Showing posts with label Poisson Distribution. Show all posts
Showing posts with label Poisson Distribution. Show all posts

Poisson Distribution Example_Likelihood Ratio testing_1

Example) Mathematical Statistics and Data Analysis 3ED, Chpater 9, Q7. 

Let $X_{1}, \cdots ,X_{n}$ be a sample from a Poisson distribution. Find the likelihood ratio for testing $H_{0}: \lambda = \lambda_{0}$ versus $H_{A}: \lambda = \lambda_{1}$, where $\lambda_{1}> \lambda_{0}$. Use the fact that the sum of independent Poisson random variables follows a Poisson distribution to explain how to determine a rejection region for a test at level $\alpha$   


$\triangleright$ Think First
the LRT reject $H_{o}\Leftrightarrow \frac {L(data|H_{o})}{L(data|H_{1})}$ < C 

If $X_{1}$ ~Poisson($\lambda_{1}$ ) and $X_{2}$ ~ Poisson($\lambda_{2}$), and $X_{1}$ & $X_{2}$ independent. 
then ($X_{1}$+$X_{2}$) ~ Poisson ($\lambda_{1}$+$\lambda_{2}$)
Thus, $\sum x_{i}$ ~ (under $H_{0}$) Poisson ($n \cdot \lambda_{0}$)

$\triangleright$ Solution
First, the likelihood ratio is...
$\rightarrow \frac {L(data|H_{o})}{L(data|H_{1})}=\frac {\prod e^{-\lambda_{0}}\lambda_{0}^{x_{i}}}{x_{i}!} / \frac {\prod e^{-\lambda_{1}}\lambda_{1}^{x_{i}}}{x_{i}!} = e^{n(\lambda_{1}-\lambda_{0})} \cdot (\frac {\lambda_{0}}{\lambda_{1}})^{\sum x_{i}}$ 

Second, LRT is...
Reject $H_{o}\Leftrightarrow$ $e^{n(\lambda_{1}-\lambda_{0})} \cdot (\frac {\lambda_{0}}{\lambda_{1}})^{\sum x_{i}}$ < C
                  $(\sum x_{i}) \ln \frac {\lambda_{0}}{\lambda_{1}} < \ln C-n(\lambda_{1}-\lambda_{0})$ 
                   $\sum x_{i} > \frac {\ln C-n(\lambda_{1}-\lambda_{0})}{\ln \lambda_{0}-\ln \lambda_{1}} = C$ Some constant C. (why? $\ln \frac {\lambda_{0}}{\lambda_{1}}< 0$ )

Thus, reject $H_{o}\Leftrightarrow \sum x_{i}> C$ 

Finally significant level $\alpha$ is...
$\alpha$ = P(reject $H_{0}$ | $H_{0}$ ) = $P(\sum x_{i} > C | \lambda = \lambda_{0})$ 

So, we have the following eqtn on C 
$\alpha = P(Y> C)$ where Y~ Poisson ($n \cdot \lambda_{0}$)
$\alpha = 1-F_{(n \cdot \lambda_{0})}(c)$, where $F_{(n \cdot \lambda_{0})}$ is the CDF of Y. 

$\therefore C= F^{-1}_{n\cdot \lambda_{0}}(1-\alpha)$  

Poisson Distribution Example_Sufficient

Example)

Let $X_{1}, X_{2}$ be a random sample of size 2 from Poisson distribution $f(x_{1}=\lambda^{x_{1}} \cdot \frac {e^{-\lambda}}{x_{1}!})$ . Show $T=X_{1}+ X_{2}$ 

$\triangleright$  Solution
The joint distribution of the random sample is $\prod_{i=1}^{2}=f(x_{i}|\lambda)=\frac {\lambda^{x_{1}+x_{2}}}{x_{1}!x_{2}!} \cdot e^{-2\lambda}$ 

The joint probability of $X_{1}=x_{1}, X_{2}=x_{2}$ and $T=X_{1}+X_{2}=t$ is
$\rightarrow f(x_{1}, x_{2}, t|\lambda)=\frac {\lambda^t}{x_{1}!x_{2}!} \cdot e^{-2\lambda}$ 

We know $T=X_{1}+X_{2}$ has a Poisson distribution with parameter $2\lambda$
$\rightarrow g(t|\lambda)=\frac {(2\lambda)^t}{t!}\cdot e^{-2\lambda}$  

Consequently, the conditional distribution of the sample, given T=t is
$\rightarrow f(x_{1}, x_{2}|T=t; \lambda)=\frac {f(x_{1}, x_{2}, t|\lambda)}{g(t|\lambda)}=\frac {\lambda^t}{x_{1}!x_{2}!}\cdot e^{-2\lambda}/ \frac {(2\lambda)^t}{t!}\cdot e^{-2\lambda}$ 
$\therefore$  This does not depend on the parameter $\lambda$ as the $\lambda$ is cancelled out. 

Poisson Distribution Example_MLE

Example) Mathematical Statistics and Data Analysis, 3ED, Chapter 8. Q3. 

One of the earliest applications of the Poisson distribution was made by Student(1907) in studying errors made in counting yeast cells or blood corpuscles with a haemacytometer. In this study, yeast cells were killed and mixed with water and gelatin; the mixture was then spread on a glass and allowed to cool. Four different concentrations were used. Counts were made on 400 squares and the data are summarized in the following table; (we're deal with only one data set) 

(#cells, Concentration 2)
= (0, 103) (1, 143) (2, 98) (3, 42) (4, 8) (5, 4) (6 2) (7, 0) (8, 0) (9, 0) (10, 0) (11, 0) (12, 0)

a) log-likelihood for $\lambda$ 
b) maximum likelihood? 
c) Calculate maximum likelihood?
d) Find an approximate 95% confidence interval for $\lambda $

$P(Y=y)= \frac {e^{-\lambda} \lambda^y}{y!}$ 

$\triangleright$ Solution (a)
$l(\lambda) = \sum y_{i} \cdot \log \lambda = \sum \log(y_{i})-n\lambda$ 

$\triangleright$ Solution (b)
$l'(\lambda) = \frac {\sum y_{i}}{\lambda} -n=(set)0\rightarrow \hat {\lambda}=\frac {\sum y_i}{n} = \bar{y}$ 

$\triangleright$ Solution (c)
$\hat{\lambda}=\bar{y}= \frac {529}{400}=1.3225$ 

$\triangleright$ Solution (d)
We can find a variance by using Fisher information: 
$- \frac {d^2}{d\lambda^2}[\sum y_i \log \lambda-n\lambda]=\frac {\sum y_{i}}{\hat {\lambda}^2}= \frac {n^2}{\sum y_{i}} \rightarrow \frac {\sum y_{i}}{n^2}\approx \sigma ^2_{\hat{\lambda}}$ 
$\hat {\lambda} \pm Z_{0.975} \frac {\sqrt{\sum y_{i}}}{n} = 1.3225 \pm1.96 \cdot \frac {23}{400}=(1.210, 1.435)$ 

Poisson Distribution

Discrete Random Variable 
: X~Poisson ($\lambda$)= the probability of a number of events occurring in a fixed time interval at rate $\lambda$. We assume that time between events is independent of one another! 
$\bigstar P(X=k)= e^{-\lambda} \cdot \frac {\lambda^k}{k!}$, k=0,1,...,  0 $\leqslant$  $\lambda < \infty$ 

$\bigstar$ Moment Generating Function, m(t)=$e^{\lambda(e^t-1)}$ 
Proof 
m(t)=$E[e^{tx}]=\sum_{x=0}^{\infty}e^{tx}\cdot e^{-\lambda}\cdot \frac {\lambda^k}{x!}= e^{-\lambda}\sum_{x=0}^{\infty} \frac {e^{tx}\lambda^x}{x!}$ 
               (tip!! $\sum p(x) = 1 \rightarrow \sum_{x=0}^{\infty}e^{-\lambda}\cdot \frac{\lambda^x}{x!}=1 \rightarrow e^{-\lambda} \cdot \sum_{x=1}^{\infty} \frac{\lambda^x}{x!}=1$) 
                = $e^{-\lambda}\cdot \sum_{x=0}^{\infty}\frac{(e^t\lambda)^x}{x!}=e^{-\lambda}\cdot e^{e^t \cdot \lambda}=e^{\lambda(e^t-1)}$  

$\bigstar$ E(X)=$\lambda$, Var(X)=$\lambda$ 
Proof 
$\rightarrow$ by using mgf, we need E[X]=m'(0), $m'(t)=e^u$, where $u=\lambda \cdot(e^t-1)$  
$m'(t)= \frac{dm}{du}\cdot \frac{du}{dt}=e^u \cdot\lambda e^t=e^{\lambda(e^t-1)}\cdot \lambda e^t$ $\therefore m'(0)=e^0 \cdot \lambda e^0= \lambda$ 

$\rightarrow$ For finding variance, we need E[X]=m''(0)
$m'(t)=e^{\lambda(e^t-1)}\cdot \lambda e^t$, $m''(t)=\lambda [e^t [e^{\lambda(e^t-1)}\cdot \lambda e^t]+e^{\lambda(e^t-1)\cdot e^t}]$ $\rightarrow m''(0)=\lambda(\lambda+1)$ 
$\therefore Var(x)=E(X)^2-[E(X)]^2=\lambda^2+\lambda - \lambda^2=\lambda$ 

Maximum Likelihood Estimate (MLE) in Poisson Distribution 
Proof
$L(\lambda)=\prod_{i=1}^{n}(\frac {\lambda^{k_{i}}\cdot e^{-\lambda}}{k_{i}!})= L(X_1, X_2,...,X_n|\lambda)$ 
$l(\lambda)=-n\lambda+ \sum x_{i}\log \lambda-log(\prod x_{i})$
$l'(\lambda)=-n + \frac {\sum x_{i}}{\lambda}=(set)0$ $\rightarrow n\lambda=\sum x_{i}\rightarrow \lambda= \frac{\sum x_{i}}{n}$ 
$\hat{\lambda}=\frac{1}{n} \sum x_{i}=\bar{x}$  

So the expected value and variance of $\hat{\lambda}$ is...
$E[\bar{x}]=E[\frac{\sum x_{i}}{n}]=\frac {1}{n}\cdot n\cdot E[x]=\lambda$ 
$Var[\bar{x}]=Var[\frac{\sum x_{i}}{n}]=\frac {1}{n^2}\cdot \sum Var(x_{i})= \frac {\lambda}{n}$ 



Example) Mathematical Statistics and Data Analysis, 3ED, Chapter 8. Q3. 
One of the earliest applications of the Poisson distribution was made by Student(1907) in studying errors made in counting yeast cells or blood corpuscles with a haemacytometer. In this study, yeast cells were killed and mixed with water and gelatin; the mixture was then spread on a glass and allowed to cool. Four different concentrations were used. Counts were made on 400 squares and the data are summarized in the following table; (we're deal with only one data set) 
(#cells, Concentration 2)
= (0, 103) (1, 143) (2, 98) (3, 42) (4, 8) (5, 4) (6 2) (7, 0) (8, 0) (9, 0) (10, 0) (11, 0) (12, 0)
a) log-likelihood for $\lambda$? 
b) maximum likelihood? 
c) Calculate maximum likelihood?
d) Find an approximate 95% confidence interval for $\lambda $

$\bigstar$ Sufficient Statistics 
Proof
$X_{1}, \cdots, X_n \sim$ Poisson,  $ P(X_{1}=x_{1}, \cdots, X_{n}=x_{n})= \frac {\lambda^{\sum x_{i} \cdot e^{-n\lambda}}}{\prod x_{i}!}$ 
$\rightarrow \lambda^{\sum x_{i}e^{-n\lambda}}\cdot \frac {1}{\prod x_{i}!}$  where $g(\sum x_{i}, \lambda) = \lambda^{\sum x_{i}e^{-n\lambda}}\cdot$ , $h(X_{1}, \cdots , X_{n})= \frac {1}{\prod x_{i}!}$ 

Sufficient Statistics Example
Let $X_{1}, X_{2}$ be a random sample of size 2 from Poisson distribution $f(x_{1}=\lambda^{x_{1}} \cdot \frac {e^{-\lambda}}{x_{1}!})$ . Show $T=X_{1}+ X_{2}$ 
Solution??!! 


$\bigstar$ Poisson Distribution is a part of exponential Family 
Proof
$p_{\theta}=e^{-\theta} \cdot \frac {\theta^x}{x!}= \exp [x log \theta - \theta] \cdot \frac {1}{x!}$, where $T(x)=x, c(\theta)=log (\theta), d(\theta)=\theta$ 




$\bigstar$ Likelihood Ratio Testing
Example) Mathematical Statistics and Data Analysis 3ED, Chpater 9, Q7. 
Let $X_{1}, \cdots ,X_{n}$ be a sample from a Poisson distribution. Find the likelihood ratio for testing $H_{0}: \lambda = \lambda_{0}$ versus $H_{A}: \lambda = \lambda_{1}$, where $\lambda_{1}> \lambda_{0}$. Use the fact that the sum of independent Poisson random variables follows a Poisson distribution to explain how to determine a rejection region for a test at level $\alpha$