Discrete Random Variable
: X~Poisson ($\lambda$)= the probability of a number of events occurring in a fixed time interval at rate $\lambda$. We assume that time between events is independent of one another!
$\bigstar P(X=k)= e^{-\lambda} \cdot \frac {\lambda^k}{k!}$, k=0,1,..., 0 $\leqslant$ $\lambda < \infty$
$\bigstar$ Moment Generating Function, m(t)=$e^{\lambda(e^t-1)}$
Proof
m(t)=$E[e^{tx}]=\sum_{x=0}^{\infty}e^{tx}\cdot e^{-\lambda}\cdot \frac {\lambda^k}{x!}= e^{-\lambda}\sum_{x=0}^{\infty} \frac {e^{tx}\lambda^x}{x!}$
(tip!! $\sum p(x) = 1 \rightarrow \sum_{x=0}^{\infty}e^{-\lambda}\cdot \frac{\lambda^x}{x!}=1 \rightarrow e^{-\lambda} \cdot \sum_{x=1}^{\infty} \frac{\lambda^x}{x!}=1$)
= $e^{-\lambda}\cdot \sum_{x=0}^{\infty}\frac{(e^t\lambda)^x}{x!}=e^{-\lambda}\cdot e^{e^t \cdot \lambda}=e^{\lambda(e^t-1)}$
$\bigstar$ E(X)=$\lambda$, Var(X)=$\lambda$
Proof
$\rightarrow$ by using mgf, we need E[X]=m'(0), $m'(t)=e^u$, where $u=\lambda \cdot(e^t-1)$
$m'(t)= \frac{dm}{du}\cdot \frac{du}{dt}=e^u \cdot\lambda e^t=e^{\lambda(e^t-1)}\cdot \lambda e^t$ $\therefore m'(0)=e^0 \cdot \lambda e^0= \lambda$
$\rightarrow$ For finding variance, we need E[X]=m''(0)
$m'(t)=e^{\lambda(e^t-1)}\cdot \lambda e^t$, $m''(t)=\lambda [e^t [e^{\lambda(e^t-1)}\cdot \lambda e^t]+e^{\lambda(e^t-1)\cdot e^t}]$ $\rightarrow m''(0)=\lambda(\lambda+1)$
$\therefore Var(x)=E(X)^2-[E(X)]^2=\lambda^2+\lambda - \lambda^2=\lambda$
$\rightarrow$ by using mgf, we need E[X]=m'(0), $m'(t)=e^u$, where $u=\lambda \cdot(e^t-1)$
$m'(t)= \frac{dm}{du}\cdot \frac{du}{dt}=e^u \cdot\lambda e^t=e^{\lambda(e^t-1)}\cdot \lambda e^t$ $\therefore m'(0)=e^0 \cdot \lambda e^0= \lambda$
$\rightarrow$ For finding variance, we need E[X]=m''(0)
$m'(t)=e^{\lambda(e^t-1)}\cdot \lambda e^t$, $m''(t)=\lambda [e^t [e^{\lambda(e^t-1)}\cdot \lambda e^t]+e^{\lambda(e^t-1)\cdot e^t}]$ $\rightarrow m''(0)=\lambda(\lambda+1)$
$\therefore Var(x)=E(X)^2-[E(X)]^2=\lambda^2+\lambda - \lambda^2=\lambda$
Maximum Likelihood Estimate (MLE) in Poisson Distribution
Proof
Proof
$L(\lambda)=\prod_{i=1}^{n}(\frac {\lambda^{k_{i}}\cdot e^{-\lambda}}{k_{i}!})= L(X_1, X_2,...,X_n|\lambda)$
$l(\lambda)=-n\lambda+ \sum x_{i}\log \lambda-log(\prod x_{i})$
$l'(\lambda)=-n + \frac {\sum x_{i}}{\lambda}=(set)0$ $\rightarrow n\lambda=\sum x_{i}\rightarrow \lambda= \frac{\sum x_{i}}{n}$
$\hat{\lambda}=\frac{1}{n} \sum x_{i}=\bar{x}$
So the expected value and variance of $\hat{\lambda}$ is...
$E[\bar{x}]=E[\frac{\sum x_{i}}{n}]=\frac {1}{n}\cdot n\cdot E[x]=\lambda$
$Var[\bar{x}]=Var[\frac{\sum x_{i}}{n}]=\frac {1}{n^2}\cdot \sum Var(x_{i})= \frac {\lambda}{n}$
Example) Mathematical Statistics and Data Analysis, 3ED, Chapter 8. Q3.
One of the earliest applications of the Poisson distribution was made by Student(1907) in studying errors made in counting yeast cells or blood corpuscles with a haemacytometer. In this study, yeast cells were killed and mixed with water and gelatin; the mixture was then spread on a glass and allowed to cool. Four different concentrations were used. Counts were made on 400 squares and the data are summarized in the following table; (we're deal with only one data set)
(#cells, Concentration 2)
= (0, 103) (1, 143) (2, 98) (3, 42) (4, 8) (5, 4) (6 2) (7, 0) (8, 0) (9, 0) (10, 0) (11, 0) (12, 0)
a) log-likelihood for $\lambda$?
b) maximum likelihood?
c) Calculate maximum likelihood?
d) Find an approximate 95% confidence interval for $\lambda $
Proof
$X_{1}, \cdots, X_n \sim$ Poisson, $ P(X_{1}=x_{1}, \cdots, X_{n}=x_{n})= \frac {\lambda^{\sum x_{i} \cdot e^{-n\lambda}}}{\prod x_{i}!}$
$\rightarrow \lambda^{\sum x_{i}e^{-n\lambda}}\cdot \frac {1}{\prod x_{i}!}$ where $g(\sum x_{i}, \lambda) = \lambda^{\sum x_{i}e^{-n\lambda}}\cdot$ , $h(X_{1}, \cdots , X_{n})= \frac {1}{\prod x_{i}!}$
Let $X_{1}, X_{2}$ be a random sample of size 2 from Poisson distribution $f(x_{1}=\lambda^{x_{1}} \cdot \frac {e^{-\lambda}}{x_{1}!})$ . Show $T=X_{1}+ X_{2}$
Solution??!!
$\bigstar$ Poisson Distribution is a part of exponential Family
Proof
$p_{\theta}=e^{-\theta} \cdot \frac {\theta^x}{x!}= \exp [x log \theta - \theta] \cdot \frac {1}{x!}$, where $T(x)=x, c(\theta)=log (\theta), d(\theta)=\theta$
$\bigstar$ Likelihood Ratio Testing
$\bigstar$ Likelihood Ratio Testing
Example) Mathematical Statistics and Data Analysis 3ED, Chpater 9, Q7.
Let $X_{1}, \cdots ,X_{n}$ be a sample from a Poisson distribution. Find the likelihood ratio for testing $H_{0}: \lambda = \lambda_{0}$ versus $H_{A}: \lambda = \lambda_{1}$, where $\lambda_{1}> \lambda_{0}$. Use the fact that the sum of independent Poisson random variables follows a Poisson distribution to explain how to determine a rejection region for a test at level $\alpha$
No comments:
Post a Comment