Showing posts with label Binomial Distribution. Show all posts
Showing posts with label Binomial Distribution. Show all posts

Binomial Distribution Example_using generating function

Example) 

Let Y~be binomial (15, $\frac {1}{3}$). Evaluate Var(Y).  
(We know variance of binomial distribution is npq. However what if we don't know this formula?) 

$\triangleright$ Think First 
$Var(Y)=E(Y^2)-[E(Y)]^2$. So we need the second moment, which means we need to use generating function. 

$\triangleright$ Solution
By using generating function, $G(z)=E(z^Y)=(q+pz)^n$
$G'(z)=E(Y \cdot z^{Y-1})=n \cdot (q+pz)^{n-1}\cdot p$ 
So when z=1, $G'(1)=E(Y)=np$ 

$G''(z)=E(Y (Y-1) \cdot z^{Y-2})=n (n-1)\cdot (q+pz)^{n-2}\cdot p^2$, 
so when z=1 $G''(1)=E(Y^2-Y)=n(n-1)^2p=E(Y^2)-E(Y)=n(n-1)p^2$  

$Var(Y)=n(n-1)p^2+np-(np)^2=n^2p^2-np^2+np-n^2p^2=npq \because(q=1-p)$ 

$\therefore Var(Y)=15 \cdot \frac{1}{3} \cdot \frac{2}{3}=\frac {30}{9}$ 

Binomial Distribution Example_Hypothesis Testing 2

Example) I have no idea where I found this example, Sorry! 

An experimenter has prepared a drug dosage level that she claims will induce sleep for 80% of people suffering from insomnia. After examining the dosage, we feel that her claims regading the effectiveness of the dosage are inflated. In an attempt to disprove her claim, we administer her prescribed dosage to 20 insominiacs and we observe Y, the number for whom the drug tdse induces sleep. The rejection region was found to be {Y $\leq$ 12}. 

(a)  $H_{0}$? $H_{1}$?
(b) In terms of this problem, what's a Type I error? 
(C) find $\alpha$ 
(d) In terms of this problem, what's a Type II error? 
(e) Find  $\beta$ when p=0.6
(f) Find $\beta$ when p=0.4 


\triangleright Solution (a) 
$H_{0}$ : p=0.8
$H_{1}$: p<0 .8="" span="">

\triangleright Solution (b) 
Rejecting $H_{0}$ when $H_{0}$ is true $\rightarrow$ Conclude that drug is worse than claimed when in fact 80% of insomnia are able to sleep after taking the drug. 

\triangleright Solution (c) 
$\alpha$ = P(reject $H_{0}$ when $H_{0}$ is true) 
  = P(Y $\leq$ 12) where Y is binomial (20, 0.8) 
  = P(Y=0)+P(Y=1)+...+P(Y=12) =  $\binom{20}{0}(0.8)^0(0.2)^{20}+ \cdots +\binom{20}{12}(0.8)^{12}(0.2)^8= 0.032$
  
\triangleright Solution (d) 
Do not reject $H_{0}$ when $H_{1}$ is true. $\rightarrow$ Conclude that drug is effective as claimed when in fact it is worse than claimed. 

\triangleright Solution (e) 
 $\beta$= P(do not reject $H_{0}$ when $H_{1}$ is true)=P(Y$\geq$ 13) where Y~Binomial(20, 0.6)
  = 1-P(Y $\leq$ 12) =0.416 (power = 0.584)

\triangleright Solution (e) 
$\beta$P(Y$\geq$ 13) = 1-P(Y $\leq$ 12) =0.021, (Y~Binomial (20, 0.4)) 

Binomial Distribution Example_Hypothesis test

Example) Mathematical Statistics and Data Analysis 3ED, Chapter 9. Q1. 

A coin is thrown independently 10 times to test the hypothesis that the probability of head is $\frac {1}{2}$ . $H_{1}: p\neq \frac {1}{2}$. The test rejects if either 0 or 10 heads are observed. 
a) What's the significance level of test?
b) If in fact, the probability of head is 0.1. What's the power of the test? 

$\triangleright$ Think First!
This is a binomial example. X~Bin(10, 0.5) as a coin is thrown 10 times and the probability of head is 0.5. 

$\triangleright$  Solution (a)
$\alpha =$ P(reject $H_{0}$ |$H_{0}$ is true) = P(X=0 | $H_{0}$) + P(X=10 | $H_{0}$)

Under $H_{0}$, $\alpha = \binom{10}{0}(0.5)^0(0.5)^{10}+ \binom{10}{10}(0.5)^{10}(0.5)^0 = \frac{1}{1024}+\frac{1}{1024}=$ 0.0020   

$\triangleright$  Solution (b)
$1-\beta$ = P(reject $H_{0}$ when $H_{1}$ is true) =  P(X=0|$H_{1}$)+P(X=10|$H_{1}$)
         = $\binom{10}{0}(0.1)^0(0.9)^{10}+ \binom{10}{10}(0.1)^{10}(0.9)^0 =$ 0.3487

Binomial Distribution

Discrete Random Variable 
: X~ Bin(n, p) = # of success in n Bernoulli trials each with success probability p.

$\bigstar P(X=k)=\left ( \frac{n}{k} \right )p^k(1-p)^{n-k}$, x=0,1,...,n
$\bigstar$ E(X)=np, Var(X)=np(1-p)=npq (where q=1-p) 


Finding Variance w/o formula Example) 
Let Y~be binomial (15, $\frac {1}{3}$). Evaluate Var(Y).  
(We know variance of binomial distribution is npq. However what if we don't know this formula?)

Example) Mathematical Statistics and Data Analysis 3ED Chapter8. Q31.
George spins a coin three times and observed no heads. He then gives the coin to Hilary. She spins it until the first head occurs, and ends up spinning it four times total. Let $\theta $ denote the probability the coin comes up heads. 
a) What is the likelihood of $\theta $?
b) What is the MLE of $\theta $?


Hypothesis Testing 
Example) Mathematical Statistics and Data Analysis 3ED, Chapter 9. Q1
A coin is thrown independently 10 times to test the hypothesis that the probability of head is $\frac {1}{2}$ . $H_{1}: p\neq \frac {1}{2}$. The test rejects if either 0 or 10 heads are observed. 
a) What's the significance level of test?
b) If in fact, the probability of head is 0.1. What's the power of the test? 
Solution??!!

Example) 
An experimenter has prepared a drug dosage level that she claims will induce sleep for 80% of people suffering from insomnia. After examining the dosage, we feel that her claims regading the effectiveness of the dosage are inflated. In an attempt to disprove her claim, we administer her prescribed dosage to 20 insominiacs and we observe Y, the number for whom the drug tdse induces sleep. The rejection region was found to be {Y $\leq$ 12}. 
(a)  $H_{0}$? $H_{1}$?
(b) In terms of this problem, what's a Type I error? 
(C) find $\alpha$ 
(d) In terms of this problem, what's a Type II error? 
(e) Find  $\beta$ when p=0.6
(f) Find $\beta$ when p=0.4

Binomial Example_likelihood&MLE

Example) Mathematical Statistics and Data Analysis 3ED Chapter 8. Q31.

George spins a coin three times and observed no heads. He then gives the coin to Hilary. She spins it until the first head occurs, and ends up spinning it four times total. Let $\theta $ denote the probability the coin comes up heads.
a) What is the likelihood of $\theta $?
b) What is the MLE of $\theta $  


$\triangleright$ Think First!! 
This is a binomial distribution example! $P(X=k)=\left ( \frac{n}{k} \right )p^k(1-p)^{n-k}$
PDF of binomial is $f(x|\theta)=\theta^x(1-\theta)^{1-x}$  


$\triangleright$ Solution (a)  
They spin a coin 7 times in total. (George: 3 times, Hilary 4 times) 
Let X be 1 (if the result is head), o (otherwise) 

George case (X)The likelihood is $\theta $, $L_{1}(\theta | X_{1},...,X_{n})= \prod_{i=1}^{n}f(X_{i}|\theta)=\prod \theta^{x_{i}}(1-\theta)^{n-x_{i}}$ 
                       George spins a coin three times, here n is 3. 
                       There is no head, $X_{1}=X_{2}=X_{3}=0$  

Hilary case (Y): PDF of Y: $g(Y|\theta)=\theta (1-\theta)^{y-1}$
                        The likelihood of Y: $L_{2}=(\theta|y)=\theta(1-\theta)^{y-1}$ 
                        Here Y is the number of toss required. So Y is 4.
                        (why $\theta^1$ ? b/c there is one head)    

$\rightarrow$ the likelihood of both George and Hilary $\theta $ is...
$\therefore$ $ L(\theta|X_{1},...,X_{n}, Y)= \left [ \prod_{i=1}^{n}\theta^{X_{i}}(1-\theta)^{n-x_i} \right ]\theta (1-\theta)^{y-1}$  $=\theta^{1+\sum x_{i}}(1-\theta)^{n-\sum x_{i}+y-1}$ 

$\triangleright$ Solution (b)  
$\Rightarrow$ Now log likelihood is $= \left [1+\sum _{i=1}^{n} x_{i} \right] \ln(\theta)+ \left [ n - \sum _{i=1}^{n} x_{i}+y -1 \right ] \ln (1-\theta)$
$\Rightarrow$ Take a derivative with respect to $\theta $ is $\frac{d \ln(L)}{d \theta}=\frac{1+\sum x_{i}}{\theta}+\frac{n-\sum x_{i} +y -1}{\theta - 1}$ = (set) 0
    = $\frac{1+\sum x_{i}}{\theta}=\frac{n-\sum x_{i} +y -1}{\theta - 1} \rightarrow (n+y)\theta =1+\sum x_{i} \rightarrow \theta = \frac{1+\sum_{i=1}^{n}x_{i}}{n+y}$ 

     $\star \frac{d}{dx}(\log (1-x))=\frac{1}{x-1}$ 

Now n=3, $X_{1}=X_{2}=X_{3}=0, Y=4$, 
$\therefore \theta_{MLE}= \frac{1+0+0+0}{3+4}=\frac {1}{7}$