본문 바로가기
Statistics

Maximum likelihood estimator/ exponential,poisson,binomial,bernoulli,Normal,uniform/ Invariance property/ consistency/ central limit theorem/slutsky's theorem

by jangpiano 2020. 8. 20.
반응형

<Two typical Ways for estimation>


There are two typical ways which are Maximum likelihood estimation and Methods of moments to be used in estimation for parameters. 

For this post, I will introduce Maximum likelihood estimator as method for estimation. 




<Maximum Likelihood Estimation> 




Because log function is the monotonic increase function, a parameter which makes likelihood function of Θ be maximized also makes log(likelihood function of Θ) be maximized. And when the derivative of log function equlas 0, it means the log function is maximized with the estimator. 


<MLE of some famous distributions>


<Maximum likelihood estimator of Exponential distribution>


<Maximum likelihood estimator of Poisson distribution>

<Maximum likelihood estimator of Normal distribution>

<Maximum likelihood estimator of Uniform distribution>

The range of the random variable of UNIF(0,θ) should be between 0 and θ. l(θ) is maximized with the smallest θ. However, θ should be at least greater than X(n). So the maximum likelihood estimator of θ is X(n). 



<Maximum likelihood estimator of Bernoulli distribution>


<Maximum likelihood estimation of Binomial distribution>


<Properties of MLE>


1. Invariance Property 


It is a property of maximum likelihood estimator that change of the maximum likelihood estimator of a parameter only depends on the parameter. 

2. consistency 


Consistency means that if n is sufficiently large, the estimator converges to the parameter. 


<Maximum likelihood estimator - Normal distribution> 


<PROOF - Maximum likelihood estimator - Normal distribution> 




<Example of the Central limit theorem of MLE>






반응형