<Two typical Ways for estimation>
There are two typical ways which are Maximum likelihood estimation and Methods of moments to be used in estimation for parameters.
For this post, I will introduce Maximum likelihood estimator as method for estimation.
<Maximum Likelihood Estimation>
Because log function is the monotonic increase function, a parameter which makes likelihood function of Θ be maximized also makes log(likelihood function of Θ) be maximized. And when the derivative of log function equlas 0, it means the log function is maximized with the estimator.
<MLE of some famous distributions>
<Maximum likelihood estimator of Exponential distribution>
<Maximum likelihood estimator of Poisson distribution>
<Maximum likelihood estimator of Normal distribution>
<Maximum likelihood estimator of Uniform distribution>
The range of the random variable of UNIF(0,θ) should be between 0 and θ. l(θ) is maximized with the smallest θ. However, θ should be at least greater than X(n). So the maximum likelihood estimator of θ is X(n).
<Maximum likelihood estimator of Bernoulli distribution>
<Maximum likelihood estimation of Binomial distribution>
<Properties of MLE>
1. Invariance Property
It is a property of maximum likelihood estimator that change of the maximum likelihood estimator of a parameter only depends on the parameter.
2. consistency
Consistency means that if n is sufficiently large, the estimator converges to the parameter.
<Maximum likelihood estimator - Normal distribution>
<PROOF - Maximum likelihood estimator - Normal distribution>
<Example of the Central limit theorem of MLE>
'Statistics' 카테고리의 다른 글
<MVUE>Cramer-Lao Lower bound/ Information Inequality /Fisher's Information/Information Inequality /Properties/Normal Approximation (0) | 2020.09.03 |
---|---|
F distribution (0) | 2020.08.25 |
Chi-Square distribution (0) | 2020.08.06 |
Exponential family (0) | 2020.08.06 |
Relationship between Beta distribution and standard uniform distribution (0) | 2020.08.05 |