Complete Sufficient Statistics
<MVUE - Minimum Variance Unbiased Estimator>
This will give you the information of another way to find Minimum variance unbiased estimator, which is the best estimator of parameters.
As I mentioned in the previous post, MSE(Mean Squared Error) is a good indicator of the difference between estimator and the parameter. As Mean Squared value can be expressed as Var(T(X))-bias^2, you should find the minimum variance of the estimator when comparing some estimators which are unbiased estimator of the parameter. Because every unbiased estimator has parameter as its expectation value, the only value that we can use for finding the best estimator is Variance of estimator.
I introduced Cramer-Rao lower bound as a method of finding Minimum Variance Unbiased Estimator.
So For this post, I will introduce the method using complete sufficient statistics which is an estimator having both value 'completeness' and 'sufficiency.'
<Sufficient Statistics>
When the conditional distribution of random samples, that is, distribution of X1,X2,X3,...Xn given S(X) does not rely on the parameter θ, S(X) becomes the 'jointly sufficient statistic.' That means, the information of the parameter θ is contained in S(X). And the fact makes the data reduction possible, which means that we can know the information of parameter without the information of random samples.
Sufficient Statistics are not unique because any statistic that contain sufficient information of the parameter can be the sufficient statistics.
Factorization Theorem is the necessary and sufficient condition of estimator to become the sufficient statistics of the parameter.
Even if every statistics that have sufficient information of the parameter becomes 'sufficient statistics', we can find that there exists the best sufficient statistics which has contains the core information of the parameter except for not necessary information for estimating parameter.
And the best sufficient statistics is able to be expressed as the function of other sufficient statistics. I have uploaded the post for sufficient statistics and give a plentiful example there. Please refer to it.
[Statistics] - Sufficient Statistics
By finding the expected value of unbiased estimator given the sufficient statistics we can find the better estimator than the original unbiased estimator. And the theorem is called 'Rao-Blackwell Theorem' which makes that possible by making the expected value of unbiased estimator given the sufficient statistics have the less variance than the original unbiased estimator.
However, It is not sufficient for sufficient statistics to be used for find for finding the Minimum Variance Unbiased Estimator. We need the property of 'Uniqueness' and 'Unbiaseness' for making the statistics to become MVUE. Complete Sufficient Statistics add the property of 'Uniqueness' to sufficient statistics.
<Complete Sufficient Statistics :Uniqueness>
There is a definition of complete statistics. When the condition for completeness is satisfied for Sufficient statistics, it becomes ' Complete Sufficient Statistics.' which has the property of Uniqueness.
This is an example showing how the largest sample X(n) becomes the complete sufficient statistics for the parameter θ in Uniform distribution between 0 and θ.
<Lehmann Scheffe Theorem>
This is the final and the core theorem we needs to find MVUE by using complete sufficient statistics.
It shows that the three properties 'unbiasess', 'completeness' and 'sufficiency' are enough to make the estimator to be the MVUE.
Formally, If E(S(X))=g(θ) and S(X) is the complete sufficient statistics, then S(X) becomes the minimum variance unbiased estimator.
That is, E(S(X)|T) is unique minimum variance unbiased estimator of the function of estimator.
<Relationship between exponential distribution and Complete Sufficient Statistics>
It could be complicated to find complete sufficient statistics. So It is a good property of complete sufficient statistics for you when figuring out whether the distribution is complete sufficient statistics or not. The property is ' If a distribution is exponential family, the distribution is guaranteed to be the complete sufficient statistics.
[Statistics] - Sufficient Statistics
[Statistics] - Exponential family
There are some example showing the way to find MVUE by using complete sufficient statistics and unbiasness. The important property of sufficient statistics here is that 1:1 functions of complete sufficient statistics are also the complete sufficient statistics.
<Basu Theorem>
Basu Theorem simply means that ' ancillary statistics and complete sufficient statistics are always independent.'
Ancillary statistics here has the meaning of ' the statistics which does not depend on the parameter θ'
'Statistics' 카테고리의 다른 글
Interval Estimation for parameter of Normal distribution/ Pivotal Quantity/ (0) | 2020.10.14 |
---|---|
T distribution (0) | 2020.10.09 |
Sufficient Statistics (0) | 2020.09.10 |
<MVUE>Cramer-Lao Lower bound/ Information Inequality /Fisher's Information/Information Inequality /Properties/Normal Approximation (0) | 2020.09.03 |
F distribution (0) | 2020.08.25 |