I'm taking a course in statistical inference. We showed in class that:
1. the best estimator for the mean of a normal distribution (and probably most common distributions?) is the sample mean
2. the best estimator for the range of a uniform distribution which goes from 0 to theta involves multiply the maximum in the sample by (n+1)/n or something. then it follows that the best estimator for the mean is the max * (n+1)/(2n)
(where by "best" i think i mean minimum variance unbiased)
This got me pretty curious as to what happens in between. Suppose i take some mixture of (alpha) normal distribution and (1-alpha) uniform distribution. As I slide alpha from 1 to 0, somewhere along the way, the best estimate of the mean must switch over from the sample mean to something involving the max. what exactly is that value of alpha? are there values of alpha for which the best estimator is neither of the two estimators i mentioned?
i suspect a mixture of the uniform distribution and normal distribution might be difficult to work with, and also a bit weird, seeing as one is infinite, the other is finite. therefore, it is probably best to approach this problem with some other distribution P which shares the same interval domain as the uniform, yet has the sample mean estimator as the best estimator for the mean of P. i tried doing something with triangular distributions a few weeks back, but got lost in the math along the way.
what is your best guess on what the behavour of the estimator will be for this sort of mixture
suggestions as to what distribution to try in place of the normal?
previous work or experience with such problems? (i didn't find anything on google)
For the discussion of math. Duh.
1 post • Page 1 of 1
Who is online
Users browsing this forum: No registered users and 15 guests