I'm taking a course in statistical inference. We showed in class that:

1. the best estimator for the mean of a normal distribution (and probably most common distributions?) is the sample mean

2. the best estimator for the range of a uniform distribution which goes from 0 to theta involves multiply the maximum in the sample by (n+1)/n or something. then it follows that the best estimator for the mean is the max * (n+1)/(2n)

(where by "best" i think i mean minimum variance unbiased)

This got me pretty curious as to what happens in between. Suppose i take some mixture of (alpha) normal distribution and (1-alpha) uniform distribution. As I slide alpha from 1 to 0, somewhere along the way, the best estimate of the mean must switch over from the sample mean to something involving the max. what exactly is that value of alpha? are there values of alpha for which the best estimator is neither of the two estimators i mentioned?

i suspect a mixture of the uniform distribution and normal distribution might be difficult to work with, and also a bit weird, seeing as one is infinite, the other is finite. therefore, it is probably best to approach this problem with some other distribution P which shares the same interval domain as the uniform, yet has the sample mean estimator as the best estimator for the mean of P. i tried doing something with triangular distributions a few weeks back, but got lost in the math along the way.

some questions:

what is your best guess on what the behavour of the estimator will be for this sort of mixture

suggestions as to what distribution to try in place of the normal?

previous work or experience with such problems? (i didn't find anything on google)

## behavior of best estimator on some constructed mixture distribution

**Moderators:** gmalivuk, Moderators General, Prelates

- Xanthir
- My HERO!!!
**Posts:**5321**Joined:**Tue Feb 20, 2007 12:49 am UTC**Location:**The Googleplex-
**Contact:**

### Re: behavior of best estimator on some constructed mixture distribution

The best estimator stops being the sample mean as soon as you mix in any of the uniform. It stops being the max-thing as soon as you mix in any of the normal. The mixed distribution has some more complex best estimator.

However, there are indeed limits *in practice* where you can still use the sample mean or the max-thing as the best estimator and get reasonable results. They'll depend on how much error you're willing to tolerate, and probably on the ranges in question.

However, there are indeed limits *in practice* where you can still use the sample mean or the max-thing as the best estimator and get reasonable results. They'll depend on how much error you're willing to tolerate, and probably on the ranges in question.

(defun fibs (n &optional (a 1) (b 1)) (take n (unfold '+ a b)))

### Re: behavior of best estimator on some constructed mixture distribution

can you explain why the mixed distribution has some oher best estimator? is it based off of intuition or some paper?

thanks

thanks

- Xanthir
- My HERO!!!
**Posts:**5321**Joined:**Tue Feb 20, 2007 12:49 am UTC**Location:**The Googleplex-
**Contact:**

### Re: behavior of best estimator on some constructed mixture distribution

The best estimator depends on the distribution in question. Mixed distributions aren't either of the input distributions, so it has to have a different best estimator.

(defun fibs (n &optional (a 1) (b 1)) (take n (unfold '+ a b)))

### Who is online

Users browsing this forum: No registered users and 7 guests