Today’s class centered on likelihoods. First we looked at the Cauchy distribution. I asserted without proof that the average of N (standard) Cauchy-distributed random variables has the same distribution as a single random variable, so that taking averages doesn’t improve our estimates. The likelihood does, however, get more and more peaked as we get more and more data. We looked at the likelihood for 1, 2, 3 and more Cauchy variables. We noted that for 2 we often get a “two-peaked” distribution, and with 3 we sometimes have odd bumps on the likelihood. These quiet down as more and more data is obtained.

I then stated the Likelihood Principle and discussed the fact that it is a consequence of the Sufficiency Principle and the Conditionality Principle, both of which seem unremarkable. Yet the Likelihood Principle is quite controversial, and many frequentist procedures violate it. Bayesian procedures never violate it, because in Bayesian inference, the information about the parameters contained in the data is always in the likelihood, which the Bayesian mantra automatically uses.

### Like this:

Like Loading...

*Related*

This entry was posted on September 27, 2012 at 12:48 pm and is filed under STAT 330. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.

September 27, 2012 at 2:04 pm |

Hi professor, two questions:

– First I think you said that you have the proof that the sample average distribution is Cauchy (the average doesnt improve the estimate), if you do have it can you upload it?

– and Second,in chart 60 the likelihood function is defined as:

L(xo|{x1,..xn}) is that correct? or is L({x1,…xn}|xo) ?

Regards

September 27, 2012 at 8:31 pm |

I really should have used a ; instead of a | in writing the likelihood, e.g., L(x0; {x1, x2, …, xn}). Usually the likelihood has the data on the right and the parameters on the left.

I’ll make a pdf of the proof and post it.