S.O.S. Mathematics CyberBoard

Your Resource for mathematics help on the web!
It is currently Thu, 31 Jul 2014 06:21:27 UTC

All times are UTC [ DST ]




Post new topic Reply to topic  [ 7 posts ] 
Author Message
PostPosted: Mon, 16 Apr 2012 22:56:49 UTC 
Offline
Member of the 'S.O.S. Math' Hall of Fame

Joined: Fri, 1 Jul 2011 01:17:26 UTC
Posts: 460
In statistics, given some sample data, one typically wants to get estimates of variance and other statistical quantities. Since the mean is usually unknown the sample average is used to get the estimates for these items. In that case a correction is needed to get an an unbiased estimate.

For the variance, when averaging the squares of the the differences between the sample values and the sample average, a correction n/(n-1) must be applied to get an unbiased estimate.

For the third central moment the factor is n^2/[(n-1)(n-2)].

I guess the general formula for the kth central moment factor would be (n-k)!n^k/n!

Has this been derived and is there a neat way to do it? I used brute force to get the second and third moments.


Top
 Profile  
 
PostPosted: Tue, 17 Apr 2012 14:50:33 UTC 
Offline
Member of the 'S.O.S. Math' Hall of Fame
User avatar

Joined: Mon, 23 Jun 2003 17:34:53 UTC
Posts: 2094
Location: San Antonio,Texas USA
Try a search for "U-statistic", for example, the article at U-statistics gives a fairly decent explanation. "Neat" is in the eye of the beholder, but the short answer to your question is "yes".

_________________
Live long and prosper.


Top
 Profile  
 
PostPosted: Tue, 17 Apr 2012 22:11:42 UTC 
Offline
Member of the 'S.O.S. Math' Hall of Fame

Joined: Fri, 1 Jul 2011 01:17:26 UTC
Posts: 460
raortega3 wrote:
The first portion of your question is referring to parameter estimation.

When you refer to the variance, I'm not sure if this is what you were implying but it isn't the 2nd moment.

Variance(x) = E(x^2) - E(x)^2, where E(x^2) is the 2nd moment.

Is say this because you at first talk about parameter estimation and move to moments.


The variance is the second CENTRAL moment. The question I was asking was about central moments.

Central moment means moment around the mean.


Top
 Profile  
 
PostPosted: Tue, 17 Apr 2012 22:19:22 UTC 
Offline
Member of the 'S.O.S. Math' Hall of Fame

Joined: Fri, 1 Jul 2011 01:17:26 UTC
Posts: 460
royhaas wrote:
Try a search for "U-statistic", for example, the article at U-statistics gives a fairly decent explanation. "Neat" is in the eye of the beholder, but the short answer to your question is "yes".

Although the article appears to give the result for the second central moment, I didn't see anything about higher moments.


Top
 Profile  
 
PostPosted: Thu, 19 Apr 2012 22:17:35 UTC 
Offline
Member of the 'S.O.S. Math' Hall of Fame

Joined: Fri, 1 Jul 2011 01:17:26 UTC
Posts: 460
My conjecture is wrong. The fourth central moment estimate includes the square of the second central moment.


Top
 Profile  
 
PostPosted: Fri, 20 Apr 2012 09:09:07 UTC 
Offline
Moderator
User avatar

Joined: Mon, 29 Dec 2008 17:49:32 UTC
Posts: 6782
Location: On this day Taiwan becomes another Tiananmen under Dictator Ma.
mathematic wrote:
My conjecture is wrong. The fourth central moment estimate includes the square of the second central moment.


Indeed, if m_r=\frac{1}{n}\sum (X_i-\bar{X})^r is the r-th sample central moment and \mu_r=\mathbb{E}(X_i-\mu)^r is the r-th (population) central moment, then
\begin{aligned}
m_1&=0\\
\mathbb{E}m_2&=\frac{n-1}{n}\mu_2\\
\mathbb{E}m_3&=\frac{(n-1)(n-2)}{n^2}\mu_3\\
\mathbb{E}m_4&=\frac{(n-1)[3(2n-3)\mu_2^2+(n^2-3n+3)\mu_4]}{n^3}\\
\mathbb{E}m_5&=\frac{(n-1)(n-2)[10(n-2)(n-3)\mu_2\mu_3+(n^2-2n+2)\mu_5]}{n^4}
\end{aligned}
(assuming X_i are independent identically distributed such that all relevant moments exist)

Here is a quick way to derive them (and all higher ones too): Let X_1,X_2,\dots,X_n iid L^\infty-random variables, Y_i=n(X_i-\bar{X})=(n-1)X_i-\sum_{j\neq i}X_j. Then Y_1,Y_2,\dots are identically distributed, and so taking expectation, you get \mathbb{E}m_r=\frac{1}{n^{r+1}}\sum\mathbb{E}Y_i^r=\frac{1}{n^r}\mathbb{E}Y_1^r. But Y_1 is a sum of independent random variables, so its moment generating function is just the product of individual m.g.f.s:
\begin{aligned}
M_{Y_1}(t)&=M_{X_1}((n-1)t)M_{X_2}(-t)\cdots M_{X_n}(-t)\\
&=M((n-1)t)M(-t)^{n-1}\\
&=\tilde{M}((n-1)t)\tilde{M}(-t)^{n-1}
\end{aligned}
where \tilde{M} is the central moment generating function of X_1: \tilde{M}(t)=\mathbb{E}\exp((X_i-\mu)t)=e^{-\mu t}M(t). So differentiate away (using (generalised) Leibnitz rule) and set t=0. Finally, use density to go from L^\infty to L^r.

Edit: correct signs.

_________________
\begin{aligned}
Spin(1)&=O(1)=\mathbb{Z}/2&\quad&\text{and}\\
Spin(2)&=U(1)=SO(2)&&\text{are obvious}\\
Spin(3)&=Sp(1)=SU(2)&&\text{by }q\mapsto(\mathop{\mathrm{Im}}\mathbb{H}\ni p\mapsto qp\bar{q})\\
Spin(4)&=Sp(1)\times Sp(1)&&\text{by }(q_1,q_2)\mapsto(\mathbb{H}\ni p\mapsto q_1p\bar{q_2})\\
Spin(5)&=Sp(2)&&\text{by }\mathbb{HP}^1\cong S^4_{round}\hookrightarrow\mathbb{R}^5\\
Spin(6)&=SU(4)&&\text{by the irrep }\Lambda_+\mathbb{C}^4
\end{aligned}


Top
 Profile  
 
PostPosted: Sat, 21 Apr 2012 01:26:10 UTC 
Offline
Member of the 'S.O.S. Math' Hall of Fame

Joined: Fri, 1 Jul 2011 01:17:26 UTC
Posts: 460
outermeasure wrote:
mathematic wrote:
My conjecture is wrong. The fourth central moment estimate includes the square of the second central moment.


Indeed, if m_r=\frac{1}{n}\sum (X_i-\bar{X})^r is the r-th sample central moment and \mu_r=\mathbb{E}(X_i-\mu)^r is the r-th (population) central moment, then
\begin{aligned}
m_1&=0\\
\mathbb{E}m_2&=\frac{n-1}{n}\mu_2\\
\mathbb{E}m_3&=\frac{(n-1)(n-2)}{n^2}\mu_3\\
\mathbb{E}m_4&=\frac{(n-1)[3(2n-3)\mu_2^2+(n^2-3n+3)\mu_4]}{n^3}\\
\mathbb{E}m_5&=\frac{(n-1)(n-2)[10(n-2)(n-3)\mu_2\mu_3+(n^2-2n+2)\mu_5]}{n^4}
\end{aligned}
(assuming X_i are independent identically distributed such that all relevant moments exist)

Here is a quick way to derive them (and all higher ones too): Let X_1,X_2,\dots,X_n iid L^\infty-random variables, Y_i=(n-1)X_i-\sum_{j\neq i}X_j. Then Y_1,Y_2,\dots are identically distributed, and so taking expectation, you get \mathbb{E}m_r=\frac{(-1)^r}{n^{r+1}}\sum\mathbb{E}Y_i^r=\frac{1}{(-n)^r}\mathbb{E}Y_1^r. But Y_1 is a sum of independent random variables, so the moment generating function is just the product of individual m.g.f.s:
\begin{aligned}
M_{Y_1}(t)&=M_{X_1}((1-n)t)M_{X_2}(t)\cdots M_{X_n}(t)\\
&=M(t)^{n-1}M((1-n)t)\\
&=\tilde{M}(t)^{n-1}\tilde{M}((1-n)t)
\end{aligned}
where \tilde{M} is the central moment generating function of X_1: \tilde{M}(t)=\mathbb{E}\exp((X_i-\mu)t)=e^{-\mu t}M(t). So differentiate away (using (generalised) Leibnitz rule) and set t=0. Finally, use density to go from L^\infty to L^r.

Thanks - this is what I was looking for in the first place.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 7 posts ] 

All times are UTC [ DST ]


Who is online

Users browsing this forum: No registered users


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
Jump to:  
Contact Us | S.O.S. Mathematics Homepage
Privacy Statement | Search the "old" CyberBoard

users online during the last hour
Powered by phpBB © 2001, 2005-2011 phpBB Group.
Copyright © 1999-2013 MathMedics, LLC. All rights reserved.
Math Medics, LLC. - P.O. Box 12395 - El Paso TX 79913 - USA