Time series & variance
Hi, I want to do something that seems simple but am getting confused.
A fund is said to return 5% on average with a standard deviation of 10%. I.e. 95% confidence interval of approximately -15% to +25%.
I want to know the confidence interval on a 30 year projection for this fund.
From a vague understanding or misunderstanding of stats I think that the variance reduces with sample size such that if variance in one year is 100, the the variance should be 100/30 for 30 years or 3.33. Which gives a confidence interval for average growth over 30 years of 5+/-2*sqrt(100/30) or 3.17 to 6.83%.
Is this correct? If not, how should I approach this question?
Thanks for any help a mathematician can give.
- Planetary Bugs
- Research Topics
- LaTeX help
- Math Comptetitions
- Math History
- Math Humor
- PlanetMath Comments
- PlanetMath System Updates and News
- PlanetMath help
- Strategic Communications Development
- The Math Pub
- Testing messages (ignore)
- Other useful stuff