Standard Deviation Calculator
What is Standard Deviation?
The Standard Deviation (SD) is a measure of how spread out numbers are. It tells you if your data is clustered closely around the average, or scattered all over the place.
- Low SD: The numbers are close to the Mean (e.g., test scores of 85, 86, 84). The data is consistent.
- High SD: The numbers are spread out (e.g., test scores of 40, 95, 60). The data is volatile.
In finance, Standard Deviation is used to measure Risk. A stock with a high SD moves wildly up and down. A stock with a low SD is stable.
The Two Formulas: Sample vs. Population
This is where most students make mistakes on exams. There are two slightly different ways to calculate SD, depending on where your data comes from.
1. Population Standard Deviation ($\sigma$)
Use this when you have data for the entire group (e.g., you measured the height of every single student in the class).
You divide by N (the total number of items).
2. Sample Standard Deviation ($s$)
Use this when you only have data for a small portion of the group (e.g., you measured 50 people to estimate the height of the entire country).
You divide by N - 1. This is called "Bessel's Correction." It makes the result slightly larger to account for the uncertainty of estimating a whole population from a small sample.
In 90% of statistics and science problems, you are working with a Sample. If you aren't sure, assume Sample ($s$).
Understanding the Bell Curve (Normal Distribution)
If data follows a "Normal Distribution" (the classic Bell Curve shape), Standard Deviation unlocks a powerful rule called the 68-95-99.7 Rule.
- 68% of all data points fall within 1 Standard Deviation of the mean.
- 95% of all data points fall within 2 Standard Deviations.
- 99.7% of all data points fall within 3 Standard Deviations.
Example: If the average IQ is 100 with an SD of 15:
68% of people have an IQ between 85 and 115.
95% of people have an IQ between 70 and 130.
Variance vs. Standard Deviation
Our calculator also outputs Variance. What is that?
Variance is simply Standard Deviation Squared ($s^2$).
While SD is easier to understand (because it is in the same units as the data, like "meters" or "dollars"), Variance is mathematically easier to use in complex algebra and calculus proofs. Think of SD as the "Real World" number and Variance as the "Math World" number.