Standard Deviation vs. Variance: What's the Difference?

Standard deviation and variance are two basic mathematical concepts that have an important place in various parts of the financial sector, from accounting to economics to investing. Both measure the variability of figures within a data set using the mean of a certain group of numbers. They are important to help determine volatility and the distribution of returns. But there are inherent differences between the two. While standard deviation measures the square root of the variance, the variance is the average of each point from the mean.
Standard deviation and variance are two key measures commonly used in the financial sector.
Standard deviation is the spread of a group of numbers from the mean.
The variance measures the average degree to which each point differs from the mean.
While standard deviation is the square root of the variance, variance is the average of all data points within a group.
The two concepts are useful and significant for traders, who use them to measure market volatility.

Standard Deviation

Standard deviation is a statistical measurement that looks at how far a group of numbers is from the mean. Put simply, standard deviation measures how far apart numbers are in a data set.

Variance

A variance is the average of the squared differences from the mean.

To figure out the variance, calculate the difference between each point within the data set and the mean. Once you figure that out, square and average the results.

The mean is the average of a group of numbers, and the variance measures the average degree to which each number is different from the mean. The extent of the variance correlates to the size of the overall range of numbers, which means the variance is greater when there is a wider range of numbers in the group, and the variance is less when there is a narrower range of numbers.

Key Differences

Other than how they’re calculated, there are a few other key differences between standard deviation and variance. For one thing, the standard deviation is a statistical measure that people can use to determine how spread out numbers are in a data set. Variance, on the other hand, gives an actual value to how much the numbers in a data set vary from the mean.

Standard deviation is the square root of variance, but the variance is expressed as a percent. As such, the standard deviation will actually be greater than the variance since the square root of a decimal will be larger (and not smaller) than the original number.