Comparing Standard Deviation and Average Deviation

Standard Deviation Versus Average Deviation

Two of the most popular ways to measure variability or volatility in a set of data are standard deviation and average deviation, also known as mean absolute deviation. Though the two measurements are similar, they are calculated differently and offer slightly different views of data.

Determining volatility—that is, deviation from the center—is important in finance, so professionals in accounting, investing, and economics should be familiar with both concepts.

Key Takeaways

  • Standard deviation is the most common measure of variability and is frequently used to determine the volatility of financial instruments and investment returns.
  • Standard deviation is considered the most appropriate measure of variability when using a population sample, when the mean is the best measure of center, and when the distribution of data is normal.
  • Some argue that average deviation, or mean absolute deviation, is a better gauge of variability when there are distant outliers or the data is not well distributed.

Understanding Standard Deviation

Standard deviation is the most common measure of variability and is frequently used to determine the volatility of markets, financial instruments, and investment returns. To calculate the standard deviation:

  1. Find the mean, or average, of the data points by adding them and dividing the total by the number of data points.
  2. Subtract the mean from each data point and square the difference of each result.
  3. Find the mean of those squared differences and then the square root of the mean.

Squaring the differences between each point and the mean avoids the issue of negative differences for values below the mean, but it means the variance is no longer in the same unit of measure as the original data. Taking the square root means the standard deviation returns to the original unit of measure and is easier to interpret and use in further calculations.

Average Deviation

The average deviation, or mean absolute deviation, is calculated similarly to standard deviation, but it uses absolute values instead of squares to circumvent the issue of negative differences between the data points and their means.

To calculate the average deviation:

  1. Calculate the mean of all data points.
  2. Calculate the difference between the mean and each data point.
  3. Calculate the average of the absolute values of those differences.

Standard Deviation Versus Average Deviation

Standard deviation is often used to measure the volatility of returns from investment funds or strategies because it can help measure volatility. Higher volatility is generally associated with a higher risk of losses, so investors want to see higher returns from funds that generate higher volatility. For example, a stock index fund should have relatively low standard deviation compared with a growth fund.

The mean average, or mean absolute deviation, is considered the closest alternative to standard deviation. It is also used to gauge volatility in markets and financial instruments, but it is used less frequently than standard deviation.

According to mathematicians, when a data set is of normal distribution—that is, there aren’t many outliers—standard deviation is generally the preferable gauge of variability. But when there are large outliers, standard deviation registers higher levels of dispersion (or deviation from the center) than mean absolute deviation.

Related Posts