Variance concept

The variance in terms of probability and statistics is a variable represented through the dispersion of some data. Thus, the definition of variance is therefore the result of deviations with respect to the distribution mean. In other words, the concept of variance is an arithmetic mean.

What is the variance for?

The theory of variance was formulated by Ronald Ficher around 1918. His intention with this formula was to discover the value of a variable at a specific time with respect to the mean or total value of said variable. Thus, the variance in a way helps us to predict what we need for the future.

The variance can also be sample and in these cases what it does is analyze data about a community based on a sample. We also speak of covariance with the intention of measuring the dispersion in this case of more than one variable.

How is the variance calculated?

The formula for variance is shown below.

Calculate the variance

To calculate the variance what we do is represent each data, the corresponding mean of the data and the number of data. Of course, the result of the variance can never be less than zero, therefore, in the unit of measurement of the variance we will always find the squared residuals, as observed in the previous formula. Therefore, for calculate variance in statistics we will have to calculate the mean of the differences always squared.

Is the variance the same as the standard deviation?

No, and the reason is that we find a difference between variance and standard deviation, since the standard deviation is intended to work with the initial units of measurement and works through the variation, since it is the square root of the variance.

Leave a Comment