What is Efficiency Variance

Efficiency variance is the difference between the theoretical amount of inputs required to produce a unit of output and the actual number of inputs used to produce the unit of output. The expected inputs to produce the unit of output are based on models or past experience. The difference between expected required input and actual required input can be due to inefficiencies in labor or use of resources, or errors in the assumptions used to set input expectations. In manufacturing, efficiency variance can be used to analyze the effectiveness with respect to labor, materials, machine time and other production factors.

BREAKING DOWN Efficiency Variance

An important factor in measuring efficiency variance is the development of a set of realistic assumptions surrounding the theoretical amount of inputs that should be required. If the actual amount of inputs used exceeds the amount theoretically required, there is a negative efficiency variance. On the other hand, if actual inputs are less than the amounts theoretically required, then there would be a positive efficiency variance. Since the baseline theoretical inputs are often calculated for the optimal conditions, a slightly negative efficiency variance is normally expected.