Yield variance is a measure of the dispersion of returns from an investment. It is calculated as the standard deviation of the return on investment (ROI), and is a useful tool for assessing the riskiness of an investment.
To calculate yield variance, the ROI for each period is first calculated. The ROI for each period is then subtracted from the mean ROI, and the resulting values are squared. The sum of these squared values is then divided by the number of periods - this gives the variance.
Yield variance is a useful measure of risk because it takes into account both the magnitude and the frequency of returns. It is therefore a good indicator of the volatility of an investment.
Yield variance is a measure of risk, and as such, investments with high yield variances are generally considered to be more risky than those with low yield variances.
By taking into account the riskiness of an investment, yield variance can be used to help choose investments that are likely to generate higher returns. It can also be used to help assess the risk/return trade-off of an investment portfolio.