The Taylor Rule is a model used to forecast interest rates. Created by famed economist John Taylor in 1992, it suggests how the central bank should change interest rates to account for inflation and other economic conditions.According to the Taylor Rule, the Federal Reserve should raise interest rates when inflation is above target, or when GDP growth is too high. The Fed should lower rates when inflation is below the target level, or when GDP growth is too slow. The Taylor Rule aims to stabilize the economy in the short term, and stabilize inflation over the long term. The Taylor Rule looks like this: i = r* + pi + 0.5 (pi-pi*) = 0.5 (y-y*) Where i is the nominal federal funds rate, r asterisk is the real federal funds rate, pi is the rate of inflation, p asterisk is the target inflation rate, y is a logarithm of real output, and y asterisk is a logarithm of potential output. The Taylor Rule comes up with an interest rate, an inflation rate and a GDP rate that are all based on the one rate that should determine the correct balance for interest rate forecasts. Essentially, the equation says that the difference between a nominal and real interest rate is inflation. Real interest rates account for inflation; nominal rates do not. If, for example, the nominal interest rate on a three-year deposit is 4% and inflation over the three years is 3%, then the real interest rate is 1%. Taylor recommended the real interest rate should be 1.5 times the inflation rate.