Regression analysis is a statistical technique used to analyze the relationship between variables. The analysis is based on several standard assumptions, including the assumption that the errors of the model are independent. However, real data often includes correlated error terms that are not accounted for in regression analysis. When the assumptions are not followed, such as with these correlated error terms, inferences made using the model may be incorrect. One common correlation structure for error terms is autocorrelation with lag-1. Many different factors of the model can be analyzed to see the difference between handling the first order autoregressive errors and ignoring them. The most important factor is found to be the autocorrelation coefficient, φ. It is found for small |φ| that the differences between the two models are much less significant than when |φ| is large. Also, it is found there is a trade off between the occurrence of type-1 errors and type-2 errors whether φ is positive or negative. Sample size and the value of the slope parameter, β1, also play less significant roles in determining the differences between the two models. Overall, while autocorrelation may not always change results of the inferences made, it is important to account for it in order to practice proper statistical techniques.