### Abstract:

Regression analysis is a statistical method widely used in many fields such as economics, technology, social sciences and finance. A linear regression model is constructed to describe the relationship between the dependent variable and one or several independent variables. Multicollinearity is defined as the existence of nearly linear dependency among the independent variables. The presence of serious multicollinearity would reduce the accuracy of the parameters estimate in a linear regression model. The Ordinary Least Squares(OLS) estimator is an unbiased estimator that is used to estimate the unknown parameters in the model. The variance of the OLS estimator would be very large in the presence of multicollinearity, therefore detection multicollinearity is first step for giving solution for this problem. Many methods suggested for detecting multicollinearity, we investigated all of these methods, and we added the extra sum of squares method as a new method can be used for detecting multicollinearity. In our study case we used three independent variables to predict the dependent variable, the IV's were tar ,nicotine and weight, the dependent variable was carbon monoxide. In regression analysis, we can use hypothesis test to check the significance of the fitted model. Analysis of variance gives the information on regression sum of squares (SSR), residual sum of squares (SSE), total sum of squares (SST) and the F value for the hypothesis test. Regression sum of squares account for the variation in y that is explained by the variation of xi. In regression analysis, the regression sum of squares will always increase while the residual sum of squares will decrease when a new independent variable is added to the model because the total sum of squares remain unchanged. If multicollinearity among the regressors x1, x2, x3 does not exist, then SSR(xi | xj) = SSR(xi) and SSR(xi| xj, xk) = SSR(xi) This result attain the independency condition which considered one of conditions to make regression model, we conclude that multicollinearity affect at the independency of the IV's of the regression model. In our model we prove this result numerically, then we try to solve the problem by ridge regression.