-It refers to a situation in multiple regression analysis where two or more independent variables are highly correlated, making it difficult to isolate the individual effects of each variable on the dependent variable. This correlation among independent variables can cause issues in the estimation of regression coefficients and can affect the overall interpretability and reliability of the regression model.
In Linear Regression. The objective is to reduce the residual(error)
Dealing with Multicollinearity:
- Remove one or more highly correlated variables from the model by using the Variance Inflation Factor.
- Combine correlated variables or use composite variables.
- Use regularization techniques (e.g., Ridge regression or Lasso regression) that can handle multicollinearity to some extent.
- Principal Component Analysis (PCA) can be applied to transform variables and create uncorrelated components.
We will use the Variance Inflation Factor.
Variance Inflation Factor-It measures how much the variance of an estimated regression coefficient increases when predictors are correlated.
We use the R-square or coefficient of determination to evaluate the performance of a Regression model.
Let's say we have a Linear Regression equation of:
We therefore can place one of the independent variables as our target variable and we use the other independent variables to predict it. As shown in the example below
The variable with VIF higher than our threshold will be eliminated.
NB: VIF-based feature elimination is done recursively i.e one at a time until we reach a stage where no variable has VIF > threshold ( 5 or 10)
It's important to detect and address multicollinearity to ensure the reliability of regression results and the meaningful interpretation of coefficients. Ignoring multicollinearity can lead to misleading conclusions and hinder the usefulness of the regression model.
I would highly recommend visit this link below 👇 to learn about multicollinearity nad VIF