Northeastern University Overfitted Models Discussion
Question Description
Each part is around 100 words
Part 1:
Ridge regression analysis is a statistical tool used to model data when there are more predictor variables than observations in a dataset or the dataset exhibits correlation between predictor variables (multicollinearity). It does not require unbiased estimators to make reliable predictive models. Bias here refers to the difference between the true population parameter and the expected estimator (Oleszak, 2019). Ride regression avoids the common problem of overfitting found in regression models based on least-squares (Glen, 2017). It does this by putting constraints on the coefficients, shrinking the coefficients using the lambda term (Bhattacharyya, 2018). By penalizing the sum of square residuals the Ridge regression shrinks the coefficients and minimizes the variance.
Lasso regression also uses shrinkage to create simple models for datasets with high levels of multicollinearity. Lasso is shorthand for Least Absolute Shrinkage and Selection Operator. It uses a different type of regularization than the Ridge regression, instead of taking the square of the coefficients, magnitudes are taken into account (Bhattacharyya, 2018). This allows for fewer coefficients; as the coefficient penalty reduces some of the possible coefficients at or close to zero they can be eliminated from the model (Glen, 2015). The key difference between the two regularization techniques is how the coefficient penalties are assigned to each model. Lasso regression uses L1 regularization which adds a penalty equal to the absolute value of the magnitude of coefficients. Whereas, Ridge regression uses L2 regularization which adds a penalty equal to the square of the magnitude.
In his 2019 DataCamp post Oleszak describes succinctly that neither regression tool is overall better for analysis, both allow for correlation between predictors but solve the multicollinearity issue with different form of regularization. Cross-validation between the two models can help determine which type is most suited for the dataset at hand.
Part 2:
Ridge and LASSO regression both involve regularization, a way to avoid overfitting a model by penalizing high-valued regression coefficients. Simple is always better, therefore Ridge and LASSO use regularization to reduce the parameters and/or shrinks the model. However, there are differences in both regression models.
Ridge performs L2 regularization, adding a penalty equal to the square of the magnitude of coefficients. Ridge is ideal for models showcasing high levels of multicollinearity or when the number of predictors exceeds the number of observations. Lastly, Ridge does NOT shrink the coefficients to zero.
Lasso performs L1 regularization, adding a penalty equal to the absolute value of the magnitude of coefficients. Lasso shrinks the coefficients equal to zero, making it ideal for models showcasing high levels of multicollinearity and variable elimination/feature selection.
A key aspect of statistical analysis is choosing the correct model. According to the PowerPoint provided for this module, LASSO has an advantage over Ridge. LASSO produces simpler and more interpretable models that involve only a subset of predictors since it can shrink some of the coefficients to zero. In a real-world setting reducing the cognitive load on the viewers of one’s model is important, so again, simple is better. However, cross-validation is often utilized to verify the correct approach.
According to Oleszak (2019), LASSO and Ridge are more advantageous in statistical modeling compared to OLS regression due to regularization. LASSO and Ridge allow for a lower variance at the cost of little bias to make the model more optimal in comparison to an OLS regression model.
Have a similar assignment? "Place an order for your assignment and have exceptional work written by our team of experts, guaranteeing you A results."