A COMPARISON OF SOME SHRINKAGE METHODS FOR LOGISTIC REGRESSION MODEL

Authors

  • Soadad Rashied Hameed, Haifa Taha Abd Author

Abstract

             A new estimate produced by shrinking the initial estimate (such as the sample mean). For example, if two extreme mean values can be combined to create a more central mean value, repeating this for all means in the sample will adjust the sample mean which has shrunk towards the true population mean. And assuming the penalty parameter λ . And by using a different criterion represented by the presence of a penalty function penalizing the model, it will lead to the reduction as the parameters approach towards zero and get rid of the variables that have no effect on the model. We will explain in detail the four methods and then compare them to find out which one is more efficient for estimation. Among the most important of these methods used is the method of the normal lasso, the adaptive lasso, the scad, and the method proposed by the researcher is the method of the Bayesian lasso - with an exponential natural Gama distribution.By conducting the simulation process for samples with sizes (250,200,150,75), the comparison was made by calculating the mean squared errors and the mean squared absolute errors, It was concluded that the adaptive Lasso method was better  Reduction methods: The Bayesian lasso method also showed good results.

Downloads

Download data is not yet available.

Downloads

Published

2024-10-25

Issue

Section

Articles

How to Cite

A COMPARISON OF SOME SHRINKAGE METHODS FOR LOGISTIC REGRESSION MODEL. (2024). International Journal of Central Banking, 20(1), 828-839. https://ijocb.com/index.php/IJCB/article/view/61