ISSN: 2641-3086
Trends in Computer Science and Information Technology
Research Article       Open Access      Peer-Reviewed

Shrinkage Parameters for Each Explanatory Variable Found Via Particle Swarm Optimization in Ridge Regression

Eren Bas1*, Erol Egrioglu1 and Vedide Rezan Uslu2

1Department of Statistics, Faculty of Arts and Science, Forecast Research Laboratory, Giresun University, Giresun, 28200, Turkey
2Department of Statistics, Faculty of Arts and Science, University of Ondokuz Mayis, Samsun, 55139, Turkey
*Corresponding author: Eren Bas, Giresun University, Faculty of Arts and Science, Department of Statistics, Gure Campus, Giresun, Turkey, Tel: +90 454 3101400; Fax: +90 454 3101477; E-mail: eren.bas@giresun.edu.tr
Received: 23 February, 2017 | Accepted: 11 March, 2017 | Published: 13 March, 2017
Keywords: Ridge regression; Shrinkage parameters; Particle swarm optimization

Cite this as

Bas E, Egrioglu E, Uslu VR (2017) Shrinkage Parameters for Each Explanatory Variable Found Via Particle Swarm Optimization in Ridge Regression. Trends Comput Sci Inf Technol 2(1): 012-020. DOI: 10.17352/tcsit.000005

Ridge regression method is an improved method when the assumptions of independence of the explanatory variables cannot be achieved, which is also called multicollinearity problem, in regression analysis. One of the way to eliminate the multicollinearity problem is to ignore the unbiased property of. β MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaqcaageaaaaaaaaa8qacqaHYoGyaaa@381C@ Ridge regression estimates the regression coefficients biased in order to decrease the variance of the regression coefficients. One of the most important problems in ridge regression is to decide what the shrinkage parameter (k) value will be. This k value was found to be a single value in almost all these studies in the literature. In this study, different from those studies, we found different k values corresponding to each diagonal elements of variance-covariance matrix of instead of a single value of k by using a new algorithm based on particle swarm optimization. To evaluate the performance of our proposed method, the proposed method is firstly applied to real-life data sets and compared with some other studies suggested in the ridge regression literature. Finally, two different simulation studies are performed and the performance of the proposed method with different conditions is evaluated by considering other studies suggested in the ridge regression literature.

Introduction

The functional relation between a dependent variable and more than one independent variable is examined by multiple regression analysis. The purpose of the multiple regression analysis is the creation of the best model that can predict the dependent variable by using the independent variables. For this purpose, the most common method to create the best model is ordinary least square (OLS) estimates method. In this method, the sum of error squares to be minimal is calculated to predict the parameters of the model.

There are some valid assumptions for the implementation of the multiple regression analysis. These are; the absence of multicollinearity problem among independent variables, the variance of error term must be constant for all independent variables and the covariance between error term and independent variables must be equal to zero.

One of the major problems in multiple regression analysis is multicollinearity problem. If there is a full or high degree linear relationship among independent variables, this situation is called as multicollinearity. Besides, multicollinearity has some important effects on OLS estimates of the regression coefficients. In the presence of multicollinearity, the OLS of regression coefficients have large variance. And also, the regression coefficients can be estimated incorrectly and the standard errors of regression coefficients can be found as exaggerated in the presence of multicollinearity. If the regression coefficients can be estimated incorrect, it can be obtained incorrect results statistically.

Therefore, ridge regression method is used to obtain stable coefficient estimates for the estimation of the regression coefficients. That means, ridge regression has been suggested to overcome the multicollinearity problem.

In the literature, it is commonly accepted that if the variance inflation factors (VIF) values are greater than 10 there is a multicollinearity problem. This is a rule of thumb and this is not exact information. Similarly, condition number can be used to determine multicollinearity problem by using rule of thumbs. As a result of, determining of multicollinearity problem can be realized by using some criteria.

The two methods most commonly used to determine the effects of multicollinearity problem are VIF and condition number methods. The diagonal elements of Var ( β ) ^ MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaqcfaieaaaaaaaaa8qacaWGwbGaamyyaiaadkhapaWaaCbiaeaapeWaaeWaa8aabaWdbiabek7aIbGaayjkaiaawMcaaaWdaeqabaWdbiaac6faaaaaaa@3DFF@ are called as VIF and are given by the Equation 1.

VI F j = 1 ( 1 R j 2 )  j=1,,p         (1) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGwbGaamysaiaadAeapaWaaSbaaSqaa8qacaWGQbaapaqabaGcpeGaeyypa0ZaaSaaa8aabaWdbiaaigdaa8aabaWdbmaabmaapaqaa8qacaaIXaGaeyOeI0IaamOua8aadaqhaaWcbaWdbiaadQgaa8aabaWdbiaaikdaaaaakiaawIcacaGLPaaaaaGaaeiiaiaadQgacqGH9aqpcaaIXaGaaiilaiabgAci8kaacYcacaWGWbGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeikaiaabgdacaqGPaaaaa@513B@

In this Equation, R j 2 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaacbmaeaaaaaaaaa8qacaWFsbWdamaaDaaaleaapeGaa8NAaaWdaeaapeGaaGOmaaaaaaa@3903@ is the determination coefficient obtained from the multiple regression of X j MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaacbmaeaaaaaaaaa8qacaWFybWdamaaBaaaleaapeGaa8NAaaWdaeqaaaaa@383C@ on the remaining ( p1 ) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qadaqadaWdaeaaieWapeGaa8hCaiabgkHiTiaaigdaaiaawIcacaGLPaaaaaa@3A5F@ regressor variables in the model.

It can be said that there is a multicollinearity problem among the relevant independent variables if these VIF values increase (VIF values ≥ 10). And also, if VIF values are increased, the degree of the multicollinearity increases with the increase of VIF values.

Condition number method is another method to determine the multicollinearity problem which is based on the eigenvalues of matrix. The formula of the condition number (CN) was given in Equation 2. 

ϕ= λ max λ min         (2) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacqaHvpGzcqGH9aqpdaWcaaWdaeaapeGaeq4UdW2damaaBaaaleaapeGaaeyBaiaabggacaqG4baapaqabaaakeaapeGaeq4UdW2damaaBaaaleaapeGaaeyBaiaabMgacaqGUbaapaqabaaaaOWdbiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeikaiaabkdacaqGPaaaaa@4A1F@

In this Equation, λ MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacqaH7oaBaaa@37C6@ shows the eigenvalues of X ' X. MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaacbmaeaaaaaaaaa8qacaWFybWdamaaCaaaleqabaWdbiaa=DcaaaGccaWFybGaaiOlaaaa@3980@ the relationship between condition number and multicollinearity is given in Table 1.

In summary, the determining of multicollinearity problem can be done by following two rules of thumbs. The first one is that if VIF values are greater than 10 multicollinearity is high. The second one is checking condition number as given in Table 1.

In addition, another problem in ridge regression is finding optimal biasing parameter (k) value. This k value is a very small constant determined by the researcher [1]. Several methods were proposed for finding it in the literature. These methods have been proposed in the studies of [2-22].

And also, there are many methods in the literature for ridge regression [23-29]. And also, [30] proposed some new methods that take care of the skewed eigenvalues of the matrix of explanatory variables. [31] Proposed an iterative approach to minimize the mean squared error in ridge regression. [32] Proposed new ridge parameters for ridge regression. [33] Proposed an optimal estimation for the ridge regression parameter. [34,35] Proposed some new estimators for estimating the ridge parameter.

This k value was found to be a single value in almost all these studies in the literature. But in this study, we found different k values corresponding to each diagonal elements of variance-covariance matrix of instead of a single value of k by using a new algorithm based on particle swarm optimization.

The rest part of the paper can be outlined as below: The second section of the paper is about ridge regression. The methodology of the paper is given in Section 3. The implementation of our proposed method is given in Section 4. Two different simulation studies are performed under the title of simulation study and finally, discussions are presented in Section 6.

Ridge regression

Ridge regression is a remedy used in the presence of multicollinearity problem and it was firstly proposed by [1]. Ridge regression method has two important advantages according to OLS method. One of them is to solve the multicollinearity problem and the other one is to decrease the mean square error (MSE). The solution technique of ridge regression is similar with OLS. Besides, the difference between ridge regression and OLS is the k value. This k value is also called as biased parameter or shrinkage parameter and it takes values between 0 and 1. This k value is added to the diagonal elements of the correlation matrix and thus biased regression coefficients are obtained.

The OLS estimates of regression coefficients and ridge estimates of regression coefficients are shown in the Equations 3 and 4 respectively.

β ^ = ( X'X ) 1 X'Y β ^ R = ( X'X+kI ) 1 X'Y         (3) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOabaeqabaGafqOSdiMbaKaacqGH9aqpdaqadaqaaiaadIfacaGGNaGaamiwaaGaayjkaiaawMcaamaaCaaaleqabaGaeyOeI0IaaGymaaaakiaadIfacaGGNaGaamywaaqaaiqbek7aIzaajaWaaSbaaSqaaiaadkfaaeqaaOGaeyypa0ZaaeWaaeaacaWGybGaai4jaiaadIfacqGHRaWkcaWGRbGaamysaaGaayjkaiaawMcaamaaCaaaleqabaGaeyOeI0IaaGymaaaakiaadIfacaGGNaGaamywaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqG0aGaaeykaaaaaa@5D74@

β ^ R = ( X'X+kI ) 1 X'Y          (4) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiqbek7aIzaajaWaaSbaaSqaaiaadkfaaeqaaOGaeyypa0ZaaeWaaeaacaqGybGaai4jaiaabIfacqGHRaWkcaWGRbGaamysaaGaayjkaiaawMcaamaaCaaaleqabaGaeyOeI0IaaGymaaaakiaabIfacaGGNaGaamywaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGOaGaaeinaiaabMcaaaa@5320@

As noted above, ridge regression is a biased regression method. The proof of this situation is shown in Equation 5.

β ^ R = ( X'X+kI ) 1 X'Y = ( X'X+kI ) 1 (X'X) β ^ =Z β ^           (5) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOabaeqabaGafqOSdiMbaKaadaWgaaWcbaGaamOuaaqabaGccqGH9aqpdaqadaqaaiaadIfacaGGNaGaamiwaiabgUcaRiaadUgacaWGjbaacaGLOaGaayzkaaWaaWbaaSqabeaacqGHsislcaaIXaaaaOGaamiwaiaacEcacaWGzbaabaGaaGPaVlaaykW7caaMc8UaaGPaVlaaykW7caaMc8UaaGPaVlaaykW7cqGH9aqpdaqadaqaaiaadIfacaGGNaGaamiwaiabgUcaRiaadUgacaWGjbaacaGLOaGaayzkaaWaaWbaaSqabeaacqGHsislcaaIXaaaaOGaaiikaiaadIfacaGGNaGaamiwaiaacMcacuaHYoGygaqcaiabg2da9iaadQfacuaHYoGygaqcaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGOaGaaeynaiaabMcaaaaa@71FE@

E( β ^ R )=E( Z β ^ )=Zβ MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiaadweadaqadaqaaiqbek7aIzaajaWaaSbaaSqaaiaadkfaaeqaaaGccaGLOaGaayzkaaGaeyypa0JaamyramaabmaabaGaamOwaiqbek7aIzaajaaacaGLOaGaayzkaaGaeyypa0JaamOwaiabek7aIbaa@4AA3@

It is clearly seen that ridge estimates of regression coefficients ( β ^ R ) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaamaabmaabaGafqOSdiMbaKaadaWgaaWcbaGaamOuaaqabaaakiaawIcacaGLPaaaaaa@406A@ are biased estimates. One of the most important points to be considered in the ridge regression is the k value. There are many methods proposed in the literature to find the optimal k value. Ridge trace is one of these methods. Ridge trace is a plot of the elements of the ridge estimator versus k usually in the interval (0, 1) [1].

The other methods in the literature used to find the optimal k value were given in the Equations 6-14, respectively.

k= ρ σ ^ 2 β ^ ' β ^           (6) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGRbGaeyypa0ZaaSaaa8aabaWdbiabeg8aY9aadaWfGaqaa8qacqaHdpWCaSWdaeqabaWdbiaac6faaaGcpaWaaWbaaSqabeaapeGaaGOmaaaaaOWdaeaadaWfGaqaa8qacqaHYoGyaSWdaeqabaWdbiaac6faaaGccaGGNaWdamaaxacabaWdbiabek7aIbWcpaqabeaapeGaaiOxaaaaaaGccaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeikaiaabAdacaqGPaaaaa@4D80@

k= ρ σ ^ 2 i=1 p λ i β ^ i 2        (7) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGRbGaeyypa0ZaaSaaa8aabaWdbiabeg8aY9aadaWfGaqaa8qacqaHdpWCaSWdaeqabaWdbiaac6faaaGcpaWaaWbaaSqabeaapeGaaGOmaaaaaOWdaeaapeWaaubmaeqal8aabaWdbiaadMgacqGH9aqpcaaIXaaapaqaa8qacaWGWbaan8aabaWdbiabggHiLdaakiabeU7aS9aadaWgaaWcbaWdbiaadMgaa8aabeaakmaaxacabaWdbiabek7aIbWcpaqabeaapeGaaiOxaaaak8aadaWgaaWcbaWdbiaadMgaa8aabeaakmaaCaaaleqabaWdbiaaikdaaaaaaOGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqG3aGaaeykaaaa@5358@

k= ρ σ ^ 2 i=1 p { β ^ i 2 [ 1+( 1+ λ i ( β ^ i 2 / σ ^ 2 ) 1/2 ) ] }         (8) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGRbGaeyypa0ZaaSaaa8aabaWdbiabeg8aY9aadaWfGaqaa8qacqaHdpWCaSWdaeqabaWdbiaac6faaaGcpaWaaWbaaSqabeaapeGaaGOmaaaaaOWdaeaapeWaaubmaeqal8aabaWdbiaadMgacqGH9aqpcaaIXaaapaqaa8qacaWGWbaan8aabaWdbiabggHiLdaakmaacmaapaqaamaaxacabaWdbiabek7aIbWcpaqabeaapeGaaiOxaaaak8aadaWgaaWcbaWdbiaadMgaa8aabeaakmaaCaaaleqabaWdbiaaikdaaaGcdaWadaWdaeaapeGaaGymaiabgUcaRmaabmaapaqaa8qacaaIXaGaey4kaSIaeq4UdW2damaaBaaaleaapeGaamyAaaWdaeqaaOWdbmaabmaapaqaamaaxacabaWdbiabek7aIbWcpaqabeaapeGaaiOxaaaak8aadaWgaaWcbaWdbiaadMgaa8aabeaakmaaCaaaleqabaWdbiaaikdaaaGccaGGVaWdamaaxacabaWdbiabeo8aZbWcpaqabeaapeGaaiOxaaaak8aadaahaaWcbeqaa8qacaaIYaaaaaGccaGLOaGaayzkaaWdamaaCaaaleqabaWdbiaaigdacaGGVaGaaGOmaaaaaOGaayjkaiaawMcaaaGaay5waiaaw2faaaGaay5Eaiaaw2haaaaacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqG4aGaaeykaaaa@6BAC@

k= ( λ max σ ^ 2 ) ( ( np1 ) σ ^ 2 + λ max   β ^ 2 max )           (9) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGRbGaeyypa0ZaaSaaa8aabaWdbmaabmaapaqaa8qacqaH7oaBpaWaaSbaaSqaa8qacaWGTbGaamyyaiaadIhaa8aabeaakmaaxacabaWdbiabeo8aZbWcpaqabeaapeGaaiOxaaaak8aadaahaaWcbeqaa8qacaaIYaaaaaGccaGLOaGaayzkaaaapaqaa8qadaqadaWdaeaapeWaaeWaa8aabaWdbiaad6gacqGHsislcaWGWbGaeyOeI0IaaGymaaGaayjkaiaawMcaa8aadaWfGaqaa8qacqaHdpWCaSWdaeqabaWdbiaac6faaaGcpaWaaWbaaSqabeaapeGaaGOmaaaakiabgUcaRiabeU7aS9aadaWgaaWcbaWdbiaad2gacaWGHbGaamiEaaWdaeqaaOWdbiaacckapaWaaCbiaeaapeGaeqOSdigal8aabeqaa8qacaGGEbaaaOWdamaaCaaaleqabaWdbiaaikdaaaGcpaWaaSbaaSqaa8qacaWGTbGaamyyaiaadIhaa8aabeaaaOWdbiaawIcacaGLPaaaaaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqG5aGaaeykaaaa@661C@

k=max( 0, p σ ^ 2 β ^ ' β ^ 1 n ( VI F j ) max )          (10) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGRbGaeyypa0JaamyBaiaadggacaWG4bWaaeWaa8aabaWdbiaaicdacaGGSaWaaSaaa8aabaWdbiaadchapaWaaCbiaeaapeGaeq4Wdmhal8aabeqaa8qacaGGEbaaaOWdamaaCaaaleqabaWdbiaaikdaaaaak8aabaWaaCbiaeaapeGaeqOSdigal8aabeqaa8qacaGGEbaaaOGaai4ja8aadaWfGaqaa8qacqaHYoGyaSWdaeqabaWdbiaac6faaaaaaOGaeyOeI0YaaSaaa8aabaWdbiaaigdaa8aabaWdbiaad6gadaqadaWdaeaapeGaamOvaiaadMeacaWGgbWdamaaBaaaleaapeGaamOAaaWdaeqaaaGcpeGaayjkaiaawMcaa8aadaWgaaWcbaWdbiaad2gacaWGHbGaamiEaaWdaeqaaaaaaOWdbiaawIcacaGLPaaacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeikaiaabgdacaqGWaGaaeykaaaa@5EFB@

k= σ ^ 2 i=1 p ( λ i β ^ i 2 ) [ s i=1 p ( λ i β ^ i 2 ) ] 2           (11) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGRbGaeyypa0ZaaSaaa8aabaWaaCbiaeaapeGaeq4Wdmhal8aabeqaa8qacaGGEbaaaOWdamaaCaaaleqabaWdbiaaikdaaaGcdaqfWaqabSWdaeaapeGaamyAaiabg2da9iaaigdaa8aabaWdbiaadchaa0WdaeaapeGaeyyeIuoaaOWaaeWaa8aabaWdbiabeU7aS9aadaWgaaWcbaWdbiaadMgaa8aabeaakmaaxacabaWdbiabek7aIbWcpaqabeaapeGaaiOxaaaak8aadaWgaaWcbaWdbiaadMgaa8aabeaakmaaCaaaleqabaWdbiaaikdaaaaakiaawIcacaGLPaaaa8aabaWdbmaadmaapaqaa8qacaWGZbWaaubmaeqal8aabaWdbiaadMgacqGH9aqpcaaIXaaapaqaa8qacaWGWbaan8aabaWdbiabggHiLdaakmaabmaapaqaa8qacqaH7oaBpaWaaSbaaSqaa8qacaWGPbaapaqabaGcdaWfGaqaa8qacqaHYoGyaSWdaeqabaWdbiaac6faaaGcpaWaaSbaaSqaa8qacaWGPbaapaqabaGcdaahaaWcbeqaa8qacaaIYaaaaaGccaGLOaGaayzkaaaacaGLBbGaayzxaaWdamaaCaaaleqabaWdbiaaikdaaaaaaOGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqGXaGaaeymaiaabMcaaaa@69EF@

k= { σ ^ 2 λ max i=1 p ( λ i β ^ i 2 )+ [ i=1 p ( λ i β ^ i 2 ) ] 2 } λ max i=1 p ( λ i β ^ i 2 )          (12) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGRbGaeyypa0ZaaSaaa8aabaWdbmaacmaapaqaamaaxacabaWdbiabeo8aZbWcpaqabeaapeGaaiOxaaaak8aadaahaaWcbeqaa8qacaaIYaaaaOGaeq4UdW2damaaBaaaleaapeGaamyBaiaadggacaWG4baapaqabaGcpeWaaubmaeqal8aabaWdbiaadMgacqGH9aqpcaaIXaaapaqaa8qacaWGWbaan8aabaWdbiabggHiLdaakmaabmaapaqaa8qacqaH7oaBpaWaaSbaaSqaa8qacaWGPbaapaqabaGcdaWfGaqaa8qacqaHYoGyaSWdaeqabaWdbiaac6faaaGcpaWaaSbaaSqaa8qacaWGPbaapaqabaGcdaahaaWcbeqaa8qacaaIYaaaaaGccaGLOaGaayzkaaGaey4kaSYaamWaa8aabaWdbmaavadabeWcpaqaa8qacaWGPbGaeyypa0JaaGymaaWdaeaapeGaamiCaaqdpaqaa8qacqGHris5aaGcdaqadaWdaeaapeGaeq4UdW2damaaBaaaleaapeGaamyAaaWdaeqaaOWaaCbiaeaapeGaeqOSdigal8aabeqaa8qacaGGEbaaaOWdamaaBaaaleaapeGaamyAaaWdaeqaaOWaaWbaaSqabeaapeGaaGOmaaaaaOGaayjkaiaawMcaaaGaay5waiaaw2faa8aadaahaaWcbeqaa8qacaaIYaaaaaGccaGL7bGaayzFaaaapaqaa8qacqaH7oaBpaWaaSbaaSqaa8qacaWGTbGaamyyaiaadIhaa8aabeaak8qadaqfWaqabSWdaeaapeGaamyAaiabg2da9iaaigdaa8aabaWdbiaadchaa0WdaeaapeGaeyyeIuoaaOWaaeWaa8aabaWdbiabeU7aS9aadaWgaaWcbaWdbiaadMgaa8aabeaakmaaxacabaWdbiabek7aIbWcpaqabeaapeGaaiOxaaaak8aadaWgaaWcbaWdbiaadMgaa8aabeaakmaaCaaaleqabaWdbiaaikdaaaaakiaawIcacaGLPaaaaaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeikaiaabgdacaqGYaGaaeykaaaa@859D@

k=max( σ ^ 2 β ^ i 2 + 1 λ i ), i=1,2,, p         (13) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGRbGaeyypa0JaamyBaiaadggacaWG4bWaaeWaa8aabaWdbmaalaaapaqaamaaxacabaWdbiabeo8aZbWcpaqabeaapeGaaiOxaaaak8aadaahaaWcbeqaa8qacaaIYaaaaaGcpaqaamaaxacabaWdbiabek7aIbWcpaqabeaapeGaaiOxaaaak8aadaWgaaWcbaWdbiaadMgaa8aabeaakmaaCaaaleqabaWdbiaaikdaaaaaaOGaey4kaSYaaSaaa8aabaWdbiaaigdaa8aabaWdbiabeU7aS9aadaWgaaWcbaWdbiaadMgaa8aabeaaaaaak8qacaGLOaGaayzkaaGaaiilaiaacckacaWGPbGaeyypa0JaaGymaiaacYcacaaIYaGaaiilaiabl+UimjaacYcacaGGGcGaamiCaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqGXaGaae4maiaabMcaaaa@5F19@

k= p σ ^ 2 i=1 p { β ^ i 2 /[ [ ( β ^ i 4 λ i 2 /4 σ ^ 2 )+( 6 β ^ i 4 λ i / σ ^ 2 ) ] 1/2 ( β ^ i 2 λ i /2 σ ^ 2 ) ] }           (14) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGRbGaeyypa0ZaaSaaa8aabaWdbiaadchapaWaaCbiaeaapeGaeq4Wdmhal8aabeqaa8qacaGGEbaaaOWdamaaCaaaleqabaWdbiaaikdaaaaak8aabaWdbmaavadabeWcpaqaa8qacaWGPbGaeyypa0JaaGymaaWdaeaapeGaamiCaaqdpaqaa8qacqGHris5aaGcdaGadaWdaeaadaWfGaqaa8qacqaHYoGyaSWdaeqabaWdbiaac6faaaGcpaWaaSbaaSqaa8qacaWGPbaapaqabaGcdaahaaWcbeqaa8qacaaIYaaaaOGaai4lamaadmaapaqaa8qadaWadaWdaeaapeWaaeWaa8aabaWaaCbiaeaapeGaeqOSdigal8aabeqaa8qacaGGEbaaaOWdamaaBaaaleaapeGaamyAaaWdaeqaaOWaaWbaaSqabeaapeGaaGinaaaakiabeU7aS9aadaWgaaWcbaWdbiaadMgaa8aabeaakmaaCaaaleqabaWdbiaaikdaaaGccaGGVaGaaGina8aadaWfGaqaa8qacqaHdpWCaSWdaeqabaWdbiaac6faaaGcpaWaaWbaaSqabeaapeGaaGOmaaaaaOGaayjkaiaawMcaaiabgUcaRmaabmaapaqaa8qacaaI2aWdamaaxacabaWdbiabek7aIbWcpaqabeaapeGaaiOxaaaak8aadaWgaaWcbaWdbiaadMgaa8aabeaakmaaCaaaleqabaWdbiaaisdaaaGccqaH7oaBpaWaaSbaaSqaa8qacaWGPbaapaqabaGcpeGaai4la8aadaWfGaqaa8qacqaHdpWCaSWdaeqabaWdbiaac6faaaGcpaWaaWbaaSqabeaapeGaaGOmaaaaaOGaayjkaiaawMcaaaGaay5waiaaw2faa8aadaahaaWcbeqaa8qacaaIXaGaai4laiaaikdaaaGccqGHsisldaqadaWdaeaadaWfGaqaa8qacqaHYoGyaSWdaeqabaWdbiaac6faaaGcpaWaaSbaaSqaa8qacaWGPbaapaqabaGcdaahaaWcbeqaa8qacaaIYaaaaOGaeq4UdW2damaaBaaaleaapeGaamyAaaWdaeqaaOWdbiaac+cacaaIYaWdamaaxacabaWdbiabeo8aZbWcpaqabeaapeGaaiOxaaaak8aadaahaaWcbeqaa8qacaaIYaaaaaGccaGLOaGaayzkaaaacaGLBbGaayzxaaaacaGL7bGaayzFaaaaaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGOaGaaeymaiaabsdacaqGPaaaaa@8DDE@

In this paper, for the purpose of comparing the results we just consider the methods of which a brief introduction is given as below.

[2] Suggested another method for finding k value which is given in Equation 15

k= p σ ^ 2 β ^ ' β ^        (15) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiaadUgacqGH9aqpdaWcaaqaaiaadchacuaHdpWCgaqcamaaCaaaleqabaGaaGOmaaaaaOqaaiqbek7aIzaajaWaaWbaaSqabeaacaGGNaaaaOGafqOSdiMbaKaaaaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqGXaGaaeynaiaabMcaaaa@4D60@

In this Equation σ ^ 2 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiqbeo8aZzaajaWaaWbaaSqabeaacaaIYaaaaaaa@3EDF@ and β ^ MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiqbek7aIzaajaaaaa@3DD4@ are the OLS estimates. This method is called as fixed point ridge regression method (FPRRM).

[39] Introduced an iterative method for finding the optimal k value. In this method k is calculated in Equation 16;

k= p σ ^ 2 ( t1 ) β ^ ( t1 ) ' β ^ ( t1 )         (16) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiaadUgacqGH9aqpdaWcaaqaaiaadchacuaHdpWCgaqcamaaCaaaleqabaGaaGOmaaaakmaabmaabaGaamiDaiabgkHiTiaaigdaaiaawIcacaGLPaaaaeaacuaHYoGygaqcamaabmaabaGaamiDaiabgkHiTiaaigdaaiaawIcacaGLPaaadaahaaWcbeqaaiaacEcaaaGccuaHYoGygaqcamaabmaabaGaamiDaiabgkHiTiaaigdaaiaawIcacaGLPaaaaaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGOaGaaeymaiaabAdacaqGPaaaaa@5A82@

In this Equation, σ ^ 2 ( t1 ) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiqbeo8aZzaajaWaaWbaaSqabeaacaaIYaaaaOWaaeWaaeaacaWG0bGaeyOeI0IaaGymaaGaayjkaiaawMcaaaaa@4313@ and β ^ ( t1 ) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiqbek7aIzaajaGaaGPaVpaabmaabaGaamiDaiabgkHiTiaaigdaaiaawIcacaGLPaaaaaa@4389@ are the corresponding residual mean square and the estimate vector of regression coefficients at (t-1)th iteration, respectively. This method is called as iterative ridge regression method (IRRM).

And also, the generalized ridge regression estimator of Hoerl and Kennard [1, 40] is given in [41] by following Equations 17-20.

Let and Q be the matrices of eigenvalues and eigenvectors of. ( X'X ) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaamaabmaabaGaamiwaiaacEcacaWGybaacaGLOaGaayzkaaaaaa@4011@ In the orthogonal version of the classical linear regression model: Z=XQ MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiaadQfacqGH9aqpcaWGybGaamyuaaaa@3FBB@ , α=Q'β MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiabeg7aHjabg2da9iaadgfacaGGNaGaeqOSdigaaa@41EA@ , α ^ = λ 1 Z'y MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiqbeg7aHzaajaGaeyypa0Jaeq4UdW2aaWbaaSqabeaacqGHsislcaaIXaaaaOGaamOwaiaacEcacaWG5baaaa@44F3@ , K=diag( k 1 , k 2 ,, k p ), k i >0 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiaadUeacqGH9aqpcaWGKbGaamyAaiaadggacaWGNbWaaeWaaeaacaWGRbWaaSbaaSqaaiaaigdaaeqaaOGaaiilaiaaykW7caWGRbWaaSbaaSqaaiaaikdaaeqaaOGaaiilaiaaykW7caaMc8UaaGPaVlabl+UimjaacYcacaaMc8UaaGPaVlaadUgadaWgaaWcbaGaamiCaaqabaaakiaawIcacaGLPaaacaGGSaGaaGPaVlaaykW7caaMc8UaaGPaVlaaykW7caWGRbWaaSbaaSqaaiaadMgaaeqaaOGaeyOpa4JaaGimaaaa@6286@ then

β ˜ =Q ( Λ+K ) 1 Λ α ^          (17) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiqbek7aIzaaiaGaeyypa0JaamyuamaabmaabaGaeu4MdWKaey4kaSIaam4saaGaayjkaiaawMcaamaaCaaaleqabaGaeyOeI0IaaGymaaaakiabfU5amjqbeg7aHzaajaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeikaiaabgdacaqG3aGaaeykaaaa@51E2@

β ˜ MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiqbek7aIzaaiaaaaa@3DD3@ Is the generalized ridge estimator of. β MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiabek7aIbaa@3DC4@ Hoerl and Kennard [1, 40], have shown that the values of k i MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiaadUgadaWgaaWcbaGaamyAaaqabaaaaa@3E2D@ which minimize the MSE of regression coefficient are given by

k i = σ 2 α i 2           (18) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiaadUgadaWgaaWcbaGaamyAaaqabaGccqGH9aqpdaWcaaqaaiabeo8aZnaaCaaaleqabaGaaGOmaaaaaOqaaiabeg7aHnaaDaaaleaacaWGPbaabaGaaGOmaaaaaaGccaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeikaiaabgdacaqG4aGaaeykaaaa@4EA7@

And the estimation of k i MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiaadUgadaWgaaWcbaGaamyAaaqabaaaaa@3E2D@ values can be obtained by using Equation 19.

k ^ i = σ ^ 2 α ^ i 2         (19) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiqadUgagaqcamaaBaaaleaacaWGPbaabeaakiabg2da9maalaaabaGafq4WdmNbaKaadaahaaWcbeqaaiaaikdaaaaakeaacuaHXoqygaqcamaaDaaaleaacaWGPbaabaGaaGOmaaaaaaGccaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqGXaGaaeyoaiaabMcaaaa@4D92@

In [41], other estimation formulas for optimum shrinkage parameters are given below.

k ^ i = λ i σ ^ 2 ( nk ) σ ^ 2 + λ i α ^ i 2        (20) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaiqadUgagaqcamaaBaaaleaacaWGPbaabeaakiabg2da9maalaaabaGaeq4UdW2aaSbaaSqaaiaadMgaaeqaaOGafq4WdmNbaKaadaahaaWcbeqaaiaaikdaaaaakeaadaqadaqaaiaad6gacqGHsislcaWGRbaacaGLOaGaayzkaaGafq4WdmNbaKaadaahaaWcbeqaaiaaikdaaaGccqGHRaWkcqaH7oaBdaWgaaWcbaGaamyAaaqabaGccuaHXoqygaqcamaaDaaaleaacaWGPbaabaGaaGOmaaaaaaGccaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeikaiaabkdacaqGWaGaaeykaaaa@5A98@

Methodology

Finding the optimal k value is an important problem in ridge regression. The k values recommended in the literature were given in the previous section. And also, there are some heuristic methods such as genetic algorithms to find the optimal k value in the literature proposed by [18, 21]. And also, [22] have found the k value by using particle swarm optimization (PSO). In all these methods suggested in the literature, this k value was found as a single value. But in this study, we found different k values corresponding to each explanatory variable instead of a single value of k by using an algorithm based on particle swarm optimization. And also, this paper is the improvement form of the study of [22].

The objective function of the paper was created by considering both mean absolute percentage error (MAPE) criterion and VIF values at the same time. The aim of the objective function is to find the optimal k values by finding the VIF values ​​ less than 10 and SSE (sum of square errors) minimum, at the same time. And also, we add a parameter ( (k) ) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbnvMCYL2DLfgDOvMCaeXatLxBI9gBaerbd9wDYLwzYbItLDharuavP1wzZbItLDhis9wBH5garqqtubsr4rNCHbGeaGakY=3j0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFfea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr0=vqpWqaaeaabiGaciaacaqabeaadaabauaaaOqaaabaaaaaaaaapeWaaeWaa8aabaWdbiabgwGiglaacIcacaWGRbGaaiykaaGaayjkaiaawMcaaaaa@41AD@ to the second part of the objective function. This parameter can be called as penalty parameter. If the VIF value corresponds to any explanatory variable is bigger than 10 the value of the objective function is increased. This is an effect of the penalty parameter. This is an undesirable result.

The optimization problem in the proposed method can be given in Equation 21.

Objective function:

min k 1 , k 2 ,, k p MAPE( k 1 , k 2 ,, k p )+( k 1 , k 2 ,, k p )         (21) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaWaaCbeaeaaqaaaaaaaaaWdbiGac2gacaGGPbGaaiOBaaWcpaqaa8qacaWGRbWdamaaBaaameaapeGaaGymaaWdaeqaaSWdbiaacYcacaWGRbWdamaaBaaameaapeGaaGOmaaWdaeqaaSWdbiaacYcacqWIVlctcaGGSaGaam4Aa8aadaWgaaadbaWdbiaadchaa8aabeaaaSqabaGcpeGaamytaiaadgeacaWGqbGaamyramaabmaapaqaa8qacaWGRbWdamaaBaaaleaapeGaaGymaaWdaeqaaOWdbiaacYcacaWGRbWdamaaBaaaleaapeGaaGOmaaWdaeqaaOWdbiaacYcacqWIVlctcaGGSaGaam4Aa8aadaWgaaWcbaWdbiaadchaa8aabeaaaOWdbiaawIcacaGLPaaacqGHRaWkcqGHfiIXcaGGOaGaam4Aa8aadaWgaaWcbaWdbiaaigdaa8aabeaak8qacaGGSaGaam4Aa8aadaWgaaWcbaWdbiaaikdaa8aabeaak8qacaGGSaGaeS47IWKaaiilaiaadUgapaWaaSbaaSqaa8qacaWGWbaapaqabaGcpeGaaiykaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqGYaGaaeymaiaabMcaaaa@6A1C@

with subject to: 0 k 1 , k 2 ,, k p 1   ( j=1,2,,p ) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaaIWaGaeyizImQaam4Aa8aadaWgaaWcbaWdbiaaigdaa8aabeaak8qacaGGSaGaam4Aa8aadaWgaaWcbaWdbiaaikdaa8aabeaak8qacaGGSaGaeS47IWKaaiilaiaadUgapaWaaSbaaSqaa8qacaWGWbaapaqabaGcpeGaeyizImQaaGymaiaacckacaGGGcGaaiiOamaabmaapaqaa8qacaWGQbGaeyypa0JaaGymaiaacYcacaaIYaGaaiilaiabl+UimjaacYcacaWGWbaacaGLOaGaayzkaaaaaa@52FA@

where MAPE ( k 1 , k 2 ,, k p ) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qadaqadaWdaeaapeGaam4Aa8aadaWgaaWcbaWdbiaaigdaa8aabeaak8qacaGGSaGaam4Aa8aadaWgaaWcbaWdbiaaikdaa8aabeaak8qacaGGSaGaeS47IWKaaiilaiaadUgapaWaaSbaaSqaa8qacaWGWbaapaqabaaak8qacaGLOaGaayzkaaaaaa@4250@ and ( k 1 , k 2 ,, k p ) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacqGHfiIXdaqadaWdaeaapeGaam4Aa8aadaWgaaWcbaWdbiaaigdaa8aabeaak8qacaGGSaGaam4Aa8aadaWgaaWcbaWdbiaaikdaa8aabeaak8qacaGGSaGaeS47IWKaaiilaiaadUgapaWaaSbaaSqaa8qacaWGWbaapaqabaaak8qacaGLOaGaayzkaaaaaa@43C9@ can be defined in Equations 22 and 23 respectively.

MAPE( k 1 , k 2 ,, k p )= 1 n i=1 n | y i y ^ i y i |          (22) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGnbGaamyqaiaadcfacaWGfbWaaeWaa8aabaWdbiaadUgapaWaaSbaaSqaa8qacaaIXaaapaqabaGcpeGaaiilaiaadUgapaWaaSbaaSqaa8qacaaIYaaapaqabaGcpeGaaiilaiabl+UimjaacYcacaWGRbWdamaaBaaaleaapeGaamiCaaWdaeqaaaGcpeGaayjkaiaawMcaaiabg2da9maalaaapaqaa8qacaaIXaaapaqaa8qacaWGUbaaamaawahabeWcpaqaa8qacaWGPbGaeyypa0JaaGymaaWdaeaapeGaamOBaaqdpaqaa8qacqGHris5aaGcdaabdaWdaeaapeWaaSaaa8aabaWdbiaadMhapaWaaSbaaSqaa8qacaWGPbaapaqabaGcpeGaeyOeI0YdamaaxacabaWdbiaadMhaaSWdaeqabaWdbiaac6faaaGcpaWaaSbaaSqaa8qacaWGPbaapaqabaaakeaapeGaamyEa8aadaWgaaWcbaWdbiaadMgaa8aabeaaaaaak8qacaGLhWUaayjcSdGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqGYaGaaeOmaiaabMcaaaa@64EB@

( k 1 , k 2 ,, k p )={       0         VI F j <10 , j=1,2,,p j=1 p VI F j                  otherwise           (23) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacqGHfiIXdaqadaWdaeaapeGaam4Aa8aadaWgaaWcbaWdbiaaigdaa8aabeaak8qacaGGSaGaam4Aa8aadaWgaaWcbaWdbiaaikdaa8aabeaak8qacaGGSaGaeS47IWKaaiilaiaadUgapaWaaSbaaSqaa8qacaWGWbaapaqabaaak8qacaGLOaGaayzkaaGaeyypa0Zaaiqaa8aabaqbaeqabiqaaaqaa8qacaGGGcGaaiiOaiaacckacaGGGcGaaiiOaiaacckacaaIWaGaaiiOaiaacckacaGGGcGaaiiOaiaacckacaGGGcGaaiiOaiabgcGiIiaacckacaGGGcGaamOvaiaadMeacaWGgbWdamaaBaaaleaapeGaamOAaaWdaeqaaOWdbiabgYda8iaaigdacaaIWaGaaiiOaiaacYcacaGGGcGaamOAaiabg2da9iaaigdacaGGSaGaaGOmaiaacYcacqGHMacVcaGGSaGaamiCaaWdaeaapeWaaybCaeqal8aabaWdbiaadQgacqGH9aqpcaaIXaaapaqaa8qacaWGWbaan8aabaWdbiabggHiLdaakiaadAfacaWGjbGaamOra8aadaWgaaWcbaWdbiaadQgaa8aabeaak8qacaGGGcGaaiiOaiaacckacaGGGcGaaiiOaiaacckacaGGGcGaaiiOaiaacckacaGGGcGaaiiOaiaacckacaGGGcGaaiiOaiaacckacaGGGcGaaiiOaiaad+gacaWG0bGaamiAaiaadwgacaWGYbGaam4DaiaadMgacaWGZbGaamyzaaaaaiaawUhaaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGOaGaaeOmaiaabkdacaqGPaaaaa@9958@

(p shows the number of explanatory variables.)

The optimization problem defined as in (21) was solved by using PSO in the proposed method. PSO is a popular artificial intelligence technique and it was firstly proposed by [42]. The algorithm of the proposed method is given below.

Algorithm

Step 1. The parameters such as pn c 1 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGJbWdamaaBaaaleaapeGaaGymaaWdaeqaaaaa@380F@ , c 2 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGJbWdamaaBaaaleaapeGaaGOmaaWdaeqaaaaa@3810@ , etc., are determined. These parameters are as follows:

pn: particle number of swarm

c 1 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGJbWdamaaBaaaleaapeGaaGymaaWdaeqaaaaa@380F@ : Cognitive coefficient

c 2 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGJbWdamaaBaaaleaapeGaaGOmaaWdaeqaaaaa@3810@ : Social coefficient interval

maxt: Maximum iteration number

w: Inertia weight

Step 2. Generate random initial positions and velocities.

The initial positions and velocities are generated by uniform distribution with (0,1) parameters. Each particle has velocities up to the number of explanatory variables and each particle has positions up to the number of explanatory variables which represents ( k 1 , k 2 ,, k p ) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qadaqadaWdaeaapeGaam4Aa8aadaWgaaWcbaWdbiaaigdaa8aabeaak8qacaGGSaGaam4Aa8aadaWgaaWcbaWdbiaaikdaa8aabeaak8qacaGGSaGaeS47IWKaaiilaiaadUgapaWaaSbaaSqaa8qacaWGWbaapaqabaaak8qacaGLOaGaayzkaaaaaa@4250@ values. x m t MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWG4bWdamaaDaaaleaapeGaamyBaaWdaeaapeGaamiDaaaaaaa@3965@ Represents the position of particle m at iteration t and v m t MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWG2bWdamaaDaaaleaapeGaamyBaaWdaeaapeGaamiDaaaaaaa@3963@ represents the velocity of the particle m at iteration t.

Step 3. The fitness function was defined as in (21) and the fitness values of the particles are calculated.

Step 4. Pbest and Gbest particles given in (24) and (25), respectively, are determined according to fitness values.

Pbes t m t =(pm), m= 1,2, , pn        (24) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGqbGaamOyaiaadwgacaWGZbGaamiDa8aadaqhaaWcbaWdbiaad2gaa8aabaWdbiaadshaaaGccqGH9aqpcaGGOaGaamiCaiaad2gacaGGPaGaaiilaiaabckacaWGTbGaeyypa0JaaeiOaiaaigdacaGGSaGaaGOmaiaacYcacaqGGcGaeyOjGWRaaiilaiaabckacaWGWbGaamOBaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeikaiaabkdacaqG0aGaaeykaaaa@575B@

Gbes t t = (pg)         (25) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGhbGaamOyaiaadwgacaWGZbGaamiDa8aadaahaaWcbeqaa8qacaWG0baaaOGaeyypa0JaaeiOaiaacIcacaWGWbGaam4zaiaacMcacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGOaGaaeOmaiaabwdacaqGPaaaaa@49D1@

Pbest is constructed by the best results obtained in the related positions at iteration t. Gbest is the best result in the swarm at iteration t.

Step 5. New velocities and positions of the particles are calculated by using the Equations given in (26) and (27).

v m t+1 = [ w× v m t + c 1 ×ran d 1 × ( Pbes t m t x m t )+ c 2 ×ran d 2 ×( Gbes t t x m t ) ]         (26) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWG2bWdamaaDaaaleaapeGaamyBaaWdaeaapeGaamiDaiabgUcaRiaaigdaaaGccqGH9aqpcaGGGcWaamWaa8aaeaqabeaapeGaam4DaiabgEna0kaadAhapaWaa0baaSqaa8qacaWGTbaapaqaa8qacaWG0baaaOGaey4kaSIaam4ya8aadaWgaaWcbaWdbiaaigdaa8aabeaak8qacqGHxdaTcaWGYbGaamyyaiaad6gacaWGKbWdamaaBaaaleaapeGaaGymaaWdaeqaaOWdbiabgEna0cqaamaabmaapaqaa8qacaWGqbGaamOyaiaadwgacaWGZbGaamiDa8aadaqhaaWcbaWdbiaad2gaa8aabaWdbiaadshaaaGccqGHsislcaWG4bWdamaaDaaaleaapeGaamyBaaWdaeaapeGaamiDaaaaaOGaayjkaiaawMcaaiabgUcaRiaadogapaWaaSbaaSqaa8qacaaIYaaapaqabaGcpeGaey41aqRaamOCaiaadggacaWGUbGaamiza8aadaWgaaWcbaWdbiaaikdaa8aabeaak8qacqGHxdaTdaqadaWdaeaapeGaam4raiaadkgacaWGLbGaam4CaiaadshapaWaaWbaaSqabeaapeGaamiDaaaakiabgkHiTiaadIhapaWaa0baaSqaa8qacaWGTbaapaqaa8qacaWG0baaaaGccaGLOaGaayzkaaaaaiaawUfacaGLDbaacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGOaGaaeOmaiaabAdacaqGPaaaaa@7F12@

x m t+1 = x m t + v m t+1          (27) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWG4bWdamaaDaaaleaapeGaamyBaaWdaeaapeGaamiDaiabgUcaRiaaigdaaaGccqGH9aqpcaWG4bWdamaaDaaaleaapeGaamyBaaWdaeaapeGaamiDaaaakiabgUcaRiaadAhapaWaa0baaSqaa8qacaWGTbaapaqaa8qacaWG0bGaey4kaSIaaGymaaaak8aacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGOaGaaeOmaiaabEdacaqGPaaaaa@4DD9@

Where ran d 1 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGYbGaamyyaiaad6gacaWGKbWdamaaBaaaleaapeGaaGymaaWdaeqaaaaa@3AE0@ and ran d 2 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGYbGaamyyaiaad6gacaWGKbWdamaaBaaaleaapeGaaGOmaaWdaeqaaaaa@3AE1@ are random numbers generated from U (0,1).

Step 6. Step 3 to Step 6 is repeated until t maxt.

Step 7. The optimal values are obtained as Gbest.

Implementation

The proposed algorithm was applied to two different and well known data sets in order to investigate of the proposed method. These two data sets named “Import Data” and “Longley Data” were used to evaluate the performance of the proposed method. Import data was analyzed by [43]. The variables of “Import Data” are; imports (IMPORT-Y), domestic production (DOPROD-X1), stock formation (STOCK-X2) and domestic consumption (CONSUM-X3), all measured in billions of French francs for the years 1949 through 1959. Both Import data and Longley data were solved by using fixed point method ([2]), iterative method ([39]), [22]’s method and the algorithm proposed in this paper. In the proposed algorithm, PSO parameters were chosen as pn=30, w=0.9,  c 1 = c 2 =2 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGWbGaamOBaiabg2da9iaaiodacaaIWaGaaiilaiaacckacaWG3bGaeyypa0JaaGimaiaac6cacaaI5aGaaiilaiaacckacaWGJbWdamaaBaaaleaapeGaaGymaaWdaeqaaOWdbiabg2da9iaadogapaWaaSbaaSqaa8qacaaIYaaapaqabaGcpeGaeyypa0JaaGOmaaaa@4947@ and max t =100 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGTbGaamyyaiaadIhacaGGGcGaamiDaiaacckacqGH9aqpcaaIXaGaaGimaiaaicdaaaa@3F5D@ . In the iterative ridge method the stopping criteria were chosen as = 10 6 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacqGH9aqpcaaIXaGaaGima8aadaahaaWcbeqaa8qacqGHsislcaaI2aaaaaaa@3A86@ . The results of each method were presented in Tables 2 and 3, respectively.

As we can see from Table 2, our proposed method has minimum SSE and MAPE values. And also there is no multicollinearity problem when “Import Data” solved by our proposed method. But, there is a multicollinearity problem when “Import Data” solved by FPRRM and IRRM methods because of the VIF values of these methods are bigger than 10. Although, other methods can give smaller SSE and MAPE values they do not still solve the multicollinearity problem. Because it is clearly seen that some VIF values of these methods are greater than 10.

As we can see from Table 3, our proposed method has minimum MAPE value when compared with other methods. But SSE value of our proposed method is not the smallest one. The SSE value of OLS is smaller than our proposed methods. But, it is clearly seen that the OLS method has multicollinearity problem when “Longley Data” solved by this method. But our proposed method has no multicollinearity problem.

As a result, finding k values for each explanatory variable gives better results than finding a single k value. And also, our proposed has no multicollinearity problem.

Simulation study

Two different simulation studies are performed in this section of the paper in order to show the performance of the proposed method in different levels of multicollinearity and standard deviation of error term and the superiority of the proposed method when compared with other methods.

The First Simulation Study: In this simulation study, the proposed method was compared with ridge regression methods given in [2,22,39] by a simulation study. The number of observations (n) was taken as 100, 500 and 1000; the standard deviation of error term was taken as 0.01 and 1 and comparisons were made for the total 6 cases. For each case, 1000 data set including multicollinearity problem was created.

The first three independent variables were generated from standard normal distribution as given in Equation 28.

X i ~N( 0,1 )  i=1,2,3           (28) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGybWdamaaBaaaleaapeGaamyAaaWdaeqaaOWdbiaac6hacaWGobWaaeWaa8aabaWdbiaaicdacaGGSaGaaGymaaGaayjkaiaawMcaaiaacckacaGGGcGaamyAaiabg2da9iaaigdacaGGSaGaaGOmaiaacYcacaaIZaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGOaGaaeOmaiaabIdacaqGPaaaaa@4F8B@

The last two independent variables were generated by using Equation 29. Thus, it is provided to arise multicollinearity problem for the data set by providing a high correlation between independent variables X 1 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGybWdamaaBaaaleaapeGaaGymaaWdaeqaaaaa@3804@ and, X 4 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGybWdamaaBaaaleaapeGaaGinaaWdaeqaaaaa@3807@ X 1 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGybWdamaaBaaaleaapeGaaGymaaWdaeqaaaaa@3804@ and X 5 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGybWdamaaBaaaleaapeGaaGynaaWdaeqaaaaa@3808@ .

X i =U( 10,20 )+U( 5,20 ) X 1 +N( 0,7 ) i=4,5         (29) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGybWdamaaBaaaleaapeGaamyAaaWdaeqaaOWdbiabg2da9iaadwfadaqadaWdaeaapeGaaGymaiaaicdacaGGSaGaaGOmaiaaicdaaiaawIcacaGLPaaacqGHRaWkcaWGvbWaaeWaa8aabaWdbiaaiwdacaGGSaGaaGOmaiaaicdaaiaawIcacaGLPaaacaWGybWdamaaBaaaleaapeGaaGymaaWdaeqaaOWdbiabgUcaRiaad6eadaqadaWdaeaapeGaaGimaiaacYcacaaI3aaacaGLOaGaayzkaaGaaiiOaiaadMgacqGH9aqpcaaI0aGaaiilaiaaiwdacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGOaGaaeOmaiaabMdacaqGPaaaaa@5B19@

The observations of dependent variable were obtained using Equation 30. So, all the coefficients in the regression model are taken as 1.

Y= X i +N(0,σ) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGzbGaeyypa0ZdamaavacabeWcbeqaaiaaygW7a0qaa8qacqGHris5aaGccaWGybWdamaaBaaaleaapeGaamyAaaWdaeqaaOWdbiabgUcaRiaad6eacaGGOaGaaGimaiaacYcacqaHdpWCcaGGPaaaaa@441D@

For each data generated in each case, VI F 2 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaWaaubiaeqaleqabaGaaGzaVdqdbaaeaaaaaaaaa8qacqGHris5aaGccaWGwbGaamysaiaadAeapaWaaWbaaSqabeaapeGaaGOmaaaaaaa@3D1C@ , SSE, MAPE and CN values are calculated by using proposed method, the studies [2, 22, 39]. The formula of SSE is given in Equation 31.

SSE= i=1 n ( y i y ^ i ) 2          (31) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGtbGaam4uaiaadweacqGH9aqpdaGfWbqabSWdaeaapeGaamyAaiabg2da9iaaigdaa8aabaWdbiaad6gaa0WdaeaapeGaeyyeIuoaaOGaaiikaiaadMhapaWaaSbaaSqaa8qacaWGPbaapaqabaGcpeGaeyOeI0YdamaaxacabaWdbiaadMhaaSWdaeqabaWdbiaac6faaaGcpaWaaSbaaSqaa8qacaWGPbaapaqabaGcpeGaaiyka8aadaahaaWcbeqaa8qacaaIYaaaaOWdaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqGZaGaaeymaiaabMcaaaa@51FB@

The most important indicator for the comparison of methods is that VIF and CN would be small. The methods [2] and [39] do not guarantee the solution of multicollinearity problem as seen in the numerical examples. The method [22] and proposed method guarantee that all VIF values are smaller than 10. Therefore, it is suitable to compare the proposed method with [22] method in terms of SSE and MAPE criteria.

The results of median and inter quartile range (IQR) values​were given between Tables 4-9.

When all tables are examined, it is clearly seen that and CN values of proposed method is lower than the other methods in all cases.

However, it is seen that the proposed method produces lower MAPE values ​​compared to others despite producing higher SSE values. This is because the objective function of the proposed method may be depending to the MAPE.

The Second Simulation Study: A second simulation study was performed in the paper according to different levels of multicollinearity problem and standard deviation of error term. The regressors were generated by using Equations 32-36 given by [44].

w ij ~N( 0,1 )  ;i=1,2,,n  ;  j=1,2,,6            (32) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWG3bWdamaaBaaaleaapeGaamyAaiaadQgaa8aabeaak8qacaGG+bGaamOtamaabmaapaqaa8qacaaIWaGaaiilaiaaigdaaiaawIcacaGLPaaacaGGGcGaaiiOaiaacUdacaWGPbGaeyypa0JaaGymaiaacYcacaaIYaGaaiilaiabgAci8kaacYcacaWGUbGaaiiOaiaacckacaGG7aGaaiiOaiaacckacaWGQbGaeyypa0JaaGymaiaacYcacaaIYaGaaiilaiabgAci8kaacYcacaaI2aGaaiiOaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeikaiaabodacaqGYaGaaeykaaaa@6204@

x ij = (1 ρ 2 ) 1/2 w ij +ρ w i,6 ;  i=1,2,,n ; j=1,2,3            (33) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGceaqabeaaqaaaaaaaaaWdbiaadIhapaWaaSbaaSqaa8qacaWGPbGaamOAaaWdaeqaaOWdbiabg2da9iaacIcacaaIXaGaeyOeI0IaeqyWdi3damaaCaaaleqabaWdbiaaikdaaaGccaGGPaWdamaaCaaaleqabaWdbiaaigdacaGGVaGaaGOmaaaakiaadEhapaWaaSbaaSqaa8qacaWGPbGaamOAaaWdaeqaaOWdbiabgUcaRiabeg8aYjaadEhapaWaaSbaaSqaa8qacaWGPbGaaiilaiaaiAdaa8aabeaak8qacaGG7aaabaGaaiiOaiaadMgacqGH9aqpcaaIXaGaaiilaiaaikdacaGGSaGaeyOjGWRaaiilaiaad6gacaGGGcGaai4oaiaacckacaWGQbGaeyypa0JaaGymaiaacYcacaaIYaGaaiilaiaaiodacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGOaGaae4maiaabodacaqGPaaaaaa@6953@

x ij = w ij   ,i=1,2,,n ;  j=4,5        (34) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWG4bWdamaaBaaaleaapeGaamyAaiaadQgaa8aabeaak8qacqGH9aqpcaWG3bWdamaaBaaaleaapeGaamyAaiaadQgaa8aabeaak8qacaGGGcGaaiiOaiaacYcacaWGPbGaeyypa0JaaGymaiaacYcacaaIYaGaaiilaiabgAci8kaacYcacaWGUbGaaiiOaiaacUdacaGGGcGaaiiOaiaadQgacqGH9aqpcaaI0aGaaiilaiaaiwdacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqGZaGaaeinaiaabMcaaaa@58D0@

e i ~N( 0,σ ) ;i=1,2,,n          (35) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWGLbWdamaaBaaaleaapeGaamyAaaWdaeqaaOWdbiaac6hacaWGobWaaeWaa8aabaWdbiaaicdacaGGSaGaeq4WdmhacaGLOaGaayzkaaGaaiiOaiaacUdacaWGPbGaeyypa0JaaGymaiaacYcacaaIYaGaaiilaiabgAci8kaacYcacaWGUbGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabIcacaqGZaGaaeynaiaabMcaaaa@520A@

y i = j=1 5 β j x i,j + e i   ; i=1,2,,n          (36) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWG5bWdamaaBaaaleaapeGaamyAaaWdaeqaaOWdbiabg2da9maawahabeWcpaqaa8qacaWGQbGaeyypa0JaaGymaaWdaeaapeGaaGynaaqdpaqaa8qacqGHris5aaGccqaHYoGypaWaaSbaaSqaa8qacaWGQbaapaqabaGcpeGaamiEa8aadaWgaaWcbaWdbiaadMgacaGGSaGaamOAaaWdaeqaaOWdbiabgUcaRiaadwgapaWaaSbaaSqaa8qacaWGPbaapaqabaGcpeGaaiiOaiaacckacaGG7aGaaiiOaiaadMgacqGH9aqpcaaIXaGaaiilaiaaikdacaGGSaGaeyOjGWRaaiilaiaad6gacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeikaiaabodacaqG2aGaaeykaaaa@5F13@

Where w i,j MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaWG3bWdamaaBaaaleaapeGaamyAaiaacYcacaWGQbaapaqabaaaaa@39F5@ independent standard normal are pseudorandom numbers and ρ 2 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacqaHbpGCpaWaaWbaaSqabeaapeGaaGOmaaaaaaa@38DA@ is theoretical correlation between any two explanatory variables.

Simulation study was conducted for a total of 8 cases for sample size is 100, (n = 100) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacaGGOaGaamOBaiaacckacqGH9aqpcaGGGcGaaGymaiaaicdacaaIWaGaaiykaaaa@3DDB@ , standard deviation of the standard deviation of error term () and different degrees of multiple connections ( σ=0.01, 0.1, 1,5 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacqaHdpWCcqGH9aqpcaaIWaGaaiOlaiaaicdacaaIXaGaaiilaiaacckacaaIWaGaaiOlaiaaigdacaGGSaGaaiiOaiaaigdacaGGSaGaaGynaaaa@43B5@ ) ρ=0.99,  0.999 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qacqaHbpGCcqGH9aqpcaaIWaGaaiOlaiaaiMdacaaI5aGaaiilaiaacckacaGGGcGaaGimaiaac6cacaaI5aGaaGyoaiaaiMdaaaa@4277@ (Tables 10-17).

It is clearly seen that in the tables of the simulation Study 2, VI F 2 π MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaWaaubiaeqaleqabaGaaGzaVdqdbaaeaaaaaaaaa8qacqGHris5aaacbmGccaWFwbGaa8xsaiaa=zeapaWaaWbaaSqabeaapeGaaGOmaaaak8aacqaHapaCaaa@3EF2@ and CN values of the proposed method do not change significantly when standard deviation of error term values are changed. VI F 2 MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaWaaubiaeqaleqabaGaaGzaVdqdbaaeaaaaaaaaa8qacqGHris5aaacbmGccaWFwbGaa8xsaiaa=zeapaWaaWbaaSqabeaapeGaaGOmaaaaaaa@3D1C@ And CN values of the proposed method are increased dramatically when multicollinearity is increased. And also there is no a hardly ever change to be seen in the MAPE values of the proposed method with the reasonable standard deviation of error term values ( σ=0.01, 0.1 ) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaaeaaaaaaaaa8qadaqadaWdaeaapeGaeq4WdmNaeyypa0JaaGimaiaac6cacaaIWaGaaGymaiaacYcacaGGGcGaaGimaiaac6cacaaIXaaacaGLOaGaayzkaaaaaa@415F@ or there is a decrease to be seen in the MAPE values of the proposed method when multicollinearity is increased.

In this simulation study, different levels of standard deviation of error term are also employed. As a result of this simulation study it is clearly seen that when standard deviation of error term value is greater than 1 and >1 the model has very big deviation from linear regression model because MAPE values are obtained about 60 and this value is not suitable. And also, it is clearly seen that in the tables of the simulation study 2, the prediction performance of the proposed is affected quite negatively when standard deviation of error term is increased.

Discussion

There are some valid assumptions to create a model in multiple regression analysis. One of them is that it should not be multicollinearity problem among independent variables. Ridge regression method is often used in the literature when there is a multicollinearity problem among independent variables.

But, ridge regression has also some problems. One of the most important problems in ridge regression is to decide what the shrinkage parameter (k) value will be. There are many studies in the literature to find the optimal k value. In these studies, this k value was found to be a single value. But in this study, we found different k values corresponding to each explanatory variable instead of a single value of k by using a new algorithm based on particle swarm optimization. And also, the proposed method was supported by two simulation studies. Besides, it is an important novelty for ridge regression literature.

In the future studies, different artificial intelligence optimization techniques can be used to find these k values for each explanatory variable.

  1. Hoerl AE, Kennard RW (1970) Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12: 55–67. Link: https://goo.gl/5ZV56T
  2. Hoerl AE, Kennard RW, Baldwin KF (1975) Ridge regression: some simulations. Communications in Statistics 4: 105–123. Link: https://goo.gl/QGgP3L
  3. McDonald GC, Galarneau DI (1975) A Monte Carlo evaluation of some ridge-type estimators. Journal of the American Statistical Association 70: 407–412. Link: https://goo.gl/7ZN2co
  4. Lawless JF, Wang P (1976) A simulation study of ridge and other regression estimators. Communications in Statistics – Theory and Methods 14: 1589–1604. Link: https://goo.gl/WfUz0p
  5. Hocking RR, Speed FM, Lynn MJ (1976) A class of biased estimators in linear regression. Technometrics 18: 425–437. Link: https://goo.gl/NEjsRY
  6. Gunst RF, Mason RL (1977) Biased estimation in regression: an evaluation using mean squared error. Journal of the American Statistical Association 72: 616–628. Link: https://goo.gl/HfxIin
  7. Wichern D, Curchill G (1978) A comparison of ridge estimators. Technometrics 20: 301-311. Link: https://goo.gl/U6OiUQ
  8. Lawless JF (1978) Ridge and related estimation procedure Theory and Methods. Communications in Statistics 7: 139–164. Link: https://goo.gl/KceYME
  9. Nordberg L (1982) A procedure for determination of a good ridge parameter in linear regression, Communications in Statistics 11: 285–309. Link: https://goo.gl/pNqtc2
  10. Saleh AK, Kibria BM (1993) Performances of some new preliminary test ridge regression estimators and their properties. Communications in Statistics – Theory and Methods 22: 2747–2764. Link: https://goo.gl/4XqQNd
  11. Haq MS, Kibria BMG (1996) a shrinkage estimator for the restricted linear regression model: ridge regression approach. Journal of Applied Statistical Science 3: 301–316. Link: https://goo.gl/sjCZrw
  12. Kibria BM (2003) Performance of some new ridge regression estimators. Communications in Statistics – Simulation and Computation 32: 419–435. Link: https://goo.gl/3OJp6a
  13. Pasha GR, Shah MA (2004) Application of ridge regression to multicollinear data. Journal of Research Science 15: 97– 106. Link: https://goo.gl/5eP2I5
  14. Khalaf G, Shukur G (2005) Choosing ridge parameter for regression problem. Communications in Statistics – Theory and Methods 34: 1177–1182. Link: https://goo.gl/Nu1Xs4
  15. Norliza A, Maizah HA, Robin A (2006) A comparative study on some methods for handling multicollinearity problems. Mathematika 22: 109–119. Link: https://goo.gl/Tlyqej
  16. Alkhamisi MA, Shukur G (2007) A Monte Carlo study of recent ridge parameters. Communications in Statistics – Simulation and Computation 36: 535–547. Link: https://goo.gl/Mv2FMY
  17. Mardikyan S, Cetin E (2008) Efficient choice of biasing constant for ridge regression. Int. J. Contemp. Math. Sciences, 3: 527–536. Link: https://goo.gl/oOgsiH
  18. Prago-Alejo RJ,Torre-Trevino LM, Pina-Monarrez MR (2008) Optimal determination of k constant of ridge regression using a simple genetic algorithm. Electronics robotics and Automotive Mechanics Conference. Link: https://goo.gl/uPVi0B
  19. Dorugade AV, Kashid DN (2010) Alternative method for choosing ridge parameter for regression. Applied Mathematical Sciences 4: 447–456. Link: https://goo.gl/E7MYJ5
  20. Al-Hassan Y (2010) Performance of new ridge regression estimators. Journal of the Association of Arab Universities for Basic and Applied Science 9: 23–26. Link: https://goo.gl/zTjoEe
  21. Ahn JJ, Byun HW, Oh KJ, Kim TY (2012) Using ridge regression with genetic algorithm to enhance real estate appraisal forecasting. Expert Systems with Applications 39: 8369–8379. Link: https://goo.gl/TM0Udi
  22. Uslu VR, Egrioglu E, Bas E (2014) Finding optimal value for the shrinkage parameter in ridge regression via particle swarm optimization. American Journal of Intelligent Systems 4: 142-147. Link: https://goo.gl/U06GuG
  23. Chitsaz S, Ahmed SE (2012) Shrinkage estimation for the regression parameter matrix in multivariate regression model. Journal of Statistical Computation and Simulation 82: 309-323. Link: https://goo.gl/lDIZzU
  24. Firinguetti L (1997) Ridge regression in the context of a system of seemingly unrelated regression equations. Journal of Statistical Computation and Simulation 56: 145-162. Link: https://goo.gl/wWROue
  25. Halawa AM, El Bassiouni MY (2000) Tests of regression coefficients under ridge regression models. Journal of Statistical Computation and Simulation 65: 341-356. Link: https://goo.gl/LUQbtW
  26. Dorugade AV, Kashid DN (2010) Variable selection in linear regression based on ridge estimator. Journal of Statistical Computation and Simulation 80: 1211-1224. Link: https://goo.gl/A0WJm7
  27. Golam Kibria BM (2004) Performance of the shrinkage preliminary test ridge regression estimators based on the conflicting of W, LR and LM tests, Journal of Statistical Computation and Simulation 74: 793-810. Link: https://goo.gl/TMvLYe
  28. Roozbeh M, Arashi M, Niroumand HA (2011) Ridge regression methodology in partial linear models with correlated errors. Journal of Statistical Computation and Simulation 81: 517-528. Link: https://goo.gl/Y2r4Nz
  29. Simpsona JR, Montgomery DC (1996) A biased-robust regression technique for the combined outlier-multicollinearity problem. Journal of Statistical Computation and Simulation 56: 1-22. Link: https://goo.gl/qgK7Fz
  30. Uzuke CA, Mbegbu JI, Nwosu CR (2015) Performance of kibria, khalaf and shurkur's methods when the eigenvalues are skewed. Communications in Statistics - Simulation and Computation Link: https://goo.gl/VdvgIo
  31. Wong KY, Chiu SN (2015) an iterative approach to minimize the mean squared error in ridge regression. Computational Statistics 30: 625-639. Link: https://goo.gl/rdHK1p
  32. Dorugade AV (2014) new ridge parameters for ridge regression. Journal of the Association of Arab Universities for Basic and Applied Sciences 15:  94-99. Link: https://goo.gl/wpNPWf
  33. Khalaf G (2013) An optimal estimation for the ridge regression parameter. Journal of Fundamental and Applied Statistics 5: 11-19.Link: https://goo.gl/bo4bde
  34. Muniz G, Golam Kibria BM, Månsson K, Ghazi S (2012) On developing ridge regression parameters: a graphical investigation. Sort-Statistics and Operations Research Transactions 36: 115-138. Link: https://goo.gl/o13EFW
  35. Muniz G, Golam Kibria BM (2009) On some ridge regression estimators: an empirical comparisons. Communications in Statistics - Simulation and Computation 38: 621-630. Link: https://goo.gl/wqCKbh
  36. Nomura M (1988) On the almost unbiased ridge regression estimation. Communications in Statistics – Simulation and Computation 17: 729–743. Link: https://goo.gl/5kX0MM
  37. Montogomery DC, Peck EA, Vining GG (2006) Introduction to Linear Regression Analysis. John Wiley and Sons. Link: https://goo.gl/M3tgXY
  38. Batah FS, Ramnathan T, Gore SD (2008) The efficiency of modified jackknife and ridge type regression estimators: a comparison. Surveys in Mathematics and its Applications 3: 111–122. Link: https://goo.gl/bw8Xcf
  39. Hoerl AE, Kennard RW (1976) Ridge regression: iterative estimation of the biasing parameter. Commun. Statist. Theor. Meth.5: 77-88. Link: https://goo.gl/VxoSpF
  40. Hoerl AE, Kennard RW (1970) Ridge Regression: Applications to Nonorthogorial Problems. Technometrics 12: 69-82.Link: https://goo.gl/HKnemY
  41. Firinguetti L (1999) A generalized ridge regression estimator and its finite sample properties. Communications in Statistics-Theory and Methods 28: 1217-1229. Link: https://goo.gl/VTzhbj
  42. Kennedy J, Eberhart R (1995) Particle swarm optimization. In Proceedings of IEEE International Conference on Neural Networks, Piscataway, NJ, USA, IEEE Press. 1942–1948
  43. Chatterjee S, Hadi (2006) A Regression Analysis by Example. John Wiley and Sons. Link: https://goo.gl/Dx6iqn
  44. Gibbons DG (1981) A simulation study of some ridge estimators. Journal of the American Statistical Association 76:131–139. Link: https://goo.gl/XMzqJs
© 2017 Bas E, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
 

Help ?