# Stepwise method of Multiple Regression

In this section, we will learn about the Stepwise method of Multiple Regression. The stepwise method is again a very popular method for doing regression analysis, but it has been less recommended. For some reason, we are going to understand it. The Stepwise method of regression analysis is a method in which variables are entered in a model in the format of stepwise criteria. In the model, to enter the variables in a stepwise manner, we have two more methods listed, which are forward and backward methods. Forward and backward methods are part of the stepwise regression method. The first method of regression is the Enter method. It is also known as the forced entry method because all the variables are forcefully entered in the model without discrimination on the base of their relative importance. The second method that we have can be called as the Stepwise method. This stepwise method of regression contains two methods. One can be called as the forward selection method, and another can be called as the backward elimination method. The last method that we have in SPSS can be called the Remove method. Now the question is a stepwise method containing two aspects: forward selection and backward elimination, then why stepwise forward selection, backward elimination because there is a little difference method kept as a three-separate method in the case of SPSS. This is between the stepwise method and these two methods. In the stepwise method, if we enter all the variables in the model in a stepwise manner, there is an assessment of the variable's relative contribution. The variable that makes an insignificant or non-significant contribution to the model is assessed and eliminated from the model. In the stepwise regression, every time a variable is entered in a model in a stepwise manner based on some mathematical criteria. That mathematical criterion could be the correlation. For example, the variable which is highly correlated with the dependent variable or shows the highest amount of correlation with the dependent variable, it can be taken as the first predictor, while the variable that shows the second-highest correlation, it can be taken as the second predictor. So, every time in case of the stepwise regression method, we add a variable. An assessment of its relative contribution follows it. So, if it contributes significantly, we allow that variable in the model. Otherwise, we don't allow that variable in the model. While this typically does not happen in the case of forward regression. Forward regression is a method of selection, not elimination. So, we will select those variables in a particular order, which will be the order of their correlation with the dependent variable. So, the highest correlation would be given the first preference.    