Improve the accuracy of predictions with advanced regression procedures
IBM® SPSS® Regression software enables you to predict categorical outcomes and apply a range of nonlinear regression procedures. You can apply the procedures to business and analysis projects where ordinary regression techniques are limiting or inappropriate—such as studying consumer buying habits, responses to treatments or analyzing credit risk.
With IBM® SPSS® Regression software, you can expand the capabilities of IBM® SPSS® Statistics Base for the data analysis stage in the analytical process.
Predict categorical outcomes
Using MLR, regress a categorical dependent variable with more than two categories on a set of independent variables. This helps you accurately predict group membership within key groups.
Use stepwise functionality, including forward entry, backward elimination, forward stepwise or backward stepwise, to find the best predictor.
For a large number of predictors, use Score and Wald methods to help you quickly reach results.
Assess your model fit using Akaike information criterion (AIC) and Bayesian information criterion (BIC).
Easily classify your data
Using binary logistic regression, build models in which the dependent variable is dichotomous; for example, buy versus not buy, pay versus default, graduate versus not graduate.
Predict the probability of events such as solicitation responses or program participation.
Select variables using six types of stepwise methods. This includes forward (select the strongest variables until there are no more significant predictors in the data set) and backward (at each step, remove the least significant predictor in the data set).
Set inclusion or exclusion criteria.
Estimate parameters of nonlinear models
Estimate nonlinear equations using NLR for unconstrained problems and CNLR for both constrained and unconstrained problems.
Using NLR, estimate models with arbitrary relationships between independent and dependent variables using iterative estimation algorithms.
With CNLR, use linear and nonlinear constraints on any combination of parameters.
Estimate parameters by minimizing any smooth loss function (objective function), and compute bootstrap estimates of parameter standard errors and correlations.
Meet statistical assumptions
If the spread of residuals is not constant, use weighted least squares to estimate the model. For example, when predicting stock values, stocks with higher share values fluctuate more than low-value shares.
Use two-stage least squares to estimate the dependent variable when the independent variables are correlated with regression error terms. This allows you to control for correlations between predictor variables and error terms.
Evaluate the value of stimuli
Use probit analysis to estimate the effects of one or more independent variables on a categorical dependent variable.
Evaluate the value of stimuli using a logit or probit transformation of the proportion responding.
Need help leveraging the power of IBM® SPSS® software? Let us help you with our analytical expertise and experience - contact us today.