logistic regression

Given a binary respose variable Y with probability of success p, the logistic regressionMathworldPlanetmath is a non-linear regression model with the following model equation:


where 𝑿T⁢𝜷 is the productPlanetmathPlanetmath of the transposeMathworldPlanetmath of the column matrix 𝑿 of explanatory variables and the unknown column matrix 𝜷 of regression coefficients. Rewriting this so that the right hand side is 𝑿T⁢𝜷, we arrive at a new equation


The left hand side of this new equation is known as the logit function, defined on the open unit interval (0,1) with range the entire real line ℝ:

logit⁡(p):=ln⁡(p1-p)⁢ where ⁢p∈(0,1).

Note that the logit of p is the same as the natural log of the odds of success (over failures) with the probability of success = p. Since Y is a binary response variable, so it has a binomial distribution with parameter (probability of success) p=E⁡[Y], the logistic regression model equation can be rewritten as

logit⁡(E⁡[Y])=logit⁡(p)=𝑿T⁢𝜷. (1)

Logistic regression is a particular type of generalized linear model. In addition, the associated logit function is the most appropriate and natural choice for a link function. By natural we mean that logit⁡(p) is equal to the natural parameter θ appearing in the distribution functionMathworldPlanetmath for the GLM (generalized linear model). To see this, first note that the distribution function for a binomial random variableMathworldPlanetmath Y is


where n is the number of trials and Y=y is the event that there are y success in these n trials. p, the parameter, is the probability of success. Let there be N iid binomial random variables Y1,Y2,…,YN each corresponding to ni trials with pi probability of success. Then the joint probability distribution of these N random variables is simply the product of the individual binomial distributions. Equating this to the distributionDlmfPlanetmath for the GLM, which belongs to the exponential family of distributions, we have:


Taking the natural log on both sides, we have the equality of log-likelihood functionMathworldPlanetmath in two different forms:


Rearranging the left hand side and comparing term i, we have


so that θi=ln⁡(pi/(1-pi))=logit⁡(pi).

Next, setting the natural link function logit of the expected valueMathworldPlanetmath of Yi, which is pi, to the linear portion of the GLM, we have


giving us the model formulaMathworldPlanetmathPlanetmath for the logistic regression.


  • •

    Comparing model equation for the logistic regression to that of the normal or Gaussian linear regression model, we see that the difference is in the choice of link function. In normal liner model, the regression equation looks like

    E⁡[Y]=𝑿T⁢𝜷. (2)

    The link function in this case is the identity functionMathworldPlanetmath. The model equation is consistentPlanetmathPlanetmath because the linear terms on the right hand side allow E⁡[Y] on the left hand side to vary over the reals. However, for a binary response variable, Equation (2) would not be appropriate as the left hand side is restricted to only within the unit interval, whereas the right hand side has the possibility of going outside of (0,1). Therefore, Equation (1) is more appropriate when we are dealing with a binary response data variable.

  • •

    The logit function is not the only choice of link function for the logistic regression. Other, “non-natural” link functions are available. Two such examples are the probit functionMathworldPlanetmath, or the inversePlanetmathPlanetmathPlanetmathPlanetmath cumulative normal distribution function Φ-1⁢(p) and the complimentary-log-log function ln⁡(-ln⁡(1-p)). Both of these functions map the open unit interval to ℝ.

Title logistic regression
Canonical name LogisticRegression
Date of creation 2013-03-22 14:47:51
Last modified on 2013-03-22 14:47:51
Owner CWoo (3771)
Last modified by CWoo (3771)
Numerical id 12
Author CWoo (3771)
Entry type Definition
Classification msc 62J12
Classification msc 62J02
Defines logit
Defines probit
Defines complementary-log-log