Skip to content

Latest commit

 

History

History
36 lines (23 loc) · 2.05 KB

File metadata and controls

36 lines (23 loc) · 2.05 KB

Logistic Regression

Equation:

$$\hat{y} = \frac{1}{1 + e^{-z}}$$

where:

  • $z = \theta^T x = \theta_0 + \theta_1 x_1 + \theta_2 x_2 + ... + \theta_n x_n$

Trap

Logistic regression models the probability of a binary outcome following a Bernoulli distribution. The Bernoulli distribution is a discrete probability distribution of a random variable that takes only two values - typically 0 and 1, with p being the probability of success (1) and 1-p being the probability of failure (0).

In logistic regression, we estimate p (the probability of the positive class) based on a given dataset of independent variables. The model outputs a probability between 0 and 1, which matches the Bernoulli distribution's parameter p. This probability can then be used to make binary predictions by applying a threshold (typically 0.5).

While commonly used for classification tasks, logistic regression is fundamentally a probabilistic model that estimates Bernoulli probabilities using a linear decision boundary. This means the input features need to be linearly separable for the model to perform well.

More