## Key Concepts

Review core concepts you need to learn to master this subject

Scikit-Learn *Logistic Regression* Implementation

*Logistic Regression* sigmoid function

*Classification Threshold* definition

*Logistic Regression* interpretability

*Log-Odds* calculation

Logistic Regression Classifier

*Logistic Regression* prediction

*Logistic Regression* cost function

Scikit-Learn *Logistic Regression* Implementation

Scikit-Learn *Logistic Regression* Implementation

*Scikit-Learn* has a *Logistic Regression* implementation that fits a model to a set of training data and can classify new or test data points into their respective classes. All important parameters can be specified, as the norm used in penalizations and the solver used in optimization.

*Logistic Regression* sigmoid function

*Logistic Regression* sigmoid function

*Logistic Regression* models use the *sigmoid function* to link the *log-odds* of a data point to the range [0,1], providing a probability for the classification decision. The *sigmoid function* is widely used in machine learning classification problems because its output can be interpreted as a probability and its derivative is easy to calculate.

*Classification Threshold* definition

*Classification Threshold* definition

A *Classification Threshold* determines the cutoff where the probabilistic output of a machine learning algorithm classifies data samples as belonging to the positive or negative class. A *Classification Threshold* of 0.5 is well suited to most problems, but particular classification problem could need a fine-tuned threshold in order to improve overall accuracy.

*Logistic Regression* interpretability

*Logistic Regression* interpretability

*Logistic Regression* models have high interpretability compared to most classification algorithms due to optimized feature coefficients. Feature coefficients can be thought as a measure of sensitivity in feature values.

*Log-Odds* calculation

*Log-Odds* calculation

The product of the feature coefficients and feature values in a *Logistic Regression* model is the *Log-Odds* of a data sample belonging to the positive class. Log odds can take any real value and it’s an indirect way to express probabilities.

Logistic Regression Classifier

Logistic Regression Classifier

*Logistic Regression* is supervised binary classification algorithm used to predict binary response variables that may indicate the presence or absence of some state. It is possible to extend *Logistic Regression* to multi-class classification problems by creating several one-vs-all binary classifiers. In a one-vs-all scheme, n - 1 classes are grouped as one and a classifier learns to discriminate the remaining class from the ensembled group.

*Logistic Regression* prediction

*Logistic Regression* prediction

*Logistic Regression* models predict the probability of an n-dimensional data point belonging to a specific class by constructing a linear decision boundary. This decision boundary splits the n-dimensional plane in two. In a prediction stage, the point is classified according to which semiplane has the highest probability.

*Logistic Regression* cost function

*Logistic Regression* cost function

The cost function measuring the inaccuracy of a *Logistic Regression* model across all samples is *Log Loss*. The lower this value, the greater the overall classification accuracy. *Log Loss* is also known as *Cross Entropy* loss.

- 1When an email lands in your inbox, how does your email service know whether it’s a real email or spam? This evaluation is made billions of times per day, and one way it can be done is with Logistic…
- 2With the data from Codecademy University, we want to predict whether each student will pass their final exam. And the first step to making that prediction is to predict the probability of each stud…
- 3We saw that the output of a Linear Regression model does not provide the probabilities we need to predict whether a student passes the final exam. Step in
In Logistic Re…**Logistic Regression!** - 4In Linear Regression we multiply the coefficients of our features by their respective feature values and add the intercept, resulting in our prediction, which can range from -∞ to +∞. In Logistic R…
- 5How did our Logistic Regression model create the S-shaped curve we previously saw? The answer is the
. The Sigmoid Function is a special case of the more general _Logistic…**Sigmoid Function** - 6Now that we understand how a Logistic Regression model makes its probability predictions, what coefficients and intercept should we use in our model to best predict whether a student will pass the …
- 7J(\mathbf{b}) = -\frac{1}{m}\sum_{i=1}^{m} [y^{(i)}log(h(z^{(i)})) + (1-y^{(i)})log(1-h(z^{(i)}))] Let’s go ahead and break down our log-loss function into two separate parts so it begins to make…
- 8Many machine learning algorithms, including Logistic Regression, spit out a classification probability as their result. Once we have this probability, we need to make a decision on what class the s…
- 9Now that you know the inner workings of how Logistic Regression works, let’s learn how to easily and quickly create Logistic Regression models with sklearn! [sklearn](http://scikit-learn.org/stable…
- 10One of the defining features of Logistic Regression is the interpretability we have from the feature coefficients. How to handle interpreting the coefficients depends on the kind of data you are wo…
- 11Congratulations! You just learned how a Logistic Regression model works and how to fit one to a dataset. Class is over, and the final exam for Codecademy University’s Introductory Machine Learning …

## What you'll create

Portfolio projects that showcase your new skills

## How you'll master it

Stress-test your knowledge with quizzes that help commit syntax to memory