Skip to Content
Codecademy Logo
Log In
Sign Up
Codecademy Logo

Supervised Learning: Regression

Print Cheatsheet

Gradient descent step

The size of the step that gradient descent takes is called the learning rate. Finding an adequate value for the learning rate is key to achieve convergence. If this value is too large the algorithm will never reach the optimus, but if is too small it will take too much time to achieve the desired value.

Gradient Descent in Regression

Gradient Descent is an iterative algorithm used to tune the parameters in regression models for minimum loss.

Related Courses

Career Path

Data Scientist

Beginner friendly

85 Lessons