How do we know when we should stop changing the parameters m and b? How will we know when our program has learned enough?

To answer this, we have to define convergence. Convergence is when the loss stops changing (or changes very slowly) when parameters are changed.

Hopefully, the algorithm will converge at the best values for the parameters m and b.



Run the file. Look at the graph. This graph shows how much the parameter b changed with each iteration of a gradient descent runner.


How many iterations did it take for this program to converge?

Enter your answer in a variable called num_iterations.


At what b value did this program converge?

Enter your answer in a variable called convergence_b.

Sign up to start coding

Mini Info Outline Icon
By signing up for Codecademy, you agree to Codecademy's Terms of Service & Privacy Policy.

Or sign up using:

Already have an account?