Log in from a computer to take this course

You'll need to log in from a computer to start Learn the Basics of Machine Learning. But you can practice or keep up your coding streak with the Codecademy Go app. Download the app to get started.

apple storegoogle store

We have a function to find the gradient of b at every point. To find the m gradient, or the way the loss changes as the slope of our line changes, we can use this formula:


Once more:

  • N is the number of points you have in your dataset
  • m is the current gradient guess
  • b is the current intercept guess

To find the m gradient:

  • we find the sum of x_value * (y_value - (m*x_value + b)) for all the y_values and x_values we have
  • and then we multiply the sum by a factor of -2/N. N is the number of points we have.

Once we have a way to calculate both the m gradient and the b gradient, we’ll be able to follow both of those gradients downwards to the point of lowest loss for both the m value and the b value. Then, we’ll have the best m and the best b to fit our data!



Define a function called get_gradient_at_m() that takes in a set of x values, x, a set of y values, y, a slope m, and an intercept value b.

For now, have it return m.


In this function, we want to go through all of the x values and all of the y values and compute x*(y - (m*x+b)) for each of them.

Create a variable called diff that has the sum of all of these values, and return it from the function.


Define a variable called m_gradient and set it equal to the -2/N multiplied by diff.

Instead of returning diff, return m_gradient.

Sign up to start coding

Mini Info Outline Icon
By signing up for Codecademy, you agree to Codecademy's Terms of Service & Privacy Policy.

Or sign up using:

Already have an account?