We have a function to find the gradient of b
at every point. To find the m
gradient, or the way the loss changes as the slope of our line changes, we can use this formula:
Once more:
N
is the number of points you have in your datasetm
is the current gradient guessb
is the current intercept guess
To find the m
gradient:
- we find the sum of
x_value * (y_value - (m*x_value + b))
for all they_value
s andx_value
s we have - and then we multiply the sum by a factor of
-2/N
.N
is the number of points we have.
Once we have a way to calculate both the m
gradient and the b
gradient, we’ll be able to follow both of those gradients downwards to the point of lowest loss for both the m
value and the b
value. Then, we’ll have the best m
and the best b
to fit our data!
Instructions
Define a function called get_gradient_at_m()
that takes in a set of x values, x
, a set of y values, y
, a slope m
, and an intercept value b
.
For now, have it return m
.
In this function, we want to go through all of the x
values and all of the y
values and compute x*(y - (m*x+b))
for each of them.
Create a variable called diff
that has the sum of all of these values, and return it from the function.
Define a variable called m_gradient
and set it equal to the -2/N
multiplied by diff
.
Instead of returning diff
, return m_gradient
.