After finding the weighted sum, the second step is to *constrain the weighted sum to produce a desired output*.

Why is that important? Imagine if a perceptron had inputs in the range of 100-1000 but the goal was to simply predict whether or not something would occur — `1`

for “Yes” and `0`

for “No”. This would result in a very large weighted sum.

How can the perceptron produce a meaningful output in this case? This is exactly where **activation functions** come in! These are special functions that transform the weighted sum into a desired and constrained output.

For example, if you want to train a perceptron to detect whether a point is above or below a line (which we will be doing in this lesson!), you might want the output to be a `+1`

or `-1`

label. For this task, you can use the “sign activation function” to help the perceptron make the decision:

- If weighted sum is positive, return
`+1`

- If weighted sum is negative, return
`-1`

In this lesson, we will focus on using the sign activation function because it is the simplest way to get started with perceptrons and eventually visualize one in action.

### Instructions

**1.**

Inside the `.activation()`

method, return `1`

if the `weighted_sum`

is greater than or equal to `0`

.

**2.**

Inside the `.activation()`

method, return `-1`

if the weighted_sum is less than `0`

.

**3.**

Try it out for yourself!

Print out the result of the method `.activation()`

called on `cool_perceptron`

if the weighted sum is `52`

.