Perceptron
Learn about the most basic type of neural net, the single neuron perceptron! You will use it to divide linearly-separable data.
StartKey Concepts
Review core concepts you need to learn to master this subject
Perceptron Bias Term
Perceptrons as Linear Classifiers
Adjusting Perceptron Weights
Perceptron Weighted Sum
Optimizing Perceptron Weights
Introduction to Perceptrons
Perceptron Activation Functions
Perceptron Training Error
Perceptron Bias Term
Perceptron Bias Term
weighted_sum = x1*w1 + x2*w2 + x3*w3 + 1*wbias
The bias term is an adjustable, numerical term added to a perceptron’s weighted sum of inputs and weights that can increase classification model accuracy.
The addition of the bias term is helpful because it serves as another model parameter (in addition to weights) that can be tuned to make the model’s performance on training data as good as possible.
The default input value for the bias weight is 1
and the weight value is adjustable.
- 1Similar to how atoms are the building blocks of matter and how microprocessors are the building blocks of a computer, perceptrons are the building blocks of Neural Networks . If you look closely,…
- 2So the perceptron is an artificial neuron that can make a simple decision. Let’s implement one from scratch in Python! The perceptron has three main components: * Inputs: Each input correspon…
- 3Great! Now that you understand the structure of the perceptron, here’s an important question — how are the inputs and weights magically turned into an output? This is a two-step process, and …
- 4After finding the weighted sum, the second step is to constrain the weighted sum to produce a desired output. Why is that important? Imagine if a perceptron had inputs in the range of 100-1000 …
- 5Our perceptron can now make a prediction given inputs, but how do we know if it gets those predictions right? Right now we expect the perceptron to be very bad because it has random weights. We h…
- 6Now that we have our training set, we can start feeding inputs into the perceptron and comparing the actual outputs against the expected labels! Every time the output mismatches the expected labe…
- 7What do we do once we have the errors for the perceptron? We slowly nudge the perceptron towards a better version of itself that eventually has zero error. The only way to do that is to change the…
- 8But one question still remains — how do we tweak the weights optimally? We can’t just play around randomly with the weights until the correct combination magically pops up. There needs to be …
- 9You have understood that the perceptron can be trained to produce correct outputs by tweaking the regular weights. However, there are times when a minor adjustment is needed for the perceptron to…
- 10So far so good! The perceptron works as expected, but everything seems to be taking place behind the scenes. What if we could visualize the perceptron’s training process to gain a better understand…
- 11Let’s recap what you just learned! The perceptron has inputs, weights, and an output. The weights are parameters that define the perceptron and they can be used to represent a line. In other words…
- 12Congratulations! You have now built your own perceptron from scratch. Let’s step back and think about what you just accomplished and see if there are any limits to a single perceptron. Earlier, …
What you'll create
Portfolio projects that showcase your new skills
How you'll master it
Stress-test your knowledge with quizzes that help commit syntax to memory