From the course: Training Neural Networks in C++ (2021)
Unlock the full course today
Join today to access over 22,500 courses taught by industry experts or purchase this course individually.
Activation functions - C++ Tutorial
From the course: Training Neural Networks in C++ (2021)
Activation functions
- [Instructor] We are almost there, but our neuron is still missing something. So let me tell you what's wrong with weighted sums. There are two inconveniences I'd like to mention. First, values aren't constrained, so as sum may sometimes result in a very large value or a very small value. Second, a weighted sum is a linear function, so the threshold to "fire" is not very well-defined. That is, a change between true and false is not very notable, and most importantly, it's not easily trained. It turns out that other functions that make learning easier are nonlinear. This is the real reason to add an element to our neuron. So what's wrong with having a very large and a very small value? Considered this example where we have a two input neuron, and we are feeding 1,000 x0 and two to x1. For now, let's leave the bias weight at zero, so the bias is not shown to keep the diagram simple. If we run the neuron, we'll have a result…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
Neurons and the brain1m 49s
-
A simple model of a neuron5m 43s
-
Activation functions6m 21s
-
Perceptrons: A better model of a neuron3m 53s
-
Challenge: Finish the perceptron1m
-
Solution: Finish the perceptron37s
-
Logic gates3m 11s
-
Challenge: Logic gates with perceptrons1m
-
Solution: Logic gates with perceptrons46s
-
-
-
-
-