From the course: Training Neural Networks in C++ (2021)

Unlock the full course today

Join today to access over 22,500 courses taught by industry experts or purchase this course individually.

Activation functions

Activation functions - C++ Tutorial

From the course: Training Neural Networks in C++ (2021)

Start my 1-month free trial

Activation functions

- [Instructor] We are almost there, but our neuron is still missing something. So let me tell you what's wrong with weighted sums. There are two inconveniences I'd like to mention. First, values aren't constrained, so as sum may sometimes result in a very large value or a very small value. Second, a weighted sum is a linear function, so the threshold to "fire" is not very well-defined. That is, a change between true and false is not very notable, and most importantly, it's not easily trained. It turns out that other functions that make learning easier are nonlinear. This is the real reason to add an element to our neuron. So what's wrong with having a very large and a very small value? Considered this example where we have a two input neuron, and we are feeding 1,000 x0 and two to x1. For now, let's leave the bias weight at zero, so the bias is not shown to keep the diagram simple. If we run the neuron, we'll have a result…

Contents