Introduction

Welcome to our exploration of "Understanding Activation Functions". In this lesson, we will investigate activation functions — essential components in neural networks that determine the output of a neuron. We will focus on the theory and C++ implementations of five activation functions. Let's begin our journey into the world of neural networks using C++.

Theoretical Understanding of Activation Functions

Activation functions play a crucial role in neural networks by determining the output of each neuron. You can think of them as gates: they decide whether a neuron should be activated or not, based on the input. In this lesson, we will explore five types of activation functions.

Step Function

Mathematical Formula:

f(x)={1if x00if x<0f(x) = \begin{cases} 1 & \text{if } x \geq 0 \\ 0 & \text{if } x < 0 \end{cases}
Sigmoid Function

Mathematical Formula:

f(x)=11+exf(x) = \frac{1}{1 + e^{-x}}

The sigmoid function maps any real value to a value between 0 and 1, producing an S-shaped curve. It is often used when the output needs to represent a probability.

ReLU Function

Mathematical Formula:

f(x)=max(0,x)f(x) = \max(0, x)

The ReLU (Rectified Linear Unit) function returns the input value itself if it is positive; otherwise, it returns zero. It is widely used in neural networks due to its simplicity and effectiveness.

C++ Implementation:

ReLU Function Visualization

You can visualize the ReLU function in C++ using matplotlibcpp as before:

Tanh Function

Mathematical Formula:

f(x)=tanh(x)=21+e2x1f(x) = \tanh(x) = \frac{2}{1 + e^{-2x}} - 1
Softplus Function

Mathematical Formula:

f(x)=ln(1+ex)f(x) = \ln(1 + e^{x})

The softplus function is a smooth approximation of the ReLU function and is differentiable everywhere. It is defined as the natural logarithm of (1 + exp(x)).

C++ Implementation:

Softplus Function Visualization

You can visualize the Softplus function in C++ using matplotlibcpp as before:

Lesson Summary and Practice

In this lesson, we explored the theory and C++ implementations of several important activation functions: step, sigmoid, ReLU, tanh, and softplus. You have learned how to implement these functions, their mathematical formulas, and how to visualize them using C++ and the matplotlibcpp library.

By mastering these activation functions, you are building a strong foundation for understanding and constructing neural networks in C++.

Sign up
Join the 1M+ learners on CodeSignal
Be a part of our community of 1M+ users who develop and demonstrate their skills on CodeSignal