In this post, you will be seeing different types of **activation functions** used in **neural networks** in form of an animation. If you are starting on deep learning and wanted to know about different types of activation functions, you may want to bookmark this page for quicker access in future.

Without further ado, let’s take a look at the animation which represents different types of activation functions:

Here is the list of different types of activation functions shown in above animation:

- Identity function (Used in Adaline – Adaptive Linear Neuron)
- Sigmoid function
- Tanh functon
- ArcTan function (inverse tangent function)
- ReLU (Rectified Linear Unit)
- Leaky ReLU (Improved version of ReLU)
- Randomized ReLU
- Parametric ReLU
- Binary (Perceptron)
- Exponential linear unit
- Soft Sign
- Inverse Square Root Unit (ISRU)
- Inverse Square Root Linear
- Square Non-linearity
- Bipolar ReLU
- Soft Plus

The following represents different variants of ReLU:

- Leaky ReLU
- Randomized ReLU
- Parametric ReLU
- Exponential linear unit
- Bipolar ReLU

Out of the above activation functions, the most commonly / popularly used are the following:

- Sigmoid
- Tanh
- ReLU and its different variants

Latest posts by Ajitesh Kumar (see all)

- Data Storytelling Explained with Examples - October 21, 2020
- How to Setup / Install MLFlow & Get Started - October 20, 2020
- Python – How to Add Trend Line to Line Chart / Graph - October 20, 2020