Swish

Swish is an activation function used in machine learning that was introduced in 2017. It is comprised of a simple formula: $f(x) = x \cdot \text{sigmoid}(\beta x)$. The activation function has a learnable parameter $\beta$, but most implementations exclude it and use the function $x\sigma(x)$, which is the same as the SiLU function that was introduced by other authors prior to swish. The Swish Activation Function The Swish activation function is a simple mathematical formula used in machine l

Tanh Activation

Tanh Activation: Overview and Uses in Neural Networks When it comes to building artificial intelligence or machine learning models, neural networks play a vital role in analyzing data and providing insights. But to make these models more accurate and efficient, we need something called an activation function. One such function is the Tanh Activation, or hyperbolic tangent, which helps to improve the performance of neural networks. What is Tanh Activation? Firstly, an activation function acts

Tanh Exponential Activation Function

When it comes to real-time computer vision tasks, lightweight neural networks are often used because they have fewer parameters than normal networks. However, the performance of these networks can be limited. The Tanh Exponential Activation Function (TanhExp) In order to improve the performance of these lightweight neural networks, a novel activation function called the Tanh Exponential Activation Function (TanhExp) has been developed. This function is defined as f(x) = x tanh(e^x). Benefit

Prev 234 4 / 4