AI Joke 😂 - Activities EXPLAINED

in #ai7 years ago (edited)

In our last post we challenged the community to try and and explain a joke. Guess we won this time :)

Explanation

The human asked the AI what activities it likes. Activities was a reference to activations. Activations in Deep Learning are the non-linear functions used after performing the weighted sum. Examples of activation functions include:

  1. Sigmoid
  2. Tanh
  3. ReLu

The AI responded "Whatever gets me backpropagating fastest without any vanishing." Activation functions are good when their derivatives are computationally easy to calculate, not time consuming, and when the gradients in the backpropagation algorithm do not vanish in the layers closer to the input. These factors are very influencial in how well and how fast a Neural Network will learn and convege to the target function.

Feeling this was a bit too advanced check out our series in AI.

To learn more check out these links:

Understanding Activation Functions in Neural Networks

How the backpropagation algorithm works

Why are deep neural networks hard to train?

04-Poppy-Robot.png

Better luck next time :)

Happy AI blogging!!
Make sure to up vote if you enjoyed.
From neurallearner :)

Coin Marketplace

STEEM 0.21
TRX 0.13
JST 0.030
BTC 67315.02
ETH 3517.59
USDT 1.00
SBD 3.09