Neural Networks and TensorFlow - Deep Learning Series [Part 2]

in #programming6 years ago

In this lesson on deep learning with TensorFlow we will be looking at feed forward neural networks.

More specifically, we'll be discussing how they are made of artificial neurons and how these neurons receive inputs, process them, and send signals forward.

In TensorFlow and also in other libraries that work with neural networks, there is this convention that the inputs which enter the neural network are being multiplied by weights, to which we add a bias terms. It appears that this convention makes the computational process, as well as the optimization of the network much more efficient.

Here we also discuss the three most common activation functions (which are in the neurons):

  • the sigmoid activation
  • the hyperbolic tangent
  • the rectified linear unit

Each of these functions have their place and some of them are more fit to a type of project than to others. To be honest, the two that I've used the most (in computer vision) are sigmoid and ReLU. Anyway, we'll be discussing in more detail about activation functions when the time comes. Here we're only introducing them and briefly mentioning how they work and how they look like. Please watch the video for the complete lesson:

To stay in touch with me, follow @cristi


Cristi Vlad Self-Experimenter and Author

Sort:  

creative posts can add insight.success always
@cristi

Github link not opening. Can you provide some other link for the code?

click on show more to see the full github link. If you click on it directly it's truncated.

Coin Marketplace

STEEM 0.26
TRX 0.11
JST 0.033
BTC 64359.90
ETH 3105.50
USDT 1.00
SBD 3.87