Neural Networks and TensorFlow - Deep Learning Series [Part 4]

in #programming6 years ago

In this fourth tutorial on deep learning with Python and TensorFlow we're going to look at Gradient Descent.

Gradient Descent is an optimization algorithm that's very popular and has been used extensively in an overwhelming number of deep learning projects.

The math behind gradient descent might be a bit intimidating, which is why we're going to leave it somewhat aside for now. Here we're trying to get an intuitive understanding of it.

As an optimization algorithm, the purpose of gradient descent is to update the parameters, the weights and the bias, to minimize the gap between the real output and the targeted output. In other words, to minimize the loss or to increase the accuracy of your model.

Please see the video for the complete walk-through.

To stay in touch with me, follow @cristi


Cristi Vlad Self-Experimenter and Author

Sort:  

Thanks for sharing

Boost Your Post. Send 0.100 STEEM or SBD and your post url on memo and we will resteem your post on 5000+ followers. check our account to see the follower count.

To listen to the audio version of this article click on the play image.

Brought to you by @tts. If you find it useful please consider upvote this reply.

So nice to see this series continue... resteemed!

Coin Marketplace

STEEM 0.26
TRX 0.11
JST 0.033
BTC 64006.33
ETH 3077.08
USDT 1.00
SBD 3.87