You are viewing a single comment's thread from:

RE: Why Do I Sometimes Get Better Accuracy With A Higher Learning Rate Gradient Descent?

in #programming7 years ago (edited)

Great article about gradient descent. Many modern machine learning algorithms (e.g. feedforward neural network) use stochastic gradient descent for updating parameters. There are also some modifications (e.g. batch stochastic descent) to speed up training while keeping the accuracy. Look forward to other of your articles about those techniques

Sort:  

I briefly mentioned some of those techniques in at the end.

You're correct that stochastic gradient descent has somewhat overtaken traditional gradient descent if for no other reason than it's faster to compute.

Anyway, there will definitely be more articles about these other techniques in the future.

Coin Marketplace

STEEM 0.18
TRX 0.16
JST 0.031
BTC 62261.29
ETH 2614.53
USDT 1.00
SBD 2.56