In this lesson of the series of building convolutional neural networks with TensorFlow and Python, we're going to build the training process of the CNN for which we've constructed the architecture previously.
First we're gonna instantiate the dataset, after which we instantiate the loss function and the optimization.
So, for the loss function we're going to use cross entropy with logits (we have to specify the logits and the labels), while for optimization we're going to use the ADAM optimizer. The learning rate we use for ADAM is very small (0.0001), which can also be passed in (as argument) as 1e-4 (it basically means the same thing).
Then we're going to define the prediction and the accuracy, which we'll help us track the training and the performance of the model as it's being trained. Up next, we're going to create a session and execute the graph we've built - so to train the model.