The loss function

For our loss function, we use the cross-entropy function. TensorFlow provides us with many such functions. For example, in this case, we are using the sparse_softmax_cross_entropy _with_logits function because here we got logits from the network. So, in this function, we pass the actual labels. These are the true labels, which are logits—the results or the output of our network. The following screenshot shows the lines of code used for showing the use of the reduce_mean function with this cross-entropy for getting the loss:

Now, using this cross-entropy, we can calculate the loss as the mean of the vector that we will get here. So this is the loss function and the mean of the cross-entropy.