For the loss function, again, we are going to get logits from the DNN and then pass this logits to the softmax_cross_entropy_with_logitsfunction from TensorFlow. We pass the true labels and logits, and then we can get the loss by using the reduce_mean function with cross_entropy. The following screenshot shows the lines of code used for showing the use of the reduce_meanfunction with cross_entropy for getting the loss: