Hands On: Achieving 99%

In the beginning of Part II, we set a target for ourselves: 99% accuracy on MNIST. We just got so close, reaching 98.6%. That last 0.4%? That’s up to you.

I told you that configuring a network can be more art than science, and this Hands On is proof of that. Here’s my suggestion: to find better hyperparameters than the ones we have now, drop compare.py. Instead, use neural_network_quieter.py, an alternative version of the network that logs accuracy only once every 10 epochs—it’s much faster. Here are a few things that you can try:

Hitting that 99% with our neural network is entirely possible. Happy hunting! Check out the 15_development/solution directory if you want to compare your hyperparameters to the ones that I found.