Log In
Or create an account -> 
Imperial Library
  • Home
  • About
  • News
  • Upload
  • Forum
  • Help
  • Login/SignUp

Index
 Acknowledgments  How the Heck Is That Possible?
About This Book Before We Begin
Part I. From Zero to Image Recognition
1. How Machine Learning Works
Programming vs. Machine Learning Supervised Learning The Math Behind the Magic Setting Up Your System
2. Your First Learning Program
Getting to Know the Problem Coding Linear Regression Adding a Bias What You Just Learned Hands On: Tweaking the Learning Rate
3. Walking the Gradient
Our Algorithm Doesn’t Cut It Gradient Descent What You Just Learned Hands On: Basecamp Overshooting
4. Hyperspace!
Adding More Dimensions Matrix Math Upgrading the Learner Bye Bye, Bias A Final Test Drive What You Just Learned Hands On: Field Statistician
5. A Discerning Machine
Where Linear Regression Fails Invasion of the Sigmoids Classification in Action What You Just Learned Hands On: Weighty Decisions
6. Getting Real
Data Come First Our Own MNIST Library The Real Thing What You Just Learned Hands On: Tricky Digits
7. The Final Challenge
Going Multiclass Moment of Truth What You Just Learned Hands On: Minesweeper
8. The Perceptron
Enter the Perceptron Assembling Perceptrons Where Perceptrons Fail A Tale of Perceptrons
Part II. Neural Networks
9. Designing the Network
Assembling a Neural Network from Perceptrons Enter the Softmax Here’s the Plan What You Just Learned Hands On: Network Adventures
10. Building the Network
Coding Forward Propagation Cross Entropy What You Just Learned Hands On: Time Travel Testing
11. Training the Network
The Case for Backpropagation From the Chain Rule to Backpropagation Applying Backpropagation Initializing the Weights The Finished Network What You Just Learned Hands On: Starting Off Wrong
12. How Classifiers Work
Tracing a Boundary Bending the Boundary What You Just Learned Hands On: Data from Hell
13. Batchin’ UpBatchin’ Up
Learning, Visualized Batch by Batch Understanding Batches What You Just Learned Hands On: The Smallest Batch
14. The Zen of Testing
The Threat of Overfitting A Testing Conundrum What You Just Learned Hands On: Thinking About Testing
15. Let’s Do DevelopmentLet’s Do Development
Preparing Data Tuning Hyperparameters The Final Test Hands On: Achieving 99% What You Just Learned… and the Road Ahead
Part III. Deep Learning
16. A Deeper Kind of Network
The Echidna Dataset Building a Neural Network with Keras Making It Deep What You Just Learned Hands On: Keras Playground
17. Defeating Overfitting
Overfitting Explained Regularizing the Model A Regularization Toolbox What You Just Learned Hands On: Keeping It Simple
18. Taming Deep Networks
Understanding Activation Functions Beyond the Sigmoid Adding More Tricks to Your Bag What You Just Learned Hands On: The 10 Epochs Challenge
19. Beyond Vanilla Networks
The CIFAR-10 Dataset The Building Blocks of CNNs Running on Convolutions What You Just Learned Hands On: Hyperparameters Galore
20. Into the Deep
The Rise of Deep Learning Unreasonable Effectiveness Where Now? Your Journey Begins
A1. Just Enough Python
What Python Looks Like Python’s Building Blocks Defining and Calling Functions Working with Modules and Packages Creating and Using Objects That’s It, Folks!
A2. The Words of Machine Learning
  • ← Prev
  • Back
  • Next →
  • ← Prev
  • Back
  • Next →

Chief Librarian: Las Zenow <zenow@riseup.net>
Fork the source code from gitlab
.

This is a mirror of the Tor onion service:
http://kx5thpx2olielkihfyo4jgjqfb7zx7wxr3sd4xzt26ochei4m6f7tayd.onion