Log In
Or create an account -> 
Imperial Library
  • Home
  • About
  • News
  • Upload
  • Forum
  • Help
  • Login/SignUp

Index
Preface
Objective and Approach Prerequisites Other Resources Conventions Used in This Book Using Code Examples O’Reilly Online Learning How to Contact Us Acknowledgments
I. Introduction to Generative Deep Learning 1. Generative Modeling
What Is Generative Modeling?
Generative Versus Discriminative Modeling Advances in Machine Learning The Rise of Generative Modeling The Generative Modeling Framework
Probabilistic Generative Models
Hello Wrodl! Your First Probabilistic Generative Model Naive Bayes Hello Wrodl! Continued
The Challenges of Generative Modeling
Representation Learning
Setting Up Your Environment Summary
2. Deep Learning
Structured and Unstructured Data Deep Neural Networks
Keras and TensorFlow
Your First Deep Neural Network
Loading the Data Building the Model Compiling the Model Training the Model Evaluating the Model
Improving the Model
Convolutional Layers Batch Normalization Dropout Layers Putting It All Together
Summary
3. Variational Autoencoders
The Art Exhibition Autoencoders
Your First Autoencoder The Encoder The Decoder Joining the Encoder to the Decoder Analysis of the Autoencoder
The Variational Art Exhibition Building a Variational Autoencoder
The Encoder The Loss Function Analysis of the Variational Autoencoder
Using VAEs to Generate Faces
Training the VAE Analysis of the VAE Generating New Faces Latent Space Arithmetic Morphing Between Faces
Summary
4. Generative Adversarial Networks
Ganimals Introduction to GANs Your First GAN
The Discriminator The Generator Training the GAN
GAN Challenges
Oscillating Loss Mode Collapse Uninformative Loss Hyperparameters Tackling the GAN Challenges
Wasserstein GAN
Wasserstein Loss The Lipschitz Constraint Weight Clipping Training the WGAN Analysis of the WGAN
WGAN-GP
The Gradient Penalty Loss Analysis of WGAN-GP
Summary
II. Teaching Machines to Paint, Write, Compose, and Play 5. Paint
Apples and Organges CycleGAN Your First CycleGAN
Overview The Generators (U-Net) The Discriminators Compiling the CycleGAN Training the CycleGAN Analysis of the CycleGAN
Creating a CycleGAN to Paint Like Monet
The Generators (ResNet) Analysis of the CycleGAN
Neural Style Transfer
Content Loss Style Loss Total Variance Loss Running the Neural Style Transfer Analysis of the Neural Style Transfer Model
Summary
6. Write
The Literary Society for Troublesome Miscreants Long Short-Term Memory Networks Your First LSTM Network
Tokenization Building the Dataset The LSTM Architecture The Embedding Layer The LSTM Layer The LSTM Cell
Generating New Text RNN Extensions
Stacked Recurrent Networks Gated Recurrent Units Bidirectional Cells
Encoder–Decoder Models A Question and Answer Generator
A Question-Answer Dataset Model Architecture Inference Model Results
Summary
7. Compose
Preliminaries
Musical Notation
Your First Music-Generating RNN
Attention Building an Attention Mechanism in Keras Analysis of the RNN with Attention Attention in Encoder–Decoder Networks Generating Polyphonic Music
The Musical Organ Your First MuseGAN The MuseGAN Generator
Chords, Style, Melody, and Groove
Chords Style Melody Groove
The Bar Generator Putting It All Together
The Critic Analysis of the MuseGAN Summary
8. Play
Reinforcement Learning
OpenAI Gym
World Model Architecture
The Variational Autoencoder The MDN-RNN The Controller
Setup Training Process Overview Collecting Random Rollout Data Training the VAE
The VAE Architecture Exploring the VAE
The full model The encoder models The decoder model
Collecting Data to Train the RNN Training the MDN-RNN
The MDN-RNN Architecture Sampling the Next z and Reward from the MDN-RNN The MDN-RNN Loss Function
Training the Controller
The Controller Architecture CMA-ES Parallelizing CMA-ES Output from the Controller Training
In-Dream Training
In-Dream Training the Controller Challenges of In-Dream Training
Summary
9. The Future of Generative Modeling
Five Years of Progress The Transformer
Positional Encoding Multihead Attention The Decoder Analysis of the Transformer BERT GPT-2 MuseNet
Advances in Image Generation
ProGAN Self-Attention GAN (SAGAN) BigGAN StyleGAN
Applications of Generative Modeling
AI Art AI Music
10. Conclusion Index
  • ← Prev
  • Back
  • Next →
  • ← Prev
  • Back
  • Next →

Chief Librarian: Las Zenow <zenow@riseup.net>
Fork the source code from gitlab
.

This is a mirror of the Tor onion service:
http://kx5thpx2olielkihfyo4jgjqfb7zx7wxr3sd4xzt26ochei4m6f7tayd.onion