Log In
Or create an account -> 
Imperial Library
  • Home
  • About
  • News
  • Upload
  • Forum
  • Help
  • Login/SignUp

Index
Title Page Copyright and Credits
Deep Learning By Example
Packt Upsell
Why subscribe? PacktPub.com
Contributors
About the author About the reviewers Packt is searching for authors like you
Preface
Who this book is for What this book covers To get the most out of this book
Download the example code files Download the color images Conventions used
Get in touch
Reviews
Data Science - A Birds' Eye View
Understanding data science by an example Design procedure of data science algorithms
Data pre-processing
Data cleaning Data pre-processing
Feature selection Model selection Learning process Evaluating your model
Getting to learn
Challenges of learning
Feature extraction – feature engineering Noise Overfitting Selection of a machine learning algorithm Prior knowledge Missing values
Implementing the fish recognition/detection model
Knowledge base/dataset Data analysis pre-processing Model building
Model training and testing Fish recognition – all together
Different learning types
Supervised learning Unsupervised learning Semi-supervised learning Reinforcement learning
Data size and industry needs Summary
Data Modeling in Action - The Titanic Example
Linear models for regression
Motivation Advertising – a financial example
Dependencies Importing data with pandas Understanding the advertising data Data analysis and visualization Simple regression model
Learning model coefficients Interpreting model coefficients Using the model for prediction
Linear models for classification
Classification and logistic regression
Titanic example – model building and training
Data handling and visualization Data analysis – supervised machine learning
Different types of errors Apparent (training set) error Generalization/true error Summary
Feature Engineering and Model Complexity – The Titanic Example Revisited
Feature engineering
Types of feature engineering
Feature selection Dimensionality reduction Feature construction
Titanic example revisited
Missing values
Removing any sample with missing values in it Missing value inputting Assigning an average value Using a regression or another simple model to predict the values of missing variables
Feature transformations
Dummy features Factorizing Scaling Binning
Derived features
Name Cabin Ticket
Interaction features
The curse of dimensionality
Avoiding the curse of dimensionality
Titanic example revisited – all together Bias-variance decomposition Learning visibility
Breaking the rule of thumb
Summary
Get Up and Running with TensorFlow
TensorFlow installation
TensorFlow GPU installation for Ubuntu 16.04
Installing NVIDIA drivers and CUDA 8 Installing TensorFlow
TensorFlow CPU installation for Ubuntu 16.04 TensorFlow CPU installation for macOS X TensorFlow GPU/CPU installation for Windows
The TensorFlow environment Computational graphs TensorFlow data types, variables, and placeholders
Variables Placeholders Mathematical operations
Getting output from TensorFlow TensorBoard – visualizing learning Summary
TensorFlow in Action - Some Basic Examples
Capacity of a single neuron
Biological motivation and connections
Activation functions
Sigmoid Tanh ReLU
Feed-forward neural network The need for multilayer networks
Training our MLP – the backpropagation algorithm Step 1 – forward propagation Step 2 – backpropagation and weight updation
TensorFlow terminologies – recap
Defining multidimensional arrays using TensorFlow Why tensors? Variables Placeholders Operations
Linear regression model – building and training
Linear regression with TensorFlow
Logistic regression model – building and training
Utilizing logistic regression in TensorFlow
Why use placeholders? Set model weights and bias Logistic regression model Training Cost function
Summary
Deep Feed-forward Neural Networks - Implementing Digit Classification
Hidden units and architecture design MNIST dataset analysis
The MNIST data
Digit classification – model building and training
Data analysis Building the model Model training
Summary
Introduction to Convolutional Neural Networks
The convolution operation Motivation
Applications of CNNs
Different layers of CNNs
Input layer Convolution step Introducing non-linearity The pooling step Fully connected layer
Logits layer
CNN basic example – MNIST digit classification
Building the model
Cost function Performance measures
Model training
Summary
Object Detection – CIFAR-10 Example
Object detection CIFAR-10 – modeling, building, and training
Used packages Loading the CIFAR-10 dataset Data analysis and preprocessing Building the network Model training Testing the model
Summary
Object Detection – Transfer Learning with CNNs
Transfer learning
The intuition behind TL Differences between traditional machine learning and TL
CIFAR-10 object detection – revisited
Solution outline Loading and exploring CIFAR-10 Inception model transfer values Analysis of transfer values Model building and training
Summary
Recurrent-Type Neural Networks - Language Modeling
The intuition behind RNNs
Recurrent neural networks architectures Examples of RNNs
Character-level language models
Language model using Shakespeare data
The vanishing gradient problem The problem of long-term dependencies
LSTM networks
Why does LSTM work?
Implementation of the language model
Mini-batch generation for training Building the model
Stacked LSTMs Model architecture Inputs Building an LSTM cell RNN output Training loss Optimizer Building the network Model hyperparameters
Training the model
Saving checkpoints Generating text
Summary
Representation Learning - Implementing Word Embeddings
Introduction to representation learning Word2Vec
Building Word2Vec model
A practical example of the skip-gram architecture Skip-gram Word2Vec implementation
Data analysis and pre-processing Building the model Training
Summary
Neural Sentiment Analysis
General sentiment analysis architecture
RNNs – sentiment analysis context Exploding and vanishing gradients - recap
Sentiment analysis – model implementation
Keras Data analysis and preprocessing Building the model Model training and results analysis
Summary
Autoencoders – Feature Extraction and Denoising
Introduction to autoencoders Examples of autoencoders Autoencoder architectures Compressing the MNIST dataset
The MNIST dataset Building the model Model training
Convolutional autoencoder
Dataset Building the model Model training
Denoising autoencoders
Building the model Model training
Applications of autoencoders
Image colorization More applications
Summary
Generative Adversarial Networks
An intuitive introduction Simple implementation of GANs
Model inputs Variable scope Leaky ReLU Generator Discriminator Building the GAN network
Model hyperparameters Defining the generator and discriminator Discriminator and generator losses Optimizers
Model training
Generator samples from training
Sampling from the generator
Summary
Face Generation and Handling Missing Labels
Face generation
Getting the data Exploring the Data Building the model
Model inputs Discriminator Generator Model losses Model optimizer Training the model
Semi-supervised learning with Generative Adversarial Networks (GANs)
Intuition Data analysis and preprocessing Building the model
Model inputs Generator Discriminator Model losses Model optimizer
Model training
Summary
Implementing Fish Recognition
Code for fish recognition
Other Books You May Enjoy
Leave a review - let other readers know what you think
  • ← Prev
  • Back
  • Next →
  • ← Prev
  • Back
  • Next →

Chief Librarian: Las Zenow <zenow@riseup.net>
Fork the source code from gitlab
.

This is a mirror of the Tor onion service:
http://kx5thpx2olielkihfyo4jgjqfb7zx7wxr3sd4xzt26ochei4m6f7tayd.onion