Log In
Or create an account -> 
Imperial Library
  • Home
  • About
  • News
  • Upload
  • Forum
  • Help
  • Login/SignUp

Index
Title Page Copyright and Credits
Python: Advanced Guide to Artificial Intelligence
About Packt
Why subscribe? Packt.com
Contributors
About the authors Packt is searching for authors like you
Preface
Who this book is for What this book covers To get the most out of this book
Download the example code files Conventions used
Get in touch
Reviews
Machine Learning Model Fundamentals
Models and data
Zero-centering and whitening Training and validation sets
Cross-validation
Features of a machine learning model
Capacity of a model
Vapnik-Chervonenkis capacity
Bias of an estimator
Underfitting
Variance of an estimator
Overfitting The Cramér-Rao bound
Loss and cost functions
Examples of cost functions
Mean squared error Huber cost function Hinge cost function Categorical cross-entropy
Regularization
Ridge Lasso ElasticNet Early stopping
Summary
Introduction to Semi-Supervised Learning
Semi-supervised scenario
Transductive learning Inductive learning Semi-supervised assumptions
Smoothness assumption Cluster assumption Manifold assumption
Generative Gaussian mixtures
Example of a generative Gaussian mixture
Weighted log-likelihood
Contrastive pessimistic likelihood estimation
Example of contrastive pessimistic likelihood estimation
Semi-supervised Support Vector Machines (S3VM)
Example of S3VM
Transductive Support Vector Machines (TSVM)
Example of TSVM
Summary
Graph-Based Semi-Supervised Learning
Label propagation
Example of label propagation Label propagation in Scikit-Learn
Label spreading
Example of label spreading
Label propagation based on Markov random walks
Example of label propagation based on Markov random walks
Manifold learning
Isomap
Example of Isomap
Locally linear embedding
Example of locally linear embedding
Laplacian Spectral Embedding
Example of Laplacian Spectral Embedding
t-SNE
Example of t-distributed stochastic neighbor embedding 
Summary
Bayesian Networks and Hidden Markov Models
Conditional probabilities and Bayes' theorem Bayesian networks
Sampling from a Bayesian network
Direct sampling
Example of direct sampling
A gentle introduction to Markov chains Gibbs sampling Metropolis-Hastings sampling
Example of Metropolis-Hastings sampling
Sampling example using PyMC3
Hidden Markov Models (HMMs)
Forward-backward algorithm
Forward phase Backward phase HMM parameter estimation
Example of HMM training with hmmlearn
Viterbi algorithm
Finding the most likely hidden state sequence with hmmlearn
Summary
EM Algorithm and Applications
MLE and MAP learning EM algorithm
An example of parameter estimation
Gaussian mixture
An example of Gaussian Mixtures using Scikit-Learn
Factor analysis
An example of factor analysis with Scikit-Learn
Principal Component Analysis
An example of PCA with Scikit-Learn
Independent component analysis
An example of FastICA with Scikit-Learn
Addendum to HMMs Summary
Hebbian Learning and Self-Organizing Maps
Hebb's rule
Analysis of the covariance rule
Example of covariance rule application
Weight vector stabilization and Oja's rule
Sanger's network
Example of Sanger's network
Rubner-Tavan's network
Example of Rubner-Tavan's network
Self-organizing maps
Example of SOM
Summary
Clustering Algorithms
k-Nearest Neighbors
KD Trees Ball Trees Example of KNN with Scikit-Learn
K-means
K-means++ Example of K-means with Scikit-Learn
Evaluation metrics
Homogeneity score Completeness score Adjusted Rand Index Silhouette score
Fuzzy C-means
Example of fuzzy C-means with Scikit-Fuzzy
Spectral clustering
Example of spectral clustering with Scikit-Learn
Summary
Advanced Neural Models
Deep convolutional networks
Convolutions
Bidimensional discrete convolutions
Strides and padding
Atrous convolution Separable convolution Transpose convolution
Pooling layers Other useful layers Examples of deep convolutional networks with Keras
Example of a deep convolutional network with Keras and data augmentation
Recurrent networks
Backpropagation through time (BPTT) LSTM GRU Example of an LSTM network with Keras
Transfer learning Summary
Classical Machine Learning with TensorFlow
Simple linear regression
Data preparation Building a simple regression model
Defining the inputs, parameters, and other variables Defining the model Defining the loss function Defining the optimizer function Training the model
Using the trained model to predict
Multi-regression Regularized regression
Lasso regularization Ridge regularization ElasticNet regularization
Classification using logistic regression
Logistic regression for binary classification Logistic regression for multiclass classification
Binary classification Multiclass classification Summary
Neural Networks and MLP with TensorFlow and Keras
The perceptron MultiLayer Perceptron MLP for image classification
TensorFlow-based MLP for MNIST classification Keras-based MLP for MNIST classification TFLearn-based MLP for MNIST classification Summary of MLP with TensorFlow, Keras, and TFLearn
MLP for time series regression Summary
RNN with TensorFlow and Keras
Simple Recurrent Neural Network RNN variants LSTM network GRU network TensorFlow for RNN
TensorFlow RNN Cell Classes TensorFlow RNN Model Construction Classes TensorFlow RNN Cell Wrapper Classes
Keras for RNN Application areas of RNNs RNN in Keras for MNIST data Summary
CNN with TensorFlow and Keras
Understanding convolution Understanding pooling CNN architecture pattern - LeNet LeNet for MNIST data
LeNet CNN for MNIST with TensorFlow LeNet CNN for MNIST with Keras
LeNet for CIFAR10 Data
ConvNets for CIFAR10 with TensorFlow ConvNets for CIFAR10 with Keras
Summary
Autoencoder with TensorFlow and Keras
Autoencoder types Stacked autoencoder in TensorFlow Stacked autoencoder in Keras Denoising autoencoder in TensorFlow Denoising autoencoder in Keras Variational autoencoder in TensorFlow Variational autoencoder in Keras Summary
TensorFlow Models in Production with TF Serving
Saving and Restoring models in TensorFlow
Saving and restoring all graph variables with the saver class Saving and restoring selected  variables with the saver class
Saving and restoring Keras models TensorFlow Serving
Installing TF Serving Saving models for TF Serving Serving models with TF Serving
TF Serving in the Docker containers
Installing Docker Building a Docker image for TF serving Serving the model in the Docker container
TensorFlow Serving on Kubernetes
Installing Kubernetes Uploading the Docker image to the dockerhub Deploying in Kubernetes
Summary
Deep Reinforcement Learning
OpenAI Gym 101 Applying simple policies to a cartpole game Reinforcement learning 101
Q function (learning to optimize when the model is not available) Exploration and exploitation in the RL algorithms V function (learning to optimize when the model is available) Reinforcement learning techniques
Naive Neural Network policy for Reinforcement Learning Implementing Q-Learning
Initializing and discretizing for Q-Learning Q-Learning with Q-Table Q-Learning with Q-Network  or Deep Q Network (DQN) 
Summary
Generative Adversarial Networks
Generative Adversarial Networks 101 Best practices for building and training GANs Simple GAN with TensorFlow Simple GAN with Keras Deep Convolutional GAN with TensorFlow and Keras Summary
Distributed Models with TensorFlow Clusters
Strategies for distributed execution TensorFlow clusters
Defining cluster specification Create the server instances Define the parameter and operations across servers and devices Define and train the graph for asynchronous updates Define and train the graph for synchronous updates
Summary
Debugging TensorFlow Models
Fetching tensor values with tf.Session.run() Printing tensor values with tf.Print() Asserting on conditions with tf.Assert() Debugging with the TensorFlow debugger (tfdbg) Summary
Tensor Processing Units Getting Started
Understanding deep learning
Perceptron Activation functions
Sigmoid The hyperbolic tangent function The Rectified Linear Unit (ReLU)
Artificial neural network (ANN)
One-hot encoding Softmax Cross-entropy Dropout Batch normalization L1 and L2 regularization
Training neural networks
Backpropagation Gradient descent Stochastic gradient descent
Playing with TensorFlow playground Convolutional neural network
Kernel Max pooling
Recurrent neural networks (RNN) Long short-term memory (LSTM)
Deep learning for computer vision
Classification Detection or localization and segmentation Similarity learning Image captioning Generative models Video analysis
Development environment setup
Hardware and Operating Systems - OS
General Purpose - Graphics Processing Unit (GP-GPU)
Computer Unified Device Architecture - CUDA CUDA Deep Neural Network - CUDNN
Installing software packages
Python Open Computer Vision - OpenCV The TensorFlow library
Installing TensorFlow TensorFlow example to print Hello, TensorFlow TensorFlow example for adding two numbers TensorBoard The TensorFlow Serving tool
The Keras library
Summary
Image Classification
The bigger deep learning models
The AlexNet model The VGG-16 model The Google Inception-V3 model The Microsoft ResNet-50 model The SqueezeNet model Spatial transformer networks The DenseNet model
Training a model for cats versus dogs
Preparing the data Benchmarking with simple CNN Augmenting the dataset
Augmentation techniques 
Transfer learning or fine-tuning of a model
Training on bottleneck features
Fine-tuning several layers in deep learning
Developing real-world applications
Choosing the right model Tackling the underfitting and overfitting scenarios Gender and age detection from face Fine-tuning apparel models  Brand safety
Summary
Image Retrieval
Understanding visual features
Visualizing activation of deep learning models Embedding visualization
Guided backpropagation
The DeepDream Adversarial examples
Model inference
Exporting a model Serving the trained model 
Content-based image retrieval
Building the retrieval pipeline
Extracting bottleneck features for an image Computing similarity between query image and target database
Efficient retrieval
Matching faster using approximate nearest neighbour
Advantages of ANNOY
Autoencoders of raw images
Denoising using autoencoders
Summary
Object Detection
Detecting objects in an image Exploring the datasets
ImageNet dataset PASCAL VOC challenge COCO object detection challenge Evaluating datasets using metrics
Intersection over Union The mean average precision
Localizing algorithms 
Localizing objects using sliding windows
The scale-space concept Training a fully connected layer as a convolution layer Convolution implementation of sliding window
Thinking about localization as a regression problem
Applying regression to other problems Combining regression with the sliding window
Detecting objects
Regions of the convolutional neural network (R-CNN) Fast R-CNN Faster R-CNN Single shot multi-box detector
Object detection API
Installation and setup Pre-trained models Re-training object detection models
Data preparation for the Pet dataset Object detection training pipeline Training the model Monitoring loss and accuracy using TensorBoard
Training a pedestrian detection for a self-driving car
The YOLO object detection algorithm  Summary
Semantic Segmentation
Predicting pixels
Diagnosing medical images Understanding the earth from satellite imagery Enabling robots to see
Datasets Algorithms for semantic segmentation
The Fully Convolutional Network The SegNet architecture
Upsampling the layers by pooling Sampling the layers by convolution Skipping connections for better training
Dilated convolutions DeepLab RefiNet PSPnet Large kernel matters DeepLab v3
Ultra-nerve segmentation Segmenting satellite images
Modeling FCN for segmentation
Segmenting instances Summary
Similarity Learning
Algorithms for similarity learning
Siamese networks
Contrastive loss
FaceNet
Triplet loss
The DeepNet model DeepRank Visual recommendation systems
Human face analysis
Face detection Face landmarks and attributes
The Multi-Task Facial Landmark (MTFL) dataset The Kaggle keypoint dataset The Multi-Attribute Facial Landmark (MAFL) dataset Learning the facial key points
Face recognition
The labeled faces in the wild (LFW) dataset The YouTube faces dataset The CelebFaces Attributes dataset (CelebA)  CASIA web face database The VGGFace2 dataset Computing the similarity between faces Finding the optimum threshold
Face clustering 
Summary
Other Books You May Enjoy
Leave a review - let other readers know what you think
  • ← Prev
  • Back
  • Next →
  • ← Prev
  • Back
  • Next →

Chief Librarian: Las Zenow <zenow@riseup.net>
Fork the source code from gitlab
.

This is a mirror of the Tor onion service:
http://kx5thpx2olielkihfyo4jgjqfb7zx7wxr3sd4xzt26ochei4m6f7tayd.onion