Log In
Or create an account -> 
Imperial Library
  • Home
  • About
  • News
  • Upload
  • Forum
  • Help
  • Login/SignUp

Index
Title Page Copyright
Machine Learning for Developers
Credits Foreword About the Author About the Reviewers www.PacktPub.com
Why subscribe?
Customer Feedback Preface
What this book covers What you need for this book Who this book is for Conventions Reader feedback Customer support
Downloading the example code Errata Piracy Questions
Introduction - Machine Learning and Statistical Science
Machine learning in the bigger picture
Types of machine learning
Grades of supervision Supervised learning strategies - regression versus classification Unsupervised problem solving–clustering
Tools of the trade–programming language and libraries
The Python language
The NumPy library The matplotlib library
What's matplotlib?
Pandas SciPy Jupyter notebook
Basic mathematical concepts
Statistics - the basic pillar of modeling uncertainty
Descriptive statistics - main operations
Mean Variance Standard deviation
Probability and random variables
Events
Probability Random variables and distributions
Useful probability distributions
Bernoulli distributions Uniform distribution Normal distribution Logistic distribution
Statistical measures for probability functions
Skewness Kurtosis
Differential calculus elements Preliminary knowledge
In search of changes–derivatives
Sliding on the slope
Chain rule
Partial derivatives
Summary
The Learning Process
Understanding the problem Dataset definition and retrieval
The ETL process Loading datasets and doing exploratory analysis with SciPy and pandas Working interactively with IPython Working on 2D data
Feature engineering
Imputation of missing data One hot encoding
Dataset preprocessing
Normalization and feature scaling
Normalization or standardization
Model definition
Asking ourselves the right questions
Loss function definition Model fitting and evaluation
Dataset partitioning
Common training terms –  iteration, batch, and epoch Types of training – online and batch processing Parameter initialization
Model implementation and results interpretation
Regression metrics
Mean absolute error Median absolute error Mean squared error
Classification metrics
Accuracy Precision score, recall, and F-measure Confusion matrix
Clustering quality measurements
Silhouette coefficient Homogeneity, completeness, and V-measure
Summary References
Clustering
Grouping as a human activity Automating the clustering process Finding a common center - K-means
Pros and cons of K-means K-means algorithm breakdown K-means implementations
Nearest neighbors
Mechanics of K-NN
Pros and cons of K-NN
K-NN sample implementation
Going beyond the basics
The Elbow method
Summary References
Linear and Logistic Regression
Regression analysis
Applications of regression
Quantitative versus qualitative variables
Linear regression
Determination of the cost function
The many ways of minimizing errors
Analytical approach
Pros and cons of the analytical approach
Covariance/correlation method
Covariance Correlation
Searching for the slope and intercept with covariance and correlation Gradient descent
Some intuitive background The gradient descent loop
Formalizing our concepts
Expressing recursion as a process
Going practical – new tools for new methods Useful diagrams for variable explorations – pairplot Correlation plot
Data exploration and linear regression in practice
The Iris dataset
Getting an intuitive idea with Seaborn pairplot Creating the prediction function Defining the error function Correlation fit Polynomial regression and an introduction to underfitting and overfitting
Linear regression with gradient descent in practice
Logistic regression
Problem domain of linear regression and logistic regression
Logistic function predecessor – the logit functions Link function
Logit function
Logit function properties The importance of the logit inverse The sigmoid or logistic function Properties of the logistic function Multiclass application – softmax regression
Practical example – cardiac disease modeling with logistic regression
The CHDAGE dataset Dataset format
Summary References
Neural Networks
History of neural models
The perceptron model Improving our predictions – the ADALINE algorithm Similarities and differences between a perceptron and ADALINE
Limitations of early models
Single and multilayer perceptrons
MLP origins The feedforward mechanism The chosen optimization algorithm – backpropagation
Types of problem to be tackled
Implementing a simple function with a single-layer perceptron
Defining and graphing transfer function types Representing and understanding the transfer functions Sigmoid or logistic function Playing with the sigmoid Rectified linear unit or ReLU Linear transfer function Defining loss functions for neural networks
L1 versus L2 properties
Summary References
Convolutional Neural Networks
Origin of convolutional neural networks
Getting started with convolution
Continuous convolution Discrete convolution
Kernels and convolutions
Stride and padding
Implementing the 2D discrete convolution operation in an example Subsampling operation (pooling) Improving efficiency with the dropout operation
Advantages of the dropout layers
Deep neural networks
Deep convolutional network architectures through time
Lenet 5 Alexnet The VGG model GoogLenet and the Inception model
Batch-normalized inception V2 and V3
Residual Networks (ResNet)
Types of problem solved by deep layers of CNNs
Classification Detection Segmentation
Deploying a deep neural network with Keras Exploring a convolutional model with Quiver
Exploring a convolutional network with Quiver Implementing transfer learning
References Summary
Recurrent Neural Networks
Solving problems with order — RNNs
RNN definition
Types of sequence to be modeled
Development of RNN
Training method — backpropagation through time Main problems of the traditional RNNs — exploding and vanishing gradients
LSTM
The gate and multiplier operation Part 1 — set values to forget (input gate) Part 2 — set values to keep Part 3 — apply changes to cell Part 4 — output filtered cell state
Univariate time series prediction with energy consumption data
Dataset description and loading
Dataset preprocessing
Summary References
Recent Models and Developments
GANs
Types of GAN applications
Discriminative and generative models
Reinforcement learning
Markov decision process
Decision elements
Optimizing the Markov process
Basic RL techniques: Q-learning References Summary
Software Installation and Configuration
Linux installation
Initial distribution requirements Installing Anaconda on Linux pip Linux installation method
Installing the Python 3 interpreter Installing pip Installing necessary libraries
macOS X environment installation
Anaconda installation Installing pip
Installing remaining libraries via pip
Windows installation
Anaconda Windows installation
Summary
  • ← Prev
  • Back
  • Next →
  • ← Prev
  • Back
  • Next →

Chief Librarian: Las Zenow <zenow@riseup.net>
Fork the source code from gitlab
.

This is a mirror of the Tor onion service:
http://kx5thpx2olielkihfyo4jgjqfb7zx7wxr3sd4xzt26ochei4m6f7tayd.onion