Title Page Copyright and Credits Machine Learning Quick Reference About Packt Why subscribe? Packt.com Contributors About the author About the reviewers Packt is searching for authors like you Preface Who this book is for What this book covers To get the most out of this book Download the example code files Download the color images Conventions used Get in touch Reviews Quantifying Learning Algorithms Statistical models Learning curve Machine learning Wright's model Curve fitting Residual Statistical modeling – the two cultures of Leo Breiman Training data development data – test data Size of the training, development, and test set Bias-variance trade off Regularization Ridge regression (L2) Least absolute shrinkage and selection operator  Cross-validation and model selection K-fold cross-validation Model selection using cross-validation 0.632 rule in bootstrapping Model evaluation Confusion matrix Receiver operating characteristic curve Area under ROC H-measure Dimensionality reduction Summary Evaluating Kernel Learning Introduction to vectors Magnitude of the vector Dot product Linear separability Hyperplanes  SVM Support vector Kernel trick Kernel Back to Kernel trick Kernel types Linear kernel Polynomial kernel Gaussian kernel SVM example and parameter optimization through grid search Summary Performance in Ensemble Learning What is ensemble learning? Ensemble methods  Bootstrapping Bagging Decision tree Tree splitting Parameters of tree splitting Random forest algorithm Case study Boosting Gradient boosting Parameters of gradient boosting Summary Training Neural Networks Neural networks How a neural network works Model initialization Loss function Optimization Computation in neural networks Calculation of activation for H1 Backward propagation Activation function Types of activation functions Network initialization Backpropagation Overfitting Prevention of overfitting in NNs Vanishing gradient  Overcoming vanishing gradient Recurrent neural networks Limitations of RNNs Use case Summary Time Series Analysis Introduction to time series analysis White noise Detection of white noise in a series Random walk Autoregression Autocorrelation Stationarity Detection of stationarity AR model Moving average model Autoregressive integrated moving average Optimization of parameters AR model ARIMA model Anomaly detection Summary Natural Language Processing Text corpus Sentences Words Bags of words TF-IDF Executing the count vectorizer Executing TF-IDF in Python Sentiment analysis Sentiment classification TF-IDF feature extraction Count vectorizer bag of words feature extraction Model building count vectorization Topic modeling  LDA architecture Evaluating the model Visualizing the LDA The Naive Bayes technique in text classification The Bayes theorem How the Naive Bayes classifier works Summary Temporal and Sequential Pattern Discovery Association rules Apriori algorithm Finding association rules Frequent pattern growth Frequent pattern tree growth Validation  Importing the library Summary Probabilistic Graphical Models Key concepts Bayes rule Bayes network Probabilities of nodes CPT Example of the training and test set Summary Selected Topics in Deep Learning Deep neural networks Why do we need a deep learning model? Deep neural network notation Forward propagation in a deep network Parameters W and b Forward and backward propagation Error computation Backward propagation Forward propagation equation Backward propagation equation Parameters and hyperparameters Bias initialization Hyperparameters Use case – digit recognizer Generative adversarial networks Hinton's Capsule network The Capsule Network and convolutional neural networks Summary Causal Inference Granger causality F-test Limitations Use case Graphical causal models Summary Advanced Methods Introduction Kernel PCA Independent component analysis Preprocessing for ICA Approach Compressed sensing Our goal Self-organizing maps SOM Bayesian multiple imputation Summary Other Books You May Enjoy Leave a review - let other readers know what you think