Log In
Or create an account -> 
Imperial Library
  • Home
  • About
  • News
  • Upload
  • Forum
  • Help
  • Login/SignUp

Index
Title Page Copyright and Credits
Mastering Predictive Analytics with scikit-learn and TensorFlow
Packt Upsell
Why subscribe? Packt.com
Contributor
About the author Packt is searching for authors like you
Preface
Who this book is for What this book covers To get the most out of this book
Download the example code files Download the color images Conventions used
Get in touch
Reviews
Ensemble Methods for Regression and Classification
Ensemble methods and their working
Bootstrap sampling Bagging Random forests Boosting
Ensemble methods for regression
The diamond dataset Training different regression models
KNN model Bagging model Random forests model Boosting model
Using ensemble methods for classification
Predicting a credit card dataset  Training different regression models
Logistic regression model Bagging model Random forest model Boosting model
Summary
Cross-validation and Parameter Tuning
Holdout cross-validation K-fold cross-validation
Implementing k-fold cross-validation
Comparing models with k-fold cross-validation Introduction to hyperparameter tuning
Exhaustive grid search Hyperparameter tuning in scikit-learn Comparing tuned and untuned models
Summary
Working with Features
Feature selection methods 
Removing dummy features with low variance Identifying important features statistically Recursive feature elimination
Dimensionality reduction and PCA Feature engineering
Creating new features
Improving models with feature engineering
Training your model
Reducible and irreducible error Summary
Introduction to Artificial Neural Networks and TensorFlow
Introduction to ANNs
Perceptrons Multilayer perceptron
Elements of a deep neural network model
Deep learning Elements of an MLP model
Introduction to TensorFlow
TensorFlow installation
Core concepts in TensorFlow
Tensors Computational graph
Summary
Predictive Analytics with TensorFlow and Deep Neural Networks
Predictions with TensorFlow
Introduction to the MNIST dataset Building classification models using MNIST dataset Elements of the DNN model Building the DNN
Reading the data Defining the architecture Placeholders for inputs and labels Building the neural network The loss function Defining optimizer and training operations Training strategy and valuation of accuracy of the classification Running the computational graph
Regression with Deep Neural Networks (DNN)
Elements of the DNN model Building the DNN
Reading the data Objects for modeling Training strategy Input pipeline for the DNN Defining the architecture Placeholders for input values and labels Building the DNN The loss function Defining optimizer and training operations Running the computational graph
Classification with DNNs
Exponential linear unit activation function Classification with DNNs Elements of the DNN model Building the DNN
Reading the data Producing the objects for modeling Training strategy Input pipeline for DNN Defining the architecture Placeholders for inputs and labels Building the neural network The loss function Evaluation nodes Optimizer and the training operation Run the computational graph
Evaluating the model with a set threshold
Summary
Other Books You May Enjoy
Leave a review - let other readers know what you think
  • ← Prev
  • Back
  • Next →
  • ← Prev
  • Back
  • Next →

Chief Librarian: Las Zenow <zenow@riseup.net>
Fork the source code from gitlab
.

This is a mirror of the Tor onion service:
http://kx5thpx2olielkihfyo4jgjqfb7zx7wxr3sd4xzt26ochei4m6f7tayd.onion