With deep learning, rather than starting fresh on every project with costly training, validating and testing, you can use pretrained deep neural network models to:
make new predictions,
continue training them further with new data or
transfer the weights learned by a model for a similar problem into a new model—this is called transfer learning.
Keras comes bundled with the following pretrained convnet models,92 each pretrained on ImageNet93—a growing dataset of 14+ million images:
Xception
VGG16
VGG19
ResNet50
Inception v3
Inception-ResNet v2
MobileNet v1
DenseNet
NASNet
MobileNet v2
ImageNet is too big for efficient training on most computers, so most people interested in using it start with one of the smaller pretrained models.
You can reuse just the architecture of each model and train it with new data, or you can reuse the pretrained weights. For a few simple examples, see:
https:/ / keras.io/ applications/
In the end-of-chapter projects, you’ll research and use some of these bundled models. You’ll also investigate the ImageNet Large Scale Visual Recognition Challenge for evaluating object-detection and image-recognition models.94 This competition ran from 2010 through 2017. ImageNet now has a continuously running challenge on the Kaggle competition site called the ImageNet Object Localization Challenge.95 The goal is to identify “all objects within an image, so those images can then be classified and annotated.” ImageNet releases the current participants leaderboard once per quarter.
A lot of what you’ve seen in the machine learning and deep learning chapters is what the Kaggle competition website is all about. There’s no obvious optimal solution for many machine learning and deep learning tasks. People’s creativity is really the only limit. On Kaggle, companies and organizations fund competitions where they encourage people worldwide to develop better-performing solutions than they’ve been able to do for something that’s important to their business or organization. Sometimes companies offer prize money, which has been as high as $1,000,000 on the famous Netflix competition. Netflix wanted to get a 10% or better improvement in their model for determining whether people will like a movie, based on how they rated previous ones.96 They used the results to help make better recommendations to members. Even if you do not win a Kaggle competition, it’s a great way to get experience working on problems of current interest.