Log In
Or create an account -> 
Imperial Library
  • Home
  • About
  • News
  • Upload
  • Forum
  • Help
  • Login/SignUp

Index
Title Page Copyright and Credits
Artificial Intelligence By Example
HumbleBundle Dedication Packt Upsell
Why subscribe? PacktPub.com
Contributors
About the author About the reviewers Packt is searching for authors like you
Preface
Who this book is for What this book covers To get the most out of this book
Download the example code files Download the color images Code in Action Conventions used
Get in touch
Reviews
Become an Adaptive Thinker
Technical requirements How to be an adaptive thinker
Addressing real-life issues before coding a solution
Step 1 – MDP in natural language Step 2 – the mathematical representation of the Bellman equation and MDP
From MDP to the Bellman equation
Step 3 – implementing the solution in Python
The lessons of reinforcement learning
How to use the outputs Machine learning versus traditional applications
Summary Questions Further reading
Think like a Machine
Technical requirements Designing datasets – where the dream stops and the hard work begins
Designing datasets in natural language meetings Using the McCulloch-Pitts neuron  The McCulloch-Pitts neuron The architecture of Python TensorFlow
Logistic activation functions and classifiers
Overall architecture Logistic classifier Logistic function Softmax
Summary Questions Further reading
Apply Machine Thinking to a Human Problem
Technical requirements Determining what and how to measure
Convergence
Implicit convergence Numerical –  controlled convergence
Applying machine thinking to a human problem
Evaluating a position in a chess game Applying the evaluation and convergence process to a business problem Using supervised learning to evaluate result quality
Summary Questions Further reading
Become an Unconventional Innovator
Technical requirements The XOR limit of the original perceptron
XOR and linearly separable models
Linearly separable models The XOR limit of a linear model, such as the original perceptron
Building a feedforward neural network from scratch
Step 1 – Defining a feedforward neural network Step 2 – how two children solve the XOR problem every day Implementing a vintage XOR solution in Python with an FNN and backpropagation
A simplified version of a cost function and gradient descent Linear separability was achieved
Applying the FNN XOR solution to a case study to optimize subsets of data Summary Questions Further reading
Manage the Power of Machine Learning and Deep Learning
Technical requirements Building the architecture of an FNN with TensorFlow
Writing code using the data flow graph as an architectural roadmap
A data flow graph translated into source code
The input data layer The hidden layer The output layer The cost or loss function Gradient descent and backpropagation Running the session Checking linear separability
Using TensorBoard to design the architecture of your machine learning and deep learning solutions
Designing the architecture of the data flow graph Displaying the data flow graph in TensorBoard The final source code with TensorFlow and TensorBoard
Using TensorBoard in a corporate environment
Using TensorBoard to explain the concept of classifying customer products to a CEO Will your views on the project survive this meeting?
Summary Questions Further reading References
Don't Get Lost in Techniques – Focus on Optimizing Your Solutions
Technical requirements Dataset optimization and control
Designing a dataset and choosing an ML/DL model
Approval of the design matrix
Agreeing on the format of the design matrix Dimensionality reduction The volume of a training dataset
Implementing a k-means clustering solution
The vision 
The data
Conditioning management
The strategy
The k-means clustering program
The mathematical definition of k-means clustering
Lloyd's algorithm 
The goal of k-means clustering in this case study The Python program
1 – The training dataset 2 – Hyperparameters 3 – The k-means clustering algorithm 4 – Defining the result labels 5 – Displaying the results – data points and clusters Test dataset and prediction
Analyzing and presenting the results
AGV virtual clusters as a solution
Summary Questions Further reading
When and How to Use Artificial Intelligence
Technical requirements Checking whether AI can be avoided
Data volume and applying k-means clustering
Proving your point
NP-hard – the meaning of P NP-hard – The meaning of non-deterministic The meaning of hard
Random sampling
The law of large numbers – LLN The central limit theorem
Using a Monte Carlo estimator
Random sampling applications
Cloud solutions – AWS
Preparing your baseline model
Training the full sample training dataset Training a random sample of the training dataset Shuffling as an alternative to random sampling
AWS – data management
Buckets Uploading files Access to output results
SageMaker notebook Creating a job Running a job
Reading the results
Recommended strategy
Summary Questions Further reading
Revolutions Designed for Some Corporations and Disruptive Innovations for Small to Large Companies
Technical requirements Is AI disruptive?
What is new and what isn't in AI
AI is based on mathematical theories that are not new Neural networks are not new Cloud server power, data volumes, and web sharing of the early 21st century started to make AI disruptive Public awareness contributed to making AI disruptive
Inventions versus innovations Revolutionary versus disruptive solutions Where to start?
Discover a world of opportunities with Google Translate
Getting started The program
The header Implementing Google's translation service 
Google Translate from a linguist's perspective
Playing with the tool Linguistic assessment of Google Translate
Lexical field theory Jargon Translating is not just translating but interpreting How to check a translation
AI as a new frontier
Lexical field and polysemy Exploring the frontier – the program
k-nearest neighbor algorithm
The KNN algorithm The knn_polysemy.py program Implementing the KNN compressed function in Google_Translate_Customized.py Conclusions on the Google Translate customized experiment
The disruptive revolutionary loop
Summary Questions Further reading
Getting Your Neurons to Work
Technical requirements Defining a CNN
Defining a CNN Initializing the CNN Adding a 2D convolution 
Kernel
Intuitive approach Developers' approach Mathematical approach
Shape ReLu
Pooling Next convolution and pooling layer Flattening Dense layers
Dense activation functions
Training a CNN model
The goal Compiling the model
Loss function
Quadratic loss function Binary cross-entropy
Adam optimizer Metrics
Training dataset
Data augmentation Loading the data
Testing dataset
Data augmentation Loading the data
Training with the classifier Saving the model
Next steps
Summary Questions Further reading and references
Applying Biomimicking to Artificial Intelligence
Technical requirements Human biomimicking
TensorFlow, an open source machine learning framework
Does deep learning represent our brain or our mind?
A TensorBoard representation of our mind
Input data Layer 1 – managing the inputs to the network
Weights, biases, and preactivation Displaying the details of the activation function through the preactivation process The activation function of Layer 1
Dropout and Layer 2 Layer 2 Measuring the precision of prediction of a network through accuracy values
Correct prediction accuracy
Cross-entropy Training
Optimizing speed with Google's Tensor Processing Unit
Summary Questions Further reading
Conceptual Representation Learning
Technical requirements Generate profit with transfer learning
The motivation of transfer learning
Inductive thinking Inductive abstraction The problem AI needs to solve
The Γ gap concept Loading the Keras model after training
Loading the model to optimize training Loading the model to use it
Using transfer learning to be profitable or see a project stopped
Defining the strategy
Applying the model Making the model profitable by using it for another problem
Where transfer learning ends and domain learning begins
Domain learning
How to use the programs
The trained models used in this section The training model program
GAP – loaded or unloaded GAP – jammed or open lanes The gap dataset
Generalizing the Γ(gap conceptual dataset) Generative adversarial networks
Generating conceptual representations The use of autoencoders
The motivation of conceptual representation learning meta-models
The curse of dimensionality  The blessing of dimensionality 
Scheduling and blockchains Chatbots Self-driving cars
Summary Questions Further reading
Automated Planning and Scheduling
Technical requirements Planning and scheduling today and tomorrow
A real-time manufacturing process
Amazon must expand its services to face competition A real-time manufacturing revolution
CRLMM applied to an automated apparel manufacturing process
An apparel manufacturing process Training the CRLMM
Generalizing the unit-training dataset
Food conveyor belt processing – positive pγ and negative nγ gaps Apparel conveyor belt processing – undetermined gaps The beginning of an abstract notion of gaps
Modifying the hyperparameters Running a prediction program
Building the DQN-CRLMM
A circular process Implementing a CNN-CRLMM to detect gaps and optimize Q-Learning – MDP 
MDP inputs and outputs
The input is a neutral reward matrix The standard output of the MDP function A graph interpretation of the MDP output matrix
The optimizer
The optimizer as a regulator 
Implementing Z – squashing the MDP result matrix Implementing Z – squashing the vertex weights vector
Finding the main target for the MDP function
Circular DQN-CRLMM – a stream-like system that never starts nor ends
Summary Questions Further reading
AI and the Internet of Things (IoT)
Technical requirements The Iotham City project Setting up the DQN-CRLMM model
Training the CRLMM
The dataset Training and testing the model
Classifying the parking lots
Adding an SVM function
Motivation – using an SVM to increase safety levels Definition of a support vector machine Python function 
Running the CRLMM
Finding a parking space Deciding how to get to the parking lot
Support vector machine The itinerary graph The weight vector
Summary Questions Further reading References
Optimizing Blockchains with AI
Technical requirements Blockchain technology background 
Mining bitcoins Using cryptocurrency  Using blockchains
Using blockchains in the A-F network
Creating a block Exploring the blocks
Using naive Bayes in a blockchain process
A naive Bayes example
The blockchain anticipation novelty The goal
Step 1 the dataset Step 2 frequency Step 3 likelihood Step 4 naive Bayes equation
Implementation
Gaussian naive Bayes
The Python program
Implementing your ideas
Summary Questions Further reading
Cognitive NLP Chatbots
Technical requirements IBM Watson
Intents
Testing the subsets
Entities Dialog flow
Scripting and building up the model
Adding services to a chatbot
A cognitive chatbot service
The case study
A cognitive dataset
Cognitive natural language processing
Activating an image + word cognitive chat Solving the problem 
Implementation
Summary Questions Further reading
Improve the Emotional Intelligence Deficiencies of Chatbots
Technical requirements Building a mind 
How to read this chapter The profiling scenario
Restricted Boltzmann Machines
The connections between visible and hidden units Energy-based models
Gibbs random sampling Running the epochs and analyzing the results
Sentiment analysis
Parsing the datasets
Conceptual representation learning meta-models
Profiling with images
RNN for data augmentation
RNNs and LSTMs
RNN, LSTM, and vanishing gradients
Prediction as data augmentation
Step1 – providing an input file Step 2 – running an RNN Step 3 – producing data augmentation
Word embedding
The Word2vec model
Principal component analysis
Intuitive explanation  Mathematical explanation
Variance Covariance Eigenvalues and eigenvectors Creating the feature vector Deriving the dataset Summing it up
TensorBoard Projector
Using Jacobian matrices
Summary Questions Further reading
Quantum Computers That Think
Technical requirements The rising power of quantum computers
Quantum computer speed Defining a qubit
Representing a qubit
The position of a qubit 
Radians, degrees, and rotations Bloch sphere
Composing a quantum score
Quantum gates with Quirk A quantum computer score with Quirk A quantum computer score with IBM Q
A thinking quantum computer
Representing our mind's concepts  Expanding MindX's conceptual representations Concepts in the mind-dataset of MindX
Positive thinking Negative thinking Gaps Distances
The embedding program The MindX experiment
Preparing the data  Transformation Functions – the situation function Transformation functions – the quantum function Creating and running the score Using the output IBM Watson and scripts
Summary Questions Further reading
Answers to the Questions
Chapter 1 – Become an Adaptive Thinker Chapter 2 – Think like a Machine Chapter 3 – Apply Machine Thinking to a Human Problem Chapter 4 – Become an Unconventional Innovator Chapter 5 – Manage the Power of Machine Learning and Deep Learning Chapter 6 – Don't Get Lost in Techniques, Focus on Optimizing Your Solutions Chapter 7 – When and How to Use Artificial Intelligence Chapter 8 – Revolutions Designed for Some Corporations and Disruptive Innovations for Small to Large Companies Chapter 9 – Getting Your Neurons to Work Chapter 10 – Applying Biomimicking to AI Chapter 11 – Conceptual Representation Learning Chapter 12 – Automated Planning and Scheduling Chapter 13 – AI and the Internet of Things Chapter 14 – Optimizing Blockchains with AI Chapter 15 – Cognitive NLP Chatbots Chapter 16 – Improve the Emotional Intelligence Deficiencies of Chatbots Chapter 17 – Quantum Computers That Think
  • ← Prev
  • Back
  • Next →
  • ← Prev
  • Back
  • Next →

Chief Librarian: Las Zenow <zenow@riseup.net>
Fork the source code from gitlab
.

This is a mirror of the Tor onion service:
http://kx5thpx2olielkihfyo4jgjqfb7zx7wxr3sd4xzt26ochei4m6f7tayd.onion