Log In
Or create an account ->
Imperial Library
Home
About
News
Upload
Forum
Help
Login/SignUp
Index
Neural Network Programming with Java Second Edition
Table of Contents
Neural Network Programming with Java Second Edition
Credits
About the Authors
About the Reviewer
www.PacktPub.com
eBooks, discount offers, and more
Why subscribe?
Customer Feedback
Preface
What this book covers
What you need for this book
Who this book is for
Conventions
Reader feedback
Customer support
Downloading the example code
Errata
Piracy
Questions
1. Getting Started with Neural Networks
Discovering neural networks
Why artificial neural networks?
How neural networks are arranged
The very basic element – artificial neuron
Giving life to neurons – activation function
The flexible values – weights
An extra parameter – bias
The parts forming the whole – layers
Learning about neural network architectures
Monolayer networks
Multilayer networks
Feedforward networks
Feedback networks
From ignorance to knowledge – learning process
Let the coding begin! Neural networks in practice
The neuron class
The NeuralLayer class
The ActivationFunction interface
The neural network class
Time to play!
Summary
2. Getting Neural Networks to Learn
Learning ability in neural networks
How learning helps solving problems
Learning paradigms
Supervised learning
Unsupervised learning
The learning process
The cost function finding the way down to the optimum
Learning in progress - weight update
Calculating the cost function
General error and overall error
Can the neural network learn forever? When is it good to stop?
Examples of learning algorithms
The delta rule
The learning rate
Implementing the delta rule
The core of the delta rule learning - train and calcNewWeight methods
Another learning algorithm - Hebbian learning
Adaline
Time to see the learning in practice!
Teaching the neural network – the training dataset
Amazing, it learned! Or, did it really? A further step – testing
Overfitting and overtraining
Summary
3. Perceptrons and Supervised Learning
Supervised learning – teaching the neural net
Classification – finding the appropriate class
Regression – mapping real inputs to outputs
A basic neural architecture – perceptrons
Applications and limitations
Linear separation
The XOR case
Multi-layer perceptrons
MLP properties
MLP weights
Recurrent MLP
Coding an MLP
Learning in MLPs
Backpropagation algorithm
The momentum
Coding the backpropagation
Levenberg-Marquardt algorithm
Coding the Levenberg-Marquardt with matrix algebra
Extreme learning machines
Practical example 1 – the XOR case with delta rule and backpropagation
Practical example 2 – predicting enrolment status
Summary
4. Self-Organizing Maps
Neural networks unsupervised learning
Unsupervised learning algorithms
Competitive learning
Competitive layer
Kohonen self-organizing maps
Extending the neural network code to Kohonen
Zero-dimensional SOM
One-dimensional SOM
Two-dimensional SOM
2D competitive layer
SOM learning algorithm
Effect of neighboring neurons – the neighborhood function
The learning rate
A new class for competitive learning
Visualizing the SOMs
Plotting 2D training datasets and neuron weights
Testing Kohonen learning
Summary
5. Forecasting Weather
Neural networks for regression problems
Loading/selecting data
Building auxiliary classes
Getting a dataset from a CSV file
Building time series
Dropping NaNs
Getting weather data
Weather variables
Choosing input and output variables
Preprocessing
Normalization
Adapting NeuralDataSet to handle normalization
Adapting the learning algorithm to normalization
Java implementation of weather forecasting
Collecting weather data
Delaying variables
Loading the data and beginning to play!
Let's perform a correlation analysis
Creating neural networks
Training and test
Training the neural network
Plotting the error
Viewing the neural network output
Empirical design of neural networks
Designing experiments
Results and simulations
Summary
6. Classifying Disease Diagnosis
Foundations of classification problems
Categorical data
Working with categorical data
Logistic regression
Multiple classes versus binary classes
Confusion matrix
Sensitivity and specificity
Implementing a confusion matrix
Neural networks for classification
Disease diagnosis with neural networks
Breast cancer
Diabetes
Summary
7. Clustering Customer Profiles
Clustering tasks
Cluster analysis
Cluster evaluation and validation
Implementation
External validation
Applied unsupervised learning
Kohonen neural network
Profiling
Pre-processing
Implementation in Java
Card – credit analysis for customer profiling
Product profiling
How many clusters?
Summary
8. Text Recognition
Pattern recognition
Defined classes
Undefined classes
Neural networks in pattern recognition
Data pre-processing
Text recognition (optical character recognition)
Digit recognition
Digit representation
Implementation in Java
Generating data
Neural architecture
Experiments
Results
Summary
9. Optimizing and Adapting Neural Networks
Common issues in neural network implementations
Input selection
Data correlation
Transforming data
Dimensionality reduction
Data filtering
Cross-validation
Structure selection
Online retraining
Stochastic online learning
Implementation
Application
Adaptive neural networks
Adaptive resonance theory
Implementation
Summary
10. Current Trends in Neural Networks
Deep learning
Deep architectures
How to implement deep learning in Java
Hybrid systems
Neuro-fuzzy
Neuro-genetic
Implementing a hybrid neural network
Summary
A. References
Chapter 1: Getting Started with Neural Networks
Chapter 2: Getting Neural Networks to Learn
Chapter 3: Perceptrons and Supervised Learning
Chapter 4: Self-Organizing Maps
Chapter 5: Forecasting Weather
Chapter 6: Classifying Disease Diagnosis
Chapter 7: Clustering Customer Profiles
Chapter 8: Text Recognition
Chapter 9: Optimizing and Adapting Neural Networks
Chapter 10: Current Trends in Neural Networks
Index
← Prev
Back
Next →
← Prev
Back
Next →