Neural Network Methods in Natural Language Processing
- Authors
- Goldberg, Yoav
- Publisher
- Morgan & Claypool
- Tags
- sequence to sequence models , neural networks , computers , deep learning , machine learning , word embeddings , natural language processing , recurrent neural networks , supervised learning , general
- ISBN
- 9781627052986
- Date
- 2017-04-17T00:00:00+00:00
- Size
- 3.31 MB
- Lang
- en
Table of Contents:
Preface
Acknowledgments
Introduction
Learning Basics and Linear Models
From Linear Models to Multi-layer Perceptrons
Feed-forward Neural Networks
Neural Network Training
Features for Textual Data
Case Studies of NLP Features
From Textual Features to Inputs
Language Modeling
Pre-trained Word Representations
Using Word Embeddings
Case Study: A Feed-forward Architecture for Sentence Meaning Inference
Ngram Detectors: Convolutional Neural Networks
Recurrent Neural Networks: Modeling Sequences and Stacks
Concrete Recurrent Neural Network Architectures
Modeling with Recurrent Networks
Conditioned Generation
Modeling Trees with Recursive Neural Networks
Structured Output Prediction
Cascaded, Multi-task and Semi-supervised Learning
Conclusion
Bibliography
Author's Biography