Neural Network Methods in Natural Language Processing

Neural Network Methods in Natural Language Processing

Table of Contents:

Preface

Acknowledgments

Introduction

Learning Basics and Linear Models

From Linear Models to Multi-layer Perceptrons

Feed-forward Neural Networks

Neural Network Training

Features for Textual Data

Case Studies of NLP Features

From Textual Features to Inputs

Language Modeling

Pre-trained Word Representations

Using Word Embeddings

Case Study: A Feed-forward Architecture for Sentence Meaning Inference

Ngram Detectors: Convolutional Neural Networks

Recurrent Neural Networks: Modeling Sequences and Stacks

Concrete Recurrent Neural Network Architectures

Modeling with Recurrent Networks

Conditioned Generation

Modeling Trees with Recursive Neural Networks

Structured Output Prediction

Cascaded, Multi-task and Semi-supervised Learning

Conclusion

Bibliography

Author's Biography