Log In
Or create an account -> 
Imperial Library
  • Home
  • About
  • News
  • Upload
  • Forum
  • Help
  • Login/SignUp

Index
Title Page Copyright and Credits
Hands-On Python Natural Language Processing
About Packt
Why subscribe?
Contributors
About the authors About the reviewers Packt is searching for authors like you
Preface
Who this book is for What this book covers To get the most out of this book Download the example code files Download the color images Conventions used Get in touch Reviews
Section 1: Introduction Understanding the Basics of NLP
Programming languages versus natural languages Chatbots Sentiment analysis Machine translation Named-entity recognition Summary
NLP Using Python
Technical requirements Understanding Python with NLP  Important Python libraries NLTK corpora Text processing Part of speech tagging Sentiment analysis Machine translation Part of speech tagging VADER Web scraping libraries and methodology Summary
Section 2: Natural Language Representation and Mathematics Building Your NLP Vocabulary
Technical requirements Stemming
Transforming Text into Data Structures
Technical requirements
Word Embeddings and Distance Measurements for Text
Technical requirements Exploring the components of a Skip-gram model  v Output vector
Exploring Sentence-, Document-, and Character-Level Embeddings
Technical requirements Building a Doc2Vec model Building a fastText model Building a spelling corrector/word suggestion module using fastText Sent2Vec
Section 3: NLP and Learning Identifying Patterns in Text Using Machine Learning
Technical requirements Min-max standardization Z-score standardization
From Human Neurons to Artificial Neurons for Understanding Text
Technical requirements Exploring the biology behind neural networks Neurons Activation functions Sigmoid Tanh activation Rectified linear unit Layers in an ANN Dropout Let's talk Keras Summary
Applying Convolutions to Text
Technical requirements What is a CNN? Understanding convolutions Let's pad our data Understanding strides in a CNN What is pooling? The fully connected layer Detecting sarcasm in text using CNNs Loading the libraries and the dataset Performing basic data analysis and preprocessing our data Loading the Word2Vec model and vectorizing our data Splitting our dataset into train and test sets Building the model Evaluating and saving our model Summary
Capturing Temporal Relationships in Text
Technical requirements Baby steps toward understanding RNNs Forward propagation in an RNN Backpropagation through time in an RNN Vanishing and exploding gradients Architectural forms of RNNs Different flavors of RNN Carrying relationships both ways using bidirectional RNNs Going deep with RNNs Giving memory to our networks – LSTMs Understanding an LSTM cell Forget gate Input gate Output gate Backpropagation through time in LSTMs Building a text generator using LSTMs Exploring memory-based variants of the RNN architecture GRUs Stacked LSTMs Summary
State of the Art in NLP
Technical requirements Transformers Summary
Other Books You May Enjoy
Leave a review - let other readers know what you think
  • ← Prev
  • Back
  • Next →
  • ← Prev
  • Back
  • Next →

Chief Librarian: Las Zenow <zenow@riseup.net>
Fork the source code from gitlab
.

This is a mirror of the Tor onion service:
http://kx5thpx2olielkihfyo4jgjqfb7zx7wxr3sd4xzt26ochei4m6f7tayd.onion