Natural Language Processing with PyTorch

Natural Language Processing with PyTorch
Natural Language Processing with PyTorch
English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 2h 58m | 360 MB

This course covers the use of advanced neural network constructs and architectures, such as recurrent neural networks, word embeddings, and bidirectional RNNs, to solve complex word and language modeling problems using PyTorch.

From chatbots to machine-generated literature, some of the hottest applications of ML and AI these days are for data in textual form. In this course, Natural Language Processing with PyTorch, you will gain the ability to design and implement complex text processing models using PyTorch, which is fast emerging as a popular choice for building deep-learning models owing to its flexibility, ease-of-use, and built-in support for optimized hardware such as GPUs. First, you will learn how to leverage recurrent neural networks (RNNs) to capture sequential relationships within text data. Next, you will discover how to express text using word vector embeddings, a sophisticated form of encoding that is supported by out-of-the-box in PyTorch via the torchtext utility. Finally, you will explore how to build complex multi-level RNNs and bidirectional RNNs to capture both backward and forward relationships within data. You will round out the course by building sequence-to-sequence RNNs for language translation. When you are finished with this course, you will have the skills and knowledge to design and implement complex natural language processing models using sophisticated recurrent neural networks in PyTorch.

Table of Contents

Course Overview
1 Course Overview

Implementing Recurrent Neural Networks (RNNs) in PyTorch
2 Module Overview
3 Prerequisites and Course Outline
4 RNNs for Natural Language Processing
5 Recurrent Neurons
6 Back Propagation through Time
7 Coping with Vanishing and Exploding Gradients
8 Long Memory Cells
9 Module Summary

Performing Binary Text Classification Using Words
10 Module Overview
11 Word Embeddings to Represent Text Data
12 Introducing torchtext to Process Text Data
13 Feeding Text Data into RNNs
14 Setup and Data Cleaning
15 Using Torchtext to Process Text Data
16 Designing an RNN for Binary Text Classification
17 Training the RNN
18 Using LSTM Cells and Dropout
19 Module Summary

Performing Multi-class Text Classification Using Characters
20 Module Overview
21 Language Prediction Based on Names
22 Loading and Cleaning Data
23 Helper Functions to One Hot Encode Names
24 Designing an RNN for Multiclass Text Classification
25 Predicting Language from Names
26 Module Summary

Performing Sentiment Analysis Using Word Embeddings
27 Module Overview
28 Numeric Representations of Words
29 Word Embeddings Capture Context and Meaning
30 Generating Analogies Using GloVe Embeddings
31 Multilayer RNNs
32 Bidirectional RNNs
33 Data Cleaning and Preparation
34 Designing a Multilayer Bidirectional RNN
35 Performing Sentiment Analysis Using an RNN
36 Module Summary

Performing Language Translation Using Sequence-to-Sequence Models
37 Module Overview
38 Using Sequences and Vectors with RNNs
39 Language Translation Using Encoders and Decoders
40 Representing Input and Target Sentences
41 Teacher Forcing
42 Setting up Helper Functions for Language Translation
43 Preparing Sentence Pairs
44 Designing the Encoder and Decoder
45 Training the Sequence-2-Sequence Model Using Teacher Forcing
46 Translating Sentences
47 Summary and Further Study