Learn the Node.js’ most powerful feature for processing data on-demand, the Node.js Streams.
A practical guide for you to learn how to process large loads of data on demand, such as processing videos, audio, integrations between systems, databases, and more using the powerful Node.js Streams.
You will learn in practice how to build and test complex projects at scale by:
- Processing terabytes of files;
- Creating E2E and Unit tests when using Node.js Streams;
- Using multiprocesses and multithreading in Node.js;
- Seeding and consuming multimedia data on demand such as video and audio.
Table of Contents
1 WATCH ME FIRST!
2 Meet your Instructor – Who’s Erick Wendel?
4 Project – Using the Observer Pattern in practice – Payments in Ecommerces use case
6 Buffers: The Key concept behind Streams
7 What are Streams in Practice – Readable, Writable and Transform Streams?
8 What are Duplex Streams – Transform and PassThrough
9 Duplex Streams in practice
10 Project – creating a chat application between servers using the native Node.js net module
11 Understanding the difference between streams API .pipe and pipeline
12 Project – Creating a stream data splitter and converting huge csv files to ndjson – PT01
13 Project – Creating a stream data splitter and converting huge csv files to ndjson – PT02
14 Async Iterators, Generator Functions, and on-demand processing
15 Working with Streams Operators – Consuming and processing data from SQL Databases on-demand
16 Aborting Async Operations
17 Project – Consuming Web APIs as Node.js Streams
19 WebStreams 101
20 Project – Consuming and producing massive data using Web streams (back + frontend) – PT01
21 Project – Consuming and producing massive data using Web streams (back + frontend) – PT02
23 Processing data in parallel using child processes and Node.js Streams