English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 6h 32m | 4.79 GB

An introduction to the linear algebra behind machine learning models

Linear Algebra for Machine Learning LiveLessons provides you with an understanding of the theory and practice of linear algebra, with a focus on machine learning applications.

About the Instructor

Jon Krohn is Chief Data Scientist at the machine learning company untapt. He authored the book Deep Learning Illustrated (Addison-Wesley, 2020), an instant #1 bestseller that has been translated into six languages. Jon is renowned for his compelling lectures, which he offers in-person at Columbia University and New York University, as well as online via O’Reilly, YouTube, and the Super Data Science Podcast. Jon holds a PhD from Oxford and has been publishing on machine learning in leading academic journals since 2010; his papers have been cited over a thousand times.

Learn How To

- Appreciate the role of algebra in machine and deep learning
- Understand the fundamentals of linear algebra, a ubiquitous approach for solving for unknowns within high-dimensional spaces
- Develop a geometric intuition of what’s going on beneath the hood of machine learning algorithms, including those used for deep learning
- Be able to more intimately grasp the details of machine learning papers as well as all of the other subjects that underlie ML, including calculus, statistics, and optimization algorithms
- Manipulate tensors of all dimensionalities including scalars, vectors, and matrices, in all of the leading Python tensor libraries: NumPy, TensorFlow, and PyTorch
- Reduce the dimensionality of complex spaces down to their most informative elements with techniques such as eigendecomposition (eigenvectors and eigenvalues), singular value decomposition, and principal components analysis

Who Should Take This Course

- Users of high-level software libraries (e.g., scikit-learn, Keras, TensorFlow) to train or deploy machine learning algorithms who would now like to understand the fundamentals underlying the abstractions, enabling them to expand their capabilities
- Software developers who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems
- Data scientists who would like to reinforce their understanding of the subjects at the core of their professional discipline
- Data analysts or AI enthusiasts who would like to become a data scientist or data/ML engineer and are keen to deeply understand from the ground up the field they’re entering (very wise!)

Lesson 1: Orientation to Linear Algebra

In Lesson 1, Jon starts with a definition of linear algebra. He then shows you how to use it to solve for unknowns in a system of linear equations. Next, he discusses why linear algebra is so crucial in modern machine learning, including deep learning. Finally, he finishes up with a brief history of algebra and some comprehension exercises.

Lesson 2: Data Structures for Algebra

Lesson 2 focuses on tensors, the fundamental data structure of linear algebra. Jon starts off with zero-dimensional scalar tensors. Then, he covers one-dimensional vector tensors, including the topics of transposition, norms, and unit vectors, as well as the bases for orthogonal and orthonormal vectors. The lesson wraps up with two-dimensional matrix tensors and higher-dimensional n tensors and a few exercises.

Lesson 3: Common Tensor Operations

Lesson 3 is about common tensor operations, including transposition, basic tensor arithmetic, reduction, and the dot product. It finishes up with exercises.

Lesson 4: Solving Linear Systems

In Lesson 4 you take a brief break from hands-on code demos to learn how to solve systems of linear systems by hand. The focus is on substitution and elimination strategies. The lesson finishes with exercises to reinforce those concepts.

Lesson 5: Matrix Multiplication

Lesson 5 is about matrix multiplication. Matrix-by-vector multiplication is covered first, followed by matrix-by-matrix multiplication. Next, the concepts of symmetric and identity matrices are discussed, followed by exercises that show their relevance to matrix multiplication. Finally, Jon wraps up with an explanation of the critical role of matrix multiplication in machine learning and deep learning applications.

Lesson 6: Special Matrices and Matrix Operations

Lesson 6 covers a number of special matrices and special matrix operations that are essential to machine learning. These include the Frobenius norm, matrix inversion, diagonal matrices, orthogonal matrices, and the trace operator.

Lesson 7: Eigenvectors and Eigenvalues

Lesson 7 begins with Jon discussing what the eigenconcept is all about. He follows this with some exercises to warm you up for playing around with eigenvectors in Python, including high-dimensional eigenvectors.

Lesson 8: Matrix Determinants and Decomposition

Jon begins Lesson 8 by illustrating how to calculate the determinant of a 2 x 2 matrix as well as the determinant for larger matrices. This prepares you for being able to work on some exercises on determinants on your own. The second half of the lesson discusses the relationship between determinants and eigenvalues and provides an overview of the broad range of eigendecomposition applications in the real world.

Lesson 9: Machine Learning with Linear Algebra

In Lesson 9 Jon helps you tie together many of the concepts you have been introduced to previously to power many useful machine learning applications. You learn singular value decomposition to compress a media file, the Moore-Penrose pseudoinverse to form a regression, and principal component analysis to break down a dataset into its most influential components. Finally, Jon provides you with resources for your further study of linear algebra.

**+ Table of Contents**

01 Topics

02 Topics

03 3.1 Tensor Transposition

04 3.5 Exercises

05 3.2 Basic Tensor Arithmetic

06 Topics

07 5.4 Exercises

08 6.4 Orthogonal Matrices

09 6.2 Matrix Inversion

10 Topics

11 Topics

12 1.1 Defining Linear Algebra

13 1.5 Exercise

14 4.1 The Substitution Strategy

15 4.2 Substitution Exercises

16 4.4 Elimination Exercises

17 Topics

18 5.3 Symmetric and Identity Matrices

19 7.4 High-Dimensional Eigenvectors

20 1.2 Solving a System of Equations Algebraically

21 Linear Algebra for Machine Learning (Machine Learning Foundations) – Introduction

22 1.3 Linear Algebra in Machine Learning and Deep Learning

23 2.1 Tensors

24 2.7 Generic Tensor Notation

25 2.3 Vectors and Vector Transposition

26 Topics

27 3.3 Reduction

28 2.4 Norms and Unit Vectors

29 2.5 Basis, Orthogonal, and Orthonormal Vectors

30 4.3 The Elimination Strategy

31 5.2 Matrix-by-Matrix Multiplication

32 5.1 Matrix-by-Vector Multiplication

33 3.4 The Dot Product

34 5.5 Machine Learning and Deep Learning Applications

35 6.1 The Frobenius Norm

36 7.1 The Eigenconcept

37 6.3 Diagonal Matrices

38 7.2 Exercises

39 8.1 The Determinant of a 2 x 2 Matrix

40 Topics

41 8.3 Exercises

42 8.2 The Determinants of Larger Matrices

43 8.4 Determinants and Eigenvalues

44 9.4 Regression via Pseudoinversion

45 9.2 Media File Compression

46 9.1 Singular Value Decomposition

47 9.3 The Moore-Penrose Pseudoinverse

48 9.6 Resources for Further Study of Linear Algebra

49 1.4 Historical and Contemporary Applications

50 2.2 Scalars

51 2.6 Matrices

52 2.8 Exercises

53 Topics

54 6.5 The Trace Operator

55 7.3 Eigenvectors in Python

56 8.5 Eigendecomposition

57 Linear Algebra for Machine Learning (Machine Learning Foundations) – Summary

58 9.5 Principal Component Analysis

Resolve the captcha to access the links!