Zach Star echoes my thoughts with this video on what he wishes his teachers told him about matrices.

Given the relevance of linear algebra to quantum computing, this could not be more timely.

Zach Star echoes my thoughts with this video on what he wishes his teachers told him about matrices.

Given the relevance of linear algebra to quantum computing, this could not be more timely.

With the rise of deep learning and quantum computing, there has literally never been a better time to learn linear algebra.

Linear algebra is central to almost all areas of mathematics. For instance, linear algebra is fundamental in modern presentations of geometry, including for defining basic objects such as lines, planes and rotations. Also, functional analysis may be basically viewed as the application of linear algebra to spaces of functions. Linear algebra is also used in most sciences and engineering areas, because it allows modeling many natural phenomena, and efficiently computing with such models.

Lesson Plan

- (0:00) Systems of Linear Equations (1 of 3)
- (16:20) System of Linear Equations (2 of 3)
- (27:55) Systems of Linear Equations (3 of 3)
- (47:18) Row Reduction and Echelon Forms (1 of 2)
- (54:49) Row Reduction and Echelon Forms (2 of 2)
- (1:4:10) Vector Equations (1 of 2)
- (1:14:05) Vector Equations (2 of 2)
- (1:24:54) The Matrix Equation Ax = b (1 of 2)
- (1:39:21) The Matrix Equation Ax = b (2 of 2)
- (1:44:48) Solution Sets of Linear Systems
- (1:57:49) Linear Independence
- (2:11:20) Linear Transformations (1 of 2)
- (2:25:10) Linear Transformations (2 of 2)
- (2:39:19) Matrix Operations
- (2:56:24) Matrix Inverse
- (3:12:17) Invertible Matrix Properties
- (3:24:24) Determinants (1 of 2)
- (3:44:40) Determinants (2 of 2)
- (4:04:28) Cramer’s Rule
- (4:18:20) Vector Spaces and Subspaces (1 of 2)
- (4:48:30) Vector Spaces and Subspaces
- (5:13:13) Null Spaces, Column Spaces, and Linear Transformations
- (5:33:25) Basis of a Vector Space
- (5:59:43) Coordinate Systems in a Vector Space
- (6:15:41) Dimension of a Vector Space
- (6:26:35) Rank of a Matrix
- (6:50:09) Markov Chains
- (7:09:23) Eigenvalues and Eigenvectors
- (7:32:03) Matrix Diagonalization
- (7:49:08) Inner Product, Vector Length, Orthogonality

Derek Banas provides this tutorial on Linear Algebra, which has applications in quantum computing.

It’s always a good time to learn linear algebra.

Fortunately, Zach Star has a great visualization on how it all works.

Lex Fridman interviews Grant Sanderson is a math educator and creator of 3Blue1Brown, a popular YouTube channel that uses programmatically-animated visualizations to explain concepts in linear algebra, calculus, and other fields of mathematics.

OUTLINE:

0:00 – Introduction

1:56 – What kind of math would aliens have?

3:48 – Euler’s identity and the least favorite piece of notation

10:31 – Is math discovered or invented?

14:30 – Difference between physics and math

17:24 – Why is reality compressible into simple equations?

21:44 – Are we living in a simulation?

26:27 – Infinity and abstractions

35:48 – Most beautiful idea in mathematics

41:32 – Favorite video to create

45:04 – Video creation process

50:04 – Euler identity

51:47 – Mortality and meaning

55:16 – How do you know when a video is done?

56:18 – What is the best way to learn math for beginners?

59:17 – Happy moment

Lex Fridman interviews Gilbert Strang on Linear Algebra, Deep Learning, Teaching, and MIT OpenCourseWare.

Gilbert Strang is a professor of mathematics at MIT and perhaps one of the most famous and impactful teachers of math in the world. His MIT OpenCourseWare lectures on linear algebra have been viewed millions of times. This conversation is part of the Artificial Intelligence podcast.

A recent post on the math needed to do machine learning got me thinking and, when I get to thinking, I get to searching. I found this course on YouTube on Linear Algebra. In it, you’ll learn what linear algebra is and how it relates to vectors and matrices. Then look through what vectors and matrices are and how to work with them, including the knotty problem of eigenvalues and eigenvectors, and how to use these to solve problems. Finally, learn at how to use these to do fun things with datasets – like how to rotate images of faces and how to extract eigenvectors to look at how the Pagerank algorithm works.

Towards the end of the course, you’ll be able to write code blocks and encounter Jupyter notebooks in Python.