我想我基本上是要follow這個課程
課程網頁 但是2015年的課程裡面有程式的作業用Julia,我覺得也很有趣,一邊搭配使用 http://web.mit.edu/18.06/www/index.shtml
09/09 The Geometry of Linear Equations 1.1-2.1 09/11 Elimination with Matrices 2.2-2.3 09/14 Matrix Operations and Inverses 2.4-2.5
總共有40個小節,我想每個小節至少要花2小時閱讀完,所以至少是80小時的學習時間。 最好是分散到30天,每天2~3小時,這樣在1月中我就能學完? 另外還想搭配 coding the matrix來練習pyhton
The geometry on linear equation,
An overview of ideas, Qestion:
A major application of linear algebra is to solving systems of linear equations. This lecture presents three ways of thinking about these systems. The "row method" focuses on the individual equations, the "column method" focuses on combining the columns, and the "matrix method" is an even more compact and powerful way of describing systems of linear equations.
第一節pdf 這個小節我們透過探討線性聯立方程組來認識線性代數。
1.1 $w_1 = \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}$ $w_2 = \begin{bmatrix} 4 \\ 5 \\ 6 \end{bmatrix}$ $w_3 = \begin{bmatrix} 7 \\ 8 \\ 9 \end{bmatrix}$問這組向量可以span出什麼?
A: 平面
1.2 算 $\begin{bmatrix} 1 & 2 & 0 \\ 2 & 0 & 3 \\ 4 & 1 & 1 \end{bmatrix} \begin{bmatrix} 3 \\ 2 \\ 1 \end{bmatrix}$
A: 1.3 trivial... 等我想到要講什麼再回來寫
link Professor Strang recommends this video from his Computational Science and Engineering I course (18.085) as an overview of the basics of linear algebra.
link
This session introduces the method of elimination, an essential tool for working with matrices. The method follows a simple algorithm. To help make sense of material presented later, we describe this algorithm in terms of matrix multiplication.
invertible:Ax=b, A必須可逆才有解 pivot:row_i的第i個element, 在把矩陣算成upper triangle之後,每個pivot必須非0,這樣才能用back subtution求出解答
This lecture looks at matrix multiplication from five different points of view. We then learn how to find the inverse of a matrix using elimination, and why the Gauss-Jordan method works.
If A is a square matrix, the most important question you can ask about it is whether it has an inverse A−1. If it does, then A−1A = I = AA−1 and we say that A is invertible or nonsingular.
This session explains inverses, transposes and permutation matrices. We also learn how elimination leads to a useful factorization A = LU and how hard a computer will work to invert a very large matrix.
我們在化簡矩陣的時候,透過$E_{21} E_{31}...A$來達成,最後完成LU兩個lower, upper triangle matrix! Ex4.1 Ex4.2 Recitation
To account for row exchanges in Gaussian elimination, we include a permutation matrix P in the factorization PA = LU. Then we learn about vector spaces and subspaces; these are central to linear algebra.
Transposes, Permutations, Vector Spaces recitation vector space: 只要我們有一組向量,定義出加法跟常數乘法(合乎八個線性代數準則的),被這組向量及運算子線性組合產生的所有向量,我們稱為向量空間。 CEx:xy平面上第一象限的點$\neq$向量空間,因為我們取任一向量*一負數就跑到外面去了。
subspace: A vector space that is contained inside of another vector space $R^2$的subspace:
Given a matrix A with columns in $R^3$, the vector space it , these columns and all their linear combinations form a subspace of $R^3$, $C(A)$
link The column space of a matrix A tells us when the equation Ax = b will have a solution x. The null space of A tells us which values of x solve the equation Ax = 0.
link We apply the method of elimination to all matrices, invertible or not. Counting the pivots gives us the rank of the matrix. Further simplifying the matrix puts it in reduced row echelon form R and improves our description of the null space.
link We describe all solutions to Ax = b based on the free variables and special solutions encoded in the reduced form R.
link A basis is a set of vectors, as few as possible, whose combinations produce all vectors in the space. The number of basis vectors for a space equals the dimension of that space.
link For some vectors b the equation Ax = b has solutions and for others it does not. Some vectors x are solutions to the equation Ax = 0 and some are not. To understand these equations we study the column space, nullspace, row space and left nullspace of the matrix A.
link As we learned last session, vectors don't have to be lists of numbers. In this session we explore important new vector spaces while practicing the skills we learned in the old ones. Then we begin the application of matrices to the study of networks.
In [ ]:
# 定義
echelon,
By approaching what we've learned from new directions, the questions in this exam review session test the depth of your understanding. Notice the short questions (with answers) at the end. This unit reached the key ideas of subspaces — a higher level of linear algebra. Please review the list of topics on the left. $u, v, w$ in $R^7$