공학수학2_선형대수

Up: Class_2020_2


1. 2020-10-27 (Linear Algebra)

It is abstract. Geometrical representations have limitations.

Vector Space

Example of vector spaces
- $\mathbb{R}^3$
- $\mathbb{R}^n$

그 안에서 덧셈과 스칼라곱이 가능


Subspace

Example:
원점 (0, 0, 0)을 지나는 평면,plane은 벡터공간 $\mathbb{R}^3$부분공간이다.

"whatever happens in the subspace stays in the subspace."


Column space of matrix A

Given matrix A is part of a system of 3 equations and 2 unknowns
Ax=b
or
$\begin{bmatrix}1&0\\5&4\\2&4\end{bmatrix}\begin{bmatrix}u\\v\end{bmatrix}=\begin{bmatrix}b_1\\b_2\\b_3\end{bmatrix}$
only a very thin subset of possible b's with(will인가?) satisfy the equation
$u\begin{bmatrix}1\\5\\2\end{bmatrix}+v\begin{bmatrix}0\\4\\4\end{bmatrix}=\begin{bmatrix}b_1\\b_2\\b_3\end{bmatrix}$
(여기서 가장 오른쪽 행렬은)
  • all possible combination of columns
  • referred to as column space(a subspace) C(A) of $\mathbb{R}^n$

// 열공간,column_space

https://i.imgur.com/3732MTN.png


Nullspace of matrix A

Nullspace contains all vectors x that gives Ax=0

Example
$\begin{bmatrix}1&0&1\\5&4&9\\2&4&6\end{bmatrix} \begin{bmatrix}c\\c\\-c\end{bmatrix} = \begin{bmatrix}0\\0\\0\end{bmatrix}$ ....여기서 두번째 행렬
nullspace is a line
$x=c, y=c, z=-c$
where $c$ is any scalar or number.
즉 영공간은 위 Ax=b에서 x임


Solving Ax=b

Given Ax=b
여기서
A : 보통 알려져 있음
x : 찾아내려는 것
b : 보통 알려져 있음
(Suppose this describes a physical system that we are trying to find values for some components.)

If $AX_p=b$ and $AX_n=0,$
then $Ax=b$
is $A(X_p+X_n)=b+0$

위에서
$X_p$ : particular solution
$X_n$ : nullspace
$(X_p+X_n)$ : our complete solution

// 해,solution

https://i.imgur.com/Jzg0HS5.png


Example

Given Ax=b as
$\begin{bmatrix}1&3&3&2\\2&6&9&7\\-1&-3&3&4\end{bmatrix}\begin{bmatrix}u\\v\\w\\y\end{bmatrix}=\begin{bmatrix}1\\5\\5\end{bmatrix}$
// 여기서 u,v,w,y를 푸는 것이 목적

We will first re-arrange the above equation into what we called an echelon matrix U.

Forward elimination
// 이건 캡쳐할수밖에...
https://i.imgur.com/HrtHOSt.png


Approach 1
https://i.imgur.com/I5zgSBq.png


Approach 2
https://i.imgur.com/rLHeB9d.png


https://i.imgur.com/hpAjHxQ.png


2. 2020-10-29

Linear Independence, Basis and Dimensions

The numbers m(rows) and n(columns) do not give the true size of the linear system.
- it can have zero rows and columns
- combinations of rows of columns

Rank
- gives the true size of the linear system
- is the number of pivots in the elimination process
- genuinely independent rows in matrix A
// 계수,rank

Linear Independence

Given $c_1v_1+c_2v_2+\cdots+c_kv_k=0$

① If the equation can only be satisfied by having $c_1=0,c_2=0,\cdots,c_k=0$ then we say $v_1,v_2,\cdots,v_k$ are linearly independent.

② If any of the coefficients is a non-zero, then we say $v_1,v_2,\cdots,v_k$ are linearly dependent.


Example

$A=\begin{bmatrix}1&3&3&2\\2&6&9&5\\-1&-3&3&0\end{bmatrix}$
If we perform $c_2=c_2-3c_1$
$A=\begin{bmatrix}1&0&3&2\\2&0&9&5\\-1&0&3&0\end{bmatrix}$ // columns are linearly independent
// 두번째 열이 zero vector

Ax=0
여기서 x: nullspace of A, N(A) must be {zero vectors} if the columns of A are independent.

zero vector를 없애면
$A=\begin{bmatrix}1&3&2\\2&9&5\\-1&3&0\end{bmatrix},$
$\begin{bmatrix}1&3&2\\2&9&5\\-1&3&0\end{bmatrix}\begin{bmatrix}x_1\\x_2\\x_3\end{bmatrix}=\begin{bmatrix}0\\0\\0\end{bmatrix}$
only $x_1=x_2=x_3=0$ satisfy equation

https://i.imgur.com/q0yZY9a.png


Spanning a subspace

  • When we say vectors $w_1,w_2,\cdots,w_l$ span the space $V,$ it means vector space $V$ consists of all linear combinations of $w_1,w_2,\cdots,w_l.$
    $\operatorname{Space}V=c_1w_1+c_2w_2+\cdots+c_lw_l$
    for some coefficient $c_i$

  • Column space of A is space spanned by its column.
  • Row space of A is space spanned by its rows.

// 생성,span 부분공간,subspace 열공간,column_space 행공간,row_space

Basis for a vector space

  • A basis of space $V$ is a set of vectors where
    - they are linearly independent (not too many vectors)
    - they span the space $V$ (not too few vectors)
  • Every vector in the space $V$ is a unique combination of basis vectors
  • If columns of matrix are independent, they are a basis for the column space (and they span it as well)

// 기저,basis

Dimension of a vector space

  • A space has infinitely many different bases(←plural for basis)
  • The number of basis vectors is a property of the space
    (fixed for a given space V)
  • number of vectors in the bases = dimension of space
  • A basis is
    - a maximal independent set
    - cannot be made larger without losing independence
  • - a minimal spanning set
    - cannot be made smaller and still span the space

// 차원,dimension

Four Fundamental Subspaces

Given matrix A is m×n matrix
① Column space of A denoted by C(A) ~ dimension is the rank r
② Null space of A denoted by N(A) ~ dimension is n-r
③ Row space of A is the column space of AT ~ dimension is r
④ Left nullspace of A is the nullspace of AT ~ dimension is m-r
It contains all vectors y such that ATy=0 denoted by N(AT)

Q&A left nullspace란?
2번에서 Ax=0에서 x nullspace가 맞는지 CHK
4번에서 y가 left nullspace이다.

① Column space of A

  • Pivot columns of A are a basis for its column space
  • If sets of columns in A are independent, it corresponding columns in echelon matrix V are also independent.

Example
$V=\begin{bmatrix}d_1&*&*&*&*&0\\0&0&0&d_2&*&0\\0&0&0&0&0&d_3\\0&0&0&0&0&0\end{bmatrix}$
  • Assumed columns 1,4,6 are independent columns
  • Columns 1,4,6 are basis for C(A)

  • Row rank = column rank (important theorem in linear algebra)
  • If rows of square matrix are independent, the columns are also independent
// 질문생략

https://i.imgur.com/sfkIdsA.png


Question

Is dimension of subspace made by 2 vectors (1 2 1)T and (1 0 0)T two? Even if the number of variables is three and the plane is on vector space dimension 3?
https://i.imgur.com/gzFJKG8.png

// 질문생략

3. 가장 중요한 class 하나 놓침

4. 2020-11-05

https://i.imgur.com/pTwtFRf.png


https://i.imgur.com/fdxjKeB.png


Nullspace of A

$Ax=0,\;x=\begin{bmatrix}c\\c\\c\\c\end{bmatrix}$

Column space of A

Is b in the column space of A (what values of b will satisfy Ax=b)
....포기, 캡쳐 참조


https://i.imgur.com/I2aGJrQ.png


https://i.imgur.com/dewmp9e.png


https://i.imgur.com/z3t5RPQ.png


https://i.imgur.com/aMdwzyZ.png


https://i.imgur.com/UQhTlND.png


https://i.imgur.com/IxHTfwb.png


5. 2020-11-10

Linear Transformations
https://i.imgur.com/pLjEokj.png

// 선형변환,linear_transformation

Transformations represented by matrices
https://i.imgur.com/QD74p2V.png


The same transformation can be described by another set of basis vectors
https://i.imgur.com/O5QPzv5.png


Example
// basis가 다항식,polynomial... 행렬로 하는 미적분? $A_{\textrm{diff}}$
https://i.imgur.com/xB5KWXK.png


And we can do the same for integration - integration matrix $A_{\textrm{int}}$
https://i.imgur.com/Bx84XIO.png


Transformation of the plane
Stretching
Rotation by 90° (ccw)
https://i.imgur.com/dqlAxfs.png


Reflection (45° line)
Projection (onto x axis) // 사영,projection
https://i.imgur.com/RmTTVQQ.png


Rotation through angle θ
https://i.imgur.com/bL6NqSd.png

// 회전,rotation

Projection onto θ-line
(θ-line은 x축에서 반시계방향으로 회전한 선)
https://i.imgur.com/UF8RNRH.png

// 사영,projection

Reflection about mirror θ-line
https://i.imgur.com/QaclZdd.png

// 반사,reflection

https://i.imgur.com/9J5xzRD.png


6. 2020-11-12

Orthogonal vectors and subspaces

Length of vector
Given vector $\vec{x}=( x_1,x_2,\cdots,x_n )$
Length squared is $||\vec{x}||^2=x_1^2+x_2^2+\cdots+x_n^2$
$=\vec{x}{}^T\vec{x}$

Example
Given $\vec{x}=\begin{bmatrix}1\\2\\-3\end{bmatrix}$
length square of $\vec{x}$ is $\vec{x}{}^T\vec{x}=[1\;2\;3]\begin{bmatrix}1\\2\\-3\end{bmatrix}=14$

// xT는 x의 전치(see 전치행렬,transpose_matrix)

https://i.imgur.com/Rhe5H45.png

// $|x|$ for scalars, $||x||$ for vectors 언급함. 절대값,absolute_value vs 노름,norm 관계가 저거?? CHK

Orthogonal vectors
// 직교성,orthogonality
https://i.imgur.com/bcQxB3T.png


Example

Given vectors
$\vec{v_1}=(\cos\theta,\sin\theta)$
$\vec{v_2}=(-\sin\theta,\cos\theta)$
.......
https://i.imgur.com/9bQnGf9.png

orthogonal unit vectors or orthonormal vectors in ℝ2

Orthogonal Subspaces
https://i.imgur.com/2BRuEtl.png


https://i.imgur.com/iOTG1R1.png


https://i.imgur.com/5lQiQm3.png


Orthogonal Complement

The space of all vectors orthogonal to subspace V of ℝn
Notation: V or "V perp" (perp는 perpendicular)
Example
Nullspace is orthogonal complement of row space
Recall
nullspace N(A)
column space C(AT)

N(A)=(C(AT))


https://i.imgur.com/NKdo2JA.png


https://i.imgur.com/dTlZ5jU.png



https://i.imgur.com/YXPWXRe.png


Projection onto a line
// 사영,projection 직선,line

Projection of vector $b$ onto line in the direction of vector $a$
$\vec{p}=\hat{x}\vec{a}=\frac{a^{\top}b}{a^{\top}a}a$
and all vectors a and b satisfy Schwarz(sic) inequality which is
$|a^{\top}b| \le ||a|| \, ||b||$
....
https://i.imgur.com/WEdTy5M.png

// 코시-슈바르츠_부등식,Cauchy-Schwartz_inequality 방향,direction

https://i.imgur.com/ZQwqPUD.png


Projection Matrix
https://i.imgur.com/q72ZJyR.png


https://i.imgur.com/W1sHaKy.png


https://i.imgur.com/A6NTTnv.png


7. 2020-11-17

Projections and Least Squares
https://i.imgur.com/LO93ClW.png


Least Squares Problems with Several Variables
Normal Equation
Best Estimate
https://i.imgur.com/IqHKGxW.png


https://i.imgur.com/zUfBukb.png


Cross Product Matrix ATA
Projection Matrices
https://i.imgur.com/L6KnioQ.png


Least Square Fitting of Data
https://i.imgur.com/8c65HYX.png


https://i.imgur.com/tgzmcA2.png


https://i.imgur.com/AcNnhBf.png


https://i.imgur.com/acIOHEv.png


Orthogonal Bases and Gram-Schmidt

Recall orthonormal vectors are orthogonal unit vectors
  • If Q(square or rectangular) has orthonormal columns then QTQ=I
  • If Q is a square matrix, it is called "orthogonal matrix"
    Then QT=Q-1
  • We will see that orthonormal vectors are very convenient to work with.

https://i.imgur.com/Wm2uK5s.png


8. 2020-11-19

https://i.imgur.com/8LD1s22.png


Rectangular Matrices with Orthonormal Columns

If Q has orthonormal columns, the least squares problem becomes easy
https://i.imgur.com/qCfSv05.png



https://i.imgur.com/yRpUrrJ.png


https://i.imgur.com/xHZSk7s.png


Eigenvalues and Eigenvectors
// 고유값,eigenvalue 고유벡터,eigenvector
https://i.imgur.com/Yo3UfMU.png


https://i.imgur.com/FufFyMZ.png


https://i.imgur.com/2ryE1Bn.png


https://i.imgur.com/GLIh9HY.png


https://i.imgur.com/Eo0YZTa.png



https://i.imgur.com/PyHEsDl.png

9. 2020-11-24

지난번 exercise의 solution:
https://i.imgur.com/dDM6nYX.png


https://i.imgur.com/Jm6fsXk.png


Diagonalization of a Matrix
// 대각화,diagonalization curr goto 대각행렬,diagonal_matrix
https://i.imgur.com/AUHDb9n.png


Diagonalizing matrix S
https://i.imgur.com/RsrHziZ.png


https://i.imgur.com/mIaTuR0.png


https://i.imgur.com/G7dQwhj.png


https://i.imgur.com/HwJwAhd.png


https://i.imgur.com/3sQwMUC.png



https://i.imgur.com/F8L8xvX.png


Orthogonality between two functions
https://i.imgur.com/BkQz0gB.png


https://i.imgur.com/2Btx2Kw.png


10. 2020-11-26

Fast Fourier Transform (FFT)
// 푸리에_변환,Fourier_transform
https://i.imgur.com/sD29I1b.png


https://i.imgur.com/ytTetQU.png



이하 TBW!!!



11. 2020-12-01 Review 1

Inverse of Matrix
https://i.imgur.com/Z1vHrN0.png


https://i.imgur.com/uJOdWjW.png


https://i.imgur.com/RP4oTYi.png


https://i.imgur.com/QTx0Ys1.png


https://i.imgur.com/txFRCVy.png


https://i.imgur.com/lkeCOTT.png


https://i.imgur.com/a4gTXIp.png


https://i.imgur.com/5mhj3Ve.png


https://i.imgur.com/8yIPQDk.png


12. Review 2 놓침



Credits
Taught by: Prof. Beelee Chua