Linear Equations:

Linear Equations, for example:
1. Linear Algebra - 图1
1. Linear Algebra - 图2

can be written in matrix form, m rows and n columns:
1. Linear Algebra - 图3

or in general:
1. Linear Algebra - 图4


Matrix transpose:

1. Linear Algebra - 图5

and
1. Linear Algebra - 图6

1. Linear Algebra - 图7

Matrix multiplication:

  • What matrices can be multiplied by each other?

cols of 1st matrix == # rows of 2nd matrix

  • What’s the size of the result of the multiplication?

rows of 1st matrix * # cols of 2nd matrix

For example:
1. Linear Algebra - 图8

(24)(43) = (23)

  • Matrix multiplication is not commutative:

1. Linear Algebra - 图9
but associative:
1. Linear Algebra - 图10

Inverse of matrix:

  • The inverse of a matrix A is written as A^-1

1. Linear Algebra - 图11
1. Linear Algebra - 图12

where I is the identity matrix:
1. Linear Algebra - 图13
1. Linear Algebra - 图14


Vectors:

  • a kind of matrix that only includes one column

1. Linear Algebra - 图15
such as
1. Linear Algebra - 图16

length/norm of a vector:

1. Linear Algebra - 图17
e.g. 1. Linear Algebra - 图18

Vector Dot Product/Inner Product:

image.png

Vector Outer Product:

image.png


Coordinate Rotation:

1. Linear Algebra - 图21
1. Linear Algebra - 图22

  • 1. Linear Algebra - 图23 is the angle of rotation
  • x and y are coordinates of a point in x-axis and y-axis

Least Squares Fitting example:

Problem: find the line that best fit these three points.

  • 1. Linear Algebra - 图24

Solution:
image.png

  • Follow the equation 1. Linear Algebra - 图26 or 1. Linear Algebra - 图27

c = intercept
d = slope

  • From 3 points, we can get:

c - d = 1
c + d = 1
c + 2d = 3
OR
1. Linear Algebra - 图28

  • 1. Linear Algebra - 图29 and

1. Linear Algebra - 图30

  • Then 1. Linear Algebra - 图31 is 1. Linear Algebra - 图32

OR
3c + 2d = 5
2c + 6d = 6

  • Finally we can get c = 9/7, d = 4/7

and best line to fit those 3 points is 1. Linear Algebra - 图33

This method of Least Square Fitting is used only if m > n for an m-by-n matrix A, and Ax = b has no solution.


Eigenvalue & Eigenvector:

We have a square matrix A, if
1. Linear Algebra - 图34
1. Linear Algebra - 图35 is called eigenvalue and x is called eigenvector.

Example:

  • 1. Linear Algebra - 图36 and 1. Linear Algebra - 图37
  • 5 and 2 are eigenvalue**s**,

    1. Linear Algebra - 图38 and 1. Linear Algebra - 图39 are eigenvectors.

If 1. Linear Algebra - 图40s are distinct eigenvalues of a matrix, then the corresponding eigenvectors are linearly independent.
(cannot be expressed as a linear combination of the other vectors)
(The rank of a square matrix is the number of linearly independent rows or columns)

A real, symmetric matrix has real eigenvalues with eigenvectors that can be chosen to be orthonormal matrix.
(1. Linear Algebra - 图41 OR
1. Linear Algebra - 图42)

Singular Value Decomposition (SVD):

An m*n matrix A can be decomposed into:
1. Linear Algebra - 图43

U is mm, _V _is nn, both of them are orthogonal matrices:
1. Linear Algebra - 图44
1. Linear Algebra - 图45

D is a diagonal matrix (other positions except diagonal elements are 0), and the diagonal elements are called the singular values, while they are all 1. Linear Algebra - 图46
Also, a matrix is non-singular iff all of the singular values are 1. Linear Algebra - 图47

Example:
1. Linear Algebra - 图48

If A is a square, non-singular matrix, it’s inverse can be written as
1. Linear Algebra - 图49

  • The singular values of D are the square roots of non-zero eigenvalues of both the nn matrix 1. Linear Algebra - 图50 and of the mm matrix 1. Linear Algebra - 图51
  • The columns of U are the eigenvectors of 1. Linear Algebra - 图52
  • The columns of V are the eigenvectors of 1. Linear Algebra - 图53

Reference:

  • handout of COMP4102: Introduction to Computer Vision from Carleton University School of Computer Science, 2019