Linear Algebra

5 minute read

Linear algebra provides a way of compactly representing and operating on sets of linear equations. For example, consider the following system of equations:

By $A \in \mathbb { R } ^ { m \times n }$ we denote a matrix with m rows and n columns, where the entries of A are real numbers.

By $x \in \mathbb { R } ^ { n }$, we denote a vector with n entries. By convention, an n-dimensional vector is often thought of as a matrix with n rows and 1 column, known as a column vector.

The ith element of a vector x is denoted $x _ { i }$

We use the notation $a _ { i j } \left( \text { or } A _ { i j } , A _ { i , j } , \text { etc } \right)$ to denote the entry of A in the ith row and jth column:

The product of two matrices $A \in \mathbb { R } ^ { m \times n } \text { and } B \in \mathbb { R } ^ { n \times p }$ is the matrix where $C _ { i j } = \sum _ { k = 1 } ^ { n } A _ { i k } B _ { k j }$

in order for the matrix product to exist, the number of columns in A must equal the number of rows in B.

#inner product or dotproduct of the vectors

inner product or dot product of the vectors, is a real number given by

Also $x ^ { T } y = y ^ { T } x$

The Outer product of vectors is given by

In addition to this, it is useful to know a few basic properties of matrix multiplication at a higher level:

  • Matrix multiplication is associative: (AB)C = A(BC)
  • Matrix multiplication is distributive: A(B + C) = AB + AC
  • Matrix multiplication is, in general, not commutative; that is, it can be the case that $A B \neq B A$

The identity matrix, denoted $I \in \mathbb { R } ^ { n \times n }$, is a square matrix with ones on the diagonal and zeros everywhere else.

For all $A \in \mathbb { R } ^ { m \times n }$ ;

The transpose of a matrix results from “flipping” the rows and columns. Transposes have the following properties

  1. $\left( A ^ { T } \right) ^ { T } = A$
  2. $( A B ) ^ { T } = B ^ { T } A ^ { T }$
  3. $( A + B ) ^ { T } = A ^ { T } + B ^ { T }$

A square matrix $A \in \mathbb { R } ^ { n \times n }$ is symmetric if $A = A ^ { T }$

The trace of a square matrix $A \in \mathbb { R } ^ { n \times n }$ denoted tr(A) is the sum of diagonal elements in the matrix:

Trace has the following properties

  1. $\operatorname { tr } A = \operatorname { tr } A ^ { T }$
  2. $\operatorname { tr } ( A + B ) = \operatorname { tr } A + \operatorname { tr } B$
  3. $\operatorname { tr } A B = \operatorname { tr } B A$
  4. $\operatorname { tr } ( t A ) = t \operatorname { tr } A$

Norms of Vector

A norm of a vector $| x |$ is the length of the vectors

The euclidean or the $\ell _ { 2 }$ norm is

where $| x | _ { 2 } ^ { 2 } = x ^ { T } x$

In General the norm for a real number p ≥ 1 is

Norms can also be defined for matrices, such as the Frobenius norm,

Linear Independence and Rank

A set of vectors is said to be (linearly) independent if no vector can be represented as a linear combination of the remaining vectors. The column rank of a matrix is the size of the largest subset of columns of that constitute a linearly independent set.

For $A \in \mathbb { R } ^ { m \times n }$

  • If $\operatorname { rank } ( A ) = \min ( m , n )$ then A is said to be full Rank
  • $\operatorname { rank } ( A ) = \operatorname { rank } \left( A ^ { T } \right)$

#The Inverse of Matrix

The inverse of a square matrix is the unique matrix such that

If a matrix does not have an inverse it is said to be non-invertible or singular.In order for a square matrix A to have an inverse , then A must be full rank.

Following are the properties of inverse

  • $\left( A ^ { - 1 } \right) ^ { - 1 } = A$
  • $( A B ) ^ { - 1 } = B ^ { - 1 } A ^ { - 1 }$
  • $\left( A ^ { - 1 } \right) ^ { T } = \left( A ^ { T } \right) ^ { - 1 }$

Orthogonal Matrices

Two vectors x, y are orthogonal if $x ^ { T } y = 0$ . A vector x i normalized if $| x | _ { 2 } = 1$

A square matrix U is orthogonal if all its columns are orthogonal to each other and are normalized (the columns are then referred to as being orthonormal ).

A nice property of orthogonal matrices is that operating on a vector with an orthogonal matrix will not change its Euclidean norm,

Null Space, Column Space and span

The span of a set of vectors is the set of all vectors that can be expressed asa linear combination of those vectors

The nullspace of a matrix is the is the set of all vectors that equal 0 when multiplied by A,

The column space of A, denoted by C(A), is the span of the columns of A for all vectors.

The Determinant

The determinant of a square matrix det A The formula for determinant for an nxn matrix A is

Example for a 2x2 matrix:

Properties of determinants are as follows:

  1. $ A = \left A ^ { T } \right $
  2. $ A B = A   B $
  3. $ A = 0$ if and only if A is singular
  4. For non singular matrix $\left A ^ { - 1 } \right = 1 / A $

Eigenvalues and Eigenvectors

Given a square matrix $A \in \mathbb { R } ^ { n \times n }$ ; $\lambda \in \mathbb { C }$ is an eigenvalue of A and $x \in \mathbb { C } ^ { n }$ is the eigenvector if

Properties of Eigen value and Eigen Vectors:

  1. The rank of A is equal to the number of non-zero eigenvalues of A.

We can write all the eigenvector equations simultaneously as

where

Updated: