Matrices
Matrix Definition
A matrix is a rectangular array of numbers (called elements), consisting of m rows and n columns:
This is said to be a matrix of order m by n. For instance, here is a 2 by 3 matrix:
Matrices are useful in a solving a number of problems.
Matrix Algebra
Addition
It is only possible to add 2 matrices if they are of the same order (ie same number of rows and columns). If we add matrices A and B to give a result X, then each element of X is simply the sum of the 2 corresponding elements of A and B,
for example
Since matrix addition is performed simply by adding the individual elements, clearly you will get the same result whatever order you add the matrices in. In maths jargon we say that the operation is both commutative and associative:
A+B = B+A
(A+B)+C=A+(B+C)
Subtraction
Subtraction of 2 matrices is analogous with addition:
for example, using the same A and B as before
Scalar Multiplication
Scalar multiplication, i.e. multiplying a matrix by a number (eg F) is simply a matter of multiplying each element of the matrix by the number:
For example:
It should be clear from the above that scalar multiplication is commutative, i.e.
3×A = A×3
and also that scalar multiplication is distributive over addition and subtraction, ie
3×(A+B)=2×A+3×B
Matrix Multiplication
We can multiply 2 matrices to give a matrix result:
X=A×B
It is only possible to multiply A and B if the number of columns of A is equal to the number of rows of B (A and B are then said to be conformable). If A is an n by m matrix, and B is an m by p matrix, then X will be a n by p matrix. The definition of X is:
For example, we can multiply a 2 by 3 matrix and a 3 by 2 matrix, resulting in a 2 by 2 matrix:
We see that if we change the order of the 2 matrices. This time we are multiplying a 3 by 2 matrix with a 2 by 3 matrix, and the result is a 3 by 3 matrix, quite different from the previous result:
This shows that matrix multiplication is not commutative. In fact, if you exchange the matrices, the multiplication may become invalid due to conformability. For example, if A is a 2 by 3 matrix and B is a 3 by 3 matrix, it is possible to for the product AB, but not BA.
Matrix multiplication is, however, associative and distributive as:
A×B≠B×A
(A×B)×C=A×(B×C)
(A+B)×C=A×C+B×C
A(B + C) = AB + AC
A(B – C) = AB – AC
(B – C)A = BA – CA
a(B +/- C) = aB +/- aC
(a +/- b)C = aC +/- bC
(ab)C = a(bC)
a(BC) = (aB)C = B(aC)
Transposition
Transposing a matrix means converting and m by n matrix into an n by m matrix, by “flipping” the rows and columns.
It is denoted by a superscript T, eg:
As an aside, there is an interesting relationship between transposition and multiplication:
(A×B)T = BT×AT
A = (At)t = A
(A +B)t = At + Bt
(kA)t = k(A)t
Equality
2 matrices are considered to be equal if they are of the same order, and if all their corresponding elements are equal.
Symmetric
A square matrix A = [aij] is said to be symmetric if A′ = A, that is, aij = aji for all possible values of i and j.
For example is a symmetric matrix as A′ = A
A square matrix A is said to symmetric if
element of the matrix element of the matrix A′
Or
Skew-symmetric
A square matrix A = [aij] is said to be skew symmetric matrix if A′ = -A, that is aji = –aij for all possible values of i and j.
We see that:
- A square matrix A is said to symmetric if
element of the matrix element of the matrix A′
Or
- Diagonal elements of a skew symmetric matrix are all zeros
Now, if we put i = j, we have aii = – aii. Therefore 2aii = 0 or aii = 0 for all i’s.
For example, the matrix is a skew symmetric matrix as B′ = -B
Result
- If A is any square matrix then A+A′ is symmetric and A-A′ is a skew symmetric matrix.
- Every square matrix can be uniquely expressed as the sum of a symmetric and a skew symmetric matrix.
Method If A is any square matrix then
where, is symmetric and is skew symmetric matrix. - If is symmetric (or skew symmetric) then kA is symmetric (or skew symmetric) matrix.
Orthogonal Matrix
A n×n matrix A is an orthogonal matrix if AAT = I, where AT is the transpose of A and I is the identity matrix. In particular, an orthogonal matrix is always invertible, and A-1 = AT. In component form, .
This relation makes orthogonal matrices particularly easy to compute with, since the transpose operation is much simpler than computing an inverse.
Comments
write a comment