AIEEE Concepts®

A Complete Coverage Over AIEEE Exam

Matrices

Matrices


A rectangular array of symbols or entities (which could be real or complex numbers) along rows and columns is called a matrix.

Thus a system of m × n symbols arranged in a rectangular formation along m rows and n columns and bounded by the brackets [.] is called an m by n matrix

(which is written as m x n matrix).

Thus,


In a compact form the above matrix is represented by A = [aij], 1 i m, 1 j n or simply by [aij]m×n.

The numbers a11, a12,... etc of this rectangular array are called the elements of the matrix. The element aij belongs to the ith row and jth column and is called the (i,

j)th element of the matrix. We shall use capital letters A, B, C, .... to denote a matrix.

Equal Matrices

Two matrices are said to be equal if they have the same order and each element of one is equal to the corresponding element of the other.

Classification of Matrices

Row Matrix:

A matrix having a single row is called a row matrix.

Column Matrix

A matrix having a single column is called a column matrix.

Real Matrix:

A matrix in which all the elements are real, is called a real matrix.

Complex Matrix:

A matrix in which one or more of the elements are complex is called a complex matrix.

Square Matrix:

An m x n matrix A is said to be a square matrix if m = n i.e. number of rows = number of columns.

Note:
The diagonal from left hand side upper corner to right hand side lower corner is known as leading diagonal or principal diagonal. In the above example,

diagonal of the square matrix containing the elements 1, 3, 5 is called the leading or principal diagonal.


Trace of a Matrix:

The sum of the elements of a square matrix A lying along the principal diagonal is called the trace of A i.e. tr(A).

Thus if A = [aij]n×n,

then tr(A) = = a11 + a22 + ... + ann.


Diagonal Matrix:

A square matrix all of whose elements, except those in the leading diagonal, are zero is called a diagonal matrix. For a square matrix A = [aij]n×n to be a diagonal

matrix, aij = 0, whenever i j.

Note: Here A can also be represented as diag(3, 5, -1)


Scalar Matrix:

A diagonal matrix, all of whose elements are equal is called a scalar matrix.

For a square matrix A = [aij]n×n to be a scalar matrix, aij = , where m 0.

Unit Matrix or Identity Matrix:

A diagonal matrix of order n which has unity for all its diagonal elements, is called a unit matrix of order n and is denoted by In.

Thus a square matrix A = [aij]n×n is a unit matrix if aij = .


Triangular Matrix:

A square matrix in which all the elements below the principal diagonal are zero is called Upper Triangular matrix and a square matrix in which all the elements

above the principal diagonal are zero is called Lower Triangular matrix.

Note:
Diagonal matrix is both upper and lower triangular

A triangular matrix A = [aij]n×n is called strictly triangular if aii = 0 for 1 i n.


Sub Matrix:

Any matrix obtained by omitting some rows and/or columns from a given m × n matrix A is called a sub matrix of A. The given matrix is a sub matrix of itself.

Null Matrix:

If all the elements of a matrix (square or rectangular) are zero, it is called a null or zero matrix.

For A = [aij] to be null matrix, aij = 0 i, j


Algebra of Matrices

Scalar Multiplication

The matrix obtained by multiplying every element of a matrix A by a scalar is called the scalar multiple of A by and is denoted by A i.e. if A = [aij] then A

= [laij]

Note: For two scalars and m (i) (A + B) = A +B; (ii) ( +)A =A + A;

(iii) (A) = () A; (iv) (- )A = - (A) = (- A).


Addition and Subtraction of Matrices:

Any two matrices can be added if they are of the same order and the resulting matrix is, thus, of the same order. If two matrices A and B are of the same order,

they are said to be conformable for addition/subtraction.

Note:
Matrices of different orders can neither be added nor subtracted.


Addition of matrices is commutative as well as associative. i.e.
(i) A + B = B + A

(ii) (A + B) + C = A + (B + C)

(iii) A + O = O + A = A

(iv) A + (- A) = O.
If A1, A2, ...., Ap are p matrices conformable for addition and 1,2, ...., p are p arbitrary scalars, then 1A1 + 2A2 + ....+ pAp is called a linear

combination of the matrices

A1, A2, ...., An.

The equation A + X = O has a unique solution in the set of all m×n matrices.


All the laws of ordinary algebra hold for the addition or subtraction of matrices and their multiplication by scalars.


Multiplication of Matrices

Two matrices can be multiplied only when the number of columns in the first, called the pre factor is equal to the number of rows in the second, called the post

factor. Such matrices are said to be conformable for multiplication.



where cij = ai1 b1j + ai2 b2j + ....+ ain bnj = , i = 1, 2, 3, ...., m, j = 1, 2, 3, ...., p.

The ith row of A has n elements and the jth column of B has n elements. We obtain the (i, j)th element of C as the sum of the product of the corresponding

elements of the ith row of A and jth column of B.

Note:
Commutative law does not necessarily hold for multiplication of matrices.


If AB = BA then matrices A and B are called commutative matrices.


If AB = -BA then matrices A and B are called anti-commutative matrices.


The product of a row matrix of order 1 × n and a column matrix of order n × 1 is a 1 × 1 matrix.


The product of a column matrix of order n × 1 and a row matrix of order 1 × m is a matrix of order m × n.


It is possible that AB is defined but BA is not.


When AB and BA are both defined, AB may or may not be equal to BA.


If A is a square matrix of order n, then
Am = A × A × A × .... × A (m times)

Ar+s = Ar. As and (Ar)s = Ars where r and s are integers
Matrix multiplication is associative. i.e. (AB)C = A (BC).


Matrix multiplication is distributive with respect to addition. i.e. A(B + C) = AB + AC and (A + B)C = AC + BC.


The matrices possess divisors of zero, i.e. if the product AB = O, it is not necessary that atleast one of the matrices should be zero matrix.


Cancellation law does not necessarily hold, i.e. if AB = AC then in general B C, even

if A 0.


For a scalar k, k(AB) = (kA)B = A(kB). In particular, A (- B) = - AB and (- A)(-B) = AB.


Special Matrices

Transpose of a Matrix:

The matrix obtained from any given matrix A, by interchanging rows and columns, is called the transpose of A and is denoted by A' or AT.

If A = [aij]m×nand A' = [bij]n×m then bij = aji, i, j


Properties of Transpose:

(i). (A')' = A

(ii). (A + B)' = A' + B', A and B being conformable matrices

(iii). (A)' = A', being scalar

(iv). (AB)' = B'A', A and B being conformable for multiplication

Conjugate of a Matrix:

The matrix obtained from any given matrix A containing complex number as its elements, on replacing its elements by the corresponding conjugate complex

numbers is called conjugate of A and is denoted by .

Properties of Conjugate:

(i).

(ii).

(iii). , being any number

(iv). , A and B being conformable for multiplication


Transpose Conjugate of a Matrix

The transpose of the conjugate of a matrix A is called transpose conjugate of A and is denoted

by A. The conjugate of the transpose of A is the same as the transpose of the conjugate of A i.e. = A

If A = [aij]m×n, then A= [bji]n×m where bji = i.e. the (j, i)th element of A= the conjugate of (i, j)th element of A.

Properties of Transpose conjugate

(i). (A)= A

(ii). (A + B) = A + B

(iii). (kA)= A, k being any number

(iv). (AB)= BA

Symmetric and Skew Symmetric Matrices:

A square matrix A = [aij] is said to be symmetric when aij = aji for all i and j i.e. A' = A. If aij = -aji for all i and j and all the leading diagonal elements are zero,

then the matrix is called a skew symmetric matrix.


Hermitian and Skew- Hermitian Matrix:

A square matrix A = [aij] is said to be Hermitian matrix if = i, j i.e. A = A.

Note:
If A is a Hermitian matrix then aii = aii is real i. Thus every diagonal element of a Hermitian Matrix must be real.


A Hermitian Matrix over the set of real numbers is actually a real symmetric matrix.


A square matrix, A = [aij] is said to be a skew-Hermitian matrix if = -, i, j i.e. A= -A.


If A is a skew-Hermitian matrix then aii = - aii + = 0 i.e. aii must be purely imaginary or zero.


A skew-Hermitian Matrix over the set of real numbers is actually a real skew-symmetric matrix.


In a skew-symmetric matrix A, all its diagonal elements are zero. Since aij = -aji for all i, j, aii = - aii aii = 0.


The matrix which is both symmetric and skew-symmetric must be a null matrix.

For any real square matrix A, A + A' is a symmetric matrix and A - A' is a skew-symmetric matrix.


Translate:

Powered By google
 
TOP^