Search for probability and statistics terms on Statlect

Properties of block matrices

by , PhD

In this lecture we summarize some simple properties enjoyed by block matrices (also called partitioned matrices).

We are going to assume that the reader is already familiar with the concept of a block matrix.

Table of Contents

Addition of block matrices

If two block matrices $M_{1}$ and $M_{2}$ have the same dimension and are partitioned in the same way, we obtain their sum by adding the corresponding blocks.

Example If [eq1]we can compute their sum as[eq2]

All the couples of summands need to have the same dimension. For instance, in the example above, if $A_{1}$ is $J	imes K$ ($J$ rows and K columns), then $A_{2}$ must be $J	imes K$.

This property of block matrices is a direct consequence of the definition of matrix addition. Two matrices having the same dimension can be added together by adding their corresponding entries. For example, the $left( j,k
ight) $-th entry of $M_{1}+M_{2}$ is the sum of the $left( j,k
ight) $-th entry of $M_{1}$ and the $left( j,k
ight) $-th entry of $M_{2}$. This does not change if we first partition $M_{1}$ and $M_{2}$ and then we add together the two blocks to which the $left( j,k
ight) $-th entries of $M_{1}$ and $M_{2}$ respectively belong.

Scalar multiplication of a block matrix

Remember that the multiplication of a matrix $M$ by a scalar $lpha $ is performed by multiplying all the entries of $M$ by the scalar $lpha $.

The same result can be achieved by multiplying all the blocks of $M$ by $lpha $ (because then all the entries of each block are multiplied by $lpha $)$.$

Example If [eq3]then[eq4]

Multiplication of block matrices

The multiplication of two block matrices can be carried out as if their blocks were scalars, by using the standard rule for matrix multiplication: the $left( j,k
ight) $-th block of the product $M_{1}M_{2}$ is equal to the dot product between the $j$-th row of blocks of $M_{1}$ and the k-th column of blocks of $M_{2}$.

Example Given two block matrices[eq5]we have that[eq6]

As all the products must be well-defined, all the couples of blocks involved in a multiplication must be conformable. For instance, in the example above the number of columns of $A_{1}$ and the number of rows of $A_{2}$ must coincide for the product $A_{1}A_{2}$ to be well-defined.

A proof that the dot product formula can be applied to block matrices follows.


Let us start from the case of the two matrices $M_{1}$ and $M_{2}$ in the previous example. Suppose that the blocks $A_{1}$ and $C_{1}$ have $S_{1}$ columns. As a consequence, $A_{2}$ and $B_{2}$ must have $S_{1}$ rows for the block products to be well-defined. Further assume that the blocks $B_{1}$ and $D_{1}$ have $S_{2}$ columns. It follows that $C_{2}$ and $D_{2}$ must have $S_{2}$ rows. By the definition of matrix product, the $left( k,l
ight) $-th entry of $M_{1}M_{2}$ is[eq7]Now, suppose that the partition of $M_{1}$ leaves $K_{1}$ rows in the upper part of the matrix and $K_{2}$ in the lower one. Further assume that the partition of $M_{2}$ leaves $L_{1}$ columns to the left and $L_{2}$ to the right. Then, if $kleq K_{1}$ and $lleq L_{1}$ (upper-left quadrant of $M_{1}M_{2}$), we have[eq8]where we have used the facts that: 1) [eq9] when $kleq K_{1}$ and $sleq S_{1}$; [eq10] when $sleq S_{1}$ and $lleq L_{1}$; 3) [eq11] when $kleq K_{1}$ and $s>0$; 4) [eq12] when $s>0$ and $lleq L_{1}$. Thus, as far as the upper-left quadrant is concerned, the claim we wanted to prove is true. Along similar lines, we can discuss the case in which $kleq K_{1}$ and $L>L_{1}$ (upper-right quadrant of $M_{1}M_{2}$), for which[eq13]We do not report the proofs for the remaining two quadrants, which are analogous. Moreover, similar case-by-case discussions can be performed if the block matrix is partitioned in a different manner (i.e., it has different numbers of horizontal and vertical cuts).

The proof, although tedious, allows us to better understand under what condition all the blocks can be multiplied. The partitions need to be such that a vertical partition of $M_{1}$ leaves $S_{1}$ columns to the left and $S_{2}$ to the right if and only if an horizontal partition of $M_{2} $ leaves $S_{1}$ rows in the upper part of the matrix and $S_{2}$ in the lower part. There are no constraints on the horizontal partitions of $M_{1}$ and the vertical partitions of $M_{2}$.

Transpose of a block matrix

The transpose of a block-matrix $M$ is the matrix $M^{	op }$ such that the $left( j,k
ight) $-th block of $M^{	op }$ is equal to the transpose of the $left( k,j
ight) $-th block of $M$.

Example The transpose of the partitioned matrix [eq14]is[eq15]

A proof follows.


Let us first prove the result for the matrix $M$ in the example. Suppose that: 1) A and $B$ have $J_{1}$ rows; 2) $C$ and $D$ have $J_{2}$ rows; 3) A and $C$ have $K_{1}$ columns; 4) $B$ and $D$ have $K_{2}$ columns. Then, we can define the $left( j,k
ight) $-th entry of $M$ as[eq16]By the definition of transpose, we have that the $left( k,j
ight) $-th entry of $M^{	op }$ is [eq17]Therefore,[eq18]We can easily check that this case-by-case definition corresponds to [eq15]The proofs for block matrices having different partitions (i.e., different numbers of horizontal and vertical cuts) are similar (the case-by-case definitions of the matrices change based on the number of cuts).

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Define the block matrix[eq20]where I is an identity matrix and 0 is a matrix of zeros. Compute the product $MM^{	op }$.


We have that the transpose of $M$ is[eq21]and the product is[eq22]

Exercise 2

Define the block matrix[eq23]How is the transpose $M^{	op }$ structured?


The transpose of $M$ is[eq24]

How to cite

Please cite as:

Taboga, Marco (2021). "Properties of block matrices", Lectures on matrix algebra.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.