
along time and across matrix components, i.e., a complete
Karhunen-Lo`eve expansion for matrix-valued signals can be
obtained. This may not be possible for Orthogonality A or
Orthogonality C induced from a scalar-valued inner product
[13], [14]. In other words, the matrix-valued inner product (5)
is fundamental to study matrix-valued signals.
Let us first introduce degenerate and linearly independent
matrix-valued signals, and study their properties.
A matrix-valued signal fin L2(a, b;CN×N)is called de-
generate signal if hf,fidoes not have full rank, otherwise it
is called nondegenerate signal. A sequence of matrix-valued
signals fkin L2(a, b;CN×N),k= 1,2, ..., K, are called
linearly independent if the following condition holds: if
K
X
k=1
Fkfk
∆
=f(11)
for constant matrices Fk∈CN×N,k= 1,2, ..., K, is
degenerate, then the null space of matrix F†
kincludes the null
space of matrix hf,fifor every k,k= 1,2, ..., K. Clearly,
the above linear independence returns to the conventional one
when all the above matrices are diagonal. Furthermore, if
f= 0 in (11), the above condition implies that all Fk= 0,
k= 1,2, ..., K, since in this case, the null space of hf,fiis
the whole space CN×N. This concides with the condition of
the conventional linear independence.
Proposition 2: If matrix-valued signals fk,k= 1,2, ..., K,
are linearly independent, then, all signals fk,k= 1,2, ..., K,
are nondegenerate.
Proof: Without loss of generality, assume f1is degenerate.
Let F1=INand Fk= 0 for k= 2,3, ..., K. Then, we have
that K
X
k=1
Fkfk=f1
is degenerate, while the null space of F†
1is 0only and does
not include the null space of hf1,f1i. In other words, fk,k=
1,2, ..., K, are not linearly independent. This contradicts the
assumption in the proposition and therefore the proposition is
proved. q.e.d.
As one can see, the above concept of degenerate signal is
similar to that of 0in the conventional linear dependence or
independence.
Proposition 3: Let Gk∈CN×N,k= 1,2, ..., K, be K
constant matrices and at least one of them have full rank.
If matrix-valued signals fk,k= 1,2, ..., K, are linearly
independent, then PK
k=1 Gkfkis nondegenerate.
Proof: Without loss of generality, let us assume G1has full
rank. If PK
k=1 Gkfk=gis degenerate, then by the linear
independence of fk,k= 1,2, ..., K, the null space of G†
1
cannot only contain 0, which contradicts the assumption that
G1has full rank. q.e.d.
It is clear to see that Proposition 2 is a special case of
Proposition 3.
Proposition 4: If matrix-valued signals fk,k= 1,2, ..., K,
are linearly independent, then for any full rank constant
matrices Gk∈CN×N,k= 1,2, ..., K, matrix-valued signals
gk
∆
=Gkfk,k= 1,2, ..., K, are also linearly independent.
Proof: For any constant matrices Fk∈CN×N,k=
1,2, ..., K, if
K
X
k=1
Fkgk=
K
X
k=1
FkGkfk=f
is degenerate, then for each k,1≤k≤K, the null space of
matrix (FkGk)†=G†
kF†
kincludes the null space of matrix
hf,fi, since fk,k= 1,2, ..., K, are linearly independent.
Because all matrices Gk,k= 1,2, ..., K, have full rank, for
each k,1≤k≤K, the null spaces of F†
kand G†
kF†
kare the
same, thus, the null space of F†
kincludes the null space of
hf,fias well. This proves the proposition. q.e.d.
Similar to the conventional linear dependence of vectors,
we have the following result for matrix-valued signals.
Proposition 5: For a matrix-valued signal f∈
L2(a, b;CN×N)and two constant matrices A, B ∈CN×N,
matrix-valued signals Afand Bfare linearly dependent.
Proof: If Afand Bfare linearly independent, then, from
Proposition 2 it is easy to see that matrices Aand Ball have
full rank and fis nondegenerate. Then, we have
BA−1Af−Bf= 0,
which contradicts with the assumption of the linear indepen-
dence of Afand Bf. This proves the proposition. q.e.d.
Although it is obvious for the conventional vectors, the
result in Proposition 5 for matrix-valued signals may not be
so, due to the matrix-valued coefficient multiplications as it
can be seen from the above proof. We next consider more
general linear combinations of linearly independent matrix-
valued signals.
For 1≤p≤K, let S1, ..., Spbe a partition of the index
set {1,2, ..., K}and each Sihas Kielements, where Si1∩
Si2=∅for 1≤i16=i2≤p,∪p
i=1Si={1,2, ..., K}, and
1≤K1, ..., Kp≤Kwith K1+K2+···+Kp=K.
Proposition 6: For each i,1≤i≤p, let Gki∈CN×N,
ki∈Si, be Kiconstant matrices and at least one of them
have full rank. If matrix-valued signals fk,k= 1,2, ..., K,
are linearly independent, then the following pmatrix-valued
signals: X
ki∈Si
Gkifki,for i= 1,2, ..., p,
are linearly independent.
Proof: Let Fi∈CN×N,i= 1,2, ..., p, be constant matrices.
Assume that p
X
i=1
FiX
ki∈Si
Gkifki=g
is degenerate. Then,
p
X
i=1 X
ki∈Si
FiGkifki=g,
3. Linear Independence and
Gram-schmidt Orthonormalization
3.1 Degenerate and Linearly
Independent Matrix-valued Signals
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2022.18.21