Joining Iso-Structured Models with Commutative Orthogonal Block
Structure
CARLA SANTOS1,5, CRISTINA DIAS2,5, CÉLIA NUNES3, JOÃO TIAGO MEXIA4,5
1Polytechnic Institute of Beja,
Beja,
PORTUGAL
2Polytechnic Institute of Portalegre,
Portalegre,
PORTUGAL
3Department of Mathematics and Center of Mathematics and Applications,
University of Beira Interior,
Covilhã,
PORTUGAL
4Department of Mathematics, SST,
New University of Lisbon,
Caparica,
PORTUGAL
5NOVAMATH - Center for Mathematics and Applications, SST,
New University of Lisbon,
Caparica,
PORTUGAL
Abstract: - In this work, we focus on a special class of mixed models, named models with commutative
orthogonal block structure (COBS), whose covariance matrix is a linear combination of known pairwise
orthogonal projection matrices that add to the identity matrix, and for which the orthogonal projection matrix
on the space spanned by the mean vector commutes with the covariance matrix. The COBS have least squares
estimators giving the best linear unbiased estimators for estimable vectors. Our approach to COBS relies on
their algebraic structure, based on commutative Jordan algebras of symmetric matrices, which proves to be
advantageous as it leads to important results in the estimation. Specifically, we are interested in iso-structured
COBS, applying to them the operation of models joining. We show that joining iso-structured COBS gives
COBS and that the estimators for the joint model may be obtained from those for the individual models.
Key-Words: - Best linear unbiased estimators, COBS, Jordan algebra, Mixed model, Models joining, Variance
components.
Received: December 23, 2022. Revised: June 8, 2023. Accepted: June 27, 2023. Published: July 27, 2023.
1 Introduction
Different areas of knowledge, such as Agriculture,
Medical and Biological Sciences, Social Sciences,
and others, base their experimental designs on linear
models.
Using the matrix notation, a linear model can be
represented as  (1)
where is the observations vector, is the design
matrix, is a vector of unknown parameters and 
is the errors vector.
Classifying linear models according to the
nature of the constituent parameters of the vector ,
we consider fixed effects models to have constants
for all the parameters of the vector , random
effects models if all the parameters of vector are
WSEAS TRANSACTIONS on MATHEMATICS
DOI: 10.37394/23206.2023.22.64
Carla Santos, Cristina Dias, Célia Nunes, João Tiago Mexia
E-ISSN: 2224-2880
577
Volume 22, 2023
random, except for the intercept, and mixed effects
models when some effects are fixed and other are
random.
Linear mixed effects models, or adopting an
abbreviated designation, mixed models, arise from
the need to appreciate the amount of variation
caused by given sources in fixed effects designs, [1],
proving to be appropriate for analyzing datasets
involving correlated data, or resulting from repeated
measures, [2], as is common to find in experimental
data in agricultural and medical sciences, for
example.
In this work we address classes of mixed
models, focusing on models with commutative
orthogonal block structure (COBS), which
constitute a special class within the subclass of
mixed models introduced by, [3], [4], called models
with orthogonal block structure.
To lighten the writing, we name the linear
models with commutative orthogonal block
structure, simply, as COBS.
Our approach to COBS relies on their algebraic
structure, based on commutative Jordan algebras of
symmetric matrices (CJAS). This approach proves
to be advantageous as it leads to important results in
the estimation of variance components and the
construction of models, [5].
We are interested in the possibility of
performing joint analysis of models obtained
separately. In, [6], [7], [8], the theory that provides
this joint analysis relies on operations between
models, which are based on binary operations
defined on commutative Jordan algebras.
In, [6], taking, [7], as a starting point, two
operations between COBS were introduced, called
models crossing and models nesting, resorting to the
Kronecker matrix product and the restricted
Kronecker matrix product. In, [8], was introduced
another operation to build up complex models from
simpler ones, named models joining, based on
another binary operation defined on commutative
Jordan algebras, the Cartesian product.
Since COBS has least squares estimators (LSE)
giving best linear unbiased estimators (BLUE) for
estimable vectors, [9], the possibility of joint
analysis of COBS that were obtained independently
is relevant, since, as proved by, [8], model joining
operation involving COBS results in a model that is
also COBS.
In previous works on operations with COBS,
[7], [8], no condition was assumed to aggregate the
models involved in the operations into a family of
models with the same fundamental structure. The
present work presents a development of the
operation of joining models, considering initial
models that belong to a family of iso-structured
models, that is, models that are independent and
have identical space spanned by their mean vectors,
as well as covariance matrices given by linear
combinations of the same pairwise orthogonal
orthogonal projection matrices (POOPM).
The paper is structured as follows.
In carrying out the estimation for COBS we use
commutative Jordan algebras of symmetric matrices
in expressing the algebraic structure of those
models, therefore, we will start by presenting key
results about Jordan algebras in section 2. In section
3 we will present the formulation of COBS and the
definition of iso-structured COBS, as well as results
that will be useful when we join iso-structured
COBS. In section 4 we discuss the estimation in
COBS. Section 5 is devoted to the operation of
model joining, involving iso-structured COBS.
Some concluding remarks are presented in section 6.
2 Jordan Algebras
To formalize the notion of an algebra of
observables, [10], introduced the structures that
were originally designated as r-number systems”,
and which later came to be known as Jordan
Algebras. For our purposes, we will follow the
approach of, [3], [4], in which Jordan algebras were
used in the study of models with orthogonal block
structures. Specifically, we are interested in
commutative Jordan Algebras of symmetric
matrices (CJAS), which are vector spaces of
symmetric matrices that commute and are closed
under squaring, [11].
A rediscovery of Jordan Algebras to carry out
linear statistical inference, [12], showed that every
CJAS has one and only one basis, called the
principal basis, constituted by POOPM.
Let 󰇝, …,󰇞󰇛󰇜 be the principal
basis of the CJAS . A matrix, , belonging to the
CJAS is a linear combination of the matrices of
the 󰇛󰇜, [13],

(2)
It is evident that the family of matrices
󰇝, …,󰇞, where , , belongs to
the CJAS is commutative, since, as shown below,
given any two matrices of the principal basis of this
CJAS, and , these matrices commute:
WSEAS TRANSACTIONS on MATHEMATICS
DOI: 10.37394/23206.2023.22.64
Carla Santos, Cristina Dias, Célia Nunes, João Tiago Mexia
E-ISSN: 2224-2880
578
Volume 22, 2023
󰇭
 󰇮󰇭
 󰇮

 

󰇭
 󰇮󰇭
 󰇮
considering that the matrices , , are
diagonalized by the same orthogonal matrix, [14].
The orthogonal projection matrix on the image
space of will be
󰇛󰇜 
󰇛󰇜
(4)
with 󰇛󰇜, so, considering ,
we have 󰇛󰇜.
Since the Moore-Penrose inverse of , expressed
by
j


(5)
with j
j
 [j
] if [],
belongs to , then the Moore-Penrose inverses of
the matrices of also belong to .
When is invertible we have , so
the inverses of invertible matrices of also belong
to .
Among the operations on CJAS, we are
interested in considering the cartesian product,
introduced by, [15].
Given the CJAS , , with principal
bases ,…,, , their
cartesian product,


(6)
will be the CJAS whose principal basis is
constituted by the block-wise diagonal matrices
󰇛󰇜,
(7)
with principal blocks 󰇛󰇜, ,
with null sub-matrices except one belonging to the
principal basis with that index, always existing a
non-null sub-matrix.
3 Models with Commutative
Orthogonal Block Structure
Let us consider a linear mixed model,

(8)
where is fixed for , and independent
random vectors for , having null mean
vectors, covariance matrices
,
where 󰇛󰇜, . The matrices
are known and such that
󰇛󰇟 󰇠󰇜. When the covariance
matrix is given by
󰇛󰇜
 
(9)
where the
are POOPM whose sum is
the identity matrix
 
(10)
the model (8) has an orthogonal block structure
(OBS), [3], [4]. Moreover, model (8) is a model
with commutative orthogonal block structure
(COBS), when , the orthogonal projection matrix
on the space, , spanned by the mean vector,
commute with the covariance matrix 󰇛󰇜,
whatever with nonnegative components, [5]. This
commutativity between and 󰇛󰇜, characteristic
condition of the COBS, is a necessary and sufficient
condition for the LSE, for estimable functions, to be
uniformly best linear unbiased estimators (UBLUE),
as proven in, [9].
As stressed by, [16], although in OBS the
estimators for estimable vectors and variance
components have good behavior, the inference is
somewhat complex due to the combination of
estimators obtained from different orthogonal
projections in the range spaces of the matrices ,
. COBS, the class of OBS resulting
from the imposition of commutativity between the
matrices and 󰇛󰇜, allows overcoming this
difficulty.
WSEAS TRANSACTIONS on MATHEMATICS
DOI: 10.37394/23206.2023.22.64
Carla Santos, Cristina Dias, Célia Nunes, João Tiago Mexia
E-ISSN: 2224-2880
579
Volume 22, 2023
The study of COBS using an approach based on
their algebraic structure leads to interesting results
in the estimation of variance components and the
construction of models. This approach has been
adopted in several works. In, [11], an alternative
condition for the definition of COBS was
established, resorting to U-matrices. The focus in,
[17], was on structured families of COBS. In the
works, [18], [19], the relationships between COBS
and other models were considered. Works on
inference, in COBS, were developed by, [13], [16],
[20], [21], [22], [23]. In, [7], [8], operations with
models were introduced.
In this work we are interested in COBS
associated with experiments carried out with the
same design, that is, models with the same algebraic
structure and independent observations vector.
Let us now designate by the CJAS
constituted by the linear combinations of the
POOPM in (8), [10]. The CJAS , whose principal
basis is
󰇛󰇜, contains the
products of its matrices and, also, their Moore-
Penrose inverses, since the Moore-Penrose inverse
of an orthogonal projection matrix is, itself, an
orthogonal projection.
󰇭
 󰇮

(11)
with  [] when [],
.
Now, the matrices of a family of symmetric
matrices commute if and only if they are
diagonalized by the same orthogonal matrix ,
[14]. Then that family will be contained in 󰇛󰇜, the
family of matrices diagonalized by , which is itself
a CJAS.
Since intercepting CJAS gives a CJAS,
intercepting all the CJAS that contain a family of
symmetric matrices that commute gives the least
CJAS that contains that family, [6]. This will be the
CJAS 󰇛󰇜, generated by the family . If

, with , is a
family of commuting symmetric matrices, there will
be a generated CJAS, 󰇛󰇜.
We point out that
󰇛󰇜

so, if is the CJAS to which the model is
associated, both 󰇛󰇜 and 󰇛󰇜will belong to .
We are assuming that is the CJAS generated by
, , and , the orthogonal
projection matrix on .
To obtain 󰇛󰇜, with the CJAS generated by
and , we reorder the
giving the first
ranks to the with range space
, the next ranks to the such that we
have neither nor , with
the orthogonal complement of , and the last
ranks to the with range space , we
will have .
Now the product of orthogonal projection
matrices that commute is an orthogonal projection
matrix. Then 󰇛󰇜 will contain the
, with
, and

 






(12)
That is, the matrices of the second set originate
pairs 󰆓 of matrices . In this construction,
the matrices  with 󰇛󰇜 are grouped
first, followed by matrices  with
intersecting 󰇛󰇜 and and finally matrices
with .
We point out that are POOPM that
add up to . We now establish the following result.
Proposition 3.1
The principal basis of corresponds to 󰇛󰇜
󰇝󰇞, with the CJAS generated by
󰇝󰇞.
Proof: Any CJAS containing the matrices
and contains the matrices ,
which are POOPM thus constituting 󰇛󰇜.
The definition of iso-structured models with
commutative orthogonal block structure (iso-
structured COBS) was introduced in, [24].
According to this definition, iso-structured COBS
have:
- covariance matrices that are linear combinations of
the same POOPM,
;
- mean vectors that span the same space, .
WSEAS TRANSACTIONS on MATHEMATICS
DOI: 10.37394/23206.2023.22.64
Carla Santos, Cristina Dias, Célia Nunes, João Tiago Mexia
E-ISSN: 2224-2880
580
Volume 22, 2023
The use of families of iso-structured models
relies on coping with inference for sets of models
with the same algebraic structure and independent
observation vectors. Our discussion is centered on
what follows for joint analysis of independent
models when they have the same algebraic structure
given by CJAS.
4 Estimation
We start by pointing out that the LSE

of 
where
is UBLUE.
To consider the estimation of variance components
we assume to have the model given by
 ,
(13)
with having null mean vectors, null
cross-covariance matrices, and covariance matrices
, where 󰇛󰇜, .
The
will be the usual variance
components, so has the covariance matrix,
 
(14)
with , .
If commute, these matrices generate
the CJAS , with 󰇛󰇜
. Then
we will have

 
 
(15)
with



 


where , since
 

 
where .
We thus get,




(16)
with 
 ,  , as well
as
 
(17)
where 
 , , are the
canonical variance components.
Given the relations between the

 
and the , ,
we also get


 


(18)
Besides this we have
󰇛󰇜
 󰆒󰆒

(18)
with
󰆒 
󰆒󰆒

󰆒󰆒

and for the 󰆒 , , we have the
estimators
WSEAS TRANSACTIONS on MATHEMATICS
DOI: 10.37394/23206.2023.22.64
Carla Santos, Cristina Dias, Célia Nunes, João Tiago Mexia
E-ISSN: 2224-2880
581
Volume 22, 2023
󰆒󰆒
󰆒
󰆒
󰆒󰆒
󰆒  
where 󰆒 is the rank of 󰆒, 
. Note that, in general, we cannot estimate
the 󰆒 , , and that the estimators for the
remaining variance components follow from
, 
implying to have a null mean vector and
covariance matrix , ,
with 󰇛󰇜.
Moreover, we will have 󰆒 ,
, which lightens the estimator
of the estimable variance components.
We now point out that,



 
(19)
Let us put
,
 and
and consider the partition of matrix ,
󰇟󰇠 ,
(20)
where matrix has columns. Thus
, 
(21)
and when has linearly independent row vectors
the same happens with the column vectors of
and, [7],
󰇫󰇛
󰇜
󰇛
󰇜
(22)
So, we may estimate and , through . Then
the relevant parameters and , of the random
effects part of the model, determine each other. This
reveals that there is segregation since that part of the
model segregates a sub-model.
If , the columns of and the  first
columns of are identical, and the corresponding
components of and are also identical. This is
called matching. Thus, estimating leads directly
to estimate
. Besides this, since the
models have COBS, the LSE is UBLUE, [9], so we
only considered in detail the estimation of the
variance components, both canonical,
,
󰇟󰇠, and usually, .
6 Joining Models
The matrices ,  express the
structure of the model. Namely 󰇝, …,󰇞 and
󰇝, …,󰇞 generate the relevant CJA and
. We thus say that models with the same matrices
, , are iso-structured.
Let us now consider independent observations
vectors of iso-structured COBS, 󰇛󰇜, …,󰇛󰇜, and
represent by , , the POOPM of 󰇛󰇜.
Applying the models joining operation to these
models, by overlapping the observations vectors of
the initial models, [8], we obtain a joined model,
whose observations vector is
󰇟󰇛󰇜󰨙󰨙󰇛󰇜󰇠
(23)
and the CJAS 
, given by the cartesian
product of the CJAS , with 󰇛󰇜
, , whose principal basis is
constituted by the blockwise diagonal matrices with
null principal blocks, but one which will belong to
󰇛󰇜, with as its index. The null blocks will
have the same dimension as the matrices of the
corresponding CJAS, [8].
Given the Independence of 󰇛󰇜, …,󰇛󰇜, the
covariance matrix of the joined model will be the
blockwise diagonal matrix,
󰇛󰇛󰇜󰇜󰇛󰇛󰇜󰇜,
(24)
with principal blocks 󰇛󰇛󰇜󰇜󰇛󰇛󰇜󰇜, where
󰇛󰇜󰇛󰇜 are the vectors of canonical variance
components for 󰇛󰇜󰇛󰇜. So, the estimators
for variance components obtained for the individual
models can be used for the joint model.
With 󰇛󰇜,…, 󰇛󰇜 the vectors of
coefficients and 󰇛󰇜󰇛󰇜the mean
vectors of the iso-structured (individual) COBS,
WSEAS TRANSACTIONS on MATHEMATICS
DOI: 10.37394/23206.2023.22.64
Carla Santos, Cristina Dias, Célia Nunes, João Tiago Mexia
E-ISSN: 2224-2880
582
Volume 22, 2023
for the joint model we will have the vector of
coefficients,
󰇟󰇛󰇜󰇛󰇜󰇠
(25)
and the mean vector
󰇛󰇜
where denotes the Kronecker matrix product.
Thus, the orthogonal projection matrix on the space
spanned by will be,
󰇟󰇠󰇛󰇜
 ,
with the orthogonal projection matrix on .
As shown below, for the joined model, the
covariance matrix, 󰇛󰇛󰇜󰇜󰇛󰇛󰇜󰇜, and
the orthogonal projection matrix on the space
spanned by the mean vector, 󰇟󰇠, commute,
󰇟󰇠󰇡󰇛󰇜󰇛󰇜󰇢
󰇡󰇛󰇜󰇛󰇜󰇢
󰇛󰇜󰇛󰇜
󰇡󰇛󰇜󰇛󰇜󰇢󰇟󰇠
(28)
This means that the joined model is also COBS.
As already established, this guarantees that the LSE
of the estimable functions in this model will be
UBLUE. Namely, we also will have
󰇟󰇠󰇛󰇜󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰇜󰇛󰇜
󰇛󰇛󰇜󰇜󰇛󰇜󰇛󰇜
󰇛󰇜
󰇛󰇜
with
󰇛󰇜 , , the LSE estimator of the
coefficients vector of the -th model.
Now the
, 
will be independent and BLUE for each model.
Moreover if the , , are linear
unbiased statistics obtained from the initial models
the

, 
will be positive semi-definite, since
,
, are BLUE. Now the 󰇛󰇜, …,󰇛󰇜 are
independent, so the

󰇟
󰇠 are
independent with
󰇱
󰇣
󰇤


we have
󰇫

󰇛󰇜󰇛

󰇜
Going over to the estimators of variance
components we point out that the joined models,
being iso-structured, have identical , and in
󰇛󰇜 as well as identical CJAS and . The
variance components are distinct from model to
model, so we can estimate them separately.
7 Conclusion
Models joining operation opens the possibility of
jointly treating models obtained separately.
When dealing with models with commutative
orthogonal block structure (COBS), we obtain
uniformly best linear unbiased estimators for
estimable functions of these joined models and
estimate their variance components, showing that
the estimators for the joint model may be obtained
from those for the individual models.
Addressing iso-structured COBS, that is, models
whose covariance matrices are linear combinations
of the same POOPM, and their mean vectors span
the same space, we have shown that joining iso-
structured COBS gives COBS and that the
estimators for the joint model may be obtained from
those for the individual models. The estimators of
variance components for the individual models can
therefore be used for the joint model. Moreover, the
BLUE for the vector 󰇟󰇠󰇟󰇛󰇜󰇛󰇜󰇠 of
coefficients for the joint model is
󰇟󰇠
󰇛󰇜
󰇛󰇜, where the sub-vectors are the
LSE for the coefficients’ vectors for the sub-models.
Thus, the estimators obtained for sub-models can be
applied to the joint model. Since joining COBS
gives COBS, the optimality of LSE then extends
from the individual models to the joint model.
WSEAS TRANSACTIONS on MATHEMATICS
DOI: 10.37394/23206.2023.22.64
Carla Santos, Cristina Dias, Célia Nunes, João Tiago Mexia
E-ISSN: 2224-2880
583
Volume 22, 2023
Acknowledgment:
The authors are grateful to the reviewers for their
insightful and constructive comments and valuable
suggestions, which have helped improve this work.
References:
[1] Khuri A., Mathew T., Sinha B., Statistical
Tests for Mixed Linear Models. Wiley, New
York, 1998.
[2] West B., Welch K., Galecki A., Linear Mixed
Models: A Practical Guide Using Statistical
Software. Taylor and Francis Group, LLC,
2007.
[3] Nelder J. A., The analysis of randomized
experiments with orthogonal block structure I,
Block structure and the null analysis of
variance. Proc. R. Soc. Lond. Ser. A Math.,
Phys. Eng. Sci., Vol.283, 1965a, pp.147–162.
[4] Nelder J. A., The analysis of randomized
experiments with orthogonal block structure
II. Treatment structure and the general
analysis of variance. Proc. R. Soc. Lond. Ser.
A Math., Phys. Eng. Sci., Vol. 283, 1965b,
pp.163–178.
[5] Fonseca M., Mexia J. T., Zmyślony R.,
Inference in normal models with commutative
orthogonal block structure. Acta et
Commentationes Universitatis Tartuensis de
Mathematica, Vol.12, 2008, pp.3–16.
[6] Fonseca M., Mexia J. T., Zmyślony R., Binary
operations on Jordan algebras and orthogonal
normal models. Linear Algebra Appl.,
Vol.117(1), 2006, pp.75–86.
[7] Mexia J. T., Vaquinhas R., Fonseca M.,
Zmyślony R., COBS: segregation, matching,
crossing and nesting. in: Latest Trends and
Applied Mathematics, Simulation, Modelling,
4th International Conference on Applied
Mathematics, Simulation, Modelling, ASM’10.
2010, pp. 249–255.
[8] Santos C., Nunes C., Dias C., Mexia J. T.,
Joining models with commutative orthogonal
block structure. Linear Algebra and its
Applications, Vol.517, 2017, pp.235 – 245.
[9] Zmyślony R., A characterization of best linear
unbiased estimators in the general linear
model. Lecture Notes in Statistics, Vol.2,
1978, pp.365–373.
[10] Jordan P., Von Neumann J., Wigner E., On an
algebraic generalization of the quantum
mechanical formulation. Annals of Math.,
Vol.35 (1), 1934.
[11] Santos C., Nunes C., Dias C., Mexia J. T.,
Models with commutative orthogonal block
structure: a general condition for
commutativity. Journal of Applied
Statistics, Vol. 47 (13-15), 2020, pp.2421
2430.
[12] Seely J., Linear spaces and unbiased
estimation. Ann. Math. Stat., Vol.41, 1970,
pp.1725 – 1734.
[13] Carvalho F., Mexia J. T., Santos C., Nunes C.,
Inference for types and structured families of
commutative orthogonal block
structures. Metrika, Vol.78, 2015, pp.337–
372.
[14] Schott J. R., Matrix Analysis for Statistics.
New York: Jonh Wiley & Sons, 1997.
[15] Santos C, Ramos P, Mexia J T, Algebraic
structure of step nesting designs. Discussiones
Mathematicae Probability and Statistics, Vol.
30, 2010, pp.221–235.
[16] Carvalho F., Mexia J. T., Oliveira M.,
Estimation in Models with Commutative
Orthogonal Block Structure. J. Stat. Theory
Practice, Vol.3(2), 2009, pp.525–535.
[17] Carvalho F., Mexia J. T., Covas R, Structured
Families of Models with Commutative
Orthogonal Block Structures. AIP Conf.
Proc., Vol.1281, 2010, pp.1256–1259.
[18] Carvalho F., Mexia J. T., Santos C.,
Commutative orthogonal block structure and
error orthogonal models. Electronic Journal
of Linear Algebra, Vol25, 2013, pp.119–128.
[19] Santos C, Nunes C, Mexia J. T., OBS, COBS
and mixed models associated to commutative
Jordan Algebra. Bulletin of the ISI, LXII,
Proceedings of 56th session of the
International Statistical Institute, Lisbon,
2008, pp. 3271–3274.
[20] Carvalho F., Mexia J. T., Oliveira M. M.,
Canonic inference and commutative
orthogonal block structure. Discussiones
Mathematicae Probability and Statistics,
Vol.28(2), 2008, pp. 171–181.
[21] Ferreira, S.S., Ferreira, D., Nunes, C.,
Carvalho, F., Mexia, J.T. (2019). Orthogonal
Block Structure and Uniformly Best Linear
Unbiased Estimators. In: Ahmed, S.,
Carvalho, F., Puntanen, S. (eds) Matrices,
Statistics and Big Data. IWMS 2016.
Contributions to Statistics. Springer, Cham.
https://doi.org/10.1007/978-3-030-17519-1_7
pp.89-98.
[22] Mexia J. T., Nunes C., Santos C., Structured
families of normal models with COBS. 17th
International Workshop in Matrices and
Statistics, Tomar (Portugal). Conference
paper. 2008.
WSEAS TRANSACTIONS on MATHEMATICS
DOI: 10.37394/23206.2023.22.64
Carla Santos, Cristina Dias, Célia Nunes, João Tiago Mexia
E-ISSN: 2224-2880
584
Volume 22, 2023
[23] Nunes C., Santos C., Mexia J. T., Relevant
statistics for models with commutative
orthogonal block structure and unbiased
estimator for variance components. J.
Interdiscip. Math., Vol.11, 2008, pp.553
564.
[24] Santos C., Nunes C., Dias C., Mexia J. T.,
Operations with iso-structured models with
commutative orthogonal block structure: an
introductory approach. Statistical Modelling
on Risk Analysis: Selected contributions from
ICRA9, Perugia, Italy, May 25-27, 2022. (C.
P. Kitsos, T. A. Oliveira, F. Pierri, M.
Restaino, eds.) Springer Proceedings in
Mathematics & Statistics. eBook ISBN: 978-
3-031-39864-3 Due: 21 October 2023.
Contribution of Individuals to the Creation of a
Scientific Article (Ghostwriting Policy)
The authors equally contributed to the present
research, at all stages from the formulation of the
problem to the final findings and solution.
Sources of Funding for Research Presented in a
Scientific Article or Scientific Article Itself
This work is partially funded by national funds
through the FCT - Fundação para a Ciência e a
Tecnologia, I.P., under the scope of the projects
UIDB/00297/2020 and UIDP/00297/2020 (Center
for Mathematics and Applications).
Conflict of Interest
The authors have no conflicts of interest to declare
that are relevant to the content of this article.
Creative Commons Attribution License 4.0
(Attribution 4.0 International, CC BY 4.0)
This article is published under the terms of the
Creative Commons Attribution License 4.0
https://creativecommons.org/licenses/by/4.0/deed.en
_US
WSEAS TRANSACTIONS on MATHEMATICS
DOI: 10.37394/23206.2023.22.64
Carla Santos, Cristina Dias, Célia Nunes, João Tiago Mexia
E-ISSN: 2224-2880
585
Volume 22, 2023