Recursive Least-Squares Wiener Consensus Filter and Fixed-Point
Smoother in Distributed Sensor Networks
SEIICHI NAKAMORI
Professor Emeritus, Faculty of Education
Kagoshima University
1-20-6, Korimoto, Kagoshima, 890-0065
JAPAN
Abstract: - Distributed Kalman filter (DKF) is classified into the information fusion Kalman filter (IFKF), i. e.
the centralized Kalman filter (CKF), and the Kalman consensus filter (KCF) in distributed sensor networks.
The KCF has the advantage to improve the estimate of the state at the sensor node uniformly by incorporating
the information of the observations and the filtering estimates at the neighbor nodes. In the first devised KCF, a
user adjusts the consensus gain. This paper designs the recursive least-squares (RLS) Wiener consensus filter
and fixed-point smoother that do not need to be adjusted in linear discrete-time stochastic systems. In addition
to the observation equation at the sensor node, an observation equation is introduced excessively. Here, the new
observation is the sum of the filtering estimates of the signals at the neighbor nodes of the sensor node. Thus, it
is interpreted that the RLS Wiener consensus estimators incorporate the information of the observations at the
neighbor nodes indirectly because the observations are used in the calculations of the filtering estimates. A
numerical simulation example shows that the proposed RLS Wiener consensus filter and fixed-point smoother
are superior in estimation accuracy to the RLS Wiener estimators.
Key-Words: - RLS Wiener consensus filter, RLS Wiener consensus fixed-point smoother, distributed sensor
networks, Kalman consensus filter, information fusion filter.
Received: April 25, 2022. Revised: December 19, 2022. Accepted: January 13, 2023. Published: February 22, 2023.
1 Introduction
Over the last decade or more, the Kalman consensus
filter (KCF) has been studied extensively in linear
discrete-time or continuous-time systems, e.g., [1]-
[9]. Casbeer and Beard [10] present an information
consensus filter (ICF) in distributed sensor
networks. Li, Caimou, and Haoji, [11], study the
KCF and the ICF, where they propose a new
optimization procedure to update the consensus
weights of the ICF. Wu et al., [4], propose the KCF
by introducing the consensus gain, as shown in (7)
and (8) in the paper, for linear continuous-time
systems. AminiOmam, Torkamani-Azar, and
Ghorashi, [12], propose a generalized Kalman
consensus filter for nonlinear discrete-time systems,
and its stability on the asymptotical convergence is
proved based on the Lyapunov method. Chen et al.,
[13], propose the distributed state estimator in
discrete-time nonlinear systems and present the
distributed cubature information filtering algorithm.
Olfati-Saber [14]-[16] is the first proposer of the
KCF. Referring to Olfati-Saber [14]-[16], Takaba
[17] explains in Japanese the distributed Kalman
filter (DKF). The DKF is classified into the
information fusion Kalman filter (IFKF), i. e. the
centralized Kalman filter (CKF), and the KCF in
distributed sensor networks. The KCF has the
advantage to improve the estimate of the state at the
sensor node uniformly by incorporating the
information of the observations and the filtering
estimates at the neighbor nodes. In the calculation of
the filtering estimate at the sensor node, the KCF,
[16], uses the one-step-ahead prediction estimates of
the states at the neighbor nodes of the sensor node in
addition to the observed value at the sensor node. In
the first devised KCF, a user adjusts the consensus
gain. This paper designs the recursive least-squares
(RLS) Wiener consensus filter and fixed-point
smoother that do not need to be adjusted in linear
discrete-time stochastic systems. In addition to the
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2023.19.1
Seiichi Nakamori
E-ISSN: 2224-3488
1
Volume 19, 2023
observation equation at the sensor node, a new
observation equation is introduced excessively.
Here, the new observation is the sum of the filtering
estimates of the signals at the neighbor nodes of the
sensor node. Thus, it is interpreted that the RLS
Wiener consensus estimators incorporate the
information of the observations at the neighbor
nodes indirectly because these observations are used
in the calculations of the filtering estimates at the
neighbor nodes.
Section 2 introduces the least-squares consensus
fixed-point smoothing problem. Section 3 presents
the RLS Wiener consensus filtering and fixed-point
smoothing algorithms. Section 4 presents the
recursive algorithm for the estimation error variance
function of the RLS Wiener consensus fixed-point
smoother. Also, the asymptotic stability condition of
the RLS Wiener consensus filter and the existence
of the RLS Wiener consensus fixed-point smoother
are shown. A numerical simulation example is
shown in section 5 to demonstrate the estimation
characteristic of the RLS Wiener consensus filter
and fixed-point smoother. From the numerical
simulation example, the proposed RLS Wiener
consensus filter and fixed-point smoother are
superior in estimation accuracy to the RLS Wiener
filter and fixed-point smoother respectively.
2 Least-squares consensus fixed-point
smoothing problem
Consider the state equation for the state vector 󰇛󰇜
in linear discrete-time stochastic systems
󰇛󰇜󰇛󰇜󰇛󰇜
(1)
where 󰇛󰇜 is the state vector with components at
time , is the system matrix, is the input matrix
and 󰇛󰇜 is the zero-mean input noise. For the
sensor nodes, , each sensor node has
the observation equation
󰇛󰇜󰇛󰇜󰇛󰇜󰇛󰇜
󰇛󰇜
(2)
where 󰇛󰇜 is the -dimensional observed value at
the sensor node , 󰇛󰇜 is the signal at the sensor
node , is the by observation matrix at the
sensor node and 󰇛󰇜 is the zero-mean
observation noise at the sensor node . The auto-
covariance functions of 󰇛󰇜 and 󰇛󰇜 are given
by
󰇟󰇛󰇜󰇛󰇜󰇠󰇛󰇜
󰇟󰇛󰇜󰇠
󰇣󰇛󰇜󰇛󰇜󰇤󰇛󰇜
(3)
where 󰇛󰇜 denotes the Kronecker function.
According to the observation equation (2), Olfati-
Saber [16] shows the Kalman consensus filter as
follows.
Filtering estimate of the state 󰇛󰇜 at the sensor
node : 󰇛󰇜
󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰇜󰇛󰇜
󰇛󰇜󰇛
󰇛󰇜
󰇛󰇜󰇜
󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰇜
(4)
󰇛󰇜 and 󰇛󰇜 are the Kalman gain and the
consensus gain respectively. The equations for the
Kalman gain and the consensus gain are shown in
the paper. In (4), the filtering estimate at the sensor
node uses the observed value 󰇛󰇜 at the sensor
node together with the one-step-ahead prediction
estimates 󰇛󰇜 of 󰇛󰇜 at the neighbor
nodes of the sensor node . Here, it should be
noted that the Kalman filter calculates the estimates
󰇛󰇜 recursively with the observed values
󰇛󰇜.
Referring to Olfati-Saber [14]-[16], Takaba [17]
summarizes the Kalman consensus filter as follows.
The observation 󰇛󰇜 at the sensor node is given
by
󰇛󰇜󰇛󰇜󰇛󰇜
󰇛󰇜
󰇛󰇜
󰇛󰇜
󰇛󰇜
(5)
From (2) and (5), 󰇛󰇜 consists of the observed
value 󰇛󰇜 at the sensor node and its neighbor
observed values 󰇛󰇜, 󰇛󰇜,, 󰇛󰇜 at time .
󰇛󰇜 has the auto-covariance function.
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2023.19.1
Seiichi Nakamori
E-ISSN: 2224-3488
2
Volume 19, 2023
󰇟󰇛󰇜󰇛󰇜󰇛󰇜󰇠
  
  



(6)
The Kalman consensus filter calculates the filtering
estimate 󰇛󰇜, at the sensor node , of the state
󰇛󰇜 recursively by (7) - (10).
󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰇜󰇛󰇜
󰇛󰇜
󰇛
󰇛󰇜󰇛󰇜󰇜
󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰇜
(7)
Kalman gain:
󰇛󰇜󰇛󰇜󰇛󰇜
󰇛󰇛󰇜󰇛󰇜󰇜
(8)
Riccati equation:
󰇛󰇜
󰇛󰇜󰇛󰇜
(9)
󰇛󰇜
󰇛󰇛󰇜󰇜󰇛󰇜
󰇛󰇛󰇜󰇜
󰇛󰇜󰇛󰇛󰇜󰇜
(10)
Here, is a positive parameter determined by a user.
So, the Kalman consensus filter is suboptimal. In
(7), denotes the neighbor nodes of the sensor
node in the distributed sensor networks. At the
sensor node , in estimating the state 󰇛󰇜, the
Kalman consensus filter uses the observations at the
sensor node and the observations at the neighbor
nodes of the sensor node together with the one-
step-ahead prediction estimates 󰇛󰇜 at the
neighbor nodes of the sensor node . Taking
into consideration of the Kalman consensus filtering
algorithm, we newly introduce the augmented
observation equation as follows.
󰆽󰇛󰇜󰆽󰇛󰇜󰆽󰇛󰇜
󰆽󰇛󰇜󰇯󰇛󰇜
󰇛󰇜󰇰
󰆽󰇯
󰇰
󰆽󰇛󰇜󰇛󰇜
󰇛󰇜
(11)
󰇛󰇜 is the observed value at the sensor node .
󰇛󰇜 denotes the sum of the filtering
estimates 󰇛󰇜 of the signal 󰇛󰇜󰇛󰇜 in
the neighbor nodes of the sensor node . In
this paper, the RLS Wiener filter calculates the
filtering estimates 󰇛󰇜 of the state 󰇛󰇜, for the
neighbor nodes , , of the sensor node  with
the observed values 󰇛󰇜 recursively. Since the
filtering estimate 󰇛󰇜 is calculated with the
information of the observed values 󰇛󰇜, the RLS
Wiener consensus estimators in this paper do not
include the observed values from the neighbor
nodes in the observation equation (11). 󰇛󰇜
represents the sum of the filtering errors of the
signals at the neighbor nodes of the sensor
node .
󰇛󰇜󰇡󰇛󰇜󰇛󰇜󰇢
󰇡󰇛󰇜󰇛󰇜󰇢
󰇛󰇜󰇛󰇜
󰇛
󰇛󰇜󰇛󰇜󰇜
(12)
It is seen that the processes 󰇛󰇜󰇛󰇜,
󰇛󰇜󰇛󰇜, , 󰇛󰇜󰇛󰇜 are
mutually uncorrelated. Let the auto-covariance
function 󰇛󰇜 of the state 󰇛󰇜 have the semi-
degenerate kernel form
󰇛󰇜󰇟󰇛󰇜󰇛󰇜󰇠
󰇫󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰇜
󰇛󰇜 󰇛󰇜󰇛󰇜
(13)
in wide-sense stationary stochastic systems [18].
The auto-covariance function 󰇛󰇜 of 󰇛󰇜 is given
by
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2023.19.1
Seiichi Nakamori
E-ISSN: 2224-3488
3
Volume 19, 2023
󰇟󰇛󰇜󰇛󰇜󰇠󰇛󰇜󰇛󰇜
󰇛󰇜
󰇛󰇛󰇜
󰇛󰇜󰇜󰇛󰇜󰇜
󰇛󰇜󰇟󰇛󰇜󰇛󰇛󰇜󰇜󰇠
(14)
Hence, the auto-covariance function of 󰆽󰇛󰇜 is
given by
󰇟󰆽󰇛󰇜󰇛󰆽󰇛󰇜󰇜󰇠󰆽󰇛󰇜󰇛󰇜
󰆽󰇛󰇜
󰇛󰇜
(15)
Now, the consensus estimation problem is reduced
to estimate the state 󰇛󰇜 with the augmented
observation 󰆽󰇛󰇜 of (11).
Let the fixed-point smoothing estimate 󰇛󰇜
of 󰇛󰇜, at the sensor node , be expressed by
󰇛󰇜
 󰇛󰇜󰆽󰇛󰇜
(16)
as a linear transformation of the observed values
󰆽󰇛󰇜, . In (16), 󰇛󰇜 is called the
impulse response function. We consider the fixed-
point smoothing problem, which minimizes the
mean-square value (MSV)
󰇟󰇛󰇜󰇛󰇜󰇠
(17)
of the fixed-point smoothing error at the sensor node
. From an orthogonal projection lemma, [18],
󰇛󰇜
 󰇛󰇜󰇛󰇜󰇛󰇜

(18)
the impulse response function, at the sensor node ,
satisfies the Wiener-Hopf equation
󰇟󰇛󰇜󰇛󰆽󰇛󰇜󰇜󰇠
 󰇛󰇜󰆽󰇛󰇜
󰆽󰇛󰇜󰇟󰆽󰇛󰇜󰇛󰆽󰇛󰇜󰇜󰇠
(19)
In (18), denotes the notation of the
orthogonality. 󰆽󰇛󰇜 is the auto-covariance
function of the augmented observed value 󰆽󰇛󰇜.
Substituting (11), (13), and (15) into (19), we obtain
the equation for the optimal impulse response
function 󰇛󰇜 at the sensor node .
󰇛󰇜󰆽󰇛󰇜󰇛󰇜󰇛󰆽󰇜
 󰇛󰇜󰆽󰇛󰇜󰇛󰆽󰇜

(20)
Starting with (20), the RLS Wiener estimation
algorithms are derived based on the invariant
imbedding method. Section 3 proposes the RLS
Wiener consensus filtering and fixed-point
smoothing algorithms.
3 RLS Wiener consensus filtering and
fixed-point smoothing algorithms
Starting with (20), which the optimal impulse
response function 󰇛󰇜 satisfies, based on the
preliminary formulations of the least-squares
consensus estimation problem, Theorem 1 presents
the RLS Wiener consensus filtering and fixed-point
smoothing algorithms.
Theorem 1 Let the state equation for the state 󰇛󰇜
be given by (1). Let the observation equation at the
sensor node with the consensus of the neighbor
nodes of the sensor node be given by (11).
The auto-covariance function of the observation
noise is given by (15). Then the RLS Wiener
consensus filtering and fixed-point smoothing
algorithms consist of (21)-(29) in the linear discrete-
time wide-sense stationary stochastic system.
Fixed-point smoothing estimate of the signal 󰇛󰇜
at the sensor node : 󰇛󰇜
󰇛󰇜󰇛󰇜
(21)
Fixed-point smoothing estimate of the state 󰇛󰇜 at
the sensor node : 󰇛󰇜
󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇜
(22)
Smoother gain at the sensor node : 󰇛󰇜
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2023.19.1
Seiichi Nakamori
E-ISSN: 2224-3488
4
Volume 19, 2023
󰇛󰇜󰇛󰇛󰇜󰇛󰇜󰇛󰆽󰇜
󰇛󰇜󰇛󰆽󰇜󰇜
󰇛󰆽󰇛󰇜󰆽󰇛󰇜
󰆽󰇛󰇜󰇜󰇛󰆽󰇜󰇜
(23)
󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇜
󰇛󰇜󰇛󰇜
(24)
Filter gain at the sensor node : 󰇛󰇜
󰇛󰇜
󰇛󰇛󰇜󰇛󰇜󰇜󰇛󰆽󰇜
󰇛󰆽󰇛󰇜󰆽󰇛󰇛󰇜
󰇛󰇜󰇜
󰇛󰆽󰇜󰇜
(25)
Filtering estimate of the signal 󰇛󰇜 at the sensor
node : 󰇛󰇜
󰇛󰇜󰇛󰇜
(26)
Filtering estimate of the state 󰇛󰇜 at the sensor
node : 󰇛󰇜
󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇜
󰇛󰇜
(27)
The variance of the filtering estimate 󰇛󰇜 at the
sensor node : 󰇛󰇜
󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇜
󰇛󰇜
(28)
Here, the variance of the observation noise 󰆽󰇛󰇜 is
given by
󰆽󰇛󰇜
󰇛󰇜
󰇛󰇜
󰇛󰇛󰇜
󰇛󰇜󰇜󰇛󰇜󰇜
󰇛󰇛󰇜󰇛󰇜󰇜󰇛󰇜󰇜
(29)
Proof of Theorem 1 is deferred to the Appendix.
Section 4 proposes the algorithm for the RLS
Wiener consensus fixed-point smoothing error
variance function. Also, the asymptotic stability
condition of the RLS Wiener consensus filter and
the existence of the RLS Wiener consensus fixed-
point smoother are shown.
4 RLS Wiener consensus fixed-point
smoothing error variance function
The RLS Wiener consensus fixed-point smoothing
error variance function is defined by
󰇛󰇜󰇟󰇛󰇛󰇜
󰇛󰇜󰇜󰇛󰇛󰇜
󰇛󰇜󰇜󰇠
(30)
From (22) and the relationship 󰇛󰇜
󰇟󰇛󰇜󰇛󰇛󰇜󰇜󰇠, (30) is developed as
󰇛󰇜
󰇛󰇜󰇟󰇛󰇜󰇛󰇛󰇜󰇜󰇠
󰇛󰇜󰇟󰇛󰇛󰇜
󰇛󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇜
󰇛󰇛󰇜󰇛󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇜󰇠
󰇛󰇜
󰇟󰇛󰇜󰇛󰇛󰇜󰇜󰇠
󰇛󰇜󰇟󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇜󰇠
󰇛󰇛󰇜󰇜
󰇛󰇜󰇛󰇜
󰇛󰆽󰇛󰇜󰆽󰇛󰇛󰇜
󰇛󰇜󰇜󰇛󰆽󰇜󰇜
󰇛󰇛󰇜󰇜
(31)
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2023.19.1
Seiichi Nakamori
E-ISSN: 2224-3488
5
Volume 19, 2023
Here, 󰇛󰇜 is calculated by (23)-(25) and (28)
recursively. 󰇛󰇜 is calculated by (25) and (28)
recursively.
Also, the RLS Wiener consensus fixed-point
smoothing error variance function 󰇛󰇜 is
written as 󰇛󰇜󰇛󰇜
󰇟󰇛󰇜󰇛󰇛󰇜󰇜󰇠 󰇟󰇛󰇜󰇛󰇛󰇜󰇜󰇠
represents the variance of the fixed-point smoothing
estimate 󰇛󰇜 at the sensor node . 󰇛󰇜 and
󰇟󰇛󰇜󰇛󰇛󰇜󰇜󰇠 are positive-semidefinite
matrices. From this fact, the variance of the fixed-
point smoothing estimate 󰇟󰇛󰇜󰇛󰇛󰇜󰇜󰇠 is
upper bounded by the variance of the state 󰇛󰇜 and
lower bounded by the zero matrix as
󰇟󰇛󰇜󰇛󰇛󰇜󰇜󰇠󰇛󰇜
This shows the existence of the fixed-point
smoothing estimate 󰇛󰇜.
The asymptotic stability of the filtering equation
(27) is assured by the condition that
󰇛󰇜󰆽 is a stable matrix. Namely, for the
stability of the filtering equation (27), all the
eigenvalues of 󰇛󰇜󰆽 must lie within the
unit circle.
5 A numerical simulation example
Let us consider the state equation
󰇛󰇜󰇛󰇜󰇛󰇜
󰇣 
 󰇤
󰇛󰇜󰇛󰇜
󰇛󰇜



(32)
The auto-covariance function of the input noise
󰇛󰇜, with mean zero, is given by
󰇟󰇛󰇜󰇛󰇜󰇠󰇛󰇜
Fig.1 Directed graph of topological structure for the
distributed sensor networks with three sensor nodes.
Fig.1 illustrates the directed graph of the topological
structure for the distributed sensor networks with
three sensor nodes. Its adjacency matrix is given by
The observation equations at the sensor nodes are
given as follows.
Observation equation at the sensor node :
󰇛󰇜󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰇜
󰇟 󰇠
(33)
Observation equation at the sensor node :
󰇛󰇜󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰇜
󰇟 󰇠
(34)
Observation equation at the sensor node :
󰇛󰇜󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰇜
󰇟 󰇠
(35)
Here, the variance  of the observation noises
󰇛󰇜,  are the same. Substituting 󰆽, ,
󰇛󰇜, 󰆽󰇛󰇜 and 󰆽󰇛󰇜 into the RLS Wiener
consensus fixed-point and filtering algorithms of
Theorem 1, we calculate the fixed-point smoothing
estimate 󰇛󰇜 and the filtering estimate 󰇛󰇜
of the signal 󰇛󰇜,  recursively.
Fig.2 illustrates the signal 󰇛󰇜, the filtering
estimate 󰇛󰇜 and the fixed-point smoothing
estimate 󰇛󰇜 for the observation noise
󰇛󰇜 at sensor node under the consensus
with neighbor node . Fig.3 illustrates the mean-
square values of the filtering and fixed-point
smoothing errors 󰇛󰇜󰇛󰇜,
 of the signal 󰇛󰇜 at the sensor node
vs. , for the white Gaussian observation noises
󰇛󰇜, 󰇛󰇜 and 󰇛󰇜, by the RLS
Wiener consensus estimators, under the consensus
of the sensor node with the neighbor node , and
the RLS Wiener estimators. Fig.4 illustrates the
mean-square values of the filtering and fixed-point
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2023.19.1
Seiichi Nakamori
E-ISSN: 2224-3488
6
Volume 19, 2023
smoothing errors 󰇛󰇜󰇛󰇜,
 of the signal 󰇛󰇜 at the sensor node
vs. , for the white Gaussian observation noises
󰇛󰇜, 󰇛󰇜 and 󰇛󰇜, by the RLS
Wiener consensus estimators, under the consensus
of the sensor node with the neighbor node , and
the RLS Wiener estimators. Fig.5 illustrates the
mean-square values of the filtering and fixed-point
smoothing errors 󰇛󰇜󰇛󰇜,
 of the signal 󰇛󰇜 at the sensor node
vs. , for the white Gaussian observation noises
󰇛󰇜, 󰇛󰇜 and 󰇛󰇜, by the RLS
Wiener consensus estimators, under the consensus
of the sensor node with the neighbor node , and
the RLS Wiener estimators. From Fig. 3, Fig.4, and
Fig.5, it is seen that the estimation accuracies of the
RLS Wiener consensus filter and fixed-point
smoother are superior to those of the RLS Wiener
filter and fixed-point smoother respectively for each
observation noise. Here, the MSVs of the filtering
and fixed-point smoothing errors are calculated by
󰇛

 󰇛󰇜󰇛󰇜󰇜
, for the RLS Wiener consensus estimators and the
RLS Wiener estimators.
Fig.2 Signal 󰇛󰇜, filtering estimate 󰇛󰇜 and fixed-point smoothing estimate 󰇛󰇜 for the
observation noise 󰇛󰇜 at sensor node under consensus with neighbor node .
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2023.19.1
Seiichi Nakamori
E-ISSN: 2224-3488
7
Volume 19, 2023
Fig.3 Mean-square values of the filtering and fixed-point smoothing errors 󰇛󰇜󰇛󰇜
 of the signal 󰇛󰇜 at the sensor node vs. , for the white Gaussian observation noises
󰇛󰇜, 󰇛󰇜 and 󰇛󰇜, by the RLS Wiener consensus estimators under the consensus of the
sensor node with the neighbor node and the RLS Wiener estimators.
Fig.4 Mean-square values of the filtering and fixed-point smoothing errors 󰇛󰇜󰇛󰇜,
 of the signal 󰇛󰇜 at the sensor node vs. , for the white Gaussian observation noises
󰇛󰇜, 󰇛󰇜 and 󰇛󰇜, by the RLS Wiener consensus estimators under the consensus of the
sensor node with the neighbor node and the RLS Wiener estimators.
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2023.19.1
Seiichi Nakamori
E-ISSN: 2224-3488
8
Volume 19, 2023
Fig.5 Mean-square values of the filtering and fixed-point smoothing errors 󰇛󰇜󰇛󰇜,
 of the signal 󰇛󰇜 at the sensor node vs. , for the white Gaussian observation noises
󰇛󰇜, 󰇛󰇜 and 󰇛󰇜, by the RLS Wiener consensus estimators under the consensus of the
sensor node with the neighbor node and the RLS Wiener estimators.
6 Conclusion
This paper has originally developed the RLS Wiener
consensus filter and fixed-point smoother in linear
discrete-time stochastic systems. The new points of
this paper are to incorporate the sum of the filtering
estimates of the signals at the neighbor nodes as the
observed value in the observation equation as shown
in the augmented observation equation (11) and
derive the RLS Wiener consensus estimators.
From the numerical simulation results in section
5, the estimation accuracies of the RLS Wiener
consensus filter and the fixed-point smoother are
superior to those of the RLS Wiener filter and fixed-
point smoother respectively for each observation
noises
A future task is to apply the robust RLS Wiener
filter to linear distributed sensor networks with
degraded observations generated by state-space
model and observation equation with uncertain
parameters.
Appendix A: Proof of Theorem 1
From (20), the impulse response function 󰇛󰇜
satisfies
󰇛󰇜󰆽󰇛󰇜󰇛󰇜󰇛󰆽󰇜
 󰇛󰇜󰆽󰇛󰇜󰇛󰆽󰇜

(A-1)
Subtracting 󰇛󰇜󰆽󰇛󰇜 from
󰇛󰇜󰆽󰇛󰇜, we have
󰇛󰇛󰇜󰇛󰇜󰇜󰆽󰇛󰇜
󰇛󰇜󰆽󰇛󰇜󰇛󰆽󰇜
󰇛

 󰇛󰇜
󰇛󰇜󰇜󰆽󰇛󰇜󰇛󰆽󰇜
(A-2)
Introducing
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2023.19.1
Seiichi Nakamori
E-ISSN: 2224-3488
9
Volume 19, 2023
󰇛󰇜󰆽󰇛󰇜󰇛󰇜󰇛󰆽󰇜
 󰇛󰇜󰆽󰇛󰇜󰇛󰆽󰇜
(A-3)
we obtain
󰇛󰇜󰇛󰇜
󰇛󰇜󰆽󰇛󰇜󰇛󰇜
(A-4)
Subtracting 󰇛󰇜󰆽󰇛󰇜 from 󰇛󰇜󰆽󰇛󰇜, we
have
󰇛󰇛󰇜󰇛󰇜󰇜󰆽󰇛󰇜
󰇛󰇜󰆽󰇛󰇜󰇛󰆽󰇜
󰇛
 󰇛󰇜
󰇛󰇜󰇜󰆽󰇛󰇜󰇛󰆽󰇜
(A-5)
From (A-3) and (A-5), we obtain
󰇛󰇜󰇛󰇜
󰇛󰇜󰆽󰇛󰇜󰇛󰇜
(A-6)
From (A-3), 󰇛󰇜 satisfies
󰇛󰇜󰆽󰇛󰇜󰇛󰇜󰇛󰆽󰇜
 󰇛󰇜󰆽󰇛󰇜󰇛󰆽󰇜
󰇛󰇜󰇛󰆽󰇜
 󰇛󰇜󰆽󰇛󰇜󰇛󰇜󰇛󰆽󰇜
(A-7)
Introducing
󰇛󰇜󰇛󰇜󰆽󰇛󰇜

(A-8)
we obtain
󰇛󰇜󰆽󰇛󰇜
󰇛󰇜󰇛󰆽󰇜󰇛󰇜󰇛󰇜󰇛󰆽󰇜
(A-9)
Subtracting 󰇛󰇜 from 󰇛󰇜 and using (A-6),
we obtain
󰇛󰇜󰇛󰇜󰇛󰇜󰆽󰇛󰇜
󰇛󰇜󰇛󰇜

 󰆽󰇛󰇜
󰇛󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇛󰇜󰇜
󰇛󰇜
(A-10)
Let us introduce the function
󰇛󰇜󰇛󰇜󰇛󰇜
(A-11)
From (A-10), we obtain
󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇜
(A-12)
Here,
󰇛󰇜󰇛󰇜
(A-13)
From (A-9), we have
󰇛󰇜󰇛󰇛󰇜󰇛󰆽󰇜
󰇛󰇜󰇛󰆽󰇜󰇜󰇛󰆽󰇛󰇜󰇜
(A-14)
From (A-12) and (A-14), we obtain (25).
From (20), 󰇛󰇜 satisfies
󰇛󰇜󰆽󰇛󰇜󰇛󰇜󰇛󰆽󰇜
 󰇛󰇜󰆽󰇛󰇜󰇛󰆽󰇜
󰇛󰇜󰇛󰇜󰇛󰆽󰇜
󰇛󰇜󰇛󰇜󰇛󰆽󰇜
(A-15)
where
󰇛󰇜
 󰇛󰇜󰆽󰇛󰇜
(A-16)
Subtracting 󰇛󰇜 from 󰇛󰇜 and using (A-
4) with (A-8), we have
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2023.19.1
Seiichi Nakamori
E-ISSN: 2224-3488
10
Volume 19, 2023
󰇛󰇜󰇛󰇜
󰇛󰇜󰆽󰇛󰇜
󰇛

 󰇛󰇜
󰇛󰇜󰇜󰆽󰇛󰇜
󰇛󰇜
󰇛󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇛󰇜󰇜
(A-17)
Introducing
󰇛󰇜󰇛󰇜󰇛󰇜
(A-18)
from (A-17) and (A-18), we obtain
󰇛󰇜󰇛󰇜
󰇛󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇛󰇜󰇜󰇛󰇜
󰇛󰇜
󰇛󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇜
(A-19)
In (A-1), putting , we have
󰇛󰇜󰆽󰇛󰇜󰇛󰇜󰇛󰆽󰇜
 󰇛󰇜󰆽󰇛󰇜󰇛󰆽󰇜
(A-20)
From (A-3), it is clear that
󰇛󰇜󰇛󰇜
(A-21)
From (A-18), we have
󰇛󰇜󰇛󰇜󰇛󰇜
(A-22)
Putting in (A-16), from (A-21), we have
󰇛󰇜
 󰇛󰇜󰆽󰇛󰇜
 󰇛󰇜󰆽󰇛󰇜
󰇛󰇜
(A-23)
From (A-11) and (A-22), we have
󰇛󰇜󰇛󰇜
(A-24)
From (A-15) and (A-18), we obtain
󰇛󰇜󰆽󰇛󰇜
󰇛󰇜󰇛󰇜󰇛󰆽󰇜
󰇛󰇜󰇛󰆽󰇜
(A-25)
Substituting (A-19) into (A-25), after some
manipulations, we obtain (23).
Now, from (16), the filtering estimate 󰇛󰇜 is
given by
󰇛󰇜
 󰇛󰇜󰆽󰇛󰇜
(A-26)
Let us introduce the function
󰇛󰇜
 󰇛󰇜󰆽󰇛󰇜
(A-27)
From (A-21), we get
󰇛󰇜󰇛󰇜
(A-28)
Subtracting the equation obtained by putting
in (A-27) from (A-27), we have
󰇛󰇜󰇛󰇜
󰇛󰇜󰆽󰇛󰇜
󰇛
 󰇛󰇜󰇛󰇜󰇜󰆽󰇛󰇜
󰇛󰇜󰆽󰇛󰇜
󰇛󰇜󰆽󰇛󰇜󰇛󰇜
(A-29)
Substituting (A-29) into (A-28), using (A-13), we
obtain (27).
The fixed-point smoothing estimate 󰇛󰇜 is given
by (16). Subtracting 󰇛󰇜 from 󰇛󰇜, and
using (A-4), we have
󰇛󰇜󰇛󰇜
󰇛󰇜󰆽󰇛󰇜
󰇛

 󰇛󰇜
󰇛󰇜󰇜󰆽󰇛󰇜
󰇛󰇜󰇛󰆽󰇛󰇜
󰆽󰇛󰇜󰇜
(A-30)
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2023.19.1
Seiichi Nakamori
E-ISSN: 2224-3488
11
Volume 19, 2023
The initial condition of the fixed-point smoothing
estimate 󰇛󰇜 at is the filtering estimate
󰇛󰇜.
(Q.E.D.)
References:
[1] M. Alighanbari, J. P. How, An unbiased
Kalman consensus algorithm, 2006 American
Control Conference, 2006, pp. 3519-3524.
[2] A. T. Kamal, Information weighted consensus
for distributed estimation in vision networks,
UC Riverside, 2013.
https://escholarship.org/uc/item/0rz9v80g
[3] W. Yang, L. Shi, Y. Yuan, X. Wang, H. Shi,
Network design for distributed consensus
estimation over heterogeneous sensor networks,
IFAC Proceedings, Vol.47, No.3, 2014, pp.
5550-5555.
[4] J. Wu, A. Elser, S. Zeng, F. Allgower,
Consensus-based distributed Kalman-Bucy
filter for continuous-time systems, IFAC-
PapersOnLine, Vol.49, No.22, 2016, pp. 321-
326.
DOI: 10.1016/j.ifacol.2016.10.417
[5] S. Das, J. M. F. Moura, Consensus+innovations
distributed Kalman Filter with optimized gains,
IEEE Transactions on Signal Processing,
Vo.65, No.2, 2017, pp. 467-481.
DOI: 10.1109/TSP.2016.2617827
[6] R. Deshmukh, Development of Optimal
Kalman Consensus Filter and its Application to
Distributed Hybrid State Estimation, Theses
and Dissertations Available from ProQuest,
2017, pp. 1-54.
[7] H. Ji, F. L. Lewis, Z. Hou, D. Mikulski,
Distributed information-weighted Kalman
consensus filter for sensor networks,
Automatica, Vol.77, 2017, pp. 18-30.
DOI: 10.1016/j.automatica.2016.11.014
[8] S. Wang, H. Paul, A. Dekorsy, Distributed
optimal consensus-based Kalman filtering and
its relation to Map estimation, 2018 IEEE
International Conference on Acoustics, Speech
and Signal Processing (ICASSP), 2018, pp.
3664-3668.
DOI: 10.1109/ICASSP.2018.8462418
[9] S. Battilotti, F. Cacace, M. d’Angelo, A.
Germani, Asymptotically optimal consensus-
based distributed filtering of continuous-time
linear systems, Automatica, Vol.122, 2020, pp.
1-7.
DOI: 10.1016/j.automatica.2020.109189
[10] D. W. Casbeer, R. Beard, Distributed
information filtering using consensus filters,
2009 American Control Conference, 2009, pp.
1882-1887.
[11] X. Li, H. Caimou, H. Haoji, Distributed filter
with consensus strategies for sensor networks,
Journal of Applied Mathematics, Article ID
683249, 2013, 9 pages.
[12] M. AminiOmam, F. Torkamani-Azar, S. A.
Ghorashi, Generalised Kalman-consensus filter,
IET Signal Processing, Vol.11, No.5, 2017, pp.
495-502.
[13] Q. Chen, W. Wang, C. Yin, X. Jin, J. Zhou,
Distributed cubature information filtering based
on weighted average consensus,
Neurocomputing, Vol. 243, 2017, pp. 115-124.
DOI: 10.1016/j.neucom.2017.03.004
[14] R. Olfati-Saber, Distributed Kalman filter with
embedded consensus filters, Proceedings of the
44th IEEE Conference on Decision and
Control, 2005, pp. 8179-8184.
DOI: 10.1109/CDC.2005.1583486
[15] R. Olfati-Saber, Distributed Kalman filtering
for sensor networks, 2007 46th IEEE
Conference on Decision and Control, 2007, pp.
5492-5498.
DOI: 10.1109/CDC.2007.4434303
[16] R. Olfati-Saber, Kalman-consensus filter:
Optimality, stability, and performance,
Proceedings of the 48h IEEE Conference on
Decision and Control (CDC) held jointly with
2009 28th Chinese Control Conference, 2009,
pp. 7036-7042.
DOI: 10.1109/CDC.2009.5399678
[17] K. Takaba, Distributed Kalman filter, Journal
of the Society of Instrument and Control
Engineers, Vol.12, 2017, pp. 937-942.
DOI: 10.11499/sicejl.56.937
[18] A.P. Sage, Estimation Theory with Applications
to Communications and Control, McGraw-Hill,
New York, 1971.
Sources of Funding for Research Presented in a
Scientific Article or Scientific Article Itself
No funding was received.
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2023.19.1
Seiichi Nakamori
E-ISSN: 2224-3488
12
Volume 19, 2023
Contribution of Individual Authors to the
Creation of a Scientific Article (Ghostwriting
Policy)
The author contributed in the present research, at all
stages from the formulation of the problem to the
final findings and solution.
Conflict of Interest
The author has no conflict of interest to declare that
is relevant to the content of this article.
Creative Commons Attribution License 4.0
(Attribution 4.0 International, CC BY 4.0)
This article is published under the terms of the
Creative Commons Attribution License 4.0
https://creativecommons.org/licenses/by/4.0/deed.en
_US