Improved UFIR Filter for Fusing Recent INS-assisted Visual
Measurement under Colored Measurement Noise in UAV
Landing
Corresponding Author
Abstract: In this paper, we discuss the landing process of unmanned aerial vehicles (UAVs) employing
inertial navigation system (INS) and visual measurement. Employing the integrated scheme, an improved
unbiased nite impulse response (UFIR) lter is developed for fusing recent INS-assisted visual measure-
ment under colored measurement noise (CMN). The UFIR lter developed for CMN and called cFIR lter
is proposed, and then the hybrid UFIR/cFIR lter is developed to work in parallel. The Mahalanobis
distance is used to select better results as the nal result of the lter. It is shown experimentally that
the proposed method enhances the accuracy and reliability of data fusion, thereby improving the overall
performance of UAV autonomous landing systems.
Key-Words: Autonomous landing, Data fusion, UFIR ltering, Mahalanobis distance.
Received: March 13, 2023. Revised: March 15, 2024. Accepted: May 17, 2024. Published: June 6, 2024.
1 Introduction
Autonomous landing is a critical task for un-
manned aerial vehicles (UAVs), [1], [2], [3], in var-
ious applications such as surveillance, [4], recon-
naissance, [5], [6], and delivery, [7]. To achieve
safe and precise landings, UAVs rely on a combi-
nation of sensors and algorithms to perceive their
environment and make real-time decisions accu-
rately, [8]. Many approaches have been proposed
for localizing UAV. For example, [9], reports an
absolute navigation of the landmark-based iner-
tial measurement units (IMU)/Vision Navigation
System (IMU/VNS) for UAV. One inertial navi-
gation system (INS)-based integrated UAV local-
ization has been proposed in [10]. In recent years,
the use of April tags, also known as ducial mark-
ers, has gained popularity in robotics and com-
puter vision applications. These tags consist of
unique visual patterns that can be easily detected
and recognized by cameras, which can obtain the
precise localization and tracking, [11]. Integrating
April tag detection with IMU data during the au-
tonomous landing process presents an opportunity
to improve the accuracy and reliability of UAV
navigation systems, [12].
Fusion of navigation information is a critical
task in localization, [13], [14], [15]. One of the
most accurate and robust solutions here is the un-
biased nite impulse response (UFIR) lter, [16],
which has found wide applications in a broad area
of tracking, [17], [18]. In [19], the UFIR lter was
extended to colored measurement noise (CMN)
and analysed in detail in [16]. This opened new
horizons for robust localization of moving objects
in harsh environments, specically for localiza-
tion under harsh disturbances in the video camera
bounding box, [16].
In this paper, we discuss the landing process
of UAVs employing INS and visual measurement.
Based on the integrated scheme, we improve the
UFIR lter for fusion of recent INS-assisted visual
measurement under CMN. The improved UFIR
lters called cFIR lter is proposed. Then the
UFIR and cFIR lters are united in a hybrid
scheme, in which the Mahalanobis distance is used
to select better results at the output. The experi-
mental results demonstrate better performance of
the proposed hybrid scheme.
2 INS-assisted visual UAV
Landing system
In this section, we develop the INS-assisted visual
landing system. The structure of the INS-assisted
visual UAV landing system is shown in Fig. 1,
where the visual sensor and the INS sensors are
maintained on the UAV. The video camera is used
to measure the position LoV
kand the attitude θV
k
at the time index k. Meanwhile, the INS is used
to measure the acceleration akand ωkat the time
index k. The LoV
k,θV
k,ak, and ωkare inputs of
the UFIR/cUFIR lter, which is the main lter in
this structure. We will consider it in detail later.
The output of the FIR/cFIR lter is the UAV’s
position Lok.
YIDE ZHANG1, TENG LI1, XIN ZANG1, JINGWEN YU1, YUAN XU1,, YURIY S. SHMALIY2
1School of Electrical Engineering, University of Jinan, Jinan, CHINA
2Department of Electronics Engineering, Universidad de Guanajuato Salamanca, MEXICO
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2024.20.4
Yide Zhang, Teng Li, Xin Zang,
Jingwen Yu, Yuan Xu, Yuriy S. Shmaliy
E-ISSN: 2224-3488
32
Volume 20, 2024
Figure 1: Structure of the INS-assisted visual UAV
landing system.
The geometric relationships between the dier-
ent sensors involved in the system are illustrated
in Fig. 2, where (pc
i, qc
i)denotes the transformation
from the camera coordinate system to the IMU co-
ordinate system and represents the positional and
orientational relationship between the camera and
the IMU. In this system, the IMU is embedded
within the camera, maintaining this relationship
unchanged throughout motion and pi
w, qi
wsigni-
es the IMU’s positional and orientational infor-
mation in the world frame, expressing the IMU’s
position and orientation information in the global
coordinate system (pc
w, qc
w).
Figure 2: Visualization of the dierent coordinate
frames in the setup.
3 Adaptive UFIR/cUFIR Filter
In this section, the adaptive UFIR/cUFIR lter
will be derived. First, the data fusion model will
be proposed. Then, the cUFIR lter will be devel-
oped based on the data fusion model. Finally, the
adaptive UFIR/cUFIR lter will be presented.
3.1 Data Fusion Model for
INS-assisted Visual UAV Landing
We use the 9-dimensional state vector
xk= [ LokVelkθk]T,(1)
which includes the 3-dimensional position, veloc-
ity and attitude, and where Lokrepresents the
position at the time index k,Velkdenotes the ve-
locity of the UAV, and θkis the attitude, which in-
cludes 3 Euler angle. The state equation adopted
in this work is as follows:
x
k=Skxk+ϖk,(2)
where the following matrices are used:
Sk=
I3×3∆t 0 0
0 ∆t 0
0 0 ∆t A
03×3I3×3B
03×303×3C
,(3)
A=−T T
(qk)akt2
2t3
3! ωk+t4
4! ωk2,
(4)
B=−T T
(qk)aktt2
2! ωk t3
3! ωk2,
(5)
C=I3×3tωk+t2
2! ωk2,(6)
where Skdenotes the state transition matrix,
ϖk N (0,Qk)is the system noise, the matrices
ωkand akare skew-symmetric matrices cor-
responding to ωkand akrespectively, T(qk)Tis
used as the rotation matrix corresponding to the
quaternion qk, which are shown as follows:
ak=0az,k ay,k
az,k 0ax,k
ay,k ax,k 0,(7)
ωk=0ωz,k ωy,k
ωz,k 0ωx,k
ωy,k ωx,k 0,(8)
where (ωx,k, ωy,k, ωz,k)is the acceleration in body-
frame (b-frame), (ωx,k, ωy,k, ωz,k)is the angular
velocity in b-frame. Thus, the observation equa-
tion of the data fusion model used in this work can
be listed as follows:
yk=Hxk+υk,(9)
where yk=ˆ
Lokˆ
θkTrepresents the observa-
tion vector, and ˆ
Lokand ˆ
Lokare measured by the
camera directly. And H=I3×303×303×3
03×303×3I3×3
serves as the observation matrix, υk N (0,Rk)
is the measurement noise.
Noted that the variable υkrepresents the mea-
surement noise at time k. In the current main-
stream methods, it is common to assume the mea-
surement noise as Gaussian white noise. However,
ensuring the persistent nature of white noise in
practical applications poses challenges, [20]. To
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2024.20.4
Yide Zhang, Teng Li, Xin Zang,
Jingwen Yu, Yuan Xu, Yuriy S. Shmaliy
E-ISSN: 2224-3488
33
Volume 20, 2024
address this issue, the Gauss-Markov model is pro-
posed for υkas follows:
ξk= Θkξk1+υk,(10)
where ξkis the colored CMN, Θkis the colored-
ness coecient, and vkis white Gaussian driving
noise with known covariance. To transform the
model ykwith CMN wo another one with white
Gaussian noise, we use Bryson’s measurement dif-
ferencing, [21], [22], and write the new observation
mkas
mk=ykΘkyk1=Okxk+¯υk,(11)
where Ok=HkΠk,Πk= ΘkHk1S1
k,¯υk=
Πkϖk+υk. The UFIR/cUFIR ltering algorithm
will be developed next.
3.2 Adaptive UFIR/cUFIR lter
Using the above-discussed state space model, the
UFIR/cUFIR ltering algorithm can be devel-
oped as in the following. First, we list the stan-
dard UFIR ltering algorithm represented with
the pseudo code as Algorithm 1.
Algorithm 1: Standard UFIR Filtering
Algorithm
Data: yk,ˆxU
0
Result: ˆxU
k
1begin
2for k=L1 : do
3l1=kL+DU
4GU
l1=I;
5˜xl1=yl1, l1< LU1
ˆxU
l1, l1LU1;
6for j=l1+ 1 : kdo
7˜xU
j=Sj˜xU
j;
8GU
j=
HTH+ (SGU
j1ST)11;
9FU
j=GU
jHT;
10 ˜xU
j=˜x
j+FU
j[ykxU
j];
11 end for
12 ˆxU
k=˜xU
j;
13 end for
14 end
15 DUis the size of the lter
Now we recall that the Kalman lter relies on
white Gaussian noise in the system and in the
measurement. In practical applications, CMN can
aect its accuracy signicantly, and therefore the
&DPHUD,16&RPSXWHUXOWUDVRQLFVHQVRU7DJ
Figure 3: Structure of the adaptive UFIR/cUFIR
lter.
UFIR approach is more preferable. A pseudo code
of the proposed cUFIR lter operating in the pres-
ence of CMN is listed as Algorithm 2.
Algorithm 2: cUFIR Filtering Algo-
rithm
Data: yk,ˆxcU
0
Result: ˆxcU
k
1begin
2for k=L1 : do
3l1=kL+DU
4GcU
l1=I;
5˜xcU
l1=yl1, l1< LU1
ˆxcU
l1, l1LU1;
6for j=l1+ 1 : kdo
7mk=ykΘkyk1;
8˜xcU
j=Sj˜xcU
j;
9GcU
j=
OTO+ (SGcU
j1ST)11;
10 FcU
j=GjOT;
11 ˜xcU
j=˜xcU
j+FcU
j[mkxcU
j];
12 end for
13 ˆxcU
k=˜xcU
j;
14 end for
15 end
The adaptive UFIR/cUFIR lter is developed
for the structure shown in Fig. 3 as follows. We
employ the Mahalanobis distance to verify the per-
formances of the UFIR lter and the cUFIR lter
by using the following equations:
dU
k= (ykxU
k)TRk(ykxU
k),(12)
dcU
k= (mkxcU
k)TRk(mkxcU
k),(13)
where dU
kand dcU
kare the Mahalanobis distances.
Then we use the following conditions: if dU
k> dcU
k,
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2024.20.4
Yide Zhang, Teng Li, Xin Zang,
Jingwen Yu, Yuan Xu, Yuriy S. Shmaliy
E-ISSN: 2224-3488
34
Volume 20, 2024
Figure 4: Structure of the measurement test
equipment.
then ˆxU
kgoes to the output; otherwise, ˆxcU
kgoes to
the output. A pseudo code of the adaptive hybrid
UFIR/cUFIR lter is listed as Algorithm 3.
Algorithm 3: Adaptive Hybrid
UFIR/cUFIR Filtering Algorithm
Data: yk,ˆxU
0,ˆxcU
0
Result: ˆxk
1begin
2for k=L1 : do
3Get ˆxU
kby using Algorithm 1;
4dU
k= (ykxU
k)TRk(ykxU
k);
5Get ˆxcU
kby using Algorithm 1;
6dcU
k=
(mkxcU
k)TRk(mkxcU
k);
7if dU
k< dcU
kthen
8ˆxk=ˆxU
k;
9else
10 ˆxk=ˆxcU
k;
11 end if
12 end for
13 end
4 Experimental Results
In this section, we test the lters developed by real
experimental data. First, we tune the lters and
then analyse ltering results.
4.1 Hardware Setting
The structure of the experimental test used in this
work is shown in Fig. 4. In the testing, a visual
camera is used to measure the UAV’s attitude and
the distance between the UAV and the tag. Mean-
0 5 10 15 20 25 30 35 40 45 50
Sample point
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
Height (m)
Camera
UFIR/cUFIR
UFIR
cUFIR
Reference value
Figure 5: The UAV used in this work.
while, the INS is used to measure the akand ωk.
The hight of the UAV is consider in this work,
thus we employ an ultrasonic sensor to measure
the distance between the UAV and the oor, which
is denoted as the reference value.
In our experiments, we utilize the Z410 UAV
as the data acquisition platform. The UAV is
equipped with the Pixhawk 2.4.8 ight controller
and the M8N GPS module. Additionally, a Rasp-
berry Pi 3B+ onboard computer is used, which
enables the external control of the UAV through
programming languages such as Dronekit-Python,
ROS, and OpenCV. An Intel T265 stereo camera
is incorporated into the system. Developed by In-
tel, the T265 camera features two sheye lenses,
each with an approximate eld of view of 170 de-
grees. The T265 camera is equipped with an inte-
grated Inertial Measurement Unit (IMU) utilizing
Bosch’s BMI055 sensor. The UAV used in this
work is shown in Fig. 5.
4.2 Performance Evaluation
Fig. 6 displays the height measured by the camera,
UFIR lter, cUFIR lter, and UFIR/cUFIR lter,
Along, we show the reference value in the test. In
this gure, the reference value is denoted by the
green line, the camera solution is denoted by the
blue line, the orange line means the UFIR ltering
solution, the cUFIR ltering solution is denoted by
the purple line, and the proposed UFIR/cUFIR l-
ter is denoted by the red line. From this gure,
we see that the heights provided by the UFIR l-
ter, cUFIR lter, and the UFIR/cUFIR lter have
dead zones, and that their outputs range close
to the reference value. Both the UFIR and the
cUFIR ltering solutions range closer to the ref-
erence value. The height estimated by the pro-
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2024.20.4
Yide Zhang, Teng Li, Xin Zang,
Jingwen Yu, Yuan Xu, Yuriy S. Shmaliy
E-ISSN: 2224-3488
35
Volume 20, 2024
10 15 20 25 30 35 40 45 50
Sample point
0
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
0.18
Vertical Position RMSE (m)
Camera
cUFIR
UFIR
UFIR/cUFIR
Figure 6: The heights measured by the camera,
UFIR lter, cUFIR lter, UFIR/cUFIR lter and
the reference value.
posed UFIR/cUFIR lter fall in-between. The
root mean square errors (RMSE) of the heights
measured by the camera, UFIR lter, cUFIR l-
ter, UFIR/cUFIR lter are listed in Fig. 7. In
this work, we compute the RMSEs by the follow-
ing equation:
LoRMSE
k=1
k
k
i=1 zkzr
k2,(14)
where the zkmeans the measurement of the height
and zr
kmeans the reference value of the height,
which is measured by the ultrasonic sensor. From
Fig. 7, we can see easily that the camera’s solution
accumulates errors. The UFIR and the cUFIR l-
ters have better performance compared with the
camera’s solution, and the proposed UFIR/cUFIR
lter has the smallest error, which shows the best
performance. The cumulative distribution func-
tion (CDF) of the the heights measured by the
camera, UFIR lter, cUFIR lter, UFIR/cUFIR
lter are shown in Fig. 8. From the gure, we
can see that the hybrid lter gives the smallest
RMSE of 0.9. The height RMSEs produced by
the UFIR lter, cUFIR lter, UFIR/cUFIR Filter
are listed in Tab. 1. From Table 1, we see that the
proposed lter gives the error of 0.010 m, which
is better that by the UFIR and cUFIR lters on
about 28.57% and 16.67%, respectively.
5 Conclusion
In this study, we proposed a new scheme for
INS-assisted visual localization of the autonomous
landing of UAVs. Based on the data fusion model,
the hybrid UFIR/cUFIR lter has been developed.
0 0.01 0.02 0.03 0.04 0.05
Position Error (m)
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
CDF
Camera
cUFIR
UFIR
UFIR/cUFIR
Figure 7: The RMSEs of the heights mea-
sured by the camera, UFIR lter, cUFIR lter,
UFIR/cUFIR lter.
0 5 10 15 20 25 30 35 40 45 50
Sample point
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
Camera
ACUFIR
UFIR
CUFIR
Truth
Figure 8: The CDF of the heights measured by the
camera, UFIR lter, cUFIR lter, UFIR/cUFIR
lter.
Table 1: Height RMSEs Produced by the UFIR
lter, cUFIR lter, UFIR/cUFIR Filter.
Filter RMSE (m)
UFIR 0.014
cUFIR 0.012
UFIR/cUFIR 0.010
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2024.20.4
Yide Zhang, Teng Li, Xin Zang,
Jingwen Yu, Yuan Xu, Yuriy S. Shmaliy
E-ISSN: 2224-3488
36
Volume 20, 2024
In the proposed structure, the conventional UFIR
and cUFIR lters are run simultaneously in CMN
environment, and the best result of the dynam-
ically selected lter, using the Mahalanobis dis-
tance, goes to the output. The test results demon-
strate that our proposed UFIR/cUFIR ltering al-
gorithm performs better that the UFIR and cU-
FIR lters, which results in highest positioning ac-
curacy.
References:
[1] A. R. Jha, Theory, Design, and Applications
of Unmanned Aerial Vehicles. Boca Raton,
FL: CRC Press, 2017.
[2] L. Setlak and R. Kowalik, “The dynamics of
group ights of an unmanned aerial vehicle,”
WSEAS Trans. Appl. Theor. Mech., vol. 14,
pp. 129–139, 2019.
[3] X. Gao, Z. Chen, and Y. Hu, “Analysis of
unmanned aerial vehicle MIMO channel ca-
pacity based on aircraft attitude,” WSEAS
Trans. Inform. Sci. Appl., vol. 10, no. 2,
pp. 58–67, 2013.
[4] P. Wilson, Surviving with Navigation & Sig-
naling. Broomall, PA: Simon & Schuster,
2015.
[5] R. Perry, A History of Satellite Reconnais-
sance. Chantilly, VA: CSNR, 2015.
[6] B. C. Williams and G. C. Baker, “An electro-
magnetic induction technique for reconnais-
sance surveys of soil salinity hazards,” Aus-
tralian Journal of Soil Research, vol. 20, no. 2.
[7] X. Dong, Y. Gao, J. Guo, S. Zuo, J. Xiang,
D. Li, and Z. Tu, “An integrated UWB-IMU-
vision framework for autonomous approach-
ing and landing of UAVs,” Aerospace, vol. 9,
no. 12, pp. 6336–6350, 2022.
[8] Y. Xu, D. Wan, S. Bi, H. Guo, and
Y. Zhuang, “A FIR lter assisted with the
predictive model and ELM integrated for
UWB-based quadrotor aircraft localization,”
Satellite Navigation, vol. 4, p. 2, 2023.
[9] C. Z. L. Huang, J. Song and G. Cai, “Ob-
servable modes and absolute navigation ca-
pability for landmark-based IMU/vision nav-
igation system of UAV,” Optik, vol. 202,
p. 163725, 2020.
[10] Y. Xu, D. Wan, Y. S. Shmaliy, X. Chen,
T. Shen, and S. Bi, “Dual free-size LS-SVM
assisted maximum correntropy Kalman lter-
ing for seamless INS-based integrated drone
localization,” IEEE Trans. Ind. Electron.,
2023.
[11] X. Liang, G. Chen, S. Zhao, and Y. Yu,
“Moving target tracking method for un-
manned aerial vehicle/unmanned ground ve-
hicle heterogeneous system based on AprilT-
ags,” Meas. Control, vol. 53, no. 3-4, pp. 427–
440, 2020.
[12] C. Huang and G. Cai, “Design and perfor-
mance analysis of landmark-based INS/vision
navigation system for UAV,” Optik, vol. 172,
pp. 484–493, 2018.
[13] J.-P. Condomines, Nonlinear Kalman Filter-
ing for Multi-Sensor Navigation of Unmanned
Aerial Vehicles. London: Elsevier, 2018.
[14] M. H. Sadraey, Design of Unmanned Aerial
Systems. Hoboken, NJ: Wiley, 2020.
[15] F. L. Lewis, L. Xie, and D. Popa, Optimal and
Robust Estimation. Boca Raton, FL: CRC
Press, 2008.
[16] Y. S. Shmaliy and S. Zhao, Optimal and Ro-
bust State Estimation: Finite Impulse Re-
sponse (FIR) and Kalman Approaches. New
York: Wiley & Sons, 2022.
[17] Y. Xu, Y. S. Shmaliy, Y. Li, and X. Chen,
“UWB-based indoor human localization with
time-delayed data using EFIR ltering,”
IEEE Access, vol. 5, pp. 16676–16683, 2017.
[18] Y. Xu, Y. S. Shmaliy, C. K. Ahn, T. Shen,
H. Guo, and Y. Zhuang, “Blind robust multi-
horizon EFIR lter for tightly integrating INS
UWB,” IEEE Sensors J., vol. 21, no. 20,
pp. 23037–23045, 2021.
[19] Y. S. Shmaliy, S. Zhao, and C. K. Ahn,
“Kalman and UFIR state estimation with
coloured measurement noise using backward
Euler method,” IET Signal Process., vol. 14,
no. 2, pp. 64–71, 2020.
[20] M.-C. O. S. Popescu, O. V. Olaru, and
N. E. Mastorakis, “Processing data for col-
ored noise using a dynamic state estimator,”
Int. J. Comp. Commun., vol. 2, no. 3, pp. 77–
86, 2008.
[21] A. E. Bryson and L. J. Henrikson, “Estima-
tion Using Sampled-Data Containing Sequen-
tially Correlated Noise, Technical Report No.
533,” Tech. Rep. NR-372-912, Grant NGR 22-
007-068, National Aeronautics and Space Ad-
ministration, 06 1967.
[22] A. E. Bryson and L. J. Henrikson, “Esti-
mation using sampled data containing se-
quentially correlated noise,” J. Spacecraft and
Rockets, vol. 5, no. 6, pp. 662–665, 1968.
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2024.20.4
Yide Zhang, Teng Li, Xin Zang,
Jingwen Yu, Yuan Xu, Yuriy S. Shmaliy
E-ISSN: 2224-3488
37
Volume 20, 2024
Contribution of Individual Authors to the
Creation of a Scientific Article (Ghostwriting
Policy)
The authors equally contributed in the present
research, at all stages from the formulation of the
problem to the final findings and solution.
Sources of Funding for Research Presented in a
Scientific Article or Scientific Article Itself
No funding was received for conducting this study.
Conflict of Interest
The authors have no conflicts of interest to declare
that are relevant to the content of this article.
Creative Commons Attribution License 4.0
(Attribution 4.0 International, CC BY 4.0)
This article is published under the terms of the
Creative Commons Attribution License 4.0
https://creativecommons.org/licenses/by/4.0/deed.en
_US
WSEAS TRANSACTIONS on SIGNAL PROCESSING
DOI: 10.37394/232014.2024.20.4
Yide Zhang, Teng Li, Xin Zang,
Jingwen Yu, Yuan Xu, Yuriy S. Shmaliy
E-ISSN: 2224-3488
38
Volume 20, 2024