6 Conclusions and Discussions
In this paper, we proposed the algorithm to create a
regression line from an original function with noise
, where is not necessarily a normal distribution
with a mean of zero, called the line-pulling linear
regression (LPR) and the band-pulling linear
regression (BPR). These algorithms can set the
regression line to the centre, top, or bottom of data
points by assigning values and . If
, the LPR and BPR algorithms provide the same
regression lines as linear regression. When ,
the resulting line is below the linear regression. And
when , the resulting line is above the linear
regression. However, since we do not know the
noise distribution in the data, we determine the
value of and based on user requirements.
The numerical examples show that the results of
the LPR and BPR algorithms are similar. The
noticeable difference is the number of iterations for
which the LPR algorithm converges faster than the
BPR algorithm.
The application of these algorithms is the
smoothing of the pectoral muscle’s boundary. We
use and to create the regression
line at the bottom of all data points, ensuring we
remove only the muscle part.
In addition, the LPR and BPR algorithms can be
extended to more complicated models, such as using
quadratic or cubic polynomial equations rather than
a linear equation, which is expected to bring greater
application benefits.
Acknowledgement:
The first author would like to express gratitude to
the Science Achievement Scholarship of Thailand
(SAST) for financial assistance for this paper. This
research is supported by Department of
Mathematics, Faculty of Science, Khon Kaen
University, Fiscal Year 2022.
References:
[1] S. M. Stigler, The history of statistics: The
measurement of uncertainty before 1900,
Harvard University Press, 1986.
[2] D. C. Montgomery, E. A. Peck, G. G. Vining,
Introduction to linear regression analysis,
John Wiley & Sons, 2021.
[3] X. Su, X. Yan, C. Tsai, Linear regression,
WIREs Computational Statistics, Vol.4, No.3,
2012, pp. 275-294.
[4] W. Yao, L. Li, A New Regression Model:
Modal Linear Regression, Scandinavian
Journal of Statistics, Vol.41, 2014, pp. 656-
671.
[5] K. H. Zou, K. Tuncali, S. G. Silverman,
Correlation and simple linear regression,
Radiology, Vol.227, No.3, 2003, pp. 617-628.
[6] D. Maulud, A. M. Abdulazeez, A review on
linear regression comprehensive in machine
learning, Journal of Applied Science and
Technology Trends, Vol.1, No.4, 2020,
pp.140-147.
[7] L. Pérez-Domínguez, H. Garg, D. Luviano-
Cruz, J.L. García Alcaraz, Estimation of
Linear Regression with the Dimensional
Analysis Method, Mathematics, Vol.10,
No.10, 2022, pp. 1645.
[8] S. Jokubaitis, R. Leipus, Asymptotic
normality in linear regression with
approximately sparse structure, Mathematics,
Vol.10, No.10, 2022, pp. 1657.
[9] M. Al-Kandari, K. Adjenughwure, K.
Papadopoulos, A Fuzzy-Statistical Tolerance
Interval from Residuals of Crisp Linear
Regression Models, Mathematics, Vol.8,
No.9, 2020, pp. 1422.
[10] X. Liu, Y. Chen, A systematic approach to
optimizing value for fuzzy linear regression
with symmetric triangular fuzzy numbers,
Mathematical Problems in Engineering,
Vol.2013, 2013.
[11] A. Kabán, New bounds on compressive linear
least squares regression, Artificial intelligence
and statistics, 2014, pp. 448-456.
[12] J. Yi, N. Tang, Variational Bayesian inference
in high-dimensional linear mixed models,
Mathematics, Vol.10, No.3, 2022, pp. 463.
[13] M. Ahn, H. H. Zhang, W. Lu, Moment-based
method for random effects selection in linear
mixed models, Statistica Sinica, Vol.22, No.4,
2012, pp. 1539.
[14] G. K. Uyanık, N. Güler, A Study on Multiple
Linear Regression Analysis, Procedia - Social
and Behavioral Sciences, Vol.106, 2013, pp.
234-240.
[15] M. Liu, S. Hu, Y. Ge, G. B. Heuvelink, Z.
Ren, X. Huang, Using multiple linear
regression and random forests to identify
spatial poverty determinants in rural China,
Spatial Statistics, Vol.42, 2021, pp. 100461.
[16] Y. Li, X. He, X. Liu, Fuzzy multiple linear
least squares regression analysis, Fuzzy Sets
and Systems, 2022.
[17] S. Weisberg, Applied Linear Regression, 4th
editio, 2014.
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2023.20.9
Nahatai Tepkasetkul, Weenakorn Ieosanurak,
Thanapong Intharah, Watcharin Klongdee