We will assume that the interval of time
between any two consecutive realizations of the
process is negligibly small compared to the
observed interval of time t∞, and the amplitude
values of the process in that interval of time are
mutually independent. Likewise, the entropy of the
system increases logarithmically towards infinity,
reaches infinity, and then drops down to zero
theoretically in infinity, as symbolically presented
by a dashed thick line in Fig.1.
Inside the system, four basic stochastic
processes can be generated: continuous- and
discrete-time and continuous-valued processes, and
continuous- and discrete-time and discrete-valued
processes. The random variables defining these
processes are described by the time-dependent
continuous and discrete uniform probability density
functions, respectively. Detailed analysis of these
pdf functions and related information and entropy
functions will be presented in the following
sections.
2 Time-dependent uniform pdf of a
continuous random variable
2.1 Definition of a time-dependent
probability density function
Due to the importance of the probability density
function (pdf) for our analysis, we will start with its
precise definition which will be consistently used in
presenting our theory. The uniform pdf of a
continuous random variable X is defined as
22
22
2
3
2
1
1
2
() 23
0 otherwise 0 otherwise
1
23
0 otherwise
cc
c
cc
cc
c
cc
X
cc
t
X x X
X x X
X
fx
X x X
t
(1)
for the positive values 0 ≤ Xc ≤ ∞. Furthermore, we
will express the variance as a linear function of
time, i.e.,
for a constant
defined in the
interval 0 ≤ σ2 ≤ ∞. The function is graphically
presented in Fig. 1a), for the mean value equal to
zero and the varying values of Xc that define the
variance
. For our analysis, we could not
rely on the definition in Montgomery and Runger's
book [5], where zero values of pdf are not
considered. The closest and a nearly proper
definition is in Papoulis and Pillai book [6], Peebles
book [7], Manolakis book [8] which defined the
variance but not its limits, even though the limit
values of the pdf function that have positive values
are not specified. The definition in Gray’s book [9],
and Proakis's book [10] are incomplete and cannot
be used as such to develop our theory.
In our rigorous definition, we specify the
interval of limit values to be 0 ≤ Xc ≤ ∞, which also
includes the equation sign due to the necessity to
explain the behavior of the pdf function and related
information function at zero and infinite values of
parameters Xc and σc, i.e., when these parameters not
only tend to infinity but when they reach infinity. It
is also strictly specified the interval of uniform
density values different from zero as -Xc ≤ x ≤ Xc,
which allows us to define the values of the
information function for every x in the interval -∞ ≤
x ≤ ∞ and calculate the entropy of the information
function of the uniformly distributed random
variable X. Namely, alongside pdf function, we will
investigate the behavior of the information function
I(X) and the entropy H(X) of random variable X.
These three functions are presented in Fig. 1.
Unlike Shannon, who defines, in his famous
paper [1], the entropy of a continuous random
variable with the pdf function fc(x), we will first
define and analyze the information function as a log
function with the base 2 of the pdf function in
equation (1) and express it as
22
2
22
2
2
2
log 2
( ) log ( ) log 0 otherwise
log 2 3
otherwise
log 2 3
otherwise
c
c c c
c
c c c
cc
t
X X x X
I X f x
X x X
t X x X
, (2)
which simplifies our understanding of the physical
sense of both the information function and the
related entropy function that is expressed as the
mean value of the information function.
The information function values are increasing
inside the interval Xc when the interval width is
decreasing as shown in Fig. 1, for the intervals
defined by Xc = 0, 1/8, 1/4, 1/2, 1, and 2 with the
corresponding values of the pdf function being ∞, 4,
2, 1, 1/2, and 1/4, respectively, which are presented
in italic font. If the interval Xc drops to zero, the pdf
function becomes the Dirac delta function,
represented by an arrow line pointing + ∞ in Fig.
1a). If the interval Xc tends to infinity, the pdf
function tends to zero.
If we form a continuous-time i.i.d. stochastic
process X(t), defined by the random variable X at
each time instant t, we may define the related
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12