Singularity Behaviour of the Density, Information, and Entropy
Functions Defining a Uniform Non-stationary Stochastic Process
STEVAN BERBER
Electrical, Computer, and Software Engineering Department
The University of Auckland
5 Grafton Road, 1010 Auckland Central
NEW ZEALAND
Abstract: - Precise definitions and derivatives of the time-dependent continuous and discrete uniform
probability density functions and related information and entropy functions are investigated. A stochastic
system is formed that can represent a uniform noise source having a time-dependent variance and forming a
uniform non-stationary stochastic process. The information and entropy function of the system are defined, and
their properties are investigated in the time domain, including the limit cases defined for infinite and zero
values of the time-dependent variance. In particular, the singularity properties of the entropy function will be
investigated when the time-dependent variance reaches infinity. Like in thermodynamics, where the physical
entropy of a system increases all the time, the information entropy of the stochastic system in information
theory is also expected to increase towards infinity when the variance increases. All investigations are
conducted for both the continuous and discrete random variables and their density functions. The presented
theory is of particular interest in analyzing the Gaussian density function having infinite variance and tending to
a uniform density function.
Key-Words: - Uniform probability density function, information function, information entropy, continuous and
discrete density, limits of information functions.
Received: April 7, 2021. Revised: February 23, 2022. Accepted: March 26, 2022. Published: April 28, 2022.
1 Introduction
The theory of stochastic processes having a
probability density function (pdf) that is a function
of time is important in the analysis of non-stationary
stochastic processes. In this paper, this theory is
further extended to the information theory by
defining and deriving the time-dependent
information and entropy functions. It is assumed
that the pdf function of a stochastic process is
uniform and its variance linearly depends on time.
The process is analyzed starting with the results in
information theory published in Shannon’s seminal
paper [1]. According to the presented theory, the
randomness in the system generating the process
will increase in time, and the average information
per a random event, or the system entropy, will tend
to infinity. Furthermore, it will be shown that the
probability of all random events will tend to zero
when the variance tends to infinity. We say the
probabilities of events reach equilibrium when all
probability values reach theoretical zero carrying the
information contents that tend to infinity.
This theory is analogous to thermodynamics
theory. Namely, the second law of thermodynamics
states that the physical entropy of the enclosed
system always increases and has an identical
expression as the information entropy defined in this
paper, as noted, for example, in Glattfelder’s paper
[2]. Furthermore, the results of this paper will show
that the system entropy is finite and tends to infinity
when the probability of each random event tends to
zero carrying the information content that tends to
infinity.
Having in mind the thermodynamics theory,
the presented system operates irreversibly out of the
thermodynamic equilibrium with the ability to reach
the thermodynamic equilibrium. An analysis of a
system operating out of the thermodynamic
equilibrium is presented in Nicholson’s paper [3].
Inside the system, generating a uniform non-
stationary stochastic process, the entropy function
would be singular when the time-dependent
variance reaches infinity, i.e., the entropy will
suddenly change from infinity to zero, even though
that contradicts Leibniz’s famous statement Natura
non facit saltus (nature makes no jump) [4]. In
addition, the information function gets infinite
values due to the related all-zero pdf function.
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
114
Volume 19, 2022
We will assume that the interval of time
between any two consecutive realizations of the
process is negligibly small compared to the
observed interval of time t, and the amplitude
values of the process in that interval of time are
mutually independent. Likewise, the entropy of the
system increases logarithmically towards infinity,
reaches infinity, and then drops down to zero
theoretically in infinity, as symbolically presented
by a dashed thick line in Fig.1.
Inside the system, four basic stochastic
processes can be generated: continuous- and
discrete-time and continuous-valued processes, and
continuous- and discrete-time and discrete-valued
processes. The random variables defining these
processes are described by the time-dependent
continuous and discrete uniform probability density
functions, respectively. Detailed analysis of these
pdf functions and related information and entropy
functions will be presented in the following
sections.
2 Time-dependent uniform pdf of a
continuous random variable
2.1 Definition of a time-dependent
probability density function
Due to the importance of the probability density
function (pdf) for our analysis, we will start with its
precise definition which will be consistently used in
presenting our theory. The uniform pdf of a
continuous random variable X is defined as
22
22
2
3
2
1
1
2
() 23
0 otherwise 0 otherwise
1
23
0 otherwise
cc
c
cc
cc
c
cc
X
cc
t
X x X
X x X
X
fx
X x X
t











(1)
for the positive values 0 Xc . Furthermore, we
will express the variance as a linear function of
time, i.e.,
22
ct

for a constant
2
defined in the
interval 0 σ2 . The function is graphically
presented in Fig. 1a), for the mean value equal to
zero and the varying values of Xc that define the
variance
. For our analysis, we could not
rely on the definition in Montgomery and Runger's
book [5], where zero values of pdf are not
considered. The closest and a nearly proper
definition is in Papoulis and Pillai book [6], Peebles
book [7], Manolakis book [8] which defined the
variance but not its limits, even though the limit
values of the pdf function that have positive values
are not specified. The definition in Gray’s book [9],
and Proakis's book [10] are incomplete and cannot
be used as such to develop our theory.
In our rigorous definition, we specify the
interval of limit values to be 0 Xc ∞, which also
includes the equation sign due to the necessity to
explain the behavior of the pdf function and related
information function at zero and infinite values of
parameters Xc and σc, i.e., when these parameters not
only tend to infinity but when they reach infinity. It
is also strictly specified the interval of uniform
density values different from zero as -Xc x Xc,
which allows us to define the values of the
information function for every x in the interval -
x and calculate the entropy of the information
function of the uniformly distributed random
variable X. Namely, alongside pdf function, we will
investigate the behavior of the information function
I(X) and the entropy H(X) of random variable X.
These three functions are presented in Fig. 1.
Unlike Shannon, who defines, in his famous
paper [1], the entropy of a continuous random
variable with the pdf function fc(x), we will first
define and analyze the information function as a log
function with the base 2 of the pdf function in
equation (1) and express it as
22
2
22
2
2
2
log 2
( ) log ( ) log 0 otherwise
log 2 3
otherwise
log 2 3
otherwise
c
c c c
c
c c c
cc
t
X X x X
I X f x
X x X
t X x X














, (2)
which simplifies our understanding of the physical
sense of both the information function and the
related entropy function that is expressed as the
mean value of the information function.
The information function values are increasing
inside the interval Xc when the interval width is
decreasing as shown in Fig. 1, for the intervals
defined by Xc = 0, 1/8, 1/4, 1/2, 1, and 2 with the
corresponding values of the pdf function being , 4,
2, 1, 1/2, and 1/4, respectively, which are presented
in italic font. If the interval Xc drops to zero, the pdf
function becomes the Dirac delta function,
represented by an arrow line pointing + in Fig.
1a). If the interval Xc tends to infinity, the pdf
function tends to zero.
If we form a continuous-time i.i.d. stochastic
process X(t), defined by the random variable X at
each time instant t, we may define the related
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
115
Volume 19, 2022
realization of this process as uniform random
signals x(t). The theory of these processes uses the
same notation as presented in chapter 19 of Berber’s
book [11]. Suppose three pdf functions of uniform
random variables X are defined by Xc = 2, 0, and 1.
The three realizations of the related stochastic
processes, also called random signals in the
presented theory, are presented in Fig. 2a) on the
same coordinate system for the sake of simplicity.
The first realization, or the random signal x1(t),
takes the values between -2 and 2 in the time
interval from 0 to 2. The second realization is a
horizontal line overlapping the abscissa defining
certain events of generating zero amplitudes at each
time instant inside the interval from 2 to 6. The third
realization takes the values from -1 to 1 in the
interval from 6 to 8.
x
Xc
-2 -1 -1/2 1 2
1
2
I=
-2
.
.
.
I=
-
.
.
.
I(X), H(X)
-1
H=∞
H=
0
I=Finite
b)
a)
fc(x)
x
Xc
-2 -1 -1/2 0 1/2 1 2
0
1/2
1/4
4
2
1
Figure 1 a) Continuous uniform pdf function, and
b) related information and entropy functions.
1
2
-2
-1
fc(x)
x
1/2
1/4
t
1 2 3 4 5 6 7 8
x1(t), x2(t), x3(t)
x2(t)
x1(t)
x3(t)
a)
1
2
-2
-1
fc(x)
x
1/2
1/4
n
1 2 3 4 5 6 7 8
x1(n), x2(n), x3(n)
x2(n)
x1(n)
x3(n)
b)
Figure 2 a) Realisations of three continuous-time,
and b) three discrete-time stochastic processes.
The theory presented in this section can be
applied to discrete-time stochastic processes. A
realization of this process is expressed as a random
function of the discrete-time variable n instead of t,
as shown in Fig. 2b). In explaining and using these
processes we will follow the notation and theory
presented in chapter 4 of Berber’s book [11].
Contrary to the system with a fixed statistical
property of the random experiment, we can imagine
a process that is generated in time according to the
varying distributions presented in Fig. 3, starting
with an all-zero pdf function producing an
undefined random signal. This signal is followed by
a random signal defined by the Dirac pdf delta
function and finishes with a distribution defined by
Xc ∞, producing a random signal with possible ±
amplitudes. Let us analyze the limit cases when
the pdf function parameter σc or Xc tends to infinity
and zero, which are essential for understanding the
properties of a non-stationary process having a time-
dependent variance.
Parameter σc or Xc tends to infinity. In this
case, when the variance of the pdf function tends to
infinity causing the defined positive pdf function
values to tend to zero, we may have
22
or
22
1
lim ( ) lim 2
11
lim lim 0
2 3 2 3
cc
c
cc
c
XX
c
t
t
c
fx X
t





(3)
as is notified in Fig. 3 by a horizontal line on the left
graph. All random values x are spread in the infinite
interval from –∞ to +∞ and occur with the
probability that tends to zero (dashed arrow line)
and reaches infinity (bold arrow line). When the
variance tends to infinity, the probabilities are
infinitesimally small, and the process still has its
realizations defined on an infinite interval of
possible values.
1
2
-2
-1
fc(x)
x
1/2
1/4
t
0 1 2 3 4 5 6 7 8
x2(t), x3(t), x1(t), , x(t)
x2(t)
x1(t)
x3(t)
-
-
Process defined
by its pdfs
x(t)
Undefined
process
0
Figure 3 System represented by hypothetical
realizations of one undefined process and four
continuous-time random processes characterized by
four uniform pdf functions for Xc = 0, 1, 2, and ∞.
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
116
Volume 19, 2022
Therefore, when Xc reaches infinity, the random
signal values are generated with the probability of
zero causing the process and its realization to
vanish. We can say that the process is presented by
an empty graph. Someone can argue that these
random values cannot exist due to the zero
probability of their generation and can be ignored
and, consequently, the defined zero values of the
limiting pdf function can be pointless. However, we
will show that this function has meaning in the
physical world from the information function point
of view and cannot be ignored.
Parameter σc or Xc tends to zero. If one of the
parameters, σc, Xc, or t tends to zero, the pdf function
becomes the Dirac delta function according to
2
0 0 0
0
11
lim ( ) lim lim
23 23
0
() 0 otherwise
cc
c
c
Xt
c
fx t
x
x






(4)
having the infinite value at x = 0. A realization of a
stochastic process, which is defined by this limit
uniform pdf function that is represented by the Dirac
delta function, is shown in Fig. 4. All amplitude
values are zero because the probability of their
generation is one.
fc(x)
x
0
t
x2(t)
1
-1
Figure 4 A realization of a stochastic process
defined by random variable X that is defined by the
uniform pdf function as a Dirac delta function at the
point x = 0.
2.2 Information function
Due to its importance for our analysis, we will
define and investigate the properties of the
information function expressed y eq. (2) for two
limit cases when σc or Xc tends to infinity or zero.
Parameters σc or Xc tends to infinity. For the
first case, the information function is
2
2
2
2
2
1
( ) lim ( ) lim log ()
log 2
lim otherwise
log 2 3
lim otherwise
log 2 3
lim otherwise
cc
cc
c
c
XX
X
c c c
X
c c c
cc
t
I X I X fx
X X x X
X x X
t X x X




















, (5)
for all x values, -x ∞. We can understand this
function in the following way. When the interval
±Xc of random variable X stretches to infinity, all
values of the random variable will exist and appear
with the infinitesimally small probability nearly
equal to zero. Thus, having these small probabilities
of appearance, the information content of all of
them will be close to infinity. The random events
persist to exist, and they can happen with a
probability close to zero.
In infinity (i.e., when the interval Xc reaches
infinity) the pdf function values become zero, thus
the probability of any event becomes zero, and a
realization of that stochastic process is an empty
graph. The random events persist to potentially
exist, and they can happen with the probability of
zero. From a theoretically strict point of view, the
random signal does not exist in time, i.e, there are
no changes in signal values, or there are no changes
in time. The pdf function is zero in the entire
interval of x values, as defined in (3) and shown in
Fig. 5a), the information function is for all x
values, as shown in Fig. 5b). A realization x(t) of the
related stochastic process X(t) is an empty
coordinate system.
x
fX(x)
-3 -2 -1 0 1 2 3
x
I(X)
-3 -2 -1 0 1 2 3
b)
a)
Figure 5 Continuous uniform pdf function and
related information function defined for the infinite
variance value.
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
117
Volume 19, 2022
Parameter σc or t tends to zero. In this case,
the information content is
02
00
2
02
1
( ) lim ( ) lim log ()
log 2 3
lim log (1/ 0) otherwise
0
otherwise
cc
c
c
c c c
I X I X fx
X x X
x












. (6)
or
2
2
002
log 2 3
( ) lim log (1/ 0) otherwise
0
otherwise
cc
t
t X x X
IX
x









(7)
which is presented in Fig. 6b) alongside the
corresponding pdf function shown in Fig. 6a).
x
fc(x)
-3 -2 -1 0 1 2 3
δ(x)
x
I(X)
-3 -2 -1 0 1 2 3
-δ(x)
b)
a)
Figure 6 Continuous uniform pdf function and
related information function defined for the zero-
variance value.
Therefore, in the case when the pdf function is
a delta function, the minimum information content
is –∞ and is represented by the inverted Dirac delta
function. For this case, we are certain that any
realization x of random variable X will be zero and
there is no uncertainty (information) about the value
of this realization, i.e., the information takes the
minimum value which is –∞. However, the
information content takes and remains of the +
value everywhere else on the x-axis where the pdf
function of X has zero values. A realization of the
related stochastic process is a horizontal line having
an amplitude of zero as shown in Fig. 4. Zero x
value occurs for sure, that is a certain event.
Therefore, there is a substantial difference between
the graphs in Fig. 5 and 6.
The same behavior of the information function
can be observed if interval Xc tends to zero, as
presented in Fig. 7. The function values are
increasing from minus infinity, for zero value 2Xc,
to infinity for the infinite value of 2Xc, which
complies with the findings in equations (5) and (7).
2.3 Entropy
By following Shannon’s theory presented in
his famous paper[1], the entropy of a continuous
random variable with the defined pdf function fc(x)
can be expressed as
2
( ) ( )log ( )
cc
H X f x f x dx


, (8)
and calculated as
22
2
2 2 2
22
22
11
( ) log
22
0 log 0 0 log 0 log 2
log 2 3 log 2 3
c
c
c
c
c
X
cc
X
X
c
X
ct
H X dx
XX
dx dx X
t






. (9)
We say that the entropy is defined as the mean
value of the information function (2), and represents
the information (uncertainty) content per a random
value x of random variable X. The contribution to
entropy value is zero for all x values outside of the
2Xc interval where the pdf function is zero. The
entropy values are numbers that can be positive, Xc
> 1/2, zero for Xc = 1/2, and negative for Xc < 1/2.
The positive values of the entropy are increasing
inside the interval Xc when the interval width is
increasing as shown in Fig. 1b), for the intervals
defined with Xc = 1 and 2 with the corresponding
values of the function being 1 and 2, which are
presented in italic font. For Xc < 1/2, the entropy is
negative and increases in absolute value.
One note more on the entropy: In this
theoretical analysis, we accept Shannon’s definition
of entropy in contrast to its definition as a
differential entropy which can be found in some
books, for example, in Haykin’s book [12]. We
consider it unnecessary to introduce the differential
entropy due to the continuity of the random variable
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
118
Volume 19, 2022
X and the strict definition of entropy as an integral
transform, which makes the presented theory to be
consistent. Let us analyze the limit cases for the
entropy when the pdf function parameter σc, t, or Xc
tends to infinity or zero.
Parameter σc, t, or Xc tends to infinity. If the
interval Xc tends to be infinite, someone can
calculate mistakenly the entropy using expression
(9) as
22
22
2
22
( ) lim log ( ) lim log 2
lim log 2 3 limlog 2 3
cc
cc
c
XX
ct
t
H X H X X
t


 
 

, (10)
which is specifically valid for the case when Xc →∞,
but not for the case when Xc = ∞, i.e., when Xc
reaches infinity. However, following (10), the
influence of zero values of the pdf function in
infinity is not considered. If the interval Xc reaches
infinity the entropy should be calculated using its
definition, which will include the zero-valued pdf
function, i.e.,
22
( ) lim ( )log ( ) 0log 0 0
c
c
cc
X
H X f x f x dx dx

  


or
2
2
( ) lim ( )log ( )
0log 0 0
cc
t
H X f x f x dx
dx
 








(11)
Another confirmation of validity for (11) can
be obtained as follows. When Xc tends to infinity,
the random variable takes values in the infinite
interval stretching from –∞ to + that occur with
the probability of zero. To consider these probability
values we can confirm (11) by calculating the
entropy as
22
22
22
22
2
11
( ) lim ( )log lim log 2 3
() 23
11
lim (log 2 3 log )
23
1 1 1 1
lim (log 2 3) lim (log )
2 3 2 3
1 1 1
lim (log 2 3) lim
2 3 2 3
cc
c
c
cc
c
cc
Xc
cc
c
c
c
cc
tt
t
H X f x dx dx
fx
dx
dx dx
dx
t





 

 


 








2
1(log )t dx
t

If the limit of the integral is equal to the integral of
the limits, we may have
22
2
2
2
2
11
( ) lim (log 2 3)
23
11
lim (log )
23
11
lim (log 2 3)
23
11
lim (log )
23
c
c
c
c
c
c
t
t
t
H X dx
dx
dx
t
t dx
t









The first limit is zero and the second is of an
indeterminate form. Applying the L’Hopital’s rule,
the second integral is also zero, i.e.,
2
2
log /
1
( ) lim /
23
(log ) /
11
lim 0 0
1
2 3 2 3
c
c
cc
cc
c
dd
H X dx
dd
edx dx






 

.(12)
or in respect to the time variable t, we may have
2
2
2
2
11
( ) 0 lim (log )
23
log /
1lim
2 3 /
log 1/ 2 /
1lim
2 3 1/ 2 1/
log
11
lim 0 0
2 3 2 3
t
t
t
t
H X t dx
t
d t dt dx
d t dt
et
dx
tt
edx dx
t









 


. (13)
Therefore, considering the values of the
information and corresponding probability values,
the entropy, as the measure of the average
information content inside the random values x, is
zero. These entropy values are presented in Fig. 1b)
by cycles connected by a full curve that reaches
infinite entropy.
Parameter σc or Xc tends to zero. When
parameters σc or Xc tends to zero, the entropy value
can be calculated as follows
02
0
0
22
1
( ) lim ( )log ()
( )log ( ) log (0)
c
c
c
Tc
H X f x dx
fx
x x dx


(14)
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
119
Volume 19, 2022
For our analysis, we have separately defined
and used the information function I(X) as defined by
equation (2), which contains the information content
of the random variable X. Therefore, for the uniform
density, the information content defined inside the
Xc interval is numerically equal to the calculated
entropy, as can be seen in Fig. 7.
Precise graphical presentations of relevant
functions as shown in Fig. 8. It is important to note
the following: While the interval Xc tends to infinity
the entropy value tends to infinity. In infinity, the
intervals of zero values of entropy disappear and the
entropy calculated in the entire infinite interval
becomes zero. In contrast to entropy, if the
appearance of all values of random variable X is
happening with the probability of zero, the
information content of all of them is infinite as will
be seen from the following analysis.
2Xc
1 2 4 8 16 32 64 128
1
I0=-
H0=-
fX, I(X), H(X)
2
4
6
-1
-3
H=0
I=∞
H(X) tends to infinity
H(X) reaches
infinity
H→∞
H=0
Figure 7 Continuous uniform pdf function, the
related information, and entropy, as functions of the
size of interval 2Xc.
Figure 8 Precise presentation of the continuous
uniform pdf function, information, and entropy as
functions of the interval 2Xc.
In the infinity, the events, the realizations of random
variable X, potentially exist and they can happen
with the probability of zero, i.e., theoretically never.
Consequently, from the generation of the random
signal x(t) strict point of view, which is a realization
of the stochastic process X(t), there are no changes
in the appearance of these random values x(t) at
time instants t. However, the information content of
all possible random values is infinite.
3 Discrete uniform random variables
3.1 Probability density function
It is important to note that we will distinguish
and use two types of the discrete uniform pdf
functions: a pdf function expressed in terms of the
Dirac delta functions, and a pdf function expressed
in terms of the Kronecker delta functions. Even
though these two types can be used to represent the
same pdf function, they are different in practical
applications and have different meanings in defining
related information functions.
When the Dirac delta functions are used, the
pdf function will be expressed as a function of a
continuous random variable value x. On the infinite
uncountable set of real values x, we will define an
infinite countable number of discrete points for
integer values x = s, where the pdf function has
either values different from zero or zero values that
are defined by the final weights of Dirac delta
functions. Consequently, the intervals between any
two adjacent delta functions contain continuous
values of x with the pdf function of zero value, as
shown in Fig. 9a). When Kronecker delta functions
are used, the pdf function will be expressed as a
function of discrete random variable values x = s
having amplitudes defined by the weights of the
Kronecker functions. Consequently, the pdf values
inside intervals between any two adjacent delta
functions will be undefined, because they are not
zero, or we say that these intervals contain nothing
instead of the pdf values.
We must use these presentations of pdf
functions to derive appropriate expressions for the
information function and entropy and understand
their properties. These presentations are consistent
with the theoretical explanation presented by Berber
[14], for the case when the Dirac delta functions are
used and Kronecker delta functions are assumed as
an additional possible solution. Delta functions are
used to present the discrete pdf functions in Papoulis
and Pillai's book [6], even though the type of the
delta function is not specified. In the same book, a
primitive definition of the uniform discrete pdf is
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
120
Volume 19, 2022
presented on p. 98, which cannot be considered as a
precise one to be used in our theoretical
developments. A presentation of the pdf function in
terms of the impulse delta function is given in
Peebles book [7], where a detailed analysis of the
Dirac delta function (called the unit-impulse
function) is presented. The uniform discrete pdf
function can be expressed in terms of Dirac delta
functions as
,
1()
() 21
0 otherwise
1()
21
0 ( ) and
0
sS
sS
d
sS
sS
s S s S
x s S x s S
fx S
x s S x s S
S
x s S
xs x s S
xs

















, (15)
One example of this function is graphically
presented in Fig. 9a) for the size of the discrete
interval defined by S = 2. It is important to note
that the pdf function is defined on a continuous
interval < x < +∞ of possible random variable
values, having the values fd(x) 0 at a set of
countable finite discrete instants of random values x,
and zeros everywhere else.
We can also use here the Kronecker delta
functions to express the discrete pdf function in this
form
,
1()
21
( ) 0 ( ) and
undefined
sS
sS
ds S s S
x s S x s S
S
f x x s x s S x s S
xs










(16)
This density is graphically presented in Fig. 9b) for
the discrete interval S = 2, which defines the
variance
2( 1)/3
dSS

=2. The pdf function can be
time-dependent if we just express S as a function of
time. Because this change of variables will not
change the generality of the explanation, it will be
avoided here.
In summary, we can say that the interval of the
random variable values x is continuous if the pdf
function values are zero everywhere else except at
points s, which are defined by the Dirac delta
functions. The Dirac delta function can be replaced
by the Kronecker delta function assuming that the
variable x is a discrete random variable having the
integer values from –∞ to +∞. In this case, the
values of the pdf function are defined at discrete
instants s and are not defined between them. We say
that the values of the pdf function do not exist
between points s. The pdf functions based on
Dirac’s and Kronecker's presentation are shown in
Fig. 9a) and 9b), respectively, with the related
random signals generated according to these pdf
functions and presented in Fig. 9c) and 9d),
respectively.
The random signal in Fig. 9c) is a continuous-
time discrete-valued signal. Any amplitude of the
signal x(n) is generated with the related probability
fd(x) and preserves that value until the next time
instant (n+1) because the probability of generating
amplitudes between x(n) and x(n+1) is zero. For that
reason, we can express this signal as a function of
continuous-time t, as in Fig. 9c). In contrast to this
signal, the signal in Fig. 9d) is a discrete-time
discrete-valued signal. These two signals combined
with the signals presented in section 1.1 complete
the set of basic signals that can be generated in the
system. Therefore, the use of the Dirac and
Kronecker delta functions have some differences
and will be separately analyzed.
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
121
Volume 19, 2022
c)
t,n
0 1 5 15 25
x(n), x(t)
1
2
-2
-1
d)
0 1 5 10 15 2 0
n
x(n)
1
2
-2
-1
a)
fd(x)
x
s
-3 -2 -1 0 1 2 3
1/5δ(x-s)
0δ(x-s)
0δ(x-s)
b)
1/5δ(x-s)
0δ(x-s)
0δ(x-s)
fd(x)
x
s
-3 -2 -1 0 1 2 3
Figure 9 Discrete uniform pdf function represented
by a) Dirac and b) Kronecker delta functions and the
realizations (random signals) of related stochastic
processes c) and d).
Let us analyze the limit cases when the density
parameters σd or S tend to infinity and zero. We are
to notify the differences in presenting the pdf
function by Dirac and Kronecker functions.
Parameter σd or S tends to infinity. The
limits of the pdf function, expressed by the Dirac
delta functions, when the variance tends to infinity,
can be obtained as follows
1
lim ( ) lim ( ) 0
21
dd
sS
d
SS
sS
f x x s
S

  
 
. (17)
Precisely calculating, the limit values of the pdf
function are expressed in terms of Dirac delta
functions as
1
lim ( ) lim ( )
21
1
lim ( )
21
0
0 ( )
0
dd
d
sS
d
SS
sS
sS
S
sS
s
f x x s
S
x s x s
S
xs
x s x s
xs

 


 











, (18)
because the values 1/(2S+1) tend to zero when S
tends to infinity. In this case, all random values are
distributed in the infinite interval stretching from
to +∞ and occurring with the probability of zero.
The pdf function for this case is presented in Fig.
10a), and the related random signal in Fig. 10c).
If the same pdf function is expressed in terms
of Kronecker delta functions, the limit values are
1
lim ( ) lim ( )
21
0 ( )
undefined
dd
sS
d
SS
sS
s
f x x s
S
x s x s
xs

  







(19)
and presented in graphical form as in Fig. 10b) with
the related random signal in Fig. 10d). In this case,
all random values are distributed in the infinite
countable interval stretching from –∞ to +∞ and
occur with the probability of zero.
Parameter σd or S tends to zero. In the second
case, parameters σd tends to zero and the pdf
function can be expressed by the Dirac delta
function at the zero point, i.e.,
00
00
1 (0) 0
1
lim ( ) lim ( ) ( ) 0 otherwise
21
dd
sS
d
SS
sS
xs
f x x s x
S








WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
122
Volume 19, 2022
or in a precise form as
00
00
,0
,0
1
lim ( ) lim ( )
21
1 ( ) 0
0 ( ) 0
0 other
1 (0) 0
0 ( ) 0
0 other
dd
sS
d
SS
sS
ss
ss
f x x s
S
x s x s
x s x s
x
xs
x s x s
x




















. (20)
c)
n
1 5 10 15
x(n)
1
-1
fd(x)
x
d)
x
fd (x)
-3 -2 -1 0 1 2 3
a)
x
fd (x)
-3 -2 -1 0 1 2 3
b)
t, n
x(t), x(n)
1
-1
fd (x)
x
Figure 10 Discrete uniform pdf function
represented by a) Dirac and b) Kronecker delta
functions when the variance tends to infinity, and
realizations of the related stochastic processes, c)
and d), respectively.
Therefore, the pdf function is defined as the
Dirac delta at x = 0 having the weight one, and by a
stream of delta functions of zero weights for all the
other values s, as presented in Fig. 11a). In between
these delta impulses, the pdf function values are
zero. Therefore, the zero-weight delta functions fill
in the x-axis and make it to be continuous.
a)
c)
d)
n
1 5 10
x2(n)
1
-1
fd(x)
x
1
x
fd (x)
-3 -2 -1 0 1 2 3
δ(x)
t, n
0 1 2 3
x2(t), x2(n)
1
-1
fd (x)
x
b)
x
fd (x)
-3 -2 -1 0 1 2 3
1
Figure 11 Discrete uniform pdf function
represented by a) Dirac and b) Kronecker delta
functions when the variance tends to zero, and
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
123
Volume 19, 2022
realizations of the related stochastic processes, c)
and d), respectively.
If the pdf function is expressed in terms of
Kronecker delta functions and the interval S tends to
zero, the pdf function is
00
00
00
0,
0
1
lim ( ) lim ( )
21
1
lim ( )
21
lim 0 ( ) and
undefined
dd
d
d
sS
d
SS
sS
sS
SsS
Ss S s S
f x x s
S
x s S x s S
S
x s S
xs x s S
xs

















,
,
1 ( ) 0
0 ( ) 0
undefined other
10
0 ( ) 0
undefined other
s S s S
s S s S
x s x s
x s x s
xs
xs
x s x s
xs















. (21)
This pdf function can be understood as the
Kronecker delta at x = 0 and by a stream of delta
functions of zero weights for all other x = s discrete
values. In between these delta functions, the pdf
function values are not defined. This function is
presented in Fig. 11b), and the related all-zero
discrete-time random signal in Fig. 11d).
3.2 Information function
Assuming that the pdf function is expressed in
terms of Dirac delta functions, we will separately
define and use the information function I(X) that
contains the information content, or the information,
of the random variable X. Having in mind the
properties of the delta function, the information
function can be derived in this form
2
2
2,
2
2
,
( ) log ( )
1
log ( )
21
and
log 0 ( )
log 0
log (2 1) ( )
and
()
d
sS
sS
s S s S
sS
sS
s S s S
I X f x
x s S x s S
S
x s S
xs x s S
xs
S x s S x s S
x s S
xs x s S
xs
























(22)
Similarly, if the pdf function is expressed in terms
of Kronecker delta functions, the information
function is
2
2
2
,
2
,
( ) log ( )
1
log ( )
21
log 0 ( ) and
undefined
log (2 1) ( )
( ) and
undefined
d
sS
sS
s S s S
sS
sS
s S s S
I X f x
x s S x s S
S
x s x s S x s S
xs
S x s S x s S
x s x s S x s S
xs













(23)
due to the definition of the Kronecker delta
function. For example, if S = 2 we may have
2
2
2, 2
2.322 ( ) 2 2
2
( ) ( ) and 2
undefined
s
s
ss
x s x s
xs
I X x s xs
xs











(24)
The graphs of both information functions, for Dirac
and Kronecker delta functions, are presented in Fig.
12a) and 12b), respectively.
The pdf function and information function are
precisely calculated and presented in Fig. 17a) and
17b), for S = 0, 1, 2, and 4 defining the interval of
pdf function values that are different from zero. The
values of the defined information functions are
presented for all values of the independent variable
x from –∞ to +∞. This presentation is important to
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
124
Volume 19, 2022
be understood because we will investigate the
behavior of these functions when the variance, or
the interval S of these functions, tends to infinity
and reaches the infinite value.
x
I(X)
-3 -2 -1 0 1 2 3
a)
Intervals with values
Intervals with values
2.32δ(x-s)
∞δ(x-s)
∞δ(x-s)
b)
x
I(X)
-3 -2 -1 0 1 2 3
Intervals with
undefined values
Intervals with
undefined values
2.32δ(x-s)
∞δ(x-s)
∞δ(x-s)
Figure 12 Information functions represented by a)
Dirac and b) Kronecker delta functions for S = 2.
Parameter σd or S tends to infinity. We will
investigate the information contents of the random
variable X for two limit cases when σd or S tends to
infinity or zero. For the first case, if Dirac delta
functions are used, the information can be calculated
using properties of the impulse function as
2
2
,
1
( ) lim ( ) lim log ()
lim log (2 1) ( )
lim ( ) and
dd
d
d
SS
X
sS
SsS
Ss S s S
I X I X fx
S x s S x s S
x s S
xs x s S
xs



 

















,
()
() and
sS
sS
s S s S
x s S x s S
x s S
xs x s S
xs












(25)
for all valuesxx
, or in a simplified
form as
()
() s
x s x s
IX
xs







. (26)
for all valuesxx
. We can understand
this function in the following way. When the
interval S of random variable values tends to
infinity, all values of the random variable X will
potentially exist and appear with an infinitesimally
small probability. Thus, having these small
probabilities of appearance, the information content
of all of them will tend to infinity. This limit
information function is shown in Fig. 13a).
If Kronecker delta functions are used, the
information function is
2
2
,
1
( ) lim ( ) lim log ()
lim log (2 1) ( )
lim ( ) and
undefined
dd
d
d
SS
X
sS
SsS
Ss S s S
I X I X fx
S x s S x s S
x s S
xs x s S
xs



 
















,
()
() and
undefined
sS
sS
s S s S
x s S x s S
x s S
xs x s S
xs











(27)
or in simplified forms as
()
()
undefined
undefined
s
x s x s
IX
xs
xs
xs










. (28)
This function is presented in Fig. 13b). We can
understand this function in the following way. When
the interval S of random variable values tends to
infinity, all values of the random variable at discrete
instants x = s will potentially exist and appear with
the infinitesimally small probability nearly equal to
zero. Thus, having these small probabilities of
appearance, the information content of all of them
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
125
Volume 19, 2022
will be close to infinity. Between any two
neighboring discrete instants, x = s, the information
function does not exist, i.e., it is not defined, as the
pdf function values in these intervals are not defined
as shown in Fig. 10b).
x
I(X)
-3 -2 -1 0 s=1 2 3
a)
Intervals with values
Intervals with values
b)
x
I(X)
-3 -2 -1 0 s=1 2 3
Intervals with
undefined values
Intervals with
undefined values
∞∙δ(x-s)
∞∙δ(x-s)
Figure 13 Discrete uniform information function
represented by a) Dirac and b) Kronecker delta
functions when the variance tends to infinity.
Parameter σd or S tends to zero. The
information function, for Dirac delta functions, can
be expressed as
000
( ) lim ( )
d
S
I X I X
(29)
2
00
,
lim log (2 1) ( )
and
()
d
sS
S
sS
s S s S
S x s S x s S
x s S
xs x s S
xs














and calculated in this form
00, 0
0 ( ) 0
( ) ( ) 0
ss
x s x s
I X x s x s
xs









(30)
or, in its simplified form as
000
2
00
2
( ) lim ( )
lim log (2 1)
log 0 otherwise
00
otherwise
d
d
S
S
I X I X
S S x S
x








. (31)
The information function is presented in Fig. 14a).
In this case, the minimum information content is
again 0 for x = 0, as shown in Fig. 14a). For this
discrete case, we are certain that any realization x of
the random variable X will be zero and there is no
uncertainty (information) about the value of this
realization, i.e., the information takes the minimum
value which is 0. The other discrete events of X
different from zero, and defined at the instants x = s,
are occurring with the probability of 0. We are
certain these events will not happen, thus their
information content is infinite, as shown in Fig. 14a)
by arrow lines pointing to the infinity. The infinite
values for the information are symbolically
presented by Dirac delta functions defined at
instants x = s. The probability of random values x
s is zero and their information content is infinite,
which is presented by the left-right arrows at the top
in Fig. 14a. In this case, a realization of a stochastic
process X(n), defined by X at any discrete time-
instant n, is a discrete process having amplitudes of
zero values, as shown in Fig. 15. Zero sample values
(outcomes) occur for sure, they are certain events at
all time instants n. Between the time instants n the
random signal x2(n) preserves zero values because
of the zero probability of events that occur between
discrete realizations defined by any x = s, as shown
in Fig. 11a).
If the σd or S tends to zero, the information
function, for Kronecker delta functions, is
000
2
00
0,
0
( ) lim ( )
lim log (2 1) ( )
and
lim ( )
undefined
d
d
d
S
sS
SsS
Ss S s S
I X I X
S x s S x s S
x s S
xs x s S
xs













WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
126
Volume 19, 2022
0, 0
0 ( ) 0
( ) 0
undefined
ss
x s x s
x s x s
xs








. (32)
x
I0(X)
-3 -2 -1 0 1 2 3
a)
Intervals with values
Intervals with values
∞∙δ(x-s)
b)
x
I0(X)
-3 -2 -1 0 s=1 2 3
Intervals with
undefined values
Intervals with
undefined values
∞∙δ(x-s)
Figure 14 Discrete uniform information functions
represented by a) Dirac and b) Kronecker delta
functions when the variance tends to zero.
In this case, the minimum information content is 0
at the origin, as shown in Fig. 14b), unlike for the
continuous random variable when the minimum
information content is –∞ and represented by an
inverted Dirac delta function, as shown in Fig. 6b).
For this case, we are certain that any realization of
the random variable X will be zero and there is no
uncertainty (information) about the value of this
realization, i.e., the information takes the minimum
value which is 0. The information content remains
of the +∞ value everywhere else on the x-axis where
the discrete pdf function of X has zero value, i.e., for
x = s. Let us explain the behavior of the information
function for these two presentations. A realization of
the related stochastic process X(n), defined by
random variable X at any time-instant n, is a
discrete-time process having amplitude zero at
points in time n and is not defined between these
points, as shown in Fig. 16 for a hypothetical
random signal x(n). Zero random values (outcomes)
occur for sure; they are certain events at all time
instants n and carry no information. Because a zero
outcome occurs with the probability of one for every
n, the information content is zero, as shown in Fig.
14b). The other events of X at the instants x = s 0
are occurring with the probability of 0, therefore
their information content is infinite, as shown in Fig.
14b). For that reason, the random signal in Fig. 16
does not have any amplitudes greater than zero. The
dependence of the pdf and information function on
the interval S is presented in Fig. 17.
fd (x)
t, n
0 1 2 3 4 5
x2(t), x2(n)
1
-1
x
Figure 15 Realizations of a continuous-time and
discrete-valued stochastic process defined by
random variable X having the uniform pdf function
expressed by the Dirac delta function for S = 0.
n
1 5 10 15
x(n)
1
-1
fd(x)
x
1
Figure 16 A realization of discrete-time and
discrete-valued stochastic processes defined by
random variable X having the uniform pdf function
expressed by a Kronecker delta function at S = 0.
b)
-4 -3 -2 -1 0 1 2 3 4
x
S
1
.
.
.
.
.
.
I(X)
4
I = finite
I=
I=
I
I→∞
a)
fd(x)
x
S
-4 -3 -2 -1 0 1 2 3 4
→ 0
1/5
1/9
1
1/3
c)
n
1 8 24
32
x1(n), x2(n), x3(n)
x2(n)
x1(n)
x3(n)
1
2
-
2
-
1
fd(x)
x
1/3
1/5
1
Figure 17 a) Discrete uniform pdf function
presented by Kronecker delta functions, b) related
information and entropy functions, and c) an
exemplary realization of the related stochastic
process.
The pdf function is expressed in terms of
Kronecker functions in Fig. 17a) for the mean value
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
127
Volume 19, 2022
equal to zero and varying values of discrete interval
S = 0, 1, 2, and 4. The related dependence of the
information function on the interval S is presented in
Fig. 17b). The information function values
(information contents) of random values x are
increasing when the values of the pdf function are
decreasing. For the zero-values of the pdf function,
the information function is of the infinite value,
which is symbolically presented by non-overlapping
dashed and dashed-dot straight arrow lines in Fig.
17b). When the interval S of the pdf function tends
to infinity, the discrete information function values
tend to infinity. When S reaches infinity, all the
information values are equal to infinity
corresponding to the pdf function with all zero
values.
If we form a discrete-time stochastic process
defined by the realizations of random variable X at
each discrete-time instant n, we may represent the
related realizations of three processes for pdf
functions of X defined by S = 2, 0, and 1, as shown
in Fig. 17c). These three random signals are
presented on the same graph for the sake of
explanation. The first realization x1(n) takes the
whole values between 2 and -2 in the time interval 0
to 7. The second realization x2(n) is represented by
zero amplitudes at each discrete-time instant inside
the interval from 8 to 23. The third realization x3(n)
takes the integer values from -1 to 1 in the interval
from 24 to 31. The presented processes are discrete-
time and discrete-valued processes. The presented
analysis will be valid for the case when we use
Dirac delta functions to represent the pdf function,
which will produce a continuous-time discrete-
valued process. Precise exemplary calculations for
the discrete pdf function and related information
function are presented in Fig. 18. When the interval
S increases from 0 to 20, the information values are
increasing from 0 to the value above 5, as shown by
cycles in Fig. 18. For the same S values, the pdf
function values are increasing from 1/4 to one, as
shown by squares in Fig. 18.
Figure 18 Discrete information functions (circles),
and uniform pdf functions (squares) for the interval
S as a parameter.
3.3 Entropy
Dirac delta function. By following Shannon’s
theory [1], the entropy can be calculated as the mean
value of the information function using the integral
transform. If the pdf function is expressed in terms
of Dirac delta functions, as in Berber’s paper [13],
the entropy function is expressed as follows
2
2
2
2
( ) ( )log ( )
11
( ) log ( ) 0
2 1 2 1
11
( ) log ( )
2 1 2 1
11
( ) log ( ) ... ( )
2 1 2 1
dd
s S s S
s S s S
x
s S s S
s S s S
x
s
sS
x
H X f x f x dx
x s x s dx
SS
x s x s dx
SS
x s x S x S dx
SS







 












S
.
The solution of the integral is
2
2
2
11
( ) log ( ) ... ( )
2 1 2 1
11
log ( ) ... ( )
2 1 2 1
11
... log ( ) ... ( )
2 1 2 1
sS
xs sS
sS
sS
H X s S s S
SS
S S S S
SS
S S S S
SS














WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
128
Volume 19, 2022
2
2
22
22
11
log 1 0 ... 0
2 1 2 1
11
... log 0 0 ... 1
2 1 2 1
1 1 1
log ... log
2 1 2 1 2 1
11
log log (2 1)
2 1 2 1
sS
sS
sS
sS
SS
SS
S S S
S
SS














(33)
Kronecker delta function. The same result can
be obtained if we express the pdf function in terms
of Kronecker delta functions. In this case, we will
express the entropy as a sum
2
2
2
2
( ) ( ) ( ) log ( )
( ) log ( ) 0
11
( ) log ( )
2 1 2 1
11
( ) ... ( ) log ( )
2 1 2 1
dd
x
S
dd
xS
S s S s S
x S s S s S
S s S
x S s S
H X E I X f x f x
f x f x
x n x n
SS
x S x S x n
SS










.
Using the sifting property of the delta function,
we may solve the sum for x as follows
2
2
11
( ) ( ) log ( )
2 1 2 1
11
... ( ) log ( )
2 1 2 1
S s S
x S s S
S s S
x S s S
H X x S x n
SS
x S x n
SS












,
and then each sum is calculated as follows
2
2
2
2
2
2
11
( ) log ( )
2 1 2 1
11
... log ( )
2 1 2 1
11
log (0)
2 1 2 1
11
... log (0)
2 1 2 1
11
log (0)
2 1 2 1
11
log 1
2 1 2 1
sS
sS
sS
sS
sS
sS
sS
sS
H X S s
SS
Ss
SS
SS
SS
SS
SS

























2
log (2 1)S


. (34)
The entropy values are numbers that are always zero
or positive for the discrete case, and represent the
average value of information (uncertainty) per
random value x of X. Due to our intention to
investigate the entropy as a function of the variance,
2( 1)/3
dSS

or as a function of the width of
interval S, we denote the entropy as H(X). The
positive values of the entropy are increasing inside
the interval S when the interval width is increasing
as shown in Fig. 19 for the intervals defined with S
= 0, 1, 2, and 4 with the corresponding values of the
entropy that are presented in italic font.
Parameter σd or S tends to infinity.
Kronecker delta function. If the interval S tends to
infinity someone can calculate mistakenly the
entropy in infinity using expression (34) as
2
( ) lim log (2 1)
Sd
H X S


(35)
However, in this case, the influence of the zero
values of the pdf function in infinity is not taken
into account. When the interval S reaches infinity,
the entropy should be calculated using its definition,
which will include the zero-valued probabilities in
the pdf function, i.e.,
2
2
( ) lim ( )log ( )
lim 0log 0 0
d
d
s
dd
Ss
s
Ss
H X f x f x

 


 








. (36)
In this case, all random variable values x are in the
infinite interval stretching from –∞ to +∞ and occur
with the probability of zero. To consider these
probability values we can confirm (36) by
calculating the entropy using a mid-term derivative
of entropy (34) as follows. The limit is
211
( ) lim ( )log lim log(2 1)
( ) 2 1
dd
x x S
d
SS
x x S
d
H X f x S
f x S


 

 

Then, following the rule that the limit of the sum is
equal to the sum of the limits, we are getting an
indeterminate case
0
that requires us to apply
the L’Hopital’s rule to get
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
129
Volume 19, 2022
'
1
( ) lim log(2 1)
21
2 / (2 1)log 2
1
lim log(2 1) lim
2 1 2
d
dd
xS
SxS
x S x S e
S L H S
x S x S
H X S
S
S
S
S

 


 
 

1
lim 0
(2 1)log 2
d
nS
S
nS e
S

 

(37)
Therefore, considering the values of the
information and corresponding probability values,
the entropy, as the measure of the average
information contents inside the random values x, is
zero. These entropy values are presented in Fig. 19
by cycles connected by a full curve that reaches
infinite entropy, which then drops back to zero
entropy as symbolically presented by a dashed curve
alongside the full curve.
It is important to note the following two
properties of entropy, its infinite value when S tends
to infinity, and zero value when S reaches infinity.
While the discrete interval S tends to infinity the
calculated entropy value in the interval S tends to
infinity and is zero beyond the S interval. In infinity,
the intervals of random variable x with zero values
of entropy disappear, and in the entire infinite
interval, the entropy becomes zero because the S
value reaches infinity. The appearance of any value
x of the random variable X is happening with the
probability of zero and its information content is
infinite, as will be seen from the following analysis.
Dirac delta function. If the pdf function is
expressed in terms of Dirac delta functions, then the
pdf function in infinity will have all zero values and
the entropy value is zero calculated as follows
2
2
( ) lim ( )log ( )
0log 0 0
ddd
H X f x f x dx
dx
 





. (38)
Therefore, considering the values of the information
and corresponding probability values, the entropy,
as the measure of the average information content
inside the random values x, is zero when the interval
S reaches infinity.
Parameter σd or S tends to zero. Kronecker
delta function. For the Kronecker delta function
presentation and the case when parameter σd or S
tends to zero, the entropy value can be calculated as
follows
000
2
0
1
( ) lim log(2 1)
21
limlog (2 1) 0
d
nS
SnS
S
H X S
S
S

(39)
The dependence of the pdf function values,
information, and entropy on the size of interval S is
presented in Fig. 19.
Dirac delta function. If the information is
expressed in terms of Dirac delta functions, we may
have
02
00
( ) lim ( ) limlog (2 1) 0
SS
H X H X S

(40)
The calculated entropy values are increasing
towards infinity when S tends to infinity. When S
reaches infinity, the entropy drops down from the
infinite value to zero, as shown in eq. (36), which is
presented by a full curve with a loop at the end (in
infinity) and a dashed arrow curve showing
symbolically the return of the entropy to zero.
S
1 2 3 4 5 6
1
fX, I(X), H(X)
2
3
H=0
I=∞
S→, H→∞
S=+, H=0
3.17
2.32
1.59
Figure 19 Discrete uniform pdf function and related
information and entropy as functions of the size of
interval S.
4 Conclusions
The presented theory has shown that the
entropy of the uniform stochastic process,
having a time-dependent variance, increases in
time, reaches infinity, and then drops to zero
showing the singularity property in infinity. In
contrast to the entropy function of the process,
all information function values tend to infinity
when the variance tends to infinite and attains
infinite values for the infinite variance. The
paper presented precise definitions and related
derivatives of the information and entropy
functions both for the continuous and discrete
uniform random variables assuming that the
variance can have any value between zero and
infinity. In particular, a uniform density
function with the variance that linearly depends
on time is defined and derived, and related non-
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
130
Volume 19, 2022
stationary processes are formed. This function
takes all zero values in infinity causing
singularity behavior of the entropy function.
Due to the possible existence of the continuous
and discrete processes in a stochastic system,
the time-dependent continuous and discrete
probability density functions and related
information and entropy functions are derived.
References:
[1]. Shannon, C. E. A Mathematical Theory of
Communication. The Bell System Technical
Journal, Vol. 27, pp. 379423, 623656, July,
October, (1948).
[2]. Glattfelder, J. B. InformationConsciousness
Reality, (Springer Nature Switzerland AG,
2019).
[3]. Nicholson, B. S. at all. Time-information
uncertainty relations in thermodynamics,
Nature Physics, VOL 16, Dec. 2020, p. 1211-
1215.
[4]. Bell, J. L. Continuity and infinitesimals, In:
Zalta, E.N. (ed.) The Stanford Encyclopedia of
Philosophy, Summer 2017 Edition.
[5]. Montgomery, D.C., Runger, G. C. Applied
Statistics and Probability for Engineers, (John
Wiley and Sons, New York, Fourth Edition,
1994).
[6]. Papoulis, A., Pillai, S. U. Probability, Random
Variables and Stochastic Processes, (McGraw-
Hill, New York, Fourth Edition, 2002).
[7]. Peebles. P. Z. Probability, Random Variables,
and Random Signal Principles, (McGraw-Hill,
New York, Fourth Edition, 2001).
[8]. Manolakis, D. G., Ingle, V. K., Kogan, S. M.
Statistical and Adaptive Signal Processing,
(Artech House. INC., 2005).
[9]. Gray, M. R. Davisson, L. D. An Introduction to
Statistical Signal Processing, (Cambridge
University Press, 2004).
[10]. Proakis, G. J. Digital Communications,
(McGraw-Hill, New York, Fourth Edition,
2001).
[11]. Berber, S. Discrete Communication Systems,
(Oxford University Press, 2021).
[12]. Haykin, S. Communication Systems, (John
Wiley & Sons Ltd., Fourth Edition, 2001).
[13]. Berber, S. The Exponential, Gaussian and
Uniform Truncated Discrete Density Functions
for Discrete-Time Systems Analysis, WSEAS
Transactions on Mathematics, ISSN / E-ISSN:
1109-2769 / 2224-2880, Volume 16, Art. #26,
2017, pp. 226-238.
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2022.19.12
Stevan Berber
E-ISSN: 2224-3402
131
Volume 19, 2022
Creative Commons Attribution License 4.0
(Attribution 4.0 International, CC BY 4.0)
This article is published under the terms of the Creative
Commons Attribution License 4.0
https://creativecommons.org/licenses/by/4.0/deed.en_US