Exploring the Potential of EEG for Real-Time Interactions in
Immersive Virtual Reality
MIKKO KORKIAKOSKI1, THEODOROS ANAGNOSTOPOULOS2, OSKARI RAJALA1,
MIKAEL SARKINIEMI1, MARKUS HIRSIMAKI1, JERE KINNUNEN1, PAULA ALAVESA1,
PANOS KOSTAKOS1
1Center for Ubiquitous Computing, Faculty of Information Technology and Electrical Engineering,
University of Oulu,
Pentti Kaiteran Katu1, 90570, Oulu
FINLAND
2DigiT.DSS.Lab
Department of Business Administration
University of West Attica
250 Thivon & P. Ralli Str, Egaleo, 12241, Athens
GREECE
Abstract: - Brain-computer interfaces (BCIs) can use data from non-invasive electroencephalogram (EEG) to
transform different brain signals into binary code, often aiming to gain control utility of an end-effector (e.g
mouse cursor). In the past several years, advances in wearable and immersive technologies have made it
possible to integrate EEG with virtual reality (VR) headsets. These advances have enabled a new generation of
user studies that help researchers improve understanding of various issues in current VR design (e.g.
cybersickness and locomotion). The main challenge for integrating EEG-based BCIs into VR environments is
to develop communication architectures that deliver robust, reliable and lossless data flows. Furthermore, user
comfort and near real-time interactivity create additional challenges. We conducted two experiments in which a
consumer-grade EEG headband (Muse2) was utilized to assess the feasibility of an EEG-based BCI in virtual
environments. We first conducted a pilot experiment that consisted of a simple task of object re-scaling inside
the VR space using focus values generated from the user’s EEG. The subsequent study experiment consisted of
two groups (control and experimental) performing two tasks: telekinesis and teleportation. Our user research
study shows the viability of EEG for real-time interactions in non-serious applications such as games. We
further suggest that a simplified way of calculating the mean EEG values is adequate for this type of use. We ,
in addition, discuss the findings to help improve the design of user research studies that deploy similar EEG-
based BCIs in VR environments.
Key-Words: - electro-encephalography, EEG, interaction, locomotion, user research study, virtual reality
Received: October 29, 2022. Revised: January 21, 2023. Accepted: February 16, 2023. Published: March 27, 2023.
1 Introduction
Immersive experiences offered by VR and
augmented reality (AR) are gaining traction in areas
outside gaming and simple simulations [1].
However, in tasks that require elevated levels of
concentration, such as teleoperated robotic
machinery [2], remote medical treatment and
education, virtual content can distract users if it is
not on par with their attentive state [3]. The most
common solutions for user control in commercial
VR devices are physical controllers and hand or
body tracking [4], [5]. In addition, the most
common feedback modalities are visual, audio, and
haptics (vibration). Capturing the user’s attention
using options derived from what is available can be
challenging in six degrees of freedom (6DoF) or
360o virtual environments. Various studies have
explored the use of multimodal feedback to help
govern the attention of the user while BCIs have
also shown promise with respect to the potential use
of brain signals as a controller [6], [7], [8], [9].
Using electroencephalography (EEG) as a
controller is not a novel idea in VR research [10],
[11], [12] but having emerged only recently, user
studies are scarce as devices using wireless dry
electrodes are very sensitive in regards to their
correct placement and they also remain susceptible
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2023.20.12
Mikko Korkiakoski, Theodoros Anagnostopoulos,
Oskari Rajala, Mikael Sarkiniemi,
Markus Hirsimaki, Jere Kinnunen, Paula Alavesa
E-ISSN: 2224-3402
98
Volume 20, 2023
to inference. Against this backdrop, our
experimental setup consists of a popular head
mounted display (HMD) along with a portable EEG
device, delivering both user comfort and easy
electrode setup. This first experiment was a pilot
study with six participants where our main focus
was to assess the feasibility of our setup. This was
done by measuring user comfort while subjects were
exposed to closed-loop biofeedback given through a
virtual heads up display (HUD) shown in VR. The
pilot study consisted of a simple task of re-scaling
an object in the VR realm using only the user
generated EEG signals. In the second experiment,
the system architecture was simplified to lower the
latency caused by the EEG data conversion. This
experiment featured two tasks: telekinesis and
teleportation. In the telekinesis task, users were
instructed to move objects to predetermined
locations inside the VR space. Respectively, in the
teleportation task, participants were instructed to
move along a course of checkpoints set in VR using
the teleportation utility. In both tests, the actions
such as grab, move, and locomotion, were
controlled using the portable EEG device. We also
had a control group who wore the EEG device,
however their EEG values were not used as a
control mode.
This article presents two experiments, pilot
experiment (see Subsection 3.1) and study
experiment (see Subsection 3.2), in which some of
the controls are bypassed with a commercially
available non-invasive EEG based BCI, enabling
users to have direct agency in the simulated
environment. Strict policy for social distancing was
in place due to the Covid-19 pandemic when the
study was planned and conducted. Specifically, we
adapted the user study for a smaller sample and used
within-subject design. The testing was conducted in
three different locations during April 2021. Each of
these locations had their own observer and
researcher running the experiments. The protocol
for testing was decided well in advance to keep the
procedure uniform. Still, when conducting testing in
different locations and with a different set of
equipment, the comparability of the results is a
concern [13]. For all testing, the VR equipment and
Muse 2 headsets were the same models, but not the
same exact units. The PCs used for running the
simulations were very light weight, all the
computers were able to run at the intended
framerates. This was made possible by
improvements made over the proof-of-concept pilot
experiment, which was run with a similar setup and
devices, but with a slightly more complex system
architecture and a less polished VR space. For
analysis a non-parametric Kruskal-Wallis test [14]
was used for results of the questionnaires while the
completion times were analyzed with one-way
ANOVA. User research study progress is presented
in Fig. 1.
Fig. 1: User research study progress.
The purpose of this study was to explore the
feasibility of a consumer grade EEG as a controller
in immersive VR and gain some early insights on
user experience (sense of agency and cyber
sickness) when using EEG for locomotion and
interactions. Our findings show that a simplified
approach can be adequate for non-serious use.
This paper is structured as follows. First, in
Section 2, we introduce related work and rationale
behind this study. Then, in Section 3, we describe
the research study beginning with the pilot
experiment description and continuing to the study
experiment. This section also describes the
architecture of our system, procedure and the
demographics of our test users. In Section 4 we go
through the results and analysis methods of this
mixed methods study. Discussion of the observed
results is performed in Section 5, while the research
effort is concluded in Section 6.
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2023.20.12
Mikko Korkiakoski, Theodoros Anagnostopoulos,
Oskari Rajala, Mikael Sarkiniemi,
Markus Hirsimaki, Jere Kinnunen, Paula Alavesa
E-ISSN: 2224-3402
99
Volume 20, 2023
2 Related Work
Naturally occurring forms of interaction, that bypass
the traditional physical inputs such as the keyboard
and the mouse, have for some time been core topics
in human-computer interaction (HCI) [15].
Recently, institutional interactions that involve a
combination of voice, eye-tracking, and hand
gesture controls have become the industry norm for
top-tier mixed reality (MR) devices like the
Microsoft HoloLens 2 [16]. While the mouse and
keyboard combination might not be replaced in the
near future, other modalities and controls are
essential in achieving the true potential of
immersive VR [2], [3], [17]. EEG-based BCIs in
VR have been used as replacements for the typical
game controls [12], [18], enabling dynamic and
responsive training environments driven by task
customization and adaptability [19]. EEG data has
also been used to better understand disorientation
and physical discomfort [20], [21], [22], brain
activity during navigational tasks [21], and as an
attention enhancement [11].
Although EEG systems are mostly used in
conjunction with other data gathering methods [20],
[21], [22], [23], [24], EEG data has been
instrumental in measuring the cognitive load of
users [23], [24]. Consequently, this body of research
has clear implications for improving VR design. For
instance, the connection between locomotion in
immersive VR and cybersickness [25], also known
as VR sickness, has been studied in the past using
VR-native interactions like teleportation, gaze, free
locomotion, tracking combined with non-isometric
walking etc. [26], [27]. Within this context, prior
VR research has leveraged portable EEG devices as
a noninvasive BCI for evaluating virtual interactions
including locomotion [10], [28]. Similarly, proof-of-
concept systems that integrate a BCI and VR
headset have been developed, targeting improved
mindfulness in immersive environments [29].
Portable devices such as Emotiv EPOC, Muse 2,
and NeuroSky MindWave have sparked renewed
interest across research fields [30], [31]. In
particular, Muse 2 [32], is a light-weight portable
EEG headset, which has been validated against
large-system EEG setups for both continuous
recording of EEG data and in event-related brain
potentials (ERP) research [33]. As reported in the
literature [34], the Muse EEG system has been used
to detect the brain states for concentration and
relaxation [29], [35], task enjoyment [36], pain [37],
as well as detecting the cognitive state of the user
[38]. In this study we explore the use of EEG as an
additional mode of interaction. The goal of the
technical implementation was to use consumer
devices and measurement solutions that would
provide usable EEG data as close as possible to real-
time.
While using EEG can improve some aspects of
the user experience, VR can also influence the EEG
measurements by offering a dynamic and immersive
scene for feedback [6], [39]. BCIs connected with
VR can result in fewer errors due to enhanced
mental effort [40]. This suggests EEG can improve
the engagement and focus on the users in VR. In
this research effort we were interested in the little
explored connection between VR sickness and
EEG-enabled locomotion. In addition, we targeted
an even less explored aspect in using EEG as a
mode of interaction, for a sense of agency. We did
this based on the assumption that due to higher
immersion and focus there might be observable
differences in this context of user experience.
Experiments using immersive technologies (e.g.,
VR, AR, MR, XR) are usually conducted in
controlled environments and suitable research
facilities. As in other fields, the Covid-19 pandemic
has forced educators and researchers to work
remotely and to social distance at the workplace.
This has severely hampered user testing for the
devices and simulations. Covid-19 has also raised
new concerns especially regarding the cleanliness of
the equipment, when conducting experiments where
the devices are passed on from person to person
numerous times in short periods of time [13].
3 The User Research Study
We conducted a user research study to explore the
potential of using EEG data as a controller i.e., a
mode for interaction in immersive VR. The study
experiment (see Subsection 3.2) was preceded by a
pilot experiment (see Subsection 3.1) that we briefly
describe in the text below.
3.1 Pilot Experiment
The pilot experiment took place pre-Covid-19 in
November 2018 and it aimed at validating the setup
of using an HMD, Oculus Quest, together with a
Muse 2 and providing accurate data collection.
Muse 2 sends raw data on five bands one of which
is ground as presented in Fig. 2. The remaining four
bands correlate to four locations on a normal EEG
cap which would contain 10 20 sensors. Muse 2 is
also cordless, delicate and lightweight, which makes
it possible to fit in under a VR HMD. On average,
each band sends 255 values per second which are
then fast Fourier transformed once every second to
obtain the final window of EEG values. The
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2023.20.12
Mikko Korkiakoski, Theodoros Anagnostopoulos,
Oskari Rajala, Mikael Sarkiniemi,
Markus Hirsimaki, Jere Kinnunen, Paula Alavesa
E-ISSN: 2224-3402
100
Volume 20, 2023
frequency in this window is from 1Hz to 128 Hz
with a time resolution of one second. The resulting
frequencies correlate to the microvolt values sent by
Muse 2. We consider these values to be relatively
only to each other in an ongoing measuring session,
because skin conductivity varies from person to
person and a number of factors can cause
interference to the measurements and the Bluetooth
connection. This architecture in the pilot experiment
was more complex than in the study experiment (see
Subsection 3.2). This is because in the pilot
experiment equipment consisted of a controller PC,
a Raspberry Pi, a Muse 2, a HDM, and a Polar H10
heart-rate sensor. It was simplified for the actual
study by using Bluetooth to connect Muse 2 to the
PC instead of Raspberry Pi, and adjusting the
calculations for transforming EEG signals into
usable data. The Polar H10 heart-rate monitor was
not used again in the study, since it did not add
value in the analysis. Intuitively, the Quest HMD
was also replaced with Oculus Rift S.
Six subjects participated in the pilot experiment,
see Fig. 3. They wore a Muse 2 accompanied by a
Polar H10 heart rate sensor [41], that was attached
to the chest using a strap. Oculus Quest was used as
the HMD. The participants were asked to perform
an object scaling task. Three participants, group 1
(G1), were exposed to their data through a virtual
wrist-mounted heads-up display (HUD). The other
three participants in the second group, (G2), were
treated similarly, but without the exposure to the
HUB. Both groups consisted of one female and two
male subjects, aged between 22 to 40 years of age.
All subjects were students with a computer science
and engineering background. One was majoring in
pedagogy and one in geography. Informed consent
was obtained for the students prior to experiment.
The subjects were told how Muse 2 works and that
they would be able to rescale an object (ball)
projected in the VR-scene using EEG data.
Following a short guidance session in the VR play
area, the participants were asked to relax for 60
seconds with their eyes closed while the threshold
values were being collected. The participants in the
first group were asked to open their eyes, and while
focused, instructed to scale the ball. The group was
provided with a virtual wrist-worn HUD, displaying
data from Muse 2 (EEG) and the heart-rate monitor.
The participants in the second group repeated the
same tasks, but without having the HUD data
visible. On average, the task lasted two minutes.
The experience in VR consisted of a simple scene,
with a hovering ball in the user’s vicinity. The size
of the ball changed depending on the focus level of
the user. After the experiment, participants in both
groups were asked to fill out a questionnaire. The
pilot experiment was conducted at the university
campus using a semi-CAVE, that provided a calm
and isolated environment, void of distractions that
could have influenced the EEG measurements.
Fig. 2: The topological arrangement of electrodes
based on the 10 20 standard (top) and the Muse 2
headband (below).
3.2 Study Experiment Setup and Procedure
The study experiment took place during spring
2021. Due to restrictions caused by the Covid-19
pandemic, the study experiment was conducted in
three separate locations with 13 participants who
were either friends or family of the researchers.
While this may have introduced bias, it is also
possible that this allowed the participants to feel
more relaxed and comfortable during the
experiments. This in turn would have been
important especially with regards to the collected
EEG data. The participants had a wide variability in
their age distribution, from 21 to 58 years of age (M
= 35.5, SD = 15.7).
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2023.20.12
Mikko Korkiakoski, Theodoros Anagnostopoulos,
Oskari Rajala, Mikael Sarkiniemi,
Markus Hirsimaki, Jere Kinnunen, Paula Alavesa
E-ISSN: 2224-3402
101
Volume 20, 2023
Fig. 3: Pilot experiment of the research study. Top
images show, from different angles, the virtual
environment where the scaling task took place. This
is the game view seen by the group that was
exposed to the closed-loop system with the wrist-
worn heads-up display (HUD). Images below on the
left show a play area used by the participants to get
familiarized with the virtual environment. Image
below on the right shows the semi-CAVE
environment used in the pilot experiment.
Each location had its own observer and
researcher running the experiments. The protocol
for testing was planned and rehearsed in advance. In
our case, the VR equipment and the Muse 2 headset
used were the same models, albeit not the same
exact units. The VR scenes, Table 1, were run on
different PCs, however, the simulation itself being
very light weight, meant that all the computers were
able to run at the intended high framerates. Still,
when conducting tests in different locations and
with a different set of equipment, the compatibility
of the results is always something you have to
consider [13].
The equipment in the study experiment included
an Oculus Rift HMD and a Muse 2 EEG headset.
Unity development platform was used to create and
run the simulations. Blue Muse software created by
Kowaleski and Wicklund [42] was used to connect
Muse 2 to a PC. We simplified the architecture by
removing the Raspberry Pi and using Lab Streaming
Layer (LSL), thereby Muse 2 could stream the EEG
data to Unity directly. BrainVision LSL Viewer [43]
was used to observe the EEG channels
simultaneously. It was also used in calibrating Muse
2 before running the experiment. Based on the pilot
experiment and further testing with Muse 2, it was
determined that the placement of the unit on the
participants’ heads with the HMD had to be precise
and even slight changes in the position could
produce errors or more interference. Since the
participants wore Muse 2 under the HMD, the
calibration phase was as interference-free as
possible.
Table 1. Description of the different VR-scenes and
user tasks used in this study experiment. All
participants went through the tasks in the same order
they are in this table. The only difference between
experimental and control groups was that the control
group generated fake focus values.
Setup
Purpose
Playroom
Calibration and familiarization
with some EEG-mediated
interactions
Teleportation
Test the use of teleportation
Timed teleportation
Test the use of teleportation
with a 180 second time limit
Puzzle room
Interacting with objects using
EEG, grabbing and dropping.
We use a simple way of calculating the focus
values by taking the four data channels from Muse
2, first summing up the channels and then dividing
the sum by four with a frequency of 10Hz. These
averages are then transformed into the focus values
by using standard deviation for every 100 average
samples at a time. The calculation works within a
first-in-first-out principle for the averages. So, each
time a new average is added into the buffer, the
oldest one is dropped and then a new focus value is
calculated. This helps in keeping the focus value
transition smoother, as calculating the focus values
from a new batch of data every time could induce
very abrupt changes in the value.
The participants were divided into groups based
on study experiment conditions. Due to the limited
availability of test users, we conducted a within-
subjects experiment in regards to the VR setup,
however the experimental and control groups were
separated. The experimental group tested with
actual focus values that were generated from the
users’ EEG data, while the control group did the
same tasks but with randomly generated fake focus
values.
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2023.20.12
Mikko Korkiakoski, Theodoros Anagnostopoulos,
Oskari Rajala, Mikael Sarkiniemi,
Markus Hirsimaki, Jere Kinnunen, Paula Alavesa
E-ISSN: 2224-3402
102
Volume 20, 2023
Table 2. Completion of mean focus values of
control and experimental groups. Values were
calculated as a mean value from the four channels of
the Muse 2 with a frequency of 10Hz.
Control
(𝑥, 𝜎)
Experimental
(𝑥, 𝜎)
p-value
(0.39, 0.41)
(2.10, 0.87)
0.004
(1.23, 1.63)
(4.11, 1.19)
<0.001
(0.96, 1.10)
(4.22, 0.97)
<0.001
(0.27, 0.29)
(3.54, 1.18)
0.004
(1.13, 1.28)
(3.94, 0.48)
0.007
The VR scenes tested were presented in Table 1,
as: (1) Playroom, (2) Teleportation, (3) Timed
Teleportation, and (4) Puzzle room. All these scenes
were tested by both the experimental and control
groups. The Playroom contained objects that the
users could interact with by using the telekinesis
system. This room was used to teach the user how
they are able to move the objects by focusing. In the
Puzzle room, the users used the telekinesis system
to place certain shaped objects to their
corresponding positions. These positions were
indicated by table-shaped pedestals. The pedestals
changed their color to green when the correct object
was placed on them. When all objects were in their
correct places the task was completed. The
Teleportation scene is a hallway with five nodes. In
this scene the users had to use the teleportation
system to teleport into the nodes using their EEG.
The task was completed after the user had teleported
through the sequence of nodes. Timed Teleportation
used the same scene but with a 180 second time
limit, EEG values and completion times were
recorded from all the mentioned tasks.
4 Results and Analysis
The collected material consisted of EEG data,
completion times, general observations and
questionnaires. The pre questionnaire had a consent
form, demographic questions, and questions on
susceptibility to motion sickness. The post
questionnaire contained a simulator sickness
questionnaire (SSQ) [44] and questions about sense
of agency adapted from [45].
The EEG for the participants was recorded from
all the scenes. The focus values were determined
using the method detailed earlier. The recorded EEG
mean focus values were <0.050. For completion
times we had less data, therefore a Kruskal-Wallis
test was used to analyze the results, as presented in
Table 2.
Table 3. Statistically significant results on mean
completion times and variance in seconds.
Setup
Control (𝑥, 𝜎)
Experimental
(𝑥, 𝜎)
p-value
Playroom
(62.01, 35.68)
(223.58, 161.42)
0.045
Timed
Teleportation
(79.01, 12.38)
(69.61, 25.82)
0.032
Whole course
(297.75, 62.12)
(496.36, 148.50)
0.022
As with the focus values and EEG, the
completion times were collected for all of the tasks.
The only statistically significant results were found
in completion times for Playroom and Timed
Teleportation as well as total completion for the
whole course. The whole course consists of all the
tasks including the Playroom scene, as presented in
Table 3.
After conducting a Kruskal-Wallis test [14] on
the SSQ results, we did not find any statistically
significant differences between the groups.
Therefore, only the total simulator sickness values
are reported here, as presented in Table 4, and not
the subcomponents from the original questionnaire.
Table 4. Statistically significant results from SSQ.
Group
Measure
Mean
rank
Variance
p-value
Control
Total
33.66
581.88
0.283
Simulator
Sickness
Experimental
Total
20.83
190.49
0.283
Simulator
Sickness
The post-questionnaire also contained statements
mapping the participants’ sense of agency in the
VR. This questionnaire was loosely based on [45]
and the analysis was conducted using a Kruskal-
Wallis test [14]. The purpose of this questionnaire
on agency was to observe differences mainly in the
telekinesis use case. However, the only close to
significant results were found in the Teleportation
task, as presented in Table 5. Concretely,
questionnaire given to users had five specific
questions, such as: (1) “Q1 = I was able to interact
with the environment the way I wanted to”, (2) Q2
= The teleportation task was (1 difficult, 7 easy) to
perform”, (3) “Q3 = The color of the objects
reflected my level of concentration accurately in the
teleportation room”, (4) “Q4 = Did you gain enough
feedback for your actions in the teleportation
room?”, and (5) “Q5 = I felt the time limit affected
my performance in the teleportation room”.
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2023.20.12
Mikko Korkiakoski, Theodoros Anagnostopoulos,
Oskari Rajala, Mikael Sarkiniemi,
Markus Hirsimaki, Jere Kinnunen, Paula Alavesa
E-ISSN: 2224-3402
103
Volume 20, 2023
Table 5. Results from the post-questionnaire section
measuring sense of agency for the teleportation task
where the results were close to being statistically
significant. (Note: Q is the abbreviation for
“Question”).
Category
Q
Control
(𝑥, 𝜎)
Experimental
(𝑥, 𝜎)
p-value
Teleportation
Q1
(6.0, 0.89)
(4.85, 1.57)
0.071
Q2
(5.83, 1.32)
(4.85, 1.57)
Q3
(5.16, 2.13)
(4.57, 0.97)
Q4
(5.33, 1.75)
(5.42, 1.61)
Q5
(5.33, 1.03)
(4.85, 1.86)
4.1 Other Observations
After answering the post-questionnaire, the
participants were told which group they had been a
part of. Some who belonged to the control group
that used fake focus values, reported that they had
become suspicious especially about the telekinesis
system. They said it seemed difficult to drop the
objects and that they had thought it could be
because they had had a tiresome day. In the
experimental group some participants reported they
had discovered a way to easily release the objects by
blinking. This is commonplace with EEG devices
having interference from the movement of facial
muscles. One person in that group also said they
could increase their focus value more easily if they
concentrated on looking at the edges of an object
instead of looking straight at it.
Participants in both groups reported they found
the focus features interesting and entertaining. Most
of them thought that these types of features can
enhance the immersive experience of VR. The
participants with eyeglasses had a disadvantage in
the tasks, because it was not possible for them to
wear their glasses under the HMD and Muse 2.
They reported that it was at times difficult for them
to see the objects in the scene.
5 Discussion
In this small-scale user research study, we coupled a
commercially available EEG device, Muse 2, with
an HMD in order to explore the suitability of using
brain signals as an alternative for the more
traditional controllers in immersive VR. Our focus
value calculation is based on standard deviation,
which is an overly simplified way of using the EEG
data and not an accurate representation of the
volume of focus. This however, was enough for our
test users to have a mainly positive experience,
which would suggest that for games and
entertainment purposes real-time EEG could be a
useful addition. If the target would be to use focus
as a mode of interaction, more sophisticated
solutions may be needed as even in this study
experiment blinking caused a peak in EEG values,
and the users learned this quite quickly.
When comparing the focus values, it seems the
participants in the experimental group succeeded
better in keeping their focus level higher. This
suggests that the experimental group needed to stay
focused, while the control group was able to pass
the tasks with the help of the “fake” focus values
i.e., by chance. In this case it is unfortunate that we
cannot also show statistically significant results
between the experimental and the control group in
the post-questionnaire measuring sense of agency.
In the Teleportation task, only agency had a p-value
of 0.071, with the experiment group faring higher.
The reason for comparing the Teleportation, with
and without a time limit, was to see if it would be
easier for the participants to focus without a time
limit. There was a statistical difference between
timed and untimed Teleportation, but not in the way
that we expected. People fared better in the time
limited version of the test, and the reason for this
was in hindsight obvious. The Teleportation scene
without the time limit was run fist giving the
participants an opportunity to not just learn the route
but also to acquaint themselves better with the
teleport system. Then, when it came to running the
time limited version of the task, the participants
were already aware of how the whole scene worked,
and did not have to learn the teleport mechanics nor
the route. This led to the users scoring better
completion times in the time limited scene. The
original idea for measuring completion times was to
see if random values opposed to the actual values
would influence the scene of agency of the test users
especially in the tasks requiring interactions.
However, we did not get significant results with
such a small sample and therefore cannot compare
results between experimental groups.
Since the results from SSQ were inclusive, we
cannot say if there were differences between the
groups. However, we did not observe much VR
sickness with the participants to begin with. This is
possibly due to teleportation being the most user-
friendly mode of locomotion in VR [46]. This was
also expected since in the designated tasks, other
than the teleportation, the user did not need to move.
5.1 Limitations
There are many limitations in this study experiment.
We run the tests in three different locations. In each
location, the rehearsed testing protocol was used to
ensure the results would be comparable to each
other. However, while running tests in several
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2023.20.12
Mikko Korkiakoski, Theodoros Anagnostopoulos,
Oskari Rajala, Mikael Sarkiniemi,
Markus Hirsimaki, Jere Kinnunen, Paula Alavesa
E-ISSN: 2224-3402
104
Volume 20, 2023
different locations, even with mostly identical
equipment, the comparability of the results is of
concern [13]. As with the research locations, Covid-
19 also forced limitations to the sample size and the
pool of available participants. The tests simply
could not be run in public with a large pool of
volunteers due to health concerns. Instead the
participants were people familiar to the researchers
and the experiments were run in private. Conducting
the testing like this can have positive and negative
effects on the results. Positive in the way that the
participants might feel more relaxed in the situation,
which could make the collection of biometric data,
such as EEG, more reliable. However, at the same
time there is a chance of acquiescence bias. The
participants were informed their honesty was valued
more than favorable comments, but this source of
bias cannot be completely ruled out. To mitigate this
bias the purpose of the study experiment was
revealed to and discussed with the participants only
after the study experiment.
When interpreting our results, it is important to
remember that while we speak of focus values, we
used a commercially available EEG device and a
very simple way of calculating the focus value. We
also noticed that the EEG data recorded from Muse
2 is very sensitive to muscle movement. Muscle
movement, blinking as well as eye movement, had a
noticeable effect on the amplitude of the raw EEG
sent by Muse 2 and it was easily observable from
the recorded data. The method used for calculating
the focus values was also simpler compared to the
one used in the pilot experiment. Using standard
deviation of averages instead of fast Fourier
transformation was faster but in turn less accurate.
Still there was an observable difference between the
two groups, experimental and control. It is also
notable that the false EEG was not random, the
behavior of the participants was what brought the
unpredictability to the setup in the control group.
5.2 Future Work
As the pandemic clears we hope to repeat this study
with a bigger sample to verify our results and gain
more conclusive results. We in addition conducted
the study experiment using within subjects’ setup to
minimize social interactions due to the pandemic.
This of course is not an ideal setup for a study like
this.
EEG and the needed calculations for using it in
real-time interactions in VR have their limitations.
In future we would like to continue studying the
optimal setup for a good user experience when using
light weight EEG devices in VR, both in games and
in serious applications. These applications have
different requirements for accuracy in interactions.
Our current setup with lossy communications and
simple calculations might be useful for
entertainment purposes, but other solutions are
needed for more serious applications. Muscle
movements on the forehand and around eyes caused
disturbance to the signal and as we specified in the
results the users were able to cause a peak in
measurements by blinking. We can suggest the use
of the current solution in certain types of future
studies such as the intuitiveness levels of EEG
controls in gameplay. Despite the inaccuracy in the
data, when studying the relevant aspects of the
user/player experience, it is plausible to suggest that
the EEG controlling mode did work as intended
most of the time and has potential for entertainment
purposes.
5.3 Ethics Statement
We follow the ethical requirements established by
the Finnish advisory board on research integrity
(TENK) [47]. The gathered material has been
handled and informed consent from the participants
was obtained in accordance with Finnish and
European laws. We also consulted and followed the
guidelines of our local ethics board [48].
5.4 Declaration of Conflict of Interests
The authors declared no special conflicts of interest
with respect to the research, authorship, and/or
publication of this article.
6 Conclusion
In this small-scale user research study, we
demonstrate the potential of EEG as a controller in
less serious applications. We were not able to show
that using EEG for locomotion would influence
cyber sickness, however there were some slight
differences in task completion times between
control and the experimental groups suggesting that
locomotion with teleportation using EEG was
slightly faster in the experimental group where
participants were able to influence the locomotion.
It remains whether this was due to intuitiveness of
the modality of higher sense of affordance, which
made adaptation and learning faster, but we suggest
exploration on this topic for future research.
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2023.20.12
Mikko Korkiakoski, Theodoros Anagnostopoulos,
Oskari Rajala, Mikael Sarkiniemi,
Markus Hirsimaki, Jere Kinnunen, Paula Alavesa
E-ISSN: 2224-3402
105
Volume 20, 2023
Acknowledgements:
Authors would like to thank the following computer
science majors at the University of Oulu, Finland.
Specifically, Sami Rapakko, Jouni Lammi, Aleksi
Sieria, and Olli Torronen for their mentoring
services during the conduction of this research
article.
References:
[1] Panos Kostakos, Paula Alavesa, Jonas
Oppenlaender, and Simo Hosio, VR ethnography: a
pilot study on the use of virtual reality ‘go-along’
interviews in Google street view, In Proceedings of
the 18th International Conference on Mobile and
Ubiquitous Multimedia (MUM’19), Pisa, Italy, 26
November, 2019, pp. 1 – 5.
[2] Yuxuan Zhang, Hexu Liu, Shih-Chung Kang, and
Mohamed Al-Hussein, Virtual reality applications
for the built environment: Research trends and
opportunities, Automation in Construction, Vol.
118, No. 103311, 2020, pp. 1 19.
[3] Oscar Ariza, Gerd Bruder, Nicholas Katzakis, and
Frank Steinicke. Analysis of proximity-based
multimodal feedback for 3rd selection in immersive
virtual environments, In Proceedings of the 25th
IEEE International Conference on Virtual Reality
and 3D User Interfaces (VR),
Tuebingen/Reutlingen, Germany, 18 March, 2018,
pp. 327 – 334.
[4] Panos Kostakos, Paula Alavesa, Mikko
Korkiakoski, Mario Monteiro Marques, Victor
Lobo, and Filipe Duarte, Wire to exist: Exploring
the effects of wayfinding affordances in
underground facilities using virtual reality,
Simulation & Gaming, Vol. 52, No. 2, 2021, pp. 107
– 131.
[5] Max B. Schafer, Kent W. Stewart, Nico Losch,
Peter P. Pott, Assessment of Commercial Virtual
Reality Controller for Telemanipulation of an
Articulated Robotic Arm, In Proceedings of the 8th
IEEE RAS/EMBS International Conference for
Biomedical Robotics and Biomechatronics
(BioRob), New York, NY, USA, 29 November,
2020, pp. 860 865.
[6] Jessica D. Bayliss, and Dana H. Ballard, A virtual
reality testbed for brain-computer interface research,
IEEE Transactions on Rehabilitation Engineering,
Vol. 8, No. 2, 2000, pp. 188 190.
[7] Marc Philipp Dietrich, Gotz Winterfeldt, and
Sebastian von Mammen, Towards eeg-based eye-
tracking for interaction design in head-mounted
devices, In the Proceedings of the 7th IEEE
International Conference on Consumer Electronics-
Berlin (ICCE-Berlin), Berlin, Germany, 03
September, 2017, pp. 227 – 232.
[8] Samantha N. Stahlke, Josh D. Bellyk, Owen R.
Meier, Pejman Mirza-Babaei, and Bill Kapralos,
Frontiers of immersive gaming technology: A
survey of novel game interaction design and serious
games for cognition, Recent Advances in
Technologies for Inclusive Well-Being, Springer,
2021, pp. 523 – 536.
[9] Jan-Philipp Tauscher, Fabian Wolf Schottky, Steve
Grogorick, Paul Maximilian Bittner, Maryam
Mustafa, and Marcus Magnor, Immersive eeg:
evaluating electroencephalography in virtual reality,
In Proceedings of the 26th IEEE International
Conference on Virtual Reality and 3D User
Interfaces (VR), Osaka, Japan, 23 March, 2019, pp.
1794 – 1800.
[10] Ioulietta Lazarou, Spiros Nikolopoulos, Panagiotis
C. Petrantonakis, Ioannis Kompatsiaris, and Madga
Tsolaki, Eeg-based brain-computer interfaces for
communication and rehabilitation of people with
motor impairment: a novel approach of the 21st
century, Frontiers in human neuroscience, Vol. 12,
No. 14, 2018, pp. 1 – 18.
[11] Baek Hwan Cho, Jong-Min Lee, J. H. Ku, Dong Pyo
Jang, J. S. Kim, In-Young Kim, Jang-Han Lee, and
Sun I. Kim, Attention enhancement system using
virtual reality and eeg biofeedback, In Proceedings
of IEEE International Conference on Virtual Reality
2002, Orlando, FL, USA, 24 March, 2002, pp. 156
163.
[12] Stephan Hertweck, Desiee Weber, Hisham Alwanni,
Fabian Unruh, Martin Fischbach, Marc Erich
Latoschik, and Tonio Ball, Brain activity in virtual
reality: assessing signal quality of high-resolution
eeg while using head-mounted displays, In
Proceedings of the 26th IEEE International
Conference on Virtual Reality and 3D User
Interfaces (VR), Osaka, Japan, 23 March, 2019, pp.
970 – 971.
[13] Anthony Steed, Francisco R. Ortega, Adam S.
Williams, Ernst Kruijff, Wolfgang Stuerzlinger,
Anil Ufuk Batmaz, Andrea Stevenson Won, Evan
Suma Rosenberg, Adalberto L. Simeone, and
Aleshia Hayes, Evaluating immersive experiences
during covid-19 and beyond, Interactions, Vol. 27,
No. 4, 2020, pp. 62 – 67.
[14] Patrick E. McKight, and Julius Najab, Kruskal-
wallis test, The Corsini Encyclopedia of Psycology,
Vol. 1, 2010, pp. 1 – 10.
[15] Mathew Turk, Multimodal interaction: A review,
Pattern Recognition Letters, Vol. 36, 2014, pp. 189
– 195.
[16] Microsoft: Introducing instinctual interactions,
Available online: https://learn.microsoft.com/en-
us/windows/mixed-reality/design/interaction-
fundamentals (accessed on 17 October 2022).
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2023.20.12
Mikko Korkiakoski, Theodoros Anagnostopoulos,
Oskari Rajala, Mikael Sarkiniemi,
Markus Hirsimaki, Jere Kinnunen, Paula Alavesa
E-ISSN: 2224-3402
106
Volume 20, 2023
[17] Joseph LaViola, Msvt: A virtual reality-based
multimodal scientific visualization tool. In
Proceedings of the 3rd IASTED International
Conference on Computer Graphics and Imaging
(CGIM), Las Vegas, NV, USA, 20 November,
2000, pp. 1 7.
[18] Robert Leeb, Marcel Lancelle, Vera Kaiser, Dieter
W. Fellner, and Gert Pfurtscheller, Thinking
penguin: multimodal brain-computer interface
control of a vr game. IEEE Transactions on
Computational Intelligence and AI in Games, Vol.
5, No. 2, 2013, pp. 117 – 128.
[19] Arindam Dey, Alex Chatburn, and Mark
Billinghurst, Exploration of an eeg-based
cognitively adaptive training system in virtual
reality, In Proceedings of the 26th IEEE International
Conference on Virtual Reality and 3D User
Interfaces (VR), Osaka, Japan, 23 March, 2019, pp.
220 – 226.
[20] Richard H. Y. So, and Hiroyasu Ujike, Visually
induced motion sickness, visual stress and
photosensitive epileptic seizures: what do they have
in common? Applied Ergonomics, Vol. 41, No. 4,
2010, pp. 491 493.
[21] Jaeseok Heo, and Gilwon Yoon, Eeg studies on
physical discomforts induced by virtual reality
gaming. Journal of Electrical Engineering and
Technology, Vol. 15, No. 3, 2020, pp. 1323 – 1329.
[22] Chae-Won Lee, Min-Kook Choi, Kyu-Sung Kim,
and Sang-Chul Lee, Analysis of causal factors and
physical reactions according to visually induced
motion sickness, Journal of HCI Society of Korea,
Vol. 9, No. 1, 2014, pp. 11 – 21.
[23] Peter Gerjets, Carina Walter, Wolfgang Rosenstiel,
Martin Bogdan, and Thorsten O. Zander, Cognitive
state monitoring and the design of adaptive
instruction in digital environments: lessons learned
from the cognitive workload assessment using a
passive brain-computer interface approach,
Frontiers in neuroscience, Vol. 8, No. 385, 2014,
pp. 1 – 21.
[24] Guido Makransky, Thomas S. Terkildsen, and
Richard E. Mayer, Adding immersive virtual reality
to a science lab simulation causes more presence but
less learning, Learning and Instruction, Vol. 60,
2019, pp. 225 236.
[25] Lisa Rebenitsch, and Charles Owen, Review on
cybersickness in applications and visual displays,
Virtual Reality, Vol. 20, No. 2, 2016, pp. 101 – 125.
[26] Chris G. Christou, and Poppy Aristidou, Steering
versus teleport locomotion for head mounted
displays, In Proceedings of the 4th International
Conference on Augmented Reality, Virtual Reality,
and Computer Graphics (AVR), Porto, Portugal, 21
June, 2017, pp. 431 – 446.
[27] Jeremy Clifton, and Stephen Palmisano, Effects of
steering locomotion and teleporting on
cybersickness and presence in hmd-based virtual
reality, Virtual Reality, Vol. 24, No. 3, 2020, pp.
453 – 468.
[28] Xinyu Tan, Yi Li, and Yuan Gao, Combining brain-
computer interface with virtual reality: Review and
prospect, In Proceedings of the 3rd IEEE
International Conference on Computer and
Communications (ICCC), Chengdu, China, 13
December, 2017, pp. 514 – 518.
[29] Judith Amores, Xavier Benavides, and Pattie Maes,
PsychicVR: Increasing mindfulness by using virtual
reality and brain computer interfaces, In
Proceedings of the 2016 CHI International
Conference Extended Abstracts on Human Factors
in Computing Systems (CHI EA’16), San Jose,
California, USA, 07 May, 2016, pp. 2 2.
[30] Jiahui Xu, and Baichang Zhong, Review on portable
eeg technology in educational research, Computers
in Human Behavior, Vol. 81, 2018, pp. 340 – 349.
[31] Nikolas S. Williams, Genevieve M. McArthur,
Bianca de Wit, George Ibrahim, and Nicholas A.
Badcock, A validation of emotiv epoc flex saline for
eeg and erp research, PeerJ, Vol. 8, No. 9713, 2020,
pp. 1 – 32.
[32] Muse 2: Brain sensing headband technology
enhanced mediation, Available online:
https://choosemuse.com/muse-2/ (accessed on 21
November 2022).
[33] Olave E. Krigolson, Chad C. Williams, Angela
Norton, Cameron D. Hassall, and Francisco L.
Colino, Choosing muse: Validation of a low-cost,
portable eeg system for erp research. Frontiers in
neuroscience, Vol. 11, No. 109, 2017, pp. 1 – 10.
[34] Konstantinos Tsiakas, Maher Abujelala, and Fillia
Makedon, Task engagement as personalization
feedback for socially-assistive robots and cognitive
training, Technologies, Vol. 6, No. 2, 2018, pp.1
17.
[35] Zhen Li, Jianjun Xu, and Tingshao Zhu, Prediction
of brain states of concentration and relaxation in real
time with portable electroencephalographs, arXiv
preprint, 2015, arXiv:1509.07642.
[36] Maher Abujelala, Cheryl Abellanoza, Aayush
Sharma, and Fillia Makedon, Brain-ee: Brain
enjoyment evaluation using commercial eeg
headband. In Proceedings of the 9th ACM
International Conference on Pervasive Technologies
Related to Assistive Environments (PETRA’16),
Corfu Island, Greece, 29 June, 2016, pp. 1 – 5.
[37] Thrasyvoulos Karydis, Filipe Aguiar, Simmie L.
Foster, and Andreas Mershin, Performance
characterization of self-calibrating protocols for
wearable eeg applications. In Proceedings of the 8th
ACM International Conference on Pervasive
Technologies Related to Assistive Environments
(PETRA’15), Corfu Island, Greece, 01 July, 2015,
pp. 1 – 7.
[38] Xi Liu, Pang-Ning Tan, Lei Liu, and Steven J.
Simske, Automated classification of eeg signals for
predicting students’ cognitive state during learning,
In Proceedings of the ACM International
Conference on Web Intelligence (WI’17), Leipzig,
Germany, 23 August, 2017, pp. 442 – 450.
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2023.20.12
Mikko Korkiakoski, Theodoros Anagnostopoulos,
Oskari Rajala, Mikael Sarkiniemi,
Markus Hirsimaki, Jere Kinnunen, Paula Alavesa
E-ISSN: 2224-3402
107
Volume 20, 2023
[39] Seamas Weech, Sophie Kenny, and Michael
Barnett-Cowan, Presence and cybersickness in
virtual reality are negatively related: A review,
Frontiers in psychology, Vol. 10, No. 158, 2019, pp.
1 – 19.
[40] Jose del R. Millan, Rudiger Rupp, Gernot Mueller-
Putz, Roderick Murray-Smith, Claudio Giugliemma,
Michael Tangermann, Carmen Vidaurre, Febo
Cincotti, Andrea Kubler, Robert Leeb, Christa
Neuper, Klaus R. Mueller, and Donatella Mattia,
Combining brain-computer interfaces and assistive
technologies: state-of-the-art and challenges,
Frontiers in neuroscience, Vol. 4, No. 161, 2010, pp.
1 – 15.
[41] Polar H10: Heart rate sensor, Available online:
https://www.polar.com/blog/new-polar-h10-heart-
rate-sensor-2017/ (accessed on 05 December 2022).
[42] Jason Kowaleski and Stephen Wicklund, BlueMuse:
https://github.com/kowalej/BlueMuse (accessed on
12 June 2021).
[43] Thomasz Kucinski, Brain products:
https://www.brainproducts.com/ (accessed on 30
August 2021).
[44] Robert S. Kennedy, Norman E. Lane, Kevin S.
Berbaum, and Michael G. Lilienthal, Simulator
sickness questionnaire: An enhanced method for
quantifying simulator sickness. The International
Journal of Aviation Psychology, Vol. 3, No. 3, 1993,
pp. 203 – 220.
[45] Ferran Argelaguet, Ludovic Hoyet, Michael Trico,
and Anatole Lecuyer, The role of interaction in
virtual embodiment: Effects of the virtual hand
representation, In the Proceedings of the 2016 IEEE
International Conference on Virtual Reality (VR),
Greenville, SC, USA, 19 March, 2016, pp. 3 10.
[46] Jesus Mayor, Laura Raya, and Alberto Sanchez, A
comparative study of virtual reality methods of
interaction and locomotion based on presence,
cybersickness and usability, IEEE Transactions on
Emerging Topics in Computing, Vol. 9, No. 3, 2021,
pp. 1542 – 1553.
[47] The Finnish National Board on Research Integrity
TENK, Available online: https://tenk.fi/en (accessed
on 27 January 2023).
[48] The Ethics Committee of Human Sciences of
University of Oulu, Available online:
https://www.oulu.fi/en/university/faculties-and-
units/eudaimonia-institute/ethics-committee-human-
sciences (accessed on 14 February 2023).
Contribution of Individual Authors to the
Creation of a Scientific Article (Ghostwriting
Policy)
Panos Kostsakos was responsible for
conceptualizing and developing the methodology.
Oskari Rajala, Mikael Sarkiniemi, Markus
Hirsimaki, and Jere Kinnunen developed the
software and pilot experiments. Mikko Koriakoski
contributed to the original draft preparation and
formal analysis. Theodoros Anagnostopoulos, Panos
Kostsakos, and Paula Alavesa were involved in the
writing, reviewing, and editing of the manuscript.
Finally, all authors have read and agreed to the
published version of the manuscript.
Sources of Funding for Research Presented in a
Scientific Article or Scientific Article Itself
This work has been partially funded by the
European Commission grants IDUNN (101021911),
PRINCE (815362), and Academy of Finland 6
Genesis Flagship (318927).
Conflict of Interest
The authors have no conflict of interest to declare
that is relevant to the content of this article.
Creative Commons Attribution License 4.0
(Attribution 4.0 International, CC BY 4.0)
This article is published under the terms of the
Creative Commons Attribution License 4.0
https://creativecommons.org/licenses/by/4.0/deed.en
_US
WSEAS TRANSACTIONS on INFORMATION SCIENCE and APPLICATIONS
DOI: 10.37394/23209.2023.20.12
Mikko Korkiakoski, Theodoros Anagnostopoulos,
Oskari Rajala, Mikael Sarkiniemi,
Markus Hirsimaki, Jere Kinnunen, Paula Alavesa
E-ISSN: 2224-3402
108
Volume 20, 2023