Perception of Students on Online Exams and How Sequential Exams
and the Lockdown Browser Affect Student Anxiety and Performance
NURSEL SELVER RUZGAR, CLARE CHUA-CHOW
Ted Rogers School of Management,
Toronto Metropolitan University,
350 Victoria Street, Toronto, ON M5B 2K3,
CANADA
Abstract: - Online education has become increasingly popular over the past few years, especially with the
global pandemic forcing students to learn remotely. Although online education offers various benefits,
including flexibility, accessibility, and convenience, it presents unique challenges, including the use of
Lockdown Browser for sequential online exams that can increase students’ anxiety levels and decrease their
performance. In this paper, an empirical study was undertaken to examine the students’ preferences for online
exams and how the protracting exams impacting on students’ anxiety and performance taking into
consideration factors such as gender, class standing, and the availability of a personal study space. The
finding reveals that sequential exams, errors in questions, use of lockdown browser, writing exams in different
time zone, and one question per page increase students’ stress and anxiety. The results also suggest that there
was a significant difference in anxiety levels between students who received different letter grades, specifically,
students who received lower grades reported higher levels of anxiety. However, the gender and delivery of the
course did not appear to have a significant impact on anxiety levels.
Key-Words: - Online education, students’ stress, and anxiety, online exams challenges, Lockdown Browser,
students’ performance, gender, class standing, having own room, Chi-square test
Received: June 16, 2022. Revised: April 25, 2023. Accepted: May 21, 2023. Published: July 5, 2023.
1 Introduction
Online education and online exams have gained
significant importance in recent years due to
advancements in technology and the global shift
towards digitalization, especially with the global
pandemic in 2020 forcing students to learn
remotely. Online education refers to the delivery of
education through digital platforms, allowing
learners to access educational resources from
anywhere and at any time. Online exams, on the
other hand, refer to the administration of exams
through digital means, eliminating the need for
physical attendance in exam centers. The shift to
online education and online exams has led to
concerns about student stress and anxiety related to
this new mode of learning.
Online education was an alternative to
traditional education before the pandemic, however,
the start of the pandemic in 2020 led to migrate in-
person classes to online classes, [1], so online
education was the only option for the education
system. This sudden transition brought unique
challenges for both students and instructors.
Instructors used different delivery methods to teach
the course. Some of them posted pre-recorded
lectures to learning management systems like D2l,
Brightspace, Moodle, and some of them posted
slides and hold online live lectures using Zoom or
Google Meet, or other online technologies. The
most important challenge that occurred for the
online assessments, was how to maintenance of
academic integrity, [2]. It was emphasized that
to
ensure academic integrity in online exams,
universities have adopted a variety of measures,
including assessment models and question types
that make academic dishonesty less rewarding,
the use of normative appeals and honor codes,
machine learning approaches to detect
dishonest behaviour after it has occurred, and
direct online surveillance: 'proctoring', [3]. It
was also stated that i
n a virtual educational setting,
there exists a wide range of methods to engage in
dishonest behaviour, such as manipulating data,
plagiarizing content, or having someone else take
exams on behalf of the student, [4], [5].
Additionally, individuals may resort to hiring a
person or a company to act as the student
throughout the educational period, as well as
copying assignments from both public and private
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
92
Volume 11, 2023
websites, [4], [6], [7]. Another form of cheating
involves sharing questions and answers through
unauthorized social media channels, such as
WhatsApp, Facebook, and so on, [8]. These
tendencies emphasize the importance of upholding
academic integrity by implementing proctored
online assessments, [4], [8].
Proctoring typically involves scanning the room,
verifying student ID cards, constant webcam
monitoring, and the requirement of a private room,
[3]. While proctoring, like Lockdown browsers, is
designed to prevent cheating during exams to uphold
academic integrity, it can also create additional
stress and anxiety for students. According to reports,
it increases students’ stress and anxiety levels and
decreases their performance. It is believed that the
discomfort caused by proctoring may potentially
intensify feelings of anxiety, [3], [9], leading to a
detrimental impact on students' exam outcomes.
This study aims to explore students’ perceptions
on online exams and the impact of proctoring exams
by Lockdown browser on students’ online exam
stress and anxiety, as well as on their performance.
For this purpose, 313 engineering and business
students participated in a volunteer-based survey
during the 2020 and 2021 Fall and Winter academic
terms. 25- item scale was administrated to 313
students for reliability analysis and the Cronbach’s
alpha value was 0.721. The data were analyzed by
Chi-square tests use of SPSS and Excel.
This paper is organized as follows. In section 2,
the literature review that this study aims to address
is discussed. In section 3, the aim and methodology
are described. In section 4 research findings are
presented and discussed. Finally, in section 5, the
conclusion and recommendations for future research
will be provided.
2 Literature Review
In the last few decades, with the growth of
technology, online education has become
increasingly popular that provides students
flexibility and accessibility for learning. After the
Covid pandemic started in March 2020, online
education is no longer a trend, it became
mainstream, [10], due to the lockdown all around
the world.
The COVID-19 pandemic changed
many aspects of education,
it has compelled
universities to re-evaluate their approaches to both
assessment and teaching. Many institutions are now
contemplating the implementation of remote exam
delivery as a way to safeguard against potential
future disruptions, thereby ensuring their
preparedness for unforeseen circumstances, [11].
Instructors have made use of diverse
technological tools, encompassing a wide range of
options such as live chats, threaded discussions,
forms, PowerPoint presentations, email, videos,
software applications, spreadsheets, word
processors, online portals, electronic
portfolios/projects, and online exams, [12]. These
tools serve as valuable resources to facilitate
effective communication and enhance the learning
experience for students, [12]. As the transition to
remote learning took place, online examinations
gained significant popularity as a preferred
assessment method among instructors. However, a
considerable number of instructors lacked prior
experience in conducting online assessments, [1],
including new online test environment, technical
issues, and academic integrity and anxiety. To
facilitate the process, a pool of questions is created
within the online platform. These questions are then
presented individually, one per page, in a sequential
manner, [13]. Students are typically prohibited
from revisiting previously answered questions, and
to further enhance fairness, the order of questions
and their corresponding answers are shuffled. Since
cheating is a common phenomenon among
students, it is crucial to preserve the trust, honesty,
and integrity of online assessments, [14]. This
needs for maintaining academic integrity through
proctored online assessments, [15]. When students
are allowed to take non-proctored online tests,
concerns arise regarding the potential for cheating.
This is because students may have access to
materials that are prohibited during the test or may
collaborate with others to complete the test
collectively. As a consequence, the resulting grades
may not accurately reflect the actual performance
of individual students, [16], [17].
Lockdown Browser is one of the proctoring
techniques. Proctoring with Lockdown Browser
involves several measures such as conducting a
room scan, scanning the student's ID card, ensuring
continuous monitoring through a webcam, and
requiring the availability of a private room. It has
been observed that this practice reinforces the well-
established inverse connection between students'
test anxiety and their performance in exams. This is
likely because proctoring introduces a sense of
unease and discomfort that can potentially trigger
anxiety, [4], [9]. Today, almost all students
experience anxiety and fear before or during exams,
[18]. Exam anxiety refers to a blend of physical
symptoms and emotional responses that hinder one's
capacity to perform effectively during examinations,
[11]. Several factors contribute to the onset of exam
anxiety, encompassing various elements such as the
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
93
Volume 11, 2023
length of the examination, the number of questions,
the specific testing methodology employed, the
instructions provided for the test, the testing
environment, and proctoring, [18], [19]. In
literature, many scholars studied the effects of
proctored exam anxiety, [1], [3], [4], [18], [20],
[21], [23], [24], and the relation between exams
anxiety and academic performance, [2], [9], [15],
[16], [25], [26], [27], [28]. The other scholars
studied how to reduce test anxiety, [11], [13], [28],
[30], [31]. Multiple studies have already presented
qualitative evidence indicating heightened levels of
anxiety in exams that involve proctoring, [3], [21],
[26]. The majority of studies have identified an
inverse correlation between exam anxiety and
academic achievement, [15], [16], [27]. Specifically,
individuals with high levels of trait test anxiety tend
to attain lower scores in exams, particularly when
taking assessments in an online proctored
environment.
3 Aim and Methodology
The rapid advancements in technology and the
widespread availability of the Internet have
revolutionized various aspects of life, including the
education system. These technological
developments have paved the way for new
educational approaches, blending with traditional
methods. Examples of such integration include
distance education, web-based (virtual or remote or
online) education, and hybrid education. These
emerging styles of education have brought about
significant changes and opportunities in the field of
learning. After the Covid pandemic and lockdown
started, most of the education institutions followed
online education. The sudden transition brought
many challenges to instructors and students,
especially online assessments. To maintain
academic integrity and to reduce cheating on online
assessments, the use of proctoring has increased
students’ stress and anxiety in addition to their
mental issues due to the pandemic. Students’
academic performance is the most crucial factor not
only impacting their learning of the materials but
also their future careers. The objective of this
research is to investigate the benefits and obstacles
associated with online education while examining
the impact of online learning on students' levels of
anxiety. Additionally, it will analyse the influence
of anxiety on students' academic performance and
overall success. Furthermore, the study will explore
potential approaches that educational institutions
and instructors can adopt to mitigate students'
anxiety specifically pertaining to online exams. By
undertaking this investigation, the research aims to
gain insights into the complex relationship between
online education, anxiety levels, and effective
strategies for alleviating student apprehensions in
the online learning environment.
This research endeavors to investigate and
provide insights into the responses to the following
inquiries:
Perceptions of students on online education
What are the challenges of online exams?
How do online exams affect students’ stress
and anxiety?
How does Lockdown Browser affect
students’ anxiety?
Is there any positive or negative relationship
between students’ anxiety and performance?
Is there any gender, class standing, and
having own room differences on students'
anxiety and performance?
This research constitutes a component of prior
studies, [32], [33] that have investigated the
perceptions of students on online education during
the Covid pandemic. In this study, data were
gathered through two distinct methods. The first
method involved an online survey that collected
demographic information from students, along with
their perceptions of online learning, online exams,
and online exam challenges. The survey utilized a 5-
Likert scale, ranging from strongly disagree to
strongly agree. The second method involved
utilizing the Learning Management System (LMS)
to collect data on students' performance. A dataset
comprising information from 2,456 students was
obtained from the LMS, aiming to explore potential
correlations between performance and variable class
standing. Additionally, a second dataset was
collected from 313 students who participated in the
online survey. This dataset specifically examined
students' perceptions of their performance in relation
to factors such as gender, class standing, and
whether they had their own room. It is important to
note that student participation in the survey was
voluntary, and their grades and class standings were
collected anonymously with their consent, [32],
[33].
The data collection process took place during
the Fall and Winter semesters of 2020 and 2021,
respectively, in mathematics and statistics courses at
two universities in Ontario, Canada. All the lectures
were presented in an online format. The lectures
were recorded and made available on the D2L
platform, which is the Learning Management
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
94
Volume 11, 2023
System (LCM) utilized by two schools. In addition
to the recorded lectures, lecture slides were also
uploaded to D2L. The assessment for the course
consisted of two online tests, 10 assignments based
on a weekly schedule on the Mylab platform, a
group project, and an online final exam. Before each
exam, a practice test was posted on D2L to
encourage students and prepare them for the
upcoming assessment. To ensure a fair assessment
and minimize cheating, the two tests and the final
exam were supervised using the "Respondus
Lockdown Browser and Monitor" system. It was
communicated in advance that students were
required to take the test on a reliable computer
equipped with a webcam and microphone in a quiet
environment. Any noise or presence of another
individual in the room would trigger a flag in the
system, [32], [33]. Out of the total of 2,456 students,
313 students participated in the survey, comprising
167 (53.4%) male and 146 (46.6%) female students.
Among the participants, 167 (53.4%) were first-year
students, 93 (29.7%) were second-year students, 36
(11.5%) were third-year students, and 17 (5.4%)
were fourth-year students, [32], [33]. Moreover, 259
(82.7%) of the participants had their own room,
while 54 (17.3%) did not.
In terms of grades, 95 (30.4%) of the students
passed the course with a grade of A, 67 (21.4%)
with a grade of B, 78 (24.9%) with a grade of C, 51
(16.3%) with grade D, and 22 (7%) failed the course
with a grade F. Among the total 2,456 students,
1,242 (61.0%) were first-year students, 545 (26.8%)
were second-year students, 147 (7.2%) were third-
year students, and 102 (5.0%) were fourth-year
students. Similarly, out of the 2,456 students, 520
(21.2%) passed with grade A, 635 (25.9%) with
grade B, 655 (26.7%) with grade C, 379 (15.4%)
with grade D, and 267 (10.8%) failed the course,
[2], [37]. To simplify the grading system, the
categories A+, A, and A- were represented as A (80-
100), B+, B, and B- as B (70-79), C+, C, and C- as
C (60-69), D+, D, and D- as D (50-59), and failing
grades as F (0-49), [32], [33]. The evaluation of the
course included two midterm tests, a final exam, 10
weekly assignments, and an online project. The
collected data were analyzed using SPSS and Excel
software.
The study's goals were accomplished
using descriptive statistics, a Likert scale with
five response options, as well as Pearson's chi-
square test.
4 Finding and Discussion
This section presents the findings derived from the
analysed data, organized into four subsections:
online learning, online exams, anxiety, and the use
of a Lock-down Browser. Within each subsection,
the items will be discussed based on various factors,
including gender differences, class standings,
ownership of a personal room, and grades. The aim
is to explore and highlight the observations and
patterns that emerged from the data analysis,
considering these different aspects and their
potential influences on the respective topics.
The survey employed a 5-Likert scale that included
response options from strongly disagree (SD) to
strongly agree (SA). However, for the sake of
simplicity, the negative responses (strongly disagree
and disagree) were combined to represent negative
perceptions, while the positive responses (strongly
agree and agree) were combined to represent
positive perceptions. This grouping allows for easier
interpretation and analysis of the data by
categorizing the responses into broader positive and
negative categories.
4.1 Online Education
To investigate students' perceptions of online
learning, four specific items were included in the
study: An6, Anx7, Anx20, and Anx22. Anx6, which
explores the statement "It is very difficult to contact
group members for online group projects," aims to
assess the challenges students face when
communicating with their peers in online
collaborative projects. Anx7, on the other hand, asks
students to indicate their preference for working on
online projects individually, providing insight into
their personal work style and inclination towards
independent work. Anx20 focuses on the statement
"Statistics and mathematics are very hard to learn in
an online class," aiming to uncover students'
attitudes and tendencies regarding online courses in
these specific subjects. This item seeks to
understand whether students perceive statistics and
mathematics as more challenging when taught
online compared to traditional classroom settings.
Lastly, Anx22 delves into the statement "I spend
more time learning the materials in an online
course," aiming to gather information on the amount
of additional time students invest in self-directed
learning for online courses. This item seeks to
explore the time commitment and effort students
allocate to effectively grasp the course materials in
an online learning environment.
Table 1 depicts data on gender differences and
responses to statements related to online group
projects, preference for individual work, the
difficulty of learning statistics and mathematics
online, and the time spent on learning materials in
an online course. At a 5% significance level, there is
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
95
Volume 11, 2023
a statistically significant association between gender
and the perceived difficulty of learning statistics and
mathematics in an online class. The responses show
that more males (23.4%) than females (22.6%)
selected SD+D, suggesting that males perceive these
subjects as harder to learn online. On the other hand,
more females (32.2%) than males (19.2%)
responded with N (neutral). This indicates that
females might have a less definite opinion or may
feel less strongly about the difficulty of learning
these subjects online compared to males. However,
there is no significant association between gender
and the difficulty of contacting group members for
online group projects or the preference for
individual work in online projects. This suggests
that males may perceive more difficulties in
contacting group members compared to females.
There is also no significant association between
gender and the amount of time spent on learning
materials in an online course. In terms of spending
more time learning materials in an online course, a
higher percentage of males (53.3%) than females
(45.2%) responded with SA+A, indicating that
males tend to invest more time in studying the
course materials in an online setting. This finding
contributes to the existing literature by
demonstrating two significant points. Firstly, it
confirms that female students tend to spend more
time online compared to their male counterparts.
Secondly, it highlights that students, both male and
female, who spend less than an average of 15 hours
online are less likely to achieve success, [35].
Table 1. Perceptions of students on online learning
versus gender
When comparing the items with having own
room, it is found that at a 5% significance level,
there is no statistically significant association
between owning a separate room and the difficulty
of contacting group members for online group
projects, the preference for individual work in
online projects, the difficulty of learning statistics
and mathematics in an online class, or the amount of
time spent on learning materials in an online course
(Table 2). However, a higher percentage of
individuals with their own room (59.4%) compared
to those without their own room (51.9%) responded
with SA+A, suggesting that individuals with their
room might allocate more time to learning materials
in an online course. Regarding the perception of
difficulty in learning statistics and mathematics in
an online class, 27.8% of individuals without their
own room and 22.0% of those with their own room
responded with SD+D. This suggests that
individuals without their own room may find these
subjects slightly more challenging to learn online.
Based on the study's results, it was discovered that
having a personal room had a beneficial effect on
academic performance, [33]. The failure rate among
students without their own room was approximately
three times higher compared to students who had
their own room, [33]. This contributes to the
findings in this work.
Table 2. Perception of students on online learning
versus having their own room
Table 3 shows the perception of students on
online learning and class standing. When the class
standing was examined with the items, at a 5%
significance level, there is no statistically significant
association between the class standing and the
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
96
Volume 11, 2023
difficulty of contacting group members for online
group projects, preference for individual work in
online projects, and the time spent on learning
materials in an online course (Table 3). Based on
the findings of an online survey conducted
among 307 students, it was revealed that 51%
of the respondents expressed that they dedicate
more time to their homework in comparison to
traditional classroom settings, [36]. Regarding
the difficulty of learning statistics and mathematics
in an online class, there is a borderline association
because the asymptotic significance 2-sided is
0.091. Notably, the percentage of students
responding with SD+D decreases as the class
standing progresses, while the percentage of
students responding with SA+A increases. This
indicates that students who are in higher class
standings may perceive statistics and mathematics
as less challenging to learn in an online setting, in
contrast to students in lower class standings. This
observation can be attributed to the fact that higher-
class students have taken more online courses, with
a significant portion of them being course retakes. It
is also found that students in higher class standings
may have slightly less difficulty in contacting group
members for online projects and as students advance
in their class standings, they are more likely to
prefer individual work for online projects.
Table 3. Perceptions of students on online learning
versus class standing
Table 4 presents the results of a survey
examining the influence of having one's own room
on students' perceptions of online exams. When the
grade distribution is based on the four online
learning items at a 5% significance level, there is no
statistically significant association between the
grade and the difficulty of contacting group
members for online group projects, preference for
individual work in online projects, and the time
spent on learning materials in an online course
(Table 4). Findings suggest that students with higher
grades, particularly those with grade A, tend to find
it slightly less challenging to contact group
members for online projects. However, there is a
statistically significant association between the
grade and the perceived difficulty of learning
statistics and mathematics in an online class.
Regarding the perception of the difficulty of
learning statistics and mathematics in an online
class, differences were found among the grades. As
grades improve, there is a noticeable decrease in the
percentage of students responding with SD+D,
while the percentage of students responding with
SA+A shows an upward trend. This observation
suggests that students with higher grades tend to
perceive statistics and mathematics as less
challenging to learn in an online setting when
compared to students with lower grades. Several
factors may contribute to this phenomenon,
including test mode, exam anxiety, the course
delivery method, as well as the students'
backgrounds in mathematics and statistics, [37].
Table 4. Perceptions of students on online learning
versus performance
4.2 Online Exams
To investigate students' perspectives on online
exams, a survey was conducted consisting of eleven
items. The items included Anx5, Anx8, Anx9,
Anx10, Anx12, Anx13, Anx14, Anx16, Anx17,
Anx18, and Anx19. Each item focused on a specific
aspect of online exams. For instance, Anx5 explored
students' preferences between in-class tests and
online tests. The coding definitions for these items
are as follows: Anx8: "I feel unable to demonstrate
my knowledge when taking an online test." Anx9: "I
find online test questions to be easier compared to
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
97
Volume 11, 2023
in-class test questions." Anx10: "I prefer taking tests
on the Mylab platform rather than on D2L." Anx12:
"I prefer multiple-choice questions in online tests."
Anx13: "I dislike multiple-choice questions with
multiple parts in online tests because I cannot earn
partial credit or marks." Anx14: "I appreciate
written answer questions in online tests because I
have the opportunity to earn partial credit or marks."
Anx16: "Online tests present more challenges
compared to in-class tests." Anx17: "Online tests
should allow for more time to be given." Anx18:
"Due to the lack of a private room, it is difficult for
me to concentrate when taking online tests." Anx19:
"Cheating is effortless in online tests as I can easily
share questions with my social media group and
obtain answers." These specific items were designed
to gather insights into students' perceptions and
experiences related to various aspects of online
exams.
Table 5. Perceptions of students on online exams
versus gender
Table 5 presents the results of a survey
examining gender differences in students'
perceptions on online exams. The survey findings
reveal the following insights regarding gender
differences in students' perceptions of online exams
(Table 5). The survey included various statements
related to preferences, difficulties, and challenges
associated with online tests. At a 5% significance
level, the data suggests that there are no statistically
significant gender differences in most of the
responses. For the statement "I prefer in-class tests
to online tests," there was a slightly higher
proportion of females (49.3%) who preferred online
tests compared to males (38.3%), but the difference
was not significant (p = 0.146). Similarly, for other
statements such as "I cannot show my knowledge
when writing an online test" and "The questions are
easier for online tests than for in-class tests," there
were no significant gender differences observed.
However, there were some notable trends. Females
had a slightly higher proportion (38.4%) who liked
written answer questions for online tests compared
to males (26.9%), and this difference approached
statistical significance (p = 0.063). Additionally,
more females (76.0%) perceived cheating as easy in
online tests compared to males (67.7%), although
this difference was not statistically significant (p =
0.258). This contributes to the literature finding that
a definitive pattern of gender differences in statistics
courses has not yet been identified, and it may vary
depending on the criteria used to measure course
success, such as exam scores versus overall course
performance, [38], [39].
When the influence of having one's own room
on students' perceptions of online exams is
examined at a 5% significance level, the data
suggests that having one's own room does not
significantly affect most of the responses. The
majority of the statements, such as "I prefer in-class
tests to online tests" and "The questions are easier
for online tests than for in-class tests," did not
exhibit statistically significant differences between
participants with and without their own room.
However, there were a few notable findings.
Participants with having their own room had a
higher proportion (10.4%) who preferred multiple-
choice questions for online tests compared to those
without having their own room (0%), and this
difference was statistically significant (p = 0.037).
Additionally, participants without having their own
room found it significantly more difficult to
concentrate when writing tests compared to those
with having their own room (p = 0.000).
Furthermore, there was a tendency for participants
without having their own room to agree more
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
98
Volume 11, 2023
strongly (73.4%) that more time should be given for
online tests compared to those with having their
own room (64.8%), although this difference
approached but did not reach statistical significance
(p = 0.059).
When the students' perceptions of in-class
exams versus online exams across different
academic years are tested at a 5% level of
significance, the data indicates several noteworthy
findings. Firstly, there were significant differences
observed in the statement "The questions are easier
for online tests than for in-class exams" among the
different academic years (p = 0.023). Specifically,
students in the 1st year (38.9%) were more likely to
agree with this statement compared to those in the
2nd year (55.9%), 3rd year (58.3%), and 4th year
(52.9%). Secondly, regarding the preference for
multiple-choice questions in online exams, a
significant difference was found across academic
years (p = 0.029). Students in the 1st year (83.2%)
expressed a higher preference for multiple-choice
questions compared to those in the 2nd year
(78.5%), 3rd year (69.4%), and 4th year (52.9%).
Additionally, the perception of cheating being easy
in online exams showed a significant difference
across academic years (p = 0.028). Students in the
1st year (67.7%) and 2nd year (74.2%) were more
likely to agree with this statement compared to those
in the 3rd year (86.1%) and 4th year (64.7%). In
contrast, there were no significant differences
observed in students' preferences for in-class exams
over online exams or their ability to demonstrate
knowledge in online tests across different academic
years. Therefore, the survey results suggest that
there are varying perceptions and preferences
regarding online exams among students in different
academic years. While students in the 1st year
tended to find online test questions easier and prefer
multiple-choice formats, they were also more
concerned about the potential for cheating. These
findings highlight the importance of considering
students' academic year when designing and
implementing online assessment strategies.
Table 6 presents data on students' preferences
and perceptions regarding in-class tests and online
tests, along with their corresponding grades. The
survey results, along with students' grades, provide
insights into their preferences for in-class tests
versus online tests (Table 6). Analysing the results
at a 5% significance level, several observations can
be made. Firstly, for the statement "I prefer in-class
tests to online tests," there is a significant difference
in the distribution of responses across grade
categories (p = 0.033). A higher percentage of
students in the A grade category (41.1%) strongly
disagreed or disagreed with this statement compared
to students in lower grade categories. Similarly, for
the statement "I cannot show my knowledge when
writing an online test," there is no significant
difference across grade categories (p = 0.791).
Regarding the difficulty of questions between online
and in-class tests, no significant difference is
observed across grade categories (p = 0.865).
Table 6. Perceptions of students on online exams
versus their performance
Similarly, for the preference between Mylab and
D2L platforms, there is no significant difference
across grade categories (p = 0.094). However, for
the statement "I prefer multiple-choice questions for
online tests," there is a significant difference (p =
0.232), with a higher percentage of students in the A
grade category (71.6%) strongly agreeing or
agreeing with this statement. The analysis also
reveals that there is no significant difference across
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
99
Volume 11, 2023
grade categories for the statement related to
disliking multiple-choice questions with many parts
in an online test (p = 0.400). Similarly, for the
preference for written answer questions in online
tests, no significant difference is observed (p =
0.283). Regarding the perceived difficulty of online
tests compared to in-class tests, no significant
difference is found (p = 0.711). However, when it
comes to the need for more time for online tests,
there is a significant difference across grade
categories (p = 0.443). A higher percentage of
students in the A grade category (72.6%) strongly
agreed or agreed with this statement. In terms of
concentration during tests, no significant difference
is observed across grade categories (p = 0.510).
Finally, for the statement about cheating in online
tests, there is no significant difference across grade
categories (p = 0.656). Overall, the analysis suggests
that preferences for test formats and perceptions of
difficulty vary among students of different grade
categories.
4.3 Stress and Anxiety
To investigate the factors contributing to students'
stress and anxiety during online exams, eight
specific items were included in the survey: Anx4,
An5, Anx8, Anx15, Anx21, Anx23, Anx24, and
Anx25. Each item is defined as follows: Anx4:
"Sequential tests increase my stress." Anx5: "I
prefer in-class tests to online tests." Anx8: "I cannot
show my knowledge when writing an online test."
Anx15: "Writing tests in a different time zone
increases my stress." Anx21: "Online education and
tests increase my mental health problems." Anx23:
"Having one question per page on an online test
increases my stress and anxiety." Anx24: "The
absence of a previous question feature in an online
test increases my stress." Anx25: "Errors in the test
questions increase my stress and anxiety." These
specific items were included in the survey to gain
insights into the factors contributing to students'
stress and anxiety levels during online exams. By
addressing these concerns, it was aimed to better
understand the challenges students face in online
testing environments and explore potential strategies
to alleviate stress and improve the overall
experience for students.
Table 7 presents the results of a survey
conducted to examine the responses of male and
female participants regarding their stress levels and
attitudes toward various aspects of online exams.
The findings pertaining to the responses of male and
female participants to statements regarding their
stress and anxiety levels during online exams are
summarized below (Table 7). For the statement
"Sequential tests increase my stress," there is a
statistically significant difference between males
and females (p = 0.028), with 11.4% of males and
13.0% of females strongly disagreeing or
disagreeing. The findings of a literature study
revealed that participants enrolled in a traditional
face-to-face classroom setting reported slightly
higher levels of stress compared to those taking the
same course online, [20]. In terms of the preference
for in-class tests over online tests, the difference
between genders is not statistically significant (p =
0.146). However, a higher percentage of females
(49.3%) express a preference for in-class tests
compared to males (38.3%). This finding adds to the
existing literature suggesting that female students
tend to experience higher levels of anxiety
compared to their male counterparts, [20], [34].
Table 7. Students' stress and anxiety by gender on
online exams
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
100
Volume 11, 2023
Similarly, when it comes to the perception of
not being able to demonstrate knowledge in online
tests, there is no significant gender difference (p =
0.221). However, a slightly higher percentage of
females (39.7%) disagree or strongly disagree with
this statement compared to males (40.7%). The
statement regarding stress caused by writing tests in
different time zones does not show a significant
gender difference (p=0.884), with similar
proportions of males and females reporting stress.
Regarding the impact of online education and tests
on mental health problems, there is no significant
gender difference (p=0.164). However, a higher
percentage of males (47.9%) agree or strongly agree
with this statement compared to females (44.5%).
When considering the statement "One question per
page on an online test increases my stress and
anxiety," there is no significant gender difference in
responses (p=0.306). Both genders reported similar
levels of stress and anxiety related to this aspect of
online tests. Not going back to a previous question
feature in online tests does not significantly differ
between males and females (p=0.638), with a high
percentage of both genders (86.2% and 89.7%
respectively) agreeing or strongly agreeing with this
statement. This shows that students prefer in-person
exams to online exams which contradicts the fact of
findings in the literature, [20]. Lastly, for the
statement on mistakes in test questions increasing
stress and anxiety, there is no significant gender
difference (p=0.195), although a slightly higher
percentage of females (82.2%) express agreement or
strong agreement compared to males (86.2%). The
level of test anxiety experienced and its impact
varies based on individual factors, as well as the
interplay of personal, situational, and contextual
elements. Extensive research suggests that female
students often report higher levels of anxiety
compared to their male counterparts, [40].
When the responses of participants based on
whether they have their own room or not were
examined only a few items showed statistically
significant differences. Having their own room
appeared to influence the levels of stress and anxiety
related to writing tests in different time zones
(p=0.013) and having one question per page on an
online test (p=0.012).
When the results of a survey were conducted
among students from different academic years to
examine their responses regarding various
statements, no significant differences were found
among students from different academic years for
any of the investigated statements. This suggests
that academic year progression does not appear to
have a significant impact on students' responses
regarding stress and anxiety associated with
sequential tests, online tests, time zone differences,
online education, test features, and mistakes in test
questions.
Table 8. Grade differences in students' stress and
anxiety in online exams
Table 8 displays the results of a survey aimed at
examining the responses of students from different
grades (A, B, C, D, F) regarding their stress levels
related to various aspects of online exams.
Concerning the statement "Sequential tests increase
my stress," no significant differences were found
among the grade groups (p = 0.364). All grade
groups reported similar levels of stress when faced
with sequential tests during online exams. For the
item "I prefer in-class tests to online tests,"
significant differences were observed among the
grade groups (p = 0.033). Students in grade F
expressed a significantly higher preference for in-
class tests compared to students in grades A, B, and
C. These findings align with prior research
indicating that elevated trait anxiety can negatively
impact exam performance, resulting in lower scores
for students with higher levels of anxiety, [15]. This
suggests that students who received lower grades in
their studies tended to have a stronger preference for
in-class tests over online tests. Similarly, there were
no statistically significant differences in responses
to "I cannot show my knowledge when writing an
online test" among the grade groups (p = 0.791). All
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
101
Volume 11, 2023
grade groups reported comparable difficulties in
demonstrating their knowledge during online exams.
The statement "Writing tests in a different time zone
increases my stress" did not reveal any significant
differences among the grade groups (p = 0.182).
Students from all grades reported similar levels of
stress when writing tests in different time zones.
Regarding the impact of online education and tests
on mental health problems, significant differences
were found among the grade groups (p = 0.020).
Students in grade F reported a significantly higher
increase in mental health problems compared to
students in grades A, B, and C. This suggests that
students who received a lower grade experienced
more mental health issues associated with online
education and tests as stated in the literature, [15].
For the item "One question per page on an online
test increases my stress and anxiety," no significant
differences were observed among the grade groups
(p = 0.070). Students from all grades reported
similar levels of stress and anxiety when faced with
one question per page on an online test. Not going
back to the previous question feature online tests did
not show any statistically significant differences
among the grade groups (p = 0.131). All grade
groups expressed a high level of agreement with this
statement. Similarly, the impact of mistakes in test
questions on stress and anxiety did not reveal any
significant differences among the grade groups (p =
0.083). All grade groups reported similar levels of
stress related to this factor. Literature reveals a
negative correlation between performance and
anxiety, indicating that lower grades tend to lead to
increased anxiety, and vice versa, [40], [41].
4.4 Lockdown Browser
To investigate the effects of the Lockdown browser
on students' anxiety levels and academic
performance, four specific items were used for
measurement: Anx1, Anx2, Anx3, and Anx11.
These items were coded and defined as follows:
Anx1: "The use of Lockdown Browser alleviates my
stress." Anx2: "I would rather participate in a Zoom
session with my camera on during the test instead of
using Lockdown Browser." Anx3: "I have no
preference between joining a Zoom session with my
camera on or using Lockdown Browser during the
test." Anx11: "I am extremely concerned about
experiencing freezing issues during the tests."
Figure 1 presents the responses of male and
female participants regarding the statement
"Lockdown Browser reduces my stress." Out of the
total male participants (n = 167), 138 (82.6%)
expressed either Strongly Disagree (SD) or Disagree
(D), indicating that they do not believe Lockdown
Browser reduces their stress. On the other hand, 15
participants (9.0%) remained Neutral (N), and 14
participants (8.4%) expressed Agreement (A) or
Strong Agreement (SA), suggesting that they
believe Lockdown Browser helps reduce their
stress. Among the female participants (n = 146), 123
(84.2%) expressed SD or D, indicating a similar
trend to the male participants. Additionally, 16
participants (11.0%) remained Neutral, and only 7
participants (4.8%) expressed A or SA, suggesting a
lower percentage compared to the male participants.
These findings align with the existing literature, [3],
[26], which highlights that remote monitoring
during supervised exams has led to increased stress
and anxiety among students. Specifically, 40% of
participants reported feeling more pressure during
the examination due to remote monitoring, while
60% stated that they did not experience similar
pressure during traditional testing, [20]. Overall, the
majority of both male and female participants
expressed skepticism or disagreement regarding the
effectiveness of Lockdown Browser in reducing
their stress. The percentage of participants who
agreed or strongly agreed with the statement was
relatively low for both genders.
Fig. 1: Gender differences on "Lockdown Browser
reduces my stress"
*1 represents SD+D, 2 represents N and 3 represents
A+SA
The findings concerning the preferences of male
and female participants regarding their choice to
join Zoom with their camera on instead of using
Lockdown Browser during tests can be summarized
as follows. Among the male participants (n = 167),
39 (23.4%) expressed either Strongly Disagree (SD)
or Disagree (D), indicating that they do not prefer
joining Zoom with the camera on. On the other
hand, 25 participants (15.0%) remained Neutral (N),
and a majority of 103 participants (61.7%)
expressed Agreement (A) or Strong Agreement
(SA), suggesting that they prefer joining Zoom with
the camera on instead of using Lockdown Browser
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
102
Volume 11, 2023
during tests. Among the female participants (n =
146), a higher percentage of 58 participants (39.7%)
expressed SD or D, indicating that they do not
prefer joining Zoom with the camera on.
Additionally, 20 participants (13.7%) remained
Neutral, and 68 participants (46.6%) expressed A or
SA, suggesting that a lower percentage of female
participants prefer joining Zoom with the camera on
compared to the male participants. Overall, the data
suggests that a significant portion of both male and
female participants prefer joining Zoom with the
camera on instead of using Lockdown Browser
during tests. However, the percentage of female
participants expressing this preference is relatively
lower compared to male participants. The findings
pertaining to the preferences of male and female
participants regarding their choice to neither join
Zoom with the camera on nor use the Lockdown
Browser during tests can be summarized as follows.
Among the male participants (n = 167), 24 (14.4%)
expressed either Strongly Disagree (SD) or Disagree
(D), indicating that they do not prefer this option.
The majority of male participants, 109 (65.3%),
expressed Agreement (A) or Strong Agreement
(SA), suggesting that they prefer neither joining
Zoom with the camera on nor using Lockdown
Browser during tests. Among the female
participants (n = 146), a lower percentage of 10
participants (6.8%) expressed SD or D, indicating
that they do not prefer this option. Additionally, 24
participants (16.4%) remained Neutral, and the
majority of female participants, 112 (76.7%),
expressed A or SA, suggesting that they prefer
neither joining Zoom with the camera on nor using
Lockdown Browser during tests. Overall, the data
indicates that a majority of both male and female
participants prefer neither joining Zoom with the
camera on nor using the Lockdown browser during
tests. The percentage of male participants
expressing this preference is higher compared to
female participants.
Findings show the responses of male and female
participants regarding their level of worry about
experiencing freezing issues during tests. Among
the male participants (n = 167), 40 (24.0%)
expressed either Strongly Disagree (SD) or Disagree
(D), indicating that they are not very worried about
freezing issues. On the other hand, a majority of
male participants, 107 (64.1%), expressed
Agreement (A) or Strong Agreement (SA),
suggesting that they are indeed very worried about
freezing issues during tests. Among the female
participants (n = 146), a lower percentage of 22
participants (15.1%) expressed SD or D, indicating
that they are not very worried about freezing issues.
Additionally, the majority of female participants,
100 (68.5%), expressed A or SA, suggesting that
they are very worried about freezing issues during
tests. Overall, the data indicates that a significant
proportion of both male and female participants are
very worried about experiencing freezing issues
during tests. The percentage of male participants
expressing this worry is higher compared to female
participants, but the majority of both genders share
this concern.
Table 9. Perceptions of students on Lockdown
Browser and having own room
Table 9 shows the significance of having one's
own room on students' responses to the items related
to the use of the Lockdown Browser. When the
same items of the survey were examined according
to having their own room, the analysis indicates that
there is no significant difference in students'
responses to the items related to Lockdown Browser
based on whether they have their own room or not.
Both groups show similar levels of agreement,
preference, and concern (Table 9).
Similarly, when the four items related to the
Lockdown browser were analyzed, the results
indicate that there is no significant difference in
students' responses to the items related to Lockdown
Browser based on their class year. The agreement,
preference, and concern levels were generally
consistent across all class levels. However, a
noteworthy difference was observed in the
preference for neither joining Zoom nor using
Lockdown Browser, with first-year students
showing a higher inclination towards this option.
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
103
Volume 11, 2023
Research in the literature reveals that students
suffering from depression tend to report higher
levels of anxiety when it comes to being required to
appear on camera during class, as well as concerns
about potential embarrassment arising from their
surroundings while being on camera, [29]. A
specific interview study conducted with students
experiencing depression demonstrated that the
obligation to be on camera during online
coursework can exacerbate their depressive
symptoms. This effect is attributed to the common
reluctance among undergraduate students to disclose
their depression to peers and mentors, [29], [42].
Consequently, the presence of an active camera may
prove particularly detrimental to students when they
are feeling sad or disengaged, as it becomes
increasingly challenging for them to conceal their
depression from others, [29].
Figure 2 presents the responses of participants
belonging to different grade categories (A, B, C, D,
and F) regarding their perception of whether the
lockdown browser reduces their stress during tests.
For grade A (n = 95) participants, the majority of 75
(78.9%) expressed either Strongly Disagree (SD) or
Disagree (D), indicating that they do not believe that
the lockdown browser reduces their stress.
Additionally, 12 participants (12.6%) remain ed
Neutral (N), and only 8 participants (8.4%)
expressed Agreement (A) or Strong Agreement
(SA), suggesting that they find the lockdown
browser effective in reducing their stress. Among
grade B participants (n = 67), a higher percentage of
56 participants (83.6%) expressed SD or D,
indicating their disagreement with the statement.
Seven participants (10.4%) remained Neutral, and
four participants (6.0%) expressed A or SA,
indicating their agreement with the effectiveness of
the lockdown browser in reducing stress.
Fig. 2: Grades and students’ perception of whether
the lockdown browser reduces their stress during
tests
Similarly, for grade C (n = 78) participants, the
majority of 66 participants (84.6%) expressed SD or
D, while seven participants (9.0%) expressed A or
SA. Five participants (6.4%) remained Neutral. For
grade D (n = 51) participants, the highest proportion
of 46 participants (90.2%) expressed SD or D,
indicating their disagreement with the effectiveness
of the lockdown browser in reducing stress. Five
participants (9.8%) remained Neutral. Finally, for
grade F (n = 22) participants, a majority of 18
participants (81.8%) expressed SD or D, while two
participants (9.1%) expressed A or SA. Two
participants (9.1%) remained Neutral. Overall, the
data suggest that participants from all grade
categories, except grade A, generally do not
perceive the lockdown browser as an effective
means of reducing their stress during tests. The
majority of participants in grade A expressed
disagreement, while a small proportion expressed
agreement or remained neutral. In the existing
literature, while proponents argue that proctoring
can effectively reduce cheating, several studies have
also highlighted the drawbacks associated with
proctoring and the use of online exams in general,
[3]. These drawbacks primarily pertain to perceived
exam difficulty, student performance, and test
anxiety, [3], [43]. Conversely, preliminary results
have shown that students express concerns
regarding the extent of personal data shared with
proctoring software providers, [3], [44]. The use of
live remote proctoring has faced criticism for
exacerbating test anxiety and infringing upon
personal privacy, [45], [46].
When the responses of participants belonging to
different grade categories (A, B, C, D, and F)
regarding their preference for joining Zoom with the
camera on instead of using the lockdown browser
during tests were examined it is found that
participants from different grade categories show
varying preferences regarding joining Zoom with
the camera on instead of using the lockdown
browser during tests. The majority of participants in
grades A, B, C, D, and F expressed agreement or
strong agreement, while grades D showed a higher
proportion of disagreement or strong disagreement.
Neutral responses were observed across all grade
categories, suggesting a range of perspectives on
this preference. According to the literature, the
findings demonstrated that the non-proctored online
test resulted in a four-point grade advantage
compared to the traditional method, [16]. However,
another study also revealed that the group who took
exams in the unproctored environment exhibited
significantly greater variation in their performance
outcomes, [2]. Participants from different grade
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
104
Volume 11, 2023
categories show varying preferences regarding
neither joining Zoom with the camera on nor using
the lockdown browser during tests. The majority of
participants in grades A, B, C, D, and F expressed
agreement or strong agreement, indicating their
preference for this option. However, some
participants expressed disagreement or neutrality,
suggesting a diversity of opinions within each grade
category.
Fig. 3: Grades and students level of worry about
experiencing freezing issues during online exams
Figure 3 shows the grades and students’ level of
worry about experiencing freezing issues during
online exams. Participants across different grade
categories exhibit varying levels of concern
regarding the occurrence of freezing issues during
tests. The majority of participants in grades A, B, C,
and D expressed agreement or strong agreement,
indicating a higher degree of worry. However, a
significant portion of participants expressed
disagreement, neutrality, or a lower level of
concern, particularly among grade F participants.
These findings contribute to the existing research on
challenges related to exams, [32]. However, these
findings contradict the study focused on
experiencing technical failures during the exam
submission process, where only 11% of participants
reported such inconvenience.s, mainly attributed to
internet connectivity issues, while 89% stated that
they did not encounter any technical complications,
[20]. These contrasting results suggest that the level
of concern for freezing issues during tests varies
among students at different grade levels (Fig. 3).
Within the literature, it has been identified that a
notable portion of students are adversely affected by
the absence of reliable Internet connectivity and
appropriate electronic devices. This situation places
these students at a significant disadvantage when it
comes to any testing solution reliant on the Internet,
[47].
5 Conclusion
The purpose of this research is to explore students'
attitudes toward online education, online exams, the
implementation of proctoring during online exams,
and how these factors affect their anxiety levels
related to online learning. Additionally, the study
aims to analyze the impact of online learning on
students' anxiety levels based on various factors
such as gender, grade, having a personal room, and
class standing.
Regarding online courses, a significant
correlation is observed between gender and the
perceived difficulty of learning statistics and
mathematics in an online class. However, no
significant relationship is found between gender and
the difficulty of contacting group members for
online group projects, preference for individual
work in online projects, or the amount of time spent
on learning materials in an online course. Similarly,
owning a separate room does not show any
statistically significant association with the
difficulty of contacting group members for online
group projects, preference for individual work in
online projects, the difficulty of learning statistics
and mathematics in an online class, or the amount of
time spent on learning materials in an online course.
Furthermore, weak associations are identified
between class standings and the difficulty of
contacting group members, preference for individual
work, the perceived difficulty of learning statistics
and mathematics online, and the time spent on
learning materials in an online course. In terms of
grades, no statistically significant relationship is
observed between grades and the difficulty of
contacting group members for online group projects,
preference for individual work in online projects, or
the time spent on learning materials in an online
course. However, a statistically significant
association is found between grades and the
perceived difficulty of learning statistics and
mathematics in an online class.
When it comes to the challenges of online
exams, a significant majority of both males (76.6%)
and females (80.8%) indicated a preference for
multiple-choice questions in online tests. However,
participants from both groups expressed concerns
about multiple-choice questions with numerous
parts, as they believed they wouldn't receive partial
credit or marks for their answers. Perceptions of
online tests being more challenging than in-person
exams varied, with a higher percentage of males
(31.1%) finding online tests more demanding
compared to females (23.3%). Both male and
female participants agreed that more time should be
allotted for online exams. While owning a private
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
105
Volume 11, 2023
room did not significantly influence most
perceptions related to online exams, it did have a
notable impact on the ability to concentrate during
tests. Additionally, participants who did not have
their room expressed a stronger preference for
multiple-choice questions and a tendency to desire
more time for online exams. The findings suggest
that students' perceptions and preferences regarding
online exams differ across academic years. The
observed discrepancies in preferences for multiple-
choice questions, perceived question difficulty,
beliefs about cheating, and the desire for additional
time underscore the importance of considering the
distinct needs and perspectives of students at
different stages of their academic journey when
designing and implementing online assessments.
Regarding stress and anxiety, although there are
some variations in responses between males and
females, only a few of them reach statistical
significance. These findings indicate that gender
might have a limited impact on the levels of stress
and anxiety experienced during online exams, and
other factors may have a more substantial influence.
The survey results reveal that there are no
significant differences in responses among
participants from different academic years. The
stress levels and preferences concerning online
exams remain generally consistent across all
academic year groups. This suggests that the factors
examined in the survey are not significantly
influenced by the participants' academic year.
Moreover, the survey results suggest that students
from different grade levels exhibit diverse responses
to certain aspects of online exams. Specifically,
students who received lower grades demonstrate a
stronger preference for in-class tests and experience
more mental health issues associated with online
education and exams. However, no significant
differences are observed in other aspects, such as
sequential tests, the inability to demonstrate
knowledge, test format, and errors in test questions.
These findings indicate that grade level may play a
role in shaping students' perceptions and
experiences with online exams, but further
investigation is necessary to comprehend the
underlying factors influencing these discrepancies.
Regarding the relationship between students'
anxiety and performance, a majority of both male
and female participants demonstrated skepticism or
disagreement regarding the effectiveness of
Lockdown Browser in reducing their stress levels.
The percentage of participants who agreed or
strongly agreed with this statement was relatively
low for both genders. The data indicate that a
significant majority of both male and female
participants do not prefer joining Zoom with their
cameras on or using Lockdown Browser during
tests. However, the percentage of female
participants expressing this preference is relatively
higher compared to male participants.
Regarding the ownership of a separate room, the
analysis reveals that there is no significant
difference in students' responses related to
Lockdown Browser based on whether they have
their room or not. Both groups show similar levels
of agreement, preference, and concern.
In terms of the use of Lockdown Browser in
online exams, the analysis indicates that there are no
significant differences in students' responses based
on their class level. However, there is a significant
difference in the preference for neither joining
Zoom nor using Lockdown Browser, with first-year
students showing a higher preference for this option.
When considering grades, the majority of
participants in grades A, B, C, and F expressed
agreement or strong agreement, while grades D
showed a higher proportion of disagreement or
strong disagreement. Neutral responses were
observed across all grade categories, suggesting a
range of perspectives on this preference. These
findings contribute to a quantitative study that
highlights the impact of proctoring stress on
students, causing them to forget some concepts they
have previously learned or studied, [20].
The stress levels related to this factor were
similar across all grade groups. In general, the
survey results reveal that students from different
grade levels display different responses to specific
aspects of online exams. Particularly, students who
received lower grades expressed a stronger
inclination for in-person tests and encountered more
mental health challenges associated with online
education and exams. However, no significant
differences were found in other aspects, such as
sequential tests, the inability to demonstrate
knowledge, test format, and errors in test questions.
These findings suggest that grade level may
influence students' perceptions and experiences with
online exams.
Regarding the challenges of exams, the data
reveals that a significant percentage of both male
and female participants express significant worries
about experiencing freezing issues during tests.
While the proportion of male participants expressing
this concern is higher than that of female
participants, the majority of both genders share this
worry. Furthermore, participants from all grade
categories, except grade A, generally do not
perceive the lockdown browser as an effective
method for reducing their stress during tests. Most
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
106
Volume 11, 2023
participants in grade A express disagreement, while
a small proportion express agreement or remain
neutral. Participants across different grade
categories display varying levels of concern
regarding freezing issues during tests. The majority
of participants in grades A, B, C, and D express
agreement or strong agreement, indicating a higher
degree of worry. However, a notable portion of
participants expresses disagreement, neutrality, or a
lower level of concern, particularly among grade F
participants. These findings contribute to the
existing study on exam challenges, [16], [32].
However, they contradict the study on the issue of
experiencing technical failures during the exam
submission process, where only 11% reported such
inconveniences, primarily attributed to internet
connectivity issues, while 89% reported no technical
complications, [20]. These contrasting results
suggest that the level of concern for freezing issues
during tests varies among students across different
grade levels.
The study suggests that gender may not be a
significant determining factor in the reported stress
levels and attitudes toward online exams. However,
further analysis and investigation may be necessary
to explore other potential factors that contribute to
these observations.
To alleviate students' anxiety during online
proctored exams, the following recommendations
can be made based on the existing research: 1-
Implement word problems instead of relying solely
on multiple-choice questions, [21]. 2- Allow
students to review and revisit previously answered
questions during the exam. 3- Provide a pilot exam
to familiarize students with the format and
requirements before the actual exam. 4-
Communicate the exam format in detail well in
advance, including instructions on how to address
any access or internet issues that may arise. 5-
Consider incorporating a Paired Adaptive Test
(PAT) system, which combines short answer and
multiple-choice questions, [11].
However, it is important to acknowledge the
limitations of the study. One limitation is the sample
size. With a larger sample size, some differences
that are currently deemed insignificant may become
significant. For instance, a significance level of 0.05
indicates the possibility of statistically significant
differences in responses among different class
standings, but further analysis is necessary to
confirm these findings. Additionally, there may be
borderline associations that require further
investigation or a larger sample size to draw
conclusive results. Another limitation is the focus on
Lockdown Browser and anxiety, which leaves room
for additional questions and considerations.
Future studies can explore the use of anxiety
tests specifically developed for online exams and
investigate alternative methods to reduce anxiety.
Furthermore, conducting similar studies comparing
online proctoring with traditional or hybrid
education formats can provide valuable insights for
comparison and analysis.
References:
[1] Hosseini, M. M., Egodawatte, G., Ruzgar, N.
S., Online assessment in a business
department during COVID-19: Challenges
and practices, The International Journal of
Management Education, Vol. 19, 2021,
https://doi.org/10.1016/j.ijme. 2021.100556.
[2] Hollister, K. K., Berenson, M. L., Proctored
versus unproctored online exams: Studying
the impact of exam environment on student
performance, Decision Sciences Journal of
Innovative Education, Vol. 7 No. 1, 2009, p.
271-294.
[3] Conijn, R., Kleingeld, A., Matzat, U.,
Snijders, C., The fear of big brother: The
potential negative side-effects of proctored
exams, Journal of Computer Assisted
Learning, Vol. 38, No. 6, 2022, p. 1521-1534.
https://doi.org/10.1111/ jcal.12651.
[4] Paredes, S. G., Peña, F. J. J., Alcazar, J. M. F.,
Remote proctored exams: Integrity assurance
in online education?, Distance Education,
Vol. 42, No. 2, 2021, pp. 200-218,
DOI: 10.1080/01587919.2021.1910495
[5]
Nurunnabi, M., Hossain, M., Data
falsification and question on academic
integrity, Accountability in Research, Vol.
26, No. 2, 2019, p. 108122. https://doi.org
/10.1080/08989621.2018.1564664.
[6] Arnold, I., Cheating at online formative tests:
Does it pay of?, Internet and Higher
Education, Vol.
29,
2016, pp.
98106.
https://doi.org/10.1016/j.iheduc.2016.02.00
1.
[7] Awdry, R., Assignment outsourcing:
Moving beyond contract cheating,
Assessment and
Evaluation in Higher Education, 2020, pp. 1
16.
https://doi.org/10.1080/02602938.2020.17653
11.
[8] Marques, T., Portugal F., M., Gomes, J.,
Understanding cheating behaviors: Proactive
and reactive intentions. Ethics and
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
107
Volume 11, 2023
Education, V ol . 14, No. 4, 2019, pp. 415
429.
https://doi.org/10.1080/17449642.
2019.16693
10.
[9] Woldeab, D., Brothen, T., 21st century
assessment: Online proctoring, test anxiety,
and student performance, International
Journal of E-Learning & Distance Education,
Vol. 34, No. 1, 2019, p. 1-10.
[10] Kumar, P., Kumar, A., Palvia, S., Verma, S.,
Online business education research:
Systematic analysis and a conceptual model,
The International Journal of Management
Education, 2018, p. 26-35,
https://doi.org/10.1016/j.ijme. 2018.11.002.
[11] Seeley, E. L., Andrade, M., Miller, R. M.,
Exam anxiety: Using paired adaptive tests to
reduce stress in business classes, e-Journal of
Business Education & Scholarship of
Teaching, Vol. 12, No. 3, 2018, p: 1-13.
[12] Schmidt, S. M. P., Ralph, D. L., Burskirk, B.,
Utilizing Online Exams: A Case Study,
Journal of College Teaching and Learning,
Vol. 6, No. 8, 2009.
[13]
Budhai, S. S., Fourteen simple strategies to
reduce cheating on online examinations,
Faculty Focus’, Higher Education Teaching &
Learning, 2020,
ww.facultyfocus.com/articles/educational-
assessment/fourteen-simple-strategies-to-
reduce-cheating-on-online-examinations/
(Accessed 11 February 2021).
[14] Fontaine, S., Frenette, E., Hébert, M. H.,
Exam cheating among Quebec’s preservice
teachers: the influencing factors,
International Journal for Educational
Integrity, Vol.
16, No. 14, 2020,
https://doi.org/10.1007/s40979-020-00062-6.
[15] Woldeab, D., Brothen, T., 21st century
assessment: online proctoring, test anxiety,
and student performance, International
Journal of E-Learning & Distance Education,
(IJEDE), Vol. 34, No. 1, 2019, ISSN: 2292-
8588.
[16] Rodrıguez-Villalobos, M., Fernandez-Garza,
J. Heredia-Escorza, Y., Monitoring methods
and student performance in distance education
exams, Information and Learning Technology,
Vol. 40 No. 2, 2023, p. 164-176,
DOI: 10.1108/IJILT-04-2022-0085.
[17] Carstairs, J., Myors, B., Internet testing: a
natural experiment reveals test score
inflation on a high-stakes, unproctored
cognitive test, Computers in Human
Behavior, Vol. 25, No. 3, 2009, p. 738-
742, DOI: 10.1016/j.chb.2009.01. 011.
[18] Dikmen, M., Test anxiety in online exams:
scale development and validity, Current
Psychology, 2022,
https://doi.org/10.1007/s12144-022-04072-0.
[19] Wadi, A., Yusoff, M. S. B, Abdul Rahim, A.
F., Lah, N. A. Z. N., Factors affecting test
anxiety: A qualitative analysis of medical
students, Views. BMC Psychology, Vol. 10,
No. 1, 2022, p. 1-8.
[20] Lazarevic, B., Bentz, D., Student perception of
stress in online and face-to-face learning: The
exploration of stress determinants, American
Journal of Distance Education, 2020,
DOI: 10.1080/08923647.2020.1748491
[21]
Carlsbad, How
Lockdown Browser impacts
students’
testing mindset,
UWIRE, a
division of Uloop, The Carroll News, John
Carroll University, 2020.
[22] Dratva, J., Zysset, A., Schlatter, N., Wyl, A.,
Huber, M., Volken, T., Swiss university
Students’ risk perception and general anxiety
during the COVID-19 pandemic,
International Journal of Environmental
Research and Public Health, Vol. 17, 2020,
7433, DOI: 10.3390/ijerph 17207433
[23]
Arora, S.,
Chaudhary, P., Singh, R., Kr.,
Impact
of coronavirus and online exam anxiety on
self-efficacy: the moderating role of coping
strategy,
Interactive Technology
and Smart
Education,
Vol. 18 No. 3, 2021,
p. 475-492,
DOI:
10.1108/ITSE-08-2020-0158
[24] Wachenheim, C. J., Final exam scores in
introductory economics courses: effect of
course delivery method and proctoring,
Review of Agricultural Economics, Vol. 31,
No. 3, p. 640652.
[25] Daffin, L. W., Jr., Jones, A. A. Comparing
student performance
on proctored and non-
proctored exams in online psychology
courses, Online Learning, Vol. 22, No. 1,
2018, p. 131-145.
[26] Lee, J. W., Impact of proctoring
environments on student performance:
Online vs offline proctored exams, The
Journal of Asian Finance, Economics, and
Business, Vol. 7, No. 8, 2020, p. 653-660.
https://doi. org/10.
13106/JAFEB.2020.VOL7.NO8.653.
[27] Sánchez-Cabrero, R., Arigita-García, A., Gil-
Pareja, D., Sánchez-Rico, A., Martínez-
López, F., Sierra-Macarrón, L., Measuring the
relation between academic performance and
emotional intelligence at the university level
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
108
Volume 11, 2023
after the COVID-19 pandemic using TMMS-
24, Sustainability, Vol. 14, 2022, 3142.
https://doi. org/10.3390/su14063142.
[28] Jaap, A., Dewar, A., Duncan, C., Fairhurst, K.,
Hope, D., Kluth, D., Effect of remote online
exam delivery on student experience and
performance in applied knowledge tests,
Jaapet al. BMC Medical Education, Vol. 21,
No. 86, 2021,
https://doi.org/10.1186/s12909-
021-02521-1.
[29] Mohammed, T. F., Erika, M. N., Carly, A.
Busch, D., B., Brownell, S. E., Claiborne, C. T.,
Edwards, B. A., Wolf, J. G., Lunt, C., Tran, M.,
Vargas, C., Walker, K. M., Warkina, T. D., Witt,
M. L., Zheng, Y., Cooper, K. M., Aspects of
large-enrollment online college science
courses that exacerbate
and alleviate
student anxiety,
CBE-Life Sciences
Education, Vol. 20, ar. 69, 2021, p. 1-23.
[30] Coohey, C., Landsman, M. J., Cummings, S.
P., Teaching strategies to reduce test anxiety
among MSW students preparing for licensure,
Journal of Teaching in Social Work, Vol. 43,
No. 2, 2023, pp. 226-238,
DOI: 10.1080/08841233.2023.2170 116.
[31] Stallman, H. M., Ohan, J. L., Chiera, B.,
Reducing distress in university students: A
randomised control
trial of two online
interventions, Australian Psychologist, Vol. 54,
2019, pp. 25-131. DOI: 10.1111/ap.12375.
[32] Ruzgar, N.S., Chua, C., How the preferences
of students change on online learning from
transition term to during the Covid pandemic
period, WSEAS Transactions on Advances in
Engineering Education, Vol. 18, 2021, p. 114-
134. DOI: 10.37394/232010.2021.18.11.
[33] Ruzgar, N., Chua-Chow, C., An empirical
study of student performance during the
COVID-19 pandemic, International Journal
of Education and Development using
Information and Communication Technology
(IJEDICT), Vol. 19, No. 1, 2023, p. 20-36.
[34] Santana, C., Gender differences in test
anxiety in high-stakes English proficiency
tests, Electronic Journal of Foreign Language
Teaching, Vol. 15, No. 1, 2018, p. 100-111.
[35] Rabin, L. A., Krishnan, A., Bergdoll, R.,
Fogel, J., Correlates of exam performance in
an introductory statistics course: Basic math
skills along with self-reported
psychological/behavioral and demographic
variables, Statistics Education Research
Journal, Vol. 20, No.1, Article 3.
https://doi.org/10.52041/serj.v20i1.97.
[36] Rodríguez, S., Reguerio, B., Piñeiro, I.,
Estévez, I., Valle, A., Gender differences in
mathematics motivation: differential effects
on performance in primary education, Front.
Psychol., Vol. 10, 2020.
https://doi.org/10.3389/fpsyg.2019.03050.
[37] Muthuprasad, T., Aiswarya, S., Aditya, K.S.,
Girish K. J., Students’ perception and
preference for online education in India
during COVID -19 pandemic, Social Sciences
& Humanities Open, Vol. 3, 2021, 100101, p.
1-11. https://doi.org/10. 1016
/j.sshaho.2020.100101.
[38] Niederle, M., & Vesterlund, L., Explaining
the gender gap in math test scores: The role of
competition, The Journal of Economic
Perspectives, Vol. 24, No. 2, 2010, p. 129-
144. https://doi.org/10.1257/jep.24.2.129.
[39] Voyer, D., & Voyer, S. D., Gender differences
in scholastic achievement: A meta-analysis,
Psychological Bulletin, Vol. 140, No. 4, 2014,
p. 11741204.
https://doi.org/10.1037/a0036620.
[40] Morales-Martinez, G. E., Hedlefs-Aguilar, M.
I., Villarreal-Lozano, R. J., Moreno-
Rodriguez, C., Gonzalez-Rodriguez, E. A.,
Functional measurement applied to
engineering students’ test anxiety judgment
for online and face-to-face tests, European
Journal of Educational Research, Vol.10, No.
3, 2021, p. 1599-1612.
https://doi.org/10.12973/eu-jer.10.3.1599.
[41] Bhuvaneswari, U. L, Test anxiety and
educational adjustment of college students.
Indian Journal of Applied Research, Vol. 10,
No. 1, 2020, p. 18-19.
https://doi.org/10.36106/ijar /8012467.
[42] Cooper, K. M., Gin, L. E., Brownell, S. E.,
Depression as a concealable stigmatized
identity: What influences whether students
conceal or reveal their depression in
undergraduate research experiences?,
International Journal of STEM Education,
Vol. 7, 2020, p. 1-18.
[43] Butler-Henderson, K., Crawford, J., A
systematic review of online examinations: A
pedagogical innovation for scalable
authentication and integrity. Computers &
Education, V o l . 159, 2020, 104024.
https://doi .org/
10.1016/j.compedu.2020.104024.
[44] Balash, D. G., Kim, D., Shaibekova, D.,
Fainchtein, R. A., Sherr, M., Aviv, A. J.,
Examining the examiners: Students' privacy
and security perceptions of online
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
109
Volume 11, 2023
proctoring services. ArXiv Preprint
ArXiv:2106.05917, 2021.
[45] Karim, M. N., Kaminsky, S. E., Behrend, T.
S., Cheating, reactions, and performance in
remotely proctored testing: An exploratory
experimental study, Journal of Business and
Psychology, Vol. 29, 2014, p. 555-572.
https://doi.org/10.1007 /s10869-014-9343-z.
[46] Lilley, M., Meere, J., Barker, T., Remote live
invigilation: A pilot study, Journal of
Interactive Media in Education, Vol. 1, 2016,
p.1-5.
[47] Moore, R., Vitale, D., High school students
access to and use of technology at home and
in school. ACT Center for Equity in Learning.
2018. Retrieved from https://equityinlearning.
act.org/research-doc/high-school-students-
access-to-and-use-of-technology-at-home-
and-in-school/.
Contribution of Individual Authors to the
Creation of a Scientific Article (Ghostwriting
Policy)
Nursel Ruzgar and Clare Chua prepared the survey
and applied, did statistics, and interpreted together.
Sources of Funding for Research Presented in a
Scientific Article or Scientific Article Itself
There is no funding for this research
Conflict of Interest
The authors have no conflict of interest to declare.
Creative Commons Attribution License 4.0
(Attribution 4.0 International, CC BY 4.0)
This article is published under the terms of the
Creative Commons Attribution License 4.0
https://creativecommons.org/licenses/by/4.0/deed.en
_US
WSEAS TRANSACTIONS on COMPUTER RESEARCH
DOI: 10.37394/232018.2023.11.9
Nursel Selver Ruzgar, Clare Chua-Chow
E-ISSN: 2415-1521
110
Volume 11, 2023