The Likelihood of Cheating at Formative Vocabulary Tests: Before and During Online Remote Learning in English Courses

Introduction: Early review studies identified the prevalence of cheating and the emergence of various forms of cheating in academic institutions. Now, there is growing concern about the rise of academic dishonesty in an unproctored online test environment that is conducted remotely. Purpose: This study examined the likelihood of student cheating at formative vocabulary tests that were conducted before and during online remote learning in English courses. The vocabulary tests were administered using the Socrative application in both learning conditions. Method: Using a quantitative research design, including Multiple paired-sample t-tests and independent t-tests, this study collected 2971 first-and second-year students’ formative scores across six general English courses. Results: Multiple paired-sample t-tests confirmed that students’ scores were significantly higher during online remote learning, with score differences ranging from 0.10 to 2.21 between before and during online remote learning. This difference in score patterns indicated the likelihood of students cheating during online remote learning. Then, independent t-tests did not reveal the tendency that male students are more likely to cheat on online tests more often than female students. Conclusion: The findings of this study may serve as an initial phase of inquiries into the identification of formative test cheating in online English classes.


INTRODUCTION
Due to the emergence of COVID-19 at the beginning of 2020, a sudden shift from face-to-face to online classes has revealed several issues in pedagogical practices.The growth in student cheating on online remote exams and formative tests is one of them.An early review study identified the prevalence of cheating and the emergence of various forms of cheating in academic institutions (McCabe et al. 2001), and now that higher education institutions are forced to organize online remote exams, there is growing concern about the rise of academic dishonesty in an unproctored online test environment.Long before the outbreak, empirical research on student cheating predicted that due to a lack of face-to-face contact between student and teacher, online remote cheating would be more prevalent than traditional forms of cheating (e.g., Fontaine, 2012;McNabb & Olmstead, 2009).An increased amount of research has recently attempted to collect evidence of student cheating (Bilen & Matros, 2021;Vellanki et al., 2023), developed proctoring strategies (Nguyen et al., 2020), and searched for appropriate assessment designs (Raje & Stitzel, 2020) in examinations held during COVID-19 online remote classes.Meccawy et al. (2021) gathered students' and lecturers' perspectives on the implementation of online remote tests during COVID-19's period; both students and lecturers expressed concerns about the increase in cheating and plagiarism and urged the university to Three approaches have been used in prior studies to detect the likelihood of cheating.The first approach is to collect students' perceptions using scenarios designed to elicit students' personal perspectives on whether they would cheat on online tests (e.g., Daniels et al., 2021;Walsh et al., 2021).This method may include self-reported surveys or qualitative interviews to ascertain whether students cheated on online tests in previous terms (Janke et al., 2021).The second approach is to compare students' test scores in offline and online environments (e.g., Brallier & Palm, 2015;Chuang et al., 2017;Ranger et al., 2020).The last approach assesses the likelihood of student cheating by examining the grade patterns of students (Arnold, 2016).
The current study takes the second and third approaches, i.e., comparing students' test scores in offline and online remote contexts and observing any unusual grade patterns that may indicate the likelihood of cheating during formative vocabulary tests.The use of the internet and technology, combined with the remote distance between students and teachers, appears to have enhanced the temptation to cheat.The findings of this study examine such an assumption and deepen our understanding of the disproportions in student test performance prior to and during COVID-19's online remote learning.
This study is built upon the approaches of student cheating on online remote tests.Following that, the literature review section below reviews studies on cheating on online remote tests in the context of online remote learning both before and during COVID-19's online remote learning.It continues with a discussion about EFL's teachers' concerns over the reliability and validity of assessments during online learning due to the high possibility of student cheating.Then, it brings up the practice of using technology-based approaches for formative assessment in online learning and the role of gender among students who cheat on a test.Thus, the following research problems are addressed using empirical data: (1) Was there a significant difference in student performance on vocabulary tests undertaken before the commencement of COVID-19 related online learning and those undertaken during COVID-19 related online learning?
(2) How do the performances of female and male students compare in vocabulary tests?

LITERATURE REVIEW Cheating Practices in Higher Education
Cheating on a test is described as a violation of the regulations that have been established for a specific test and have been explicitly laid out for students (Dick et al., 2003).Most test rules include the prohibition of copying classmates' answers, the prohibition of opening learning materials sources, such as books and modules, the prohibition of seeking answers from reachable people, such as classmates and teachers, the prohibition of using digital device aids that can assist in finding test answers, and so forth, all of which essentially require students to concentrate on answering test questions using their own knowledge without the assistance of outsiders.Cheating has long been a problem in educational assessments, as cheating is often perceived as a quick way to earn a decent grade (Aiken, 1991).The scale of the problem is demonstrated in an early study by McCabe et al. (2001), who conducted a review of the studies on cheating over the last 30 years and reported that students' impressions of their peers' behavior were the most powerful influence on their inclination to cheat.While it is true that not all students cheat, they are inclined to do so if they witness classmates cheating on tests.Moreover, tests that are seen as difficult learning tasks will have a substantial, direct impact on students' likelihoods of cheating, as they might generate negative emotions, such as anxiety and stress, as well as increased pressure prior to the tests (Wenzel & Reinhard, 2020).
Cheating on tests becomes more of a concern in online remote learning contexts.One of the primary reasons is that

| Research Papers
proctors are unable to supervise students completely during online remote testing, which results in increased potential for students to cheat.Even though there are a number of proctoring approaches that enable relatively secure online testing environments to be established, for example ProctorU or Proctorexam.com,this type of affordance was not available to the institution in question.Fask et al. (2014), for example, studied students' test-taking behaviors in offline and online environments.Their findings revealed that the online testing environment has a detrimental effect on performance, including increased ambient distractions, differences in student comfort, differences in technical difficulties, and differences in the ability to seek clarifications for potentially ambiguous exam questions.All these negative consequences encourage students to cheat, meaning that online testing aids student cheating, a conclusion reinforced by further research (e.g., Chuang et al., 2017;et al., 2020).When online remote examinations are not proctored, students are more likely to cheat (Harmon & Lambrinos, 2008).It has been recognized that students perform much better on unproctored online remote tests than on proctored classroom assessments, raising the possibility of cheating (Brallier & Palm, 2015;Waluyo & Tuan, 2021).Thus, to combat academic dishonesty in online testing, previous research has emphasized the importance of 1) tightening the proctoring process using webcam recording software, which can be useful during tests and for post-test evaluation (Dendir & Maxwell, 2020), and 2) using paraphrased test questions whose answers are not readily available on the internet (Golden & Kohlbeck, 2020).Cheating, nevertheless, may not be completely eliminated due to the nature of online remote testing.However, an empirical study conducted by Ladyshewsky (2015) discovered no statistically significant differences in students' test scores on supervised in-class tests and unsupervised online tests among post-graduate students, even though both types of tests included multiple-choice questions that are prone to cheating.These findings suggest that the higher the educational level at which students study, the less likely they are to cheat, regardless of the testing situations.

Online Remote Learning and Test during COVID-19
In March 2020, many higher education institutions worldwide transitioned from face-to-face learning to online learning in response to the COVID-19 pandemic.These significant shifts occurred spontaneously and without prior planning but were critical to reduce contact between students and teachers and to contain the spread of the COVID-19 virus.
Since then, educators have encountered numerous barriers and challenges, raising concerns about COVID-19's online remote learning's effectiveness as a substitute for traditional teaching and learning.One of the points of contention is whether the new norm of online learning makes it easier for students to cheat.As a result, a growing number of empirical research has been conducted on the subject in different countries.Janke et al. (2021) conducted a survey in Germany to determine the dangers of ad hoc online assessment for academic integrity.They surveyed 1608 German students from various higher education institutions who had participated in COVID-19's online remote learning.As expected, their investigation found students' accounts of frequent cheating on tests and exams when enrolled in online learning.Similar findings have been found through empirical studies involving students from a variety of countries, including Bangladesh, Canada (Daniels et al., 2021), and the United States of America (Walsh et al., 2021), but little is known about Thailand.Among the key factors that contribute to students cheating on tests during COVID-19's online remote learning are stress and anxiety related to COVID-19's circumstances (Apridayani et al., 2023).Negative emotions impair one's ability to focus on learning.Moreover, both university lecturers and students acknowledged that online remote learning makes it easier for students to cheat due to the lack of supervision (Reedy et al., 2021).The findings from these latest studies on student cheating on tests during COVID-19's online remote learning corroborate the conclusions from previous studies on online test cheating.
Concerns regarding the reliability and validity of formative and summative tests delivered during COVID-19's emergency teaching have also been voiced by EFL teachers.In fact, Ghanbari and Nowroozi's qualitative study (2021) revealed that EFL teachers saw cheating as a key problem and concentrated their efforts on reducing the likelihood of student cheating on online tests.Test results, particularly those from formative assessments, can be utilized to track student progress and serve as a benchmark for continuous improvement of student learning throughout the course.Cheating can skew test results by failing to reflect students' actual knowledge and skills, thereby misleading teachers with the following teaching and learning materials.More crucially, a study by Shoaib and Zahran (2021) discovered that weak students viewed the COVID-19's online remote learning as an opportunity to obtain better grades through cheating.In this case, teachers would have a difficult time identifying weak students and providing suitable interventions to aid in their learning.In other instances, high performers who do not cheat on online remote examinations but receive lower results are deemed weak and receive further learning treatments.These circumstances may result in unconscious misinterpretations of students' English learning progress.Unfortunately, empirical evidence for the subject is still lacking, and research into student cheating on online remote assessments, particularly in the present online remote learning practice, has not been thoroughly investigated in online remote English classes.Moorhouse and Kohnke (2021) conducted a review of articles concerning online English classes during the COVID-19 pandemic, and their findings included no mention of the ELT community's identification of student cheating in online tests as a reaction.Thus, the current study intends to address the research gap at this point.

| Research Papers
It is critical to highlight that EFL teachers continue to undertake summative and formative assessments in their online English classes, with some adapting assessment plans to meet the online environment and others maintaining the same assessment plans as in face-to-face learning (Zhang et al., 2021;Waluyo, 2020).Between the two, formative assessment is more likely to be compromised by student cheating on formative tests because of its iterative nature throughout the learning process.The results will not assist teachers in identifying students' deficiencies, nor will they assist students in making greater overall academic progress, as Arnold ( 2016) suggested after examining students' scores on online formative tests at a Dutch university.The study substantiated instances of cheating in online tests by identifying irregular grade patterns that exhibited a negative correlation with students' academic progress.Throughout the pandemic era, the ELT professional community has been actively engaged in the development of process-oriented and formative assessment practices (Chung & Choi, 2021).Online formative assessments have been suggested to be critical in connecting assessment, teaching, and learning because they enable teachers to identify students' weaknesses during the learning process, provide appropriate feedback for students' learning improvement, and direct teachers' subsequent teaching approaches toward student learning enhancement (Gikandi et al., 2011).Yet, this type of assessment may be ineffective unless efforts are made to identify and resolve student cheating on online tests.
COVID-19's online learning has also been considered as an opportunity to apply technology-based formative assessments (Prastikawati, 2021;Waluyo & Apridayani, 2021).One of the practices is the deployment of online applications that incorporate IRS (Interactive Response Systems), which ena-bles teachers to identify students' strengths and weaknesses in real-time.Students can also observe and track their formative test outcomes.Socrative is one of the several IRSbased educational apps that applied in the online teaching and learning space.Students who took tests in an online class that utilized Socrative for formative assessment were pleased with the results since they arrived promptly and simply (Abdulla et al., 2021), and teachers maintained some continuity and active learning in the classroom despite being in a different location (Christianson, 2020).Teachers can develop multiple-choice, true/false, and short-answer questions using Socrative.Teachers can use a variety of delivery methods and settings when presenting the app as a formative test.Teachers can select Instant Feedback, which provides quick feedback to students once they respond to a test question.Teachers can choose Open Navigation, which empowers students to answer questions based on their choices in a random fashion.Also, there is the Teacher Pace option, which allows teachers to manage the flow of questions and monitor responses as they occur.All of these activities take place in real-time and are accessible through smartphones, laptops, and computers.Nonetheless, Rofiah and Waluyo's quantitative study (2020) highlighted Thai EFL students' approval of Socrative as a means for administering vocabulary formative tests, as well as the risk of student cheating during exams.Their research examined the use of Socrative for formative assessment in the classroom.It is reasonable to assume that the possibility of cheating will be greater when the app is used in online exam environments.Nonetheless, actual evidence for this is still sparse, which the current study will explore.
Meanwhile, by gender, significant differences will likely be noticeable when female and male students vary in their Steps to Launch a Quiz on Socrative.com| Research Papers levels of self-control, shame, perceived external sanctions, grades, and cheating intentions (Tibbetts, 1999).Given that gender serves as both a control variable (Finn & Frone, 2004) and a personal factor influencing cheating behavior (McCabe & Trevino, 1993), exploring gender differences is pivotal in understanding the motivations behind students reporting suspected academic dishonesty.In alignment with this perspective, Simon et al. 's study (2004) substantiated the relevance of gender in this context by uncovering a substantial contrast between male and female students.
Their findings emphasized that female students, in particular, displayed a significantly greater inclination to report suspected instances of academic dishonesty, shedding light on the intricate interplay between gender and reporting behavior in academic integrity matters.Previous studies found that male students cheat more frequently or have a higher perception of cheating than female classmates (Muntada et al., 2013;Zhang et al., 2018).Gender disparities in online test cheating, on the other hand, have not been sufficiently investigated.

METHOD Research Design
The primary objective of this study was to identify the likelihood of student cheating at formative vocabulary tests that were conducted before and during online remote learning.To achieve this objective, it employed a quantitative research design with an emphasis on examining substantial disparities in student performance between in-class vocabulary tests before and online remote vocabulary tests during COVID-19's online remote learning.The vocabulary tests were administered using the Socrative application in both learning modes.This study tracked students' vocabulary test scores across six general English courses, involving students from different cohorts and academic majors, occurring prior to and during the emergency online learning at a university in the south of Thailand.

Setting
This study was conducted in the context of six mandatory General English (GE) courses that began on February 10, 2020, and ended on May 1, 2020, during the third academic term of 2019-2020.It involved 2971 first-and second-year students studying various academic majors.The students were spread out studying six different English courses.The detailed descriptions of the courses and the number of students involved are elaborated below and summarized in Table 1.

Course 1
The first English course was GE61-122, entitled «Academic Listening and Speaking,» and was taken by 1st-year stu-dents.387 students enrolled in total.The courses place an emphasis on English proficiency practice in both informal and formal settings.Through dialogues, passages, reports, and announcements, it focuses on listening and pronunciation.Moreover, through group discussions, oral presentations, and report writing, it aims to develop academic speaking skills.

Course 2
The second English course was GEN61-123, «Academic Reading and Writing,» which was studied by first-year students.There were 1171 students in all.This course is primarily designed to help students improve their reading and writing skills through a variety of academic texts and exercises.It specifically strengthens students' abilities to conduct critical readings of academic publications, summarize key concepts from texts, create various types of academic reports, compose effective paragraphs and essays, and appropriately use citations and references throughout the writing process.

Course 3
The third English course, GEN61-124, was taken by second-year students and was named «English for Academic Communication.»There were 156 students in all.This course aims to improve students' understanding of the English language and their ability to communicate effectively in academic and professional settings.It equips students with the necessary communication methods and abilities for academic correspondence.Moreover, it teaches students how to properly recognize their sources, which results in more effective academic communication.

Course 4
The fourth English course, GE61-127, was taken by second-year students and was entitled «English for Presentation in Sciences and Technology.»There were 150 students in all.This course focuses on the four key English abilities of listening, speaking, reading, and writing, with an emphasis on scientific phrases, structures, and terminology.Further, it instills in students the required abilities for effective presentation.

Course 5
The fifth English course was GEN61-128, «English for Humanities and Social Sciences Presentation,» which was taken by second-year students.There was a total of 76 students.This course is aimed to teach students how to plan, organize, and deliver excellent presentations while focusing on the content, structure, and delivery.It emphasizes several facets of oral presentations, such as pronunciation, volume, intonation, body language, gestures, and images.

Course 6
The sixth English course, GEN61-129, was taken by second-year students and was titled «English for Media and Communication.»There were 76 students in total.This course is aimed to help students improve their English communication abilities by utilizing a variety of artistic and communicative media.These include teleconferencing, conducting interviews, producing simple news stories, developing engaging commercials, writing scripts for blog sites, voice recording and pronunciation techniques, using a teleprompter, and speaking from a script.It builds students' confidence in their English speaking and communicative abilities.

Course Design and Data
Each of the six courses that the students were studying implemented weekly formative vocabulary tests that lasted 10 weeks.Students were obliged to study fifty academic English words from provided lists each week in these courses.Following that, either in week 2 or week 3, students' vocabulary knowledge was assessed during the first ten minutes of class before the main lessons took place.One test lasted ten minutes and consisted of fifteen multiple-choice questions.Students completed ten vocabulary tests using Socrative.comover the course of ten weeks.Students accessed the tests using their smartphones, which were proctored by teachers in the classroom.When the COVID-19 outbreak occurred, the students were in the middle of the academic term.Therefore, the students took half of the formative tests in-class and the other half online.Researchers tracked students' vocabulary scores in the selected courses.The data was cleaned up, including the removal of incomplete test scores.As presented in Table 1, there were 2971 students' scores that were kept for further analysis.Below is the sample formative vocabulary test administered through the Socrative application.

Target Words of the Formative Vocabulary Tests
Each of the six selected courses had a target vocabulary of 500 academic English words ranging from A1 to B1 on the CEFR (Common European Framework of Reference).The words were divided into ten lists that students were required to study independently at home.Students were assigned to write definitions and sample sentences for each word in the vocabulary lists provided.For one week, one list was studied.
It was expected that this technique would enable students to acquire vocabulary on their own.In their independent vocabulary learning, students were encouraged to make the most use of any available resources, such as dictionary, Google Translate, etc. Students could also consult the words with teachers through Facebook if they wished to do so.

Procedure
The research procedures consisted of two phases.In the first phase, students did the formative vocabulary tests in class.At that time, the teaching and learning process was normal and COVID-19 outbreak had not reached the area.This occurred from February 10 to April 1, 2020.Then, the second phase was the time when students took the formative vocabulary tests online remotely.The COVID-19 outbreak had reached the area.As a response, the university moved all English classes remotely online from April 2, 2020, to the end of the term on May 1, 2020.Except the mode of learning, all vocabulary test procedures were kept the same as in the first phase.Table 1 shows the number of formative vocabulary tests that students took in-class and remotely online.Figure 3 is the illustration sample of the data collection procedure.All the courses had an equal number of tests in class and online except for course 1 and course 4 as shown in table 1.
Below are the procedures carried out in each research phase:

First Phase: In-Class Formative Vocabulary Tests
Each course prepared 500 target words, divided into ten vocabulary sets, prior to the start of the term.Each set had fifty words that students were required to study weekly, beginning in week two or three, depending on the course's lesson schedule.A ten-minute test comprised of fifteen multiple-choice questions.The test inquired about the meaning of words, their parts of speech, synonyms, and antonyms, as well as sentence completion.Students completed the test by accessing Socrative.comvia their smartphones.The teacher could monitor student progress from the classroom computer, display it on the projection screen, and roam around the room to prevent students from cheating.

Second Phase: Online Formative Vocabulary Tests Remotely
Due to the COVID-19 pandemic, the Thai government issued a national emergency decree on March 26, requiring Thai universities, including the site of this research, which was in the midst of the academic year 2019-2020, to transition from face-to-face to fully synchronous online learning on April 2, 2020.Teachers conducted lessons using a variety of conferencing platforms, including Zoom, Webex, and Ms. Team.When teachers administered vocabulary tests, they were able to track students' progress solely through their personal computers.They were unable to effectively supervise the test due to several constraints, including the limited number of students per monitor, a lack of equipment, unfamiliarity with online teaching, and time management.These limits created an environment conducive to disobedience and cheating during the test.

Data Analysis
This study used IBM SPSS 25 for data analysis.Following data collection, they were cleaned, computed in SPSS, and prepared for analysis.Incomplete scores from absent students were not considered for the data analysis.Only completed students' score from test one to ten were all included in the data analysis from all courses.To answer the first research question, multiple paired-sample t-tests were performed and independent t-tests were used to examine the second research question.Then, multiple t-tests were performed on students' formative vocabulary in-class and online test scores separately for each course.

DISCUSSION
This study identified the likelihood of student cheating at formative vocabulary tests that were conducted before and during online remote learning.It adopted the approaches employed by previous studies: comparing students' test score results in offline and online settings (e.g., Brallier & Palm, 2015;Chuang et al., 2017;Ranger et al., 2020) and exploring students' grade patterns (Arnold, 2016).The first analysis results showed that students' test scores experienced significant increases when the tests were moved to online remote settings in five courses.The significant increase was not statistically visible in one course, i.e., Course 5.The course had the smallest sample size compared to other courses.The descriptive patterns of the students' scores, as shown in Chart 1, confirmed that they achieved greater scores during online remote learning than they did during previous face-to-face learning.Thus, these findings corroborate previous studies indicating that students perform better on tests in online remote contexts (Chuang et al., 2017;Fask et al., 2014;Ranger et al., 2020).Given the notable increments observed across the majority of the chosen courses, this study aligns with Arnold's (2016) findings, suggesting the possibility that instances of student irregularities in online formative tests might have taken place.These occurrences, if indeed present, could have a discernible impact on students' formative scores.

| Research Papers
Several pedagogical implications emerge from the study's findings.Teachers are urged to exercise caution when receiving formative exam scores from students.Now that the study has established the potential of cheating, test scores may not accurately reflect students' true abilities.Teachers should be aware that weak students perceived the assess-ments administered during online remote learning as opportunities to cheat their way to a better grade (Shoaib & Zahran, 2021;Taherkhani & Aref, 2024).Teachers are urged to utilize paraphrased test questions for which the solutions are not readily available online (Golden & Kohlbeck, 2020) while tightening the proctoring process through the use of   (Dendir & Maxwell, 2020).Moreover, formative evaluation cannot be treated in the same way that it is in face-to-face learning.Indeed, online formative assessments are crucial as a benchmark for differentiating learning aid provided to students throughout the learning process (Gikandi et al., 2011).Nonetheless, teachers must monitor students' behaviour and performance in online remote learning classes.Teachers may wish to ask students who do poorly or well on formative assessments to validate their expected competencies.This type of technique may assist teachers in determining the validity of students' formative test results.
Furthermore, these initial results contribute to our understanding that, while employing technology-based formative assessments appears to be a smart idea, the risk of student cheating has been observed in both online and offline situations.Rofiah and Waluyo (2020) discovered that, although students accepted Socrative.comas a means for conducting vocabulary formative tests, students acknowledged using an online dictionary, chatting with online peers, and browsing the internet during formative tests on Socrative.These activities become even more convenient when teachers and students are located in different locations in online classes.Students can create excuses for not turning on their cameras during COVID-19's online learning, such as poor internet connections or a lack of one.Even when students activate their cameras, teachers' visibility remains limited, particularly in large classes (Koçer & Köksal, 2024).Even though in the current study, the online test setting did have teacher proctoring, and while it did not work perfectly, it could be a solution compared to a case with no teacher proctoring at all.As with formative assessment, this study advocates for online tests administered by IRS-based technology such as Socrative to account for a small portion of a student's grade.Due to the decreased impact online test results have on a student's grade, cheating during tests may be minimized.
The following statistical analysis revealed that female and male students fared equally well on formative vocabulary tests prior to and during COVID-19's online learning in five courses, with the exception of Course 5, where a significant difference was observed.Given that all five classes had a larger student population than Course 5, this study will partially corroborate earlier research indicating that male students are more likely to cheat than female students (Muntada et al., 2013;Tibbetts, 1999;Zhang et al., 2018).
The current study's findings may indicate that cheating on tests in online environments is different from cheating in an offline one.Based on prior research indicating that online learning not only increases opportunities for cheating (Brallier & Palm, 2015;Harmon & Lambrinos, 2008;Pratiwi & Waluyo, 2022), but also causes a slew of negative emotions in students, such as stress, anxiety, and worry, especially during the COVID-19 pandemic, and introduces technical difficulties and personal discomforts, this study asserts that students, regardless of gender, will cheat on online formative tests.COVID-19 inherently generates negative emotions and insecurity in students, whether about their personal safety and that of their family, or about their academic performance in terms of grade, causing students to perceive formative tests as difficult, which can result in cheating actions as a temporary and easy solution (Apridayani, 2022;Wenzel & Reinhard, 2020).

CONCLUSION
After assessing students' performance on formative vocabulary tests prior to and throughout online remote classes in English courses, this study concluded that the considerable rise in online remote testing indicates the likelihood of cheating.However, the chance of male students cheating more frequently was not proven, contradicting the findings from offline assessments.Given the study's shortcomings, it is stated that this study would be best suited as a pilot study for attempting to uncover student cheats on formative tests in remote online English classes.If qualitative interviews with students had been done, the study would have garnered further insights.However, due to language barriers and the impact of the COVID-19 outbreak, which restricted their opportunities, prevented the researchers, who were foreigners, from conducting qualitative interviews.As we proceed in online remote English classes, the researchers hope that the study's findings will alert English teachers to the possibility of cheating and how to address the issue.

Table 1
Students' Data

Table 3
exhibits the detailed results for each course.Chart 1 illustrates the differences in students' scores.

Table 3
Results of T-tests across the Six Courses

Table 4
Results for Cohen's d Effect Size across the Six Courses