4.1.1.1. The prior experiences
According to the question “What were the prior experiences of participants for online assessment?”. This question was designed to determine the students' prior experience with online assessment. (see Chart 1 and Table 1)
Chart 1 Percentage Experience of Students' Computer Competence
Students competence with Advanced (%)
Good (%)
Introductory (%)
poor (%)
none (%)
Web browser 25 54 17 4 0
Chat 20 39 33 8 0
Email 0 21 17 59 3
Other online platforms 4 8 17 38 33
Table 1: Percentage Distribution of Students’ Computer Competence
As the results indicated, half of the students had no prior experience with online assessment. Only 54% students had the prior experiences for online assessment. 4% of all students rated their web browser ability as "POOR". An introductory proficiency
46%
54%
Percentage Experience of Students' Computer Competence
No Yes
37
level was sufficient for efficient usage of the created online assessment instrument.
The total percentage of the students beyond the introductory competency level with web browser was 96%. Students were given practice quizzes and were educated on the key features of the online assessment tool prior to the final test. As a result, all issues caused by browser usage were eradicated. Students must be instructed on the instruction of online assessment tools without regard for their past computer and internet skill level.
4.1.1.2. User interface Evaluation of User
Perception towards Online Assessment
Percentages of Agreement (%) Number of students (Total: 50)
Means SD Strongly
Agree Agree Neutral Disagree Strongly Disagree 1. Overall framework
and operation levels of the system are clear and smooth
24 12
34 17
36 18
6 3
0
0 3.76 0.88 2. Overall interface
operation method is easy and appropriate
18 9
34 17
34 17
10 5
4
2 3.52 1.02 3. Log-in interface is
clear and easy to operate
22 11
34 17
30 15
10 5
4
2 3.6 1.05 4. Register interface is
clear and easy to operate
24 12
28 14
42 21
6 3
0
0 3.7 0.9
5. Exam interface is clear and easy to operate
26 13
38 19
30 15
6 3
0
0 3.84 0.88 6. Past exam results
interface is clear and easy to operate
18 9
54 27
18 9
10 5
0
0 3.8 0.84 7. Statistical evaluation
interface is clear and easy to operate
16 8
48 24
26 13
10 5
0
0 3.7 0.85
8. Exam result interface 28 40 26 9 0 3.96 0.94
38 is clear and easy to
operate
14 20 13 3 0
9. Help page interface is clear and easy to operate
22 11
36 18
12 6
14 7
16
8 3.34 1.38 Total mean
22.0 38.5 28.2 9.0 2.7 3.69
Table 2.1: Frequencies, Percentages and Means of Student Agreement in Online Assessment System “Screen and Interface Design” (Pre)
Table 2.1 shows the system assessment components in terms of screens and interface. The users' agreeability mean is likewise fairly high, at 3.69. Almost every standard deviation is smaller than one. It showed that practically all users had similar ideas about the user interface. According to the findings of our survey, the appropriateness in terms of the overall framework, the overall configuration of colors and background, the overall layout of screen and window design, and the overall interface operating technique received good ratings. Furthermore, the appropriateness of screen design and the convenience of use of interface function were both scored highly and evenly. 30% of respondents said the help page UI was confusing and difficult to use. This item's standard deviation was also calculated. The largest standard deviation in terms of standard deviations is 1.38. It also has the lowest mean score in that category, 3.34. All of this demonstrates that users of the assistance page do not have similar opinions about whether it is helpful or terrible. However, the trend in that question is negative in comparison to the other questions in the questionnaire. Despite the fact that all users were instructed to read the assistance page, the majority of them did not and instead passed the exam pages straight. Some applicants said that the assistance page interface is not clear and easy to use due to reading difficulties. As a result, there should be more effective and simple support pages that match the demands of learners as they utilize online assessment systems.
On average, 86% of users assess the Login interface as straightforward and simple to use. The standard deviations for those items are somewhat higher than 1.05, indicating that students exhibit common sense when it comes to the login interface.
Only 14% said it was difficult to use. The register interface is straightforward and easy
39
to use, according to 94% of users, and the design of "The register" interface is adequate. On average, 94% of customers assess the test interface as straightforward and simple to use.
On average, 93% of users find "The statistical evaluation" interface to be straightforward and simple to use. The design of the "The past exam results" interface was appropriate for 90% of the users, while it was not appropriate for 10% of the users.
The results can be explained by the fact that that section of the online assessment instrument was not often used by students. This may have influenced students to think in this manner. On average, 91% of users said the test result interface was clear and simple to use, and the design of the exam result interface was appropriate. Despite the fact that the mean values in User interface evaluation are more than 3.69.
As a result, it may be stated that many rooms require additional renovation.
4.1.1.3. The systematic use
From the question “What are participants’ perceptions about the systematic use of the online assessment Platforms?”, the various aspects of systematic use were depicted.
Evaluation of User Perception towards Online Assessment
Percentages of Agreement (%) Number of students (Total: 50)
Means SD Strongly
Agree Agree Neutral Disagree Strongly Disagree 1. I have browsed
among platforms easily
22 11
46 23
26 13
6 3
0
0 3.84 0.83 2. I have followed the
direction without any problem
44 22
28 14
22 11
6 3
0
0 4.1 0.94 3. It is easy to register
to system
44 22
28 14
22 11
6 3
0
0 4.1 0.94 4. It is easy to take an
exam
46 23
28 14
20 10
6 3
0
0 4.14 0.94 5. Easier to correct
work
30 15
44 22
20 10
6 3
0
0 3.98 0.86
6. Ease of use and 36 36 22 6 0 4.02 0.91
40
comfortable 18 18 11 3 0
7. I often visit the past exam result page
28 14
40 20
18 9
14 7
0
0 3.82 0.99 8. Help page made me
use the platform better
16 8
38 19
28 14
18 9
0
0 3.52 0.96 9. Seeing left time
makes me progress better
36 18
32 16
22 11
6 3
4
2 3.9 1.08
Total mean 33.6 35.6 22.2 8.2 0.4 3.94
Table 3.1.: Frequencies, Percentages and Means of Student Agreement in Online Assessment System “System Use” (Pre)
Table 3.1 depicts the system use aspects. Items 1-9. The averages range from 3.52 to 4.02, while the standard deviations of the questions are mostly smaller than one.
As a result, practically all users have the same views on how to utilize the system. This demonstrates that people navigated the site pages without incident. The difficulty with using the help page reappears here. As a result, we'll repeat it again: the help page has to be enhanced for greater usability. 18% of users said that the assistance page had no influence on better system use. On average, 94% of users said that surfing between web sites was simple, that directions were straightforward to follow, that registering for the system and taking the exam were simple, that the system was easy to use and comfortable, and that modifications could be made quickly.
The total mean value of system usage is nearly 4.00, and such a high score may be due to initial system use instruction and sample quizzes done prior to the final test.
4.1.1.4. The Impact
The impact was conducted from the question “What are participants’ perceptions about the impact of the online assessment platforms on the learning process?”
Evaluation of User Perception towards Online Assessment
Percentages of Agreement (%) Number of students (Total: 50)
Means SD Strongly
Agree Agree Neutral Disagree Strongly Disagree 1. Assessment is fair 40
20
32 16
22 11
6 3
0
0 4.06 0.93
41 2. It is hard to cheat 20
10
14 7
14 7
32 16
20
10 2.82 1.42 3. System feedback
helps me to reflect on my merits in learning
18 9
62 31
18 9
2 1
0
0 3.96 0.66 4. Tracking past exam
results makes me understand my progress
40 20
36 18
14 7
6 3
4
2 4.02 1.07 5. Statistical evaluation
page gives detailed information on units where I am good or unsuccessful
18 9
30 15
38 19
14 7
0
0 3.52 0.94
6. It helps me to better understand my growth and improvements by using the system
14 7
48 24
32 16
6 3
0
0 3.7 0.78 7. It helps me to learn by
using this system
22 11
40 20
28 14
10 5
0
0 3.74 0.91 8. I hope to use systems
in other classes as well
36 18
28 14
26 13
6 3
4
2 3.86 1.10 9. Diverse types of
questions make me feel better during the exam
36 18
32 16
14 7
18 9
0
0 3.86 1.10
Total mean 27.1 35.8 22.9 11.1 3.1 3.73
Table 4.1.: Frequencies, Percentages, and Means of Student Agreement about
“Impacts on Learning Process (Pre)
The overall mean value in the Impacts on learning process category of the questionnaire is 3.73, which is the least mean value in the other categories of the questionnaire. It implies that we should create or add new features to the online assessment website in order to maximize the impact on the learning process. Except for cheating, standard deviations for all items in this category are around 1.00, implying that students primarily had similar views about the usage of the online assessment tool. The most major issue here is that 52% of users said cheating was easy in the online evaluation instrument. As a result, the total mean value in that category
42
was lower. The standard deviation for that question was 1.42, indicating that students had differing perspectives on the difficulty degree of cheating in the exam. To avoid cheating in the system, questions are asked in a random order, and the location of the question options varies from user to user. Another measure attempted to avoid cheating was to have all tests completed in labs under the supervision of proctors.
Based on the research findings, a new method of preventing cheating should be devised. Except for the cheating issue, 94% of people believed the evaluation was fair.
Another item with a low mean value in that category is 3.52, which belongs to the
"Statistical evaluation page gives detailed information on units where I am good at or unsuccessful" item. Because this page is more sophisticated than the others, it may generate mental ambiguity. Extra effort may be required to grasp what that page says.
That might explain why the mean value of that item was so low.
On average, 94% of users said that system feedback, tracking previous exam outcomes, and questions appearing page after page had a beneficial influence on the learning process. 90% of users said they would want to take this type of evaluation in subsequent courses as well.
Based on the finding, it may be concluded that students have favorable attitudes toward online testing.
4.1.1.5. Participants’ Opinion
The question “What are the participants’ opinions about the online assessment Platforms?” investigated the general perceptions of students toward the online assessment tool
Evaluation of User
Perception towards Online Assessment
Percentages of Agreement (%)
Number of students (Total: 50) Means SD
Strongly
Agree Agree Neutral Disagree Strongly Disagree 1. System
provides immediate feedback
14 7
44 22
36 18
6 3
0
0 3.66 0.79
43
2. Less
anxious
16 8
36 18
18 9
14 7
16
8 3.22 1.32
3. Better than paper-and- pencil form
66 33
10 5
14 7
6 3
4
2 4.28 1.15
4. Consistent with the teaching style
26 13
40 20
32 16
2 1
0
0 3.9 0.81
5. Faster than paper-and- pencil
56 28
32 16
6 3
2 1
4
2 4.34 0.97
6.
Contemporary
58 29
18 9
18 9
2 1
4
2 4.24 1.07
7. More
systematic
36 18
36 18
22 11
2 1
4
2 3.98 1.01
8. Can be applied to other lessons
24 12
48 24
14 7
14 7
0
0 3.82 0.95
Total mean 32.9 29.3 17.8 5.3 3.6 3.49
Table 5.1.: Percentages and Means of Student Agreement in Online Assessment System (Student opinions) (Pre)
Students' general feelings about the online assessment instrument were studied in the "Student Opinions" Category. The system offered quick feedback to 94% of users. 90% of users believe that online assessment is superior to paper-pencil assessment, and 94% believe that online assessment is faster than paper-pencil assessment. 96% of users believe that online evaluation is more modern and systematic. All of the users agreed that this type of online evaluation was in accordance with the teaching style. In that group, 30% of consumers find online testing intriguing. The greatest variance is 1,32 for the question "less excited," which also has the lowest mean value in the students' opinions category. It might be stated that the pupils' responses to the question are ambiguous. Because the standard deviation is 1.32, their thoughts differed. However, in comparison to the other elements in the questionnaire, students considered that this evaluation technique was not less exciting than paper-based tests.
44
Users may be enthusiastic because they are unfamiliar with the online assessment system. It is believed that after people become acquainted with the system, they would lose interest.
4.1.2. Pre-interview 4.1.2.1. Teachers (n=3)
Common online assessment methods Benefits
Multiple-choice questions
Reflective writing
Graded homework
High-quality audio system Conserve students' time Scoring objectivity
Transitioning from theory to practice Making use of blended learning Table 6: Common online writing assessment methods
The tables summarize the main findings of the themes and subthemes that emerged from the interviews with three teachers. The above tables show that teachers used only three assessment tools to evaluate their students' performance, with multiple-choice questions taking precedence. Furthermore, English teachers faced a broader range of challenges than opportunities. Furthermore, teachers reported that they dealt with issues related to online assessment in a variety of ways, either alone or with the assistance of others. Finally, when asked about opportunities to use online assessment, teachers mentioned some benefits but with less certainty, and they explained why in the exerts listed below. The findings are further examined in the discussion section, which includes direct quotes from the conducted interviews.
The data received from the interviews was evaluated to learn more about the difficulties faced during online testing. Insufficient experience is the most crucial difficulty that instructors confront. One issue spawned others, such as technological difficulties, student monitoring, and so on. For example, the following information comes from interviews in which teachers viewed the insufficient experience with online assessments as a difficulty.
Teacher A:
45
“One of the primary issues, in my opinion, is insufficient experience. For example, due to the constraints of online training, some learning goals may be overlooked and may be unmet (e.g., teamwork).”
Teacher B:
“The main difficulty I confront is that standard evaluation methods (such as those utilized in today's schools) are no longer valid. I have to devise further methods of ensuring academic integrity.”
Almost all of the instructors interviewed expressed concerns about plagiarism, cheating, and verifying the identities of individuals administering assessments. The majority of professors stated that online tests permitted pupils to cheat from one another or from internet sources.
Following are some reactions from teachers expressing their discontent with plagiarism and cheating in online tests, or even the lack of students' enthusiasm that they attempt to give justifications for not doing the assessment because teachers couldn't check the accuracy.
Teacher C:
“As an examiner, my worry about real test attempts is magnified in online assessment methods since I have no means to check that the exam is being attempted by the student.”
Teacher B:
“They often gave me various reasons why they can't do the assessments which maybe involves in the internet, the difficulty in using online platforms and so on. The trick is that students knew I cannot check the accuracy of those reasons which also helped them not do the assessment.”
Three teachers were perplexed as to how they might ensure that their pupils were not stealing answers from one another. They suggested that online assessments might lead to academic dishonesty on the part of pupils, creating a chance to acquire grades through cheating. They stated that online tests provided students with opportunities to obtain high grades by cheating, which is consistent with earlier research on the issues of online assessment.
46
Three teachers claimed that online testing had increased their workload. When professors are aware that students might cheat, they resort to extra labor such as preparing various tests or question banks, which takes time and energy. Even instructors' functions as academic advisors have suddenly transferred online, adding to the pressure.
Teacher A complained about the added effort saying:
“The online assessment is time-consuming, particularly when one considers the number of tests every semester, the number of students in sections, and the number of courses one teaches.”
Furthermore, three teachers complained about recurring technical difficulties and internet disconnections that impacted the dependability and validity of tests, which could not be achieved until all students took exams under the same conditions. Online platforms were frequently encountering difficulties and technical support, which might be inconvenient at test time. Two teachers might find this tiresome, while students might find it stressful. Teachers complained that some pupils had connection difficulties or other types of technical challenges that disrupted the smooth flow of the assessment process. While taking their tests, students spent time calling technicians to have their difficulties resolved. As a result, their attention was distracted, which occasionally led to student dissatisfaction since they were unable to focus and continue as before.
Three teachers noted that the tool used for summative assessment provided limited alternatives for framing questions because it was built for all courses. As a result, the kind of questions compatible with the program restricted the range of questions that were critical to achieving the course learning goals. Third, three teachers reported that pupils lacked the necessary skills to schedule their time for online tests, and they constantly requested more time to respond. According to these teachers, students were too anxious about their attempts, and they were occasionally doubtful about the checking or grading of their responses. As a result, many of them were curious to the point of annoyance at times.
There are nine main challenges to online writing assessment when administering online assessments from teachers’ perceptions
47
Inadequate experience
Technical issues
Time commitment
Monitoring students
Students’ cheating
Test anxiety
Self-discipline
Less concentration
According to the interview findings, teachers face significant challenges when assessing students online. It is clear that items related to a lack of physical interaction, assessing speaking and translation courses, a high risk of cheating and plagiarism, technical difficulties, and assessing a large number of students scored the highest among other factors, revealing the serious impact of such factors on assessing students in full-time e-learning. The data collected from the interviews was analyzed to learn more about the difficulties encountered in online assessment. The most pressing issue for teachers was ensuring the integrity of online assessments. For example, the following information comes from interviews in which teachers viewed the integrity of online exams as a challenge. Three teachers complained that online assessment had increased their workload. Because teachers are aware that students can cheat, they resort to extra work such as preparing different exams or question banks, which takes time and energy.
Overall, three teachers complained about recurring technical issues and internet disconnections that impacted the reliability and validity of exams, which could not be achieved unless all students took exams under the same conditions. Online platforms were frequently experiencing malfunctions and technical support, which could be inconvenient during exam time. This could be exhausting for some teachers and stressful for students. Teachers claimed that some students experienced connectivity issues or other types of technical issues that disrupted the smooth flow of the assessment process. While taking their exams, students spent time contacting technicians to get their problems resolved. As a result, their attention was diverted, which sometimes led to student disillusionment because they were unable to focus and
48
continue as before. Some teachers complained that the application used for summative assessment provided limited options for framing questions because it was designed for all courses. As a result, the nature of questions compatible with the program reduced the variety of questions that were critical to achieving the course learning outcomes.
Finally, teachers complained that students lacked training on how to budget their time for online assessments and that they constantly sought extra time to answer questions.
According to these teachers, students appeared overly concerned about their attempts, and they were sometimes skeptical about the checking or marking of their answers. As a result, many of them were annoyingly inquisitive at times.
Advantages of online assessments (pre-interview) (overall)
Disadvantages of online assessments (pre-interview)
1. Economical since it saves effort, time and money.
2. Flexibility it can be applied before and after or during the explanation.
3. Digital assessment tools are more efficient.
4. Teachers can monitor student progress in real-time.
5. Personalized learning and its significance in students’ outcomes.
6. Provide direct feedback and easily correcting misconceptions.
7. Significantly reduce the monitoring burden while examining large numbers of students.
8. Decreasing tile workload of teachers through saving time sent on routine work.
9. Allows teacher to quickly evaluate the performance of the group against the individual.
All data can be stored on a single server.
1. Assessing the practical skills (writing) is difficult and not accuracy.
2. Challenges in technology adoption by students and teachers.
3. Difficulties estimating the type of long answer.
Table 7: Advantages and disadvantages of online writing assessments