Special Issue: Implications of COVID-19 on Higher Education Are Universities Using the Right Assessment Tools During the Pandemic and Crisis Times?

All industries have been affected by the COVID-19 pandemic and have worked to develop alternative strategies and actions to survive and continue business operations; the education sector is no exception. University administrators and instructors have faced challenges in finding the appropriate mechanisms to manage the final examination process. This essay suggests that project-based learning (PBL) assessment could be an effective alternative to online examinations. It advocates the adoption of PBL by highlighting the challenges/pitfalls associated with online exams supported by proctoring software tools.


Introduction
In response to lockdowns implemented as a result of coronavirus disease 2019 (COVID-19) outbreak, universities have shifted to teaching and administering examinations online. A typical solution adopted by many universities is to conduct exams online using software tools that can also provide proctoring, such as Respondus (https://web.respondus.com/), Proctorio (www.proctorio.com), or ProctorU (https://www.proctoru.com/). These tools can monitor students while attending the exam from home by recording live video of students taking the exam and perhaps the surrounding environment (e.g., Respondus). Two proctoring techniques are typically used with these tools. The first uses a live proctor who watches students during the exam using a webcam (e.g., ProctorU). In the second technique, an artificial intelligence biometric system (e.g., Respondus) analyzes the recorded videos of students and flags any suspicious cases

The Challenges of Using Proctoring Tools
These proctoring software tools raise several concerns. First, the technical capability and readiness of universities, faculties, and students must be assessed. Universities need to have robust infrastructures that can accommodate a large number of students who must access the server at the same time (as several exams are often scheduled simultaneously) and whose proctored videos are recorded and uploaded to the university server. In addition, students need to be trained to use the proctoring tools that are integrated into learning management systems such as Blackboard and Moodle. In fact, IT teams can easily manage the technical readiness issue by extending server scalability and training students using video demonstrations that provide step-by-step instructions on how to use the proctoring tool and to access the exam online. Nevertheless, in some countries, not all university infrastructures are capable of administering exams online using these proctoring software tools, as is the case in India (Sharma, 2020) and Egypt (Khaled, 2020). Second, some courses require students to design or draw diagrams or conduct mathematical calculations, but students cannot access other applications and tools once the exam has started, nor can they draw or write their answers on a piece of paper, scan it, and attach it. Thus, the tool fails to fulfill the requirements of some courses. Although the instructor might allow students to access some external applications (e.g., Respondus allows access to a calculator or website), some applications (e.g., MS Access, IBM Rational Rose) cannot be accessed as they are not available through the Internet. Also, students cannot access their desktop to attach files unless the block is disabled.
Most importantly, proctoring tools cannot completely prevent students from cheating and, even if suspicious cases are detected, proving the intention to cheat might be difficult. Mitra and Gofman (2016) state that detecting cheating in online exams is more difficult than in traditional exams, which are controlled in the classroom with the presence of live proctors. In online exams, students can consult unauthorized sources (e.g., the Internet, their mobile phones, and friends) or have others take the exam for them. King et al. (2009) conducted empirical research to identify students' perception about cheating in online exams. They found that 73.6% of the surveyed students believed that it is easier to cheat in an online exam versus a traditional exam. A similar result was obtained by recent research in Norway, in which 212 students and 162 instructors were surveyed. The researchers found that both students and teachers perceived that cheating was easier with online exams (Chirumamilla et al., 2020).
Proctoring tools use artificial intelligence capabilities to detect cheating intentions when a student moves their body, face, or head away from the camera by flagging the moment as suspicious behavior (Mitra & Gofman, 2016). Many concerns have been raised in this context: students might move their bodies or make gestures for many valid reasons (e.g., nervousness or stress because of the exam) that cannot be proven as a cheating intention-especially if the student is used to being in their home environment, meaning their movements will be more spontaneous and natural than in a real classroom environment. Other examples illustrate how difficult it is to prove cheating intention, such as a female student covering her face with a hijab or a student needing to use the washroom. In addition, students can employ multiple devices (e.g., smartwatches, smartphones, and laptops) as well as personal assistants who can be hidden in the student's home or room and cannot be captured by the camera (Jalali & Noorbehbahani, 2017), which is typically placed in a fixed position and is usually built into the computer. Incidents such as an Internet disconnection, a power outage, or family emergency might distract a student's attention during an online exam and might be a legitimate reason to allow the student to re-take the exam. However, whether or not such an incident actually occurred would be difficult to verify. Proctoring software tools do not help in these contexts.
The use of proctor software assumes that students have access to the needed technology for online examinations (i.e., high-speed Internet and laptop with a camera and microphone) and that students' home environments are similar to a university classroom environment. Woldeab et al. (2017) investigated students' perceptions about proctored online exams prior to an exam. The results showed that 38% of the surveyed participants (a total of 865 students) reported that they were not at all confident that they had the necessary equipment for the exam, and the majority (52%) reported that they were not confident or only somewhat confident that they had the technological expertise to set up and complete online exams. Furthermore, previous research has shown that the use of online proctored exams increases students' anxiety due to general wariness about the use of technology in addition to the fear of testing (Woldeab et al., 2017;Woldeab & Brothen, 2019).
The use of these software tools also raises privacy concerns, as universities and the software vendor (e.g., Proctorio) have access to students' personal spaces in addition to biometric and exam data. Concerns about the video recordings of students include how they will be used after the exam and for what purposes as well as who will have access to them. Mitra and Gofman (2016) found that 43% of surveyed students reported that they were not comfortable taking an online exam while being monitored via biometrics, and the majority (63%) considered such monitoring to be an invasion of their privacy. Privacy concerns led university students in Australia and the Netherlands (Doffman, 2020;DutchNews, 2020;Zhou, 2020) to sign an online petition opposing the use of these software tools. Universities have the right to monitor students during examinations and have full control of students through regulations implemented on campus; these rights and regulations do not extend to students' home environment. However, one might argue that online universities have been conducting teaching and exams online for years, so the issue of proctoring online exams is not new, and universities have already successfully implemented online proctoring, taking into consideration the related privacy legislation. Indeed, this is correct in cases where students make a choice from the outset to study in universities that are known to apply certain controls to online exams to ensure the integrity of the examination process. In contrast, in the case of the pandemic, online exams have been introduced to students at traditional universities, and these students might not accept video recordings since they were not aware of their use when they registered.
Some prior empirical research has found that students' grades significantly decreased after they took an online exam using a proctoring software tool (Alessio & Maurer, 2018;Alessio et al., 2017;Davis et al., 2016). However, the opportunities to cheat could benefit some students who might achieve higher grades due to cheating (Nader et al., 2019;Varble, 2014). Thus, the grading of online examinations may be unfair to students, as their GPAs may not reflect their real capabilities. This could negatively affect a university's reputation: if the GPAs of students do not accurately reflect their ability, the value and credibility of the credential offered by the university might diminish (Delbert et al., 2020;Nader et al., 2019).

An Alternative Solution: Project-Based Assessments
The top 10 universities in the world, according to the QS World University Rankings, have adopted several approaches to deal with exams and assessments, given the limitations of proctoring software tools. For instance, at the time of writing this paper, the University of Oxford (2020) has cancelled all examinations for first-year undergraduates and students are deemed to have passed, while examinations for second-and thirdyear students and postgraduates have been deferred to the next academic year. The University of Cambridge Faculty of Law (2020) decided to conduct exams online on an open-book basis with word limits and fewer questions, so they are more akin to an essay assignment. University College London (UCL) (2020) viewed it as unacceptable to postpone exams as this might affect students' progression to graduation, so it introduced capstone assessments as an alternative to exams for all UCL first-year students. For all other students, the university adopted alternative assessments, such as timed 24-hour open book exams, longer-term assignments, and capstone assignments. As previous research reported, detecting cheating is more difficult in online exams, so this might explain why top-rated universities have opted for open-book exams, capstone projects, or exam deferral, as online exams proctored with software tools might not reflect students' actual abilities.
In response to the technical challenges related to using online proctoring software, the limitations of these software tools in preventing cheating, and the privacy concerns, the following section presents project-based learning (PBL) as an effective alternative assessment.
PBL is a form of learning by doing in which students acquire new knowledge and skills by working on a project for an extended period of time to solve a real-world problem or challenge (Pellegrino & Hilton, 2012;Peterson, 2012). Kokotsaki et al. (2016) define PBL as "an active student-centered form of instruction which is characterized by students' autonomy, constructive investigations, goal-setting, collaboration, communication and reflection within real-world practices" (p. 276). Krajcik and Blumenfeld (2006) state that a project-based classroom allows students to investigate questions or problems, propose hypotheses and explanations, discuss their ideas, criticize peers' ideas, and test their creative ideas. Previous research has shown that project-based learning results in improved learning; broadened knowledge and improved cognitive skills (such as problemsolving, critical thinking, and creativity); and more deeply developed soft skills such as communication and conflict resolution, time management, leadership, teamwork, and project management skills (Chandrasekaran & Al-Ameri, 2016;Chiang & Lee, 2016;Verbic et al., 2017;Lee et al., 2015;Fini et al., 2017;Condliffe, 2017;Taheri, 2018).
Scholars have referred to the key characteristics that distinguish PBL from other forms of learning (Krajcik & Shin, 2014;Kubiatko & Vaculová, 2011;Ravitz, 2010). For example, Larmer (2020) recently proposed the following seven essential elements of PBL: 1. Challenging problem or question: Students answer challenging questions or find solutions to challenging, meaningful problems.
2. Sustained inquiry: The project involves an extended process of posing questions and finding answers over a period of time, during which students need to find and use the required resources.
3. Authenticity: The project has a real-world context, process, tools, and quality standards, and has a real-world impact.
4. Student voice and choice: Students are empowered to make some decisions about the project, for example, how the work should be done and what they create, and the role of the instructor is limited to guiding and mentoring the students.

Reflection:
The project provides opportunities for students to reflect on their learning experience, the challenges that arise in implementing the project activities, and the actions taken to overcome them.
6. Critique and revision: The project activities are conducted in an iterative process that allows students to give, receive, and apply feedback to develop their ideas and improve the project outcomes.
7. Public product: Students are encouraged to present and demonstrate their project outcome (e.g., computer system, website, poem, startup business proposal, graphics, social media campaign, speech, etc.) to people beyond the boundary of the classroom.
The application of PBL is not limited to science or technology/engineering fields but also can be applied to social sciences, humanities, and languages (Chu et al., 2017;Lo, 2018). As pointed out by scholars (Condliffe et al., 2017;Huberman et al., 2014;Pellegrino & Hilton, 2012), PBL is a more student-driven/studentcentered approach to education, and it supports deeper learning through engagement with real-world problems, which is essential to prepare graduates to be successful in their careers. Lo (2018) reported that high quality PBL in social studies classrooms is possible and can achieve deep learning. Furthermore, educators can create learning activities that teach students the professional competencies and skills needed by employers and the market.
What matters most is that students meet course learning outcomes. If course goals can be achieved by having students complete a project, an examination might not be needed. Instructors may have already used project methods to assess coursework during the semester. However, in the coming semesters, instructors might adopt a project-based learning approach throughout the whole semester and assign one large project to cover all the course learning outcomes, where the project weight/mark represents the total course grade. While exams may have previously been the dominant mode of assessment, this may be the time to consider changing the way instructors assess students, giving more weight to projects than to exams and quizzes. University administrators can use the current challenge presented by COVID-19 as an opportunity to change their policies to support the shift to applied, project-based learning and assessment. Team projects are valuable tools, as they simulate reality, enable the application of knowledge, and facilitate the development of many competencies and skills, such as critical thinking, problem-solving, evaluation, simulation of reality, innovation, reflection, teamwork, leadership, and presentation-many of which are difficult to assess through standard exams. This should be the focus of education in this century.
The author has adopted PBL as an assessment method in a course, Innovation and Entrepreneurship, in which 30 students were enrolled, with the project being worth 80% of the total course grade. The project was the foundation for the course curriculum and was completed over 14 weeks. The course was taught online using Microsoft Teams. The instructor used Teams to create small groups of five to six students who met online to discuss their project separately from the other groups during class time. The online platform allowed each group to focus better and think more creatively as there was no noise and distraction from other groups as would be the case in a classroom environment. The instructor was able to easily move between the group meetings, observe each group's discussion and ways of thinking about their project problems, and intervene when necessary to give feedback and guidance. The students found it is easier to work remotely via Teams on their ideas as they were recording their notes immediately on their computer, using the Internet to find relevant information, using the drawing function on their computer, and using screen sharing among themselves and with the instructor to get feedback. All of these tools might not be available when students work in groups in the classroom. Students worked on their group project in phases and had to meet certain milestones during the semester to demonstrate their progress. Students also had the opportunity to get mentoring and feedback from volunteers in industry (i.e., entrepreneurs) and to present their project outcome to the public (public product) through participation in a national innovation competition. In terms of grading the project, some deliverables/components (i.e., the business plan and prototype) were marked collectively, where the same grade was given to all group members. Group members also gave individual presentations, which were graded individually, during which the instructor asked each group member several questions to evaluate their contribution to the project, their understanding of the concepts that were covered in the course, and how well they met the course learning outcomes. Students were informed that each group member should be able to answer questions related to any part of the project deliverables as well as demonstrate in detail the tasks that they performed individually to support the project.
This project allowed for deep learning. For example, knowledge about the concept of prototyping, the advantages and disadvantages of prototyping, and the process of creating a prototype (remembering the levels of Bloom's taxonomy) could be assessed in an examination through essay questions; however, students' ability to answer the questions would not guarantee that students really understood, applied, and experienced the concept of prototyping. In contrast, through a project that tackles a real-world problem, students were working on creating prototypes for a product, website, mobile application, house, park or whatever the project domain entailed. Thus, they were able to answer any questions because they were not focused on memorizing information. They also had opportunities to reflect on their experience of creating a prototype. This, in addition to developing teamwork and presentation skills, cannot be obtained through exams that focus mainly on knowledge. If the higher levels of Bloom's taxonomy are achieved (i.e., application and innovation, which of course can be done through projects), then automatically by default, the lower levels (i.e., knowledge and analysis) are also attained.

Conclusion
This essay highlights important concerns associated with adopting online exams supported by proctoring software tools and provides a rationale for PBL as an alternative assessment strategy. PBL allows students to develop the competencies and skills needed in the 21st century to succeed in their careers. To shift from traditional learning and assessment to PBL, faculty need the support of university administrators to implement the required academic policies and procedures and have the required resources available.
To adopt project-based learning successfully, instructors might need training on how to design and evaluate team projects as well as on how to ensure that projects comprehensively provide students with opportunities to meet course learning outcomes. They might also need training on how to ensure that students' work is fairly graded, both individually and collectively. Instructors also need to ensure that the key elements of PBL described previously are incorporated in projects.
This essay proposes that it is appropriate to adopt PBL during pandemic and crises times. Nevertheless, its use should not be limited to these situations but should continue after the pandemic/crises and can be adopted, during normal situations, in place of exams and quizzes.