The current COVID-19 pandemic dictated online exams and provided interesting opportunities for integrating innovative teaching and evaluation methods. Instead of adapting our paper-based exam to this changed testing environment (that is, asking the students to submit their scanned paper-based exams), we allowed the students in the Technion's Introduction to Computer Science course to use the integrated development environment (IDE) they had been working with throughout the semester during the exam itself. this blog describes our experience with this evaluation method and our recommendations with respect to its use. We call this evaluation method an executable exam, since it enables the code submitted by the students to be executed by the students during the exam, and by the teaching staff after the exam.
This post is based on the experience we gained during the Winter 2020 semester, in which we used an executable exam. The course, which was offered at the Technion – Israel Institute of Technology, was attended by about 1,200 computer science and electrical engineering freshmen and was taught by four lecturers, one of which was Yael Erez, the first author of this post.
In order to understand the students' perception of this change in the evaluation approach, questionnaires were distributed to the students before and after the exam. In addition, interviews were conducted with the teaching staff of the course. An analysis of the qualitative and quantitative data gathered using these tools revealed not only the students' and staff's attitudes toward executable exams, but also highlighted potential challenges that should be considered and overcome.
In this post we offer 10 tips, targeted at lecturers, on how to manage executable exams. The tips are divided into three categories according to the time at which they should be considered: (a) Pedagogical tips, (b) Technical tips, and (c) Psychological tips. Clearly, the categories overlap to some degree, and so tips that fit several categories are presented in the more intuitive category. Figure 1 presents the ten tips on the course timeline.
Pedagogical tips address teaching-related issues and their aim is to ensure that the executable exam is fair, that it relates to the course goals, and that the students' answers represent well their course-related knowledge.
Since the exam is an executable exam, it is reasonable to consider the option of evaluating students' code automatically using a dedicated tool that has been prepared in advance. In the case described in this post, the same automatic tool used for grading students' homework during the semester was used to evaluate the students' answers on the semester's final executable exam. This tool checks compilation and runs automatic tests; however, these abilities are not always the main topics that the course teaching staff wishes to evaluate in the students' knowledge. It is known, for example, that sometimes code that is not compiled should be graded higher than code that is compiled. Hence, in the case described in this blog, the decision was to check all of the exams manually, in addition to the automatic test, and specifically, with respect to those that were not compiled, to check the reason they were not compiled. From a practical point of view, this means the evaluators must decide what to evaluate in the students' codes and how, according to each of the course goals. Various questions should be asked, such as: How should code that is not compiled be graded? How should a program that terminates abnormally (e.g., stack overflow) be graded? Are we going to grade code style, and so on. To summarize, the implementation of an easy-to-check strategy (such as compilation-based or automatic tests) should be avoided and an appropriate grading policy should be examined carefully. The easy (automatic) option would likely send undesired messages to the students about what is important in software development processes.
Since executable exams allow the students to use an IDE, some kinds of questions, largely used in paper-based exams, are just not suitable for this format since they are rendered trivial. Examples of such questions are: What is the output of this program? and What are the compilation errors? At the same time, some questions are better suited to an executable exam than to a paper-based exam. For example, the students can be presented with a "buggy" computer program, alongside its documentation, and be asked to debug it. With an IDE on hand, it seems only natural to give the students several test cases in order to check their debugging process. Another example is to give the students a program with missing parts, alongside a description of what the program is supposed to do, and ask them to add the missing parts.
Extraneous cognitive load refers to the way information or tasks are presented to a learner (Sweller, 1998). This and the next tip suggest ways of reducing extraneous cognitive load in executable exams.
In order to reduce the students' extraneous cognitive load (in other words, conditions that are unrelated to the content of the exam), it is important (a) to make sure that the exam format, in terms of working habits in the IDE, is similar to the one the students were accustomed to working with during the semester; for example, when submitting their homework, and (b) to adjust the exam questions to this format, as much as possible (see Tip #2). Another example is the automatic checker: if an automatic checker was used to check homework, it could be a good idea to make it available to the students during the exam so they may check their answers. Some features of the IDE and the automatic checker can be disabled but, to reduce the students' cognitive load, it is important to avoid adding new and unfamiliar features that may distract their attention.
Working in an IDE requires programmers to dedicate time to prepare the environment before starting to work on their programming tasks. Such preparation may, of course, add extraneous cognitive load. To reduce this cognitive load in the case of an executable exam and save the students precious time during the exam, we recommend providing students with a skeleton file for use alongside questions that involve code writing. Such a skeleton file can include declarations, function signatures, the main function, and even some simple tests. Since such files enable the students to focus on the main ideas learned in the course, they can clearly help reduce extraneous and unnecessary cognitive load. Although the students can, of course, change the skeleton file and add their own tests, in the case described here, most of the students used and submitted the main function they were provided with.
About four weeks before the exam, we administered a simulation of the exam, which turned out to be crucial for the students as well as for the teaching staff. It is important to make this simulation mandatory and make certain all students participate.
A simulation exam enables the students to gain some experience with the essence and procedure of an executable exam, which most of them will probably be experiencing for the first time. During the simulation exam, students can experience various possible technical problems that may occur during the real exam. For example, some students discovered they do not know how to work with the skeleton files provided, since they never had the opportunity to work with such files before. Thanks to the simulation, they arrived at the real exam better prepared and more relaxed.
Moreover, the teaching staff received all of the written simulation exams and could test the checking procedures. After the simulation, the teaching staff improved its understanding of possible pitfalls, so they could pay special attention to ensuring that students did not fall into them during the real exam (e.g., working with Zip files without extracting them, or submitting the wrong files). The simulation also revealed technical issues that the teaching staff were not aware of and enabled them to arrive at the real exam with the required solutions for such issues.
The technical tips describe technological actions required for a smooth and glitch-free executable exam.
Since an executable exam is a computer-based exam, prolonged computer problems happen and are devastating. We recommend coordinating alternative procedures with the students, in advance, in case such malfunctions occur. For example, in the case described in this post, (a) a link to a Zoom meeting where a lecturer was present during the entire exam to handle such problems was published in advance, (b) students were asked to be prepared with a hotspot on their phone as an Internet connection backup, and (c) an alternative email for submitting the exam was published (instead of Moodle, the Technion's LMS). All these procedures were vital and were, in fact, used during the exam.
Most students understand that unethical behavior, such as getting the answers from someone else, can harm them and so they avoid such behaviors. Yet, if the Web is available, the temptation to get answers from external resources increases. Thus, to avoid plagiarism and enhance trust, we recommend disconnecting the students' computers from the Internet. If the exam is being administered on campus, a computer lab can be used for this purpose. If the exam is being written on the students' personal computers, as was the case during the pandemic, we recommend using a dedicated environment. During the COVID-19 pandemic, many such environments were developed, upgraded, or adapted to the new learning situation (e.g., exam.net and gradescope.com).
Although it is preferable to let students use an IDE during the exam to write and check their code, other tools may also be provided for use during the exam for this purpose. In the case described here, the students could use the same tool during the exam that they used during the semester for writing and checking their homework prior to its submission (see Tip #3). This tool checks compilation and runs automatic tests but, during the exam, the automatic test feature was disabled and only the compilation check was enabled. This was done for two reasons: first, since the automated tests are online, they cannot handle so many students running their tests simultaneously; second, we did not want the students to be distracted and to spend unnecessary time on tests. The students were notified of these decisions prior to the exam.
The psychological category relates to emotions and behaviors. It is especially relevant when the exam format (computer-based and executable) is new to the students. Our data shows that both the teaching staff and the students have mixed emotions with respect to the transition from paper-based exams to the executable exam format. The next two tips refer to dealing with these emotions.
The first time a computer-based executable exam is administered, resistance most probably will be expressed both by the course's teaching staff and by the students. Such reactions are natural and understandable. The staff is concerned about doing something different, after years of doing something else that worked well; they may also express concerns with respect to plagiarism. Students may worry that the exam will be more difficult and claim that executable exams from previous years, which they would normally use to study for the exam, are unavailable. Such reactions, however, should not discourage the change leader as they are very common in change processes (see Kotter's model of leading changes (1996)). One way to cope with such voices of resistance is to maintain open communication with all stakeholders; in our case, the students and the teaching staff. Tip #10 addresses this aspect.
Continuous communication with the course staff and students is crucial. Although such communication is clearly important at any time, not only when changing the exam format to executable exams, it is especially important when such a change is made that affects so many students at the beginning of their studies. Communication that articulates the importance and positive aspects of executable exams enables information exchange and opinion sharing, reduces anxiety, and enhances trust.
This post offers 10 tips for the implementation of computer-based executable exams. The tips are presented in Figure 1 according to the time at which they should be considered. As you can see, most of the tips that should be considered prior to the exam are pedagogical, which highlights the fact that an executable exam is, first and foremost, a pedagogical act that creates an infrastructure for the technical and psychological actions that should be carried out to ensure smooth execution of the executable exam. As soon as this infrastructure is set, both the teaching staff and the students can concentrate on what is important: fair evaluation of the students' knowledge of the course contents.
Pedagogical tips |
Technical tips |
Psychological tips |
Tip 1. Decide on the grading policy and checking tools Tip 2. Make sure the exam suits an executable format Tips 3 & 4. Reduce cognitive load I & II Tip 5. Administer a mandatory simulation |
|
|
Tip 6. Prepare instructions in case of a malfunction |
Tip 7. Disconnect the internet Tip 8. Provide students with an IDE and checking tool |
|
Tip 9. Expect, accept, and respect resistance Tip 10. Maintain continuous communication with the course staff and students |
||
Prior to the exam |
During the exam |
After the exam |
Figure 1: The 10 tips on a timeline
Although additional tips and categories clearly exist, an examination of our data identified these 10 tips as the most significant ones. We invite the readership to share their experience and thoughts with respect to executable exams, regardless of whether or not they currently implement such computer-based executable exams or plan to in the future.
References
Kotter, John P. (1996). Leading Change. Boston, MA, Harvard Business School Press, 1996. Print.
Sweller, J. (1998). Cognitive load during problem solving: Effects on learning. Cognitive Science (12): 257–285.
Yael Erez is a lecturer in the Technion's Faculty of Computer Science and a staff member of the ORT Braude Department of Electrical Engineering. She currently is studying towards a teaching certificate at the Technion's Department of Education in Science and Technology. Orit Hazzan is a professor at the Technion's Department of Education in Science and Technology. Her research focuses on computer science, software engineering, and data science education. For additional details, see https://orithazzan.net.technion.ac.il/ .
No entries found