Undergraduate computing classes typically deliver content through passive lectures and require students to write code from scratch. However, students do not always pay attention in lecture and writing code from scratch can be overwhelming for novice students. Students report feeling frustrated when they cannot figure out what is wrong with their code or wait hours to get help from instructional staff. Students from groups that have been historically marginalized are more at risk of failure in introductory courses since they tend to have less prior programming experience. How can we help students succeed in programming courses, especially those without prior programming experience?
Research on active learning in STEM has shown it improves learning, motivation, and pass rates over traditional lecture. In active learning students construct knowledge through discussion, problem solving, role play, and other methods. A meta-analysis found that students in traditional lecture STEM classes were 1.5 times more likely to fail than those in active-learning classrooms.3 Active learning is particularly effective for students from historically marginalized groups. While there are many types of active learning, in this column I focus on four types: interactive ebooks; Peer Instruction (PI); mixed-up code (Parsons) problems and Process Oriented Guided Inquiry Learning (POGIL).
I have used these approaches in my lectures for several years in a data-oriented programming course, SI 206, which I teach in the School of Information at the University of Michigan. It is the second required programming course for our majors, but approximately 30%-40% of the students are from outside the School of Information. Many of the students are from computer science or engineering and know C++, but want to learn Python. Some of the students are business majors. There is a large variance in prior programming experience, familiarity with Python, and in the level of programming skill that students want to develop. The course enrollment has doubled in five years, from approximately 100 students in 2018 to approximately 200 students in winter 2023. Readings are assigned before lecture from a free and interactive ebook. In lecture, students answer Peer Instruction questions in the ebook, solve Parsons problems, and work in groups to solve POGIL-style activities.
Interactive ebooks for computing courses typically allow students to run code and display the result in the ebook. They include support for typical instructional content such as videos, text, and images. Many also support a wide range of additional practice problems such as multiple-choice questions, fill in the blank questions, matching questions, code writing problems with unit tests, and mixed-up code (Parsons) problems. These problems provide immediate feedback and can be automatically graded. There are both free ebooks (for example, from Runestone Academy and OpenDSA) and commercial ebooks (for example, from Zybooks and Codio). Interactive ebooks improve learning versus static textbooks, and students prefer them. I have been creating free interactive ebooks on the Runestone Academy platform for many years. I use one of these ebooks, Python for Everyone—Interactive, in my course. A static version of this ebook was originally created by Charles Severance for his very popular Python for Everybody online course. Undergraduate students and I modified his static ebook to make the code examples runnable and to add lots of interactive practice problems as well as new content.
Active learning is particularly effective for students from historically marginalized groups.
Peer Instruction (PI) was originally developed by Eric Mazur of Harvard University to improve students' understanding in physics. Sometimes, the name "Peer Instruction" leads to confusion with peer instruction (working with peers). In PI as defined by Mazur, students read material before lecture and do an assignment or a quiz based on the reading either before lecture or at the beginning of lecture. During lecture the instructor displays a hard multiple-choice question with distractors (incorrect answers) based on common student misconceptions. Students answer the question individually (first vote), then discuss their answer with peers, and then answer individually again (second vote). Next the instructor shows the outcome of the two votes and leads a discussion about the question. There are many variants of Peer Instruction. Instructors do not always assign readings before lecture or include an assessment of the reading. Instructors may skip the second vote, especially if the percentage of students who got the question right on the first vote is above 70% or below 30%. Learning is optimized when about half of the students get the question wrong on the first vote. There are a variety of ways to vote including raising hands, using cards, or using electronic devices such as iClickers (handheld devices that look like television remotes).
I was introduced to Peer Instruction (PI) at computing education conferences where I learned of approximately a decade of positive research results for PI in many STEM subjects including computing.1 I also attended a workshop that explained the research, demonstrated the process, and provided resources. One of the resources was a website (see http://peerinstruction4cs.com/) with slides for several computing courses with embedded PI questions. When I first tried PI, I had students vote with iClicker devices. However, after every lecture I would have a line of students who had forgotten their iClicker devices or the batteries had run out, but still wanted credit for answering the Peer Instruction questions. Later there was an app that students could use to answer PI questions on their phones or laptops, but the app wasn't free. Since I was already creating and using an interactive ebook for the course, I decided to create a new tool to support answering PI questions by building it into the Runestone platform.
Solving Parsons problems is typically significantly faster for students than writing the equivalent code and results in similar learning gains.
This tool supports both in-person discussion and remote discussion via a chat interface. The chat interface allows us to maximize the number of groups that have members with different answers. It is also useful when a lecture is held on Zoom. The tool also includes an interface for students who answer PI questions outside of lecture. That student votes individually and then enters a justification for their answer. They then view a saved chat discussion where one of the members answered the same way they did, and then they vote again (still individually). We are currently studying the effectiveness of this new tool.
I first heard about mixed-up code (Parsons) problems at the 2012 ICER conference just after I had started my Ph.D. In a Parsons problem, students place mixed-up blocks in order to solve a problem see the accompanying figure). Parsons problems can also have distractors blocks that are not needed in a correct solution. I expected Parsons problems would be an easier form of practice than writing code from scratch. I added Parsons problems to a free ebook on the Runestone platform and watched what happened. I found that more students attempted to answer the Parsons problems than nearby multiple-choice questions. This encouraged me to study Parsons problems in my dissertation work. However, I also found that some students really struggled to answer Parsons problems. Some gave up and never answered them. Practice must be successful for learning to occur, so I wanted to guide students to the correct solution without just giving them the solution. I added a form of adaptation trigged by clicking a "Help Me" button after at least three incorrect attempts. Each time the student clicks the "Help Me" button it will either remove and disable a distractor block, or if there are no distractor blocks in the solution area, it will combine two blocks into one until the solution area has only three blocks.
Figure. A Parsons problem with the mixed-up blocks is on the left. Notice there are pairs of correct and distractor code with purple edges grouping them and an "or" to indicate the students should pick. The correct solution is shown on the right.
Research on Parsons problems has been growing.2 Several online systems support Parsons problems including Runestone Academy (see https://runestone.academy/ns/books/index), PraireLearn (see https://www.prairielearn.org/), Epplets (see https://epplets.org/), and Codio (see https://www.codio.com/). Studies have provided evidence that most students find solving adaptive Parsons problems useful for learning to program and perceive them as easier than writing the equivalent code. Solving Parsons problems is typically significantly faster for students than writing the equivalent code and results in similar learning gains. There is evidence that the distractors help novices learn to recognize common syntax and semantic errors. A log file analysis of ebooks with both adaptive and non-adaptive Parsons problems found that learners where nearly twice as likely to correctly solve an adaptive Parsons problem than a non-adaptive one. However, some students would rather write code from scratch than solve a Parsons problem, especially students with more prior programming experience. We added a new type of toggle problem to Runestone Academy that allows students to choose to solve either a Parsons problem or the equivalent code-writing problem. We also created another type of toggle problem that allows students to pop-up an adaptive Parsons problem when they are struggling to write code. Students can solve the Parsons problem but they still must at least type the solution in the code writing problem area. We are currently studying both of these types of toggle problems.
My goal is to help more students succeed in computing courses, especially those from groups that do not have access to computing courses before college.
One year at SIGCSE I attended a Process Oriented Guided Inquiry Learning (POGIL) workshop. POGIL is a form of group work that is carefully structured to allow students to actively discover important concepts. Research on POGIL has provided evidence that it improves teamwork skills and learning.4 There is a CSPOGIL website (see https://cspogil.org/Home) with worksheets from several researchers and for several courses and languages. Students work through the worksheets in structured groups. I integrated POGIL-style activities into the ebook that I use in my course (Python for Everybody—Interactive) so that the questions provide immediate feedback and can be automatically graded. Over time, I changed most of my lectures into POGIL-style activities. When students work on a POGIL-style activity I ask them to record what they learned and any questions in a Padlet. After they have worked in groups, I go over the questions. I have seen a big improvement in student engagement in lecture when students work on these activities versus passive lecture.
Active-learning techniques including interactive ebooks, Peer Instruction, mixed-up code (Parsons) problems, and POGIL can be used to improve student learning, skills, and motivation in computing courses. These approaches work well even large courses with hundreds of students. This is important since the number of students in introductory computing courses has more than tripled at many institutions since 2006. My goal is to help more students succeed in computing courses, especially those from groups that do not have access to computing courses before college. I encourage computing instructors to try one or more of these approaches. I find that these approaches are not only better for students, but also increase my enjoyment of lecture since students are more engaged.
1. Crouch, C.H. and Mazur, E. Peer instruction: Ten years of experience and results. American Journal of Physics 69, 9 (Sept. 2001), 970–977.
2. Ericson, B.J. et al. Parsons problems and beyond: Systematic literature review and empirical study designs. In Proceedings of the 27th ACM Conference on Innovation and Technology in Computer Science Education Vol. 2 (Dublin, Ireland, 2022). ACM, New York, NY, USA.
3. Freeman, S. et al. Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111 (23 (2014), 8410–8415.
4. Yadav, A. et al. Collaborative learning, self-efficacy, and student performance in cs1 pogil. In Proceedings of the 52nd ACM Technical Symposium on Computer Science Education. (Mar. 2021), 775–781.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2023 ACM, Inc.
No entries found