acm-header
Sign In

Communications of the ACM

ACM Careers

Crowdgrader Brings Crowdsourcing to the Task of Grading Homework


View as: Print Mobile App Share:
blue books

Grading homework assignments for a large class can be a time-consuming chore, and overworked teaching assistants often have little time to give students detailed feedback. But help may be on the way: A new crowdsourcing tool developed at the University of California, Santa Cruz involves students in the grading.

CrowdGrader allows students to submit their homework online and then distributes the submitted solutions anonymously to other students in the class for grading. Using a novel crowdsourcing algorithm that relies on a reputation system to assess each student's grading accuracy, CrowdGrader combines the student-provided grades into a consensus grade for each submission. As an incentive to take the grading task seriously, each student's overall grade for an assignment depends in part on the quality of his or her work as a grader.

Luca de Alfaro, professor of computer science at UCSC's Baskin School of Engineering, worked with graduate student Michael Shavlovsky to develop CrowdGrader. They have been evaluating it for programming assignments (C++, Java, and Android) in computer science classes taught by de Alfaro and others at UCSC and at the University of Naples in Italy. Though still an experimental project, the preliminary results are encouraging, de Alfaro says.

"My impression is that the accuracy is not perfect, but it's no worse than what a TA does. There is always imprecision in grading," he says. "The real benefit is in the learning experience."

Grading other students' work actually helps students develop their programming skills, de Alfaro says, because they get to see how other students have solved the same problem. Also, because their own work is graded by up to five other students, they get more feedback than they do from a teaching assistant.

"TAs are fairly consistent in the way they grade the homework submissions, but they tend to have schematic grading criteria that focus on some things and might miss others. From what I saw, the feedback from students gave a more holistic evaluation of the students' work," de Alfaro says.

The researchers describe their work in "CrowdGrader: Crowdsourcing the Evaluation of Homework Assignments," which also presents the results of preliminary user studies. While more user studies are under way, de Alfaro says that he was happy with the results so far. He plans to use CrowdGrader in more classes this fall.

Crowdgrader is available free online to any user. "It's hosted on a very stable platform, so if anybody wants to try it for their own classes, they can," de Alfaro says.

This work was supported in part by a Google Research Award for de Alfaro's work on crowdsourced ranking.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account