Teal Session: Students as Examiners - Crowd-Sourced Exam Preparation
Prof. Paul Denny presents on Peerwise, a tool that supports students in the creation, sharing, evaluation and discussion of assessment questions. Prof. Denny was visiting from his home institution in New Zealand, the University of Auckland. This presentation took place on Tuesday, December 12, 2017, as part of the TEAL Fellowship.
The very first thing that Prof. Denny taught us was - always bring a laser pointer! Though, this is something that might be an issue in the TEAL rooms, as the pointer is not always visible on backlit screens. But, I digress. Below is a quick summary of some major points of the presentation. At the end of the blog post is the presentation PDF.
Prof. Denny began his session by asking a very thought provoking question - "Does gamification lead to greater student engagement?" We've heard quite a bit about gamifying content, aiming to make it more fun and engaging, but, does it actually do what it purports to do? To look into this question further, Prof. Denny developed Peerwise, which is a platform that allows students to add and answer quesitons, rate other contributed questions, and earn badges for different accomplishments within the system (both for authoring and answering). From his research, Prof. Denny reported that self-testing with feedback has the strongest correlation with exam improvement. This seems to support the idea that when you are creating, or authoring content, you experience the "generation effect" - which is to say that when you create, you are more likely to remember the details of what you've created. By creating questions, not only do students have the chance to think about the content critically, they also create a rich self-testing environment for their classmates. The more questions, the more opprotunity to rank and provide feedback, and the more opprotunity to practice before any formal assessments.
Now, this isn't to say that Prof. Denny was not cognizant of the pitfalls of group-authoring or user-generated content. To illustrate this, he noted his recent experience using Trip Advisor in preparation for his Canada trip. He brought up a picture of Montreal, which was a beautiful city shot. He brought up a picture of Toronto, which was a decent skyline. And then...he brough up a picture of Hamilton. You can imagine that this shot did not do the city any justice. Now, each of these photos was uploaded by a Trip Advisor user, so demonstrated clearly the variance not only in quality but in appropriateness. Hamilton has many attractive qualities - if you evaluated simply based on this, you'd be missing a whole lot. This is addressed in the ability for students not only to rank questions by quality but also by allowing student authors to improve their questions/answers after receiving feedback from their classmates. This dose of peer assessment contributes to stronger questions.
Prof. Denny has implemented this platform in his courses with a variety of parameters. Sometimes, he's attached grades to authoring (say, 3%). Sometimes, not. Sometimes there's requirements on the number of questions to be authored. He's found, though, that while students will mostly author what they are required to, they will often answer (practice) far more than they are required to by any defined assessment standards. This data in depicted in historgams in the below presentation. Prof. Denny was also very clear in saying that while there was some loose correlation that showed that the more questions a student answered, the better they did on the exam, that an engaged student is likely going to do well, whatever tool/platform is implemented.
Peerwise is only one student-question bank platform option - others that were noted are Quizzical (which seems to be an app) and Basilisk. Each has a different fee structure and varying integrations with our learning management system. If you are interested in using a tool like this in your course, please contact email@example.com to investigate piloting this tool.