Students’ own collective criteria - influence on peer feedback and lab report quality
Other conference contribution, 2017
What happens when you let the students collectively decide together what’s meant by “good quality” for example in a lab report? Will this adequately guide their own learning as shown by their written lab reports and peer feedback comments? Or is it better to just tell them what you expect?
ABSTRACT
Intended audience:
This presentation will likely mostly interest teachers who use or want to start using peer review in their courses (at bachelor or master level). Teachers of courses with lab report assignments may also be interested in my example of use of peer feedback in this learning context.
Problem statement:
Many students, at the onset of advanced level studies at Chalmers, lack the ability to produce a good quality lab report. Such generic skills are expected to be in place before students leave Chalmers with their master’s degree. Experience has shown that “just” telling the students how to write a good report is not sufficient. Might they listen more to each other than to the teacher?
Suggested solution:
If students are collectively given the chance to spell out what they mean by a “good quality” lab report, agreeing upon a list of criteria in class, this list can be used to guide their learning through applying their own criteria. This list of criteria can support their individual first draft writing process, as well as their individual feedback given to each other in a structured peer review process. Finally, the feedback given by the teacher can reference the students’ own list of criteria when making the final assessment of the revised report.
Chosen learning scenarios:
This method has been applied for two consecutive years in an introductory course within the MPWPS program. An introductory lecture before labs start includes a collective exercise where students provide their take on quality criteria for lab reports. The teacher takes the role of secretary, creating their collective list on the whiteboard, and making the list available in PingPong after the lecture. (See Ref. 1.) If key aspects are being forgotten by the students collectively during the process, the teacher can drop gentle hints in order to have the final quality criteria list quality assured.
Seven obligatory short labs are performed in pairs, but only one of these labs is individually and randomly assigned to each student for writing a formal lab report. Students are then assigned peer review roles, and apply these quality criteria to a report on a different lab by another student. Revised lab reports are then submitted, along with a short text on how the peer feedback was incorporated into the final version. With peer review of first drafts, the reviewer as well as the author of the report will be learning during the process, while carefully applying the quality criteria.
During the two most recent years, when the above system was in place in this course, the teaching differed in one significant way: the most recent year included a lecture dedicated to academic honesty and the avoidance of plagiarism. (See Ref. 2.) In previous years, quality criteria were provided by the teacher, not by the collective group of students. (See Ref. 3.)
Student achievement measures:
We analyze the student achievements (lab report quality) as a measure of their learning in three different aspects: core subject content learning, generic written communication skills, and academic honesty. The latter analysis is reported in a separate presentation (Ref. 2.) The generic written communication skills will be the focus of my presentation, with an attempt at measuring the degree to which the student’s own quality criteria list was actually successfully applied in their lab report and peer review writing. It will however not be possible to make a “fair and scientific” comparison to previous years’ student lab reports, since other factors were also changed at the same time.
Comparison of resulting achievement for different scenarios:
Student learning connected to similar quality criteria will be compared for the “student collectively generated” criteria (the latest two years of the course) and the “teacher generated” criteria (provided previously). The two different years this student generated criteria scenario was used had slightly different criteria lists – which may or may not be evident in the outcome of the students’ writing. This course has just finished, and the analysis will be performed during the coming study period before the KUL conference, so results are pending.
Alternative solutions:
In a more traditional scenario, the teacher informs the students of the quality criteria, set by the teacher, and the teacher applies the criteria when grading the assignment. Both comments and grades are usually provided by the teacher to individual students, who sometimes resubmit assignments after taking into account the teacher-provided feedback. However, in many cases, the feedback comes without any further requirement posed, and therefore without strong incentives for further learning.
Much research has been done on criteria based assessment including negotiating criteria with students, (see e.g. Ref. 4.) Here, however, I have just given my suggestion of one way of assessing, which I have found to work well
References:
1. ”Lab report quality criteria”, available on PingPong at the following link:
https://pingpong.chalmers.se/courseId/7038/content.do?id=3366624
2. KUL 2017, Undervisa och examinera akademisk hederlighet, submitted ”short presentation”
3. KUL 2012, Systematisk feedback och progression som stöd för studenters lärande inom generella kompetenser.
4. Biggs, J. ”Teaching for quality Learning at University”, 2003, chapters 8 and 9.
quality criteria
generic skills
formative feedback
peer review
Author
Sheila Galt
Chalmers, Microtechnology and Nanoscience (MC2), Photonics
Hans Hjelmgren
Electric, Computer, IT and Industrial Engineering
Gothenburg, Sweden,
Subject Categories
Didactics
Learning
Pedagogical Work
Learning and teaching
Pedagogical work