full screen background image
 

Student Partnership Assessment

Method

This systematic review was conducted according to the standard steps in a systematic review process: formulating the research questions, constructing inclusion and exclusion criteria, developing search strategy, screening studies using selection criteria, extracting data, assessing quality and relevance, and synthesizing results to answer the research questions (Gough 2007). The previous section explains the conceptual framework adopted in this study and how the research questions were derived from the conceptual framework. As Sih, Pollack, and Zepeda (2019) emphasize, systematic review questions should be constructed around a sound conceptual framework. The procedure of the review is described in the following subsections.

Inclusion and exclusion criteria

Five criteria were devised for the selection of articles (see Table S1). This review aims to explore how student partnership is enacted in assessment communities of practice; therefore, for a publication to be included, it must fulfil the primary criterion of being an empirical study. The rationale for using empirical studies is to enable synthesis of results that is based on collaborative efforts which have been implemented and trialed in reality in order that practice-based conclusions can be drawn. The empirical studies included consist of quantitative, qualitative, and mixed methods research, whereas all theoretical and conceptual papers were excluded from the review.

 
Table S1. Inclusion and exclusion criteria
Inclusion criteria Exclusion criteria Rationale
English language Languages other than English A lack of language resources
Published between 2000 and March 2022 Publications before 2000 Consistence with the trend and development of student partnership in assessment literature
Empirical studies Theoretical/conceptual papers To provide practice-based conclusions
Assessment activities Non-assessment related activities Relevance to the topic of the review
Higher education context Contexts other than higher education, e.g., primary education, secondary education, professional training Relevance to the topic of the review
Evidence of student-staff collaboration Participatory practices without student-staff collaboration In accordance with the definition of partnership adopted in the review

As the foci of this review are assessment-related activities that involve student partnership and the roles students adopt in partnership, the selected studies must provide sufficiently clear information for such data to be extracted. In addition, there must be clear evidence of collaboration between students and academic staff particularly in the description of the research to qualify a study as empirical work on student partnership. Studies that examined participatory practices where students and staff perform a task without engaging in dialogue and negotiation would not be included in this review. This criterion is in accordance with the definition of partnership as a collaborative and reciprocal process explained in the previous section.

Literature search

The literature search process consisted of three parts. First, searches were performed in four major electronic academic databases, namely, ERIC, Web of Science, Scopus, and PsycInfo, which host a large number of academic journals covering broad disciplinary areas. Three groups of keywords consisting of the main concepts in the systematic review topic—“student partnership”, “assessment”, and “higher education”—as well as their synonyms and associated terms were used to scan titles and abstracts for relevant articles (see Table S2). Although the notion of student-teacher collaboration in assessment dated back to the 1980s (Boud and Prosser 1980; Falchikov 1986), it began to gain widespread attention in the 1990s (e.g., Birenbaum 1996; Towler and Broadfoot 1992) with the term “partnership” coming into use in late 1990s (Stefani 1998). Therefore, the time period parameter was set to search for articles published between 2000 and 2022. Due to a lack of language resources to support translation work, the literature searches were limited to articles published in English only.

 
Table S2. Search terms
Concept Search terms
Student partnership “student partnership” OR “student-staff partnership” OR “student-faculty partnership” OR “students as partners” OR “student-staff collaboration” OR “student voice” OR “student engagement”
Assessment “assessment” OR “evaluation” OR “assessment for learning” OR “assessment as learning” OR “co-assessment” OR “co-creation” OR “self-assessment” OR “peer assessment”
Higher education “higher education” OR “university” OR “college” OR “postsecondary” OR “tertiary”

The second search process was manual searches using the same parameters in the web search engine, Google Scholar, and two online journals, International Journal for Students as Partners and Teaching and Learning Together in Higher Education. These two journals were chosen due to their prominent focus on students and staff collaborative work. As Google Scholar is less able to handle complex searches compared to databases, it was used as a secondary method to complement database searches. The search string (“student partnership” AND “assessment” AND “higher education”) was used to search for articles published between 2000 and 2022, excluding patents. About 1,520 results were returned. As most of the results were similar to the database search results, only the first 100 results were screened. Another search strategy was backward reference searching which involved examining the reference lists and works cited in relevant articles and background reading materials to identify other relevant studies. In all three search processes, both peer-reviewed and non-peer reviewed publications were included as non-peer reviewed publications such as book chapters, reports, and conference proceedings might also be relevant to the purpose of this study.

Study selection

The database searches yielded 1,775 results, whereas manual and backward reference searching identified 1,670 potential articles (see Table S3). The abstracts of the articles were screened against the third, fourth, and fifth inclusion and exclusion criteria in Table S1 (the first two criteria had been applied through search filters). If the context of the study—whether it was assessment related and whether it was conducted in a tertiary setting—was not clearly stated in the abstract, the article would be included so that the criteria could be evaluated in full-text review. After abstract screening, 120 articles were selected for full-text review. Then, each of the selected articles was read thoroughly to determine its suitability against the last criterion in Table S1 – whether there was evidence of student-staff collaboration in the study reported, and any previous criteria if necessary, resulting in further elimination of 75 articles. In total, 45 empirical studies were selected to undergo the subsequent stage of quality assessment and data extraction – 16 from database searches, 12 from manual searches, and 17 from backward referencing searching. The steps taken to select relevant publications are shown in a flow diagram in Figure S1.

 
Table S3. Search strategy used in each database
Database Search string Field searched Search results
ERIC AB ( “student partnership” OR “student-staff partnership” OR “student-faculty partnership” OR “students as partners” OR “student-staff collaboration” OR “student voice” OR “student engagement” ) AND AB ( “assessment” OR “evaluation” OR “assessment for learning” OR “assessment as learning” OR “co-assessment” OR “co-creation” OR “self-assessment” OR “peer assessment” ) AND AB ( “higher education” OR “university” OR “college” OR “postsecondary” OR “tertiary” ) Abstract 193
PsycInfo ab(“student partnership” OR “student-staff partnership” OR “student-faculty partnership” OR “students as partners” OR “student-staff collaboration” OR “student voice” OR “student engagement”) AND ab(“assessment” OR “evaluation” OR “assessment for learning” OR “assessment as learning” OR “co-assessment” OR “co-creation” OR “self-assessment” OR “peer assessment”) AND ab(“higher education” OR “university” OR “college” OR “postsecondary” OR “tertiary”) Abstract 83
Scopus ABS ( “student partnership” OR “student-staff partnership” OR “student-faculty partnership” OR “students as partners” OR “student-staff collaboration” OR “student voice” OR “student engagement” ) AND ABS ( “assessment” OR “evaluation” OR “assessment for learning” OR “assessment as learning” OR “co-assessment” OR “co-creation” OR “self-assessment” OR “peer assessment” ) AND ABS ( “higher education” OR “university” OR “college” OR “postsecondary” OR “tertiary” ) Abstract 967
Web of Science ((AB=(“student partnership” OR “student-staff partnership” OR “student-faculty partnership” OR “students as partners” OR “student-staff collaboration” OR “student voice” OR “student engagement”)) AND AB=(“assessment” OR “evaluation” OR “assessment for learning” OR “assessment as learning” OR “co-assessment” OR “co-creation” OR “self-assessment” OR “peer assessment”)) AND AB=(“higher education” OR “university” OR “college” OR “postsecondary” OR “tertiary”) Abstract 532
Total 1,775

Figure S1. Inclusion and exclusion of studies

Figure S1. Inclusion and exclusion of studies

Quality assessment and data extraction

In systematic synthesis of research, it is important to assess the quality and relevance of the data obtained from the selected studies. Gough (2007) contends that since questions and foci differ from one systematic review to another, the standards for judging the quality of included studies should comprise both generic and review-specific criteria. The quality assessment tool used in this systematic review was devised with such a goal in mind – to evaluate the general robustness of the research procedure adopted and the trustworthiness of the findings in a selected study, while also to assess whether the selected study was suitable for answering the systematic review questions. The assessment tool consisted of two parts with four questions each (see Table S4). Part 1 contained generic criteria adapted from Hong et al.’s (2018) Mixed Methods Appraisal Tool; Part 2 was designed based on the aim and research questions of this systematic review. For a study to be deemed fit for the purpose of this review, it must fulfil:

  • criteria 1.1, 2.1, and 2.2; and
  • at least two of the following criteria from Part 1: 1.2, 1.3, and 1.4; and
  • at least one of the following criteria from Part 2: 2.3 and 2.4.
 

Table S4. Quality assessment tool

Dimension Quality criteria
1. Generic assessment 1.1. Are the research objectives/questions clearly stated?
1.2. Is the method of data collection appropriate?
1.3. Is the method of data analysis explained?
1.4. Are the conclusions drawn from the results/findings of the study?
2. Review-specific assessment 2.1. Is student assessment a focus of the study?
2.2. Is the collaborative relationship between students and staff made clear?
2.3. Are students’ activities in the partnership clearly described?
2.4. Are staff’s activities in the partnership clearly described?

Data extraction was performed concurrently with quality assessment using an Excel worksheet. The data extracted include descriptions of individual studies, such as context, year of study, methods, and participants; as well as information related to the review questions, for example, assessment activities, students’ actions/responsibilities, and staff’s actions/responsibilities. Both researchers discussed and agreed on the template before undertaking data extraction independently using the same template. Some of the studies appeared to be relevant at the screening stage but were found to be unsuitable for the purpose of the review upon closer examination. Two studies were excluded because there was insufficient information on student and staff participation for the analysis of assessment partnership. Hence, after quality assessment and data extraction, 43 studies were found to be suitable for the purpose of this review, and the extracted data from these studies were synthesized to answer the review questions. The characteristics of the 43 included studies are presented in Table S5.

 
Table S5. Characteristics of included studies
Study Location & year of study Context Participants Study design Study aim/focus/objectives
Abdelmalak
(2016)
USA;
Year unknown
15-week graduate-level education course (EDTECH);
A medium-sized university in the Southwest of the United States
Six volunteer graduate students (5 doctoral students) – 3 Caucasian female, 3 Mexican male;
Age range: 38-55 years old
One instructor
Qualitative multiple case studies.
Interviews, class observations, student artifacts.
Cross case analysis
To understand graduate students’ perceptions of the collaborative construction of course assignments.
Alsford
(2012)
UK;
2010-2012
A large London post 1992 university with three campuses Student Forum leaders formally recruited and employed with part-time hours
Student Forum members
Staff from the Educational Development Unit (EDU)
Reflective case study
Qualitative data by eliciting responses from participants, discussion with the EDU
Quantitative data about the participation of student forum members in particular activities
To introduce a model of student engagement based on an experimental project
Andrews et al.
(2018)
UK Portsmouth School of Architecture Students and academics from the School of Architecture
Student representatives from the BA (Hons) Interior Architecture and Design course
12 undergraduate students and 6 Masters of Architecture students participated in focus groups
Case study
National Student Survey
Focus groups with students
Workshops with academics
Course-level feedback, module grades
Student interviews
Interpretative analysis
To evaluate assessment and feedback strategies from across the School resulting in a creation of a new and innovative set of “assessment for learning” tools produced with students as partners
Baerheim and Meland
(2003)
Norway
2000-2001
Bergen Medical School,
General medicine/family medicine
Sixth-year medical students in family medicine Marks in examination
Scores on self-administered questionnaires
Students’ free text evaluation
Descriptive statistics, pragmatic approach
To evaluate how sixth-year medical students experienced proposing questions for their own written examination in medical medicine, and to what extent their performance in the examination was influenced
Bell et al.
(2019)
Australia Film studies unit “From Silent to Sound Cinema” Four student partners, 3 academic staff, professional staff
9 students participated in focus groups; 12 responded to the online survey
Focus groups with students
Online survey (25% response rate)
Students’ and student partners’ reflective writing
Thematic analysis
To investigate the perceptions of students enrolled in the unit of redesigned learning activities and assessment and what student partners gain from working together in a curriculum redesign process
Cecchinato & Foschi (2017) Italy A non-compulsory course “e-Learning Technologies” within the Social, Work and Communication Psychology degree programme
University of Padua
42 fourth-year students Indicators of efficacy: attendance (Moodle event reports on the online interaction and participation in laboratory meetings), accountability (observations, forum discussion on divergence of opinions in peer grading), engagement (average grades, student survey) To present the transformation of a university course inspired by the theoretical background of the student voice approach
Chamunyonga et al. (2018) Australia Queensland University of Technology
Bachelor of Radiation Therapy programme
2 student representatives
Third-year students
One lecturer acted as a moderator
One senior radiation therapy lecturer
The University coordinator for the Students as Partners programme
22 students took Survey 1; 23 students took Survey 2
Semi-structured focus groups
Two surveys
To report the development and initial evaluation of alternative approaches to assessing treatment planning skills and knowledge in undergraduate radiation therapy education
Collins (2010) Australia An Australian regional university
First-year law course “Law in Context”
First-year law students (on-campus and off-campus)
Two academic staff
The Queensland Debating Society (facilitating assessment)
A specialist guest lecturer/senior Queensland Adjudicator
Student evaluation survey
Student focus groups
To report the experience of designing and delivering a course in legal theory and jurisprudence
Colson et al. (2021) Australia First-year foundational course for 400+ students in health undergraduate programmes – “Genes and Disease” (delivered mixed mode)
Griffith University
First-year foundational students
Academics teaching the course
123 students responded to the questionnaires
Mixed-methods approach
Anonymous online student questionnaires
Open-ended questionnaire questions, student public feedback comments and private written reflections
Descriptive and inferential statistics
Content analysis
To explore how the co-creation assessment task enabled students to develop graduate capabilities and demonstrate their creativity.
Cooper (2017) UK
2014
Professional practice module on a professional qualifying BA (Hons) Youth & Community Work programme
Block placement where students planned, delivered and evaluated a six-week youth work intervention
Students, fieldwork supervisors, university tutors
3 students (out of 18 students who completed the collaborative assessment)
3 supervisors, 3 tutors
Semi-structured interviews
Thematic analysis
To investigate the qualitative experiences of students, supervisors, and tutors involved in a summative collaborative assessment of placement learning
Cosker et atl. (2021) France Medical Faculty of the University of Larraine
3rd year medical curriculum
303 third-year medical students who had never undertaken the objective structured clinical examination (OSCE)
44 novice OSCE tutors
36 tutors and 185 students responded to the questionnaires
3 permanent medical teachers, one pedagogical engineer
Self-assessment questionnaires
Descriptive statistics
To study the perceived effectiveness of tutor-student partnership in a practice OSCE module by novice tutors and medical students
Deeley (2014) UK
January – March 2010
Service-learning, an optional Honours course for 3rd and 4th year students in a Public Policy undergraduate degree programme in a Scottish university 8 students (4 female, 4 male)
The teacher of the course (co-assessor)
Practitioner research study
Semi-structured interviews with students
A focus group with students
Overarching themes arising from the data were identified; thematic framework was drawn
To investigate the effects of using co-assessed oral presentation with students engaged in service-learning
Deeley (2018) UK Service-learning optional honours course in Social and Public Policy in a Scottish university 20 third- and fourth-year undergraduate students studying the degree of MA (Honours) Social Sciences Three focus groups and five semi-structured individual in-depth interviews with students
Sequential data analysis
To examine and critically evaluate a selection of different technological methods that were specifically chosen for their alignment with, and potential to enhance, extant assessment for learning practice
Deeley and Bovill (2017) UK
2013-2014
A four-year undergraduate MA Social Sciences degree in
a Scottish university
27 third- and fourth-year students taking two optional Public Policy honours courses
The course instructor
Case study
Questionnaires, students’ self-assessment of their essays
Identifying prevalent themes in the data
To investigate student perspectives on their learning during a staff-student partnership that engaged students as co-designers within assessment and feedback processes
Doyle et al. (2019) Ireland 3rd year undergraduate tax module 159 business students
104 students completed the survey
3 academics – the module leader delivered the lectures, the tutor delivered the tutorials, a third academic was involved in the research design of the co-creation assignment
A short survey
Instructors’ reflection
To outline an example of assessment co-creation
To explore students’ and instructors’ perceptions of certain aspects of assessment co-creation
El-Mowafy (2014) Australia
2011-2012
Geospatial Sciences learning unit “GPS Surveying” at Curtin University 30 and 32 third-year undergraduate students in 2011 and 2012 respectively Questionnaires
Marks given by different groups of assessors
Descriptive statistics, ICC coefficient
To investigate peer assessment of fieldwork
Geraghty et al. (2020) USA
January – April 2017 (survey period)
University of Illinois College of Medicine-Chicago Medical students, expert medical education faculty
563 of 753 medical students across all years participated in the survey – 63 were formally involved in the Student Curricular Board
Student survey
Chi-squared tests, Cronbach’s alpha, thematic analysis
To examine the impact of a novel student engagement programme known as the Student Curricular Board
Godbold et al. (2021) Australia Primary education degree programme in a research-intensive Australian university 61 students enrolled in a final-year course
7 students participated in two focus-group conversations
Exploratory qualitative study
Focus group conversations with students
Thematic analysis
To investigate how final-year undergraduate students experienced the shift toward partnership in the classroom
Hussain et al. (2019) UK First-year class in the Electronics and Electrical Engineering degree programme 202 current students in the Microelectronics Systems course
173 students responded to the feedback questionnaire
The instructor, 3 ex-students who took the course in the previous year
Feedback questionnaire for current students
Focus group interviews with ex-students
To explore the effectiveness of student involvement in improving the assessment and feedback for larger classes in the transnational educational provision
Kaur et al. (2017) Malaysia Masters of Education degree programme in a public university
Linguistically diverse classroom with varied levels of English language skills
114 in-service teachers (28-40 years old) in the 3rd and 4th semester of the programme
12 students participated in face-to-face interviews
Design-based methodology
Focus group discussions, open-ended questionnaire, video recording of group work activity, face-to-face interviews
Thematic analysis
To investigate students’ experiences with contextually sensitive assessment protocols for inclusive and fair assessment developed through faculty-student partnership
Kearney (2019) Australia An undergraduate primary education programme at the University of Notre Dame 200 out of 232 first-year students in the first semester of study (18-27 years old) Student surveys To explore the use of the Authentic Self and Peer Assessment for Learning (ASPAL) model to help acculturate first-year students to tertiary assessment practices
Kiester and Holowko (2020) US
June 2018 – January 2019
An undergraduate first-year seminar course
A small liberal arts institution in the mid-Atlantic region
An assistant professor of sociology
A computer science and political science double major student partner
16 first-year students
Feedback surveys
Faculty and student partners’ observations and reflection
Open coding and common themes
To report the co-creation of the curriculum for an undergraduate first-year seminar
Leslie and Gorman (2017) UK
2011-2013
Project-based modules at the Mechanical Engineering and Design Department
Aston University
Group 1: 96 first-year students
Group 2: 90 first-year students
Action research
Questionnaire
Descriptive statistics, t-tests
An action research project to enhance students’ understanding of the principles of the logbook and awareness of how to complete the logbook to a high standard
Lorber et al. (2019) UK A law school in a UK university Five academics
Two professional colleagues
One member of the university Learning institute
Four law students
Case study
Views of staff expressed either in committees, via email, or through the staff questionnaire
A student-staff partnership project devised to help improve assessment and feedback practices with a law school
Lorente and Kirk (2013) Spain Physical education teacher education
Alternative assessment in an optional unit called “Physical Education Assessment” towards the degree “Ciencias de la Actividad Fı´sica y el Deporte” at the ‘Instituto
Nacional de Educacio´n Fı´sica de Catalun˜ a’, Spain
40 students
The unit teacher
Action research
Field notes, documentary analysis of the portfolio of learning materials developed by students, meta-assessment conclusions made by students
Grounded theory approach to generate themes and subthemes
To reveal some lessons learned by a teacher educator as she sought to apply alternative, democratic assessment practices in physical education teacher education
Lubicz-Nawrocka (2018) UK Scottish universities
Across disciplines – education, environmental biology, service learning, medicine, geosciences, psychology, politics, veterinary science
10 staff members from four Scottish universities who engaged in one or more co-creation of the curriculum projects
10 students who had participated in the co-creation projects
Qualitative research
Interviews
Grounded theory approach
To explore the benefits of co-creation of the curriculum
Matthews et al. (2017) Australia A three-year Bachelor of Science degree with an optional Honours-research year and a four-year Bachelor of Biomedical Science degree in the Faculty of Science
A research-intensive university
268 students (out of 1,208 students who were emailed) Quantitative study
Online survey
Descriptive statistics, paired t-tests
To explore undergraduate students’ perceptions of how involved they were in partnership activities across their degree programmes and whether this matched their desired level of involvement in such practices
Meer and Chapman (2015) 2010-2013 Second year Human Resource Development course of an undergraduate Business and Management degree 24 students divided into four groups of six
The course lecturer
Four-year longitudinal action research
Semi-structured focus group, written responses to questions, students’ grades
To analyse the differences between marking criteria written by academics and students; to analyse peer, self, and lecturer grading of assessment using co-created marking criteria; to analyse the engagement and performance of the students during the intervention
Murphy et al. (2017) UK A large university in the northwest of England
Four degree programmes: events management, law, sport and exercise sciences, and quantity surveying
Core project team: Four academic staff members; three student project officers studying full-time master’s degree programmes in sport and exercise science
Co-creation workshop: 35 staff members and 60 students
35 staff members responded to the survey; 16 academic staff members took part in the interviews
Mixed-methods sequential explanatory approach
Questionnaires, interviews
Descriptive statistics, inductive thematic analysis
To explore staff perspectives of a staff-student partnership project aimed at improving feedback strategies
Orsmond et al. (2000) UK Two biology modules “Common Skills” and “Life on Earth” at Staffordshire University An assessing tutor
85 students from two different years of study
Group A (16 students); Group B (20 students); Group C (16 students); Group D (16 students); Group E (17 students)
Marks awarded by students and the tutor; evaluation questionnaire
Sign test, descriptive statistics
A report of the implementation of a method of student self and peer assessment involving student constructed marking criteria
Orsmond et al. (2002) UK The module “Work Experience and Personal Development” in Environmental Sciences and Applied Biology
Staffordshire University
22 first-year undergraduate biology students
2 tutors
Poster feedback questionnaire, marks awarded by students and tutors
Kolmogorov-Smirnov test
A study on the implementation of a method of student self and peer assessment involving student constructed marking criteria in the presence of exemplars
Peseta et al. (2016) Australia
2014
Sydney Teaching Colloquium, University of Sydney 6 undergraduates from the areas of education, health sciences, and arts/social sciences Student ambassadors’ reflections
Analysed by the ambassadors themselves to develop a set of themes
An account of an innovative students-as-partners initiative at the University of Sydney
Quesada et al. (2019) Spain Elementary Education in Early Childhood Education degree programme at the University of Cádiz 470 first- and second-year undergraduate students; 4 teachers
349 students completed the questionnaire; 31 students participated in focus groups
Questionnaires (teacher survey and student survey), focus groups with students
Qualitative analysis – thematic approach
To analyse the strengths, weaknesses, opportunities, and threats perceived by students participating in co-assessment practices; to explore the strengths, weaknesses, opportunities, and threats perceived by professors participating in the experience
Rivers et al. (2017) Australia Second-year molecular genetics course Two course convenors
Students taking the course (typical enrolment of 200)
Peer-assisted learning mentors
Student surveys; student comments on PeerWise A report of the implementation of peer-learning strategies that aim to provide students with greater ownership of the course content and its assessment
Smith et al. (2021a) UK
2020 – 2021
Business school, University of Sussex Six students from different years and degrees within and outside the business school
9 staff members (5 female; 4 male)
Non-teaching staff – academic developer, student academic success advisor
Five student partners and five staff project members
Case study
Semi-structured interviews
Thematic analysis
A report of a staff-student partnership to co-create generic assessment criteria for the business school of a university
Smith et al. (2021b) UK
2020 – 2021
Business school, University of Sussex 6 student connectors from different years of study and degrees from within and outside the business school
Staff selected from various departments and positions
Five student connectors and five staff members participated in interviews
Case study
Semi-structured interviews
Thematic analysis
A case study of a quality-enhancement staff-student partnership to identify the stages of the partnership co-creation process
Snelling et al. (2019)
Only Exemplar 2 is relevant
Australia University of Adelaide Exemplar 2:
Second- and third-year and academic staff from plant science-related degrees in the Faculty of Science
Pre-workshop survey, course evaluation A report of three exemplars of practice inspired by emerging evidence that student-staff partnerships have the potential to significantly enhance many areas of higher education
Only Exemplar 2 focuses on assessment
Stephenson (2006) New Zealand The course “Teacher roles in young children’s learning” in an Early Childhood diploma programme Class sizes ranged from 16 to 24 students (predominantly female)
The course teacher (author of the paper)
Action research over four years
Institution’s formal evaluation, an informal end-of-course evaluation, audiotaped sections of the class, teacher
Implementation of new methods of learning/teaching to reposition students as co-constructors of their own knowledge
Taylor et al. (2021) Australia
2008-2020
Business Faculty Stage 1 intervention: more than 4,500 UG and PG students
Stage 2 intervention: 785 UG and PG students
Stage 3 intervention: 318 students
All support workshops were hosted by the discipline-based staff members and attended by specialist teaching and learning advisors
Three-stage, targeted intervention
Student surveys
To identify the reasons for the limited attainment of positive cross-boundary crossings for all students within the context of an Australian, Business Faculty; to outline the thirteen-year (2008-2020), iterative, intervention process undertaken with 6,000 participating students; to provide post-script reflections related to the post-CV19, online experiences of students
Taylor et al. (2015) Australia
2010 – 2014
Accounting subjects within the Bachelor of Business degree in an Australian university Phase 1: approximately 85 students of second-year Financial Accounting in Semesters 1 & 2, 2010
Phase 2: 126 students of International Accounting in Semester 1, 2011 and Semesters 1 & 2, 2012
Phase 3: International Accounting students in Semester 1, 2013
Phase 4: International Accounting students in Semester 2, 2013 and Semester 1 2014
Student surveys A report of a four-phase, cross-institution, and cross-discipline project designed to embed peer-review processes as part of the assessment in two large, undergraduate accounting classes
Thompson et al. (2017) Australia
2015
Final-year capstone paramedic practice in a three-year bachelor of paramedic science degree
A university in southern Australia
Undergraduate students
90 of the 94 eligible participants enrolled in the subject responded to the questionnaire (96% response rate)
Questionnaire
Automated, university-mandated survey: Student Evaluation of Topic (SET) survey
To evaluate students’ perceptions of student-tutor consensus grading of simulation-based practical skills assessments in a final-year paramedic practice capstone subject.
Tiew (2010) Malaysia
2009
Weekly tutorials of the second- and third-year of studies at the Business School, Curtin University of Technology, Sarawak Campus Semester 1: 42 students
Semester 2: 35 students
Action research
Questionnaire
Calculation of means and percentages
To collate students’ opinions of peer and self-assessment on tutorial class participation
Ward et al. (2022) Ireland
2020 – 2021
Three undergraduate degree programmes in School of Computing, Dublin City University: Computer Applications, Enterprise Computing, and Data Science 20 class representatives (representing around 700 students)
18 class representatives participated in the focus group
Module coordinators, programme chair and the teaching convenor (student consultations)
Quality Promotions Office – upskilling the class representatives
Focus group with student representatives
Reflections of the Assistant Head for Teaching Excellence (incorporating findings from academics in the School)
To investigate whether student partnership made a difference to the student learning experience in COVID-19 times

Result synthesis

The data were analyzed using Braun and Clarke’s (2006) theoretical thematic analysis approach. The extracted data were organized according to the three research questions in an Excel spreadsheet. Each set of data were examined separately, and an initial code was assigned to describe each data segment. After all the data segments were coded, they were read thoroughly and examined for patterns. Codes that were associated with one another or related to the same issue were grouped together. For example, “create and refine assessment tools/tasks”, “create supporting resources for staff and students”, and “prepare answers for designed tasks” were codes for the second research question on students’ roles. All three codes concerned the action of developing something, so they were put in the same group. A label was then assigned to describe the group of codes. Using the same example for students’ roles, the label assigned was “co-designers”. After all the three sets of data were coded and grouped into categories, the categories were reviewed to inspect whether all the codes were adequately explained by the categories and whether all the data segments fit the codes and categories. Both researchers performed the analysis independently and then met to compare their codes and categories. Discrepancies were resolved through negotiation and discussion. There were 51 codes grouped into 12 categories, which were further organized into three themes to answer the research questions. The naming of the three themes was guided by the research questions. The findings are reported according to the three themes: areas of partnership, roles of students, and support provided by university staff.

 
Articles included in the systematic review
  1. Abdelmalak, M. M. M. 2016. “Faculty-Student Partnerships in Assessment.” International Journal of Teaching and Learning in Higher Education 28 (2): 193-203. https://files.eric.ed.gov/fulltext/EJ1111135.pdf
  2. Alsford, S. 2012. “An Educational Development Student Forum: Working Partnerships with Students.” Journal of Applied Research in Higher Education 4 (2): 186-202. http://doi.org/10.1108/17581181211273291
  3. Andrews, M., R. Brown, and L. Mesher. 2018. “Engaging Students with Assessment and Feedback: Improving Assessment for Learning with Students as Partners.” Practitioner Research in Higher Education 11 (1): 32-46. https://files.eric.ed.gov/fulltext/EJ1180136.pdf
  4. Baerheim, A., and E. Meland. 2003. “Medical Students Proposing Questions for Their Own Written Final Examination: Evaluation of an Educational Project.” Medical Education 37 (8): 734-738. http://doi.org/10.1046/j.1365-2923.2003.01578.x
  5. Bell, A., S. Potter, L.-A. Morris, M. Strbac, A. Grundy, and M. Z. Yawary. 2019. “Evaluating the Process and Product of a Student-Staff Partnership for Curriculum Redesign in Film Studies.” Innovations in Education and Teaching International 56 (6): 740-750. http://doi.org/10.1080/14703297.2019.1588768
  6. Cecchinato, G., and L. C. Foschi. 2017. “Flipping the Roles: Analysis of a University Course Where Students Become Co-Creators of Curricula.” Teaching and Learning Together in Higher Education 1 (22): 1-9. https://repository.brynmawr.edu/tlthe/vol1/iss22/5
  7. Chamunyonga, C., J. Burbery, P. Caldwell, P. Rutledge, and C. Hargrave. 2018. “Radiation Therapy Students as Partners in the Development of Alternative Approaches to Assessing Treatment Planning Skills.” Journal of Medical Imaging and Radiation Sciences 49 (3): 309-315. http://doi.org/10.1016/j.jmir.2018.04.023
  8. Collins, P. 2010. “Inclusive Team Assessment of Off-Campus and On-Campus First Year Law Students Using Instantaneous Communication Technology.” The Law Teacher 44 (3): 309-333. http://doi.org/10.1080/03069400.2010.524032
  9. Colson, N., M.-A. Shuker, and L. Maddock. 2021. “Switching on the Creativity Gene: A Co-Creation Assessment Initiative in a Large First Year Genetics Course.” Assessment & Evaluation in Higher Education: 1-18. http://doi.org/10.1080/02602938.2021.2011133
  10. Cooper, S. 2017. “A Collaborative Assessment of Students’ Placement Learning.” Assessment & Evaluation in Higher Education 42 (1): 61-76. http://doi.org/10.1080/02602938.2015.1083093
  11. Cosker, E., V. Favier, P. Gallet, F. Raphael, E. Moussier, L. Tyvaert, M. Braun, and E. Feigerlova. 2021. “Tutor-Student Partnership in Practice OSCE to Enhance Medical Education.” Medical Science Educator 31: 1803-1812. http://doi.org/10.1007/s40670-021-01421-9
  12. Deeley, S. J. 2014. “Summative Co-Assessment: A Deep Learning Approach to Enhancing Employability Skills and Attributes.” Active Learning in Higher Education 15 (1): 39-51. http://doi.org/10.1177/1469787413514649
  13. Deeley, S. J. 2018. “Using Technology to Facilitate Effective Assessment for Learning and Feedback in Higher Education.” Assessment & Evaluation in Higher Education 43 (3): 439-448. http://doi.org/10.1080/02602938.2017.1356906
  14. Deeley, S. J., and C. Bovill. 2017. “Staff Student Partnership in Assessment: Enhancing Assessment Literacy through Democratic Practices.” Assessment & Evaluation in Higher Education 42 (3): 463-477. http://doi.org/10.1080/02602938.2015.1126551
  15. Doyle, E., P. Buckley, and J. Whelan. 2019. “Assessment Co-Creation: An Exploratory Analysis of Opportunities and Challenges Based on Student and Instructor Perspectives.” Teaching in Higher Education 24 (6): 739-754. http://doi.org/10.1080/13562517.2018.1498077
  16. El-Mowafy, A. 2014. “Using Peer Assessment of Fieldwork to Enhance Students’ Practical Training.” Assessment & Evaluation in Higher Education 39 (2): 223-241. http://doi.org/10.1080/02602938.2013.820823
  17. Geraghty, J. R., A. N. Young, T. D. M. Berkel, E. Wallbruch, J. Mann, Y. S. Park, L. E. Hirshfield, and A. Hyderi. 2020. “Empowering Medical Students as Agents of Curricular Change: A Value-Added Approach to Student Engagement in Medical Education.” Perspectives on Medical Education 9 (1): 60-65. http://doi.org/10.1007/s40037-019-00547-2
  18. Godbold, N., T.-Y. A. Hung, and K. E. Matthews. 2021. “Exploring the Role of Conflict in Co-Creation of Curriculum through Engaging Students as Partners in the Classroom.” Higher Education Research and Development 41 (4): 1104-1118. http://doi.org/10.1080/07294360.2021.1887095
  19. Hussain, S., K. A. A. Gamage, W. Ahmad, and M. A. Imran. 2019. “Assessment and Feedback for Large Classes in Transnational Engineering Education: Student-Staff Partnership-Based Innovative Approach.” Education Sciences 9 (3): 221-231. http://doi.org/10.3390/educsci9030221
  20. Kaur, A., M. Noman, and H. Nordin. 2017. “Inclusive Assessment for Linguistically Diverse Learners in Higher Education.” Assessment & Evaluation in Higher Education 42 (5): 756-771. http://doi.org/10.1080/02602938.2016.1187250
  21. Kearney, S. 2019. “Transforming the First-Year Experience through Self and Peer Assessment.” Journal of University Teaching & Learning Practice 16 (5): 20-35. http://doi.org/10.14453/jutlp.v16i5.3
  22. Kiester, E., and J. Holowko. 2020. “Redefining the Roles of Master and Apprentice: Crossing the Threshold through the Co-Creation of a First-Year Seminar.” International Journal for Students as Partners 4 (1): 66-81. http://doi.org/10.15173/ijsap.v4i1.3826
  23. Leslie, L. J., and P. C. Gorman. 2017. “Collaborative Design of Assessment Criteria to Improve Undergraduate Student Engagement and Performance.” European Journal of Engineering Education 42 (3): 286-301. http://doi.org/10.1080/03043797.2016.1158791
  24. Lorber, P., S. Rooney, and M. Van Der Enden. 2019. “Making Assessment Accessible: A Student-Staff Partnership Perspective.” Higher Education Pedagogies 4 (1): 488-502. http://doi.org/10.1080/23752696.2019.1695524
  25. Lorente, E., and D. Kirk. 2013. “Alternative Democratic Assessment in PETE: An Action-Research Study Exploring Risks, Challenges and Solutions.” Sport, Education and Society 18 (1): 77-96. http://doi.org/10.1080/13573322.2012.713859
  26. Lubicz-Nawrocka, T. 2018. “From Partnership to Self-Authorship: The Benefits of Co-Creation of the Curriculum.” International Journal for Students as Partners 2 (1): 47-63. http://doi.org/10.15173/ijsap.v2i1.3207
  27. Matthews, K. E., L. J. Groenendijk, and P. Chunduri. 2017. “We Want to Be More Involved: Student Perceptions of Students as Partners across the Degree Program Curriculum.” International Journal for Students as Partners 1 (2): 1-16. http://doi.org/10.15173/ijsap.v1i2.3063
  28. Meer, N., and A. Chapman. 2014. “Co-Creation of Marking Criteria: Students as Partners in the Assessment Process.” Business and Management Education in HE: 1-15. https://doi.org/10.11120/bmhe.2014.00008
  29. Murphy, R., S. Nixon, S. Brooman, and D. Fearon. 2017. “‘I Am Wary of Giving Too Much Power to Students:’ Addressing the ‘But’ in the Principle of Staff-Student Partnership.” International Journal for Students as Partners 1 (1): 1-16. http://doi.org/10.15173/ijsap.v1i1.3055
  30. Orsmond, P., S. Merry, and K. Reiling. 2000. “The Use of Student Derived Marking Criteria in Peer and Self-Assessment.” Assessment & Evaluation in Higher Education 25 (1): 23-38. http://doi.org/10.1080/02602930050025006
  31. Orsmond, P., S. Merry, and K. Reiling. 2002. “The Use of Exemplars and Formative Feedback When Using Student Derived Marking Criteria in Peer and Self-Assessment”. Assessment & Evaluation in Higher Education 27 (4): 309-323. http://doi.org/10.1080/0260293022000001337
  32. Peseta, T., A. Bell, A. Clifford, A. English, J. Janarthana, C. Jones, M. Teal, and J. Zhang. 2016. “Students as Ambassadors and Researchers of Assessment Renewal: Puzzling over the Practices of University and Academic Life.” International Journal of Academic Development 21(1): 54-66. http://doi.org/10.1080/1360144X.2015.1115406
  33. Quesada, V., M. A. G. Ruiz, M. B. G. Noche, and J. Cubero-Ibáñez. 2019. “Should I Use Co-Assessment in Higher Education? Pros and Cons from Teachers and Students’ Perspectives.” Assessment & Evaluation in Higher Education 44 (7): 987-1002. http://doi.org/10.1080/02602938.2018.1531970
  34. Rivers, J., A. Smith, D. Higgins, R. Mills, A. G. Maier, and S. M. Howitt. 2017. “Asking and Answering Questions: Partners, Peer Learning, and Participation.” International Journal for Students as Partners 1 (1): 1-10. http://doi.org/10.15173/ijsap.v1i1.3072
  35. Smith, S., K. Akhyani, D. Axson, A. Arnautu, and I. Stanimirova. 2021a. “Learning Together: A Case Study of a Partnership to Co-Create Assessment Criteria.” International Journal for Students as Partners 5 (2): 123-133. http://doi.org/10.15173/ijsap.v5i2.4647
  36. Smith, S., K. Akhyani, D. Axson, A. Arnautu, and I. Stanimirova. 2021b. “The Partnership Co-Creation Process: Conditions for Success?” International Journal for Students as Partners 5 (2): 48-66. http://doi.org/10.15173/ijsap.v5i2.4772
  37. Snelling, C., B. R. Loveys, S. Karanicolas, N. J. Schofield, W. Carlson-Jones, J. Weissgerber, R. Edmonds, and J. Ngu. 2019. “Partnership through Co-Creation: Lessons Learnt at the University of Adelaide.” International Journal for Students as Partners 3 (2): 62-77. http://doi.org/10.15173/ijsap.v3i2.3799
  38. Stephenson, A. 2006. “Troubling Teaching.” Australasian Journal of Early Childhood 31 (1): 51-56. http://doi.org/10.1177/183693910603100108
  39. Taylor, S., M. Ryan, and L. Elphinstone. 2021. “Generating Genuine Inclusion in Higher Education Utilising an Original, Transferable, and Customisable Model for Teaching and Assessing Reflective Learning.” Reflective Practice 22 (4): 531-549. http://doi.org/10.1080/14623943.2021.1933408
  40. Taylor, S., M. Ryan, and J. Pearce. 2015. “Enhanced Student Learning in Accounting Utilising Web-Based Technology, Peer-Review Feedback and Reflective Practices: A Learning Community Approach to Assessment.” Higher Education Research & Development 34 (6): 1251-1269. http://doi.org/10.1080/07294360.2015.1024625
  41. Thompson, J., D. Houston, K. Danise, T. Rayner, T. Pointon, S. Pope, A. Cayetano, B. Mitchell, and H. Grantham. 2017. “Student & Tutor Consensus: A Partnership in Assessment for Learning.” Assessment & Evaluation in Higher Education 42 (6): 942-952. http://doi.org/10.1080/02602938.2016.1211988
  42. Tiew, F. 2010. “Business Students’ View of Peer Assessment on Class Participation.” International Education Studies 3 (3): 126-131. http://doi.org/10.5539/ies.v3n3p126
  43. Ward, M., M. Senchea, and C. Gormley. 2022. “Teaching, Learning and Assessment within a School of Computing: Did Student Partnership Have an Impact?” All Ireland Journal of Teaching and Learning in Higher Education (AISHE-J) 14 (1): 1-19. https://ojs.aishe.org/index.php/aishe-j/article/view/609/979
 
Table S6. Areas of assessment partnership
Area Sub-areas Studies
Assessment and feedback design Assessment tasks Abdelmalak (2016), Baerheim and Meland (2003), Bell et al. (2019), Chamunyonga et al., 2018, Cosker et al. (2019), Deeley and Bovill (2017), Doyle et al. (2019), Hussain et al. (2019), Kiester and Holowko (2020), Lorente and Kirk (2013), Lubicz-Nawrocka (2018), Quesada et al. (2019), Rivers et al. (2017), Snelling et al. (2019)
Assessment criteria Abdelmalak (2016), Andrews et al. (2018), Colson et al. (2021), El-Mowafy (2014), Hussain et al. (2019), Kearney (2019), Kiester and Holowko (2020), Leslie and Gorman (2017), Lorber et al. (2019), Lorente and Kirk (2013), Matthews et al. (2017), Meer and Chapman (2015), Orsmond et al. (2000), Orsmond et al. (2002), Smith et al. (2021b), Tiew (2010)
Grading method Abdelmalak (2016), Godbold et al. (2021), Lorente and Kirk (2013), Matthews et al. (2017), Stephenson (2006)
Assessment process Abdelmalak (2016), Godbold et al. (2021), Kaur et al. (2017), Matthews et al. (2017), Ward et al. (2022)
Execution and implementation Self-assessment Cooper (2017), Deeley (2014, 2018), Deeley and Bovill (2017), Kearney (2019), Lorente and Kirk (2013), Matthews et al. (2017), Meer and Chapman (2015), Orsmond et al. (2002), Quesada et al. (2019), Thompson et al. (2017), Tiew (2010)
Peer assessment Cecchinato and Foschi (2017), Collins (2010), Colson et al. (2021), Cosker et al. (2021), El-Mowafy (2014), Hussain et al. (2019), Kaur et al. (2017), Kearney (2019), Lorente and Kirk (2013), Meer and Chapman (2015), Orsmond et al. (2002), Taylor et al. (2015), Tiew (2010)
Peer review Andrews et al. (2018), Chamunyonga et al. (2018), Cosker et al. (2021), Deeley and Bovill (2017), Hussain et al. (2019), Lubicz-Nawrocka (2018), Matthews et al. (2017), Taylor et al. (2021)
Quality assurance Surveys Andrews et al. (2018), Bell et al. (2019), Kearney (2019), Kiester and Holowko (2020), Smith et al. (2021a, 2021b), Taylor et al. (2015)
Discussions/focus groups Andrews et al. (2018), Cosker et al. (2021), Murphy et al. (2017), Smith et al. (2021b), Snelling et al. (2019)
Reports from student representatives Geraghty et al. (2020), Peseta et al. (2016), Rivers et al. (2017), Smith et al. (2021b), Ward et al. (2022)
Policy establishment Implementation of assessment and feedback Alsford (2012), Andrews et al. (2018), Geraghty et al. (2020), Lorber et al. (2019), Murphy et al. (2017), Peseta et al. (2016), Ward et al. (2022)
Grading systems Geraghty et al. (2020), Lorber et al. (2019), Smith et al. (2021a)
Dissemination of policy Peseta et al. (2016), Smith et al. (2021a, 2021b)
Practical issues Alsford (2012), Geraghty et al. (2020), Lorber et al. (2019), Ward et al. (2022)
 
Table S7. Student roles
Role Activities Studies
Co-designers Create and refine assessment tools/tasks Andrews et al. (2018), Baerheim and Meland (2003), Cecchinato and Foschi (2017), Chamunyonga et al. (2018), Cosker et al. (2021), Deeley and Bovill (2017), Hussain et al. (2019), Kiester and Holowko (2020), Lorente and Kirk (2013), Lubicz-Nawrocka (2018), Quesada et al. (2019), Rivers et al. (2017), Snelling et al. (2019)
Develop and review assessment criteria/rubrics including feedback strategies Abdelmalak (2016), Andrews et al. (2018), Colson et al. (2021), El-Mowafy (2014), Hussain et al. (2019), Kaur et al. (2017), Kearney (2019), Kiester and Holowko (2020), Leslie and Gorman (2017), Lorber et al. (2019), Lubicz-Nawrocka (2018), Matthews et al. (2017), Meer and Chapman (2015), Murphy et al. (2017), Orsmond et al. (2000), Orsmond et al. (2002), Smith et al. (2021a, 2021b), Stephenson (2006), Tiew (2010)
Create supporting resources for staff and students Peseta et al. (2016), Smith et al. (2021a, 2021b)
Prepare answers for designed tasks Baerheim and Meland (2003)
Justify design of tasks Chamunyonga et al. (2018)
Design methods to gather feedback from students Smith et al. (2021b)
Devise strategies based on feedback Andrews et al. (2018), Smith et al. (2021b)
Assessors Evaluate/grade peers’ work Cecchinato and Foschi (2017), Chamunyonga et al. (2018), Collins (2010), Colson et al. (2021), Cosker et al. (2021), El-Mowafy (2014), Hussain et al. (2019), Kaur et al. (2017), Kearney (2019), Lorente and Kirk (2013), Meer and Chapman (2015), Orsmond et al. (2000), Orsmond et al. (2002), Taylor et al. (2015), Tiew (2010)
Comment on peers’ work Abdelmalak (2016), Andrews et al. (2018), Chamunyonga et al. (2018), Colson et al. (2021), Cosker et al. (2021), Deeley and Bovill (2017), Hussain et al. (2019), Lubicz-Nawrocka (2018), Matthews et al. (2017), Taylor et al. (2021)
Evaluate/grade one’s own work Cooper (2017), Deeley (2014), El-Mowafy (2014), Kearney (2019), Lorente and Kirk (2013), Matthews et al. (2017), Meer and Chapman (2015), Orsmond et al. (2000), Orsmond et al. (2002), Quesada et al. (2019), Thompson et al. (2017), Tiew (2010)
Reflect on one’s own performance Colson et al. (2021), Deeley (2014), Quesada et al. (2019)
Negotiate/moderate grades Deeley (2014), El-Mowafy (2014), Hussain et al. (2019), Lorente and Kirk (2013), Meer and Chapman (2015), Quesada et al. (2019), Thompson et al. (2017)
Justify grades/marks Quesada et al. (2019), Thompson et al. (2017)
Evaluate effectiveness of assessment strategies/tasks/policies Andrews et al. (2018), Cosker et al. (2021), Geraghty et al. (2020), Smith et al. (2021a, 2021b), Snelling et al. (2019)
Evaluate experiences with assessment Bell et al. (2019), Kearney (2019), Kiester and Holowko (2020), Taylor et al. (2015), Ward et al. (2022)
Consultants Identify/highlight issues in existing practice Alsford (2012), Andrews et al. (2018), Lorber et al. (2019), Kiester and Holowko (2020), Smith et al. (2021b), Ward et al. (2022)
Propose changes/strategies for improvement Bell et al. (2019), Cosker et al. (2021), Kiester and Holowko (2020), Lorber et al. (2019), Peseta et al. (2016), Snelling et al. (2019)
Collect and report feedback from students Geraghty et al. (2020), Murphy et al. (2017), Peseta et al. (2016), Rivers et al. (2017), Smith et al. (2021b)
Facilitate discussions/workshops with students Murphy et al. (2017), Peseta et al. (2016)
Revise and test assessment ideas Bell et al. (2019)
Decision makers Decide on grading methods Abdelmalak (2016), Lorente and Kirk (2013), Stephenson (2006)
Negotiate deadlines Godbold et al. (2021), Matthews et al. (2017), Ward et al. (2022)
Choose assessment topics/tasks Kaur et al. (2017), Matthews et al. (2017), Ward et al. (2022)
Negotiate weightings Godbold et al. (2021), Matthews et al. (2017)
Vote on curricular and policy issues Geraghty et al. (2020), Lorber et al. (2019)
 
Table S8. Support provided by university staff
Type of support Actions Studies
Essential knowledge Provide specific instructions/guidelines Abdelmalak (2016), Chamunyonga et al. (2018), Cosker et al. (2021), Deeley (2014), Deeley and Bovill (2017), El-Mowafy (2014), Hussain et al. (2019), Kearney (2019), Lorente and Kirk (2013), Taylor et al. (2015), Tiew (2010)
Explain assessment and partnership principles Andrews et al. (2018), Chamunyonga et al. (2018), Doyle et al. (2019), Lorente and Kirk (2013), Murphy et al. (2017), Rivers et al. (2017), Snelling et al. (2019)
Clarify key concepts Andrews et al. (2018), Cecchinato and Foschi (2017), Chamunyonga et al. (2018), Collins (2010), Orsmond et al. (2000, 2002)
Prepare relevant reading resources Abdelmalak (2016), Doyle et al. (2019)
Training and coaching Provide samples of work and exemplars Abdelmalak (2016), Andrews et al. (2018), Collins (2010), El-Mowafy (2014), Hussain et al. (2019), Meer and Chapman (2015), Orsmond et al. (2002), Taylor et al. (2015)
Organise and conduct training workshops Bell et al. (2019), Cosker et al. (2021), Geraghty et al. (2020), Ward et al. (2022)
Conduct pilot marking sessions Colson et al. (2021), El-Mowafy (2014), Kearney (2019), Taylor et al. (2021), Taylor et al. (2015), Tiew (2010)
Accuracy and quality check Review student-developed tools Abdelmalak (2016), Andrews et al. (2018), Baerheim and Meland (2003), Bell et al. (2019), Colson et al. (2021), Cosker et al. (2021), Doyle et al. (2019), Godbold et al. (2021), Hussain et al. (2019), Lorber et al. (2019), Murphy et al. (2017), Quesada et al. (2019), El-Mowafy (2014)
Verify appropriateness of design/ideas Baerheim and Meland (2003), Bell et al. (2019), Cecchinato and Foschi (2017), Chamunyonga et al. (2018), Doyle et al. (2019), Hussain et al. (2019), Kiester and Holowko (2020)
Moderate student assessment Deeley (2014, 2018), El-Mowafy (2014), Lorente and Kirk (2013), Meer and Chapman (2015), Thompson et al. (2017)
Partnership management Facilitate negotiation and discussion Abdelmalak (2016), Andrews et al. (2018), Chamunyonga et al. (2018), El-Mowafy (2014), Godbold et al. (2021), Hussain et al. (2019), Kaur et al. (2017), Lorente and Kirk (2013), Meer and Chapman (2015), Orsmond et al. (2002), Tiew (2010)
Collate and synthesise student reponses Andrews et al. (2018), Leslie and Gorman (2017), Tiew (2010)