Engaging Students Early: Institutional learning from programme progress reviews

Deeba Parmar, Agi Ryder, David Westwood
Middlesex University

SOLSTICE & CLTR Conference, 8-9 June, 2011

Context and rationale

Programme Progress Review meetings are an opportunity to engage students in 1-2-1 meetings with academic staff at two points during the first year of UG study (week 9 & 18) in order to discuss feedback from formative assessment activities and to consider their progress holistically across all of their first year modules. The purpose of these reviews is to engage students with the programme, academic staff, to reflect and take ownership on their academic development and learning needs. This initiative was introduced at Middlesex University in 2010/11 following a pilot study in the previous academic year.

The University has a range of programmes run in a variety of ways. In developing PPRs it was recognised that flexibility was necessary to enable the disciplinary contexts of the undergraduate portfolio at Middlesex as well as other factors such as cohort size and the nature of programmes.

This project aims to learn from the different ways in which PPRs have been delivered within departments, the perceived successes and challenges in terms of both delivery and outcomes of PPRs, any proposed future changes to their delivery, and the perceived impact the PPR has had on students.

The intended outcome of the project is to identify its successful elements in order to share good practice, provide recommendations on how effective ways of delivering PPRs, and to inform further development of the process for future years, for example consider the potential of roll-out to years 2 & 3 of the undergraduate programme and to post graduate programmes.

Research methodology and design

As this project is intended to identify the different approaches departments employ in regards to PPRs and to investigate the perceived effectiveness of the initiative, it employs an evaluative approach.

The project relies on a mixed method of data collection. Qualitative data is gathered through semi-structured questionnaires from each department/ school and where appropriate follow-up interviews are held with Heads of Departments to identify approaches taken and perceived usefulness of the initiative. Quantitative data is collected to identify in which programmes the PPR has been employed.

Through reading, coding and analysing the reports returned from Schools and departments a number of themes arose encapsulating both practical, processed based and wider theoretical based learning and information about the implementation of PPRs.

Summary of overall learning

1. Different ways of doing PPRs

Across Schools and even within departments differing ways of delivering the PPRs were employed.

Communication with students
    • Some departments had clear communication strategies including information provided in programme handbooks, within induction, on the VLE, within lectures/seminars and through email.
    • Often a guidance document and template texts were provided to staff on how and what to communicate to their students.
    • Different groups of people arranged the PPRs with students, e.g. programme administrators, SAAs or programme leaders. In some cases students were allocated to specific tutors, whilst others allowed students to sign up to times posted on tutors doors. One department area stated that they matched staff and students depending on specialism areas and personality types.
    • For those students that were unable to attend their allotted slot alternatives were made to try to allocate student.

Conducting PPRs
    • Mostly PPRs were delivered 121, occasionally in small groups and in one area as online questionnaires (due to severe weather causing travel difficulties). PPRs were delivered from weeks 8-10 and weeks 17-18. In some departments PPR was extended to include year 2-3.
    • Many cases stated the difficulties they had in holding the PPRs as space is at a premium. These were therefore conducted in: tutors officers (negotiated with other tutors if shared rooms), lecture/seminar rooms, quiet areas around the campus and the main atrium.
    • The majority of the departments stated that teaching was not cancelled but scheduled differently to accommodate the PPRs. In addition tutors’ office hours were used to hold the meetings.
    • Most PPR sessions were scheduled for 15 min, slightly longer for small groups.

Record keeping
    • Most departments recorded attendance, a brief summary by the tutor of issues raised and any developmental work/direction required which was then held on central department spreadsheets or in paper form which was further analysed by SAAs or programme teams. Most encouraged students to have their own copy to refer to and aid their development, tied into their PDP.
2. Success and challenges

Engaging the engaged
    • It was noted that PPRs were not seen as mandatory by students and therefore attendance figures varied dramatically from 7% - 96%. Attendance was highest when the PPR was scheduled into the teaching time and the rationale communicated clearly to the students. However, from the evaluations returned there was a sense that the PPRs were engaging those students that were already engaged in their programme. It was felt that the PPRs failed to engage those students that were considered ‘not engaged’ and these students failed to attend their scheduled PPR. The non-attenders were followed up across all Schools and some programmes had specific channels for ‘chasing’ with emails from the programme administrators, SAAs, PLs or in some cases DoPs.
    • It appeared that across the board that there was greater attendance for the first PPR in week 9 that the second in weeks 18. Some programmes gave student the option of not attending if they felt they were pleased with their progression and had no other issues they wished to raise.

Perceived benefits
    • Students reported it was beneficial and a positive experience. They felt the department took interest in their progress, aspiration, difficulties.
    • Tutors became more aware of individual student issues and trends across a cohort and of support requirements. They could refer them to appropriate support or when the student has already taking advantage, discuss their progress.
    • PPRs provided the opportunity to organise a more co-ordinated approach to supporting students across module, programmes and refer students to other types of support where necessary.
    • Issues around confidentiality with regards to students sharing feedback with their own tutors vs unfamiliar tutors. There were potential benefits and challenges to both, depending on circumstances.
    • Students appeared more confident reporting issues to staff and hopefully regarded the PPRs as the start of an ongoing dialogue.
    • Several smaller operational issues flagged in the PPRs were sorted out in a timely fashion (e.g. Timetables, session bookings, online resources).
    • It was felt to increase student engagement with academic staff and their learning.
    • Some departments tapped into existing personal tutor networks (e.g. Midwifery)

Perceived challenges
    • A large number of tutors involved resulted in difficulty in ensuring a consistent approach in terms of advertising PPRs, delivery and record keeping.
    • Significant numbers of students not keeping appointments, wasting academic time and also resulting in more administrative duties of chasing students.
    • Increase in staff workloads, in some cases of large cohorts of student staff that did not teach on year one were included to help in PPRs.
    • Those students who were engaged were more likely to take up the offer of PPR. The challenge is still with engaging students who most need it but are unlikely to attend.
    • Challenges with communicating the purpose of PPR to students, making clear distinction between other opportunities of giving and receiving feedback. It was sometimes perceived negatively if students were unable to receive feedback or make arrangements to speak to a tutor outside of the 2 allotted PPRs.
    • Difficulty of scheduling appropriate spaces for PPRs with rooms unavailable, timetabled classes, etc. Some therefore had to be conducted in coffee areas which was not seen as ideal.

3. Impact on students

The PPRs were generally regarded as beneficial for both students and tutors.
    • Tutors became more aware of individual student issues and trends across a cohort particularly with those departments that kept central records and had discussions about their learning from the PPRs. Two departments coded students in a traffic light system of concerns and then discussed these students with their PLs and were noted on a central database in order to provide an overview of the cohort.
    • From the PPRs students were pointed towards additional subject help and learning including academic skills and English language and learning support offered within the LDU although some tutors expressed concerns that there was no clear way of finding out if this additional help had been sought or taken.
    • Staff felt that students on the whole perceived the PPRs as a worthwhile exercise and some departmental feedback stated that their students felt more ‘taken care of’. It was also perceived that students were more confident in communicating issues about their learning and one department noted that a deeper reflection of their students’ learning was noted in their PDP after the PPRs had taken place.

Use of time
    • There was some conflict between the time allocations given for the PPRs. Most of the feedback returned stated that there was a shared agreement of the usefulness of the PPRs and its rationale. However, there was some conflict that this was taking time away from teaching in order to deliver the initiative and was not beneficial for the students, as one academic put it ‘Your can’t feed a cow by weighing it’. However, others stated that the time available (10-15 minutes) did not allow for any depth of the review as the student numbers are too great in delivering it in a 121 fashion and therefore they were just scraping the surface.

Measures of success
    • Often attendance was used as a measure of success with this initiative. Students were encouraged to attend through a variety of means (email, reminded in class, rationale in programme handbook and one programme offered a 1% grade incentive) although attendance varied significantly even with the incentives offered.
    • It was noted that although the PPRs could help all student in their development it was not engaging those students who were of concern (that were not attending or submitting work).
    • Others stated that it was difficult to look for measures of success and tie them to grades as they were unsure if the suggested support had been sought and progression boards had not taken place.


This project is ongoing and the institutional leanings presented here are likely to influence the initiative’s adoption for the next academic year.


If you’d like to learn more about the project, please contact one of us:
Deeba Parmar (d.parmar @ mdx.ac.uk)
Agi Ryder (a.i.ryder @ mdx.ac.uk)
David Westwood (d.westwood @ mdx.ac.uk)