387
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Surfacing the complexity of students’ experiences of assessment and feedback processes using a rich picture approach

ORCID Icon, ORCID Icon, ORCID Icon, , & ORCID Icon
Pages 467-480 | Received 12 Sep 2022, Accepted 19 Apr 2024, Published online: 09 May 2024

ABSTRACT

Student-staff dialogue is often emphasised as a means of improving students’ engagement with assessment and feedback processes. However, focusing on dialogue alone overlooks the complexity of students’ experiences and the sociomaterial contexts in which they occur. To surface the roles of the social and the material in students’ experiences, we engaged pedagogies of mattering theory, and employed a rich picture (RP) approach in which students visually depicted their experiences of assessment and feedback. We anticipated that making use of a range of icons, symbols, and visual metaphors might enable participants to think about what matters in their everyday experiences, moving beyond solely human–human interactions, to highlight the significance of the objects, spaces and material elements that are involved. RPs were analysed using a form of content analysis and the following recurrent motifs were identified: Visual metaphors depicting uncertainty; emotive faces showing impacts on wellbeing; seasons, clocks and calendars depicting the pervasiveness of processes; and figures and objects depicting human and non-human elements. Based on the findings, we argue for a shift to greater embedding of meaningful relational approaches in assessment and feedback processes.

Introduction

Negative experiences of assessment and feedback may affect students’ engagement with these fundamental aspects of their education (Sambell Citation2016). Yet, many of the studies exploring students’ experiences and perceptions of such processes have been criticised for being repetitive and lacking theoretical or conceptual frameworks (Van der Kleij and Lipnevich Citation2021). Furthermore, where such frameworks are present, until recently many have overlooked the situated contexts in which assessment and feedback processes take place in favour of standardised approaches to supporting feedback processes (Gravett and Carless Citation2024). As a result, in the current study, we intend to build on the literature-base of students’ experiences and perceptions of assessment and feedback to foreground the importance of meaningful connections between students and staff, and with their university environments. That is, connections that may include a breadth of human and non-human actors (i.e. physical objects, materials and spaces that shape the learning experience). To surface the roles of the social and the material in students’ experiences, we think with relational theories of mattering, to understand assessment and feedback processes, and we adopt the rich picture method (Bell, Berg, and Morse Citation2016). As a novel approach to assessment and feedback research, we suggest that inviting students to use imagery to visually depict their experiences of assessment and feedback allows us to interrogate these experiences in more methodologically authentic and creative ways, enabling us to achieve deeper, original insights.

Students’ experiences of assessment and feedback processes

Assessment is the primary form of certification in higher education (Boud and Associates Citation2010), and therefore often acts as a ‘gate-keeper’ to students’ future careers (Harman and McDowell Citation2011). With the teacher maintaining the power over processes (Small and Attree Citation2016), students may struggle to learn the ‘rules’ of assessment, such as how and when it is appropriate for them to seek feedback from lecturers (Gravett and Winstone Citation2019). As a result, students may feel there is a lack of fairness and clarity in assessment and feedback processes (Deeley et al. Citation2019; MacKay et al. Citation2019). This may be due to the perception that assessment guidance is often ambiguous, that marking practices are inconsistent, and that feedback from staff about what to do next is unclear (Deeley et al. Citation2019). The perceived lack of transparency about how assessment processes are enacted may lead to the belief that there are different ‘rules’ for staff and students, ‘creating an “us vs them” mentality’ (MacKay et al. Citation2019, 324). Additionally, (well-intentioned) feedback may be interpreted as a personal criticism, evoking a strong emotional response (Pitt and Norton Citation2017), with potentially negative consequences for students’ motivation (Ryan and Henderson Citation2018) and self-esteem (Ossenberg, Henderson, and Mitchell Citation2019). There is even emerging evidence that assessment processes can have a damaging impact on students’ wellbeing (Jones et al. Citation2021).

Research has found that students perceive that more opportunities for dialogue with lecturers might counter some of these negative impacts by providing a space for seeking further clarification around areas of uncertainty (Ali, Ahmed, and Rose Citation2018; Chalmers, Mowat, and Chapman Citation2018). Indeed, dialogue can position the student as a proactive recipient with increased agency (Winstone et al. Citation2017), opening up more opportunities for students to elicit valuable informal feedback (Joughin et al. Citation2021). Furthermore, in some circumstances dialogue has been found to help redress unequal power dynamics (Green Citation2019; Rees et al. Citation2020). Despite the usefulness of these insights, dialogic approaches do not inevitably lead to productive, meaningful and equitable outcomes (Gravett Citation2022; Jørgensen Citation2019; Quinlan and Pitt Citation2021). Student agency is situated within entanglements of the human and non-human (Gravett and Carless Citation2024; Nieminen et al. Citation2022). Agency and engagement with feedback can be seen as being influenced by multiple contextual factors (e.g. features of feedback, student-teacher relationships, spaces, curriculum materials, and the roles of teachers and learners) and individual factors (e.g. beliefs and goals, feedback experience, and abilities) (Chong Citation2021). Providing opportunities for dialogue is important, but does not mean that effective engagement with feedback will necessarily follow, given the myriad of other factors that impact upon students’ engagement in learning (Gravett and Carless Citation2024).

We suggest that what is needed is a deeper exploration of students’ fine-grained experiences of interactions within assessment and feedback processes (Ajjawi and Boud Citation2017; Esterhazy Citation2018) that enables researchers to ‘attune to the emergent moments of feedback…. generated as people and things come together’ (Gravett and Carless Citation2024, 143–147). Specifically, in the current study, we engage pedagogies of mattering theory in order to re-examine what matters in students’ experiences of assessment and feedback, and to consider the significance of social and material factors in these processes. Gravett, Taylor, and Fairchild (Citation2024) note that in ‘current formulations of a relational pedagogy, who and what matters is almost always understood as human’ (393). Thus, they recast relational pedagogies as pedagogies of mattering to ‘enable us to reconceptualise relations as a more-than-human concern, to surface and problematise the inequalities that underpin concepts of student engagement, as well as to explore enriched ways of enacting relational pedagogies’ (400–401). Pedagogies of mattering offer new insights into how both the material and the discursive matter in learning and teaching, understanding entanglements of staff–student relations with care, and teaching–learning relations with matters of power (Gravett, Taylor, and Fairchild Citation2024). Thinking with these ideas enables researchers to explore how learning and teaching practices do not always produce predictable results, and to look at assessment and feedback processes in new ways.

Surfacing complexity in students’ experiences: the rich picture method

Researchers wishing to make sense of complex experiences often begin by interviewing the individuals at the centre of those experiences. But interviews can be frustratingly limited. Visual methods, such as drawings, are beginning to show promise for designing research that taps into the complexity of professional practice.

(Cristancho Citation2015, 138)

When examining students’ experiences within higher education, ‘focus group discussions offer the potential to engage with students as partners and surface a more authentic voice’ (Bourne and Winstone Citation2021, 352). Yet, there may be a limit to the extent to which these discursive approaches can glean fresh insights into students’ experiences, particularly when the intention is to explore more complex aspects of assessment and feedback experiences. Indeed, Tai et al. (Citation2019) highlight that ‘a study focussing on sociomaterial impacts of feedback would have different aims and methodologies, compared to a study grounded in a cognitivist tradition’ (43–44). Thus, in order to gain a different perspective that potentially taps into both the social and the material in students’ experiences of assessment and feedback, in the current study we drew on the rich picture (RP) method (Bell, Berg, and Morse Citation2016). The RP method involves participants creating visual and symbolic representations of their lived experiences using pens and paper: ‘The pictures produced capture not just representations of the physical world, but conceptual and emotional aspects relevant to the individuals concerned, aspects which alternative methods may not educe in the same manner’ (Berg et al. Citation2017, 1343). This approach gives participants active authorship of the data, and traverses linguistic, cultural and educational barriers (Berg et al. Citation2017), so it may be particularly effective when exploring topics where students are likely to acquiesce and/or where some individuals can dominate a focus group discussion. Bourne and Winstone (Citation2021) also make a case for activity-oriented questions over didactic direct questioning in focus groups to enable diverse voices to be expressed. The RP approach has been used to gain ‘insight into tacit perceptions such as motivations, hopes, fears, goals and threats’ (Berg et al. Citation2017, 1343). It ‘allows participants to access unconscious thought…. [so] they are able to see links/connections/patterns and/or explore meanings/implications’ (Gisby et al. Citation2023, 206). Therefore, we anticipated that this approach might provide a means for students to contextualise assessment and feedback within their wider student experience by exploring links with factors that sit outside of certification processes. Icons, symbols, and visual metaphors can provide individuals with a language for communicating the complexity in their lives (Cristancho and Helmich Citation2019). Therefore, we hoped that making use of iconography might enable participants to think about what matters in their everyday experiences, moving beyond solely human–human interactions (e.g. between themselves and their lecturers), to highlight the significance of the objects, spaces and material elements that are involved.

Methods

Participants

An opportunity sample of seven students (6 females, 1 male) were recruited from a university in the South-East of England, and received online shopping vouchers as compensation for their time. Participants came from a range of subject-areas, including psychology, liberal arts and sciences, law and criminology, aerospace engineering, theatre and performance, and media studies disciplines. Despite the diversity in subject areas of the sample, we intended to explore common experiences in assessment and feedback processes that cut across disciplines. One participant was in their second year of undergraduate study, five were in their final year, and one was a doctoral student (although this student only focused their discussions on their assessment and feedback experiences of previous taught degree programmes at the University). Since the seven participants each produced three RPs, this was deemed to be a large enough set of data for the purposes of performing an in-depth qualitative analysis.

Rich picture approach

All institutional procedures for ethical review were followed and participants provided informed consent. Two 2-hour focus group sessions were led by two undergraduate interns under the supervision of three academic members of staff, all of whom are co-authors of this article. None of the authors were teachers or peers of the participants. Participants were provided with A3 sheets of paper and coloured pens. Although RPs are traditionally produced collaboratively (Berg et al. Citation2017), the flexibility of the approach means that it is equally suitable for participants to produce RPs on their own (Gisby et al. Citation2023). In this study, we asked participants to draw their RPs individually as an acknowledgement of the personal and emotional sensitivities of students’ experiences of assessment and feedback. Individual RPs are also sometimes preferred ‘for the purpose of isolating iconography’ and to stop students from copying each other (Berg and Pooley Citation2013, 36). The rest of the focus group time was given over to discussion to enable participants to share and debate their drawings collectively, and to empathise and search for commonalities in their experiences.

To get participants accustomed to the method, a warm-up drawing activity was conducted in which they drew a picture based on the question, ‘why is the subject you are studying for your degree important?’ After ensuring that participants were comfortable with the method and instructions during this warm-up activity, they then completed three RPs individually by following instructions to draw, visually display, annotate, and capture their thoughts in relation to the following questions:

  1. What is your experience of assessment and feedback at university?

  2. What do students want from assessment and feedback?

  3. What role do assessment and feedback play in your wider university experience?

These questions were chosen in order to encourage the participants to think broadly about their experiences within the wider context of their university life, rather than simply focusing on isolated incidents. Participants were given 10 minutes to draw each RP, and after each question, they spent 20 minutes as a group sharing and discussing the meaning behind their pictures. Focus group facilitators and the other participants asked probing questions during this discussion to elicit further explanations of the pictures. The ensuing discussion was audio recorded and transcribed, and pictures were collected in.

Analytical approach

Analysis of the RPs was informed by the Eductive Interpretation (EI) procedure, outlined by Bell, Berg, and Morse (Citation2019), which is a form of content analysis recommended for analysing RPs. EI enabled us to draw out messages, stories and emotions from across the RPs and explore the visual metaphors held within them. In practice, our approach to EI entailed each researcher analysing the RPs individually and identifying icons or metaphors present. In keeping with the usual practice when analysing responses to focus group questions, rather than isolating out and focusing on each of the RPs produced for the three questions separately, we combined all of the RPs into a single data set for analysis. Once all RPs had been analysed by each researcher, they produced EI themes, bringing together the icons and metaphors. For example, as can be seen in , emotive faces appeared in multiple RPs, so these were extracted and used to generate a theme. Transcripts from the focus group discussions were used to aid interpretation of the images based on participants’ explanations in order to avoid them being taken out of the context in which they were drawn. Thus, discussion of the findings is supported by quotations from the focus groups. Upon sharing individual findings as a group, we encountered considerable consensus about the meaning behind the icons and metaphors in the RPs, which made the themes easier to establish.

Reflexivity of the researchers

The EI analysis was conducted by all staff and student co-authors, so the themes drew on our own experiences of designing and completing assessment tasks, as well as engaging in feedback interactions. We adopted an evolving co-constructed approach to data analysis, where student perspectives were brought to the fore to enable the researchers to view students’ experiences as guided by students themselves. In practice, this meant that during analysis, the two student co-researchers acted as ‘critical friends’ (Smith and McGannon Citation2018), interrogating interpretations of the data and enriching the analysis with their own student perspectives. This positions co-construction as a fundamental thread interwoven throughout the study, whereby connections between staff and students were present in the design of the study, the collection of data, and the process of analysis.

Findings and discussion

The themes generated from the EI analysis of the RPs were based on recurrent motifs. Each theme is listed as a heading below, and is illustrated by grouping similar excerpts from the RPs within a single figure to surface related metaphors and common narratives in students’ experiences. Each excerpt has been placed within its own “cell” within the figure and given a letter to identify it (e.g. refers to the cell marked ’a’ in ). Quotations related to students’ descriptions of these excerpts accompany their images. Further quotations from the focus groups have also been included where relevant to the themes.

Metaphors depicting uncertainty

Feelings of uncertainty about assessment and feedback were expressed through metaphors in the RPs. In , the ephemeral image of a floating balloon with the word feedback written inside is used ‘because sometimes [feedback] appears and sometimes it doesn’t’, demonstrating the variability in feedback practices between lecturers. Similarly, another student drew a lighthouse () as a metaphor for how feedback can be elusive and unclear; students may therefore need to ‘search in the dark’ to learn what they need to do differently if they want to improve:

I think in terms of what you want from feedback, you want to be able to define what the problem is. So by lighthouse, searching. Because often it’s like searching in the dark for something. I think it’s very hard if you just keep doing what you’re doing. You’re never going to know how to improve.

Figure 1. Uses of metaphors depicting uncertainty in excerpts from RPs.

Drawing of a balloon with the word feedback inside in the cell marked a. Drawing of a lighthouse in the cell marked b.
Figure 1. Uses of metaphors depicting uncertainty in excerpts from RPs.

The perceptions of ambiguity and uncertainty expressed by the participants chime with MacKay et al’.s (Citation2019) and Deeley et al’.s (Citation2019) findings regarding a perceived lack of fairness, transparency and clarity in assessment and feedback processes. This could lead to a sense of frustration and feelings of anxiety (Beaumont, O’Doherty, and Shannon Citation2011), in which students unsuccessfully struggle to search for the answers to their difficulties within the physical artefacts they have been provided. This reinforces the importance of meaningful human–human interactions. These examples highlight situations in which feedback is either being provided without opportunities for seeking clarification, or that students may not feel ‘allowed to ask why I’m getting that mark’. This impression of ”not being allowed” is unlikely to be rectified by simply increasing opportunities for dialogue, because students may not feel comfortable approaching their lecturers to question their judgements. Deeley et al. (Citation2019) also found that students did not always feel able to seek further feedback out of fear that they would be perceived as challenging their lecturer’s authority or expertise. This likely fuels the ‘“us vs them” mentality’ (MacKay et al. Citation2019, 324), as assessment is positioned as an external and disembodied thing that is ‘done to’ students, instead of being a ‘material-discursive doing.... in which students have a stake’ (Gravett, Taylor, and Fairchild Citation2024, 399).

Emotive faces and metaphors depicting impacts on wellbeing

A wide spectrum of emotions related to assessment and feedback were expressed in the RPs, largely through the use of emotive faces. In , the student depicts the sensation of being overwhelmed through their image of a person with an alarmed expression surrounded by lots of things (i.e. people and objects) on their mind:

Figure 2. Uses of emotive faces and metaphors depicting impacts on wellbeing in excerpts from RPs.

Figure 2. Uses of emotive faces and metaphors depicting impacts on wellbeing in excerpts from RPs.

You’ve got societies, you’ve got maybe things at home, you’ve got other things. Then assessments end up taking a much bigger part of that than I would like it to…. it can play a big part and make you feel a little bit overwhelmed.

This reflects the conceptualisation of assessment as a material-discursive practice (Gravett, Taylor, and Fairchild Citation2024); the assessment process is entangled with the student and a multiplicity of other actors – ‘societies’, ‘things at home’ – that constitute their university life. Similarly, another student uses a face with a troubled expression surrounded by random symbols seemingly representing confusion () to highlight how receiving feedback can be ‘quite draining’ due to the ‘pressure of needing to be the best or needing to do really well in [our] lives’, stressing how students rely on the impacts of their assessments beyond university for their future careers (Harman and McDowell Citation2011). Therefore, effects are felt far beyond the classroom, ‘completely shap[ing] not only my mood, but everyone I know’s mood’, as another student noted. One participant places a face within a frame labelled as a mood changer () to show the power feedback has to transform their mood from one extreme to the other: ‘there are only two ways in which [feedback] could affect me.... either I get an ego boost or it will lower my confidence and that’s really a mood-changer for me’. The same student shows their sense of feeling overwhelmed through a drawing of a metaphorical bomb within a thought bubble (): ‘there’s a bomb in my brain because I’m always too overwhelmed at the first instance just reading through [written feedback information]’.

Rather than protecting students from emotional reactions to feedback, there is growing recognition that instead, the process of engaging with such emotional responses is an important part of students’ development (Carless and Boud Citation2018; Noble et al. Citation2020; Winstone, Balloo, and Carless Citation2022). Indeed, Molloy, Noble, and Ajjawi (Citation2019) observe that educators’ attempts to suppress this emotional impact by ”removing the static” and ”softening the blow” may only serve to confuse the messages contained in feedback information. Therefore, students must learn how to ‘navigate the emotional turmoil usually associated with feedback processes’ (Noble et al. Citation2020, 69). However, a notable issue expressed through participants’ RPs was the impact of assessment on their self-worth and wellbeing, indicating the long-term impact of even a single feedback encounter. Grades were also seen as ‘a value judgement about you as opposed to just a judgement of your academic performance in that one specific task’ and ‘a number defining your value’. In , the student juxtaposes happy and sad faces to show the two sides of assessment and feedback processes, noting its potential impact on students’ wellbeing:

mental health is dependent on assessment and feedback. It’s inextricably linked to it. I’ve got friends who’ve gone through really hard times directly following feedback. So in a way, it’s shaping the social environments at university quite a lot …. I think they become internalised into your self-image quite quickly so that it’s how you value yourself and how you see yourself as a worthy human being.

This is another example in which the boundaries between the social and the material are blurred. Assessment and feedback matters a great deal to students and carries a significant amount of influence (Gravett, Taylor, and Fairchild Citation2024). Likewise, in a powerful image of a stamped forehead being struck by lightning (), the student depicts the deleterious effects of assessment and feedback, suggesting that processes can expose and exacerbate students’ pre-existing vulnerabilities:

The face has the fragile stamp on her head because of mental health, really. Because a lot of university students are vulnerable and they’re made even more vulnerable because of assessments. And it’s very easy to forget to look after yourself, and also your social interactions go down so much during assessment period because everyone’s focused on what they’re doing and they’re trying to pass. So it becomes a very fragile state during assessments.

Finally, in , the student has drawn arrows from a happy face to an upset face to demonstrate the up-and-down process of working through different emotions from submitting an assessment to receiving their mark and feedback. These findings emphasise the need for a ‘balancing act’ between feedback being constructive, assessment being suitably challenging, and the ‘psychological threats’ this poses to students’ wellbeing being managed (Jones et al. Citation2021, 438). Therefore, meaningful interpersonal connections and strong communication that signals how much students matter (Gravett, Taylor, and Fairchild Citation2024), may be the crucial support needed to help students navigate these processes.

Seasons, clocks and calendars depicting the pervasiveness of processes

The omnipresent and pervasive nature of assessments and their associated deadlines appeared to be a salient issue for students. This was represented by images depicting the passage of time. In , a human figure is surrounded by different types of weather to reflect changing seasons: ‘my whole world ends up revolving around assessments, but more just that I have to plan my year based on assessments and feedback…. to make all my plans fit around my assessments’. Another drawing () highlights how a 2-hour exam carries a disproportionate amount of weight compared to overall outcomes. Similarly, images of calendars () show how consumed students are by their assessments:

Figure 3. Uses of seasons, clocks and calendars to depict the pervasiveness of processes in excerpts from RPs.

Picture of a stick figure surrounded by weather symbols in cell a. Picture of pie charts with a label that says time in cell b. Picture of calendars in cells c and d. Pictures of clocks and happy and sad faces in cells e to i.
Figure 3. Uses of seasons, clocks and calendars to depict the pervasiveness of processes in excerpts from RPs.

[Assessment and feedback] is the university experience really…. the actual time that the assessment and feedback takes up, say, until our exam, has such a huge bearing on the outcome that’s so disconnected to the amount of time spent on it that it shapes a lot of future paths and directions in your life. And that’s technically what university is mostly focused on, that it completely determines for me the calendar of my year.

A number of the RP excerpts () include clocks to mark the importance of time in assessment and feedback processes. In , the student has drawn a clock on a chain hanging around the neck of a person, which could be seen as weighing her down. Some of these clocks are combined with happy and sad faces () to portray feelings of satisfaction and dissatisfaction related to feedback turnaround times: ‘We were getting the feedback from the first [assessment] the day that the second [assessment] was due. So, then you can’t use that feedback for anything useful until the final [assessment]’. Indeed, timeliness of feedback processes, in terms of giving students time to act on feedback information, has been highlighted as an important concern in the utility of feedback (Pitt and Norton Citation2017; Quinlan and Pitt Citation2021; Winstone et al. Citation2016).

Figures and objects depicting human and non-human elements

Students’ RPs included recurring images of people (stick figures, roles, faces) and objects (clocks, laptops, computers, books, desks, paper, a phone), emphasising how objects, spaces and material elements shape and foster experiences of assessment and feedback processes. Whilst these elements are present across all themes, certain excerpts particularly demonstrate the weight of human–material interactions. In , the student depicts a brain connected to a piece of student work to represent their belief that assessments should be ‘a true reflection of [students’] actual thought processes’. Therefore, assessments are expected to be a physical manifestation of students’ understanding – the assignment as a physical artefact is entangled with the student. However, the student continues: ‘The feedback and the effort and energy [students] put in should be reflected in the grades, and it isn’t always’. Thus, there is a perceived disconnect between the human (thought processes and understanding) and the non-human (grades), suggesting a perceived lack of fairness in the assessment process. At times, technologies are also described as inhibiting and obstructing engagement. Feedback could be seen as: ‘disconnected and digital. I’ve never received feedback in person. It’s always been a number popping up in a file on a screen with very little context or explanation behind it’, as depicted by an image showing the transmission of information between two computers (), which almost completely excises the role of human–human interactions in processes. Another student uses a picture of a smartphone () to illustrate the significance of the group chats that take place after the release of marks and feedback: ‘As soon as feedback comes out, our course group chat explodes, complaining about everything’, which they use to ‘often compare feedback’. These interactions occur online and are mediated through the object which symbolises this interaction: the student’s phone. These conversations might be seen as necessary to students if feedback comments and grades are simply transmitted to them without further discussion to enable clarification. Furthermore, this illustrates the complexity of interactions that are occurring outside of the classroom in students’ everyday lives, and not involving staff members. Strong and meaningful connections with lecturers – the human–human interactions – were seen as central to students benefiting from feedback, which highlights how simply opening up spaces for dialogue (e.g. making office hours available) without considering the wider context (e.g. that students might not feel comfortable seeking further feedback) is unlikely to be sufficient. This is underscored by the two human figures shown collaborating in :

Figure 4. Figures and objects depicting human and non-human elements in excerpts from RPs.

Picture of a document connected to a brain with lines in cell a. Picture of laptops connected with a line in cell b. Picture of a smartphone in cell c. Picture of two stick figures jointly holding a document in cell d.
Figure 4. Figures and objects depicting human and non-human elements in excerpts from RPs.

I think it’s always better when you have some sort of personal connection with a lecturer. I always find that if I have a real problem, I would go and see them because I think it’s really difficult just to send emails back and forth.

This entanglement of the human and non-human resonates with recent work exploring the impact of the material, such as artefacts, technologies and spaces upon students’ experiences of feedback interactions (Ajjawi and Boud Citation2017; Gravett Citation2022; Nieminen et al. Citation2022; Quinlan and Pitt Citation2021). It emphasises the importance of enacting pedagogies of mattering to recognise assessment and feedback processes as both discursive and material. That is, dialogue and student agency all occur within situated sociomaterial contexts.

Insights and implications

Our findings reveal the complex impacts of assessment and feedback on students. Using the RP approach to revisit a thorny issue from an original perspective, students were able to communicate the complexity inherent in their situated experiences of assessment and feedback. Our article offers a number of contributions for educators. In this study, we were interested in exploring the role of both the social and the material in students’ learning experiences. Engaging ideas from pedagogies of mattering theory (Gravett, Taylor, and Fairchild Citation2024), we suggest that a renewed focus on the material may help educators to confront some of the challenges raised, through attending to assessment and feedback processes as situated affective encounters, generated when people, objects, spaces and material elements come together. Gaining a deeper understanding of the complex nature of students’ lives, such as their motivations and responsibilities outside the classroom, could enable more trust to form between staff and students. Connections built on trust might also assuage the ‘“us vs them” mentality’ between staff and students (MacKay et al. Citation2019, 324). Students’ RPs depict their experiences of assessment and feedback as full of ambiguity and uncertainty. Without strong connections, the power dynamics inherent to assessment situations (e.g. Small and Attree Citation2016) may lead students to feel unable to contact lecturers, leaving uncertainties unresolved.

Assessment and feedback processes are also perceived as being ever-present and ongoing, where time pressures are a significant source of challenge. It was evident that, for students, assessment and feedback are integral aspects of their overall student experience. This indicates that initiatives to ”improve” assessment and feedback are unlikely to be effective unless considered within the wider context of the student’s life at university, where our participants discussed these processes as permeating all elements of their lives as a student. This potentially signals a need for an innovative re-examination of how processes are developed. Initiatives/developments in assessment and feedback tend to isolate actors and systems without recognising their entanglement. For example, an action plan might have separate lines for staff training on feedback and developing electronic rubrics, which indicates that we can improve these different processes in parallel without any interaction. We contend that efforts to enhance assessment and feedback processes can only truly be achieved through a holistic focus on assessment, feedback, teaching, learning and the wider student experience.

With digital assessment and feedback practices becoming far more prevalent across the sector as a result of COVID-19, this may lead to additional challenges for staff (Watermeyer et al. Citation2021). However, technologies can be enabling in such circumstances. For example, if the main summative assessment and feedback process is completed using technology, then programme teams need to make space for other forms of meaningful dialogue, and think carefully about how connections can be fostered online. In fact, one solution to this conundrum involves disentangling feedback opportunities from summative grading events, so that the purpose of feedback (i.e. to support development) is not overpowered by the certification function of assessment (Winstone and Boud Citation2022). Other practices that might support strong connections between staff and students include staff sharing their own emotional responses to feedback (e.g. in response to peer review) to open up spaces for meaningful dialogue that help students to understand how emotion relates to their own responses to feedback (Bearman and Molloy Citation2017; Gravett et al. Citation2020). This could also be an opportunity to tackle threats to students’ wellbeing in situ, before it becomes ‘internalised into your self-image’, as one student put it.

Conclusion

Our study offers a contribution to the field through its adoption of the creative rich picture method, which served as a highly effective way to surface deeper considerations of the impact of assessment and feedback processes. The RP approach enabled participants to depict holistic connections between assessment, feedback and their student experience, highlighting the centrality and impact of assessment and feedback processes in their day-to-day lives. Through using this approach, there is now further evidence for the argument that it is not enough for assessment to be ‘bolted on’ to a course without interacting with the learning process. The RP approach enabled our participants to become active creators of the data, producing their own visual artefacts. This participatory, material engagement changed the energy and power balance in the room. The RP method also offered participants space to articulate their thoughts via discussion. This method enabled students to generate powerful metaphors and imagery, and we would recommend this method as a valuable and enjoyable means of data collection. Furthermore, there may be potential for future research exploring experiences of digital assessment and feedback practices using the RP method.

The research findings also foreground the need to think about who and what matters to students. We recognise that for academics, increasing pressures, workloads, and precarity create real barriers to connecting meaningfully with students, impacting upon their own sense of mattering, wellbeing and agency. However, when academics are able to connect with students, then meaningful relationships communicate to students that they are recognised by staff and that they matter (Gravett and Winstone Citation2022; Gravett, Taylor, and Fairchild Citation2024). Furthermore, embedding of relational practices within feedback interactions might facilitate a greater understanding among educators of the direct impact of assessment and feedback on students’ experiences. Finally, it may also open up further spaces for conversations regarding the significance of mattering, and what resources academics may need to be able to meaningfully support and interact with students.

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

  • Ajjawi, R., and D. Boud. 2017. “Researching Feedback Dialogue: An Interactional Analysis Approach.” Assessment & Evaluation in Higher Education 42 (2): 252–265. https://doi.org/10.1080/02602938.2015.1102863.
  • Ali, N., L. Ahmed, and S. Rose. 2018. “Identifying Predictors of students’ Perception of and Engagement with Assessment Feedback.” Active Learning in Higher Education 19 (3): 239–251. https://doi.org/10.1177/1469787417735609.
  • Bearman, M., and E. Molloy. 2017. “Intellectual Streaking: The Value of Teachers Exposing Minds (And Hearts).” Medical Teacher 39 (12): 1284–1285. https://doi.org/10.1080/0142159X.2017.1308475.
  • Beaumont, C., M. O’Doherty, and L. Shannon. 2011. “Reconceptualising Assessment Feedback: A Key to Improving Student Learning?” Studies in Higher Education 36 (6): 671–687. https://doi.org/10.1080/03075071003731135.
  • Bell, S., T. Berg, and S. Morse. 2016. Rich pictures: Encouraging resilient communities. Routledge. https://doi.org/10.4324/9781315708393.
  • Bell, S., T. Berg, and S. Morse. 2019. “Towards an Understanding of Rich Picture Interpretation.” Systemic Practice and Action Research 32 (6): 601–614. https://doi.org/10.1007/s11213-018-9476-5.
  • Berg, T., T. Bowen, C. Smith, and S. Smith. 2017. “Visualising the Future: Surfacing Student Perspectives on Post-Graduation Prospects Using Rich Pictures.” Higher Education Research & Development 36 (7): 1339–1354. https://doi.org/10.1080/07294360.2017.1325855.
  • Berg, T., and R. Pooley. 2013. “Contemporary Iconography for Rich Picture Construction.” Systems Research and Behavioral Science 30 (1): 31–42. https://doi.org/10.1002/sres.2121.
  • Boud, D. 2010. Assessment 2020: Seven Propositions for Assessment Reform in Higher Education. Sydney, Australia: Australian Learning and Teaching Council.
  • Bourne, J., and N. E. Winstone. 2021. “Empowering students’ Voices: The Use of Activity-Oriented Focus Groups in Higher Education Research.” International Journal of Research & Method in Education 44 (4): 352–365. https://doi.org/10.1080/1743727X.2020.1777964.
  • Carless, D., and D. Boud. 2018. “The Development of Student Feedback Literacy: Enabling Uptake of Feedback.” Assessment & Evaluation in Higher Education 43 (8): 1315–1325. https://doi.org/10.1080/02602938.2018.1463354.
  • Chalmers, C., E. Mowat, and M. Chapman. 2018. “Marking and Providing Feedback Face-To-Face: Staff and Student Perspectives.” Active Learning in Higher Education 19 (1): 35–45. https://doi.org/10.1177/1469787417721363.
  • Chong, S. W. 2021. “Reconsidering Student Feedback Literacy from an Ecological Perspective.” Assessment & Evaluation in Higher Education 46 (1): 92–104. https://doi.org/10.1080/02602938.2020.1730765.
  • Cristancho, S. 2015. “Eye Opener: Exploring Complexity Using Rich Pictures.” Perspectives on Medical Education 4 (3): 138–141. https://doi.org/10.1007/s40037-015-0187-7.
  • Cristancho, S. M., and E. Helmich. 2019. “Rich Pictures: A Companion Method for Qualitative Research in Medical Education.” Medical Education 53 (9): 916–924. https://doi.org/10.1111/medu.13890.
  • Deeley, S. J., M. Fischbacher-Smith, D. Karadzhov, and E. Koristashevskaya. 2019. “Exploring the ‘Wicked’ Problem of Student Dissatisfaction with Assessment and Feedback in Higher Education.” Higher Education Pedagogies 4 (1): 385–405. https://doi.org/10.1080/23752696.2019.1644659.
  • Esterhazy, R. 2018. “What Matters for Productive Feedback? Disciplinary Practices and Their Relational Dynamics.” Assessment & Evaluation in Higher Education 43 (8): 1302–1314. https://doi.org/10.1080/02602938.2018.1463353.
  • Gisby, A., C. Ross, J. Francis-Smythe, and K. Anderson. 2023. “The ‘Rich Pictures’ Method: Its Use and Value, and the Implications for HRD Research and Practice.” Human Resource Development Review 22 (2): 204–228. https://doi.org/10.1177/15344843221148044.
  • Gravett, K. 2022. “Feedback Literacies as Sociomaterial Practice.” Critical Studies in Education 63 (2): 261–274. https://doi.org/10.1080/17508487.2020.1747099.
  • Gravett, K., and D. Carless. 2024. “Feedback Literacy-As-Event: Relationality, Space and Temporality in Feedback Encounters.” Assessment & Evaluation in Higher Education 49 (2): 142–153. https://doi.org/10.1080/02602938.2023.2189162.
  • Gravett, K., I. M. Kinchin, N. E. Winstone, K. Balloo, M. Heron, A. Hosein, S. Lygo-Baker, and E. Medland. 2020. “The Development of academics’ Feedback Literacy: Experiences of Learning from Critical Feedback via Scholarly Peer Review.” Assessment & Evaluation in Higher Education 45 (5): 651–665. https://doi.org/10.1080/02602938.2019.1686749.
  • Gravett, K., C. A. Taylor, and N. Fairchild. 2024. “Pedagogies of Mattering: Re-Conceptualising Relational Pedagogies in Higher Education.” Teaching in Higher Education 29 (2): 388–403. https://doi.org/10.1080/13562517.2021.1989580.
  • Gravett, K., and N. E. Winstone. 2019. “‘Feedback interpreters’: The Role of Learning Development Professionals in Facilitating University students’ Engagement with Feedback.” Teaching in Higher Education 24 (6): 723–738. https://doi.org/10.1080/13562517.2018.1498076.
  • Gravett, K., and N. E. Winstone. 2022. “Making Connections: Authenticity and Alienation within students’ Relationships in Higher Education.” Higher Education Research & Development 41 (2): 360–374. https://doi.org/10.1080/07294360.2020.1842335.
  • Green, S. 2019. “What Students don’t Make of Feedback in Higher Education: An Illustrative Study.” Journal of English for Academic Purposes 38:83–94. https://doi.org/10.1016/j.jeap.2019.01.010.
  • Harman, K., and L. McDowell. 2011. “Assessment Talk in Design: The Multiple Purposes of Assessment in HE.” Teaching in Higher Education 16 (1): 41–52. https://doi.org/10.1080/13562517.2010.507309.
  • Jones, E., M. Priestley, L. Brewster, S. J. Wilbraham, G. Hughes, and L. Spanner. 2021. “Student Wellbeing and Assessment in Higher Education: The Balancing Act.” Assessment & Evaluation in Higher Education 46 (3): 438–450. https://doi.org/10.1080/02602938.2020.1782344.
  • Jørgensen, B. M. 2019. “Investigating Non-Engagement with Feedback in Higher Education as a Social Practice.” Assessment & Evaluation in Higher Education 44 (4): 623–635. https://doi.org/10.1080/02602938.2018.1525691.
  • Joughin, G., D. Boud, P. Dawson, and J. Tai. 2021. “What Can Higher Education Learn from Feedback Seeking Behaviour in Organisations? Implications for Feedback Literacy.” Assessment & Evaluation in Higher Education 46 (1): 80–91. https://doi.org/10.1080/02602938.2020.1733491.
  • MacKay, J. R. D., K. Hughes, H. Marzetti, N. Lent, and S. M. Rhind. 2019. “Using National Student Survey (NSS) Qualitative Data and Social Identity Theory to Explore students’ Experiences of Assessment and Feedback.” Higher Education Pedagogies 4 (1): 315–330. https://doi.org/10.1080/23752696.2019.1601500.
  • Molloy, E., C. Noble, and R. Ajjawi. 2019. “Attending to Emotion in Feedback.” In The Impact of Feedback in Higher Education, edited by M. Henderson, R. Ajjawi, D. Boud, and E. Molloy, 83–105. Palgrave Macmillan. https://doi.org/10.1007/978-3-030-25112-3_6.
  • Nieminen, J. H., J. Tai, D. Boud, and M. Henderson. 2022. “Student Agency in Feedback: Beyond the Individual.” Assessment & Evaluation in Higher Education 47 (1): 95–108. https://doi.org/10.1080/02602938.2021.1887080.
  • Noble, C., S. Billett, L. Armit, L. Collier, J. Hilder, C. Sly, and E. Molloy. 2020. ““It’s Yours to take”: Generating Learner Feedback Literacy in the Workplace.” Advances in Health Sciences Education 25 (1): 55–74. https://doi.org/10.1007/s10459-019-09905-5.
  • Ossenberg, C., A. Henderson, and M. Mitchell. 2019. “What Attributes Guide Best Practice for Effective Feedback? A Scoping Review.” Advances in Health Sciences Education 24 (2): 383–401. https://doi.org/10.1007/s10459-018-9854-x.
  • Pitt, E., and L. Norton. 2017. “‘Now that’s the Feedback I want!’ Students’ Reactions to Feedback on Graded Work and What They Do with it.” Assessment & Evaluation in Higher Education 42 (4): 499–516. https://doi.org/10.1080/02602938.2016.1142500.
  • Quinlan, K. M., and E. Pitt. 2021. “Towards Signature Assessment and Feedback Practices: A Taxonomy of Discipline-Specific Elements of Assessment for Learning.” Assessment in Education Principles, Policy & Practice 28 (2): 191–207. https://doi.org/10.1080/0969594X.2021.1930447.
  • Rees, C. E., C. Davis, O. A. King, A. Clemans, P. E. S. Crampton, N. Jacobs, T. McKeown, J. Morphet, and K. Seear. 2020. “Power and Resistance in Feedback During Work-Integrated Learning: Contesting Traditional Student-Supervisor Asymmetries.” Assessment & Evaluation in Higher Education 45 (8): 1136–1154. https://doi.org/10.1080/02602938.2019.1704682.
  • Ryan, T., and M. Henderson. 2018. “Feeling Feedback: Students’ Emotional Responses to Educator Feedback.” Assessment & Evaluation in Higher Education 43 (6): 880–892. https://doi.org/10.1080/02602938.2017.1416456.
  • Sambell, K. 2016. “Assessment and Feedback in Higher Education: Considerable Room for Improvement?” Student Engagement in Higher Education 1 (1): 1–14.
  • Small, F., and K. Attree. 2016. “Undergraduate Student Responses to Feedback: Expectations and Experiences.” Studies in Higher Education 41 (11): 2078–2094. https://doi.org/10.1080/03075079.2015.1007944.
  • Smith, B., and K. R. McGannon. 2018. “Developing Rigor in Qualitative Research: Problems and Opportunities within Sport and Exercise Psychology.” International Review of Sport and Exercise Psychology 11 (1): 101–121. https://doi.org/10.1080/1750984X.2017.1317357.
  • Tai, J., P. Dawson, M. Bearman, and R. Ajjawi. 2019. “Beware the Simple Impact Measure: Learning from the Parallels with Student Engagement.” In The Impact of Feedback in Higher Education: Improving Assessment Outcomes for Learners, edited by M. Henderson, R. Ajjawi, D. Boud, and E. Molloy, 37–50. Palgrave Macmillan. https://doi.org/10.1007/978-3-030-25112-3_3.
  • Van der Kleij, F. M., and A. A. Lipnevich. 2021. “Student Perceptions of Assessment Feedback: A Critical Scoping Review and Call for Research.” Educational Assessment, Evaluation and Accountability 33 (2): 345–373. https://doi.org/10.1007/s11092-020-09331-x.
  • Watermeyer, R., T. Crick, C. Knight, and J. Goodall. 2021. “COVID-19 and Digital Disruption in UK Universities: Afflictions and Affordances of Emergency Online Migration.” Higher Education 81 (3): 623–641. https://doi.org/10.1007/s10734-020-00561-y.
  • Winstone, N. E., K. Balloo, and D. Carless. 2022. “Discipline-Specific Feedback Literacies: A Framework for Curriculum Design.” Higher Education 83 (1): 57–77. https://doi.org/10.1007/s10734-020-00632-0.
  • Winstone, N. E., and D. Boud. 2022. “The Need to Disentangle Assessment and Feedback in Higher Education.” Studies in Higher Education 47 (3): 656–667. https://doi.org/10.1080/03075079.2020.1779687.
  • Winstone, N. E., R. A. Nash, M. Parker, and J. Rowntree. 2017. “Supporting learners’ Agentic Engagement with Feedback: A Systematic Review and a Taxonomy of Recipience Processes.” Educational Psychologist 52 (1): 17–37. https://doi.org/10.1080/00461520.2016.1207538.
  • Winstone, N. E., R. A. Nash, J. Rowntree, and R. Menezes. 2016. “What Do Students Want Most from Written Feedback Information? Distinguishing Necessities from Luxuries Using a Budgeting Methodology.” Assessment & Evaluation in Higher Education 41 (8): 1237–1253. https://doi.org/10.1080/02602938.2015.1075956.