Hear, hear: What the CYI has learned from the Listening Fund evaluation so far and how we can all improve our practice

The Centre for Youth Impact is evaluating the work of the 22 partners supported through The Listening Fund (TLF). This blog by Matthew Hill, Head of Research and Learning at the Centre for Youth Impact, summarises the key learning from the initial organisational listening self-assessment and the project application forms. More detail can be found in the full summary report.

Partners are already doing lots of different forms of listening

As we know, there is already a huge amount of experience and expertise in engaging youth voice across the sector – often distinguishing non-formal provision from other settings. As such, it came as no surprise that TLF partners are already doing many different forms of listening: almost all (90%) are undertaking ongoing listening within practice, four fifths (81%) are undertaking focus groups and four fifths (81%) are collecting listening case studies of individual young people. Partners also felt that to a large extent they refer to listening in their organisational strategy and actively create opportunities for listening to young people within their work. Encouragingly, three-quarters (72%) of organisations felt that they act on what they have heard to a large extent.

How listening is framed

The top ‘use’ for listening by TLF partners was to give young people an opportunity to express themselves followed by using listening to evaluate services and using listening to influence the development of services. Still high, but slightly lower, was undertaking listening because young people have a democratic right to shape services.

The final question on the self-assessment asked partners to position themselves on Hart’s Ladder of Youth Participation (which outlines eight levels of youth participation in projects). As can be seen below, although no partners rated themselves on the bottom two rungs, there was wide variance in positioning. It will be fascinating to see how this changes over the two years of the fund.

Despite the existing good practice, partners recognise considerable scope for development around organisational listening in a number of areas which also has relevance for the youth sector more widely.

A more systematic and consistent approach

Although listening is seen as a crucial part of ongoing practice, partners felt that the Listening Fund grant would allow their listening to be “set out in a more formalised process” and be “carried out more strategically across the organisation”. A crucial part of this was the perceived variance in listening practice within organisations with half (47%) feeling it varied to a large extent.

Doing justice to the data collected through listening

Partners generally rated their skills and practice in data analysis (both quantitative and qualitative) lower than in data collection. This backs up wider findings from our evaluation of the Youth Investment Fund (funded by DCMS/ BLF and led by NPC) where organisations told us that although their data collection has advanced considerably over recent years they often don’t do the data justice in terms of really analysing and understanding what it means.

Communicating about what you are NOT doing as a result

Partners generally scored themselves high for communicating back to those who had engaged in their listening process but scored themselves lower on communicating to those who had not engaged (including giving an explanation of why they may not have acted on what they heard).

Feedback on feedback

Some of the lowest scores related to partners actually evaluating their own listening activity. Some see this as a ‘higher order’ element of listening but it is also a fundamental principle that if we are asking young people for feedback we should also give them an opportunity to feedback on whether they feel that their voice is being heard. This type of analysis can also help us see if our listening activity is suffering from any bias in terms of the groups engaged.

Reflections on the self-assessment itself

The overall feedback was positive with partners generally finding the self-assessment valid and, crucially, seeing it as providing useful insights for improvement. Two main issues were raised. Firstly, some felt that it was less applicable to their focus on co-production (which they saw as going well beyond mere listening). Secondly, as an organisational self-assessment, the tool was seen as limited in capturing the variance within organisations.

We believe this is the first self-assessment of its kind (at least to be made publicly available) and so we are very open to critical (and uncritical!) feedback. You can download the self-assessment tool here – and you can find a whole host of other information and tools under the ‘Resources’ section of this website. If you would like more information on the evaluation please contact matthew.hill@youthimpact.uk.

Skip to content