New PIs wanted by Funding Councils

Posted in: Comment, News and Updates

The UK Funding Council are consulting on HE performance indicators.  Hefce says that the indicators currently cover the following data:

  • widening participation indicators
  • non-continuation rates (including projected outcomes)
  • module completion rates
  • research output
  • employment of graduates.

... with HESA providing the detail.  Of course, data already exists for all of these and this is an important criterion for inclusion.

Should the PIs change?  Can they sensibly be added to?  Well, ...

1. student satisfaction is a clear missing component here and may well be the best proxy for 'teaching quality' in a conventional sense.  Data are readily available.

2. currently, it's all rather UG-focused, and M and D level completion rates (to timetable) seem not to be included.  Perhaps they should.  Data also readily available.

3.  what the Americans call "outreach"; ie, all those community-focused (in a wide sense) initiatives are not included.  It's hard to get a numerical fix on though.

And then there's sustainability.  An indicator might be:

institution orientated to address sustainability or           institution focus on sustainability

... but where are the readily available, valid, reliable data?  People & Planet will hardly do; nor LiFE, though for different reasons.   Neither will anything (yet) from the NUS/HEA surveys.  I'd say that proposing an area of focus to them without a source of data will not be successful.  Of course, it does all come back to that validity question: what's to count in the sense of an institutional focus on sustainability?  There you have it.  Maybe EAUC is working on that ...

The SHED network was invited to share views.  So far, these are the only comments:

There is scope for an Environmental Impact measure, but it should draw upon a basket of current measurements - for example the carbon data you already have to submit, so it would be a measure of measures. I would be worried at adding a different measure, with new methodology and related data collection issues. (We are close to spending more time collecting data than trying to alter the realities of things) and I think holistic ESD issues are best measured through the QAA processes, and getting the ESD element of that strengthened is essential.

Chris Willmore; Bristol


I do see the point that we should be focusing on outcomes rather than inputs. I also agree that we should avoid too much data gathering particularly if interpreted in an accountability context. Perhaps I also agree that there is a difference between the needs of the QAA and the nature of the performance indicators addressed here, but I would probably question this distinction to some extent. Just because the QAA is currently focused in one area and performance indicators used for another purpose does not make this the best possible way forward. Some may have interesting perspectives on this that could be helpful to this consultation process.

But I also question the nature of the current data collecting that we do and sometimes have been known to suggest that if we devoted a fraction of the time, money and effort, that we spend asking our students what they think of their teachers, and their teaching, towards evaluating our impact on the sustainability attributes of our students (and indeed on their global perspectives and cultural sensitivities) we might be in a better position to address some of the problems that we are encountering as we attempt to educate for sustainability or for sustainable development, or environmentally-educate. Not necessarily more data collection, but data with a different focus. Note that some Australian Universities receive funding on the basis of student feedback on their learning experiences (check it out here… and perhaps take this as an indicator of what might come your way if you let it … http://www.innovation.gov.au/HigherEducation/Policy/Documents/AdministrativeGuidelines.docx ... )

Personally I see performance indicators as useful devices to help us understand if we are actually on the way to achieving some of the things that we want to achieve. Even more important will be the process of designing performance indicators, as that process will help us understand the nature of our objectives. Institutions that have committed themselves to the Talloires Declaration or that have incorporated sustainability or global citizenship into lists of graduate attributes or institutional objectives could consider if performance indicators in this area would be valuable evaluative instrument to help them understand their impacts. And note that I am not being prescriptive about what we should be doing, only suggesting that we should be interested in monitoring what we say we do. And if I cannot encourage this discussion in Shed-share, where else?

Perhaps I might go further and suggest that I am not particularly interested in the carbon data submitted by universities, any more than that submitted by any large business.  Acting as responsible corporate entities is what we all should do. Higher education has more challenging roles and performance indicators should be designed accordingly. (And yes I do get it that HEIs act as role models for our students, but the outcomes that we may be most interested in could be our students’ perspectives, not carbon data).

Kerry Shephard; Otago


Posted in: Comment, News and Updates

Respond

  • (we won't publish this)

Write a response