No, this is not about some dyslectic verison of the BBC's brilliant French farce of (almost) the same name, but the tale is Gallic in one respect.
The term, Ahelo, is the acronym for 'Assessment of Higher Education Learning Outcomes' which is the Paris-based OECD's attempts to extend its PISA tentacles into higher education. The OECD has been quietly developing a way to test students across different institutions and countries, with the aim of discovering how much they have learned. Their expectation is this will lead (of course) to more international league tables. Joy unbounded. You can read more about this on the Diverse blog, and there's a UK perspective on the Guardian.
As sensible commentators in the UK have noted, this is a deranged idea. The Guardian article quotes Alison Wolf:
“It is basically impossible to create complex language-based test items which are comparable in difficulty when translated into a whole lot of different languages. And that is before you even start on whether a given set of items can possibly be equally appropriate regardless of the subject studied or the very different nature of higher education courses in different countries, or the level of similarity between OECD question formats and those used for assessment in the system concerned.”
Wolf says that an OECD-style multi-country test is not the answer, but she adds:
“I do, however, think that the question of how much people actually learn on degree courses is a major one, long overdue for serious attention.”
Such comments are no doubt music to the ears of those in HEA / QAA / HEFCE / Offa / BIS / Cabinet Office / etc who always want more opportunities to hold universities to account – and don't be fooled; this is as much about the effectiveness of institutions as it is about student learning.
As such, it's no surprise that there are new efforts to explore 'learning gain', with HEFCE being about to announce around a dozen pilot projects to look at ways to measure the skills and knowledge students develop in higher education. This is what schools know as 'added value'. The Guardian says that these projects will
"... range from surveying students at the beginning of their courses and then in years two and three to test how ready they are for employment, to asking them at the beginning and then again at the end of their university studies to write essays to test their ability to analyse, synthesise and think critically."
Clearly, students will be queuing down the street and round the corner to write these essays.
"To test how ready they are for employment" is the key idea here. Given that no one has the faintest idea what this means, it's going to be a tough ask. Employers hire potential, trusting themselves to do the rest. Possession of a (good) degree is part of that potential in a new graduate, and what an added value score will add to this is not clear. I think it's a nonsense, and shall keep well away from it.
What (the prestigious) institutions are concerned about, of course, is that there may not be much correlation between 'learning gain' and 'student satisfaction' scores. Now, wouldn't that be awkward ...