I've been reading the report that UNESCO released last year: Shaping the Education of Tomorrow: Full-length Report on the UN Decade of Education for Sustainable Development. I briefly wrote about this at the time of its release, noting Arjan Wals' concerns about how UNESCO had treated the more critical aspects of his original text. In this recent reading, I was particularly interested in data: their sources and treatment.
Data are, of course, crucial, but it is difficult to ask questions about ESD across cultures. There are three main reasons for this:
- ESD's shaky conceptual grounding makes asking valid questions difficult
- Education systems, their socio-political contextualisation, and the language used to describe them, vary across the world
- The data collection process itself is not always as open and participative as it ought to be
As the best analyses are worthless if data are invalid, all this is a concern, and an early section of the report is devoted to this issue, showing the complexities and difficulties of the process. The main sources of data for the report were:
- A 2011 literature review
- An on-line Global Monitoring & Evaluation Survey [GMES]
- Surveys of 44 Key Informants [KIS]
- Learning-based case studies
- Internal review of contributions to ESD by the various UN agencies
- Reports on eight National ESD Journeys
- Reports from UNESCO ESD Chairs
Fig 1 [page 16] shows this in all its complex detail, and section 1.6 sets out a careful considerations of the data collection limitations. This is worth reading, and I have put it at the foot of the posting.
Appendix 1 [page 92] shows all the data sources which illustrates their patchwork nature. In the UK, there was a response to the GMES (though by whom is not clear), and four organisations responded to the key informant survey [KIS]:
- Gaia Education
- Open to Create
- SEEd (Sustainability and Environmental Education)
- The Sustainable MBA
SEEd and Gaia are well known, but Open to Create, and The Sustainable MBA less so (at least by me). Are these really the best sources of information on a picture of ESD across the UK, I wonder (if only to myself). Obviously missing from this list is the UK National Commission for UNESCO which knows a lot about ESD in the UK because it commissions reports on it, with a new one about to emerge. Its work, especially its 2010 report, ought to have been picked up by all these surveys but (oddly) it wasn't. Such a shame, as this report would have been much better informed about the richness of UK ESD, especially in relation to practice and policy across the UK's four jurisdictions.
Report Section 1.6: Limitations of the Global Monitoring and Evaluation Process
- The MEEG developed the GMEF to assess implementation of the Decade, but realized that this process would capture the changes occurring during the ten-year period marked by the DESD and not just initiatives developed under the label of the DESD. It is difficult to discern which processes and learning activities were developed specifically for the DESD and which gained or gathered momentum because of the DESD’s existence. It is, however, important to recognize the ESD processes and learning that have taken place or continue to take place.
- Global studies present challenges related to distribution and collection of survey instruments. The involvement of local, regional and global NGOs in SD and ESD (considered key players in ESD) and youth are under-reported. Whereas, much of the data come from UN related sources, is from UNESCO Headquarters and Field Offices.
- Using a universal template and questionnaire has advantages not only in creating uniformity in reporting, but also in making sure that all respondents report on the same ESD components and issues. From the data provided, however, it is clear that not all concepts included in the template were understood in the same manner, even though a glossary of key terms was provided. Even within the same country, organizations or officials have different understandings of concepts such as ‘problem-based learning’ or ‘multi-stakeholder engagement’.
- An interactive process sometimes took place involving multiple people who were knowledgeable about specific ESD areas, thereby strengthening the validity of those responses. However, there were also cases where the data entered in the surveys were not confirmed by multiple sources to determine whether others also deemed valid and supported the responses provided.
- Even though our intention was to have the designated country representatives manage the completion of the global survey by distributing it to key informants in all the specified ESD contexts, they often filled in the questionnaire themselves – thereby unwillingly privileging the ESD context they were most familiar with at the expense of the others. As a result, the global survey has limited value as many participating countries did not turn in a separate survey for each of the ESD contexts. The GMES did include a question asking respondents to assess their own level of expertise in the ESD context on which they reported. Most respondents rated their level of expertise ‘high’ or ‘very high’.