Work programme 2009-2012
Mini-Seminar on Transparency Tools
One of the specific tasks of the 2009-2012 BFUG Working Group on Multidimensional Transparency Tools - Towards a transparent landscape of European Higher Education was “to organize a seminar on transparency tools open for the participation of the BFUG”. The mini-seminar aimed at sharing with the BFUG members the specific knowledge gathered through the Transparency Tools WG discussions. It was also a good opportunity to collect participants view on the issues that were intensely debated within the WG.
The tools developed within the Bologna Process (ECTS, Diploma Supplement, quality assurance, qualifications frameworks and learning outcomes) are complex transparency tools whose major merit is that they enable understanding of the learning experience. They describe coherently where one can reach, where one stands for now and how to get there in terms of higher education. They help identifying strengths, not necessarily comparing merits of alternative study choices.
In order to fulfill their transparency function, Bologna Process tools need to rely on each other. This implies that Bologna Process cannot be approached a la carte and also that further effort needs to be put in order to make Bologna Process infrastructure understandable by the less initiated public. It is hard to imagine that an average prospective student and their family have the detailed knowledge on Bologna Process tools which is necessary in order to properly understand all merits and risks associated with a qualification that is presented to them. Their task may become even harder if qualifications are marketed, instead of being described for the purpose of information provision.
Bologna Process tools face competition mainly from newspapers that offer easy to use tools, such as rankings. Bologna tools can be criticized for not addressing properly the need for information on employability, student support, student/staff ratios or other aspects of the quality of the learning experience. Rankings’ appeal to the public demonstrates that there is interest for such tools. Currently, there are no indications that public propensity for rankings would decrease. In this context, one key challenge for Bologna tools is to increase their understandability while maintaining comprehensibility.
Classifications and rankings enable straight forward comparisons between study alternatives and claim objectivity through their methods. Classifications cluster HEIs in groups of equivalence, while rankings traditionally order HEIs hierarchically, departments or study programmes. Innovations in classifications consist in portraying the profiles of HEIs based on a predetermined set of dimensions, allowing users to create their own grouping criteria; multidimensional rankings allow users to compare HEIs performance on different dimensions they can choose from a predetermined set or to visualize the performance profile of HEIs. U-multirank, as an example of multidimensional ranking, fares better than traditional rankings in terms of being user-driven. It includes innovative indicators, especially under the regional engagement dimension. It also makes use of U-Map, a multidimensional classification tool, in order to determine which institutions are comparable, based on their activity profile.
Criticism to rankings consists of many claims, out of which some where highlighted:
- The choice of indicators, dimensions, weight of aggregation is not necessarily relevant for the users, but rather determined by measurement technologies, existing data and rankers preferences;
- They are biased towards different issues, such as disciplines, language, size of institution, post-graduate and research intensiveness etc.
- There are important aspects of higher education that cannot be measured and quantified.
The main concern with classification is mostly due to the users: some misperceive descriptive classifications as being hierarchical due to some “public stereotypes”. Probably the widest spread “public stereotype” is that research universities are better than the ones focused on learning and teaching, not that they are just different.
Another staple concern associated with classifications and rankings is that they are not always used for the purpose for which they were intended. They sometimes turn from being information provision tools to funds distribution tools. In such cases, perverse incentives are provided for scoring high, thus the adequacy of the tool for transparency purposes decreases.
As a conclusion, the perfect transparency tool does not exist, but it is important to make available to the public some tools that have only an information function. It is equally important to properly communicate these tools. Transparency is created making use of a mix of transparency tools, combining different approaches. In order to make better use of existing data, Bologna Process tools should improve the information they provide on employability, student support, student/staff ratios or other aspects of the learning experience, while classifications and rankings should incorporate the information from Bologna Process associated tools.
WG Activities
- Working Group on Transparency Tools 2009-2012
- Meeting 1 Brussels 30 November 2009
- Meeting 2 Brussels 11 October 2010
- Meeting 3 Brussels 18 April 2011
- Workshop Brussels 9 June 2011
- Mini-Seminar Cracow 12 October 2011
- Meeting 4 Brussels 15 November 2011
- Meeting 5 Brussels 10 January 2012
RELATED DOCUMENTS
- WG Transparency Tools - Report mini-seminar 12.10.2011
- WG Transparency Tools - Draft agenda mini-seminar
- Mini Seminar Transparency tools - Introduction
- Mini Seminar Transparency tools - Thoughts on transparency
- Mini Seminar Transparency tools - Diversity policy
- Mini Seminar Transparency tools - Global university rankings and their impact
- Mini Seminar Transparency tools - U-Multirank on the feasibility of a new approach