Leader magazineASCL - Association of School and College Leaders

One score fits all?

Tape measure

A new framework for colleges - and, from 2010, for school sixth forms - combines different performance measures into a single rating. But can complex institutions be measured accurately in such simple fashion? Chris Tyler explains.

By now, the new performance measure for the learning and skills sector will be a reality for college leaders. The Framework for Excellence (FfE) has been extensively trialled and adapted for use by colleges and training providers.

And school leaders take note - it's on the way for sixth forms too. From September 2009, FfE will be piloted at a school near you.

FfE was designed to support FE self-regulation and introduced in the March 2006 white paper as "a single, standard set of key performance indicators covering the broad themes of responsiveness, quality and resources." However, it is now seen as a way to judge and compare all providers of post-16 education and training using a common scale.

The idea behind FfE is to support demand-led funding by sharing details about each institution's performance with learners and employers. This should help them to choose the education/training that is right for them and help the sector to meet the challenging targets set in the 2006 Leitch Report.

FfE was first piloted in 2007 and is being rolled out in the FE sector over three years.

In 2008-09, the second pilot will include local authorities, specialist designated colleges, independent specialist colleges, personal development and community development learning providers, offender learning providers and HE institutions with FE provision.

After a third pilot from September 2009 with selected schools, all FE providers, including school sixth forms, will use the framework from September 2010.

Performance indicators

The framework assesses the performance of providers through a number of performance indicators (PIs). There are seven key performance areas (KPAs) and each college's or provider's performance is aggregated to produce grades within three dimensions: responsiveness, effectiveness and finance. From these, a single overall performance rating (OPR) of outstanding, good, satisfactory or inadequate is derived.

Each PI is judged against a four-point scale, according to its own assessment criteria. The provisional 2008-09 assessment criteria will be reviewed in spring 2009. Criteria then will apply until 2012.

The Learning and Skills Council (LSC) is trying to avoid additional work for providers, so the majority of judgements will be based on data already in the system. Full details of the three dimensions, seven KPAs and the PIs that constitute them are available in the Framework for Excellence: Provider Guide 2008-09.

The first dimension, responsiveness to learners, is based on data from an annual learner views survey which includes issues such as information, advice and guidance; quality of teaching and learning; and satisfaction with the level of support received. Learner destination data collected by the provider is also taken into account.

Providers involved in Train to Gain or apprenticeship training or that generate over £30,000 annually of employer response funding also will be judged on responsiveness to employers.

Sixth form colleges in the 2007-08 pilot expressed concern that their responsiveness grade was based on a LSC-designed questionnaire for learners and possibly scant learner destination information - yet this counts for a third of their eventual overall performance rating. This and other concerns have been taken on board by the framework's designers.

Judgements on the second dimension, effectiveness, are probably the least contentious area as they use familiar data. The quality of outcomes is judged through qualification success rates sourced from individualised learner record (ILR) data and Learner Achievement Tracker (LAT) A level, amended to recognise value added. The most recent Ofsted inspection grade is used as the indicator for quality of provision.

The final dimension, finance, is based on three KPAs. Financial health is judged by current ratio (solvency); operating surplus or deficit as percentage of turnover (sustainability); and borrowing as a percentage of certain reserves and debt (status). Data for these judgements is taken from colleges' accounts.

In the financial management and control KPA, providers are required to self-assess and grade their arrangements annually.

The third finance KPA is capital. It considers condition - the current state of building stock - and renewal, which identifies the progress being made to renew building stock.

The finance dimension has caused some concern in the pilots, since many colleges are borrowing substantial sums to improve their premises and the effect that this has on their financial status can reduce their rating. It is also clear that these criteria will not work for schools. The LSC has noted the concerns.

Reaction so far

Reactions to the 2007-08 pilot from the 100 colleges and providers involved have been mixed. The pilot evaluation from the LSC concludes that the FfE has been largely successful in providing a robust assessment tool that will support the sector in self-regulation and give useful comparative data.

However, some issues need to be clarified. One is exactly how the performance indicators are drawn up. Another is the quality of data used. The LSC has also recognised that there needs to be a level playing field between providers, taking into account elements outside their control and ensuring that colleges are not penalised for good practice - building improvements for instance.

Stakeholders have been regularly consulted and ASCL has members on the FfE Inspection and Regulation Stakeholders Group and is also contacted formally and informally by members of the FfE team.

Any system that provides reliable performance indicators which can help inform decisions about how education and training are provided is welcome.

However, ASCL is unhappy with the concept of a single grade to represent the performance of large and complex institutions and has expressed this view forcefully and frequently.

Providers, meanwhile, are concerned about a potential increase in bureaucracy and about accuracy of the data being used to make judgements. The relationship between FfE and Ofsted and other quality measures may also be problematic.

Perceived wisdom is that in preparing annual self-assessment reports (SARs), colleges should incorporate the FfE measures, thus meeting both requirements without further paperwork. But colleges that have attempted to do this have had mixed results. For some, the design of their SAR makes it feasible; for others, it has meant considerable adjustment.

For schools, the design of the self-evaluation form (SEF) may need amending if they are to avoid extra form-filling once the FfE is in place. The current FfE format would also need tweaking to accommodate schools' different processes in finance, for example.

The future

At present, the LSC shares individual framework results and ratings as soon as possible with individual providers. Some 2007-08 scores will be published in May 2009 but the information will only include outputs from well-established measures with well-established national datasets.

The responsiveness dimension will only be published where data has been gathered by a proven method and is robust. Overall performance ratings will not be published but the information will be shared with individual colleges, local LSC teams, Ofsted, the Higher Education Funding Council for England (HEFCE) and the new Learning and Skills Improvement Service (LSIS) in spring 2009.

LSIS (formerly QIA) has said it will use the framework's data to target its support services and help colleges to improve their performance.

The LSC has been responsive to stakeholder feedback and it is important to note that the framework is still a work in progress. Many of the indicators above will not be relevant to schools and will need to be replaced by more meaningful ones.

The common performance management framework that will extend to school sixth forms in 2010 will be designed on the basis of the framework, rather than being a simple extension of what currently exists.

But if this is so, it raises a crucial question: how can a common framework, set up to allow a single grade comparator of provision in different parts of a widely varying post-16 sector, exist and deliver the desired outcomes?

If too many adjustments are necessary to make FfE work across the sector, resulting comparisons will not be valid.

Raising Expectations: Enabling the system to deliver (the Machinery of Government white paper of 2008) makes quite clear that future local commissioning will be based on analysis of local demand, underpinned by a strong performance management system. So it appears that it will be up to the providers to be aware of the importance of FfE in local authority decisionmaking and correct any inaccurate assumptions about their provision.

There is a long way to go yet and ASCL will continue to argue that a single grade for each college or school is too simplistic. But leaders of all post-16 providers can expect that their future will contain a Framework for Excellence in some form.

Chris Tyler is ASCL's colleges consultant and a former college principal.


Find out more...

Framework for Excellence Provider Guide (July 2008) and Framework for Excellence: Putting the Framework into Practice (June 2008): http://ffe.lsc.gov.uk/

LSC Framework for Excellence Pilot Evaluation 2008: http://ffe.lsc.gov.uk/publications


Case study

Carmel College took part in the first Framework for Excellent pilot. Nick Burnham shares his verdict on the process and the outcomes.

When I reported to Carmel College governors on our experience of the Framework for Excellence pilot I suggested that overall it had not been too burdensome from a bureaucratic point of view.

Very quickly the finance director informed me I should speak for myself! This is one illustration of how the framework has had a varied impact within and between colleges.

From our perspective the framework came to the right conclusion. The Overall Performance Rating (OPR) was 'outstanding' and agreed with our recent Ofsted inspection. However, we did not agree with some of the judgements made in the Key Performance Areas and felt that advice/instruction to change quality assurance systems was plainly wrong.

A key issue is the use of the indicators to arrive at judgements on very different institutions. Using the same criteria to judge sixth form colleges, general FE colleges and specialist institutions led, in our view, to erroneous judgements in aspects of the dimensions.

Regarding the effectiveness dimension, the judgements made about the college were clear and in our view correct. We had been inspected the previous academic year and therefore the Ofsted report was current and appropriate. Clearly over time this may change.

The responsiveness dimension provided some concerns. For the student population, we opted for a paper-based survey and while this was an administrative burden it did not prove a problem.

The timing did however, as we were requested to complete the survey in November. The lower sixth in particular were not really in a position to answer many of the questions and while the survey is now required in January and February it is still not the ideal time to conduct a survey of sixth form college students.

The judgements made on the results provided a real concern. Apparently there is a strong correlation between the course level that learners are on and their likelihood of responding negatively to questionnaires - the higher the level the less positive the response.

We were disappointed with the judgement made on our students' views and we know that other similar sixth form colleges shared our opinion. This contextualisation needs more refinement.

The scope and methodology of the destinations survey, arranged by the LSC, was impressive and did not really impact on the college at all. The LSC compared our ILR with HE and FE to match progression and phoned any missing students.

A few students rang to ask why we had given their details to other agencies though we had sought their permission on enrolment. We felt the judgement made on this aspect of the framework was appropriate. As a sixth form college we were exempt from the employers' views survey.

Finance dimension

Again we had concerns about the finance judgements. It was suggested during the process that our self-assessment report (SAR) did not have the appropriate detail. This surprised us as our SAR process had been cited by Ofsted as an example of good practice. Unlike some colleges, we have not changed our self-assessment process though we will incorporate the framework into our SAR. This dimension is problematic with issues around the treatment of surpluses and the impact of a significant building project on college financial health.

The results of the framework are due to be shared with local authorities and the LSC. There are some real concerns that need to addressed before the very different institutions included its scope can be confident that it provides an accurate picture of their performance.

Nick Burnham is vice principal of Carmel College, a sixth form college in Merseyside.

© 2024 Association of School and College Leaders | Designed with IMPACT