Leader magazineASCL - Association of School and College Leaders

In a league of their own

Number 1 shirt

John Dunford urges schools and colleges to create their own set of data that measures how well they are meeting their own aims, not how well they perform against government targets.

There are, as has often been said, lies, damned lies and statistics, but education - and, in particular, league tables - has brought the black art of statistics to a new level.

The amount of data available to schools and colleges is enormous - contextualised value added (CVA), Fischer Family Trust, A level Performance System (ALPS), Learner Achievement Trackers (LAT), A level Information Systems (ALIS), Jesson value added used by the Specialist Schools and Academies Trust, and the rest of the Durham University suite of age-related data, to name but a few.

The question facing school and college leaders, therefore, is as much what to use as how to use it. In one sense there is no choice - the data used by the government for accountability has to be top of the list. But Ofsted says that it will take the school/college's own data into account, so that can certainly be used in the self-evaluation programme.

When considering the use of the data, it all depends on the audience. Data that is appropriate for internal use for self-evaluation purposes - say, for annual reviews with heads of department - is certainly not in the right form to use to tell parents about how the school is doing.

Other audiences for data include the senior leadership team, middle managers, the whole staff, the governing body, parents, the local media, the school improvement partner and Ofsted; the needs of each group have to be considered.

Most secondary schools and colleges now have a data manager, who may be a member of the senior leadership team. Some institutions will have a more junior data manager, reporting to a senior leader or directly to the principal.

Remain in control

Sometimes it may feel as if data is taking over the world, or at least your part of it. That's why it is really important to remain in control of the data and why it is vital to have a first class data manager. Otherwise it will feel as if the data is taking control of you and steering your agenda. A good principle to work to is that data does not give the answers, it simply asks the questions.

One way to overcome the problem of controlling the data is to have, in addition to the demands imposed from elsewhere, your own data set. School and college leaders have a great deal of autonomy over the direction of the institution. Through you the ethos of the school/college develops and all members of the institutional community know why they are there and what they are aiming for.

When I was a head, after several years at the school I revisited its aims to see if we really were fulfilling them. The honest answer was that, for most of the aims, I didn't know how well we were doing. I had no evidence one way or the other.

So I developed a set of performance measures specifically designed to monitor how well the school was doing in pursuit of its aims.

The first aim: 'the school should make the maximum range of opportunities for all students' (I cannot recall the exact wording), could be monitored by recording the number of students taking part in extra-curricular activities and/or the number taking part in visits and exchanges each year.

Another aim, 'the school should promote the professional development of all staff,' could be gauged by the extent of staff participation in external courses, but that would only measure attendance and not effect or outcomes. The measure that I used for this was the number of staff gaining promoted posts, in the school itself and elsewhere.

It may seem bizarre to measure one's performance by the number of staff who leave, but this was the number who left to gain promotion. So it was a measure of the extent to which the school was helping people towards the next stage of their career. It meant that the school lost good staff, but it also spelt out the fact that this was a school that cared about its staff and helped them in their careers.

Many of the staff then at Durham Johnston Comprehensive School are now on school leadership teams and six of them are headteachers. It was, I like to think, a performance indicator that worked. The best schools in this respect are factories of professional development.

The underlying point here is that the school itself was in control of its own performance indicators and thus of its own agenda. True accountability (as ASCL wrote in 2003 in the first paper that we produced on intelligent accountability) intelligent accountability) lies not in the answerability to external agencies or performance measures imposed by outside bodies, but in the professional accountability that is driven by ourselves as leaders and educationists.

Qualitative data

Data does not have to be a set of numbers. Qualitative data can be as illuminating as quantitative measures. Nowhere is this more important than in gathering the views of parents and students. Many ASCL members use Kirkland Rowell, an ASCL premier partner, or similar companies to carry out the surveys for them and produce results in a form that enables school leaders to benchmark performance against that of other similar schools. This is critically important information for school self-improvement and for the self-evaluation process that feeds into the SEF.

I have recently had an exchange of letters with the schools' minister, Vernon Coaker, in which I pointed out the current confusion about parents' surveys. The government's July white paper and the Ofsted framework both state that schools should carry out surveys of parent and student views and use this information in their self-evaluation.

In addition, the Ofsted process includes a survey of parents to be carried out in the short time between notification of an inspection and the arrival of the inspectors. Ofsted has also stated that it intends to carry out an annual survey of parents to inform the decision about which schools are to be inspected in the following year.

This is overkill and I have asked the government and Ofsted to slim down their plans and to ensure that parent surveys remain in the ownership of the school as part of the increasingly successful and rigorous self-evaluation now carried out annually in the vast majority of schools.

School report card

The school report card will sharpen the data debate. Even if you are not in one of the 120 schools piloting the report card, it is worth looking at the data that is proposed to go into it.

At one level, ASCL welcomes the notion of a report card, which has the capacity to give credit for more than just exam results. The Framework for Excellence (FfE) is already moving colleges in this direction. However, similar to our concerns about the FfE, we have grave fears about what the report card may turn into.

The ASCL contribution to the early debate on the report card was to articulate ten principles on which it should be based, the most important of which is that "it should be equally possible for a good school serving a disadvantaged community and a good school serving a less disadvantaged area to obtain a high score."

If the report card is to be dropped into the overcrowded waters of the accountability pool, it is important to relate it to the other accountability measures already there - school self-evaluation, Ofsted inspection, league tables and the whole panoply of targets and agencies putting pressure on schools.

Most important of all, the report card - if it is ever to see the light of day, and that is by no means certain if there is a change of government - must use a data set that gives a fair account of school performance.

Never forget that data helps us to ask the right questions, but it does not make the judgements for us, whether we are school or college leaders, Ofsted inspectors or the person designing the report card.

promisingV2.jpg

© 2017 Association of School and College Leaders