Leader magazineASCL - Association of School and College Leaders

Screen saver

Computer worker

There has been much talk recently about reforming the exam system but so far little action. However, a quiet evolution is starting with the introduction of the first on-screen tests. Julie Nightingale looks at the implications for schools and gets reaction from those who have piloted the new system.

For some time, exam boards have been experimenting with computer-based rather than written exams - now they are ready to unleash them in schools.

From 2008 the new key stage 3 test in ICT will be available only on computer. Marking and administration will be automated. If the testing proves successful, this could signal the beginning of a radical shift in the way assessment is carried out.

The Qualifications and Curriculum Authority began piloting the on-screen tests in 2004. Many schools have taken part and around 2,500 are already approved test centres.

"Next year we hope to have 100 per cent participation," says Sue Walton, who is directing the project for the QCA.

It is more than a hope: the QCA is telling schools to sign up for the 2007 pilot or risk floundering when the tests go live in two years' time. "Students' results could be affected if people don't do anything before 2008," Sue warns.

Getting to grips with the technology is the first issue. Schools need to make sure that their networks can cope with the testing software which has to be downloaded from a CD. (It is a common misconception that the tests are online. While pupils use an email system within the test, there is no live internet component.)

Students and pupils have to familiarise themselves with the interface and the software, both of which are not standard and therefore will differ from packages they may be accustomed to using in class or at home.

Besides the technological adjustments, on-screen testing presents planning and management challenges.

Sue says: "We are talking about a big culture change. Paper testing is very easy and clear and you can almost get anyone to do it. But computer-based is a bit different.

"Who is the best person in school to get involved? Is the assistant head, the head, the exams officer or is someone else the best person to oversee it?

"We can give guidance but in the end schools need to make these decisions themselves. Until schools engage with the project it's quite difficult for them to understand the issues."

Just-in-time testing

In the long term, e-testing could change more than how students are examined. The tests don't require students to sit in halls - they can be done on computer in a classroom and there are no test papers to smuggle out.

The logical conclusion is that tests don't all need to be done at the same time which brings it one step closer to just-in-time testing - where a student sits an exam at the point he or she is sufficiently proficient in the subject, rather than at a fixed point in the school year.

In the long term, this would have huge ramifications for schools including, potentially, a reorganisation of the whole timetable to accommodate children learning at different paces.

For the moment, schools have a four-week window in which to carry out both the ICT practice tests and the tests proper.

The tests themselves take up two 50-minute sessions. Marking is done by the software: as students work through the test what they do is captured by the computer which checks whether they are meeting the skills and knowledge required by the national curriculum.

As part of the pilot, in addition to the results, schools receive a formative report for students at the end of the second session which gives an in-depth picture of their performance and thus can be used in teaching.

Sue says: "The report describes what the students have been asked to do and what they actually did and also what they need to do to improve. So if all the students have problems with data handling, the teachers can see that's something to concentrate on."

A report on the 2005 pilot identified some key issues with the implementation, including teacher confusion over preparatory assessments, pupil difficulties over timings and, in some cases, even the basic email component of the test. The results were also lower than teachers had predicted.

Around the curve

Chesham Park Community School in Chesham in Buckinghamshire took part in the 2005 pilot and was pleased on the whole, though it is a "big learning curve," says Margaret Gingell, the network manager and ICT coordinator.

A significant fear for her was whether on-screen testing would lend itself to on-screen cheating.

Margaret says: "I was concerned about this at first as we have desks in clusters of six. But in the test, students get different activities to do at different points. You could have two students sitting side by side doing different parts of the test, so they can't copy from each other."

One problem they had was over timings.

"We tried to fit the tests into the lessons which are 50 minutes long - the same as the tests - but it was too tight if there are any technological problems or other set-up problems. In future, we will schedule it like any other exam, otherwise it's quite stressful."

Technical help - both in setting up the system and software and on test days - is essential, as is making sure that students are familiar and comfortable with the non-standard software and interface. The practice tests supplied by QCA can help with this.

The exam puts more pressure on teaachers' initial assessment. In 2005, the tests assessed levels 3 to 6 and there were two tiers: level 3 to 5 and level 4 to 6. Prior to the tests, teachers carried out assessments gauging the level they expect students to achieve and these determined which test the student was entered for.

It is essential that the assessments are accurate, says Margaret, as the automated test will only deliver a grade either side of the teacher's estimate.

"It means that if you were to put them in for level 6 and they score a level 4, it's possible they could receive 'N' - that is, not be awarded a level."

But overall, she says, the tests are proving helpful in revealing the depth of a student's learning.

"The online test looks at how they are going about every stage of the task. If they did it on paper, you would only have got the end result. But here you know how they arrived at the formulae and whether they used the software, for example, because it is tracking and capturing everything they do."

It's also easier to pinpoint which concepts and ideas are

making the students stop and which tasks they are not getting through, so the test is informing future teaching, she adds.

Timing it right

Hreod Parkway School, Swindon also wrestled with difficulties over timing. Graham Offler, director of ICT, says: "I think you need to allow around two hours to do it. If all the pupils log on at the same time, the system can't handle it and there's a one- or two-minute delay. We allowed a double lesson for each test."

Any hold-ups, technical or otherwise, during the test do not mean necessarily that the students lose time, he points out. Once a pupil presses the 'teacher help' button, their screen freezes and the clock stops.

It was fairly straightforward for students to transfer their skills on standard software to the bespoke system, he says. "The package has some similarities with Microsoft but it is different. We gave students five or six hours to get used to the software as part of ICT lessons."

Pupil reaction has been good on the whole, he says.

"The worst situation has been a couple of pupils who have got fed up and turned off their machine - which is the equivalent of sitting in an exam room and just writing your name on the paper. But others have been very enthusiastic. They like trying to beat the clock."

Senior management team involvement in planning is crucial, he adds, as time needs to be allowed in the timetable for training staff, practice tests and giving pupils sufficient time to do the tests.

Stuart Morris is assistant head at The Gilberd School in Colchester and a member of the QCA's focus group for the project.

He says the test has not involved a huge administrative burden for the school, although there is a sheaf of documentation to read and rules to be observed.

"You also need a test administrator, an invigilator and others, such as a network manager involved, which is more than you would have for other tests," he says.

"There's also a problem getting enough students through in a small window to do two tests and have time to practice. We have had up to 270 students go through in some years."

He also finds the security constraints unnecessarily rigorous for the pilot - the QCA uses UPNs where issuing students with individual passwords might have sufficed - and the technical challenge of transferring student data from the PLASC school census files to the test programme was complex and time-consuming, in part because of compatibility issues between the databases.

Another concern for Stuart is that the tests are proving less effective at measuring more able students' skills. However, he says: "The test is testing their intelligence and thinking skills rather than their ICT skills, though it does test their ICT capability. As a result, we are looking at doing more task-based tests and more independent learning work with the students."

Teachers preparing for the pilot should ensure they are not only familiar with it but also teach to the style of working required so the children are prepared, he adds.

While this might sound suspiciously like 'teaching to the test', as Sue Walton says, computer based testing requires a new way of thinking and willingness to engage with the technology. In this respect, students may be already one step ahead.

Julie Nightingale is a freelance journalist who writes regularly on ICT in education for The Guardian.

For more information on the key stage 3 ICT tests, see www.qca.org.uk/7280.html

© 2024 Association of School and College Leaders | Designed with IMPACT