Watching you from the bridge
AT THE far end of an open-plan office on the fourth floor of the Department for Children, Schools and Families sits a collection of display stands and a very large television set.
It may not seem much, but according to Ralph Tabberer, director-general responsible for England’s schools, this ordinary-looking section of office space represents the future of public policy-making. Only time will tell if he is right, but one thing is certain: if you want to understand how the DCSF operates today, then you need to know what happens on “the bridge”.
Introduced late last year, it is the basis of a data-intensive approach to Government, designed to make schools policy more evidence-based and less one size-fits-all, and to bring departmental officials more in touch with each other and with what is happening on the front line.
“The bridge” conjures up a hi-tech vision of a Star Trek-style control deck. But it is only when the TV or rather, the oversized computer monitor is switched on that the visual reality comes anywhere near living up to its futuristic title.
The screen reveals a grid of tiny multi-coloured squares, known by some as “the matrix”. Mr Tabberer has dubbed it “Hotspots”. Closer inspection reveals that the squares actually relate to about 15 performance indicators, covering everything from exam results to truancy levels. Details for each of England’s 150 or so local authorities are given on colour-coded indicators: red, amber or green, according to the progress being made, or light blue, mauve or purple according to the level of risk.
Hotspots is used to brief Jim Knight, schools minister, for Parliamentary questions. It is where Lord Adonis, the department’s other schools minister, comes for weekly progress reports on the national strategies. It is also the starting point for a lot of schools policy.
“This allows us to identify where strengths and weaknesses actually lie and to break down the nature of a problem more carefully,” explains Mr Tabberer. “It allows us to understand the difference between what is happening in Barnsley and what is happening in Barnet.”
He stresses that all the information on Hotspots is already in the public domain. And there is nothing particularly new about the technology, either. It is essentially a collection of very large and complex Excel databases.
What is remarkable is the sheer amount of previously disparate information, available for instant comparison and analysis, stored on the one system. Data at local authority level is only the tip of the iceberg. A click of a button takes the viewer to detailed figures on a huge range of indicators for each of England’s 22,918 state schools.
Another click moves you down a level to individualised data on each of England’s 8.2 million pupils. This facility is rarely used, and schools are usually consulted first if it is, but the data is there, on tap, if needed. Officials can also gain access to satellite photographs of the housing that surrounds a particular school, so they can see what sort of area it serves.
But local authorities are the most common unit of analysis. Their performance is discussed at formal monthly meetings between senior officials and ministers, held in front of the Hotspots screen.
“We will often pick out authorities where we have concerns and look at them in depth,” said Mr Tabberer. “Where maybe they have got a few reds showing or where they have no greens.
“We will talk about the authority, what we know about it politically, what we know about the performance of its schools and what we know about the area. We will talk about how much confidence we might have in the team there and what we know about them.”
Hotspots can also be used for a quick test of whether the explanation an authority offers for its schools’ performance is valid.
“They might say, ‘The problem is actually in this other county it is the people across the boundary who have high exclusion rates and are exporting the problem,’” Mr Tabberer said. “It is really interesting to be able to test whether that is the case and see whether the exclusion rates in those schools actually are high. It just helps to keep the conversation honest.”
The system is a powerful research tool. Mr Tabberer gives the example of being able to instantly find the 41 primaries, out of 17,504 in England, that have raised their pupils from the lowest 10 per cent of performance aged seven to the highest aged 11. The system also allows officials to spot problems and solutions that might not otherwise have been obvious.
Mr Tabberer selects a couple of neighbouring authorities in the North at random. The colour coding instantly shows that the first is strong in areas where the second is weak and vice versa. In other words, they should be working more closely together.
All this is possible because the tests and teacher assessments that take place throughout every pupil’s school career have given England what Mr Tabberer describes as “probably the best data set on school performance in the world”.
But is it wise to base policy so heavily on data that many would argue is far from copper-bottomed? National test results play a huge role in the data. Yet most secondaries do not trust them as guides to pupil potential and rely instead on the cognitive ability tests they set at their own expense.
Local authorities also have major reservations about the colour-code system being used by the department to make snap judgments on their performance. One gave an example of being given a “red” for its limited rate of improvement on a particular indicator, pointing out that this was because it was already virtually at the top of the scale.
Others are concerned that some ratings are not based on actual figures but on value judgments made by national strategies officials.
“These are very reasonable questions,” said Mr Tabberer. “The point is, we know the basis on which these judgments are made. We never use judgments individually and we never act precipitately just on the basis of the data in front of us.
“If we talk to a local authority, it is with a question not, ‘You are doing X and we are going to do Y to you.’ This is not a command and control centre. This is a place where our view of what is going on can meet other people’s. That is why it is called the bridge.”
The bridge is more than an information system: it represents an entire philosophy of government. It was introduced partly in response to last year’s Cabinet Office review which warned that top officials had a tendency to work in isolated silos.
The solution to this can be found in the comparatively low-tech “scorecards area”. This is a meeting space with a series of posters on each wall relating to one of the department’s four main priorities: raising standards, narrowing the gap between the highest and lowest- achieving pupils, increasing choice and diversity, and providing a broad and relevant balanced curriculum.
“At a glance” evidence of progress is presented under four sub-headings: “Headlines”; “Are we getting better?”; “How well organised are we to deliver?” and “Frontline feedback”. These are are then rated red, amber or green. “We wanted to create a physical space where the trace of what everybody is doing is constantly there, so that as you do your work you find out what is happening in other areas,” said Mr Tabberer. “It means we can ask the questions in joined-up ways.”
So far, the bridge has been confined to the schools directorate. But it has been judged “an immensely successful experiment” and a new bridge for the whole of the new DCSF is now planned.
The aim is to make access to data within the department as democratic as possible. But what about outside the department? If this data really is already in the public domain, could Hotspots be made available online to everyone?
The answer is no. Mr Tabberer insists he is not operating a “secret garden”, as the information is already out there in some form or other. But he fears that making it public through Hotspots would raise the stakes, leading local authorities to demand reassurances and new protocols, which would ultimately limit its usefulness.
“What we are doing here is a very interesting experiment and it is working,” he said. “If we were to push it and pretend we could represent the whole system using this, and that this is what everybody should have access to, then we would have spoilt the very thing that is making us work better.”
WHAT THE COLOUR CODES MEAN
Red: We want to do much better.
Amber: We are doing OK but would like to do better.
Green: We are reasonably happy with progress.
Light blue: High risk
Mauve: Medium risk
Purple: Low risk
(Definitions by Ralph Tabberer, the civil servant in charge of schools)
Keep reading for just £1 per month
You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters