Four schools took part in the Data Rich Schooling Project in 2000 and 2001 – three from Wales and one from Gloucestershire. The Project was developed to gain knowledge of the effectiveness of data richness, an area where bodies of knowledge from research or from experience were difficult to obtain. The knowledge was considered necessary as an area of intervention, with a focus on generating capacity for improvement within schools. The premise of the Project was that data richness would enable schools to actively track individual student progress, enhance decision making by school management and identify good practice within the school.

The four Project schools used data for measuring performance, potential and behaviour. Each school set up their own individual systems of data management, with the process ongoing as their own knowledge of the potential of data evolved. This report will show that each of the schools found data richness an essential tool in school improvement by enhancing management decision making and identification of good practice alongside enabling good communication of information to and about pupils.

With the setting up of effective means to manage data various questions arose:

    • what data management system to use and who should manage it;

    • what type of data to use;

    • how to make data accessible, and to whom it should be accessible.

All four Project schools succeeded in solving some of the problems of data management and, in consequence, became more intelligent organisations through the knowledge the data provided. With the relevant data, systems could be developed for the monitoring and targeting of pupils, especially those who were shown as underachievers, as the data highlighted individual’s potential.

For data to contribute to the measurement of raising performance of both staff and students, tracking at an individual student level was imperative. With a system of individual student tracking in place across the whole curriculum, information could be quickly and efficiently identified and acted upon on a proactive basis. In order to measure performance, baseline assessment of students was of great importance.

Project schools One and Two found the Key Stage 2 results of little use as baseline data for setting targets for improvement or for value added at Key Stage 3. When Year 7 students were admitted to these schools they were tested with NFER, reading tests and CATs, and these results were used as the baseline data. Project Two found a large anomaly with the academic year 2001 Year 7 intake Key Stage 2 results. In English these results showed 65% of students at Level 3 and above, with 59% at Level 4 and above, but when evaluated by the school using the tests mentioned above, 69% were below their chronological age in reading and 59% were two years behind.

Project Three used Key Stage 2 results as baseline data for Maths, Science and English. The results were analysed and adjusted to show exactly the level of the students, e.g. 5.45 and not 5. Key Stage 3 targets were then set by adding 1.5 (the government’s expectation of the ‘average’ student’s progression) to the Key Stage 2 level. Project Three was also moving toward using MidYis at Key Stage 3 for baseline assessment, but at the time of this study had not yet used it for target setting or value added.

Project schools One and Three wrote baseline assessment tests for Key Stage 3, using Excel spreadsheets to record the Key Stage 3 data. As noted above, Project Three used Key Stage 2 tests as baseline for three subjects, but other subject areas wrote their own baseline tests using National Curriculum levels to enable tracking and target setting. These baseline tests were new for the academic year 2001 and therefore there was no evidence of their effectiveness at the time of this study. Teaching staff would be inputting their own data upon completion of the tests.

Project One, on the other hand, employed someone to input all their data. At the time of the study the school was in it’s second year of writing baseline assessment for all subject areas at Key Stage 3, but was still attempting to overcome some difficulties. The problems involved maintaining cross-curricular consistency and dissemination of the assessment data. Each Department had to work out their own criteria and awarded levels 5-1 on student attainment and effort. The first year of using these tests caused certain anomalies as some subject areas had used National Curriculum levels and others had not. The use of National Curriculum levels caused a problem as, for example, Welsh had five levels, but French had only four levels. Students, parents and form teachers looking at the results therefore mistakenly believed that a student was performing better in Welsh. The repercussion of this was that when pupils in Year 9 chose their options for GCSE, French was perceived as too difficult and was therefore not chosen. Certain subjects, such as PE, had used National Curriculum levels, and students, especially in the lower years, appeared to be not achieving however able. English and Maths set tests as baseline assessment, one subject department decided levels before marking the test and the other subject department decided levels after marking. The result being that the former appeared to show students having difficulties with the subject, whilst in comparison the latter had exceptionally good results. The school is actively working on making their assessment procedure consistent.

Due to discrepancies in the system the data was only handled at Middle and Senior Management level and not passed on to the Form Tutors, parents and pupils. The data had been distributed to everyone in the previous year which had led to many difficulties – some due to misconceptions of capability in certain subjects (as above) and some due to staff inability to pass on the knowledge gained in a positive and proactive manner. Therefore the lack of knowledge at Form Tutor and student/parent level meant that Management could use their knowledge as a motivational force for both staff and students by disseminating whatever knowledge they believed appropriate for improvement. Both Senior and Middle Management used the data for setting and to monitor individual student academic and behavioural performance, with problems and excellence very quickly picked up, acknowledged and dealt with accordingly. The data also informed Senior Management of excellence and problems at staff and Departmental level.

Project One also found that the YELLIS scores for GCSE target setting and value added had not generated a positive response in the first year when all data was shared with teachers, parents and students. The school therefore set its targets at Key Stage 4, using GCSE grading (not by internal assessment 5-1 as at Key Stage 3) based on YELLIS, but did not share these scores with parents or students in the second year of the Project. Again Management used data to track students and set targets, with provision in the system to mentor students who were underperforming or were borderline on the C/D grading. They also mentored students who were high academic achievers to keep them on task. This meant that Management were actively using data to improve performance of individuals and meet, or better, Local Authority targets for the school for GCSE A*-C grades.

The Data Manager in Project One (a Deputy Head) did not use the SIMS Assessment Manager but developed Microsoft Excel Spreadsheets for tracking data. The belief was that the Assessment Manager was too complicated and time consuming. The Data Manager would very much value the development of a dedicated Microsoft Access database. He expressed the view that an Access database would be simpler to use and therefore also to train people to use, however, it would require an expert in Access software to develop it.

Project Four developed an Access database that they used for student tracking. This database was developed by members of the school’s teaching staff. Development had taken many hours of staff time, in both training and application of acquired knowledge.

Project Two used the SIMS Assessment Manager for tracking by data. All staff had laptops and on entering the school were trained in their use. All information was collated and processed by the Data Manager (a member of the teaching staff), and passed on to all staff, students and parents. The data was used for target setting with parents and students jointly negotiating grades up throughout all years with teaching staff. All Key Stage 3 assessments were by National Curriculum levels and all GCSE assessment levels by GCSE levels. CATS, NFER and YELLIS were shared with both parents and students for target setting. The Form Tutor took on a very proactive role, using all this knowledge at a pastoral level, as well as at an academic level, allowing even minor problems to be picked up and dealt with quickly. The students were tracked half-termly and setting was by Departmental choice.

A new system was put in place that connected all laptops to the central server from anywhere in the school. The presence of this system means that monitoring will be immediate, for example registration in all classes, at every lesson, will be automatically linked to the server which will pick up any missing students and alert appropriately. Through the use of data, the school felt minor problems were being targeted before they escalated. Staff here appreciate the worth of the data for information at an individual as well as a school level.

Project Three shared all their knowledge on students with staff, students and parents. In order to do this, they had had to overcome difficulties with teaching staff who were concerned the information would have a detrimental effect. This was felt because of the perception that the catchment area had low socio-economic status, and an assumption that the data knowledge would reinforce a perceived ethos of negative aspiration.

With the breadth of knowledge gained through extensive data richness, showing potential as well as weakness, this concern had been proved wrong. The outcome of target setting with all staff, students and parents had improved the school results. Again, the use of data had highlighted potential problems so that they could be handled by in-school systems before they escalated, it also highlighted the student potentials for staff, students and parents to see.

The Data Rich Project has proved that data richness has the potential to be an important tool in school improvement. In order for this to be the case, there are a number of important factors to consider:

    • Baseline assessment across the curriculum needs to be addressed. Key Stage 2 and 3 results are inadequate for two reasons, (1) they do not inform of students’ ability and potential across the curriculum as only three subjects are tested, and (2) there are certain anomalies in the information of students’ ability.

    • If schools are given targets (which also feed into performance related pay), based on National Key Stage testing, there are potentially serious problems with the gap between Key Stage 2 results and student ability, and also the belief that it would be better to get bad Key Stage 3 results and excellent GCSE results (this was emphasised in Wales with the removal of league tables as a measuring device).

    • The time required for data input and management of the compilation of the results needs to be addressed. One Project School had teachers inputting their own data. Whilst this conferred ownership and an understanding of how useful and powerful a tool data can be, it also used an enormous amount of teaching staff’s time on administration. This use of expensive and out of classroom time was echoed by all the schools having middle/senior members of staff compiling and managing the data.

    • Data richness makes a more intelligent organisation with the knowledge it gives but how the knowledge is used (for example a Chances Graph from YELLIS) is what enables a school to improve:

    • All the Project Schools admitted there was a problem with staff embracing the importance of data. New systems are hard to implement in any institution, but once the worth was understood by improved results more staff were realising how it could help them.

    • The data information is only of use if there are systems in place to act on the knowledge, either positive (reward) or negative (support).

    • The information is also only of use if staff are trained to understand the importance of the knowledge. If the information is not used and disseminated proactively it can cause a negative reaction as witnessed in the first year of the project in Project School One.

All four Project Schools became data rich and with imaginative leadership found methods to use the information for each of their contexts. The schools, as intelligent organisations, had used the data to enhance management decision making through identification of good practice and weakness throughout the school. All four schools had strong a parental/pupil link, with target setting for academic achievement and behaviour being jointly agreed between schools, parents and pupils. Their schools improved regardless of the social context (in Projects One and Two there were declining cohorts due to social housing opening in their catchments with "problem families being dumped" there. The Head of Project Two estimated that Free School Meals would increase from 11% at the beginning of the 2001/2 academic year to 29% by the end of the academic year).

The Project Schools expressed the view that a cultural paradigm shift had taken place within their respective schools through embracing data and building systems to back the knowledge gained. The data had given them a wider picture of all the pupils, highlighting strengths and weaknesses. With the wider picture the data provided, more questions were provoked, and any preconceptions of students’ ability challenged. The wider picture proved more accurate than instinctive ‘gut feelings’ about pupils that had been in evidence previously. As the knowledge of children’s potential was highlighted the staff were motivated toward school improvement using the data to challenge underperformance and attitude. Two of the Project Schools saw the knowledge given by data as having enabled staff to work as team players in a learning community with common goals. Challenging staff and pupil expectations had become the school ethos. As a consequence, the children’s results improved and there was a tangible change in self-belief of the staff, which was passed back to the pupils, with new and positive self-fulfilling prophecies becoming the norm. One member of staff explained, "people are not in any comfort zone here, staff have rising expectations of themselves and students with the good results acting as a motivational force". The use of data to inform generated many questions that resulted in a change of ‘attitude’, and thereby practice, throughout the schools.

Click on the individual project schools below to find out more about how each school became data-rich.

Project 1 (Cwrt Sart School)

Project 2 (Dyffryn Comprehensive)

Project 3 (Pen-y-dre School)

Project 4 (Castle School) -  File 1  

                                     - File 2

These files explain how Castle School uses its software to collect and analyse data.