0% found this document useful (0 votes)
19 views10 pages

Using Data To Support Teaching and Learning

Uploaded by

sosodef
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views10 pages

Using Data To Support Teaching and Learning

Uploaded by

sosodef
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Using Data to Support Teaching and Learning

Robin Lane

Associate Literacy Consultant


Traditionally data has been used to assess both school and pupil
performance. While this is always likely to be the case the school can use
the data more creatively, with each cycle of data collection being viewed as
a source of information on progress towards a collective goal, assisting
everyone in the school to make decisions on future action.

The school can begin by making use of its existing data, undertaking
systematic analysis that can identify patterns of strengths and weaknesses
and areas for improvement.
The cautious and reflective use of data underpins the effective work of the
school and is a crucial part of self-assessment and self-evaluation
programmes. The collection of data is only worthwhile if, however, it will
produce information that will improve teaching and learning.

What’s it all about?


ƒ enabling teachers and schools to report on what pupils have
achieved at certain points in time
ƒ analysing assessment information enabling teachers to make
informed decisions. However, to do this teachers need to understand
what the information means and what it is, or is not, telling them
about their pupils
ƒ informing decision making at the classroom and/or school level

What needs to be done?


ƒ spread data analysis responsibilities throughout the school
ƒ remember to keep in mind the whole picture of pupil performance
ƒ encourage action on data as well as accessibility

What questions need to be asked?


- at school level
ƒ how does the school’s performance compare with its previous
performances?
ƒ how does the school’s performance compare with the performance
of other ‘similar’ schools?
ƒ does the school’s performance match teachers’ expectations and
meet any targets set? What does this tell us about the success of
existing improvement strategies?
ƒ does the school’s performance show any marked differences in
attainment within curriculum subjects or classes?

Using data to support teaching and learning 2


ƒ where the progress in curriculum subjects is not comparable, are
there any reasons why this might be?
ƒ does the school’s performance show that some groups of pupils are
doing better than others – gender, ethnicity? Is this related to any
ability range or any particular class or set?
ƒ is there evidence to suggest that particular teaching and learning
strategies have been more, or less, successful than others? What
implications are there for professional development?
ƒ what more should the school aim to achieve? What must the school
do to make it happen?

at classroom level

ƒ has the teacher used information gathered from assessments to


inform medium term planning and pupil progress targets?
ƒ can the teacher identify learning objectives in medium term
planning that reflects progress towards pupil progress targets?
ƒ is the teacher using assessment information from on-going teaching
to monitor progress towards learning objectives in medium term
plans?
ƒ which pupils achieved the targets set for them at the beginning of
the term/year?
ƒ which pupils have made significantly better or worse progress than
others? Can the reasons for this be identified?
ƒ can the teacher identify issues that need to be addressed in the
coming year?

Using data to support teaching and learning 3


The Main Sources of School Data
PANDA—an annual report produced by OFSTED www.ofsted.gov.uk and
forwarded to schools in the form of a web document (usually in
November). This document:
ƒ summarises the findings of the school’s most recent inspection
ƒ offers details of the numbers on roll, special needs, ethnicity and free
school meals
ƒ provides attainment summaries (of SAT results at KS1 and KS2), and
compares the school with all schools nationally and those of a
‘similar’ type
ƒ considers trends over time
ƒ presents the schools attendance and pupil mobility, and
ƒ suggests the school’s context

INSPECTION REPORT—of the school’s most recent OFSTED inspection.


This document is usually posted on the web three months after the final
report has been presented to the school. This document:
ƒ summarises the strengths of the school, and those areas where there
is room for improvement
ƒ considers how the school has improved since its last inspection
ƒ indicates the quality of teaching in each Key Stage—Foundation, Key
Stage 1 and Key Stage 2
ƒ provides attainment summaries
ƒ makes judgements about the leadership and management of the
school
ƒ describes the teaching of each curriculum subject

PRIMARY SCHOOL LEAGUE TABLES– published by the DfES


www.dfes.gov.uk in the first week of December. Information is provided
on every state Primary and Secondary school. This includes, for Primary
schools, the:
ƒ number of pupils eligible for assessment
ƒ % of eligible pupils with SEN
ƒ % of pupils gaining Level 4 or above in English Mathematics and
Science
ƒ average points score
ƒ the school’s value-added measure
The problem with performance tables presented in this way is that they
encourage competition rather than co-operation between schools.

Using data to support teaching and learning 4


UNIVERSITY OF DURHAM– has developed PIPs assessments. These
provide tests for each Primary school year. While these tests are
recommended by the LEA at Reception, Year 2, Year 4 and Year 6, it is the
school that decides whether or not to subscribe, for which Year Groups,
and the level of analysis they wish to be reported. The reliability of the
tests is calculated and communicated each year. The University of
Durham reports to the LEA on the results of subscribing schools.
In addition to presenting a number of standardised scores, there are
Chance Tables that estimate how likely a pupil is to achieve each of the
various levels in Key Stage 1 and Key Stage 2. The Grade Tables report
achievements in terms of both academic attainment and relative progress
(value-added). Scatter plots also indicate the relative progress of
individual pupils. www.pipsproject.org
A complete set of a school’s assessments is maintained on the PIPs website
and this can be accessed by the subscribing school using its password.

PRIMARY SCHOOL INFORMATION PROFILE– this is a recent, and


very valuable, document provided by the LEA as part of the School Service
Guarantee. It brings together a range of information about the school from
the PANDA, most recent Inspection Report, LEA involvement and support
and Key Stage 2 targets. It also provides a far more accurate picture of the
context of the school because while the PANDA assumes that the pupils live
in the immediate area surrounding the school, the SIP gives detailed
information on where the pupils live. (01772 531555)

FOUNDATION STAGE PROFILE—this replaces statutory baseline


assessment. It summarises the pupils’ achievements at the end of the
Foundation Stage by assessing each pupil’s development in relation to
stepping stones and early learning goals that form the Curriculum
Guidance for the Foundation Stage. There are thirteen assessment scales
and each of these has nine points. For each scale point the judgement
made is the teacher’s assessment of the pupil’s typical attainment.
www.qca.org.uk

OTHER ASSESSMENT INFORMATION used by schools– this might


include QCA tests for Years 3 to 5, NFER/Nelson tests, reading tests e.g.
Salford, Schonell. It might also include individual teachers own assessment
made of pupils e.g. marking against learning intentions, spelling and tables
tests or tests constructed by the school.

Using data to support teaching and learning 5


Most schools are now generating a great deal of assessment data from
both national and optional tests. This data is generated at considerable cost
to the school both in terms of time and money. The results from the tests
provide schools with the information they need to calculate the progress
made by pupils’ year on year. It is now possible to set realistic, and
challenging, targets for individuals, groups of pupils, year group cohorts
and the school as a whole.
Even so, the quality and use of assessment remains the weakest aspect of
teaching,
according to OFSTED (2001).

Item or gap analysis


After pupils have taken a test it is possible to analyse the papers, pupil by
pupil, item by item. This allows a detailed picture to be developed of the
strengths and weaknesses of individual pupils, groups of pupils and year
group cohorts. It has implications for both teaching and learning.
Analysis sheets for SATs at both KS1 and KS2 and QCA tests - Years 3-5,
can be downloaded from the Literacy website:
www.lancsngfl.ac.uk/nationalstrategy/literacy . These sheets come in the
form of Excel spreadsheets producing on-going totals with the in-putting
of each new piece of data.
Level threshold tables and age standardisations can be downloaded from
QCA website: www.qca.org.uk/ages3-14/tests_tasks

Hidden pupils
Item or gap analysis helps identify pupils whose attainment is suffering.
Ruth Sutton (1995) has suggested that in each class there are the good, the
bad and the missing. Even the most conscientious teacher does not notice
some pupils as much as others and some pupils work quite hard to avoid
the teachers’ attention (the average, quiet pupils, often girls). She suggests
that teachers should list all the pupils in their class without reference to the
register. The pupils who are at the end of the list, or who are missing, are
the pupils who should be focused upon because they are probably making
limited progress and their attainment is, therefore, suffering.

Observation data
Assessment information can be obtained by observing pupils at work,
discussing their learning or marking completed work alongside the pupil.
This is the basis upon which the Foundation Stage Profile is constructed.
Such systems benefit from moderation where teachers share an
understanding of what constitutes a particular level of competence.
Teachers in schools that have effectively adopted the principles of
Assessment for Learning use assessment information to set work that

Using data to support teaching and learning 6


better matches pupils’ capabilities.

REMEMBER, the purpose of assessment is to improve standards (of


teaching and learning) not merely to measure them.

Using assessment information to monitor progress


ƒ this information is used to pitch the curriculum, feeding it into
medium and short-term planning by helping to identify key learning
objectives
ƒ tracking occurs across the school, mapping the progress of
individuals, groups, classes and year group cohorts
ƒ year on year trends are monitored
ƒ performance of specific groups of pupils is monitored—for instance
boys’ attainment in Writing and girls’ attainment in Science
ƒ areas of focus are identified to inform both school improvement
planning and professional development activities
ƒ national and local data is used effectively. Local benchmark
information is useful in identifying similar schools who may be
achieving more than others. A visit to one of these schools provides
the opportunity to learn from them.
The information that is generated by such monitoring helps a school to
make decisions about what to include in subject action plans and School
Improvement Plans.

Target setting or target getting


Target setting is often the result of looking at previous year’s results and
adding a few per cent.
Where a school has agreed to collect a set of attainment data on individual
pupils and cohorts it can be used to track their progress.
Target getting focuses attention on raising the attainment of all pupils,
knowing what pupils are capable of and challenging their teachers to
achieve or better that target.

Using data to support teaching and learning 7


Questions, puzzles and clarifications
Sample Size
It is important to know the size of the cohort when considering any data.
Particular care should be taken with the results of schools where cohorts
are small.

Comparing like with like


It should go without saying that it is only reasonable to compare one data
set with another similar set. There is no wisdom in comparing one test
with another, a school in one context with another in a completely
different context, one year’s results with another; using the percentage of
pupils achieving each level or the average score.

Problems of using levels


If a pupil achieves a Level 3, what does this mean? A perfect 100% or a
borderline between Level 3 and Level 2a.
The answer to this question has significant implications for the quality of
teaching.
Where pupils are sufficiently able to attain Level 3 with little or no teacher input, the
poor teacher will appear to have done as well as a good teacher who has stretched her
pupils to
attain the Level 3. The same is true at the other extreme, Level 1c, for example.
There is a real advantage of using standardised scores rather than levels

Regression to the mean


All measurement tools are subject to this. Extreme values are more likely
to occur by chance than average values. All teachers have examples of
where a pupil has done extremely well or extremely badly in a test. On a
subsequent test their performance returns back to ‘normal’ - it regresses
to the mean. Thus any test administered to a class can expect pupil(s) to do
well/badly.

Unique Pupil Numbers


Since their introduction these have aided the tracking of the progress of
individual pupils at school, County and national levels.

Pupil Progress
What is appropriate pupil progress? The normal expectation is that a
pupil will progress by one complete National Curriculum level every two
years. Many reading tests link a reading age to the pupil’s chronological
age. As a pupil becomes one year older one would expect his or her

Using data to support teaching and learning 8


reading age to improve by one year too. Where standardised scores are
used, 100 is the median and, crudely, the score attained by a pupil from
one year to another suggests whether appropriate progress has been made
or not.

Alignment of tests
It is important to ask the question, to what extent is the test aligned with
the curriculum? The design of SATs, for example, predates the National
Literacy Strategy.

SATs change
SATs change each year and new ‘level’ cut-off scores have to be
determined every year. The design of the tests also changed significantly in
2003.

Value-added measures
In order to assess value-added it is essential to look at the progress of the
same group of pupils, comparing current with prior attainment. This is a
far more valuable approach that considering raw achievement.
Value-added is judged by a school’s results at Key Stage 2 being compared
against the national progress of pupils from KS1 to KS2. The national
analysis is based upon a representative sample of 35,000 pupils. It is also
possible to calculate value-added for the optional QCA tests (Years 3-5) by
using tables that are available on the QCA website.
As there is no statutory baseline assessment this is the limit of this
measure. However, schools using PIPs can consider value-added measures
from one year to another in the Primary school.

Using data to support teaching and learning 9


References
DfES (2002) Recognising Progress – getting the most from your data

DfES (2003) Pupil Assessment Tracker 2003 - a tool for analysing pupil
performance

OFSTED (2001) Annual Report of Her Majesty’s Chief Inspector of Schools

Ruth Sutton(1995) Assessment for Learning RS Publications

Using data to support teaching and learning 10

You might also like