Showing posts with label SATs ks2. Show all posts
Showing posts with label SATs ks2. Show all posts

Saturday, 15 February 2020

Getting Ready For The 2020 KS2 Reading Test

If you're just here for the free resources, then here are the links:

Booklet 1: https://www.tes.com/teaching-resource/booklet-1-year-6-sats-prep-reading-comprehension-based-on-analysis-of-2019-test-12250306

Mark Scheme for Booklet 1: https://www.tes.com/teaching-resource/mark-scheme-for-booklet-1-year-6-sats-prep-reading-comprehension-based-on-analysis-of-2019-test-12260913

Booklet 2: https://www.tes.com/teaching-resource/booklet-2-year-6-sats-prep-reading-comprehension-based-on-analysis-of-2019-test-12257463

However, if you have a little more time, have a read about the thought process that has gone into the resource creation.

But, before you start reading my bit, I can't stress how important it is that you read Penny Slater's blog series of reflections on analysis of the 2019 reading test. It is in 4 parts and it has been the reading of these that has brought me to write this blog post about how I am hoping to prepare for the 2020 test:

https://www.hertsforlearning.co.uk/blog/reflections-analysis-2019-ks2-reading-sats-part-1

https://www.hertsforlearning.co.uk/blog/reflections-analysis-2019-ks2-reading-sats-part-2

https://www.hertsforlearning.co.uk/blog/reflections-analysis-2019-ks2-reading-sats-part-3

https://www.hertsforlearning.co.uk/blog/reflections-analysis-2019-ks2-reading-sats-part-4

One of our main reflections on having given our year 6 children a go at some of the past papers is that stamina is a key skill which needs to be developed.

With this in mind, I looked at the wordage breakdowns that Tim Roach and Penny Slater provided:

Given that 2019's test had the longest reading extracts ever I decided to use its word count as a benchmark for developing some reading comprehension activities that we could use with the children to develop their stamina.

It wasn't just the word count that was the issue. Previously, we had no way of checking whether or not the reading materials we were using in reading lessons were of a comparable difficulty to the texts used in the tests.

I used a simple online analysis tool to get some more information: https://datayze.com/readability-analyzer.php

I ran each of the three 2019 reading texts through the tool and got the following information:

The Park:


Fact Sheet: About Bumblebees:


Music Box:


Using this data I set about finding similar suitable texts (in both length and readability) to use for a series of test-like comprehension activities. The aim of these activities is to replicate the length and readability of the second and third texts in the 2019 paper so as to provide around 40-45 minutes' worth of reading and answering questions. So far, at my current school, reading lessons have not provided such practice at such length so in the run up to Easter we have adapted our timetable to allow for longer reading lessons.

To aid me in the creation of these questions I re-made the questions from texts 2 and 3 of the 2019 paper and used these as a template (click the link to download these from TES). I also did a quick analysis of both question types (e.g. short written answer, complete the table, multiple choice tick box etc) and an analysis of the content domain coverage (using the information in the mark scheme):

Fact Sheet: All About Bumblebees:

Content Domains:

2a = 2/19 marks = 11%
2b = 9/19 marks = 47% (2 mark questions)
2c = 6/19 marks = 32%
2d = 3/19 marks = 16% (inferences in NF)
2g = 1/19 marks = 5%

Question types:

Short answer (one line): 14, 17, 18, 21, 26 = 5/15 = 33%
Medium answer (two lines): 19, 22b, 27 = 3/15 = 20%
Complete table: 15, 25 = 2/15 = 13%
Multiple choice tick box: 16, 20, 23 = 3/15 = 20%
Tick table: 22a, 24 = 2/15 = 13%

Music Box:

Content Domains:

2a = 1/17 marks = 6%
2b = 5/17 marks = 29%
2d = 9/17 marks = 53% (3 mark inference questions)
2g = 2/17 marks = 12%

Question types:

Short answer (one line): 31, 34, 35, 36, 38 = 5/12 = 42%
Medium answer (two lines): 28, 30 = 2/12 = 17%
Long answer (3 marks): 39 (32 is also 3 marks) = 1/12 = 8%
Complete table: 32, 33 = 2/12 = 17%
Multiple choice tick box: 29, 37 = 2/12 = 17%

So far I have identified several texts which I have begun to create reading comprehension questions for. With the ones I have created so far I have stuck quite closely to the questions from the 2019 test, however will probably deviate more to bring in more variety as I create more resources.

Here are the texts I have found so far (texts in bold are texts from 2019 test):

Fiction:

Title
Wordage
Ave. Score
Flesch Reading Ease
Jane Eyre
807
3.242
98.05
The Wrong Train
800
3.54
97.29
Armistice Runner
774
3.634
92.19
Music Box
908
4.414
90.64
Louisiana’s Way Home
803
4.436
89.78
The City of Secret Rivers
789
4.53
90.56
Floodworld
896
4.646
90.54
The Park
636
5.342
88.51

Narrative Non-Fiction:
Title
Wordage
Ave. Score
Flesch Reading Ease
Lightning Mary
495
3.52
94.32
The Girl Who Fell From The Sky
908
6.066
79.15


Non-Fiction:
Title
Wordage
Ave. Score
Flesch Reading Ease
Human Digestive System
870
5.574
79.7                              
Pets in Cold Weather
650
5.932
83.59
When You Grow Up
700
6.55
84.75
All About Bumblebees
632
6.87
68.48
Henry 8th Wives
748
8.594
70.77
All About The Circular Economy
814 (+diagrams)
8.754
66.51
Dr Jane Goodall Interview
789
8.784
67.47
What is a Bushfire?
657
9.218
64.46
Tutankhamun
649
9.408
61.42


Most texts have been sourced from Nat Geo Kids and LoveReading4Kids.

A note on the Flesch score: The Flesch score uses the number of syllables and sentence lengths to determine the reading ease of the sample. A Flesch score of 60 is taken to be plain English. A score in the range of 60-70 corresponds to 8th/9th grade English level. A score between 50 and 60 corresponds to a 10th/12th grade level. Below 30 is college graduate level. To give you a feel for what the different levels are like, most states require scores from 40 to 50 for insurance documents.

So, looking at the above non-fiction texts, and converting the US grade system to the UK year group system we find that, according to this simple analysis, All About Bumblebees could potentially be a year 9/10 level text, better suited to 13-15 year-olds. However, the Flesch Reading Ease scores are calculated using only number of words, number of sentences and number of syllables in words.

In order to get another idea of readability I also averaged out the scores from the 5 other readability scores that the analyser provides (Gunning Fog Scale Level, Flesch-Kincaid Grade Level, SMOG Grade, Dale-Chall Score, Fry Readability Grade Level). None of this is an exact science but I hope it gives a ballpark idea of how difficult the texts should be in order to match the texts in the test.

Interestingly, although the text The Park is shorter, and is in the number one position in the 2019 test, it comes out as being a slightly more difficult text than Music Box. In this instance, we must assume the shortness of the text, combined with simpler questions (a heavier focus on retrieval than inference, for example), makes this an easier part of the test. I think it also shows us that the difficulty of the text based on these scores can vary, therefore the questions we ask must be complex enough (if we are wanting to replicate the difficulty of the test for practice purposes).

When choosing the non-fiction texts I tried to find things that of a similar interest level to the SATS texts - I also wanted to make sure that there was a variety of subject matter and text type. When choosing the fiction texts I tried to find extracts in which something happens - it wasn't just a case of finding a chunk with the right wordage.

With all this in mind, with the texts I currently have, I suggest the following order of use for the resources I intend to create:

Lightning Mary + Human Digestive System = 495 + 870 = 1365 words
Pets in Cold Weather + Jane Eyre = 650 + 807 = 1457 words
When You Grow Up + The Wrong Train = 700 + 800 = 1500 words
Henry 8th Wives + Armistice Runner = 748 + 774 = 1522 words
All About The Circular Economy + Louisiana’s Way Home = 814 + 803 = 1617 words
Dr Jane Goodall Interview + The City of Secret Rivers = 789 + 789 = 1578 words
What is a Bushfire? + Floodworld = 657 + 896 = 1553 words
Tutankhamun + The Girl Who Fell From The Sky = 649 + 908 = 1557 words

A few examples of the texts and questions that can be downloaded on the TES website:

Booklet 1: https://www.tes.com/teaching-resource/booklet-1-year-6-sats-prep-reading-comprehension-based-on-analysis-of-2019-test-12250306

Mark Scheme for Booklet 1: https://www.tes.com/teaching-resource/mark-scheme-for-booklet-1-year-6-sats-prep-reading-comprehension-based-on-analysis-of-2019-test-12260913

Booklet 2: https://www.tes.com/teaching-resource/booklet-2-year-6-sats-prep-reading-comprehension-based-on-analysis-of-2019-test-12257463



Once again, here's the link to download the reading comprehension resources:

Booklet 1: https://www.tes.com/teaching-resource/booklet-1-year-6-sats-prep-reading-comprehension-based-on-analysis-of-2019-test-12250306

Mark Scheme for Booklet 1: https://www.tes.com/teaching-resource/mark-scheme-for-booklet-1-year-6-sats-prep-reading-comprehension-based-on-analysis-of-2019-test-12260913

Booklet 2: https://www.tes.com/teaching-resource/booklet-2-year-6-sats-prep-reading-comprehension-based-on-analysis-of-2019-test-12257463

Please do keep checking back on that link as I will keep adding resources as I create them. Even if you don't need to use them as I intend to, hopefully they can be useful beyond my own setting.

Postscript:

I'd just like to make it clear that this isn't the only thing we will be doing in the run-up to SATS - we will still be reading a class novel, doing Reciprocal Reading, Fluency Reads and so on. We will also be soldiering on with teaching the wider curriculum!

Monday, 13 May 2019

From The @TES Blog: Eyes Down, It's Time For SATs Reading Test Bingo


In what must be the article with the shortest shelf life that I've ever written I've made some tongue-in-cheek predictions for the content of the 2019 Reading SATs:

Read it here: https://www.tes.com/news/eyes-down-its-time-sats-reading-test-bingo

Friday, 8 February 2019

Times Tables Fluency and the KS2 SATs

How important is times tables fluency for the KS2 SATs? I'd say quite important.

When we are fluent in speaking a language, we can speak it without thinking much about it. That kind of fluency will be useful for year 6 children to have when it comes to the tests in May.

I looked through the 2018 SATs papers to see just how many questions required some times tables knowledge.

Here's what I discovered:

  • In Paper 1 (Arithmetic) there are 19 out of 36 questions which definitely require children to have fluent times tables knowledge.
  • In Papers 2 and 3 (Reasoning) there are 18 out of 44 questions which also require children to have fluent times tables knowledge.
But why might fluency be important? Can't children just work out the times tables without knowing them by heart?

Well, yes, they could, but it would cost them time.

For Paper 1 children are given 30 minutes, meaning that they have less than a minute per question. For Papers 2 and 3 there are 40 minutes per question - this means children have about 2 minutes per question.


In a question where 8 different times tables facts must be recalled (see above), it is obvious that this needs to be done quickly so that children can focus on the procedure of answering the question. In the same questions accuracy is essential too: if children are fluent with the times tables facts they are less likely to make mistakes.

If children are spending too much time working out times tables facts they risk going over that l or 2 minute per question; in turn they risk not having time to finish the test.

But, looking at the 2018 tests revealed something else: most of the times tables facts that children needed to use to answer the questions were fairly easy: the sort of times tables that are learned in years 1, 2 and 3. The times tables grid here shows exactly which times tables facts are required. The hardest times tables facts (such as 7 x 8, 9 x 8, 11 x 11) weren't required. The most common facts needed were below 6 x 6 with majority of the additional facts coming from the 2, 3 and 4 times tables.

But children need to be able to do more than recall them quickly; they need to be fluent enough to use and apply them. It's not just about remembering the facts but being able to recognise relationships between numbers. For example, the questions below require children to spot related facts:



I've put together a PowerPoint presentation which contains all the questions that require some times tables knowledge. I've animated the working out and answers for each question too so that these can be used flexibly. The PowerPoint (which the images used above are taken from) can be downloaded here: https://www.tes.com/teaching-resource/powerpoint-to-demonstrate-the-need-for-times-tables-fluency-in-sats-12065667. I used it with year 5 parents who found it useful to know how important times tables were going to be for SATs.

Tuesday, 3 April 2018

From The @TES Blog: 5 Things To Do Instead Of Revising For SATs

From The @TES Blog: 5 Things To Do Instead Of Revising For SATsThis might come across as idealistic or cynical. It might even sound hypocritical to those who’ve taught Year 6 alongside me. But there really is more to Year 6 than Sats revision – even in Sats week.

Regardless of your views on key stage 2 testing, it’s the system with which we’re currently lumbered. And I would always advise that children are prepared for them.

But by preparing, I don’t mean drilled to within an inch of their life: Easter booster classes, daily past papers, hours of homework and the like. There are other ways of helping children to be ready for that week of testing in May – ways that prepare them mentally; ways that ensure they remain emotionally intact.

Here are five suggestions:

Click here to read the whole article: https://www.tes.com/news/school-news/breaking-views/five-things-do-instead-sats-revision

Friday, 16 March 2018

Comprehension Strategies And The KS2 Reading Test - What and How Should We Teach?

Comprehension Strategies And The KS2 Reading Test - What and How Should We Teach?
In my first blog post in this series I explored the difference between reading comprehension strategies and reading skills. I noted that many of the skills that are tested in the KS2 SATs also have a matching reading comprehension strategy. With the conclusion that the deliberate use of strategies develops and embeds skills, I posed a question to myself:

Is there a way to teach comprehension strategies that prepares children well for the KS2 reading test?

In answering my second question I had to consider that which is different about the reading test. Whereas the commonly-used comprehension strategies do not require children to give written answers to questions they ask or generate themselves, the test does. This is the main difference. In addition to this, the year 5/6 National Curriculum objectives mention no requirement for children to provide written answers to questions and many of the objectives aren't tested at all by the SATs. The objectives circled in red aren't tested by SATs; the ones outlined in blue are.
Without having any evidence back this up with, I believe that there are children who, having been taught strategies which have become skills, are able to complete the reading test, confidently giving written answers to the questions it asks. I suspect that these children are also able writers and they have probably had a healthy relationship with literacy in general from an early age. There is a potential argument here for a sole focus on teaching comprehension strategies and never asking children to spend time practising giving written answers to comprehension questions.

But, I also think that there are probably children for whom some explicit instruction about how to give written answers to comprehension questions will be useful and necessary (if they are to have a chance of demonstrating their reading skills in a test, which all year 6 children are). Again, I have no research evidence to back this up, only anecdotal experience. However, there is research evidence to back up the idea that particular written activities do support reading comprehension.

I turned to Steve Graham and Michael Hebert's 'Writing to Read' report which states:

"Writing-about-text activities had a positive impact on struggling students’ understanding of a text. An important key to success in using these activities with lower-achieving students was to provide them with ongoing practice and explicit instruction."

The report recommends that students do write in response to things they have read and outlines a series of recommendations of activities. One of the recommendations is that teachers should have students answer questions about a text in writing, or create and answer written questions about a text:

"Answering questions about a text can be done verbally, but there is greater benefit from performing such activities in writing. Writing answers to text questions makes them more memorable, as writing an answer provides a second form of rehearsal. This practice should further enhance the quality of students’ responses, as written answers are available for review, reevaluation, and reconstruction (Emig, 1977).

For generating or responding to questions in writing, students either answered questions about a text in writing; received practice doing so; wrote their own questions about text read; or learned how to locate main ideas in a text, generated written questions for them, and then answered them in writing. These practices had a small but consistently positive impact on improving the reading comprehension of students in grade 6–12 when compared to reading or reading instruction."

Lemov et al's 'Reading Reconsidered' also provides plenty of classroom evidence that writing supports reading comprehension. They summarise:

"...the strategic use of writing made reading and discussions of reading- the other core activities of English class—more rigorous, focused, productive and engaging- ‘better’ in short.  Writing is a deeply valuable endeavor in its own right, but it is also an endeavor that works in synergy with reading in specific ways."

From 'Writing To Read'
Activities other than answering questions include responding to a text through writing personal reactions or analyses/interpretations of the text, writing summaries of a text, taking notes on a text, and creating and/or answering questions about a text in writing. Actually, all of these activities have a greater effect size than answering questions and therefore should be explored further in the primary classroom - another blog post for another time!

What does come through both the 'Writing To Read' report and Lemov et al's 'Reading Reconsidered' chapter entitled 'Writing For Reading' is an emphasis on explicit teaching: if we want children to be able to write well about the things they read in order to develop a better understanding of what they read, we must explicitly teach these skills - they must be modelled well by the teacher.

What I have found is that evidence from both research and successful classroom practice shows that an approach to teaching reading strategies which includes giving children the opportunities to practise giving written answers to comprehension questions (in order to prepare them well for a test) is not something we should avoid, but is something that, if done right, could be beneficial to the children we teach.
From the IES guide
So, is there a way to teach comprehension strategies that prepares children well for the KS2 reading test? Yes, I think so. As long as there is modelling, discussion (book talk) and time for children to practise, a sequence of learning that will improve reading skills can (and should) focus both on teaching reading comprehension strategies (as outlined in the EEF and IES guidance) and the elements of the National Curriculum (as outlined in the content domain in the KS2 test developers' framework) as they can act reciprocally due to similarities between the skills and the strategies. Reading instruction which includes, amongst other things, teachers, asking children to respond in writing to well-written questions based on a manageable amount of text is a good idea when preparing children for KS2 tests. It shouldn't be the only element of reading instruction but it should help. Where children lack particular skills it will be best to focus modelling and practise on those particular skills.

If children are only given written comprehension activities the comprehension strategies are not likely to be employed or developed. But if the written comprehension activities are backed up with explicit teaching of the supporting strategies (as well as vocabulary, any other necessary background knowledge and how to write answers), then comprehension strategies should be developed. Such explicit teaching (including modelling and discussion) should focus on ensuring that children know what the strategy is, how it is used and why and when to use it. Children can be shown how to use the strategies when completing written comprehension activities.

The York Reading for Meaning Project assessed three reading comprehension interventions delivered by teaching assistants in 20 primary schools. The three interventions were carried out with children who had been identified as having the poor comprehender profile - the three interventions were intended to help children who struggled with reading comprehension to overcome their problems. The three interventions differed:
  • Oral Language Programme: vocabulary, reciprocal teaching with spoken language, figurative language and spoken narrative
  • Text Level Programme: metacognitive strategies, reciprocal teaching with written language, inferencing from text and written narrative
  • Combined Programme: all of the above (vocabulary, reciprocal teaching with spoken language, figurative language, spoken narrative, metacognitive strategies, reciprocal teaching with written language, inferencing from text and written narrative)
Based on the findings, the report concludes that 'the Oral Language intervention overall was the most effective of the three programmes. Theoretically, this finding provides strong support for the theory that the reading comprehension difficulties seen in those who show the poor comprehender profile are a secondary consequence of these children’s oral language weaknesses.'.

Here then is evidence that children who are struggling with reading comprehension, and are falling behind, will benefit from an oral language programme as intervention. In the context of this blog post - which focuses on teaching all children (including those are aren't struggling with comprehension but are still learning new skills and strategies) - it is worth questioning whether these research findings bear relevance - should we scrap writing as part of first teaching of reading and focus solely on an oral approach?
Examples of combined programmes from The York Reading for Meaning Project: An Overview


However, the outcomes of the project also show that 'all three interventions (Text Level, Oral Language and Combined) improved children’s reading comprehension skills'. In this blog post I have been suggesting what is essentially a combined programme for everyday classroom-based reading instruction (see the examples above). The question the research doesn't answer is, where first teaching of reading comprehension is concerned (i.e. not interventions for poor comprehenders), whether or not the benefits of writing discussed above are still outweighed by only focusing on an oral-only approach.

What is potentially telling is that 'the children who received the Combined programme experienced all components but at half the quantity of the other two intervention programmes'. What if children were given a whole quantity of both oral and written approaches? Isn't this something that a reading lesson, with an adequate amount of time given over to it, could offer children that an intervention (in this study set at 30 minutes long) could not?

It would be interesting to know which approach (oral, text or combined) shows the best results for all learners rather than interventions for poor comprehenders . For teachers working on helping children to be prepared for KS2 testing it would be good to see research which focuses on first teaching for all learners where the results are taken from SATs performance. Whether you are in support of year 6 testing or not, they are currently a feature of the UK's education system. In order for children to feel prepared (and hopefully not stressed by uncertainty about the tests) and in order for schools to demonstrate accurately the reading ability of their children, most schools will want to allow children to practise giving written answers to comprehension questions. Would it be too much of a gamble in this case for schools to take an oral-only approach?

Expanding on some of the ideas in this blog post, in previous blog posts I have written about...

Friday, 15 September 2017

9 Important Changes to the Primary Maths Curriculum and Assessment

In response to the DfE's latest documents, I wrote this for Third Space Learning. It's a summary of the key changes in the way primary maths will be assessed over the next few years:

On 14th September, just as we were all getting settled into the new school year, the DfE published not one, but two documents of considerable importance: ‘Primary assessment in England: Government consultation response’ and the 2017/2018 ‘Teacher assessment frameworks at the end of KS2’. Both documents reveal changes that will no doubt affect our approach as teachers and leaders.

Whilst the most imminent and significant changes involve writing and reading, there are also some interesting developments in Maths.

Monday, 11 September 2017

KS2 Maths SATs On Reflection: Why We Teach For Mastery In Maths

Here's one I wrote for Third Space Learning: https://www.thirdspacelearning.com/blog/2017/ks2-maths-sats-on-reflection-teaching-for-mastery

‘Without reflection, we go blindly on our way, creating more unintended consequences, and failing to achieve anything useful.’ - Margaret J. Wheatley

Perhaps that’s a little over the top, but there’s something in it. As a teacher it’s always worth reflecting on a year just gone, looking back at what went well and what might need changing for the next year. I spent the year as Maths and UKS2 lead whilst teaching in Year 6.

As such I have the privilege of being up to date with the changes taking place in primary education, especially with regards to the expected standards in assessment. Now that I’ve got a few weeks of holiday under my belt, my mind is a little fresher. It's on natural then, that I begin to look back upon KS2 Maths SATs 2017. Read on for my reflections on the end of July and the ever-present changes to how Maths is assessed in UK primary schools...

https://www.thirdspacelearning.com/blog/2017/ks2-maths-sats-on-reflection-teaching-for-mastery

Sunday, 9 July 2017

Changing The Teaching of Reading: Did It Work?


Disclaimer: Although this blog post is all about SATs data, it's not what I'm all about, nor has it been our singular focus this year. This blog post merely serves to analyse test data, however flawed it is turning out to be, in the spirit of transparency.

Regular readers will know I've blogged quite a lot about reading over the last year (slight understatement). I've developed ways of teaching reading (based on research, not just whim) which I've shared with others, and which others have used in their own classrooms. I've been eager to share my approaches because I've been excited about them, but there was always a question in the back of my mind: "What if they don't work?"

So, as results day approached, I was worried for more than just one reason: I hoped I had not let my own school down, and I hoped as well that I hadn't sent those who'd tried some of my ideas down the wrong path.

And because I've shouted loudly about how we've tackled our approach to teaching reading, particularly in year 6, it's only really right that I share something of our results with my readers.

I feel that first off I must signpost an excellent blog post by Mr. Jennings: 29% to 92%, a reading journey! The blog is an in-depth exploration of everything his school did to make a supremely impressive jump in their percentage of children reaching the 100 scaled score in the 2017 key stage 2 reading test. I feel honoured to have been mentioned in the blog post as part of Mr. Jennings' journey, but I now hail him as my new hero! What an achievement! I shall be learning from all the amazing work he has done this year.

Attainment

Well, we didn't quite get a 63 percentage point increase in our reading results - ours was a much more modest 21 percentage point increase, meaning that roughly half of our 60-strong cohort reached the magic 100 mark.

I was pleased with the increase - I've been told that a school working very hard to improve something can expect a 10 percentage point increase in results on average.

However, I was hoping that we would get 60% of children reaching the pass mark. Looking into the data, I found there were a number of children who were between 1 and 4 marks off getting 26 marks - these children would have brought the percentage of children reaching the 100 scaled score to my desired 60%. (I am hoping to have 3 of these children's scripts remarked.)

Progress

So, whilst our attainment was low, it wasn't a surprise. Our school's history and context (in short: Inadequate Ofsted December 2013 leading to academy conversion January 2015, 94% EAL, 37% disadvantage) means that our children haven't consistently had good teaching.

A review of historical data shows that only 22.2% of this cohort made average or above average progress during their time in lower key stage 2, and, as a result, only 5.6% of them were working at age related expectations in reading at the end of year 4. This figure was 26% (at ARE) by the end of year 5.

Our online tracking system (SPTO) takes a CTF file from the NCA Tools website and assigns points to the different scaled scores. Using this data, this 94.8% of this cohort have shown to have made average or above progress. In fact, 91.4% made accelerated progress. Looking at progress from official key stage 1 data shows that only 10% of children didn't make expected progress - at the beginning of the year that figure was 43% not making expected progress across the key stage.

So, even though our attainment results don't yet reflect the progress being made due to low starting points, there is considerable reason to believe that the approaches we took in the teaching of reading have had a positive impact on the children.

Test to Test

One or two of you may remember that I reported that before Christmas almost 50% of the cohort achieved the pass mark on the 2016 reading test. This is something that has caused me quite a bit of consternation: did no progress occur between December and May, given that the same percentage of children passed in May as in the December?

So, I spent some time looking into the data. Thankfully, I'd kept the scores from when the children tried the 2016 test as we wanted to see if taking the 2016 test was a good indicator of how well children would do on the actual test so that we could rely on using it as a practice paper in the future.

Positively, I discovered that:

  • two extra children 'passed' in May who hadn't passed in December (one child who had 'passed' in December didn't 'pass' in May).
  • of the aforementioned children who had scaled scores of 97, 98 or 99, all of them got significantly more marks (between 6 and 12) than on the 2016 test and all but one of them (the one who 'passed' in December but not May) got a higher scaled score, for example one child moved from a scaled score of 90 to 97.
  • most of our lowest attaining children, and our SEND children, made the most progress between the two tests: some of the most vulnerable children getting a double digit increase in both their scaled score and number of marks gained.
  • overall, children had made progress, some very impressive, from one test to the next, even if this did not mean that they achieved the 100 scaled score.
Interestingly, I also found that a number of children who 'passed' both the tests achieved lower scaled scores in the 2017 test than in the 2016 test, with an average of a -2 points difference. For some children it appeared that the 2017 test, although easier, with its raised pass mark was actually harder to pass than the more difficult 2016 test with its low pass mark.

So, is the 2016 test a good indicator of how well a child might do in the 2017 test? Yes, although some children may get an equal, or lower, scaled score on the more recent test as it could be considered harder to pass.

And does the fact that our percentage of children passing both tests despite the extra teaching time in the middle mean that our approach to teaching reading didn't work? I believe not as most children made progress between the two tests, gaining both extra marks and higher scaled scores - this was particularly evident for lower attainers. 

However, it draws attention to a group of children who were scoring in the 90s on the 2016 test in December who would have benefited from some additionality - this is the challenge for next year: what does that intervention look like? What do those children need in order to be more consistent when answering comprehension questions? Are there other factors that meant these particular children struggled to reach the pass mark, despite showing progress?

I hope that this blog post has been read with interest and without judgement - I only seek to be transparent. I am fairly certain that I can conclude that what we have done in reading has been successful on the whole, and that like most new approaches, now just needs some adjustments and additions. However, I do share this hoping that I might gain insight from outsiders on how else I might interpret the data and make conclusions - for example, if it seems to you that my approach hasn't at all worked, I'd prefer to know that and not waste any more time on it!

A small request: I'd be interested if you would share with me, publicly or privately, the increase in percentage you have experienced between the 2016 and 2017 reading results (between last year's cohort and this year's).

Monday, 3 July 2017

To My Brilliant Year Six Teachers

To my brilliant year six teachers,

Thank you and well done for all your incredibly hard work this year. I could not have asked for more commitment and dedication to the children of our school. They have received top-notch teaching and a highly-tailored curriculum this year - you have thought of each and every one, assessing their needs and then working on them meticulously to help them to make, in so many cases, very rapid progress.

You have had the highest of expectations for all the children in your care and have not let anyone get away with anything sub-standard. At the time, that might make you feel like an ogre, but, it is absolutely necessary in ensuring that the children have the best possible chance of present and future success.

It has been so encouraging to see how you have worked together, trying out new things and analysing their success. You have really made every effort to be excellent teachers - and it has paid off. Your self-reflectiveness and your desire to always better yourself has been an absolute gift, both to me as your leader and to the children.

And so, whatever the 2017 KS2 tests results say, I stand with you and support you. Should they be good, we will celebrate. Should they be disappointing, we will look for and celebrate the successes that are sure to be there. And we will optimistically plan for the future, resolutely seeking ways to better our practice from this year. 

Yes, I am now speaking about 'we' and not 'you' because, although your personal commitment is independently commendable, we are a team and we did this together. This is not a case of 'you', it is 'us' - that 'us' includes all the school's leaders and every other member of staff who has touched the lives of our outgoing year sixes. We did this together and we will stand together.

Thank you though for all the times you felt you were on your own, but you kept on going anyway - you truly do put the children at the centre of all you do.

When those results come in, think not of them as the only measure of each child's achievements, no matter how well they have done. They do not measure all the things that you have told me, and that I have seen, throughout the year: the small wins and the big successes. That child who was working on year two objectives who can now successfully demonstrate understanding of many year six objectives. That child who only started with us this year, having not been in school for a good while. That child who has discovered a love of reading, of writing, of maths, of history, of Shakespeare. That child who now speaks up in class. All those children who are raring to go to secondary school, confident that they are learners and that they will be successful as long as they hold high expectations for themselves. You did that. 

They might not thank you for it. But I do. And in years to come they will look back and remember all that you have done and they will wish that they had thanked you. 

But I know you don't do it for the thanks. You do it because you care. There is not enough thanks to cover that.

A vast understatement to finish, because anything attempting to sum it up would sound far too hyperbolic and platitudinous: this has been a great year and you should be proud of what you have achieved with the children.

This was re-blogged on the TES site: https://www.tes.com/news/school-news/breaking-views/sats-year-6-teachers-results-day-there-arent-enough-words-say-thank