page loader

NAPLAN Year 3 Numeracy 2017 – Analysis of the results

« Go back

The annual NAPLAN tests are supposed to assist governments and schools to gauge whether students are meeting key educational outcomes. These results are supposed to help identify strengths or address areas that need to be improved.

Each year for the last 15 years I have been involved with a select group of colleagues in analysing the NSW Year 3 and Year 5 NAPLAN Numeracy results – initially in my role as Numeracy Project Officer for the Archdiocese of Sydney and lately in my role as Director, Maths Matters.

When we look at the NSW NAPLAN Numeracy results for Year 3, we double check every question for the exact curriculum outcome it matches in the NSW Syllabus. Often we find our decisions differ from the official ones which means we have a different proportion of Stage 1, 2 or 3 content outcomes in our summary. Each year we find that there are insufficient Stage below questions in both the Year 3 and Year 5 papers. We are puzzled by this. For example, in Year 3 we would expect more questions to relate to Stage 1 content. And in Year 5 we would expect more questions to relate to Stage 1 and 2 content. Surely governments want to check that most students have achieved the key Stage below content outcomes.

We next examine each Stage below question and determine if it is one which we expect 80% or more students to correctly answer. We call these Core Stage questions. Another name could be “key content outcome”. Notice our 80% cut off still allows for one in five students who might be unable to answer correctly. They may be working at an even earlier stage perhaps. These Core Stage below questions focus on the key outcomes we expect students to have understood and also should be able to demonstrate by answering these questions correctly. If there are fewer students answering these Stage below questions correctly, they alert us to blockages that hinder student understanding. When we analyse the percentage selecting an incorrect answer, we are able to see common error patterns in student thinking. These then assist teachers to focus on specific key content outcomes. Teachers will clearly identify student blockages. The ideal result would be that all Core Stage below questions have 80% or more students answering correctly.

Some questions in this Year 3 paper might be Stage below, but they can include twists that we do not expect 80% or more students to answer correctly. We call these Advanced questions.

All this information is then put into a one-page grid. At first you will find the grid a little disorienting due to its unfamiliarity. We put the question number on the left and its matching result on the right. Questions are colour-coded for easy reference so you can see which Stage it is and also if it is a Core or an Advanced question. Question numbers that are underlined mean that question is an open one, not a multiple choice. Questions with an asterisk (*), mean that it is repeated somewhere in the Year 5 paper.

On the matching square on the right, we record the percentage of students who answered correctly.

  • If this result has a green background then 80% or more students in NSW schools answered this correctly.
  • If the number is bold red, that means that it was a Core Stage below question that did not get 80% or more students correct.
  • If this result has a yellow background, that means that fewer than 50% of students answered this question correctly.

For example, Question 11 on the left is coded olive green and the number is underlined. This means it is an Advanced Stage 1 question that is repeated somewhere in the Year 5 paper. The third matching space on the right shows that 67% of students answered Question 11 correctly. Question 15 on the left is coded lime green. This means it is a core Stage 1 question. We expect 80% or more students to answer correctly. If you look at the matching square on the right, you see that only 59% of students answered Question 15 correctly. This alerts us to pay attention to the incorrect responses to see what blockages may be revealed. As Question 15 is also underlined, this means it is repeated somewhere in the Year 5 paper.

We then include a second page summary. This shows each of the open questions (free response) and the results. We also show each of the repeat questions and the results for both Year 3 and Year 5.

We then show an overall summary which enables us to make big picture statements about how Year 3 students in NSW are tracking. Based on this summary we then provide detailed feedback on relevant questions. Your class or your school may be above, at or below these statistics. But what we look at is the biggest picture we can – the total number of Year 3 students in NSW who sat for that test. We show you the major trends in student misunderstanding. These statements help you identify where students need more support.

These are the major results from our analysis of the 2017 NAPLAN Numeracy test for Year 3:

  • 61% were ES1 or Stage 1 questions (22 out of 36 questions)

  • 44% were Core Stage 1 or ES1 questions (only 16 out of 36 questions)

  • 50% of these 16 Core Stage 1 or ES1 questions did not score 80% or more correct

To see our complete summary for the 2017 Year 3 Numeracy paper, click here. It is in the column marked Term 3/4.