For the last 15 years Bev Dunbar has been analysing the NSW NAPLAN results, looking for the “big picture” information to help all teachers better understand student misconceptions and blockages to understanding. When you look at your NAPLAN results, you are looking at individual students in your class and whether your school achieved at, below or above State averages. This information helps you plan targeted strategies for student improvement.
However here at Maths Matters we look at the whole state results, in our case that’s NSW. Fortunately NSW results are almost identical to the Australian results so we are able to see how Australian students as a whole are tracking in their mathematical progess. But unfortunately … this is not a pretty picture. It appears that our students are not able to adequately demonstrate competency on key test questions. In other words, they are not able to demonstrate that they have achieved key content outcomes.
We can argue about the nature of the test and whether we agree or disagree about its continuation. That is another story. The fact is that our government collects yearly data and that yearly data tells us something.
We analyse every test question and categorise it by both Stage and also by Core or Advanced. We call a test item a Core Stage question if we think it is suitable for most students to answer. For example, in Year 3 a Core Question is one that covers Early Stage 1 or Stage 1 concepts that we expect most Year 3 students to answer. We have a cut off point of 80% for our definition of “most”. So one in five students might answer this Core Question incorrectly. These may be students who are anxious about the test, may be working at an even earlier Stage, may have language and reading difficulties and so on. But we believe the question is an effective one to ask this cohort. Other questions in the test paper may be at the same Stage but may have twists and turns or too many steps in their solution. We call these test items Advanced. We do not expect 80% or more students to answer these questions correctly. These questions help identify more advanced mathematical thinkers in your class, school, state or country.
The strange thing is though, the questions that make up a particular NAPLAN Numeracy test more often than not do NOT test Core Stage below content. Some even test our students on content that is two Stages above. We are not sure what this data is supposed to tell us. We are perplexed as to why these questions are included. If you study William Shakespeare’s plays at University, for example, you would not expect to be tested on the plays of Anton Chekhov, just to see if someone has that knowledge.
In the 2017 Year 3 NAPLAN Numeracy paper, only 44% of the questions were Early Stage 1 or Core Stage 1. That’s 16 out of 36 questions. And only 50% of these 16 questions were answered by 80% or more Year 3 students.
In the 2017 Year 5 NAPLAN Numeracy paper, 50% of the questions were Core Stage 1 or Core Stage 2. That’s 21 out of 42 questions. And only 43% of these 21 questions were answered by 80% or more Year 3 students.
If you would like to see our one page summary of this year’s Year 3 results, click here.
If you would like to see our one page summary of this year’s Year 5 results, click here.
The summary is compact, with test question numbers on the left and the NSW test results for each question in matching spaces on the right. We also include an analysis of key concerns, based on these results. These concerns highlight the “big picture” view. Your individual students may have been successful in answering a question, but the overall results might indicate misconceptions you need to be aware of. We then develop resources that help your students overcome these blockages to their understanding.