However here at Maths Matters we look at the whole state results, in our case that’s NSW. Fortunately NSW results are almost identical to the Australian results so we are able to see how Australian students as a whole are tracking in their mathematical progess. But unfortunately … this is not a pretty picture. It appears that our students are not able to adequately demonstrate competency on key test questions. In other words, they are not able to demonstrate that they have achieved key content outcomes.
We can argue about the nature of the test and whether we agree or disagree about its continuation. That is another story. The fact is that our government collects yearly data and that yearly data tells us something.
We analyse every test question and categorise it by both Stage and also by Core or Advanced. We call a test item a Core Stage question if we think it is suitable for most students to answer. For example, in Year 3 a Core Question is one that covers Early Stage 1 or Stage 1 concepts that we expect most Year 3 students to answer. We have a cut off point of 80% for our definition of “most”. So one in five students might answer this Core Question incorrectly. These may be students who are anxious about the test, may be working at an even earlier Stage, may have language and reading difficulties and so on. But we believe the question is an effective one to ask this cohort. Other questions in the test paper may be at the same Stage but may have twists and turns or too many steps in their solution. We call these test items Advanced. We do not expect 80% or more students to answer these questions correctly. These questions help identify more advanced mathematical thinkers in your class, school, state or country.
The strange thing is though, the questions that make up a particular NAPLAN Numeracy test more often than not do NOT test Core Stage below content. Some even test our students on content that is two Stages above. We are not sure what this data is supposed to tell us. We are perplexed as to why these questions are included. If you study William Shakespeare’s plays at University, for example, you would not expect to be tested on the plays of Anton Chekhov, just to see if someone has that knowledge.
If you would like to see our one page summary of this year’s Year 3 results, click here.
If you would like to see our one page summary of this year’s Year 5 results, click here.
The summary is compact, with test question numbers on the left and the NSW test results for each question in matching spaces on the right. We also include an analysis of key concerns, based on these results. These concerns highlight the “big picture” view. Your individual students may have been successful in answering a question, but the overall results might indicate misconceptions you need to be aware of. We then develop resources that help your students overcome these blockages to their understanding.
]]>
Each year for the last 15 years I have been involved with a select group of colleagues in analysing the NSW Year 3 and Year 5 NAPLAN Numeracy results – initially in my role as Numeracy Project Officer for the Archdiocese of Sydney and lately in my role as Director, Maths Matters.
When we look at the NSW NAPLAN Numeracy results for Year 5, we double check every question for the exact curriculum outcome it matches in the NSW Syllabus. Often we find our decisions differ from the official ones which means we have a different proportion of Stage 1, 2 or 3 content outcomes in our summary. Each year we find that there are insufficient Stage below questions in both the Year 3 and Year 5 papers. We are puzzled by this. For example, in Year 3 we would expect more questions to relate to Stage 1 content. And in Year 5 we would expect more questions to relate to Stage 1 and 2 content. Surely governments want to check that most students have achieved the key Stage below content outcomes.
We next examine each Stage below question and determine if it is one which we expect 80% or more Year 5 students to correctly answer. We call these Core Stage questions. Another name could be “key content outcome”. Notice our 80% cut off still allows for one in five students who might be unable to answer correctly. They may be working at an even earlier stage perhaps. These Core Stage below questions focus on the key outcomes we expect students to have understood and also should be able to demonstrate by answering these questions correctly. If there are fewer students answering these Stage below questions correctly, they alert us to blockages that hinder student understanding. When we analyse the percentage selecting an incorrect answer, we are able to see common error patterns in student thinking. These then assist teachers to focus on specific key content outcomes. Teachers will clearly identify student blockages. The ideal result would be that all Core Stage below questions have 80% or more students answering correctly.
Some questions in this Year 5 paper might be Stage below, but they can include twists that we do not expect 80% or more students to answer correctly. We call these Advanced questions.
All this information is then put into a one-page grid. At first you will find the grid a little disorienting due to its unfamiliarity. We put the question number on the left and its matching result on the right. Questions are colour-coded for easy reference so you can see which Stage it is and also if it is a Core or an Advanced question. Question numbers that are underlined mean that question is an open one, not a multiple choice. Questions with an asterisk (*), mean that it is repeated somewhere in the Year 3 paper.
On the matching square on the right, we record the percentage of students who answered correctly.
For example, Question 5 on the left is coloured pink which means it is a Core Stage 2 question. The matching space on the right shows that 90% of Year 5 students answered correctly. Question 24 on the left is coded purple and the number is underlined. This means it is an Advanced Stage 2 question that is repeated somewhere in the Year 3 paper. The third matching space on the right shows that 48% of students answered Question 24 correctly. It is coloured yellow as this result is less than 50%.
We then include a second page summary. This shows each of the open questions (free response) and the results. We also show each of the repeat questions and the results for both Year 3 and Year 5.
We then show an overall summary which enables us to make big picture statements about how Year 5 students in NSW are tracking. Based on this summary we then provide detailed feedback on relevant questions. Your class or your school may be above, at or below these statistics. But what we look at is the biggest picture we can – the total number of Year 5 students in NSW who sat for that test. We show you the major trends in student misunderstanding. These statements help you identify where students need more support.
These are the major results from our analysis of the 2017 NAPLAN Numeracy test for Year 5:
Each year for the last 15 years I have been involved with a select group of colleagues in analysing the NSW Year 3 and Year 5 NAPLAN Numeracy results – initially in my role as Numeracy Project Officer for the Archdiocese of Sydney and lately in my role as Director, Maths Matters.
When we look at the NSW NAPLAN Numeracy results for Year 3, we double check every question for the exact curriculum outcome it matches in the NSW Syllabus. Often we find our decisions differ from the official ones which means we have a different proportion of Stage 1, 2 or 3 content outcomes in our summary. Each year we find that there are insufficient Stage below questions in both the Year 3 and Year 5 papers. We are puzzled by this. For example, in Year 3 we would expect more questions to relate to Stage 1 content. And in Year 5 we would expect more questions to relate to Stage 1 and 2 content. Surely governments want to check that most students have achieved the key Stage below content outcomes.
We next examine each Stage below question and determine if it is one which we expect 80% or more students to correctly answer. We call these Core Stage questions. Another name could be “key content outcome”. Notice our 80% cut off still allows for one in five students who might be unable to answer correctly. They may be working at an even earlier stage perhaps. These Core Stage below questions focus on the key outcomes we expect students to have understood and also should be able to demonstrate by answering these questions correctly. If there are fewer students answering these Stage below questions correctly, they alert us to blockages that hinder student understanding. When we analyse the percentage selecting an incorrect answer, we are able to see common error patterns in student thinking. These then assist teachers to focus on specific key content outcomes. Teachers will clearly identify student blockages. The ideal result would be that all Core Stage below questions have 80% or more students answering correctly.
Some questions in this Year 3 paper might be Stage below, but they can include twists that we do not expect 80% or more students to answer correctly. We call these Advanced questions.
All this information is then put into a one-page grid. At first you will find the grid a little disorienting due to its unfamiliarity. We put the question number on the left and its matching result on the right. Questions are colour-coded for easy reference so you can see which Stage it is and also if it is a Core or an Advanced question. Question numbers that are underlined mean that question is an open one, not a multiple choice. Questions with an asterisk (*), mean that it is repeated somewhere in the Year 5 paper.
On the matching square on the right, we record the percentage of students who answered correctly.
For example, Question 11 on the left is coded olive green and the number is underlined. This means it is an Advanced Stage 1 question that is repeated somewhere in the Year 5 paper. The third matching space on the right shows that 67% of students answered Question 11 correctly. Question 15 on the left is coded lime green. This means it is a core Stage 1 question. We expect 80% or more students to answer correctly. If you look at the matching square on the right, you see that only 59% of students answered Question 15 correctly. This alerts us to pay attention to the incorrect responses to see what blockages may be revealed. As Question 15 is also underlined, this means it is repeated somewhere in the Year 5 paper.
We then include a second page summary. This shows each of the open questions (free response) and the results. We also show each of the repeat questions and the results for both Year 3 and Year 5.
We then show an overall summary which enables us to make big picture statements about how Year 3 students in NSW are tracking. Based on this summary we then provide detailed feedback on relevant questions. Your class or your school may be above, at or below these statistics. But what we look at is the biggest picture we can – the total number of Year 3 students in NSW who sat for that test. We show you the major trends in student misunderstanding. These statements help you identify where students need more support.
These are the major results from our analysis of the 2017 NAPLAN Numeracy test for Year 3:
]]>
We have also included the blue sharks on our MULTIPLICATION & DIVISION Photographs page as you could also reproduce each image to create as many multiples as you like. Here are 4 groups of 3 sharks, for example.
]]>
An object has mirror symmetry if you can divide it into two matching pieces, like looking at a reflection in a mirror. We use the term “reflection” to describe the matching image. This is also call a “flip” in more colloquial language. An object has rotational symmetry if it can be rotated about a fixed point and parts match the new position exactly. For example, you can trace the outline of a shape on paper then rotate this shape around the centre point of the paper image. If this new position matches the original shape on the paper then this shape has rotational symmetry. It may match more than one time as it turns a full circle.
Not all things have symmetry. It is important to talk about what is NOT symmetrical too,so that your students build a clearer image, remove any blockages.
When I worked at The LEGO Centre in Drummoyne many years ago, we had over 70 000 children visit our LEGO play area. There we were able to observe that almost all the spontaneous LEGO constructions were symmetrical. Without anyone directing the children, they appeared to want their construction to be shape symmetrical, although colours did not always match.
This collection of photographs enables you to discuss a wide variety of objects and images with a special focus on symmetry. Where is the line of symmetry? Is there more than one axis? Are real-life objects always exactly symmetrcal? Do we call something symmetrical even though not every single piece matches exactly?
]]>
James Turrell is an American installation artist who works directly with light to create his artworks. He was originally a perceptual psychologist. “Turrell’s over eighty Skyspaces, chambers with an aperture in the ceiling open to the sky … (where) … the simple act of witnessing the sky from within a Turrell Skyspace, notably at dawn and dusk, reveals how we internally create the colors we see and thus, our perceived reality.”
“My work has no object, no image and no focus. With no object, no image and no focus, what are you looking at? You are looking at you looking. What is important to me is to create an experience of wordless thought.” (James Turrell)
]]>
If you know individual students are travelling along country roads, encourage them to take their own photographs of unusual road signs. Have they seen one about emus? Deer? Kangaroos? Koalas? Use these for related class discussions about the importance of road signs and the information they tell us.
]]>
Match my Story provides follow-up examples to discuss together with your Stage 1 students. It is also suitable for high and medium block Stage 2 students. Each set has more than one matching number sentence. It is vital to discuss both the ones that work and those that don’t. Help your students think more deeply about what they are learning.
]]>
If you just heft two objects in your hands, like in this picture, it is easy enough to describe the objects as either lighter than, the same mass as or heavier than each other.
But is becomes a more challenging task when you look at a diagram and need to read something important in it. For example, in the diagram below, you need to see that the object on the left pan balances the 5 objects on the right pan. That tells you something. That tells you that 5 tomatoes have the same mass as one bottle. Or it may even tell you that one tomato has the same mass as one fifth of the bottle.
If you analyse what you see in the two diagrams below, this time you see that 5 small yellow balls are heavier than a large yellow cube. But the second diagram tells you that those same 5 small yellow balls are actually lighter than the large yellow sphere. Both diagrams need to be looked at to work out which object is the heaviest – the large cube? The large sphere? Or the 5 small balls?
In a recent numeracy test, 70% of Year 3 students were correctly able to identify the least heavy mass from a sequence of diagrams. Yet last year, in 2016, only 67% of Year 5 students could correctly identify the heaviest mass from a similar diagram. After two more years of school. ARRGGGHHH!
Does this mean the Year 3 students are dramatically improving their understandings? It’s so difficult to know how to respond to these results.
The test diagrams show a set of balance pans with one only object in each pan. Some show the objects as equal in mass. Some show the objects as lighter or heavier than each other. You need to use your mass logic to look at each diagram, understand which object is heavier and then transfer this realisation to the second or third diagram. In other words. You need to go beyond your first response, your first diagram.
In this year’s test 11% of Year 3 incorrectly selected their first response. What they selected was correct if there was not a second diagram giving you new information. In other words it was their problem-solving skills that let them down. For these one in 10 students, they need to practice perseverence, rereading the question, making time for the problem to become real, making time for them to use their correct understandings of balance pan diagrams. They do NOT need to be taught how to read a balance pan. Their mass thinking is correct but they just didn’t carry it on far enough.
In last year’s test 27% of Year 5 incorrectly selected their first response. Again, it was their problem-solving skills that let them down, not their ability to read meaning into a mass diagram. This is reinforced by the very small number of students who selected totally incorrect answers from the multiple choice. So 67 + 27 = 94. That’s potentially 94% of Year 5 students who could have been correct – a much more reasonable expectation for Year 5.
So, as teachers, we need to think carefully about what test results actually reveal. We should place more emphasis on developing problem-solving skills than reteaching known mass concepts. All students need time to solve a multi-step problem (of course not in a test situation …). They need to reread the question to see that they need to go further than their initial reaction. They need to use logic when they think about their answers. They need to develop pride in being an effective problem-solver.
]]>
In 2016 a large cohort of Year 5 students had lots of trouble correctly identifying how long a length measure was when the cm tape measure didn’t start at 0. Only 42% of students surveyed answered correctly (5 cm). This year, in 2017, Year 3 students are overwhelmingly successful at correctly identifying a similar measure using informal units. Too many swings and roundabouts for me.
When you analyse the incorrect responses, 46% of these Year 5 students selected the answer which was just one more than the correct answer (6 cm). In other words, they counted the 6 marks on the ruler, counting up from the first mark as 1 rather than 0. What on earth do they think length is about? The ruler measures off units of length and the marks in themselves are not the units. Perhaps too many Year 5 students have experienced pen and paper experiences? Or had too strong a focus on the ruler or tape measure marks rather than the actual length of a unit.
So what do we discover when we analyse the Year 3 response to a similar question? This year’s question was open-ended rather than multiple choice. But in itself this doesn’t explain why so many more students were correct. In fact this year’s question was more difficult as it involved far more units (13 units and last year there were only 5 units). We don’t know how many students answered 14 units, but 81% of Year 3 students answered correctly anyway. This is a dramatic change in behaviour and understanding with the 2016 Year 5 cohort.
I believe that there is a glaring explanation. This year’s question talks about measuring with blocks placed end-to-end. In other words, students are virtually told how to interpret the ruler in the diagram. They are told to see it as blocks in a line. So of course they don”t focus on counting up the marks (which would include the mark at the start to get 14).
This explanation has major implications for how we discuss length units with all students. We need to continuously reinforce that length represents units like blocks. We need to talk about the misunderstanding that creates the blockage for the majority of older students.
Length is not just a measure of marks on a ruler. When your feet are together at the start of a number line you have not moved. You are at 0. It is only when you jump a unit of length that you record 1. This concept needs reinforcing with all primary age students. A ruler is conveniently marked to show where these length units reach. Whenever we begin to measure though, we always start at 0. Even if the ruler shows an object starting at 27, that starting point should now represent 0 in a new counting sequence. If an object starts at 108 on a ruler, 108 should now be seen as 0 in a new counting sequence. Even if an object starts at 389 on a tape measure, 389 should now be seen as 0 in a new counting sequence.
Let’s hope that we can make these length understandings part of all our student’s spatial visualisation skills.
]]>