What if your students created their own book about geometry? What topics would they select? How would they illustrate it? Would they work on one book for the whole class or perhaps create a book in pairs? How many pages would it have?
Part of working mathematically is recording what you know and presenting it in a way that other people can read about your ideas.
Try to create at least one class book about mathematical concepts each term. You could lend your class book to other classes in the school and ask them for feedback too.
]]>Just write the letters of the month in the balloons, then start writing in the numbers for the days of the week.
For more ideas like this see our Time activities, Time graphics or Time photographs.
]]>
Start with two identical size large equilateral triangles and place one upside down on top of the other. You might like to glue these together.You now have a 6 pointed star and the points are 6 smaller equilateral triangles.
Cut out 6 more triangles to fit these 6 smaller outer triangles and place them upside down on top of each one. Now you have created 18 tiny equilateral triangles.
Cut out 18 identical small triangles and place these upside down on each small triangle. Got the idea?
Stop whenever you feel the triangles are getting too fiddly to manage.
You have now created one of the most famous FRACTALS, the Von Koch Snowflake.
]]>Some interesting mathematical facts about dolphins:
– they are a type of “toothed whale”
– they grow up to 4 m in length
– they can swim for up to 100 km each day
– they get fresh water from their food (squid, fish)
– they can close down one half of their brain to “sleep” for 8 hours a day
– they can travel up to 35 km per hour and breathe through their blowhole
– this breath can help them dive for up to 4 minutes
– they can also hold their breath for up to 20 minutes
– they live for up to 40 years
Imagine all the wonderful classroom discussions with your students about these maths facts.What else can your students discover? Encourage your students to work in pairs to create a mathematical challenge based on these facts.
Remember it is vital to talk about what something ISN’T as much as about what it is. I just saw this ABC online video clip about Quarters. Everything there was fine. Cute female child’s voice over, clear graphics. All good. BUT … the biggest problem your students have in fractions is understanding far more than that, far more than just one explanation. What is NOT a quarter, what is NOT an equal part? What happens if there are only 3 equal parts? What happens if there are now five equal parts? Talking about “positive” things is only part of the solution. Talking about ‘negative’ things is also a vital component.
This video clip will be a fantastic starting point for a discussion with your class. Try to generate as many alternative questions that need answering. Get your students to explain to a partner what is and is not a quarter. Get them to draw pictures of what they think represents quarters and non-quarters. Try to spot misunderstandings that can then be shared with the whole class.
For more targeted fraction activities click here.
For more targeted fraction photographs click here.
For more targeted fraction graphics click here.
]]>
For example, leopards:
This Stage 2 activity (Years 3/4), Match my Group (Division Number Sentences), helps these selected students realise that each part of a number sentence has meaning. If you rearrange the position of digits and symbols, then your number sentence may not make any sense. You can’t just put them wherever you like.
It also helps students realise that they need time to think about the actions in a number sentence. If they see the number 42 and the number 6, for example, do you add, subtract, multiply or divide them? You can’t just rush in with the first thing that comes into your head.
Each of the 4 stories in this activity is a single step story. Your students need to be able to manage linking their story and number sentence before tackling more complex events. You need to make time for your students to talk to each other, explain their thinking strategies, to work co-operatively on each activity. You can’t rush their understanding. You need that “aha” moment when they work out that actions matter, the placement of digits and symbols matter. Maths matters!
]]>However here at Maths Matters we look at the whole state results, in our case that’s NSW. Fortunately NSW results are almost identical to the Australian results so we are able to see how Australian students as a whole are tracking in their mathematical progess. But unfortunately … this is not a pretty picture. It appears that our students are not able to adequately demonstrate competency on key test questions. In other words, they are not able to demonstrate that they have achieved key content outcomes.
We can argue about the nature of the test and whether we agree or disagree about its continuation. That is another story. The fact is that our government collects yearly data and that yearly data tells us something.
We analyse every test question and categorise it by both Stage and also by Core or Advanced. We call a test item a Core Stage question if we think it is suitable for most students to answer. For example, in Year 3 a Core Question is one that covers Early Stage 1 or Stage 1 concepts that we expect most Year 3 students to answer. We have a cut off point of 80% for our definition of “most”. So one in five students might answer this Core Question incorrectly. These may be students who are anxious about the test, may be working at an even earlier Stage, may have language and reading difficulties and so on. But we believe the question is an effective one to ask this cohort. Other questions in the test paper may be at the same Stage but may have twists and turns or too many steps in their solution. We call these test items Advanced. We do not expect 80% or more students to answer these questions correctly. These questions help identify more advanced mathematical thinkers in your class, school, state or country.
The strange thing is though, the questions that make up a particular NAPLAN Numeracy test more often than not do NOT test Core Stage below content. Some even test our students on content that is two Stages above. We are not sure what this data is supposed to tell us. We are perplexed as to why these questions are included. If you study William Shakespeare’s plays at University, for example, you would not expect to be tested on the plays of Anton Chekhov, just to see if someone has that knowledge.
If you would like to see our one page summary of this year’s Year 3 results, click here.
If you would like to see our one page summary of this year’s Year 5 results, click here.
The summary is compact, with test question numbers on the left and the NSW test results for each question in matching spaces on the right. We also include an analysis of key concerns, based on these results. These concerns highlight the “big picture” view. Your individual students may have been successful in answering a question, but the overall results might indicate misconceptions you need to be aware of. We then develop resources that help your students overcome these blockages to their understanding.
]]>
Each year for the last 15 years I have been involved with a select group of colleagues in analysing the NSW Year 3 and Year 5 NAPLAN Numeracy results – initially in my role as Numeracy Project Officer for the Archdiocese of Sydney and lately in my role as Director, Maths Matters.
When we look at the NSW NAPLAN Numeracy results for Year 5, we double check every question for the exact curriculum outcome it matches in the NSW Syllabus. Often we find our decisions differ from the official ones which means we have a different proportion of Stage 1, 2 or 3 content outcomes in our summary. Each year we find that there are insufficient Stage below questions in both the Year 3 and Year 5 papers. We are puzzled by this. For example, in Year 3 we would expect more questions to relate to Stage 1 content. And in Year 5 we would expect more questions to relate to Stage 1 and 2 content. Surely governments want to check that most students have achieved the key Stage below content outcomes.
We next examine each Stage below question and determine if it is one which we expect 80% or more Year 5 students to correctly answer. We call these Core Stage questions. Another name could be “key content outcome”. Notice our 80% cut off still allows for one in five students who might be unable to answer correctly. They may be working at an even earlier stage perhaps. These Core Stage below questions focus on the key outcomes we expect students to have understood and also should be able to demonstrate by answering these questions correctly. If there are fewer students answering these Stage below questions correctly, they alert us to blockages that hinder student understanding. When we analyse the percentage selecting an incorrect answer, we are able to see common error patterns in student thinking. These then assist teachers to focus on specific key content outcomes. Teachers will clearly identify student blockages. The ideal result would be that all Core Stage below questions have 80% or more students answering correctly.
Some questions in this Year 5 paper might be Stage below, but they can include twists that we do not expect 80% or more students to answer correctly. We call these Advanced questions.
All this information is then put into a one-page grid. At first you will find the grid a little disorienting due to its unfamiliarity. We put the question number on the left and its matching result on the right. Questions are colour-coded for easy reference so you can see which Stage it is and also if it is a Core or an Advanced question. Question numbers that are underlined mean that question is an open one, not a multiple choice. Questions with an asterisk (*), mean that it is repeated somewhere in the Year 3 paper.
On the matching square on the right, we record the percentage of students who answered correctly.
For example, Question 5 on the left is coloured pink which means it is a Core Stage 2 question. The matching space on the right shows that 90% of Year 5 students answered correctly. Question 24 on the left is coded purple and the number is underlined. This means it is an Advanced Stage 2 question that is repeated somewhere in the Year 3 paper. The third matching space on the right shows that 48% of students answered Question 24 correctly. It is coloured yellow as this result is less than 50%.
We then include a second page summary. This shows each of the open questions (free response) and the results. We also show each of the repeat questions and the results for both Year 3 and Year 5.
We then show an overall summary which enables us to make big picture statements about how Year 5 students in NSW are tracking. Based on this summary we then provide detailed feedback on relevant questions. Your class or your school may be above, at or below these statistics. But what we look at is the biggest picture we can – the total number of Year 5 students in NSW who sat for that test. We show you the major trends in student misunderstanding. These statements help you identify where students need more support.
These are the major results from our analysis of the 2017 NAPLAN Numeracy test for Year 5:
Each year for the last 15 years I have been involved with a select group of colleagues in analysing the NSW Year 3 and Year 5 NAPLAN Numeracy results – initially in my role as Numeracy Project Officer for the Archdiocese of Sydney and lately in my role as Director, Maths Matters.
When we look at the NSW NAPLAN Numeracy results for Year 3, we double check every question for the exact curriculum outcome it matches in the NSW Syllabus. Often we find our decisions differ from the official ones which means we have a different proportion of Stage 1, 2 or 3 content outcomes in our summary. Each year we find that there are insufficient Stage below questions in both the Year 3 and Year 5 papers. We are puzzled by this. For example, in Year 3 we would expect more questions to relate to Stage 1 content. And in Year 5 we would expect more questions to relate to Stage 1 and 2 content. Surely governments want to check that most students have achieved the key Stage below content outcomes.
We next examine each Stage below question and determine if it is one which we expect 80% or more students to correctly answer. We call these Core Stage questions. Another name could be “key content outcome”. Notice our 80% cut off still allows for one in five students who might be unable to answer correctly. They may be working at an even earlier stage perhaps. These Core Stage below questions focus on the key outcomes we expect students to have understood and also should be able to demonstrate by answering these questions correctly. If there are fewer students answering these Stage below questions correctly, they alert us to blockages that hinder student understanding. When we analyse the percentage selecting an incorrect answer, we are able to see common error patterns in student thinking. These then assist teachers to focus on specific key content outcomes. Teachers will clearly identify student blockages. The ideal result would be that all Core Stage below questions have 80% or more students answering correctly.
Some questions in this Year 3 paper might be Stage below, but they can include twists that we do not expect 80% or more students to answer correctly. We call these Advanced questions.
All this information is then put into a one-page grid. At first you will find the grid a little disorienting due to its unfamiliarity. We put the question number on the left and its matching result on the right. Questions are colour-coded for easy reference so you can see which Stage it is and also if it is a Core or an Advanced question. Question numbers that are underlined mean that question is an open one, not a multiple choice. Questions with an asterisk (*), mean that it is repeated somewhere in the Year 5 paper.
On the matching square on the right, we record the percentage of students who answered correctly.
For example, Question 11 on the left is coded olive green and the number is underlined. This means it is an Advanced Stage 1 question that is repeated somewhere in the Year 5 paper. The third matching space on the right shows that 67% of students answered Question 11 correctly. Question 15 on the left is coded lime green. This means it is a core Stage 1 question. We expect 80% or more students to answer correctly. If you look at the matching square on the right, you see that only 59% of students answered Question 15 correctly. This alerts us to pay attention to the incorrect responses to see what blockages may be revealed. As Question 15 is also underlined, this means it is repeated somewhere in the Year 5 paper.
We then include a second page summary. This shows each of the open questions (free response) and the results. We also show each of the repeat questions and the results for both Year 3 and Year 5.
We then show an overall summary which enables us to make big picture statements about how Year 3 students in NSW are tracking. Based on this summary we then provide detailed feedback on relevant questions. Your class or your school may be above, at or below these statistics. But what we look at is the biggest picture we can – the total number of Year 3 students in NSW who sat for that test. We show you the major trends in student misunderstanding. These statements help you identify where students need more support.
These are the major results from our analysis of the 2017 NAPLAN Numeracy test for Year 3:
]]>