Teaching isn't just what I do, it's part of who I am. I'm the lady in the supermarket complimenting your kids on their use of a maths strategy or encouraging them to activate their prior knowledge. I'd say sorry, but I'm not! Learning is life. Learn or... Well... What's the alternative?
Engaging Now! 7.4
Ready for the annual Understanding and Responding to Trauma refresher. @AusChildhood
We are living – and teaching – in a time with, what seems to be, a never ending supply of data about our students. Some of it is readily accessible and useful whilst some of it just seems like data for data’s sake. The key for me, is whether the data is going to tell me anything about my kids, and about how I can better meet their (or, on occasion, my) needs; and whether the data is timely enough to be helpful. [I’ll be honest though: I AM a bit of a data nerd. I like finding the trends, identifier my outliers and working out how it all fits together.]
Having said that, working out ways to deal with the general day to day data we collect is super important. I was reminded of this recently when Brett commented on my post about One Minute Basic Number Facts Tests. In response to this query – and the three or four that I fielded in person last year – I thought I’d share how I set up conditional formatting and then share a couple of examples of how I use the information to inform my practice.
There are a few ways to set up conditional formatting, all with their pros and cons. Personally, I prefer entering my rules manually than using any of the data bars, icon sets etc. because I feel like it gives a greater degree of control in setting my levels. You’ll notice I said that I feel like it gives me more control; entering the rules manually is how I initially learnt and am comfortable doing it. I’ve had a good poke around the other options in both Excel (Microsoft) and Numbers (Mac) and the reality is that the other options have ways of entering the same levels also. Maybe next time I need to enter new rules I’ll try something different?
My process starts with identifying the benchmark(s). In the example below I’ve used the norms for the One Minute Basic Number facts. As you can see, the table provides norms at 6 monthly intervals, however for the purpose of this discussion, I’m only using the norms for 10 years of age. (In a composite class I will make a professional judgement about how many to use, based on the cohort. Typically with this test, I will pick one age for each year level and assess against it for the year in order to measure growth. At the end of the year I will do a much finer and accurate assessment against actual age in order to pass on the data.)
Creating the conditional formatting rules is, of course, a little different in each programme but the general premise and language tends to be fairly consistent. (My images are screenshots from Google Sheets.) The goal is to create a custom formatting RULE for each level. You will be asked for the condition upon which to apply a particular format, and what format you would like. If we take the image below as an example: the benchmark for a 10 year old, in the addition subtest is 21 correct answers. When asked when to apply the formatting I have chosen ‘is equal to’ and entered 21 as the value. I’ve then chosen the formatting style (which in this case is simple a colour fill). This process is the same for each other level, with different conditions and formatting styles.
In a test like the One Minute Basic Number Facts with multiple subtests, it is simple enough to copy the formatting across tests and simply change the values within the rules.
The sample I’m working with for this post, to be consistent, is based on some de-identified data using the same One Minute Basic Number Facts norms as in the post on which Brett commented. You can access the Google Sheet with both the norms and the sample conditional formatting here.
From this point, I’ve used this information to inform both whole class and small group instruction. I’ve buddied students who have similar learning profiles based on data and challenged them with personalised learning tasks. I’ve advocated for a particular student with incontrovertible evidence that he was years behind his peer group in basic literacy skills that had not been picked up previously because the cohort’s data had never been collated in oneplace. I’ve shown parents evidence of their child’s growth over a period of time, and similarly been able to show parents their child’s ‘place’ in the class (with confidentiality of the other students maintained).
The challenge with any testing is, of course, the message that it sends to children and the potential reinforcement of growth or fixed mindsets. I’m very conscious of this and use discussions about any data I collect to help children develop growth mindsets (e.g. “look at this growth you’ve made because you’ve been using this strategy” or “tell me more about what you did to improve in this”). Students love seeing their growth over time, and the colours make it super clear to them that they’ve made progress. One little guy I taught was determined to ‘go blue’ across the board, and on more than one occasion I had to explain to classroom visitors that he wasn’t planning an extreme exit strategy!
Late last year I applied to host EduTweetOz, a “rotating curation education Twitter account with Australian educators sharing ideas, experiences & questions.” I’ve followed the account for a number of years and learnt a lot from both the hosts and other people engaged in discussions prompted by the hosts; hosting the account seemed like a natural progression in my learning. End of the year busy-ness started and I quickly put my application out of mind so it was almost a shock to see a request asking me to be the first host for the 2018 school year.
[The EduTweetOz account changes hosts weekly and so does the account’s name. Any EduTweetOz tweets I share here were sent BY me, however they will show the current host’s name.]
I was going to write a post that shared ALL of my tweets from the week but quickly realised that the volume of tweets would make that somewhat challenging, and probably quite boring to read. Instead I’m going to share my reflections and highlights.
Each of the core topics in my Master of Education course requires us to complete an educational intervention case study of some kind. The case studies have involved us identifying an educational point of difficulty for a learner, proposing a particular intervention based on our ongoing learning in the field of cognitive psychology, engaging with the intervention, taking pre- and post- intervention data, analysing the outcomes and linking it all to exiting research.
With the exception of the first topic, we have been free to chose the learner (recognising the reality that most of us are juggling our post-grad studies with our teaching workload). I’ve worked with a gifted high school student (my son), a post-grad tertiary student (my study buddy) and most recently a primary student (the daughter of a friend). I, clearly, like variety!
My most recent case study is the most closely linked to my daily teaching practice in terms of the age of the learner and the learning area.
Primary Case Study
My participant, E, was an 11 year old female student enrolled in a year 6 class in a small government R-7 school in the southern suburbs of Adelaide, South Australia. Early observations uncoveredthat Ehad low self-efficacy in the area of Maths, which she attributed to both her own lack of ability and her teachers’ repeated failure (Weiner, 1972). Communication with E’s teacher described E as suffering from low self-confidence, while her mother (a pre-service teacher) raised concerns about E’s basic number sense. Further observations exposed significant misconceptions around place value as being the source of the ongoing poor performance and achievement which in turn lead to low self-efficacy.
I shared my observations and initial thoughts with another teacher (my study buddy) and then proposed an intervention that would test whether explicit instruction in the development of conceptually correct place value understanding would lead to improved mastery experiences (Bandura, 1977) in Maths learning tasks and subsequently improved self-efficacy in Maths. Based on discussions with my professor and study buddy I made minor changes to both the proposed assessments and intervention progression.
E’s overall Maths performance was assessed (through the PAT-M 6 test) as being well below year level, scoring 13/36. E’s conceptual understanding of place value was assessed using the Victoria Department of Education and Training’s 2017 Common Misunderstandings – Level 2.4 Renaming and Counting Tool as being quite limited. She was struggled to read numbers of greater than two digits and could not represent a number over one hundred using tens and ones concrete materials.
These assessments were supported by the recording of ongoing observations, think-alouds and interviews.
I tested my early observations that E experienced low self-efficacy, specifically in the area of Maths using both the Children’s Perceived Academic Self-Efficacy: An Inventory Scale (MJSES) (Jinks & Morgan, 1999) and the Mathematics Self-Efficacy and Anxiety Questionnaire (MSEAQ) (Kay, 2009). The first inventory confirmed that E had low general self-efficacy and the latter confirmed it with regards, specifically, to Maths.
Both of these tests/inventories are very easy to administer and analyse, and will, I am sure, be useful in my ongoing teaching practice.
I always dread teaching chance and probability, and then invariably love actually doing it. 2017 was no different.
My concerns this year were based, partly, on the broad range of development: within the main cohort of my class the range in Maths was typically from year 2 to year 9. Based on the school wide PAT-M testing that happened shortly before we tackled this unit, I knew that I’d been granted a reprieve: the range was only year 3 to year 8. (Doesn’t sound like a big difference but trust me, it felt like it!) (I can’t tell you how often I’ve been grateful for the PAT training I did a couple of years ago. The PAT (or Progressive Achievement Tests) aren’t the be all and end all of assessment but they sure do provide broad snapshots of children’s achievement that can be easily compared over time.)
I presented the whole class with four questions and asked them to work with a partner to formulate a response. With the exception of a couple of children who become anxious when required to make social choices, I allowed the children to choose their own partners. I roamed around listening to the children, using three single point rubrics (you can read about those in my post here) to record if they were able to identify the element of chance, use informal language of probability and use mathematical language of probability.
Based on the PAT-M data and their responses to those problems, I knew that (with the exception of one student) the whole class needed to review the basic concepts of chance and probability before we went any further. What does probability mean? Try answering it without using the words probable, likely or chance. It’s hard! We started by brainstorming chance and probability language that we hear all the time. (Yes, I know my writing is super messy. It was done on the fly as children called out words and wasn’t up there long.)
The children worked in heterogeneous groups to order these words. I was quite surprised by the different ways the groups went about this. One group spread out all the words and tried to sort them into four equally likely/certain groups before realising that probably and maybe and certainly are all on the ‘positive’ side while unlikely, impossible and low chance were all on the ‘negative’ side which meant that four groups weren’t enough. Another group was adamant that there were no gradations of probability: if there is even a small chance of something not happening it is uncertain, otherwise it is certain. Most groups identified similarity between some terms and clarity about the gradations of probability. Each group presented their learning product to the class and fielded questions. The process of needing to justify and reconsider choices based on their peers’ feedback resulted in a lot of changes being made. I hung the ‘spectrums’ on our learning wall and invited the children to use sticky notes to ask (and answer) further questions.
One of the things I missed this past year, while teaching year 4/5s, was using mentor sentences to really ramp up writing quality and to teach grammar. I realised, very early in the year, that the process I had previously used (about which you can read here and in the follow up here) wouldn’t work for this cohort of children, nor would it fit with the school’s literacy schedule. I was disappointed but that’s teaching right?
By the end of term 3, the children had amazed me with their writing progress. Their willingness to try new things, take risks, seek – and offer – feedback, revise and to put in the time that leads to quality writing was inspiring. The missing element though, was syntactical flexibility. The ideas were all there, but the ability to create new or different sentence structures wasn’t growing as fast as I hoped. Reflecting on conversations I’d had with various children during our literacy group rotations I realised that some of them weren’t able to manipulate the basic building blocks of sentences (words) because they didn’t have a strong grasp of what the blocks represented. They didn’t understand that swim (the verb) and swim (the noun) mean different things and play different roles in a sentence. I realised also that some of my high flyers, who were starting to take risks with sentence structure, usually fell down when wanting to use a synonym of the same word category in order to achieve cohesion.
I decided to use some of the ideas of my previous work with mentor sentences to build the children’s understanding of syntax and syntactic flexibility.
Back to mentor sentences. I decided that during term 4 we would start each week by focussing on a mentor sentence as a class and then during literacy groups the children would work further with the sentences toward different goals, with different scaffolds. I introduced this on the first day back by having the sentence written on anchor chart paper and posted on our board before the children arrived for the day. They noticed it immediately and began talking amongst themselves about what it meant, why it was there and what I might be planning. I didn’t keep them in suspense long; we got started straight away.