This has been a whirlwind of a quarter, packed with math community days, PARCC PBA prepping, and crazy weather. As those of you at the math extravaganza are aware, this is also the time when I will be analyzing Q3 data to determine Q4 priorities and support. We had a great conversation about the data at the extravaganza, but I wanted to make sure that everyone has a chance to see how we’re doing heading into the last quarter of the year. If you’re interested in understanding where the data below comes from, I recommend you take a look at our Q2 step-back. Otherwise, if you’re good with your lingo (ERC, COA, PK, and so on) then let me share where our math cohort stands.
Data Point #1: Progress Known
- What is Progress Known (PK)?
PK is basically a “Yes” or “No” answer to the question: “do we have reliable and complete data on where students stand in this classroom?”
- How do you collect reliable and complete data?
The reliable and complete data comes from our teachers sharing it with their TLD Coach and/or Content Specialist. As long as you have data for student progress on ALL of your Metrics – you must have data on content mastery and mathematical thinking (performance tasks) – and we have seen that your assessments are reliable, then your students are PK!
- What does the PK data tell us going into Q4?
Our PK has increased by 5% in mathematics, but remains frighteningly low (especially going into Q4). Our biggest growth has been with middle school and upper elementary, which leads me to believe that the PARCC PBA that is now underway encouraged more teachers to prepare students with performance tasks. As we can see, high school math classrooms are where this is lower, which could be an indicator that, without the explicit connection of a modern high stakes test (outside of Algebra I), there is not as much immediate incentive for performance tasks. The biggest fear on my end is an overreliance on ACT prep, while not balancing this with the sort of complex tasks that students would actually need in college. That balance between the realities of standardized tests versus the importance of rich tasks that allow sharing and feedback on mathematical thinking is at the center of many of our Q4 course offerings (more below).
Overall, there does seem to be the case that some classrooms are giving rich math tasks to explore with students, but we might not be fully sharing and analyzing the data with TLD Coaches.
As we can see to the right, we have about 40% of classes that actually are engaging in analysis/application/explaining but are not progress known. As I have said before, if you can give feedback on mathematical thinking and strategy, then a question can be assessed as a performance task. This means that any class that is analyzing, applying, or explaining math is engaging with performance tasks! So those 40% of progress unknown classes are either not tracking data to provide students with growth-oriented data OR this data has not been shared with TLD Coaches and myself. That is a big gap that is holding our cohort back from PK. Something else I found interesting is the following graph…
Here you can see that classrooms are more than three times as likely to be PK and interested/hardworking than not PK and interested/hardworking, while the majority of our non-PK (which again mostly means we do not have performance task data for mathematical thinking feedback) classrooms are stuck as merely compliant and on-task. One theory that this correlation tells me is that students are not as likely to be developing positive math identities in classrooms where they are not getting growth-oriented performance task feedback, and so it only makes sense that if this baseline data point of content mastery and mathematical thinking growth is not established, students are less likely to show interest in the content. After all, as Julia Aguirre et al. say in The Impact of Identity in K-8 Mathematics, “Feedback (in math) tends to accentuate what students do not know and cannot do, thus leading them to believe that they are ‘not smart,’ lack ability, or cannot learn.” While much of the content in Q3 focused on how to conduct lessons conducive to performance task data and how to give empowering feedback for this, Q4 will double down especially on the instructional side – I want to make sure that teachers feel empowered themselves to take on these pedagogical habits, techniques, and mindsets in their classrooms.
“Feedback (in math) tends to accentuate what students do not know and cannot do, thus leading them to believe that they are ‘not smart,’ lack ability, or cannot learn.” – Aguirre et al., The Impact of Identity in K-8 Mathematics
Data Point #2: Culture of Achievement
- What is Culture of Achievement (CoA)?
CoA is the quality of the classroom culture that your students enjoy as they are learning. Some people think immediately about “management” but CoA goes well beyond that: it’s the way in which your students actively maintain and foster a positive environment because of the way they care about their learning.
- How do you collect data around CoA?
CoA is determined by the TLD Coach in collaboration with your thinking after an observation, using the Culture of Achievement Pathways rubric to inform our terminology. This then gets collected in our Program Tracker so we can analyze the data at different levels.
- What does the CoA data tell us going into Q4?
Our math teachers have exploded onto the scene with a huge increase in interested/hard-working classrooms, while destructive classrooms are a thing of the past and apathetic/unruly classrooms are greatly decreasing. Just to drive home the changes from end of Q2 to end of Q3, here’s a look at just how much our COA changed:
Our middle school teachers (who again also had the greatest jump in PK) had the biggest increase in interested/hard-working classrooms while all lower bands for COA decreased. However all of our grade bands are seeing shifts upwards in COA. I can’t wait to hear how teachers take full advantages of this growing positive culture to make gains with their students this last quarter of school.
Data Point #3: Engagement with Rigorous Content (ERC)
- What is Engagement with Rigorous Content (ERC)?
ERC is the level of rigor at which students are engaging with the content. Some people think immediately about “difficulty” of the questions being asked by the teacher, but this goes well beyond that: it’s the depth and sophistication with which students are thinking about and working within the content.
- How do you collect data around ERC?
ERC is determined by the TLD Coach in collaboration with your thinking after an observation, using the Engagement with Rigorous Content rubric to inform our terminology. This then gets collected in our Program Tracker so we can analyze the data at different levels.
- What does the ERC data tell us going into Q4?
Like COA, our teachers are showing a lot of growth in ERC, with analysis/application/explaining increasing 18% while all lower bands showed a decrease. I’m hoping for an even greater increase in Q4 as we offer lots of opportunities to plan, instruct, and get feedback on lessons that encourage greater student ownership of mathematical problem-solving.
It’s worth noting that every grade band saw double-digit growth in analysis/application/explaining While the number of passive or confused classrooms dropped to single digits. I can’t to see the incredible work our teachers do with this strength going into Q4.
Forward into Q4
Based on all of the data above, as well as data from the mid-year survey (which was discussed more in-depth at the math extravaganza), observations, and conversations with teachers and coaches, the following are our priorities for Q4 in math.
Head over to the PD Page of our Professional Development website to see what sessions will be driving towards these priorities, and sign up for them!
So what are your thoughts? What resonates with you about this data, these priorities, and these upcoming experiences? What else do you see in the data and in your own classroom? Fire off below!