It came from San Francisco: California's new math standard could spread despite being junk

Stanford mathematician Brian Conrad has written a piece for the Atlantic reminding readers that California’s recently adopted math curriculum is a bad idea and one that could soon spread to other states. The California curriculum was supposedly based on scientific insight into what is best for kids. In reality is was based on the current concern for greater equity and based partly on a failed effort in San Francisco.

Advertisement

I am a professional mathematician, a graduate of the public schools of a middle-class community in New York, and the son of a high-school math teacher. I have been the director of undergraduate studies in math at Stanford University for a decade. When California released a revised draft of the math framework last year, I decided someone should read the whole thing, so I dove in. Sometimes, as I pored over the CMF, I could scarcely believe what I was reading. The document cited research that hadn’t been peer-reviewed; justified sweeping generalizations by referencing small, tightly focused studies or even unrelated research; and described some papers as reaching nearly the opposite conclusions from what they actually say.

The document tried hard to convince readers that it was based on a serious reading of neuroscience research. The first chapter, for example, cited two articles to claim that “the highest achieving people have more interconnected brains,” implying that this has something to do with learning math. But neither paper says anything about math education.

Professor Conrad has published his findings at length prior to this. I wrote about them last May. Here’s a sample:

The CMF contains false or misleading descriptions of many citations from the literature in neuroscience, acceleration, detracking, assessments, and more. (I consulted with three experts in neuroscience about the papers in that field which seemed to be used in the CMF in a concerning way.) Sometimes the papers arrive at conclusions opposite those claimed in the CMF…

A sample misleading quote is “Park and Brannon (2013) found that when students worked with numbers and also saw the numbers as visual objects, brain communication was enhanced and student achievement increased.” This single sentence contains multiple wrong statements: (1) they worked with adults and not students; (2) their experiments involved no brain imaging, and so could not demonstrate brain communication; (3) the paper does not claim that participants saw numbers as visual objects: their focus was on training the approximate number system…

The CMF selectively cites research to make points it wants to make. For example, Siegler and Ramani (2008) is cited to claim that “after four 15-minute sessions of playing a game with a number line, differences in knowledge between students from low-income backgrounds and those from middle-income backgrounds were eliminated”. In fact, the study was specifically for pre-schoolers playing a numerical board game similar to Chutes and Ladders and focused on their numerical knowledge, and at least five subsequent studies by the same authors with more rigorous methods showed smaller positive effects of playing the game that did not eliminate the differences.

Advertisement

In some places, the citation used in the framework actually said the opposite of what the author claimed it did.

In yet another case, the CMF cites Burris et al (2006) for demonstrating “positive outcomes for achievement and longer-term academic success from keeping students in heterogenous groups focused on higher-level content through middle school”. But the CMF never tells the reader that this paper studied the effect of teaching Algebra I for all 8th grade students (getting good outcomes) — precisely the uniform acceleration policy that the CMF argues against in the prior point…

The CMF claims that Sadler and Sonnert (2018) provides evidence in favor of delaying calculus to college, but the paper finds that taking calculus in high-school improved performance in college…

The CMF makes the dramatic claim that (Black et al, 2002) (really 2004) showed that students “are incredibly accurate at assessing their own understanding, and they do not over or under estimate it.” If this claim were true, no exams would be needed to assess student knowledge: we could just ask them. Unfortunately, the paper of Black et al contains nothing of the sort.

The Chronicle of Higher Education also found serious citations problems in the framework which was written by another Stanford professor named Jo Boaler.

Another questionable claim relates to one of Boaler’s most despised class activities. “For about one third of students the onset of timed testing is the beginning of math anxiety,” she asserts in Mathematical Mindsets and a Youcubed white paper, adding that the tests are “a major cause of this debilitating, often life-long condition.”

But the basis for that causal link is a mystery, Greg Ashman, a math teacher and school administrator in Victoria, Australia, has foundMathematical Mindsets cites the white paper, which in turn cites a 2014 opinion piece in a journal for math educators, but nowhere within it is a reference to sufficiently back up the claim. Boaler did not identify any such citation when asked, but instead provided me with a list of studies that were not in the article. She also said that “one third” was an estimate that “comes from 30 years of looking at different studies and examples.”

In 2019, when Boaler tweeted that “timed tests are the cause of math anxiety,” Ashman responded, “I disagree — you have not demonstrated a causal link,” and pointed to the blog post he’d written. She did not respond and, later, blocked him.

Advertisement

One of the big ideas adopted as part of the California math framework was something called math de-tracking. This means not separating kids into groups, some of whom take algebra in 8th grade while others do not. This was picked up by state reformers after San Francisco adopted math de-tracking on equity grounds. Here’s how the Washington Post described it in 2021:

Advocates for the new California math guidelines say “de-tracking,” or mixing together students of varying academic performance, can help all students, particularly those who would have languished in lower-level classes. It can also unravel racial segregation inside schools. Almost everywhere, White and Asian American students are more likely to be placed in higher tracks, with Black and Latino students more likely to be placed in lower tracks…

Laying the groundwork for California’s proposed framework was San Francisco. In 2014, the district overhauled its curriculum to group all students with a mix of abilities through 10th grade. Almost everyone is taught Algebra 1 in ninth grade and geometry in 10th. Students who want to take AP Calculus their senior year may accelerate, for instance by combining Algebra 2 and precalculus in 11th grade.

The district was looking to unravel the pervasive classroom racial segregation that came with tracking and rethinking the best way to teach, said Lizzy Hull Barnes, mathematics and computer science supervisor for the district. The result, she said, is that “more students enrolled in advanced math courses, and it’s a more diverse group of students.”

It all sounds great except the San Francisco’s experiment with de-tracking didn’t work. The claims made by advocates didn’t hold up and the achievement gap was not closed. Here’s how professor Conrad describes it:

Advertisement

Although school officials in San Francisco had largely ignored parents who questioned the district’s policy against offering Algebra I in middle school, critics refused to give up, and for good reason. A recent working paper from three Stanford researchers indicates that the San Francisco Unified School District’s decade-long experiment was a bust. The percentage of Black and Latino students taking advanced math courses did not increase. Some students who would otherwise have studied calculus as high-school seniors were unable to do so. The kids who succeeded in reaching calculus typically did so through extracurricular measures, such as summer classes. Later CMF drafts quietly removed the mention of the SFUSD policy while still generally endorsing the ideas behind it.

You have to appreciate that last sentence above. It was widely reported that California adopted de-tracking because of its supposed success in San Francisco. But the later drafts of the California math framework simply removed mentions of it because it was increasingly clear it hadn’t actually worked. But the state kept pursuing the policy anyway. Failure won’t stand in their way.

Why? Why would California adopt an idea that didn’t work (the new framework was adopted in July)? The answer to that won’t surprise you. Here’s a sample of the plan as it appeared in 2021:

Equity cannot be an afterthought to more traditional mathematics content-centered offerings that do nothing to address the fact that “Black, Latinx, Indigenous, women, and poor students, have experienced long histories of underrepresentation in mathematics and mathematics-related domains” (Martin, 2019; see also Martin, Anderson, & Shah, 2017). Inequities caused by systemic issues means that a “culture of exclusion” persists even in equity-oriented teaching (Louie, 2017). Many of the stories that we use to define mathematics, and to talk about who does or is good at mathematics, are highly racialized and English language-centric, and are experienced that way by students (Lue & Turner, 2020). This means students’ mathematics identities are shaped in part by a culture of societal and institutionalized racism. Professional learning in mathematics can respond to these realities and aim for more than incremental change (which does little to change the framing narratives that drive inequities).

Advertisement

It really doesn’t matter that this plan uses loads of faulty and misleading citations or that it is partly based on a similar reform in San Francisco which failed to do what advocates claimed it would. So long as the plan embraces equity and social justice it’s unstoppable in California. And the ideas in this awful plan could now spread to other states.

In Ohio, for example, a menu of alternative math “pathways” in high school has been touted as providing entry into a variety of appealing and lucrative careers. But the pathways labeled for data science and computer science remove many Algebra II skills; the fine print reveals that the pathways are inadequate for students who might want college degrees in those fields. School officials in Middletown, Connecticut, have proposed to revamp the traditional calculus track by scaling back on preparations for eighth-grade Algebra I and introducing mash-up algebra-and-geometry courses that would magically pack three years of instruction into two.

There are a lot of progressives out there who don’t seem to care what this does to advanced students who could benefit from taking these advanced math classes. What they care about is the false promise that the new framework will be more equitable and will help close the achievement gap. It won’t happen but it could still be a long time before the advocates admit they were wrong.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Beege Welborn 5:00 PM | December 24, 2024
Advertisement
Advertisement