K12 Archives - The Hechinger Report http://hechingerreport.org/tags/k12/ Covering Innovation & Inequality in Education Mon, 01 Jul 2024 06:13:54 +0000 en-US hourly 1 https://hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon-32x32.jpg K12 Archives - The Hechinger Report http://hechingerreport.org/tags/k12/ 32 32 138677242 PROOF POINTS: Some of the $190 billion in pandemic money for schools actually paid off https://hechingerreport.org/proof-points-190-billion-question-partially-answered/ https://hechingerreport.org/proof-points-190-billion-question-partially-answered/#respond Mon, 01 Jul 2024 10:00:00 +0000 https://hechingerreport.org/?p=101767 This image shows a conceptual illustration with a figure standing amidst a variety of floating U.S. dollar bill fragments on a teal background. The pieces of currency are scattered in different orientations, creating a sense of disarray and abstraction.

Reports about schools squandering their $190 billion in federal pandemic recovery money have been troubling.  Many districts spent that money on things that had nothing to do with academics, particularly building renovations. Less common, but more eye-popping were stories about new football fields, swimming pool passes, hotel rooms at Caesar’s Palace in Las Vegas and […]

The post PROOF POINTS: Some of the $190 billion in pandemic money for schools actually paid off appeared first on The Hechinger Report.

]]>
This image shows a conceptual illustration with a figure standing amidst a variety of floating U.S. dollar bill fragments on a teal background. The pieces of currency are scattered in different orientations, creating a sense of disarray and abstraction.

Reports about schools squandering their $190 billion in federal pandemic recovery money have been troubling.  Many districts spent that money on things that had nothing to do with academics, particularly building renovations. Less common, but more eye-popping were stories about new football fields, swimming pool passes, hotel rooms at Caesar’s Palace in Las Vegas and even the purchase of an ice cream truck. 

So I was surprised that two independent academic analyses released in June 2024 found that some of the money actually trickled down to students and helped them catch up academically.  Though the two studies used different methods, they arrived at strikingly similar numbers for the average growth in math and reading scores during the 2022-23 school year that could be attributed to each dollar of federal aid. 

One of the research teams, which includes Harvard University economist Tom Kane and Stanford University sociologist Sean Reardon, likened the gains to six days of learning in math and three days of learning in reading for every $1,000 in federal pandemic aid per student. Though that gain might seem small, high-poverty districts received an average of $7,700 per student, and those extra “days” of learning for low-income students added up. Still, these neediest children were projected to be one third of a grade level behind low-income students in 2019, before the pandemic disrupted education.

“Federal funding helped and it helped kids most in need,” wrote Robin Lake, director of the Center on Reinventing Public Education, on X in response to the two studies. Lake was not involved in either report, but has been closely tracking pandemic recovery. “And the spending was worth the gains,” Lake added. “But it will not be enough to do all that is needed.” 

The academic gains per aid dollar were close to what previous researchers had found for increases in school spending. In other words, federal pandemic aid for schools has been just as effective (or ineffective) as other infusions of money for schools. The Harvard-Stanford analysis calculated that the seemingly small academic gains per $1,000 could boost a student’s lifetime earnings by $1,238 – not a dramatic payoff, but not a public policy bust either. And that payoff doesn’t include other societal benefits from higher academic achievement, such as lower rates of arrests and teen motherhood. 

The most interesting nuggets from the two reports, however, were how the academic gains varied wildly across the nation. That’s not only because some schools used the money more effectively than others but also because some schools got much more aid per student.

The poorest districts in the nation, where 80 percent or more of the students live in families whose income is low enough to qualify for the federally funded school lunch program, demonstrated meaningful recovery because they received the most aid. About 6 percent of the 26 million public schoolchildren that the researchers studied are educated in districts this poor. These children had recovered almost half of their pandemic learning losses by the spring of 2023. The very poorest districts, representing 1 percent of the children, were potentially on track for an almost complete recovery in 2024 because they tended to receive the most aid per student. However, these students were far below grade level before the pandemic, so their recovery brings them back to a very low rung.

Some high-poverty school districts received much more aid per student than others. At the top end of the range, students in Detroit received about $26,000 each – $1.3 billion spread among fewer than 49,000 students. One in 10 high-poverty districts received more than $10,700 for each student. An equal number of high-poverty districts received less than $3,700 per student. These surprising differences for places with similar poverty levels occurred because pandemic aid was allocated according to the same byzantine rules that govern federal Title I funding to low-income schools. Those formulas give large minimum grants to small states, and more money to states that spend more per student. 

On the other end of the income spectrum are wealthier districts, where 30 percent or fewer students qualify for the lunch program, representing about a quarter of U.S. children. The Harvard-Stanford researchers expect these students to make an almost complete recovery. That’s not because of federal recovery funds; these districts received less than $1,000 per student, on average. Researchers explained that these students are on track to approach 2019 achievement levels because they didn’t suffer as much learning loss.  Wealthier families also had the means to hire tutors or time to help their children at home.

Middle-income districts, where between 30 percent and 80 percent of students are eligible for the lunch program, were caught in between. Roughly seven out of 10 children in this study fall into this category. Their learning losses were sometimes large, but their pandemic aid wasn’t. They tended to receive between $1,000 and $5,000 per student. Many of these students are still struggling to catch up.

In the second study, researchers Dan Goldhaber of the American Institutes for Research and Grace Falken of the University of Washington estimated that schools around the country, on average, would need an additional $13,000 per student for full recovery in reading and math.  That’s more than Congress appropriated.

There were signs that schools targeted interventions to their neediest students. In school districts that separately reported performance for low-income students, these students tended to post greater recovery per dollar of aid than wealthier students, the Goldhaber-Falken analysis shows.

Impact differed more by race, location and school spending. Districts with larger shares of white students tended to make greater achievement gains per dollar of federal aid than districts with larger shares of Black or Hispanic students. Small towns tended to produce more academic gains per dollar of aid than large cities. And school districts that spend less on education per pupil tended to see more academic gains per dollar of aid than high spenders. The latter makes sense: an extra dollar to a small budget makes a bigger difference than an extra dollar to a large budget. 

The most frustrating part of both reports is that we have no idea what schools did to help students catch up. Researchers weren’t able to connect the academic gains to tutoring, summer school or any of the other interventions that schools have been trying. Schools still have until September to decide how to spend their remaining pandemic recovery funds, and, unfortunately, these analyses provide zero guidance.

And maybe some of the non-academic things that schools spent money on weren’t so frivolous after all. A draft paper circulated by the National Bureau of Economic Research in January 2024 calculated that school spending on basic infrastructure, such as air conditioning and heating systems, raised test scores. Spending on athletic facilities did not. 

Meanwhile, the final score on pandemic recovery for students is still to come. I’ll be looking out for it.

This story about federal funding for education was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

The post PROOF POINTS: Some of the $190 billion in pandemic money for schools actually paid off appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-190-billion-question-partially-answered/feed/ 0 101767
PROOF POINTS: Teens are looking to AI for information and answers, two surveys show https://hechingerreport.org/proof-points-teens-ai-surveys/ https://hechingerreport.org/proof-points-teens-ai-surveys/#respond Mon, 17 Jun 2024 10:00:00 +0000 https://hechingerreport.org/?p=101528

Two new surveys, both released this month, show how high school and college-age students are embracing artificial intelligence. There are some inconsistencies and many unanswered questions, but what stands out is how much teens are turning to AI for information and to ask questions, not just to do their homework for them. And they’re using […]

The post PROOF POINTS: Teens are looking to AI for information and answers, two surveys show appeared first on The Hechinger Report.

]]>

Two new surveys, both released this month, show how high school and college-age students are embracing artificial intelligence. There are some inconsistencies and many unanswered questions, but what stands out is how much teens are turning to AI for information and to ask questions, not just to do their homework for them. And they’re using it for personal reasons as well as for school. Another big takeaway is that there are different patterns by race and ethnicity with Black, Hispanic and Asian American students often adopting AI faster than white students.

The first report, released on June 3, was conducted by three nonprofit organizations, Hopelab, Common Sense Media, and the Center for Digital Thriving at the Harvard Graduate School of Education. These organizations surveyed 1,274 teens and young adults aged 14-22 across the U.S. from October to November 2023. At that time, only half the teens and young adults said they had ever used AI, with just 4 percent using it daily or almost every day. 

Emily Weinstein, executive director for the Center for Digital Thriving, a research center that investigates how youth are interacting with technology, said that more teens are “certainly” using AI now that these tools are embedded in more apps and websites, such as Google Search. Last October and November, when this survey was conducted, teens typically had to take the initiative to navigate to an AI site and create an account. An exception was Snapchat, a social media app that had already added an AI chatbot for its users. 

More than half of the early adopters said they had used AI for getting information and for brainstorming, the first and second most popular uses. This survey didn’t ask teens if they were using AI for cheating, such as prompting ChatGPT to write their papers for them. However, among the half of respondents who were already using AI, fewer than half – 46 percent – said they were using it for help with school work. The fourth most common use was for generating pictures.

The survey also asked teens a couple of open-response questions. Some teens told researchers that they are asking AI private questions that they were too embarrassed to ask their parents or their friends. “Teens are telling us I have questions that are easier to ask robots than people,”  said Weinstein.

Weinstein wants to know more about the quality and the accuracy of the answers that AI is giving teens, especially those with mental health struggles, and how privacy is being protected when students share personal information with chatbots.

The second report, released on June 11, was conducted by Impact Research and  commissioned by the Walton Family Foundation. In May 2024, Impact Research surveyed 1,003 teachers, 1,001 students aged 12-18, 1,003 college students, and 1,000 parents about their use and views of AI.

This survey, which took place six months after the Hopelab-Common Sense survey, demonstrated how quickly usage is growing. It found that 49 percent of students, aged 12-18, said they used ChatGPT at least once a week for school, up 26 percentage points since 2023. Forty-nine percent of college undergraduates also said they were using ChatGPT every week for school but there was no comparison data from 2023.

Among 12- to 18-year-olds and college students who had used AI chatbots for school, 56 percent said they had used it for help in writing essays and other writing assignments. Undergraduate students were more than twice as likely as 12- to 18-year-olds to say using AI felt like cheating, 22 percent versus 8 percent. Earlier 2023 surveys of student cheating by scholars at Stanford University did not detect an increase in cheating with ChatGPT and other generative AI tools. But as students use AI more, students’ understanding of what constitutes cheating may also be evolving. 

 

More than 60 percent of college students who used AI said they were using it to study for tests and quizzes. Half of the college students who used AI said they were using it to deepen their subject knowledge, perhaps, as if it were an online encyclopedia. There was no indication from this survey if students were checking the accuracy of the information.

Both surveys noticed differences by race and ethnicity. The first Hopelab-Common Sense survey found that 7 percent of Black students, aged 14-22, were using AI every day, compared with 5 percent of Hispanic students and 3 percent of white students. In the open-ended questions, one Black teen girl wrote that, with AI, “we can change who we are and become someone else that we want to become.” 

The Walton Foundation survey found that Hispanic and Asian American students were sometimes more likely to use AI than white and Black students, especially for personal purposes. 

These are all early snapshots that are likely to keep shifting. OpenAI is expected to become part of the Apple universe in the fall, including its iPhones, computers and iPads.  “These numbers are going to go up and they’re going to go up really fast,” said Weinstein. “Imagine that we could go back 15 years in time when social media use was just starting with teens. This feels like an opportunity for adults to pay attention.”

This story about ChatGPT in education was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

The post PROOF POINTS: Teens are looking to AI for information and answers, two surveys show appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-teens-ai-surveys/feed/ 0 101528
PROOF POINTS: AI essay grading is already as ‘good as an overburdened’ teacher, but researchers say it needs more work https://hechingerreport.org/proof-points-ai-essay-grading/ https://hechingerreport.org/proof-points-ai-essay-grading/#comments Mon, 20 May 2024 10:00:00 +0000 https://hechingerreport.org/?p=101011

Grading papers is hard work. “I hate it,” a teacher friend confessed to me. And that’s a major reason why middle and high school teachers don’t assign more writing to their students. Even an efficient high school English teacher who can read and evaluate an essay in 20 minutes would spend 3,000 minutes, or 50 […]

The post PROOF POINTS: AI essay grading is already as ‘good as an overburdened’ teacher, but researchers say it needs more work appeared first on The Hechinger Report.

]]>

Grading papers is hard work. “I hate it,” a teacher friend confessed to me. And that’s a major reason why middle and high school teachers don’t assign more writing to their students. Even an efficient high school English teacher who can read and evaluate an essay in 20 minutes would spend 3,000 minutes, or 50 hours, grading if she’s teaching six classes of 25 students each. There aren’t enough hours in the day. 

Could ChatGPT relieve teachers of some of the burden of grading papers? Early research is finding that the new artificial intelligence of large language models, also known as generative AI, is approaching the accuracy of a human in scoring essays and is likely to become even better soon. But we still don’t know whether offloading essay grading to ChatGPT will ultimately improve or harm student writing.

Tamara Tate, a researcher at University California, Irvine, and an associate director of her university’s Digital Learning Lab, is studying how teachers might use ChatGPT to improve writing instruction. Most recently, Tate and her seven-member research team, which includes writing expert Steve Graham at Arizona State University, compared how ChatGPT stacked up against humans in scoring 1,800 history and English essays written by middle and high school students. 

Tate said ChatGPT was “roughly speaking, probably as good as an average busy teacher” and “certainly as good as an overburdened below-average teacher.” But, she said, ChatGPT isn’t yet accurate enough to be used on a high-stakes test or on an essay that would affect a final grade in a class.

Tate presented her study on ChatGPT essay scoring at the 2024 annual meeting of the American Educational Research Association in Philadelphia in April. (The paper is under peer review for publication and is still undergoing revision.) 

Most remarkably, the researchers obtained these fairly decent essay scores from ChatGPT without training it first with sample essays. That means it is possible for any teacher to use it to grade any essay instantly with minimal expense and effort. “Teachers might have more bandwidth to assign more writing,” said Tate. “You have to be careful how you say that because you never want to take teachers out of the loop.” 

Writing instruction could ultimately suffer, Tate warned, if teachers delegate too much grading to ChatGPT. Seeing students’ incremental progress and common mistakes remain important for deciding what to teach next, she said. For example, seeing loads of run-on sentences in your students’ papers might prompt a lesson on how to break them up. But if you don’t see them, you might not think to teach it. 

In the study, Tate and her research team calculated that ChatGPT’s essay scores were in “fair” to “moderate” agreement with those of well-trained human evaluators. In one batch of 943 essays, ChatGPT was within a point of the human grader 89 percent of the time. On a six-point grading scale that researchers used in the study, ChatGPT often gave an essay a 2 when an expert human evaluator thought it was really a 1. But this level of agreement – within one point – dropped to 83 percent of the time in another batch of 344 English papers and slid even farther to 76 percent of the time in a third batch of 493 history essays.  That means there were more instances where ChatGPT gave an essay a 4, for example, when a teacher marked it a 6. And that’s why Tate says these ChatGPT grades should only be used for low-stakes purposes in a classroom, such as a preliminary grade on a first draft.

ChatGPT scored an essay within one point of a human grader 89 percent of the time in one batch of essays

Corpus 3 refers to one batch of 943 essays, which represents more than half of the 1,800 essays that were scored in this study. Numbers highlighted in green show exact score matches between ChatGPT and a human. Yellow highlights scores in which ChatGPT was within one point of the human score. Source: Tamara Tate, University of California, Irvine (2024).

Still, this level of accuracy was impressive because even teachers disagree on how to score an essay and one-point discrepancies are common. Exact agreement, which only happens half the time between human raters, was worse for AI, which matched the human score exactly only about 40 percent of the time. Humans were far more likely to give a top grade of a 6 or a bottom grade of a 1. ChatGPT tended to cluster grades more in the middle, between 2 and 5. 

Tate set up ChatGPT for a tough challenge, competing against teachers and experts with PhDs who had received three hours of training in how to properly evaluate essays. “Teachers generally receive very little training in secondary school writing and they’re not going to be this accurate,” said Tate. “This is a gold-standard human evaluator we have here.”

The raters had been paid to score these 1,800 essays as part of three earlier studies on student writing. Researchers fed these same student essays – ungraded –  into ChatGPT and asked ChatGPT to score them cold. ChatGPT hadn’t been given any graded examples to calibrate its scores. All the researchers did was copy and paste an excerpt of the same scoring guidelines that the humans used, called a grading rubric, into ChatGPT and told it to “pretend” it was a teacher and score the essays on a scale of 1 to 6. 

Older robo graders

Earlier versions of automated essay graders have had higher rates of accuracy. But they were expensive and time-consuming to create because scientists had to train the computer with hundreds of human-graded essays for each essay question. That’s economically feasible only in limited situations, such as for a standardized test, where thousands of students answer the same essay question. 

Earlier robo graders could also be gamed, once a student understood the features that the computer system was grading for. In some cases, nonsense essays received high marks if fancy vocabulary words were sprinkled in them. ChatGPT isn’t grading for particular hallmarks, but is analyzing patterns in massive datasets of language. Tate says she hasn’t yet seen ChatGPT give a high score to a nonsense essay. 

Tate expects ChatGPT’s grading accuracy to improve rapidly as new versions are released. Already, the research team has detected that the newer 4.0 version, which requires a paid subscription, is scoring more accurately than the free 3.5 version. Tate suspects that small tweaks to the grading instructions, or prompts, given to ChatGPT could improve existing versions. She is interested in testing whether ChatGPT’s scoring could become more reliable if a teacher trained it with just a few, perhaps five, sample essays that she has already graded. “Your average teacher might be willing to do that,” said Tate.

Many ed tech startups, and even well-known vendors of educational materials, are now marketing new AI essay robo graders to schools. Many of them are powered under the hood by ChatGPT or another large language model and I learned from this study that accuracy rates can be reported in ways that can make the new AI graders seem more accurate than they are. Tate’s team calculated that, on a population level, there was no difference between human and AI scores. ChatGPT can already reliably tell you the average essay score in a school or, say, in the state of California. 

Questions for AI vendors

At this point, it is not as accurate in scoring an individual student. And a teacher wants to know exactly how each student is doing. Tate advises teachers and school leaders who are considering using an AI essay grader to ask specific questions about accuracy rates on the student level:  What is the rate of exact agreement between the AI grader and a human rater on each essay? How often are they within one-point of each other?

The next step in Tate’s research is to study whether student writing improves after having an essay graded by ChatGPT. She’d like teachers to try using ChatGPT to score a first draft and then see if it encourages revisions, which are critical for improving writing. Tate thinks teachers could make it “almost like a game: how do I get my score up?” 

Of course, it’s unclear if grades alone, without concrete feedback or suggestions for improvement, will motivate students to make revisions. Students may be discouraged by a low score from ChatGPT and give up. Many students might ignore a machine grade and only want to deal with a human they know. Still, Tate says some students are too scared to show their writing to a teacher until it’s in decent shape, and seeing their score improve on ChatGPT might be just the kind of positive feedback they need. 

“We know that a lot of students aren’t doing any revision,” said Tate. “If we can get them to look at their paper again, that is already a win.”

That does give me hope, but I’m also worried that kids will just ask ChatGPT to write the whole essay for them in the first place.

This story about AI essay scoring was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

The post PROOF POINTS: AI essay grading is already as ‘good as an overburdened’ teacher, but researchers say it needs more work appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-ai-essay-grading/feed/ 2 101011
PROOF POINTS: Many high school math teachers cobble together their own instructional materials from the internet and elsewhere, a survey finds https://hechingerreport.org/proof-points-many-high-school-math-teachers-cobble-together-their-own-instructional-materials-from-the-internet-and-elsewhere-a-survey-finds/ https://hechingerreport.org/proof-points-many-high-school-math-teachers-cobble-together-their-own-instructional-materials-from-the-internet-and-elsewhere-a-survey-finds/#comments Mon, 29 Apr 2024 10:00:00 +0000 https://hechingerreport.org/?p=100387

Writing lesson plans has traditionally been a big part of a teacher’s job.  But this doesn’t mean they should be starting from a blank slate. Ideally, teachers are supposed to base their lessons on the textbooks, worksheets and digital materials that school leaders have spent a lot of time reviewing and selecting.  But a recent […]

The post PROOF POINTS: Many high school math teachers cobble together their own instructional materials from the internet and elsewhere, a survey finds appeared first on The Hechinger Report.

]]>

Writing lesson plans has traditionally been a big part of a teacher’s job.  But this doesn’t mean they should be starting from a blank slate. Ideally, teachers are supposed to base their lessons on the textbooks, worksheets and digital materials that school leaders have spent a lot of time reviewing and selecting. 

But a recent national survey of more than 1,000 math teachers reveals that many are rejecting the materials they should be using and cobbling together their own.

“A surprising number of math teachers, particularly at the high school level, simply said we don’t use the district or school-provided materials, or they claimed they didn’t have any,” said William Zahner, an associate professor of mathematics at San Diego State University, who presented the survey at the April 2024 annual meeting of the American Educational Research Association in Philadelphia. Students, he said, are often being taught through a “bricolage” of materials that teachers assemble themselves from colleagues and the internet. 

“What I see happening is a lot of math teachers are rewriting a curriculum that has already been written,” said Zahner. 

The survey results varied by grade level. More than 75 percent of elementary school math teachers said they used their school’s recommended materials, but fewer than 50 percent of high school math teachers said they did. 

Share of math teachers who use their schools recommended materials

Source: Zahner et al, Mathematics Teachers’ Perceptions of Their Instructional Materials for English Learners: Results from a National Survey, presented at AERA 2024.

The do-it-yourself approach has two downsides, Zahner said, both of which affect students. One problem is that it’s time consuming. Time spent finding materials is time not spent giving students feedback, tailoring existing lessons for students or giving students one-to-one tutoring help. The hunt for materials is also exhausting and can lead to teacher burnout, Zahner said.

Related: Education research, condensed. The free Proof Points newsletter delivers one story every Monday.

The other problem is that teacher-made materials may sacrifice the thoughtful sequencing of topics planned by curriculum designers.  When teachers create or take materials from various sources, it is hard to maintain a “coherent development” of ideas, Zahner explained. Curriculum designers may weave a review of previous concepts to reinforce them even as new ideas are introduced. Teacher-curated materials may be disjointed. Separate research has found that some of the most popular materials that teachers grab from internet sites, such as Teachers Pay Teachers, are not high quality

The national survey was conducted in 2021 by researchers at San Diego State University, including Zahner, who also directs the university’s Center for Research in Mathematics and Science Education, and the English Learners Success Forum, a nonprofit that seeks to improve the quality of instructional materials for English learners. The researchers sought out the views of teachers who worked in school districts where more than 10 percent of the students were classified as English learners, which is the national average. More than 1,000 math teachers, from kindergarten through 12th grade, responded. On average, 30 percent of their students were English learners, but some teachers had zero English learners and others had all English learners in their classrooms.

Teachers were asked about the drawbacks of their assigned curriculum for English learners. Many said that their existing materials weren’t connected to their students’ languages and cultures. Others said that the explanations of how to tailor a lesson to an English learner were too general to be useful.  Zahner says that teachers have a point and that they need more support in how to help English learners develop the language of mathematical reasoning and argumentation.

It was not clear from this survey whether the desire to accommodate English learners was the primary reason that teachers were putting together their own materials or whether they would have done so anyway. 

Related: Most English lessons on Teachers Pay Teachers and other sites are ‘mediocre’ or ‘not worth using,’ study finds

“There are a thousand reasons why this is happening,” said Zahner. One high school teacher in Louisiana who participated in the survey said his students needed a more advanced curriculum. Supervisors inside a school may not like the materials that officials in a central office have chosen. “Sometimes schools have the materials but they’re all hidden in a closet,” Zahner said.

In the midst of a national debate on how best to teach math, this survey is an important reminder of yet another reason why many students aren’t getting the instruction that they need. 

This story about math lessons was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters. 

Talk to us about your college application essay

We want to hear directly from recent college applicants: What did you want to share about yourself with admissions officers? Your replies will help us understand what it’s like to apply to college now. We won’t publish anything you submit without getting your permission first.

1. What is your name?
Accepted file types: docx, jpg, pdf, png, Max. file size: 5 MB.
We won’t publish anything you submit without getting your permission first.
This will allow us to verify anything we receive from you. One of our reporters may also reach out to you for a follow-up conversation.
This field is for validation purposes and should be left unchanged.

The post PROOF POINTS: Many high school math teachers cobble together their own instructional materials from the internet and elsewhere, a survey finds appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-many-high-school-math-teachers-cobble-together-their-own-instructional-materials-from-the-internet-and-elsewhere-a-survey-finds/feed/ 9 100387
PROOF POINTS: The myth of the quick learner https://hechingerreport.org/proof-points-the-myth-of-the-quick-learner/ https://hechingerreport.org/proof-points-the-myth-of-the-quick-learner/#comments Mon, 27 Nov 2023 11:00:00 +0000 https://hechingerreport.org/?p=97215

Some kids appear to learn faster than others. A few years ago, a group of scientists at Carnegie Mellon University decided to study these rapid learners to see what they are doing differently and if their strategies could help the rest of us. But as the scientists began their study, they stumbled upon a fundamental […]

The post PROOF POINTS: The myth of the quick learner appeared first on The Hechinger Report.

]]>

Some kids appear to learn faster than others. A few years ago, a group of scientists at Carnegie Mellon University decided to study these rapid learners to see what they are doing differently and if their strategies could help the rest of us.

But as the scientists began their study, they stumbled upon a fundamental problem:  they could not find faster learners. After analyzing the learning rates of 7,000 children and adults using instructional software or playing educational games, the researchers could find no evidence that some students were progressing faster than others. All needed practice to learn something new, and they learned about the same amount from each practice attempt. On average, it was taking both high and low achievers about seven to eight practice exercises to learn a new concept, a rather tiny increment of learning that the researchers call a “knowledge component.”

“Students are starting in different places and ending in different places,” said Ken Koedinger, a cognitive psychologist and director of Carnegie Mellon’s LearnLab, where this research was conducted. “But they’re making progress at the same rates.” 

Koedinger and his team’s data analysis was published in the Proceedings of the National Academy of Sciences (PNAS), a peer-reviewed journal of the National Academy of Sciences, in March 2023. The study offers the hope that “anyone can learn anything they want” if they get well-designed practice exercises and put some effort into it.  Raw talent, like having a “knack for math” or a “gift for language,” isn’t required.

Koedinger and his colleagues wrote that they were initially “surprised” by the “astonishing amount of regularity in students’ learning rate.” The discovery contradicts our everyday experiences. Some students earn As in algebra, an example mentioned in the paper, and they appear to have learned faster than peers who get Cs.

But as the scientists confirmed their numerical results across 27 datasets, they began to understand that we commonly misinterpret prior knowledge for learning. Some kids already know a lot about a subject before a teacher begins a lesson. They may have already had exposure to fractions by making pancakes at home using measuring cups. The fact that they mastered a fractions unit faster than their peers doesn’t mean they learned faster; they had a head start. 

Like watching a marathon

Koedinger likens watching children learn to watching a marathon from the finish line. The first people to cross the finish line aren’t necessarily the fastest when there are staggered starts. A runner who finished sooner might have taken five hours, while another runner who finished later might have taken only four hours. You need to know each runner’s start time to measure the pace.

Koedinger and his colleagues measured each student’s baseline achievement and their incremental gains from that initial mark. This would be very difficult to measure in ordinary classrooms, but with educational software, researchers can sort practice exercises by the knowledge components required to do them, see how many problems students get right initially and track how their accuracy improves over time.  

In the LearnLab datasets, students typically used software after some initial instruction in their classrooms, such as a lesson by a teacher or a college reading assignment. The software guided students through practice problems and exercises. Initially, students in the same classrooms had wildly different accuracy rates on the same concepts. The top quarter of students were getting 75 percent of the questions correct, while the bottom quarter of students were getting only 55 percent correct. It’s a gigantic 20 percentage point difference in the starting lines. 

However, as students progressed through the computerized practice work, there was barely even one percentage point difference in learning rates. The fastest quarter of students improved their accuracy on each concept (or knowledge component) by about 2.6 percentage points after each practice attempt, while the slowest quarter of students improved by about 1.7 percentage points. It took seven to eight attempts for nearly all students to go from 65 percent accuracy, the average starting place, to 80 percent accuracy, which is what the researchers defined as mastery.

The advantage of a head start

The head start for the high achievers matters.  Above average students, who begin above 65 percent accuracy take fewer than four practice attempts to hit the 80 percent threshold. Below average students tend to require more than 13 attempts to hit the same 80 percent threshold. That difference – four versus 13 – can make it seem like students are learning at different paces. But they’re not. Each student, whether high or low, is learning about the same amount from each practice attempt. (The researchers didn’t study children with disabilities, and it’s unknown if their learning rates are different.)

The student data that Koedinger studied comes from educational software that is designed to be interactive and gives students multiple attempts to try things, make mistakes, get feedback and try again. Students learn by doing. Some of the feedback was very basic, like an answer key, alerting students if they got the problem right or wrong. But some of the feedback was sophisticated. Intelligent tutoring systems in math provided hints when students got stuck, offered complete explanations and displayed step-by-step examples. 

The conclusion that everyone’s learning rate is similar might apply only to well-designed versions of computerized learning. Koedinger thinks students probably learn at different paces in the analog world of paper and pencil, without the same guided practice and feedback. When students are learning more independently, he says, some might be better at checking their own work and seeking guidance.  

Struggling students might be getting fewer “opportunities” to learn in the analog world, Koedinger speculated. That doesn’t necessarily mean that schools and parents should be putting low-achieving students on computers more often. Many students quickly lose motivation to learn on screens and need more human interaction.

Memory ability varies

Learning rates were especially steady in math and science – the subjects that most of the educational software in this study focused on. But researchers noticed more divergence in learning rates in the six datasets that involved the teaching of English and other languages. One was a program that taught the use of the article “the,” which can be arbitrary. (Here’s an example: I’m swimming in the Atlantic Ocean today but in Lake Ontario tomorrow. There’s no “the” before lakes.) Another program taught Chinese vocabulary. Both relied on students’ memory and individual memory processing speeds differ. Memory is important in learning math and science too, but Koedinger said students might be able to compensate with other learning strategies, such as pattern recognition, deduction and induction. 

To understand that we all learn at a similar rate is one of the best arguments I’ve seen not to give up on ourselves when we’re failing and falling behind our peers. Koedinger hopes it will inspire teachers to change their attitudes about low achievers in their classrooms, and instead think of them as students who haven’t had the same number of practice opportunities and exposure to ideas that other kids have had. With the right exercises and feedback, and a bit of effort, they can learn too. Perhaps it’s time to revise the old saw about how to get to Carnegie Hall. Instead of practice, practice, practice, I’m going to start saying practice, listen to feedback and practice again (repeat seven times).

This story was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for the Hechinger newsletter.

The post PROOF POINTS: The myth of the quick learner appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-the-myth-of-the-quick-learner/feed/ 3 97215
PROOF POINTS: Professors say high school math doesn’t prepare most students for their college majors https://hechingerreport.org/proof-points-professors-say-high-school-math-doesnt-prepare-most-students-for-their-college-majors/ https://hechingerreport.org/proof-points-professors-say-high-school-math-doesnt-prepare-most-students-for-their-college-majors/#comments Mon, 13 Nov 2023 11:00:00 +0000 https://hechingerreport.org/?p=97081

The typical ambitious high school student takes advanced algebra, trigonometry, pre-calculus and calculus. None of that math may be necessary for the vast majority of undergraduates who don’t intend to major in science or another STEM field.  But those same students don’t have many of the math skills that professors think they actually do need. […]

The post PROOF POINTS: Professors say high school math doesn’t prepare most students for their college majors appeared first on The Hechinger Report.

]]>
A survey of college professors indicates that most fields of study don’t require many of the math topics that high school students learn in high school. Credit: Kevin Wolf/ Associated Press

The typical ambitious high school student takes advanced algebra, trigonometry, pre-calculus and calculus. None of that math may be necessary for the vast majority of undergraduates who don’t intend to major in science or another STEM field. 

But those same students don’t have many of the math skills that professors think they actually do need. In a survey, humanities, arts and social science professors say they really want their students to be able to analyze data, create charts and spreadsheets and reason mathematically – skills that high school math courses often skip or rush through.

“We still need the traditional algebra-to-calculus curriculum for students who are intending a STEM major,” said Gary Martin, a professor of mathematics education at Auburn University in Alabama who led the team that conducted this survey of college professors. “But that’s maybe 20 percent. The other 80 percent, what about them?” 

Martin said that the survey showed that high schools should stress “reasoning and critical thinking skills, decrease the emphasis on specific mathematical topics, and increase the focus on data analysis and statistics.”

This damning assessment of the content of high school math comes from a survey of about 300 Alabama college professors who oversee majors and undergraduate degree programs at both two-year and four-year public colleges in the humanities, arts, social sciences and some natural sciences. Majors that require calculus were excluded. 

The 2021 survey prompted Alabama’s public colleges and universities to allow more students to meet their math requirements by taking a statistics course instead of a traditional math class, such as college algebra or calculus. 

Martin and his colleagues later realized that the survey had implications for high school math too, and presented these results at an Oct. 26, 2023 session of the National Council of Teachers of Mathematics annual conference in Washington D.C.  Full survey results are slated to be published in the winter 2024 issue of the MathAMATYC Educator, a peer-reviewed journal of the American Mathematical Association of Two-Year Colleges.

In the survey, professors were asked detailed questions about which mathematical concepts and skills students need in their programs. Many high school math topics were unimportant to college professors. For example, most professors said they wanted students to understand functions, particularly linear and exponential functions, which are used to model trends, population changes or compound interest. But Martin said that non-STEM students didn’t really need to learn trigonometric functions, which are used in satellite navigation or mechanical engineering. 

College professors were more keen on an assortment of what was described as mathematical “practices,” including the ability to “interpret quantitative information,” “strategically infer, evaluate and reason,” “apply the mathematics they know to solve everyday life, society and the workplace,” and to “look for patterns and relationships and make generalizations.”

“Teachers are so focused on covering all the topics that they don’t have time to do the practices when the practices are what really matters,” said Martin.

Understanding statistics was high on the list. An overwhelming majority of college professors said students in their programs needed to be familiar with statistics and data analysis, including concepts like correlation, causation and the importance of sample size. They wanted students to be able to “interpret displays of data and statistical analyses to understand the reasonableness of the claims being presented.” Professors say students need to be able to produce bar charts, histograms and line charts. Facility with spreadsheets, such as Excel, is useful too.

“Statistics is what you need,” said Martin. “Yet, in many K-12 classrooms, statistics is the proverbial end-of-the-year unit that you may or may not get to. And if you do, you rush through it, just to say you did it. But there’s not this sense of urgency to get through the statistics, as there is to get through the math topics.”

Though the survey took place only in Alabama and professors in other states might have different thoughts on the math that students need, Martin suspects that there are more similarities than differences.

The mismatch between what students learn in high school and what they need in college isn’t easy to fix. Teachers generally don’t have time for longer statistics units, or the ability to go deeper into math concepts so that students can develop their reasoning skills, because high school math courses have become bloated with too many topics. However, there is no consensus on which algebra topics to jettison.

Encouraging high school students to take statistics classes during their junior and senior years is also fraught. College admissions officers value calculus, almost as a proxy for intelligence. And college admissions tests tend to emphasize math skills that students will practice more on the algebra-to-calculus track. A diversion to data analysis risks putting students at a disadvantage. 

The thorniest problem is that revamping high school math could force students to make big choices in school before they know what they want to study in college. Students who want to enter STEM fields still need calculus and the country needs more people to pursue STEM careers. Taking more students off of the calculus track could close doors to many students and ultimately weaken the U.S. economy.

Martin said it’s also important to remember that vocational training is not the only purpose of math education.  “We don’t have students read Shakespeare because they need it to be effective in whatever they’re going to do later,” he said. “It adds something to your life. I felt that it really gave me breadth as a human being.”  He wants high school students to study some math concepts they will never need because there’s a beauty to them. “Appreciating mathematics is a really intriguing way of looking at the world,” he said.

Martin and his colleagues don’t have any definitive solutions, but their survey is a helpful data point in demonstrating how too few students are getting the mathematical foundations they need for the future. 

This story about high school math was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for the Hechinger newsletter.

The post PROOF POINTS: Professors say high school math doesn’t prepare most students for their college majors appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-professors-say-high-school-math-doesnt-prepare-most-students-for-their-college-majors/feed/ 10 97081
PROOF POINTS: Schools’ mission shifted during the pandemic with healthcare, shelter and adult ed https://hechingerreport.org/proof-points-with-dental-care-shelter-and-adult-ed-the-pandemic-prompted-a-shift-in-schools-mission/ https://hechingerreport.org/proof-points-with-dental-care-shelter-and-adult-ed-the-pandemic-prompted-a-shift-in-schools-mission/#comments Mon, 06 Nov 2023 11:00:00 +0000 https://hechingerreport.org/?p=96983

Much attention in the post-pandemic era has been on what students have lost – days of school, psychological health, knowledge and skills. But now we have evidence that they may also have gained something: schools that address more of their needs. A majority of public schools have begun providing services that are far afield from […]

The post PROOF POINTS: Schools’ mission shifted during the pandemic with healthcare, shelter and adult ed appeared first on The Hechinger Report.

]]>
The Buena Vista Horace Mann K-8 Community School in San Francisco opened its gymnasium to homeless students and their families as part of its Stay Over Program in 2022. It is one example of the many community services that a majority of public schools are now providing, according to a federal survey. Credit: Marissa Leshnov for The Hechinger Report

Much attention in the post-pandemic era has been on what students have lost – days of school, psychological health, knowledge and skills. But now we have evidence that they may also have gained something: schools that address more of their needs. A majority of public schools have begun providing services that are far afield from traditional academics, including healthcare, housing assistance, childcare and food aid. 

In a Department of Education survey released in October 2023 of more than 1,300 public schools, 60 percent said they were partnering with community organizations to provide non-educational services. That’s up from 45 percent a year earlier in 2022, the first time the department surveyed schools about their involvement in these services. They include access to medical, dental, and mental health providers as well as social workers. Adult education is also often part of the package; the extras are not just for kids. 

“It is a shift,” said Marguerite Roza, director of the Edunomics Lab at Georgetown University, where she tracks school spending. “We’ve seen partnering with the YMCA and with health groups for medical services and psychological evaluations.”

Deeper involvement in the community started as an emergency response to the coronavirus pandemic. As schools shuttered their classrooms, many became hubs where families obtained food or internet access. Months later, many schools opened their doors to become vaccine centers. 

New community alliances were further fueled by more than $200 billion in federal pandemic recovery funds that have flowed to schools. “Schools have a lot of money now and they’re trying to spend it down,” said Roza. Federal regulations encourage schools to spend recovery funds on nonprofit community services, and unspent funds will eventually be forfeited.

The term “community school” generally refers to schools that provide a cluster of wraparound services under one roof. The hope is that students living in poverty will learn more if their basic needs are met. Schools that provide only one or two services are likely among the 60 percent of schools that said they were using a community school or wraparound services model, but they aren’t necessarily full-fledged community schools, Department of Education officials said.

The wording of the question on the federal School Pulse Panel survey administered in August 2023 allowed for a broad interpretation of what it means to be a community school. The question posed to a sample of schools across all 50 states was this: “Does your school use a “community school” or “wraparound services” model? A community school or wraparound services model is when a school partners with other government agencies and/or local nonprofits to support and engage with the local community (e.g., providing mental and physical health care, nutrition, housing assistance, etc.).” 

The most common service provided was mental health (66 percent of schools) followed by food assistance (55 percent). Less common were medical clinics and adult education, but many more schools said they were providing these services than in the past.

A national survey of more than 1,300 public schools conducted by the National Center for Education Statistics indicates that a majority are providing a range of non-educational wraparound services to the community. Source: PowerPoint slide from an online briefing in October 2023 by the National Center for Education Statistics.

The number of full-fledged community schools is also believed to be growing, according to education officials and researchers. Federal funding for community schools tripled during the pandemic to $75 million in 2021-22 from $25 million in 2019-20. According to the  education department, the federal community schools program now serves more than 700,000 students in about 250 school districts, but there are additional state and private funding sources too. 

Whether it’s a good idea for most schools to expand their mission and adopt aspects of the community school model depends on one’s view of the purpose of school. Some argue that schools are taking on too many functions and should not attempt to create outposts for outside services. Others argue that strong community engagement is an important aspect of education and can improve daily attendance and learning. Research studies conducted before the pandemic have found that academic benefits from full-fledged community schools can take several years to materialize. It’s a big investment without an instant payoff.

Meanwhile, it’s unclear whether schools will continue to embrace their expanded mission after federal pandemic funds expire in March 2026. That’s when the last payments to contractors and outside organizations for services rendered can be made. Contracts must be signed by September 2024.

Edunomics’s Roza thinks many of these community services will be the first to go as schools face future budget cuts. But she also predicts that some will endure as schools raise money from state governments and philanthropies to continue popular programs.

If that happens, it will be an example of another unexpected consequence of the pandemic. Even as pundits decry how the pandemic has eroded support for public education, it may have profoundly transformed the role of schools and made them even more vital.

This story about wraparound services was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for the Hechinger newsletter.

The post PROOF POINTS: Schools’ mission shifted during the pandemic with healthcare, shelter and adult ed appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-with-dental-care-shelter-and-adult-ed-the-pandemic-prompted-a-shift-in-schools-mission/feed/ 1 96983
PROOF POINTS: Flashcards prevail over repetition in memorizing multiplication tables https://hechingerreport.org/proof-points-flashcards-prevail-over-repetition-in-memorizing-multiplication-tables/ https://hechingerreport.org/proof-points-flashcards-prevail-over-repetition-in-memorizing-multiplication-tables/#comments Mon, 30 Oct 2023 10:00:00 +0000 https://hechingerreport.org/?p=96854

Young students around the world struggle to memorize multiplication tables, but the effort pays off. Cognitive scientists say that learning 6 x 7 and 8 x 9 by heart frees up the brain’s working memory so that students can focus on the more demanding aspects of problem solving.  Math teachers debate the best way to […]

The post PROOF POINTS: Flashcards prevail over repetition in memorizing multiplication tables appeared first on The Hechinger Report.

]]>
A study published in 2023 in the journal of Applied Cognitive Psychology documented that second graders memorized more multiplication facts when they practiced using flashcards rather than by repeating their times tables aloud. Credit: Matt McClain/The Washington Post via Getty Images

Young students around the world struggle to memorize multiplication tables, but the effort pays off. Cognitive scientists say that learning 6 x 7 and 8 x 9 by heart frees up the brain’s working memory so that students can focus on the more demanding aspects of problem solving. 

Math teachers debate the best way to make multiplication automatic. Some educators argue against drills and say fluency will develop with everyday usage. Others insist that schools should devote time to helping children memorize times tables. 

Even among proponents of memorization, it’s unclear which methods are the most effective. Should kids draw their own color-coded tables and study them, or copy their multiplication facts out dozens of times? Should they play multiplication songs and videos? Should they learn mnemonic tricks, like how the digits of the multiples of nine add up to nine (1+8, 2+7, 3+6, etc.)?  My daughter’s gym teacher used to make students shout “7 x 5 is 35” and “6 x 8 is 48” as they did jumping jacks. (It was certainly a way to make jumping less monotonous.) 

To help advise teachers, a team of learning scientists compared two common methods: chanting and flashcards. 

The 2022 experiment took place in four second grade classrooms in the Netherlands. The teachers began by delivering a lesson on multiplying by three. Using the same scripted lesson, they explained multiplication concepts, such as: “If I grab three apples, and I do this only one time, how many apples do I have?” 

After the lesson, half the classrooms practiced by reciting equations displayed on a whiteboard:  “One times three is three, two times three is six…” through to 10. The other half practiced with flashcards. Students had their own personal sets with answers on the reverse side. Both groups spent five minutes practicing three times during the week for a total of 15 minutes. (More details on the experiment’s design here.)

When the teachers moved on to multiplication by fours, the groups switched. The chanters quizzed themselves with flashcards, and the flashcard kids started chanting. All the students practiced memorizing both ways. 

The results added up to a clear winner. 

On a pre-test before the lesson, the second graders got an average of three math facts right. Afterwards, the chanters tended to double their accuracy, answering six facts correctly. But the flashcard users averaged eight correct. Students were tested again a full week later without any additional practice sessions, and the strong advantage for flashcard users didn’t fade. It was a sign that flashcard practice not only produces better short-term memories, but also better long-term ones –  the ultimate goal.

Students scored higher on a multiplication test after practicing through flashcards (retrieval practice) than by chanting aloud (restudy). Source: Figure 1 of “The effect of retrieval practice on fluently retrieving multiplication facts in an authentic elementary school setting,” (2023) Journal of Applied Cognitive Psychology.

The study, “The effect of retrieval practice on fluently retrieving multiplication facts in an authentic elementary school setting,” was published online in October 2023 in the journal Applied Cognitive Psychology.  Though a small study of 48 students, this classroom experiment is a good example of the power of what cognitive scientists call “spaced retrieval practice,” in which the act of remembering consolidates information and helps the brain form long-term memories.  

Retrieval practice can seem counterintuitive. One might think that students should study before being assessed or quizzing themselves. But there’s a growing body of evidence that trying to recall something is itself a powerful tool for learning, particularly when you are given the correct answer immediately after making a stab at it and then get a chance to try again. Testing your memory – even when you draw a blank – is a way to build new memories. 

Many experiments have shown that retrieval practice produces better long-term memories than studying. Flashcards are one way to try retrieval practice. Quizzes are another option because they also require students to retrieve new information from memory. Indeed, many teachers opt for speed drills, asking students to race through a page of multiplication problems in a minute. 

Flashcards can be less anxiety provoking, provide students immediate feedback with answers on the reverse side and allow students to repeat the retrieval practice immediately, running through the deck more than once. Still, kids are kids and they easily drift off task during independent practice time. With a timed quiz, the teacher can be more confident that everyone has benefited from a round of retrieval practice. I’d be curious to see flashcards and quizzes pitted against each other in a future classroom experiment. 

As charming as multiplication songs are – I have a soft spot for School House Rock and my editor fondly recalls her Billy Leach multiplication records – they are unlikely to be as effective as flashcards because they don’t involve retrieval practice, according to Gino Camp, a professor of learning sciences at Open University in the Netherlands and one of the researchers on the study.

That doesn’t mean we should jettison the songs or all the other memorization methods just because some aren’t as effective as others. Researchers may eventually find that a combination of techniques is even more powerful. Still, there are limited minutes in the school day, and knowing which learning methods are the most effective can help everyone – teachers, parents and students – use their time wisely.

This story about multiplication flashcards was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for the Hechinger newsletter.

The post PROOF POINTS: Flashcards prevail over repetition in memorizing multiplication tables appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-flashcards-prevail-over-repetition-in-memorizing-multiplication-tables/feed/ 4 96854
PROOF POINTS: Schools staff up as student enrollment drops https://hechingerreport.org/proof-points-schools-staff-up-as-student-enrollment-drops/ https://hechingerreport.org/proof-points-schools-staff-up-as-student-enrollment-drops/#comments Mon, 09 Oct 2023 10:00:00 +0000 https://hechingerreport.org/?p=96349

The stats on school staffing might seem like a violation of the laws of supply and demand. In the past decade, the population of elementary, middle and high school students in Massachusetts dropped by 42,000 while the number of school employees grew by 18,000. In Connecticut, public school enrollment fell 7 percent while staffing rose […]

The post PROOF POINTS: Schools staff up as student enrollment drops appeared first on The Hechinger Report.

]]>
Georgetown University’s Edunomics Lab documents the divergence between the growth of school staff and students. See detailed graphs below. Retrieved from https://edunomicslab.org/staffing-v-enrollment-trends-2/

The stats on school staffing might seem like a violation of the laws of supply and demand.

In the past decade, the population of elementary, middle and high school students in Massachusetts dropped by 42,000 while the number of school employees grew by 18,000. In Connecticut, public school enrollment fell 7 percent while staffing rose 8 percent. Even in states with growing populations, school staff has been increasing far faster than students. Texas, for example, educates 367,000 more students, a 7 percent increase over the past decade, but the number of employees has surged by more than 107,000, a 16 percent jump. Staffing is up 20 percent in Washington state but the number of students has risen by less than 3 percent. 

“When kids go to school right now there are more adults in the building of all types than there were in 2013 and more than when I was a kid,” said Marguerite Roza, director of the Edunomics Lab at Georgetown University, where she has been tracking the divergence between students and staff at the nation’s public schools. 

What’s behind the apparent imbalance? Follow the money.

School hiring has taken place in three acts, Roza says. The first act followed the Great Recession of 2008, as schools added back staff that they had been forced to cut in the economic downturn. 

The second act came with seven consecutive years of strong economic growth beginning in 2013. That led to higher state and local tax receipts, which increased school funding and enabled the new hires. “Most of the additions were fueled by a lot of new money,” said Roza. Schools hired more teachers to reduce class sizes. They added art and music teachers, librarians and nurses, as well as special education teachers to help children with disabilities. Schools generally chose to add more slots instead of raising salaries for the teachers they already had, Roza said.

The third act was a pandemic-fueled “hiring bonanza.” Starting in 2020, the federal government sent schools more than $200 billion in pandemic recovery funds. Schools hired additional counselors, interventionists (a fancy name for tutors), and aides, and increased their reserves of substitute teachers. More teachers were hired to further reduce class size, in the hope that students might receive more attention and catch up from pandemic learning losses. By the spring of 2023, school districts had amassed more staff than at any time in history, the Edunomics Lab calculated.

Not every school has increased staffing levels, according to Roza, but she says it’s a widespread national trend. Roza’s organization produced graphs for six states – Connecticut, Massachusetts, Michigan, Texas, Washington and Pennsylvania – that release their staffing and student enrollment data publicly. It could be years before complete national data is available, Roza said. 

The available data doesn’t specify how much of the staff expansion represents new classroom teachers, as opposed to support staff, such as janitors and attendance clerks, or administrators, such as vice principals and math supervisors. 

Roza says there is administrative bloat in the central offices of many school districts. But some of the administrative growth is required to comply with increased federal regulations, such as those that stem from the Individuals with Disabilities in Education Act (IDEA). Other administrators are needed to manage federal grants. Central offices needed more administrators to handle recruitment and human resources because they were hiring for so many new positions. 

Meanwhile, the number of students has been dropping in most school districts. That’s because Americans made fewer babies after the 2008 recession. The national elementary and middle student population, ages five to 13, peaked in 2013 at 37 million; in 2021 there were 400,000 fewer students. (This includes public, private, charter and homeschooled students.) Student population losses are more dramatic in some regions of the country than others; many school districts in the South are still growing.

Roza says some schools have excess capacity and are only half filled. School budgets, often based on per pupil funding formulas, would normally be cut. But many districts have been insulated from financial realities because of pandemic recovery funds.  Schools are expected to face a reckoning after September 2024 when these federal funds expire. Roza predicts many schools will need to lay off four percent or more of their staff, including teachers. 

This news is confusing because school administrators have been complaining about teacher shortages. And indeed, there are unfilled vacancies at many schools. Some of these vacancies reflect new slots that are hard to fill with a finite supply of teachers. But many vacancies are in high poverty schools where fewer teachers want to teach. A year from now, as districts are forced to layoff more teachers, high poverty schools might have even more unfilled positions. And our neediest children will suffer the most. 

This story about school staffing was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

The post PROOF POINTS: Schools staff up as student enrollment drops appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-schools-staff-up-as-student-enrollment-drops/feed/ 1 96349
PROOF POINTS: Three views of pandemic learning loss and recovery https://hechingerreport.org/proof-points-three-views-of-pandemic-learning-loss-and-recovery/ https://hechingerreport.org/proof-points-three-views-of-pandemic-learning-loss-and-recovery/#respond Mon, 28 Aug 2023 10:00:00 +0000 https://hechingerreport.org/?p=95421

Kids around the country are still suffering academically from the pandemic. But more than three years after schools shut down, it’s hard to understand exactly how much ground students have lost and which children now need the most attention. Three new reports offer some insights.  All three were produced by for-profit companies that sell assessments […]

The post PROOF POINTS: Three views of pandemic learning loss and recovery appeared first on The Hechinger Report.

]]>

Kids around the country are still suffering academically from the pandemic. But more than three years after schools shut down, it’s hard to understand exactly how much ground students have lost and which children now need the most attention.

Three new reports offer some insights.  All three were produced by for-profit companies that sell assessments to schools. Unlike annual state tests, these interim assessments are administered at least twice a year and help track student progress, or learning, during the year. These companies may have a business motive in sounding an alarm to sell more of their product, but the reports are produced by well-regarded education statisticians.

The big picture is that kids at every grade are still behind where they would have been without the pandemic. All three reports look at student achievement in the spring of 2019, before the pandemic, and compare it to the spring of 2023. A typical sixth grader, for example, in the spring of 2023 was generally scoring much lower than a typical sixth grader in 2019.

The differences are in the details. One report says that students are still behind the equivalent of four to five months of school, but another says it’s one to three months. A third doesn’t measure months of lost learning, but notices the alarming 50 percent increase in the number of students who are still performing significantly below grade level.

Depending on how you slice and dice the data, older students in middle school and beyond seem to be in the most precarious position and younger children seem to be more resilient and recovering better. Yet, under a different spotlight, you can see troubling signs even among younger children. This includes the very youngest children who weren’t school age when the pandemic hit.

The most recent data, released on Aug. 28, 2023, is from Curriculum Associates, which sells i-Ready assessments taken by more than 11 million students across the country and focuses on “grade-level” skills.*  It counts the number of students in third grade, for example, who are able to read at a third-grade level or solve math problems that a third grader ought to be able to solve. The standards for what is grade-level achievement are similar to what most states consider to be “proficient” on their annual assessments.

The report concludes that the percentage of students who met grade-level expectations was “flat” over the past school year. This is one way of noting that there wasn’t much of an academic recovery between spring of 2022 and spring of 2023. Students of every age, on average, lagged behind where students had been in 2019.

For example, 69 percent of fourth graders were demonstrating grade-level skills in math in 2019. That dropped to 55 percent in 2022 and barely improved to 56 percent in 2023. (The drop in grade-level performance isn’t as dramatic for seventh and eighth graders, in part, because so few students were meeting grade-level expectations even before the pandemic.)

“It’s dang hard to catch up,” said Kristen Huff, vice president of assessment and research at Curriculum Associates.

To make up for lost ground, students would have to learn more in a year than they typically do. That generally didn’t happen. Huff said this kind of extra learning is especially hard for students who missed foundational math and reading skills during the pandemic.

While most students learned at a typical pace during the 2022-23 school year, Curriculum Associates noted a starkly different and troubling pattern for children who are significantly below grade level by two or more years. Their numbers spiked during the pandemic and have not gone down. Even worse, these children learned less during the 2022-23 school year than during a typical pre-pandemic year. That means they are continuing to lose ground.

Huff highlighted three groups of children who need extra attention: poor readers in second, third and fourth grades; children in kindergarten and first grade, and middle school math students.

There’s been a stubborn 50 percent increase in the number of third and fourth graders who are two or more grade levels behind in reading, Huff said. For example, 19 percent of third graders were that far behind grade level in 2023, up from 12 percent in 2019.  “I find this alarming news,” said Huff, noting that these children were in kindergarten and first grade when the pandemic first hit. “They’re missing out on phonics and phonemic awareness and now they’re thrust into grades three and four,” she said. “If you’re two or more grade levels below in grade three, you’re in big trouble. You’re in big, big, big trouble. We’re going to be seeing evidence of this for years to come.”

The youngest students, who were just two to four years old at the start of the pandemic, are also behind. Huff said that kindergarteners and first graders started the 2022-23 school year at lower achievement levels than in the past. They may have missed out on social interactions and pre-school. “You can’t say my current kindergartener wasn’t in school during the pandemic so they weren’t affected,” said Huff.

Math achievement slipped the most after schools shuttered and switched to remote learning. And now very high percentages of middle schoolers are below grade level in the subject. Huff speculates that they missed out on foundational math skills, especially fractions and proportional reasoning.

Renaissance administered its Star tests to more than six million students around the country. Its spring 2023 report was released on Aug, 9. Like Curriculum Associates, Renaissance finds that, “growth is back, but performance is not,” according to Gene Kerns, Renaissance’s chief academic officer.** That means students are generally learning at a typical pace at school, but not making up for lost ground. Depending on the subject and the grade, students still need to recover between one and three months of instruction.

Bars represent the achievement gaps between student scores in spring 2023 and 2019, before the pandemic. Each point is roughly equal to a week of instruction. First grade students in 2023 scored as high in math as first grade students did in 2019; learning losses had been recovered. (Data source: Renaissance)

Math is rebounding better than reading. “Math went down an alarming amount, but has started to go back up,” Kerns said. “We’ve not seen much rebound to reading.” Reading achievement, however, wasn’t as harmed by school disruptions. 

Kerns generally sees a sunnier story for younger children and a more troubling picture for older students.

The youngest children in kindergarten and first grade are on par with pre-pandemic history, he said. Middle elementary school grades are a little behind but catching up. 

“The older the student, the more lingering the impact,” said Kerns. “The high school data is very alarming. If you’re a junior in high school, you only have one more year. There’s a time clock on this.” 

Seventh and eighth graders showed tiny decreases in annual learning in math and reading. Kerns says he’s “hesitant” to call it a “downward spiral.”

The third report come from NWEA, which administers the Measures of Academic Progress (MAP) Assessment to more than 6 million students. Its spring 2023 data, released on July 11, showed that students on average need four to five months of extra schooling, on top of the regular school year, to catch up. This graph below, is a good summary of how much students are behind as expressed in months of learning.

Spring 2023 achievement gaps and months of schooling required to catch up to pre-COVID achievement levels

Like the Renaissance report, the NWEA report shows a bigger learning loss in math than in reading, and indicates that older students have been more academically harmed by the pandemic. They’ll need more months of extra schooling to catch up to where they would have been had the pandemic never happened. It could take years and years to squeeze these extra months of instruction in and many students may never receive them.

From my perspective, Renaissance and NWEA came to similar conclusions for most students. The main difference is that Renaissance has additional assessment data for younger children in kindergarten through second grade, showing a recovery, and high school data, showing a worse deterioration. The discrepancies in their measurement of months of learning loss, whether it’s four to five months or one to three months, is inconsequential. Both companies admit these assumption-filled estimates are imprecise.

One of the most substantial differences among the reports is that Curriculum Associates is sounding an alarm bell for kindergarteners and first graders while Renaissance is not.

The three reports all conclude that kids are behind where they would have been without the pandemic. But some sub-groups are doing much worse than others. The students who are the most behind and continuing to spiral downward really need our attention. Without extra support, their pandemic slump could be lifelong. 

* Correction: An earlier version of this story incorrectly said that more than 3 million students took i-Ready assessments.

** Correction: An earlier version of this story incorrectly spelled Gene Kerns’s last name.

This story about pandemic recovery was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters. 

The post PROOF POINTS: Three views of pandemic learning loss and recovery appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-three-views-of-pandemic-learning-loss-and-recovery/feed/ 0 95421