Data and research Archives - The Hechinger Report https://hechingerreport.org/tags/data-and-research/ Covering Innovation & Inequality in Education Mon, 08 Jul 2024 17:25:49 +0000 en-US hourly 1 https://hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon-32x32.jpg Data and research Archives - The Hechinger Report https://hechingerreport.org/tags/data-and-research/ 32 32 138677242 OPINION: School counselors are scarce, but AI could play an important role in helping them reach more students https://hechingerreport.org/opinion-school-counselors-are-scarce-but-ai-could-play-an-important-role-in-helping-them-reach-more-students/ https://hechingerreport.org/opinion-school-counselors-are-scarce-but-ai-could-play-an-important-role-in-helping-them-reach-more-students/#respond Tue, 09 Jul 2024 05:00:00 +0000 https://hechingerreport.org/?p=101874

If we are to believe the current rapturous cheerleading around artificial intelligence, education is about to be transformed. Digital educators, alert and available at all times, will soon replace their human counterparts and feed students with concentrated personalized content. It’s reminiscent of a troubling experiment from the 1960s, immortalized in one touching image: an infant […]

The post OPINION: School counselors are scarce, but AI could play an important role in helping them reach more students appeared first on The Hechinger Report.

]]>

If we are to believe the current rapturous cheerleading around artificial intelligence, education is about to be transformed. Digital educators, alert and available at all times, will soon replace their human counterparts and feed students with concentrated personalized content.

It’s reminiscent of a troubling experiment from the 1960s, immortalized in one touching image: an infant monkey, clearly scared, clutching a crude cloth replica of the real mother it has been deprived of. Next to it is a roll of metal mesh with a feeding bottle attached. The metal mom supplies milk, while the cloth mom sits inert. And yet, in moments of stress, it is the latter the infant seeks succor from.

Notwithstanding its distressing provenance, this image has bearing on a topical question: What role should AI play in our children’s education? And in school counseling? Here’s one way to think about these questions.

With its detached efficiency, an AI system is like the metal mesh mother — capable of delivering information, but little else. Human educators — the teachers and the school counselors with whom students build emotional bonds and relationships of trust — are like the cloth mom.

It would be a folly to replace these educators with digital counterparts. We don’t need to look very far back to validate this claim. Just over a decade ago, we were gripped by the euphoria around MOOCs — educational videos accessible to all via the Internet.

“The end of classroom education!” “An inflection point!” screamed breathless headlines. The reality turned out to be a lot less impressive.

MOOCs wound up playing a helpful supporting role in education, but the stars of the show remained the human teachers; in-person learning environments turned out to be essential. The failures of remote learning during Covid support the same conclusion. A similar narrative likely will (and we argue, ought to) play out in the context of AI and school counseling.

Related: Become a lifelong learner. Subscribe to our free weekly newsletter to receive our comprehensive reporting directly in your inbox.

Guidance for our children must keep caring adults at its core. Counselors play an indispensable role in helping students find their paths through the school maze. Their effectiveness is driven by their expertise, empathy and ability to be confidants to students in moments of doubt and stress.

At least, that is how counseling is supposed to work. In reality, the counseling system is under severe stress.

The American School Counselor Association recommends a student-to-counselor ratio of 250-to-1, yet the actual average was 385-to-1 for the 2022–23 school year, the most recent year for which data is available. In many schools the ratio is far higher.

Even for the most dedicated counselor, such a ratio makes it impossible to spend much time getting to know any one student; the counselor has to focus on administrative work like schedule changes and urgent issues like mental health. This constraint on availability has cascading effects, limiting the counselor’s ability to personalize advice and recommendations.

Students sense that their counselors are rushed or occupied with other crises and feel hesitant to ask for more advice and support from these caring adults. Meanwhile, the counselors are assigned extraneous tasks like lunch duty and attendance support, further scattering their attention.

Against this dispiriting backdrop, it is tempting to turn to AI as a savior. Can’t generative AI systems be deployed as virtual counselors that students can interact with and get recommendations from? As often as they want? On any topic? Costing a fraction of the $60,000 annual salary of a typical human school counselor?

Given the fantastic recent leaps in the capabilities of AI systems, answers to all these questions appear to be a resounding yes: There is a compelling case to be made for having AI play a role in school counseling. But it is not one of replacement.

Related: PROOF POINTS: AI essay grading is already as ‘good as an overburdened’ teacher, but researchers say it needs more work

AI’s ability to process vast amounts of data and offer personalized recommendations makes it well-suited for enhancing the counseling experience. By analyzing data on a student’s personality and interests, AI can facilitate more meaningful interactions between the student and their counselor and lay the groundwork for effective goal setting.

AI also excels at breaking down complex tasks into manageable steps, turning goals into action plans. This work is often time-consuming for human counselors, but it’s easy for AI, making it an invaluable ally in counseling sessions.

By leveraging AI to augment traditional approaches, counselors can allocate more time to providing critical social and emotional support and fostering stronger mentorship relationships with students.

Incorporating AI into counseling services also brings long-term benefits: AI systems can track recommendations and student outcomes, and thus continuously improve system performance over time. Additionally, AI can stay abreast of emerging trends in the job market so that counselors can offer students cutting-edge guidance on future opportunities.

And AI add-ons are well-suited to provide context-specific suggestions and information — such as for courses and local internships — on an as-needed basis and to adapt to a student’s changing interests and goals over time.

As schools grapple with declining budgets and chronic absenteeism, the integration of AI into counseling services offers a remarkable opportunity to optimize counseling sessions and establish support systems beyond traditional methods.

Still, it is an opportunity we must approach with caution. Human counselors serve an essential and irreplaceable role in helping students learn about themselves and explore college and career options. By harnessing the power of AI alongside human strengths, counseling services can evolve to meet the diverse needs of students in a highly personalized, engaging and goal-oriented manner.

Izzat Jarudi is co-founder and CEO of Edifii, a startup offering digital guidance assistance for high school students and counselors supported by the U.S. Department of Education’s SBIR program. Pawan Sinha is a professor of neuroscience and AI at MIT and Edifii’s co-founder and chief scientist. Carolyn Stone, past president of the American School Counselor Association, contributed to this piece.

This story about AI and school counselors was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s newsletter.

The post OPINION: School counselors are scarce, but AI could play an important role in helping them reach more students appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/opinion-school-counselors-are-scarce-but-ai-could-play-an-important-role-in-helping-them-reach-more-students/feed/ 0 101874
PROOF POINTS: Asian American students lose more points in an AI essay grading study — but researchers don’t know why https://hechingerreport.org/proof-points-asian-american-ai-bias/ https://hechingerreport.org/proof-points-asian-american-ai-bias/#comments Mon, 08 Jul 2024 10:00:00 +0000 https://hechingerreport.org/?p=101830 global online academy

When ChatGPT was released to the public in November 2022, advocates and watchdogs warned about the potential for racial bias. The new large language model was created by harvesting 300 billion words from books, articles and online writing, which include racist falsehoods and reflect writers’ implicit biases. Biased training data is likely to generate biased […]

The post PROOF POINTS: Asian American students lose more points in an AI essay grading study — but researchers don’t know why appeared first on The Hechinger Report.

]]>
global online academy

When ChatGPT was released to the public in November 2022, advocates and watchdogs warned about the potential for racial bias. The new large language model was created by harvesting 300 billion words from books, articles and online writing, which include racist falsehoods and reflect writers’ implicit biases. Biased training data is likely to generate biased advice, answers and essays. Garbage in, garbage out. 

Researchers are starting to document how AI bias manifests in unexpected ways. Inside the research and development arm of the giant testing organization ETS, which administers the SAT, a pair of investigators pitted man against machine in evaluating more than 13,000 essays written by students in grades 8 to 12. They discovered that the AI model that powers ChatGPT penalized Asian American students more than other races and ethnicities in grading the essays. This was purely a research exercise and these essays and machine scores weren’t used in any of ETS’s assessments. But the organization shared its analysis with me to warn schools and teachers about the potential for racial bias when using ChatGPT or other AI apps in the classroom.

AI and humans scored essays differently by race and ethnicity

“Diff” is the difference between the average score given by humans and GPT-4o in this experiment. “Adj. Diff” adjusts this raw number for the randomness of human ratings. Source: Table from Matt Johnson & Mo Zhang “Using GPT-4o to Score Persuade 2.0 Independent Items” ETS (June 2024 draft)

“Take a little bit of caution and do some evaluation of the scores before presenting them to students,” said Mo Zhang, one of the ETS researchers who conducted the analysis. “There are methods for doing this and you don’t want to take people who specialize in educational measurement out of the equation.”

That might sound self-serving for an employee of a company that specializes in educational measurement. But Zhang’s advice is worth heeding in the excitement to try new AI technology. There are potential dangers as teachers save time by offloading grading work to a robot.

In ETS’s analysis, Zhang and her colleague Matt Johnson fed 13,121 essays into one of the latest versions of the AI model that powers ChatGPT, called GPT 4 Omni or simply GPT-4o. (This version was added to ChatGPT in May 2024, but when the researchers conducted this experiment they used the latest AI model through a different portal.)  

A little background about this large bundle of essays: students across the nation had originally written these essays between 2015 and 2019 as part of state standardized exams or classroom assessments. Their assignment had been to write an argumentative essay, such as “Should students be allowed to use cell phones in school?” The essays were collected to help scientists develop and test automated writing evaluation.

Each of the essays had been graded by expert raters of writing on a 1-to-6 point scale with 6 being the highest score. ETS asked GPT-4o to score them on the same six-point scale using the same scoring guide that the humans used. Neither man nor machine was told the race or ethnicity of the student, but researchers could see students’ demographic information in the datasets that accompany these essays.

GPT-4o marked the essays almost a point lower than the humans did. The average score across the 13,121 essays was 2.8 for GPT-4o and 3.7 for the humans. But Asian Americans were docked by an additional quarter point. Human evaluators gave Asian Americans a 4.3, on average, while GPT-4o gave them only a 3.2 – roughly a 1.1 point deduction. By contrast, the score difference between humans and GPT-4o was only about 0.9 points for white, Black and Hispanic students. Imagine an ice cream truck that kept shaving off an extra quarter scoop only from the cones of Asian American kids. 

“Clearly, this doesn’t seem fair,” wrote Johnson and Zhang in an unpublished report they shared with me. Though the extra penalty for Asian Americans wasn’t terribly large, they said, it’s substantial enough that it shouldn’t be ignored. 

The researchers don’t know why GPT-4o issued lower grades than humans, and why it gave an extra penalty to Asian Americans. Zhang and Johnson described the AI system as a “huge black box” of algorithms that operate in ways “not fully understood by their own developers.” That inability to explain a student’s grade on a writing assignment makes the systems especially frustrating to use in schools.

This table compares GPT-4o scores with human scores on the same batch of 13,121 student essays, which were scored on a 1-to-6 scale. Numbers highlighted in green show exact score matches between GPT-4o and humans. Unhighlighted numbers show discrepancies. For example, there were 1,221 essays where humans awarded a 5 and GPT awarded 3. Data source: Matt Johnson & Mo Zhang “Using GPT-4o to Score Persuade 2.0 Independent Items” ETS (June 2024 draft)

This one study isn’t proof that AI is consistently underrating essays or biased against Asian Americans. Other versions of AI sometimes produce different results. A separate analysis of essay scoring by researchers from University of California, Irvine and Arizona State University found that AI essay grades were just as frequently too high as they were too low. That study, which used the 3.5 version of ChatGPT, did not scrutinize results by race and ethnicity.

I wondered if AI bias against Asian Americans was somehow connected to high achievement. Just as Asian Americans tend to score high on math and reading tests, Asian Americans, on average, were the strongest writers in this bundle of 13,000 essays. Even with the penalty, Asian Americans still had the highest essay scores, well above those of white, Black, Hispanic, Native American or multi-racial students. 

In both the ETS and UC-ASU essay studies, AI awarded far fewer perfect scores than humans did. For example, in this ETS study, humans awarded 732 perfect 6s, while GPT-4o gave out a grand total of only three. GPT’s stinginess with perfect scores might have affected a lot of Asian Americans who had received 6s from human raters.

ETS’s researchers had asked GPT-4o to score the essays cold, without showing the chatbot any graded examples to calibrate its scores. It’s possible that a few sample essays or small tweaks to the grading instructions, or prompts, given to ChatGPT could reduce or eliminate the bias against Asian Americans. Perhaps the robot would be fairer to Asian Americans if it were explicitly prompted to “give out more perfect 6s.” 

The ETS researchers told me this wasn’t the first time that they’ve noticed Asian students treated differently by a robo-grader. Older automated essay graders, which used different algorithms, have sometimes done the opposite, giving Asians higher marks than human raters did. For example, an ETS automated scoring system developed more than a decade ago, called e-rater, tended to inflate scores for students from Korea, China, Taiwan and Hong Kong on their essays for the Test of English as a Foreign Language (TOEFL), according to a study published in 2012. That may have been because some Asian students had memorized well-structured paragraphs, while humans easily noticed that the essays were off-topic. (The ETS website says it only relies on the e-rater score alone for practice tests, and uses it in conjunction with human scores for actual exams.) 

Asian Americans also garnered higher marks from an automated scoring system created during a coding competition in 2021 and powered by BERT, which had been the most advanced algorithm before the current generation of large language models, such as GPT. Computer scientists put their experimental robo-grader through a series of tests and discovered that it gave higher scores than humans did to Asian Americans’ open-response answers on a reading comprehension test. 

It was also unclear why BERT sometimes treated Asian Americans differently. But it illustrates how important it is to test these systems before we unleash them in schools. Based on educator enthusiasm, however, I fear this train has already left the station. In recent webinars, I’ve seen many teachers post in the chat window that they’re already using ChatGPT, Claude and other AI-powered apps to grade writing. That might be a time saver for teachers, but it could also be harming students. 

This story about AI bias was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

The post PROOF POINTS: Asian American students lose more points in an AI essay grading study — but researchers don’t know why appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-asian-american-ai-bias/feed/ 3 101830
TEACHER VOICE: Everything I learned about how to teach reading turned out to be wrong https://hechingerreport.org/opinion-everything-i-learned-about-how-to-teach-reading-turned-out-to-be-wrong/ https://hechingerreport.org/opinion-everything-i-learned-about-how-to-teach-reading-turned-out-to-be-wrong/#respond Mon, 08 Jul 2024 05:00:00 +0000 https://hechingerreport.org/?p=101869 Cody Beck reads a book that was assigned by his teacher at Grenada Middle School. Since April, Cody has been on a “homebound” program due to behavior, where he does his work at home and meets with a teacher for four hours each week for instruction. (Photo by Jackie Mader)

When I first started teaching middle school, I did everything my university prep program told me to do in what’s known as the “workshop model.” I let kids choose their books. I determined their independent reading levels and organized my classroom library according to reading difficulty. I then modeled various reading skills, like noticing the […]

The post TEACHER VOICE: Everything I learned about how to teach reading turned out to be wrong appeared first on The Hechinger Report.

]]>
Cody Beck reads a book that was assigned by his teacher at Grenada Middle School. Since April, Cody has been on a “homebound” program due to behavior, where he does his work at home and meets with a teacher for four hours each week for instruction. (Photo by Jackie Mader)

When I first started teaching middle school, I did everything my university prep program told me to do in what’s known as the “workshop model.”

I let kids choose their books. I determined their independent reading levels and organized my classroom library according to reading difficulty.

I then modeled various reading skills, like noticing the details of the imagery in a text, and asked my students to practice doing likewise during independent reading time.

It was an utter failure.

Kids slipped their phones between the pages of the books they selected. Reading scores stagnated. I’m pretty sure my students learned nothing that year.

Yet one aspect of this model functioned seamlessly: when I sat on a desk in front of the room and read out loud from a shared classroom novel.

Kids listened, discussions arose naturally and everything seemed to click.

Slowly, the reason for these episodic successes became clear to me: Shared experiences and teacher direction are necessary for high-quality instruction and a well-run classroom.

Related: Become a lifelong learner. Subscribe to our free weekly newsletter to receive our comprehensive reporting directly in your inbox.

Over time, I pieced together the idea that my students would benefit most from a teaching model that emphasized shared readings of challenging works of literature; memorization of poetry; explicit grammar instruction; contextual knowledge, including history; and teacher direction — not time practicing skills.

But even as I made changes and saw improvements, doubts nagged at me. By abandoning student choice, and asking kids to dust off Chaucer, would I snuff out their joy of reading? Is Shakespearean English simply too difficult for middle schoolers?

To set my doubts aside, I surveyed the relevant research and found that many of the assumptions upon which the workshop model was founded are simply false — starting with the assumption that reading comprehension depends on “reading comprehension skills.”

There is evidence that teaching such skills has some benefit, but what students really need in order to read with understanding is knowledge about history, geography, science, music, the arts and the world more broadly.

Perhaps the most famous piece of evidence for this knowledge-centered theory of reading comprehension is the “baseball study,” in which researchers gave children an excerpt about baseball and then tested their comprehension. At the outset of the study, researchers noted the children’s reading levels and baseball knowledge; they varied considerably.

Ultimately, the researchers found that it was each child’s prior baseball knowledge and not their predetermined reading ability that predicted their comprehension and recall of the passage.

That shouldn’t be surprising. Embedded within any newspaper article or novel is a vast amount of assumed knowledge that authors take for granted — from the fall of the Soviet Union to the importance of 1776.

Just about any student can decode the words “Berlin Wall,” but they need a knowledge of basic geography (where is Berlin?), history (why was the Berlin wall built?) and political philosophy (what qualities of the Communist regime caused people to flee from East to West?) to grasp the full meaning of an essay or story involving the Berlin Wall.

Of course, students aren’t born with this knowledge, which is why effective teachers build students’ capacity for reading comprehension by relentlessly exposing them to content-rich texts.

My research confirmed what I had concluded from my classroom experiences: The workshop model’s text-leveling and independent reading have a weak evidence base.

Rather than obsessing over the difficulty of texts, educators would better serve students by asking themselves other questions, such as: Does our curriculum expose children to topics they might not encounter outside of school? Does it offer opportunities to discuss related historical events? Does it include significant works of literature or nonfiction that are important for understanding modern society?

Related: PROOF POINTS: Slightly higher reading scores when students delve into social studies, study finds

In my classroom, I began to choose many books simply because of their historical significance or instructional opportunities. Reading the memoirs of Frederick Douglass with my students allowed me to discuss supplementary nonfiction texts about chattel slavery, fugitive slave laws and the Emancipation Proclamation.

Reading “The Magician’s Nephew” by C. S. Lewis prompted teaching about allusions to the Christian creation story and the myth of Narcissus, knowledge they could use to analyze future stories and characters.

Proponents of the workshop model claim that letting students choose the books they read will make them more motivated readers, increase the amount of time they spend reading and improve their literacy. The claim is widely believed.

However, it’s unclear to me why choice would necessarily foster a love of reading. To me, it seems more likely that a shared reading of a classic work with an impassioned teacher, engaged classmates and a thoughtfully designed final project are more motivating than reading a self-selected book in a lonely corner. That was certainly my experience.

After my classes acted out “Romeo and Juliet,” with rulers trimmed and painted to resemble swords, and read “To Kill a Mockingbird” aloud, countless students (and their parents) told me it was the first time they’d ever enjoyed reading.

They said these classics were the first books that made them think — and the first ones that they’d ever connected with.

Students don’t need hours wasted on finding a text’s main idea or noticing details. They don’t need time cloistered off with another book about basketball.

They need to experience art, literature and history that might not immediately interest them but will expand their perspective and knowledge of the world.

They need a teacher to guide them through and inspire a love and interest in this content. The workshop model doesn’t offer students what they need, but teachers still can.

Daniel Buck is an editorial and policy associate at the Thomas B. Fordham Institute and the author of “What Is Wrong with Our Schools?

This story about teaching reading was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s newsletter.

The post TEACHER VOICE: Everything I learned about how to teach reading turned out to be wrong appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/opinion-everything-i-learned-about-how-to-teach-reading-turned-out-to-be-wrong/feed/ 0 101869
OPINION: Colleges have to do a better job helping students navigate what comes next https://hechingerreport.org/opinion-colleges-have-to-do-a-better-job-helping-students-navigate-what-comes-next/ https://hechingerreport.org/opinion-colleges-have-to-do-a-better-job-helping-students-navigate-what-comes-next/#respond Tue, 02 Jul 2024 05:00:00 +0000 https://hechingerreport.org/?p=101821

Higher education has finally come around to the idea that college should better help prepare students for careers. It’s about time: Recognizing that students do not always understand the connection between their coursework and potential careers is a long-standing problem that must be addressed. Over 20 years ago, I co-authored the best-selling “Quarterlife Crisis,” one […]

The post OPINION: Colleges have to do a better job helping students navigate what comes next appeared first on The Hechinger Report.

]]>

Higher education has finally come around to the idea that college should better help prepare students for careers.

It’s about time: Recognizing that students do not always understand the connection between their coursework and potential careers is a long-standing problem that must be addressed.

Over 20 years ago, I co-authored the best-selling “Quarterlife Crisis,” one of the first books to explore the transition from college to the workforce. We found, anecdotally, that recent college graduates felt inadequately prepared to choose a career or transition to life in the workforce. At that time, liberal arts institutions in particular did not view career preparation as part of their role.

While some progress has been made since then, institutions can still do a better job connecting their educational and economic mobility missions; recent research indicates that college graduates are having a hard time putting their degrees to work.

Importantly, improving career preparation can help not only with employment but also with student retention and completion.

Related: Interested in innovations in the field of higher education? Subscribe to our free biweekly Higher Education newsletter.

I believe that if students have a career plan in mind, and if they better understand how coursework will help them succeed in the workforce, they will be more likely to complete that coursework, persist, graduate and succeed in their job search.

First-generation students, in particular, whose parents often lack college experience, may not understand why they need to take a course such as calculus, which, on the surface, does not appear to help prepare them for most jobs in the workforce.

They will benefit deeply from a clearer understanding of how such required courses connect to their career choices and skills.

Acknowledging the need for higher education to better demonstrate course-to-career linkages — and its role in workforce preparation — is an important first step.

Taking action to improve these connections will better position students and institutions. Better preparing students for the workforce will increase their success rates and, in turn, will improve college rankings on student success measures.

This might require a cultural shift in some cases, but given the soaring cost of tuition, it is necessary for institutions to think about return on investment for students and their parents, not only in intellectual terms but also monetarily.

Such a shift could help facilitate much-needed social and economic mobility, particularly for students who borrow money to attend college.

Related: OPINION: Post-pandemic, let’s develop true education-to-workforce pathways to secure a better future

Recent articles and research about low job placement rates for college graduates often posit that internships provide the needed connection between college and careers. Real-world experience is important, but there are other ways to make a college degree more career relevant.

1. Spell out the connections for students. The class syllabus is one opportunity to make this connection for students. Faculty can explain how different coursework topics and texts translate to career skills and provide real-life examples of those skills at work. In some cases, however, this might be a tough sell for faculty who have spent their careers in the academy and do not see career counseling as part of their job.

But providing this additional information for students does not need to be a big lift and can be done in partnership with campus staff, such as career services counselors. These connections can also be made in course catalogs, on department websites and through student seminars.

2. Raise awareness of realistic careers. Many students start college with the goal of entering a commonly known profession — doctor, lawyer or teacher, to name a few. However, there are hundreds of jobs, such as public policy research and advocacy, with which students may not be as familiar. Colleges should provide more detailed information on a wide range of careers that students may never have thought of — and how coursework can help them enter those fields. Experiential learning can provide good opportunities to sample careers that match students’ interests, to help further determine the right fit.

Increased awareness of job options can also serve as motivation for students as they formulate their goals and plans. Jobs can be described through the same information avenues as the career-coursework connections listed above, along with examples of how coursework is used in each job.

3. Make coursework-career connections a campuswide priority. College leaders must stress to faculty the importance of better preparing students for careers. Economic mobility is of increasing importance to institutions and the general public, and consumers now rely on information about employment outcomes when selecting colleges (e.g., see College Scorecard).

Faculty can be assured that adding career preparation to a college degree does not diminish its educational value — quite the contrary; critical thinking and analytical skills, for example, are of utmost importance to liberal arts programs and prospective employers. Simply demonstrating those links does not change coursework content or objectives.

4. Help students translate their coursework for the job market. Beyond understanding the coursework-to-career linkages, students must know how to articulate them. Job interviews are unnatural for anyone, especially for students new to the workforce — and even more so for those who are the first in their families to graduate from college.

Career centers often provide interview tips to students — again, if the students seek out that help — but special emphasis should be placed on helping students reflect on their coursework and translate the skills and knowledge they have gained for employers.

A portfolio can help them accomplish this, and it can be developed at regular intervals throughout a student’s time on campus, since reflecting on several years of coursework all at once can be challenging. A Senior Year Seminar can further promote workforce readiness and tie together the career skills gained throughout one’s time on campus.

By making these simple changes, institutions can take the lead in making students and the public more aware of the benefits of higher education.

Abby Miller, founding partner at ASA Research, has been researching higher education and workforce development for over 20 years.

This story about college and careers was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s newsletter.

The post OPINION: Colleges have to do a better job helping students navigate what comes next appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/opinion-colleges-have-to-do-a-better-job-helping-students-navigate-what-comes-next/feed/ 0 101821
PROOF POINTS: Some of the $190 billion in pandemic money for schools actually paid off https://hechingerreport.org/proof-points-190-billion-question-partially-answered/ https://hechingerreport.org/proof-points-190-billion-question-partially-answered/#respond Mon, 01 Jul 2024 10:00:00 +0000 https://hechingerreport.org/?p=101767 This image shows a conceptual illustration with a figure standing amidst a variety of floating U.S. dollar bill fragments on a teal background. The pieces of currency are scattered in different orientations, creating a sense of disarray and abstraction.

Reports about schools squandering their $190 billion in federal pandemic recovery money have been troubling.  Many districts spent that money on things that had nothing to do with academics, particularly building renovations. Less common, but more eye-popping were stories about new football fields, swimming pool passes, hotel rooms at Caesar’s Palace in Las Vegas and […]

The post PROOF POINTS: Some of the $190 billion in pandemic money for schools actually paid off appeared first on The Hechinger Report.

]]>
This image shows a conceptual illustration with a figure standing amidst a variety of floating U.S. dollar bill fragments on a teal background. The pieces of currency are scattered in different orientations, creating a sense of disarray and abstraction.

Reports about schools squandering their $190 billion in federal pandemic recovery money have been troubling.  Many districts spent that money on things that had nothing to do with academics, particularly building renovations. Less common, but more eye-popping were stories about new football fields, swimming pool passes, hotel rooms at Caesar’s Palace in Las Vegas and even the purchase of an ice cream truck. 

So I was surprised that two independent academic analyses released in June 2024 found that some of the money actually trickled down to students and helped them catch up academically.  Though the two studies used different methods, they arrived at strikingly similar numbers for the average growth in math and reading scores during the 2022-23 school year that could be attributed to each dollar of federal aid. 

One of the research teams, which includes Harvard University economist Tom Kane and Stanford University sociologist Sean Reardon, likened the gains to six days of learning in math and three days of learning in reading for every $1,000 in federal pandemic aid per student. Though that gain might seem small, high-poverty districts received an average of $7,700 per student, and those extra “days” of learning for low-income students added up. Still, these neediest children were projected to be one third of a grade level behind low-income students in 2019, before the pandemic disrupted education.

“Federal funding helped and it helped kids most in need,” wrote Robin Lake, director of the Center on Reinventing Public Education, on X in response to the two studies. Lake was not involved in either report, but has been closely tracking pandemic recovery. “And the spending was worth the gains,” Lake added. “But it will not be enough to do all that is needed.” 

The academic gains per aid dollar were close to what previous researchers had found for increases in school spending. In other words, federal pandemic aid for schools has been just as effective (or ineffective) as other infusions of money for schools. The Harvard-Stanford analysis calculated that the seemingly small academic gains per $1,000 could boost a student’s lifetime earnings by $1,238 – not a dramatic payoff, but not a public policy bust either. And that payoff doesn’t include other societal benefits from higher academic achievement, such as lower rates of arrests and teen motherhood. 

The most interesting nuggets from the two reports, however, were how the academic gains varied wildly across the nation. That’s not only because some schools used the money more effectively than others but also because some schools got much more aid per student.

The poorest districts in the nation, where 80 percent or more of the students live in families whose income is low enough to qualify for the federally funded school lunch program, demonstrated meaningful recovery because they received the most aid. About 6 percent of the 26 million public schoolchildren that the researchers studied are educated in districts this poor. These children had recovered almost half of their pandemic learning losses by the spring of 2023. The very poorest districts, representing 1 percent of the children, were potentially on track for an almost complete recovery in 2024 because they tended to receive the most aid per student. However, these students were far below grade level before the pandemic, so their recovery brings them back to a very low rung.

Some high-poverty school districts received much more aid per student than others. At the top end of the range, students in Detroit received about $26,000 each – $1.3 billion spread among fewer than 49,000 students. One in 10 high-poverty districts received more than $10,700 for each student. An equal number of high-poverty districts received less than $3,700 per student. These surprising differences for places with similar poverty levels occurred because pandemic aid was allocated according to the same byzantine rules that govern federal Title I funding to low-income schools. Those formulas give large minimum grants to small states, and more money to states that spend more per student. 

On the other end of the income spectrum are wealthier districts, where 30 percent or fewer students qualify for the lunch program, representing about a quarter of U.S. children. The Harvard-Stanford researchers expect these students to make an almost complete recovery. That’s not because of federal recovery funds; these districts received less than $1,000 per student, on average. Researchers explained that these students are on track to approach 2019 achievement levels because they didn’t suffer as much learning loss.  Wealthier families also had the means to hire tutors or time to help their children at home.

Middle-income districts, where between 30 percent and 80 percent of students are eligible for the lunch program, were caught in between. Roughly seven out of 10 children in this study fall into this category. Their learning losses were sometimes large, but their pandemic aid wasn’t. They tended to receive between $1,000 and $5,000 per student. Many of these students are still struggling to catch up.

In the second study, researchers Dan Goldhaber of the American Institutes for Research and Grace Falken of the University of Washington estimated that schools around the country, on average, would need an additional $13,000 per student for full recovery in reading and math.  That’s more than Congress appropriated.

There were signs that schools targeted interventions to their neediest students. In school districts that separately reported performance for low-income students, these students tended to post greater recovery per dollar of aid than wealthier students, the Goldhaber-Falken analysis shows.

Impact differed more by race, location and school spending. Districts with larger shares of white students tended to make greater achievement gains per dollar of federal aid than districts with larger shares of Black or Hispanic students. Small towns tended to produce more academic gains per dollar of aid than large cities. And school districts that spend less on education per pupil tended to see more academic gains per dollar of aid than high spenders. The latter makes sense: an extra dollar to a small budget makes a bigger difference than an extra dollar to a large budget. 

The most frustrating part of both reports is that we have no idea what schools did to help students catch up. Researchers weren’t able to connect the academic gains to tutoring, summer school or any of the other interventions that schools have been trying. Schools still have until September to decide how to spend their remaining pandemic recovery funds, and, unfortunately, these analyses provide zero guidance.

And maybe some of the non-academic things that schools spent money on weren’t so frivolous after all. A draft paper circulated by the National Bureau of Economic Research in January 2024 calculated that school spending on basic infrastructure, such as air conditioning and heating systems, raised test scores. Spending on athletic facilities did not. 

Meanwhile, the final score on pandemic recovery for students is still to come. I’ll be looking out for it.

This story about federal funding for education was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

The post PROOF POINTS: Some of the $190 billion in pandemic money for schools actually paid off appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-190-billion-question-partially-answered/feed/ 0 101767
PROOF POINTS: This is your brain. This is your brain on screens https://hechingerreport.org/proof-points-neuroscience-paper-v-screens-reading/ https://hechingerreport.org/proof-points-neuroscience-paper-v-screens-reading/#comments Mon, 24 Jun 2024 10:00:00 +0000 https://hechingerreport.org/?p=101581

Studies show that students of all ages, from elementary school to college, tend to absorb more when they’re reading on paper rather than screens. The advantage for paper is a small one, but it’s been replicated in dozens of laboratory experiments, particularly when students are reading about science or other nonfiction texts. Experts debate why […]

The post PROOF POINTS: This is your brain. This is your brain on screens appeared first on The Hechinger Report.

]]>
One brain study, published in May 2024, detected different electrical activity in the brain after students had read a passage on paper, compared with screens. Credit: Getty Images

Studies show that students of all ages, from elementary school to college, tend to absorb more when they’re reading on paper rather than screens. The advantage for paper is a small one, but it’s been replicated in dozens of laboratory experiments, particularly when students are reading about science or other nonfiction texts.

Experts debate why comprehension is worse on screens. Some think the glare and flicker of screens tax the brain more than ink on paper. Others conjecture that students have a tendency to skim online but read with more attention and effort on paper. Digital distraction is an obvious downside to screens. But internet browsing, texting or TikTok breaks aren’t allowed in the controlled conditions of these laboratory studies.

Neuroscientists around the world are trying to peer inside the brain to solve the mystery. Recent studies have begun to document salient differences in brain activity when reading on paper versus screens. None of the studies I discuss below is definitive or perfect, but together they raise interesting questions for future researchers to explore. 

One Korean research team documented that young adults had lower concentrations of oxygenated hemoglobin in a section of the brain called the prefrontal cortex when reading on paper compared with screens. The prefrontal cortex is associated with working memory and that could mean the brain is more efficient in absorbing and memorizing new information on paper, according to a study published in January 2024 in the journal Brain Sciences. An experiment in Japan, published in 2020, also noticed less blood flow in the prefrontal cortex when readers were recalling words in a passage that they had read on paper, and more blood flow with screens.

But it’s not clear what that increased blood flow means. The brain needs to be activated in order to learn and one could also argue that the extra brain activation during screen reading could be good for learning. 

Instead of looking at blood flow, a team of Israeli scientists analyzed electrical activity in the brains of 6- to 8-year-olds. When the children read on paper, there was more power in high-frequency brainwaves. When the children read from screens, there was more energy in low-frequency bands. 

The Israeli scientists interpreted these frequency differences as a sign of better concentration and attention when reading on paper. In their 2023 paper, they noted that attention difficulties and mind wandering have been associated with lower frequency bands – exactly the bands that were elevated during screen reading. However, it was a tiny study of 15 children and the researchers could not confirm whether the children’s minds were actually wandering when they were reading on screens. 

Another group of neuroscientists in New York City has also been looking at electrical activity in the brain. But instead of documenting what happens inside the brain while reading, they looked at what happens in the brain just after reading, when students are responding to questions about a text. 

The study, published in the peer-reviewed journal PLOS ONE in May 2024, was conducted by neuroscientists at Teachers College, Columbia University, where The Hechinger Report is also based. My news organization is an independent unit of the college, but I am covering this study just like I cover other educational research. 

In the study, 59 children, aged 10 to 12, read short passages, half on screens and half on paper. After reading the passage, the children were shown new words, one at a time, and asked whether they were related to the passage they had just read. The children wore stretchy hair nets embedded with electrodes. More than a hundred sensors measured electrical currents inside their brains a split second after each new word was revealed.

For most words, there was no difference in brain activity between screens and paper. There was more positive voltage when the word was obviously related to the text, such as the word “flow” after reading a passage about volcanoes. There was more negative voltage with an unrelated word like “bucket,” which the researchers said was an indication of surprise and additional brain processing. These brainwaves were similar regardless of whether the child had read the passage on paper or on screens. 

However, there were stark differences between paper and screens when it came to ambiguous words, ones where you could make a creative argument that the word was tangentially related to the reading passage or just as easily explain why it was unrelated. Take for example, the word “roar” after reading about volcanoes. Children who had read the passage on paper showed more positive voltage, just as they had for clearly related words like “flow.” Yet, those who had read the passage on screens showed more negative activity, just as they had for unrelated words like “bucket.”

For the researchers, the brainwave difference for ambiguous words was a sign that students were engaging in “deeper” reading on paper. According to this theory, the more deeply information is processed, the more associations the brain makes. The electrical activity the neuroscientists detected reveals the traces of these associations and connections. 

Despite this indication of deeper reading, the researchers didn’t detect any differences in basic comprehension. The children in this experiment did just as well on a simple comprehension test after reading a passage on paper as they did on screens. The neuroscientists told me that the comprehension test they administered was only to verify that the children had actually read the passage and wasn’t designed to detect deeper reading. I wish, however, the children had been asked to do something involving more analysis to buttress their argument that students had engaged in deeper reading on paper.

Virginia Clinton-Lisell, a reading researcher at the University of North Dakota who was not involved in this study, said she was “skeptical” of its conclusions, in part because the word-association exercise the neuroscientists created hasn’t been validated by outside researchers. Brain activation during a word association exercise may not be proof that we process language more thoroughly or deeply on paper.

One noteworthy result from this experiment is speed. Many reading experts have believed that comprehension is often worse on screens because students are skimming rather than reading. But in the controlled conditions of this laboratory experiment, there were no differences in reading speed: 57 seconds on the laptop compared to 58 seconds on paper –  statistically equivalent in a small experiment like this. And so that raises more questions about why the brain is acting differently between the two media. 

“I’m not sure why one would process some visual images more deeply than others if the subjects spent similar amounts of time looking at them,” said Timothy Shanahan, a reading research expert and a professor emeritus at the University of Illinois at Chicago. 

None of this work settles the debate over reading on screens versus paper. All of them ignore the promise of interactive features, such as glossaries and games, which can swing the advantage to electronic texts. Early research can be messy, and that’s a normal part of the scientific process. But so far, the evidence seems to be corroborating conventional reading research that something different is going on when kids log in rather than turn a page.

This story about reading on screens vs. paper was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

The post PROOF POINTS: This is your brain. This is your brain on screens appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-neuroscience-paper-v-screens-reading/feed/ 3 101581
Math ends the education careers of thousands of community college students. A few schools are trying something new https://hechingerreport.org/math-ends-the-education-careers-of-thousands-of-community-college-students-a-few-schools-are-trying-something-new/ https://hechingerreport.org/math-ends-the-education-careers-of-thousands-of-community-college-students-a-few-schools-are-trying-something-new/#comments Mon, 24 Jun 2024 05:00:00 +0000 https://hechingerreport.org/?p=101504

ALBANY, Ore. – It’s 7:15 on a cold gray Monday morning in May at Linn-Benton Community College in northwestern Oregon. Math professor Michael Lopez, in a hoodie and jeans, a tape measure on his belt, paces in front of the 14 students in his “math for welders” class. “I’m your OSHA inspector,” he says. “Three […]

The post Math ends the education careers of thousands of community college students. A few schools are trying something new appeared first on The Hechinger Report.

]]>

ALBANY, Ore. – It’s 7:15 on a cold gray Monday morning in May at Linn-Benton Community College in northwestern Oregon. Math professor Michael Lopez, in a hoodie and jeans, a tape measure on his belt, paces in front of the 14 students in his “math for welders” class. “I’m your OSHA inspector,” he says. “Three sixteenths of an inch difference, you’re in violation. You’re going to get a fine.”

He’s just given them a project they might have to do on the job: figure out the rung spacing on an external steel ladder that attaches to a wall. Thousands of dollars are at stake in such builds, and they’re complicated: Some clients want the fewest possible rungs to save money, others a specific distance between steps. To pass inspection, rungs must be evenly spaced to within one sixteenth of an inch, the top rung exactly flush with the top of the wall.

The exercise could be an algebra problem, but Lopez gives them a six-step algorithm that doesn’t use algebraic letters and symbols. Instead, they get real-world industry variables: tolerances, basic rung spacing, wall height.

Lopez breaks the class into five teams. Each team is assigned different wall heights and client specs, and they get to work calculating where to place the rungs. Lopez will inspect each team’s work and pass or fail the job.

Math is a giant hurdle for most community college students pursuing welding and other career and technical degrees. About a dozen years ago, Linn-Benton’s administration looked at their data and found that many students in career and technical education, or CTE, were getting most of the way toward a degree but were stopped by a math course, said the college’s president, Lisa Avery. That’s not unusual: Up to 60 percent of students entering community college are unprepared for college-level work, and the subject they most often need help with is math.

The college asked the math department to design courses tailored to those students, starting with its welding, culinary arts and criminal justice programs. The first of those, math for welders, rolled out in 2013.

Math professor Michael Lopez helps a student work through an algorithm for calculating ladder rung placement in his math for welders class. Credit: Jan Sonnenmair for The Hechinger Report

More than a decade later, welding department instructors say that math for welders has had a huge impact on student performance. Since 2017, 93 percent of students taking it have passed, and 83 percent have achieved all the course’s learning goals, including the ability to use arithmetic, geometry, algebra and trigonometry to solve welding problems, school data show. Two years ago, Linn-Benton asked Lopez to design a similar course for its automotive technology program; they began to offer that course last fall.

Related: Interested in innovations in higher education? Subscribe to our free biweekly higher education newsletter

Math for welders changed student Zane Azmane’s view of what he could do. “I absolutely hated math in high school. It didn’t apply to anything I needed at the moment,” said Azmane, 20, who failed several semesters of math early in high school but last year got a B in the Linn-Benton course. “We actually learned equations I’m going to use, like setting ladder rungs,” he said.

Linn-Benton’s aim is to change how students pursuing technical degrees learn math by making it directly applicable to their technical specialties.

Some researchers think these small-scale efforts to teach math in context could transform how it’s taught more broadly.

Among strategies to help college students who struggle with math, giving them contextual curriculums seems to have “the strongest theoretical base and perhaps the strongest empirical support,” according to a 2011 paper by Columbia University Teachers College professor emerita Dolores Perin. (The Hechinger Report is an independent unit of Teachers College.)

Perin’s paper echoed the results of a 2006 study of math in CTE involving 131 CTE high school teachers and almost 3,000 students. Students in the study who were taught math through an applied approach performed significantly better on two of three standardized tests than those taught math in a more traditional way. (The applied math students also performed better on the third test, though the results didn’t reach the statistical significance threshold.)

Robert Van Etta, a student in Linn-Benton Community College’s math for welders class, marks out the spacing for ladder rungs, part of a lesson in using algebraic concepts to solve real-world challenges. Credit: Jan Sonnenmair for The Hechinger Report

So far, there haven’t been systematic studies of math in CTE at the college level, said James Stone, director of the National Research Center for Career and Technical Education at the Southern Regional Education Board, who ran the 2006 study.

Stone explained how math in context works. Students start with a practical problem and learn a math principle for solving it. Next, they use the principle to solve a similar practical problem, to see that it applies generally. Finally, they apply the principle on paper, in say, a standardized test.

“I like to say math is just like a wrench: It’s another tool in the toolbox to solve a workplace problem,” said Stone. “People learn almost anything better in context because then it has meaning.”

Linn-Benton dean Steve Schilling offers an example. Carpenters use a well-known 3-4-5 rule to get a square corner — lay out two boards at a square angle and mark one board at 3 feet and the other at 4 feet. Now a straight line joining the two marks should measure exactly 5 feet—if it doesn’t, the boards are out of square.

The rule is based on the Pythagorean theorem, a method for calculating the lengths of a right triangle’s sides: a2 + b2 = c2. When explaining to students why the theorem describes the rule, the instructor uses math terms — “adjacent side,” “opposite side,” “hypotenuse” — that they’ll need to use on a math test, said Schilling. When using practical skills like the 3-4-5 rule on a project, “at first, they don’t even realize they’re doing math,” he said.

Related: Federal relief money boosted community colleges, but now it’s going away

Oregon appears to be one of the few places where this approach is spreading, if slowly.

Three hours south of Linn-Benton, Doug Gardner, an instructor in the Rogue Community College math department, had long struggled with a persistent question from students: “Why do we need to know this?” The answer couldn’t just be that they needed it for their next, higher-level math class, said Gardner, now the department’s chair. “It became my life’s work to have an answer to that question.”

Meanwhile, Algebra I was a huge barrier for many Rogue students. About a third of those taking the course or a lower-level math course failed or withdrew. That meant they had to retake the class and likely stay another term to graduate; since many were older students with families and obligations, hundreds dropped out, school administrators said.

Math proficiency is critical to jobs in welding and other technical fields, but a huge hurdle for most community college students pursuing career and technical degrees. Some colleges have succeeded in improving math learning by tailoring instruction to those technical fields. Credit: Jan Sonnenmair for The Hechinger Report

For those who stayed, lack of math knowledge hurt their job skills. Pipe fitters, for example, are among the higher-paid welders, said welding department chair Todd Giesbrecht, but they need a solid understanding of the math involved. “Whether they’re making elbows, whether they’re making dump truck bodies, they’re installing steam pipe, all of those things involve math,” he said.

So, in 2010, Gardner applied for and got a National Science Foundation grant to create two new applied algebra courses. Instead of abstract formulas, students would learn practical ones: how to calculate the volume of a wheelbarrow of gravel and the number of wheelbarrows needed to cover an area, or how much a beam of a certain size and type will bend under a certain load.

Since then, the pass rate in the applied algebra class has averaged 73 percent while that of the traditional course has continued to hover around 59 percent, according to Gardner. Even modest gains like that are hard to achieve, said Navarro Chandler, a dean at the college. “Any move over 2 percent, we call that a win,” he said.

Linn-Benton Community College asked its math department to design specialized courses for students getting degrees in its welding, automotive technology and other career and technical programs. Tyrese Unger, rear, using a protractor, is in one of the welding program’s applied math courses. Credit: Jan Sonnenmair for The Hechinger Report

One day in May, math professor Kathleen Foster was teaching applied algebra in a sun-drenched classroom on Rogue’s wooded campus and launched into a lesson about the Pythagorean theorem and why it’s an essential tool for building home interiors and steel structures.

She presented the formula, then jumped to illustrated exercises: What’s the right length for diagonal braces in a lookout tower to ensure that the structure will hold? What length does the diagonal top plate for a stair wall need to be to ensure that the wall’s corners are perfectly square?

James Butler-Kyniston, 30, who is pursuing a degree as a machinist, said that the exercises covered in Foster’s class are directly applicable to his future career. One exercise had them calculate how large a metal sheet you would need to manufacture a certain number of parts at one time, a skill he’s used in the lab. “Algebraic formulas apply to a lot of things, but since you don’t have any examples to tie them to, you end up thinking they’re useless,” he said.

Related: Proof Points: Shop class sometimes boosts college going, Massachusetts study finds

Unlike at Linn-Benton, students at Rogue in any degree field can take this course, so some of the applied examples don’t work for everyone. Butler-Kyniston said he thinks applied math works better if it’s tailored to a specific set of majors.

Still, Foster’s class could rescue the college plans of at least one student. Kayla LeMaster, 41, is on her second try at a two-year degree. She had to drop out in 2012 after getting injured in a house fire. She’s going for a degree that will let her transfer to the University of Oregon to major in psychology; she hopes to eventually work as a school counselor or in some other job supporting kids.

But her graduation from Rogue hangs by a thread because she needs a math credit. She struggled in the traditional algebra class and had to withdraw, and the same happened in a statistics course. Applied algebra is her last chance. “When you add the alphabet to math, it doesn’t make sense,” she said. By contrast, in the examples in Foster’s class, “you get into that work mode, a job site somewhere, and you can see the problem in your head.” She got an A on her first test. “I’m getting it,” she said.

Professor Michael Lopez, who has a strong background in technical careers himself, introduces an exercise on using math to calculate the spacing when building ladder rungs, a project his welding students might one day have to do on the job. Credit: Jan Sonnenmair for The Hechinger Report

Gardner worries about the consequences of the traditional abstract approach to teaching math. When he was in college, “nobody ever showed me one formula that calculated anything really interesting,” he said. “I just think we’re doing a terrible job. Applied math is so fun.”

Oregon’s leaders appear to see merit in teaching math in context. In 2021, state legislators passed a law requiring all four-year colleges to accept an applied math community-college course called Math in Society as satisfying the math requirement for a four-year degree. In that course, instead of studying theoretical algebra, students learn how to use probability and statistics to interpret the results in scientific papers and how political rules like apportionment and gerrymandering affect elections, said Kathy Smith, a math professor at Central Oregon Community College.

“If I had my way, this is how algebra would be taught to every student, the applied version,” said Gardner. “And then if a student says, ‘This is great, but I want to go further,’ then you sign up for the theoretical version.”

At the level of individual schools, lack of money and time constrain the spread of applied math. Stone’s team works with high schools around the country to design contextual math courses for career and technical students. They tried to work with a few community colleges, but their CTE faculty, many of whom are part-timers on contract, didn’t have time to partner with their math departments to come up with a new curriculum, a yearlong process, Stone said.

Linn-Benton was able to invest the time and money because its math department was big enough to take on the task, said Avery. And both Linn-Benton and Rogue may be outliers because they have math faculty with technical backgrounds: Lopez worked as a carpenter and sheriff’s deputy and served three tours as a machine gunner in Iraq, and Gardner was a construction contractor who still designs houses. “I have up to 16 house plans in the works during construction season,” he said.

Back in Lopez’s class, on a sunny Wednesday, students are done calculating where their ladder rungs should go and now must mark them on the wall. One team struggles. “I don’t understand any of this,” says Keith Perkins, 40, who’s going for a welding degree and wants to get into the local pipe fitters union.

“I know, but you’re not doing the steps in the right order,” says Lopez. “Walk me through it. Tell me what you did, starting with step 1.”

As teams finish up, Lopez inspects their work. “That’s one thirty-second shy. But I wouldn’t worry too much about it,” he tells one group. “OSHA’s not going to knock you down for that.”

Three teams pass, two fail — but this is the place to make mistakes, not out on the job, Lopez tells them.

“This stuff is hard,” said Perkins. “I hated math in school. Still hate it. But we use it every day.”

This story about math in CTE courses was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for our higher education newsletter. Listen to our higher education podcast.

The post Math ends the education careers of thousands of community college students. A few schools are trying something new appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/math-ends-the-education-careers-of-thousands-of-community-college-students-a-few-schools-are-trying-something-new/feed/ 2 101504
OPINION: There’s a promising path to get students back on track to graduation https://hechingerreport.org/opinion-theres-a-promising-path-to-get-students-back-on-track-to-graduation/ https://hechingerreport.org/opinion-theres-a-promising-path-to-get-students-back-on-track-to-graduation/#respond Tue, 18 Jun 2024 05:00:00 +0000 https://hechingerreport.org/?p=101558

Rates of chronic absenteeism are at record-high levels. More than 1 in 4 students missed 10 percent or more of the 2021-22 school year. That means millions of students missed out on regular instruction, not to mention the social and emotional benefits of interacting with peers and trusted adults. Moreover, two-thirds of the nation’s students […]

The post OPINION: There’s a promising path to get students back on track to graduation appeared first on The Hechinger Report.

]]>

Rates of chronic absenteeism are at record-high levels. More than 1 in 4 students missed 10 percent or more of the 2021-22 school year. That means millions of students missed out on regular instruction, not to mention the social and emotional benefits of interacting with peers and trusted adults.

Moreover, two-thirds of the nation’s students attended a school where chronic absence rates reached at least 20 percent. Such levels disrupt entire school communities, including the students who are regularly attending.

The scope and scale of this absenteeism crisis necessitate the implementation of the next generation of student support.

Fortunately, a recent study suggests a promising path for getting students back in school and back on track to graduation. A group of nearly 50 middle and high schools saw reductions in chronic absenteeism and course failure rates after one year of harnessing the twin powers of data and relationships.

From the 2021-22 to 2022-23 school years, the schools’ chronic absenteeism rates dropped by 5.4 percentage points, and the share of students failing one or more courses went from 25.5 percent to 20.5 percent. In the crucial ninth grade, course failure rates declined by 9.2 percentage points.

These encouraging results come from the first cohort of rural and urban schools and communities partnering with the GRAD Partnership, a collective of nine organizations, to grow  the use of “student success systems” into a common practice.

Student success systems take an evidence-based approach to organizing school communities to better support the academic progress and well-being of all students.

They were developed with input from hundreds of educators and build on the successes of earlier student support efforts — like early warning systems and on-track initiatives — to meet students’ post-pandemic needs.

Related: Widen your perspective. Our free biweekly newsletter consults critical voices on innovation in education.

Importantly, student success systems offer schools a way to identify school, grade-level and classroom factors that impact attendance; they then deliver timely supports to meet individual students’ needs. They do this, in part, by explicitly valuing supportive relationships and responding to the insights that students and the adults who know them bring to the table.

Valuable relationships include not only those between students and teachers, and schools and families, but also those among peer groups and within the entire school community. Schools cannot address the attendance crisis without rebuilding and fostering these relationships.

When students feel a sense of connection to school they are more likely to show up.

For some students, this connection comes through extracurricular activities like athletics, robotics or band. For others, it may be a different connection to school.

Schools haven’t always focused on connections in a concrete way, partly because relationships can feel fuzzy and hard to track. We’re much better at tracking things like grades and attendance.

Still, schools in the GRAD Partnership cohort show that it can be done.

These schools established “student success teams” of teachers, counselors and others. The teams meet regularly to look at up-to-date student data and identify and address the root causes of absenteeism with insight and input from families and communities, as well as the students themselves.

The teams often use low-tech relationship-mapping tools to help identify students who are disconnected from activities or mentors. One school’s student success team used these tools to ensure that all students were connected to at least one activity — and even created new clubs for students with unique interests. Their method was one that any school could replicate —collaborating on a Google spreadsheet.

Another school identified students who would benefit from a new student mentoring program focused on building trusting relationships.

Related: PROOF POINTS: The chronic absenteeism puzzle

Some schools have used surveys of student well-being to gain insight on how students feel about school, themselves and life in general — and have then used the information to develop supports.

And in an example of building supportive community relationships, one of the GRAD Partnership schools worked with local community organizations to host a resource night event at which families were connected on the spot to local providers who could help them overcome obstacles to regular attendance — such as medical and food needs, transportation and housing issues and unemployment.

Turning the tide against our current absenteeism crisis does not have a one-and-done solution — it will involve ongoing collaborative efforts guided by data and grounded in relationships that take time to build.

Without these efforts, the consequences will be severe both for individual students and our country as a whole.

Robert Balfanz is a research professor at the Center for Social Organization of Schools at Johns Hopkins University School of Education, where he is the director of the Everyone Graduates Center.

This story about post-pandemic education was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s newsletter.

The post OPINION: There’s a promising path to get students back on track to graduation appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/opinion-theres-a-promising-path-to-get-students-back-on-track-to-graduation/feed/ 0 101558
PROOF POINTS: Teens are looking to AI for information and answers, two surveys show https://hechingerreport.org/proof-points-teens-ai-surveys/ https://hechingerreport.org/proof-points-teens-ai-surveys/#respond Mon, 17 Jun 2024 10:00:00 +0000 https://hechingerreport.org/?p=101528

Two new surveys, both released this month, show how high school and college-age students are embracing artificial intelligence. There are some inconsistencies and many unanswered questions, but what stands out is how much teens are turning to AI for information and to ask questions, not just to do their homework for them. And they’re using […]

The post PROOF POINTS: Teens are looking to AI for information and answers, two surveys show appeared first on The Hechinger Report.

]]>

Two new surveys, both released this month, show how high school and college-age students are embracing artificial intelligence. There are some inconsistencies and many unanswered questions, but what stands out is how much teens are turning to AI for information and to ask questions, not just to do their homework for them. And they’re using it for personal reasons as well as for school. Another big takeaway is that there are different patterns by race and ethnicity with Black, Hispanic and Asian American students often adopting AI faster than white students.

The first report, released on June 3, was conducted by three nonprofit organizations, Hopelab, Common Sense Media, and the Center for Digital Thriving at the Harvard Graduate School of Education. These organizations surveyed 1,274 teens and young adults aged 14-22 across the U.S. from October to November 2023. At that time, only half the teens and young adults said they had ever used AI, with just 4 percent using it daily or almost every day. 

Emily Weinstein, executive director for the Center for Digital Thriving, a research center that investigates how youth are interacting with technology, said that more teens are “certainly” using AI now that these tools are embedded in more apps and websites, such as Google Search. Last October and November, when this survey was conducted, teens typically had to take the initiative to navigate to an AI site and create an account. An exception was Snapchat, a social media app that had already added an AI chatbot for its users. 

More than half of the early adopters said they had used AI for getting information and for brainstorming, the first and second most popular uses. This survey didn’t ask teens if they were using AI for cheating, such as prompting ChatGPT to write their papers for them. However, among the half of respondents who were already using AI, fewer than half – 46 percent – said they were using it for help with school work. The fourth most common use was for generating pictures.

The survey also asked teens a couple of open-response questions. Some teens told researchers that they are asking AI private questions that they were too embarrassed to ask their parents or their friends. “Teens are telling us I have questions that are easier to ask robots than people,”  said Weinstein.

Weinstein wants to know more about the quality and the accuracy of the answers that AI is giving teens, especially those with mental health struggles, and how privacy is being protected when students share personal information with chatbots.

The second report, released on June 11, was conducted by Impact Research and  commissioned by the Walton Family Foundation. In May 2024, Impact Research surveyed 1,003 teachers, 1,001 students aged 12-18, 1,003 college students, and 1,000 parents about their use and views of AI.

This survey, which took place six months after the Hopelab-Common Sense survey, demonstrated how quickly usage is growing. It found that 49 percent of students, aged 12-18, said they used ChatGPT at least once a week for school, up 26 percentage points since 2023. Forty-nine percent of college undergraduates also said they were using ChatGPT every week for school but there was no comparison data from 2023.

Among 12- to 18-year-olds and college students who had used AI chatbots for school, 56 percent said they had used it for help in writing essays and other writing assignments. Undergraduate students were more than twice as likely as 12- to 18-year-olds to say using AI felt like cheating, 22 percent versus 8 percent. Earlier 2023 surveys of student cheating by scholars at Stanford University did not detect an increase in cheating with ChatGPT and other generative AI tools. But as students use AI more, students’ understanding of what constitutes cheating may also be evolving. 

 

More than 60 percent of college students who used AI said they were using it to study for tests and quizzes. Half of the college students who used AI said they were using it to deepen their subject knowledge, perhaps, as if it were an online encyclopedia. There was no indication from this survey if students were checking the accuracy of the information.

Both surveys noticed differences by race and ethnicity. The first Hopelab-Common Sense survey found that 7 percent of Black students, aged 14-22, were using AI every day, compared with 5 percent of Hispanic students and 3 percent of white students. In the open-ended questions, one Black teen girl wrote that, with AI, “we can change who we are and become someone else that we want to become.” 

The Walton Foundation survey found that Hispanic and Asian American students were sometimes more likely to use AI than white and Black students, especially for personal purposes. 

These are all early snapshots that are likely to keep shifting. OpenAI is expected to become part of the Apple universe in the fall, including its iPhones, computers and iPads.  “These numbers are going to go up and they’re going to go up really fast,” said Weinstein. “Imagine that we could go back 15 years in time when social media use was just starting with teens. This feels like an opportunity for adults to pay attention.”

This story about ChatGPT in education was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

The post PROOF POINTS: Teens are looking to AI for information and answers, two surveys show appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-teens-ai-surveys/feed/ 0 101528
PROOF POINTS: As teacher layoffs loom, research evidence mounts that seniority protections hurt kids in poverty https://hechingerreport.org/proof-points-teacher-layoffs-seniority-protections/ https://hechingerreport.org/proof-points-teacher-layoffs-seniority-protections/#comments Mon, 10 Jun 2024 10:00:00 +0000 https://hechingerreport.org/?p=101445

Teacher layoffs are likely this fall as $190 billion in federal pandemic aid expires. By one estimate, schools spent a fifth of their temporary funds on hiring new people, most of them teachers. Those jobs may soon be cut with many less experienced teachers losing their jobs first. The education world describes this policy with […]

The post PROOF POINTS: As teacher layoffs loom, research evidence mounts that seniority protections hurt kids in poverty appeared first on The Hechinger Report.

]]>

Teacher layoffs are likely this fall as $190 billion in federal pandemic aid expires. By one estimate, schools spent a fifth of their temporary funds on hiring new people, most of them teachers. Those jobs may soon be cut with many less experienced teachers losing their jobs first. The education world describes this policy with a business acronym used in inventory accounting: LIFO or “Last In, First Out.” 

Intuitively, LIFO seems smart. It not only rewards teachers for their years of service, but there’s also good evidence that teachers improve with experience. Not every seasoned teacher is great, but on average, veterans are better than rookies. Keeping them in classrooms is generally best for students.

The problem is that senior teachers aren’t evenly distributed across schools. Wealthier and whiter schools tend to have more experienced teachers. By contrast, high-poverty schools, often populated by Black and Hispanic students, are staffed by more junior teachers. That’s because stressful working conditions at low-income schools prompt many teachers to leave after a short stint. Each year, they’re replaced with a fresh crop of young teachers and the turnover repeats. 

When school districts lay teachers off by seniority, high-poverty schools end up bearing the brunt of the job cuts. The policy exacerbates the teacher churn at these schools. And that churn alone harms student achievement, especially when a large share of teachers are going through the rocky period of adjusting to a new workplace. 

“LIFO is not very good for kids,” said Dan Goldhaber, a labor economist at the American Institutes for Research, speaking to journalists about expected teacher layoffs at the 2024 annual meeting of the Education Writers Association in Las Vegas.

Source: TNTP and Educators for Excellence (2023) “So All Students Thrive: Rethinking Layoff Policy To Protect Teacher Diversity.” A more detailed list of teacher layoff laws by state is in the appendix.

The last time there were mass teacher layoffs was after the 2008 recession. Economists estimate that 120,000 elementary, middle and high school teachers lost their jobs between 2008 and 2012. The vast majority of school districts used seniority as the sole criteria for determining which teachers were laid off, according to a 2022 policy brief published in the journal Education Finance and Policy. In some cases, state law mandated that teacher layoffs had to be done by seniority. LIFO rules were also written into teachers union contracts. In other cases, school leaders simply decided to carry out layoffs this way. 

Economists haven’t been able to conclusively prove that student achievement suffered more under LIFO layoffs than other ways of reducing the teacher workforce. But the evidence points in that direction for children in poverty and for Black and Hispanic students, according to two research briefs by separate groups of scholars that reviewed dozens of studies. For example, in the first two years after the 2008 recession, Black and Hispanic elementary students in Los Angeles Unified School District had 72 percent and 25 percent greater odds, respectively, of having their teacher laid off compared to their white peers, according to one study. 

Districts with higher rates of poverty and larger shares of Black and Hispanic students were more likely to have seniority-based layoff policies, according to another study. “LIFO layoff policies end up removing less experienced teachers, sometimes in mass, from a small handful of schools,” wrote Matthew Kraft and Joshua Bleiberg in their 2022 policy brief for the journal, Education Finance and Policy.

Budget cuts can create some messy situations. Terry Grier, a retired superintendent, who ran the San Diego school district following the 2008 recession, remembers that his district cut costs by eliminating jobs in the central office and reassigning these bureaucrats, many of whom had teacher certifications, to fill classroom vacancies. To avoid additional layoffs, his school board forced him to transfer teachers in overstaffed schools to fill classroom vacancies elsewhere, Grier said. The union contract specified that forced transfers had to begin with teachers who had the least seniority. That exacerbated teacher turnover at his poorest schools, and the loss of some very good teachers, he said. 

“Despite being relatively new to the profession, many of these teachers were highly skilled,” said Grier. 

Losing promising new talent is painful. Raúl Gastón, the principal of a predominantly Hispanic and low-income middle school in Villa Park, Ill., still regrets not having the discretion to lay off a teacher whose poor performance was under review, and being forced instead to let go of an “excellent” rookie teacher in 2015.

“It was a gut punch,” Gastón said. “She had just received a great rating on her evaluation. I was looking forward to what she could do to bring up our scores and help our students.”

The loss of excellent early career teachers was made stark in Minnesota, where Qorsho Hassan lost her job in the spring of 2020 because of her district’s adherence to LIFO rules. After her layoff, Hassan was named the state’s Teacher of the Year

Hassan was also a Black teacher, which highlights another unintended consequence of layoff policies that protect veteran teachers: they disproportionately eliminate Black and Hispanic faculty. That undermines efforts to diversify the teacher workforce, which is 80 percent white, while the U.S. public school student population is less than half white. In recent years, districts have had some success in recruiting more Black and Hispanic teachers, but many of them are still early in their careers. 

The unfairness of LIFO layoffs became evident after the 2008 recession. Since then, 20 states have enacted laws to restrict the use of seniority as the main criteria for who gets laid off. But many states still permit it, including Texas. State laws in California and New York still require that layoffs be carried out by seniority, according to TNTP, a nonprofit focused on improving K-12 education, and Educators for Excellence. 

While there is a consensus among researchers that LIFO layoffs have unintended consequences that harm both students and teachers, there’s debate about what should replace this policy. One approach would be to lay off less effective teachers, regardless of seniority. But teacher effectiveness ratings, based on student test scores, are controversial and unpopular with teachers. Observational ratings can be subjective and, in practice, these evaluations tend to rate most teachers highly, making it hard to use them to distinguish teacher quality.

Others have suggested keeping a seniority system in place but adding additional protections for certain kinds of teachers, such as those who teach in hard-to-staff, high-poverty schools. Oregon keeps LIFO in place, but in 2021 carved out an exception for teachers with “cultural and linguistic expertise.” In 2022, Minneapolis schools decided that “underrepresented” teachers would be skipped during seniority-based layoffs. Still another idea is to make layoffs proportional to school size so that poor schools don’t suffer more than others.

This story about teacher layoffs was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

The post PROOF POINTS: As teacher layoffs loom, research evidence mounts that seniority protections hurt kids in poverty appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/proof-points-teacher-layoffs-seniority-protections/feed/ 2 101445