Technology Archives - The Hechinger Report https://hechingerreport.org/tags/technology/ Covering Innovation & Inequality in Education Mon, 29 Apr 2024 18:05:03 +0000 en-US hourly 1 https://hechingerreport.org/wp-content/uploads/2018/06/cropped-favicon-32x32.jpg Technology Archives - The Hechinger Report https://hechingerreport.org/tags/technology/ 32 32 138677242 Schools were just supposed to block porn. Instead they sabotaged homework and censored suicide prevention sites https://hechingerreport.org/schools-were-just-supposed-to-block-porn-instead-they-sabotaged-homework-and-censored-suicide-prevention-sites/ https://hechingerreport.org/schools-were-just-supposed-to-block-porn-instead-they-sabotaged-homework-and-censored-suicide-prevention-sites/#respond Thu, 25 Apr 2024 05:00:00 +0000 https://hechingerreport.org/?p=100362

This article was originally published by The Markup, a nonprofit, investigative newsroom that challenges technology to serve the public good. WILDWOOD, Missouri — A middle school student in Missouri had trouble collecting images of people’s eyes for an art project. An elementary schooler in the same district couldn’t access a picture of record-breaking sprinter Florence Griffith Joyner […]

The post Schools were just supposed to block porn. Instead they sabotaged homework and censored suicide prevention sites appeared first on The Hechinger Report.

]]>

This article was originally published by The Markup, a nonprofit, investigative newsroom that challenges technology to serve the public good.

WILDWOOD, Missouri — A middle school student in Missouri had trouble collecting images of people’s eyes for an art project. An elementary schooler in the same district couldn’t access a picture of record-breaking sprinter Florence Griffith Joyner to add to a writing assignment. A high school junior couldn’t read analyses of the Greek classic “The Odyssey” for her language arts class. An eighth grader was blocked repeatedly while researching trans rights.

All of these students saw the same message in their web browsers as they tried to complete their work: “The site you have requested has been blocked because it does not comply with the filtering requirements as described by the Children’s Internet Protection Act (CIPA) or Rockwood School District.”

CIPA, a federal law passed in 2000, requires schools seeking subsidized internet access to keep students from seeing obscene or harmful images online—especially porn. 

School districts all over the country, like Rockwood in the western suburbs of St. Louis, go much further, limiting not only what images students can see but what words they can read. Records obtained from 16 districts in 11 different states show just how broadly schools block content, forcing students to jump through hoops to complete assignments and keeping them from resources that could support their health and safety.

Some of the censorship inhibits the ability to do basic research on sites like Wikipedia and Quora. Students have been blocked from going to websites that web-filtering software categorizes as “education,” “news,” or “informational.” But even more concerning, especially for some students who spoke with The Markup, are blocks against sex education, abortion information, and resources for LGBTQ+ teens—including suicide prevention.

Future-minded reporting

Our free newsletter, Future of Learning, features innovative, original stories every other Wednesday.

Virtually all school districts buy web filters from companies that sort the internet into categories. Districts decide which categories to block, often making those selections without a complete understanding of the universe of websites under each label—information that the filtering companies consider proprietary. This necessarily leads to overblocking, and The Markup found that districts routinely have to create new, custom categories to allow certain websites on a case-by-case basis. Students and teachers, meanwhile, suffer the consequences of overzealous filtering.

The filters did sometimes keep students from seeing pornographic images, but far more often they kept students from playing online games, browsing social media, and using the internet for legitimate academic work. Records from the 16 districts include blocks that students wouldn’t necessarily notice, representing just elements of a page, like an ad or an image, rather than the entire site, but they reveal that districts’ filters collectively logged over 1.9 billion blocks in just a month.

Related: How AI could transform the way schools test kids

“We’re basically trapped in this bubble, and they’re deciding what we can and can’t see,” said 18-year-old Ali Siddiqui, a senior at a San Francisco Bay Area high school.

The Markup requested records from 26 school districts. Many we selected because they had made headlines for banning library books; others we chose because government records showed they had purchased web filters or because they were mentioned by students interviewed for this article. Although 10 districts did not release the records—almost all claiming it would compromise their cybersecurity—we were still able to compile one of the most comprehensive datasets yet showing how U.S. schools censor the internet.

The blocks raise questions about whether schools’ online censorship runs afoul of constitutional law and federal guidance. The Markup’s investigation revealed that some districts, including Rockwood, continue to block content that’s supportive of LGBTQ+ teens while leaving anti-LGBTQ+ content accessible, something a Missouri court ruled was unconstitutional over a decade ago. What’s more, many districts completely block social media sites, something the Federal Communications Commission said in 2011 was inconsistent with CIPA.

The districts examined by The Markup varied significantly in what they blocked. While many districts blocked YouTube and most blocked social media, only a handful blocked sex education websites.

Catherine Ross, professor emeritus of law at George Washington University and author of a book on school censorship, called the blocks “a very serious concern—particularly for those whose only access is through sites that are controlled by the school,” whether that access is limited because they can’t afford it at home or simply can’t get it.

“We’re setting up a system in which students, by the accident of geography, are getting very different kinds of education,” Ross said. “Do we really want that to be the case? Is that fair?”

Related: Why schools’ efforts to block the Internet are so laughably lame

Survey data show how these inequities play out. The Center for Democracy and Technology asked teachers last year whether internet filtering and blocking can make it harder for students to complete assignments. Among teachers in schools with high rates of poverty, 62 percent said yes; among teachers in schools with lower rates, 50 percent said the same.

Though banned books get more attention than blocked websites in schools, some groups are fighting back. Students in Texas are supporting a state law that would limit what schools can censor, and the American Library Association hosts Banned Websites Awareness Day each fall. The ACLU continues to fight the issue at the local level more than a decade after wrapping up its national “Don’t Filter Me” campaign against school web blocks of resources for the LGBTQ+ community. Yet as the culture wars play out in U.S. schools, Brian Klosterboer, an attorney with the ACLU of Texas, said there are signs the problem is getting worse. “I’m worried there’s a lot more content filtering reemerging.”

“Human sexuality”

When Grace Steldt was in eighth grade in the Rockwood School District, she had to do a research project and decided to study trans rights. As someone who identifies as queer, she was particularly interested in the transgender community’s battle for civil rights. Steldt, now a sophomore, remembers having to do much of her research on her phone to get around the district’s web filter.

She also remembers that one of her teachers that year had a poster on her wall about The Trevor Project, whose site offers suicide prevention resources specifically for LGBTQ+ young people. The teacher wanted students to know her room was a safe space and that there was help available.

But the Rockwood web filter blocks The Trevor Project for middle schoolers, meaning that Steldt couldn’t have accessed it on the school network. Same for It Gets Better, a global nonprofit that aims to uplift and empower LGBTQ+ youth, and The LGBTQ+ Victory Fund, which supports openly LGBTQ+ candidates for public office nationwide. At the same time, the filter allows Rockwood students to see anti-LGBTQ+ information online from fundamentalist Christian group Focus on the Family and the Alliance Defending Freedom, a legal nonprofit the Southern Poverty Law Center labeled an anti-LGBTQ+ hate group in 2016.

Bob Deneau, the school district’s chief information officer, said his department works with teachers to determine the curricular benefit of unblocking certain categories. “When we look at it, we say, ‘Is there educational purpose?’” he explained.

Related: Is early childhood education ready for AI?

The policy is to block first and only unblock in the face of a compelling case.

Rockwood did unblock some LGBTQ+ sites for high schoolers, including The Trevor Project and It Gets Better, in response to individual requests, but they remain blocked for middle and elementary schoolers, and the district records listed some thwarted attempts to visit the sites.

Rockwood School District gets its web-filtering platform, ContentKeeper, from a company called Impero, which, in 2021, was reportedly used by over 300 school districts in the U.S. One of its filter categories is called “human sexuality,” and it captures informational resources, support websites, and entertainment news designed for the LGBTQ+ community.

Even though the ACLU’s “Don’t Filter Me” campaign, launched in 2011, urged filtering companies to get rid of LGBTQ+ categories, The Markup investigation found that ContentKeeper and a filter from a company called Securly both still use them. Securly is one of the most popular web filters, used in more than 20,000 schools, and its “sexual content” category covers “websites about sexual health and LGBTQ+ advocacy websites.” Despite the category name, it is not designed to include porn.

Two other filtering companies represented in The Markup’s dataset, iboss and Lightspeed, removed similar categories in response to the ACLU campaign. Lightspeed says it serves 28,000 schools globally; while iboss doesn’t offer school-specific numbers, it works with more than 4,000 organizations worldwide.

The ACLU campaign didn’t focus only on filtering companies. It also pressured districts to unblock the categories themselves. Missouri’s Camdenton R-III School District refused, and the ACLU took it to court. Attorneys argued the district’s filter amounted to viewpoint discrimination, blocking access to supportive LGBTQ+ information while allowing access to anti-LGBTQ+ sites. They won.

Yet complaints have continued. Cameron Samuels first encountered blocks to LGBTQ+ web pages during the 2018–19 school year while working on a class project as a ninth grader in Texas’ Katy Independent School District. Like Rockwood, Katy uses ContentKeeper to filter the web; to Samuels, the LGBTQ+ category of blocks felt like a personal attack. Not only did Samuels find that the LGBTQ+ news source The Advocate was blocked, the teen also couldn’t visit The Trevor Project.

“The district was blocking access to potentially lifesaving resources for me and my LGBT identity,” Samuels said.

By senior year, Samuels was ready to challenge the whole filter category, having gained confidence and experience in community organizing. The ACLU of Texas got involved, helping Samuels file a grievance with Katy ISD. District administrators ruled against them, but the school board ruled in Samuels’ favor on appeal, unblocking the entire “human sexuality” category for high schoolers.

Related: How flawed IQ tests prevent kids from getting help in school

Still, the category remains blocked for younger students, and Anne Russey wants to change that. A mom of two elementary schoolers in Katy ISD and a professional therapist for LGBTQ+ adults, Russey first filed tech support tickets to ask for individual websites to be unblocked. After being denied, she escalated her fight through the same grievance process Samuels took, but the school board would not unblock The Trevor Project in its elementary schools. Seeing no further recourse locally, Russey also filed a discrimination complaint with the U.S. Department of Education’s Office for Civil Rights, and that case remains open.

“My biggest fear is that we lose a student as a result of this filter,” she said. The Trevor Project estimates that at least one LGBTQ+ person between the ages of 13 and 24 attempts suicide every 45 seconds.

“On a less catastrophic level,” Russey said, “kids do start to figure out who they are attracted to in these upper elementary grades.” If kids want to explore LGBTQ+ information, thinking they might identify as part of that community, they would only be able to access negative information on school computers.

Representatives from Impero did not return repeated calls and emails requesting comment about ContentKeeper for this story.

Securly’s vice president of marketing, Joshua Mukai, said only that “the Sexual Content category helps schools avoid overblocking websites related to reproductive health or sexual orientation by enabling them to create policies that specifically allow sites discussing sexual topics for age-appropriate groups.” He offered no comment on the idea that blocking LGBTQ+ advocacy websites through the “sexual content” category is discriminatory.

Reproductive health

Maya Perez, a senior in Fort Worth, Texas, is the president of her high school’s Feminist Club, and she and her peers create presentations to drive their discussions. But research often proves nearly impossible on her school computer. She recently sought out information for a presentation about health care disparities and abortion access.

“Page after page was just blocked, blocked, blocked,” Perez said. “It’s challenging to find accurate information a lot of times.”

She resorted to looking things up on her phone and then typing notes into her computer, which was “really inefficient,” she said. “I just wish I had access to more news sites and informational sites.”

In response to a request for records of blocked websites through November, the Fort Worth Independent School District released only two days’ worth of blocking, showing the five most frequently blocked domains (Spotify, Facebook, TikTok, Roku, and Instagram) as well as a list of categories blocked. “Abortion” did not show up as a blocked category, but search engines were blocked more than 4,500 times, education websites were blocked about 3,800 times, and news websites were blocked 648 times.

Planned Parenthood affiliates around the country end up negotiating directly with local school districts to unblock their website, according to Julia Bennett, the nonprofit’s senior director of digital education and learning strategy. Some schools say yes, some no. 

Alison Macklin spent almost 20 years as a sex educator in Colorado; at the end of her lessons she would tell students that they could find more information and resources on plannedparenthood.org. “Kids would say, ‘No, I can’t, miss,’” she remembered. She now serves as the policy and advocacy director for SIECUS, a national nonprofit advocating for sex education.

Only 29 states and the District of Columbia require sex education, according to SIECUS’ legislative tracking. Missouri is not one of them. The Rockwood and Wentzville school districts in Missouri were among those The Markup found to be blocking sex education websites. The Markup also identified blocks to sex education websites, including Planned Parenthood, in Florida, Utah, Texas, and South Carolina.

In Manatee County, Florida, students aren’t the only ones who can’t access these sites — district records show teachers are blocked from sex education websites too.

The breadth of the internet

Like Perez, Rockwood School District sophomore Brooke O’Dell most frequently runs into blocked websites when doing homework. Sometimes she can’t access PDFs she wants to read. Her workaround is to pull out her phone, find the webpage using her own cellular data, navigate to the file she wants, email it to herself, and then go back to her school-issued Chromebook to open it. When it’s website text she’s interested in, O’Dell uses the Google Drive app on her phone to copy-and-paste text into a Google Doc that she can later access from her Chromebook. She recently had to do this while working on a literary criticism project about the book “Jane Eyre.”

Recounting her frustration, O’Dell bristled at the need for any web filter at all.

“While you’re in school, they are in charge of you,” she said, “but that doesn’t mean they need to control everything you’re doing.”

In Forsyth County Schools in Georgia, which blocks a relatively narrow set of categories, records obtained by The Markup reveal a spate of blocked YouTube videos: One video shows a person reading a novel about Pablo Picasso. Another, a clip of Picasso himself painting. A third is an analysis of the painting “Guernica,” and a final one describes Picasso’s life and impact. Besides inhibiting Picasso research, the filter stopped other internet users in the district from history videos, a physics lesson, videos of zoo animals, and children’s songs about the seasons and days of the week.

Among the 16 districts that released records about their blocked websites, 13 shared the categories tied to the blocks. Games and social media were the most frequently blocked categories, along with ads, entertainment, audio and video content, and search engines.

Sites labeled “porn” or “nudity” didn’t crack the top 10 categories blocked in any district. Only in Palm Beach County, Florida, and Seattle were they even in the top 20.

The School District of Manatee County blocks its internet more broadly than almost any other district The Markup analyzed. Internet users in Manatee were blocked from accessing dictionary websites, Google Scholar, academic journals, church websites, and a range of news outlets, including Teen Vogue, Fox News, and a Tampa Bay TV station, according to the records. Manatee’s chief technology officer, Scott Hansen, said many of those websites are available to students and staffers but not guests on the district’s network, such as outside students working on homework during downtime over long sports tournaments or other events. Still, Manatee students can’t access the local public library catalog; most social media platforms; or sites with audio and video content including Fox Nation, Spotify, and SoundCloud.

As Deneau explained in the Rockwood School District, Hansen described a filtering policy in Manatee that errs on the side of blocking. If a category isn’t seen as having an explicit educational purpose, it is blocked.

Hansen started working in school district IT before CIPA required filters. “In the early days, they were all terrible,” he said. “They created lots of challenges, but their intent was good and they were needed.” Now, by contrast, Hansen said the most widely used filters do a good job of properly categorizing the internet, which limits the complaints he hears from teachers; few instructors actually request that sites get unblocked.

While that may be true, interviews with students and teachers around the country indicate many of them have simply resigned themselves to being kept from much of the internet. Students don’t necessarily know they can ask that sites get unblocked, and many who do make the request have been denied. The overarching rationale for the filters—keeping students safe—seems unimpeachable, so few people try to fight them. And schools, after all, have the right to limit what they make available online. CIPA lets the FCC refuse internet subsidies to school districts that don’t filter out porn, but the law doesn’t identify any consequence for excessive filtering, giving districts wide latitude to make their own decisions.

In the Center for Democracy and Technology’s survey, nearly three-quarters of students said web filters make it hard to complete assignments. Even accounting for youthful exaggeration, 57 percent of teachers said the same was true for their students.

Kristin Woelfel, a policy counsel at CDT, said she and her colleagues started to think of the web filters as a “digital book ban,” an act of censorship that’s as troubling as a physical book ban but far less visible. “You can see whether a book is on a shelf,” she said. By contrast, decisions about which websites or categories to block happen under the radar.

When Rockwood started using ContentKeeper a few years ago, O’Dell noticed that the filtering became more restrictive. While she recognizes that the blocking prevents students from playing games on their computers, she doesn’t believe technology should play that role.

“It’s not really teaching kids the responsibility of when to pay attention in class,” she said. “It kind of just takes that entire part of learning completely away.”

A stubborn status quo

The American Library Association has been calling for a more nuanced approach to filtering the internet in schools and libraries since 2003, when it failed to convince the Supreme Court that CIPA is unconstitutional. In that case, the ALA argued that the filters violate public library patrons’ right to receive information, a constitutional protection legal scholars trace back to the 1940s. The Supreme Court has upheld the concept multiple times since then, arguing that the First Amendment protects not only the right to speak but the right to receive information and ideas. In the 2003 case, however, the Supreme Court ruled that, as long as people 17 and older could request a website be unblocked, the filters did not unduly limit internet users’ constitutional rights.

Though CIPA makes clear that school districts only have to block a narrow sliver of the internet, it does leave schools with the power to determine what else is inappropriate for their students. In 2010, the U.S. Department of Education lamented that filters put up barriers “to the rich learning experiences that in-school Internet access should afford students.” Shortly after the Department of Education complained about the law’s impact, the FCC emphasized that school districts should not set up blanket blocks on social media websites.

Yet in more than a decade, districts have had no additional federal guidance about what they owe students online. And the Markup investigation showed that many districts are flouting the limited existing guidelines; almost all districts blocked some social media sites in their entirety. And only three out of 16 school districts analyzed by The Markup let students directly request sites be unblocked. Deborah Caldwell-Stone, director of the ALA’s Office for Intellectual Freedom, said schools that refuse to field such requests are potentially infringing on students’ constitutional rights.

Caldwell-Stone called CIPA “a handy crutch” for censorship that is not justified by the law. “The FCC makes it clear that it’s not [justified], but there’s no remedy for the kind of activity other than going to court,” she said, which is too expensive and time-consuming for many families.

Lawsuits also have limited reach, often changing behavior in only one small part of the country at a time. Rockwood School District has a filter doing what the ACLU sued Camdenton for over a decade ago and the two districts are in the same state, just 150 miles apart. Battling discrimination carried out via web filters is like a game of whack-a-mole in a nation where much of the decision-making is left to more than 13,000 individual school districts.

Bob Deneau, the chief information officer at Rockwood, said he wasn’t aware of the Camdenton case or that the district’s filter policies might be a legal liability.

And besides the cases where filters explicitly block one viewpoint while allowing another—as with LGBTQ+-related content in Rockwood and Katy—the question of what students have a right to see is only getting murkier. In 2023 alone, the American Library Association tracked challenges to more than 9,000 books in school libraries nationwide.

But it doesn’t have to be that way. Schools could use the wide latitude the FCC leaves them to take a more hands-off approach to web filtering. In Georgia’s Forsyth County, where books have been banned from school libraries, Mike Evans, the district’s chief technology and information officer, said websites have not been involved in the controversy.

“We’ll always have different families on one side or another,” Evans said. “Some would rather have things more restricted if they don’t agree with any LGBTQ-type material or video that might be available, but we try to stay away from that type of [filtering] altogether.”

Forsyth County Schools does not have a block category for LGBTQ+ resources.

In Texas, meanwhile, Katy ISD grad Cameron Samuels co-founded Students Engaged in Advancing Texas to fight for open access to information statewide. The group supported a bill, introduced by state Rep. Jon Rosenthal last year, that would prohibit schools from blocking websites with resources for students about human trafficking, interpersonal or domestic violence, sexual assault, or mental health and suicide prevention for LGBTQ+ individuals. It didn’t go anywhere, but Samuels hopes it will in the future—especially because new board members in Katy ISD could mean the websites Samuels fought so hard to unblock get blocked once again.

“Censorship,” Samuels said grimly, “is a winning issue right now.”

The post Schools were just supposed to block porn. Instead they sabotaged homework and censored suicide prevention sites appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/schools-were-just-supposed-to-block-porn-instead-they-sabotaged-homework-and-censored-suicide-prevention-sites/feed/ 0 100362
How AI could transform the way schools test kids https://hechingerreport.org/how-ai-could-transform-the-way-schools-test-kids/ https://hechingerreport.org/how-ai-could-transform-the-way-schools-test-kids/#respond Thu, 11 Apr 2024 05:00:00 +0000 https://hechingerreport.org/?p=99994

Imagine interacting with an avatar that dissolves into tears – and being assessed on how intelligently and empathetically you respond to its emotional display. Or taking a math test that is created for you on the spot, the questions written to be responsive to the strengths and weaknesses you’ve displayed in prior answers. Picture being […]

The post How AI could transform the way schools test kids appeared first on The Hechinger Report.

]]>

Imagine interacting with an avatar that dissolves into tears – and being assessed on how intelligently and empathetically you respond to its emotional display.

Or taking a math test that is created for you on the spot, the questions written to be responsive to the strengths and weaknesses you’ve displayed in prior answers. Picture being evaluated on your scientific knowledge and getting instantaneous feedback on your answers, in ways that help you better understand and respond to other questions.

These are just a few of the types of scenarios that could become reality as generative artificial intelligence advances, according to Mario Piacentini, a senior analyst of innovative assessments with the Programme for International Student Assessment, known as PISA.

He and others argue that AI has the potential to shake up the student testing industry, which has evolved little for decades and which critics say too often falls short of evaluating students’ true knowledge. But they also warn that the use of AI in assessments carries risks.

“AI is going to eat assessments for lunch,” said Ulrich Boser, a senior fellow at the Center for American Progress, where he co-authored a research series on the future of assessments. He said that standardized testing may one day become a thing of the past, because AI has the potential to personalize testing to individual students.

PISA, the influential international test, expects to integrate AI into the design of its 2029 test. Piacentini said the Organization for Economic Cooperation and Development, which runs PISA, is exploring the possible use of AI in several realms.

  • It plans to evaluate students on their ability to use AI tools and to recognize AI-generated information.
  • It’s evaluating whether AI could help write test questions, which could potentially be a major money and time saver for test creators. (Big test makers like Pearson are already doing this, he said.)
  • It’s considering whether AI could score tests. According to Piacentini, there’s promising evidence that AI can accurately and effectively score even relatively complex student work.  
  • Perhaps most significantly, the organization is exploring how AI could help create tests that are “much more interesting and much more authentic,” as Piacentini puts it.

When it comes to using AI to design tests, there are all sorts of opportunities. Career and tech students could be assessed on their practical skills via AI-driven simulations: For example, automotive students could participate in a simulation testing their ability to fix a car, Piacentini said.

Right now those hands-on tests are incredibly intensive and costly – “it’s almost like shooting a movie,” Piacentini said. But AI could help put such tests within reach for students and schools around the world.

AI-driven tests could also do a better job of assessing students’ problem-solving abilities and other skills, he said. It might prompt students when they’d made a mistake and nudge them toward a better way of approaching a problem. AI-powered tests could evaluate students on their ability to craft an argument and persuade a chatbot. And they could help tailor tests to a student’s specific cultural and educational context.

“One of the biggest problems that PISA has is when we’re testing students in Singapore, in sub-Saharan Africa, it’s a completely different universe. It’s very hard to build a single test that actually works for those two very different populations,” said Piacentini. But AI opens the door to “construct tests that are really made specifically for every single student.”

That said, the technology isn’t there yet, and educators and test designers need to tread carefully, experts warn. During a recent panel Javeria moderated, Nicol Turner Lee, director of the Center for Technology Innovation at the Brookings Institution, said any conversation about AI’s role in assessments must first acknowledge disparities in access to these new tools.

Many schools still use paper products and struggle with spotty broadband and limited digital tools, she said: The digital divide is “very much part of this conversation.” Before schools begin to use AI for assessments, teachers will need professional development on how to use AI effectively and wisely, Turner Lee said.

There’s also the issue of bias embedded in many AI tools. AI is often sold as if it’s “magic,”  Amelia Kelly, chief technology officer at SoapBox Labs, a software company that develops AI voice technology, said during the panel. But it’s really “a set of decisions made by human beings, and unfortunately human beings have their own biases and they have their own cultural norms that are inbuilt.”

With AI at the moment, she added, you’ll get “a different answer depending on the color of your skin, or depending on the wealth of your neighbors, or depending on the native language of your parents.”  

But the potential benefits for students and learning excite experts such as Kristen Huff, vice president of assessment and research at Curriculum Associates, where she helps develop online assessments. Huff, who also spoke on the panel, said AI tools could eventually not only improve testing but also “accelerate learning” in areas like early literacy, phonemic awareness and early numeracy skills. Huff said that teachers could integrate AI-driven assessments, especially AI voice tools, into their instruction in ways that are seamless and even “invisible,” allowing educators to continually update their understanding of where students are struggling and how to provide accurate feedback.

PISA’s Piacentini said that while we’re just beginning to see the impact of AI on testing, the potential is great and the risks can be managed.  

“I am very optimistic that it is more an opportunity than a risk,” said Piacentini. “There’s always this risk of bias, but I think we can quantify it, we can analyze it, in a better way than we can analyze bias in humans.”

This story about AI testing was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s newsletter.

The post How AI could transform the way schools test kids appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/how-ai-could-transform-the-way-schools-test-kids/feed/ 0 99994
OPINION: Artificial intelligence can be game-changing for students with special needs https://hechingerreport.org/opinion-artificial-intelligence-can-be-game-changing-for-students-with-special-needs/ https://hechingerreport.org/opinion-artificial-intelligence-can-be-game-changing-for-students-with-special-needs/#respond Mon, 01 Apr 2024 05:00:00 +0000 https://hechingerreport.org/?p=99727

Much has been made of artificial intelligence’s potential to revolutionize education. AI is making it increasingly possible to break down barriers so that no student is ever left behind. This potential is real, but only if we are ensuring that all learners benefit. Far too many students, especially those with special needs, do not progress […]

The post OPINION: Artificial intelligence can be game-changing for students with special needs appeared first on The Hechinger Report.

]]>

Much has been made of artificial intelligence’s potential to revolutionize education. AI is making it increasingly possible to break down barriers so that no student is ever left behind.

This potential is real, but only if we are ensuring that all learners benefit.

Far too many students, especially those with special needs, do not progress as well as their peers do academically. Meanwhile, digital media, heavily reliant on visuals and text, with audio often secondary, is playing an increasing role in education.

For a typical user in most cases, this is fine. But not for blind or deaf students, whose sensory limitations frequently impede their access to quality education. The stakes are much higher for these students, and digital media often underserves them.

That’s why the development of AI-powered tools that can accommodate all learners must be a priority for policymakers, districts and the education technology industry.

Related: ‘We’re going to have to be a little more nimble’: How school districts are responding to AI

Good instruction is not a one-way street where students simply absorb information passively. For learning content to be most effective, the student must be able to interact with it. But doing so can be especially challenging for students with special needs working with traditional digital interfaces.

A mouse, trackpad, keyboard or even a touch screen may not always be appropriate for a student’s sensory or developmental capabilities. AI-driven tools can enable more students to interact in ways that are natural and accessible for them.

For blind and low-vision students

For blind and low-vision students, digital classroom materials have historically been difficult to use independently. Digital media is visual, and to broaden access, developers usually have to manually code descriptive information into every interface.

These technologies also often impose a rigid information hierarchy that the user must tab through with keys or gestures. The result is a landscape of digital experiences that blind and low-vision students either cannot access at all or experience in a form that lacks the richness of the original.

For these students, AI-powered computer vision offers a solution — it can scan documents, scenes and apps and then describe visual elements aloud through speech synthesis. Coupled with speech recognition, this allows seamless conversational navigation without rigid menus or keyboard commands.

Free tools like Ask Envision and Be My Eyes demonstrate this potential. Using just an AI-enabled camera and microphone, these apps can capture and explain anything the user points them toward, and then answer follow-up questions.

These technologies have the potential to allow blind and low-vision students to get the full benefit of the same engaging, personalized ed tech experiences that their peers have been using for years.

For deaf and hard-of-hearing students

In some ways, the visually oriented world of digital media is an ideal fit for deaf and hard-of-hearing students. Audio is often a secondary consideration; particularly once users can read.

In cases in which audio is required for comprehension, like with video, the accommodation most digital developers provide is text-based captioning. Unfortunately, this means that a user must already be a proficient reader.

For younger learners, or any learner who does not read fluently or quickly, translation into sign language is a preferable solution. AI can be of service here, translating speech and text into animated signs while computer vision reads the user’s gestures and translates them into text or commands.

There are some early developments in this area, but more work is needed to create a fully sign language-enabled solution.

For the youngest learners

For young learners, even those without diagnosed disabilities, developmentally appropriate interactions with conventional desktop/mobile apps remain a challenge. A young child cannot read or write, which makes most text-based interfaces impossible for them. And their fine motor control is not fully developed, which makes using a mouse or keyboard or trackpad more difficult.

AI voice controls address these problems by enabling students to simply speak requests or responses, a more natural interaction for these pre-readers and -writers. Allowing a child to simply ask for what they want and verbally answer questions gives them a more active role in their learning.

Voice control may also enable a more reliable assessment of their knowledge, as there are fewer confounding variables when the student is not trying to translate what they understand into an input that a computer will understand.

Computer vision can smooth over text-based methods of interaction. For example, username/password login forms can be replaced with QR codes; many school-oriented systems have already done so.

Computer vision can also be used to enable interactions between the physical and digital world. A student can complete a task by writing or drawing on paper or constructing something from objects, and a computer can “see” and interpret their work.

Using physical objects can be more developmentally appropriate for teaching certain concepts. For example, having a child count with actual objects is often better than using digital representations. Traditional methods can also be more accurate in some cases, such as practicing handwriting with pencil and paper instead of a mouse or trackpad.

Even without physical objects, computer vision can enable the assessment of kinesthetic learning, like calculating on fingers or clapping to indicate syllables in a word.

Related: STUDENT VOICE: Teachers assign us work that relies on rote memorization, then tell us not to use artificial intelligence

A major hurdle in education is that although every student is unique, we have not had the tools or resources to truly tailor their learning to their individualized strengths and needs. AI technology has the potential for transformative change.

The responsibility falls on all of us — districts, policymakers and the ed tech industry — to collaborate and ensure that AI-powered accessibility becomes the norm, not the exception.

We must share knowledge and urgently advocate for policies that prioritize and fund the swift deployment of these game-changing tools to all learners. Accessibility can’t be an afterthought; it must be a top priority baked into every program, policy and initiative.

Only through concerted efforts can we bring the full potential of accessible AI to every classroom.

Diana Hughes is the vice president of Product Innovation and AI at Age of Learning.

This story about AI and special needs students was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s newsletter.

The post OPINION: Artificial intelligence can be game-changing for students with special needs appeared first on The Hechinger Report.

]]>
https://hechingerreport.org/opinion-artificial-intelligence-can-be-game-changing-for-students-with-special-needs/feed/ 0 99727