Friday, October 2, 2020

On Tech: Tech isn’t the answer for test taking

Taking tests remotely is a problem. That doesn't mean technology is the solution.

Tech isn’t the answer for test taking

Kiel Mutschelknaus

Dear readers, please be extra careful online today. The news that President Trump has tested positive for the coronavirus created the kind of fast-moving information environment in which we might be inclined to read and share false or emotionally manipulative material online. It’s happening already.

I found this from The Verge and this from The Washington Post to be helpful guides to avoid contributing to online confusion, unhelpful arguments and false information. A good rule of thumb: If you have a strong emotional reaction to something, step away from your screen.

Technology is not more fair or more capable than people. Sometimes we shouldn’t use it at all.

That’s the message from Meredith Broussard, a computer scientist, artificial intelligence researcher and professor in data journalism at New York University.

We discussed the recent explosion of schools relying on technology to monitor remote students taking tests. Broussard told me this is an example of people using technology all wrong.

My colleagues reported this week on software designed to flag students cheating on tests by doing things like tracking eye movements via a webcam. Students told my colleagues and other journalists that it felt callous and unfair to be suspected of cheating because they read test questions aloud, had snacks on their desks or did other things that the software deemed suspicious.

ADVERTISEMENT

Monitoring test taking is never going to be flawless, and the pandemic has forced many schools into imperfect accommodations for virtual education. But Broussard said the underlying problem is that people too often misapply technology as a solution when they should be approaching the problem differently.

Instead of finding invasive, imperfect software to keep the test-taking process as normal as possible in wildly abnormal times, what if schools ditched closed-book tests during a pandemic, she suggested.

“Remote education needs to look a little bit different, and we can all adapt,” Broussard told me.

Broussard, who wrote about the misuse of software to assign student grades for The New York Times’s Opinion section, also said that schools need to have the option to try software for test proctoring and other uses, assess if it’s helping students and ditch it without financial penalty if it isn’t.

ADVERTISEMENT

Broussard’s ways of looking at the world go far beyond education. She wants us all to reimagine how we use technology, period.

There are two ways to think about uses of software or digital data to help make decisions in education and beyond. One approach is that imperfect outcomes require improvement to the technology or better data to make better decisions. Some technologists say this about software that tries to identify criminal suspects from photos or video footage and has proved flawed, particularly for darker-skinned people.

Broussard takes a second view. There is no effective way to design software to make social decisions, she said. Education isn’t a computer equation, nor is law enforcement. Social inputs like racial and class bias are part of these systems, and software will only amplify the biases.

Fixing the computer code is not the answer in those circumstances, Broussard said. Just don’t use computers.

ADVERTISEMENT

Talking to Broussard flipped a switch in my brain, but it took a while. I kept asking her, “But what about …” until I absorbed her message.

She isn’t saying don’t use software to spot suspicious credit card transactions or screen medical scans for possible cancerous lesions. But Broussard starts with the premise that we need to be selective and careful about when and how we use technology.

We need to be more aware of when we’re trying to apply technology in areas that are inherently social and human. Tech fails at that.

“The fantasy is we can use computers to build a system to have a machine liberate us from all the messiness of human interaction and human decision making. That is a profoundly antisocial fantasy,” Broussard said. “There is no way to build a machine that gets us out of the essential problems of humanity.”

If you don’t already get this newsletter in your inbox, please sign up here.

Facebook can’t quit its bad habits

Everyone is telling Facebook to do one thing. It is doing the opposite.

Those concerned about the spread of false conspiracy theories and misinformation online have singled out the dangers of Facebook’s groups, the gatherings of people with shared interests. Groups, particularly those that are by invitation only, have become places where people can push false health treatments and wild ideas, and plan violent plots.

Facebook recommends groups — including those that discuss extremist ideas — to people as they’re scrolling through their feeds. My colleague Sheera Frenkel told me that almost every expert she knew said that Facebook should stop automated recommendations for groups devoted to false and harmful ideas like the QAnon conspiracy. This is tricky because groups focused on dangerous ideas sometimes hide their focus.

Facebook knows about the problems with group recommendations, and it’s responding by … making even MORE recommendations for groups open to everyone. That was among the changes Facebook announced on Thursday. The company said it would give people who oversee groups more authority to block certain people or topics in posts.

That is Facebook’s answer. Make group administrators responsible for the bad stuff. Not Facebook. This infuriates me. (To be fair, Facebook is doing more to emphasize public groups, not private ones in which outsiders are less likely to see and report dangerous activities.) But Facebook isn’t fully adopting a safety measure that everyone had been shouting about from the rooftops.

Why? Because it’s hard for people and companies to change.

Like most internet companies, Facebook has always focused on getting bigger. It wants more people in more countries using Facebook more and more avidly. Recommending people join groups is a way to get people to find more reasons to spend time on Facebook.

My colleague Mike Isaac told me that growth can overrule all other imperatives at Facebook. The company says it has a responsibility to protect people and not contribute to the flow of dangerous information. But when protecting people conflicts with Facebook’s growth mandate, growth tends to win.

Before we go …

  • When our tax dollars are spent fighting the wrong problem: My colleague Patricia Cohen reported that some efforts to root out fraud in U.S. state unemployment insurance programs have been misdirected at uncovering people who misstate their eligibility instead of targeting the networks of criminals who steal people’s identities to swindle the government out of money.
  • The pros and cons of pay-advance apps: Apps like Earnin that give people an advance on their paychecks have been lifelines to many people during the pandemic. My colleague Tara Siegel Bernard also writes that the apps come with some of the same concerns as conventional payday lenders: excess fees or misleading business practices that can trap people in expensive cycles of debt.
  • Seriously, things are bonkers. Please watch something nice: I personally am going to wallow in YouTube videos from the cooking rock star Sohla El-Waylly. Check out that and other recommendations from The New York Times Watching newsletter.

Hugs to this

We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.

If you don’t already get this newsletter in your inbox, please sign up here.

Need help? Review our newsletter help page or contact us for assistance.

You received this email because you signed up for On Tech with Shira Ovide from The New York Times.

To stop receiving these emails, unsubscribe or manage your email preferences.

Subscribe to The Times

Connect with us on:

facebooktwitterinstagram

Change Your EmailPrivacy PolicyContact UsCalifornia Notices

The New York Times Company. 620 Eighth Avenue New York, NY 10018

No comments:

Post a Comment