Friday, September 25, 2020

On Tech: Facebook’s power this election

How much should Facebook be held responsible for what happens on its site?

Facebook’s power this election

Zipeng Zhu

Charlie Warzel is furious about Facebook. Even when I don’t agree with him, I want to know what he thinks.

In his New York Times Opinion columns, Charlie has over months railed against the company for what he said were decisions and design flaws that created fertile ground for the QAnon conspiracy, toxic and extremist speech and manipulative information about elections, including claims from President Trump. He has argued that Facebook should more aggressively demote or delete divisive and potentially dangerous posts.

On the doorstep of the U.S. presidential election, Charlie and I talked about how much blame Facebook and other internet companies deserve for divisions in the United States, and how much Facebook should intervene to make sure voters aren’t swayed by misleading narratives about the world.

Shira: Your most recent column is an argument with yourself. You want tech companies to push people toward factual election information and make voting easier, but worry that corporations like Facebook have too much power.

Charlie: Yes, we talked in the spring about a similar feeling I had about technology companies and coronavirus exposure alerts. When crises happen and important institutions like the government fail us, we look for adults in the room. These companies are potential adults, and there’s something soothing about that.

ADVERTISEMENT

But it’s also crazy to ask maybe a dozen unelected people in charge of big internet companies to protect a pillar of American democracy.

Do we place too much blame on internet companies for what individuals do on their sites? Facebook didn’t start the QAnon conspiracy theory or put inflammatory words in President Trump’s mouth.

There’s a lot of misdirected anger now, yes, but we shouldn’t let the internet companies off the hook.

I can’t get out of my head the false information that spread on social media wrongly blaming anti-fascist activists for the wildfires in Oregon. People flooded law enforcement tip lines with bogus leads, and the misinformation made some people defy evacuation orders.

ADVERTISEMENT

I don’t blame Facebook for the ills of the country. But it is an accelerant, and what galls me is that the company seems unwilling to grapple with that in a serious way — while feigning that it is.

If President Trump makes false claims that undermine trust in the election, why not blame him — not Facebook for disseminating what he says? He says the same things in front of TV cameras, too.

That’s right but, again, it shouldn’t let Facebook or other internet companies off the hook.

And the fact that Trump can say in the briefing room that he might not accept the results of the election should actually remove some pressure from these internet companies to let him post whatever he wants. They are not lawmakers’ only voice.

ADVERTISEMENT

In these excerpts from Facebook’s internal meetings, Mark Zuckerberg, the company’s chief executive, said the majority of negative customer feedback Facebook receives is from people concerned the company removes too many posts, and that they often interpret those actions as the company’s bias against conservative views.

If you, as well as some of Facebook’s employees, want Facebook to more aggressively demote or delete manipulative or potentially dangerous posts but its customers are concerned about censorship, isn’t the company right to be cautious?

This is Facebook trying to be two incompatible things. There is Facebook that is a customer-oriented product, like McDonald’s, serving two billion people.

And there is Facebook that acknowledges it has a social responsibility as essential communications infrastructure for elections, the pandemic and more. McDonald’s wants customers to be happy, but it doesn’t try to secure elections.

If you don’t already get this newsletter in your inbox, please sign up here.

A government acted on corporate behavior. Nothing happened.

It has been a week since I wrote about Zephyr Teachout’s prescription for how people should help fix what they believe is irresponsible behavior by companies like Facebook: Don’t demand that the company change its behavior. Demand that governments force the company to change. (Some of you disagreed with this.)

But I’ve been wondering, What happens if governments force companies to change and nothing happens?

California passed a law last year that was intended to force Uber and some other companies to classify their workers as employees rather than contractors. Lawmakers acted because of concerns that Uber and similar companies misapplied contract work rules in ways that left people without minimum wages, sick leave and other benefits and job protections.

This is what Teachout, a law professor at Fordham University, was talking about. Pressure mounted, and the government tried to force companies to change their behavior. But Uber and some other companies said no.

They said the law — which was essentially written with them as the focus — did not apply to them. They sued and did not comply. Uber and Lyft told Californians they might be forced to shut down or significantly alter service in the state. Uber, Lyft and other companies have also spent $180 million and counting to ask California voters to redo the law in a November ballot measure. A recent poll showed that the vote might be close.

Look, this is how democracy and the legal system work in the United States. Corporations are free to challenge laws that they believe are wrong, and they can ask voters to tell their elected officials a law is misguided.

But I can’t help thinking that California did what Teachout talked about — the state saw a problem and acted. And a handful of companies just said no.

Before we go …

  • Public safety technology doesn’t work if local governments don’t use it effectively: My colleague Jim Tankersley wrote that when a wildfire struck part of western Oregon, officials didn’t turn on the emergency alert system intended to inform people about evacuation orders by text, radio and TV. Problems with notifications have plagued wildfire evacuations across the West in recent years, often with deadly consequences.
  • Google promises changes to how it treats workplace misconduct: Google’s parent company has agreed to make changes — including loosening requirements for employees to keep secret about sexual harassment settlements — to end lawsuits over its handling of workplace misconduct claims, my colleague Dai Wakabayashi writes. The lawsuits came after The Times reported two years ago that the company had approved a large payment to a star Google executive accused of workplace sexual misconduct.
  • A wild crime story about our software-driven lives: Prosecutors have charged former Amazon employees and e-commerce consultants with bribing Amazon workers for years to erase bad reviews, get competitors booted off the site for bogus reasons and other manipulations, Bloomberg News reported. The tactics show that what people buy on Amazon is influenced by computerized assessments of things like reviews and the reputation of the seller — and that those factors can be gamed.

Hugs to this

Jellyfish cam! (This was recommended by the television writer Cord Jefferson during a lovely interview by my colleague Tara Parker-Pope)

We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.

If you don’t already get this newsletter in your inbox, please sign up here.

Need help? Review our newsletter help page or contact us for assistance.

You received this email because you signed up for On Tech with Shira Ovide from The New York Times.

To stop receiving these emails, unsubscribe or manage your email preferences.

Subscribe to The Times

Connect with us on:

facebooktwitterinstagram

Change Your EmailPrivacy PolicyContact UsCalifornia Notices

The New York Times Company. 620 Eighth Avenue New York, NY 10018

No comments:

Post a Comment