Thursday, October 8, 2020

On Tech: False rumors often start at the top

Powerful people must now anticipate how their words might be twisted into weapons online.

False rumors often start at the top

Brenna Murphy

We know that false information spreads online like the world’s worst game of telephone.

But we don’t talk enough about the role of people in charge who say too little or the wrong things at important moments, creating conditions for misinformation to flourish.

Think about the recent rumors and outrage that flew around President Trump’s health, the wildfires in Oregon and the message of a Netflix film. Ill-considered communication from those at the top — including the president himself — made cycles of bogus information and misplaced anger even worse.

Every word that powerful people say matters. It may not be fair, but they must now anticipate how their words might be twisted — intentionally or not — into weapons in the online information war.

For one example, look at Oregon, where a tweet and other poorly communicated information from the police contributed to bogus rumors that left-wing activists deliberately started wildfires.

“We ask you to demonstrate peacefully and without the use of fire,” the police in Portland posted. There was no evidence that protesters were setting fires, but people seized on this and other odd or ambiguous official information as evidence that left-wing provocateurs at the Portland protests were responsible for wildfires.

ADVERTISEMENT

Local officials, including the Chamber of Commerce in Sioux Falls, S.D., also spread false rumors over the summer that left-wing protesters were headed to their town to start trouble.

None of this was true, but truth doesn’t matter in internet information soup. Wrong or ill-considered official statements can confirm what people already suspected.

The same thing happened when Netflix unleashed a clueless marketing campaign to promote a film called “Cuties.” My colleague described the movie as a nuanced exploration of gender and race and how society dangerously blurs the lines between girl empowerment and sexual exploitation. But Netflix’s promotional materials, including an image of tween girls posing in dance clothes, gave the false impression that the movie sexualized children.

In short, Netflix’s communication projected the idea that its own movie was the opposite of what it really was. Some politicians, parents and a Texas prosecutor called the film child pornography and pushed Netflix to ban it. Outcry about the movie has been amplified by supporters of the QAnon conspiracy theory, the false idea that top Democrats and celebrities are behind a global child-trafficking ring.

ADVERTISEMENT

I want to be clear: There are always people who twist information to their own ends. People might have misplaced blame for the wildfires or dumbed down the complexities of “Cuties” even if official communications had been perfectly clear from the jump. But by not choosing their words and images carefully, the people in charge provided fuel for misinformation.

We see over and over again that unclear, wrong or not enough information from the beginning can be hard to overcome.

Conspiracy theories about President Trump’s coronavirus diagnosis and health condition in the last week were fueled by people close to the president misspeaking or obfuscating what was happening. And the White House’s history of spreading false information contributed to a lack of trust in the official line. (My colleague Kevin Roose also wrote about this fueling wild speculation about the president’s health.)

Nature abhors a vacuum, and the internet turns a vacuum into conspiracies. All of us have a role to play in not contributing to misinformation, but experts and people in positions of power shoulder even more responsibility for not creating the conditions for bogus information to go wild.

If you don’t already get this newsletter in your inbox, please sign up here.

ADVERTISEMENT

Facebook is afraid. That’s good.

Facebook is expanding a blackout period for political and issue-related ads in the United States for days or longer after Election Day — a period in which officials might still be counting votes in the presidential election and other contests.

I want to make two points. First, Facebook’s ads blackout might be smart or it might be ineffectual, but it is definitely small fish.

Look at your Facebook feed. A lot of the overheated and manipulative garbage you see did not pay to be there. Those posts are there because they make people angry or happy, and Facebook’s computer systems circulate the stuff that generates an emotional reaction.

Yes, it’s extra galling if Facebook makes money directly from lies and manipulations. That’s a big reason some civil rights groups and company employees have called on internet companies to take a hard line against political ads or to ban them. But I suspect that most of the stuff that might rile people up if votes are still being counted after Election Day will be unpaid posts, including from President Trump — not ads.

Second, I am going to say something nice about Facebook. With the company’s ban on groups or pages that identify with the QAnon conspiracy announced this week and its gradually broadening crackdown on attempted voter intimidation and premature declarations of election victory, Facebook is showing courage in its convictions.

This is different. Too often the company myopically fixates on technical rules, not principles, and caves to its self-interest.

Facebook is taking a different tack in part because it doesn’t want to be blamed — as the company was four years ago — if there is confusion or chaos around the election. I love that Facebook is a little bit afraid.

It’s healthy for the company to ask itself: What if things go wrong? That’s something Facebook has often failed to do with disastrous consequences.

Before we go …

  • We are all conspiracists now: Kevin Roose, a technology columnist for The New York Times, writes that conspiracy theories are a symptom of the broader erosion of authority in the internet age. “How easily the conspiracist’s creed — that the official narrative is always a lie, and that the truth is out there for those willing to dig for it themselves — has penetrated our national psyche,” Kevin writes.
  • LinkedIn contains multitudes: During the pandemic and protests against racial injustice, the typically blah workplace social network has become a thriving outlet for Black professionals to express both fun stuff and grief about racial discrimination and alienation on the job, Ashanti M. Martin wrote for The Times. Some LinkedIn users said the company didn’t know how to handle it.
  • Raining cash on internet video stars: A small app called Triller is trying to steal stars from TikTok by paying them for just about anything, including a helicopter for a video shoot and a leased Rolls-Royce with a “TRILLER” vanity plate, my colleague Taylor Lorenz writes. My question: How long can Triller keep spending like this?

Hugs to this

We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.

If you don’t already get this newsletter in your inbox, please sign up here.

Need help? Review our newsletter help page or contact us for assistance.

You received this email because you signed up for On Tech with Shira Ovide from The New York Times.

To stop receiving these emails, unsubscribe or manage your email preferences.

Subscribe to The Times

Connect with us on:

facebooktwitterinstagram

Change Your EmailPrivacy PolicyContact UsCalifornia Notices

The New York Times Company. 620 Eighth Avenue New York, NY 10018

No comments:

Post a Comment