Friday, April 16, 2021

On Tech: The race for attention on YouTube

What factors are driving us to get into heated fights online and watch hateful videos?

The race for attention on YouTube

Daniel Zender

When we get caught up in heated arguments with our neighbors on Facebook or in politically charged YouTube videos, why are we doing that? That's the question that my colleague Cade Metz wants us to ask ourselves and the companies behind our favorite apps.

Cade's most recent article is about Caolan Robertson, a filmmaker who for more than two years helped make videos with far-right YouTube personalities that he says were intentionally provocative and confrontational — and often deceptively edited.

Cade's reporting is an opportunity to ask ourselves hard questions: Do the rewards of internet attention encourage people to post the most incendiary material? How much should we trust what we see online? And are we inclined to seek out ideas that stoke our anger?

Shira: How much blame does YouTube deserve for people like Robertson making videos that emphasized conflict and social divisions — and in some cases were manipulated?

Cade: It's tricky. In many cases these videos became popular because they confirmed some people's prejudices against immigrants or Muslims.

But Caolan and the YouTube personalities he worked with also learned how to play up or invent conflict. They could see that those kinds of videos got them attention on YouTube and other websites. And YouTube's automated recommendations sent a lot of people to those videos, too, encouraging Caolan to do more of the same.

ADVERTISEMENT

One of Facebook's executives recently wrote, in part, that his company mostly isn't to blame for pushing people to provocative and polarizing material. That it's just what people want. What do you think?

There are all sorts of things that amplify our inclination for what is sensational or outrageous, including talk radio, cable television and social media. But it's irresponsible for anyone to say that's just how some people are. We all have a role to play in not stoking the worst of human nature, and that includes the companies behind the apps and websites where we spend our time.

I've been thinking about this a lot in my reporting about artificial intelligence technologies. People try to distinguish between what people do and what computers do, as though they are completely separate. They're not. Humans decide what computers do, and humans use computers in ways that alter what they do. That's one reason I wanted to write about Caolan. He is taking us behind the curtain to see the forces — both of human nature and tech design — that influence what we do and how we think.

What should we do about this?

I think the most important thing is to think about what we're really watching and doing online. Where I get scared is thinking about emerging technologies including deepfakes that will be able to generate forged, misleading or outrageous material on a much larger scale than people like Caolan ever could. It's going to get even harder to know what's real and what's not.

Isn't it also dangerous if we learn to mistrust anything that we see?

Yes. Some people in technology believe that the real risk of deepfakes is people learning to disbelieve everything — even what is real.

ADVERTISEMENT

How does Robertson feel about making YouTube videos that he now believes polarized and misled people?

On some level he regrets what he did, or at the very least wants to distance himself from that. But he's essentially now using the tactics that he deployed to make extreme right-wing videos to make extreme left-wing videos. He's doing the same thing on one political side that he used to do on the other.

If you don't already get this newsletter in your inbox, please sign up here.

If you've found this newsletter helpful, please consider subscribing to The New York Times — with this special offer. Your support makes our work possible.

ADVERTISEMENT

Before we go …

  • Why Amazon workers voted no on a union: My colleagues Karen Weise and Noam Scheiber talked to some Amazon workers at an Alabama warehouse that overwhelmingly voted against unionization. The workers said that Amazon's pay and health benefits were a powerful incentive to side with the company.
  • The recent police killings of Adam Toledo in Chicago and Daunte Wright in Minnesota both were recorded by police-worn body cameras. In a conversation last year, my colleague Ashley Southall discussed the benefits and the limits of law enforcement body cameras. It's also worth reading this Twitter thread from Omar Wasow, a Princeton University professor, about the public witnessing state violence.
  • Documenting the (un)friendly skies: What happens when an often silly Instagram account about people on airplanes meets a pandemic? "You can watch the trajectory of the account going from the crazy stuff that people do that makes us giggle and laugh to the increase of physical and verbal abuse," the woman behind Passenger Shaming told The Washington Post.

Hugs to this

Listen to a Bach choral prelude for the organ, recreated on a 1980s-era Commodore 64 computer system. (Here is more information on the technology behind that eerily beautiful computer music.)

We want to hear from you. Tell us what you think of this newsletter and what else you'd like us to explore. You can reach us at ontech@nytimes.com.

If you don't already get this newsletter in your inbox, please sign up here.

Need help? Review our newsletter help page or contact us for assistance.

You received this email because you signed up for On Tech with Shira Ovide from The New York Times.

To stop receiving these emails, unsubscribe or manage your email preferences.

Subscribe to The Times

Connect with us on:

facebooktwitterinstagram

Change Your EmailPrivacy PolicyContact UsCalifornia Notices

LiveIntent LogoAdChoices Logo

The New York Times Company. 620 Eighth Avenue New York, NY 10018

No comments:

Post a Comment