As we have seen with the fall of Kabul, a handful of unelected tech executives play a big role in high-stakes global affairs.
| | | August 18, 2021 | |
Beginning in September, the On Tech newsletter will be available only to Times subscribers. Subscribe to The Times to continue receiving it. |
A selection of our newsletters will soon be reserved for Times subscribers. Subscribe now for $1/week for access to Times journalism and all of our newsletters — to follow your interests and discover new ones. |
Companies' Afghanistan foreign policy |
As we have seen with the fall of Kabul, a handful of unelected tech executives play a big role in high-stakes global affairs. |
| Daniel Zender |
|
Almost as soon as the Taliban retook power in Afghanistan, Facebook, YouTube, Twitter and other large internet companies confronted an uncomfortable decision: What should they do about online accounts that the Taliban began to use to spread their message and establish their legitimacy? |
I want us to stop and sit with the discomfort of internet powers that are functioning like largely unaccountable state departments. They don't do this entirely alone, and they don't really have a choice. It's still wild that a handful of unelected tech executives play a role in high-stakes global affairs. |
One way for the Taliban to try to gain Afghans' trust is to appear to be a legitimate government on social media, and the internet companies are trying to figure out how to handle it. |
Facebook has for years banned Taliban-related accounts as part of its three-tiered policy for "dangerous organizations," and the company said this week that it would continue to remove Taliban accounts and posts that support the group. That includes a help line for Afghan citizens on WhatsApp, which Facebook owns. (The Taliban now control a country, but they aren't allowed to start a Facebook group.) |
Citing U.S. sanctions on the Afghan Taliban, YouTube said it would also remove accounts it believes are operated by the group. Twitter doesn't have a blanket ban but told CNN that any posts or videos must comply with rules that prohibit what it considers hate speech or incitements to violence. My colleagues Sheera Frenkel and Ben Decker found examples of pro-Taliban social media accounts and posts that sprang up despite those bans, including a Facebook page that called itself a grocery store but posted pro-Taliban messages in recent days. |
Those U.S. internet companies are guided by the laws of their home country and those of the countries in which they operate, and they take their cues from the international community. But ultimately, these are private companies that must make their own choices. |
It was Facebook, YouTube and Twitter that decided in January that the words of President Donald J. Trump might inspire additional violence if they were blared on their sites. Twitter had to make a choice when the government of India ordered it to wipe away what the country's leadership considered subversive speech and others believed was essential free expression in a democracy. Facebook opted (by neglect rather than an active decision) not to intervene when Myanmar military personnel turned the social network into a tool for ethnic cleansing. |
In each case, unelected technology executives mostly in the United States had to make consequential decisions that reverberated for citizens and elected leaders. And unlike governments, internet companies face virtually no accountability to the public if people disagree with their decisions. Citizens can't vote Mark Zuckerberg out of office. |
There is a long and often ugly history of American companies' influencing what happens far from home to protect their interests. Media tycoons have helped start wars and elect their preferred candidates. The position of Facebook, YouTube and other U.S. internet companies feels different. Their products have become so widely used that their influence is not really a choice. They must act as diplomats whether they like it or not. |
I almost feel a little sorry for the U.S. internet companies. (Almost.) They wanted to change the world, and they did. Now they have become so powerful they must make hard decisions about an imperfect world. They and we live with the consequences. |
Reminder: This newsletter will soon be reserved for Times subscribers. |
- Well-meaning technology has downsides, too: My colleague Jack Nicas writes that Apple's plans to scan iPhones to root out child sexual abuse images ran into criticism from security and privacy experts. Jack explains the uncomfortable reality that technology to go after criminals can hurt ordinary people, and technology that protects ordinary people can also help criminals.
- Self-driving cars are really, really difficult: Bloomberg News says that some employees at Waymo, the driverless-car sibling of Google, lost faith in the progress of computer-piloted cars. Lots of big and small things, including a misplaced wire in a car or traffic cones on the roads, can trip up the technology. (My colleague Cade Metz wrote recently about why driverless cars have progressed greatly but still face a long way to go.)
- The latest internet phenomenon that will pass in five minutes: Vox explains why videos of University of Alabama sorority recruitment are all over TikTok. It seems that videos by people who are confused or angry that they're seeing sorority videos help circulate those sorority videos more on TikTok. The 2021 internet is fun?!?!
|
Enjoy this newsletter? Subscribe to keep receiving it. |
We've reserved a selection of newsletters, including this one, for Times subscribers. Subscriber support ensures that we have the resources to deliver original, quality journalism in every form — including our newsletters. |
Your access to this newsletter ends in September. Become a Times subscriber to enjoy our journalism and to continue to read this newsletter and any others you find interesting. |
We want to hear from you. Tell us what you think of this newsletter and what else you'd like us to explore. You can reach us at ontech@nytimes.com. |
|
No comments:
Post a Comment