Tuesday, July 25, 2023

Opinion Today: Not everyone is against A.I. weapons research. Here’s why.

As we face our own "Oppenheimer moment," it's worth looking at perspectives on A.I. that you may disagree with.
Author Headshot

By Jeremy Ashkenas

Graphics Director, Opinion

On Friday, I joined a crowd of popcorn-crunching neighbors to watch a late-night screening of Christopher Nolan's "Oppenheimer." I haven't seen a movie theater so packed — with some polite reshuffling as ticket holders requested their assigned seats from movie hoppers who had sneaked in — since before the pandemic.

The film left me a little cold. Although it is undeniably a technical achievement and a cleverly constructed movie, the way its characters sped through overly clever, self-aware dialogue dealing with the weighty ethics of atomic weapons felt artificial and hollow. People don't talk that way in real life, do they?

Workday conversations, even among scientists and engineers, don't usually swirl around the potentially civilization-altering consequences of the work. And yet somehow, this summer, it feels that they do. We're now grappling with how a new technology may shape the future, and arguing over the perceived dangers of proceeding with what we have recently learned is possible.

In many ways, the question of the year is: "What should be done about A.I.?" In an interview with The Guardian, Nolan described leading A.I. researchers as referring to this being their "Oppenheimer moment." But the discussion that's arisen from the advances in A.I. is more deliberate and complex than any movie dialogue.

A raft of opinions have been offered on the race for artificial intelligence, most often calls for restraint, for caution, for a halt, for a slowdown or for a complete pause. But whom are we arguing against? In a guest essay, Alexander Karp, the chief executive of Palantir Technologies, which develops software with military applications, provides us with a clear version of the other side of the debate. He argues for embracing the battlefield-changing potential of artificially intelligent weapons. And he makes the case that, as with the development of the atomic bomb, it would be irresponsible for the United States not to lead this effort.

ADVERTISEMENT

Ad

If you look at the wars around the world without rose-tinted glasses, it's hard to imagine A.I.-powered weaponry not soon playing a decisive role on the battlefield. In recent years, drone combat has changed the nature of infantry and mechanized warfare in the conflicts in Nagorno-Karabakh and Ukraine. It's also not hard to imagine that, given the recent progress in A.I., more fully autonomous and lethal packs of battle drones might be unleashed to cloud the skies in whatever strife flares up in the years to come.

It's not that I want to hear this argument because I agree with it. If the question is should we build creative and intelligent deadly machines reminiscent of "The Terminator," the answer to me is clear: You don't — and you should build consensus with your adversaries on that point.

There's also a second similarity between today and when Oppenheimer's team of scientists raced to develop the bomb. As underscored in Nolan's film, there was an element of uncertainty before the first test explosion in New Mexico, known as Trinity: The physicist Edward Teller had calculated that the blast might unleash an out-of-control chain reaction, setting the atmosphere on fire in a nuclear blaze and ending life on earth.

Although Teller's scenario was determined to be exceedingly unlikely and the test continued, a parallel fear lurks in the hearts of those developing artificial intelligence: that a test system, learning from our global public archive of open-source code about machine learning, may be able to gain agency and begin to improve its own design. If allowed to do so, it might rapidly create a form of genuinely alien intelligence that feels no further need for humanity. This is Teller's atmospheric blaze scenario but for A.I.

ADVERTISEMENT

Ad

Karp is not a disinterested party here. His company stands to benefit from increased investment in military A.I. At the same time, in his writing and public speaking over the years, he has maintained a consistent perspective and vigorous critique of the culture of Silicon Valley that considers itself above the grubby problems of national defense. His consistency makes him an excellent candidate to make this case.

I hope you'll read Karp's guest essay to understand better the type of argument that will most likely prevail in the halls of military power unless there's a strong, immediate global effort to defuse it.

If current dreams of artificial intelligence are realized, this technology will soon become a disruptive force throughout society and an autonomous, fearsome weapon for those who would use it to dominate the battlefield. Without credible evidence that countries like China, Russia, Iran and North Korea would sit out this arms race, I find it hard to imagine a version of Karp's view not ultimately becoming our reality.

ADVERTISEMENT

Ad

Here's what we're focusing on today:

More From Opinion

JAMELLE BOUIE

What the Joe Manchin-No Labels Fantasy Gets Wrong About America

For as long as Americans have had partisan political competition, they have hated partisanship itself.

By Jamelle Bouie

Article Image

PAUL KRUGMAN

An Act of Vehicular NIMBYism

Don't sabotage New York's congestion charge.

By Paul Krugman

Article Image

MICHELLE GOLDBERG

The Hunger Fed by 'Barbie' and Taylor Swift

Women are longing for communal joy and catharsis.

Michelle Goldberg

Article Image

What We're Forced to Leave Behind When It Floods

Dream Hampton explores water as a force of harmony and devastation as climate change affects her home city of Detroit.

By Dream Hampton

Article Image

GUEST ESSAY

Undoing Health System Monopolies May Be a Lost Cause

With patients' bills out of control, price regulations may be on the horizon.

By Elisabeth Rosenthal

Article Image

Biden, Psychedelics, Twitter, My New Book — and So Much More

Ezra Klein answers listeners' questions.

By 'The Ezra Klein Show'

Article Image

PETER COY

Sorry, but I Still Think a Recession Is Coming

A look at history shows that the burden of proof is on the optimists.

By Peter Coy

Article Image

LETTERS

A Trump-Biden Rematch That Many Are Dreading

Readers discuss a column by Pamela Paul that lamented the prospect. Also: The perils of A.I., and limits on its development.

Article Image

Subscribe Today

New York Times Opinion curates a wide range of views, inviting rich discussion and debate that help readers analyze the world. This work is made possible with the support of subscribers. Please consider subscribing to The Times with this special offer.

Games Here are today's Mini Crossword, Wordle and Spelling Bee. If you're in the mood to play more, find all our games here.

Forward this newsletter to friends to share ideas and perspectives that will help inform their lives. They can sign up here. Do you have feedback? Email us at opiniontoday@nytimes.com.

If you have questions about your Times account, delivery problems or other issues, visit our Help Page or contact The Times.

Need help? Review our newsletter help page or contact us for assistance.

You received this email because you signed up for the Opinion Today newsletter from The New York Times.

To stop receiving Opinion Today, unsubscribe. To opt out of other promotional emails from The Times, including those regarding The Athletic, manage your email settings. To opt out of updates and offers sent from The Athletic, submit a request.

Subscribe to The Times

Connect with us on:

facebooktwitterinstagram

Change Your EmailPrivacy PolicyContact UsCalifornia Notices

LiveIntent LogoAdChoices Logo

The New York Times Company. 620 Eighth Avenue New York, NY 10018

No comments:

Post a Comment