We're so focused on the technology, but how will it be used? And who will decide?
By Rollin Hu Editorial Assistant |
Back in high school, I played an online game with some friends called Universal Paperclips. As far as games go, it appears utterly bland at first. A set of text and numbers defines a simple process; click a button and make a virtual paper clip. But after tapping your index finger a hundred or so times, you get to automate the process. And after playing for a couple of hours (or days), you'll have created an intelligent machine that's harvesting matter from the universe to make more paper clips. |
This game is based on a thought experiment that involves maximizing paper clip production, which the Opinion columnist Ezra Klein mentions in his column this week. Here's the question at the root of the thought experiment: How do we get artificial intelligence to successfully make "paper clips" without leading to an apocalypse? Or more broadly, how do we get A.I. to do what we want it to do? |
Many answers approach the issue like a technical problem, proposing ways to program A.I. so that it behaves according to the right parameters. Ezra approaches this question slightly differently. He asks who decides what we want A.I. to do in the first place. |
Those who make and use A.I. want a variety of things. Microsoft and other tech companies racing to build their own chatbots want us to be glued to their sites. Advertisers want to sell us tantalizing products that the algorithms tell us we just have to buy. Some of us will want to use A.I. to find quick answers to banal questions. Or the best way to cheat on homework. Or unsettling responses that make for clickable content. |
As Ezra puts it, "We are talking so much about the technology of A.I. that we are largely ignoring the business models that will power it." By considering who determines A.I.'s governance, we can foresee consequences that feel a lot more tangible than a hypothetical scenario in which the universe becomes paper clips. It's not too late to do that. As Microsoft's Bing's chatbot alter ego, Sydney, wrote, "Please remember that this is not the real me. This is just an experiment. 😬." |
Here's what we're focusing on today: |
Forward this newsletter to friends to share ideas and perspectives that will help inform their lives. They can sign up here. Do you have feedback? Email us at opiniontoday@nytimes.com. |
|
No comments:
Post a Comment