A Break From Consumer Tech |
| A computer running artificial intelligence software at a contest last year in Beijing defeated two teams of doctors in accurately recognizing maladies in magnetic resonance images. Mark Schiefelbein/Associated Press | |
Each week, technology reporters and columnists from The New York Times review the week's news, offering analysis and maybe a joke or two about the most important developments in the tech industry. Want this newsletter in your inbox? Sign up here. |
Hi. I'm Steve Lohr. I write about the technology industry, from the old titans like IBM to start-ups working on the future of artificial intelligence. |
In this week's newsletter, we're largely taking a break from the consumer internet giants, and the various controversies swirling around them. (Exception: Amazon's surprise announcement on Thursday to abandon its plans to build a headquarters in New York. See below.) |
Instead, we're going to look at recent research and reports that track how digital technology is moving into mainstream industries, and the implications. |
Let's start with a smart piece this week by my colleague Erin Griffith, "The Next Wave of 'Unicorn' Start-Ups." Erin asked CB Insights, a research firm that studies start-ups and venture capital, to create a list of the next wave of young companies likely to reach valuations of $1 billion or more. She then reported on some of the 50 companies. |
The first round of well-known unicorns, led by Uber and Airbnb, she writes, exploited smartphones and cloud computing to upend old-line businesses. The next companies poised to become unicorns are going narrower, deeper and into more fields. |
"They largely focus on software," she writes, "for specific industries like farms, banks and life sciences companies." |
That kind of broad-based adoption across the nation's $20 trillion economy is what will deliver growth and higher living standards for many. Yet the advance of data-driven artificial intelligence will also provide real-world evidence of how much and how quickly this technology may widen income disparities and kill jobs. |
The short answer, it seems, is not as fast as techno-pessimists fear. A.I.'s progress is impressive, but limited. An article this week by Cade Metz, "A.I. Shows Promise Assisting Physicians," underlines that point. |
The piece describes an A.I. research program whose accuracy matched or slightly surpassed human physicians in diagnosing common childhood diseases like influenza. The software was trained on the medical histories, lab tests and other clinical data in more than 600,000 electronic health records of children in southern China. |
It was an encouraging demonstration. But the experimental system relied on the easy access to personal data in China, where privacy regulations are less restrictive, and was confined to diagnosing common ailments. |
That step-by-step approach is the counsel for business in a new book by Thomas Davenport, "The AI Advantage: How to Put the Artificial Intelligence Revolution to Work" (MIT Press). Mr. Davenport, a professor of information technology and management at Babson College, is a seasoned expert in using digital data to streamline operations and spot opportunities. And he's been at it since well before the current A.I. wave, which is both a technological evolution and a branding craze. |
His advice to mainstream companies is that the best course is often to use basic A.I. tools to automate mundane office tasks in operations like accounting, billing, payments and customer service — and to avoid "moon shots." |
One of Mr. Davenport's prime examples of a misguided moon shot was IBM's initial efforts to apply its Watson technology to diagnosing cancer. That was a high-profile science project that proved more difficult, time-consuming and costly than anticipated, though IBM continues that work with several leading cancer centers. |
Speaking of IBM, the company was out in San Francisco this week for its annual Think conference, which attracted more than 25,000 attendees. Its executives presented its Watson and cloud technology as the trusted path for companies across the industrial spectrum that want to embrace A.I. |
IBM cannot compete head-on with Amazon, Microsoft and Google in the big-spending game of building out massive data centers to provide the infrastructure layer of cloud computing to one and all. So it is seeking to shift the competition. |
IBM talks about "chapter two" in the cloud market. Increasingly, it says, companies will run not only new applications but also their legacy software on the cloud, either private clouds in their own data centers or on IBM's cloud. The next stage of cloud computing, Virginia Rometty, IBM's chief executive, told Jon Fortt of CNBC, "is going to be driven by the modernization of mission-critical apps. That's our sweet spot." |
And in a nod to reality, IBM announced a Watson Anywhere initiative: Its A.I. technology will run on the popular clouds of Amazon, Microsoft or Google as well as IBM's cloud. |
In other news: |
■ In an article this week before Amazon's retreat, J. David Goodman, City Hall reporter for The Times, deftly explained the shifting politics before Amazon withdrew its plan in "Why Amazon Is Caught in an Unexpected Brawl in New York." |
But what happened in Queens is part of a broader resistance to the tech boom, and its consequences. After the protests surfaced last year, Fred Wilson, the dean of New York venture capitalists, told me that "it's partly from a sense that Amazon coming in is not going to help them, and will only drive up their costs. To really be a success in New York, the benefits of the tech sector have to extend to every borough and every neighborhood." |
That concern, across the country, as A.I. technology marches ahead, is the subject of a lengthy analysis this week by Mark Muro, a senior fellow at the Brookings Institution. |
■ For a rich, enlightening read, I recommend a piece in this week's New York Times Magazine, "The Secret History of Women in Coding." The "secret" is a headline writer's exaggeration. Anyone with an interest in computing history knows about Ada Lovelace, Grace Hopper and the women who programmed the early Eniac computer. Those stories have been told repeatedly, including in books. |
But Clive Thompson, the author, elegantly weaves that history around the story of an early female programmer who is alive and recalls it all. His piece captures what it was like in the 1940s through early 1960s, when writing software was a wide-open field, before a male-dominated culture took root. |
As Lois Haibt — who in 1955, as a freshly minted Vassar College graduate, joined the IBM team that created Fortran, the first popular programming language — once told me: "They took anyone who seemed to have an aptitude for problem-solving skills — bridge players, chess players, even women." |
Steve Lohr, based in The Times's New York headquarters, writes about technology and economics. Follow him on Twitter here: @SteveLohr. |
No comments:
Post a Comment