top of page

Artificial Intelligence: Separating the Hype from Reality


Artificial Intelligence: Separating the Hype from Reality

Like bees to honey, tech trends generate hype. Merely appending the word “dotcom” to a company’s name drove up stock prices in the Internet’s salad days. Cloud computing, big data, and cryptocurrencies each have taken their turn in the hype cycle in recent years. Every trend brings genuinely promising technological developments, befuddling buzzwords, enthusiastic investors, and reassuring consultants offering enlightenment—for a fee, naturally.

Now the catchall phrase of artificial intelligence is shaping up as the defining technological trend of the moment. And yet, because the claims of what it will achieve are so grand, businesses risk raising their hopes for A.I. too high—and wasting money by trying to apply the technology to problems it can’t solve.

Consider the bubbly warning signs. Venture capitalists are beyond eager to fund A.I. They staked 1,028 A.I.-related startups last year, up from 291 in 2013, says researcher PitchBook. Twenty-six of those companies had “A.I.” in their names, compared with one five years earlier. Then there’s the profusion of conferences promising to explain A.I. to the benighted manager. At the annual meeting of the World Economic Forum in Davos, Switzerland, the agenda this year included no fewer than 11 panels that reference A.I., with names like “Designing Your A.I. Strategy” and “Setting Rules for the A.I. Race.” (Fortune has gotten into this act too: Its 2018 Global Tech Forum in Guangzhou, China, was dominated by A.I. discussions.)

The result is a serious subject running the risk of jumping the shark. “If advocates are not careful, they will have successfully Bitcoinized A.I.,” says Michael Schrage, a researcher at MIT’s Initiative on the Digital Economy.

Make no mistake—artificial intelligence is more than a fad. It represents a whole new way of doing business by turbocharging the existing trends of automation, sensor-based industrial monitoring, and algorithmic analysis of business processes. Computer science was already helping machines perform routine tasks more quickly than humans. The new techniques of A.I.—combined with ever faster computing power and the accumulation of years of digitized data—mean that for the first time computers learn the tasks humans require of them rather than merely doing as they’re told.

bottom of page