An excellent example of the magic of adding AI to X can be seen in photography. In the 1970s I was a travel photographer hauling around a heavy bag of gear. In addition to a backpack with 500 rolls of film, I carried two brass Nikon bodies, a flash, and five extremely heavy glass lenses that weighed over a pound each. Photography needed “big glass” to capture photons in low light; it needed light-sealed cameras with intricate marvels of mechanical engineering to focus, measure, and bend light in thousandths of a second. What has happened since then? Today my point-and-shoot Nikon weighs almost nothing, shoots in almost no light, and can zoom from my nose to infinity. Of course, the camera in my phone is even tinier, always present, and capable of pictures as good as my old heavy clunkers. The new cameras are smaller, quicker, quieter, and cheaper not just because of advances in miniaturization, but because much of the traditional camera has been replaced by smartness. The X of photography has been cognified. Contemporary phone cameras eliminated the layers of heavy glass by adding algorithms, computation, and intelligence to do the work that physical lenses once did. They use the intangible smartness to substitute for a physical shutter. And the darkroom and film itself have been replaced by more computation and optical intelligence. There are even designs for a completely flat camera with no lens at all. Instead of any glass, a perfectly flat light sensor uses insane amounts of computational cognition to compute a picture from the different light rays falling on the unfocused sensor. Cognifying photography has revolutionized it because intelligence enables cameras to slip into anything (in a sunglass frame, in a color on clothes, in a pen) and do more, including calculate 3-D, HD, and many other options that earlier would have taken $100,000 and a van full of equipment to do. Now cognified photography is something almost any device can do as a side job.
A similar transformation is about to happen for every other X. Take chemistry, another physical endeavor requiring laboratories of glassware and bottles brimming with solutions. Moving atoms—what could be more physical? By adding AI to chemistry, scientists can perform virtual chemical experiments. They can smartly search through astronomical numbers of chemical combinations to reduce them to a few promising compounds worth examining in a lab. The X might be something low-tech, like interior design. Add utility AI to a system that matches levels of interest of clients as they walk through simulations of interiors. The design details are altered and tweaked by the pattern-finding AI based on customer response, then inserted back into new interiors for further testing. Through constant iterations, optimal personal designs emerge from the AI. You could also apply AI to law, using it to uncover evidence from mountains of paper to discern inconsistencies between cases, and then have it suggest lines of legal arguments.
The list of Xs is endless. The more unlikely the field, the more powerful adding AI will be. Cognified investments? Already happening with companies such as Betterment or Wealthfront. They add artificial intelligence to managed stock indexes in order to optimize tax strategies or balance holdings between portfolios. These are the kinds of things a professional money manager might do once a year, but the AI will do every day, or every hour.
Here are other unlikely realms waiting to be cognitively enhanced:
Cognified music—Music can be created in real time from algorithms, employed as the soundtrack for a video game or a virtual world. Depending on your actions, the music changes. Hundreds of hours of new personal music can be written by the AI for every player.
Cognified laundry—Clothes that tell the washing machines how they want to be washed. The wash cycle would adjust itself to the contents of each load as directed by the smart clothes.
Cognified marketing—The amount of attention an individual reader or watcher spends on an advertisement can be multiplied by their social influence (how many people followed them and what their influence was) in order to optimize attention and influence per dollar. Done at the scale of millions, this is a job for AI.
Cognified real estate—Matching buyers and sellers via an AI that can prompt “renters who liked this apartment also liked these . . .” It could then generate a financing package that worked for your particular circumstances.
Cognified nursing—Patients outfitted with sensors that track their bio markers 24 hours a day can generate highly personalized treatments that are adjusted and refined daily.
Cognified construction—Imagine project management software that is smart enough to take into account weather forecasts, port traffic delays, currency exchange rates, accidents, in addition to design changes.
Cognified ethics—Robo cars need to be taught priorities and behavior guidelines. The safety of pedestrians may precede the safety of drivers. Anything with some real autonomy that depends on code will also require smart ethical code as well.
Cognified toys—Toys more like pets. Furbies were primitive compared with the intense attraction that a smart petlike toy will invoke from children. Toys that can converse are lovable. Dolls may be the first really popular robots.
Cognified sports—Smart sensors and AI can create new ways to score and referee sporting games by tracking and interpreting subtle movements and collisions. Also, highly refined statistics can be extracted from every second of each athlete’s activity to create elite fantasy sports leagues.
Cognified knitting—Who knows? But it will come!
Cognifying our world is a very big deal, and it’s happening now.
• • •
Around 2002 I attended a private party for Google—before its IPO, when it was a small company focused only on search. I struck up a conversation with Larry Page, Google’s brilliant cofounder. “Larry, I still don’t get it. There are so many search companies. Web search, for free? Where does that get you?” My unimaginative blindness is solid evidence that predicting is hard, especially about the future, but in my defense this was before Google had ramped up its ad auction scheme to generate real income, long before YouTube or any other major acquisitions. I was not the only avid user of its search site who thought it would not last long. But Page’s reply has always stuck with me: “Oh, we’re really making an AI.”
I’ve thought a lot about that conversation over the past few years as Google has bought 13 other AI and robotics companies in addition to DeepMind. At first glance, you might think that Google is beefing up its AI portfolio to improve its search capabilities, since search constitutes 80 percent of its revenue. But I think that’s backward. Rather than use AI to make its search better, Google is using search to make its AI better. Every time you type a query, click on a search-generated link, or create a link on the web, you are training the Google AI. When you type “Easter Bunny” into the image search bar and then click on the most Easter Bunny–looking image, you are teaching the AI what an Easter Bunny looks like. Each of the 3 billion queries that Google conducts each day tutors the deep-learning AI over and over again. With another 10 years of steady improvements to its AI algorithms, plus a thousandfold more data and a hundred times more computing resources, Google will have an unrivaled AI. In a quarterly earnings conference call in the fall of 2015, Google CEO Sundar Pichai stated that AI was going to be “a core transformative way by which we are rethinking everything we are doing. . . . We are applying it across all our products, be it search, be it YouTube and Play, etc.” My prediction: By 2026, Google’s main product will not be search but AI.