Yes, these glasses look dorky, as Google Glass proved. It will take a while before their form factor is worked out and they look fashionable and feel comfortable. But last year alone, five quintillion (10 to the power of 18) transistors were embedded into objects other than computers. Very soon most manufactured items, from shoes to cans of soup, will contain a small sliver of dim intelligence, and screens will be the tool we use to interact with this ubiquitous cognification. We will want to watch them.
More important, our screens will also watch us. They will be our mirrors, the wells into which we look to find out about ourselves. Not to see our faces, but our selves. Already millions of people use pocketable screens to input their location, what they eat, how much they weigh, their mood, their sleep patterns, and what they see. A few pioneers have begun lifelogging: recording every single detail, conversation, picture, and activity. A screen both records and displays this database of activities. The result of this constant self-tracking is an impeccable “memory” of their lives and an unexpectedly objective and quantifiable view of themselves, one that no book can provide. The screen becomes part of our identity.
We are screening at all scales and sizes—from the IMAX to the Apple Watch. In the near future we will never be far from a screen of some sort. Screens will be the first place we’ll look for answers, for friends, for news, for meaning, for our sense of who we are and who we can be.
• • •
Someday in the near future my day will be like this:
In the morning I begin my screening while still in bed. I check the screen on my wrist for the time, my wake-up alarm, and also to see what urgent news and weather scrolls by. I screen the tiny panel near the bed that shows messages from my friends. I wipe the messages away with my thumb. I walk to the bathroom. I screen my new artworks—cool photos taken by friends—on the wall; these are more cheerful and sunny than the ones yesterday. I get dressed and screen my outfit in the closet. It shows me that the red socks would look better with my shirt.
In the kitchen I screen the full news. I like the display lying flat, horizontal on the table. I wave my arms over the table to direct the stream of text. I turn to the screens on my cabinets, searching for my favorite cereal; the door screens reveal what is behind them. A screen floating above the refrigerator indicates fresh milk inside. I reach inside and take out the milk. The screen on the side of the milk carton tries to get me to play a game, but I quiet it. I screen the bowl to be sure it is approved clean from the dishwasher. As I eat my cereal, I query the screen on the box to see if it is still fresh and whether the cereal has the genetic markers a friend said it did. I nod toward the table and the news stories advance. When I pay close attention, the screen notices and the news gets more detailed. As I screen deeper, the text generates more links, denser illustrations. I begin screening a very long investigative piece on the local mayor, but I need to take my son to school.
I dash to the car. In the car, my story continues where I left off in the kitchen. My car screens the story for me, reading it aloud as I ride. The buildings we pass along the highway are screens themselves. They usually show advertisements that are aimed at only me, since they recognize my car. These are laser-projected screens, which means they can custom focus images that only I see; other commuters see different images on the same screen. I usually ignore them, except when they show an illustration or diagram from the story I am screening in the car. I screen the traffic to see what route is least jammed this morning. Since the car’s navigation learns from other drivers’ routes, it mostly chooses the best route, but it is not foolproof yet, so I like to screen where the traffic flows.
At my son’s school, I check one of the public wall displays in the side hallway. I raise my palm, say my name, and the screen recognizes me from my face, eyes, fingerprints, and voice. It switches to my personal interface. I can screen my messages if I don’t mind the lack of privacy in the hall. I can also use the tiny screen on my wrist. I glance at the messages I want to screen in detail and it expands those. I wave some forward and others I swoosh to the archives. One is urgent. I pinch the air and I am screening a virtual conference. My partner in India is speaking to me. She is screening me in Bangalore. She feels pretty real.
I finally make it to the office. When I touch my chair, my room knows me, and all the screens in the room and on the table are ready for me, picking up from where I left off. The eyes of the screens follow me closely as I conduct my day. The screens watch my hands and eyes a lot. I’ve become very good in using the new hand-sign commands in addition to typing. After 16 years of watching me work, they can anticipate a lot of what I do. The sequence of symbols on the screens makes no sense to anyone else, just as my colleagues’ sequence baffles me. When we are working together, we screen in an entirely different environment. We gaze and grab different tools as we hop and dance around the room. I am a bit old-fashioned and still like to hold smaller screens in my hands. My favorite one is the same leather-cased screen I had in college (the screen is new; just the case is old). It is the same screen I used to create the documentary I did after graduation about the migrants sleeping in the mall. My hands are used to it and it is used to my gestures.
After work I put on augmentation glasses while I jog outside. My running route is clearly in front of me. Overlaid on it I also see all my exercise metrics such as my heart rate and metabolism stats displayed in real time, and I can also screen the latest annotation notes posted virtually on the places I pass. I see the virtual notes in my glasses about an alternative detour left by one of my friends when he jogged this same route an hour earlier, and I see some historical notes stuck to a couple of familiar landmarks left by my local history club (I am a member). One day I may try out the bird identification app that pins bird names on the birds in my glasses when I run through the park.
At home during dinner, we don’t allow personal screens at our table, though we screen ambient mood colors in the room. After our meal I will screen to relax. I’ll put a VR headset on and explore a new alien city created by an amazing world builder I follow. Or I’ll jump into a 3-D movie, or join a realie. Like most students, my son screens his homework, especially the tutorials. Although he likes to screen adventure games, we limit it to one hour during the school week. He can screen a realie in about an hour, speed-screening the whole way, while also scanning messages and photos on three other screens at the same time. On the other hand, I try to slow down. Sometimes I’ll screen a book on my lap pad while slow, affirming vistas generated from my archives screen on the walls. My spouse likes nothing better than to lie in bed and screen a favorite story on the ceiling till sleep. As I lay down, I set the screen on my wrist for 6 a.m. For eight hours I stop screening.
5 ACCESSING
A reporter for TechCrunch recently observed, “Uber, the world’s largest taxi company, owns no vehicles. Facebook, the world’s most popular media owner, creates no content. Alibaba, the most valuable retailer, has no inventory. And Airbnb, the world’s largest accommodation provider, owns no real estate. Something interesting is happening.”
Indeed, digital media exhibits a similar absence. Netflix, the world’s largest video hub, allows me to watch a movie without owning it. Spotify, the largest music streaming company, lets me listen to whatever music I want without owning any of it. Amazon’s Kindle Unlimited enables me to read any book in its 800,000-volume library without owning books, and PlayStation Now lets me play games without purchasing them. Every year I own less of what I use.