The first Apple Watch reviews came out this week, and they weren't great.
Reviewers did praise the design and the way it lets you leave your phone in your pocket more, and blogger John Gruber had a great non-cynical take on the "taptic" communications, which lets you send little taps and a representation of your heartbeat to other watch wearers — "Non-verbal, non-visual, physical communication across any distance. This could be something big."
But beneath headline phrases like "magical" and "bliss," reviewers had a lot of complaints. Most of them revolved around two points.
First, it wasn't clear exactly what problem any smart watch is supposed to solve, and the Apple Watch made it no clearer.
Second, the Watch itself seemed fussy to use. The controls were hard to figure out and required a "steep learning curve". It was too easy to hit the wrong icon. It sent way too many notifications.
All in all, it seems like reviewers were confused.
But this could be because the reviewers — and probably the people making apps for the watch, and maybe even Apple itself — are still stuck in the current mindset of how we use computers. Call it the smartphone mindset.
For the last eight years or so, we've had computers more powerful than the ones NASA used to send men to the moon in our pockets. They're connected to the Internet all the time.
And the way we interact with them is mostly by looking at the screen and tapping with our fingers to make something happen. We take a picture, we send an email, we text, Facebook, Snapchat.
If you apply these same habits to a tiny computer on your wrist, of course you're going to be disappointed. It's harder to read. It's harder to control. It's a more awkward way of doing the same things you're already doing quite easily with your smartphone.
But the future of computing is probably going to look quite different. If you look at what big tech companies like Google, Microsoft, Facebook, Cisco, Intel, IBM, and (yes) Apple are focusing on, you see a few common themes:
- "Internet of things." It's a dopey term, but it basically means there will be little tiny computerized sensors everywhere, and those sensors will be able to connect to each other, to local private networks, and to the Internet. Or some combination of the three. Suddenly, inanimate objects will be able to do more than just sit there — they'll exchange information with each other, or send and receive simple signals that trigger events.
- Artificial intelligence. Apple's Siri, Google Now, and Microsoft Cortana are all early examples of how artificial intelligence can help answer fairly simple questions. But the more interesting part comes when AI can help you anticipate and answer questions before you even know you have those questions. This is sometimes called anticipatory computing.
- Passive interfaces. Virtual reality devices like Facebook's Oculus and augmented reality devices like Microsoft's Hololens and Google Glass have a very important difference from earlier computing devices like smartphones and PCs. You don't have to do anything with the actual device to get something out of it. You don't have to type, or move a mouse, or pick it up and touch the screen. You just put it on — and things happen.
This is where computing is going after the smartphone era. It will be everywhere, it will know what you want, and it won't require you to do anything to get something in return.
Ubiquitous, anticipatory, and passive.
The Apple Watch is a small step forward in all three categories.
It's a tiny device that can communicate with all kinds of networks — Wi-Fi, Bluetooth, and short-range NFC for Apple Pay. It uses Siri to understand voice commands. It requires less attention than your phone, especially when you're just receiving simple messages like a tap or a heartbeat with it.
So look ahead a couple iterations and think of it this way:
- You walk up to doors — your house, your car, your office, your hotel room — and they automatically unlock.
- You get out of bed and the coffee machine automatically turns on. The lights turn on as you walk around the house.
- You're within a block of a person you've highlighted and your watch tells you they're nearby and guides you so you accidentally-on purpose "run into them" (or, if you're so inclined, avoid them).
- You walk into the lunch spot where you always order the same thing and it's already on the counter for you to pick up when you get in. You go to the grocery store and load your shopping cart up with items and walk out the door, and your credit card is automatically debited.
- Your heartbeat gets irregular or stops for more than a second or two and the watch automatically calls your doctor, an ambulance, and your emergency contact.
Love it or hate it, this is where personal technology is going. The underlying plumbing is almost there. The human desire to make tasks easier and more convenience will create the market.
Apple could very well be the company that gets there first. If it sells millions of Apple Watches, as it almost certainly will, it'll have a head start on one critical part of the equation — the thing that identifies you to all these sensors and devices that are just waiting for something to trigger them.
And don't forget that last year Apple introduced HomeKit, which is a set of technologies for app makers to connect to devices in the home, and CarPlay, which is the same thing for cars. All the pieces are falling into place.
Or maybe the Apple Watch won't take us there. The leaders of one generation of computing are seldom the leaders of the next, even if they get there first — just look at the early Windows Mobile phones, which really tried to take the PC desktop and shrink it down to a tiny screen. Right direction, terrible implementation.
But this is where the entire computer industry is going, and Apple is once again trying to lead the way.
READ THIS NEXT: The truth about the Apple Watch
Join the conversation about this story »
NOW WATCH: Trying the Apple Watch at the store won't persuade you to buy one