Apple Vision Pro
Thoughts on the long-term implications of Apple's latest product
We carry supercomputers in our pocket!
It has become a cliché to point out the amazing processing power of smartphones compared to computers that took up entire rooms a half century ago. However, this doesn’t make it any less amazing to consider that our phones are dramatically faster than those used to send astronauts to the moon. The processing power and functional capabilities of computers have advanced so dramatically that we take it for granted.
As important as raw processing power can be, we should not ignore key advances in how human beings interact with computers. Early computers had no visual interface and were controlled by punch cards. Creating a computer program was laborious and error prone. A single error in a series of punch cards could require a programmer to start from scratch. Combined with slow processing times, this made it difficult to translate the thoughts of human beings into instructions that a computer could act on. Since much human knowledge is gained through iteration, the latency inherent in translating human thoughts into actionable code hindered progress.
The introduction of computer terminals used to control mainframes represented a major advance because programmers had the ability to type code using a keyboard and see the results on a screen. Rather than using punch cards, programs could be entered using an interface humans were already familiar with — keyboards, which had been in use since the commercialization of typewriters a century earlier. Over time, computer programs were built for users without technical backgrounds and accomplished many tasks, albeit in a highly prescribed procedural manner.
While Apple did not invent the mouse, the original Macintosh computer released in 1984 was the first to popularize its use. The mouse became another way in which users could interact with a computer, but it represented more than that. Rather than running procedural programs in which a user responded to prescribed prompts, the mouse allowed a user to interact with windows on the screen in a multitude of ways. The graphical user interface controlled with a mouse and keyboard made possible an array of event driven programs that revolutionized how people interact with machines.
Fast forward over two decades to 2007. Much changed in computing between the release of the original Macintosh and the first iPhone. Microsoft introduced its own graphical user interface, the internet went mainstream, cell phones became ubiquitous, and Apple’s iPod changed the music industry forever. However, in the mid ‘00s, the primary means of interacting with a computer was still the keyboard and mouse while cell phones relied on physical buttons for entry of alphanumeric text.
The introduction of the iPhone represented a step-change in how humans interact with computers. Rather than type instructions with physical buttons or a keyboard and using a pointing device, we began to use our fingers. A rectangular piece of glass became our window to the world, with its appearance changing automatically, almost magically, based on the context of our actions. When a keyboard was needed, one would appear, but it would only remain active while it was needed. Human-computer interaction changed in a revolutionary way which would be carried forward on devices like the iPad and other tablets over the next fifteen years.
We are in the early days of another step-change in how we interact with computers. Much of the coverage of Apple Vision Pro, which begins shipping to customers this week, has missed the mark. Too much attention has been devoted to the functional capabilities of the product rather than considering the long-term implications of advances in the human-computer interface. Spatial computing brings the interface into our physical environment and responds to our body in transformational ways.
From Thoughts to Actions
In traditional computing, we translate our thoughts into actions through a physical or virtual keyboard and a pointing device, whether it is a mouse, trackpad, or our fingers. What if computers could read our thoughts and take action without any physical movement on the part of the user? This might seem like science fiction, but it is the basis of Elon Musk’s Neuralink which just completed its first brain implant on a human earlier this week. Neuralink’s immediate goal is to help severely disabled people, but the potential applications of brain implants could be far broader.
We are likely decades away from a mass market for brain implants, but we are on the verge of having computers that are capable of tracking eye movements.
Eyes have long been thought to be “windows to the soul” because so much communication is expressed through eye contact. Everything from emotion to intent can be inferred through the eyes. If humans can glean important information through eye contact, why can’t computers determine our intent in the same way?
This is precisely the advance that is most exciting about Vision Pro.
Apple has developed technology that is constantly observing and tracking the user’s eyes, with intent inferred from eye movement alone. Visual gestures coupled with natural hand movements are used to control the system. The current tradeoff is that the Vision Pro is bulky and heavy with limited battery life and a very high price tag. A recent Wall Street Journal review of Vision Pro does a good job of illustrating how the product works. For a more detailed review, John Gruber’s recent article is excellent.
The length of time it takes to translate thoughts to intentions via the eyes is much faster than through the fingers acting on a physical device. We cannot think of a sentence to write and have it magically appear on the screen, but eyes can substitute for a pointing device and replace cumbersome physical gestures. By dispensing with physical controllers, Vision Pro represents a major advance over competing headsets.
The initial applications of Vision Pro appear to focus mostly on passive consumption of content and gaming. It will be interesting to see whether productivity applications are usable with Vision Pro. Initial reviews indicate that physical keyboards, connected via Bluetooth, remain more productive than virtual keyboards for extended writing, but it is certain that Apple will improve the virtual interfaces dramatically over time.
Form Factors
History shows that the “1.0” version of a product will improve, becoming lighter, more compact, and easier to use. Vision Pro is bulky and quirky based on initial reviews. While it envelops the user in the headset, it provides a view of the outside world, making it an “augmented reality” headset rather than a “virtual reality” product.
Apple has spent significant resources on technology that presents an image of the user’s eyes on the outside of the headset so that others in a room will have a sense of the headset user being “present” rather than in his or her own world. While this feature has been derided as a gimmick in some reviews, it makes it more likely that Vision Pro will be used for extended periods of time without alienating the people around the user. Rather than disappearing into a void, the user seems more present, even though the eyes are merely images rather than reality.
Vision Pro is a mobile device in the sense that it is portable and can be used in multiple settings, but it is not mobile like a cell phone. Users are not expected to put on the headset while engaging in activities like walking, exercising, and driving. It seems to be intended for use while stationary. At some point, a smaller version of Vision Pro will emerge that allows for truly mobile use and will likely displace today’s cell phones. Eye movement and gestures will replace using our fingers to interact with the device. The question will be whether Apple can retain the immersive experience provided by Vision Pro while letting in more of the outside world.
Timing and Price
Apple is highly secretive when it comes to product development and we do not know precisely how long Vision Pro has been in development or the scale of the investment that has been made up to this point.
Why is Apple introducing the product now?
As human-computer interfaces advance, complexity has skyrocketed. Procedural code that the user controls via a dumb terminal has a set number of pathways for execution. The introduction of the graphical user interface and event-driven programs vastly complicates the number of potential actions the user could take and in what order. As more possibilities open up, users will take strange and unexpected actions known as “edge cases” that must be accounted for. I learned this lesson over many years of migrating users from dumb terminals to Microsoft Windows based software.
No matter how much quality assurance and testing Apple conducts internally, there is nothing like releasing a product “into the wild” to see how people will use it.
A consumer electronics product that has a price tag starting at $3,500 is not intended to be a mass market product. In fact, I believe that Apple intentionally priced the product at a point where only the earliest adopters of technology would be tempted to make a purchase. In many ways, Vision Pro 1.0 looks like a public beta, albeit one that Apple will be paid for. By releasing the product to an audience of tech-savvy, wealthy consumers, Apple is able to include premium touches that provide differentiation from mass market headsets and effectively get customers to fund extended testing.
I am not suggesting that this is a “public beta” to insult Apple. I am sure the product is as polished as it can be given the current state of technology and the ability of Apple to internally test the system. Apple could have opted to keep Vision Pro in development for several more years to further refine the software and hardware, but the current approach will provide much better feedback to improve the product.
Releasing a “public beta” of a very expensive product is something few companies can get away with. However, Apple has a fanatically loyal following and a very rich core customer demographic. As I noted in a recent issue of The Digest, the Vision Pro is actually more affordable than the original Macintosh computer when compared to median household income. It is likely to sell well. As long as it provides reasonable utility and has enough of a “wow” factor, Apple will get the feedback it needs without risking customer dissatisfaction. Apple will learn, adapt, and improve the next version.
Conclusion
Computing becomes more powerful every year but long periods of time often go by in which the fundamental underlying software and hardware architecture remains static in terms of how humans and computers interact. It is not often that a major step-change occurs that could transform the industry forever.
Apple Vision Pro has many features that seem especially promising for immersive entertainment and gaming. Initial reviews place the product in its own category in terms of polish and capabilities which, of course, is reflected in the price tag. While I am not likely to purchase a Vision Pro anytime soon, I am excited about the promise of a major change in how we interact with computers.
With the iPhone continuing to represent Apple’s most important product, it is critical for the company to be the leader in any technology that risks making the smart phone obsolete. Apple cannot be afraid to cannibalize itself since doing so is far better than running the risk of a competitor disrupting the landscape.
If you found this article interesting, please click on the ❤️️ button and consider sharing this issue with your friends and colleagues.
Thanks for reading!
Copyright, Disclosures, and Privacy Information
Nothing in this article constitutes investment advice and all content is subject to the copyright and disclaimer policy of The Rational Walk LLC.
Your privacy is taken very seriously. No email addresses or any other subscriber information is ever sold or provided to third parties. If you choose to unsubscribe at any time, you will no longer receive any further communications of any kind.
The Rational Walk is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.
Individuals associated with The Rational Walk LLC own a very significant indirect stake in Apple via direct ownership of shares of Berkshire Hathaway.