“What is real? How do you define ‘real’? If you’re talking about what you can feel, what you can smell, what you can taste and see, then ‘real’ is simply electrical signals interpreted by your brain.”
Morpheus (and Immanuel Kant for that matter) didn’t need the Apple Vision Pro to change their views about the “real” world. Our senses serve as the primary interface through which we understand and navigate the world. Yet these sensory inputs undergo a complex process of interpretation and construction by our brains, crafting the reality we experience daily.
I’m hearing impaired, which creates a different reality for me. I can’t tell the direction of sound, can’t hear certain tones, can’t understand conversations at a dinner table (reading lips and “faking it” get me through most of the time.) My reality is different than yours – it’s mediated by my physical limitations.
Are the immersive, digital experiences created with VR any less “real” than those mediated by our natural senses? We might argue that all of our experiences, whether natural or technologically enhanced, are forms of “mediated” reality.
The launch of the Apple Vision Pro may push us closer to fully realizing the concept of mediated experiences. In Casey Neistat’s video below, he says the device “isn’t the future of AR or VR, this is the future interface of all computing.” I don’t disagree. Once the hardware starts looking less like oversized ski goggles and more like Ray-Bans, the potential is staggering.
But don’t be fooled – this isn’t a technology story, it’s a biology story. This is a story about the fundamental aspects of cognition and human perception. As we continue to explore and integrate these technologies into our lives, we are not just changing how we interact with the world; we are also expanding our understanding of what it means to live and perceive within it.

