Purple Pill - How to Mix Real with Virtual to Create your New World
What is The Metaverse? This has been a standing question since as long as I can recall.
We are not thinking big enough. As the concepts of Virtual, Augmented, and Mixed reality enter our daily experiences the range of expectations lies on an unnecessarily narrow spectrum. On one side you have the typical haters and luddites saying that we’ve been here before and this technology will never reach mass adoption. And of course we have the champions, evangelists, and entrepreneurs who believe everyone on the planet will have a VR headset or AR glasses in the next few years.
But let’s think BIGGER. The use of mixed reality allows us, for the first time, to trick the human brain into thinking a 3D image is actually a physical object.
Let that sink in.
When you walk down your block to get a coffee on your way to the subway, how often do you ask yourself whether the people you walked by were really physically there or just holograms?
When you look up at the sky on a starry night do you find yourself wondering how many of those stars no longer exist?
We are familiar with the idea that a dead star can still be seen despite having exploded into oblivion millions of years ago — it takes light quite a while to travel through the galaxy. Yet we are not used to questioning the physical nature of objects in our immediate vicinity. Mixed reality is going to change this expectation.
Lets think about what makes something real. It must react to our senses and the laws of physics the way we expect.
TOUCH: When we touch the object, we don’t expect our hands to just glide passed the edges and into the center, especially if it is made of a metal like titanium.
VISION: Similarly, we expect this titanium object, let’s say a shiny sphere, to occlude our vision of objects behind it.
WEIGHT: It should have a weight of course
TASTE: And if we venture to taste it we would expect something not so pleasant.
SOUND: If we drop it from the roof of our house, it should make a very loud boom
FORCE: and cause our driveway to buckle.
The combination of these expectations makes the titanium sphere a physically real object.
How many of those expectations would we have to emulate through technical trickery for our brains to be fooled into thinking we were interacting with a titanium sphere, when in fact, we were just waving our hands in mid-air?
Vision is the first, and perhaps easiest, problem to tackle. Anyone who has myopia and wears glasses all day can tell you that basic optics can do magical things. Add a projection to that from behind the glasses (AR) or from two stereoscopic lenses in front of the eyes (VR) and you have yourself a genuine trick. I’m optimistic about AR projections that don’t require glasses, but let’s skip that for now. Now that we have vision worked out, we need to explore the next hot area of AR/VR research: haptics.
If you walked up to a car and, as you reached out to open the door, you felt nothing, instead of the grip of the cold metallic handle, you would quickly realize that the car is not real. The illusion is broken immediately when there is no haptic feedback. In solving this issue, we can start with feedback that is semi-realistic. That can be accomplished with haptic gloves or suits. If you open the car you feel a rumbling in your glove that tells you “hey there’s something here!” To enhance that experience, what kind of magical elements can we add to a suit that would make it impossible for you to enter the car as opposed to floating right past the image? I can imagine a suit that has air pressure units that inflate and deflate based on the presence of physical barriers. Or we can imagine a material lining the suite that becomes alternatively rigid and soft depending on your interaction with a virtual object or wall. But would that convince you? Would you really think the object was real or would you still be fully aware that the object is simulated? I wouldn’t be convinced.
Faced with this seemingly insurmountable technical challenge, how do we navigate towards a believable alternate reality? How do we think BIGGER and conjure up the holodeck, matrix, oasis or meta verse of science fiction masters?
First I think we need to start by accepting that, unless we have chips directly implanted into our brain, we will not be fully convinced that a simulation is 100% real. That being said, I think simulated realities still have an important place in our daily lives in the near future.
Lets think about other platforms where simulations are important even when they don’t replicate reality in exact detail. Think of Van Gogh’s “Starry Night”, a personal favorite. When you look at those wide circling dashes of paint that outline stars and trees, you know there are no physical stars or trees that look exactly like that. That doesn’t mean this painting doesn’t properly represent our perception of reality. Many of us wish we could jump into an impressionist painting and fly from one floating cloud to the next. It’s a fantasy based on physical reality.
Mixed reality will make us feel the same way. When we are riding on the train and look out to the passing suburbs and, with a click of our finger, we change the color of the sky from a grayish blue to a bright pink — That will feel awesome. We will know it’s not real, but we will want to experience it just the same.
Mixed Reality Ocean
I’ve always believed that, when it comes to smart glasses or contact lenses, augmented reality will be on the translucent side of the reality spectrum and virtual reality will be on the opaque side of the spectrum. There is no way the geniuses in Silicon Valley and Shanghai will make us wear two different glasses to experience reality overlays that are so similar in construction and form. Similar in construction because the same Unity or Unreal engine is used to develop the 3D visuals and similar in form because the same programming languages, usually C#, will give these visuals their interactive properties.
Go back to our pink sky on the train vignette. Now imagine clicking another button and the only thing you see is the pink sky. The rest of the world is pitch black. You’ve made the leap from augmented to virtual. Now it gets complicated. Your sitting on the train (try not to fall over) staring at the pink sky with a few unicorns. You hear the screeching of the track and the announcement reminding you to refrain from playing loud music. The doors open and close. You see nothing but the pink sky. Reach out in front of your glasses and tap. Suddenly the train car comes into view. But the houses and power lines and streets and trees and cell towers that rush past you are completely hidden in a coat of darkness. What reality are you in now?
You can use an app called “Color Effects” by The Othernet, LLC (goo.gl/jOn5zs to download on the app store) to create a static version of this thought experiment. If you upload a picture, like the Chicago skyline or Florida rocky beach shown in this article, you can choose what portions to keep in their original color and what portions to “paint” with whatever color you desire. What results is a combination of “real color” elements with “painted” elements. In many cases, its very difficult to discern which colors are real, even if you were the one taking the photograph.
In a similar way, we will paint the contours of our personal customized realities. We might not be in a completely believable immersive simulation, but we will not be in plain old simple reality either. For the first time in human history what is physically there will no longer be what we perceive. We will be the magicians of our own world of fantasy and wonder. I hope you’ll join me on this journey to a new phase in human perception. I’m not sure exactly what it will be like, but I’m sure it’s going to be awesome.