feature article
Subscribe Now

The Metaverse is Coming and GridRaster Is Ready to Take Us There

Before we plunge headfirst into the fray with gusto and abandon and — of course — aplomb, I’d like to make mention of the fact that I am a huge fan of virtual reality and augmented reality. We should also remind ourselves that, as I discussed in my What the FAQ are VR, MR, AR, DR, AV, and HR? column, virtual reality and augmented reality form only part of what I call the Reality-Virtuality Continuum.

As illustrated in the incredibly complex diagram below, the Reality-Virtuality Continuum is bounded at one end by 100% physical reality (PR) in which everything is real, and at the other end by 100% virtual reality (VR) in which everything is generated by computer.

When people talk about augmented reality (AR), many of them think in terms of augmenting real-world scenes with additional textual or graphical information only. To quote myself (and who deserves to quote me more?), a better description might be as follows: “AR refers to an interactive experience of a real-world environment in which the objects that reside in the real world are enhanced by computer-generated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory, and olfactory.”

As an aside, with respect to the haptic devices alluded to in the previous paragraph, as reported by myriad magazines such as WIRED along with other online sources, earlier this week on 16 November, Meta (formerly Facebook as discussed below) unveiled the prototype for a high-tech haptic glove. Using this little scamp, if you touch an object in the virtual world, you’ll feel that object with your fingers. If you grasp something, the finger actuators will stiffen up, providing a sensation of resistance, thereby giving the impression that you are really holding something.

Meta’s prototype haptic glove (Image source: Facebook Reality Labs)

However, AR is only one side of the coin. The counterpart to AR is diminished reality (DR), in which information or stimuli are reduced or removed from the real-world scene. Examples would be to fade down (or out) extraneous voices or other sounds when you are conversing with someone in a noisy environment, fading or blurring portions of the scene you are viewing, removing colors from all or part of a scene (perhaps to highlight a group of friends in a crowd), or completely removing objects or people from the reality with which you are engaging.

There’s also the concept of augmented virtuality (AV). As opposed to AR, in which objects and scenes in the real world are augmented with computer-generated information, AV refers to augmenting virtual environments with real-world objects and/or people.

I’ve also heard the terms mediated reality (MR) and mixed-reality (MR), but I tend not to use these anymore because different people define them in different — often contradictory — ways. Having said this, I’m increasingly seeing the XR appellation being used to encompass the Reality-Virtuality Continuum in the form of AR/DR, AV, and VR. Actually, this may be a good place to note that, in the fullness of time, we will almost certainly use a single headset — which may well look like a stylish pair of wrap-around spectacles — to satisfy all our XR needs, switching between PR, AR/DR, AV, and VR views of the world as required.

Moving on, the term “metaverse” is a portmanteau of “meta-” and “universe.” This concept of a VR-based successor to the internet was coined by author Neal Stephenson in his 1992 science-fiction novel Snow Crash.

A 2D version of a metaverse appeared on the scene in 2003 in the form of Second Life, which is developed and owned by the San Francisco-based firm Linden Lab. Based on the Havok physics engine, this application — which runs on Windows and macOS (development on Linux is paused) — allows people to create an avatar for themselves and have a second life in an online virtual world (i.e., a metaverse). Users (a.k.a. residents) are able to interact with places, objects and other avatars, participate in group activities, and build, create, shop, and trade virtual property and services with one another. Second Life even has its own virtual currency called the Linden Dollar that is exchangeable with real world currency. Several years ago, for example, I heard of someone who had opened a store in Second Life to sell custom designed clothes that could be worn by other avatars. This person was making enough money in Second Life to live well in the real world, and this is just a 2D metaverse!

Did you ever read Ready Player One, which was the debut science fiction novel by Ernest Cline in 2011? There was also a 2018 movie of the same name directed by Stephen Spielberg (the movie was good, but the book was better). This somewhat dystopian tale is set in 2045. As summarized by Wikipedia: “The world is gripped by an energy crisis and global warming, causing widespread social problems and economic stagnation. The primary escape for most people is a virtual universe called the OASIS, which is accessed with a visor and haptic gloves. It functions both as a massively multiplayer online role-playing game (MMORPG) and as a virtual society.”

When Snow Crash crashed onto the scene in 1992, the idea of a 3D VR metaverse truly seemed to be the stuff of science fiction. Nineteen years later, when Ready Player One made its debut in 2011, the concept of us all spending significant amounts of time in a 3D VR metaverse still seemed somewhat improbable, and certainly not something that was coming our way anytime soon. Of course, another 5-year hop forward in time brought us to 2016 and the introduction of the Oculus Rift. In may surprise you to hear that I’m rarely an adopter of new technology (I prefer to let other people pay the premium and work the bugs out first), but I made an exception in the case of the Oculus, and I splashed the cash for the first unit I could lay my sticky fingers on.

One more 5-year hop through time brings us to today, 2021, when talk of metaverses is rife. Take Victoria VR, for example, which is based on the Unreal Engine (see also Want to Join Me in a Photorealistic VR Metaverse?). Although I don’t believe Victoria VR’s creators have opened their virtual doors yet, they exuberantly promise that in their metaverse we will be able to “work, make money, have fun, network and make new friends, and maybe even find love.”

On Thursday 28 October 2021, just three weeks ago as I pen these words, Mark Zuckerberg told attendees at the company’s annual Connect conference that, although Facebook, WhatsApp, and Instagram will all be keeping their current names, the parent company that owns and controls them will henceforth be known as Meta. Not surprisingly, many see this as an attempt to eventually become synonymous with the metaverse concept.

As an additional tidbit of trivia and nugget of knowledge, according to The Verge, by the spring of this year, there were already 10,000 Facebook employees working in the Reality Labs Division, which is about 20% of Facebook’s 50,000 worldwide workforce. Also, you may find The Metauniverse is Taking Over the Physical World article on Interesting Engineering to be… well… interesting (especially the embedded videos).

When I first started playing with my Oculus Rift, I was transported to alien worlds. Unfortunately, I was also tethered to the real world by the cables linking the headset to the monster computer powering the graphics. I now have an untethered Oculus Quest headset, which is an amazing piece of equipment and a lot of fun, but the images it displays are not of sufficiently high resolution as to be photorealistic, which is what we all really, really want.

The problem is that mobile devices simply don’t have enough computational horsepower and battery life to render these scenes at the fidelity we desire. This may change with the introduction of as-yet unknown technologies sometime in the future, but it’s probably not going to happen anytime soon.

The alternative is for all of the heavy lifting to be performed in the cloud or, in the case of enterprise-level users like aerospace companies, in a local on-site fog (in the same way that fog is closer to the ground than clouds, fog computing employs on-site servers that are closer to the end-users, thereby bringing cloud computing capabilities closer to the edge — see also What the FAQ is the Edge vs. the Far Edge?).

The reason I’m waffling on about all of this here is that I was just chatting to Dijam Panigrahi, who is a co-founder and COO of GridRaster.

The metaverse is heading our way (Image source: GridRaster)

Dijam explained that, when the original specifications for 5G were defined, one of key applications discussed was XR (predominantly in the form of AR and VR). Dijam and his co-founders had experience in things like mobile devices, wireless networks, and cloud computing, and they knew that something special was going to be required to bring high-end immersive experiences to mobile devices.

They founded GridRaster in 2015. Initially, they targeted gaming, entertainment, and education applications, but they quickly realized that a lot of variables need to fall in place — including content, infrastructure, and user adoption — for the medium to pick up in a big way with respect to mainstream use. By comparison, in the case of enterprise-level organizations like aerospace, defense, and automotive, sophisticated 3D CAD models were already available, as were resources and well-defined infrastructures. These enterprise-level organizations were ideal customers for GridRaster’s technologies, which include low-latency remote rendering and the ability to overlay virtual objects on physical assets with millimeter precision, where these virtual objects are almost indistinguishable from real objects in the physical world. Furthermore, they’ve developed the software infrastructure that can scale these capabilities from tens and hundreds of users to millions and — eventually — billions.

Of particular interest to enterprises who wish to futureproof their investments is the fact that GridRaster is agnostic with regard to whatever XR headsets are used and to whatever off-site cloud or on-site fog computational resources are employed.

As a result of all this, the guys and gals at GridRaster are currently working with a number of enterprise-level clients. These clients include one of the world’s largest aerospace companies, which uses XR not only for conceptualizing the design of airplanes and spacecraft, but also for assembly, training, and aftersales activities like maintenance and servicing. Furthermore, just a couple of months ago at the time of this writing, GridRaster was awarded a SBIR Phase II contract by the U.S. Air Force (USAF).

Now, I like enterprises as much as the next nerd, but I’m more interested in when I can get to use these capabilities myself. If you’d asked me as little as a year ago, I would have said that having access to high-fidelity XR or being able to roam around a photo-realistic metauniverse was still a long way out. If pressed, I might have guessed in the 20-year timeframe. By comparison, Dijam is more bullish, saying that he thinks things are now moving so fast that we may well see spectacular developments in a three-to-five-year window.

There are a couple of things that haunt me about all this. The first was triggered by the Hyper-Reality video on YouTube. Created by Keiichi Matsuda, this video presents a provocative and kaleidoscopic vision of a future where physical and virtual realities have merged and the user is saturated by media.

 

The part I’m thinking of is around the 3:50 mark when the user is in a brightly lit supermarket and her system crashes, at which point we get to see the dingy and grungy real-world. Then the system reboots and everything returns to exciting and vibrant colors and sounds. So, now we have two things to worry about: (a) information overload and (b) growing to prefer the augmented world to the real one.

The other thought that keeps on rattling around (what I laughingly call) my mind is, “What happens if we get used to all this and then something bad happens and it goes away and we don’t know how to live without it?” Just a few days ago, I lost access to the internet in my office (some plonker had cut through the cable a mile or so up the road) and I was “dead in the water,” as it were. I’ve had shorter months than the two hours it took for service to be restored. Most people associate the English author E. M. Forster with his 1924 novel A Passage to India, which is set against the backdrop of the British Raj and the Indian independence movement in the 1920s. Less well known is his 1909 short story The Machine Stops, which is set in a future world where everyone lives in isolation in individual rooms in vast underground cities. They communicate with each other via a kind of instant messaging/video conferencing machine (remember that this tale was originally told in 1909), all their bodily needs are met by an omnipotent, global machine, and then the machine stops…

Amazingly, more than 100 years after its first publication, this book is still available on Amazon (I have a copy here in my office). It’s pretty grim stuff and it certainly makes you think. Having said this, I can’t wait to see what happens next. How about you? What do you think about all of this?

Leave a Reply

featured blogs
Jul 20, 2024
If you are looking for great technology-related reads, here are some offerings that I cannot recommend highly enough....

featured video

Larsen & Toubro Builds Data Centers with Effective Cooling Using Cadence Reality DC Design

Sponsored by Cadence Design Systems

Larsen & Toubro built the world’s largest FIFA stadium in Qatar, the world’s tallest statue, and one of the world’s most sophisticated cricket stadiums. Their latest business venture? Designing data centers. Since IT equipment in data centers generates a lot of heat, it’s important to have an efficient and effective cooling system. Learn why, Larsen & Toubro use Cadence Reality DC Design Software for simulation and analysis of the cooling system.

Click here for more information about Cadence Multiphysics System Analysis

featured chalk talk

Enabling the Evolution of E-mobility for Your Applications
The next generation of electric vehicles, including trucks, buses, construction and recreational vehicles will need connectivity solutions that are modular, scalable, high performance, and can operate in harsh environments. In this episode of Chalk Talk, Amelia Dalton and Daniel Domke from TE Connectivity examine design considerations for next generation e-mobility applications and the benefits that TE Connectivity’s PowerTube HVP-HD Connector Series bring to these designs.
Feb 28, 2024
20,193 views