An icon for a calendar

Published December 28, 2021

What the concept of the Metaverse means to someone who has been looking at AR and VR for decades.

What the concept of the Metaverse means to someone who has been looking at AR and VR for decades.

As a child in the 70’s who was consumed by technology,  I got to experience the user interface changing from computers that only provided a hex keypad for input and 8 individual LED’s for output and memory that was well below 1K of RAM.

I was delighted by what you could do with a keyboard and teletype printer. I Remember the green screens, acoustic couplers, Bulletin boards, Perkin Elmer minis that didn’t come with floating point arithmetic.

I remember working out how to code individual pixels to create graphical images, modulating the code to change the beep from an internal speaker into musical notes and then more complex sounds.

I remember CP/M and the launch of PC DOS, MS DOS, Linux, Windows, The Macintosh, SCO Unix, and dozens of lesser (now) known operating systems.

I remember the movement towards structured languages and code libraries, and the voices of that “older generation” of coders who loved the challenges of building their code from the ground up, who sounded just like the autophiles who complained that CD’s didn’t sound as good as Vinyl.

Today code is social, we use the work of millions to build new applications without having to repeat previously solved issues.

Amazingly enough even Moore’s law still seems to work.

Virtualized platforms have freed us from the cost and complexity of building a computing environment to support each need.

And the user interface has moved to touchscreens and Alexa, with machine learning able to recognize our questions through pattern recognition.

VR and AR headsets allow us to experience the user interface in ways that match out senses. It makes sense that this will continue to just get better, to the point where we can use more of our senses to engage, allowing virtuality to mimic reality.

We all use zoom-like apps to attend meetings and events. But every zoom call starts with most of the attendees asking again and again if you can hear them and ends with people who are distracted by an email they are reading in the background and are having trouble finding the unmute button.

VR has a significant advantage, quite simply by being so deeply immersive, it’s hard to mix your realities, you are consumed in VR and are forced to ignore reality while participating.

The possibilities are literally endless.

But the challenge is that the quality of user experience must be so good with VR, because by mimicking reality, we spot errors much more easily, and small errors become impossible to handle.

  • If the perspective of an image is “off” you will feel seasick.
  • If the sound is not perfect, you will get a headache.
  • If the quality of the representation of a person you are interacting with is poor, then you will stop immediately.

We are used to poor performance or a bad user experience on a screen/keyboard or touchscreen app. We may find it annoying or tedious, but it is unlikely to stop us using it.

With VR, performance issues, or a poor user experience will make an app entirely unusable, to the point that it will even make the user unwell.

This places a much higher level of quality on the development and Operations teams.

VR and mixed reality (AR) can create health issues, both physical and mental.

What does this mean to the development process? Will doctors, psychologists, ethicists be required as part of the dev and QA process?

When a bank builds an app for the metaverse, since the engagement will be so deeply immersive, what regulations and testing and certification will be required?

This is truly a new world, and I’m not convinced that the people that can’t effectively today manage the ethics of social media or search have the skills to define the boundaries of this platform.

What kinds of standards and tests need to be in place to ensure that as VR and AR become the UI of the Metaverse, that the problems we have seen with each previous UI don’t become exponentially more difficult to manage?

I’m a regular user of VR, love the concept, but recognize the issues it can unleash on society, if it’s not handled with the care of a new drug being developed.