Designing for the Metaverse

Anticipating Approaches and Challenges

Humanity has witnessed the digitalization of everyday aspects of life tangibly in the last 30 years and the velocity of this transformation is increasing exponentially. The Covid-19 pandemic fueled the amount of time we spend in the digital world through websites, apps, social networks, and now – even work. This immersion into a more digital realm of existence has served as a catalyst for leveling up, and the metaverse is our first foray into an existence that has, until this point, been conceivable only through science fiction.

Books, television, movies, and video games have provided a context for what this brave new world will look like, but how do we, as designers, shape the User experience – what type of approaches should we consider – what challenges should we anticipate when entering this new layer of digital interaction?


History has shown us that there is a recognizable transition pattern to adoption during evolutionary leaps forward – may they be social, economic and/or technological. New paradigms of communication, new mediums if you will, have (and in reality, must) employ existing practices in order to transition users from the innovation phase and into early adoption. The reason is because innovators, recognizing the intrinsic value in the new medium, will champion the evolutionary jump, but early adopters need a nudge – effectively, a transitionary hybrid between the medium’s singularities and elements from mainstream mediums. 

A picture containing text, clipart

Description automatically generated

There are countless historical, real-world examples of this process, but the most relevant example is the introduction of touch screen smartphones in the mid- 2000s. The early interfaces presented clickable (tap-able) elements that looked like physical objects with volume and depth. Since the buttons could not feel like the physical buttons cell-users were accustomed to, designers guided users by creating visually accurate representations of buttons, which gave the users a sense of familiarity, control, and savvy, thereby easing the transition. This aesthetic, technically called Skeuomorphism, was then gradually replaced by Minimal/Metro as the user base of touch screen smartphones expanded from the innovators and early adopters into the early majority.

Using the adoption path of touch screen devices as an example of what is in store for the metaverse, we can expect that some critical interactions and engagements that run across the board with users (a broad range of user demographics) will probably emulate “2D” experience dynamics. Where possible, designers will take advantage of the spatial property within the 3D environment in ways the user can relate to (i.e. “taking” the object/avatar/asset as they would in a physical space – which, admittedly, is an experience increasingly left behind by the action of the digital evolution itself). Additionally, we can posit that in its first stages, the metaverse will rely heavily on the imitation of real-world objects as direct visual metaphors. In time, new objects will arise from the internal dynamic in this new medium, which will not necessarily be representative of any concrete real-world counterpart.


As with all (r)evolutionary jumps, the benefits are counterbalanced with both expected and novel risks/challenges. The metaverse’s untapped potential for expanding people’s interactions in a virtual landscape – a world within a world – via social networks (or even as an extension of them) will undoubtedly stand to inherit a lot of the current privacy and intrusiveness issues of social networks, as well as the lack of transparency in the manipulation of personal data. None of these issues are solved in the current playfield and there are risks of those becoming much worse in the metaverse. 

Let’s consider the systems that track users’ facial expressions/reactions and how these systems train algorithms to interpret said expressions/reactions as a means of funneling appropriate marketing and product placements to the user. Louis Rosemberg makes a case for aggressive regulations for the metaverse by identifying potentially dangerous outcomes and manipulation of users that may arise from this additional layer of tracking that the metaverse offers. Rosemberg is not wrong in his assessment and this is not to be downplayed considering that social network algorithms have been proven to swing support for political movements and parties during election cycles.

Another challenge is that today’s User Interface Designers come with baggage, specifically patterns and layout strategies (ex: digital dashboard, websites, and application) that work in the 2D realm, but will be of less use for the metaverse — think of HUD elements and contextual menus or options when you are playing any contemporary 3D game — but will ultimately not be the core component of the experience and interaction. This is because the concept is entirely different: by assuming a digital persona in a virtual space that aspires to resemble and represent many of the structures we know from the “real world”, the user is encouraged to inhabit and explore the metaverse in a more immersive way. In this respect, the goal of quickly and effectively finding information or performing actions continues to have its intrinsic value but this is not its central point. The experience of the new medium is one of exploration and living (perhaps even owning) the virtual space. This shift transforms designers into something closer to architects / urbanists – touching on sociology, philosophy, potentially extending into even law, politics, and economics – because what’s at stake is the genesis of a “meta society”.

Last, but certainly not least, if you are experiencing a metaverse by using, let’s say —for now— a VR helmet; how do you imagine the analog of the navigation we do currently in 2D would be, with its constant hyperlinking and jumping through tabs? It has been observed that in virtual reality environments, this transition between scenes must be carefully designed so that the player does not get disoriented. The “natural” way out of this is the simple movement between the different meta places, just as one would move in the physical world. But of course, new ways of navigation will be refined as the metaverse develops, allowing for natural multitasking, or even multi-staging, while not leaving the user feeling disjointed or dislocated.

Take Aways

The metaverse presents us with the opportunity to create meaningful experiences and visual languages for a medium that puts the user in a “player” position. We will be transitioning from putting our pages and products in front of the user to putting the user inside of the product. The potential we have as designers to use products, services, and ideas as a means of inspiring users via immersive experiences that maximize connection and engagement is unprecedented. The dystopian ghost is obviously lurking around regarding this topic and, as designers and developers, it is up to us to circumvent these challenges and mitigate the risks of these new interactions. 
With one foot still on the ground – knee spring loaded – poised and ready for the leap – we, as designers, must be prepared to launch into the unknown possibilities of transforming science fiction into technological reality. As innovators, the birth of true VR will be our legacy – IF we understand how to introduce it and evolve it such that early adopters give way to the early majority, and IF the early majority feel safe enough to see VR succeed through the innovation lifecycle.

With more than a decade of experience designing innovative solutions across the private and federal landscape, Mobomo, LLC places a premium on thought leadership within our ranks.  For further insight on the cutting edge solutions we have developed for our customers, visit us or reach out directly and discover the ways in which we can help put our proven dedication to excellence to work for your organization.