Metaverse and UI: how will we interface in virtual worlds?
The metaverse has become a viral topic, especially in recent times. Many companies have decided to invest resources and technology in the subject. For this reason, one of the issues to analyze, both for companies and people, is the interface.
On the other hand, even if you have the best technology, you will not be able to use it in any way if the interface is not intuitive and efficient.
In the field of video games, all you need is a screen and a controller. What about professional applications?
The best interface is invisible
The first step towards virtual reality is the functionality of the user experience. The ultimate goal is surely to offer an experience as real as possible compared to reality. With this vision, anything that ‘lightens’ the user is seen in a positive light. Compared to a screen, there is a clear preference for voice control. In the future, it will be possible to control objects and actions through our gaze and thoughts. This future is not so far away, and some people are already working on it.
Present and prospects
Experimentation and research are focusing on improving hand-held devices, such as Valve’s Index Controller. The features of these devices are the numerous sensors that allow the natural movements that people make to grasp, throw, release an object, and so on, within the virtual worlds.
At the moment, we can find valid alternatives in the VRfree and VRFree Haptic Glove by Sensoryx AG. However, we should not think that there are only supports to replicate movements limited to our hands. Some supports can replicate the feeling of walking or even running, such as the Cybershoes, or the new technology used especially in the medical field, called VRfree Sole.
We can’t forget platforms. However, for economic and size reasons, they are not suitable for home use, which is why they are mainly intended for use in amusement arcades.
At the moment, two main companies are standing out in terms of experimenting with the interfaces of the future.
The first is Meta, Facebook’s evolution of social connection within the metaverse. Zuckerberg has decided to focus on eye-tracking. At first, eye tracking was mainly used as a tool to analyze user behavior and maximize marketing results. Nowadays, eye tracking has become a possibility that allows Oculus users to execute commands simply through their gaze and the movement of their eyes, thus freeing the subject from additional media or devices.
Elon Musk is also turning his gaze to the future, studying an even more invisible type of interface, the neural interface, using Neurolink technology. In this way, Musk manages to completely free the user from any support, with people’s minds as the only tool. The implant will then be placed in the brain and, through it, it will be possible to command interfaces.
The best interface of the metaverse could be the most common one.
Realistically, it is unlikely that everyone will be able to access the metaverse through the latest generation of visors or interfaces shortly.
The first access to the virtual worlds will certainly be mediated through a screen that anyone got, like a computer screen or a mobile phone screen. Now, with Web3D, we can have access to virtual reality.
Full article: https://www.anothereality.io/blog/