Hello everyone! 

 

It’s a pleasure to be here today to share with you the latest news from Enoverse, and please believe me when I say that I’m truly excited about it. 

Our goal has always been to enable our clients to open their own metaverse to the world, distinguishing themselves for the quality, interaction possibilities with other avatars, and even with the NPCs present in the space. 

 

In line with this goal, the first introduced innovation, which we’ve named “i-direction,” allows administrators of their own space to stream content created directly in the metaverse in 4k resolution on any desired social media platform, through a tool that supports the use of “n” cameras, thus allowing all VR creators to have access to a true direction within the metaverse. 

 

Imagine the scenic and engaging effect of a video presentation, a webinar, or a virtual press conference shared on LinkedIn, Facebook, your website, and wherever else you desire, with an audio-video quality previously impossible, and direction that focuses on the audience as well as on the speakers or the content presented, thus offering an impeccable streaming experience of extraordinary quality to the whole world. 

 

The challenge to achieve this was tough, but in the end, we managed to find the thread of the maze, and some of our spaces have been completely re-engineered and integrated into an executable application capable of bypassing the bottleneck represented by WebGL, thus eliminating any compromise in terms of quality: 4k textures, impeccable lighting, even more realistic avatars, and our “dynamic ecosystem” even more surprisingly vibrant. 

 

But we didn’t stop there. 

 

The insight was to insert not one, but multiple cameras into the thus re-engineered space, along with a control panel capable of managing, both intuitively and quickly, the transition from one to another. The result? Broadcasts capable of emphasizing, with a close-up, a pivotal moment of a presentation, rather than using a wider shot to show the audience present at the event, an angular view to be more creative, and a zoom to frame the screen behind the speakers. Never again monotonous broadcasts, blurry images, squeaky voices, and videos of uncertain quality. 

The second significant innovation recently introduced, and that I want to talk to you about, is a tool for managing audio/video playlists. The need was conveyed to us by one of our clients, who wanted to be able to broadcast some YouTube videos in their space without necessarily having to be present. 

 

Challenge accepted. 

 

The first step was to create, in Enoverse’s backoffice, a repository for multimedia files within which to store audio, video, images, and documents of all kinds. Still in the backoffice, we then implemented a tool for creating playlists: something simple that only requires the name of the playlist, when to start publishing it in the metaverse, and when to end it, and then selecting the files that make up the playlist. 

 

Clicking on “save,” the playlist will autonomously start working at the indicated time, rotating the audio-video components until the closing time. 

 

But we didn’t stop there. 

 

To leave nothing out for our clients, we also needed a tool, this time working directly in the metaverse, to broadcast in real time musical files or documents of all kinds on the virtual devices present in the space. From within the metaverse, it is now possible for space administrators only to click on the virtual device that they want to broadcast the audio-video file, or project the document onto the screen, to open a modal from which to select the file or files, and then launch the transmission, which will start immediately. 

 

I admit that now we are really curious to see how the Enoverse community will use these incredible resources to share ideas, information, and passions with the whole world. It will be really exciting. 

 

Until the next Dev Diary.