Nvidia, an American multinational technology company, is stepping up its efforts to establish a presence in the metaverse. On 9 August, the company unveiled a new set of metaverse-focused developer tools, including new AI capabilities, simulations, and other creative assets.
According to Nvidia, one of the primary functions of the tools will be to aid in the creation of “accurate digital twins and realistic avatars.” Simply put, that means the new features will make life in the metaverse more realistic.
The new features will be available for creators who use the Omniverse Kit, as well as apps like Nucleus, Audio2Face, and Machinima. Nvidia Omniverse, a real-time graphics collaboration platform created by Nvidia, is a platform for developers creating 3D simulations that can be added to the metaverse.
Other additions include the Omniverse Avatar Cloud Engine (ACE), which developers claim will improve building conditions of “virtual assistants and digital humans.”
Nvidia PhysX, an “advanced real-time engine for simulating realistic physics,” is another tool being introduced. With it, developers can include realistic reactions to metaverse interactions that follow physical laws.
So far, Nvidia’s AI technology has played an important role in creating spaces for social interaction in the digital universe. The new releases indicate that Nvidia is on the right track as far as its plans to make the metaverse more realistic are concerned, and there is every chance that the metaverse will expand and surpass $50 billion in the next four years.