Nvidia, Meta CEOs Discuss AI Research Fundamentals at SIGGRAPH 2024 

At the SIGGRAPH Asia 2024 event, Nvidia’s series of AI announcements revealed new tools and technologies, all in collaboration with Meta.  

At the SIGGRAPH Asia 2024 event, Nvidia’s series of AI announcements revealed new tools and technologies, all in collaboration with Meta.  

The newly forged partnership between these two giants will merge both Nvidia and Meta’s expertise in building innovation to further push the AI industry. 

Nvidia’s CEO, Jensen Huang, kicked off the event with the announcement of a list of new AI tools that will integrate AI capabilities with simulation technologies.  

Among the highlights were seven key innovations introduced by Nvidia, setting the stage for what could be a major leap in global AI advancements.  

  • James, the Human Digital Assistant 

James, a digital human assistant powered by Nvidia’s ACE technology – a suite of technologies for bringing digital humans to life generative AI – kicks in a marketing expert within the company. 

Acting as an open-source virtual assistant, James will answer all questions linked to Nvidia’s data centers and gaming technologies. James is also vocal, can retain eye contact, maintain higher user interaction, and can also be integrated with chatbots like GPT. 

As a trial of this assistant, James demonstrated a high level of realism, and it even took it as far as interrupting users during conversations.  

  • Creative AI with Edify 

The second AI feature introduced by Nvidia is Edify, a multimodal AI architecture designed to generate digital content such as images, videos, 3D models, and 360-degree HDR environment maps.  

Edify gives companies the chance to train generative AI models on their privately licensed data using simple text prompts, heightening efficiency by producing high-quality content with fewer training images.  

The architecture can fine-tune models to match specific styles or learn unique characters, making it a versatile tool for digital content creation. Edify powers services like Getty Images’ generative AI, allows users to model, manipulate, and master digital assets in respect to copyright safety. 

  • Meta, Nvidia, and AI Together 

The highlight of the SIGGRAPH Asia 2024 event was Meta CEO Mark Zuckerberg announcing Meta’s AI Studio during his discussion with Nvidia’s Jensen. Zuck explained that AU Studio was developed to push creators to build their own AI agents to improve time management. 

Throughout the interview, Zuckerberg frequently mentioned the collaboration with Nvidia in developing AI Studio and Llama 3.1, praising the speed and efficiency made possible by Nvidia’s H100 Graphics Processing Units (GPUs). Referring to the critical role Nvidia played, Zuckerberg remarked to Huang, “You kind of made this.” 

Also, back in July, Meta announced the launching of its open-source AI model Llama 3.1 using Nvidia’s GPUs. 

  • AI Assistant for Everyone 

In a conversation with Wired during the event, Jensen shared his insights on the future of AI-enhanced human productivity, the energy efficiency of accelerated computing, and the convergence of graphics and AI. 

Huang mostly emphasized the growing integration of AI into everyday life, predicting that AI will increasingly serve as a personal assistant. “Every single company, every single job within the company, will have AI assistance,” he stated. 

  • L40S and fVDB from Another Digital World 

As part of the event’s announcements, Nvidia introduced the L40S, a powerful GPU designed to handle demanding tasks, such as AI training and inference, large-scale simulations, and advanced graphics rendering.  

The L40S is particularly adept at managing complex workloads, making it an ideal solution for 3D design, video production, and AI-driven applications. 

Nvidia also introduced fVDB, a new deep learning framework designed to create AI-driven virtual representations for autonomous vehicles, climate science, and smart cities.  

The framework can build larger and more detailed digital models of the world by converting data from various techniques into intricate 3D environments through real-time interpretation during AI training. 

  • AI-Driven Digital Marketing 

Nvidia’s latest announcement at SIGGRAPH Asia 2024 event will also expand the current horizons of digital marketing with advanced control over generative AI through its OpenUSD framework and Nvidia NIM microservices that produce brand-accurate visuals. 

Nvidia OpenUSD (Universal Scene Description) is for general collaboration and interchange of 3D data between source applications for creating and manipulating complex 3D scenes. 

Nvidia NIM services provide the needed infrastructure to build and deploy AI-driven applications like digital humans or virtual assistants, amplifying real-time multimodal AI interactions with the integration of AI models and user interfaces, improved performance and functionality of virtual assistants and interactive digital characters for a more human touch. 


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Tech sections to stay informed and up-to-date with our daily articles.