SIGGRAPH 2024: highlights from day one of this year's conference
When you buy through links on our articles, Future and its syndication partners may earn a commission.
SIGGRAPH 2024 kicked off yesterday (28 July) at the Colorado Convention Center in Denver with big-name talks and demonstrations on computer graphics, production, animation, gaming and new technologies.
Highlights so far have included tech ware for mixed reality experiences in the Emerging Technologies, Pixar wigs, and glimpses of research into everything from GPU-accelerated Rendering of Vector Brush Strokes to directable fractal self-similarity.
Sunday afternoon saw the motion capture company Vicon present advances such as Wand, an accessory with LEDs to help calibrate the motion capture system and cameras used for reference videos. Meanwhile, a dedicated session on Bodies, Skin, and Hair saw Roblox demonstrate its system for transitioning from a 3D model to a clothed and animated avatar. In the same sessions, DreamWorks Animation’s CFX team showed its Skin Wrinkles tool and Pixar discussed its wig refitting process for Inside Out 2, through which it created a scalp to allow hair to be reused on different characters.
In Adobe's Papers Fast Forward forum on Sunday afternoon, Ph.D. students in computer graphics presented their research in 3-minute videos. Viewers were apparently impressed by Yale's Alexa Schor's presentation of a novel method to create self-similar fractals from arbitrary input shapes. The method introduces "portals" into an iterated map, allowing for user placement of self-similarities and bridging the aesthetics of iterated maps with the fine-grained control of iterated function systems (IFS) in both 2D and 3D.
In the Emerging Technologies area at SIGGRAPH, the University of Maryland's Wearable Computing department is showcasing its exoskeleton, while FEEL TECH Wear is showing how it's enhancing the mixed reality experience with wrist-to-finger haptic attribution.
At the Oracle booth, Beamr Imaging is demonstrating an optimised production of large high-resolution videos rendered from 3D design with Oracle Cloud Infrastructure (OCI), reducing an extremely large video file to one-fourth of its original size. The streamlined video is from an OpenUSD-based scene developed on NVIDIA Omniverse.
Day two of SIGGRAPH 2024 will see the opening Keynote session by Dr. Mark Sagar, co-founder and former Chief Science Officer of Soul Machines and director of the Laboratory for Animate Technologies at the Auckland Bioengineering Institute. There will also be a rare joint public appearance of Nvidia founder and CEO Jensen Huang and Meta founder and CEO Mark Zuckerberg on Monday 29 July at 4 pm MDT. They'll discuss the future of AI and simulation, the role of research in AI breakthroughs, and how open-source generative AI can empower developers and creators. They’ll also reflect on the role of generative AI in building virtual worlds, and how virtual worlds can build the next wave of AI and robots.
AI is likely to feature highly elsewhere. As well as this fireside chat, there will be keynote presentations from Dr. Dava Newman of MIT Media Lab, Dr. Mark Sagar from Soul Machines and Manu Prakash with Stanford University, sharing perspectives on the future of technology and computer graphics. See the full schedule on the website.