Closing the Gap in Neural Implicit Representations With ‘Spelunking the Deep’
Image by Nicholas Sharp and Alec Jacobson, 2022. Original 3D model by sketchfab user ncassab, used under CC Attribution.
A recipient of a SIGGRAPH 2022 Technical Papers Best Paper Award, “Spelunking the Deep” Guaranteed Queries on General Neural Implicit Surfaces via Range Analysis” introduces a new technique to work with neural implicit shape representations. SIGGRAPH caught up with contributors Nicholas Sharp and Alec Jacobson to learn more about this award-winning research, including how they achieved specific outcomes and how they envision it being used in the future. Plus, as we countdown to SIGGRAPH 2023 Technical Papers submissions opening, find out what advice Sharp and Jacobson have for future contributors.
SIGGRAPH: Congratulations on receiving one of the SIGGRAPH 2022 Technical Papers Best Paper Awards! Share some background about “Spelunking the Deep: Guaranteed Queries on General Neural Implicit Surfaces via Range Analysis.” What inspired this research?
Nicholas Sharp (NS): Neural implicit representations — where a shape is represented by a neural network in space — have been a bit of a revolution in computer graphics over the last few years. They’re very exciting and work amazingly well for problems like 3D reconstruction. But there were still some basic gaps in our ability to do low-level geometry operations on these representations, like testing if two shapes overlap. We wanted to close some of those gaps.
SIGGRAPH: Tell us how you developed this work. What problems does it solve?
NS: This project grew out of wanting more techniques to work with neural implicit shape representations. If they’re here to stay in computer graphics, we need to build our toolbox to work with them just like we would with a triangle mesh, for example. This research gives more tools to scientists, engineers, and artists who want to build higher-level applications like simulations or animated characters using neural implicit representations.
SIGGRAPH: What challenges did you face in its development? What surprised you most during the process?
NS: The main technique we used in this project is a computational method called “range analysis.” Once we hit on the idea of using range analysis, we thought the problem was all solved! But then we discovered that basic range analysis didn’t actually work well in practice for many of the tasks we wanted to tackle — we had to dig a bit deeper to a special variant of range analysis called “affine arithmetic” to really make it work.
Alec Jacobson (AJ): Nick is an exceptionally tenacious researcher. We had some ups and downs in the project. Our first attempts using standard interval arithmetic worked on some tasks but failed miserably on others. Rather than giving up or redefining our problem to avoid the hard cases, Nick dug deep into the range analysis literature and found that our problems echoed some of those encountered by others in really different domains. Our paper’s contribution is the direct result of Nick’s dissection of this literature and distillation into techniques that work really well for the tasks people in our community care about.
SIGGRAPH: With “Spelunking the Deep,” resulting queries have guaranteed accuracy, even on randomly initialized networks. How did you achieve this outcome?
NS: This property is really important — many neural network-based techniques only “mostly” work when the neural network can be fit well enough to data, but the queries we develop in this work always give the right answer, up to some predefined tolerance, no matter how crazy your data or your neural network is. This property follows from range analysis — our algorithms keep refining the solution until they can be sure the answer is correct.
SIGGRAPH: How do you envision this research being used in the future?
NS: We hope that these techniques will become a common tool for manipulating 3D neural implicit shapes. If integrated into codebases and engines, they could allow the graphics community to seamlessly leverage the latest AI-based shape representations in new applications.
SIGGRAPH: SIGGRAPH 2022 was excited to introduce roundtable sessions for Technical Papers. Tell us about your SIGGRAPH 2022 experience. Do any key moments or interactions with SIGGRAPH participants stand out?
NS: I really enjoyed SIGGRAPH 2022. The best part of the conference is interacting one-on-one with the amazing folks in our field, and the roundtable sessions provided even more opportunities to do so.
AJ: Wow. What a relief to be back to attending SIGGRAPH in person! While I appreciated some aspects of the virtual conferences we had earlier in the pandemic, it was so valuable to meet in person with other researchers and finally meet the new cohorts of student researchers.
SIGGRAPH: What advice do you have for someone who wants to submit to Technical Papers for a future SIGGRAPH conference?
NS: Good visualizations and diagrams are key to explaining your work! People sometimes mistake them as merely eye candy, but in reality one good picture can explain your paper better than columns and columns of text. It’s worth investing significant effort to create the visuals that will tell the story of your research.
AJ: My strongest advice is to find collaborators who you personally enjoy working with. The research challenges are more endurable if you’re positively looking forward to each opportunity to work with your colleagues on them.
Source: SIGGRAPH Conferences
熱門頭條新聞
- China’s Minidramas Make Big Splash
- DevGAMM Lisbon 2024 celebrates another successful edition with more than 750 attendees from around the world
- Wait What’s That – A VRy Unique Take On A Classic Drawing Game – Out Now For Free On Meta Quest!
- Slow-Motion Collapse: How Nostalgia, Streaming, and Short-Sightedness Undermined Hollywood’s Future
- PBS NOVA / GBH JOINS THE PRODUCTION OF ZED AND ARTE’S PREMIUM DOCUMENTARY THE LOST TOMBS OF NOTRE-DAME
- Biopunk Action Title Sonokuni Launching Early 2025
- LifeAfter Season 7 “The Heronville Mystery” Brings a Brand New Folklore Horror Theme for a Unique Survival Experience
- 2024 Developer Showcase and look ahead to 2025