📢 Our lab has been exploring 3D world models for years — and we're thrilled to share **PhysTwin**: a milestone that reconstructs object appearance, geometry, and dynamics from just a few seconds of interaction! Led by the amazing Hanxiao Jiang: https://lnkd.in/ePg-nUYR PhysTwin combines **Gaussian splatting** with **inverse dynamics optimization** based on simple **spring-mass** systems. ⚙️ The result? Real-time, action-conditioned 3D video prediction under novel interactions (i.e., 3D world models). 🔑 A few key takeaways: 1. Having the right structure (e.g., particles/masses) helps navigate the trade-off between sample efficiency, generalization, and broad applicability. 2. Visual foundation models (VFMs) have matured to the point where they can provide rich supervision for world modeling (e.g., tracking, shape completion). 3. Beyond VFMs, many crucial components have come together in recent years: Gaussian splats for rendering, NVIDIA Warp for high-performance simulation, and scene/asset generation from a wide range of labs and companies. The future of 3D world models is looking bright! ✨ 4. The resulting digital twin supports a wide range of downstream applications, especially in data generation and policy evaluation, thanks to its realistic rendering and simulation capabilities. 🎥 All code and data to reproduce the results, along with interactive demos, are available on the website. Check the following visualizations of: (1) observations, (2) reconstructed state/actions, and (3) interactive digital twins.
Innovations in Real-Time Rendering
Explore top LinkedIn content from expert professionals.
-
-
Here's my 2024 LinkedIn Rewind, by Coauthor: 2024 proved that 3D Gaussian splatting isn't just another tech trend - it's transforming how we capture and understand the world around us. From real-time architectural visualization to autonomous vehicle training, we're seeing practical implementations I could only dream about a year ago. Through my "100 Days of Splats" project, I witnessed this technology evolve from research papers to real-world applications. We saw: → Large-scale scene reconstruction becoming practical → Real-time rendering reaching 60+ FPS → Integration with game engines and VFX pipelines → Adoption by major companies like Meta, Nvidia, and Varjo Three posts that captured pivotal developments: "VastGaussians - First Method for High-Quality Large Scene Reconstruction" Finally bridging the gap between research and AEC industry needs "This research is specifically tailored for visualization of large scenes such as commercial and industrial buildings, quarries, and landscapes." https://lnkd.in/gvgpqMNe "2D Gaussian Splatting vs Photogrammetry" The first radiance fields project producing truly accurate geometry "All in one pipeline I can generate a radiance field, textured mesh, and fly renderings - all in less than an hour" https://lnkd.in/geprBw6j "HybridNeRF Development" Pushing rendering speeds while maintaining quality "HybridNeRF looks better than 3DGS and can achieve over 60 FPS framerate" https://lnkd.in/gcqdE4iD Speaking at Geo Week showed me how hungry the industry is for practical applications of these technologies. We're no longer asking if Gaussian splatting will be useful - we're discovering new uses every day. 2025 will be about scaling practical applications - from AEC to geospatial to virtual production. The foundation is laid; now it's time to build. To everyone exploring and pushing the boundaries of 3D visualization - your experiments today are tomorrow's innovations. Keep building, keep sharing, keep pushing what's possible. #ComputerVision #3D #AI #GuassianSplatting #LinkedInRewind
-
AI-generated Minecraft opens new possibilities for real-time video game creation and customization. AI companies Decart and Etched have developed a version of Minecraft entirely generated by artificial intelligence. This technology uses next-frame prediction to create a playable game environment without traditional coding, potentially revolutionizing game development and virtual world interactions. 🎮 AI generates Minecraft-like games using next-frame prediction: The system creates each frame of the game in real-time based on previous frames and player inputs, mimicking Minecraft's physics and gameplay without pre-programmed rules. 🧠 Extensive training on gameplay footage: The AI model learned game mechanics and physics by analyzing millions of hours of Minecraft gameplay, enabling it to replicate complex interactions and environmental behaviors. 🌈 Voice command potential for instant environment changes: Future iterations aim to allow players to verbally request changes to the game world, such as adding new structures or altering landscapes, which the AI would generate in real time. 💻 Current hardware limitations affect gameplay: The computational demands of real-time AI generation restrict smooth gameplay on consumer hardware, but custom AI chips are being developed to address this challenge. 🚀 Broader applications beyond gaming: Researchers envision using similar technology to create AI-powered virtual entities like doctors or tutors, capable of real-time interaction and adaptation in various fields. #AIGaming #MinecraftAI #NextFramePrediction #RealTimeGeneration #GameDevelopment #AIInnovation #VirtualWorlds #FutureOfGaming #AITechnology #GameAI
-
3D Gaussian Splatting (3DGS) is becoming the new standard for 3D reconstruction, rapidly transforming fields like Robotics, Autonomous Vehicles, AR/VR, Gaming, and VFX. With its ability to deliver photorealistic, real-time rendering while capturing large-scale scenes with minimal artifacts, 3DGS is solving complex problems across industries—all without relying on neural networks. In this article, we break down the Gaussian Splatting paper, explore the key equations, and explain how it achieves its unmatched performance. We also guide you through training your own data using nerf-studio's gSplat and share tips to get the best results. If you're working on 3D reconstruction or visual computing, this is the resource you need to stay ahead! https://buff.ly/3ZKhzrG
-
🔍 Sungkyunkwan University and Hanwha Vision bring forth Deblurring 3D Gaussian Splatting, a novel approach addressing the challenge of blurry input in 3D scene reconstructions. This technique empowers a 3D Gaussian splatting-based method to reconstruct fine and sharp details from blurry images, a significant advancement for real-time rendering applications. Employing a multi-layer perceptron (MLP) to manipulate the covariance of each 3D Gaussian, the model tailors the scene's blurriness to enhance sharpness and clarity. The result is a flexible and effective deblurring capability within the 3D-GS framework, setting new standards for visual quality in dynamic environments. 🔗 Discover their Project Page: https://lnkd.in/evy6Wvjs 📚 Dive into their research: https://lnkd.in/eJ3ewHzS 💻 Explore the code on GitHub: https://lnkd.in/en7cuepa (coming soon) For more cutting-edge AI and real-time rendering insights ⤵ 👉 Follow Orbis Tabula #3DRendering #Deblurring #ComputerVision
-
Gaussian Frosting Editable Complex Radiance Fields with Real-Time Rendering We propose Gaussian Frosting, a novel mesh-based representation for high-quality rendering and editing of complex 3D effects in real-time. Our approach builds on the recent 3D Gaussian Splatting framework, which optimizes a set of 3D Gaussians to approximate a radiance field from images. We propose first extracting a base mesh from Gaussians during optimization, then building and refining an adaptive layer of Gaussians with a variable thickness around the mesh to better capture the fine details and volumetric effects near the surface, such as hair or grass. We call this layer Gaussian Frosting, as it resembles a coating of frosting on a cake. The fuzzier the material, the thicker the frosting. We also introduce a parameterization of the Gaussians to enforce them to stay inside the frosting layer and automatically adjust their parameters when deforming, rescaling, editing or animating the mesh. Our representation allows for efficient rendering using Gaussian splatting, as well as editing and animation by modifying the base mesh. We demonstrate the effectiveness of our method on various synthetic and real scenes, and show that it outperforms existing surface-based approaches. We will release our code and a web-based viewer as additional contributions.
-
Found an exciting new study on 3D modeling, AI and robotics. I'll explain the tech, but first... a story: Imagine pointing a camera at your factory floor or a complex assembly line. Instantly, on your screen, you see a live, interactive 3D model of that entire space – not just the machinery, but also your workers moving within it, all updated continuously in real-time. Think of it like having a perfect, living dynamic dollhouse version of your operations that mirrors reality second-by-second. Rather than a recording of something that already happened, it's live spatial understanding. That's what this new research potentially makes possible. It introduces a framework for simultaneously tracking camera movement, estimating human poses, and reconstructing both the human and the surrounding scene in 3D, all in real-time. Using 3D Gaussian Splatting, it efficiently models dynamic elements. This sets a precedent for creating live, detailed digital twins of humans interacting with environments, which will be crucial for advancements in robotics (so they have real-time perception), virtual and/or augmented reality, and human-computer interaction. Eventually, this means a lot of positive knock-on effects: - Smarter Robots: Robots could use this live 3D view to navigate complex, changing environments and work much more safely and effectively alongside your human workforce. - Hyper-Realistic Training: You could drop trainees into virtual or AR simulations that perfectly replicate live operational conditions for unparalleled realism. - Remote Expertise: Remote experts could literally "walk through" the live digital twin to troubleshoot issues or guide on-site staff with complete, real-time context. This will enable bridging the gap between the physical world and digital systems instantly, enabling much smarter automation, collaboration, and analysis. Paper: https://lnkd.in/eH6VmmCg
-
GenAI + Gaussian splats in game development. Can it become a new Procedural level generation? With Gaussian Splats using one picture you can create full world. While generating such 3D scenes is not new, doing it in real-time is so far it's unprecedented. #WonderWorld is an interesting project that is slowly trying to get closer to real-time. Based on their paper it takes 10 seconds to generate a complete scene on an A6000 GPU. Although this is not ultra-fast, it's worth noting that it's a considerable step into optimization. Based on their paper it's a mix of fast Gaussian splats, depth maps, and outpainting. The process involves taking an initial image and extracting a depth map from it. Then, similar to standard outpainting in ControlNet, a world is generated around the initial image. The difference is that the depth map is simplified and uses a limited number of "depths" for optimization. The models are trained on the image. We can then enjoy and explore the newly created scene. Adding PhysDreamer (https://lnkd.in/gm5f3TGS), which allows physical interaction with splats, will make it even more impressive. How far do you all think we are from 30-60-120 FPS? Project page (https://lnkd.in/gX7ZGAqE), which even has a demo for exploration. However, the scene rendering is done directly in the browser, which might take a while to load. Paper (https://lnkd.in/gwCggnjY) #GameDevelopment #3DScenes #GaussianSplatting #RealTimeRendering #AIGeneratedWorlds #PhysDreamer #NeuralRendering #TechInnovation #GamingFuture #AIInGames
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development