Product Release: Depthkit Studio Setup is Now 10x Faster
New Depthkit Studio release is available now, featuring enhancements for fast hardware configuration and multi-camera calibration.
January 3, 2023
Depthkit Studio is the world’s most widely used system for creating high-quality volumetric video captures and livestreams. Today’s updates make Depthkit the easiest professional-grade volumetric video software to set up. The release makes capture hardware configuration and calibration ten times faster. Reduced setup time cuts down on production costs leading to higher customer ROI. This allows for creators to do more on-location capture and make more content.
Depthkit Studio's new features are available now to all Depthkit Studio subscribers at no additional cost. If you’re not a subscriber and would like to see the new features in action, email firstname.lastname@example.org to schedule a live demo with our team.
James George, Co-Founder and CEO at Scatter — “Scatter has responded directly to the Depthkit user community’s feedback and sped up the first mile of Depthkit Studio an order of magnitude for new and experienced users alike. Today’s release is a major step forward for the creative community making interactive 3D and XR projects. Now that Depthkit Studio is ten times faster to set up, it will allow even more incredible creators to make volumetric video projects.”
Throughout the setup and calibration process, Depthkit Studio now boasts easier to use interfaces and improved visual feedback to get you up and running faster. Features and benefits of the new Depthkit Studio release include:
- Completely new algorithm for volumetric video realtime visualization and diagnostics ensure quality calibration.
- Smartphone-powered remote control interface enabling solo operators to set up quickly on location.
- Hardware status display to instantly identify misconfigurations and eliminate hours of potential hardware debugging.
Read on for a deep dive into all the details of the release.
Visualizing Calibration Quality
You’ll find a new toolbar in the top-right of the viewport which unlocks a host of new visualization features aimed at ensuring your setup is solid during multi-camera calibration and capture.
New Preview Rendering: Fusion Lite
The most significant feature in this release is a totally new depth data visualization algorithm perfect for visualizing the data streams during the setup and calibration process. We call Fusion Lite. It is now the default visualization option during Depthkit Studio calibration and capture, giving you a solid alternative to the raw point cloud view. The elegance of the Fusion Lite rendering system is that creates a unified 3D textured mesh from all camera depth perspectives without requiring a perfect calibration to be in place already. This allows you to visualize how well the data from various perspectives is alignment while the calibration is being created.
Blended Edge Compare with both IR and Color Textures
Building on the availability of Fusion Lite, we’ve also introduced a way to analyze calibration quality visually. A new edge compare mode is available during Calibration that exaggerates any disparities in the textures between two cameras. Using an edge detection filter on each perspective in combination with an additive blend, misalignments stand out as ghosted, while well aligned calibrations overlap pop as solid lines.
Furthermore, you can now switch between visualizing the IR or the Color streams during calibration. By switching to IR view, you can isolate issues of internal color misalignment in the Azure Kinect (read more about this here) from multi-camera calibration divergence.
Visualizing Camera Placement and Link Chain during calibration
The goal of the calibration process is to determine the placement each camera in 3D space relative to one another and to the ground. Depthkit Studio solves this equation by creating a chain of links between the cameras based on shared visibility of the sampled calibration markers. Conducting precise and efficient calibrations requires an intuitive understanding of the calibration process, which guides the marker sampling strategy. That intuition can be hard to form as the process has been invisible to the user. Until now.
Depthkit 0.6.1 introduces both camera placement and link chain visualization during calibration. This lets users see where cameras are placed and how the links have been built.
Remote Control for Calibration & Recording
We are introducing a network-based remote control interface that uses OSC to trigger calibration and recording in Depthkit Studio.
Depthkit is portable, which means our users often find themselves setting up in their capture stage in chaotic physical environments, with limited time and without a lot helping hands. Empowering our operators to set up quickly and autonomously is a big priority. The new remote control interface enables a single operator to calibrate from within the volume using their phone or tablet over the network. No more running back and forth to a workstation keyboard just to hit a button!
The network API interface also allows your own custom apps to trigger recording automatically. This is especially useful for creating automatic workflows in experiential installations and trade show booths. Read on to learn more about the remote control commands, and download our starter templates for iOS.
Validating Hardware with Sensor Status indicators
The Azure Kinect sensors used in Depthkit Studio have a lot of moving parts. Synchronization cables, USB extension chipsets compatibility, and different firmware versions to name a few. Each variable introduces a potential gotcha when setting up a capture array. An error or mismatch with just one of these can stop the whole system from working smoothly.
To aid in fast and effective hardware debugging, we’ve added a host of new feedback systems to give you confidence that your hardware is setup correctly and help diagnose problems when they do come up.
The new version of Depthkit Studio introduces a feedback system for sensor and synchronization status at a global and sensor level.