class ARDepthManager (Niantic.ARDK.Extensions.ARDepthManager)
Overview
class ARDepthManager: Niantic.ARDK.Rendering.ARRenderFeatureProvider { public: // enums enum OcclusionMode; // properties Camera Camera; IDepthBuffer CPUDepth; IDepthBufferProcessor DepthBufferProcessor; Matrix4x4 DepthTransform; Texture GPUDepth; InterpolationMode Interpolation; float InterpolationPreference; bool IsDepthNormalized; uint KeyFrameFrequency; DepthMeshOccluder MeshOccluder; OcclusionMode OcclusionTechnique; bool?? PreferSmoothEdges; bool StabilizeOcclusions; // events event DepthBufferInitialized(); event DepthBufferUpdated(); // methods virtual override void ApplyARConfigurationChange(ARSessionChangesCollector.ARSessionRunProperties properties); void ToggleDebugVisualization(bool isEnabled); virtual override void UpdateRenderState(Material material); };
Inherited Members
public: // properties bool AreFeaturesEnabled; bool CanInitialize; bool Initialized; ISet<string> Features; RenderTarget? Target; ArdkEventHandler<RenderFeaturesChangedArgs> ActiveFeaturesChanged; ISet<string> Features; RenderTarget? Target; // events event ActiveFeaturesChanged(); // methods void Deinitialize(); void DisableFeatures(); void EnableFeatures(); void Initialize(); virtual abstract void ApplyARConfigurationChange(ARSessionChangesCollector.ARSessionRunProperties properties) = 0; void UpdateRenderState(Material material); virtual abstract void UpdateRenderState(Material material) = 0;
Detailed Documentation
Properties
Camera Camera
Returns a reference to the scene camera used to render AR content, if present.
IDepthBuffer CPUDepth
Returns the latest depth buffer on CPU memory. This buffer is not displayed aligned, and needs to be sampled with the DepthTransform property.
IDepthBufferProcessor DepthBufferProcessor
Returns the underlying context awareness processor.
Matrix4x4 DepthTransform
Returns a transformation that fits the depth buffer to the target viewport.
Texture GPUDepth
Returns the latest depth buffer on GPU memory. The resulting texture is not display aligned, and needs to be used with the DepthTransform property.
InterpolationMode Interpolation
The value specifying whether the depth buffer should synchronize with the camera pose.
float InterpolationPreference
The value specifying whether to align depth pixels with closer (0.1) or distant (1.0) pixels in the color image (aka the back-projection distance).
uint KeyFrameFrequency
The value specifying how many times the depth generation routine should target running per second.
OcclusionMode OcclusionTechnique
The value specifying how to render occlusions.
bool?? PreferSmoothEdges
If true, will use bilinear filtering instead of point filtering on the depth texture.
bool StabilizeOcclusions
Depth stabilization is a feature that mixes depth values captured from the fused mesh to fill in parts of the depth texture that otherwise would contain flickery data during occlusion.
Note
this is an experimental feature. Experimental features should not be used in production projects as they are subject to breaking changes, not officially supported, and may be deprecated without notice.
Note
In order to use this feature, the scene needs to employ an ARMeshManager component, configured with a mesh chunk that is set to ‘ARDK_FusedMesh’ layer.
Events
event DepthBufferInitialized()
Event for when the first depth buffer is received.
event DepthBufferUpdated()
Event for when the contents of the depth buffer or its affine transform was updated.
Methods
virtual override void ApplyARConfigurationChange(ARSessionChangesCollector.ARSessionRunProperties properties)
Inheritors should override this to modify session configuration settings based on their script’s needs.
Note
This is executed as a result of the ARSession being run, which may or may not be triggered by a call to RaiseConfigurationChanged().
virtual override void UpdateRenderState(Material material)
Called when it is time to copy the current render state to the main rendering material.
Parameters:
material |
Material used to render the frame. |