FIcontent.Gaming.Enabler.RealityMixer.CameraArtifactRendering  1.0.0
In see-through augmented reality applications, a camera, typically on the backside of a mobile device, captures video. At the same time the video is played live on the screen and virtual objects are rendered and overlaid in real-time. In general, the captured images and the rendered objects do not visually blend together well. For example, the camera sensor adds noise and the lens adds optical distortion to the captured images, but the rendered virtual objects do not exhibit any noise or distortion. However, these image effects, or artifacts, can be artificially created and applied. This project demonstrates how motion blur can be generated from the camera and applied to the virual content.
Public Member Functions | Properties | List of all members
TargetTextureDelayBehaviour Class Reference

Maintains an array of camera image textures frames. An older frame that the current one can be accessed. This component can be used to simulate camera image delay effects. More...

Inherits MonoBehaviour.

Public Member Functions

RenderTexture OutTexture ()
 Provides the texture given the current delay. More...
 

Properties

int Delay [get, set]
 Used to access the delay publicly More...
 

Detailed Description

Maintains an array of camera image textures frames. An older frame that the current one can be accessed. This component can be used to simulate camera image delay effects.

Member Function Documentation

RenderTexture TargetTextureDelayBehaviour.OutTexture ( )
inline

Provides the texture given the current delay.

Returns

Property Documentation

int TargetTextureDelayBehaviour.Delay
getset

Used to access the delay publicly


The documentation for this class was generated from the following file: