FIcontent.Gaming.Enabler.RealityMixer.CameraArtifactRendering  1.0.0
In see-through augmented reality applications, a camera, typically on the backside of a mobile device, captures video. At the same time the video is played live on the screen and virtual objects are rendered and overlaid in real-time. In general, the captured images and the rendered objects do not visually blend together well. For example, the camera sensor adds noise and the lens adds optical distortion to the captured images, but the rendered virtual objects do not exhibit any noise or distortion. However, these image effects, or artifacts, can be artificially created and applied. This project demonstrates how motion blur can be generated from the camera and applied to the virual content.
Class List
Here are the classes, structs, unions and interfaces with brief descriptions:
 CCARLinearBlurApplies the linear motion blur shader to the camera it is attached to. The amount and direction of blur is calculated from the movement of the public Camera cam parameter.
 CMergeCameraDelayTexturesCalls the merge shader to overlay a foreground and background camera image (with alpha).
 CTargetTextureDelayBehaviourMaintains an array of camera image textures frames. An older frame that the current one can be accessed. This component can be used to simulate camera image delay effects.