AR Unity Editor Previz

Tools to get an idea how Effects would look like in the app

Related to the Neural Segmentation apps, we also felt the need to develop an Execute in Edit Mode sandbox to improve the Effect creation pipeline for current and future AR apps. Basically, the idea was to simulate various pseudo-random realistic circumstances in an Editor Scene (different light conditions, skin colors, camera angles, clothing, obstacle obstruction, etc) in which the segmentation and everything else would act as close as possible to the final result on the device, improving the otherwise clumsy build-to-test pipeline that AR apps (and mobile apps in general) impose on the dev and design teams.

At the time, Unity was pursuing something on a parallel line to this (project MARS), and so there was some work made for them in the immediate time before the Unite Berlin conference 2018.

See how happy Juan was about one of the early tests?:

 

PROJECT INFO.

Some early examples of using Head Tracking for AR:

Scroll to top