Patchy Track is a utility app for iPhones and iPads that records camera motion in augmented reality and exports this data together with video and reference geometry.

This data can then be imported in 3d applications (like Maya or Blender) to add CGI elements. The app was created to assist in the offline development of content for augmented reality.


  1. Capture augmented reality camera motion.
  2. Capture video in sync with camera motion.
  3. If the device supports LIDAR capture scene geometry (if the device does not supports LIDAR reference plane geometry will be detected instead).
  4. Native IOS file management and sharing (use the 'files' app to transfer data).


  1. And modern iOS device capable of Augmented Reality.



  1. Quick start guide.
  2. Maya and Blender import scripts.
  3. Example files.


Quick start guide:

Move the device around for the augmented reality session to be initialized, a notification will appear once tracking is ready.

With the augmented reality session initialized the device will try to find reference planes or geometry. This geometry comes in handy in post production, it can be used to find the correct placement for CGI content relative to the camera.

Now the 'record button' can be pressed to capture the camera's motion simultaneously with video. Augmented reality tracking isn't super accurate, there will be jitter and glitches in the data. This however feels authentic when simulation offline augmented reality content.

With a recording made the 'preview button' can be pressed to preview the motion of the camera through the world. Take a step to the side and point the device in the direction of filming to see the camera object move through the world.

The 'export button' will export all the available data to the device. The camera motion is saved in a .txt format and you require an script to import it into Maya or Blender. Links to these scripts are provided below.


Maya and Blender import scripts:

Under construction.


Example files:

Under construction.