Designing the

SmartVCS

  • Intelligent

  • Mobile

  • Virtual Camera System

Premise

In the current production environment, the ability to previsualize shots utilizing a virtual camera system requires expensive hardware and large motion capture spaces only available to large studio environments. By leveraging consumer-level technologies such as tablets and motion gaming controllers as well as merging the cinematic techniques of film with the real-time benefits of game engines, it is possible to develop a hybrid interface that would lower the barrier of entry for virtual production. Utilizing affordable hardware, an intuitive user interface, and an intelligent camera system, the SmartVCS is a

new virtual cinematography platform that provides professional directors as well as a new market of amateur filmmakers the ability to previsualize their films or game cinematics with familiar and accessible technology. This system has potential applications to other areas including game level design, real-time compositing & post-production, and architectural visualization. In addition, this system has the ability to expand as a human-computer interface for video games, robotics, and medicine as a functional hybrid freespace input device.

System Architecture

The SmartVCS is a mixed media device, taking advantage of several pieces of consumer accessible hardware for the purpose of virtual cinematography. Each component serves a unique and specific purpose. At the time of development, the hardware used  for the system provided to be accessible, easy for development, and to support a low-latency feedback loop.
The following expands upon the specific function of each component:

  • Playstation Move Controller records the orientation of the virtual camera system
  • Playstation Eye Camera captures the system’s spatial position via the glowing controller
  • Playstation Navigation Controller enables for custom commands to be scripted
  • Playstation 3 System interprets the position & rotational input information into a data string
  • Data is streamed wirelessly over the Move.Me server application to the Unity3D Engine
  • Native software developed for the touchscreen to control the virtual cameras and scene

Constructing the System

Input Layout

Features

  • Native iPad app running the Unity game engine connecting directly to the Playstation Move.Me server.
  • Freespace, realtime control of the Virtual Camera position and orientation via the Playstation 3, Move controller, and Eye camera.
  • Camera controls for tracking, dollying, panning, and booming via the Playstation Navigation controller’s Analog sticks.
  • Custom Shotcode setup for saving recorded camera takes.
  • Cinema Prime Lens adjustments through quick-toggle face buttons on the Navigation controller. Zoom is accomplished using the controller’s soft-touch triggers.
  • On-Screen menu for setting the Virtual Camera’s recording aspect ratio.
  • Bi-Planar Tracking Sensitivity Adjustment through the Navigation controller’s face button. This would enable a user to translate a greater or lesser distance in the virtual world in relation to their physical motion in the capture volume.
  • Scene Modification through touch-dragging of on-screen objects in camera space.
  • Record & Playback a Virtual Camera shot using the aforementioned freespace and camera controls. Replay controls include standard playback functions as well as the ability to scrub across a time slider to pin-point a specific replay position. Capture playback includes camera and animation data.
  • Bake & Save camera takes directly on the iPad in a text-file format as a comma separated series of position and orientation values. Data can then be retrieved by connecting the iPad to iTunes and visiting to the app’s File Sharing menu where the user will be able to download said text files. From there, a Python parser was developed for Maya to convert the data into a baked camera within the animation package. The resulting camera contains all the information that was captured using the SmartVCS.
Additional tests included:

  • Collaboration using multiple SmartVCS systems and an actor captured Live with the Microsoft Kinect. This “tele-capture” demonstration allowed several camera operators to interact in a live virtual set with a remote actor performing in real-time.
  • Match moving & 3D Integration of virtual objects over a live video backplate using the iPad’s rear-facing camera. This test showcased the ability to accurately track, with low latency, 3D objects in a real world environment without the need for Augmented Reality markers.

Recognition

Since its inception in 2012, the SmartVCS, through its various iterations, has received recognition from key personnel in both film & game production studios. Leveraging their experience in the industry, these individuals provided insight and feedback throughout the system’s development, ensuring the SmartVCS would be a production-accurate virtual camera tool.

Unite 2013: Unity Awards

Presenting at SIGGRAPH 2013

Awards & Honors

Unite Conference 2013: Best Student Project Award (2013)

SIGGRAPH 2013 Studio Research Talk & Booth Presenter (2013)

Drexel University Masters Research Award (2013)

Academy of Interactive Arts: Randy Pausch Award (2012)

Talks & Presentations

NAB 2014: Panel Speaker (Tentative)

GDC 2014 with Side Effects Software (Tentative)

LA Animation & VFX Pipeline Developers Meetup

LA Unity3D User Group

SIGGRAPH 2013: Studio Research Talk & Booth

Want to learn more?