• Intel’s Perceptual Computing Camera

    Intel - Perceptual Computing

    At GDC 2013 I spoke at the “Natural, Intuitive, and Immersive Gaming With Intel Perceptual Computing” and Intel Booth introducing the emerging field of Physical Technical Art, an area that focuses on human-computer, human-human interactions and physical workflows during game development and play.  One of the main interaction devices that I spoke about in my talk was the Intel Perceptual Computing Camera. In this article I’m going to talk about my experiences working with this hardware and how I used the provided SDK to port an iOS game to use it’s perceptual interface.

    What is Perceptual Computing

    Traditionally when interacting with a computer you only had a number of basic input systems; keyboard, mouse / trackball, joystick or digitising tablet. Over the years this has exploded to include speech recognition, touch / multi-touch interfaces, eye and face tracking, motion/gesture tracking and even scratch detection to name a few.

    Intel’s Perceptual computing initiative focuses on the research and development of software and hardware that encompasses speech recognition, face identification, finger and hand gesture tracking and augmented reality for consumers in turn improving the Natural User Interface or human-machine interface.

    GDC Intel Robert Butterworth

    Intel Developers Session

    Bowlind Dead- Intel Booth TalkIntel Booth Talk on porting “The Bowling Dead”


    The Camera

    Intel Gesture Camera

    Intel Perceptual Camera

    The Intel Perceptual Computing camera is actually a Creative Camera, a USB 2.0 HD webcam with dual-array microphones and a Depth sensor designed for close-range interactions. It doesn’t require any additional power as the Kinect does and it’s quite small as you can see in the first image. The camera does do some image and sound processing but most of the magic is in Intel’s SDK. It’s great for laptops and desktops, but not so accessible to lower power systems such as Raspberry Pi or Arduino.

    How it works

    The Creative Camera is built from a number of parts including an HD camera to capture the 24 bit color at 30fps as well as stereo microphones; typically what you would find in a standard HD Web Camera. The other camera is an 8 bit Infrared camera with an Infrared emitting light right next to it which can capture up to 60fps at 320×240 pixels.

    The HD color camera works as a normal camera would, although it has an infrared filter over the lens to stop any light produced by the IR Light flooding the color image. The infrared camera is a little different; it works in tandem with the light. This camera uses a concept calledTime of Flight or ToF. This works using the known speed of light to measure distance which requires some precision electronics to time an IR flash to each pixel on the IR sensor. With this timing an image can be created with a gradient of distance as you can see in the grayscale image below.

    Anomalies can be created in the image if the surface that you are illuminating doesn’t reflect the light correctly or not at all and is outside of its timing range. Glass is a great example of this, it is virtually invisible to the sensor and we can use this to our advantage; I’ll go into a little more “depth” later. The IR Light is similar to the light used in a remote control and as with a remote you can also see the IR light on this camera with a mobile phone camera or any other camera that doesn’t have an IR filter on it.

    The dual microphones not only offer high quality audio capture at 48Khz, but it provide us with positional information that can be used to identify who is talking in the image or other gestural information such as clicking or tapping. The PCSDK also includes a speech recognition system for voice commands and dictation.

    CC Parts

    Parts of the Camera

    PCX Color Depth

    Color and Depth Images

    With the information collected via the color and infrared sensors the Intel PCSDK can algorithmically detect hand positions and gestures, detect faces and  facial motion and produce 3d point clouds of the viewed scene, as you can see in the images below.

    The PCSDK is a simple to use but very powerful API which takes all the hard work out of the computational and image manipulation required to track and detect these gestures. Querying the API for information about each finger, palm or gesture is easily accessible whether you are using it in a standalone app, incorporated into a presentation system such as Cinder, in a web app or in a game engine as I did with The Bowling Dead port.


    PCX Hand Face Point

    Hand & Face Tracking and Point Cloud Images

    Comparable Technologies

    There are a number of similar hardware devices to the Intel Camera, each having their own strengths and weaknesses. I have included some of the major hardware devices that are currently popular and in use.

    Kinect 1&2
    Kinect - Xbox 360 Kinect - Xbox One

    The Kinect for PC has been a staple for hackers for a number of years now. There are many applications that have been developed ranging from full body motion capture systems to 3d scanners. The resolution of the image provided is quite low however and the hardware’s age is starting to tell. The Kinect 1 camera uses a slightly different technique for constructing its 3d scene, namely Structured Light. Essentially this system projects an IR pattern (structured light) over the scene and then by the distortion it can reconstruct it’s depth.

    With the second generation of Kinect currently shipping with Xbox One, a PC version will soon be available. This new hardware has switched to use the ToF system with an HD sensor, which will greatly improve its depth quality and overall tracking. One of the major downsides to the Kinect is its form factor. The hardware is designed to be a tabletop device that require external power in addition to the USB connection and not as an ultra portable device. Access to the Kinect 2 API is still currently restricted but will be released soon.

    Leap Motion


    The Leap motion is constructed from 2 cameras with 3 IR LED’s and runs completely off USB. Measuring only 3” long this unit is extremely responsive and fits nicely in front of your keyboard and will soon be integrated within one. With it’s cameras pointing directly up it is able to capture fingertips and hand motion smoothly and at high frame rate. It is best suited for interaction with palms facing down towards the camera. The recent SDK 2.0 sees improvements to hand and finger tracking with better occluded finger tracking and hand and finger labeling.

    Intel Camera

    Creative Camera

    Overall the Intel PCSDK is the powerhouse behind this developer hardware. It provides a great deal of information about the subject with hand, face and gesture tracking along with its voice recognition software. It’s medium form factor enables it to be taken easily wherever you go and can be run completely off USB.

    Now that the developer hardware has been out for a year, Intel has announced and demoed at CES 14 and IDF Shenzhen 2014 that within this year we will see a greatly miniaturized device embedded in laptops and tablets. This is going to have a phenomenal effect on how we interact with our hardware. I’ll certainly be looking forward to testing out the new incarnation of this device.

    Compact PC Camera


    Over the months leading up to GDC 13 I was able to test out a couple of features of the SDK. I was particularly interested in seeing how a touch based game would adapt to the Intel Camera.


    The Bowling Dead – iOS to Perceptual Computing SDK conversion

    The Bowling Dead is an iOS game that uses the touch swipe mechanism to bowl various bowling balls, some explosive, at the ever increasing zombie horde marching towards you in an alleyway. If the Zombie reaches you the game switches to a frantic melee and you are required to remove the zombie from you before it devours you.


    Hand Tracking in Maya

    Using the SDK I decided to test to see if hand tracking would work within Maya. It was relatively easy to get it all hooked up and operational. I used a combination of C# with Python.Net and it was surprisingly fast. This could be extended to allow the manipulation of other objects  or brushes within the scene or used as a general motion capture interface.


    Head Tracking in Maya (Perspective coupled Head Tracking)

    In this video you can see perspective coupled head tracking. This system drives the camera in planar space and is based upon the movement of the user’s head, producing perspective parallax which is a component of depth perception. In addition to this technique the user could wear Anaglyphic glasses typically red or blue or polarized glasses adding stereo vision which is key to producing immersion.



    In later blogs I’ll be delving a little more into the details on how I ported the game from iOS to the Intel Perceptual Computing SDK.


    Stay Tuned!

  • Super Ultra Deadrising 3 Display

    Over the past 6 months I’ve been working on an Expansion pack “Super Ultra Deadrising 3’ Arcade Remix Hyper Edition EX Plus Alphaand as part of the project I helped build with Jason Buchwitz an in-house display for the project. Here’s a little sample of what the sign was able to do, starting with just an idle animation every minute or so it enters into an attract mode where the sign lives up the name of the game… Massively over the top and completely awesome! Check out the video below and afterwards you can have a read of how the display was put together.



    The Sign is built mainly of foam core with graphics printed and glued to its surface, to illuminate the sign I used some 36mm Square 12V Digital RGB LED Pixels.


    These lights are individually addressable (4 Per Square) and can produce the full RGB gamut. I chained a couple of these strings together and fed them around the display, starting around the logo and then behind the flames. The lights are controlled via a controller Chip (WS2801) and take data via SPI or Serial Peripheral Interface. This data feed is sent from an Arduino  Due.



    This 84 MHz ARM Microcontroller looks after the sequencing of the display and can also control audio playback. The whole display is powered off  2x 12V power supplies for the Lights and a 5V power adapter for the Arduino. I’ll be going into a little more detail about the code and wire setup in future posts… Stay Tuned!


  • Physical Technical Art

    Over the next couple of posts I’m going to be writing mainly on the topic of the emerging field of Physical Technical Art. So… what does that mean you say? Game Development in most parts is a software exercise; once concept art and motion capture has been digitized the remainder of production is mainly dedicated to DCC’s, Assets Pipelines and Code. Development can be fraught with unforeseen hurdles and challenges, some of these problems require a team to think outside the software box and this is where Physical Technical Art implements a solutions. In addition to solving hardware problems Physical Technical Art also aims to improve the human-computer interface (NUI), human to human interface and game interaction methods through the use of technology.

    A great example of a hardware solution is the XBox 360 Controller Monitor. This custom hardware enables the capture of the controller’s inputs (buttons and joysticks) and then displays the values on a small board. This board is then framed with the vision from the game and filmed, giving the reviewer a clear connection between the input and the time taken for it to be displayed on screen.

    The following video is a Tear Down of this hardware, a brief overview of the electronics used and how it captures and displays its data:

    Xbox 360 Controller Monitor (Tear Down)

    Latency is an important concern when developing a game, it changes how the game is played and if it’s too high the user may get frustrated ruining the experience. The Xbox 360 Controller monitor is a practical and simple solution to this important focus. Yes, input latency can be calculated within the devkit; but having a hardware solution ensures no overheads and enables testing for unprofileable games.

    Ben Heck’s Controller Monitor Intro Video

    Ben Heck’s Website

    Uno32 (Ardunio) Code: Macro_Controller.Zip

    Fair Child 8Bit Shift Register DataSheet

    If you use something like this or any other physical solutions in your studios, I would love to hear about it. You can contact me via the contacts tab or just leave a comment.