Wonderful Projects I Did With Microsoft’s Kinect


Microsoft has ceased manufacturing the Kinect. Here’s some projects, art and research, academic and industrial, that I could only have done with the Kinect.

The Role of Physical Controllers in Motion Video Gaming, 2011:
Paper here.
Video here
A study where I empirically compared the cognitive/physiological effect of Wii-style versus Kinect-style versus hypothetical 6-dof controller-style (cough VR) motion video gaming.
Inside scoop: Even though this work was at Microsoft Research, I actually did this work before a Windows SDK for the Kinect was actually available, even internally. So, my first week of work was building a system that streamed Kinect data from developer Xbox hardware to a PC. Thus began my long history of hacking novel hardware and software to make interesting demos before it should be feasibly possible. I even had the Kinect for Windows SDK team ask me later how I did it, assuming I’d somehow done their work already for them!

Kinect Fusion, some of the first real-time SLAM in 2011:

I’m in the video demoing some of the capabilities from 6:00 – 6:40

Tweetris, 2011:

A two-player video game where players made shapes from the game Tetris, with their bodies. This is what got me interested in building experiences that turned players into actors, in a shared public setting. Basically the origin of my interest in digitally-augmented participatory theatre games for novice performers.
The team for this was pretty big. We did the original Tweetris installation at a Toronto-based arts festival called Nuit Blanche. Derek Reilly later remounted it several times in Halifax at a similar festival called Nocturne.
Myself and collaborators wrote a pretty in-depth study of player behaviour, which won an award (pardon my horrendous beard, etc. etc., grad school).

Kinect Body Paint, 2012:

Built at a Microsoft-sponsored jam session at the Pervasive 2012 conference

Shapeshifter, inspired by Tweetris, that I worked on with Kyle Duffield until late 2013.

While it worked well as an art installation, most people’s living rooms were too small for Shapeshifter’s intense kinetic movements, so we shelved it. I’ve dreamed of bringing it back some day! Sidebar: this was already from an era where I was a wayyyy worse programmer. I was a good research programmer, but a bad video game programmer, since I was easily bogged down in continually re-inventing the wheel/creating a better mouse trap. I’ve since become way better, but that’s a topic for another day.

Background Activity, a Kinect-Captured Dataset of typical living room gestures. Published 2015.

This felt like the most serious, useful research I ever did. Details here: http://www.dgp.toronto.edu/~dustin/backgroundactivity/

Improv Remix, my PhD thesis, published 2016.

Where I built a system for improv performers to record and replay their movement. Sort of like a loop pedal for improv comedy, with some Tupac hologram thrown in there. Credit to theatre lighting and projection designer Montgomery C. Martin for making the holographic part work. One of the curious constraints from this project was that I had to write a library to detect whole-body gestures from behind the users. For this, I invented a UX paradigm called the Vitruvian Menu, which you can read about, among other things like theories of interactive theatre, in my PhD thesis.

Replacing live performers with virtual avatars, 2016

Here combined with a Vive, I can swap out someone’s body with a live motion-captured avatar.

Live motion capture in virtual reality theatre (Raktor), 2017

As part of Raktor, I implemented a way for actors to puppet virtual avatars as part of a live stage show. You can see me puppeting a wizard in the demo reel, and later Jasper puppeting a fairy. Unlike other mocap systems, even real cheap ones like Vive trackers, there’s nothing to put on or take off when using the Kinect, which made it perfect for quick, spontaneous running around on stage taking control of virtual characters.

, , , ,