:| :| :| :| :|
15″ 2015 Macbook Pro (the last one before USB-C)
2x lighting cables. Second one is 2 metres long; helpful for AR debugging.
Wired mouse (helpful for 3d modelling or level design work
Various chargers, including a battery charger
3D printed phone stand (critical for AR work, thanks @philnelson)
Bridge HMD + Structure Sensor + Charger
Bridge 6dof Controller
Gear VR + Samsung Galaxy S6
Game Controller (for Minecraft in Gear VR, for playing on airplanes)
iPhone 6S (my normal/ARKit dev work phone)
2x Bluetooth iBeacons (Estimote, XYFindIt)
Wallet (includes credit-card size bottle opener/multi-tool)
Proof that I’m a resident of Canada (crossing into the US has been weird in the past):
– Hard copies of ongoing contracts
– Sublet agreement for a place in Toronto
Notebook, blank pages
D20/D&D Dice Set
Graphic Novel “Ingress: Origins”
Toiletries case, including folding toothbrush
Sugar pills (just in case)
Advil (just in case)
Emergency migraine medicine (expensive, prescription, just in case)
Deodorant (Old Spice)
Lock (for gyms)
Yellow REI computer bag with padded laptop section (so useful)
Biking-oriented, sweat-wicking jeans (Duer L2X)
3 pairs of underwear
3 pairs of socks
Various wizardy shirts and jackets
Various leather things
Yesterday, I saw/experienced the show “D&D Yoga”, in the Toronto Fringe Festival.
This is a literal combination these two things, which worked in some surprising ways, didn’t work in some surprising ways. It is a yoga class, run by a real-life yoga instructor Christine Desrochers, during which you go on an actual Dungeons and Dragons adventure, with dice rolling and character sheets and hit points and an inventory. Here’s my character sheet:
This is immersive theatre at its most experimental and Fringiest. You should see it.
Other people have written reviews about the show, and I’m specifically going to cover the immersiveness of it. I haven’t done much yoga, but the meditation/yoga I’ve done has always been about bringing your mind to a place of suggestiveness and relaxation, that particularly allows you to feel, temporarily somewhere else. There’s the typical “imagine yourself on a beach” or “imagine yourself in a forest, with birds chirping”. These are mainstays of meditative/yogic experiences. In a typical D&D setup, this immersive, imagination-provoking information is delivered while everyone is sitting, with their eyes open, probably with a beer in hand, while at least one person is looking over their character sheet and doing needless math, and another is requesting the volume go up on a Skype connection.
The most immersive parts of the experience were when the yoga instructor was guiding us through poses related to our dungeon crawl: pushing a boulder out of the way, climbing a wall, shimmying through a crack in the rock, hiding against a wall from goblins, trying to escape from flesh-eating vines (I rolled a Nat 20 and got to notify the party early). This made me wish for more narrative exercise experiences, where my imagination is engaged while my body is being used.
When I run or play D&D sessions, it’s highly banter-y. People interrupt with insane ideas, there are jokes, the tone is that people are encouraged to try to do things and it’s the job of the DM to let them know if they shouldn’t. You’re encouraged not to self-censor. This felt at-odds with an experience where a Dungeonmaster/Yoga Instructor is guiding you through poses as you’re doing a dungeon crawl. In typical D&D, you could say: “Instead of rappelling down, I’m going to use my bag as a parachute”, and it’s up to the DM to come up with a roll for that, while everyone laughs. For this type of show, it would be an un-yogic experience to pause and have the Dungeonyogamaster come up with a suitable roll and pose for you.
Just by itself, the one-shot campaign designed for the show is quite clever. Like I said above, it has to feel interactive without too much player intervention, and it even includes a clever twist in the end, at least in my playthrough. It makes me want more immersive theatre exercise. More! I demand anyone that reads this make more in that genre!
We were given a class in the beginning (Warrior, Rogue, or Mage). Each had a separate attack yoga pose, as well as out-of-combat yogic abilities. (I’m not joking; this was tightly done and I’m giggling at how literally hilarious this design is). One awkward thing was that we had to roll our dice, on the ground, before doing our attack pose. I kept forgetting to roll the dice before, and when we went through to see how everyone did, I had to reach down and roll quickly. It makes me wish for some sort of in-hand dice.
I’m currently putting together a grant application. This means I need to get together documentation and press for what I’ve done. I need to submit this all on printed physical paper. URLs or pdf files by themselves are not enough.
I’m on macOS. Normally, you can just print a webpage to pdf from the browser. However, for some crazily-made websites, this doesn’t work and creates broken formatting. I had to use a plugin (which I now forget, because I’ve since switched computers and it was a few months ago) to print these broken webpages. The plugin outputs the page as a single page large pdf, like so:
Unfortunately, this is not “ready to print” yet.
I have several single-page pdfs with page sizes that are much larger than 8.5″ x 11″. I want to split these into pages so that they’re ready to print on a 8.5″ x 11″ paper. Fortunately, the layout of the big pdf is such that we can split it into a single column; we don’t have to have split the page left-to-right, only vertically.
Trying to find a solution:
Nothing seems to do this out of the box. In fact, most of the current pdf management tools seem unprepared for non-standard pdf page size.
This is a hard-to-Google problem, in that all the ways to specify it are ambiguous for the purposes of a search engine. Is there a general term for these kinds of problems? When you search for “splitting pdf” or “crop pdf” or “separate pdf pages” the results assume you want to take a pdf document with several pages in it, and remove some of the pages, without affecting individual pages themselves.
macOS’s Preview app’s print dialogue won’t let me print a single page pdf onto multiple pages. When I try to adjust scale of a single pdf page, it crops the page rather than lets it overflow:
I thought I could maybe use lovely command-line tool image magick convert to take each single-page pdf and crop it into a series of 8.5″x11″ pages. Unfortunately, looks like my source pdfs all have varying widths. While it may be possible, there’s no immediately convenient way to measure the width of a source pdf and split it into semi-overlapping tiles of a given ratio.
I heard that Adobe Acrobat would let me crop pdfs. I installed it via homebrew cask, which is how I install any application whenever possible. After installing it, I that the cropping pdf feature was only available in Acrobat Pro. Cropping a pro feature!? Seems crazy, but given how arduous my search had been so far, maybe that’s not actually that crazy.
Fortunately, I’m already paying for Adobe Creative Cloud at $50 USD/month.
I found the pdf splitting interface in the print dialogue (more on that later). However, Adobe Acrobat won’t print to pdf. And, Adobe won’t print to macOS’s print dialogue – trying to do this crashes Adobe Acrobat. Adobe Acrobat would only print to real printers that were connected to my computer, which I couldn’t do because I’m sending pdfs to a print shop via email.
So, I needed to make a virtual printer that showed up in my print dialogue, but actually printed to pdf. I found one called VipRiser. Here’s the virtual printer:
With VipRiser, you can choose where the pdf goes. I tried setting it to Desktop, and then Downloads, which VipRiser accepted, yet when I tried to print, it would hang for a bit then told me it “couldn’t find the folder”. So, I selected the “Open in Preview” option instead. Then, after the resulting pdf opened, I could save it to the desired location. This yak is a Matryoshka doll.
VipRiser worked fine after that, but it froze if my Mac ever slept. It also hanged for a shockingly long time while printing documents of only a few pages, like 30 seconds.
Now that actually outputting a document is solved, lets go back to the Adobe Acrobat print interface. Under the “Page Size & Handling” Tab, select “Poster” to choose your tiling options.
On the right side, you can see the dotted lines cut the big single page into 3 pages. However, the top of the first page isn’t aligned with the top of the original page. I couldn’t see how to fix this. I just decided to accept this and hoped it wouldn’t make my application look too weird.
You can see I set the “Tile Scale” to 60%; I found this out manually. Note in the page visualization on the right, it tells you the document size is “8.5 x 33 inches”. If you make the Tile Scale one bigger, to 61%, it changes the page layout so it’s 11 inches wide, ignoring the Orientation setting:
But then it worked. Holy shit.
I come across these sorts of “I just want to do a simple thing” deep dives more often than I’d like, so I’ve started a new category: yak-shaving posts. If you aren’t familiar with the definition of yak shaving:
[MIT AI Lab, after 2000: orig. probably from a Ren & Stimpy episode.] Any seemingly pointless activity which is actually necessary to solve a problem which solves a problem which, several levels of recursion later, solves the real problem you’re working on.
Here’s a story of my struggles with version control at Raktor as I push it to the limit for a variety projects in the Unity engine. Pour a drink and commiserate with me.
I love git. My background is in handling large, complex codebases that go all the way down to the metal, so distributed version control with branching and rebasing is essential. As I juggle many different third party libraries and projects while pumping out MVPs, a robust, well-documented repo history is important to diagnose when bugs appeared and why. I use
For our large repo, I’ve found git + git-lfs to be “good enough but still terrible at handling large binary assets”> so here’s my experience over the past year and a half. It’s important to emphasize that this repo is intentionally messy; we’re moving at the speed of prototyping, and I’m not taking the time to worry about whether we’ll use an asset frequently before we add it. I’m also not taking time to cull assets that we haven’t used in a while, as we are often remounting old projects. We’re not worrying about a “shippable” state, we’re worrying about a “runnable” state as we move fast and break things for demos we are running ourselves.
Here’s a look into the repo, a total of 13 GB and 504 commits to date:
Partway through development, as the repo started to get very heavy, I reorganized it so that any asset content that was updated infrequently was moved to the “Dressing Room” folder, which weighs in at 10.7 GB. I fantasized that, at some point, I’d move this content out of git and manage it separately. This is mostly Unity Asset Store downloads.
This repo is used to ship to 4 separate platforms (macOS, Windows, Android, iOS) and we use third party libraries with inconsistently and naively documented compatibility with different versions of Unity (at the moment, 5.6.0, 5.4.3xEditorVR-p3, 5.4.2, 5.3.4).
Since even individual projects need to be compiled on multiple platforms to run, I need to switch back and before between these quickly to build and test. Often, when switching platform or Unity version, this triggers an asset re-import. Unity Cache Server helps a bit with this. However, whether the asset is being imported from scratch, or “downloaded” from the cache server (I only ever used localhost), this can take up to 10 minutes on my faster Windows machine, or up to half an hour on my slower Macbook.
I switch back and forth from programming on macOS and Windows,
and MonoDevelop and Visual Studio have different default attitudes toward whitespace. I haven’t dumped enough time into figuring out the most smooth way to do things. Also, I haven’t been able to get a handle on git’s autoclrf settings in a way that “just works”. One time a bunch of
^M showed up in my .gitignore file and I had no idea why, and didn’t want to touch it.
Unity Cache Server
While Cache Server has been great, a different version ships with each version of Unity. It’s not clear to me what a given Cache Server version’s compatibility is going backwards and forwards. Note that since my machines move around physically, I’m only ever using a localhost cache server, and haven’t shared one between machines.
Scary Anecdote: I once had two copies of the big repo on the same computer, for two separate versions of Unity. (This was to handle another problem I’ll get to later.) Unity Cache Server was running, and both versions of Unity had been linked to it. I opened repo A with Unity version A, then closed it with no changes. Then, I opened repo B with Unity version B, then closed it with no changes. Then, I opened repo A with Unity version A again, and Unity downloaded changes from the cache server! What’s going on there!? I wish it was more transparent what the Cache Server was doing.
Special characters that appeared in an asset downloaded from the Unity store have been the bane of my existence and will not die.
These files show as modified even when they haven’t been yet, and re-appear every time I have to git clone, or navigate forwards or backwards over the commit where I made changes to them. I don’t how to fix this problem, and how much effort I should put into it. I’m guessing it’s a macOS <-> Windows compatibility issue, but to solve it once and for all, I think I’d need to go and edit git history to excise them from ever existing, right? For all I know, the special characters that refuse to die may also persist in the Library or Cache Server cache and resurrect themselves after I naively believe they are gone, like some cyberpunk version of The Thing. I’d love advice on this.
Git LFS: Large File Storage
Git-lfs is, in principle, a great idea: for big binary files that aren’t going to change often, keep them outside of the regular git tree and only download them as needed. Don’t store the entire binary files’ history in the .git directory. GitHub charges a small premium for Git-lfs bandwidth, and if it worked 100%, it would be totally worth it ($5 per month for 50 GB of bandwidth). Git-lfs is open-source and managed by GitHub themselves, and clearly aimed at keeping git-familiar devs like me using git instead of switching to a more game-tailored version control system.
Installing and running git and git-lfs on Windows is fucked. By way of explanation, I’m used to Unix-based systems where there seems to be one agreed-on method to install and access programs. On Windows, I had to resort to using the GUI app GitHub for Windows to install git because it sets up GitHub’s 2FA right, and I couldn’t get the keys (via Putty, etc.) working without it.
When uploading or downloading large assets, sometimes the network would hang, or the git operation would fail for some other reason. This appeared to leave the repo in a corrupt state. While
git status would finish execution, files would show as changed even if they hadn’t been, and
git checkout . would hang indefinitely, even if the files were relatively small, like a jpg. Poking around in the git lfs issues, it appears that this is due to smudge errors (smudging, I think, is the process where a file tracked by git is replaced by a git-lfs pointer in the .git history). I would end up with a repo that was corrupt due to an unrecoverable smudge error. Hey, take a look at how many corrupt repos I have, each of which are ~13 GB and required me to freshly download all of those hot gigabytes!
To avoid having to freshly re-download, I tried “backing up” my repo periodically by zipping it, but this seemed to cause even more problems with OS-specific files getting added on unzip. Zipping itself took ~15 minutes due to the sheer number of files (29,542) and folders (1,460).
On further investigation, git-lfs 2.0 supposedly handled smudge error recovery much better. However,
git lfs version showed I was on 1.5.5. I upgraded to git-lfs 2.0 and then continued to diagnose issues, but kept having them. Imagine my gaslight-y horror when
git lfs version revealed I’d been reverted to 1.5.5 somehow! Imagine how horrifying it was to discover this when I was also trying to diagnose other reasons why the repo was corrupt, and everything I was tried had processing times from 15 minutes to an hour!
Turns out that the shell launched from GitHub for Windows uses git-lfs installed at
%UserProfile%/AppData/Local/GitHub/lfs-amd64_1.5.5/git-lfs and if you update it to a later version, like I did, it reverts! So there’s no way to update the git lfs version with GitHub for Windows to a more stable version.
Next, I installed git-lfs via the terminal offered through Sourcetree. Somehow, first installing Github for Windows, and letting it make 2FA settings, and then installing Sourcetree, and then installing git-lfs 2.0 via Sourcetree’s terminal, made it work. Before, when I’d straight installed Sourcetree, I couldn’t get it to work without GitHub for Windows setting up 2FA right. Yes, I know about GitHub’s auth tokens and I know Sourcetree 1.8 and 1.9 sometimes cached server passwords in a buggy way.
(Let’s take a breath and remind ourselves that my goal in all this is to get to work, not diagnose git issues.)
As a final git-lfs puzzle, periodically, git-lfs seems to “discover” files that were already in commit history that should have been added to lfs a long time ago, but somehow have not been yet. Is there some git-lfs-doctor I can run? I’d love to know.
FYI, here’s my .gitattributes:
$ cat .gitattributes
*.psd filter=lfs diff=lfs merge=lfs -text
*.png filter=lfs diff=lfs merge=lfs -text
*.jpg filter=lfs diff=lfs merge=lfs -text
*.tga filter=lfs diff=lfs merge=lfs -text
*.tif filter=lfs diff=lfs merge=lfs -text
*.tiff filter=lfs diff=lfs merge=lfs -text
*.mp3 filter=lfs diff=lfs merge=lfs -text
*.wav filter=lfs diff=lfs merge=lfs -text
*.mp4 filter=lfs diff=lfs merge=lfs -text
*.fbx filter=lfs diff=lfs merge=lfs -text
*.xcf filter=lfs diff=lfs merge=lfs -text
*.bytes filter=lfs diff=lfs merge=lfs -text
*.dll filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.7z filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*LightingData.asset filter=lfs diff=lfs merge=lfs -text
*.exr filter=lfs diff=lfs merge=lfs -text
Alternatives: Plastic SCM
I know there’s game-dev-oriented version control systems like Perforce, but I’ve been resistant because git has been so powerful and anything I read about others indicates that they aren’t as much.
I’ve had Plastic SCM strongly recommended by a developer I trust, so I gave it a shot over a game jam, taking a copy of my existing big repo and making 98 commits over 72 hours, as a solo dev.
– the output of
cm diff is not helpful
– Files are often labelled as “changed” even if they’ve only been “checked out”, and there are no actual changes, not even whitespace.
– I don’t like that commit labels are incrementing numbers, not hashes. I didn’t try branching and merging, but this doesn’t make me optimistic that the results will be easy to read.
– Pre-commit, there’s no git-like concept of staging. While I’m working in git, I use staging to indicate to self what parts of a current chunk of work are “good to go” versus “still messy/working on it”.
– The Plastic SCM client I used, as far as I can tell, allowed for only one “active workspace”, aka a repository, at once. This limitation is pretty insane. While I’m all for big mono-repos, when I’m diagnosing behaviour of external libraries who have their own git history, I need to be able to examine and operate on multiple histories at once.
– Plastic SCM’s ignore format is not as regex-friendly as git’s .gitignore, so I could not rename the same file and get going.
Even Other Alternatives
I refuse to make my own version control system like Jon Blow. My needs as a developer can’t be that insane right?
Other Question: What shell should I be using on Windows?
Like I said, I’m used to using macOS or *nix systems, which have a one-stop-shopping shell. On Windows, we have: cmd, Powershell, Powershell opened via GitHub for Windows (which adds GitHub for Windows’ git to its path), the MINGW64 terminal launched by SourceTree (which, oddly, is missing fundamentals like
which). Finally, there’s Bash for Windows, which installs its own Unix environment. However, anecdotally, I’ve found any git operations via Bash for Windows take about 5x longer than via Powershell. I’m not sure if this is due to some level of abstraction, but it makes it pretty unusable. Also, none of these shells support copy-and-paste as elegantly as macOS does, so I automatically feel disdain towards them.
Back to Git-LFS: As I was trying out different Windows shells, I once ran
git checkout . on a repo using lfs in a git environment that didn’t have lfs. This corrupted the repo unrecoverably, so I had to download all 13 GB from scratch yet again. Please: I’d love a command like git-lfs-doctor or git-lfs-unbreak that can diagnose and repair repos.
For the last few years, I’ve been a “VR tech professional”, which means I have, on my desk, various pieces of Virtual/Augmented/Mixed Reality equipment. These will get cheaper, but at the moment are dubious buys for the average person. Despite companies’ best efforts, set-up is still a confusing pain.
VR still has so many goddamn boxes. When can I get it in syringe form? pic.twitter.com/xNXWy16bNv
— Dustin Freeman🏡TO (@dustinfreeman) April 25, 2017
Some room-scale Virtual Reality hardware like the Vive or Oculus Touch requires a dedicated, calibrated space. These don’t exist in a lot of homes. When I was making a multiplayer Kinect game a few years ago, one of the limiting problems was that clear space in the average person’s living rooms was just not enough.
So let’s talk about ticketed Virtual Reality Venues, both temporary and permanent, using specific examples in Toronto. VR venues can serve a few purposes:
– equipment rental
– a dedicated, professional space setup
– a night out away from your living space
Vivid VR ran a pop-up Virtual Reality Cinema on Dundas Street West in the Summer of 2016. $20, 1 hour. I was really excited to see their approach; having incorporated virtual reality into a couple performances with Raktor so far, I wanted to see what a company that called itself a “VR Cinema” would do. The space was filled with swivel chairs. They handed out Gear VRs preloaded with some 360 films and over-ear headphones. They told us to put the headset on, and then only one of the earpieces, so we could hear their instructions. The organizer then started a countdown and we were all to point at the same movie icon in the Oculus Video app and press the GearVR touchpad to start it at the same time, and then put the last earphone on. So we put the earphones on and watched “together”, but actually isolated.
I was disappointed by this – I was expecting bespoke multi-headset-synced-playback software. I was there with a friend and I wanted us to be able to talk to each other during the movie and call attention when we noticed interesting stuff going on. I can’t recall the movies themselves, but they were guilty of needlessly incorporating movement; in the first scene we were mounted on top of a car.
I had already owned a GearVR at this point for at least 6 months, so the value proposition (for me) of this cinema was negligible. The tickets were totally sold out, even though it felt like merely equipment rental.
The Toronto International Film Festival (TIFF) ran a summer 2016 VR event series called POP which has, so far, been the best-run VR event I’ve ever seen. They based it out of a gallery space in their big King Street East building. Cleverly, each Vive or Oculus station used a ceiling-mounted cable with a spring-loaded dog leash to keep the cable out of the way. Entry price was $23.75 for a few dozen VR experiences, and an attendant at each station ensured visitors did not get disoriented and things ran smoothly. Raktor premiered our asymmmetric multiplayer storytelling experience Inverse Dollhouse here, and having an attendant I could train to run the experience and smooth any bumps for new people was great. TIFF POP was very successful ticket-sales-wise, each in the series selling out weeks in advance.
I currently live in Kensington Market, and within 500 m of two (TWO!) VR-only arcades.
Toronto VR Games, at 55 Kensington Ave., has a very genuinely cyberpunk-y feel. It’s a former Chinese fruit market, keeping some of the signage, and added some sick dragon art.
Inside, it’s lit like a submarine (dark red) and the VR station dividers are black curtains. It feels exactly like the place I’d go if I wanted myself to become emaciated in VR while I did some 96 hour hack to steal a corp’s info in a William Gibson novel. There’s a fridge with sugar drinks I’m sure I could pay the staff to pour in my mouth so I wouldn’t have to take my headset off. This place has mostly Vive stations, and some Oculus stations in the back, but no Oculus Touch yet. Currently $28.25/hour.
To contrast, VRPlayIn at 294 College Street is well-lit and feels like somewhere I could take a risk-averse suit-wearing person or the kind of person who brings their kids around in a van. VRPlayIn opened quietly open a couple weeks ago.
VRPlayIn only has Vive stations, and is currently $29-$39/hour depending on day/time of the week, though considering the place is so nice I feel they could charge much more. They even have a large private room that legitimately feels like a private karaoke room. I’ve dreamed of that “bookable holodeck” setup for a few years now and this is the first time I’ve seen it.
Apparently, VRPlayIn is a wing of VNovus, a VR software studio. VNovus has made an in-headset VR app launcher and intro experience. This seems redundant to the work that Steam and Oculus has done, but I suppose everyone has a their own ideas of what users’ first contact with VR should be.
Toronto VR Games vs. VRPlayIn: VRPlayIn deserves your money more because they are a genuinely nicer space, but if you want to be confronted by cyberpunk aesthetic realness, Toronto VR Games is for you. They both have a very large selection of experiences, though if you want to play something specific, you can check in advance.
There’s another upcoming VR Venue: House of VR opening May 6th. An article claimed they were Toronto’s first VR lounge; tbd if they’re lounge-y enough to not count as an arcade. They do promise to have at least one Mixed Reality green screen area – though probably not as good as the state of the art: LIV.
Raiders’ e-Sports Centre is a surprise: it looks, sounds, smells and feels like a Sports Bar, but it’s e-Sports, not, like, actual sports. Big-screen TVs on almost every wall show mostly League of Legends, but also various Twitch streamers’ channels. A big area with leather booths serves standard beer and pub food. To break from a normal sports bar, there are a few dozen bookable desks with PCs, just like an internet cafe. There are a few more bookable booths with multiplayer game consoles. There are a couple VR stations with Vives, called the Atomic District. Pricing is $25/hour. Unlike the other current VR arcades in Toronto, you can actually get food and alcohol here, so it’s approaching a real party venue. Here, at least one of Vive setups is surrounded on 3 sides with open space, unlike being in a booth at VRPlayIn or Toronto VR Games, so if you want to be performative, this is the spot.
Electric Perfume is a “studio and event space” near Pape Station that, full disclosure, I’ve run multiple events out of and taught workshops at. With a projector, wraparound white walls, and a single well-constructed Vive setup it’s the most holodeck-y of any setup I’ve seen so far. If you want to book out a space to exhibit something beautiful, this is the spot. In the land of traditional theatre, “black box theatre” is a space with totally black walls and drapes that you can make look like any environment with lighting. Electric Perfume is a perfect “white box theatre” space, if you bring your own projectors for the other walls. In the future, I hope for “green box theatre” spaces for wraparound mixed reality.
Professional VR Developer Post-Note: If I want to run VR events or playtests with custom or pre-release software, I need to be able to install my own executable or bring my own machine(s) and plug it into their VR rigs. So far, I’ve asked VRPlayIn about this, and they were a little resistant about me installing my own software on their machines. I’m hoping that House of VR or another venue is less so – this would enable VR release parties and other special events. I or someone else shouldn’t have to set up an entire temporary exhibit like TIFF POP when we want to show off something non-standard.
See this compilation:
Paxton Number: Bill Paxton has a Paxton Number of Zero. Actors that have been killed on-screen by a monster who has also killed Bill Paxton, have a Paxton Number of 1. Anybody else’s number is m+1, where m is the lowest number of anyone else killed by the same monster. For every other actor, their Paxton Number is infinity.
Bill Paxton‘s Paxton Number is Zero, and he was killed by Predator in Predator 2.
Jesse Ventura‘s Paxton Number is 1, as he was killed by Predator in The Predator. He was also killed by Poison Ivy, as played by Uma Thurman, in Batman & Robin.
Ralf Moeller‘s Paxton Number is 2, as he was also killed by Poison Ivy in Batman & Robin. He was also killed by The Scorpion King, as played by Dwayne “The Rock” Johnson in The Scorpion King.
Randy Couture‘s Paxton Number is 3, as he was also killed by The Scorpion King, this time played by Michael Copon, in The Scorpion King 2: Rise of a Warrior.
The rule is based on the monster in the story, not the actor. So, we can’t count Paxton Number when actors are killed by Arnold Schwarzenegger when he plays Terminator equivalently to when he plays Conan the Barbarian – those are separate monsters.
There are several murderously profiling villains. Let’s find a finite Paxton Number for one of their victims and then we can give finite Paxton Numbers to whole swaths of people. They are:
– Darth Vader
Collab credit: Cian Cruise & Jan Streekska
Possibly useful resource: cinemorgue.wikia.com
*Fact: My Erdős Number is 4. Co-authorship via Ravin Balakrishnan; Michael Chi Hung Wu; Maria M. Klawe; Paul Erdős. co-authorship graph.
Or, How I Fixed A Real-Time Image Transmission Protocol For A Live Event By Making A Numbers Station
As we got deeper in beers, we started telling war stories of the craziest things that went well or poorly with past projects in the wild. Here is one thing I did that went exceptionally well, but shouldn’t have.
It was autumn of 2011. Myself and a team of professors and students at the University of Toronto and OCAD were doing a project for the all-night art event Nuit Blanche. Tweetris was a multiplayer game where two players raced to match a shape from the game Tetris, as judged by a Kinect:
The first person to match the shape, and hold it stably for a period of time, had their picture taken and tweeted to our live feed, @TweetrisTO, which you can still see! Then, anyone during the event could go to a now-defunct URL and actually play Tetris with the bodies of players. You can see what this looks like in the middle of this video.
So, instead of an image crop web-app-side, we built a numbers station with a secret twitter account. A Numbers Station is a way to describe a series of odd shortwave radio stations that have transmitted numbers, spoken aloud, for decades, presumably used in transmitting intelligence information to spies. You should really read the Wikipedia article.
The Tweetris team had already built the capability to download images from Twitter into a webapp, so we changed our backend to have a secondary, secret twitter account, to which the Kinect app tweeted pre-cropped square images at, including a formatted CSV of which tetris piece it corresponded to, including sub-piece coordinate and rotation. Here is what that looked like:
Unfortunately, all of the twitpic links for this account appear to have been cleared out. But, if you can imagine, these were all random portions of grainy images of human bodies; half of a face, a hand and a sleeve, a shoe, etc. All somewhat off-centre and with small amounts of motion blur. Miraculously, even though this account received 4x the image tweets of the main TweetrisTO account, it did not get tagged as spam during the live 12-hour event. We obviously didn’t announce or promote anyone following this secret account.
So, there you go. Your backend not working? No worries – just make a number station in plain sight!
Is this at 90% of being amazing and needs to be pushed/polished just a little or is it actually at 20% and there’s a ton more work?
If it’s a long way from being good, is the path to success clear or unclear?
If the extra work to make it good could be put in, does it then become uneconomical?
Do you think that this experience as it stands is sufficient for most audience members, and thus there’s no need to address your concerns?
If it’s underwhelming there’s 2 options:
1. It’s great but not for you
2. It’s not great for anyone
(#2 means it’s a failure for better or worse)
Could you salvage some parts of the experience into a more contained version that is higher quality?
Were your expectations harmfully different from what you experienced? If so, did these unhelpful expectations come from part of the intended preparation for the experience (e.g. marketing) or from yourself?
Is the essence of the experience (narrative, dynamic) still interesting despite the execution? Could this compelling core idea be implemented in a better way?
Is there a chance that your opinion could be influenced by an ‘off-night’ performance by any of the live actors or implementers?
Was there an arc (beginning, middle, and end) to your experience? Do you feel the experience could have been improved by the addition/sharpening of such?
Did you feel snubbed or irritated by the subject matter? Do you feel like the topic was irrelevant or impersonal to you? Is it possible that other individuals might agree/disagree?
Do you feel distances from the experience? Could it be made better by making you feel more involved?
What message/effect/phenomenon were they trying to convey? Could it be rephrased in a more effective and articulate way? How many versions do you think they have you considered?
Did you have fun doing what you did? Did you feel yourself come through in this work? Did you reach any new heights or cover new territory?
What mark were they trying to hit with their (desired?) audience? Does this relate to them well?
Perhaps something about comprehending the material – was it clear what the performance was trying to communicate? Was the plot / meaning / theme ever opaque in a way that seemed unintentional or didn’t add to the work?
Tied to the subject matter – is this a genre / medium that you particularly dislike?
– Please suggest additions –
Contributors: Joy, Randy, Patrick, Dat, Katy
I finally set aside time over New Year’s to see all the immersive theatre in New York City that people have been bugging me to see. Here’s a terse listing of them all. NOTE: all of those shows are great and worth seeing. With my comments, I’m not trying to convince people to think about whether or not they are worth seeing or attempting to provide even a helpful information summarization of any show. These are primarily about reflecting on what I care about and what I felt experiencing them.
The primary reason I went to New York. I saw it twice. I enjoyed that, after the first show I felt that I experienced enough content to be worth the ticket price, yet also had the feeling that there was so much more to experience that it was worth going again. That is a difficult balance to strike. The first show, I mostly avoided following performers, as most audience members do, but rather wandered the level. In the last 10 minutes of the show, I came across a floor I had not seen and panickingly tried to explore it as quick as possible. Very cool to be struck with the immensity of the content. For the second time I saw the show, I mostly explored that level, and stayed stationary as scenes flew by me. I even managed to get to a private scene (see cheeky lipstick smudge above). Sleep No More is very sparse when it comes to spoken language, which I suppose helps when you come across scenes in the middle of them. With no spoken language, you don’t feel that you’re missing out on any factual narrative. I don’t think that’s the kind of puzzle-box-y show I’d ever want to make, but it is a clever hack. I don’t particularly care for innovative dance, though watching hotties use their bodies in interesting ways is nice, so the joy of Sleep No More for me was treating it as a clever content choreography puzzle. I would definitely go again.
I was told that if I didn’t like Sleep No More, then I would like this one better. Then She Fell has gorgeous costume and set design. The density of production quality is so much higher because you’re guided on much more personal, intimate journeys. However, it somehow didn’t feel like theatre for me. All the small scenes were about intimacy of quirk, and I did not find myself caring enough about the characters and their arcs for it to feel like actual theatre – just static representations of well-dressed characters in a pretty time and space. The coordination of the audience moving through the space is interesting; a nice contrast to Sleep No More’s more chaotic approach.
Grand Paradise has my favourite structure but my least favourite plot. It’s set in a tropical paradise in the 70s, with several proxy audience members also on vacation. It’s about a decadent time away from the worries of our yuppie lives. I had drinks poured in myself several times, had actors invite me to slow dance, and spooned with one in a wooden cabin on a beach set. Which is all nice and interesting, but again the static-ness of the dramatic experience was frustrating to me. It just feels like set and moment design, and a string of interesting moments does not lead to a plot that, you know, should really being smacking me around psychologically. The soft-touch of the plot made it feel like I was at a theatre spa.
However, the structure was quite clever. The audience had a free wander that was controlled a little bit more tightly that Sleep No More (in Third Rail shows, you aren’t allowed to open doors). However, you’d get pulled aside for more intimate scenes. The leis we were innocuously given at the start served as markers for the actors for whether we had received an intimate scene or not. The intimate scene to audience member ratio was roughly 4:1, but using the leis as markers ensured that at least everyone had an intimate scene near the beginning, without it coming off as too controlled.
This used a binaural microphone head on stage, so the solo performer could whisper in your ear, and do other interesting effects. Everyone in the audience wore earphones. The effect was amazing, and I’m frustrated that I don’t see this used everywhere. Will probably try to use them in my own project. It’s simply magic.
This is actually “VR Theatre”. Like actually! Sort of. You go inside a little “back stage” area, put on a Vive headset, and in front of you in VR a virtual curtain opens into a large Broadway like theatre with an audience applauding you. You make grandiose gestures with the Vive controllers, which leads to different types of cheering responses from the adoring audience. Simultaneously to the VR curtain opening, a real red curtain opens to the gallery space the installation is set in, exposing you to whoever else is in there, either just hanging out or waiting for their own turn. When I visited midday, there was about a dozen people. However, you don’t see or hear these people while playing the game, yet they cheer for you anyway. Since you are immersed yourself in the game, and the VR headset is taking on the properties of a mask, you end up acting more confidently absurd to these real audience members because you think you’re acting to fake AI audience members. A sort of Ender’s Game of acting. The game seemed to be about guessing which gestures led to certain cheers. The screenshot after my run-through shows which gestures I “found”. CVRTAIN definitely follows one of the themes Raktor cares about, which is tricking non-performers into being performers.
A lovely touring mystery theatre. I’ve done a few of these before, and this felt like one of the first ones I’ve seen with a really good production value. We were hilariously hampered by snow.
I saw the closing show. The effects of multiple generations of poverty and abuse are intense. I was reminded of reading Octavia Butler’s Wild Seed from earlier this year. By total surprise, Bill and Hillary Clinton were in the audience like 40 people away from me.
An absurdly polished, well coordinated funny show. Like seriously the level of polish is insane. The show is 100% on rails, but there’s some very clever audience interaction mindfucks. These folks are absurdly talented.
This is sort of like a higher-end more specific Chuck E Cheese with the strong belief that ninjas are what the internet believes ninjas are. There’s constant jump scares and yelling, which for the first 15 minutes is eye-rollingly terrible but then rotates around to being hilarious. I highly recommend for the absurdity, unless you’re like, too cool for fun or something.
I saw this as it’s a higher-budget, less narrative-driven version of Playlines‘ work. They made some quite clever choices and obviously have spent a lot more engineering time on debugging tools, but it’s nice to see that the finickyness of bluetooth beacons is just as hard as them as it is for us. They also implemented some ideas that we were thinking of doing, but having seen them done in person I don’t think they make sense for us, which is perfect. Good field trip.