Kesif Augmented Reality Mobile App

Technologies: Unity, C#, Vuforia Augmented Reality SDK, Maya

This was mobile app I worked on for our team’s project for the 2014 Imaginations Design Competition for Walt Disney Imagineering. Our project ended up being selected as one of the 6 finalists out of 231 total entries. My other teammates were John Nappo, Sade Oba and Gabrielle Patterson.

Note: This project was conceived by the University of Pennsylvania Team Keşif and created for the 2014 Walt Disney Imagineering’s Imaginations Design Competition. This project is the sole property of Walt Disney Imagineering and all rights to use these ideas are exclusive to Walt Disney Imagineering. The competition is a way for students and recent graduates to showcase their talents and for Walt Disney Imagineering to identify new talent.
The purpose of the Kesif app is to combine all of the different components of our experience into one place. If you want to learn a bit more about our experience in general, you can check out the page for our animation. You can learn more about the current week’s snapshot, read about the story of the Maiden, see how you can interact with the puppet platforms, claim your token at the ferry dock, or track your progress and see your current token collection. But I think the most exciting part of the app is that it brings your tokens to life. You’re able to watch the snapshot on your token come to life, and interact with it.

I loved working on this app even though it was quite a journey, because I didn’t really know how I was going to make it so I ended up having to learn a lot in the process. I’m planning to write a more extensive blog post about how I made it, but here’s a brief description: After a failed attempt with Phonegap and the Wikitude Augmented Reality Plugin for Phonegap, I changed technologies and switched to Vuforia. I wasn’t sure whether the Vuforia Unity SDK was going to work right, but I tested it out with a sample app and after solving a couple of problems, it worked beautifully. Our initial goal was to have the AR pick out the distinct shapes on the tokens, but looking back, I think that would have probably required a different approach (like edge detection using OpenCV). Because of that we decided to keep it simple and stick with the sample target images that Vuforia has (the stones and the chips). Next, I had to figure out how to import my custom models and animations from Maya into Unity. I learned a bit about FBX exporting from Maya, and was able to get a dervish to appear on top of the image target successfully. The step after that was to trigger the animation on a specific interaction. I went for simple again and decided to check the number of touches on the screen. On the detection of a touch, the animation would start playing. For the seagull, I added another animation and linked it with two touches.

The user interface itself is “faked” in Unity using textured planes. As I mentioned earlier, we were in a time crunch and that method seemed to work pretty well, with almost no delays, so I decided to go with it. I had been testing the app on an Android tablet up to that point, but after an unreasonably long amount of time signing up for an iOS Developer Account and attempting to run the Unity project on the iPad, it finally worked!

You can check out the static content on the app at We wanted the judges to be able to visit it if they wanted to, so I made a basic HTML version of the app’s contents.