UX Designer, Full Stack Dev
Sept 2021 - June 2022
Links: You can check out the initial version here and final version here! It is also featured on the Snap Spectacles Instagram page. Or, read the official project paper.
With one of tech's biggest hot topics being the implications of XR, it has never been more exciting to work with mixed reality. Snap Inc released the next generation of AR technology in the form of the 2021 Snap Spectacles, their first pair of true augmented reality glasses.
Lots of my colleagues at the Brown HCI Lab have been working on AR projects, but none with AR glasses, so I was eager to be the first to dive into this technology. We're specifically interested in incorporating partial object manipulation and naturalistic throwing interactions, which has taken the form of Pokemon CatchAR, a Pokemon Go demo with live hand tracking and waypoint navigation. I'm leading this project as a full stack prototyper, defining end-to-end UX and gathering feedback along the way.
I developed the application (lens) to work on both Spectacles and smartphones
unique plays and views on Snapchat
with full hand tracking on Spectacles and smartphones
The interface of the lens, from concept to reality
Implement intuitive throwing algorithm
Seamlessly embed objects in the user's world
Make it fun by giving the user a goal
After acquiring the Spectacles for the lab, we soon found that lenses (the term used by Snapchat for experiences running on their tech) can only be developed using Lens Studio, an environment made by Snap Inc. to rival Unity and Spark AR.
This meant I spent my first couple of weeks on the project learning the ins and outs of this software. I wanted to explore cool utility features in AR, and eventually started testing out a navigation system. Think about it: Google Maps routing, projected right in your glasses!
Unfortunately, however, there were clear roadblocks: a GPS API isn't quite possible with this tech stack yet , and even if it were, granular navigation would be very time-consuming to implement. So we ultimately set this idea to the side and thought about what excites people most about AR. Games! While raw utility features will be a main topic of research in the coming years, we figured we could still explore AR paradigms while creating an immersive gaming experience.
I decided to tinker with Snap's hand tracking framework, which we ultimately used to implement our own throwing algorithm. Because Lens Studio had no physics engine, we had to calculate the velocity and trajectory of the ball ourselves.
To do this, we had to create a ring buffer of the latest 10 hand positions, and calculate a velocity vector based on the average of these positions, which was then fed into a physics algorithm to make the ball move. After refining the formula, we added some Pokemon into the scene and had a first draft of the project!
From user feedback, we learned that we needed to implement a tutorial screen to educate users about hand tracking (a fairly new interaction method to most people), and some other game mechanics to make the experience more exciting.
Remember that navigation idea from earlier? We decided to revisit it in the context of Pokemon Go. In the mobile game, users view Pokemon on a map on their phones. Of course, having a map of that quality wouldn't be possible on Spectacles, so after some ideation, created a radar-like mini-map showing locations where Pokemon will spawn if the user travels close enough to them. These locations are randomized within a fixed distance of the user such that there are always 3 spawn locations in the player's immediate environment.
While the mini-map shows the relative location of the Pokemon in 2D and is updated in realtime, it is a fairly small signifier. To remedy this, we adapted our path-generating navigation system to take the form of an arrow. While the user navigates 3D space, the arrow shows them which direction the nearest Pokemon is in.
To maximize efficiency, navigation mode turns off when the user enters "throwable" distance, which is denoted by the white circle around the Pokemon. While I was working on these features, Snap added a physics engien to Lens Studio, which we also integrated to make the ball feel more grounded in the environment. These changes were informed through user feedback and observational studies.
While you can view our initial lens on Snapchat here, in order to market our lens, we would have to remove the Pokémon IP due to legal issues.
Here's a list of the changes we made to the lens:
So far, this has been a super fun project to work on, and I'm happy to be working on a milestone in AR research. Here are some things I learned: