Disclaimer: This project is still a work in progress!
With tech's biggest topic being the implications of the metaverse, it has
never been more exciting
to work with mixed reality. Snap Inc released the next generation of AR technology
in the form of the 2021 Snap Spectacles, their first pair of true augmented
Lots of my colleagues at the Brown HCI Lab have been working on AR
projects, but none
with AR glasses, so I was eager to be the first to dive into this technology. We're
specifically interested in incorporating partial object manipulation and naturalistic
which has taken the form of Pokemon CatchAR, a Pokemon Go demo with live
tracking and waypoint navigation. I'm leading this project as a full stack
defining end-to-end UX and gathering feedback along the way.
I developed the application (lens) to work
on both Spectacles
Currently, augmented reality applications aren't fully immersive–
rely on the use of a phone screen. For example, the most popular AR game of all time,
requires users to throw a ball via swiping. User feedback shows that current
methods for interacting
with AR objects are not nearly as immersive as users would like.
What if we leverage hand tracking to create a more
immersive and accessible experience for
interacting with virtual objects?
unique plays and views on Snapchat
with full hand tracking on Spectacles and smartphones
The interface of the lens, from concept to
How do we leverage hand tracking to create a more
immersive and accessible experience for interacting with virtual objects?
Implement intuitive throwing algorithm
Seamlessly embed objects in the user's world
Make it fun by giving the user a goal
Making an Augmented Reality
After acquiring the Spectacles for the lab, we soon found that lenses (the
term used by Snapchat for experiences running on their tech) can only be developed using
Lens Studio, an environment made by Snap Inc. to rival Unity and Spark AR.
This meant I spent my first couple of weeks on the project
learning the ins
and outs of this software. I wanted to explore cool utility features in AR, and eventually
started testing out a navigation system. Think about it: Google Maps routing, projected
right in your
Unfortunately, however, there were clear roadblocks: a GPS API
quite possible with this tech stack yet , and even if it were, granular
navigation would be very time-consuming to
implement. So we ultimately set this idea to the side and thought about what
excites people most about AR. Games! While raw utility
features will be a main topic of research in the coming years, we figured we could still
explore AR paradigms while creating
an immersive gaming experience.
I decided to tinker with Snap's hand tracking framework,
we ultimately used to implement our own throwing algorithm. Because Lens Studio had no
engine, we had to calculate the velocity and trajectory of the ball
To do this, we had to create a ring buffer of the latest
10 hand positions, and
calculate a velocity vector based on the average of these positions, which was then fed
physics algorithm to make the ball move. After refining the formula, we added some
Pokemon into the scene
and had a first draft of the project!
From user feedback, we learned that we needed to implement a tutorial
educate users about hand tracking (a fairly new interaction method to most
people), and some other game mechanics to make the
experience more exciting.
Remember that navigation idea from earlier? We decided to revisit it
in the context
of Pokemon Go. In the mobile game, users view Pokemon on a map on their phones. Of
course, having a map
of that quality wouldn't be possible on Spectacles, so after some ideation,
created a radar-like mini-map
showing locations where Pokemon will spawn if the user travels close enough to
them. These locations are randomized within
a fixed distance of the user such that there are always 3 spawn locations in the
player's immediate environment.
While the mini-map shows the relative location of the Pokemon in 2D and
is updated in realtime, it is a fairly small signifier.
To remedy this, we adapted our path-generating navigation system to take the form of an
arrow. While the user navigates 3D space, the
arrow shows them which direction the nearest Pokemon is in.
To maximize efficiency, navigation mode turns off when the user enters
"throwable" distance, which is denoted by the white circle around the Pokemon. While I
was working on these features, Snap added
a physics engien to Lens Studio, which we also integrated to make the ball feel more
grounded in the environment. These changes were informed through user feedback
While we are well underway on v2, this is still a work in progress, meaning
our final version of the lens is still nopt publically available. In the meantime, you can
initial lens on Snapchat here!
Here's a list of what we have planned for the remainder of the project:
Add scores based on Pokemon caught. This will help gamify the
Use framework for educational purposes. We are in talks with a
professor at the
University of Oregon to use our framework to create an educational lens about exploring
Randomize spawn points and add catching logic. Currently, not much
happens when a user
catches a Pokemon. We want to reward their efforts in a way that's reflected in the UI.
So far, this has been a super fun project to work on, and I'm happy
to be working on a milestone in AR research. Here are some things I've learned so far:
Understand the problem deeply / Even when prototyping, it's important
to establish exactly what problem you want to solve early on. If you can't convince someone
else of your solution, you probably need to explore it some more before diving into
Gather constant feedback from different sources / The beauty of collaboration is that everyone
has a unique perspective. I've gathered feedback from experts in computer science, design,
perceptual psychology, and gaming (including some prototypers over at Snap)– every person has offered something new for me to consider.
Set realistic goals / It can be hard to navigate ambiguity, especially in academia.
Communicating with mentors regularly about goals and adjusting them as needed is imperative to
creating a final product you're proud of. Having stretch goals is great, but they shouldn't distract
you from the main objective of the project.