Hey world, Jenna here.

I’m an XR developer and my goal is to use XR to make us safer, more efficient, and empowered to shape the future.

 

MoonBloom

The Hunt

Jurisdiction Unlimited

CollabARt

Odds and Ends

Resume

A Mixed Reality Story for Your Space

 

MoonBloom is a Mixed Reality adventure developed for Magic Leap and IOS ARKit.  Players solve puzzles in their space with natural gestures to help Luna the fox reassamble the shattered moon.

 
 

Key Features

Gestural Interactions

MoonBloom is a digital adventure in your space and the app pusher what that means from an interaction standpoint.  Fostering the idea of presence in the user is important for XR so interactions in MoonBloom are by natural gestures.  On the Magic Leap, MoonBloom tracks the users hands and key feature points letting users directly grab and move digital objects. The IOS ARKit version, the phone acts as a digital hand and players use its position to interact with the environment.  In both versions users move logs and mushrooms out of Luna’s way, sticking their hands in digital lakes to find orbs, and tracing the stars.  Getting users physically invested in their actions strengthened the story’s presence in our reality and empowered the users’ efficacy in this digital land.  

 

Mixed Reality Narrative

MoonBloom is designed to explore how to tell a narrative in a digital narrative in a physical space.  On a base level MoonBloom understands the users’ room causing environmental elements like waterfalls and boulders interact with the physical world. Luna similarly understands the player’s location and looks at them during key moments of the app.  However, on the Magic Leap we’ve taken this further and integrated physical props into our experience.  These physical props create custom occlusion moments and directly drive the player’s goals in the digital world.  Users see Luna emerge from a handcrafted den, they trace constellations on walls, and place a digital piece of the moon into a physical outline secured to a window

 

Responsibilities

Immersive Designer

Experience Design: Developed gameplay mechanics, puzzle design, and gestural interactions to align with project goals and UX considerations

Rapid Prototyping/Iterative Testing: Expressed design ideas through rapid prototypes for play testing and iterated based user feedback.

Design Documentation: Created gameplay loops, flow charts, feature overviews to foster a shared team vision and maintain an optimal level of user flow.

Conceptualization and pitch: Ideated initial concept and goals expressed through both a playable prototype and a pitch deck that was presented to stakeholders for project green light.

 

Lead Engineer

Mixed Reality Guru: Utilized Lumin OS and ARKit sdks to achieve world anchoring, user tracking, environmental understanding, and light estimation.

Gameplay implementation: Implemented user gestures, AI pathfinding, narrative cutscenes, animation systems, and general gameplay utilities.

Optimization: Optimized art assets, draw calls, physics, and script execution for 60fps mobile platform performance.  

Shader Writing: Created and edited shaders to facilitate lighting conditions, blend masks, and world texture coordinates.

Developed at a.newreality

Location Based Team Building

 

The Hunt is an AR geolocation scavenger hunt developed using Mapbox and ARKit.  We divided 70 users into six teams and let them loose in Old Town Pasadena with an AR map of their surroundings.  AR clues drove them to different locations where they searched for image tags and participated in team challenges, culminating in a celebration a local bar.

 
 

Key Features

AR Map

Each team on The Hunt had a team flag which doubled as their map for the experience.  We took location data and generated 3D extrusions of the building’s Old Town Pasadena.   Clues were delivered by pinning 3D objects to different locations at the map and once users deciphered the clue, they would travel to that location and find the relevant statue or signage to scan.  

 

Team Collaboration

The Hunt’s goal was team building so we placed user engagement at the forefront of the experience.  We developed a strong experience loop where we had users do a couple location based clues then we broke the flow by putting them into team challenges.  The most successful challenge was having our teams play giant Jenga with the goal of using the pieces to assembled an AR image tag.

 

Responsibilities

Immersive Designer

Experiential Design: Designed clues, core gameplay loops, and team challenges.

Optimized Routes: Developed an optimized loop per team to create the most user friendly route and minimize overlap between teams.

UX: Developed and iterated on an AR map UX for the primary vehicle for the experience.

 

Lead Engineer

Geolocation/AR Integration: Combined Mapbox and ARKit sdks to create AR objects geo-positioned to users’ digital maps.

App Development: Developed app UI functionality and interaction systems.

Data Management: Integrated Gamesparks cloud servers to manage clues and gps locations.  

Developed at a.newreality

XR Efficiency for Building Inspectors

 

Jurisdiction Unlimited is a Hololens proof of concept developed for a leading insurance technology company, Praeses. JU is holographic tool that empowers building inspectors use voice, gaze, gesture to log multimedia data during an onsite inspection.  The primary goal of this tool is to optimize the entire process from inspection to review, ensuring the inspectors are heads up, hands free, and safe. 

 
 

Key Features

Spatial History

Inspectors can use spatial scanning to capture a holographic representation of their space.  This creates a sort of minimap which enables users to navigate throughout an unfamiliar location with confidence.  Data captured during inspections is tied to points both in the real space and on the minimap.  This created a living spatial inspection history for a location, giving captured data an extra layer of spatial context to help inform smarter decisions.

 

Gestural UI

We developed a non intrusive quick access radial menu that can be accessed through simple pinch and drag methods.  From this menu system users can complete state required inspection forms, log code violations, or simply capture comments.  Each form of data capture can include text, video, or photo attachments and this data stored in a JSON structure that directly mirrors Praeses's existing library

 

Responsibilities

Immersive Designer

Interaction Design: Developed and iterated upon radial menu, scroll, and reposition gestures.  Created user feedback patterns for these gestures. 

Paper Prototyping and Envisioning: Led envisioning workshop with team and client to evangelize what the Hololens's capabilities.  Took initial design decisions and blocked them out with physical primitives to create a rough physical outline of the project.

Product Documentation: Worked with client and Project Manager to draft project goals and identify specific problems our app would target.  

 

Lead Engineer

Hololens Development: Developed spatial minimap and avatar tracking tool. Utilized native Hololens input systems to allow unique gestures and voice to text dictation.  Integrated with Spectator View technology to capture stable and high resolution 3rd person video of the application.

App functionality: Created user interactions and systems for location setup, content pinning, data entry, and violation logging.

Multimedia Capture: Implemented photo/video capture, storage, and playback from Hololens and third party plugins.

Developed at a.newreality

Persistent ARt

 

CollabARt is a geolocation collaborative AR stacking application.  Developed as a proof of concept, CollabARt lets users stack totems and leave them at real world locations using a combination of Mapbox and IOS ARKit.

 
 

Key Features

Persistence and Collaboration

The primary goal of CollabARt was to empower users to create digital art that lasts.  Users , the app saves the gps coordinates and stores them on the cloud using a combination of ARKit, Mapbox, and Gamesparks. Users can travel to location to discover stacks that other users have left or go back to augment stacks they’ve previously left. CollabARt lets users add to or change stacks they’ve discovered asynchronously or join a shared session and stack in realtime with friends.

 

Physical Stacking

We wanted to create a tactile experience in CollabARt so we required players use physical movements with their phones to stack objects. Opposed to taps and drags, users actually move their phone to lift objects. We transformed the mobile device into a digital hand with 6DOF to create a mixed reality interaction system that closely mirrors how users would stack in our reality

 

Responsibilities

Lead Engineer

Location Persistence: Integrated Mapbox and Gamesparks to log gps coordinates of stacks and store them in the cloud.

Networking: Implemented Photon networking to sync transforms, physics, and interactions with ARKit tag alignment to establish a shared world coordinate system.

App Development: Developed app UI functionality and interaction systems. 

Immersive Designer

Interaction Design: Designed physical interaction system where users stack objects by moving their phone through space.

Developed at a.newreality

Holo Sanic

Sanic is a short hololens experience developed for the Sonic the Hedgehog social channels.  We took a fan favorite crudely drawn meme named Sanic and created a lo-fi story where he comes to life to wreak havoc on social media.  This video generated over a million views across all Sonic the Hedgehog social channels.  

 

Microsoft Build 2017

In 2017 we were invited to give a brief tech talk at Microsoft Build about our Jurisdiction Unlimited Hololens application.  The talk focused on how to introduce new XR technology to an industry where clipboards are still standard.  We covered how to represent data with spatial context, gestural UX, and user onboarding.

 

Microsoft Agency Readiness Program 

When the Hololens first released, our agency was invited to participate in hands on training program with the creators of the device across a six month period.  During this time we learned about best practices for Mixed Reality from the envision phase all the way to gold master. The program culminated in us developing our Jurisdiction Unlimited application.

13710493_298043620531127_1129880862030932054_o.jpg