Three months after we shipped Meta accounts and shortly after Connect, we saw a sharp increase in new users trying Workrooms. However with all these new users, we were encountering a new problem. While, our existing user base was predominantly VR enthusiasts; think, self-taught coders and curious nerds, with a high tolerance for novelty, we were now attracting young families, moms and slightly older CEOs curious about trying the latest tech on the market. These people were much less confident with new tech and, whilst we had diversified our user base, we had incredibly low retention rates. These people would land in the virtual space, bounce and never come back.
As a team, we asked how can we help new users feel comfortable in a strange virtual space? The solution was a new user onboarding (NUX)… but not just any NUX. This was virtual reality after all.
1 Designer
1 Prototyper
1 Researcher
1 Product Manager
3 Engineers
3 Technical Artists
1 Localisation Manager
3 Video Artists
Understanding our users better
In H1, I had worked with UXR to create our team’s first VR user personas. Prior to Connect, our user base was predominantly The Tech Dreamer and The Quality Optimiser. These users had high tolerance for things going wrong; were highly motivated to teach themselves and were energised by being on the frontier of nascent technology. Our more recent users, however, fitted into the Practical Realist and Efficiency Fanatic bracket and had much less tolerance for thrash and needed hand-holding. How could we go about appealing to these users?
Mapping existing onboarding flows
We started by mapping the E2E user onboarding experience with UXR, PM and PMM, tagging pain points and moments of delights along the way.
Key pain points:
Key moments of delight:
These would be incorporated into the learning materials for the NUX.
Creating learning principles
I worked with user research to understand how people learn best and then translated those into product principles. I also interviewed friends who were teachers and asked them how their students learn new ideas. It turned out that people learn best when they get to experience something in real time and then get to play about what that thing on their own terms.
Slide from my XR Content Strategy, referencing Kolb’s seminal paper on best practices for teaching and learning. Here he lays out the 4 phases: “Experience”, “Reflect”, “Conceptualise” and “Experiment”.
We’d try and apply these to the NUX.
My notes as I was establishing storytelling principles. “Things they need to know”; “Things they care about” and “things they need to find out” became guides I used to define key moments in the story.
Principles: Interactive, Social, Educational, Show don’t tell, Human
At the time in VR, our NUX relied on a bog-standard feature carousel and a 2D video of the virtual workroom itself layered over your virtual computer. These weren’t contextual and the video had a 1% CTR; only 10% of users watched until the end.
At this point I was really getting into VR films, like Wolves in the Wall which used a branched narrative that allowed the viewer to have limited degrees of interaction with the space around them whilst retaining a fixed story. This has the benefit of there being added constraint to limit build complexity, but it also allows for a more natural, human experience.
I suggested that rather than reuse the same 2D NUX and video, we should consider an embodied VR guide. Virtual reality for most people felt cold and removed from their everyday lives. A virtual guide would compliment the space and make VR feel more tangible and life-like.
If our job as content designers is to communicate, that might also mean rethinking the tools and means we use to communicate. After all, there are no tooltips in the real world so when designing for immersive spaces, what are the tools we can use to communicate beyond just the written word
To make VR more compelling, we generally followed a principle that there should be some parallels with the real world. I wanted our NUX to feel like you were coming into the office for your first day induction and this person was here to show you around the office.
At a kick-off with our Art team and a few eng who worked in video gaming, we came up with a proposal for a character who would act as your virtual guide. Now, rather than you needing to orient yourself, “Sam” (working title) would show you around the space and teach you how to use key tooling.
Based on my research, I derived some values we could come back to to drive the NUX. These were:
The user should be able to play and explore with the space during the NUX.
The user should be able to experience the space with others during the NUX.
The user should have learnt how to use key features after the NUX.
The user should be demonstrated key features, not told about them.
The NUX should feel as relatable as possible.
Working with user research, we started breaking down what we wanted the NUX to look like; how long it should be and what features we wanted to highlight. These would become our hero moments.
We also discussed what tools users found confusing and how might we unblock them. These would become practical tutorials.
The storyboard
Storyboarding
I started by creating a narrative board broken down into three key moments in the user journey: Before, during and after joining the virtual room for the first time. These were then broken down into actions the user would be taking at each moment: Enter, Transition, Arrival, Explore.
🚪Enter was critical. This governed ease of setup and entering VR. This actually started on web and in real life. We famously saw high drop-off here and our web team had worked hard to improve automated headset syncing that would wake up before your scheduled meeting started and jump you into the correct workroom.
➡️ Transition governed latency and speed of putting on headset and joining the virtual space. In theory, this was also a critical moment. How might we add elements of delight here to get people excited and/or prepared about joining virtual reality?
👋 Arrival was about finding a balance between allowing people to acclimatise to the new world at their own pace, whilst also trying to ground them with a helping hand. This was a critical part of teaching and learning. Some people learn best on their own, while others need assistance, or learn better when being guided initially.
🛝 Explore was the fun part. This was when the NUX would really kick in and we empowered users to learn new and existing features, but also try things out for themselves.
The Story
I then mapped The Story to each of these moments; it should have equal moments of delight and address genuine educational and activational frustrations users had.
The Story was simple. User enters the workroom and triggers the NUX manually. At this point “Sam”, the guide joins and walks you through a series of tutorials derived from the pain points and moments of delight mapped in our earlier brainstorm. These included setting up your virtual screen, using the pen tool, using the whiteboard and moving seats. We’d end with our crowd-pleaser; a virtual High-5 between the guide and user.
Constraints
Thinking beyond frames
It goes without saying that virtual reality is immersive (duh!). In reality, this means we cannot control the users attention or assume we know where they’re looking at. Unlike web, we cannot control or drive the users attention with pop-ups, or modals or dialogs; we cannot control the order they experience something from beginning to end and we cannot control their gaze at the single point on a screen. In VR, users can move their heads freely; move seats; close their virtual screen; look out at the virtual beach…
This means that when we’re writing for virtual spaces, we need to think about all the other factors going on in the space around them. In that sense, it more resembles writing for a movie. For this reason we also write for World and Object Interactions and Social Interactions.
World and Object Interactions are the NPC aspects going on in the background. For example, we wanted to demonstrate to users what a video call looked like in VR (moment of delight). For this, we had a fake guest user join and wave at the user in the virtual room. This is an example of a World and Object interaction.
Social Interactions constitute “Experiments” (Kolb, 1984) and are the degrees of movement or amount of object interaction we afford the user in the moment. Because we had limited eng resource, we had to limit the number of things the user could interact with at any one time. Plus, this was a NUX after all. Abiding by Kolb’s learning principles, we wanted users to (1) Experience, (2) Reflect and finally, (3) Experiment, meaning we had to guide them through a set of steps in a particular order.
This also included how we signalled to the user this was a moment they could freely interact with the space; how we prompted people to move on if they didn’t interact when we wanted them to.
Lastly, was dialog. This governed what the guide said.
Once the dialog was written, me and my PD mocked up what each frame would look like in the workroom to pass to the art team and engineering, ready to build.
Adding visual layer
Once we started building, we ran into technical limitations. For instance:
Using our pen tool was a hero feature and setup was a pain point we wanted to include in the NUX. However, using the pen freely required real-time calibration, which was often janky and slow. As a solution, we limited the pen tool to a sticky note on the desk, rather than the whiteboard.
Because the NUX was being spoken by a “person”, localisation would need to “localise the mouth” of the guide to accommodate the various other spoken languages. The video team were happy to support this work, however we had to deprioritise a spoken word segment from a virtual guest who joined on video call. To still demonstrate the value of video calling in VR, as a workaround, we made the guest wave rather than speak.
Initially we wanted the user to be able to move seats, or navigate freely to the whiteboard. This was deprioritised in favour of moving the user to a seat of our choice, i.e. the experience was ‘on wheels’.
The NUX was a huge investment of time and once we shipped it, we’d be committing to refreshing every 6-8 months in line with new feature launches. We knew in the future, there’d come a time where we’d have to make trade-offs between more important P0s and updating the NUX for hygiene purposes. This did eventually happen and after a particularly egregious SEV, the team had to unship a number of features, including the virtual NUX. It never returned.