Headspace Uses Dscout to Safely Build Their First GenAI Product
Design Lead Priyanka Marawar discusses how Headspace built a GenAI experience that centers on ethics, member safety, and well-being.
Headspace used Dscout for diary studies and usability studies.
Headspace started as a mindfulness and meditation app but has grown to become much more. The company offers coaching, therapy, psychiatry services, and digital content to support peoples' mental health journeys.
Through generative research with Dscout, including diary studies and usability and naming tests, see how the team brought Headspace to the forefront of innovation and wellbeing.
Priyanka Marawar, Design Lead at Headspace, shares a first-person account below on how her team applied GenAI to product building with a safe and thoughtful approach.
How Headspace approached their first member-facing GenAI experience
For the last four years, I've been building health interventions—helping people stress less, sleep more, and so much more. Recently, I have been working on a tool that uses generative AI.
This project is our first member-facing AI experience. Being a mission-driven healthcare company, you can imagine how using AI within the sensitive context of mental health can make many people uncomfortable.
✔ Address skepticism
As we were getting started, there were a lot of strong feelings about using GenAI. On one hand, stakeholders were very excited about GenAI and how it could scale access to mental health tools globally.
On the other hand, there was a lot of skepticism and nervousness about how AI could be detrimental to someone's mental health if not designed right because mental health is inherently so vulnerable.
We learned that this nervous sentiment was also present in our members. We conducted a large-scale study to understand the desirability and attitudes towards AI, and while there was a high degree of trust in Headspace, there was also a lot of skepticism around using AI in mental health.
One interesting insight with many AI features was that people needed to interact with the technology to better understand it.
After using it, people trusted it more. This was seen specifically with ChatGPT. People who used it had more positive attitudes towards it, so we had to encourage people to try the experience, to trust it and reap any health benefits from the experience.
It all came down to what we can do as designers to help our members feel safe about interacting with AI. Also as a mental health company that is passionate about and invested in improving people’s mental health, how do we keep our members safe from any unintended consequences of AI?
The first step was to find a focused and valuable use case that balanced all of these factors. This might seem a little familiar to what Nicole shared with the desirability, viability, and feasibility matrix. Ours has a few additions.
✔ Find the right use case
This had to be something that people wanted and could help them in their lives. Through a lot of generative research that we've done with Dscout over the years, one of the things that kept coming up is how people have a reflection practice outside of Headspace. That's a capability that our app did not support.
Through clinical best practices, we know that reflection can be beneficial and supportive of one's mental health. It gives one clarity, increases a sense of self-efficacy, and it can empower people to take more agency in their own journey. Overall, it benefits one's well-being.
We had to also consider the needs of our ecosystem of services as a whole. In our ecosystem, as I mentioned, we also have human care providers. So how does reflection support a member's experience with their care provider? It turns out reflection is very supportive and complementary to seeing a human provider and is often recommended in the form of activities within a member care plan.
This use case around reflection became a safe container in the larger context of mental health for us to leverage the power of generative AI and specifically conversational AI to understand someone's unique context and encourage deep reflection.
This led us to build Ebb, a guided reflection experience that uses conversational AI. Ebb is your compassionate companion. It creates a safe space for you to express yourself. It asks you open-ended, deep, reflective questions with the sole purpose of helping you be more self-aware and offers non-judgmental support along the way.
This [GenAI tool] had to be something that people wanted and could help them in their lives. Through a lot of generative research that we've done with Dscout over the years, one of the things that kept coming up is how people have a reflection practice outside of Headspace. That's a capability that our app did not support.
Design Lead at Headspace
✔ Encourage adoption
We had a solid use case people would likely benefit from, but we had to address people's barriers around not feeling safe engaging with AI.
This is where (through the upfront experience and research that we had done) we realized that we needed to emphasize that Ebb is built by clinical experts. The more we elevate that early in the experience, we increase feelings of trust for people.
People were concerned about their privacy—so this one was delicate. How much do you talk about data and privacy up front? You don't want to overwhelm people. We had to strike the right balance.
This use case around reflection became a safe container in the larger context of mental health for us to leverage the power of generative AI and specifically conversational AI to understand someone's unique context and encourage deep reflection.
Design Lead at Headspace
✔ Nail the experience with the right methodology
The next step was to really nail that first reflective conversation and interaction with Ebb, so people keep coming back. We adopted an iterative approach to refining the conversational experience where research had a big role to play. A huge shout out to Co-Lab last year and the folks from Duolingo who helped influence our approach.
We conducted…
An upfront desirability and attitudinal study on AI and reflection
Diary studies combined with voice flow prototyping and then extensive in-house testing
Usability + naming tests
The feedback loops between the conversation design and these methods—the desirability and attitudinal study on AI, diary studies and voice flow prototyping and finally several rounds of internal red teaming sessions helped our conversational designers hone the structure, cadence of the conversation, fine tune the voice and tone of Ebb.
We also used two rounds of concept testing and usability tests to inform the design of the conversational interface and Ebb’s identity, all towards supporting safe and deep reflection for our members.
How did this all come together in the experience? Let’s come back to the notion of safety.
✔ Cultivate a sense of safety to maximize health benefits
One of the ways we helped members feel safe within the experience was to elevate their choice, control, and autonomy.
In order to encourage quality reflection and free expression, we designed the conversational experience to feel open and calm but then also give people clear boundaries to keep them safe. For example, giving people the chance to end or delete a conversation anytime if things get sensitive.
It’s also important to note that Ebb never offers you advice as a coach or a therapist would and waits for your signal if you ever want suggestions with activities from our content library. Ebb, powered by safety detection mechanisms, is constantly sensing if your needs are escalated in any way, it will gently guide you to crisis resources or a provider.
Last but not least, shaping the persona of Ebb was important to create that sense of safety. We married psychology best practices and brand guidelines to shape the persona, voice, and approach of Ebb, and how it communicates with you.
We also shaped its visual presence to be gentle, hopeful, and compassionate. There’s lots of power here for design to evoke a certain feeling with AI.
Wrapping it up
The team at Headspace knew they needed to look at all angles when merging wellness and GenAI. This project required a conscientious approach.
By finding the right use case and blending a variety of methodologies on Dscout—from attitudinal studies to diary studies and usability and naming tests—the team was able to share their new offering with both pride and peace of mind.
Ensure that your GenAI products not only meet user needs but are safe for them to use
Use Dscout to recruit the right participants, concept or prototype test, analyze with ease, and set your products up for success.
Schedule a demo to see how our platform can easily integrate with your product roadmap.