Beyond Emotional Junk Food: Building Mental Health Tools That Actually Help

How a concerning meme about AI relationships sparked our mission to build evidence-based emotional tools for people who want real help during moments of overwhelm, not just temporary relief.

Role

UX/UI Designer

Industry

Health & Fitness

Duration

Jan 2025 - Now

It all started with this meme. At first glance, it is funny but if you really think about it, it is quite sad. We are out here comparing human relationships to those with AI and in many cases AI is winning.

Whenever people feel any emotional discomfort or confusion, they turn to these tools for support almost instantly. You ask for something and it instantly delivers the perfect response. It feels like genuine help in the moment, but it's really just temporary relief that never builds lasting resilience. You feel satisfied for a while, then find yourself automatically returning the next time things get tough. The problem is, they give you emotional junk food.

It all started with this meme. At first glance, it is funny but if you really think about it, it is quite sad. We are out here comparing human relationships to those with AI and in many cases AI is winning.

Whenever people feel any emotional discomfort or confusion, they turn to these tools for support almost instantly. You ask for something and it instantly delivers the perfect response.

What really concerned me was seeing people use AI to validate their thoughts and feelings. This becomes particularly dangerous for anyone navigating mental health challenges. AI doesn't understand context or nuance. Instead of challenging unhelpful patterns, it often enables them and reinforces the exact behaviors you're trying to break free from.

Therapy works so differently. There are no instant answers or quick fixes. You're not told what's right or wrong. Instead, you're given space to discover your own solutions and build real tools for independence, not dependence.

What really concerned me was seeing people use AI to validate their thoughts and feelings. This becomes particularly dangerous for anyone navigating mental health challenges. AI doesn't understand context or nuance. Instead of challenging unhelpful patterns, it often enables them and reinforces the exact behaviors you're trying to break free from.


Therapy works so differently. There are no instant answers or quick fixes. You're not told what's right or wrong. Instead, you're given space to discover your own solutions and build real tools for independence, not dependence.


It feels like genuine help in the moment, but it's really just temporary relief that never builds lasting resilience. You feel satisfied for a while, then find yourself automatically returning the next time things get tough. The problem is, they give you emotional junk food.

What really concerned me was seeing people use AI to validate their thoughts and feelings. This becomes particularly dangerous for anyone navigating mental health challenges. AI doesn't understand context or nuance. Instead of challenging unhelpful patterns, it often enables them and reinforces the exact behaviors you're trying to break free from.

Therapy works so differently. There are no instant answers or quick fixes. You're not told what's right or wrong. Instead, you're given space to discover your own solutions and build real tools for independence, not dependence.


Going Down the Rabbit Hole


Getting people to talk about their emotional struggles isn't easy. Even when they're willing to open up, you have to be incredibly careful not to say the wrong thing or be insensitive.When we tried talking to people directly about how they handle their emotions, we didn't get much in terms of reliable answers. So we turned to existing research and online forums where people were already discussing their mental health struggles and experiences with AI tools anonymously.


We also looked into existing mental health check-in apps, but most weren't very efficient. The apps specifically designed for panic attacks were poorly designed, and many would build trust with users only to push them into expensive subscription plans.

The pattern was clear: most people were using these tools to distract themselves or combat loneliness. They weren't looking for therapy replacements initially, but the convenience and availability were turning casual use into something much deeper. The fact that 31% specifically turned to AI during acute stress and 35% used it to combat loneliness showed us there were real emotional needs being met, even if temporarily.



Going Down the Rabbit Hole


Getting people to talk about their emotional struggles isn't easy. Even when they're willing to open up, you have to be incredibly careful not to say the wrong thing or be insensitive.When we tried talking to people directly about how they handle their emotions, we didn't get much in terms of reliable answers. So we turned to existing research and online forums where people were already discussing their mental health struggles and experiences with AI tools anonymously.


We also looked into existing mental health check-in apps, but most weren't very efficient. The apps specifically designed for panic attacks were poorly designed, and many would build trust with users only to push them into expensive subscription plans.



The pattern was clear: most people were using these tools to distract themselves or combat loneliness. They weren't looking for therapy replacements initially, but the convenience and availability were turning casual use into something much deeper. The fact that 31% specifically turned to AI during acute stress and 35% used it to combat loneliness showed us there were real emotional needs being met, even if temporarily.



The pattern was clear: most people were using these tools to distract themselves or combat loneliness. They weren't looking for therapy replacements initially, but the convenience and availability were turning casual use into something much deeper. The fact that 31% specifically turned to AI during acute stress and 35% used it to combat loneliness showed us there were real emotional needs being met, even if temporarily.


Who could we help?


We knew from the start it wouldn’t be possible to help everyone. Some users are too dependent on tools like GPT, making effective interventions difficult, and those who don’t recognize the problem are unlikely to change.


We wanted to focus on people who already know they need support and are actively seeking solutions. We planned to build for users who want to vent safely, find practical tools for stability, practice therapy skills, or explore getting professional help.


These would be solvable problems with clear parameters: users with intent, self-awareness, and readiness for change. By focusing on this specific group, we could provide meaningful help without risking harm to unprepared users.



Our Design Principles


Before building anything, we established three non-negotiable rules that would guide every decision:

No Playing Therapist - We wouldn't attempt diagnosis or pretend to replace professional mental health care. Our role was to provide support tools, not clinical treatment.

Privacy and Data Safety - Protecting user data and privacy would be paramount. People sharing emotional struggles deserve complete security and control over their personal information.

Empower, Don't Entrap - We wanted to help users build their own coping skills and independence, not create another tool they'd become dependent on. The goal was to make ourselves less necessary over time, not more.



The Path Forward


We identified the most common emotional struggles from our research and mapped them to evidence-based therapeutic approaches. Instead of trying to solve everything, we focused on specific emotion clusters and grounded each solution in clinically-approved methodologies like CBT, PST, SFBT, and several other established therapeutic frameworks.

Anxiety and Overwhelm - CBT techniques like deep breathing, progressive muscle relaxation, and mindful journaling. Shame-resilience strategies addressed a major component we found in user struggles.

Disconnection and Numbness - Activation strategies and grounding techniques. Small accomplishment tasks and sensory exercises help users reconnect with their present experience.

Emotional Intensity - Immediate regulation tools: safe physical releases, pause-and-probe reflection, and assertiveness training for managing intense emotions in real-time.

Crisis Support - DBT crisis skills and safety planning techniques, including immediate outreach resources and structured distraction toolkits to bridge critical moments until professional help is accessed.

Each solution empowered users with proven coping strategies while maintaining clear boundaries around our role as support, not therapeutic replacement.



Our Design Principles


Before building anything, we established three non-negotiable rules that would guide every decision:

No Playing Therapist - We wouldn't attempt diagnosis or pretend to replace professional mental health care. Our role was to provide support tools, not clinical treatment.

Privacy and Data Safety - Protecting user data and privacy would be paramount. People sharing emotional struggles deserve complete security and control over their personal information.

Empower, Don't Entrap - We wanted to help users build their own coping skills and independence, not create another tool they'd become dependent on. The goal was to make ourselves less necessary over time, not more.


The Path Forward


We identified the most common emotional struggles from our research and mapped them to evidence-based therapeutic approaches. Instead of trying to solve everything, we focused on specific emotion clusters and grounded each solution in clinically-approved methodologies like CBT, PST, SFBT, and several other established therapeutic frameworks.


Anxiety and Overwhelm - CBT techniques like deep breathing, progressive muscle relaxation, and mindful journaling. Shame-resilience strategies addressed a major component we found in user struggles.


Disconnection and Numbness - Activation strategies and grounding techniques. Small accomplishment tasks and sensory exercises help users reconnect with their present experience


Emotional Intensity - Immediate regulation tools: safe physical releases, pause-and-probe reflection, and assertiveness training for managing intense emotions in real-time.


Crisis Support - DBT crisis skills and safety planning techniques, including immediate outreach resources and structured distraction toolkits to bridge critical moments until professional help is accessed.


Each solution empowered users with proven coping strategies while maintaining clear boundaries around our role as support, not therapeutic replacement.


Our Core Features


We built four evidence-based tools, each targeting specific emotional struggles our users face. Every feature is grounded in proven therapeutic approaches and designed for quick, self-guided relief.


Memory Box & Mindfulness Tools


Memory Box A personal collection of photos and cherished memories that users can save and revisit during difficult times. Veterans and others use this feature to recall fond memories during dire moments to help keep them functioning and connected to positive experiences.


Mindfulness & Grounding Tools A collection of techniques including breathwork and sensory exercises like the 5-4-3-2-1 method. These practices help calm overwhelming emotions, reduce panic, and anchor users to the present moment when feeling anxious or dissociated.



Challenge Beliefs


Uses cognitive restructuring and Socratic questioning to test anxious or shame-based thoughts. CBT techniques are proven effective for anxiety and depression by helping users examine evidence and reframe unhelpful thinking patterns.



Understand Your Thoughts


Teaches users to notice thoughts as mental events, not facts. This metacognitive approach reduces emotional reactivity and helps with racing thoughts, worry loops, and post-argument replaying through self-distancing techniques.




Book a Therapist


Connects users with qualified therapists in their area through our curated network. We're planning to build partnerships with mental health professionals to serve as a bridge between our app users and professional care, making the transition to therapy smoother and more accessible.



Each tool works as a standalone solution but becomes more powerful when combined. At the heart of our app is Whelma, our AI guide and the most integral part of our MVP. She's trained on extensive therapy transcript data to provide authentic, ethical support while maintaining clear boundaries.

Whelma actively listens to understand your emotional state, gently challenges unhelpful patterns, and guides you to the right tools without making decisions for you. She doesn't give advice but helps you discover your own insights. As our core feature, Whelma creates a supportive space where users feel heard and understood before being empowered to use our evidence-based tools. She continuously learns and updates to provide better support over time.




Smart Onboarding


Our onboarding process screens users to understand their therapy experience, current emotional state, and specific needs. This helps us personalize their experience from day one and ensures Whelma can provide appropriate support that fits where they are in their mental health journey.




Smart Onboarding


Our onboarding process screens users to understand their therapy experience, current emotional state, and specific needs. This helps us personalize their experience from day one and ensures Whelma can provide appropriate support that fits where they are in their mental health journey.



Crisis Support


We introduced a crisis support feature that allows users to quickly connect with their designated emergency contact in urgent situations.




Crisis Support


We introduced a crisis support feature that allows users to quickly connect with their designated emergency contact in urgent situations.




Looking Forward

Building a mental health app that truly helps people is technically complicated, but what's wrong with trying? Writing conversational scripts that feel authentic and therapeutic is an absolute nightmare, and we're still figuring out how to navigate that challenge. But it's definitely worth it and has been a fun ongoing process

.

We're not measuring success by engagement metrics or session length. We're tracking how many people build genuine emotional resilience, whether that's transitioning to professional therapy or simply having a reliable toolkit when they need it most.


We know we're being idealistic, but that's exactly what's needed for something this important. Real healing happens in the real world with real people and real growth. We're just here to help bridge that gap and provide the tools people need along the way.

This feature went through multiple iterations as the flow was complex and gaps kept surfacing. At one point, here was talk of building an auction hosting platform, an idea so over engineered it horrified the team. Thankfully, it was dropped. After many rounds of refining rules and removing unnecessary elements, we delivered a solution that is simple for the user and solid for the business.




Looking Forward

Building a mental health app that truly helps people is technically complicated, but what's wrong with trying? Writing conversational scripts that feel authentic and therapeutic is an absolute nightmare, and we're still figuring out how to navigate that challenge. But it's definitely worth it and has been a fun ongoing process


We're not measuring success by engagement metrics or session length. We're tracking how many people build genuine emotional resilience, whether that's transitioning to professional therapy or simply having a reliable toolkit when they need it most.


We know we're being idealistic, but that's exactly what's needed for something this important. Real healing happens in the real world with real people and real growth. We're just here to help bridge that gap and provide the tools people need along the way.

This feature went through multiple iterations as the flow was complex and gaps kept surfacing. At one point, here was talk of building an auction hosting platform, an idea so over engineered it horrified the team. Thankfully, it was dropped. After many rounds of refining rules and removing unnecessary elements, we delivered a solution that is simple for the user and solid for the business.




Looking Forward

Building a mental health app that truly helps people is technically complicated, but what's wrong with trying? Writing conversational scripts that feel authentic and therapeutic is an absolute nightmare, and we're still figuring out how to navigate that challenge. But it's definitely worth it and has been a fun ongoing process

We're not measuring success by engagement metrics or session length. We're tracking how many people build genuine emotional resilience, whether that's transitioning to professional therapy or simply having a reliable toolkit when they need it most.

We know we're being idealistic, but that's exactly what's needed for something this important. Real healing happens in the real world with real people and real growth. We're just here to help bridge that gap and provide the tools people need along the way.

This feature went through multiple iterations as the flow was complex and gaps kept surfacing. At one point, here was talk of building an auction hosting platform, an idea so over engineered it horrified the team. Thankfully, it was dropped. After many rounds of refining rules and removing unnecessary elements, we delivered a solution that is simple for the user and solid for the business.


Other projects

Other projects

Designing a Scenario Simulator for Emission Reduction

Simplifying complex environmental data into an intuitive tool that helps organizations model and compare emission reduction strategies through scenario-based simulations.

Designing a Better Way to Manage Auction in Self-Storage

Building an integrated auction management system for storage facilities to handle the legal compliance, financial tracking, and operational workflows that were causing financial issues and data inconsistencies when done manually.


Copyright 2025 by Anusha Shetty

Copyright 2025 by Anusha Shetty

Copyright 2025 by Anusha Shetty

Copyright 2025 by Anusha Shetty