Category: Uncategorized

  • Building Psychological Safety: A Data-Driven Approach to Workplace Mental Health

    Our recent workplace survey revealed a crucial insight: the gap between providing mental health support and creating an environment where people feel safe using it. While 84% of organisations have increased mental health resources, our data shows that creating genuine psychological safety requires a more nuanced approach.

    The concept of psychological safety, first introduced by Harvard’s Amy Edmondson, describes an environment where people feel safe taking interpersonal risks. Our research suggests this is precisely what’s missing in many workplace mental health initiatives.

    When we analysed responses across different organisational levels, three key elements emerged as essential for building genuine psychological safety:

    1. Trust Architecture – Our data shows that 73% of employees prefer sharing mental health challenges within small, trusted groups rather than through traditional organisational channels. This mirrors findings from Google’s Project Aristotle, which identified psychological safety as the primary factor in high-performing teams.
    2. Privacy By Design – The most striking finding was that 92% of respondents cited privacy control as crucial for engagement. This wasn’t just about confidentiality – it was about agency. As one senior manager noted: “The ability to control who sees what is non-negotiable.”
    3. Data-Driven Support – Perhaps most intriguingly, 72% of respondents expressed interest in anonymous aggregate insights. This suggests a desire to understand personal experiences within a broader context while maintaining individual privacy.

    These findings point toward a new framework for workplace mental health support:

    Structured Trust Networks

    Rather than broad, organisation-wide programmes, our data suggests creating infrastructure for small, trusted support circles. These should be employee-driven but organisationally supported.

    Granular Privacy Controls

    The future of workplace mental health tools must offer sophisticated privacy settings that give employees complete control over their information sharing.

    Anonymous Intelligence

    Organisations should leverage aggregate data to understand trends and patterns while maintaining individual privacy. This enables proactive support without compromising personal boundaries.

    The implications are significant. When McKinsey studied psychological safety in 2024, they found organisations with high psychological safety scores were 76% more likely to see increased innovation and 67% more likely to retain top talent.

    Our research suggests that building psychological safety isn’t just about having the right policies – it’s about creating the right infrastructure. This means developing frameworks that support intimate, trusted connections while maintaining professional boundaries.

    As we look to the future, it’s clear that workplace mental health support must evolve beyond traditional wellness programmes. The organisations that succeed will be those that create environments where employees feel genuinely safe discussing mental health – not because they’re told to, but because they choose to.

    What has your experience been with psychological safety at work? Have you seen examples of organisations successfully building trust through structure?

    Our full findings and detailed recommendations will be available in our upcoming whitepaper: ‘The 2025 Workplace Mental Health Trust Report’.

  • The Privacy-Support Balance: What Employees Really Want from Workplace Mental Health Support

    Our recent workplace survey revealed something fascinating about mental health support: it’s not just about having resources available – it’s about having control over how they’re used.

    When we analysed responses across different organisational levels, from individual contributors to executives, a clear pattern emerged. 92% of respondents cited ‘control over who sees the information’ as a crucial factor in their willingness to engage with mental health support. This wasn’t just a preference – it was a prerequisite.

    What’s particularly interesting is how this desire for control manifests differently across organisational hierarchies. Senior leaders, while generally more open to discussing mental health, still emphasised the need for bounded sharing. As one executive put it: “Clear privacy guarantees and control over information flow are non-negotiable.”

    The data reveals three key elements employees want from workplace mental health support:

    1. Granular Privacy Control Our survey showed that even in organisations rated as having ‘excellent’ mental health support, employees want fine-grained control over their information. 76% expressed interest in tools that allow them to selectively share with trusted colleagues while maintaining privacy from broader management.
    2. Trust-Based Networks The concept of ‘trusted circles’ emerged strongly in our data. While only 15% of respondents were comfortable discussing mental health broadly at work, 68% were open to sharing within small, trusted groups. This suggests a move away from organisation-wide programmes toward more intimate support structures.
    3. Anonymous Insights A surprising finding was the strong interest in anonymous aggregate insights. 72% of respondents indicated they would engage more with mental health support if they could see anonymised patterns and trends across their organisation – suggesting a desire to understand their experiences in context without compromising privacy.

    This presents a clear challenge for organisations: how to provide comprehensive mental health support while maintaining the level of privacy and control employees demand? Traditional top-down wellness programmes may need to evolve into more nuanced frameworks that enable trusted connections while preserving professional boundaries.

    The implications are significant. When MIT Sloan Management Review studied successful mental health initiatives in 2024, they found that programmes emphasizing user control and privacy saw engagement rates three times higher than traditional approaches.

    As we continue to analyse our data, one thing becomes clear: the future of workplace mental health support lies not in broader rollouts of existing programmes, but in reimagining how these programmes can work within employees’ trust and privacy requirements.

    What control features would make you more likely to engage with workplace mental health support? Have you seen examples of organisations successfully balancing support with privacy?

    Watch for the final part of this series tomorrow, where we’ll explore how organisations can build genuine psychological safety using a data-driven approach.

  • The Trust Paradox in Workplace Mental Health: New Data Reveals Disconnect

    A fascinating paradox has emerged in workplace mental health support: as organisations invest more in mental health initiatives, employees remain deeply hesitant to engage with them. Our recent survey of workplace mental health attitudes reveals that even in companies rated as having ‘good’ or ‘excellent’ mental health support, 73% of respondents would only share mental health challenges with select trusted colleagues – or not at all.

    This mirrors findings from the 2023 Mental Health and Wellbeing Survey by CIPD, which found that while 84% of UK employers have increased mental health support, only 49% of employees feel comfortable discussing mental health at work. The disconnect is stark.

    Our data suggests the core issue isn’t the availability of support, but trust and control. Every senior leader in our survey cited ‘control over who sees the information’ as a crucial factor in their willingness to engage with workplace mental health initiatives. This aligns with discussions in BMC Public Health, highlighting that trust and perceived effectiveness are significant factors in employee engagement with mental health programs.

    Perhaps most telling is the consistent concern about career impact. Across all seniority levels, from individual contributors to executives, respondents emphasised the need for mental health support to be ‘separate from performance reviews.’ One senior manager noted poignantly: ‘I know for a fact that all mental health issues discussed will be in the back of their mind when deciding on promotions.’

    This creates a challenging dynamic: organisations aim to support mental health but may inadvertently foster environments where employees feel hesitant to utilize that support due to concerns about their professional standing.

    The implications are significant. According to the 2024 NAMI Workplace Mental Health Poll, while 74% of full-time employees in the U.S. believe it’s appropriate to discuss mental health concerns at work, only 58% feel comfortable doing so.

    This disconnect underscores the importance of not only providing mental health resources but also cultivating a workplace culture that encourages open dialogue and psychological safety.

    Our data suggests a potential way forward. While respondents were hesitant about broad workplace disclosure, there was strong interest in systems that enable sharing within small, trusted groups. 68% of respondents indicated they would engage with mental health support if they had granular control over information sharing.

    This points to an evolution in workplace mental health support: moving from broad, organisation-wide programmes to frameworks that enable intimate, trusted connections while maintaining professional boundaries.

    As we continue analysing this data, one thing is clear: the future of workplace mental health support isn’t just about providing more resources – it’s about creating environments where people feel safe enough to use them.

    What has been your experience with mental health support in the workplace? Have you observed similar patterns in your organisation?

    The next article in this series will be published tomorrow: The Privacy-Support Balance: What Employees Really Want

  • The January Paradox: Why Digital Habits Matter More Than Resolutions

    As we emerge from the festive blur into January’s stark reality, an interesting pattern emerges in our digital behaviour. While many of us instinctively retreat into endless scrolling to escape the winter blues, research suggests this might be exactly when we need to reshape our relationship with technology.

    The conventional wisdom about January is fascinating. We’re told to make grand resolutions, start ambitious programmes, transform ourselves entirely. Yet psychological research reveals something counterintuitive: it’s precisely when we’re at our most vulnerable that tiny, consistent actions have the most profound impact.

    Through our work at Duck Score, we’ve observed something remarkable. Users who maintain simple daily check-ins during January report significantly different experiences than those who attempt dramatic lifestyle changes. It’s not about the magnitude of the action, but its consistency. A two-minute reflection each day creates more lasting change than hours of sporadic self-improvement efforts.

    This insight aligns with recent neuroscience findings about habit formation in challenging conditions. When we’re stressed or depleted – as many are in January – our brains actually become more receptive to micro-habits. It’s as if the mind, seeking stability, becomes particularly receptive to small, manageable patterns.

    But here’s where it gets really interesting: the role of AI in this process isn’t what you might expect. While many tech companies push AI as a replacement for human connection during difficult times, our research suggests technology works best when it creates bridges rather than substitutes. AI can help us recognise patterns in our emotional landscape, but it’s the small, human connections that actually shift our experience.

    Consider this: in our beta testing, users who combined daily personal check-ins with small group sharing showed remarkable resilience during challenging periods. The AI provided insights, but the human connections, even minimal ones, provided the emotional anchor.

    This challenges the traditional January narrative. Instead of grand transformations, what if we focused on tiny, consistent digital habits that support our well-being? Instead of trying to overhaul our lives, what if we simply committed to better understanding ourselves through small, daily practices?

    As we navigate these cold, dark days, perhaps the answer isn’t in dramatic changes or digital escapism, but in small, intentional moments of reflection and connection. Technology, when thoughtfully designed, can help create these moments without demanding more energy than we have to give.

    This January, consider this: what if the path through winter’s challenges isn’t about changing everything, but about understanding yourself a little better each day? What if the most powerful resolution isn’t a dramatic transformation, but a commitment to small, daily moments of self-awareness?

    In a world that demands constant evolution, sometimes the most revolutionary act is simply paying attention to where you are right now.

    Would love to hear your thoughts on how you’re approaching January. Are you focusing on grand changes, or finding power in smaller habits?

  • The Uncanny Valley Paradox: When AI Gets Too Close to Human

    Something strange is happening in our relationship with AI. Roboticist Masahiro Mori described it back in 1970 as the ‘uncanny valley’ – that unsettling feeling when something appears almost, but not quite, human. Think of ultra-realistic CGI characters whose slight imperfections make them more disturbing than cartoons. As AI becomes more sophisticated, you might expect this effect to disappear. Yet something unexpected is happening: our sensitivity to artificial humanity seems to be increasing.

    I first encountered this concept in Barcelona back in 2016. I was invited to do some work with Oliver Harrison, MD MPH on a health moonshot at Telefonica, trying to imagine what the future of healthcare might look like. Mental health emerged as our greatest challenge, but we couldn’t have predicted how AI would transform this landscape.

    A personal tragedy during that time drove home the desperate need for better mental health support. It wasn’t just about accessibility anymore – it was about creating solutions that could genuinely help people while feeling authentic and trustworthy.

    Fast forward to today, and recent research reveals something fascinating. Studies from early 2024 show that even as we casually chat with ChatGPT and let AI help write our emails, we’re becoming more, not less, sensitive to artificial attempts at human connection. AI news anchors still trigger feelings of eeriness, while AI-generated forecasts face persistent skepticism compared to human predictions.

    This creates a fascinating tension in mental health technology. How do we harness AI’s remarkable capability for pattern recognition and insight generation without triggering that uncanny valley response? The answer might lie in not trying to make AI more human-like at all.

    Through our work at Duck Score, we’re discovering that technology works best when it knows its role. AI excels at spotting patterns and generating insights from data – things the human brain isn’t optimised for. Meanwhile, real human connection happens best in small, trusted groups where people can be their authentic selves. It’s about creating distinct spaces where each can play to its strengths.

    It’s like we’ve been trying to teach fish to climb trees when they could be showing us new ways to swim. The real opportunity isn’t in making AI more human-like, but in using it to enhance genuine human connections. Perhaps the uncanny valley isn’t an obstacle to overcome, but a guide showing us where to draw the line between artificial and human interaction.

    Think about a therapist’s office. The value isn’t just in the therapist’s insights, but in the human connection – the empathy, understanding, and trust built over time. Similarly, our closest relationships aren’t built on pattern recognition or data analysis, but on shared experiences and emotional bonds. What if technology could support both these elements without trying to replicate them?

    This shift in thinking has profound implications. Instead of asking “How do we make AI feel more human?”, we should be asking “How do we use AI to support and enhance human connection?” It’s about creating digital spaces where AI can provide valuable insights while preserving authentic human interaction.

    Looking back at those early discussions in Barcelona, I realise we were asking the wrong questions. The future of digital mental health support doesn’t lie in crossing the uncanny valley, but in building bridges over it – creating environments where AI and human interaction each serve their distinct purpose in supporting mental wellbeing.

    The uncanny valley might just be showing us the way forward – not by trying to make technology more human, but by letting it be brilliantly, unashamedly artificial in service of genuine human connection.

  • When Social Media Meets AI: The Case for Digital Authenticity

    As we hurtle towards 2025, an intriguing paradox has emerged: despite unprecedented digital connectivity and rapidly advancing AI, many of us feel more disconnected than ever. Our children scroll endlessly through carefully curated highlights of others’ lives, while AI chatbots offer companionship without true connection.

    The original promise of social media was beautiful – a digital town square where people could share, connect, and support each other. Instead, we’ve created performance spaces where authenticity is sacrificed for likes, and genuine vulnerability is met with judgment rather than understanding.

    Now, as AI enters the equation, we face a critical juncture. Will we use this powerful technology to deepen the artifice, or can we harness it to foster genuine human connection?

    Through our work at Duck Score, we’ve discovered something fascinating. When given the choice, people naturally gravitate towards smaller, more intimate digital spaces. Our beta testing revealed two distinct patterns: those who create small, trusted groups (what we call ‘nests’) for mutual support, and those who prefer private reflection enhanced by AI insights. Both groups share a common thread – they’re seeking authenticity in their digital interactions.

    The data tells an interesting story. In our beta community of 250+ users, the average ‘nest’ size is just five people. When one person shares their emotional state, others often reciprocate within hours. This isn’t the viral spread of social media; it’s the natural rhythm of genuine human connection.

    This has led us to a radical conclusion: maybe the future of digital connection isn’t about scaling up, but scaling down. Instead of broadcasting to hundreds of followers, what if we focused on meaningful connections with those who matter most? What if AI served not as a replacement for human interaction, but as a tool for deeper self-understanding and more meaningful connection?

    We’ve seen the impact firsthand. One parent shared how their teenager used our platform to signal they needed support, preventing a potential crisis. Another user noted how seeing others share their ups and downs helped reset their notion that “everyone else has it all figured out.”

    The path forward isn’t about rejecting technology – it’s about reimagining its purpose. We need digital spaces that celebrate authenticity over performance, that use AI to illuminate rather than mask our true selves, and that prioritize genuine connection over endless growth.

    As we enter 2025, we’re faced with a choice. We can continue down the path of performative social media and artificial engagement, or we can build something different – something that uses technology to enhance rather than replace human connection.

    The future of digital interaction doesn’t have to be more disconnected. By combining thoughtful AI implementation with intentional design focused on genuine connection, we can create digital spaces that actually strengthen our relationships and support our mental wellbeing.

    It’s time to put the ‘social’ back in social media – one authentic connection at a time.

  • The Quiet Revolution in Mental Health Apps ðŸ¦†

    In a digital world screaming for our attention, Duck Score is intentionally different.

    We’re not here to maximise your screen time or hijack your attention. Instead, we’re building something that requires a different kind of engagement – one that combines thoughtful design with personal commitment.

    Yes, we’ve carefully crafted moments that make Duck Score sticky: the satisfying feeling of entering your score, the anticipation of AI-generated insights, the connection within your nests. But we’ve also intentionally included friction where it matters.

    When you enter your Duck Score, there’s no slider to quickly swipe. Instead, you manually type your number, creating a moment of genuine reflection. When you receive a supportive message from your nest, it floats across your screen rather than instantly appearing. These aren’t bugs – they’re features.

    The truth is, developing better mental health awareness isn’t about passive consumption. It’s about active participation. While we can design an experience that supports and encourages this journey, the real change comes from your commitment to understanding yourself better.

    Duck Score isn’t competing for your attention in the digital noise. We’re creating a quiet space for something more meaningful – a daily practice of self-awareness and connection.

    Because sometimes, the most powerful digital experiences aren’t the ones shouting the loudest, but the ones giving you space to hear yourself think.

    If you’re ready to gift yourself some peace and quiet in this noisy digital world, Duck Score is currently in beta. Join our early flock before our Boxing Day launch and be part of a different kind of digital experience – one that values your mental well-being over your attention span.

    Download now and give yourself the gift of daily moments of calm and clarity. Because in a world that’s constantly shouting, sometimes the most revolutionary act is taking a moment to check in with yourself. 🦆✨

    Android https://tinyurl.com/rcfs2s88

    iOS https://tinyurl.com/bdepwtux

  • The Surprising Stability of Our Mental Health: Insights from Duck Score

    At Duck Score, we’ve observed an intriguing trend among our users. Many people begin using our app with the impression that they’re constantly anxious or stressed. However, after consistently tracking their mental state, they’re often surprised to discover a much more stable and positive picture than they initially perceived.

    This phenomenon speaks to a fundamental aspect of human psychology. Research in cognitive psychology has long established that our brains are wired to pay more attention to negative experiences. This ‘negativity bias’ can skew our overall perception of our mental state, making us feel more unstable than we actually are.

    The simple act of daily mental health tracking can be revelatory. Consistent tracking creates a habit of self-reflection, allowing users to see patterns over time rather than focusing on momentary lows. This insight aligns with broader trends in mental health research. Studies have shown that people often overestimate the impact of negative events on their overall well-being.

    By providing a daily check-in, Duck Score offers users a more balanced view of their mental health journey. Quantifying our experiences can lead to surprising insights. It’s not about ignoring negative feelings, but about putting them in perspective.

    Of course, this doesn’t mean that mental health challenges aren’t real or significant. Rather, it underscores the importance of awareness and perspective in managing our mental well-being. Sometimes we don’t realise how resilient we are until we step back and look at the bigger picture.

    The simplicity of tracking one’s mental state daily can have profound effects. Just as people are often surprised by how simple it is to maintain physical health habits, they’re amazed at how impactful daily mental health tracking can be.

    At Duck Score, we believe that mental health awareness is a powerful tool, and it doesn’t have to be complicated. By providing a simple, daily check-in, we’re helping users gain a more accurate picture of their mental state over time.

    This trend highlights the gap between our perception and reality. By bridging this gap, we can empower people to better understand and manage their mental health.

    In a world that often feels chaotic and stressful, it’s heartening to discover that we might be more stable and resilient than we give ourselves credit for. And sometimes, all it takes to uncover this truth is a simple daily practice of self-reflection.

    Have you ever been surprised by the stability of your own mental health? We’d love to hear your thoughts and experiences in the comments below.

  • Embracing ‘Good Friction’: How Duck Score is Rethinking App Design for Mental Health


    Last week I had a really interesting conversation with Dave Birss, a friend from his Ogilvy days, my absolute hero AI teacher and a person with a very brilliant mind. I love talking to him, I always learn something. This time he told me about a book he wrote a few years ago called Friction and how we need more in our lives. I of course ordered it immediately and have been reading it. It struck that many of our design decisions in Duck Score are based on tiny pauses which we did to allow awareness, but then I began really thinking about it. Here are some thoughts…

    In the tech world, ‘frictionless’ has long been the gold standard for user experience. However, Dave’s book challenges this notion, arguing that some friction can actually enhance user engagement and create greater value. At Duck Score, we’ve found this principle particularly relevant in designing our mental health app.

    For instance, when users input their daily ‘duck score’, we don’t provide easy sliders or pre-set options. Instead, we present a clean interface with two boxes separated by a decimal point. This intentional friction point encourages users to pause and reflect on their current state of mind.

    Similarly, we’ve incorporated a brief waiting period while our AI generates personalised insights. This moment of anticipation can trigger the release of dopamine, enhancing the user’s engagement with the forthcoming information.
    Perhaps our most charming use of ‘good friction’ is our ‘floating feathers’ feature. Users can send supportive notes to others in their ‘nest’, and these notes float across the screen before opening. This brief delay builds anticipation and, we believe, enhances the emotional impact of the supportive message.

    As Dave says in his book, ‘Friction isn’t something to be avoided. It can generate adrenaline, dopamine, oxytocin, serotonin, and endorphins to drive increased engagement, meaning, belonging, rapport, assurance, competence, and exclusivity.’ Our design choices aim to leverage these neurochemical responses to deepen your engagement and promote mental health awareness.
    In the words of Adam Grant, ‘If you think eliminating effort is the key to good design, get ready to think again.’ At Duck Score, we’re rethinking app design for mental health, embracing ‘good friction’ to create more meaningful, impactful user experiences.

    By intentionally incorporating these moments of pause, reflection, and anticipation we’re crafting an experience that encourages mindfulness, deepens emotional connections, and ultimately supports our users’ mental well-being.
    In a world that often prioritises speed and convenience above all else, we believe there’s value in occasionally asking our users to slow down, to think, to feel. After all, isn’t that what mental health is all about?

    iOS version https://lnkd.in/eVGrhyra
    Android version https://lnkd.in/eeuxYVjr