Woebot, the virtual therapist
— Alison Darcy“Woebot is one step forward to stopping the negative spiral in someone’s head so they can get back on with their lives.”
Transcript
Transcript:
Woebot, the virtual therapist
ALISON DARCY: Stuff of life is inescapable. Depression has doubled in the last 25 years in every region in the world.
It affects pretty much everything in your life. The running narrative in your mind — and it is loud. Those thoughts are the things that are creating your misery and disappointment and shame and embarrassment.
You can have a super deep conversation with your therapist, and that is magic. I think every clinician has this thought: How many patients can you really see in a day, or a week, or even a year? What we do just does not scale very well.
By removing the other person, we’re willing to disclose more and have deeper conversations. Technology can help you develop a more intimate relationship with yourself. You’re free to really examine your own thinking without the additional noise of “Oh my gosh, how am I coming across right now? What does the other person think of me?” Without having to impression manage.
It’s not a replacement for human connection, but it’s one step forward to just stopping the negative spiral, so they can just get back on with their lives.
CATERINA FAKE: That was psychologist Alison Darcy. She invented a mobile app called “Woebot” where you can get one-on-one therapy. But here’s the thing: replies don’t come from a human. It’s driven entirely by AI.
I can imagine a world where Woebot makes entire populations happier and healthier. But it could also make us more emotionally distant and isolated from each other, playing into our worst habits of turning to our phones instead of connecting with others. Alison is on the show because she wants to guide Woebot towards a future that benefits all of us.
[THEME MUSIC]
FAKE: I’m Caterina Fake, and I believe that the boldest new technologies can help us flourish as human beings. Or destroy the very thing that makes us human. It’s our choice.
A bit about me: I co-founded Flickr – and helped build companies like Etsy and Kickstarter from when they first started. I’m now a partner at Yes VC, and your host on Should This Exist?.
On today’s show, we meet Woebot, a mobile app that’s like a therapist on your phone. It uses AI to guide you through steps that can change your negative thoughts. And I have to admit, I was skeptical about that at first.
The epidemic of anxiety and depression correlates with the increase of technology in our lives. Solving a problem possibly caused by technology, with technology, seems… misguided.
ESTHER PEREL: You know AI stands for artificial intelligence, but AI also stands for artificial intimacy.
FAKE: That was Esther Perel, the renowned couples therapist with two blockbuster TED Talks, two bestselling books, and the podcast “Where Shall We Begin?”. She’ll join us later in the show to help us future cast, with a few other experts.
But we’ll start with Woebot founder Alison Darcy. As I talked to Alison, I was drawn in by her vision for how and when AI can play a role in our emotional lives
With Woebot, you’ll always have a therapist with you, right on your phone. Whenever you have those sinking, dark feelings – what I call the “black cloud” – Woebot is there. Morning, noon or night, whenever it hits you, you have somebody to talk to.
And a lot of people are talking to Woebot. It fields one to two million messages each week, from people in 130 countries.
DARCY: Woebot, in his first day of launch, had more conversations with people than a therapist could see in a lifetime.
FAKE: The first question I always ask about any invention is: Who created it? And what problem are they trying to solve?
DARCY: The burden of illness of depression has doubled in the last 25 years in every region in the world. We’re watching this great increase in people’s needs for mental health services – and the systems that we have just can’t support that need.
I think every clinician has this thought: How many patients can [you] really see in a day, or a week, or even a year? What we do just does not scale very well.
FAKE: Don’t confuse Alison for the kind of technologist who’s looking to take humans out of the equation. She learned from an early age the benefit of a human support system.
DARCY: My father went through AA before I was born, probably 50 years ago, and was successful. He was quite active in that community and was a sponsor for lots of people. And as a result, I grew up in a house where my mom and dad were helping people in their moment of need – whether it was in the middle of the night or people would come over to the house and my mom would be making them dinner.
FAKE: The dinners and late-night visitors stuck with Alison.
DARCY: That idea of helping people in the moment – that actually 12-step programs have through this idea of a sponsor – is quite structurally different to the other way that therapy is, which is you go and see somebody, and it’s once a week and it’s in a specific time and place. Woebot is much more of the sponsor model: somebody or something there when you need it, in that moment
FAKE: Before Woebot, Alison was a research psychologist at Stanford, where she blended technology and psychology in new ways. Her most influential therapy helped families of teenagers with anorexia, using video chat. There were a lot of skeptics.
DARCY: The field, at the time, would say, “You can’t treat anorexia over the internet. That is contraindicated.” And the results we had were really promising. We were able to, in lots of cases, help parents re-nourish their child back to a normal weight within a few weeks.
FAKE: This experience flipped Alison’s thinking about digital therapy.
DARCY: The common wisdom is you should at least meet the family face-to-face for the first few sessions, but what I found was, actually, the opposite was true. I was basically showing up as a therapist in their kitchen. The video cameras are sitting at their kitchen table. And it occurred to me, this is actually more intimate in a certain way.
And then it occurred to me that technology can also help develop a more intimate relationship with yourself by virtue of removing the other person – because then you’re freed to really examine your own thinking without the additional noise of “Oh my gosh, how am I coming across right now? Like what does the other person think of me?” Without having to impression manage.
FAKE: Oh, that is so interesting. I do think a lot of people go to their therapists with the sense of: “What do they think? Who do they think I am? How do I appear to them?” That’s a very interesting way of thinking about it, that they no longer have the burden of self presentation.
DARCY: That’s right. You could see that interactions with Woebot are intensely personal because just none of that noise is necessary.
FAKE: Before we go further, let’s talk about how Woebot works.
After you download the app, you enter a chatroom that looks like any texting interface. You’re greeted by Woebot, a friendly, animated robot that’s smiling at you. And waving. Woebot begins by texting you ground rules, like: “The information you share is private. There will be daily check-ins.”
Then, Woebot digs in. It asks how you’re feeling. You respond by texting back one of 10 different emojis that include happy, okay, depressed, anxious, and tired, among others. I texted back “tired,” and Woebot responded, “Got it. You just completed your first check-in.”
From there, the Woebot relationship begins…
DARCY: Woebot will then invite you to work on that and say, “Is this something that you want my help with? Or did you just want me to understand? Are you just getting that off your chest?”
And if someone accepts that invitation, Woebot will then proceed to say, “Okay, I can challenge your thinking around this.”
FAKE: The conversation may go something like a conversation that you’d have with your hairdresser, bartender, or friend – but one that gives particularly good advice:
CUSTOMER: I’ll have a whiskey sour. In fact, put a double shot in there. It’s been a tough couple of days.
WOEBOT BARTENDER: I’m sorry to hear that you’re feeling so down.
CUSTOMER: I guess I just feel this constant sense of impending failure. Like… what if I’m a fraud, and at any moment I’m about to be found out? Unmasked for the incompetent phoney that I am?
BARTENDER: Is this something you would like my help with or did you just want to let me know? Both are okay.
CUSTOMER: Huh… what are you …a bartender or a therapist?
BARTENDER: Take a minute to tell me what’s got you feeling this way.
CUSTOMER: I dunno, nothing in particular. The truth is, I feel this way more often than not.
BARTENDER: Gosh that sucks. Anxiety can be a beast! Imagine what your negative thoughts would be saying if you had a voice. Can you do that?
CUSTOMER: They’d be…. they’d be saying… “Hey. HEY. You big dummy. Look at you. Trying to fool everyone else. Well. I SEE THROUGH YOU. And soon they will too. And then you’ll be exposed.”
BARTENDER: Okay, do you think there are any cognitive distortions in this?
CUSTOMER: Distortions?
BARTENDER: Cognitive distortions are unhelpful tricks that we play on ourselves. They’re errors in thinking that feel 100 percent true to us, and they give rise to feeling bad. Learning to identify them is the first step on the road to feeling better.
CUSTOMER: Well, I guess it is kind of over the top. Perhaps at times I’m too in my head.
BARTENDER: Maybe you concentrate too much blame on yourself. This kind of distorted thinking shuts down opportunities to grow and learn.
CUSTOMER: Hmm… I like your way of looking at things. How did you get so good at this?
BARTENDER: Oh, I spend most of my time listening to all sorts of people.
FAKE: Okay, so we took a few small liberties in that conversation. But the bartender’s replies were the actual responses we got from Woebot.
Woebot is based on a specific approach to therapy called Cognitive Behavioral Therapy, or CBT. It’s a highly structured question-and-answer exchange between a patient and a practitioner.
DARCY: What was different with CBT was that it definitely changed the landscape away from the endless talking to being very problem-focused and data-driven and also evidence-based. So for the first time, arguably, we had a model of therapy that could be evaluated because it was time-limited. And I think there will be… By the way, a lot of people will argue with me on this.
FAKE: The driving philosophy of CBT is that our thoughts directly affect our feelings. And our thoughts are often negative, and often misguided: For example, when we think about how we’re bad, wrong, and flawed, we often feel bad, wrong and flawed.
DARCY: CBT teaches you to tune into those voices and then write them down. There’s something very impactful in actually externalizing those thoughts and writing them down – or in this case typing them, telling Woebot.
FAKE: Is there a reason you chose CBT rather than other therapies as the basis for the chat bot?
DARCY: Yeah, there is. It’s because actually CBT is so empowering, and that actually matches really well with what we’re trying to create with Woebot, which is not therapy. It’s actually DIY therapy – and that’s the really important nuance.
FAKE: I think there’s a utopia built into Alison’s technology – and in that utopia, mental health care is readily accessible to anyone, anywhere in the world who seeks it out.
But I also think there’s potential for Alison’s technology to lead us to a dystopia – to a place where we become overly dependent on bots to take care of us, and we lose incentive to foster our most important relationships.
I asked Alison if Woebot would ever replace human therapists, and what we might lose if it did?
DARCY: You can have a super deep conversation with your therapist, and that is magic. That human connection is magic. But that’s not what Woebot is or trying to be either. You’re never going to be able to have a super deep conversation with Woebot. That is absolutely true.
FAKE: Not possible.
DARCY: It is just not possible. But Woebot can actually get you out of a negative thought spiral when it’s 2:00 am and the world seems very dark and weird and there’s no one else to turn to, you know? You have people who they can’t even turn to their partner, and instead of saying, “Well, that’s just wrong. You should,” I think we can honor that and say, “Well, there’s actually this little tool that can get people back on track.
FAKE: What’s your vision for Woebot, and your ultimate hopes and dreams for the technology and where would you see it in in 10 years?
DARCY: I think Woebot will improve in kind of in three ways. Woebot will be better at understanding English. Right now, he’s not really trained to understand a lot of what people are saying. Woebot will also broaden the kind of repertoire of things that he can deal with. And finally, I think Woebot’s real core intelligence will get better. Most of our AI is really around getting the person the right tool at the right time. And that’s what we call sort of “precision psychology” and similar concepts of precision medicine, all people are not created same.
The way we’ve had to develop treatment before has been sort of a one size fits all kind of model. And so you’re only ever getting average results for the average person. So getting to the real precision, which is really about understanding what is the right technique to deliver to the right person at the right time.
FAKE: On Should This Exist?, we’re creating a new kind of conversation – a conversation between the entrepreneur and the world.
When you’re the inventor, it’s easy to get stuck in your head and assume that your vision of a future is the only future that will exist. But it’s not.
Welcome to the Should This Exist? workshop. Here, Alison and I will respond to ideas I’ve gathered from super smart, creative experts. We asked each of them to throw unexpected things at us — both possibilities and pitfalls that Alison might otherwise miss.
Our first expert is Esther Perel. She’s a renowned couples therapist and relationship expert based in New York City. She is the author of State of Affairs and Mating in Captivity, and the host of the hit podcast “Where Should We Begin?”.
PEREL: The robot will help you remember to write down your five sentences or to take your medicine, but it will not have a conversation with you about if you had a meaningful life – so far.
FAKE: Baratunde Thurston is an Emmy-award nominated comedian, writer, and cultural critic who helped re-launch “The Daily Show with Trevor Noah”. As you’ll soon hear, Baratunde is always thinking about the unexpected ways technology sneaks into our social lives.
BARATUNDE THURSTON: We’re dating, but then our bots are kind of dating, and we’re having conversations, but our bots are having conversations. It’s a four-way relationship.
FAKE: Kevin Delaney is a veteran journalist who has covered technology for over 20 years. He’s the founder and editor in chief of Quartz. And he brings to us both his journalistic rigor – and his healthy skepticism.
DELANEY: The caution here is that we can’t outsource our emotional lives and the support of them to machines.
FAKE: Woebot really challenges some of our deeply held ideas about human relationships. So I wanted to start with the personal. Here’s what I asked Baratunde:
FAKE: Would you feel excluded if your girlfriend started spending more time with Woebot and not with you?
THURSTON: I think she and I both might feel the opposite, which is, “Why don’t you tell the Woebot about this, babe? I don’t need to hear everything.” In fact, what I’m sure my girlfriend would appreciate is not even a cognitive behavioral bot, just a bot that I could talk about what I did today with, because she’s like, “I don’t really need to hear all the details of the meetings or the shows or the articles you read.” Or my latest kick, which is, “I just saw this great YouTube video about X.” She’s like, “I get it. You’re really into YouTube now.” If I could offload 30% of that to a bot, I actually think it would enhance my relationship.
FAKE: Not the answer I was expecting. I asked Alison if she thought Baratunde’s plan was sound.
DARCY: I think where it plays out really well in relationships is, again, because Woebot’s not real, the questions it’s asking you about your relationships are not coming from any other place except for these are just good questions for you to think about and process through.
But I don’t know. My husband’s really into woodwork. He’s also really into YouTube. I have to go through watching some very technical YouTube videos about specific joints and things for woodworking. I want him to watch gardening shows. And so we have this back and forth. But I don’t know if I really want to unload that off to Woebot either, you know? That’s the stuff of comedy in your relationship.
FAKE: Baratunde was going for comedy. But that scenario in relationships – it’s a real thing. I was super curious to hear what Esther Perel had to say. And it turns out, Alison was too.
DARCY: Oh, wow.
FAKE: …who you are familiar with, sounds like, renowned couple’s therapist.
DARCY: She’s so amazing.
FAKE: Esther has always embraced technology in her work – through social media, video chat, her podcast. But she sees a lot of red flags here. We pushed her to share.
PEREL: AI stands for artificial intelligence, but it also stands for artificial intimacy. And this idea that you can create robotics, machines, apps, bots that will talk to you, and answer you the way that you would want to be answered to, and that you can suspend the idea that it has actually been programmed, that it is pretend, that you are free from the iterative and the reiteration of relationships – that the fake will be as good as the real, like, Las Vegas.
Our thoughts and our distortions are often over generalizations, but a bot itself is an artifact of over generalization because it goes for the most common denominator. It simplifies. Otherwise, you can’t scale stuff. All of these things will on some level reduce our expectations of relationships.
Everything about AI is about systematizing, simplifying, and about preventing risk, minimizing risk, and giving you a false sense of control over your life, rather than understanding that your life comes with negative thoughts and negative emotions and that often they are actually the right emotions to have for the circumstances you’re in.
DARCY: But I disagree with the notion that AI oversimplifies because it’s not trying to oversimplify human relationship. This argument is based on thinking it’s an overgeneralization of an end to end human relationship. And that is not true in this case. And I think any AI that does try and position that way is mistaken and is bound to that kind of criticism. I actually think it opens the doors to human intimacy.
And maybe this is just because I’m a pragmatist, it’s still like the tennis ball machine shooting tennis balls at you that allows people to practice their swing. Right? That has a purpose. But does it necessarily, because it exists, diminish the experience of playing a tennis game with another person? It doesn’t at all. It has zero impact, where they’re very different. All it does is allows somebody to swing better. It allows them to practice their skills and possibly even enjoy that tennis game more, because of that practice.
FAKE: When you think about the role Woebot could play in our modern lives – there are so many different metaphors you can choose. Is it the therapist? The AA sponsor? The ball machine? One other analogy came to my mind….
FAKE: I was raised Catholic and I’m thinking, Catholicism famously has confession and you go into this booth, and you tell your sins and then you were given instructions of how to get absolved, which is usually five Hail Marys and 10 Holy Fathers, and et cetera. And I always thought that this was a very interesting psychological outlet for people’s pent up emotions, to have an anonymous person in a booth that’s not part of your life, that you’re not in an intimate relationship with. And that the closest thing in traditional spiritual wisdoms from long ago, this was obviously a medieval strategy, has some similarities to Woebot.
DARCY: That’s interesting. Woebot as the sort of modern confessional. That’s interesting.
FAKE: Is Woebot a modern confessional? Maybe. But here’s the thing: Traditional confessionals were rooted inside a very structured community, where everyone was deeply connected to everyone else. In our modern lives: It’s just not the case. And Esther asks us to consider: Will Woebot drive people who are already isolated farther apart?
PEREL: Many social people will use technologies like this, and it will not isolate them further because they’re using it as part of their life. It’s integrated in a broader context. But if you are an isolated person to begin with, why am I going to give you self-help? The whole point of your being isolated is that you need the help from others and you need to be able to reach out to others – and I’m going to help you with that?
You know, the the overblown notion of self-help is part of an individualistic mentality and part of a society that privatizes problems: “They’re your problems. I’ll help you deal with this on your own.” And that is a worldview.
What do you do in parts of the world where the word “self” does not exist? It’s not part of the language because there is no self that exists separately from how you are embedded in your relationships to others. It’s that way that often the West has gone in to all kinds of places saying “I know what you need.” Because: “I have a solution for you.”
FAKE: Esther makes a lot of important points there. I’ll pick up on just one, because I have some reservations about the self-help movement myself.
So much of the self-help industry comes from advertising and social media, which creates FOMO and tells us that we’re insufficient and inadequate, so it markets us mood enhancers, meditation and exercise programs, and endless makeup videos. I was curious what Allison would say about Woebot in relation to this movement.
DARCY: Oh, yeah, I think we’re in total agreement here. I know I often say that self-help or Woebot is not going to be the solution for somebody who would benefit from a human therapist and where there is a relational component. If somebody is feeling really isolated and even loneliness or there’s broken relationships or broken attachment, then the therapeutic component is the therapeutic relationship for sure. Which you’re not going to get with Woebot. I mean, I totally agree with that.
I think where we might not see eye-to-eye is this idea that self-help is very individualistic. I think there’s a whole industry around it that’s individualistic.
FAKE: A lot of self-help I think comes from the impress of advertising and social media and everything telling us that were insufficient and inadequate.
DARCY: Like “I’m not X enough,” there’s no way to fix that. There isn’t a suddenly a bell that goes off and says, “Congratulations.”
FAKE: “Congratulations.”
DARCY: “You’re good enough now.” That never actually happens.
FAKE: But it’s interesting that this has come up because I do think that… I’ve heard it say that self-help is the true religion of America.
DARCY: Oh wow. I mean that the other way to look at that though is that their pursuit of perfection or whatever, that gets exploited in an industry that has become known to be self-help. And I think it’s sad that actually self-help has such a bad name. Self care is also a forgotten art and I think that can be fundamentally empowering in a way that other things don’t offer necessarily.
FAKE: I also think it’s very interesting that you called it DIY and not self-help. Because I do think there’s terminology that can promote or inhibit what you’re trying to do.
DARCY: Right. Because I’m just so pragmatic and I think that’s what DIY is doing. You know, it’s like sometimes you just need a wrench to fix your leaky faucet – and then sometimes you need a plumber. It’s about knowing the difference.
FAKE: We need to spend more time with this technology to know when we need a robot and when we need a human. In this case, Kevin’s money is on Woebot. His reason is one I hadn’t heard before.
DELANEY: You want to deploy technology in the places where it’s least helpful for humans to be involved. So it turns out that we, as human beings, are actually not especially good at completing repeated tasks. So the classic example is that people who work in the medical profession should remember to and know how to wash their hands. This is something that they should be doing dozens, hundreds of times a day, but it turns out that they don’t. Human beings are flawed and it’s not just the people who work in the medical professions.
I actually was at the doctor this week, and I saw, next to the sink, this 10 point checklist that explained how to wash your hands: One, run the water. Two, rub your hands. Three, get the soap, and so forth.
So to bring this back to Woebot, what’s interesting is that I would bet that Woebot, because it’s literally following a list, is actually better and more reliable and has better outcomes than the therapists who are bored out of their brains, repeating these same questions over and over again.
DARCY: Yeah, there’s a phenomenon in intervention research known as therapist drift. That when a therapist learns a particular therapeutic technique, they start implementing it really, really well. And then over time they implement it less and less well. About 50% of therapists who believe they are delivering CBT are actually not.
FAKE: Woebot will never do that. There is no therapist drift.
DARCY: No. And Woebot will never be able to deliver fully CBT, right. Because It’s not able to step outside the normal bounds of what we might consider CBT drills and really respond to somebody when they are having an existential crisis.
FAKE: Woebot may not handle existential crises with the elegance of psychoanalyst, but I couldn’t help wonder: Could Woebot’s constant availability increase the chances of people getting overly attached to it? Kevin forecasts that it could:
DELANEY: The caution is that we can’t outsource our emotional lives and the support of them to machines. The research has shown that people do actually develop emotional connections to machines. You have millions of people, particularly in Asia, who rate their interactions with chat bots that are fully automated to be at the same level at some of their friends or family. How do you make sure that things that are really gratifying and exist only virtually don’t become the sole focus of our existence?
FAKE: Alison acknowledged that she does see this danger in AI.
DARCY: There’s a creepiness or a danger where somebody is really mistaking the tech for being real, right? Or that that is replacing what would otherwise be a human connection, or a space that would be otherwise taken up by a human.
FAKE: But she believes this won’t happen with Woebot.
DARCY: It’s really hard for me to get my head around the idea that somebody would ever be attached or as attached to Woebot as they could be to a family member, because I think the interactions are so different and so limited. Woebot will actually cut off an interaction, so we don’t optimize for keeping people in the conversation as long as possible. Although, I do see that some chat bots do design themselves that way, and I think it is… Yeah, that does feel a little weird to me if you’re trying to just keep somebody talking.
FAKE: What are you designing for? A long interaction or a good interaction? These are the kinds of decisions every technologist faces. And most try to “gamify” their systems, so people are motivated to stay longer.
But when Baratunde forecast where this technology could go deeply, darkly wrong – he didn’t worry as much about our potential attachment to Woebot, but rather the fallout from it. He described a dystopia straight out of a horror movie.
THURSTON: [One of the] worst fears and feelings of people all across the planet: by time of day and geolocation and what they’re commuting pattern was when they were feeling the thing and dox the world with like an emotional data dump – that’s one of the worst things.
Or more frighteningly could target individuals who have certain influence in the world and blackmail them and say, “We know you were having these feelings of shame or self-doubt or concern, leader of this organization / public figure of significant interest / vulnerable members of society with accountability issues.” And so there’s a huge, huge black mail industry that might even be the biggest business plan that the Woebot has – is just like blackmailing wealthy people of influence who happen to have installed this app and have shared their deepest darkest moments with an AI intelligence.
FAKE: A lot of technology has been designed without these risks in mind and the trust that you have to feel in a technology like this is super, super important, because there’s a lot of things that are hurtful to people. This can have real world consequences. You can lose your job. You can be denied insurance. You can lose out on opportunities. Here’s what Alison had to say:
DARCY: Oh yeah. I mean, that is the absolute darkest awfulness, no doubt, from this technology. Yeah, it’s interesting. I just recently wrote a blog post about why we would never sell someone’s data for commercial reasons. Obviously we’re bound as psychologists to an ethical framework, but here’s the business argument for why selling data would be a shockingly bad decision. Woebot has to function on trust. I mean, trust is the most important thing. So we could never risk that kind of a breach or that kind of a violation of that trust.
But actually you could probably shut down the business now if you really had mal intention, and then blackmail people. The only thing is, we don’t know who anyone is. The individual data points are not identifiable. That’s really important.
FAKE: But it’s on… Is it still using Facebook?
DARCY: Oh, it is. I have to say, since the Cambridge Analytica scandal, Facebook usage has absolutely nose-dived. Most people are using our apps. However, we do keep Facebook Messenger open because the majority of people using it are using it from developing countries. It aligns with our mission of access, you know?
FAKE: But it doesn’t align your mission of data privacy.
DARCY: Well, right. I mean, as a company, we don’t know who these people are. But the…
FAKE: Facebook does.
DARCY: … but Facebook does. But we make sure that every user actually knows that and actively opts in and says, “Yes, I understand that.”
FAKE: There’s a distinction between self-help as it’s currently marketed to us and Woebot, which is a tool which gives us the ability to help ourselves. But when you’re an entrepreneur and you’re building a technology like this, you have to be super sensitive to potential outcomes, and think through every one of them from the perspective of the most vulnerable.
As a teenager, I went through a dark time and used CBT myself to get out of it. Woebot would have helped me, for sure. But I have mixed feelings about the idea of spending even more time on my phone. Woebot brings up different things for different people – and sometimes even different things for the same person.
We want to know what you think about Woebot. Tweet at us using the hashtag #ShouldThisExist.