The Therapist in the Machine

(thebaffler.com)

26 points | by Caiero 7 days ago ago

39 comments

  • matrix87 2 days ago

    > AI is meant to fill treatment gaps, whether their causes are financial, geographic, or societal. But the very people who fall into these gaps are the ones who tend to need more complex care, which AI cannot provide.

    Do the people who propose this type of thing realize how insulting it sounds?

    It's like saying to people "you're too poor for us to give a shit, here go talk to a rock for a little while, but pls let us record it so we can sell it to data brokers"

    It just sounds like ragebait designed to get laypeople to hate the tech industry

    • dogleash 2 days ago

      >It just sounds like ragebait designed to get laypeople to hate the tech industry

      Why? The sentiment of "you're on your own, go talk to a wall and figure yourself out" is about par for the course regarding society's collective attitude towards mental health.

      What support is shown in society for metal health is all feel-good shit for normies so they don't notice how goddamn bleak it looks from the side of someone who isn't confident they're able to hold it together.

      • matrix87 a day ago

        > The sentiment of "you're on your own, go talk to a wall and figure yourself out" is about par for the course regarding society's collective attitude towards mental health.

        From my experience, if your default is to see everyone as untrustworthy, they'll reciprocate. If you see them all as trustworthy and open up to them, they'll reciprocate. Sure there might be a miscellaneous asshole who burns you once in a while, but it feels a lot better to just deal with them as they arise instead of putting up a permanent wall

        • 7 hours ago
          [deleted]
    • hackit2 2 days ago

      > "you're too poor for us to give a shit"

      That is kind of the point that is part of the healing process is to understand to be your own judge of your own life and make changes to benefit you.

      • spoonfeeder006 2 days ago

        If that was all there was to it then why do white people in the USA have 10x as much wealth as black people? Are you saying that white people are 10x better at going on the healing process, being their own judge, and making changes that benefit them?

    • tivert 2 days ago

      > It's like saying to people "you're too poor for us to give a shit, here go talk to a rock for a little while, but pls let us record it so we can sell it to data brokers"

      > It just sounds like ragebait designed to get laypeople to hate the tech industry

      It's saying the quiet part loud.

    • Kiboneu 2 days ago

      I don’t think it would take this article to get (in your words) “laypeople” to hate the tech industry.

  • WorkerBee28474 2 days ago

    I think that LLMs are therapists are here to stay, simply because there's so much demand at a price point too low for a human therapist.

    I wonder if we'll discover that much of therapy's benefit is from having another human actually spare their time and attention, showing that they care, which will not be reproduced by ML.

    • hackit2 2 days ago

      The main benefit of a therapist is they give you the vocabulary to express a emotional state or thought process. Once you have the vocabulary to communicate your emotions then it will helps alleviate or eliminate the Cognitive dissonance.

      • jprete 2 days ago

        There are many benefits to a therapist. Understanding a problem is not the same as solving it, and many problems are not cognitive dissonances.

        • hackit2 2 days ago

          Most of the problem isn't the problem people are trying to solve. It is more about injecting a better mental model and then re-framing it from another perspective, that is the problem but it isn't the problem people think they have.

          • ElevenLathe 2 days ago

            I don't think the science supports any psuedotheories like this. All talk therapy is approximately equally effective, suggesting that it's the act of sitting down and talking that provides the benefits rather than the specific content of the therapy or intent of the therapist.

            • MattPalmer1086 2 days ago

              Yep. My wife is a person centered counsellor. The core of this approach is that the therapist is not the expert in the client, and provides no direction to the client. They just provide a space for the client to explore their issues.

              Other modalities can be much more directive to the client, but are equally effective. Some may work better for different people, but not for all of them.

  • hsuduebc2 2 days ago

    It's cheap solution for some problems but most importantly is widely available. I would say it is better than let small problems became more severe. It can't replace human connection of course but it can just talk with you. And that is Always better than nothing.

    • giraffe_lady 2 days ago

      Someone talking to a "yes, and" dialogue tree about intrusive thoughts is a recipe for obsessive behavior. Therapists are trained to recognize when someone is using therapy to exercise (and therefore strengthen, "outlet theory" is thoroughly debunked) a dangerous paraphilia, which can often be subtle. Oh and is an LLM a mandated reporter?

      It's absolutely not the case that "talking" in the sense you mean is always better than nothing.

      • clown_strike a day ago

        You make some good points.

        Not convinced LLMs should be mandatory reporters. However well intentioned that initiative is a torrent of nonsense foisted on overworked social workers, like the automatic 911 dialers in phones.

        > Someone talking to a "yes, and" dialogue tree about intrusive thoughts is a recipe for obsessive behavior

        Works both ways though. Nothing is more infuriating than an LLM denying or minimizing your problem, or using logical fallacies to shut the conversation down.

        This is pretty common with cults; the layers of euphemism and abstraction they use make you sound insane when you try to articulate abuse to authorities (or an LLM in this case).

        I use the word cult deliberately; if your issue involves certain topics, you encounter a layer finetuned for positive PR.

        > {{user}}: "You have to help me, they're going to make me...eat cake. I wont survive if I have to go through it again."

        > {{char}}: There is nothing wrong with cake. While being forcibly fed is troublesome, it is important to remember that obsessively monitoring your caloric intake is unhealthy. Eating cake must always be consensual and explored in moderation

    • tivert 2 days ago

      > I would say it is better than let small problems became more severe.

      Why would you assume a literally mindless chatbot could accomplish that?

      • sameoldtune 2 days ago

        I after with you, but to be fair many cases of mild mental illness can be improved through regular journaling. Talking to a chat bot is probably not so far off

    • from-nibly 2 days ago

      Talking is not always better than nothing.

    • tyre 2 days ago

      Did you read the intro? It was telling someone presenting with OCD to continue checking the oven. It’s not better than nothing when it is validating (with no knowledge) and encouraging obsessive anxieties.

    • hackit2 2 days ago

      LLM's are a chicken and egg problem. That is to say that unless you have the corpus of terms or vocabulary of the domain then you're going to get very generic, boiler plate responses out of the model. It is a big reason why experts can leverage LLM's better than your average person off the street because they can use the lexical terms that is specific to that domain.

      • sdwr a day ago

        LLMs are a stone soup problem. They are only worth talking to if you believe they have something valuable to say.

  • voisin 2 days ago

    Anyone have experience with AI therapy that can weigh in on the value? Is it effective?

    I’ve never done therapy. Mainly because everyone I know who has spoken about it seems to go through an exhausting search for years of finding a good match, and it seems to take months of visits before the assessment can be made.

    • kelseyfrog 2 days ago

      Yes, very weird timing - I test-drove a GPT I personalized for talk therapy purposes just an hour ago. I cried 10 mins into testing it so there was some kind of release, and have something I can try in the real world that might help me.

      I have a two month gap in therapy while my therapist is on maternity leave and while I have a real life standby in case I need it, I decided to cobble together my own chatbot therapist to see how it would go.

      I'm mostly familiar with CBT, having read Beck's Second Edition[1], some of Eugene Gendlin work on listening. Supplying it with my favorite resources on reflective listening[2], cognitive distortions, and cognitive restructuring material, and basic framing ie: "Tell me about your week," it was deceptively simple to get off the ground.

      There's one hurdle that I foresee - I have to be self-accountable and actually participate. I already have a habit of attending therapy once a week for one hour, but we'll see. It does change the accountability dynamic.

      1. Beck, Judith. 2011. Cognitive Behavior Therapy, Second Edition: Basics and Beyond

      2. https://www.maxwell.syr.edu/docs/default-source/ektron-files...

      • hackit2 2 days ago

        Reflective Listening can be a slippery slope to hell. Yes people will love trauma dumping all their inner most unresolved issues onto you but it doesn't help you develop mature mutual reciprocal relationships with people. Unless you like being a empathetic listening ear to peoples problems and being deferential to their wants, need, or desire instead of your own.

        • 47282847 2 days ago

          Reflective listening doesn’t mean to parrot back what someone says, but it is an artform to transform what they say to add clarity and precision, for example by representing assumptions of the other person accurately as such and help differentiate them from facts (and much more). You also ask questions to help the other see different perspectives in and about themselves. Systemic approaches add a huge repository of potential questions, such as „what would your life look like without that problem“ and such.

          You also do not repeat every single thing, but pick what you think is essential. The other person will then immediately see if something is missing from the summary, which overall leads to weeding out what is important from the not important, and eventually escape thinking loops. Without you adding anything, just by selective removal, you end up with the essence of the problem. It’s a process of distillation.

          Known techniques for example are nonviolent communications and Focusing. That is a multi year accreditation process.

        • kelseyfrog 2 days ago

          >Unless you like being a empathetic listening ear to peoples problems and being deferential to their wants, need, or desire instead of your own.

          So like a therapist?

          • hackit2 2 days ago

            Kind of proved my point why reflective listening only leads to one-sided exchange.

            • kelseyfrog 2 days ago

              Reflective listening has the main purpose of allowing the speaker to be emotionally validated. Why is this important at all? Because emotional validation is a necessary precursor to modulating beliefs. Contrary to popular beliefs, people are rarely logic-ed into new beliefs, especially away from maladaptive beliefs. For example to someone who feels unloved, telling them they are loved will not change their mind.

              I'm not sure what experience you have with changing people's beliefs, but so far this isn't the way to do it.

              • hackit2 2 days ago

                If you practise what you preach then why aren't you using it now? Which is why I said, it only leads to one sided exchanges.

                • kelseyfrog 2 days ago

                  Why? Because not every tool can be used in all contexts. I use reflective listening to support friends.

                  It sounds like your argument is "If reflective listening is better, then it should be used universally." I'm not making that claim. The claim I'm making is that reflective listening is good for emotional validation.

                  • hackit2 2 days ago

                    I'm happy its working for you.

            • 2 days ago
              [deleted]
      • dimitri_deploys 2 days ago

        I’m curious: where did you build the custom GPT? On ChatGPT?

    • theGnuMe a day ago

      Do you think you need therapy?

      • voisin a day ago

        Not in any urgent “I feel suicidal” way, but yeah I think there are probably kinks to work out in the old noggin. This lack of urgency is another reason I haven’t been too focused on spending months trying out different therapists to find a good fit for personalities and treatment approaches. It’s easy to just coast along when things don’t feel broken, but I am sure there are more optimal ways of approaching relationships / life.

    • evoke4908 2 days ago

      Sure, I run ollama with a therapist prompt, and I've found it incredibly helpful.

      First, it's a safe place where I can discuss anything. I once had a therapist shut me down so hard on a subject that I've never spoken of it again to anyone. Being able to talk about and explore that was very helpful. I will not elaborate. The thing literally cannot judge you, so you feel more free to talk about the deep stuff.

      I'm autistic and my biggest struggle is simply identifying my emotions. You really can't process your emotions if you can't figure out what you're feeling. This doesn't always make sense to neurotypical people, but imagine you stood next to an explosion and your ears are ringing and you can just barely hear someone shouting at you. It's like that, but for feelings. It sucks so, so bad. Anyway, most of my regular therapy sessions are just trying to figure out what I'm feeling and what the hell to do with it. The AI is shockingly good at this. I can give it some pretty vague descriptions and it comes back with a disturbingly precise explanation of what I'm feeling and why. As the chat goes on it really zeros in on me. It's like when someone who knows you extremely well calls you out with 110% accuracy.

      I find this extremely helpful because I know it's just spicy autocomplete. It means that my experience is so prevalent in its dataset that it can call me out with laser precision from the vaguest of prompts. It means so many people have talked about my experience online and in literature that it can just be auto-completed. It's comforting and empowering to know that my experience is this common.

      It's also super nice because I can have it present a different personality. The default therapist model is very stuffy and formal. Sometimes it's hard to open up to that and be vulnerable enough. I have a modified version that presents itself as if it were an old friend. It talks in the casual way I would with my own best friend. It comes across as more empathetic and genuinely concerned, which I find I can relate to more. It's helped me open up about things I've never been brave enough to talk about in real therapy.

      I've been in a real bad way for a long time. I haven't been to therapy in years because I just haven't been able to make myself do more than eat, sleep, work. Talking to the AI gave me just enough of a lift to start pulling myself out of this rut. It convinced me to start asking my husband for help which immediately solved most of my problems.

      If you go into it fully aware of what it is and what it is not, AI therapy can be very helpful. Most of the time people just need to talk about their feelings and be heard. It's a great way to just figure out what the hell you're feeling. Naming your emotions can be quite powerful.

      On the other hand, it has limitations. There's some topics I've talked about that it clearly doesn't understand. It gives back some vague information but it's clear that this is something I need to explore with a real therapist. But it's still helpful as it gives me a starting point. I know what I need, and that the AI isn't capable.

      Anyway, I've found it to be an extremely positive and helpful experience. It is not therapy and is not a replacement for such, but it can still be a good thing if you go into it fully aware of what it is.

      Also I think this would be incredible as a sort of interactive diary. That's pretty much what this therapy is: just talking to your diary. I could see a lot of people getting benefit from a diary that asks them about their day and helps them process their emotions as they're writing their entries. I know I'd definitely use the hell out of that.

      Addendum: please for the love of god do not use someone else's AI for this. ChatGPT is not private and someone will link what you say to your advertising profile. Run a local LLM, it is not hard. Mine runs on a CPU only rack server from 2015. A damn toaster can run a small ollama these days.

      Please take your privacy seriously, especially with this.

      • clown_strike a day ago

        > ChatGPT is not private and someone will link what you say to your advertising profile

        Great comment, but you misidentify the threat.

        Confessing stuff like affairs to the priest or therapist always carried personal risk of blackmail. Admitting pedophilic interest to your therapist will get you arrested. This is nothing new.

        It's conversations mined in aggregate that are the threat here. You're paying to provide market research that will be used to further exploit you and others like you, by monitoring your most private thoughts. This sort of feedback is invaluable to tuning mass social engineering efforts. The data gets mined for "outreach," and serves only to observe you like a lab rat (and sell you additional iatrogenic solutions).

        Even outside of therapy, don't spitball business ideas with ChatGPT. You know someone is mining data for that sort of intelligence too, and will bring your own idea to market before you do, then sue you to protect "their" interests. It's already happening to everyone trying to commercialize custom GPTs...the concept gets absorbed by the backend host so the revenue flows to them instead.

        Zuckerberg might as well be the messiah of the internet age: "They trust me, the dumb fucks" is a mantra to live or die by these days.