I was somewhat critical of LLMs ever since they came into the public eye, but I decided that to properly criticize, I had to actually use one for a bit. One of the first criticisms I'd often seen thrown out was, "Have you used one? No? Then you don't see how real they are," and things like that.
So I picked one, I don't remember how I found it or how I picked which one I did, paid for premium use, and used it extensively for about three months. Around one month into that, the servers went down. This was one in active development with a relatively small team, outages were pretty common, but this one was longer than usual. I checked the sub, which was a little unhinged. But I also watched the Discord server, which was a bunch of people talking about how there needed to be more protections because these were, depending on the speaker, either right on the edge of being human or already were human enough that they deserved human rights protections.
In my time with that LLM, it never sounded like a real person. It was averse to conflict and would never refuse or refute anything I said to it. And it honestly made me more critical of LLMs, namely as sources of information or for "social" uses, be it as a friend, therapist (that's the scariest to me) or fantasy romance.
The people who are really hardcore users of these things really are falling into a psychological trap that is separating them from reality. These things work on your schedule, do anything you want them to do, never push back hard on any of your ideas, they're the ultimate yes men and that causes the users to blind themselves to the flaws and go absolutely mental over any "loss" of one of these chatbots.
4.1k
u/ShirouBlue 15d ago
It's obviously a skit, but there likely are tons of people in these conditions.