What a surprise, the empathy-free text generator makes things worse when people expect it to output empathy. My condolences to the kid’s family and I hope he’s in a better place, but this sort of thing is going to happen more and more until people realize that AI chatbots only seem human-like because the human brain is so good at empathy that it projects emotions and agency onto anything, even a literal cowpile with googly eyes on top.
AI isn’t “good enough to fool us” . We’re just stupid enough to be fooled even by something as moronic as AI. What we emphasize in such a statement makes all the difference in how we handle this tech.
What a surprise, the empathy-free text generator makes things worse when people expect it to output empathy. My condolences to the kid’s family and I hope he’s in a better place, but this sort of thing is going to happen more and more until people realize that AI chatbots only seem human-like because the human brain is so good at empathy that it projects emotions and agency onto anything, even a literal cowpile with googly eyes on top.
AI isn’t “good enough to fool us” . We’re just stupid enough to be fooled even by something as moronic as AI. What we emphasize in such a statement makes all the difference in how we handle this tech.