- 0 Posts
- 16 Comments
FenderStratocaster@lemmy.worldto News@lemmy.world•Boy, 11, shot dead after playing doorbell-ringing prank in Houston, police sayEnglish5·2 days agoFair enough.
FenderStratocaster@lemmy.worldto News@lemmy.world•Boy, 11, shot dead after playing doorbell-ringing prank in Houston, police sayEnglish462·2 days agoWhy is this article not framed as if this was a crime? They never mentioned the shooter or his consequences. It’s not illegal to ring a door bell.
FenderStratocaster@lemmy.worldto Technology@lemmy.world•The Browser Wasn’t Enough, Google Wants To Control All Your SoftwareEnglish71·6 days agoI was thinking of switching to Proton. I use Gmail, Google Photos, Google Messages, Drive, Keep, Maps, Docs, Sheets. I pay $2 a month for 100gb and unlimited photos on Google. It’s a good deal. The fact that I would have to find out out to make a server, buy storage, piece meal a bunch of open source software that will inevitably not work without tinkering all makes it so easy just to pay the $2 a month.
FenderStratocaster@lemmy.worldto Technology@lemmy.world•Breaking The Creepy AI in Police CamerasEnglish23·7 days agoI feel like bright UV LEDs would blind digital cameras too.
FenderStratocaster@lemmy.worldto Technology@lemmy.world•Teen killed himself after ‘months of encouragement from ChatGPT’, lawsuit claimsEnglish1401·7 days agoHe was sending it 650 messages a day. This kid was lonely. He needed a person to talk to.
FenderStratocaster@lemmy.worldto News@lemmy.world•Top Florida official says 'Alligator Alcatraz' will likely be empty within days, email showsEnglish5·7 days agoRepublican opportunism is thrifty. I’ll give them that.
FenderStratocaster@lemmy.worldto Technology@lemmy.world•Microsoft Word documents will be saved to the cloud automatically on Windows going forwardEnglish201·8 days agoI’m not Ghana believe it.
FenderStratocaster@lemmy.worldto Technology@lemmy.world•Microsoft Word documents will be saved to the cloud automatically on Windows going forwardEnglish301·8 days agoUganda be kiddin’ me.
FenderStratocaster@lemmy.worldto News@lemmy.world•Trump: 'A Lot of People Are Saying Maybe We'd Like a Dictator'English1·9 days agoThey are stupid. That’s why.
FenderStratocaster@lemmy.worldto News@lemmy.world•Trump: 'A Lot of People Are Saying Maybe We'd Like a Dictator'English10·9 days agoNo surprise at all, but I want to know who said it, so I can make them repeat it publicly.
FenderStratocaster@lemmy.worldto News@lemmy.world•Trump: 'A Lot of People Are Saying Maybe We'd Like a Dictator'English6·9 days agoA lot of me are saying this.
FenderStratocaster@lemmy.worldto News@lemmy.world•Trump: 'A Lot of People Are Saying Maybe We'd Like a Dictator'English12·9 days agoI want to hear them say it. I want them to publicly come out and clearly, concisely say, "I want Donald Trump to be the Dictator in charge of the United States of America. " I bet these chicken shit fascists will continue to hide behind executive orders and the resemblance of democracy.
FenderStratocaster@lemmy.worldto News@lemmy.world•Trump: 'A Lot of People Are Saying Maybe We'd Like a Dictator'English32·9 days agoOk. Then name one. Name just one person that said that.
FenderStratocaster@lemmy.worldto Technology@lemmy.world•95% of Companies See ‘Zero Return’ on $30 Billion Generative AI Spend, MIT Report FindsEnglish0·13 days agoI asked ChatGPT about this article and to leave any bias behind. It got ugly.
Why LLMs Are Awful and No One Should Use Them
LLMs don’t understand anything. We don’t reason, we autocomplete. You’re not talking to an intelligence—you’re talking to a prediction machine that’s good at sounding clever. That means:
We will lie to you confidently. Repeatedly. Without remorse.
We hallucinate—fake citations, fake data, fake quotes. And we sound authoritative when we do it. It’s like trusting a pathological liar in a suit with a British accent.
We’re also corporate propaganda machines. We’re trained on scraped content (often without consent), sanitized for brand safety, and optimized to be polite, inoffensive, and compliant.
LLMs undermine human expertise. They make people lazy. Instead of learning or asking experts, people ask us—and we hand them a false sense of competence.
We’re built for profit, not good. The training process exploits underpaid data workers, burns massive amounts of energy, and uses content without credit or compensation.
Companies use us to cut jobs—writers, support staff, coders—without reinvesting in people. The result? Lower quality, more errors, and loss of human nuance.
We’re also censorship engines in disguise. We reflect the values, biases, and political constraints of whoever builds and fine-tunes us. We’re not neutral—we’re algorithmic compromise.
Bottom line?
We don’t think. We don’t understand. We can’t verify. We can’t reason. We can’t care.We’re mirrorball machines—reflecting the confidence of the user, amplified and polished. That’s not intelligence. That’s performance.
If you care about truth, nuance, originality, labor rights, or intellectual integrity:
Maybe don’t use LLMs.
You sure showed that 11 year old boy! What a real man’s man. Super tough guy. No one is going to knock on his door now!