- cross-posted to:
- Technology@programming.dev
- cross-posted to:
- Technology@programming.dev
cross-posted from: https://programming.dev/post/36160327
Comments
If we can get food by going to the grocery store, who needs farms?
The idea that AI is a “google killer” and a replacement for web sites always struck me as incredibly dumb. I know that’s how many people use it though, which is fairly awful. So, AI gets all of its information from websites. If it puts websites out of existence, where is it getting new information from?
LLMs, as the name suggests, are language models - not knowledge machines. Answering questions correctly isn’t what they’re designed to do. The fact that they get anything right isn’t because they “know” things, but because they’ve been trained on a lot of correct information. That’s why they come off as more intelligent than they really are. At the end of the day, they were built to generate natural-sounding language - and that’s all. Just because something can speak doesn’t mean it knows what it’s talking about.
who needs the web?
Anybody who cares enough to confirm whatever stupid bullshit the AI probabilistically regurgitated without actual understanding.
Seriously, in my experience AI generated results are only actually correct maybe 10% of the time
What I really don’t get is that there are in fact models that you can feed a document too and they will directly copy/paste quote relevant parts of that document in their reply complete with a little reference to the correct page. Basically a smarter ctrl-F function that can take sentences as input.
When that exists why is google using the probabilistic shit in their search?