Given how much it hallucinates, that's one very scary echo chamber to be in where you trust info reinterpreted by an algorithm instead of just reading it yourself. Yikes.
It rarely hallucinates for me. But you need to know what it's capable of and how to use it to work effectively with it. You can't use it well if you think it's an all knowing sentient AI.
How would you know how frequently it hallucinates if it has "mostly replaced Google" for you? Are you fact-checking all your queries? This is a very strange self-inflicted echo chamber.
Because the code it generates is working? The recipes it gave to me are also delicious and I don't even have to read someone's life story on a blog before getting to the recipe.
Not really sure why you are so fixated on the echo chamber thing. We are on the internet! The biggest echo chamber humanity has ever built.
Functional or delicious ≠ accurate to the source material.
Because it's a layer of abstraction, mate. One known to get things wrong, because it's an LLM. If I write a post about Richard Stallman's opinions on paedophilia and Jeffrey Epstein (https://en.wikipedia.org/wiki/Richard_Stallman#Controversies), and it incorrectly tells you that Stallman associated directly with Epstein, or is a paedophile himself, that would not be accurate to the source.
At least with a Google search result you can go more directly to the source. If the scientific method is getting to the truth, why on Earth would you put an obstacle in front of it?
If someone tells me x, am I going to believe them? No. So, why would I believe an LLM if it isn't presenting sources to me that are 1:1 in accuracy to the information it presents to me?
Yup, and it hallucinates plenty with dev too, even over basic stuff like an NGINX config.
Given that it hallucinates in particular over measurements/specs/stats, I'd be extremely sceptical of taking a recipe from it, whether that's generated and original or coming from a known source.
Baking requires very specific measurements, the slightest mistake and it won't turn out well in most cases. Again, why go via an LLM and not a search engine to the actual source? It makes zero sense, especially if it only returns text and you can't see what the recipe produces if it's an existing thing.
> Again, why go via an LLM and not a search engine to the actual source?
I believe the argument presented (possibly in a separate thread) was that search engines have degraded to the point where what they show you is worse than LLM output.