I know, but it’s a ridiculous term. It’s so bad it must have been invented or chosen to mislead and make people think it has a mind, which seems to have been successful, as evidenced by the OP
I know, but it’s a ridiculous term. It’s so bad it must have been invented or chosen to mislead and make people think it has a mind, which seems to have been successful, as evidenced by the OP
ChatGPT does not “hallucinate” or “lie”. It does not perceive, so it can’t hallucinate. It has no intent, so it can’t lie. It generates text without any regard to whether said text is true or false.
We don’t have a way to do this. I don’t think we ever will. Wish the answer was different.
The one thing I will say is that logical argument is extremely ineffective for changing people’s views. Personal, emotional stories are best. The issue is that war and the draft is already highly emotionally charged, so it’s gonna be hard to find something that will strike a nerve with someone who hasn’t already come around on it.
Executives believe nearly half of the skills that exist in today’s workforce won’t be relevant just two years from now, thanks to artificial intelligence.
Executives are such dumbasses
That is literally all this “study” did. Ask people how many of their skills they think will be obsoleted. This headline is ridiculous.
OP clearly expects LLMs to exhibit mind-like behaviors. Lying absolutely implies agency, but even if you don’t agree, OP is confused that
The whole point of the post is that OP is upset that LLMs are generating falsehoods and parroting input back into its output. No one with a basic understanding of LLMs would be surprised by this. If someone said their phone’s autocorrect was “lying”, you’d be correct in assuming they didn’t understand the basics of what autocorrect is, and would be completely justified in pointing out that that’s nonsense.