Its not though, nobody really knows what most of the words in that sentence mean in the technical or algorithmical sense, and hence you can't really say whether llms do or don't possess these skills.
>nobody really knows what most of the words in that sentence mean in the technical or algorithmical sense
And nobody really knows what consciousness is, but we all experience it in a distinct, internal way that lets us navigate the world and express ourselves to others, yet apparently some comments seem to dismiss this elephant of sensation in the room by pretending it's no different than some cut and dried computational system that's programmed to answer certain things in certain ways and thus "is probably no different from a person trained to speak". We're obviously, evidentially more than that.
> by pretending it's no different than some cut and dried computational system
This is not really what is going on, what is going on is a mix-up in interpreting the meaning of words, because the meaning of words is not transitive between subject matter unless we arrive at a scientific definition which is leading, and we have not (yet).
When approaching the word consciousness from a spiritual POV, it is clear that LLMs may not possess it. When approaching consciousness from a technical point of view, it is clear that LLMs may possess it in the future. This is because the spiritual POV is anthropologically reductive (consciousness is human), and the technical POV is technically reductive (consciousness is when we can't tell it apart).
Neither statements help us clarify opposing positions because neither definitions are falsifiable, and so not scientific.
I disagree with that characterization. I don’t experience consciousness as an “internal way that lets us navigate the world and express ourselves to others”. To me it is a purely perceptional experience, as I concluded after much introspection. Sure it feeds back into one’s behavior, mostly because we prefer certain experiences over others, but I can’t identify anything in my inner experience that is qualitatively different in nature from a pure mechanism. I do agree that LLMs severely lack awareness (not just self-awareness) and thus also consciousness. But that’s not about being a “mere” computational system.
Words are not reducible to technical statements or algorithms. But, even if they were, then by your suggestion there's not much point in talking about anything at all.
They absolutely are in the context of a technical, scientific or mathematical subject.
Like in the subject of LLMs everyone knows what a "token" or "context" means, even if they might mean different things in a different subject. Yet, nobody knows what "consciousness" means in almost any context, so it is impossible to make falsifiable statements about consciousness and LLMs.
Making falsifiable statements is the only way to have an argument, otherwise its just feelings and hunches with window dressing.
Its not though, nobody really knows what most of the words in that sentence mean in the technical or algorithmical sense, and hence you can't really say whether llms do or don't possess these skills.