Researchers in ML and neuroscience disagree with you.
You have a superficial grasp of the topic. Your refusal to engage with the literature suggests an underlying insecurity regarding machine intelligence.
Good luck navigating this topic with such a mental block, it's a great way to remain befuddled.
> in 2020 neuroscientists introduced the Tolman-Eichenbaum Machine (TEM) [1], a mathematical model of the hippocampus that bears a striking resemblance to transformer architecture.
...what? Underlying insecurity? You think I'm afraid of computers being smarter than me? Sorry but that ship sailed a long time ago, I can't even beat a chess bot from the 90s.
The fact that someone created a mathematical model does not mean it is accurate, and even if a small piece of our brain might conceptually resemble a ML model that does not mean they are equivalent.
It is an indisputable fact that our brains are completely, fundamentally different from computers. A cpu is just a bunch of transistors, our brains use both electrical signals and chemical signals. They are alive, they can form new structures as they need them.
You can link fancy papers and write condescending replies all you want, fact is ChatGPT fails at extremely basic tasks precisely because it has absolutely no understanding of the text it spits out, even when it contains all the knowledge necessary to solve them and much more.
I'm not saying we'll never make AGI, I'm simply saying LLMs are not it. Not on their own anyway. I don't understand why you people are so opposed to that simple fact when the evidence is staring you in the face.