Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The turing test is still a thing. No llm could pass for a person for more than a couple minutes of chatting. That’s a world of difference compared to a decade ago, but I would emphatically not call that “passing the turing test”

Also, none of the other things you mentioned have actually happened. Don’t really know why I bother responding to this stuff





Ironically the main tell of LLMs is that are too smart and write too well. No human can discuss the depth of topics they can and no humans writes like a author/journalist all the time.

i.e. the tell that it's not human is that it is too perfectly human.

However if we could transport people from 2012 to today to run the test on them, none would guess the LLM output was from a computer.


That’s not the Turing Test; it’s just vaguely related. The Turing Test is an interactive party game of persuasion and deception, sort of like playing a werewolves versus villagers game. Almost nobody actually plays the game.

Also, the skill of the human opponents matters. There’s a difference between testing a chess bot against randomly selected college undergrads versus chess grandmasters.

Just like jailbreaks are not hard to find, figuring out exploits to get LLM’s to reveal themselves probably wouldn’t be that hard? But to even play the game at all, someone would need to train LLM’s that don’t immediately admit that they’re bots.


Yesterday I stumbled onto a well written comment on reddit, it was a bit contrarian, but good. Then I was curious and looked at their comment history and found it was a one month old account with many comments of similar length and structure. I put a LLM to read that feed and they spotted LLM writing, and the argument? it was displaying too broad a knowledge across topics. Yes, it gave itself up by being too smart. Does that count as Turing test fail?

> No llm could pass for a person for more than a couple minutes of chatting

I strongly doubt this. If you gave it an appropriate system prompt with instructions and examples on how to speak in a certain way (something different from typical slop, like the way a teenager chats on discord or something), I'm quite sure it could fool the majority of people




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: