> The problem is without any philosophical backup the neural correlate is entirely uncompelling—it's merely a clinical definition of a measure of... nothing meaningful.
Gravity is arguably a more compelling concept to someone falling off a ladder than to a student in a high school physics class. But, so? Does gravity lose meaning on a scale that is too large or too small to fit human experience?
The physics has obvious relevance to the experience and leaves no experiential questions unanswered. There's no problem with gravity.
I'm not looking for a why—the common complaint about gravity—I just want all the whats to match up. The meaninglessness is in reference to the question of consciousness. If you supply this to the asker, they could rightly ask, "sure, but what about my question?" Perhaps there is no reconciliation and you might want to complain that the question is nonsense—that complaint would make a better answer than a collection of irrelevant facts.
You are right. I should have gone straight to the heart of the matter: An intelligence arising in machines or in an exobiological system (or, for that matter, in a sufficiently different animal) seems very unlikely to function like human intelligence. Does that mean the only philosophically valid intelligence is human, or maybe in a genetically manipulated primate?
Gravity is arguably a more compelling concept to someone falling off a ladder than to a student in a high school physics class. But, so? Does gravity lose meaning on a scale that is too large or too small to fit human experience?