Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, that's interesting, because there's no good scientific definition of consciousness.


There is no philosophical definition of consciousness, but there is none of gravity either, in the sense of "what gravity really is". Science doesn't examine what things "really are", but tries to make useful predictions and the definitions employed are only means to this end. In fact there are operational definitions of consciousness and I think they certainly deserve to be called "scientific", see for example:

https://en.wikipedia.org/wiki/Consciousness#Neural_correlate...

https://en.wikipedia.org/wiki/Consciousness#Defining_conscio...


The problem is without any philosophical backup the neural correlate is entirely uncompelling—it's merely a clinical definition of a measure of... nothing meaningful. One that can be just as easily applied to a recording device, modulo the explicit mention of neurons.

It gets worse. When you look at it closely there's no useful reason to restrict any of these measures with arbitrary criteria to the things we consider candidates for consciousness, other than to reaffirm our prejudices.

In other words, there's no such thing as a good scientific definition without the philosophical context. You can have as accurate an arbitrary measure as you like, but that doesn't imbue it equally arbitrarily with your desired meaning.


The problem is that multiple times in the history of science the search for somehow philosophically sound underpinnings led to stagnation and not to progress, that was in fact what Aristotle did to a large extent and was the main obstacle to development of science altogether. The recent example of this is the search of aether. In the end the way that science has progressed is by dropping too much "whys" and sticking to "hows".

There might never be a philosophically satisfactory definition of consciousness. Our subjective experience that we call consciousness might be an amalgamate of very different things happening at the brain level and a single definition might just not do. Meanwhile the operational definitions like the ones I mentioned allow us to make useful predictions and draw conclusions, for example that the brain activity that happens when we experience states most people would describe as consciousness also happens in certain animals. It is not perfect philosophically, but again, you could also have a long philosophical debate about what gravity is and you would never go anywhere scientifically - that's precisely what people did before the scientific revolution.


I think you're confused. I'm not asking for philosophical proofs end-to-end, I'm asking for the framing.

If you look at the formulation it starts out as a magical-seeming property, yet the definitions and processes do nothing to demystify that property. If you look at that process carefully you'll find the deceit: this is not an answer to the question of consciousness as asked. With the right philosophical treatment there might be hope of reconciliation, but without it the concept is just going to remain magical, without meaningful conclusions.

Aether is a great example, one I had in mind. Consciousness is very much like aether in some ways. It may well be that the only useful scientific thing to say about it at the end of the day is: it is not a useful concept to science. Much better than the contrivances offered up with no compelling connection to the subject.

The last word in TFA is 'qualia'. This is the problem. Canonical definitions of that term describe it as impossible to measure, or simply ineffable, which effectively puts it off the table for a scientific treatment. Regardless of what you think of what should constitute a scientific concept, the implication that these measurements alone elucidate an ineffable phenomenon is exactly the kind of thing that stinks of bullshit.


The problem is that the answers you are looking for are of philosophical nature and not of scientific one (you talk about "demystifying" consciousness, "elucidating it"). The neural correlates might not clarify what consciousness is, but they might yield answers to precise scientific questions, such as: what brain activity is necessary and sufficient for a person to be able demonstrate self- or world- awareness. We might need to first answer to such questions before we gain any new insights of more philosophical nature. I am not saying this method is the silver bullet, but certainly I cannot agree the results are meaningless or "only reaffirm our prejudices".

My only point is really that the lack of a great definition of consciousness doesn't diminish the value of research like the one cited here.


The neural correlates might not clarify what consciousness is, but they might yield answers to precise scientific questions

I don't deny that at all, in fact it is quite precisely what I endorse. Notice that the term consciousness lies on the left side, the excluded part. On the right side, you use the better-defined term, awareness. A definition that comes with better philosophical understanding. TFA, however, talks of qualia.

We might need to first answer to such questions before we gain any new insights of more philosophical nature.

I would say "different" rather than "more." Philosophy doesn't mean "weird stuff we don't really understand" and it can often be as boring as the implications of simple arithmetic or even the logic used in scientific endeavors. I wouldn't want to throw that out in the name of progress either.

I cannot agree the results are meaningless or "only reaffirm our prejudices".

The results described by the article and supported here are of the form "consciousness is X" where no question was asked that is answerable directly in terms of X, and no reconciliation has been made. That is the sense of meaninglessness I'm talking about. If you're still in doubt, or think that's somehow unimportant, grab the bull by the horns and deal with the implication that this is somehow ultimately a measure of qualia.

My only point is really that the lack of a great definition of consciousness doesn't diminish the value of research like the one cited here.

In that phrasing I am almost in agreement, if it wasn't for some of the claims made. Some very interesting things are being measured, but to go from these measurements to things like qualia is a leap I can't justify. A correlation between these measures and alertness, intelligence and kinds of awareness are easy to establish or contradict, and better yet: given those connections who is going to say "yeah but what are these results over here? it looks like manifest experience!"—?


What's "philosophical backup"? Scientists normally get by without any help from philosophers, so why do they need to get philosophers' imprimatur before they can use the concept of 'consciousness'?

Why doesn't the concept of, say, 'the gene' also need this backup? Or maybe all of those geneticists are also stumbling around in the dark without philosophers to guide them...


The gene does, just as everything else, but we don't ascribe magical properties to the gene which are not readily accessible and coherent with the philosophical context. If you tried to use the gene to define God, then you would have the philosophical disconnect I'm talking about.


> The problem is without any philosophical backup the neural correlate is entirely uncompelling—it's merely a clinical definition of a measure of... nothing meaningful.

Gravity is arguably a more compelling concept to someone falling off a ladder than to a student in a high school physics class. But, so? Does gravity lose meaning on a scale that is too large or too small to fit human experience?


The physics has obvious relevance to the experience and leaves no experiential questions unanswered. There's no problem with gravity.

I'm not looking for a why—the common complaint about gravity—I just want all the whats to match up. The meaninglessness is in reference to the question of consciousness. If you supply this to the asker, they could rightly ask, "sure, but what about my question?" Perhaps there is no reconciliation and you might want to complain that the question is nonsense—that complaint would make a better answer than a collection of irrelevant facts.


You are right. I should have gone straight to the heart of the matter: An intelligence arising in machines or in an exobiological system (or, for that matter, in a sufficiently different animal) seems very unlikely to function like human intelligence. Does that mean the only philosophically valid intelligence is human, or maybe in a genetically manipulated primate?


There's no definition of consciousness that is 1) not subjective (= is scientific) 2) matches our intuitive understanding of consciousness 3) doesn't contain logical inconsistencies or circular reasoning.

And there are good argumnts suggesting that there'll never be such definition (summary: consciousness is inherently subjective).


No they are not. Omitting qualia disqualifies almost all existing models of consciousness. One of the only ones I know that attempt an explanation of this phenomenon may be Tononi's IIT [1], but it is of course criticised for being too broad a definition.

[1] http://en.wikipedia.org/wiki/Integrated_Information_Theory


There aren't any scientific definitions of consciousness that don't derive completely from the empirical study of people who have been declared conscious (philosophically.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: