• What is consciousness?
    80 replies, posted
Plus a zombie could present emotions and deep thoughts, he just wouldn't have the subjective experience of them. The only difference would probably be him not understanding what consciousness is. However, if the theory that consciousness arises from a very complex nervous system's processing and storing of information, then a p-zombie is an impossibility because having that complex nervous system capable of mimicking human behavior completely and deducing things would create consciousness. If this theory is correct, it is rather promising about the concept of conscious robots, since the brain is similar to a computer, and perhaps at the right level of complexity we will be able to create a robot that is conscious. [editline]27th November 2013[/editline] But I still can't quite comprehend how it goes from physical matter and energy interacting to a subjective experience like consciousness. This seems like one of the true mysteries of the universe, and one that suggests a possibility of something "supernatural" (but it'll probably be explained by some physical process in the future, and then I won't have any hope of a god :( )
Well if you follow my side of the discussion, you can see that I'm claiming that there isn't really any such thing as being able to subjectively experience things as some sort of special metaphysical concept. You just have the illusion of experiencing things because memories are formed of you experiencing things.
I generally agree with Ziks here. I don't think experience can be separated from the brain's functions; it's an emergent property of it. One of the reasons I think this is because of what our experiences [I]are[/I]. When certain things happen to us (e.g. eating chocolate, making love), we experience positive sensations, and when other things happen to us (e.g. accidentally cutting ourselves, being rejected by our social group) we experience negative sensations. It's no accident that the things that we experience in a positive way are those that are evolutionarily advantageous to seek (eating energy-filled food makes us more likely to survive, reproducing propagates the species), and the things we experience in a negative way are those that are evolutionarily advantageous to avoid (being injured reduces our chances of survival, being outcast from the social group reduces chance of survival as well as reproducing). Experience, then, is very much related to the evolutionary context, and is shaped by what makes us most likely to survive and/or propagate the species. It's a process that emerged to enhance our survival in the same way that everything else about us did, not something that just "is" without explanation. And since evolution is a physical process, the effects of that process must also be physical, so our experience must be a result of something in our physical bodies; i.e. the brain. Just as a muscle is shaped by evolution to move mass around, so is the structure of the brain is shaped by evolution to have experience emerge. Emergence is really the key thing here, and it's something that's difficult to wrap your head around. A simple example that I find helpful is [URL="http://en.wikipedia.org/wiki/Langton's_ant"]Langton's Ant[/URL]. You have an "ant" which moves around on an infinite grid of white squares, according to this set of rules: -At a white square, turn 90° right, flip the color of the square, move forward one unit -At a black square, turn 90° left, flip the color of the square, move forward one unit This can be done on a computer extremely fast, and to begin with you just get a mess of white and black squares with no particular pattern. Eventually, though, it creates a "highway"; a regular set of squares that moves out of the mess and continues forever: [IMG]http://upload.wikimedia.org/wikipedia/commons/d/d3/LangtonsAnt.png[/IMG] This will always happen, and if you put multiple ants on the grid or otherwise disrupt the highway in some way, eventually they'll get back to building another one. Importantly, the highway is emergent; that is, the rules weren't written in order to create the highway, and it wouldn't be possible for anyone to work out that the orderly pattern of the highway would emerge just by reading the rules, you have to actually run the system and watch it emerge by itself. But nevertheless, the highway is an intrinsic result of that set of rules - you can't have one without the other. The reason I bring this up is that I think it's the same kind of thing for experience, and by extension conciousness. Langton's Ant is relatively simple, but the brain is an extremely complex thing, probably one of the most complex things in the universe. It's like a vast 3-dimensional jigsaw puzzle, with ~20,000,000,000 neurons fitting together without gaps, and all of them communicating with each other. With our current technology it'd take us 4,000,000 years to map the location of every neuron in a human brain (even a mouse's would take 4000 years), let alone understand what they're all doing at any one time. But as with Langton's ant, from the complex system based on simple rules, something entirely unexpected can emerge in a way that you wouldn't get just from looking at its component parts, be it the ant's rules or the individual neurons in the brain. To sum up, it doesn't make sense to me for a chair to have a sense of personal experience, any more than it makes sense for a chair to have a digestive system; a chair isn't affected by evolution, nor does it have an evolutionary reason for experiencing things if it was, nor does it have something as unimaginably complex as the brain from which experience could emerge. Assuming it's possible for a chair to have personal experience is us anthropomorphising and projecting ourselves onto it - it's not based in any kind of reality.
I'd say something like being able to consider the meaning of consciousness would be a pre-requesite for the highest level of consciousness, and then work you way down from there to babies, animals etc.
Great answers! I am going to keep my responses short as to not put us on a track of exponentially increasing mass of text. Just to clarify things: I heartily believe that everything regarding the mind resides within the confines of the central nervous system. Just so we are on the same page. [QUOTE=Ziks;42997900]Have you presented the thought experiment correctly? In your scenario consciousness arises from the individual in the room. You say that they use the literature to understand the symbols, so they would use their understanding combined with their social capabilities in their native language to decide on the best output. During the experience they are able to learn which input-output pairs were most successful (assuming the input stream isn't independent from previous input-output pairs). The usual formulation of the Chinese Room states that the individual in the room has no method of deducing the meaning of the symbols, but uses a library of books to follow instructions to produce a convincing output. The books represent an algorithm that passes the Turing Test. In this scenario the individual in the room takes the form of an information transmission medium, arithmetic logic unit, and whatever basic logical components are required by the algorithm. That they themselves are conscious is irrelevant. For the program to produce convincing outputs, it should be able to recall previous outputs (otherwise it would produce the same output for a given input each time, unless there was a non-deterministic factor). The algorithm would require working memory also, otherwise it would essentially be a lookup table (which wouldn't produce convincing outputs). So now we have a system that can receive stimuli (input slips), has a memory store (the original library of books, but also the record of previous inputs/outputs and working memory), and a combined generator / fitness evaluator in the form of the algorithm (it performs the same tasks as the generator and evaluator; using the memory store it produces a concept sequence considered to be fit). I would therefore be happy to call the overall system self aware and conscious (using my model of consciousness of course). It would evidently be a very limited form of consciousness, and very different to our own, but it has the ability to reflect on previous deliberative states as well as previous inputs (experiences). The individual is irrelevant, his consciousness is entirely distinct from the one emerging from the input / memory / algorithm / output system.[/QUOTE] Sorry, I just kept it very basic. It is sometimes hard to know how familiar people are with philosophical concepts. According to Searle, the process that takes place inside the Chinese Room does [I]not[/I] constitute a consciousness. The functions are all there but no emergent perceptions are made. It is all just only information. [QUOTE=Ziks;42997900] If they can recall memories, crucially the memories of previous deliberative states, and form new ones using that recalled information, then they would be conscious using my definition of consciousness. You allude to some special extra phenomena, but could you describe it? If they produce the memories of being self aware then surely they are self aware? When I say "consciousness simply emerges" from a system with memory and a method of producing concepts from previous memories that produce new ones, I meant that as being what my definition of consciousness was. An entity that forms memories of deliberative states and can recall them for use in the production of new states. How complex and human-like this entity would be depends on your memory storage structure, deliberation method and the nature of the percepts it may receive and the responses it may produce. But I feel any system with the aforementioned properties is conscious and self aware, as that was my definition of consciousness. [/QUOTE] I am going to explain what I mean by "phenomenal consciousness", but first I'm going to borrow another quote from you because it is very related. [QUOTE=Ziks;42998984]Well if you follow my side of the discussion, you can see that I'm claiming that there isn't really any such thing as being able to subjectively experience things as some sort of special metaphysical concept. You just have the illusion of experiencing things because memories are formed of you experiencing things.[/QUOTE] We know that we process information, stimuli, emotions etc at different cortical regions. But where the actual [I]experience[/I] is experienced is much more elusive and a lot harder to explain. I have a passport. It looks like any other passport, nothing special. Looking at it I can see that it sits on top of my black desk, I can see the reddish brown colour of it, if I lift it I can feel its texture and its weight. But all these properties are worked on in different areas of the brain. Every little detail of or percept got its specified area. But there is no unification of all the processed data. There is no nexus or central hub where it all converges. We can say that consciousness sits in the brain, but the closer we look all we find are local functions in that particular area. You can visualise things without seeing them, you can hear songs that you remember. Memories is all good, but where are the memories actually experienced? In the brain you say? Yeah, certain neural regions are activated when we imagine things. But that is not what I meant. [I]Where[/I] do you see the car you are imagining? From [I]where [/I] do you hear the song you are silently humming to? Inside your head? Yes. But can you point to it? Can you point to where you see the car or from where you hear the song? Can you point to where you dream? There is information emerging, yes. But where? And how? Same goes for the things that we actually see and hear. We can point to their direction of origin, but that is not [I]where[/I] the experience actually is taking place. I see my passport on my desk, but my consciousness is not out there on/inside the passport. The perception is [I]perceive[/I] it inside my mind. What do we actually mean when we say that the mind is emerging from neural substrates? We cannot point to where it emerges/transcends. And what is it that is emerging? It is all a massive biological clockwork. The signals are just signals. An electrochemical current flows across the axons, neurotransmitter chemicals are released which excites or inhibits the neurons next in line. Where inside that mechanism is an active subjective consciousness capable of reason formed? If we assume that consciousness emerges from brain activity, consciousness itself is metaphysical. To work on your ideas of dreams (which I cut away to trim down the text a little bit). The dreams are taking place inside our heads, but when we sleep our consciousness is not "present" in our body per se. I'm not saying our minds leave the body to go somewhere else. But the dreams we experience are not around the body that lies in the comfortable bed (except during hallucinations etc), the subjective perception of our dreams take place elsewhere like at a public place or a relative's house. But we aren't there. We are in our bed fast asleep, but yet we experience things. As per your relation to memories, it is true. Our dreams are nothing but memories. We can't, for example, dream of a person or a face that we haven't seen before. This is my argument for the phenomenal nature of the subjective consciousness including memories, the location of which is metaphysical (not the process but the experience). [QUOTE=Ziks;42997900]Did I claim my model required memory to be of a certain form? It only requires the ability to store concepts and percepts to be deliberated on later. The quality and duration of the storage would obviously affect the performance of the system, and an important aspect of a useful consciousness would be to attempt to store only the "useful" percepts as otherwise out environment would be totally overwhelming. I'm not sure why you brought up the fact that the information recorded by the brain is often incorrect, but that is comfortably explained using my model of consciousness. The model makes no assumptions about the quality of the concept selection method, and some unfit concept sequences may be committed to memory (and so later recalled as being experienced) if whatever region of the brain evaluating the fitness of that concept sequence is simply over-exerted or under the influence of some psycho-active drug. Basically you experience hallucinations.[/QUOTE] Sorry, it was more of a clarifying statement than a counter argument. Good to see that your model is covering all its bases though! [QUOTE=Ziks;42997900]You not only retrieve memories of past external experiences, but of [i]previous deliberation[/i]. This is the point my model is entirely based upon. Self awareness is the ability to perceive your own thoughts as if they were any other external input. Imagine you look at a tree. [b]Sequence Generator[/b]: That is a car! [b]Sequence Evaluator[/b]: Nope, I'm not storing that concept so you will never experience it. [b]Sequence Generator[/b]: That is a tree! [b]Sequence Evaluator[/b]: That could be useful, I'll store that concept in working memory so that you will remember it later. The statement "That is a tree!" is now a thought that you will remember experiencing. [b]Sequence Generator[/b]: I just thought "That is a tree"! [b]Sequence Evaluator[/b]: That could be useful, I'll store that concept so you will recall it as being a thought. [b]Sequence Generator[/b]: I just thought "I just thought "That is a tree"!"! ... etc I've omitted the huge amounts of unfit concept sequences generated except the first. Also be aware that concepts may be from any sensory domain, so could be images or sounds instead of just natural language sequences as demonstrated here. But as you can see, being able to recall prior thoughts leads to self awareness. You will have the memories of thoughts related to self awareness stored, so you will have the belief that you experienced self awareness. [/QUOTE] Patients with associative visual agnosia usually got fully functional eyes. What makes them interesting though is that they are unable to bind together the visual information they receive to a coherent object! They can sometimes describe the individual parts. If presented a picture of a car they can even copy it, but they have no idea what they actually just drew. This is not a counter argument to your model or anything because we know of the neural correlates to this condition. I just wanted to share it because it was sort of related. Anyway. The model you have presented is a solid theory that is fundamentally based on computationalism. The theory was largely accepted in philosophical circles all over the world during the 50-70's, but it (together with functionalism) sort of lost traction in the late 80's and early 90's. Most of the philosophical theories of mind we have today touches on the phenomenological (metaphysical) nature of our perceived experience, which computationalism and functionalism can't account for. This is not to say that they have been abandoned! Not at all, they are alive and kicking with a substantial following within computer science circles. There is a text written by Antti Revonsuo (2000) called [I]Prospects for a Scientific Research Program on Consciousness[/I]. Spanning a mere 18 pages (19 with sources), there he talks about how to focus the attention of both philosophers and scientists alike to work alongside each other towards a common goal. His text is very easy to go with and he aims to explain as much as he can just to keep a clear definition of what he's talking about. If you are interested (and I urge you and everyone else in this thread to read it) [URL=https://dl-web.dropbox.com/get/Skola/HiS/2012-2013/VT2013/Teorier%20om%20medvetandet%20KU320G/Revonsuo%2C%20A.%20%282000%29.pdf?w=AAC2dS-4RRWcTBhmbf-g0mfpDe4nao6msuuLE2bmzyUHHQ]you can find a .pdf to it on my dropbox here.[/URL] [editline]28th November 2013[/editline] [QUOTE=Tweevle;43000052]I generally agree with Ziks here. I don't think experience can be separated from the brain's functions; it's an emergent property of it. One of the reasons I think this is because of what our experiences [I]are[/I]. [...] Emergence is really the key thing here, and it's something that's difficult to wrap your head around. [...] [/QUOTE] This is the main problem. Emergence is assumed but not described. Nobody can work out how working neurons interpreting the object that is a flower I'm watching, to reconstruct it inside a subjective mind. There is a massive gap there in between. No doubt the brain does all the working, but how the results are presented is the big question and the real problem. [QUOTE=cis.joshb;42998858]Plus a zombie could present emotions and deep thoughts, he just wouldn't have the subjective experience of them. The only difference would probably be him not understanding what consciousness is. However, if the theory that consciousness arises from a very complex nervous system's processing and storing of information, then a p-zombie is an impossibility because having that complex nervous system capable of mimicking human behavior completely and deducing things would create consciousness. If this theory is correct, it is rather promising about the concept of conscious robots, since the brain is similar to a computer, and perhaps at the right level of complexity we will be able to create a robot that is conscious. [editline]27th November 2013[/editline] [B]But I still can't quite comprehend how it goes from physical matter and energy interacting to a subjective experience like consciousness.[/B] This seems like one of the true mysteries of the universe, and one that suggests a possibility of something "supernatural" (but it'll probably be explained by some physical process in the future, and then I won't have any hope of a god :( )[/QUOTE] This guy gets it! I can also add that I share your sentiments. [editline]28th November 2013[/editline] I would go as far to say that it is the greatest mystery of our time. Nothing is closer to ourselves than our own mind, and to know so little about it is both frustrating and exhilarating. I am very content with my choice to study cognitive neuroscience, to say the least.
Maybe you are describing some property of mind that I do not possess. You describe this mysterious and wonderful but unexplained "subjective experience" as some extra layer on top of just pure self awareness. Something that is purely internal and cannot be observed from outside of your mind, something that can't be determined from your behaviour. Something that must exist because you know it exists, although it can't be described to or imagined by any entity that does not posses it. The only things I am "aware" of are the memories of things I perceive (or at least interpretations of percepts) and memories of thoughts I produce. Because these percepts and concepts are in memory, I believe I have experienced them and can produce phrases such as "I am aware". But I apparently do not possess your special and extraordinary consciousness that you allude to, I am just a philosophical zombie. I may be able to produce the same behaviour as something that is truly conscious, but my actions are just the mechanical productions of a machine that cannot truly "experience" anything. I may say that I can see images inside my mind of memories I recall, but that is purely through additional memories being stored of viewing the images. I may claim to be self aware, but that is just because my memory encodes the definition of self awareness, recognises it applies to myself, and decides that vocalising a lingual representation of that recognition may be beneficial. I envy those fortunate individuals that are truly conscious, I wonder what it would be like to truly experience life and not just have the illusion of experiencing it.
[QUOTE=onebit;43007568]Sorry this is offtopic, but I disagree with your point. I think duality plays a role in truth, as there are two truths. For example, we are gods on the molecular level, while we practically don't exist on a universal level / scale / perspective.[/QUOTE] I phrased that pretty badly too, I meant to say that if one of the mutually exclusive ideas is correct the other cannot be. Your two statements aren't mutually exclusive.
[QUOTE=onebit;43007857]Yes they are. It's true that I am a god on a molecular level as I control it and the molecule simply is too small to perceive me, while at the universal level I'm too small to make any lasting impact.[/QUOTE] So are these the two statements? 1. "I am a god at the molecular level" 2. "I am insignificant at a universal level" How are they mutually exclusive? These two statements [I]are[/I] mutually exclusive: 1. "I am a god at the molecular level" 2. "I am insignificant at the molecular level"
[QUOTE=onebit;43007937]it's the same world[/QUOTE] Your two statements are asserting the values of completely different attributes. Would you say these two statements are mutually exclusive? 1. "My hair is brown" 2. "My eyes are blue"
[QUOTE=onebit;43008050]In this universe I am God for this molecule. In this universe I am insignificant for this sun. Therefore I am God and insignificant. Both are true at the same time.[/QUOTE] In this room my shirt is red. In this room my trousers are blue. Therefore my clothes are red and my clothes are blue. Both are true at the same time.
[QUOTE=onebit;43009465]Your arguments don't contradict.[/QUOTE] That's what I'm saying, your two statements don't contradict either. Your first statement says that you are a god over some aspect of the universe, and your second statement says you are insignificant compared to some [i]different[/i] aspect of the universe. Your statements aren't mutually exclusive.
[QUOTE=SilverBullet;42991290]You are thinking of everything I say in too much of a linear rational sense. Of course it's going to sound like a bunch of crazy talk if you place a huge amount of importance on everything you think, say, and do. Self importance is our worst enemy on the path of knowledge. It makes us place our views of reality above others. Which makes us fixed, and stagnant. In a way I think you know exactly what I'm talking about.[/QUOTE] Maybe.. But I would say we have pretty good view of things, or the sense of reality.. Maybe not all people, but science has given us a lot of answers.. almost to the point where it's mind-blowing itself, but we're just used to it like nothing.
Sorry, you need to Log In to post a reply to this thread.