Consciousness - Wikipedia
Consciousness - Wikipedia
Examples of the range of descriptions, definitions or explanations are: ordered distinction between
self and environment, simple wakefulness, one's sense of selfhood or soul explored by "looking
within"; being a metaphorical "stream" of contents, or being a mental state, mental event, or mental
process of the brain.
Etymology
The words "conscious" and "consciousness" in the English language date to the 17th century, and
the first recorded use of "conscious" as a simple adjective was applied figuratively to inanimate
objects ("the conscious Groves", 1643).[6]: 175 It derived from the Latin conscius (con- "together" and
scio "to know") which meant "knowing with" or "having joint or common knowledge with another",
especially as in sharing a secret.[7] Thomas Hobbes in Leviathan (1651) wrote: "Where two, or more
men, know of one and the same fact, they are said to be Conscious of it one to another".[8] There
were also many occurrences in Latin writings of the phrase conscius sibi, which translates literally
as "knowing with oneself", or in other words "sharing knowledge with oneself about something".
This phrase has the figurative sense of "knowing that one knows", which is something like the
modern English word "conscious", but it was rendered into English as "conscious to oneself" or
"conscious unto oneself". For example, Archbishop Ussher wrote in 1613 of "being so conscious
unto myself of my great weakness".[9]
The Latin conscientia, literally 'knowledge-with', first appears in Roman juridical texts by writers such
as Cicero. It means a kind of shared knowledge with moral value, specifically what a witness knows
of someone else's deeds.[10][11] Although René Descartes (1596–1650), writing in Latin, is generally
taken to be the first philosopher to use conscientia in a way less like the traditional meaning and
more like the way modern English speakers would use "conscience", his meaning is nowhere
defined.[12] In Search after Truth (Regulæ ad directionem ingenii ut et inquisitio veritatis per lumen
naturale, Amsterdam 1701) he wrote the word with a gloss: conscientiâ, vel interno testimonio
(translatable as "conscience, or internal testimony").[13][14] It might mean the knowledge of the value
of one's own thoughts.[12]
The French term conscience is defined roughly like English "consciousness" in the 1753 volume of
Diderot and d'Alembert's Encyclopédie as "the opinion or internal feeling that we ourselves have
from what we do".[18]
Problem of definition
Scholars are divided as to whether Aristotle had a concept of consciousness. He does not use any
single word or terminology that is clearly similar to the phenomenon or concept defined by John
Locke. Victor Caston contends that Aristotle did have a concept more clearly similar to
perception.[19]
Modern dictionary definitions of the word consciousness evolved over several centuries and reflect a
range of seemingly related meanings, with some differences that have been controversial, such as
the distinction between inward awareness and perception of the physical world, or the distinction
between conscious and unconscious, or the notion of a mental entity or mental activity that is not
physical.
concerned awareness; INTEREST, CONCERN—often used with an attributive noun [e.g. class
consciousness]
2. the state or activity that is characterized by sensation, emotion, volition, or thought; mind in the
broadest possible sense; something in nature that is distinguished from the physical
3. the totality in psychology of sensations, perceptions, ideas, attitudes, and feelings of which an
individual or a group is aware at any given time or within a particular time span—
4. waking life (as that to which one returns after sleep, trance, fever) wherein all one's mental
powers have returned . . .
5. the part of mental life or psychic content in psychoanalysis that is immediately available to the
ego—
The Cambridge English Dictionary defines consciousness as "the state of being awake, thinking, and
knowing what is happening around you", as well as "the state of understanding and realizing
something".[20] The Oxford Living Dictionary defines consciousness as "[t]he state of being aware of
and responsive to one's surroundings", "[a] person's awareness or perception of something", and "
[t]he fact of awareness by the mind of itself and the world".[21]
Philosophers have attempted to clarify technical distinctions by using a jargon of their own. The
corresponding entry in the Routledge Encyclopedia of Philosophy (1998) reads:
Consciousness
Philosophers have used the term consciousness for four main topics: knowledge in general,
intentionality, introspection (and the knowledge it specifically generates) and phenomenal
experience... Something within one's mind is 'introspectively conscious' just in case one
introspects it (or is poised to do so). Introspection is often thought to deliver one's primary
knowledge of one's mental life. An experience or other mental entity is 'phenomenally conscious'
just in case there is 'something it is like' for one to have it. The clearest examples are: perceptual
experience, such as tastings and seeings; bodily-sensational experiences, such as those of pains,
tickles and itches; imaginative experiences, such as those of one's own actions or perceptions;
and streams of thought, as in the experience of thinking 'in words' or 'in images'. Introspection
and phenomenality seem independent, or dissociable, although this is controversial.[22]
During the early 19th century, the emerging field of geology inspired a popular metaphor that the
mind likewise had hidden layers "which recorded the past of the individual".[23]: 3 By 1875, most
psychologists believed that "consciousness was but a small part of mental life",[23]: 3 and this idea
underlies the goal of Freudian therapy, to expose the unconscious layer of the mind.
Other metaphors from various sciences inspired other analyses of the mind, for example: Johann
Friedrich Herbart described ideas as being attracted and repulsed like magnets; John Stuart Mill
developed the idea of "mental chemistry" and "mental compounds", and Edward B. Titchener sought
the "structure" of the mind by analyzing its "elements". The abstract idea of states of consciousness
mirrored the concept of states of matter.
In 1892, William James noted that the "ambiguous word 'content' has been recently invented instead
of 'object'" and that the metaphor of mind as a container seemed to minimize the dualistic problem
of how "states of consciousness can know" things, or objects;[24]: 465 by 1899 psychologists were
busily studying the "contents of conscious experience by introspection and experiment".[25]: 365
Another popular metaphor was James's doctrine of the stream of consciousness, with continuity,
fringes, and transitions.[24]: vii [a]
James discussed the difficulties of describing and studying psychological phenomena, recognizing
that commonly used terminology was a necessary and acceptable starting point towards more
precise, scientifically justified language. Prime examples were phrases like inner experience and
personal consciousness:
The first and foremost concrete fact which every one will affirm to belong to his
inner experience is the fact that consciousness of some sort goes on. 'States of mind'
succeed each other in him. [...] But everyone knows what the terms mean [only] in a
rough way; [...] When I say every 'state' or 'thought' is part of a personal
consciousness, 'personal consciousness' is one of the terms in question. Its meaning
we know so long as no one asks us to define it, but to give an accurate account of it
is the most difficult of philosophic tasks. [...] The only states of consciousness that
we naturally deal with are found in personal consciousnesses, minds, selves,
concrete particular I's and you's.[24]: 152–153
Prior to the 20th century, philosophers treated the phenomenon of consciousness as the "inner
world [of] one's own mind", and introspection was the mind "attending to" itself,[b] an activity
seemingly distinct from that of perceiving the 'outer world' and its physical phenomena. In 1892
William James noted the distinction along with doubts about the inward character of the mind:
'Things' have been doubted, but thoughts and feelings have never been doubted.
The outer world, but never the inner world, has been denied. Everyone assumes
that we have direct introspective acquaintance with our thinking activity as such,
with our consciousness as something inward and contrasted with the outer objects
which it knows. Yet I must confess that for my part I cannot feel sure of this
conclusion. [...] It seems as if consciousness as an inner activity were rather a
postulate than a sensibly given fact...[24]: 467
By the 1960s, for many philosophers and psychologists who talked about consciousness, the word
no longer meant the 'inner world' but an indefinite, large category called awareness, as in the
following example:
It is difficult for modern Western man to grasp that the Greeks really had no
concept of consciousness in that they did not class together phenomena as varied as
problem solving, remembering, imagining, perceiving, feeling pain, dreaming, and
acting on the grounds that all these are manifestations of being aware or being
conscious.[27]: 4
Many philosophers and scientists have been unhappy about the difficulty of producing a definition
that does not involve circularity or fuzziness.[28] In The Macmillan Dictionary of Psychology (1989
edition), Stuart Sutherland emphasized external awareness, and expressed a skeptical attitude more
than a definition:
Influence on research
Many philosophers have argued that consciousness is a unitary concept that is understood by the
majority of people despite the difficulty philosophers have had defining it.[32] The term 'subjective
experience', following Nagel, is amibiguous, as philosophers seem to differ from non-philosophers
in their intuitions about its meaning.[33] Max Velmans proposed that the "everyday understanding of
consciousness" uncontroversially "refers to experience itself rather than any particular thing that we
observe or experience" and he added that consciousness "is [therefore] exemplified by all the things
that we observe or experience",[34]: 4 whether thoughts, feelings, or perceptions. Velmans noted
however, as of 2009, that there was a deep level of "confusion and internal division"[34] among
experts about the phenomenon of consciousness, because researchers lacked "a sufficiently well-
specified use of the term...to agree that they are investigating the same thing".[34]: 3 He argued
additionally that "pre-existing theoretical commitments" to competing explanations of
consciousness might be a source of bias.
Within the "modern consciousness studies" community the technical phrase 'phenomenal
consciousness' is a common synonym for all forms of awareness, or simply 'experience',[34]: 4
without differentiating between inner and outer, or between higher and lower types. With advances
in brain research, "the presence or absence of experienced phenomena"[34]: 3 of any kind underlies
the work of those neuroscientists who seek "to analyze the precise relation of conscious
phenomenology to its associated information processing" in the brain.[34]: 10 This neuroscientific
goal is to find the "neural correlates of consciousness" (NCC). One criticism of this goal is that it
begins with a theoretical commitment to the neurological origin of all "experienced phenomena"
whether inner or outer.[c] Also, the fact that the easiest 'content of consciousness' to be so analyzed
is "the experienced three-dimensional world (the phenomenal world) beyond the body surface"[34]: 4
invites another criticism, that most consciousness research since the 1990s, perhaps because of
bias, has focused on processes of external perception.[36]
From a history of psychology perspective, Julian Jaynes rejected popular but "superficial views of
consciousness"[2]: 447 especially those which equate it with "that vaguest of terms, experience".[23]: 8
In 1976 he insisted that if not for introspection, which for decades had been ignored or taken for
granted rather than explained, there could be no "conception of what consciousness is"[23]: 18 and in
1990, he reaffirmed the traditional idea of the phenomenon called 'consciousness', writing that "its
denotative definition is, as it was for René Descartes, John Locke, and David Hume, what is
introspectable".[2]: 450 Jaynes saw consciousness as an important but small part of human
mentality, and he asserted: "there can be no progress in the science of consciousness until ... what
is introspectable [is] sharply distinguished"[2]: 447 from the unconscious processes of cognition such
as perception, reactive awareness and attention, and automatic forms of learning, problem-solving,
and decision-making.[23]: 21-47
The cognitive science point of view—with an inter-disciplinary perspective involving fields such as
psychology, linguistics and anthropology[37]—requires no agreed definition of "consciousness" but
studies the interaction of many processes besides perception. For some researchers,
consciousness is linked to some kind of "selfhood", for example to certain pragmatic issues such as
the feeling of agency and the effects of regret[36] and action on experience of one's own body or
social identity.[38] Similarly Daniel Kahneman, who focused on systematic errors in perception,
memory and decision-making, has differentiated between two kinds of mental processes, or
cognitive "systems":[39] the "fast" activities that are primary, automatic and "cannot be turned
off",[39]: 22 and the "slow", deliberate, effortful activities of a secondary system "often associated with
the subjective experience of agency, choice, and concentration".[39]: 13 Kahneman's two systems
have been described as "roughly corresponding to unconscious and conscious processes".[40]: 8 The
two systems can interact, for example in sharing the control of attention.[39]: 22 While System 1 can
be impulsive, "System 2 is in charge of self-control",[39]: 26 and "When we think of ourselves, we
identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides
what to think about and what to do".[39]: 21
Some have argued that we should eliminate the concept from our understanding of the mind, a
position known as consciousness semanticism.[41]
Medical definition
While historically philosophers have defended various views on consciousness, surveys indicate
that physicalism is now the dominant position among contemporary philosophers of mind.[44] For
an overview of the field, approaches often include both historical perspectives (e.g., Descartes,
Locke, Kant) and organization by key issues in contemporary debates. An alternative is to focus
primarily on current philosophical stances and empirical findings.
Philosophers differ from non-philosophers in their intuitions about what consciousness is.[45] While
most people have a strong intuition for the existence of what they refer to as consciousness,[32]
skeptics argue that this intuition is too narrow, either because the concept of consciousness is
embedded in our intuitions, or because we all are illusions. Gilbert Ryle, for example, argued that
traditional understanding of consciousness depends on a Cartesian dualist outlook that improperly
distinguishes between mind and body, or between mind and world. He proposed that we speak not
of minds, bodies, and the world, but of entities, or identities, acting in the world. Thus, by speaking of
"consciousness" we end up leading ourselves by thinking that there is any sort of thing as
consciousness separated from behavioral and linguistic understandings.[46]
Types
Ned Block argues that discussions on consciousness have often failed properly to distinguish
phenomenal consciousness from access consciousness. The terms had been used before Block
used them, but he adopted the short forms P-consciousness and A-consciousness.[47] According to
Block:
A-consciousness is the phenomenon whereby information in our minds is accessible for verbal
report, reasoning, and the control of behavior. So, when we perceive, information about what we
perceive is access conscious; when we introspect, information about our thoughts is access
conscious; when we remember, information about the past is access conscious, and so on.
Block adds that P-consciousness does not allow of easy definition: he admits that he "cannot define
P-consciousness in any remotely noncircular way.[47]
Although some philosophers, such as Daniel Dennett, have disputed the validity of this
distinction,[48] others have broadly accepted it. David Chalmers has argued that A-consciousness
can in principle be understood in mechanistic terms, but that understanding P-consciousness is
much more challenging: he calls this the hard problem of consciousness.[49]
Some philosophers believe that Block's two types of consciousness are not the end of the story.
William Lycan, for example, argued in his book Consciousness and Experience that at least eight
clearly distinct types of consciousness can be identified (organism consciousness; control
consciousness; consciousness of; state/event consciousness; reportability; introspective
consciousness; subjective consciousness; self-consciousness)—and that even this list omits
several more obscure forms.[50]
There is also debate over whether or not A-consciousness and P-consciousness always coexist or if
they can exist separately. Although P-consciousness without A-consciousness is more widely
accepted, there have been some hypothetical examples of A without P. Block, for instance, suggests
the case of a "zombie" that is computationally identical to a person but without any subjectivity.
However, he remains somewhat skeptical concluding "I don't know whether there are any actual
cases of A-consciousness without P-consciousness, but I hope I have illustrated their conceptual
possibility".[51]
Sam Harris observes: "At the level of your experience, you are not a body of cells, organelles, and
atoms; you are consciousness and its ever-changing contents".[52] Seen in this way, consciousness
is a subjectively experienced, ever-present field in which things (the contents of consciousness)
come and go.
Christopher Tricker argues that this field of consciousness is symbolized by the mythical bird that
opens the Daoist classic the Zhuangzi. This bird's name is Of a Flock (peng 鵬), yet its back is
countless thousands of miles across and its wings are like clouds arcing across the heavens. "Like
Of a Flock, whose wings arc across the heavens, the wings of your consciousness span to the
horizon. At the same time, the wings of every other being's consciousness span to the horizon. You
are of a flock, one bird among kin."[53]
Mind–body problem
Mental processes (such as consciousness) and physical processes (such as brain events) seem to
be correlated, however the specific nature of the connection is unknown.
The first influential philosopher to discuss this question specifically was Descartes, and the answer
he gave is known as mind–body dualism. Descartes proposed that consciousness resides within an
immaterial domain he called res cogitans (the realm of thought), in contrast to the domain of
material things, which he called res extensa (the realm of extension).[54] He suggested that the
interaction between these two domains occurs inside the brain, perhaps in a small midline structure
called the pineal gland.[55]
Although it is widely accepted that Descartes explained the problem cogently, few later
philosophers have been happy with his solution, and his ideas about the pineal gland have
especially been ridiculed.[55] However, no alternative solution has gained general acceptance.
Proposed solutions can be divided broadly into two categories: dualist solutions that maintain
Descartes's rigid distinction between the realm of consciousness and the realm of matter but give
different answers for how the two realms relate to each other; and monist solutions that maintain
that there is really only one realm of being, of which consciousness and matter are both aspects.
Each of these categories itself contains numerous variants. The two main types of dualism are
substance dualism (which holds that the mind is formed of a distinct type of substance not
governed by the laws of physics), and property dualism (which holds that the laws of physics are
universally valid but cannot be used to explain the mind). The three main types of monism are
physicalism (which holds that the mind is made out of matter), idealism (which holds that only
thought or experience truly exists, and matter is merely an illusion), and neutral monism (which
holds that both mind and matter are aspects of a distinct essence that is itself identical to neither of
them). There are also, however, a large number of idiosyncratic theories that cannot cleanly be
assigned to any of these schools of thought.[56]
Since the dawn of Newtonian science with its vision of simple mechanical principles governing the
entire universe, some philosophers have been tempted by the idea that consciousness could be
explained in purely physical terms. The first influential writer to propose such an idea explicitly was
Julien Offray de La Mettrie, in his book Man a Machine (L'homme machine). His arguments, however,
were very abstract.[57] The most influential modern physical theories of consciousness are based on
psychology and neuroscience. Theories proposed by neuroscientists such as Gerald Edelman[58]
and Antonio Damasio,[59] and by philosophers such as Daniel Dennett,[60] seek to explain
consciousness in terms of neural events occurring within the brain. Many other neuroscientists,
such as Christof Koch,[61] have explored the neural basis of consciousness without attempting to
frame all-encompassing global theories. At the same time, computer scientists working in the field
of artificial intelligence have pursued the goal of creating digital computer programs that can
simulate or embody consciousness.[62]
A few theoretical physicists have argued that classical physics is intrinsically incapable of
explaining the holistic aspects of consciousness, but that quantum theory may provide the missing
ingredients. Several theorists have therefore proposed quantum mind (QM) theories of
consciousness.[63] Notable theories falling into this category include the holonomic brain theory of
Karl Pribram and David Bohm, and the Orch-OR theory formulated by Stuart Hameroff and Roger
Penrose. Some of these QM theories offer descriptions of phenomenal consciousness, as well as
QM interpretations of access consciousness. None of the quantum mechanical theories have been
confirmed by experiment. Recent publications by G. Guerreshi, J. Cia, S. Popescu, and H. Briegel[64]
could falsify proposals such as those of Hameroff, which rely on quantum entanglement in protein.
At the present time many scientists and philosophers consider the arguments for an important role
of quantum phenomena to be unconvincing.[65] Empirical evidence is against the notion of quantum
consciousness, an experiment about wave function collapse led by Catalina Curceanu in 2022
suggests that quantum consciousness, as suggested by Roger Penrose and Stuart Hameroff, is
highly implausible.[66]
Apart from the general question of the "hard problem" of consciousness (which is, roughly speaking,
the question of how mental experience can arise from a physical basis[67]), a more specialized
question is how to square the subjective notion that we are in control of our decisions (at least in
some small measure) with the customary view of causality that subsequent events are caused by
prior events. The topic of free will is the philosophical and scientific examination of this conundrum.
Many philosophers consider experience to be the essence of consciousness, and believe that
experience can only fully be known from the inside, subjectively. The problem of other minds is a
philosophical problem traditionally stated as the following epistemological question: Given that I
can only observe the behavior of others, how can I know that others have minds?[68] The problem of
other minds is particularly acute for people who believe in the possibility of philosophical zombies,
that is, people who think it is possible in principle to have an entity that is physically
indistinguishable from a human being and behaves like a human being in every way but
nevertheless lacks consciousness.[69] Related issues have also been studied extensively by Greg
Littmann of the University of Illinois,[70] and by Colin Allen (a professor at the University of
Pittsburgh) regarding the literature and research studying artificial intelligence in androids.[71]
The most commonly given answer is that we attribute consciousness to other people because we
see that they resemble us in appearance and behavior; we reason that if they look like us and act
like us, they must be like us in other ways, including having experiences of the sort that we do.[72]
There are, however, a variety of problems with that explanation. For one thing, it seems to violate the
principle of parsimony, by postulating an invisible entity that is not necessary to explain what we
observe.[72] Some philosophers, such as Daniel Dennett in a research paper titled "The Unimagined
Preposterousness of Zombies", argue that people who give this explanation do not really
understand what they are saying.[73] More broadly, philosophers who do not accept the possibility of
zombies generally believe that consciousness is reflected in behavior (including verbal behavior),
and that we attribute consciousness on the basis of behavior. A more straightforward way of saying
this is that we attribute experiences to people because of what they can do, including the fact that
they can tell us about their experiences.[74]
Qualia
The term "qualia" was introduced in philosophical literature by C. I. Lewis. The word is derived from
Latin and means "of what sort". It is basically a quantity or property of something as perceived or
experienced by an individual, like the scent of rose, the taste of wine, or the pain of a headache.
They are difficult to articulate or describe. The philosopher and scientist Daniel Dennett describes
them as "the way things seem to us", while philosopher and cognitive scientist David Chalmers
expanded on qualia as the "hard problem of consciousness" in the 1990s. When qualia is
experienced, activity is simulated in the brain, and these processes are called neural correlates of
consciousness (NCCs). Many scientific studies have been done to attempt to link particular brain
regions with emotions or experiences.[75][76][77]
Species which experience qualia are said to have sentience, which is central to the animal rights
movement, because it includes the ability to experience pain and suffering.[75]
Identity
An unsolved problem in the philosophy of consciousness is how it relates to the nature of personal
identity.[78] This includes questions regarding whether someone is the "same person" from moment
to moment. If that is the case, another question is what exactly the "identity carrier" is that makes a
conscious being "the same" being from one moment to the next. The problem of determining
personal identity also includes questions such as Benj Hellie's vertiginous question, which can be
summarized as "Why am I me and not someone else?".[79] The philosophical problems regarding the
nature of personal identity have been extensively discussed by Thomas Nagel in his book The View
from Nowhere.
A common view of personal identity is that an individual has a continuous identity that persists from
moment to moment, with an individual having a continuous identity consisting of a line segment
stretching across time from birth to death. In the case of an afterlife as described in Abrahamic
religions, one's personal identity is believed to stretch infinitely into the future, forming a ray or line.
This notion of identity is similar to the form of dualism advocated by René Descartes. However,
some philosophers argue that this common notion of personal identity is unfounded. Daniel Kolak
has argued extensively against it in his book I am You.[80] Kolak refers to the aforementioned notion
of personal identity being linear as "Closed individualism". Another view of personal identity
according to Kolak is "Empty individualism", in which one's personal identity only exists for a single
moment of time. However, Kolak advocates for a view of personal identity called Open
individualism, in which all consciousness is in reality a single being and individual personal identity
in reality does not exist at all. Another philosopher who has contested the notion of personal identity
is Derek Parfit. In his book Reasons and Persons,[81] he describes a thought experiment known as
the teletransportation paradox. In Buddhist philosophy, the concept of anattā refers to the idea that
the self is an illusion.
Other philosophers have argued that Hellie's vertiginous question has a number of philosophical
implications relating to the metaphysical nature of consciousness. Christian List argues that the
vertiginous question and the existence of first-personal facts is evidence against physicalism, and
evidence against other third-personal metaphysical pictures, including standard versions of
dualism.[82] List also argues that the vertiginous question implies a "quadrilemma" for theories of
consciousness. He claims that at most three of the following metaphysical claims can be true: 'first-
person realism', 'non-solipsism', 'non-fragmentation', and 'one world' – and at least one of these four
must be false.[83] List has proposed a model he calls the "many-worlds theory of consciousness" in
order to reconcile the subjective nature of consciousness without lapsing into solipsism.[84] Vincent
Conitzer argues that the nature of identity is connected to A series and B series theories of time,
and that A-theory being true implies that the "I" is metaphysically distinguished from other
perspectives.[85] Other philosophical theories regarding the metaphysical nature of self are Caspar
Hare's theories of perspectival realism,[86] in which things within perceptual awareness have a
defining intrinsic property that exists absolutely and not relative to anything, and egocentric
presentism, in which the experiences of other individuals are not present in the way that one's
current perspective is.[87][88]
Scientific study
For many decades, consciousness as a research topic was avoided by the majority of mainstream
scientists, because of a general feeling that a phenomenon defined in subjective terms could not
properly be studied using objective experimental methods.[89] In 1975 George Mandler published an
influential psychological study which distinguished between slow, serial, and limited conscious
processes and fast, parallel and extensive unconscious ones.[90] The Science and Religion Forum[91]
1984 annual conference, 'From Artificial Intelligence to Human Consciousness' identified the nature
of consciousness as a matter for investigation; Donald Michie was a keynote speaker. Starting in
the 1980s, an expanding community of neuroscientists and psychologists have associated
themselves with a field called Consciousness Studies, giving rise to a stream of experimental work
published in books,[92] journals such as Consciousness and Cognition, Frontiers in Consciousness
Research, Psyche, and the Journal of Consciousness Studies, along with regular conferences
organized by groups such as the Association for the Scientific Study of Consciousness[93] and the
Society for Consciousness Studies.
Modern medical and psychological investigations into consciousness are based on psychological
experiments (including, for example, the investigation of priming effects using subliminal
stimuli),[94] and on case studies of alterations in consciousness produced by trauma, illness, or
drugs. Broadly viewed, scientific approaches are based on two core concepts. The first identifies the
content of consciousness with the experiences that are reported by human subjects; the second
makes use of the concept of consciousness that has been developed by neurologists and other
medical professionals who deal with patients whose behavior is impaired. In either case, the
ultimate goals are to develop techniques for assessing consciousness objectively in humans as
well as other animals, and to understand the neural and psychological mechanisms that underlie
it.[61]
For example, subjects who stare continuously at a Necker cube usually report that they experience it
"flipping" between two 3D configurations, even though the stimulus itself remains the same.[96] The
objective is to understand the relationship between the conscious awareness of stimuli (as
indicated by verbal report) and the effects the stimuli have on brain activity and behavior. In several
paradigms, such as the technique of response priming, the behavior of subjects is clearly influenced
by stimuli for which they report no awareness, and suitable experimental manipulations can lead to
increasing priming effects despite decreasing prime identification (double dissociation).[97]
Verbal report is widely considered to be the most reliable indicator of consciousness, but it raises a
number of issues.[98] For one thing, if verbal reports are treated as observations, akin to
observations in other branches of science, then the possibility arises that they may contain errors—
but it is difficult to make sense of the idea that subjects could be wrong about their own
experiences, and even more difficult to see how such an error could be detected.[99] Daniel Dennett
has argued for an approach he calls heterophenomenology, which means treating verbal reports as
stories that may or may not be true, but his ideas about how to do this have not been widely
adopted.[100] Another issue with verbal report as a criterion is that it restricts the field of study to
humans who have language: this approach cannot be used to study consciousness in other species,
pre-linguistic children, or people with types of brain damage that impair language. As a third issue,
philosophers who dispute the validity of the Turing test may feel that it is possible, at least in
principle, for verbal report to be dissociated from consciousness entirely: a philosophical zombie
may give detailed verbal reports of awareness in the absence of any genuine awareness.[101]
Although verbal report is in practice the "gold standard" for ascribing consciousness, it is not the
only possible criterion.[98] In medicine, consciousness is assessed as a combination of verbal
behavior, arousal, brain activity, and purposeful movement. The last three of these can be used as
indicators of consciousness when verbal behavior is absent.[102][103] The scientific literature
regarding the neural bases of arousal and purposeful movement is very extensive. Their reliability as
indicators of consciousness is disputed, however, due to numerous studies showing that alert
human subjects can be induced to behave purposefully in a variety of ways in spite of reporting a
complete lack of awareness.[97] Studies related to the neuroscience of free will have also shown
that the influence consciousness has on decision-making is not always straightforward.[104]
Another approach applies specifically to the study of self-awareness, that is, the ability to
distinguish oneself from others. In the 1970s Gordon Gallup developed an operational test for self-
awareness, known as the mirror test. The test examines whether animals are able to differentiate
between seeing themselves in a mirror versus seeing other animals. The classic example involves
placing a spot of coloring on the skin or fur near the individual's forehead and seeing if they attempt
to remove it or at least touch the spot, thus indicating that they recognize that the individual they are
seeing in the mirror is themselves.[105] Humans (older than 18 months) and other great apes,
bottlenose dolphins, orcas, pigeons, European magpies and elephants have all been observed to
pass this test.[106] While some other animals like pigs have been shown to find food by looking into
the mirror.[107]
Contingency awareness is another such approach, which is basically the conscious understanding
of one's actions and its effects on one's environment.[108] It is recognized as a factor in self-
recognition. The brain processes during contingency awareness and learning is believed to rely on
an intact medial temporal lobe and age. A study done in 2020 involving transcranial direct current
stimulation, Magnetic resonance imaging (MRI) and eyeblink classical conditioning supported the
idea that the parietal cortex serves as a substrate for contingency awareness and that age-related
disruption of this region is sufficient to impair awareness.[109]
Neural correlates
A major part of the scientific literature on consciousness consists of studies that examine the
relationship between the experiences reported by subjects and the activity that simultaneously
takes place in their brains—that is, studies of the neural correlates of consciousness. The hope is to
find that activity in a particular part of the brain, or a particular pattern of global brain activity, which
will be strongly predictive of conscious awareness. Several brain imaging techniques, such as EEG
and fMRI, have been used for physical measures of brain activity in these studies.[110]
Another idea that has drawn attention for several decades is that consciousness is associated with
high-frequency (gamma band) oscillations in brain activity. This idea arose from proposals in the
1980s, by Christof von der Malsburg and Wolf Singer, that gamma oscillations could solve the so-
called binding problem, by linking information represented in different parts of the brain into a
unified experience.[111] Rodolfo Llinás, for example, proposed that consciousness results from
recurrent thalamo-cortical resonance where the specific thalamocortical systems (content) and the
non-specific (centromedial thalamus) thalamocortical systems (context) interact in the gamma
band frequency via synchronous oscillations.[112]
A number of studies have shown that activity in primary sensory areas of the brain is not sufficient
to produce consciousness: it is possible for subjects to report a lack of awareness even when areas
such as the primary visual cortex (V1) show clear electrical responses to a stimulus.[113] Higher
brain areas are seen as more promising, especially the prefrontal cortex, which is involved in a range
of higher cognitive functions collectively known as executive functions.[114] There is substantial
evidence that a "top-down" flow of neural activity (i.e., activity propagating from the frontal cortex to
sensory areas) is more predictive of conscious awareness than a "bottom-up" flow of activity.[115]
The prefrontal cortex is not the only candidate area, however: studies by Nikos Logothetis and his
colleagues have shown, for example, that visually responsive neurons in parts of the temporal lobe
reflect the visual perception in the situation when conflicting visual images are presented to
different eyes (i.e., bistable percepts during binocular rivalry).[116] Furthermore, top-down feedback
from higher to lower visual brain areas may be weaker or absent in the peripheral visual field, as
suggested by some experimental data and theoretical arguments;[117] nevertheless humans can
perceive visual inputs in the peripheral visual field arising from bottom-up V1 neural
activities.[117][118] Meanwhile, bottom-up V1 activities for the central visual fields can be vetoed, and
thus made invisible to perception, by the top-down feedback, when these bottom-up signals are
inconsistent with the brain's internal model of the visual world.[117][118]
Modulation of neural responses may correlate with phenomenal experiences. In contrast to the raw
electrical responses that do not correlate with consciousness, the modulation of these responses
by other stimuli correlates surprisingly well with an important aspect of consciousness: namely with
the phenomenal experience of stimulus intensity (brightness, contrast). In the research group of
Danko Nikolić it has been shown that some of the changes in the subjectively perceived brightness
correlated with the modulation of firing rates while others correlated with the modulation of neural
synchrony.[119] An fMRI investigation suggested that these findings were strictly limited to the
primary visual areas.[120] This indicates that, in the primary visual areas, changes in firing rates and
synchrony can be considered as neural correlates of qualia—at least for some type of qualia.
In 2013, the perturbational complexity index (PCI) was proposed, a measure of the algorithmic
complexity of the electrophysiological response of the cortex to transcranial magnetic stimulation.
This measure was shown to be higher in individuals that are awake, in REM sleep or in a locked-in
state than in those who are in deep sleep or in a vegetative state,[121] making it potentially useful as
a quantitative assessment of consciousness states.
Assuming that not only humans but even some non-mammalian species are conscious, a number of
evolutionary approaches to the problem of neural correlates of consciousness open up. For
example, assuming that birds are conscious—a common assumption among neuroscientists and
ethologists due to the extensive cognitive repertoire of birds—there are comparative
neuroanatomical ways to validate some of the principal, currently competing, mammalian
consciousness–brain theories. The rationale for such a comparative study is that the avian brain
deviates structurally from the mammalian brain. So how similar are they? What homologs can be
identified? The general conclusion from the study by Butler, et al.[122] is that some of the major
theories for the mammalian brain[123][124][125] also appear to be valid for the avian brain. The
structures assumed to be critical for consciousness in mammalian brains have homologous
counterparts in avian brains. Thus the main portions of the theories of Crick and Koch,[123] Edelman
and Tononi,[124] and Cotterill[125] seem to be compatible with the assumption that birds are
conscious. Edelman also differentiates between what he calls primary consciousness (which is a
trait shared by humans and non-human animals) and higher-order consciousness as it appears in
humans alone along with human language capacity.[124] Certain aspects of the three theories,
however, seem less easy to apply to the hypothesis of avian consciousness. For instance, the
suggestion by Crick and Koch that layer 5 neurons of the mammalian brain have a special role,
seems difficult to apply to the avian brain, since the avian homologs have a different morphology.
Likewise, the theory of Eccles[126][127] seems incompatible, since a structural homolog/analogue to
the dendron has not been found in avian brains. The assumption of an avian consciousness also
brings the reptilian brain into focus. The reason is the structural continuity between avian and
reptilian brains, meaning that the phylogenetic origin of consciousness may be earlier than
suggested by many leading neuroscientists.
Joaquin Fuster of UCLA has advocated the position of the importance of the prefrontal cortex in
humans, along with the areas of Wernicke and Broca, as being of particular importance to the
development of human language capacities neuro-anatomically necessary for the emergence of
higher-order consciousness in humans.[128]
A study in 2016 looked at lesions in specific areas of the brainstem that were associated with coma
and vegetative states. A small region of the rostral dorsolateral pontine tegmentum in the brainstem
was suggested to drive consciousness through functional connectivity with two cortical regions, the
left ventral anterior insular cortex, and the pregenual anterior cingulate cortex. These three regions
may work together as a triad to maintain consciousness.[129]
Krista and Tatiana Hogan have a unique thalamic connection that may provide insight into the
philosophical and neurological foundations of consciousness. It has been argued that there's no
empirical test that can conclusively establish that for some sensations, the twins share one token
experience rather than two exactly matching token experiences. Yet background considerations
about the way the brain has specific locations for conscious contents, combined with the evident
overlapping pathways in the twins' brains, arguably implies that the twins share some conscious
experiences. If this is true, then the twins may offer a proof of concept for how experiences in
general could be shared between brains.[130][131][132]
Academic definitions of consciousness
Clear definitions of consciousness in academic literature are rare. David Chalmers declared the task
the hard problem of consciousness. However academic definitions do exist, from Tononi's
integrated information theory, Craig MacKenzie, and Cleeremans and Jimenez - the latter being a
Definition of Learning with remarkable similarity to both Tononi and MacKenzie's definitions. Both
Bernard Baars and Igor Aleksander worked out the aspects necessary for consciousness.
This definition is notable for its similarity to the global workspace theory (GWT) theatre analogy
Models
A wide range of empirical theories of consciousness have been proposed.[136][137][138] Adrian Doerig
and colleagues list 13 notable theories,[138] while Anil Seth and Tim Bayne list 22 notable
theories.[137]
Global workspace theory (GWT) is a cognitive architecture and theory of consciousness proposed
by the cognitive psychologist Bernard Baars in 1988. Baars explains the theory with the metaphor of
a theater, with conscious processes represented by an illuminated stage. This theater integrates
inputs from a variety of unconscious and otherwise autonomous networks in the brain and then
broadcasts them to unconscious networks (represented in the metaphor by a broad, unlit
"audience"). The theory has since been expanded upon by other scientists including cognitive
neuroscientist Stanislas Dehaene and Lionel Naccache.[139][140] See also the Dehaene–Changeux
model.
Integrated information theory (IIT), pioneered by neuroscientist Giulio Tononi in 2004, postulates
that consciousness resides in the information being processed and arises once the information
reaches a certain level of complexity. IIT proposes a 1:1 mapping between conscious states and
precise, formal mathematical descriptions of those mental states. Proponents of this model
suggest that it may provide a physical grounding for consciousness in neurons, as they provide the
mechanism by which information is integrated. This also relates to the "hard problem of
consciousness" proposed by David Chalmers.[141][75] In 2023, 124 scholars signed a letter saying
that IIT gets disproportionate media attention relative to its supporting empirical evidence, and
called it "pseudoscience", arguing that its core assumptions are not adequately testable. This led to
academic debate, as some other researchers objected to the "pseudoscience" characterization.[142]
Orchestrated objective reduction (Orch-OR), or the quantum theory of mind, was proposed by
scientists Roger Penrose and Stuart Hameroff, and states that consciousness originates at the
quantum level inside neurons. The mechanism is held to be a quantum process called objective
reduction that is orchestrated by cellular structures called microtubules, which form the
cytoskeleton around which the brain is built. The duo proposed that these quantum processes
accounted for creativity, innovation, and problem-solving abilities. Penrose published his views in
the book The Emperor's New Mind. In 2014, the discovery of quantum vibrations inside microtubules
gave new life to the argument.[75]
However, scientists and philosophers have criticized Penrose's interpretation of Gödel's theorem
and his conclusion that quantum phenomena play a role in human cognition.[143]
Attention schema theory
In 2011, Michael Graziano and Kastner[144] proposed the "attention schema" theory of awareness.
Graziano went on to publish an expanded discussion of this theory in his book "Consciousness and
the Social Brain".[145] In that theory, specific cortical areas, notably in the superior temporal sulcus
and the temporo-parietal junction, are used to build the construct of awareness and attribute it to
other people. The same cortical machinery is also used to attribute awareness to oneself. Damage
to these cortical regions can lead to deficits in consciousness such as hemispatial neglect. In the
attention schema theory, the value of explaining the feature of awareness and attributing it to a
person is to gain a useful predictive model of that person's attentional processing. Attention is a
style of information processing in which a brain focuses its resources on a limited set of interrelated
signals. Awareness, in this theory, is a useful, simplified schema that represents attentional states.
To be aware of X is explained by constructing a model of one's attentional focus on X.
The entropic brain is a theory of conscious states informed by neuroimaging research with
psychedelic drugs. The theory suggests that the brain in primary states such as rapid eye
movement (REM) sleep, early psychosis and under the influence of psychedelic drugs, is in a
disordered state; normal waking consciousness constrains some of this freedom and makes
possible metacognitive functions such as internal self-administered reality testing and self-
awareness.[146][147][148][149] Criticism has included questioning whether the theory has been
adequately tested.[150]
In 2017, work by David Rudrauf and colleagues, including Karl Friston, applied the active inference
paradigm to consciousness, leading to the projective consciousness model (PCM), a model of how
sensory data is integrated with priors in a process of projective transformation. The authors argue
that, while their model identifies a key relationship between computation and phenomenology, it
does not completely solve the hard problem of consciousness or completely close the explanatory
gap.[151]
In 2004, a proposal was made by molecular biologist Francis Crick (co-discoverer of the double
helix), which stated that to bind together an individual's experience, a conductor of an orchestra is
required. Together with neuroscientist Christof Koch, he proposed that this conductor would have to
collate information rapidly from various regions of the brain. The duo reckoned that the claustrum
was well suited for the task. However, Crick died while working on the idea.[75]
The proposal is backed by a study done in 2014, where a team at the George Washington University
induced unconsciousness in a 54-year-old woman suffering from intractable epilepsy by stimulating
her claustrum. The woman underwent depth electrode implantation and electrical stimulation
mapping. The electrode between the left claustrum and anterior-dorsal insula was the one which
induced unconsciousness. Correlation for interactions affecting medial parietal and posterior
frontal channels during stimulation increased significantly as well. Their findings suggested that the
left claustrum or anterior insula is an important part of a network that subserves consciousness,
and that disruption of consciousness is related to increased EEG signal synchrony within frontal-
parietal networks. However, this remains an isolated, hence inconclusive study.[75][152]
A study published in 2022 opposed the idea Claustrum is the seat of consciousness but instead
concluded that it is more like a "router" transferring command and information across the
brain.[153][154] The study showed that when the Claustrum is disabled, complex tasks could not be
performed.
The emergence of consciousness during biological evolution remains a topic of ongoing scientific
inquiry. The survival value of consciousness is still a matter of exploration and understanding. While
consciousness appears to play a crucial role in human cognition, decision-making, and self-
awareness, its adaptive significance across different species remains a subject of debate.
Some people question whether consciousness has any survival value. Some argue that
consciousness is a by-product of evolution. Thomas Henry Huxley for example defends in an essay
titled "On the Hypothesis that Animals are Automata, and its History" an epiphenomenalist theory of
consciousness, according to which consciousness is a causally inert effect of neural activity—"as
the steam-whistle which accompanies the work of a locomotive engine is without influence upon its
machinery".[155] To this William James objects in his essay Are We Automata? by stating an
evolutionary argument for mind-brain interaction implying that if the preservation and development
of consciousness in the biological evolution is a result of natural selection, it is plausible that
consciousness has not only been influenced by neural processes, but has had a survival value itself;
and it could only have had this if it had been efficacious.[156][157] Karl Popper develops a similar
evolutionary argument in the book The Self and Its Brain.[158]
Opinions are divided on when and how consciousness first arose. It has been argued that
consciousness emerged (i) exclusively with the first humans, (ii) exclusively with the first mammals,
(iii) independently in mammals and birds, or (iv) with the first reptiles.[159] Other authors date the
origins of consciousness to the first animals with nervous systems or early vertebrates in the
Cambrian over 500 million years ago.[160] Donald Griffin suggests in his book Animal Minds a
gradual evolution of consciousness.[161] Further exploration of the origins of consciousness,
particularly in molluscs, has been done by Peter Godfrey Smith in his book Metazoa.[162]
Regarding the primary function of conscious processing, a recurring idea in recent theories is that
phenomenal states somehow integrate neural activities and information-processing that would
otherwise be independent.[163] This has been called the integration consensus. Another example has
been proposed by Gerald Edelman called dynamic core hypothesis which puts emphasis on
reentrant connections that reciprocally link areas of the brain in a massively parallel manner.[164]
Edelman also stresses the importance of the evolutionary emergence of higher-order
consciousness in humans from the historically older trait of primary consciousness which humans
share with non-human animals (see Neural correlates section above). These theories of integrative
function present solutions to two classic problems associated with consciousness: differentiation
and unity. They show how our conscious experience can discriminate between a virtually unlimited
number of different possible scenes and details (differentiation) because it integrates those details
from our sensory systems, while the integrative nature of consciousness in this view easily explains
how our experience can seem unified as one whole despite all of these individual parts. However, it
remains unspecified which kinds of information are integrated in a conscious manner and which
kinds can be integrated without consciousness. Nor is it explained what specific causal role
conscious integration plays, nor why the same functionality cannot be achieved without
consciousness. Not all kinds of information are capable of being disseminated consciously (e.g.,
neural activity related to vegetative functions, reflexes, unconscious motor programs, low-level
perceptual analyzes, etc.), and many kinds of information can be disseminated and combined with
other kinds without consciousness, as in intersensory interactions such as the ventriloquism
effect.[165] Hence it remains unclear why any of it is conscious. For a review of the differences
between conscious and unconscious integrations, see the article of Ezequiel Morsella.[165]
As noted earlier, even among writers who consider consciousness to be well-defined, there is
widespread dispute about which animals other than humans can be said to possess it.[166] Edelman
has described this distinction as that of humans possessing higher-order consciousness while
sharing the trait of primary consciousness with non-human animals (see previous paragraph). Thus,
any examination of the evolution of consciousness is faced with great difficulties. Nevertheless,
some writers have argued that consciousness can be viewed from the standpoint of evolutionary
biology as an adaptation in the sense of a trait that increases fitness.[167] In his article "Evolution of
consciousness", John Eccles argued that special anatomical and physical properties of the
mammalian cerebral cortex gave rise to consciousness ("[a] psychon ... linked to [a] dendron
through quantum physics").[168] Bernard Baars proposed that once in place, this "recursive" circuitry
may have provided a basis for the subsequent development of many of the functions that
consciousness facilitates in higher organisms.[169] Peter Carruthers has put forth one such potential
adaptive advantage gained by conscious creatures by suggesting that consciousness allows an
individual to make distinctions between appearance and reality.[170] This ability would enable a
creature to recognize the likelihood that their perceptions are deceiving them (e.g. that water in the
distance may be a mirage) and behave accordingly, and it could also facilitate the manipulation of
others by recognizing how things appear to them for both cooperative and devious ends.
Other philosophers, however, have suggested that consciousness would not be necessary for any
functional advantage in evolutionary processes.[171][172] No one has given a causal explanation, they
argue, of why it would not be possible for a functionally equivalent non-conscious organism (i.e., a
philosophical zombie) to achieve the very same survival advantages as a conscious organism. If
evolutionary processes are blind to the difference between function F being performed by conscious
organism O and non-conscious organism O*, it is unclear what adaptive advantage consciousness
could provide.[173] As a result, an exaptive explanation of consciousness has gained favor with
some theorists that posit consciousness did not evolve as an adaptation but was an exaptation
arising as a consequence of other developments such as increases in brain size or cortical
rearrangement.[160] Consciousness in this sense has been compared to the blind spot in the retina
where it is not an adaption of the retina, but instead just a by-product of the way the retinal axons
were wired.[174] Several scholars including Pinker, Chomsky, Edelman, and Luria have indicated the
importance of the emergence of human language as an important regulative mechanism of learning
and memory in the context of the development of higher-order consciousness (see Neural correlates
section above).
Altered states
There are some brain states in which consciousness seems to be absent, including dreamless sleep
or coma. There are also a variety of circumstances that can change the relationship between the
mind and the world in less drastic ways, producing what are known as altered states of
consciousness. Some altered states occur naturally; others can be produced by drugs or brain
damage.[175] Altered states can be accompanied by changes in thinking, disturbances in the sense
of time, feelings of loss of control, changes in emotional expression, alternations in body image and
changes in meaning or significance.[176]
The two most widely accepted altered states are sleep and dreaming. Although dream sleep and
non-dream sleep appear very similar to an outside observer, each is associated with a distinct
pattern of brain activity, metabolic activity, and eye movement; each is also associated with a
distinct pattern of experience and cognition. During ordinary non-dream sleep, people who are
awakened report only vague and sketchy thoughts, and their experiences do not cohere into a
continuous narrative. During dream sleep, in contrast, people who are awakened report rich and
detailed experiences in which events form a continuous progression, which may however be
interrupted by bizarre or fantastic intrusions.[177] Thought processes during the dream state
frequently show a high level of irrationality. Both dream and non-dream states are associated with
severe disruption of memory: it usually disappears in seconds during the non-dream state, and in
minutes after awakening from a dream unless actively refreshed.[178]
Research conducted on the effects of partial epileptic seizures on consciousness found that
patients who have partial epileptic seizures experience altered states of consciousness.[179][180] In
partial epileptic seizures, consciousness is impaired or lost while some aspects of consciousness,
often automated behaviors, remain intact. Studies found that when measuring the qualitative
features during partial epileptic seizures, patients exhibited an increase in arousal and became
absorbed in the experience of the seizure, followed by difficulty in focusing and shifting attention.
There has been some research into physiological changes in yogis and people who practise various
techniques of meditation. Some research with brain waves during meditation has reported
differences between those corresponding to ordinary relaxation and those corresponding to
meditation. It has been disputed, however, whether there is enough evidence to count these as
physiologically distinct states of consciousness.[183]
The most extensive study of the characteristics of altered states of consciousness was made by
psychologist Charles Tart in the 1960s and 1970s. Tart analyzed a state of consciousness as made
up of a number of component processes, including exteroception (sensing the external world);
interoception (sensing the body); input-processing (seeing meaning); emotions; memory; time
sense; sense of identity; evaluation and cognitive processing; motor output; and interaction with the
environment.[184] Each of these, in his view, could be altered in multiple ways by drugs or other
manipulations. The components that Tart identified have not, however, been validated by empirical
studies. Research in this area has not yet reached firm conclusions, but a recent questionnaire-
based study identified eleven significant factors contributing to drug-induced states of
consciousness: experience of unity; spiritual experience; blissful state; insightfulness;
disembodiment; impaired control and cognition; anxiety; complex imagery; elementary imagery;
audio-visual synesthesia; and changed meaning of percepts.[185]
Medical aspects
The medical approach to consciousness is scientifically oriented. It derives from a need to treat
people whose brain function has been impaired as a result of disease, brain damage, toxins, or
drugs. In medicine, conceptual distinctions are considered useful to the degree that they can help to
guide treatments. The medical approach mainly focuses on the amount of consciousness a person
has: in medicine, consciousness is assessed as a "level" ranging from coma and brain death at the
low end, to full alertness and purposeful responsiveness at the high end.[186]
Assessment
The more complex procedure is known as a neurological examination, and is usually carried out by
a neurologist in a hospital setting. A formal neurological examination runs through a precisely
delineated series of tests, beginning with tests for basic sensorimotor reflexes, and culminating
with tests for sophisticated use of language. The outcome may be summarized using the Glasgow
Coma Scale, which yields a number in the range 3–15, with a score of 3 to 8 indicating coma, and 15
indicating full consciousness. The Glasgow Coma Scale has three subscales, measuring the best
motor response (ranging from "no motor response" to "obeys commands"), the best eye response
(ranging from "no eye opening" to "eyes opening spontaneously") and the best verbal response
(ranging from "no verbal response" to "fully oriented"). There is also a simpler pediatric version of
the scale, for children too young to be able to use language.[186]
Disorders
Medical conditions that inhibit consciousness are considered disorders of consciousness.[191] This
category generally includes minimally conscious state and persistent vegetative state, but
sometimes also includes the less severe locked-in syndrome and more severe chronic
coma.[191][192] Differential diagnosis of these disorders is an active area of biomedical
research.[193][194][195] Finally, brain death results in possible irreversible disruption of
consciousness.[191] While other conditions may cause a moderate deterioration (e.g., dementia and
delirium) or transient interruption (e.g., grand mal and petit mal seizures) of consciousness, they are
not included in this category.
Disorder Description
Locked-in The patient has awareness, sleep-wake cycles, and meaningful behavior (viz., eye-movement), but
syndrome is isolated due to quadriplegia and pseudobulbar palsy.
Minimally The patient has intermittent periods of awareness and wakefulness and displays some
conscious state meaningful behavior.
Persistent The patient has sleep-wake cycles, but lacks awareness and only displays reflexive and non-
vegetative state purposeful behavior.
Chronic coma The patient lacks awareness and sleep-wake cycles and only displays reflexive behavior.
Brain death The patient lacks awareness, sleep-wake cycles, and brain-mediated reflexive behavior.
In children
Of the eight types of consciousness in the Lycan classification, some are detectable in utero and
others develop years after birth. Psychologist and educator William Foulkes studied children's
dreams and concluded that prior to the shift in cognitive maturation that humans experience during
ages five to seven,[199] children lack the Lockean consciousness that Lycan had labeled
"introspective consciousness" and that Foulkes labels "self-reflection".[200] In a 2020 paper,
Katherine Nelson and Robyn Fivush use "autobiographical consciousness" to label essentially the
same faculty, and agree with Foulkes on the timing of this faculty's acquisition. Nelson and Fivush
contend that "language is the tool by which humans create a new, uniquely human form of
consciousness, namely, autobiographical consciousness".[201] Julian Jaynes had staked out these
positions decades earlier.[202][203] Citing the developmental steps that lead the infant to
autobiographical consciousness, Nelson and Fivush point to the acquisition of "theory of mind",
calling theory of mind "necessary for autobiographical consciousness" and defining it as
"understanding differences between one's own mind and others' minds in terms of beliefs, desires,
emotions and thoughts". They write, "The hallmark of theory of mind, the understanding of false
belief, occurs ... at five to six years of age".[204]
In animals
The topic of animal consciousness is beset by a number of difficulties. It poses the problem of other
minds in an especially severe form, because non-human animals, lacking the ability to express
human language, cannot tell humans about their experiences.[205] Also, it is difficult to reason
objectively about the question, because a denial that an animal is conscious is often taken to imply
that it does not feel, its life has no value, and that harming it is not morally wrong. Descartes, for
example, has sometimes been blamed for mistreatment of animals due to the fact that he believed
only humans have a non-physical mind.[206] Most people have a strong intuition that some animals,
such as cats and dogs, are conscious, while others, such as insects, are not; but the sources of this
intuition are not obvious, and are often based on personal interactions with pets and other animals
they have observed.[205]
Thomas Nagel argues that while a human
might be able to imagine what it is like to
be a bat by taking "the bat's point of view",
it would still be impossible "to know what it
is like for a bat to be a bat". (Townsend's
big-eared bat pictured.)
Philosophers who consider subjective experience the essence of consciousness also generally
believe, as a correlate, that the existence and nature of animal consciousness can never rigorously
be known. Thomas Nagel spelled out this point of view in an influential essay titled "What Is it Like
to Be a Bat?". He said that an organism is conscious "if and only if there is something that it is like to
be that organism—something it is like for the organism"; and he argued that no matter how much we
know about an animal's brain and behavior, we can never really put ourselves into the mind of the
animal and experience its world in the way it does itself.[207] Other thinkers, such as Douglas
Hofstadter, dismiss this argument as incoherent.[208] Several psychologists and ethologists have
argued for the existence of animal consciousness by describing a range of behaviors that appear to
show animals holding beliefs about things they cannot directly perceive—Donald Griffin's 2001 book
Animal Minds reviews a substantial portion of the evidence.[161]
On July 7, 2012, eminent scientists from different branches of neuroscience gathered at the
University of Cambridge to celebrate the Francis Crick Memorial Conference, which deals with
consciousness in humans and pre-linguistic consciousness in nonhuman animals. After the
conference, they signed in the presence of Stephen Hawking, the 'Cambridge Declaration on
Consciousness', which summarizes the most important findings of the survey:
"We decided to reach a consensus and make a statement directed to the public that is not scientific.
It's obvious to everyone in this room that animals have consciousness, but it is not obvious to the
rest of the world. It is not obvious to the rest of the Western world or the Far East. It is not obvious
to the society."[209]
"Convergent evidence indicates that non-human animals ..., including all mammals and birds, and
other creatures, ... have the necessary neural substrates of consciousness and the capacity to
exhibit intentional behaviors."[210]
In artificial intelligence
The idea of an artifact made conscious is an ancient theme of mythology, appearing for example in
the Greek myth of Pygmalion, who carved a statue that was magically brought to life, and in
medieval Jewish stories of the Golem, a magically animated homunculus built of clay.[211] However,
the possibility of actually constructing a conscious machine was probably first discussed by Ada
Lovelace, in a set of notes written in 1842 about the Analytical Engine invented by Charles Babbage,
a precursor (never built) to modern electronic computers. Lovelace was essentially dismissive of
the idea that a machine such as the Analytical Engine could think in a humanlike way. She wrote:
It is desirable to guard against the possibility of exaggerated ideas that might arise
as to the powers of the Analytical Engine. ... The Analytical Engine has no
pretensions whatever to originate anything. It can do whatever we know how to
order it to perform. It can follow analysis; but it has no power of anticipating any
analytical relations or truths. Its province is to assist us in making available what
we are already acquainted with.[212]
One of the most influential contributions to this question was an essay written in 1950 by pioneering
computer scientist Alan Turing, titled Computing Machinery and Intelligence. Turing disavowed any
interest in terminology, saying that even "Can machines think?" is too loaded with spurious
connotations to be meaningful; but he proposed to replace all such questions with a specific
operational test, which has become known as the Turing test.[213] To pass the test, a computer must
be able to imitate a human well enough to fool interrogators. In his essay Turing discussed a variety
of possible objections, and presented a counterargument to each of them. The Turing test is
commonly cited in discussions of artificial intelligence as a proposed criterion for machine
consciousness; it has provoked a great deal of philosophical debate. For example, Daniel Dennett
and Douglas Hofstadter argue that anything capable of passing the Turing test is necessarily
conscious,[214] while David Chalmers argues that a philosophical zombie could pass the test, yet fail
to be conscious.[215] A third group of scholars have argued that with technological growth once
machines begin to display any substantial signs of human-like behavior then the dichotomy (of
human consciousness compared to human-like consciousness) becomes passé and issues of
machine autonomy begin to prevail even as observed in its nascent form within contemporary
industry and technology.[70][71] Jürgen Schmidhuber argues that consciousness is the result of
compression.[216] As an agent sees representation of itself recurring in the environment, the
compression of this representation can be called consciousness.
John Searle in December 2005
In a lively exchange over what has come to be referred to as "the Chinese room argument", John
Searle sought to refute the claim of proponents of what he calls "strong artificial intelligence (AI)"
that a computer program can be conscious, though he does agree with advocates of "weak AI" that
computer programs can be formatted to "simulate" conscious states. His own view is that
consciousness has subjective, first-person causal powers by being essentially intentional due to the
way human brains function biologically; conscious persons can perform computations, but
consciousness is not inherently computational the way computer programs are. To make a Turing
machine that speaks Chinese, Searle imagines a room with one monolingual English speaker
(Searle himself, in fact), a book that designates a combination of Chinese symbols to be output
paired with Chinese symbol input, and boxes filled with Chinese symbols. In this case, the English
speaker is acting as a computer and the rulebook as a program. Searle argues that with such a
machine, he would be able to process the inputs to outputs perfectly without having any
understanding of Chinese, nor having any idea what the questions and answers could possibly
mean. If the experiment were done in English, since Searle knows English, he would be able to take
questions and give answers without any algorithms for English questions, and he would be
effectively aware of what was being said and the purposes it might serve. Searle would pass the
Turing test of answering the questions in both languages, but he is only conscious of what he is
doing when he speaks English. Another way of putting the argument is to say that computer
programs can pass the Turing test for processing the syntax of a language, but that the syntax
cannot lead to semantic meaning in the way strong AI advocates hoped.[217][218]
In the literature concerning artificial intelligence, Searle's essay has been second only to Turing's in
the volume of debate it has generated.[219] Searle himself was vague about what extra ingredients it
would take to make a machine conscious: all he proposed was that what was needed was "causal
powers" of the sort that the brain has and that computers lack. But other thinkers sympathetic to his
basic argument have suggested that the necessary (though perhaps still not sufficient) extra
conditions may include the ability to pass not just the verbal version of the Turing test, but the
robotic version,[220] which requires grounding the robot's words in the robot's sensorimotor capacity
to categorize and interact with the things in the world that its words are about, Turing-
indistinguishably from a real person. Turing-scale robotics is an empirical branch of research on
embodied cognition and situated cognition.[221]
In 2014, Victor Argonov has suggested a non-Turing test for machine consciousness based on a
machine's ability to produce philosophical judgments.[222] He argues that a deterministic machine
must be regarded as conscious if it is able to produce judgments on all problematic properties of
consciousness (such as qualia or binding) having no innate (preloaded) philosophical knowledge on
these issues, no philosophical discussions while learning, and no informational models of other
creatures in its memory (such models may implicitly or explicitly contain knowledge about these
creatures' consciousness). However, this test can be used only to detect, but not refute the
existence of consciousness. A positive result proves that a machine is conscious but a negative
result proves nothing. For example, absence of philosophical judgments may be caused by lack of
the machine's intellect, not by absence of consciousness.
Nick Bostrom has argued in 2023 that, being very sure that large language models (LLMs) aren't
conscious, would require unwarranted confidence; in which consciousness theory is correct and
how it applies to machines.[223] He views consciousness as a matter of degree,[224] and argued that
machines could in theory be much more conscious than humans.[225][226]
Stream of consciousness
William James is usually credited with popularizing the idea that human consciousness flows like a
stream, in his Principles of Psychology of 1890.
A similar concept appears in Buddhist philosophy, expressed by the Sanskrit term Citta-saṃtāna,
which is usually translated as mindstream or "mental continuum". Buddhist teachings describe that
consciousness manifests moment to moment as sense impressions and mental phenomena that
are continuously changing.[228] The teachings list six triggers that can result in the generation of
different mental events.[228] These triggers are input from the five senses (seeing, hearing, smelling,
tasting or touch sensations), or a thought (relating to the past, present or the future) that happen to
arise in the mind. The mental events generated as a result of these triggers are: feelings,
perceptions and intentions/behaviour. The moment-by-moment manifestation of the mind-stream is
said to happen in every person all the time. It even happens in a scientist who analyzes various
phenomena in the world, or analyzes the material body including the organ brain.[228] The
manifestation of the mindstream is also described as being influenced by physical laws, biological
laws, psychological laws, volitional laws, and universal laws.[228] The purpose of the Buddhist
practice of mindfulness is to understand the inherent nature of the consciousness and its
characteristics.[229]
Narrative form
In the West, the primary impact of the idea has been on literature rather than science: "stream of
consciousness as a narrative mode" means writing in a way that attempts to portray the moment-to-
moment thoughts and experiences of a character. This technique perhaps had its beginnings in the
monologues of Shakespeare's plays and reached its fullest development in the novels of James
Joyce and Virginia Woolf, although it has also been used by many other noted writers.[230]
Here, for example, is a passage from Joyce's Ulysses about the thoughts of Molly Bloom:
Yes because he never did a thing like that before as ask to get his breakfast in bed
with a couple of eggs since the City Arms hotel when he used to be pretending to be
laid up with a sick voice doing his highness to make himself interesting for that old
faggot Mrs Riordan that he thought he had a great leg of and she never left us a
farthing all for masses for herself and her soul greatest miser ever was actually
afraid to lay out 4d for her methylated spirit telling me all her ailments she had too
much old chat in her about politics and earthquakes and the end of the world let us
have a bit of fun first God help the world if all the women were her sort down on
bathingsuits and lownecks of course nobody wanted her to wear them I suppose
she was pious because no man would look at her twice I hope Ill never be like her a
wonder she didnt want us to cover our faces but she was a well-educated woman
certainly and her gabby talk about Mr Riordan here and Mr Riordan there I suppose
he was glad to get shut of her.[231]
Spiritual approaches
The Upanishads hold the oldest recorded map of consciousness, as explored by sages through
meditation.[232]
The Canadian psychiatrist Richard Maurice Bucke, author of the 1901 book Cosmic Consciousness:
A Study in the Evolution of the Human Mind, distinguished between three types of consciousness:
'Simple Consciousness', awareness of the body, possessed by many animals; 'Self Consciousness',
awareness of being aware, possessed only by humans; and 'Cosmic Consciousness', awareness of
the life and order of the universe, possessed only by humans who have attained "intellectual
enlightenment or illumination".[233]
Another thorough account of the spiritual approach is Ken Wilber's 1977 book The Spectrum of
Consciousness, a comparison of western and eastern ways of thinking about the mind. Wilber
described consciousness as a spectrum with ordinary awareness at one end, and more profound
types of awareness at higher levels.[234]
Other examples include the various levels of spiritual consciousness presented by Prem Saran
Satsangi and Stuart Hameroff.[235]
See also
b. From the Macmillan Encyclopedia of Philosophy (1967): "Locke's use of 'consciousness' was
widely adopted in British philosophy. In the late nineteenth century the term 'introspection'
began to be used. G. F. Stout's definition is typical: "To introspect is to attend to the workings of
one's own mind" [... (1899)]".[26]: 191–192
c. "Investigating "how experience ensues from the brain", rather than exploring a factual claim,
betrays a philosophical commitment".[35]
References
2. Jaynes J (2000) [1976]. The Origin of Consciousness in the Breakdown of the Bicameral Mind.
Houghton Mifflin. ISBN 0-618-05707-2.
3. Rochat P (2003). "Five levels of self-awareness as they unfold early in life" (http://psychology.e
mory.edu/cognition/rochat/Five%20levels%20.pdf) (PDF). Consciousness and Cognition. 12
(4): 717–731. doi:10.1016/s1053-8100(03)00081-3 (https://doi.org/10.1016%2Fs1053-8100%
2803%2900081-3) . PMID 14656513 (https://pubmed.ncbi.nlm.nih.gov/14656513) .
S2CID 10241157 (https://api.semanticscholar.org/CorpusID:10241157) . Archived (https://gh
ostarchive.org/archive/20221009/http://psychology.emory.edu/cognition/rochat/Five%20level
s%20.pdf) (PDF) from the original on 2022-10-09.
4. P.A. Guertin (2019). "A novel concept introducing the idea of continuously changing levels of
consciousness" (https://jcer.com/index.php/jcj/article/view/829/825) . Journal of
Consciousness Exploration & Research. 10 (6): 406–412. Archived (https://web.archive.org/we
b/20211215112848/https://jcer.com/index.php/jcj/article/view/829/825) from the original
on 2021-12-15. Retrieved 2021-08-19.
5. Hacker P (2012). "The Sad and Sorry History of Consciousness: being, among other things, a
challenge to the "consciousness-studies community" " (http://info.sjc.ox.ac.uk/scr/hacker/doc
s/ConsciousnessAChallenge.pdf) (PDF). Royal Institute of Philosophy. supplementary volume
70. Archived (https://ghostarchive.org/archive/20221009/http://info.sjc.ox.ac.uk/scr/hacker/d
ocs/ConsciousnessAChallenge.pdf) (PDF) from the original on 2022-10-09.
6. Barfield O (1962) [1926]. History in English Words (239 pgs. paper covered ed.). London: Faber
and Faber Limited.
7. C. S. Lewis (1990). "Ch. 8: Conscience and conscious". Studies in words. Cambridge University
Press. ISBN 978-0-521-39831-2.
8. Thomas Hobbes (1904). Leviathan: or, The Matter, Forme & Power of a Commonwealth,
Ecclesiasticall and Civill (https://archive.org/details/leviathan00hobbgoog) . University Press.
p. 39 (https://archive.org/details/leviathan00hobbgoog/page/n62) .
9. James Ussher, Charles Richard Elrington (1613). The whole works, Volume 2. Hodges and
Smith. p. 417.
11. G. Molenaar (1969). "Seneca's Use of the Term Conscientia". Mnemosyne. 22 (2): 170–180.
doi:10.1163/156852569x00670 (https://doi.org/10.1163%2F156852569x00670) .
12. Boris Hennig (2007). "Cartesian Conscientia". British Journal for the History of Philosophy. 15
(3): 455–484. doi:10.1080/09608780701444915 (https://doi.org/10.1080%2F0960878070144
4915) . S2CID 218603781 (https://api.semanticscholar.org/CorpusID:218603781) .
13. Charles Adam, Paul Tannery (eds.), Oeuvres de Descartes X, 524 (https://archive.org/stream/oe
uvresdedescar10desc#page/524/mode/2up) (1908).
14. Sara Heinämaa, Vili Lähteenmäki, Pauliina Remes, eds. (2007). Consciousness: from perception
to reflection in the history of philosophy. Springer. pp. 205–206. ISBN 978-1-4020-6081-6.
15. Locke J. "An Essay Concerning Human Understanding (Chapter XXVII)" (https://web.archive.or
g/web/20180508053707/https://ebooks.adelaide.edu.au/l/locke/john/l81u/B2.27.html) .
Australia: University of Adelaide. Archived from the original (https://ebooks.adelaide.edu.au/l/l
ocke/john/l81u/B2.27.html) on May 8, 2018. Retrieved August 20, 2010.
23. Jaynes J (1976). The Origin of Consciousness in the Breakdown of the Bicameral Mind (https://a
rchive.org/details/originofconsciou0000unse) . Houghton Mifflin. ISBN 0-395-20729-0.
24. James W (1948) [1892]. Psychology. Cleveland: Fine Editions Press, World Publishing Co.
27. Peters RS, Mace CA (1967). "Psychology". In Edwards P (ed.). The Encyclopedia of Philosophy.
Vol. 7 (Reprint 1972 ed.). Macmillan, Inc. pp. 1–27.
32. Michael V. Antony (2001). "Is consciousness ambiguous?". Journal of Consciousness Studies. 8:
19–44.
33. Justin Sytsma, Edouard Machery (2010). "Two conceptions of subjective experience" (http://ph
ilsci-archive.pitt.edu/archive/00004888/01/Two_Conceptions_of_Subjective_Experience.pdf)
(PDF). Philosophical Studies. 151 (2): 299–327. doi:10.1007/s11098-009-9439-x (https://doi.or
g/10.1007%2Fs11098-009-9439-x) . S2CID 2444730 (https://api.semanticscholar.org/CorpusI
D:2444730) . Archived (https://ghostarchive.org/archive/20221009/http://philsci-archive.pitt.
edu/archive/00004888/01/Two_Conceptions_of_Subjective_Experience.pdf) (PDF) from the
original on 2022-10-09.
34. Max Velmans (2009). "How to define consciousness—and how not to define consciousness".
Journal of Consciousness Studies. 16: 139–156.
36. Frith C, Metzinger T (March 2016). "What's the Use of Consciousness? How the Stab of
Conscience Made Us Really Conscious". In Engel AK (ed.). The Pragmatic Turn: Toward Action-
Oriented Views in Cognitive Science (https://www.researchgate.net/publication/304657860) .
pp. 193–214. doi:10.7551/mitpress/9780262034326.003.0012 (https://doi.org/10.7551%2Fmi
tpress%2F9780262034326.003.0012) . ISBN 9780262034326.
38. Seth A (March 2016). "Action-Oriented Understanding of Consciousness and the Structure of
Experience". In Engel AK (ed.). The Pragmatic Turn: Toward Action-Oriented Views in Cognitive
Science. pp. 261–282. doi:10.7551/mitpress/9780262034326.003.0012 (https://doi.org/10.75
51%2Fmitpress%2F9780262034326.003.0012) . ISBN 978-0-262-03432-6.
42. Güzeldere G (1997). Block N, Flanagan O, Güzeldere G (eds.). The Nature of Consciousness:
Philosophical Debates. Cambridge, MA: MIT Press. pp. 1–67.
43. Fins JJ, Schiff ND, Foley KM (2007). "Late recovery from the minimally conscious state: ethical
and policy implications". Neurology. 68 (4): 304–307.
doi:10.1212/01.wnl.0000252376.43779.96 (https://doi.org/10.1212%2F01.wnl.0000252376.43
779.96) . PMID 17242341 (https://pubmed.ncbi.nlm.nih.gov/17242341) . S2CID 32561349
(https://api.semanticscholar.org/CorpusID:32561349) .
45. Justin Sytsma, Edouard Machery (2010). "Two conceptions of subjective experience" (http://ph
ilsci-archive.pitt.edu/archive/00004888/01/Two_Conceptions_of_Subjective_Experience.pdf)
(PDF). Philosophical Studies. 151 (2): 299–327. doi:10.1007/s11098-009-9439-x (https://doi.or
g/10.1007%2Fs11098-009-9439-x) . S2CID 2444730 (https://api.semanticscholar.org/CorpusI
D:2444730) . Archived (https://ghostarchive.org/archive/20221009/http://philsci-archive.pitt.
edu/archive/00004888/01/Two_Conceptions_of_Subjective_Experience.pdf) (PDF) from the
original on 2022-10-09.
46. Gilbert Ryle (2000) [1949]. The Concept of Mind. University of Chicago Press. pp. 156–163.
ISBN 978-0-226-73296-1.
47. Ned Block (1998). "On a confusion about a function of consciousness" (http://cogprints.org/23
1/1/199712004.html) . In N. Block, O. Flanagan, G. Guzeldere (eds.). The Nature of
Consciousness: Philosophical Debates. MIT Press. pp. 375–415. ISBN 978-0-262-52210-6.
Archived (https://web.archive.org/web/20111103034117/http://cogprints.org/231/1/1997120
04.html) from the original on 2011-11-03. Retrieved 2011-09-10. Pages 230 and 231 in the
version on the author's own website (https://www.nedblock.us/papers/1995_Function.pdf) .
48. Daniel Dennett (2004). Consciousness Explained. Penguin. p. 375. ISBN 978-0-7139-9037-9.
50. William Lycan (1996). Consciousness and Experience. MIT Press. pp. 1–4. ISBN 978-0-262-
12197-2.
52. Harris, S. (12 October 2011). The mystery of consciousness. Sam Harris.
https://www.samharris.org/blog/the-mystery-of-consciousness Archived (https://web.archiv
e.org/web/20230423061921/https://www.samharris.org/blog/the-mystery-of-consciousnes
s) 2023-04-23 at the Wayback Machine
53. Tricker, C. (2022). The cicada and the bird. The usefulness of a useless philosophy. Chuang
Tzu's ancient wisdom translated for modern life. (https://thecicadaandthebird.com) Archived
(https://web.archive.org/web/20230421032929/https://thecicadaandthebird.com/) 2023-04-
21 at the Wayback Machine Page 52. (Google Books) (https://books.google.com/books?id=Yn
CaEAAAQBAJ) Archived (https://web.archive.org/web/20230608153319/https://books.googl
e.com/books?id=YnCaEAAAQBAJ) 2023-06-08 at the Wayback Machine
54. Dy MB Jr (2001). Philosophy of Man: selected readings. Goodwill Trading Co. p. 97. ISBN 978-
971-12-0245-3.
56. William Jaworski (2011). Philosophy of Mind: A Comprehensive Introduction. John Wiley and
Sons. pp. 5–11. ISBN 978-1-4443-3367-1.
57. Julien Offray de La Mettrie (1996). Ann Thomson (ed.). Machine man and other writings.
Cambridge University Press. ISBN 978-0-521-47849-6.
58. Gerald Edelman (1993). Bright Air, Brilliant Fire: On the Matter of the Mind (https://archive.org/de
tails/brightairbrillia00gera) . Basic Books. ISBN 978-0-465-00764-6.
59. Antonio Damasio (1999). The Feeling of What Happens: Body and Emotion in the Making of
Consciousness (https://archive.org/details/feelingofwhathap00dama_0) . New York: Harcourt
Press. ISBN 978-0-15-601075-7.
61. Christof Koch (2004). The Quest for Consciousness. Englewood, CO: Roberts & Company.
ISBN 978-0-9747077-0-9.
62. Ron Sun and Stan Franklin, Computational models of consciousness: A taxonomy and some
examples. In: P.D. Zelazo, M. Moscovitch, and E. Thompson (eds.), The Cambridge Handbook of
Consciousness, pp. 151–174. Cambridge University Press, New York. 2007
64. Cai J, Popescu S, Briegel H (2010). "Persistent dynamic entanglement from classical motion:
How bio-molecular machines can generate non-trivial quantum states". Physical Review E. 82
(2): 021921. arXiv:0809.4906 (https://arxiv.org/abs/0809.4906) .
Bibcode:2010PhRvE..82b1921C (https://ui.adsabs.harvard.edu/abs/2010PhRvE..82b1921C) .
doi:10.1103/PhysRevE.82.021921 (https://doi.org/10.1103%2FPhysRevE.82.021921) .
PMID 20866851 (https://pubmed.ncbi.nlm.nih.gov/20866851) . S2CID 23336691 (https://api.
semanticscholar.org/CorpusID:23336691) .
65. John Searle (1997). The Mystery of Consciousness. The New York Review of Books. pp. 53–88.
ISBN 978-0-940322-06-6.
66. Derakhshani M, Diósi L, Laubenstein M, Piscicchia K, Curceanu C (September 2022). "At the
crossroad of the search for spontaneous radiation and the Orch OR consciousness theory" (htt
ps://www.sciencedirect.com/science/article/abs/pii/S1571064522000197) . Physics of Life
Reviews. 42: 8–14. Bibcode:2022PhLRv..42....8D (https://ui.adsabs.harvard.edu/abs/2022PhLR
v..42....8D) . doi:10.1016/j.plrev.2022.05.004 (https://doi.org/10.1016%2Fj.plrev.2022.05.00
4) . PMID 35617922 (https://pubmed.ncbi.nlm.nih.gov/35617922) .
67. Rocco J. Gennaro (2011). "§4.4 The hard problem of consciousness" (https://books.google.co
m/books?id=t-XgKMgzwk4C&pg=PA75) . The Consciousness Paradox: Consciousness,
Concepts, and Higher-Order Thoughts. MIT Press. p. 75. ISBN 978-0-262-01660-5.
68. Hyslop A (14 January 2014). Zalta EN, Nodelman U (eds.). "Other minds" (http://plato.stanford.
edu/entries/other-minds/) . Stanford Encyclopedia of Philosophy. Metaphysics Research Lab,
Center for the Study of Language and Information, Stanford University. ISSN 1095-5054 (http
s://search.worldcat.org/issn/1095-5054) . Retrieved May 26, 2015.
70. The Culture and Philosophy of Ridley Scott, Greg Littmann, pp. 133–144, Lexington Books
(2013).
71. Moral Machines, Wendell Wallach and Colin Allen, 288 pages, Oxford University Press, USA
(June 3, 2010), ISBN 0-19-973797-5.
72. Alec Hyslop (1995). "The analogical inference to other minds". Other Minds. Springer. pp. 41–
70. ISBN 978-0-7923-3245-9.
74. Stevan Harnad (1995). "Why and how we are not zombies". Journal of Consciousness Studies.
1: 164–167.
75. Parsons P, Dixon G (2016). 50 Ideas You Really Need to Know: Science. London: Quercus.
pp. 141–143. ISBN 978-1-78429-614-8.
76. Oxford English Dictionary, "qualia", 3rd ed., Oxford University Press, 2010. Accessed October 3,
2024. https://www.oed.com/search/dictionary/?scope=Entries&q=qualia .
80. Kolak D (2007-11-03). I Am You: The Metaphysical Foundations for Global Ethics (https://digitalp
hysics.ru/pdf/Kaminskii_A_V/Kolak_I_Am_You.pdf) (PDF). Springer Science & Business
Media. ISBN 978-1-4020-3014-7. Archived (https://web.archive.org/web/20240906163443/htt
ps://digitalphysics.ru/pdf/Kaminskii_A_V/Kolak_I_Am_You.pdf) (PDF) from the original on
2024-09-06.
85. Conitzer V (30 Aug 2020). "The Personalized A-Theory of Time and Perspective".
arXiv:2008.13207v1 (https://arxiv.org/abs/2008.13207v1) [physics.hist-ph (https://arxiv.org/
archive/physics.hist-ph) ].
86. Hare C (September 2010). "Realism About Tense and Perspective" (http://web.mit.edu/~caspa
rh/www/Papers/CJHarePerspectivalRealism.pdf) (PDF). Philosophy Compass. 5 (9): 760–
769. doi:10.1111/j.1747-9991.2010.00325.x (https://doi.org/10.1111%2Fj.1747-9991.2010.003
25.x) . hdl:1721.1/115229 (https://hdl.handle.net/1721.1%2F115229) .
87. Hare C (July 2007). "Self-Bias, Time-Bias, and the Metaphysics of Self and Time" (http://web.mi
t.edu/~casparh/www/Papers/CJHareSelfBias2.pdf) (PDF). The Journal of Philosophy. 104
(7): 350–373. doi:10.5840/jphil2007104717 (https://doi.org/10.5840%2Fjphil2007104717) .
88. Hare C (2009). On Myself, and Other, Less Important Subjects (http://press.princeton.edu/titles/
8921.html) . Princeton University Press. ISBN 9780691135311.
89. Horst Hendriks-Jansen (1996). Catching ourselves in the act: situated activity, interactive
emergence, evolution, and human thought. Massachusetts Institute of Technology. p. 114.
ISBN 978-0-262-08246-4.
90. Mandler, G. "Consciousness: Respectable, useful, and probably necessary". In R. Solso (Ed.)
Information processing and cognition: NJ: LEA.
91. "Science and Religion Forum" (https://www.srforum.org/about) . 2021. Archived (https://web.
archive.org/web/20161103075415/http://srforum.org/about/) from the original on 2016-11-
03.
93. Stuart Hameroff, Alfred Kaszniak, David Chalmers (1999). "Preface". Toward a Science of
Consciousness III: The Third Tucson Discussions and Debates. MIT Press. pp. xix–xx. ISBN 978-
0-262-58181-3.
94. Lucido, R. J. (2023). Testing the consciousness causing collapse interpretation of quantum
mechanics using subliminal primes derived from random fluctuations in radioactive decay.
Journal of Consciousness Exploration & Research, 14(3), 185-194.
https://doi.org/10.13140/RG.2.2.20344.72969
95. Bernard Baars (1993). A Cognitive Theory of Consciousness. Cambridge University Press.
pp. 15–18. ISBN 978-0-521-42743-2.
96. Paul Rooks, Jane Wilson (2000). Perception: Theory, Development, and Organization.
Psychology Press. pp. 25–26. ISBN 978-0-415-19094-7.
97. Thomas Schmidt, Dirk Vorberg (2006). "Criteria for unconscious cognition: Three types of
dissociation" (https://doi.org/10.3758%2Fbf03193692) . Perception and Psychophysics. 68
(3): 489–504. doi:10.3758/bf03193692 (https://doi.org/10.3758%2Fbf03193692) .
PMID 16900839 (https://pubmed.ncbi.nlm.nih.gov/16900839) .
98. Arnaud Destrebecqz, Philippe Peigneux (2006). "Methods for studying unconscious learning".
In Steven Laureys (ed.). The Boundaries of Consciousness: Neurobiology and Neuropathology.
Elsevier. pp. 69–80. ISBN 978-0-444-52876-6.
101. David Chalmers (1996). "Ch. 3: Can consciousness be reductively explained?" (https://archive.o
rg/details/consciousmindins00chal) . The Conscious Mind. Oxford University Press.
ISBN 978-0-19-511789-9.
102. J.T. Giacino, C.M. Smart (2007). "Recent advances in behavioral assessment of individuals with
disorders of consciousness". Current Opinion in Neurology. 20 (6): 614–619.
doi:10.1097/WCO.0b013e3282f189ef (https://doi.org/10.1097%2FWCO.0b013e3282f189ef) .
PMID 17992078 (https://pubmed.ncbi.nlm.nih.gov/17992078) . S2CID 7097163 (https://api.s
emanticscholar.org/CorpusID:7097163) .
103. Christof Koch (October 2017). "How to Make a Consciousness Meter". Scientific American. 317
(5): 28–33. Bibcode:2017SciAm.317e..28K (https://ui.adsabs.harvard.edu/abs/2017SciAm.31
7e..28K) . doi:10.1038/scientificamerican1117-28 (https://doi.org/10.1038%2Fscientificameri
can1117-28) . PMID 29565878 (https://pubmed.ncbi.nlm.nih.gov/29565878) .
104. Patrick Haggard (2008). "Human volition: towards a neuroscience of will". Nature Reviews
Neuroscience. 9 (12): 934–946. doi:10.1038/nrn2497 (https://doi.org/10.1038%2Fnrn2497) .
PMID 19020512 (https://pubmed.ncbi.nlm.nih.gov/19020512) . S2CID 1495720 (https://api.s
emanticscholar.org/CorpusID:1495720) .
105. Gordon Gallup (1970). "Chimpanzees: Self recognition". Science. 167 (3914): 86–87.
Bibcode:1970Sci...167...86G (https://ui.adsabs.harvard.edu/abs/1970Sci...167...86G) .
doi:10.1126/science.167.3914.86 (https://doi.org/10.1126%2Fscience.167.3914.86) .
PMID 4982211 (https://pubmed.ncbi.nlm.nih.gov/4982211) . S2CID 145295899 (https://api.s
emanticscholar.org/CorpusID:145295899) .
106. David Edelman, Anil Seth (2009). "Animal consciousness: a synthetic approach". Trends in
Neurosciences. 32 (9): 476–484. doi:10.1016/j.tins.2009.05.008 (https://doi.org/10.1016%2Fj.t
ins.2009.05.008) . PMID 19716185 (https://pubmed.ncbi.nlm.nih.gov/19716185) .
S2CID 13323524 (https://api.semanticscholar.org/CorpusID:13323524) .
107. Broom DM, Sena H, Moynihan KL (2009). "Pigs learn what a mirror image represents and use it
to obtain information" (https://linkinghub.elsevier.com/retrieve/pii/S0003347209003571) .
Animal Behaviour. 78 (5): 1037–1041. doi:10.1016/j.anbehav.2009.07.027 (https://doi.org/10.1
016%2Fj.anbehav.2009.07.027) .
110. Christof Koch (2004). The Quest for Consciousness. Englewood, CO: Roberts & Company.
pp. 16–19. ISBN 978-0-9747077-0-9.
112. Rodolfo Llinás (2002). I of the vortex: from neurons to self. MIT Press. ISBN 978-0-262-62163-2.
115. Francis Crick, Christof Koch (2003). "A framework for consciousness" (https://web.archive.org/
web/20120522054447/http://papers.klab.caltech.edu/29/1/438.pdf) (PDF). Nature
Neuroscience. 6 (2): 119–126. doi:10.1038/nn0203-119 (https://doi.org/10.1038%2Fnn0203-1
19) . PMID 12555104 (https://pubmed.ncbi.nlm.nih.gov/12555104) . S2CID 13960489 (http
s://api.semanticscholar.org/CorpusID:13960489) . Archived from the original (http://papers.kl
ab.caltech.edu/29/1/438.pdf) (PDF) on 2012-05-22.
118. Zhaoping L (2020-07-30). "The Flip Tilt Illusion: Visible in Peripheral Vision as Predicted by the
Central-Peripheral Dichotomy" (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7401056) . i-
Perception. 11 (4) 2041669520938408. doi:10.1177/2041669520938408 (https://doi.org/10.1
177%2F2041669520938408) . ISSN 2041-6695 (https://search.worldcat.org/issn/2041-669
5) . PMC 7401056 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7401056) .
PMID 32782769 (https://pubmed.ncbi.nlm.nih.gov/32782769) .
119. Biederlack J., Castelo-Branco M., Neuenschwander S., Wheeler D.W., Singer W., Nikolić D.
(2006). "Brightness induction: Rate enhancement and neuronal synchronization as
complementary codes" (https://doi.org/10.1016%2Fj.neuron.2006.11.012) . Neuron. 52 (6):
1073–1083. doi:10.1016/j.neuron.2006.11.012 (https://doi.org/10.1016%2Fj.neuron.2006.11.0
12) . PMID 17178409 (https://pubmed.ncbi.nlm.nih.gov/17178409) . S2CID 16732916 (http
s://api.semanticscholar.org/CorpusID:16732916) .
120. Williams Adrian L., Singh Krishna D., Smith Andrew T. (2003). "Surround modulation measured
with functional MRI in the human visual cortex". Journal of Neurophysiology. 89 (1): 525–533.
CiteSeerX 10.1.1.137.1066 (https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.137.1
066) . doi:10.1152/jn.00048.2002 (https://doi.org/10.1152%2Fjn.00048.2002) .
PMID 12522199 (https://pubmed.ncbi.nlm.nih.gov/12522199) .
121. Adenauer G. Casali, Olivia Gosseries, Mario Rosanova, Mélanie Boly, Simone Sarasso, Karina R.
Casali, Silvia Casarotto, Marie-Aurélie Bruno, Steven Laureys, Giulio Tononi, Marcello Massimini
(14 August 2013). "A Theoretically based index of consciousness independent of sensory
processing and behavior" (http://orbi.ulg.ac.be/jspui/handle/2268/171542) . Science
Translational Medicine. 5 (198): 198ra105. doi:10.1126/scitranslmed.3006294 (https://doi.org/
10.1126%2Fscitranslmed.3006294) . hdl:2268/171542 (https://hdl.handle.net/2268%2F1715
42) . PMID 23946194 (https://pubmed.ncbi.nlm.nih.gov/23946194) . S2CID 8686961 (http
s://api.semanticscholar.org/CorpusID:8686961) .
122. Ann B. Butler, Paul R. Manger, B.I.B Lindahl, Peter Århem (2005). "Evolution of the neural basis
of consciousness: a bird-mammal comparison". BioEssays. 27 (9): 923–936.
doi:10.1002/bies.20280 (https://doi.org/10.1002%2Fbies.20280) . PMID 16108067 (https://p
ubmed.ncbi.nlm.nih.gov/16108067) .
123. Francis Crick and Christof Koch (1995). "Are we aware of neural activity in primary visual
cortex?". Nature. 375 (6527): 121–123. Bibcode:1995Natur.375..121C (https://ui.adsabs.harvar
d.edu/abs/1995Natur.375..121C) . doi:10.1038/375121a0 (https://doi.org/10.1038%2F37512
1a0) . PMID 7753166 (https://pubmed.ncbi.nlm.nih.gov/7753166) . S2CID 4262990 (https://
api.semanticscholar.org/CorpusID:4262990) .
124. Gerald M. Edelman and Giulio Tononi (2000). A Universe of Consciousness: How Matter
Becomes Imagination. Basic Books. ISBN 978-0-465-01376-0.
125. Rodney M.J. Cotterill (2001). "Cooperation of the basal ganglia, cerebellum, sensory cerebrum
and hippocampus: possible implications for cognition, consciousness, intelligence and
creativity". Progress in Neurobiology. 64 (1): 1–33. doi:10.1016/s0301-0082(00)00058-7 (http
s://doi.org/10.1016%2Fs0301-0082%2800%2900058-7) . PMID 11250060 (https://pubmed.nc
bi.nlm.nih.gov/11250060) . S2CID 206054149 (https://api.semanticscholar.org/CorpusID:206
054149) .
126. J.C. Eccles (1982). "Animal consciousness and human self-consciousness". Experientia. 38
(12): 1384–1391. doi:10.1007/bf01955747 (https://doi.org/10.1007%2Fbf01955747) .
PMID 7151952 (https://pubmed.ncbi.nlm.nih.gov/7151952) . S2CID 35174442 (https://api.se
manticscholar.org/CorpusID:35174442) .
127. John Eccles (1990). "A unitary hypothesis of mind-brain interaction in the cerebral cortex".
Proceedings of the Royal Society of London B. 240 (1299): 433–451.
Bibcode:1990RSPSB.240..433E (https://ui.adsabs.harvard.edu/abs/1990RSPSB.240..433E) .
doi:10.1098/rspb.1990.0047 (https://doi.org/10.1098%2Frspb.1990.0047) . PMID 2165613 (h
ttps://pubmed.ncbi.nlm.nih.gov/2165613) . S2CID 23188208 (https://api.semanticscholar.or
g/CorpusID:23188208) .
132. Roelofs L, Sebo J (2024). "Overlapping minds and the hedonic calculus" (https://doi.org/10.100
7%2Fs11098-024-02167-x) . Philosophical Studies. 181 (6–7): 1487–1506.
doi:10.1007/s11098-024-02167-x (https://doi.org/10.1007%2Fs11098-024-02167-x) .
134. McKenzie C (2024-06-01). "Consciousness defined: requirements for biological and artificial
general intelligence" ". arXiv:2406.01648 (https://arxiv.org/abs/2406.01648) [q-bio.NC (http
s://arxiv.org/archive/q-bio.NC) ].
135. Cleeremans, Jiménez (2002). "Implicit Learning and Consciousness: A Graded, Dynamic
Perspective" (https://philpapers.org/rec/CLEILA) . Psychology Press.
136. Northoff G, Lamme V (2020). "Neural signs and mechanisms of consciousness: Is there a
potential convergence of theories of consciousness in sight?". Neuroscience and Biobehavioral
Reviews. 118: 568–587. doi:10.1016/j.neubiorev.2020.07.019 (https://doi.org/10.1016%2Fj.ne
ubiorev.2020.07.019) . PMID 32783969 (https://pubmed.ncbi.nlm.nih.gov/32783969) .
S2CID 221084519 (https://api.semanticscholar.org/CorpusID:221084519) .
137. Seth AK, Bayne T (2022). "Theories of consciousness" (http://sro.sussex.ac.uk/id/eprint/10503
0/1/SethBayne_NRN_accepted.pdf) (PDF). Nature Reviews Neuroscience. 23 (7): 439–452.
doi:10.1038/s41583-022-00587-4 (https://doi.org/10.1038%2Fs41583-022-00587-4) .
PMID 35505255 (https://pubmed.ncbi.nlm.nih.gov/35505255) . S2CID 242810797 (https://ap
i.semanticscholar.org/CorpusID:242810797) . Archived (https://web.archive.org/web/202301
21221104/http://sro.sussex.ac.uk/id/eprint/105030/1/SethBayne_NRN_accepted.pdf) (PDF)
from the original on 2023-01-21. Retrieved 2023-01-17.
138. Doerig A, Schurger A, Herzog MH (2021). "Hard criteria for empirical theories of
consciousness" (https://doi.org/10.1080%2F17588928.2020.1772214) . Cognitive
Neuroscience. 12 (2): 41–62. doi:10.1080/17588928.2020.1772214 (https://doi.org/10.1080%
2F17588928.2020.1772214) . hdl:2066/228876 (https://hdl.handle.net/2066%2F228876) .
PMID 32663056 (https://pubmed.ncbi.nlm.nih.gov/32663056) . S2CID 220529998 (https://ap
i.semanticscholar.org/CorpusID:220529998) .
141. Tononi G, Boly M, Massimini M, Koch C (July 2016). "Integrated information theory: from
consciousness to its physical substrate" (https://www.nature.com/articles/nrn.2016.44) .
Nature Reviews Neuroscience. 17 (7): 450–461. doi:10.1038/nrn.2016.44 (https://doi.org/10.10
38%2Fnrn.2016.44) . ISSN 1471-0048 (https://search.worldcat.org/issn/1471-0048) .
PMID 27225071 (https://pubmed.ncbi.nlm.nih.gov/27225071) . S2CID 21347087 (https://api.
semanticscholar.org/CorpusID:21347087) . Archived (https://web.archive.org/web/20230504
082713/https://www.nature.com/articles/nrn.2016.44) from the original on 2023-05-04.
Retrieved 2023-05-21.
142. Lenharo M (2023-09-20). "Consciousness theory slammed as 'pseudoscience' — sparking
uproar" (https://www.nature.com/articles/d41586-023-02971-1) . Nature.
doi:10.1038/d41586-023-02971-1 (https://doi.org/10.1038%2Fd41586-023-02971-1) .
PMID 37730789 (https://pubmed.ncbi.nlm.nih.gov/37730789) .
144. Graziano, M.S.A., Kastner, S (2011). "Human consciousness and its relationship to social
neuroscience: A novel hypothesis" (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC322302
5) . Cog. Neurosci. 2 (2): 98–113. doi:10.1080/17588928.2011.565121 (https://doi.org/10.10
80%2F17588928.2011.565121) . PMC 3223025 (https://www.ncbi.nlm.nih.gov/pmc/articles/
PMC3223025) . PMID 22121395 (https://pubmed.ncbi.nlm.nih.gov/22121395) .
145. Graziano M (2013). Consciousness and the Social Brain. Oxford University Press. ISBN 978-
0199928644.
146. Carhart-Harris RL, Friston KJ, Barker EL (20 June 2019). "REBUS and the Anarchic Brain:
Toward a Unified Model of the Brain Action of Psychedelics" (https://www.ncbi.nlm.nih.gov/pm
c/articles/PMC6588209) . Pharmacological Reviews. 71 (3): 316–344.
doi:10.1124/pr.118.017160 (https://doi.org/10.1124%2Fpr.118.017160) . PMC 6588209 (http
s://www.ncbi.nlm.nih.gov/pmc/articles/PMC6588209) . PMID 31221820 (https://pubmed.ncb
i.nlm.nih.gov/31221820) .
147. Carhart-Harris RL (November 2018). "The entropic brain – revisited". Neuropharmacology. 142:
167–178. doi:10.1016/j.neuropharm.2018.03.010 (https://doi.org/10.1016%2Fj.neuropharm.2
018.03.010) . PMID 29548884 (https://pubmed.ncbi.nlm.nih.gov/29548884) .
S2CID 4483591 (https://api.semanticscholar.org/CorpusID:4483591) .
148. Carhart-Harris RL, Leech R, Hellyer PJ, Shanahan M, Feilding A, Tagliazucchi E, Chialvo DR, Nutt
D (2014). "The entropic brain: a theory of conscious states informed by neuroimaging research
with psychedelic drugs" (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3909994) .
Frontiers in Human Neuroscience. 8: 20. doi:10.3389/fnhum.2014.00020 (https://doi.org/10.33
89%2Ffnhum.2014.00020) . PMC 3909994 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC
3909994) . PMID 24550805 (https://pubmed.ncbi.nlm.nih.gov/24550805) .
149. "Entropy as More than Chaos in the Brain: Expanding Field, Expanding Minds" (https://mind-fou
ndation.org/entropy-as-more-than-chaos/) . 2018-06-22. Archived (https://web.archive.org/we
b/20190705111205/https://mind-foundation.org/entropy-as-more-than-chaos/) from the
original on 2019-07-05. Retrieved 2019-07-05.
150. Papo D (30 August 2016). "Commentary: The entropic brain: a theory of conscious states
informed by neuroimaging research with psychedelic drugs" (https://www.ncbi.nlm.nih.gov/pm
c/articles/PMC5004455) . Frontiers in Human Neuroscience. 10: 423.
doi:10.3389/fnhum.2016.00423 (https://doi.org/10.3389%2Ffnhum.2016.00423) .
PMC 5004455 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5004455) . PMID 27624312
(https://pubmed.ncbi.nlm.nih.gov/27624312) .
151. David Rudrauf, Daniel Bennequin, Isabela Granic, Gregory Landini, Karl Friston, Kenneth
Williford (2017). "A Mathematical Model of Embodied Consciousness" (https://discovery.ucl.a
c.uk/id/eprint/10057795/) . Journal of Theoretical Biology. 428 (1): 106–131.
Bibcode:2017JThBi.428..106R (https://ui.adsabs.harvard.edu/abs/2017JThBi.428..106R) .
doi:10.1016/j.jtbi.2017.05.032 (https://doi.org/10.1016%2Fj.jtbi.2017.05.032) .
hdl:2066/175365 (https://hdl.handle.net/2066%2F175365) . PMID 28554611 (https://pubme
d.ncbi.nlm.nih.gov/28554611) . S2CID 4476538 (https://api.semanticscholar.org/CorpusID:44
76538) .
152. Koubeissi MZ, Bartolomei F, Beltagy A, Picard F (2014). "Electrical stimulation of a small brain
area reversibly disrupts consciousness" (https://linkinghub.elsevier.com/retrieve/pii/S1525505
014002017) . Epilepsy & Behavior. 37: 32–35. doi:10.1016/j.yebeh.2014.05.027 (https://doi.or
g/10.1016%2Fj.yebeh.2014.05.027) . PMID 24967698 (https://pubmed.ncbi.nlm.nih.gov/2496
7698) .
153. "A Brain Area Thought to Impart Consciousness Instead Behaves Like an Internet Router" (http
s://neurosciencenews.com/claustrum-cognition-21834/) . Neuroscience News. 2022-11-14.
Retrieved 2025-06-22.
154. Madden, M. B., Stewart, B. W., White, M. G., Krimmel, S. R., Qadir, H., Barrett, F. S., ... & Mathur, B.
N. (2022). A role for the claustrum in cognitive control. Trends in cognitive sciences, 26(12),
1133-1152. doi: 10.1016/j.tics.2022.09.006
155. T.H. Huxley (1874). "On the hypothesis that animals are automata, and its history" (https://doi.o
rg/10.1038%2F010362a0) . The Fortnightly Review. 16 (253): 555–580.
Bibcode:1874Natur..10..362. (https://ui.adsabs.harvard.edu/abs/1874Natur..10..362.) .
doi:10.1038/010362a0 (https://doi.org/10.1038%2F010362a0) .
158. Karl R. Popper, John C. Eccles (1977). The Self and Its Brain (https://archive.org/details/selfitsb
rain0000popp) . Springer International. ISBN 978-0-387-08307-0.
159. Peter Århem, B.I.B. Lindahl, Paul R. Manger, Ann B. Butler (2008). "On the origin of
consciousness—some amniote scenarios" (https://books.google.com/books?id=OQGJz1DVQ
NMC&pg=PA77) . In Hans Liljenström, Peter Århem (eds.). Consciousness Transitions:
Phylogenetic, Ontogenetic, and Physiological Aspects. Elsevier. ISBN 978-0-444-52977-0.
160. Feinberg TE, Mallatt J (October 2013). "The evolutionary and genetic origins of consciousness
in the Cambrian Period over 500 million years ago" (https://www.ncbi.nlm.nih.gov/pmc/article
s/PMC3790330) . Frontiers in Psychology. 4: 667. doi:10.3389/fpsyg.2013.00667 (https://doi.
org/10.3389%2Ffpsyg.2013.00667) . PMC 3790330 (https://www.ncbi.nlm.nih.gov/pmc/articl
es/PMC3790330) . PMID 24109460 (https://pubmed.ncbi.nlm.nih.gov/24109460) .
161. Donald Griffin (2001). Animal Minds: Beyond Cognition to Consciousness. University of Chicago
Press. ISBN 978-0-226-30865-4.
163. Bernard Baars (January 2002). "The conscious access hypothesis: Origins and recent
evidence". Trends in Cognitive Sciences. 6 (1): 47–52. doi:10.1016/S1364-6613(00)01819-2 (htt
ps://doi.org/10.1016%2FS1364-6613%2800%2901819-2) . PMID 11849615 (https://pubmed.n
cbi.nlm.nih.gov/11849615) . S2CID 6386902 (https://api.semanticscholar.org/CorpusID:6386
902) .
164. Seth A, Eugene Izhikevich, George Reeke, Gerald Edelman (2006). "Theories and measures of
consciousness: An extended framework" (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC148
7169) . Proceedings of the National Academy of Sciences. 103 (28): 10799–10804.
Bibcode:2006PNAS..10310799S (https://ui.adsabs.harvard.edu/abs/2006PNAS..10310799
S) . doi:10.1073/pnas.0604347103 (https://doi.org/10.1073%2Fpnas.0604347103) .
PMC 1487169 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1487169) . PMID 16818879
(https://pubmed.ncbi.nlm.nih.gov/16818879) .
165. Ezequiel Morsella (2005). "The function of phenomenal states: Supramodular Interaction
Theory" (https://web.archive.org/web/20201118022838/http://pdfs.semanticscholar.org/fdd
7/81a15d0405a888abe4584a99ed9cbc6fb3ff.pdf) (PDF). Psychological Review. 112 (4):
1000–1021. doi:10.1037/0033-295X.112.4.1000 (https://doi.org/10.1037%2F0033-295X.112.
4.1000) . PMID 16262477 (https://pubmed.ncbi.nlm.nih.gov/16262477) . S2CID 2298524 (ht
tps://api.semanticscholar.org/CorpusID:2298524) . Archived from the original (http://pdfs.se
manticscholar.org/fdd7/81a15d0405a888abe4584a99ed9cbc6fb3ff.pdf) (PDF) on 2020-11-
18.
166. S. Budiansky (1998). If a Lion Could Talk: Animal Intelligence and the Evolution of Consciousness
(https://archive.org/details/iflioncouldtalka00budi) . The Free Press. ISBN 978-0-684-83710-9.
167. S. Nichols, T. Grantham (2000). "Adaptive Complexity and Phenomenal Consciousness" (http
s://web.archive.org/web/20170813055023/http://dingo.sbs.arizona.edu/~snichols/Papers/ev
olcons(final).pdf) (PDF). Philosophy of Science. 67 (4): 648–670. CiteSeerX 10.1.1.515.9722
(https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.515.9722) . doi:10.1086/392859
(https://doi.org/10.1086%2F392859) . S2CID 16484193 (https://api.semanticscholar.org/Cor
pusID:16484193) . Archived from the original (http://dingo.sbs.arizona.edu/~snichols/Paper
s/evolcons(final).pdf) (PDF) on 2017-08-13. Retrieved 2017-10-25.
169. Bernard Baars (1993). A Cognitive Theory of Consciousness. Cambridge University Press.
ISBN 978-0-521-42743-2.
171. Owen Flanagan, T.W. Polger (1995). "Zombies and the function of consciousness". Journal of
Consciousness Studies. 2: 313–321.
172. Rosenthal D (2008). "Consciousness and its function". Neuropsychologia. 46 (3): 829–840.
doi:10.1016/j.neuropsychologia.2007.11.012 (https://doi.org/10.1016%2Fj.neuropsychologia.2
007.11.012) . PMID 18164042 (https://pubmed.ncbi.nlm.nih.gov/18164042) .
S2CID 7791431 (https://api.semanticscholar.org/CorpusID:7791431) .
173. Stevan Harnad (2002). "Turing indistinguishability and the Blind Watchmaker" (http://cogprints.
org/1615) . In J.H. Fetzer (ed.). Consciousness Evolving. John Benjamins. Archived (https://we
b.archive.org/web/20111028162407/http://cogprints.org/1615/) from the original on 2011-
10-28. Retrieved 2011-10-26.
174. Zack Robinson, Corey J. Maley, Gualtiero Piccinini (2015). "Is Consciousness a Spandrel?".
Journal of the American Philosophical Association. 1 (2): 365–383. doi:10.1017/apa.2014.10 (h
ttps://doi.org/10.1017%2Fapa.2014.10) . S2CID 170892645 (https://api.semanticscholar.org/
CorpusID:170892645) .
177. Coenen A (2010). "Subconscious Stimulus Recognition and Processing During Sleep" (http://jo
urnalpsyche.org/files/0xbb10.pdf) (PDF). Psyche: An Interdisciplinary Journal of Research on
Consciousness. 16–2. Archived (https://web.archive.org/web/20170611115233/http://journalp
syche.org/files/0xbb10.pdf) (PDF) from the original on 2017-06-11.
178. Hobson JA, Pace-Schott EF, Stickgold R (2003). "Dreaming and the brain: Toward a cognitive
neuroscience of conscious states" (https://www.researchgate.net/publication/2599957) . In
Pace-Schott EF, Solms M, Blagrove M, Harnad S (eds.). Sleep and Dreaming: Scientific Advances
and Reconsiderations. Cambridge University Press. ISBN 978-0-521-00869-3. Archived (https://
web.archive.org/web/20210810234114/https://www.researchgate.net/profile/Edward-Pace-Sc
hott/publication/2599957_Dreaming_and_the_Brain_Toward_a_Cognitive_Neuroscience_of_Co
nscious_States/links/02e7e52f240372e115000000/Dreaming-and-the-Brain-Toward-a-Cognitiv
e-Neuroscience-of-Conscious-States.pdf) (PDF) from the original on 2021-08-10.
179. Johanson M., Valli K., Revonsuo A., Wedlund J. (2008). "Content analysis of subjective
experiences in partial epileptic seizures". Epilepsy & Behavior. 12 (1): 170–182.
doi:10.1016/j.yebeh.2007.10.002 (https://doi.org/10.1016%2Fj.yebeh.2007.10.002) .
PMID 18086461 (https://pubmed.ncbi.nlm.nih.gov/18086461) . S2CID 28276470 (https://api.
semanticscholar.org/CorpusID:28276470) .
180. Johanson M., Valli K., Revonsuo A., et al. (2008). "Alterations in the contents of consciousness
in partial epileptic seizures". Epilepsy & Behavior. 13 (2): 366–371.
doi:10.1016/j.yebeh.2008.04.014 (https://doi.org/10.1016%2Fj.yebeh.2008.04.014) .
PMID 18522873 (https://pubmed.ncbi.nlm.nih.gov/18522873) . S2CID 24473529 (https://api.
semanticscholar.org/CorpusID:24473529) .
183. M. Murphy, S. Donovan, E. Taylor (1997). The Physical and Psychological Effects of Meditation:
A Review of Contemporary Research With a Comprehensive Bibliography, 1931–1996. Institute of
Noetic Sciences.
185. Studerus E, Gamma A, Vollenweider FX (2010). Bell V (ed.). "Psychometric evaluation of the
altered states of consciousness rating scale (OAV)" (https://www.ncbi.nlm.nih.gov/pmc/article
s/PMC2930851) . PLOS One. 5 (8): e12412. Bibcode:2010PLoSO...512412S (https://ui.adsab
s.harvard.edu/abs/2010PLoSO...512412S) . doi:10.1371/journal.pone.0012412 (https://doi.or
g/10.1371%2Fjournal.pone.0012412) . PMC 2930851 (https://www.ncbi.nlm.nih.gov/pmc/arti
cles/PMC2930851) . PMID 20824211 (https://pubmed.ncbi.nlm.nih.gov/20824211) .
186. Hal Blumenfeld (2009). "The neurological examination of consciousness". In Steven Laureys,
Giulio Tononi (eds.). The Neurology of Consciousness: Cognitive Neuroscience and
Neuropathology. Academic Press. ISBN 978-0-12-374168-4.
187. Kinney HC, Korein J, Panigrahy A, Dikkes P, Goode R (26 May 1994). "Neuropathological
findings in the brain of Karen Ann Quinlan – the role of the thalamus in the persistent
vegetative state" (https://web.archive.org/web/20201118022837/http://pdfs.semanticscholar.
org/44a2/3798f5dc002a79512bfa9bff974bdbb611e1.pdf) (PDF). N Engl J Med. 330 (21):
1469–1475. doi:10.1056/NEJM199405263302101 (https://doi.org/10.1056%2FNEJM1994052
63302101) . PMID 8164698 (https://pubmed.ncbi.nlm.nih.gov/8164698) . S2CID 5112573 (h
ttps://api.semanticscholar.org/CorpusID:5112573) . Archived from the original (http://pdfs.se
manticscholar.org/44a2/3798f5dc002a79512bfa9bff974bdbb611e1.pdf) (PDF) on 18
November 2020.
189. V. Mark Durand, David H. Barlow (2009). Essentials of Abnormal Psychology (https://archive.or
g/details/isbn_9780495806134) . Cengage Learning. pp. 74–75 (https://archive.org/details/is
bn_9780495806134/page/74) . ISBN 978-0-495-59982-1. Note: A patient who can additionally
describe the current situation may be referred to as "oriented times four".
190. Neergaard L (August 14, 2013). "New tool peeks into brain to measure consciousness" (https://
web.archive.org/web/20130816144320/https://www.nbcnews.com/health/new-tool-peeks-bra
in-measure-consciousness-6C10919906) . Associated Press through NBC News. Archived
from the original (https://www.nbcnews.com/healthmain/new-tool-peeks-brain-measure-consc
iousness-6c10919906) on August 16, 2013. Retrieved March 2, 2022.
191. Bernat JL (8 Apr 2006). "Chronic disorders of consciousness". Lancet. 367 (9517): 1181–1192.
doi:10.1016/S0140-6736(06)68508-5 (https://doi.org/10.1016%2FS0140-6736%2806%296850
8-5) . PMID 16616561 (https://pubmed.ncbi.nlm.nih.gov/16616561) . S2CID 13550675 (http
s://api.semanticscholar.org/CorpusID:13550675) .
192. Bernat JL (20 Jul 2010). "The natural history of chronic disorders of consciousness".
Neurology. 75 (3): 206–207. doi:10.1212/WNL.0b013e3181e8e960 (https://doi.org/10.1212%2
FWNL.0b013e3181e8e960) . PMID 20554939 (https://pubmed.ncbi.nlm.nih.gov/2055493
9) . S2CID 30959964 (https://api.semanticscholar.org/CorpusID:30959964) .
193. Coleman MR, Davis MH, Rodd JM, Robson T, Ali A, Owen AM, Pickard JD (September 2009).
"Towards the routine use of brain imaging to aid the clinical diagnosis of disorders of
consciousness" (https://doi.org/10.1093%2Fbrain%2Fawp183) . Brain. 132 (9): 2541–2552.
doi:10.1093/brain/awp183 (https://doi.org/10.1093%2Fbrain%2Fawp183) . PMID 19710182
(https://pubmed.ncbi.nlm.nih.gov/19710182) .
194. Monti MM, Vanhaudenhuyse A, Coleman MR, Boly M, Pickard JD, Tshibanda L, Owen AM,
Laureys S (18 Feb 2010). "Willful modulation of brain activity in disorders of consciousness" (ht
tps://web.archive.org/web/20190224091809/http://pdfs.semanticscholar.org/560f/d2dd08c0
532dcf5c61668690dd88d19d7114.pdf) (PDF). N Engl J Med. 362 (7): 579–589.
doi:10.1056/NEJMoa0905370 (https://doi.org/10.1056%2FNEJMoa0905370) .
PMID 20130250 (https://pubmed.ncbi.nlm.nih.gov/20130250) . S2CID 13358991 (https://api.
semanticscholar.org/CorpusID:13358991) . Archived from the original (http://pdfs.semantics
cholar.org/560f/d2dd08c0532dcf5c61668690dd88d19d7114.pdf) (PDF) on 24 February
2019.
195. Seel RT, Sherer M, Whyte J, Katz DI, Giacino JT, Rosenbaum AM, Hammond FM, Kalmar K, Pape
TL, et al. (December 2010). "Assessment scales for disorders of consciousness: evidence-
based recommendations for clinical practice and research". Arch Phys Med Rehabil. 91 (12):
1795–1813. doi:10.1016/j.apmr.2010.07.218 (https://doi.org/10.1016%2Fj.apmr.2010.07.21
8) . PMID 21112421 (https://pubmed.ncbi.nlm.nih.gov/21112421) .
196. Prigatano GP (2009). "Anosognosia: clinical and ethical considerations". Current Opinion in
Neurology. 22 (6): 606–611. doi:10.1097/WCO.0b013e328332a1e7 (https://doi.org/10.1097%2
FWCO.0b013e328332a1e7) . PMID 19809315 (https://pubmed.ncbi.nlm.nih.gov/1980931
5) . S2CID 40751848 (https://api.semanticscholar.org/CorpusID:40751848) .
197. George P. Prigatano, Daniel Schacter (1991). "Introduction". In George Prigatano, Daniel
Schacter (eds.). Awareness of Deficit After Brain Injury: Clinical and Theoretical Issues. Oxford
University Press. pp. 3–16. ISBN 978-0-19-505941-0.
199. Arnold J. Sameroff, Marshall M. Haith, eds. (1996). The Five to Seven Year Shift: The Age of
Reason and Responsibility. Chicago: University of Chicago Press.
200. Foulkes D (1999). Children's Dreaming and the Development of Consciousness. Cambridge,
Massachusetts: Harvard University Press. p. 13. "In defining 'consciousness' as a self-reflective
act, psychology loses much of the glamour and mystery of other areas of consciousness-
study, but it also can proceed on a workaday basis without becoming paralyzed in pure
abstraction."
201. Nelson K, Fivush R (2020). "The Development of Autobiographical Memory, Autobiographical
Narratives, and Autobiographical Consciousness" (https://doi.org/10.1177%2F0033294119852
574) . Psychological Reports. 123 (1): 74. doi:10.1177/0033294119852574 (https://doi.org/1
0.1177%2F0033294119852574) . PMID 31142189 (https://pubmed.ncbi.nlm.nih.gov/311421
89) . S2CID 169038149 (https://api.semanticscholar.org/CorpusID:169038149) .
202. Jaynes J (2000) [1976]. The Origin of Consciousness in the Breakdown of the Bicameral Mind.
Houghton Mifflin. p. 447. ISBN 0-618-05707-2. "Consciousness is based on language....
Consciousness is not the same as cognition and should be sharply distinguished from it."
203. Jaynes J (2000) [1976]. The Origin of Consciousness in the Breakdown of the Bicameral Mind.
Houghton Mifflin. p. 450. ISBN 0-618-05707-2. "The basic connotative definition of
consciousness is thus an analog 'I' narratizing in a functional mind-space. The denotative
definition is, as it was for Descartes, Locke, and Hume, what is introspectable."
206. Peter Carruthers (1999). "Sympathy and subjectivity". Australasian Journal of Philosophy. 77
(4): 465–482. doi:10.1080/00048409912349231 (https://doi.org/10.1080%2F0004840991234
9231) .
207. Thomas Nagel (1991). "Ch. 12 What is it like to be a bat?". Mortal Questions. Cambridge
University Press. ISBN 978-0-521-40676-5.
208. Douglas Hofstadter (1981). "Reflections on What Is It Like to Be a Bat?". In Douglas Hofstadter,
Daniel Dennett (eds.). The Mind's I. Basic Books. pp. 403–414 (https://archive.org/details/mind
sifantasiesr00hofs/page/403) . ISBN 978-0-7108-0352-8.
211. Moshe Idel (1990). Golem: Jewish Magical and Mystical Traditions on the Artificial Anthropoid.
SUNY Press. ISBN 978-0-7914-0160-6. Note: In many stories the Golem was mindless, but
some gave it emotions or thoughts.
212. Ada Lovelace. "Sketch of The Analytical Engine, Note G" (http://www.fourmilab.ch/babbage/sk
etch.html) . Archived (https://web.archive.org/web/20100913042032/http://www.fourmilab.c
h/babbage/sketch.html) from the original on 2010-09-13. Retrieved 2011-09-10.
213. Stuart Shieber (2004). The Turing Test : Verbal Behavior as the Hallmark of Intelligence. MIT
Press. ISBN 978-0-262-69293-9.
215. David Chalmers (1997). The Conscious Mind: In Search of a Fundamental Theory. Oxford
University Press. ISBN 978-0-19-511789-9.
216. Jürgen Schmidhuber (2009). Driven by Compression Progress: A Simple Principle Explains
Essential Aspects of Subjective Beauty, Novelty, Surprise, Interestingness, Attention, Curiosity,
Creativity, Art, Science, Music, Jokes (https://archive.org/details/arxiv-0812.4360) .
arXiv:0812.4360 (https://arxiv.org/abs/0812.4360) . Bibcode:2008arXiv0812.4360S (https://u
i.adsabs.harvard.edu/abs/2008arXiv0812.4360S) .
217. John R. Searle (1990). "Is the brain's mind a computer program" (http://www.cs.princeton.edu/
courses/archive/spr06/cos116/Is_The_Brains_Mind_A_Computer_Program.pdf) (PDF).
Scientific American. 262 (1): 26–31. Bibcode:1990SciAm.262a..26S (https://ui.adsabs.harvard.
edu/abs/1990SciAm.262a..26S) . doi:10.1038/scientificamerican0190-26 (https://doi.org/10.
1038%2Fscientificamerican0190-26) . PMID 2294583 (https://pubmed.ncbi.nlm.nih.gov/2294
583) . Archived (https://ghostarchive.org/archive/20221009/http://www.cs.princeton.edu/co
urses/archive/spr06/cos116/Is_The_Brains_Mind_A_Computer_Program.pdf) (PDF) from the
original on 2022-10-09.
220. Graham Oppy, David Dowe (2011). "The Turing test" (http://plato.stanford.edu/archives/spr201
1/entries/turing-test) . Stanford Encyclopedia of Philosophy (Spring 2011 Edition). Archived (h
ttps://web.archive.org/web/20131202073948/http://plato.stanford.edu/archives/spr2011/entr
ies/turing-test/) from the original on 2013-12-02. Retrieved 2011-10-26.
222. Victor Argonov (2014). "Experimental Methods for Unraveling the Mind-body Problem: The
Phenomenal Judgment Approach" (http://philpapers.org/rec/ARGMAA-2) . Journal of Mind
and Behavior. 35: 51–70. Archived (https://web.archive.org/web/20161020014221/http://philp
apers.org/rec/ARGMAA-2) from the original on 2016-10-20. Retrieved 2016-12-06.
223. Leith S (2022-07-07). "Nick Bostrom: How can we be certain a machine isn't conscious?" (http
s://www.spectator.co.uk/article/nick-bostrom-how-can-we-be-certain-a-machine-isnt-consciou
s/) . The Spectator. Retrieved 2025-08-09.
225. Shulman C, Bostrom N (August 2021). "Sharing the World with Digital Minds" (https://www.rese
archgate.net/publication/353967146_Sharing_the_World_with_Digital_Minds) . Rethinking
Moral Status. doi:10.1093/oso/9780192894076.003.0018 (https://doi.org/10.1093%2Foso%2F
9780192894076.003.0018) .
226. "The intelligent monster that you should let eat you" (https://www.bbc.com/future/article/2020
1111-philosophy-of-utility-monsters-and-artificial-intelligence) . BBC. 2020-11-13. Retrieved
2025-08-09.
227. William James (1890). The Principles of Psychology, Volume 1. H. Holt. p. 225.
228. Karunamuni N.D. (May 2015). "The Five-Aggregate Model of the Mind" (https://doi.org/10.117
7%2F2158244015583860) . SAGE Open. 5 (2) 2158244015583860: 215824401558386.
doi:10.1177/2158244015583860 (https://doi.org/10.1177%2F2158244015583860) .
229. Dzogchen Rinpoche (2007). "Taming the mindstream" (https://archive.org/details/losingclouds
gain0000unse/page/81) . In Doris Wolter (ed.). Losing the Clouds, Gaining the Sky: Buddhism
and the Natural Mind. Wisdom Publications. pp. 81–92 (https://archive.org/details/losingcloud
sgain0000unse/page/81) . ISBN 978-0-86171-359-2.
230. Robert Humphrey (1992) [1954]. Stream of Consciousness in the Modern Novel. University of
California Press. pp. 23–49. ISBN 978-0-520-00585-3.
232. Thompson E (2014-11-18). Waking, Dreaming, Being: Self and Consciousness in Neuroscience,
Meditation, and Philosophy (https://books.google.com/books?id=q_vpBAAAQBAJ) . Columbia
University Press. p. 19. ISBN 978-0-231-53831-2.
233. Richard Maurice Bucke (1905). Cosmic Consciousness: A Study in the Evolution of the Human
Mind (https://archive.org/details/cosmicconsciousn01buck) . Innes & Sons. pp. 1 (https://arc
hive.org/details/cosmicconsciousn01buck/page/n19) –2.
234. Ken Wilber (2002). The Spectrum of Consciousness. Motilal Banarsidass. pp. 3–16. ISBN 978-
81-208-1848-4.
235. Satsangi PS, Hameroff S, eds. (2016). Consciousness: Integrating Eastern and Western
Perspectives. New Age Books. ISBN 978-81-7822-493-0.
Further reading
Dehaene S (2014). Consciousness and the Brain: Deciphering How the Brain Codes Our Thoughts.
Viking Press. ISBN 978-0-670-02543-5.
Harley T (2021). The Science of Consciousness: Waking, Sleeping, and Dreaming. Cambridge
University Press. doi:10.1017/9781316408889 (https://doi.org/10.1017%2F9781316408889) .
ISBN 978-1-107-56330-8. S2CID 233977060 (https://api.semanticscholar.org/CorpusID:23397706
0) .
Koch C (2019). The Feeling of Life Itself: Why Consciousness Is Widespread but Can't Be Computed.
MIT Press. ISBN 978-0-262-04281-9.
Overgaard M, Mogensen J, Kirkeby-Hinrup A, eds. (2021). Beyond Neural Correlates of
Consciousness. Routledge. ISBN 978-1-138-63798-6.
Prinz J (2012). The Conscious Brain: How Attention Engenders Experience. Oxford University Press.
doi:10.1093/acprof:oso/9780195314595.001.0001 (https://doi.org/10.1093%2Facprof%3Aoso%2
F9780195314595.001.0001) . ISBN 9780195314595.
Schneider S, Velmans M, eds. (2017). The Blackwell Companion to Consciousness (2nd ed.). Wiley-
Blackwell. ISBN 978-0-470-67406-2.
Seth A (2021). Being You: A New Science of Consciousness. Penguin Random House. ISBN 978-1-
5247-4287-4.
Thompson E (2014). Waking, Dreaming, Being: Self and Consciousness in Neuroscience, Meditation,
and Philosophy. Columbia University Press. ISBN 978-0-231-13695-2.
Zelazo PD, Moscovitch M, Thompson E, eds. (2007). The Cambridge Handbook of Consciousness.
Cambridge University Press. doi:10.1017/CBO9780511816789 (https://doi.org/10.1017%2FCBO9
780511816789) . ISBN 978-0-521-67412-6.
Articles
External links