Abstracting and Indexing

  • Google Scholar
  • CrossRef
  • WorldCat
  • ResearchGate
  • Academic Keys
  • DRJI
  • Microsoft Academic
  • Academia.edu
  • OpenAIRE

Is Consciousness A Mystery? A Simplified Approach to Pinpoint the Basic Nature of Consciousness

Article Information

Wolfgang Kromer*

Specialist in Clinical Pharmacology and Toxicology, Konstanz, Germany

*Corresponding Author: Wolfgang Kromer, Specialist in Clinical Pharmacology and Toxicology, Konstanz, Germany

Received: 07 June 2022; Accepted: 14 June 2022; Published: 16 June 2022

Citation: Wolfgang Kromer. Is Consciousness A Mystery? A Simplified Approach to Pinpoint the Basic Nature of Consciousness. Journal of Psychiatry and Psychiatric Disorders 6 (2022): 196-202.

View / Download Pdf Share at Facebook

Abstract

The biophysical boundaries between the body and its environment demarcate the individual's inner world against the outer world. Although the aspect of self-world- distinction had already been addressed in the literature, it remained largely obscure how this distinction might be achieved and how it would lead to the experience of one's "self". This paper tries to answer this question in a way as simplified as possible by reducing the concept to just one essential point: Comparative processing and representation of neuronal stimuli from outside and inside the body is discussed as the crucial point in the genesis of consciousness, and is equated with subjective experience. In contrast to theories based on axiomatical constructs, the present concept is based solely on neuronal mechanisms and is therefore accessible to both experimental and clinical research. This may focus on defining the neuronal interface in the brain's representation of the outside versus the inside world.

Keywords

Consciousness; Self; Sensation; Perception

Consciousness articles; Self articles; Sensation articles; Perception articles

Consciousness articles Consciousness Research articles Consciousness review articles Consciousness PubMed articles Consciousness PubMed Central articles Consciousness 2023 articles Consciousness 2024 articles Consciousness Scopus articles Consciousness impact factor journals Consciousness Scopus journals Consciousness PubMed journals Consciousness medical journals Consciousness free journals Consciousness best journals Consciousness top journals Consciousness free medical journals Consciousness famous journals Consciousness Google Scholar indexed journals Self articles Self Research articles Self review articles Self PubMed articles Self PubMed Central articles Self 2023 articles Self 2024 articles Self Scopus articles Self impact factor journals Self Scopus journals Self PubMed journals Self medical journals Self free journals Self best journals Self top journals Self free medical journals Self famous journals Self Google Scholar indexed journals Sensation articles Sensation Research articles Sensation review articles Sensation PubMed articles Sensation PubMed Central articles Sensation 2023 articles Sensation 2024 articles Sensation Scopus articles Sensation impact factor journals Sensation Scopus journals Sensation PubMed journals Sensation medical journals Sensation free journals Sensation best journals Sensation top journals Sensation free medical journals Sensation famous journals Sensation Google Scholar indexed journals Perception articles Perception Research articles Perception review articles Perception PubMed articles Perception PubMed Central articles Perception 2023 articles Perception 2024 articles Perception Scopus articles Perception impact factor journals Perception Scopus journals Perception PubMed journals Perception medical journals Perception free journals Perception best journals Perception top journals Perception free medical journals Perception famous journals Perception Google Scholar indexed journals artificial intelligence articles artificial intelligence Research articles artificial intelligence review articles artificial intelligence PubMed articles artificial intelligence PubMed Central articles artificial intelligence 2023 articles artificial intelligence 2024 articles artificial intelligence Scopus articles artificial intelligence impact factor journals artificial intelligence Scopus journals artificial intelligence PubMed journals artificial intelligence medical journals artificial intelligence free journals artificial intelligence best journals artificial intelligence top journals artificial intelligence free medical journals artificial intelligence famous journals artificial intelligence Google Scholar indexed journals consciousness articles consciousness Research articles consciousness review articles consciousness PubMed articles consciousness PubMed Central articles consciousness 2023 articles consciousness 2024 articles consciousness Scopus articles consciousness impact factor journals consciousness Scopus journals consciousness PubMed journals consciousness medical journals consciousness free journals consciousness best journals consciousness top journals consciousness free medical journals consciousness famous journals consciousness Google Scholar indexed journals synchronization articles synchronization Research articles synchronization review articles synchronization PubMed articles synchronization PubMed Central articles synchronization 2023 articles synchronization 2024 articles synchronization Scopus articles synchronization impact factor journals synchronization Scopus journals synchronization PubMed journals synchronization medical journals synchronization free journals synchronization best journals synchronization top journals synchronization free medical journals synchronization famous journals synchronization Google Scholar indexed journals brain articles brain Research articles brain review articles brain PubMed articles brain PubMed Central articles brain 2023 articles brain 2024 articles brain Scopus articles brain impact factor journals brain Scopus journals brain PubMed journals brain medical journals brain free journals brain best journals brain top journals brain free medical journals brain famous journals brain Google Scholar indexed journals human articles human Research articles human review articles human PubMed articles human PubMed Central articles human 2023 articles human 2024 articles human Scopus articles human impact factor journals human Scopus journals human PubMed journals human medical journals human free journals human best journals human top journals human free medical journals human famous journals human Google Scholar indexed journals neuronal networks articles neuronal networks Research articles neuronal networks review articles neuronal networks PubMed articles neuronal networks PubMed Central articles neuronal networks 2023 articles neuronal networks 2024 articles neuronal networks Scopus articles neuronal networks impact factor journals neuronal networks Scopus journals neuronal networks PubMed journals neuronal networks medical journals neuronal networks free journals neuronal networks best journals neuronal networks top journals neuronal networks free medical journals neuronal networks famous journals neuronal networks Google Scholar indexed journals

Article Details

Introduction

Any trial to answer the title question by resorting to the indeterminable flow of contributions from philosophy, psychology and neurosciences must inevitably end up in frustration. Too speculative and contradictory are the approaches to this complex material that may even appear unsolvable at a first glance. Moreover, many scientific contributions address the preconditions rather than the essence of consciousness, as exemplified by a flood of publications [1,2]. An example may be the focus on synchronous oscillations of neurons at about 40 Hz, a phenomenon that was considered to be a precondition of "binding" between groups of cortical neurons [3-5]. Such synchronization may facilitate the team work between different parts of the brain in order to generate a complex, conscious percept. But it does not by itself explain the subjective, phenomenal experience of sensation. Even a superficial view at "hard problem of consciousness" in Wikipedia will sufficiently prove the above claim of divergent speculations and confusion in the relevant literature. Prof. Michael Graziano from Princeton University went even further when he wrote "Most popular theories of consciousness are worse than wrong. They play to our intuitions, but don't explain anything" [6].

Despite all this, Searle [7] rightly noted that "consciousness is entirely caused by neurobiological processes and is realized in brain structures". However, theories on the basic nature of consciousness still remain experimentally unproven. They are more or less based on thought experiments. Although "Gedanken experiments are useful devices for generating new ideas", as Crick and Koch [8] remark, "they do not lead, in general, to trustworthy conclusions". Anyway, in order to circumvent the obvious confusion in the literature, we might go back to the starting hole and perform such a thought experiment, however as simple and uncompounded as possible. This is always the best option in case no proof of more complex explanations is available. The present concept cuts it down to just one central aspect.

A simplified thought experiment

Unicellular organisms like the amoeba react to external stimuli without any hint of conscious behavior. The same holds true for the earthworm with its rope ladder-like nervous system. By contrast, the situation is quite different with the dog, a good friend of ours, who recognizes his master, takes up its mood, and responds as an individual with specific character. What does constitute the difference between the three examples?

In the animal kingdom inclusive of humans, the difference in question lies on a continuum and refers to the neuronal organisation. The amoeba misses it completely, while the earthworm still has a quite rudimentary neuronal system. By contrast, the dog already has command of a highly specialised central nervous system, equipped with sensory organs such as eyes, ears, nose and organ of equilibrium or somatovisceral sensitivity. This enables the dog to permanently perceive and discriminate stimuli from inside and outside the body. The ongoing flow of information will be transmitted by functionally organised nerve tracts to specialised neuronal networks. There the information is stored as engrams composed of different kinds of stimuli and with different impact.

Still the question remains: What is the basic nature of consciousness? Obviously, there must be an individual to whom something can become conscious. Exemplified by the above comparison between the three species, as long as stimuli from outside the body cannot be distinguished from and compared to those from inside the body, the stimuli may cause excitations in the responsive (neuronal) cells, but they will not become conscious. The situation would be comparable to an automatic camera taking a picture, but nobody is there to look at it. Only if the organism can distinguish between its own body and the outside world by comparatively processing internal and external stimuli, the organism can experience itself as an individual. And only then can consciousness be allocated. This is the crucial point.

It is undisputed that there are no distinctive features of single neurons identifiable which could serve as a specific substrate for consciousness, that is to say, independent of the comparative processing and representation of internal and external stimuli in neuronal networks. According to the "Integrated Information Theory" [9,10], complex processing and interconnection of neuronal stimuli is an essential precondition of consciousness. However, although necessary, this is by no means sufficient to create consciousness. It is the comparative processing of internal (bodily) versus external (environmental) stimuli that counts.

Stimuli originating within the own body have inherent features of continuity and mutual dependency. Actually, any part of the body is experienced in the context with the other parts of the body which guarantees, in the course of time, a robust experience as “own” even in case of illness when one or the other part may fail. This is a fundamental difference to environmental stimuli and enables the organism to distinguish internal from external stimuli. Only then can the organism, as an individual, experience the outside world as opposed to its inner life.

The potentially provocative thesis therefore is: Comparative neuronal processing and representation of stimuli from inside and outside the body is to EQUATE with consciousness! There is at least no indication that it is, in principle, more than that - regardless of the enormous complexity of the underlying neuronal interactions. The NCC, the Neuronal Correlates of Consciousness [8,11], only relate to the functional details of the underlying neuronal network. Consequently and according to the degree of neuronal organisation, the development of consciousness gradually increases in the phylogenesis, from a timid outset to human consciousness.

The present theory refers, in the first place, to bodily sensations in the real situation. This is what provides us with the vivid experience of "self". As in the awake state, the dreamer's experience of "self" is based on the memory, imagination or even limited sensation of the own body versus what is imagined, remembered or even fabricated as the outside world. The brain's critical task will then be to identify the source of such memories, fantasized imaginations or sensations, which has been reviewed by Johnson et al. [12]. Regardless of the dream state, the present theory relates not only to consciousness of one's own existence but also to consciousness on a lower evolutionary level which Humphrey [13] has discussed as feeling rather than thinking.

An "embodied self" and the experience of your own body as an important factor involved in the constitution of "self" had already been addressed by several authors (e.g.: Humphrey [13]; Newen [14]; Riva [15]). Newen [14] argued that "the self can best be described as a flexible embodied self which is the result of an integration of typical features into a minimal pattern". His concept is in some respect close to the concept offered here but still much more complex. In support of an "embodied self" it may be realized that, by fertilization, two cells without any experience of "self" fuse, divide and grow up to a body that only then and in the course of time develops consciousness. The body is first, the 'mind' second, matching with the subjective experience that 'my self' is caught in this very body instead of a different one. However, EQUATING the brain's comparative representation of internal versus external stimuli with consciousness goes one step beyond the mere notion of an "embodied self".

Based on this concept, the emotionality of subjective experience may be best explained by the individual mixture of such stimuli with their varying, positive or negative consequences for the body. Accordingly, Wiens [16] already summarized that "centrally integrated feedback from the whole body plays a role in emotional experience".

Some examples of comparator models in the literature

The present concept is based on a comparative neuronal process. Such neuronal functions are operative at numerous levels in the nervous system, apart from their fundamental function in consciousness. They should not be mixed up. For example, Vallortigara [17] refers to the "efference copy" which may nullify sensory signals arising as perception-disturbing by- products of an organism's movement. Another example is the "intention-to-act" that generates a motor program followed by proprioceptive or visual reafferences which may contribute to the feeling that we are the agents who act [18]. Further, Humphrey [13] thought about a "sensory loop" from the organism's boundary to the sensory cortex and back to the site of the stimulus' reception, and he also proposed a comparative process that either accepts or rejects the interpretation (perception) of a sensation and wrote "the 'perceptual center' might well send its reconstruction of the stimulus straight over to the 'sensory center' where a comparison could take place". However, all these comparator examples do not specifically relate to the distinction between the body and its environment that is hold central to self-consciousness by the present theory.

In contrast to the above examples, Kunzendorf [19] introduced a comparator function with direct reference to consciousness. He argues in his "Theory of Source Monitoring": "While physically monitoring either the peripheral or central source of sensations, the brain's monitoring mechanism is subjectively paralleled by the generic knowledge that all monitored sensations should be treated as 'belonging to oneself' and by the resulting illusion of a 'self' as subject having these sensations". Kunzendorf‘s "source monitoring mechanism" identifies the brain ("central") versus the environment ("peripheral") as the two relevant sources of sensations, i.e., sensations based on, for example, "visually imagined" contents versus sensations "externally generated or perceived". By contrast, the present theory is based on the comparison between bodily sensations with sensations originating in the environment.

Furthermore, in Kunzendorf's theory, the brain's "source monitoring mechanism" has to distinguish between two sources that both yield, to a considerable degree, arbitrary contents of sensations and percepts which can hardly serve as a reliable basis of "self". This is true for the environment with its constantly changing stimuli, and for the brain with its great variety of uncontrollable factors like psychotic states. By contrast, the present theory is based on the brain's comparison between stimuli originating in the environment with stimuli originating within the own body. This crucial point deserves a repetition: Bodily sensations will be perceived by any individual as "own" because any part of the body is experienced in the context with the rest of the body. Any change over time or during illness will take place still embedded in this continuity of the complex scenario of bodily sensations which can be considered the most reliable basis of "self". As it is evident from introspection, conscious perception of the environment is always paralleled by sensation of the own body.

Most importantly, in either theory, the brain must have any function that allows for a decision about "self". In Kunzendorf's theory, this is an axiomatical construct termed "generic knowledge" which results in the “illusion” of self and which escapes any empirical proof. By contrast, the present theory simply refers to a neuronal mechanism, namely the comparative representation of external versus internal sensory impressions. Such a neuronal mechanism can be approached empirically, for example by techniques for neurobehavioral assessment combined with neuroimaging [20]. There is no theoretical construct required. This may be the most plausible explanation for the generation of "self", and it is certainly the most uncompounded one.

Closest to the present concept may be the one outlined by Newen [14] who noted that "for all cognitive tasks in which we interact with the world, we receive sensorimotor and affective input and construct information about the world and information about ourselves at the same time" which results in the "self-world distinction". However, this still does not specify the basic mechanism by which neuronal input of external versus internal stimuli results in "self-world distinction" and conscious experience. Is it anything mysterious? This explanation gap is probably an illusory one which is filled by equating COMPARATIVE REPRESENTATION of internal versus external stimuli with consciousness.

Self-consciousness and artificial intelligence

Related to artificial intelligence (AI), Cunha [21] also asked the question of how self-consciousness evolves. She concluded that, "in order to develop conscious and creative AI, machines must be self-aware" and, to this end, must distinguish themselves from the environment. This again meets the statement central to the present publication. However, Cunha's proposal that the machine's "neural networks" may somehow "self-organize" to achieve this goal may fail because a computer program per se might not allow for the constitution of any personal, embodied "self". With an arbitrary flow of information received from a permanently changing environment, self-awareness will require, as a reliable and authentic basis, steady flow of information from the own body. A sensorial infrastructure is therefore essential to achieve this. In the living organism, this information will then be comparatively processed with that from the environment.

Whereas a somatovisceral sensorium (just to mention it as one example) is operative and adaptable in all its neuronal components in the living organism, structures of comparable complexity are missing and hard (if at all) to establish in a machine's dead 'body', the hardware. For the purpose in question, those functions might be even meaningless in a dead material, although this assessment is largely based on intuition. In any case, the enormous task would not only be to measure environmental parameters parallel to a high number of parameters taken from the 'body' (say, for example, the temperature of the machine's metal), but to interconnect all those 'bodily' signals such that they are 'experienced' by the computer as all belonging to one and the same 'personal' entity. Really a huge task! Apart from that, a computer’s ability to respond to its environment in an apparently intelligent manner must not be equated with self-awareness and consciousness. It may just appear as if it were. As already outlined, there is every reason to assume that a steady flow of information from the body is an essential requirement for the constitution of self-consciousness. While this requirement is met by a living organism, it is (so far?) not met by a computerized machine.

Despite these objections, Blum and Blum [22] addressed, in the context of their highly sophisticated theory based on theoretical computer science, the "big question: Will CTM [Conscious Turing Machine] have the 'feeling' that it is conscious?" and they continued "while we believe that the answer is YES, at least for 'sufficiently complex' CTM, we cannot prove anything mathematically without a definition of the 'feeling of consciousness', which we do not have (yet)." Howsoever the answer to this "big question" will turn out in the end, Krauss and Maier [23] summarized in their review on “conscious machines” and on theories of consciousness in general: "In all theories that we touched in this article, the notion of self is fundamental and the emergence of consciousness crucially requires embodiment".

Conclusion

The (small) step ahead which the present concept offers is to EQUATE the comparative neuronal representation of internal versus external stimuli with (self-) consciousness. This is considered sufficient to explain self-world-distinction and,

thereby, the imagination of one's "self". Actually, apart from complex neuronal processing underlying any comparative representation of internal versus external stimuli, no specific neuronal features have been identified which might otherwise explain (self-) consciousness and subjective experience. This leads me back to the title question: Is consciousness a mystery? As far as its basic nature is concerned, to me the answer seems to be: No! However, as to the structural and functional details of the underlying neuronal networks, no doubt a huge amount of work still has to be done. In this respect, the answer may be: Yes!

References

  1. Zeman AZJ, Grayling AC, Cowey Contemporary theories of consciousness. J Neurol Neurosurg Psychiatry 62 (1997): 549-552.
  2. Tassi P, Muzet Defining the states of consciousness. Neurosci Biobehav Rev. 25 (2001): 175-191.
  3. Crick F, Koch Towards a neurobiological theory of consciousness. Seminars in the Neurosciences 2 (1990): 263- 275.
  4. Damasio Synchronous activation in multiple cortical regions: A mechanism for recall. Seminars in Neurosciences 2 (1990): 287-296.
  5. Crick The astonishing hypothesis: The scientific search for the soul. Simon and Schuster, London (1994).
  6. Graziano Most popular theories of consciousness are worse than wrong. The Atlantic March 2016.
  7. Searle Consciousness. Annu Rev Neurosci 23 (2000): 557-578.
  8. Crick F, Koch The Unconscious Homunculus. In Eds. Metzinger T. The Neuronal Correlates of Consciousness, MIT Press (2000): 103-110.
  9. Tononi G, Boly M, Massimini M and Koch Integrated information theory: from consciousness to its physical substrate. Nat Rev Neurosci 17 (2016): 450-461.
  10. Maillé Lynn M. Reconciling Current theories of consciousness. J Neurosci Res 40 (2020): 1994-1996.
  11. Koch Ch. The quest for consciousness - A neurobiological approach. Roberts & Company Publishers (2004).
  12. Johnson MK, Hashtroudi Lindasy DS. Source Monitoring. Psychological Bulletin. 114 (1993): 3-28.
  13. Humphrey A History of the Mind. London, Chatto & Windus (1992).
  14. Newen The Embodied Self, the Pattern Theory of Self, and the Predictive Mind. Frontiers in Psychology. 9 (2018): 2270.
  15. Riva The neuroscience of body memory: From the self through the space to the others. Cortex. 104 (2018): 241- 260.
  16. Wiens Interoception in emotional experience. Current Opinion in Neurology 18 (2005): 442-447.
  17. Vallortigara The rose and the fly. A conjecture on the origin of consciousness. Biochem Biophys Res Commun. 564 (2021): 170-174.
  18. David Newen A. Vogeley K. The "sense of agency" and its underlying cognitive and neural mechanisms. Consciousness and Cognition. 17 (2008): 523-534.
  19. Kunzendorf Source monitoring as an explanation for the illusion of "self as subject". Psychology of Consciousness: Theory, Research, and Practice. 9 (2022): 64-77.
  20. Bigler The lesion(s) in traumatic brain injury: implications for clinical neuropsychology. Arch Clin Neuropsychol 16 (2001): 95-131.
  21. Cunha C. Creating a truly conscious AI – a novel PsyArXiv (2019).
  22. Blum M, Blum A theoretical computer science perspective on consciousness. J Artif Intell Conscious 8 (2021): 1- 42.
  23. Krauss P, Maier A. Will we ever have conscious machines? Front Comput Neurosci 14 (2020):

Journal Statistics

Impact Factor: * 2.6

CiteScore: 2.9

Acceptance Rate: 11.01%

Time to first decision: 10.4 days

Time from article received to acceptance: 2-3 weeks

Discover More: Recent Articles

Grant Support Articles

© 2016-2024, Copyrights Fortune Journals. All Rights Reserved!