You can't find the evidence.
“
no formal system is able to generate anything even remotely mind-like. The asymmetry between the brain and the computer is complete, all comparisons are flawed, and the idea of a computer-generated consciousness is nonsense.” (emphasis added)
Torey, Z. (2009).
The crucible of consciousness: An integrated theory of mind and brain. Cambridge: MIT press.
Living systems are all non-computable
Louie, A. H. (2005). Any material realization of the (M, R)-systems must have noncomputable models.
Journal of integrative neuroscience,
4(04), 423-436.
"the computer metaphor is incomplete, since we are seldom told how such “computations” are carried out – for instance, what are the algorithms for feeling surprise or fear, for falling in love or down the stairs, for asking questions or criticizing answers. Such imprecision is characteristic of an immature science. It is similar to the molecular biologist who assured us that DNA molecules “specify” proteins (or “instruct” about their synthesis), instead of exhibiting the corresponding chemical reactions." p. 232
Bunge, M. (2010).
Matter and Mind: A Philosophical Inquiry (
Boston Studies in the Philosophy of Science Vol. 287). Springer.
“The brain is not a computer, nor is the world an unambiguous piece of tape defining an effective procedure and constituting “symbolic information.” Such a selectional brain system is endlessly more responsive and plastic than a coded system.”
Edelman, G. M. (1999). Building a Picture of the Brain.
Annals of the New York Academy of Sciences,
882(1), 68-89.
“We have demonstrated, for the first time to our knowledge, that computations performed and shaped by the dynamics of charges are radically different than computations in digital computers.”
Aur, D., & Jog, M. S. (2010).
Neuroelectrodynamics: Understanding the Brain Language (
Biomedical and Health Research Vol. 74). IOS Press.
“To understand why neurons and computers are fundamentally different, we must bear in mind that modern computers are algorithmic, whereas the brain and neurons are not.”
Tse, P. (2013).
The Neural Basis of Free Will: Criterial Causation. MIT Press.
“The free will theorem supports a powerful challenge to the scientific credentials of determinism, by showing, on certain well-supported assumptions, that two cornerstones of contemporary science, namely (1) acceptance of the scientific method as a reliable way of finding out about the world, and (2) relativity theory’s exclusion of faster-than-light transmission of information, taken together, conflict with determinism in both its versions. Belief in determinism may thus come to be seen as notably
unscientific.”
Hodgson, D. (2012).
Rationality + Consciousness = Free Will (
Philosophy of Mind). Oxford University Press.
"In order to establish whether minds are or operate on the basis of the same principles that govern computing machines, however, it is necessary to accomplish three tasks. First, discover the principles that govern computing machines. Second, discover the principles that govern human minds. And, third, compare them to ascertain whether they are similar or the same. That much should be obvious. But while leading computationalists have shown considerable ingenuity in elaborating and defending the conception of minds as computers, they have not always been attentive to the study of thought processes themselves. Their underlying attitude has been that no theoretical alternative is possible...The essays collected here are intended to demonstrate that this attitude is no longer justified."
Fetzer, J. H. (2001).
Computers and cognition: Why minds are not machines (
Studies in Cognitive Systems Vol. 25). Springer.
“The view that the brain does not compute Turing-computable-functions is still a form of wide mechanism in Copeland’s sense, but it is more encompassing than Copeland’s, because it includes both Copeland’s hypercomputationalism and the view that mental capacities are not explained by neural computations but by neural processes that are not computational. Perhaps brains are simply not computing mechanisms but some other kinds of mechanisms. This view fits well with contemporary theoretical neuroscience, where much of the most rigorous and sophisticated work assigns no explanatory role to computation”
Piccinini, G. (2007). Computationalism, the Church–Turing thesis, and the Church–Turing fallacy.
Synthese,
154(1), 97-120.
“Referring to the ‘widespread belief ... in many scientific circles ... that the brain is a computer,’ neurobiologist Gerald Edelman (2006) insists that ‘this belief is mistaken,’ for a number of reasons, principal among which are that ‘the brain does not operate by logical rules’ (p. 21). Jerome Bruner (1996), a founder of cognitive science itself, yet, coincidentally, a key figure in the emergence of narrative psychology, challenges the ability of ‘information processing’ to account for ‘the messy, ambiguous, and context-sensitive processes of meaning-making’ (p. 5). Psychologist Daniel Goleman (1995), author of the popular book Emotional Intelligence, asserts that cognitive scientists have been so ‘seduced by the computer as the operative model of mind’ (pp. 40f.) that they have forgotten that, ‘in reality, the brain’s wetware is awash in a messy, pulsating puddle of neurochemicals’ (p. 40f.) which is ‘nothing like the sanitized, orderly silicon that has spawned the guiding metaphor for mind’ (pp. 40–41).”
Randall, W. L. (2007). From Computer to Compost: Rethinking Our Metaphors for Memory.
Theory & psychology,
17(5), 611-633.
“Semantic ambiguity exists in real-world processes of life and mind...Thus, it is feasible to rationally investigate a real-world semantic process, such as the interaction between synaptic communication and NDN, by placing the process into a modeling relation with an impredicative model, such as a hyperset process, and learn novel (albeit qualitative rather than quantitative) things about the real-world process by asking questions about the model.
What is not feasible is serious investigation of such processes by algorithmic computation. Algorithms disallow internal semantics, and specifically prohibit ambiguity. In other words, in a fundamental manner, the entailment structures of algorithms differ from the entailment structures of processes of life and mind. Thus, algorithmic descriptions of such processes are superficial, capturing the incidental syntax but not the essential semantics...
No computer program, no matter how cleverly designed, has an entailment structure like a mind, or even a prion.” (emphasis added)
Kercel, S. W. (2003, June). Softer than soft computing. In
Soft Computing in Industrial Applications, 2003. SMCia/03. Proceedings of the 2003 IEEE International Workshop on (pp. 27-32). IEEE.
“Today’s programs—at best—solve specific problems. Where humans have broad and flexible capabilities, computers do not.
Perhaps we’ve been going about it in the wrong way. For 50 years, computer scientists have been trying to make computers intelligent while mostly ignoring the one thing that is intelligent: the human brain. Even so-called neural network programming techniques take as their starting point a highly simplistic view of how the brain operates.”
Hawkins, J. (2007). Why Can't a Computer be more Like a Brain?.
Spectrum, IEEE,
44(4), 21-26.
“there is no evidence for a computer program consisting of effective procedures that would control a brain’s input, output, and behavior. Artificial intelligence doesn’t work in real brains. There is no logic and no precise clock governing the outputs of our brains no matter how regular they may appear.”
Edelman, G. M. (2006).
Second nature: Brain science and human knowledge. Yale University Press.
"the brain is not a computer, yet it manipulates information...while von Neumann and others invented computers with mimicking the brain in mind (von Neumann 1958), the brain does not appear to behave as a Turing Machine "
Danchin, A. (2009). Information of the chassis and information of the program in synthetic cells.
Systems and synthetic biology,
3(1-4), 125-134.
“Why has the traditional separation of grammar/syntax and semantics proven to be so troublesome? The problem lies in another general fallacy in thinking. Maybe the fast advances in science and technology have taught us too much of a mechanistic approach. It is like the viewpoint that ‘a machine is defined by the sum total of its parts’…A meal or the definition of a soup, for instance, cannot be the sum total of its ingredients. We could not drink a cup of water, consume raw vegetables and some meat, then ingest salt, peppercorns, pimento, parseley, etc. and claim to have eaten a soup. The processing that creates a new definition of each ingredient in their relationship to others results in a completely different meaning from the sum total. The recipe explains some of the interrelationships of processing and ingredients, but it can neither be identified with the soup nor with the experience of cooking it, nor with its consumption. The interrelationship of human mind, concepts and language can be similarly defined, except that it is incomparably more complicated, and we don't really have a compendium of recipes yet. We don't really know how our mind works when it thinks and/or creates language, especially since our mind has to read itself, while it is working.”
Schmidt, K. M. (2014). Concepts and Grammar: Thoughts about an Integrated System. In N. Dershowitz & E. Nissan (Eds.).
Language, Culture, Computation: Computational Linguistics and Linguistics: Essays Dedicated to Yaacov Choueka on the Occasion of His 75th Birthday, Part III (
Lecture Notes in Computer Science Vol. 8003). Springer.
"...we do not use brains as we use computers. Indeed it makes no more sense to talk of storing information in the brain than it does to talk of having dictionaries or filing cards in the brain as opposed to having them in a bookcase or filing cabinet. (Hacker, 1987, p. 493)
Hacker, P M S (1987). Languages, minds and brains. In
Mindwaves: Thoughts on Intelligence, Identity and Consciousness (ed. C Blakemore and S Greenfield), pp. 485–505. Blackwell.
See also the paper on the manner in which software supports the idea of immaterial non-reductive reality I attached to a previous post.