• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Free will?

LegionOnomaMoi

Veteran Member
Premium Member
Legion, can you explain how functional processes happens in a simpler example? Such as the n-body gravitational problem?

1) There is no way to explain it through the n-body problem because it doesn't relate.
2) Emergent properties (functional or otherwise) are necessarily the products of complex systems. In other words, I can give you many simpler examples, but it wouldn't really matter, because the entirety of fuctional processes as emergent properties of systems belongs in the domain of complex systems. Ant colonies, schools of fish, cells, and any other system out of which emerge functional processes are necessarily complex. Moreoever, at the moment, many of these cannot be solved through computable models (and arguably have been proven to be noncomputable), and are widely considered irreducible.

(because reading thousands of words either way is getting tiresome, and I suspect it's not actually getting us anywhere useful.)

An unfortunate but inevitable component of any discussion about such things. We're dealing with the most complex system we know of (the human brain) and how mental causation is either an illusion (contrary to our personal experience) or is somehow realized through as yet unknown dynamics. In short, if you [I'm using the generalized "you" here, not you specifically] want a simple answer, it's either going to be one which you will accept because you already believe it, or one which you reject because it lacks the necessarily nuanced arguments.

The problem of relating the mind (concepts, self-awareness, consciousness, etc.) to the realm of the physical has always been a problem. The 20th and 21st century didn't make this easier, but pushed understanding further back as we found that so much of what we had deemed "simple" was beyond our ability to model.

And all that is without the philosophical issues of causality itself. Explaining complex systems (the only systems which can have emergent functional processes) may have simple examples, but will not have any simple explanations (unless these are basically useless, or are very lengthy).
 

PolyHedral

Superabacus Mystic
1) There is no way to explain it through the n-body problem because it doesn't relate.
Assuming that there is no radioactive atoms involved, all of a cell's workings are essentially an n-body problem in QED. (The two interatomic forces are too small to notice, and gravity is too large to notice.)

2) Emergent properties (functional or otherwise) are necessarily the products of complex systems. In other words, I can give you many simpler examples, but it wouldn't really matter, because the entirety of fuctional processes as emergent properties of systems belongs in the domain of complex systems. Ant colonies, schools of fish, cells, and any other system out of which emerge functional processes are necessarily complex. Moreoever, at the moment, many of these cannot be solved through computable models (and arguably have been proven to be noncomputable), and are widely considered irreducible.
In terms of Kolmogorov complexity, I can make the problem arbitararily complex simply by adding more bodies. (Every body added requires a new describing equation and all of the other equations must have one more term.) Your examples of ant colonies and fish can almost certainly be modelled in the same way, since the fish/ants (appear to) have very little state of their own, and evolve mostly from the state of their neighbours.

If you mean something else by "complexity," please define it.
 

LegionOnomaMoi

Veteran Member
Premium Member
Assuming that there is no radioactive atoms involved, all of a cell's workings are essentially an n-body problem in QED. (The two interatomic forces are too small to notice, and gravity is too large to notice.)

So, Quantum electrodynamics and the n-body (classical mechanics) problem ensures that this is a n-body, not many-body system? Why exactly should we approach this using Hamiltonian dynamics? Or rather, why would that clarify things?

Far more importantly, after I stated that emergent functional processes governed system dynamics, you want me to rephrase this in terms of classical equations of motion (or Hamiltonian) which
1) Are approximations and
2) rather completely contradict the entire idea behind emergent functional processes, as the properties of these are the models for internal dynamics which cannot even in principle be understood through any known equations of motion.

""a living system as a clearly distinguishable network of processes for production of elements constituting and re-activating the network that produces these elements...Any formal description of such a living network is impossible with current mathematics." from Simeonov's paper "Integral biomathics" Progress in Biophysics and Molecular Biology 102 (2010) 85-121

The methods for approximating molecular kinetics are just that: approximations. In Microbial Systems Biology- Methods and Protocols (Methods in Molecular Biology 881), part VI is on Kinetic modelling techniques for cells. The paper by Singharoy, Joshi, Cheluvaraja, Miao, Brown, & Ortoleva, "Simulating Microbial Systems: Addressing Model Uncertainty/Incompleteness via Multiscale and Entropy Methods", specifically discusses the problems and challenges with using the "Newtonian dynamics of an N-atom system" for anything other than predictive approximation ("coarse-grained") models of system paramaters.

More importantly, "Developing a general platform for simulating microbial systems requires a broad understanding of self-organization phenomena since it may not be otherwise feasible to have sufficient data to impose intracellular or other structure. In contrast, if one uses a model that incorporates self-organization mechanisms, then many of the spontaneously emerging structures within a microbe can be predicted."

In other words, those the emergent properties you want explained in terms of classical or even QED motion equations are fundamentally contradictory approaches. One is about the motion of bodys in terms of classical (or Hamiltonian) dynamics, and the other concerns how systems produce processes which govern system dynamics. It's like asking someone to explain quantum entanglement in terms of Newtonian physics.

However, if you insist: in terms of the n-body problem, biological systems (and in particular the brain), like nonlinear systems with emergent properties in general, are treated in one of two ways: either the component parts are plugged into nonlinear equations to create probabilistic models in which (hopefully) over a short period of time internal (motion) dynamics follow a trajectory within a certain range (or space), or the system is represented in non-formal (or semi-formal) holistic diagrams (usually using graph theory or some hybrid). At the moment, models using the former method type are very, very poor.

If you mean something else by "complexity," please define it.
"Complexity is a scientific theory that asserts that some systems display behavioural phenomena completely inexplicable by any conventional analysis of the systems’ constituent parts.
Besides, emergence refers to the appearance of higher-level properties and behaviours of a system that while obviously originating from the collective dynamics of that system’s components -are neither to be found in nor are directly deductable from the lower-level properties of that system. Emergent properties are properties of the ‘whole’ that are not possessed by any of the individual parts making up that whole. For example, an air molecule is not a cyclone, an isolated species doesn’t form a food chain and an ‘isolated’ neuron is not conscious: emergent behaviours are typically novel and unanticipated.
Moreover, it is becoming a commonplace that, if the 20th was the century of physics, the 21st will be the century of biology, and, more specifically, mathematical biology" from p. 130 of Alaoui's "Complex Emergent Properties and Chaos (De)synchronization" in the edited volume Emergent Properties in Natural and Artificial Dynamical Systems (from Springer's series Understanding Complex Systems; Springer, 2006)
 

PolyHedral

Superabacus Mystic
So, Quantum electrodynamics and the n-body (classical mechanics) problem ensures that this is a n-body, not many-body system?
The distinction is lost on me.

Why exactly should we approach this using Hamiltonian dynamics? Or rather, why would that clarify things?
See in a moment.
Far more importantly, after I stated that emergent functional processes governed system dynamics, you want me to rephrase this in terms of classical equations of motion (or Hamiltonian) which
1) Are approximations
AFAIK, according to known physics, they are only approximations in regimes where gravity or interatomic forces become significant. A living cell has neither of those, and therefore, QED should be "exact" for all useful purposes. (Up to errors on fundamental constants and input data.)

2) rather completely contradict the entire idea behind emergent functional processes, as the properties of these are the models for internal dynamics which cannot even in principle be understood through any known equations of motion.
I'll get back to this.

""a living system as a clearly distinguishable network of processes for production of elements constituting and re-activating the network that produces these elements...Any formal description of such a living network is impossible with current mathematics." from Simeonov's paper "Integral biomathics" Progress in Biophysics and Molecular Biology 102 (2010) 85-121
Ditto.

The methods for approximating molecular kinetics are just that: approximations. In Microbial Systems Biology- Methods and Protocols (Methods in Molecular Biology 881), part VI is on Kinetic modelling techniques for cells. The paper by Singharoy, Joshi, Cheluvaraja, Miao, Brown, & Ortoleva, "Simulating Microbial Systems: Addressing Model Uncertainty/Incompleteness via Multiscale and Entropy Methods", specifically discusses the problems and challenges with using the "Newtonian dynamics of an N-atom system" for anything other than predictive approximation ("coarse-grained") models of system paramaters.
I did say quantum. :p

More importantly, "Developing a general platform for simulating microbial systems requires a broad understanding of self-organization phenomena since it may not be otherwise feasible to have sufficient data to impose intracellular or other structure. In contrast, if one uses a model that incorporates self-organization mechanisms, then many of the spontaneously emerging structures within a microbe can be predicted."

In other words, those the emergent properties you want explained in terms of classical or even QED motion equations are fundamentally contradictory approaches. One is about the motion of bodys in terms of classical (or Hamiltonian) dynamics, and the other concerns how systems produce processes which govern system dynamics. It's like asking someone to explain quantum entanglement in terms of Newtonian physics.

However, if you insist: in terms of the n-body problem, biological systems (and in particular the brain), like nonlinear systems with emergent properties in general, are treated in one of two ways: either the component parts are plugged into nonlinear equations to create probabilistic models in which (hopefully) over a short period of time internal (motion) dynamics follow a trajectory within a certain range (or space), or the system is represented in non-formal (or semi-formal) holistic diagrams (usually using graph theory or some hybrid). At the moment, models using the former method type are very, very poor.
You are declaring quantum physics, and reductionist physics as a whole, wrong. Why should I believe you? :p What possible experiment could one do that could demonstrate that that laws of physics care for scale, and change depending on how big you are, as all of this seems to imply?

In fact, that's what you're doing. In order for you to be right, physics has to change behaviour depending on scale, and therefore the entire Standard Model is wrong. Go collect your Nobel. :p

Besides, emergence refers to the appearance of higher-level properties and behaviours of a system that while obviously originating from the collective dynamics of that system’s components -are neither to be found in nor are directly deductable from the lower-level properties of that system. Emergent properties are properties of the ‘whole’ that are not possessed by any of the individual parts making up that whole. For example, an air molecule is not a cyclone, an isolated species doesn’t form a food chain and an ‘isolated’ neuron is not conscious: emergent behaviours are typically novel and unanticipated.
I agree with all of this, but don't see how it supports your point. If this is the phenomena you're talking about, then these things can be modelled computationally.
 

idav

Being
Premium Member
The brain is reducible cause you can cut half of it out and it could still funtion. It is redundant so it just reroutes and reprograms each cell one at a time and added to the network.
 

The Wizard

Active Member
Not too many of us suggest we have no free will in our everyday life.

However it does get kind of mystical when we wonder about what is happening, after a bout of idle mind, upstream from the underlying causes of our next thought.

Any thoughts anybody?
:human:
We, as counscious species, definately have counscious free will. But, the level of it is dictated by various influences, as well as are own level of disciplined self awareness and quite simply- awareness....imho.
 

LegionOnomaMoi

Veteran Member
Premium Member
The distinction is lost on me.

Quantum "many-body" problems fall into two "classes": those that (like their classical counterparts) become very difficult very quickly simply because of the number of "bodies", but all the "bodies" are distinguishable (i.e., for all particles n such that n is a component of the system (Hn), there exists a physical propertywhich allows us to describe the system using QM formalism.

This isn't always true, thanks to the principle of indistinguishability. In quantum systems, it is not always the case that mutually interacting particles in some system are even in principle distinguishable.
AFAIK, according to known physics, they are only approximations in regimes where gravity or interatomic forces become significant.

1) The "n-body problem" as understood classically is pretty irrelevant simply because it was formulated to describe celestial orbits. At the beginning of Greenspan's N-Body Problems And Models (World Scientific, 2004), he has a section "Problem Statement" which defines the N-body problem.Here's an scan from the page:

legiononomamoi-albums-other-picture4066-n-body-problem.jpg



The "classical" n-body problem was about "collision" in space. In general, neither QED nor other physical sciences talk about N-body problems much. Usually, textbooks, sources for active researchers, and published research either do not mention the N-body problem at all, or they mention it in passing or implicitly by e.g. the "two-body" problem.


2) It is the incorporation of the N-body problem into the dynamical systems approach which is really behind the irrelevancy of the N-body problem. Texts like Introduction to Hamiltonian Dynamical Systems and the N-Body Problem are written for pedagogical reasons: "The main example developed in the text is the classical N-body problem; i.e., the Hamiltonian system of differential equations that describes the motion of N point masses moving under the influence of their mutual gravitational attraction...But this is not a book about the N-body problem for its own sake. Very few of the special results that only apply to the N-body problem are given." p. vii
Useful work on this relic of Newtonian physics were incorporated into the various methods (ODE's, multidimensional scaling, fractal analysis, etc.) used to "simplify" or approximate the dynamics of nonlinear systems.

A living cell has neither of those, and therefore, QED should be "exact" for all useful purposes. (Up to errors on fundamental constants and input data.)

The issue here is not just the relationship between QED (or quantum theories in general) and biological systems (including cells), but a common misconception about the relationship between Quantum theory, Classical theory, and the physical world. From the edited volume Advanced in Quantum Theory :

"There are some popular but insufficient ideas about the distinctions of quantum physics and classical physics. One misleading distinction concerns the scope of application to microphysics and macrophysics, respectively...Popular but false is also the distinction between a "fuzzy" quantum theory and a "sharp" classical physics. It ignores that quantum theory provides for the most accurate description of nature we ever had. Classical physics nourishes the illusion of exactness. Its mathematical structure is based on the assumption of “arbitrarily smooth changes” of any variable. While this is a precondition for calculus, it is by no means always afforded by nature."
(sect. 4 of "Quantum Theory as Universal Theory of Structures – Essentially from Cosmos to Consciousness").

From the same (p. 13) "Life is characterized as control and timing, enabled by quantum information...In the self-regulation of organisms – extending even to consciousness in the later stages of the biological evolution – quantum effects can become operational at the macroscopic level."

There are plenty of journals, research papers, volumes, etc., on the inadequacy of the "classical physics" approach to biology. I've referenced many, and works like
Quantum Biochemistry: Electronic Structure and Biological Activity (Wiley VCH; 2010), or the contributions in Quantum Aspects of Life (Imperial College Press, 2008) represent current, cutting edge research in multiple fields.


You are declaring quantum physics, and reductionist physics as a whole, wrong. Why should I believe you?

1) There is no "quantum physics" which exists either as a broadly, clearly understood and accepted model, nor is there a standard methodological paradigm for physics research or experimental design/interpretation in either the life or natural sciences
2) I'm providing you with references to scientific literature across fields and their results. You seem to think that quantum physics and its relation classical mechanics (as well as these to the life sciences) are generally understood and that I'm giving you some "fringe" research or publications. Neither is true.
We can go as far back as Schrödinger's What is life? to see this. He argued that QM plays "a dominating role in the very orderly and lawful events within a living organism" and that they "they determine important characteristics of its functioning." He even predicted that as yet unkown physical laws/theories would be discovered and which would be "integral" in understanding "living matter".

As for this:

What possible experiment could one do that could demonstrate that that laws of physics care for scale, and change depending on how big you are, as all of this seems to imply?

1) What "laws" of physics? Apart from what I said above, look at Westerhoff & Kell's point in "The methodologies of systems biology" (in Systems Biology: Philosophical Foundations; Elsevier, 2007): the "reduction of molecular biology and biochemistry to the underlying physics and chemistry is rare, and not even an aim of these disciplines anymore; both disciplines are entirely successful on the basis of their own concepts and laws, immaterial whether these are reducible to physics and chemistry or not" (p. 37).

Although things are changing (thanks largely to work in complex systems), a great deal of work in biology is only achieved by quitely "ignoring" physics and chemistry, as everything from transmembrane ion dynamics to enzyme catalysis to something as basic as "the pathways of processes that make living cells operate" (ibid) can't seem to be fit into our current understanding of physics.

That's just cellular stuff. It get's worse for more complex biosystems. See e.g., "Nonlocal mechanism for cluster synchronization in neural circuits" where "nearly zero-lag synchronization (ZLS)" activity "among two or more cortical areas which do not share the same input" are "governed by a nonlocal quantity" (an emergent functional property they explain in terms of GCD-clusters).

The evidence that our understanding of, or approach to, modern physics is flawed (perhaps seriously) ranges from the implicit evidence (in that biological mechanisms, structures, and functions can't be reduced to known physical laws), theoretical proofs (similar to EPR and Bell's work) that biological systems operate through closed, circular causality, and finally experimental data we can't explain through modern physics (including basic electrodynamics of cellular interactions). And all of that, all the problems complex systems (especially biological) pose for modern physics mentioned (and unmentioned but there in the literature) has nothing to do with consciousness. It's the current state of research in a variety of scientific disciplines creating the problems, not just some mind-body problem.

Maybe we just lack whatever is needed to explain all this with a reductionist physics validated through experimental results. But that isn't the way things are heading at the moment, nor does "irreducible" entail (the naive version of) anti-materialism.

Go collect your Nobel. :p

It's already been awarded. In the free online book I linked to is a paper "Quantum Transition State for Peptide Bond Formation in the Ribosome". The use of quantum crystallography to understand ribosome structure and function was the reason that one of the authors was awarded the Nobel prize.

these things can be modelled computationally.

How?
 
Last edited:

PolyHedral

Superabacus Mystic
Quantum "many-body" problems fall into two "classes": those that (like their classical counterparts) become very difficult very quickly simply because of the number of "bodies", but all the "bodies" are distinguishable (i.e., for all particles n such that n is a component of the system (Hn), there exists a physical propertywhich allows us to describe the system using QM formalism
I wasn't aware there was a functional difference. AFAIK, the modelling worked the same whether or not the particles were distinguishable - it just gave different answers.
(According to Wiki, "n-body" means the classical version, and "many-body" is the quantum version, and no other distinction is alluded to. I'll mention what I specifically mean further on.)
1) The "n-body problem" as understood classically is pretty irrelevant simply because it was formulated to describe celestial orbits. At the beginning of Greenspan's N-Body Problems And Models (World Scientific, 2004), he has a section "Problem Statement" which defines the N-body problem.Here's an scan from the page:
You're explaining what the n-body problem is to me when I brought it up in the first place. :sarcastic
The "classical" n-body problem was about "collision" in space. In general, neither QED nor other physical sciences talk about N-body problems much. Usually, textbooks, sources for active researchers, and published research either do not mention the N-body problem at all, or they mention it in passing or implicitly by e.g. the "two-body" problem.

2) It is the incorporation of the N-body problem into the dynamical systems approach which is really behind the irrelevancy of the N-body problem. Texts like Introduction to Hamiltonian Dynamical Systems and the N-Body Problem are written for pedagogical reasons: "The main example developed in the text is the classical N-body problem; i.e., the Hamiltonian system of differential equations that describes the motion of N point masses moving under the influence of their mutual gravitational attraction...But this is not a book about the N-body problem for its own sake. Very few of the special results that only apply to the N-body problem are given." p. vii
So the n-body problem isn't used generally by physicists... and? I'm talking about it, so unless there's a good reason not to, I'll continue.

Specifically, I meant the example of modelling n particles obeying an inverse-square law between each other. The "set up" can be done quite easily classically, and I suspect the equation(s) describing the quantum version can also be constructed fairly easily. I'm aware that its currently impossible to solve even classically through pure analytics.

However, since I deliberately picked the problem to be non-linear and arbitrarily scalable, why doesn't it count as a "dynamic" system?
The issue here is not just the relationship between QED (or quantum theories in general) and biological systems (including cells), but a common misconception about the relationship between Quantum theory, Classical theory, and the physical world. From the edited volume Advanced in Quantum Theory :
"There are some popular but insufficient ideas about the distinctions of quantum physics and classical physics. One misleading distinction concerns the scope of application to microphysics and macrophysics, respectively...Popular but false is also the distinction between a "fuzzy" quantum theory and a "sharp" classical physics. It ignores that quantum theory provides for the most accurate description of nature we ever had. Classical physics nourishes the illusion of exactness. Its mathematical structure is based on the assumption of “arbitrarily smooth changes” of any variable. While this is a precondition for calculus, it is by no means always afforded by nature."
(sect. 4 of "Quantum Theory as Universal Theory of Structures – Essentially from Cosmos to Consciousness").
I don't see your point. Are you saying that quantum field theory does not give accurate answers?
From the same (p. 13) "Life is characterized as control and timing, enabled by quantum information...In the self-regulation of organisms – extending even to consciousness in the later stages of the biological evolution – quantum effects can become operational at the macroscopic level."
So, what experiment have you (or anyone) done to show this? Because macroscopic quantum phenomena is a pretty big claim.
1) There is no "quantum physics" which exists either as a broadly, clearly understood and accepted model
So what do we call this, then? (I'm aware there is fraying around the edges, but its mostly correct.) Are you saying the EU spent $9bn on an experiment to find the Higgs' particle without knowing that the rest of the theory worked?
2) I'm providing you with references to scientific literature across fields and their results. You seem to think that quantum physics and its relation classical mechanics (as well as these to the life sciences) are generally understood and that I'm giving you some "fringe" research or publications. Neither is true.
I have to - you are contradicting science itself. You're not even contradicting a specific result, but saying "All science ever done has been fundamentally flawed." Unless you can build me a device as revolutionary as a transistor out of this, it's not rational for me to believe you. However, even that's hard, because a consequence of the "theory" is no further predictions are possible. That's not science - it has no explanatory power!
1) What "laws" of physics?
# The universe is made of... #
(See also the link to the Standard Model earlier.)
Apart from what I said above, look at Westerhoff & Kell's point in "The methodologies of systems biology" (in Systems Biology: Philosophical Foundations; Elsevier, 2007): the "reduction of molecular biology and biochemistry to the underlying physics and chemistry is rare, and not even an aim of these disciplines anymore; both disciplines are entirely successful on the basis of their own concepts and laws, immaterial whether these are reducible to physics and chemistry or not" (p. 37).
So is Catholicism.
Although things are changing (thanks largely to work in complex systems), a great deal of work in biology is only achieved by quitely "ignoring" physics and chemistry, as everything from transmembrane ion dynamics to enzyme catalysis to something as basic as "the pathways of processes that make living cells operate" (ibid) can't seem to be fit into our current understanding of physics.
So why isn't literally every physicist in the world scrambling to rewrite the textbooks? You are implying that physics, as a whole, collectively, is wrong. Why isn't anyone responding to that?
That's just cellular stuff. It get's worse for more complex biosystems. See e.g., "Nonlocal mechanism for cluster synchronization in neural circuits" where "nearly zero-lag synchronization (ZLS)" activity "among two or more cortical areas which do not share the same input" are "governed by a nonlocal quantity" (an emergent functional property they explain in terms of GCD-clusters).
How on earth do you measure "non-local quantities" on things so small? The brain is on the order of a few centimetres across - the time it takes an EM field to propagate through there is minuscule.
theoretical proofs (similar to EPR and Bell's work)
These are perfectly solvable if you drop the bloody stupid Copenhagen interpretation.
that biological systems operate through closed, circular causality,
This, meanwhile, would produce a thermodynamics violation.
and finally experimental data we can't explain through modern physics (including basic electrodynamics of cellular interactions).
I'm surprised you can measure them well enough for that conclusion to be valid at all.
Simulate the small bits, and the interaction laws, and watch.
 
Last edited:

LegionOnomaMoi

Veteran Member
Premium Member
I wasn't aware there was a functional difference.
It's kind of a basic, fundamental part of the many-body problem, such that textbooks which have a section on it lead with it. In fact, Dickhoff & Neck's textbook Many-body theory exposed! Propagator description of quantum mechanics in many-body systems (World Scientific, 2005) do this for their whole book, beginning in sect. 1.1. with "In a quantum many-body system, particles of the same species are completely indistinguishable. Moreover, even in the absence of mutual interactions they still have a profound influence on each other, since the number of ways in which the same quantum state can be occupied by two or more particles is severely restricted." Perhaps even more to the point is Christian Lubich's discussion (sect. 1.5) on the issue in From Quantum to Classical Molecular Dynamics: Reduced Models and Numerical Analysis (Zurich Lectures in Advanced Mathematics; European Mathematical Society, 2008). This is because, rather than stopping at distinguishable/indistinguishable, Lubich moves immediately to the "severe problems" one faces with "any attempt to 'solve' numerically the molecular Schrödinger equation" thanks to high dimensionality even for simple molecules, oscillations, etc. Perhaps most important is here again we find the fuzzy quantum/classic relationship (italics in original; emphasis added): "To obtain satisfactory results in spite of these difficulties, one requires a combination of model reduction, based on physical insight and/or asymptotic analysis, and numerical techniques used on the reduced models that are intermediate between classical and full quantum dynamicsTo obtain satisfactory results in spite of these difficulties, one requires a combination of model reduction, based on physical insight and/or asymptotic analysis, and numerical techniques used on the reduced models that are intermediate between classical and full quantum dynamics."

The many-body problem can present an entirely different set of issues; hence the enormous amount of work in quantum chemistry developing techniques to approximate solutions. One set of techiques involve Monte Carlo simulations, but the problem here is again the distinction between the "Newtonian" N-body problem and that quantum theory. First, there "is no unique extension of the Monte Carlo method as applied in classical statistical mechanics to quantum statistical mechanics that could deal well with all these problems" and not only are do classical methods fail, "For some quantum mechanical many-body problems even understanding the groundstate is a challenge" (both quotes are from A Guide to Monte Carlo Simulations in Statistical Physics by Landau & Binder; Cambridge University Press, 2009).
AFAIK, the modelling worked the same whether or not the particles were distinguishable - it just gave different answers.

Different techniques, different problems, and not always any answer. "Classical" N-body problems deal with the motion of "bodies". In order to begin to approach any many-body problem, it's rather important to know how many "bodies" there are, but in the quantum extension of N-body problems "[t]he occupation probabilities of mutually-interacting identical particles overlap, which makes their identification impossible. Every physical problem which requires the observation of single particles is physically meaningless for systems of identical particles!" p. 3 of Nolting's Fundamentals of Many-body Physics (Springer, 2009).

You're explaining what the n-body problem is to me when I brought it up in the first place. :sarcastic

I'm trying to understand exactly how you understand both the N-body problems and the many-body extensions. For example, you stated:
AFAIK, according to known physics, they are only approximations in regimes where gravity or interatomic forces become significant. A living cell has neither of those, and therefore, QED should be "exact" for all useful purposes.


According to "known physics", whether or not one thinks that a living cell is ultimately reducible or not, the idea that "QED should be 'exact' for all useful purposes" is counter to, well, QED, biology, physics, and chemistry. The following is from the Proceedings of the 14th International Conference on Recent Progress in Many-Body Theories (vol. 11 of the Series on Advances in Quantum Many-Body Theory): "At the heart of quantitative chemistry is the formidable mathematical task consisting in finding accurate eigensolutions of the N-body electronic Schrödinger equation. This task is particularly difficult for several reasons. First, the precision required to meet the "chemical" accuracy in realistic applications is very high. For example, in the case of small organic molecules the calculation of atomization energies (the energy needed to break apart a molecule into separated atoms) requires a relative error on the total ground-state energies of at least 10^-3...For intermolecular forces (hydrogen bonds, van der Waals interaction, etc.), the accuracy needed is at least 10^-6"

The authors (Caffarell & Ramirez-Solis) continue with the other issues in their contribution to the volume, but the their paper, the conference itself, and the series exist because this:
AFAIK, according to known physics, they are only approximations in regimes where gravity or interatomic forces become significant. A living cell has neither of those, and therefore, QED should be "exact" for all useful purposes.

is inaccurate at every level. First, interatomic forces are significant. Second, gravity and interatomic forces can be trivial and we still wouldn't be able to get close to this "exact" level you refer to. Finally, if it were true that QED would allow us exactly describe cellular dynamics, isn't it a bit suspicious that no one in the fields of biology, chemistry, or physics seems to know this?

So the n-body problem isn't used generally by physicists... and? I'm talking about it, so unless there's a good reason not to, I'll continue.

I don't mind talking about it, as for me it's mostly just some different terms but all the same equations. However, if you want to frame molecular, biological, or other dynamical systems in terms of the N-body (or many-body) problems, then you have to approach these problems as they are dealt with in these fields today, rather than the 17th century celestial problem. It doesn't matter to me if you say many-body, N-body, nonlinear system, chaotic system, etc. But you appear to think that QED makes cells a no-brainer, when in reality things like sand and other "granular media" are the subject of intensive study because of "strongly non-Newtonian" flows, structures, bonds, bridges, etc. (see e.g., Mehta's Granular Physics or the edited volume Micromechanics of Granular Materials).

There's a reason why the continual stream of unifying theories generally make reference to the applicability of that theory to biology: it's because at the moment, we are very much in the dark. Nottalle's Scale Relativity and Fractal Space-Time: A New Approach to Unifying Relativity and Quantum Mechanics (Imperial College Press, 2011), for example, ends by his volume with how it can enable an understanding of biological processes, in particular through the incorporation of his method with that of systems biology.

Specifically, I meant the example of modelling n particles obeying an inverse-square law between each other.

Which is totally inapplicable to even molecular or chemical physics, let alone how these work in biological systems. In molecular/chemical physics, the "many-body" problem shows up in "many-body perturbation theory", but again as an approximation. Chemical kinetics involve both quantum and classical approximations when we're dealing with molecular dynamics (including simple reactions). We don't know exactly what's going on, and once again there is an increasing amount of literature on the need to stop ignoring or bypassing (via bad approximations) physics (see e.g., Ab Initio Molecular Dynamics: Basic Theory and Advanced Methods by Marx & Hutter; Cambridge University Press, 2009). However, as complex as chemical & molecular physics are, the approaches so far have not yielded much progress when applied to biological systems. Cells don't just have spontaneous structures and complex reactions, they or self-organizing dynamic complexes with emergent functional processes affecting constituents parts.

In other words, the "inverse-square law" barely applies to physics at the molecular level at all, and is woefully inadequate to understand cellular processes.

I don't see your point. Are you saying that quantum field theory does not give accurate answers?

That depends on what you mean. I think MacKinnon put it as concisely as possible in the preface to Interpreting Physics: Language and the Classical/Quantum Divide (Boston Studies in the Philosophy of Science, Vol. 289): "In contemporary particle physics conclusions from a theory are never tested against observations. They are tested against inferences based on observations and a network of presuppositions supporting the inferential process of experimental physics."

For a more detailed account of the measurement problem, see Measurements in Quantum Mechanics, or even just the first chapter (it's free).
 
Last edited:

LegionOnomaMoi

Veteran Member
Premium Member
(cont. from above)

I'm making several statements:

1) QM has performed wonderously, but the continued experimental success and constant accuracy is, well, intricately tied to the fact that the experiments determine the result. That's the measurement problem (simplified)
2) QM can only be considered complete mathematically. However, as these mathematics are supposed to represent physical processes, and as these formalisms are interpreted so that they are consistent sort of makes the whole "completeness" a matter of circular reasoning. That this is so can be gleaned simply by surveying the various working models attempting to unify relativity and QM. Why are there so many? Mostly because they can't be verified. Instead, they are partly products of macroscopic observations and theories and partly syntactic manipulations. For example, superluminal signaling isn't "impossible" because of any empircal findings (quite the opposite). Instead, it is made impossible through interpreting QM formalism.
3) The life sciences have a tenuous relationship with modern physics because it doesn't work. Theoretical physicists can argue all day about wavefunction collapses, but when one is trying to model cellular processes, and there are as yet no known physical laws which can be used to reduce the processes to the consituent elements, we have a problem.
4) The idea that quantum mechanics is complete was never fully accepted, and is increasingly challenged, both for the continued failure to unite quantum theory with our knowledge of the macroscopic world, and because there are an increasing number of phenomena which don't seem to fit into either classical or quantum mechanics. Biological processes (and even molecular non-biological self-organization and other complex phenomena) continue to create difficulties for classical reductionism, into which quantum theory was more or less forced.
5) It is possible what appears to be an increasing gap in our ability to explain natural and biological systems using modern physics will end up being resolved somehow. However, there is no convincing reason to think that this is so, other than an unjustified faith in the classical deterministic reductionism, which crept into the scientific enterprise not through philosophical or even theoretical means (not primarily, anyway) but through repeated experiments. An implicit assumption in experimental design, and its subsequent success, ensured the survival of determisitic reductionism for a very long time. However, the same reasons for which it became dogma are now a serious problem for it.

So, what experiment have you (or anyone) done to show this? Because macroscopic quantum phenomena is a pretty big claim.

I'm going to bring up your response to one below:

How on earth do you measure "non-local quantities" on things so small? The brain is on the order of a few centimetres across - the time it takes an EM field to propagate through there is minuscule.

The kind of measurements you're talking about were possible 20 years ago (see here). We can measure the activity of a single neuron, and advances in BOLD imaging have allowed for increases not simply in higher resolution for abritrary voxel selection, but increasing temporal resolution too. However, there is also a lot of ensuring that the measuring and math is done correctly.

But here's the odd part You are talking about the "impossiblity" of actual measurements like the ones repeatedly found which I referenced, yet you have no problem accepting the results from experiments in which hypotheses about physical reality, properties, and processes are "tested" without actually measuring them at all?
So what do we call this, then?
Wikipedia. "Ask not if quantum mechanics is true, ask rather what the theory implies" (the opening line of Simon Sauders' contribution to the edited volume Many Worlds? Everett, Quantum Theory, and Reality; Oxford University Press, 2010)

I have to - you are contradicting science itself. You're not even contradicting a specific result, but saying "All science ever done has been fundamentally flawed."

No, I'm not. Nor do I think it is.


Unless you can build me a device as revolutionary as a transistor out of this, it's not rational for me to believe you. However, even that's hard, because a consequence of the "theory" is no further predictions are possible. That's not science - it has no explanatory power!

1) You have fundamentally confused what I stated. There is a difference between saying that at the moment there is not any agreement about the precise relationship between classical and quantum mechanics, and that our understanding of physics may be incomplete, and saying "science is flawed".

2) You don't have to believe me. You can start reading publications by academic institutions instead of relying on wikipedia.

3) For any claim I have made, I'm more than happy to provide you with references from scientific research

4) If "explanatory power" is such an issue for you, how do you resolve the measurement problem? And if the "standard model" as described by wikipedia is really all that standard, why isn't this reflected in the literature?

So why isn't literally every physicist in the world scrambling to rewrite the textbooks?

With what? There are plenty of texts I've cited, some of them textbooks, which you have had issues with. These were written by physicists or other scientists, some for graduate students in physics, some for graduate students in other disciplines. What exactly are you basing you understanding of theoretical physics, the life sciences, and the relationship between the two on?

The reason they aren't scrambling is because "they" don't exist as a collective. We have a standard model that most physicists don't believe to be accurate, but nothing to replace it with.

You are implying that physics, as a whole, collectively, is wrong. Why isn't anyone responding to that?
I'm not. I'm saying it's incomplete, and that it may be that certain assumptions by certain scientists are wrong. This is necessarily true, because nothing I have expressed isn't also expressed in the academic literature, so if I'm wrong, so are they.

More importantly, people are responding. More than ever. It took time because from Einstein to Penrose and DIrac there has been too much back and forth within theoretical physics community, while applied physicists, along with those in the natural and life sciences, continued to conduct research which was either explainable through classical physics, or wasn't, but would be. That has changed for a number of reasons. Most importantly, armed with the finally sufficiently formalized nonlinear mathematics, scientists set out to model life, the universe, and everything. Only it turned out that complexity was everywhere, and by complexity I mean in particular the failure of classical reductionism to account for processes and the very nature of both biological and natural systems. Not a complete failure, to be sure. But just as the entire Newtonian paradigm was turned on its head at the beginning of the 20th century, the end saw a similar process in the other sciences. Our increasing computational abilities and other sophisticated tools increased our understanding, but tended to show us we knew less than we thought. Now, however, in response to these problems we see things like the following:
"In May 2002 a number of about 20 scientists from various disciplines were invited by the Berlin-Brandenburg Academy of Sciences and Humanities to participate in an interdisciplinary workshop on structures and structure generating processes. The site was the beautiful little castle of Blankensee, south of Berlin. The disciplines represented ranged from mathematics and information theory, over various fields of engineering, biochemistry and biology, to the economic and social sciences. All participants presented talks explaining the nature of structures considered in their fields and the associated procedures of analysis."

That was in the preface to the edited volume Emergence, Analysis and Evolution of Structures- Concepts and Strategies Across Disciplines (Understanding Complex Systems).

We have systems biology, increasinly advanced mathematical tools to model complexity, and finally a great increase of work from theoretical physicists to the other natural sciences and life sciences.

These are perfectly solvable if you drop the bloody stupid Copenhagen interpretation.
They aren't. Apart from anything else, the number of physicists who have dropped it and who still don't agree tells us that.

This, meanwhile, would produce a thermodynamics violation.

Creating a problem: we observe system dynamics we can't reduce or explain without emergent properties which entail circular (or nonlinear) causality, but they violate our observations violate "laws". At the moment, this is an unsolved problem.

I'm surprised you can measure them well enough for that conclusion to be valid at all.
I'm suprised someone so sure about the results of physics, which consists of interpretations of measurements of things never measured at all (and, by the way, if we want to just say that the formalism doesn't correspond to physical reality, then we can say nothing about the existence of quantum reality or much about reality at all).

Simulate the small bits, and the interaction laws, and watch.

Done. DIdn't work. Why? Because so far as we can tell, there is no way to reduce the the system to "interacton laws" and components.
 
Last edited:

LegionOnomaMoi

Veteran Member
Premium Member
I believe things have gotten more than a little side-tracked by the discussion of the "N-body Problem", such that the more important issuess receive less attention. An example of such an issue would be the state of research within the physical/natural sciences and the life sciences, and in particular whether this research supports a view of reality that is ontologically reductionist and deterministic.
My hope is that if I can suitably address the "N-body" problem, I wouldn't do what I did above: devote at least half my response to that issue, and then having spent so much time on it, fail to suitably respond to the issues directly related to the possibility of conscious, self-determining systems. With that in mind:

Here is the initial request which began the N-body discussion component of this thread:

Legion, can you explain how functional processes happens in a simpler example? Such as the n-body gravitational problem?

(because reading thousands of words either way is getting tiresome, and I suspect it's not actually getting us anywhere useful.)
(the second statement after the request could not be more true, but is also rather ironic)

This above request was followed by further clarification about what was meant and why it was relevant:
Assuming that there is no radioactive atoms involved, all of a cell's workings are essentially an n-body problem in QED. (The two interatomic forces are too small to notice, and gravity is too large to notice.)
Specifically, I meant the example of modelling n particles obeying an inverse-square law between each other

Now that both the request and clarifications are organized in a single place, I hope to construct a single organized response.
legiononomamoi-albums-other-picture4066-n-body-problem.jpg

(from N-Body Problems And Models (World Scientific, 2004) by Donald Greenspan, p. xi under the section title "Problem Statement")

The above description, and indeed the N-body problem itself, is an extension of Newtonian physics and the problems Newton and other had understanding the orbits of the planets around the sun. However, it can be useful with objects in general:

Imagine that a remote control car strikes me at 30kmh. it might hurt, but it's nothing compared to the effect of a truck hitting me at the same speed. In other words, "force" is related to mass. However, for Newton, "mass" was a whole different ball game because he was dealing with planets. So for him, things like mass and velocity were complicated by the amount of mass, because mass means gravitational force, and planets have enough mass for this to really matter.

However, changed around a bit, an"n-body" problem can becomes much more interesting, even when n is 1: Imagine a pendulum hanging from center mass, perpendicular to but above the ground. Let's also say that we know the mass of the pendulum, it's center, how long it is, the angles we create (relative an imagined set of right angles on either side) when we raise it, and that we simply release it.

Knowing all this, applying an isolated system w/o friction, and armed with Newton's F=ma, this "1-body" problem would appear easy on to model, i.e., calculate the future motion.

As it turns out, this "one-body" problem has no (known) general solution method. It's the simplest type of nonlinear systems problem: a 1d oscillating system.

Notice that we haven't had to deal with thermodynamics, quantum processes, molecular reactions, fields, and most of basic phsyics. What happens when we do?

I mentioned in other posts the (thankfully increasingly addressed) problem within the life sciences when it comes to physics. The inadequacy of even the extremely complex, sophisticated,computational approaches used in molecular and chemical physics are typically inadequate. However, as usual, there are exceptions to the general rule: Dr. Hans Frauenfelder was a pioneer in the field of biological physics, and the volume The Physics of Proteins: An Introduction to Biological Physics and Molecular Biophysics ( from the series Biological and Medical Physics, Biomedical Engineering; Springer, 2010) was put together by the editors in an attempt not just for use by students & researchers, but also to "capture" Fauenfelder's ingenuity.
However, as is typical of such volumes, some material is work of other specialists to fill certain gaps. So chap. 3 is R. H. Austin's "Biomoecules, Spin Glasses, Glasses, and Solids", which begins by comparing the complexity of "a few many-body systems", to biological systems. For exampe, Austin notes "Solids or glasses cannot be modified on an atomic or molecular scale at a particular point: modifications are either periodic or random. In contrast, a protein can be changed at any desired place at the molecular level: Through genetic engineering, the primary sequence is modified at the desired location, and this modification leads to the corresponding change in the protein." (p. 14). The various molecular dynamics Austin compares to biomolecular dynamics are qualitatively different.

Thus, for example, "the energy surface of a crystal, that is, its energy as a function of its conformation, is nondegenerate and has a single minimum ...We can, in principle, describe the conformation of a system by giving the coordinates of all atoms. The energy hypersurface then is a function of all of these coordinates"

This does not mean these many-body systems are simple to reduce in order to understand the emergent structure through its constituents. One barrier is the phenomenon (aptly) termed "frustration":

"Frustration generally implies that there is no global ground energy state but rather a large number of nearly isoenergetic states separated by large energy barriers. In other words: imagine that you take a large interacting system and split it into two parts. Minimize the energy of the two separate parts. Now bring the two parts back together. A frustrated system will not be at the global energy minimum because of the interactions across the surface of the two systems.
There are a number of physical consequences that arise from frustration. The primary one that we wish to stress here is the presence of a dense multitude of nearly isoenergetic states separated by a distribution of energy barriers between the states. This is due to the presence of frustrated loops that cannot be easily broken by any simple symmetry transformation. This complexity inherently leads to distributions of relaxation times." (italics in original; emphasis added)

However, as complex as things like spin glass dynamics can be, they are nothing compared to biomolecules. Biomolecules "are complex many-body systems. Their size indicates that they lie at the border between classical and quantum systems. Since motion is essential for their biological function, collective phenomena play an important role. Moreover, we can expect that many of the features involve nonlinear processes. Function, from storing information, energy, charge, and matter, to transport and catalysis, is an integral characteristic of biomolecules. The physics of biolomecules stands now where nuclear, particle, and condensed mattter physics were around 1930." (emphasis added).

We can further illustrate the difficulties biomolecular processes present by one found in the book: ligand binding to hemoglobin. Hill, who attempted to simply the process for modelling, "introduced in 1913 a hypothetical nonlinear equation that is unphysical in that it assumes an n-body interaction...it implies that n ligands compete simultaneously for one binding site" (p. 87). Using the ol' n-body approach failed miserably. The model had no "physical interpretation" but rather was "a way to parameterize the data we don’t understand in terms of the basic physics of proteins" (p. 89).

The reductionist approach (including the extended "n-body" of the early 20th century) which succeeded in the case of some nonbiological chemical reactions, stands in stark contrast with biomolecules, as "biomolecules provide a complex but highly organized environment that can affect the course of the reaction" (p. 125) rendering the reduction of the reaction process to its constitutents ineffective.

To return again to this:
Legion, can you explain how functional processes happens in a simpler example? Such as the n-body gravitational problem?

The answer is (again) "no", but this time with an example of just such an attempt in the early 20th century, before it was widely recognized that the extension of 17th century methods presented problems in general, and were replaced or incorporated into the dynamical systems approach. Additionally, while the complex, nonlinear dynamics of molecules present more than there share of challenges, they are qualitatively different from the complexity inherent in biomolecular processes, structures, and functioning.

"n-body" isn't used not simply because it has been replaced by "many-body" problems. The pioneers of "chaos theory" incorporated equations of motion into their models to start with, and it was the suprising complexity of these equations to work even in "1-body" cases like the pendulum which rendered so thoroughly irrelevent the "n-body problem", even the extended form (which was not the "n-body gravitational problem").
 

PolyHedral

Superabacus Mystic
..."In a quantum many-body system, particles of the same species are completely indistinguishable. Moreover, even in the absence of mutual interactions they still have a profound influence on each other, since the number of ways in which the same quantum state can be occupied by two or more particles is severely restricted."
I would not have taken the bold statement to imply what he used to imply. The fact that there are less ways to arrange indistinguishable objects than distinguishable ones isn't really an "influence" quantum has - any more than arithmetic being demonstrated when we combine objects is an influence classical mechanics has.

..."To obtain satisfactory results in spite of these difficulties, one requires a combination of model reduction, based on physical insight and/or asymptotic analysis, and numerical techniques used on the reduced models that are intermediate between classical and full quantum dynamicsTo obtain satisfactory results in spite of these difficulties, one requires a combination of model reduction, based on physical insight and/or asymptotic analysis, and numerical techniques used on the reduced models that are intermediate between classical and full quantum dynamics."
IOW, it's very very hard to do, and therefore we need an approximation if we want an answer within a limited amount of time/memory? Fine - the important bit is the theorectical possibility of calculating the answer without any reduction or approximation.

Different techniques, different problems, and not always any answer. "Classical" N-body problems deal with the motion of "bodies". In order to begin to approach any many-body problem, it's rather important to know how many "bodies" there are, but in the quantum extension of N-body problems "[t]he occupation probabilities of mutually-interacting identical particles overlap, which makes their identification impossible. Every physical problem which requires the observation of single particles is physically meaningless for systems of identical particles!" p. 3 of Nolting's Fundamentals of Many-body Physics (Springer, 2009).
I don't follow the logic. Of course its not meaningful to look for "that" particle, if there are multiple identical ones?

...First, the precision required to meet the "chemical" accuracy in realistic applications is very high. For example, in the case of small organic molecules the calculation of atomization energies (the energy needed to break apart a molecule into separated atoms) requires a relative error on the total ground-state energies of at least 10^-3...For intermolecular forces (hydrogen bonds, van der Waals interaction, etc.), the accuracy needed is at least 10^-6"
The masses, charges and electric constant are known to an error of 10^-8. (IIRC, from having looked it up in the past)

The authors (Caffarell & Ramirez-Solis) continue with the other issues in their contribution to the volume, but the their paper, the conference itself, and the series exist because this:
is inaccurate at every level.
Notice that "exact" is in quote marks. The quote marks are there because it isn't going to be exact - I know how measurement uncertainty works. The point I was making was that QED will deliver - in a specific domain of energy/length scales and objects - a result whose accuracy is limited by the uncertainty in input data. Newtonian mechanics does the same, but its domain of usefulness is a lot smaller.

First, interatomic forces are significant
In a cell involving no unstable nuclei? Why?

Finally, if it were true that QED would allow us exactly describe cellular dynamics, isn't it a bit suspicious that no one in the fields of biology, chemistry, or physics seems to know this?
I don't see them denying that QED is accurate - the sources you've cited say that solving complex QED problems are infeasible. Infeasibility is not the same as uncomputability.

It doesn't matter to me if you say many-body, N-body, nonlinear system, chaotic system, etc. But you appear to think that QED makes cells a no-brainer, when in reality things like sand and other "granular media" are the subject of intensive study because of "strongly non-Newtonian" flows, structures, bonds, bridges, etc. (see e.g., Mehta's Granular Physics or the edited volume Micromechanics of Granular Materials).
"A solution can theoretically be computed with arbitrarily powerful and fast hardware" is not equivalent to anything being a "no-brainer".

There's a reason why the continual stream of unifying theories generally make reference to the applicability of that theory to biology: it's because at the moment, we are very much in the dark. Nottalle's Scale Relativity and Fractal Space-Time: A New Approach to Unifying Relativity and Quantum Mechanics (Imperial College Press, 2011), for example, ends by his volume with how it can enable an understanding of biological processes, in particular through the incorporation of his method with that of systems biology.
Havng looked at the premise of Scale Relativity, it seems to contradict what you were saying earlier about biology being uncomputable in principle. (Since there's very few measures of "complexity" you can use in relative scale theory.)


Cells don't just have spontaneous structures and complex reactions, they or self-organizing dynamic complexes with emergent functional processes affecting constituents parts.
You can't explain to me what a "functional process" is, so how do you know one when you see it?

That depends on what you mean. I think MacKinnon put it as concisely as possible in the preface to Interpreting Physics: Language and the Classical/Quantum Divide (Boston Studies in the Philosophy of Science, Vol. 289): "In contemporary particle physics conclusions from a theory are never tested against observations. They are tested against inferences based on observations and a network of presuppositions supporting the inferential process of experimental physics."
...What?
The sentence you quoted appears to either misunderstand how inferential science works, or is suggesting that physics is not science. The more advanced theories cannot be directly tested against observation, because our squishy biology does not have the equipment to observe the results. We can't observe that quarks or muons or even individual atoms exist directly; they're inferred to exist from theory - or from "a network of presuppositions."

1) QM has performed wonderously, but the continued experimental success and constant accuracy is, well, intricately tied to the fact that the experiments determine the result. That's the measurement problem (simplified)
The measurement problem is that the experimental measurement disturbs the system being measured. What does that have to do with "determining the result?"

3) The life sciences have a tenuous relationship with modern physics because it doesn't work. Theoretical physicists can argue all day about wavefunction collapses, but when one is trying to model cellular processes, and there are as yet no known physical laws which can be used to reduce the processes to the consituent elements, we have a problem.
I don't think it's possible, in practical terms, for you be able to justify that claim, let alone actually having done so. The only way I can think of to demonstrate that no known physical laws reduce cell processes to their constituent elements is to actually do the quantum field theory calculation and show that the answer you get is wrong.

4) The idea that quantum mechanics is complete was never fully accepted, and is increasingly challenged, both for the continued failure to unite quantum theory with our knowledge of the macroscopic world, and because there are an increasing number of phenomena which don't seem to fit into either classical or quantum mechanics. Biological processes (and even molecular non-biological self-organization and other complex phenomena) continue to create difficulties for classical reductionism, into which quantum theory was more or less forced.
Has anyone ever shown what happens when you take quantum field theory in the limit as h -> 0?

An implicit assumption in experimental design, and its subsequent success, ensured the survival of determisitic reductionism for a very long time. However, the same reasons for which it became dogma are now a serious problem for it.
IOW... we think that way because the experiments work when we do? And you're saying this is not scientific?

But here's the odd part You are talking about the "impossiblity" of actual measurements like the ones repeatedly found which I referenced, yet you have no problem accepting the results from experiments in which hypotheses about physical reality, properties, and processes are "tested" without actually measuring them at all?
I didn't say it was impossible. I said it was unbelievable that your experiment was so precise as to determine the presence of a non-local quantity in a space so fantastically small. I bring up the small aspect specifically because to demonstrate a non-local phenomena, you'd have to show that information was communicated faster than any carrier medium, and in this case the carrier is electromagnetism, which travels phenomenally fast on this scale. How do you know you're not looking at an illusionary instance, instead of the real thing?
 

PolyHedral

Superabacus Mystic
Wikipedia. "Ask not if quantum mechanics is true, ask rather what the theory implies" (the opening line of Simon Sauders' contribution to the edited volume Many Worlds? Everett, Quantum Theory, and Reality; Oxford University Press, 2010)
How rational of Sauders. Specifically, I believe he is alluding to the is that "right" is a free-floating belief that, in itself, does not imply anything about the observable data we will find. It is not a scientific question what is "right" - it is a scientific question, and one that quantum mechanics can answer, "What will this experiment do?"

No, I'm not. Nor do I think it is.
All physical modelling I've ever heard of was mathematical and reductionist.

1) You have fundamentally confused what I stated. There is a difference between saying that at the moment there is not any agreement about the precise relationship between classical and quantum mechanics, and that our understanding of physics may be incomplete, and saying "science is flawed".
The precise relationship between classical mechanics, quantum mechanics and reality is irrelevant as alluded to earlier - the relevant part is the predictions made, and those have been precise and relatively accurate where they're capable of being made.

4) If "explanatory power" is such an issue for you, how do you resolve the measurement problem?
What is there to resolve?

And if the "standard model" as described by wikipedia is really all that standard, why isn't this reflected in the literature?
Do you mean modern literature, or the entire literature? Perhaps its so well-established that there is no discussion. After all, the entire theory has been validated by experiment. Every single of the 21 particles has been found to have the predicted properties. (Although it is not a complete theory of everything, but neither is anything else we've found yet.)

The reason they aren't scrambling is because "they" don't exist as a collective. We have a standard model that most physicists don't believe to be accurate, but nothing to replace it with.
Compare the idea of non-computable biology to that of FTL particles. They both imply, quite directly, that classically accepted physics is wrong - yet the latter was all over the pop science feeds, and the former has not been mentioned even by the graduate-level physicists I've watched.

They aren't. Apart from anything else, the number of physicists who have dropped it and who still don't agree tells us that.
But the only EPR paradox I've seen mentioned is the absurdity of FTL entanglement. In a non-Copenhagen interpretation of quantum, the entire idea is nonsensical. In MWI, there is no transmission at all, because the information being stored in what is effectively a non-local hidden variable.

I'm suprised someone so sure about the results of physics, which consists of interpretations of measurements of things never measured at all
Please cite a concrete example.

(and, by the way, if we want to just say that the formalism doesn't correspond to physical reality, then we can say nothing about the existence of quantum reality or much about reality at all).
The formalism's structural correspondence to reality is irrelevant - but the answers they produce must, and mostly do correspond to reality.

Done. DIdn't work. Why? Because so far as we can tell, there is no way to reduce the the system to "interacton laws" and components.
That's very strange, considering it seems to work fine with other systems. (Which is, apparently, derived from the biological systems you keep mentioning.)

...
Knowing all this, applying an isolated system w/o friction, and armed with Newton's F=ma, this "1-body" problem would appear easy on to model, i.e., calculate the future motion.
In an actual 1-body problem, calculating the future motion can be done exactly. (You might not get it as an elementary function, but it can be done.)

As it turns out, this "one-body" problem has no (known) general solution method. It's the simplest type of nonlinear systems problem: a 1d oscillating system.
A pendulum doesn't really count as a 1-body problem except in the massless-rod approximation. Complex pendulums especially, since even with point like masses there are multiple bodies moving semi-independently.
Thus, for example, "the energy surface of a crystal, that is, its energy as a function of its conformation, is nondegenerate and has a single minimum ...We can, in principle, describe the conformation of a system by giving the coordinates of all atoms. The energy hypersurface then is a function of all of these coordinates"
This is the sort of thing I've been trying to say all along.

However, as complex as things like spin glass dynamics can be, they are nothing compared to biomolecules. Biomolecules "are complex many-body systems. Their size indicates that they lie at the border between classical and quantum systems.
There is, AFAIK, no such border. QM should in principle describe macroscopic physics.

The reductionist approach (including the extended "n-body" of the early 20th century) which succeeded in the case of some nonbiological chemical reactions, stands in stark contrast with biomolecules, as "biomolecules provide a complex but highly organized environment that can affect the course of the reaction" (p. 125) rendering the reduction of the reaction process to its constitutents ineffective.
Obviously, to model a reaction in a non-inert environment, you've got to have the environment be part of your model as well.

Additionally, while the complex, nonlinear dynamics of molecules present more than there share of challenges, they are qualitatively different from the complexity inherent in biomolecular processes, structures, and functioning.
How? And how do you show that?
 
Last edited:

LegionOnomaMoi

Veteran Member
Premium Member
Has anyone ever shown what happens when you take quantum field theory in the limit as h -> 0?
Perfect example (link): "It is generally believed that classical mechanics is the contraction of quantum mechanics in some appropriate limit of vanishing h. Thus in principle every classical observable...is the contraction of some quantum observable. However, quantum observables are generally constructed by the quantization of classical observables...Obviously this introduces circularity when one invokes the correspon dence principle. This is unsatisfactory if quantum mechanics were to be internally coherent and autonomous from classical mechanics." This study also gives an intro to the quantum-classical correspondence problem and quasiclassical systems, but see here for more detail


How rational of Sauders. Specifically, I believe he is alluding to the is that "right" is a free-floating belief that, in itself, does not imply anything about the observable data we will find.

No. To give more context:
"Ask not if quantum mechanics is true, ask rather what the theory implies. What does realism about the quantum state imply? What follows then, when quantum theory is applied without restriction, if need be to the whole universe?

This is the question that this book addresses. The answers vary widely. According to one view, ‘what follows’ is a detailed and realistic picture of reality that provides a unified description of micro- and macroworlds. But according to another, the result is nonsense—there is no physically meaningful theory at all, or not in the sense of a realist theory, a theory supposed to give an intelligible picture of a reality existing independently of our thoughts and beliefs. According to the latter view, the formalism of quantum mechanics, if applied unrestrictedly, is at best a fragment of such a theory, in need of substantive additional assumptions and equations.

So sharp a division about what appears to be a reasonably well-defined question is all the more striking given how much agreement there is otherwise, for all parties to the debate in this book are agreed on realism, and on the need, or the aspiration, for a theory that unites micro- and macroworlds, at least in principle."
The precise relationship between classical mechanics, quantum mechanics and reality is irrelevant as alluded to earlier - the relevant part is the predictions made, and those have been precise and relatively accurate where they're capable of being made.


So the QCC problem is trivial? Or perhaps you mean trivial for something like the brain? In which case you might want to tell the researchers who published the study Quantum-classical correspondence in the brain: scaling, action distances and predictability behind neural signals

Or perhaps this study:
"The boundary between quantum theory and classical physics is still largely unknown. Quantum theory obviously applies on length scales smaller than atomic radii but beyond that it is not entirely clear where it should be superseded by Newtonian mechanics. Here weargue that recent criticisms of the use of quantum mechanics in biology are not very convincing since they ignore the already existing evidence for quantum effects in biological systems. In this paper we investigate three particular systems of special importance: the cell membrane, microtubules (MTs), and ion channels which are some of the most important parts of a neuron in the human brain. We argue that these subsystems are the best candidates for possible sites of quantum effects." (Journal of Physics: Conference Series, 2011)
...What?
The sentence you quoted appears to either misunderstand how inferential science works, or is suggesting that physics is not science.
It's another way of saying this: "In classical physics, the notion of the “state” of a physical system is quite intuitive...there exists a one-to-one correspondence between the physical properties of the object (and thus the entities of the physical world) and their formal and mathematical representation in the theory...With the advent of quantum theory in the early twentieth century, this straightforward bijectivism between the physical world and its mathematical representation in the theory came to a sudden end. Instead of describing the state of a physical system by means of intuitive symbols that corresponded directly to the “objectively existing” physical properties of our experience, in quantum mechanics we have at our disposal only an abstract quantum state that is defined as a vector (or, more generally, as a ray) in a similarly abstract Hilbert vector space.
The conceptual leap associated with this abstraction is hard to overestimate. In fact, the discussions regarding the 'interpretation of quantum mechanics' that have occupied countless physicists and philosophers since the early years of quantum theory are to a large part rooted precisely in the question of how to relate the abstract quantum state to the 'physical reality out there.' (pp. 14-15)
from Schlosshauer's Decoherence and the Quantum-to-Classical Transition (from Springer's monograph series The Frontiers Collection; 2007):
All physical modelling I've ever heard of was mathematical and reductionist.
Actually, QM is arguably not reductionist at all, or at least it is fundamentally irreducible for all intents and purposes: "Unlike those of classical statistical physics, quantum objects and processes become irreducibly inaccessible to all our knowledge and conception, and hence are beyond any possibility of explaining the physical nature of the processes, beyond any possible specific ontology, apart from the fact they exist, which is all we can say about them." from Plotnitsky's "Prediction and Repetition in Quantum Mechanics: The EPR Experiment and Quantum Probability" (AIP Conf. Proc. 889).

It's hard to claim that QM is reductionist, when the mathematical models have no agreed upon relation to reality. It's akin to claiming conscious experience is reductionist because I can reduce it Psi. What is Psi? It's conscious experience. Reduction complete.

In fact, the problem with the classical and quantum divide and its relation to reductionism is not some new issue:
"The question can be simply stated as 'What is the relation between classical and quantum mechanics?' The simplicity of the question, however, belies the complexity of the answer. Classical mechanics and quantum mechanics are two of the most successful scientific theories ever developed, and yet how these two very different theories can successfully describe one and the same world – the world we live in – is far from clear. One theory is deterministic, the other indeterministic; one theory describes a world in which chaotic behavior is pervasive, and the other a world in which it is almost entirely absent. Did quantum mechanics simply replace classical mechanics as the new universal theory? Do they each describe their own distinct domains of phenomena? Or is one theory really just a continuation of the other?
In the philosophy literature, this sort of issue is known as the problem of intertheoretic relations. Currently, there are two accepted philosophical frameworks for thinking about intertheoric relations: the first is reductionism, and the second, pluralism. As we shall see, these labels each actually describe a family of related views." from the introduction of Bokulich's Reexamining the Quantum-Classical Relation: Beyond Reductionism and Pluralism (Cambridge University Press, 2008).
 
Last edited:

LegionOnomaMoi

Veteran Member
Premium Member
What is there to resolve?

Apart from the QCC issue (see the links in my post above):"the problem that quantum mechanics faces — the 'measurement problem' — is that it sometimes assigns the wrong state to some systems. (As we shall see, the name 'measurement problem' is misleading, because it suggests that the problem occurs only when one makes a measurement, whereas the problem is, in fact, generic.)" (from W. M. Dickson's Quantum Chance and Non-locality: Probability and Non-locality in the Interpretations of Quantum Mechanics).
In a cell involving no unstable nuclei? Why?
Let's go back a bit:
Assuming that there is no radioactive atoms involved, all of a cell's workings are essentially an n-body problem in QED. (The two interatomic forces are too small to notice, and gravity is too large to notice.)

"Chemical and biological systems are often large and have a complex structure so that the investigation of their dynamical properties poses challenges for theory and simulation. A complete description of their dynamics must be based on quantum-mechanical time evolution equations" Kapral, R. (2006). Progress in the theory of mixed quantum-classical dynamics. Annu. Rev. Phys. Chem., 57, 129-157.

It's already been awarded. In the free online book I linked to is a paper "Quantum Transition State for Peptide Bond Formation in the Ribosome". The use of quantum crystallography to understand ribosome structure and function was the reason that one of the authors was awarded the Nobel prize.
Call me crazy, but if they are giving out nobel prizes for scientists who use quantum mechanics to understand fundamentals of biology, then even if one knew next to nothing about physics the idea that quantum processes were at play here seems a likely conclusion.



Do you mean modern literature, or the entire literature?

There is discussion. A great deal of it. In two sentences: "This formulation of the measurement process is known as the collapse of the wavefunction, and it is a clearly non-unitary process which formally completes the Copenhagen interpretation. The problem with this description is that does not solve anything." (from p. 12 of Jasper van Wezel's Quantum Mechanics and the Big World; Leiden University Press, 2007). It's the reason why books like Quantum Enigma (Oxford University press, 2006) can make incredibly outlandish claims: "This is a controversial book. But nothing we say about quantum mechanics is controversial. The experimental results we report and our explanation of them with quantum theory are completely undisputed. It is the mystery these results imply beyond physics that is hotly disputed. For many physicists, this mystery, the quantum enigma, is best not talked about. It displays physics’ encounter with consciousness. It’s the skeleton in our closet."


Please cite a concrete example.
The double-slit experiment and all variants. See the response below

After all, the entire theory has been validated by experiment

Do you know what these experiments involve?
"Quantum theory is a procedure by which scientists predict probabilities that measurements of specified kinds will yield results of specified kinds in situations of specified kinds. It is applied in circumstances that are described by saying that a certain physical system is first prepared in a specified manner and is later examined in a specified manner. And this examination, called a measurement, is moreover such that it can yield, or not yield, various possible specified results...
First [the experimental physicist] transforms his information about the preparation of the system into an initial wave function. Then he applies to it some linear transformation, calculated perhaps from the Schrödinger initial wave function into a final wave function. This final wave function, which is built on the degrees of freedom of the measured system, is then folded into the wave function corresponding to a possible result. This gives the transition amplitude, which is multiplied by its complex conjugate to give the predicted transition probability...
The above account describes how quantum theory is used in practice. The essential points are that attention is focused on some system that is first prepared in a specified manner and later examined in a specified manner. Quantum theory is a procedure for calculating the predicted probability that the specified type of examination will yield some specified result...The above account describes how quantum theory is used in practice. The essential points are that attention is focused on some system that is first prepared in a specified manner and later examined in a specified manner. Quantum theory is a procedure for calculating the predicted probability that the specified type of examination will yield some specified result.
The wave functions used in these calculations are functions of a set of variables characteristic of the prepared and measured systems. These systems are often microscopic and not directly observable. No wave functions of the preparing and measuring devices enter into the calculation. These devices are described operationally. They are described in terms of things that can be recognized and/or acted upon by technicians. These descriptions refer to the macroscopic properties of the preparing and measuring devices.
The crucial question is: How does one determine the transformations A → ΨA and B → ΨB? These transformations transcribe procedural descriptions of the manner in which technicians prepare macroscopic objects, and recognize macroscopic responses, into mathematical functions built on the degrees of freedom of the (microscopic) prepared and measured systems. The problem of constructing this mapping is the famous “problem of measurement” in quantum theory." pp. 53-54 of Mind, Matter and Quantum Mechanics (Springer, 2009)


In a non-Copenhagen interpretation of quantum, the entire idea is nonsensical.
That's simply not true. Either part actually (that the idea is non-sensical and that it is somehow different in a "non-Copenhagen" interpretation). I went into this in great detail right before:

http://www.religiousforums.com/forum/3138784-post173.html
http://www.religiousforums.com/forum/3138787-post174.html
http://www.religiousforums.com/forum/3139548-post175.html

You never responded to any of the research cited. I can add more: "Non-locality is another of the various criticisms that has been laid at the feet of the Causal Theory. The experimental tests of the various Bell Inequalities have come down ‘fair and square’ on the side of non-locality, i.e. experiments continue to confirm that the Bell Inequalities are indeed violated, as predicted by the formalism of quantum mechanics. Such criticism of the Causal Theory is completely misdirected, as Maudlin explains: 'Violations of Bell’s Inequality show that the world is non-local. It can be no criticism of a theory that it displays this feature of the world in an obvious way'" p. 72 of Riggs' Quantum Causality: Conceptual Issues in the Causal Theory of Quantum Mechanics (Springer Science, 2009)

I don't follow the logic. Of course its not meaningful to look for "that" particle, if there are multiple identical ones?

Not multiple "identical" so much as "indistinguishable". The "identical" bit was for illustrating purposes. The incredible difficulty of nonlinear dynamics is hard enough when you can describe the "bodies" of your system. That's "distinguishable". You can't do this for numerous many-body problems.

Havng looked at the premise of Scale Relativity
Having looked where?

You can't explain to me what a "functional process" is, so how do you know one when you see it?

I did explain. You didn't like the explanation because it contradicted computer science definitions of models. I would be happy to describe them again, and in greater detail, if you would point out what part of the previous explanations you did not understand.

The more advanced theories cannot be directly tested against observation, because our squishy biology does not have the equipment to observe the results.

It has nothing to do with that. In fact, we aren't exactly sure what it has to do with because there is no agreement as to whether the wavefunction is actually a probability function of finding a particle or is the superpositioned particle which has a physical reality contrary to all reality as we know it.

The measurement problem is that the experimental measurement disturbs the system being measured. What does that have to do with "determining the result?"

The fundamental difference between classical and quantum theory is that of measurement: in the former, you describe a system through measurements of variables which correspond to physical reality. In the latter, you can't. Any measurement changes the physical reality. It determines it.
 
Last edited:

idav

Being
Premium Member
The fundamental difference between classical and quantum theory is that of measurement: in the former, you describe a system through measurements of variables which correspond to physical reality. In the latter, you can't. Any measurement changes the physical reality. It determines it.

If we could actually measure it we would likely find a classical type cause. Random is simply something too complex to calculate but complex doesn't mean impossible to calculate except for our own human limitations.
 

FunctionalAtheist

Hammer of Reason
Not too many of us suggest we have no free will in our everyday life.

However it does get kind of mystical when we wonder about what is happening, after a bout of idle mind, upstream from the underlying causes of our next thought.

Any thoughts anybody?
:human:

It may have been helpful if you defined exactly what you mean by free will. It seems the arguments thus far have been rather ambiguous. One thing is for sure, free will does not exist outside of the willful mind.

Will is a property of the mind, and an attribute of acts intentionally performed. Free will is the ability of agents to make choices free from certain kinds of constraints. The question of free will is an illusion. The real question is free from what? What kind of constraints. For most the question is “Is reality deterministic?” Though for the compatibilist school of thought determinism is compatible with the idea of free will, and constraints such as physical, emotional, social constraints are more relevant. So the argument should be 1) Do we agree that we are talking about determinism? and 2) Is reality deterministic?

Since the argument moves around whichever definition of the problem we like to choose, I state the problem this way “Do I chose my actions?”

Of course I do. Even in a deterministic world, I am the cause of my choice. Even if over analysis forces me to view a macro problem at a quantum level, the mechanics are no more than a reductionist view of all that which makes “I!” Even if the deterministic, quantum view insists that choice is an illusion, that there is really only one option, it is still “I” that have, through my previous nature and nurture, even if at the quantum level, have eliminated the other options, at least in part. Even if (not that I believe it) every single decision in my entire life has been predetermined, it is still the sum total of “I,” my atoms, my neurons, the mechanics of my subatomic particles, that makes each one of these decisions. I choose my actions!

Since it is all a question of defining the problem anyway, I'm allowed to see it any way I wish (with my freewill).
 
Last edited:

PolyHedral

Superabacus Mystic
If we could actually measure it we would likely find a classical type cause. Random is simply something too complex to calculate but complex doesn't mean impossible to calculate except for our own human limitations.
Quantum mechanics is, genuinely, unmeasurable. However, this is not connected with what Legion is saying. (Which is more connected to quantum theory's relation to reality.)
 

idav

Being
Premium Member
Quantum mechanics is, genuinely, unmeasurable. However, this is not connected with what Legion is saying. (Which is more connected to quantum theory's relation to reality.)
Cause the brain is quantum it gets too complex and magically becomes immeasurable?:shrug:

Breaking it down we are talking about atoms and chemicals and cells all reducible.
 

LegionOnomaMoi

Veteran Member
Premium Member
If we could actually measure it we would likely find a classical type cause.

We can, and do measure quantum systems. The problem is that this make them classical. As soon as you perform any measurment, you change it. Additionally, the issue isn't one of cause, but is ontological. Any quantum experiment involves quantum level activity. We can "set up" the experiment, and describe the end result, and do this all with mathematical precision. but I can only do it if I don't actually allow my mathematical descriptions to correspond to any physical reality. If I'm working with crash test dummies and measuring what happens to them when a certain car hit a brick wall at a certain speed, then all my calculations either directly correspond to physical reality (i.e., the size of the vehicle, it's mass, it's structure, etc.) or describe a characteristic of its dynamics (its velocity, angle, etc.). In other words, whatever equation(s) I use to model what happens to the car, the crash test dummies, and the wall, these equations contain variables which correspond to actual, observable things I measured (mass, velocity, etc.).

That's classical mechanics. In quantum mechanics, it's a bit like doing all of the above, only you never saw the car, the dummies, or how fast the car was going and the angle at which it hit the brick wall. The difference is that in classical mechanics, if you looked at the wreck after the test, you could do some sophisticated mathematics after investigating the remains, but now your models are educated guesses. The more complicated the the experiment (e.g., the car had a frame built mostly out of one material, but in certain key points it had another, and only some of the dummies had seat belts, two of which failed, and in the 'unobservable' part the car hit a puddle and shortly after a spike strip) and the more exact you want your "models" to describe what happened, the less likely you are going to be able to do either. And we know that.

In quantum mechanics, this is ignored (at least in practice). We have very exact specifications and measurements corresponding to the "before" and "after", but in the end we take a symbol and just say "this is the system" because as soon as we measure, we've changed everything. A wavefunction is a probability function. Under a certain interpretation, it tells us the probability that a particle will end up in a certain location. Under another, it describes a "spread out" or "smeared" nebulous superpositioned "particle" which ends up in the certain location (or not) because of the way we measured it. And we cannot, even in principle, no which (if either) it is.

But the problems do not end there. Because these quantum processes, which we can't measure, are what make up everything. So when I set up a lab experiment, all of my instruments, my fellow scientists, the room, etc., are all made up of the very things I'm trying to do experiments with. Yet I don't treat other things the way I treat my quantum systems. And they don't act according to the same principles as far as anyone can tell. In theory, I should be able to use QM alone, without any classical physics. But it isn't clear how exactly one does this. The line between quantum and classical has (especially in recent years, as our computing power and measurement technologies has increased) has grown fuzzier and fuzzier.
 
Top