• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Any Defenses of Materialism?

Koldo

Outstanding Member
Prove it. Provide a non-vacuous definition.

And if you become able to use logic at some point, then state an argument by which to conclude that the thesis of physicalism is true. That was the challenge of the OP. After more than 400 posts, you still haven't met the challenge of the OP.

That is pretty much what I have been doing most of the time when talking to you. Just read again my posts.
 
Last edited:

Koldo

Outstanding Member
So you can't identify even a single law that accounts for the ability of an individual to choose among the available options what acts he will or won't perform?

I have never heard anyone saying there is a specific law of physics that accounts for choices. That would be a novelty to me.
If, however, what you want to know is why one would regard the collective of physics as the basis for your choices, then I would credit it to the explanatory power the natural sciences have achieved.
 

LegionOnomaMoi

Veteran Member
Premium Member
Some of your point aren't relevant, while the others only deal with reductionism.
Your pictorial/graphical representation is fundamentally reductionistic. It suggests that macroscopic phenomena (from molecular to social) can be reduced to the dynamics of fundamental/elementary particles and the laws which govern them (at least in principle). This is wrong. Even fundamental particles and their dynamics can't be reduced to these, as we have to arbitrarily decide which entities are to be taken as fundamental and which composite.
Your graphics in which higher levels of analysis are produced by the dynamics governing lower levels is simply and fundamentally wrong:
"At the rather basic level of life, and perhaps even in chemistry, there is no reduction: perhaps the simplest proof of this is that while the bases of DNA each obey the laws of physics, the juxtaposition of bases in the nucleotides is physically contingent, so the information content of DNA and the way it serves to encode instructions for constructing proteins is not governed merely by the laws of physics."
Simons, P. (2002). Candidate General Ontologies for Situating Quantum Field Theory. In Kuhlmann, M., Lyre, H., Wayne, A. (Eds.). Ontological Aspects of Quantum Field Theory. World Scientific.

As for determining what is "physical" in physics, this is highly non-trivial. Virtual particles are a necessary component of all fundamental physical processes, but nobody knows if or how they are supposed to exist, nor how exactly they are supposed to be "virtual" as opposed to what (presumably) are "real" particles:

"It might be objected that we can observe virtual particles, namely if we make a measurement while the interaction is taking place we will find some of the particles indicated by the Feynman diagrams...it does not follow that the particles we detect when we interrupt an interaction would have been there if we had not made the measurement, or were there just before the measurement."
Weingard, R. (1988). “Virtual Particles and Quantum Field Theory” in H. R. Brown & R. Harré (Eds.) Philosophical Foundations of Quantum Field Theory (pp. 43-58). Oxford University Press.

Modern physics has fundamentally dematerialized matter, making physical entities non-physical.

"The notion of Physical Object is Untenable”
D’Ariano, G. M. (2015). It from Qubit. In It From Bit or Bit From It? (pp. 25-35). Springer.

"In a sense every real photon is actually virtual if one look over sufficiently long time scales." p. 56
Feynman, R. P. (1998). The Theory of Fundamental Processes (Advanced Book Classics). Westview Press.
 

Koldo

Outstanding Member
Your pictorial/graphical representation is fundamentally reductionistic. It suggests that macroscopic phenomena (from molecular to social) can be reduced to the dynamics of fundamental/elementary particles and the laws which govern them (at least in principle). This is wrong. Even fundamental particles and their dynamics can't be reduced to these, as we have to arbitrarily decide which entities are to be taken as fundamental and which composite.

Not 'reduced to', but rather 'derived from'.
This is an essential distinction.

Your graphics in which higher levels of analysis are produced by the dynamics governing lower levels is simply and fundamentally wrong:
"At the rather basic level of life, and perhaps even in chemistry, there is no reduction: perhaps the simplest proof of this is that while the bases of DNA each obey the laws of physics, the juxtaposition of bases in the nucleotides is physically contingent, so the information content of DNA and the way it serves to encode instructions for constructing proteins is not governed merely by the laws of physics."
Simons, P. (2002). Candidate General Ontologies for Situating Quantum Field Theory. In Kuhlmann, M., Lyre, H., Wayne, A. (Eds.). Ontological Aspects of Quantum Field Theory. World Scientific.

You will have to elaborate on this quote.
What do you propose to be responsible for this contingency ?


As for determining what is "physical" in physics, this is highly non-trivial. Virtual particles are a necessary component of all fundamental physical processes, but nobody knows if or how they are supposed to exist, nor how exactly they are supposed to be "virtual" as opposed to what (presumably) are "real" particles:

"It might be objected that we can observe virtual particles, namely if we make a measurement while the interaction is taking place we will find some of the particles indicated by the Feynman diagrams...it does not follow that the particles we detect when we interrupt an interaction would have been there if we had not made the measurement, or were there just before the measurement."
Weingard, R. (1988). “Virtual Particles and Quantum Field Theory” in H. R. Brown & R. Harré (Eds.) Philosophical Foundations of Quantum Field Theory (pp. 43-58). Oxford University Press.

I have never said it is trivial. Or, at least, I hope I haven't since that would be definitely wrong on my part. It is well-known that defining the boundaries of what we are to regard as 'physical' in Physicalism is troublesome, to say the least.
This is, however, not the same as saying that the notion of 'physical' is vacuous.

Modern physics has fundamentally dematerialized matter, making physical entities non-physical.

"The notion of Physical Object is Untenable”
D’Ariano, G. M. (2015). It from Qubit. In It From Bit or Bit From It? (pp. 25-35). Springer.

"In a sense every real photon is actually virtual if one look over sufficiently long time scales." p. 56
Feynman, R. P. (1998). The Theory of Fundamental Processes (Advanced Book Classics). Westview Press.

Physicalism forgoes the notion of 'matter' as being primary anyway.
 

LegionOnomaMoi

Veteran Member
Premium Member
Not 'reduced to', but rather 'derived from'.
This is an essential distinction.
Ok, then we'll go with "derived from". Molecular structures are not derived from atomic physics (which are not derived by elementary particles, but defined in terms of these), and we can't even understand cellular dynamics in terms of molecular physics, let alone somehow "derive" any science of living systems from our understanding of molecules. Hell, we have to assume that gases and liquids and other systems can, in principle, be understood in terms of the dynamics of their constituent elements (i.e., that gases and so forth are made up of molecules and atoms that obey the laws of physics governing the dynamics of atoms and molecules) because even in the Lagrangian or Hamiltonian formulations of classical (analytical) dynamics the equations for tiny amounts of gases require integrals in such high dimensional spaces as to be forever unsolvable even by approximation methods. We rely instead on statistical descriptions that we assume are descriptions of averages of the behaviors of the molecules making up gases and similar systems, and we know that in general these assumptions are wrong but often approximately correct.
Indeed, it is in condensed matter physics, continuum mechanics, and similar levels of analysis in which we find physicists and research in physics that reflects the autonomy (i.e., non-derivability) of the level of analyses that your chart assumes to be non-autonomous (i.e., it assumes higher levels are derived from lower). Even at the level of solid state physics, condensed matter physics, and other areas of physics that require the use of quantum theory (sometimes beyond quantum mechanics) we find the emergence of structure, dynamics, and phenomena that cannot be derived from atomic or elementary particle physics. Certain collective, irreducible states begin to emerge at very low-level analyses.
See e.g.,
Falkenburg, B., & Morrison, M. (Eds.). (2015). Why more is different: Philosophical issues in condensed matter physics and complex systems. Springer.
Chibbaro, S., Rondoni, L., & Vulpiani, A. (2014). Reductionism, emergence and levels of reality. Springer.
And Anderson's groundbreaking study on the autonomy of different levels of analysis and the non-derivability you assume: More is Different
Although Anderson doesn't go far enough, you will find that the ability to derive higher levels from lower is a stronger condition than reductionism.

You will have to elaborate on this quote.
I've uploaded a scan of the paper from the volume it was published in.
What do you propose to be responsible for this contingency ?
Higher level structures can at best be in principle reduced to accord with the laws governing their constituent (lower level) parts. But they cannot be constructed from these laws. For example, in granular media the initial configuration state of a system that undergoes a transition, such as the formation of crystalline structure or that of a sandpile, does not violate any known laws of physics but the final structure cannot be "derived" from the initial. It is contingent upon the initial state and the collective (emergent) dynamics that are produced by forces acting on a system too interconnected to be understood solely in terms of its constituents.
The emergence of such collective structures that are irreducible and non-derivable from the dynamics of lower levels take on a qualitative difference for living systems, even for intracellular structures such as DNA. For living systems, functional emergence (which is non-physical) becomes necessary. Consider a model, simulation, or similar "realization" of a cell and the process of metabolic-repair, and let f: A→B be a function
"where f is the process that takes input A and output B...The system Rosen uses for an example is the Metabolism-Repair or [M,R] system. The process, f, in this case stands for the entire metabolism goin on in an organism...The transition, f, which is being called metabolism, is a mapping taking some set of metabolites, A, into some set of products, B. What are the members of A? Really everything in the organism has to be included in A, and there has to be an implicit agreement that at least some of the members of A can enter the organism from its environment. What are the members of B? Many, if not all, of the memebers of A since the transitions in the reduced system are all strung together in the many intricate patterns or networks that make up the organism's metabolism. It also must be true that some members of B leave the organism as products of metabolism...In the context developed so far, the mapping, f, has a very special nature. It is a functional component of the system we are developing. A functional component has many interesting attributes. First of all, it exists independent of the material parts that make it possible. Reductionism has taught us that every thing in a real system can be expressed as a collection of material parts. This is not so in the case of functional components...Fragmentability is the aspect of systems that can be reduced to their material parts leaving recognizable material entities as the result. A system is not fragmentable is reducing it to its parts destroys something essential about that system. Since the crux of understanding a complex system had to do with identifying the context dependent functional components, they are by definition, not fragmentable". (emphasis added; italics in original)
Mikulecky, D. C. (2005). The Circle That Never Ends: Can Complexity be Made Simple? In D. Bonchev & D. H. Rouvray (Eds.). Complexity in Chemistry, Biology, and Ecology (Mathematical and Computational Chemistry). Springer.



"systems biology is concerned with the relationship between molecules and cells; it treats cells as organized, or organizing, molecular systems having both molecular and cellular properties. It is concerned with how life or the functional properties thereof that are not yet in the molecules, emerge from the particular organization of and interactions between its molecular processes. It uses models to describe particular cells and generalizes over various cell types and organisms to arrive at new theories of cells as molecular systems. It is concerned with explaining and predicting cellular behaviour on the basis of molecular behaviour. It refers to function in ways that would not be permitted in physics. It addresses an essential minimum complexity exceeding that of any physical chemical system understood until now. It shies away from reduction of the system under study to a collection of elementary particles. Indeed, it seems to violate many of the philosophical foundations of physics, often in ways unprecedented even by modern physics." (emphases added)
Boogerd, F., Bruggeman, F. J., Hofmeyr, J. H. S., & Westerhoff, H. V. (Eds.). (2007). Systems biology: philosophical foundations. Elsevier.

Physicalism forgoes the notion of 'matter' as being primary anyway.
So what is the ontological nature, according to physicalism, of the "virtual" processes and particles that are involved in fundamental interactions, either before or after renormalization?
 

Attachments

  • Candidate General Ontologies for Situating Quantum Field Theory.pdf
    850.8 KB · Views: 137
Last edited:

Yerda

Veteran Member
So, accordingly, the thesis that everything that exists is physical would mean that everything that exists "pertains to the physical sciences, especially physics". Is that right? Is that what you believe--that everything that exists "pertains to the physical sciences, especially physics"?
As I said, it's from dictionary.com. It's usually good practice to start simple.

For the record, I don't know. Everything that exists might be the kind of thing you can study in physics or depend on the kind of thing you can study in physics but that seems like a matter that we could solve by finding clear examples of something that isn't. I'm open to both possibilities or even scenarios where neither statement makes any sense. I'd say the mind is a possible candidate and I've no idea of how we could categorise the truths of mathematics in such a scheme.

This is not to say that adults with an interest in metaphysics can't have a sensible discussion on the ways of making a defence of materialism without getting our y-fronts all in a tangle.

Nous said:
Does volition or free will--the ability of an individual to choose among available options--pertain to the physical sciences,especially physics? If so, in what way? Can you provide any evidence by which to draw such a conclusion?
A physicalist account could maybe proceed as follows:

Free will comes from somewhere. We can all agree that we are/have bodies. We can all agree that nothing without a body has ever been seen to have had free will. It is not a great leap to suppose that free will comes with being/having the particular kinds of body we are/have.

Whether or not there is evidence for this depends on the standard of evidence you are willing to accept. I'm willing to agree it is not a given that free will is a result of the kind of things we can discuss in relation to physics but quite the same it isn't beyond the realms of possibility. Unless, perhaps, you think you know a reason that it is logically or physically impossible.

Nous said:
Yes, I will not be making an argument that either materialism or physicalism is true. I've done enough reading on the topic to know that no philosopher has made such an argument. We already know that the classical definition of materialism is false, and physicalism is invariably gored on one of the two horns of Hempel's dilemma.
What would you say is the most relevant reading on the matter?

Nous said:
So, you should definitely expect nothing but rocks from me. I assure you I am well armed.
That hardly seems like the behaviour required to engage with others in good faith.
 
Last edited by a moderator:

LegionOnomaMoi

Veteran Member
Premium Member
What do you propose to be responsible for this contingency ?

The more I’ve thought about my previous answer to this, the more dissatisfied I’ve become. There is a more elegant (and in general more accurate, or at least more fundamental) answer. The reason for contingency is due to the development and formulation of physical laws. By “development” I refer to the historical manner in which these laws came to be, and by formulation I mean the manner of their exposition in modern physics (including the modern way classical laws of physics, now known to be incorrect, are expressed).

First, physical laws (and the idea that such things existed) were developed by considering physical systems (whether celestial bodies or rocks dropped from ship masts) in an idealized manner. In particular, it was assumed that we could sufficiently isolate systems such that we could study their dynamics but consider our role as observers to be negligible and our role as designers (in the case of most experiments in which systems are set-up by humans, not simply “observed” as celestial bodies were) neglected. Generalizations from the results of such studies of “isolated” systems were considered to hold true universally (or at least far more broadly).

Second, the mechanical nature of the instruments and experimental designs used to codify laws and the ontological status of the objects of their descriptions were based fundamentally on the development and examination of mechanical devices:
“If we look back at the history of physics, with a view to understanding why it could develop as rapidly as it did, we find two relevant sets of circumstances. First, there was the patiently accumulated data on astronomy, collected and tabulated for millennia. Second, an apparently unrelated circumstance: one could experiment directly with simply mechanical contrivances, with inclined planes, pulleys, and springs, and express the resulting data in the form of simple mathematical rules, revealed by simple laboratory experiments with simple bulk systems, held good throughout all nature, at every level, from the greatest to the very smallest…
This tacit belief in the unlimited uniformity of mechanical behavior, and the corresponding universality of mechanical laws, provided the absolutely essential nutrient that permitted theoretical physics to develop as it did...
We now know, after three centuries, that this assumption of uniformity was entirely, hopelessly false.”
Rosen, R. (1991). Life Itself: A Comprehensive Inquiry into the Nature, Origin, and Fabrication of Life (Complexity in Ecological Systems). Columbia University Press.

By generalizing from the observation of the dynamics of simple, mechanical systems (or those systems whose behavior was akin to simple, mechanical systems) considered in isolation laws such as the law of conservation of energy or of momentum were derived. But long before it became clear that this generalization procedure was hopelessly flawed, these laws were often elevated to a definitional status. For example, the law of the conservation of energy only holds true in the case of isolated systems (as do most “laws” of physics). In reality, systems are never truly isolated (quantum physics taught us that), but in classical physics and with laws derived via the observation of simple, mechanical systems, it seemed possible to isolate systems. However, even for Newton and certainly for physicists in the 1800s it became clear that seemingly isolated systems were not, in fact, “isolated”. So circular reasoning via contingency was employed: the laws of physics held true for isolated systems, but for the laws to hold true it was necessary that these systems be isolated, and so systems were rightfully considered to be isolated if and only if the laws held:

“It is often said that energy and momentum are conserved for closed systems. One ordinarily thinks of this claim as merely empirical: if a system is closed, then it is an empirical fact that the total momentum and energy of that system is conserved. Thus, it is imagined that, had the empirical facts been different, a closed system could turn out to have total momentum and energy that is not conserved. But do we have a criterion for determining whether a system is closed, independently of the conservation laws? In other words, do we have a criterion for causally determining that a system is isolated from the environment? Is this criterion independent of momentum-energy conservation laws? A short reflection would demonstrate that there is none. And so a system is defined as closed whenever the total momentum and energy of that system is conserved. In other words, the conservation laws themselves provide us with a criterion of isolation, or the criterion by which a system is shown to be causally isolated from the rest of the world.”
Belkind, O. (2012). Physical Systems: Conceptual Pathways between Flat Space-time and Matter (Boston Studies in the Philosophy of Science, Vol. 264). Springer.

In short, the laws of physics are contingent because they hold only for idealized systems that don’t exist, but hold therefore also for systems that approximate idealization. This failure has become realized to be drastic in several ways, both within physics and with respect to biology:
“From the physical point of view, even the simplest system one would want to call an organism is already inconceivably complicated. There are no biological counterparts of the inclined plane or pulley, the simple systems that manifests in itself the general laws we want to study. We cannot thus study organisms by inorganic proxy, at least not experimentally”
(Rosen, p. 18)

Even when considering would-be simple classical systems, it turns out that a large degree of faith was put into the validity of “laws” of physics because of our ability to treat systems that could be understood in terms of their parts with these “laws”. In physics, laws are in general codified in mathematical form, most particularly via differential equations. The problem is that most differential equations cannot be solved analytically (that is, there exists no method for solving them, even in principle, exactly, but computational procedures may be used to approximate answers under suitable conditions). It turns out that the systems whence came our basis for asserting there to be the “laws” of reductionist physics were those systems that are “separable” (i.e., modeled by differential equations which could by solved exactly via separation of the “parts” of the system by treating the infinitesimal displacements and/or differential forms which appeared in the system(s) of equation(s) independently) This is, in general, not possible, and is certainly impossible with living systems or any non-living system that behaves chaotically.

In even shorter form, the laws of physics are contingent because they were formulated using contingency, and were only held to be true more generally by assumptions which were false.
 
Last edited:

Nous

Well-Known Member
Premium Member
Prove it. Provide a non-vacuous definition.

And if you become able to use logic at some point, then state an argument by which to conclude that the thesis of physicalism is true. That was the challenge of the OP. After more than 400 posts, you still haven't met the challenge of the OP.
That is pretty much what I have been doing most of the time when talking to you.
Then just quote your non-circular definition of “physical” and use it in your argument that concludes that the thesis of physicalism is true. That's the challenge of this thread. I don't know what you're waiting on--an invitation to come in the mail?

P1: [. . .]
P2: [. . .]
C: Therefore, everything that exists is [your non-circular definition of “physical”].

Fill in the blanks.
 

Nous

Well-Known Member
Premium Member
For the record, I don't know. Everything that exists might be the kind of thing you can study in physics or depend on the kind of thing you can study in physics but that seems like a matter that we could solve by finding clear examples of something that isn't.
How about free will or volition--the ability of an individual to choose between available options? You don't know of any occasion where that phenomenon has been "stud[ied] in physics," do you?

I'm open to both possibilities or even scenarios where neither statement makes any sense. I'd say the mind is a possible candidate and I've no idea of how we could categorise the truths of mathematics in such a scheme.
So, in other words, you do not have or know of an argument that concludes that the thesis of physicalism--that everything that exists is "physical"--is true. Correct?

A physicalist account could maybe proceed as follows:

Free will comes from somewhere. We can all agree that we are/have bodies. We can all agree that nothing without a body has ever been seen to have had free will. It is not a great leap to suppose that free will comes with being/having the particular kinds of body we are/have.
It is a great leap unless you can identify some process happening in the body that can plausibly result in an individual being able to choose between available options. No?

Let's say you are confronted with two arguments that use the same premises but have different conclusions. One argument is of the form: "If P, then Q. Not Q. Therefore not P." The other argument is of the form: "If P, then Q. Not Q. Therefore R." What is the process in the body that allows a person to accept, agree with, believe and assert the truth of the conclusion of the first argument rather than the conclusion of the second argument?

Yes, I will not be making an argument that either materialism or physicalism is true. I've done enough reading on the topic to know that no philosopher has made such an argument. We already know that the classical definition of materialism is false, and physicalism is invariably gored on one of the two horns of Hempel's dilemma.
What would you say is the most relevant reading on the matter?
My conclusion is that the thesis of materialism has been proven false by the findings and theories of modern physics, and that thesis of physicalism is vacuous and indefensible because the concept of "physical" is vacuous and unscientific.
 

Nous

Well-Known Member
Premium Member
What would you say is the most relevant reading on the matter?
It only now occurred to me that you must be asking about what I've read on the topic, not what I have concluded.

Honestly I haven't done a lot of reading on metaphysics (or philosophy generally) lately. I've read some of the older 20th century standards, such as Rorty. I've read a couple of Dennnett's books, where he didn't really attempt to make any arguments for materialism. I've read Jaegwon Kim's Mind in a Physical World, and some of his papers (I especially like Kim--but he needs to study physics a little). I know I've read a paper by Stoljar where, as I recall, he articulated something like an argument; offhand I don't recall the name of this paper. I've read papers by the Churchlands, but not their books.

I've also read books and papers by others who argue for other metaphysical theses or at least against materialism/physicalism, such as Chalmers, Sprigge, Hartshorne, Searle, Strawson, Bergson, and a hell of a lot of the pre-20th century idealists.
 

Nous

Well-Known Member
Premium Member
And, I should say that the authors who've most influenced me on metaphysics are the physicists who've broached the topic--Paul Davies, Henry Stapp, Wheeler, Eddington, Kuttner & Rosenblum, Heinsenberg, Schrodinger, Bohr.
 

idav

Being
Premium Member
The more I’ve thought about my previous answer to this, the more dissatisfied I’ve become. There is a more elegant (and in general more accurate, or at least more fundamental) answer. The reason for contingency is due to the development and formulation of physical laws. By “development” I refer to the historical manner in which these laws came to be, and by formulation I mean the manner of their exposition in modern physics (including the modern way classical laws of physics, now known to be incorrect, are expressed).

First, physical laws (and the idea that such things existed) were developed by considering physical systems (whether celestial bodies or rocks dropped from ship masts) in an idealized manner. In particular, it was assumed that we could sufficiently isolate systems such that we could study their dynamics but consider our role as observers to be negligible and our role as designers (in the case of most experiments in which systems are set-up by humans, not simply “observed” as celestial bodies were) neglected. Generalizations from the results of such studies of “isolated” systems were considered to hold true universally (or at least far more broadly).

Second, the mechanical nature of the instruments and experimental designs used to codify laws and the ontological status of the objects of their descriptions were based fundamentally on the development and examination of mechanical devices:
“If we look back at the history of physics, with a view to understanding why it could develop as rapidly as it did, we find two relevant sets of circumstances. First, there was the patiently accumulated data on astronomy, collected and tabulated for millennia. Second, an apparently unrelated circumstance: one could experiment directly with simply mechanical contrivances, with inclined planes, pulleys, and springs, and express the resulting data in the form of simple mathematical rules, revealed by simple laboratory experiments with simple bulk systems, held good throughout all nature, at every level, from the greatest to the very smallest…
This tacit belief in the unlimited uniformity of mechanical behavior, and the corresponding universality of mechanical laws, provided the absolutely essential nutrient that permitted theoretical physics to develop as it did...
We now know, after three centuries, that this assumption of uniformity was entirely, hopelessly false.”
Rosen, R. (1991). Life Itself: A Comprehensive Inquiry into the Nature, Origin, and Fabrication of Life (Complexity in Ecological Systems). Columbia University Press.

By generalizing from the observation of the dynamics of simple, mechanical systems (or those systems whose behavior was akin to simple, mechanical systems) considered in isolation laws such as the law of conservation of energy or of momentum were derived. But long before it became clear that this generalization procedure was hopelessly flawed, these laws were often elevated to a definitional status. For example, the law of the conservation of energy only holds true in the case of isolated systems (as do most “laws” of physics). In reality, systems are never truly isolated (quantum physics taught us that), but in classical physics and with laws derived via the observation of simple, mechanical systems, it seemed possible to isolate systems. However, even for Newton and certainly for physicists in the 1800s it became clear that seemingly isolated systems were not, in fact, “isolated”. So circular reasoning via contingency was employed: the laws of physics held true for isolated systems, but for the laws to hold true it was necessary that these systems be isolated, and so systems were rightfully considered to be isolated if and only if the laws held:

“It is often said that energy and momentum are conserved for closed systems. One ordinarily thinks of this claim as merely empirical: if a system is closed, then it is an empirical fact that the total momentum and energy of that system is conserved. Thus, it is imagined that, had the empirical facts been different, a closed system could turn out to have total momentum and energy that is not conserved. But do we have a criterion for determining whether a system is closed, independently of the conservation laws? In other words, do we have a criterion for causally determining that a system is isolated from the environment? Is this criterion independent of momentum-energy conservation laws? A short reflection would demonstrate that there is none. And so a system is defined as closed whenever the total momentum and energy of that system is conserved. In other words, the conservation laws themselves provide us with a criterion of isolation, or the criterion by which a system is shown to be causally isolated from the rest of the world.”
Belkind, O. (2012). Physical Systems: Conceptual Pathways between Flat Space-time and Matter (Boston Studies in the Philosophy of Science, Vol. 264). Springer.

In short, the laws of physics are contingent because they hold only for idealized systems that don’t exist, but hold therefore also for systems that approximate idealization. This failure has become realized to be drastic in several ways, both within physics and with respect to biology:
“From the physical point of view, even the simplest system one would want to call an organism is already inconceivably complicated. There are no biological counterparts of the inclined plane or pulley, the simple systems that manifests in itself the general laws we want to study. We cannot thus study organisms by inorganic proxy, at least not experimentally”
(Rosen, p. 18)

Even when considering would-be simple classical systems, it turns out that a large degree of faith was put into the validity of “laws” of physics because of our ability to treat systems that could be understood in terms of their parts with these “laws”. In physics, laws are in general codified in mathematical form, most particularly via differential equations. The problem is that most differential equations cannot be solved analytically (that is, there exists no method for solving them, even in principle, exactly, but computational procedures may be used to approximate answers under suitable conditions). It turns out that the systems whence came our basis for asserting there to be the “laws” of reductionist physics were those systems that are “separable” (i.e., modeled by differential equations which could by solved exactly via separation of the “parts” of the system by treating the infinitesimal displacements and/or differential forms which appeared in the system(s) of equation(s) independently) This is, in general, not possible, and is certainly impossible with living systems or any non-living system that behaves chaotically.

In even shorter form, the laws of physics are contingent because they were formulated using contingency, and were only held to be true more generally by assumptions which were false.
With all this in mind, aren't things still reducible as you alluded to in the previous post, down to energies, atoms and such. For example you used several examples of biology in which even the simplest organisms are highly complex. However as the organisms die do they not reduce back to said basic components and it can it not be quantified? Sure emergence is fascinating but even something like H2O which becomes something so different from there components can also be easily reduced to the H and the O via heat.
 

LegionOnomaMoi

Veteran Member
Premium Member
With all this in mind, aren't things still reducible as you alluded to in the previous post, down to energies, atoms and such.
No. Heat, for example, is fundamentally an emergent property. The properties of most molecules, such as water, can't be understood in terms of their atoms (at best they merely fail to violate any "laws" governing the dynamics of atoms). Solitons, phonons, strongly synchronized systems (particularly non-separable synchronized systems), etc., all involve the emergence of structures, functions, and so forth that cannot be "derived" from any laws governing their constituent parts. Living systems cannot, in general, even be understood in terms of their parts at all (let alone the dynamical laws which are supposed to characterize these).
However as the organisms die do they not reduce back to said basic components and it can it not be quantified?
Actually, there is a certain degree of equivalence between death in living systems and reductionism. Because living systems are determined not only by the dynamics of their parts but by the emergent functions (such as metabolism-repair) which cannot be reduced to constituent parts yet cannot be ignored, both death and reductionism remove something fundamental from the systems in question. One cannot understand even a single cell without incorporating metabolic-repair, yet a dead cell is not determined by this function (nor does it emerge from cellular dynamics in a dead cell).

Sure emergence is fascinating but even something like H2O which becomes something so different from there components can also be easily reduced to the H and the O via heat.
Heat cannot be understood apart from statistical complexes. It is irreducible. Also, the properties of water cannot be understood by any study of hydrogen and oxygen atoms.
 

LegionOnomaMoi

Veteran Member
Premium Member
even something like H2O which becomes something so different from there components can also be easily reduced to the H and the O via heat.
I noted in my previous post that this isn't true at all. But given that you invoked heat, I thought it appropriate to concentrate momentarily on how "via heat" means "via irreducible properties". Specifically:
"one can assign an 'objective temperature' to a body only on the basis of evidence concerning the average velocity of its constituent particles, some of which escape from the object (altering thereby 'the body') and are recorded by a detector whose own physical properties are involved essentially in the 'reading'. If one had a LaPlacean knowledge of all the particles involved, that is, a complete tabulation of all the component microevents and their interrelations, then one could predict the time and nature of every actual recording by the detector. But then, in such a case, one could no longer assign an objective temperature to the body; because the very concepts of temperature and entropy presuppose statistical disorder in the phenomena." p. 81-81
Hanson, N. R. (1963). The Concept of the Positron: A Philosophical Analysis. Cambridge University Press.
 
Last edited:

LegionOnomaMoi

Veteran Member
Premium Member
My conclusion is that the thesis of materialism has been proven false by the findings and theories of modern physics, and that thesis of physicalism is vacuous and indefensible because the concept of "physical" is vacuous and unscientific.
I largely agree. I would, however, say that one can define "physical" circularly without having such a circular definition be worthless. In fact, "unscientific" is similarly "vacuous" in that the scientific endeavor concerns the formulation of models and theories abstracted by empirical investigations which assume that there exists a "physical" reality that is subject to such investigations. "Unscientific" theories are those that posit phenomena that cannot be subjected to the empirical methods employed by scientists. Of course, there are no hard and fast rules separating "science" from other fields of inquiry (e.g., some consider history to be a "science", and until recently the most exact science was generally considered to be mathematics, which doesn't concern either the physical or the empirical). But I would not wholly abandon the notion of "physical" as I believe that in order for physics to be scientific, it must be capable of at least a fuzzy distinction between physical and non-physical, else it cannot be considered even a sensible knowledge domain.
 

Yerda

Veteran Member
How about free will or volition--the ability of an individual to choose between available options? You don't know of any occasion where that phenomenon has been "stud[ied] in physics," do you?
No. But it might be the case that it depends on things like matter/fields/energy etc. It surely depends on something.

Nous said:
So, in other words, you do not have or know of an argument that concludes that the thesis of physicalism--that everything that exists is "physical"--is true. Correct?
Correct. I don't. Not in so many words, at least. I've actually never seen anyone argue that everything is physical, only that such and such is physical (usually the mind).

Nous said:
It is a great leap unless you can identify some process happening in the body that can plausibly result in an individual being able to choose between available options. No?
Perhaps. I see choice as an aspect of consciousness and I see consciousness as a biological process like digestion or photosynthesis (see John Searle). That would mean it depends on matter for the most part. Which is physical.

Nous said:
Let's say you are confronted with two arguments that use the same premises but have different conclusions. One argument is of the form: "If P, then Q. Not Q. Therefore not P." The other argument is of the form: "If P, then Q. Not Q. Therefore R." What is the process in the body that allows a person to accept, agree with, believe and assert the truth of the conclusion of the first argument rather than the conclusion of the second argument?
The process of checking for logical consistency?

Nous said:
My conclusion is that the thesis of materialism has been proven false by the findings and theories of modern physics, and that thesis of physicalism is vacuous and indefensible because the concept of "physical" is vacuous and unscientific.
I see. I take it the concept "non-physical" is also vacuous and unscientific?
 
Last edited by a moderator:

Koldo

Outstanding Member
Ok, then we'll go with "derived from". Molecular structures are not derived from atomic physics (which are not derived by elementary particles, but defined in terms of these), and we can't even understand cellular dynamics in terms of molecular physics, let alone somehow "derive" any science of living systems from our understanding of molecules. Hell, we have to assume that gases and liquids and other systems can, in principle, be understood in terms of the dynamics of their constituent elements (i.e., that gases and so forth are made up of molecules and atoms that obey the laws of physics governing the dynamics of atoms and molecules) because even in the Lagrangian or Hamiltonian formulations of classical (analytical) dynamics the equations for tiny amounts of gases require integrals in such high dimensional spaces as to be forever unsolvable even by approximation methods. We rely instead on statistical descriptions that we assume are descriptions of averages of the behaviors of the molecules making up gases and similar systems, and we know that in general these assumptions are wrong but often approximately correct.
Indeed, it is in condensed matter physics, continuum mechanics, and similar levels of analysis in which we find physicists and research in physics that reflects the autonomy (i.e., non-derivability) of the level of analyses that your chart assumes to be non-autonomous (i.e., it assumes higher levels are derived from lower). Even at the level of solid state physics, condensed matter physics, and other areas of physics that require the use of quantum theory (sometimes beyond quantum mechanics) we find the emergence of structure, dynamics, and phenomena that cannot be derived from atomic or elementary particle physics. Certain collective, irreducible states begin to emerge at very low-level analyses.
See e.g.,
Falkenburg, B., & Morrison, M. (Eds.). (2015). Why more is different: Philosophical issues in condensed matter physics and complex systems. Springer.
Chibbaro, S., Rondoni, L., & Vulpiani, A. (2014). Reductionism, emergence and levels of reality. Springer.
And Anderson's groundbreaking study on the autonomy of different levels of analysis and the non-derivability you assume: More is Different
Although Anderson doesn't go far enough, you will find that the ability to derive higher levels from lower is a stronger condition than reductionism.


I've uploaded a scan of the paper from the volume it was published in.

Higher level structures can at best be in principle reduced to accord with the laws governing their constituent (lower level) parts. But they cannot be constructed from these laws. For example, in granular media the initial configuration state of a system that undergoes a transition, such as the formation of crystalline structure or that of a sandpile, does not violate any known laws of physics but the final structure cannot be "derived" from the initial. It is contingent upon the initial state and the collective (emergent) dynamics that are produced by forces acting on a system too interconnected to be understood solely in terms of its constituents.
The emergence of such collective structures that are irreducible and non-derivable from the dynamics of lower levels take on a qualitative difference for living systems, even for intracellular structures such as DNA. For living systems, functional emergence (which is non-physical) becomes necessary. Consider a model, simulation, or similar "realization" of a cell and the process of metabolic-repair, and let f: A→B be a function
"where f is the process that takes input A and output B...The system Rosen uses for an example is the Metabolism-Repair or [M,R] system. The process, f, in this case stands for the entire metabolism goin on in an organism...The transition, f, which is being called metabolism, is a mapping taking some set of metabolites, A, into some set of products, B. What are the members of A? Really everything in the organism has to be included in A, and there has to be an implicit agreement that at least some of the members of A can enter the organism from its environment. What are the members of B? Many, if not all, of the memebers of A since the transitions in the reduced system are all strung together in the many intricate patterns or networks that make up the organism's metabolism. It also must be true that some members of B leave the organism as products of metabolism...In the context developed so far, the mapping, f, has a very special nature. It is a functional component of the system we are developing. A functional component has many interesting attributes. First of all, it exists independent of the material parts that make it possible. Reductionism has taught us that every thing in a real system can be expressed as a collection of material parts. This is not so in the case of functional components...Fragmentability is the aspect of systems that can be reduced to their material parts leaving recognizable material entities as the result. A system is not fragmentable is reducing it to its parts destroys something essential about that system. Since the crux of understanding a complex system had to do with identifying the context dependent functional components, they are by definition, not fragmentable". (emphasis added; italics in original)
Mikulecky, D. C. (2005). The Circle That Never Ends: Can Complexity be Made Simple? In D. Bonchev & D. H. Rouvray (Eds.). Complexity in Chemistry, Biology, and Ecology (Mathematical and Computational Chemistry). Springer.



"systems biology is concerned with the relationship between molecules and cells; it treats cells as organized, or organizing, molecular systems having both molecular and cellular properties. It is concerned with how life or the functional properties thereof that are not yet in the molecules, emerge from the particular organization of and interactions between its molecular processes. It uses models to describe particular cells and generalizes over various cell types and organisms to arrive at new theories of cells as molecular systems. It is concerned with explaining and predicting cellular behaviour on the basis of molecular behaviour. It refers to function in ways that would not be permitted in physics. It addresses an essential minimum complexity exceeding that of any physical chemical system understood until now. It shies away from reduction of the system under study to a collection of elementary particles. Indeed, it seems to violate many of the philosophical foundations of physics, often in ways unprecedented even by modern physics." (emphases added)
Boogerd, F., Bruggeman, F. J., Hofmeyr, J. H. S., & Westerhoff, H. V. (Eds.). (2007). Systems biology: philosophical foundations. Elsevier.

It seems you have taken my post to mean that supervenience entails higher-level explanations being produced by knowledge of the lower-level ( events, object and/or laws ).
However, I wasn't presenting supervenience as as an epistemological approach ( in other words, a way to derive knowledge ), but rather as an metaphysical statement.

Taking that into consideration, emergence does not preclude supervenience nor physicalism. It comes down to the specifics as to how you are using the term.

Further, considering you have said that "It is contingent upon the initial state and the collective (emergent) dynamics that are produced by forces acting on a system too interconnected to be understood solely in terms of its constituents.", what do you hold to be the relationship between the lower-level and the higher-level ? How do they relate to each other ?

So what is the ontological nature, according to physicalism, of the "virtual" processes and particles that are involved in fundamental interactions, either before or after renormalization?

So, what is the ontological nature, according to physicalism of ... ?

If the rest of this question is filled with something that exists in the actual world ( which does vary depending on who you ask ) then the only viable answer is to ultimately say: 'It is physical'.
 

Koldo

Outstanding Member
The more I’ve thought about my previous answer to this, the more dissatisfied I’ve become. There is a more elegant (and in general more accurate, or at least more fundamental) answer. The reason for contingency is due to the development and formulation of physical laws. By “development” I refer to the historical manner in which these laws came to be, and by formulation I mean the manner of their exposition in modern physics (including the modern way classical laws of physics, now known to be incorrect, are expressed).

First, physical laws (and the idea that such things existed) were developed by considering physical systems (whether celestial bodies or rocks dropped from ship masts) in an idealized manner. In particular, it was assumed that we could sufficiently isolate systems such that we could study their dynamics but consider our role as observers to be negligible and our role as designers (in the case of most experiments in which systems are set-up by humans, not simply “observed” as celestial bodies were) neglected. Generalizations from the results of such studies of “isolated” systems were considered to hold true universally (or at least far more broadly).

Second, the mechanical nature of the instruments and experimental designs used to codify laws and the ontological status of the objects of their descriptions were based fundamentally on the development and examination of mechanical devices:
“If we look back at the history of physics, with a view to understanding why it could develop as rapidly as it did, we find two relevant sets of circumstances. First, there was the patiently accumulated data on astronomy, collected and tabulated for millennia. Second, an apparently unrelated circumstance: one could experiment directly with simply mechanical contrivances, with inclined planes, pulleys, and springs, and express the resulting data in the form of simple mathematical rules, revealed by simple laboratory experiments with simple bulk systems, held good throughout all nature, at every level, from the greatest to the very smallest…
This tacit belief in the unlimited uniformity of mechanical behavior, and the corresponding universality of mechanical laws, provided the absolutely essential nutrient that permitted theoretical physics to develop as it did...
We now know, after three centuries, that this assumption of uniformity was entirely, hopelessly false.”
Rosen, R. (1991). Life Itself: A Comprehensive Inquiry into the Nature, Origin, and Fabrication of Life (Complexity in Ecological Systems). Columbia University Press.

By generalizing from the observation of the dynamics of simple, mechanical systems (or those systems whose behavior was akin to simple, mechanical systems) considered in isolation laws such as the law of conservation of energy or of momentum were derived. But long before it became clear that this generalization procedure was hopelessly flawed, these laws were often elevated to a definitional status. For example, the law of the conservation of energy only holds true in the case of isolated systems (as do most “laws” of physics). In reality, systems are never truly isolated (quantum physics taught us that), but in classical physics and with laws derived via the observation of simple, mechanical systems, it seemed possible to isolate systems. However, even for Newton and certainly for physicists in the 1800s it became clear that seemingly isolated systems were not, in fact, “isolated”. So circular reasoning via contingency was employed: the laws of physics held true for isolated systems, but for the laws to hold true it was necessary that these systems be isolated, and so systems were rightfully considered to be isolated if and only if the laws held:

“It is often said that energy and momentum are conserved for closed systems. One ordinarily thinks of this claim as merely empirical: if a system is closed, then it is an empirical fact that the total momentum and energy of that system is conserved. Thus, it is imagined that, had the empirical facts been different, a closed system could turn out to have total momentum and energy that is not conserved. But do we have a criterion for determining whether a system is closed, independently of the conservation laws? In other words, do we have a criterion for causally determining that a system is isolated from the environment? Is this criterion independent of momentum-energy conservation laws? A short reflection would demonstrate that there is none. And so a system is defined as closed whenever the total momentum and energy of that system is conserved. In other words, the conservation laws themselves provide us with a criterion of isolation, or the criterion by which a system is shown to be causally isolated from the rest of the world.”
Belkind, O. (2012). Physical Systems: Conceptual Pathways between Flat Space-time and Matter (Boston Studies in the Philosophy of Science, Vol. 264). Springer.

In short, the laws of physics are contingent because they hold only for idealized systems that don’t exist, but hold therefore also for systems that approximate idealization. This failure has become realized to be drastic in several ways, both within physics and with respect to biology:
“From the physical point of view, even the simplest system one would want to call an organism is already inconceivably complicated. There are no biological counterparts of the inclined plane or pulley, the simple systems that manifests in itself the general laws we want to study. We cannot thus study organisms by inorganic proxy, at least not experimentally”
(Rosen, p. 18)

Even when considering would-be simple classical systems, it turns out that a large degree of faith was put into the validity of “laws” of physics because of our ability to treat systems that could be understood in terms of their parts with these “laws”. In physics, laws are in general codified in mathematical form, most particularly via differential equations. The problem is that most differential equations cannot be solved analytically (that is, there exists no method for solving them, even in principle, exactly, but computational procedures may be used to approximate answers under suitable conditions). It turns out that the systems whence came our basis for asserting there to be the “laws” of reductionist physics were those systems that are “separable” (i.e., modeled by differential equations which could by solved exactly via separation of the “parts” of the system by treating the infinitesimal displacements and/or differential forms which appeared in the system(s) of equation(s) independently) This is, in general, not possible, and is certainly impossible with living systems or any non-living system that behaves chaotically.

In even shorter form, the laws of physics are contingent because they were formulated using contingency, and were only held to be true more generally by assumptions which were false.

I was asking that question on regards to this quote:

"Your graphics in which higher levels of analysis are produced by the dynamics governing lower levels is simply and fundamentally wrong:
"At the rather basic level of life, and perhaps even in chemistry, there is no reduction: perhaps the simplest proof of this is that while the bases of DNA each obey the laws of physics, the juxtaposition of bases in the nucleotides is physically contingent, so the information content of DNA and the way it serves to encode instructions for constructing proteins is not governed merely by the laws of physics."
Simons, P. (2002). Candidate General Ontologies for Situating Quantum Field Theory. In Kuhlmann, M., Lyre, H., Wayne, A. (Eds.). Ontological Aspects of Quantum Field Theory. World Scientific."

Now, correct me if I am wrong but you are saying that the relationship between the laws of physics and the juxtaposition of bases in the nuclueotides is contingent.
And my question was: What ( if anything ) do you propose to be responsible for this contingency ?
 

Koldo

Outstanding Member
Then just quote your non-circular definition of “physical” and use it in your argument that concludes that the thesis of physicalism is true. That's the challenge of this thread. I don't know what you're waiting on--an invitation to come in the mail?

P1: [. . .]
P2: [. . .]
C: Therefore, everything that exists is [your non-circular definition of “physical”].

Fill in the blanks.

It only now occurred to me that you must be asking about what I've read on the topic, not what I have concluded.

Honestly I haven't done a lot of reading on metaphysics (or philosophy generally) lately. I've read some of the older 20th century standards, such as Rorty. I've read a couple of Dennnett's books, where he didn't really attempt to make any arguments for materialism. I've read Jaegwon Kim's Mind in a Physical World, and some of his papers (I especially like Kim--but he needs to study physics a little). I know I've read a paper by Stoljar where, as I recall, he articulated something like an argument; offhand I don't recall the name of this paper. I've read papers by the Churchlands, but not their books.

I've also read books and papers by others who argue for other metaphysical theses or at least against materialism/physicalism, such as Chalmers, Sprigge, Hartshorne, Searle, Strawson, Bergson, and a hell of a lot of the pre-20th century idealists.

I am honestly surprised by the second quote.
Do you mean to say you have read several books about physicalism and yet your refutation of it rests on semantinc grounds ?
 

Nous

Well-Known Member
Premium Member
I largely agree. I would, however, say that one can define "physical" circularly without having such a circular definition be worthless.
What would be a circular definition of "physical" that is not worthless?

Can we also use a circular but non-vacuous definition of the adjective “deific” in order to argue that the thesis of pantheism is true?

In fact, "unscientific" is similarly "vacuous" in that the scientific endeavor concerns the formulation of models and theories abstracted by empirical investigations which assume that there exists a "physical" reality that is subject to such investigations. "Unscientific" theories are those that posit phenomena that cannot be subjected to the empirical methods employed by scientists. Of course, there are no hard and fast rules separating "science" from other fields of inquiry (e.g., some consider history to be a "science", and until recently the most exact science was generally considered to be mathematics, which doesn't concern either the physical or the empirical). But I would not wholly abandon the notion of "physical" as I believe that in order for physics to be scientific, it must be capable of at least a fuzzy distinction between physical and non-physical, else it cannot be considered even a sensible knowledge domain.
All true. Despite tacking “unscientific” onto the end of my sentence, I didn't mean to imply that the word “physical” is inherently unscientific in some way, only that no scientific discipline defines or traffics in that adjective. As I've said several times here, I suspect the ancient concept of “physical” denotes something that humans can at least theoretically acquire some kind of sense datum from--that's generally my concept of “physical,” and even Stoljar said something similar. By this definition, the banana that I shall eat soon is physical, but the antagonist in the novel that I have not written the first word of is not physical. Energy is not physical; countably infinite sets are not physical. Etc., etc.
 
Top