• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Entropy is NOT disorder (it really is not)

Revoltingest

Pragmatic Libertarian
Premium Member
I am not sure I agree. If we consider the entropy of a black hole, for instance, that seems to be reduced to the amount of bits mapped on its surface. This is also why throwing a body in a black hole will cause an increase of its horizon by the exact amount required to store the system state of that body.

And you are partially right. That machine won't work. With a very high probability, proportional to the amount of time it is supposed to work.
I'm not versed in the properties of black holes.
But I do know....
- They're not a closed system (since matter & energy fall into them).
- That some speculate that they're also not a closed system because
they're a gateway to another universe. Entertaining idea, eh.
- I've seen no evidence they don't obey the 2nd Law.
- There's no evidence that matter & energy which create & contribute to
them are converted to bits of information...whatever that form would take.
So, since you believe entropy cannot possibly decrease. Can you show me the fallacy in Poincare recurrence theorem?
It's above me pay grade to address that.

We should note that all the fancy statistical mechanics analysis is really just a theoretical
particle physics approach to macroscopic properties of temperature, pressure & heat.
This whole "information" approach is dysfunctional, leading to misunderstanding.
And creationists are having a field day with it.
 
Last edited:

viole

Ontological Naturalist
Premium Member
I'm not versed in the properties of black holes.
But I do know....
- They're not a closed system (since matter & energy fall into them).
- That some speculate that they're also not a closed system because
they're a gateway to another universe. Entertaining idea, eh.
- I've seen no evidence they don't obey the 2nd Law.
- There's no evidence that matter & energy which create & contribute to
them are converted to bits of information...whatever that form would take.

It's above me pay grade to address that.

We should note that all the fancy statistical mechanics analysis is really just a theoretical
particle physics approach to macroscopic properties of temperature, pressure & heat.
This whole "information" approach is dysfunctional, leading to misunderstanding.
And creationists are having a field day with it.

I personally don't care about creationists. And in my experience, they like to claim that information is not reducible to physisics. Which is, in my opinion, wrong.

Fact is. All resolutions to Maxwell little demon paradox, seem to go in the same direction. The work in information technology seems also to confirm that information processing is strictly related to thermodynamics. You cannot possibly manipulate information without affecting physics. That is why your computer needs fans. Landauer work in this area present interesting links with information and physics.

In other words, bits and joule/celsius degrees are basically different units for the same thing.

Ciao

- viole
 

Fool

ALL in all
Premium Member
I personally don't care about creationists. And in my experience, they like to claim that information is not reducible to physisics. Which is, in my opinion, wrong.

Fact is. All resolutions to Maxwell little demon paradox, seem to go in the same direction. The work in information technology seems also to confirm that information processing is strictly related to thermodynamics. You cannot possibly manipulate information without affecting physics. That is why your computer needs fans. Landauer work in this area present interesting links with information and physics.

In other words, bits and joule/celsius degrees are basically different units for the same thing.

Ciao

- viole


panpsychism


https://phys.org/news/2011-03-quantum-no-hiding-theorem-experimentally.html
 

Revoltingest

Pragmatic Libertarian
Premium Member
I personally don't care about creationists. And in my experience, they like to claim that information is not reducible to physisics. Which is, in my opinion, wrong.

Fact is. All resolutions to Maxwell little demon paradox, seem to go in the same direction. The work in information technology seems also to confirm that information processing is strictly related to thermodynamics. You cannot possibly manipulate information without affecting physics. That is why your computer needs fans. Landauer work in this area present interesting links with information and physics.

In other words, bits and joule/celsius degrees are basically different units for the same thing.

Ciao

- viole
Not all computers need fans.
We'll agree to disagree about everything being information.
 

viole

Ontological Naturalist
Premium Member
Not all computers need fans.

True. But they need, at least, passive coolers. The heat produced by a computation is always higher then the one needed to operate the machine. This extra heat has to do with information manipulation only. Evertyime a bit is deleted, no matter how, we have an absolute minimal amount of heat that gets produced. Ergo, you cannot possibly manipulate information without physical impact.

That is what Landauer discovered. I think this is pretty strong evidence that information is physical. This is also the title of his article.

We'll agree to disagree about everything being information.

Ok. But you never said that everything is information.

Ciao

- viole
 

A Vestigial Mote

Well-Known Member
It's one of the most perversely propagated scientific misunderstandings in the world. The concept of entropy has nothing to do with order or disorder whatsoever

I think a lot of people take "disorder" to mean chaotic, or out of control. When all it really means as applied to entropy is "less orderly" - and even then, only as applied to the parts within the system being less contained by any specific "order" of said parts. And energy-wise, this "disorder" is a by-product of the dispersal or dissemination... the idea being that the energy follows fewer patterns, separations or focal points - even as that also means that the "share" of energy per part in the system sees greater distribution.
 

Revoltingest

Pragmatic Libertarian
Premium Member
True. But they need, at least, passive coolers. The heat produced by a computation is always higher then the one needed to operate the machine. This extra heat has to do with information manipulation only. Evertyime a bit is deleted, no matter how, we have an absolute minimal amount of heat that gets produced. Ergo, you cannot possibly manipulate information without physical impact.

That is what Landauer discovered. I think this is pretty strong evidence that information is physical. This is also the title of his article.



Ok. But you never said that everything is information.

Ciao

- viole
To process information (eg, by computers, by brains, by printing) is always a
physical process of some kind, so energy will be expended. But this use of
energy does not mean that energy is information.

Silly analogy time....
If the fact that processing information uses energy means that energy is information,
then the fact that all drivers use vehicles means that all vehicles are drivers.
 

viole

Ontological Naturalist
Premium Member
To process information (eg, by computers, by brains, by printing) is always a
physical process of some kind, so energy will be expended. But this use of
energy does not mean that energy is information.

Silly analogy time....
If the fact that processing information uses energy means that energy is information,
then the fact that all drivers use vehicles means that all vehicles are drivers.

Maybe I was not sufficiently clear. I think.

Cancelling a bit on a computer, or anywhere else, produces intrinsic entropy. It is a minimal amount of entropy that is what is expended independently of any technology you want to deploy. It is a physical, inviolable and intrinsic limit. It has only to do with the bit being removed, not with the means used to delete it. And it does not matter how that bit is stored. That minimal amount is the same. It is entropy contained in that "abstract" bit, so to speak.

It has nothing to do with the energy used to operate transistors and stuff. That comes on top of that.

For all practical purposes, deleting a bit, without generating entropy, is equivalent to a miracle.

The logical conclusion is obvious. Information is, independently from the means used to store it and process it, intrinsically physical.

Ciao

- viole
 
Last edited:

Polymath257

Think & Care
Staff member
Premium Member
Are you saying that once a scientific word becomes commonly used it looses its standing in science? That in science "atom" no longer refers to "the smallest constituent unit of ordinary matter that has the properties of a chemical element"?
Source: various sites

Or that in science "calorie" no longer refers to "the amount of heat required at a pressure of one atmosphere to raise the temperature of one gram of water one degree Celsius that is equal to about 4.19 joules"?
Source: Merriam-Webster Dictionary

.

Don't forget that the *original* definition of an atom was 'the smallest possible piece of matter', which is quite distinct from 'the smallest constituent of ordinary matter that has the properties of a chemical element'. To use the old definition would lead to considerable confusion. Be careful when reading Lucretius!
 

sayak83

Veteran Member
Staff member
Premium Member
I agree, in principle. Bottom down, it all depends on the definition of disorder.

Entropy is information. Ergo, information is physical. Basically, it is the minimal amount of bits that can completely specify the state of a system. Maximal entropy is the maximum amounts of bits that can specify the state of the system.

In other words: maximum entropy is the storage size (in bits) that can be used to specify any state of the system. The current entropy is the number of bits of that storage actually used.

Therefore, it is entirely possible that a system with a small storage and in equilibrium (maximum storage used), uses less bits then another system, very far from equilibrium, with a much higher storage capacity. The latter (more bits, and therefore entropy, even if far from equilibrium) would then be more "disordered" then the "former", which seems absurd.

Another mistake popular science does is to declare that entropy can only increase or remain constant. This is simply not true. There is no law of physics that prohibits a system to spontaneously get less entropy. It is just very unlikely if the system is far from equilibrium. But when it is in equilibrium, for instance, entropy could very easily decrease (even if not that much, probably).

And if we wait long enough, any initial configuration will be repeated. Poincare theorem shows why that must happen.

Now, science says that while entropy increases, information remains constant. The solution of this apparent paradox is left as a simple exercise to the reader.

Ciao

- viole

My understanding of Shannon entropy is poor. It has some similarities with thermodynamic entropy, but an explicit equivalence had never been shown. The concept of information is inherently observer based, and Shannon entropy is as well. However thermodynamic entropy can be defined without taking recourse to any observer in terms of physical states alone. This difference makes it difficult to find a formal and generalizable equivalence that is more than hand-wavy type.
 

Polymath257

Think & Care
Staff member
Premium Member
Entropy isn't quite as easy to define as even this suggests.

For a classical entropy paradox, suppose we consider a container divided in two by a separator. We have three scenarios:

1. Both sides have the same gas inside. We remove the separator and the entropy doesn't change.

2. The sides have *different* gases inside. We remove the separator, the gases mix and the entropy increases.

3. The sides have different gases inside, but we don't currently have the technology to detect the difference. The remarkable thing is that it is perfectly consistent to calculate the entropy in two ways: one in which there is only one gas and the other where there are two gases. ALL subsequent calculations will be consistent with observations as long as we are consistent about the distinguish-ability of the gases.

The point is that entropy isn't just a calculation of the number of quantum states. It is a calculation of the number of *distinguishable* quantum states. The difference shows up in a number of the calculations of entropy in practice (there are factorials involved that are required for consistency with observations).
 

Revoltingest

Pragmatic Libertarian
Premium Member
Maybe I was not sufficiently clear. I think.

Cancelling a bit on a computer, or anywhere else, produces intrinsic entropy. It is a minimal amount of entropy that is what is expended independently of any technology you want to deploy. It is a physical, inviolable and intrinsic limit. It has only to do with the bit being removed, not with the means used to delete it. And it does not matter how that bit is stored. That minimal amount is the same. It is entropy contained in that "abstract" bit, so to speak.
Entropy is increased (not "expended") when any work is done in the physical world.
The amount of energy transferred doesn't matter. Even a miniscule change in an
electronic, pneumatic, mechanical, or photonic computing device is work done.
(It takes energy to change the state.)
This is so whether it's to store, change or retrieve a bit or more of information.
Electrons & photons don't care what purpose they serve, be it about information,
laundering a kilt, charging a battery, tanning a fanny, or heating up haggis.
(I once worked a bit with pneumatic logic circuits to control machinery....weird, eh.)
It has nothing to do with the energy used to operate transistors and stuff. That comes on top of that.
The information aspect is irrelevant, as I explained above.
It's all about an irreversible physical process...both on the
quantum & the macro (ie, classical thermodynamic) level.
The logical conclusion is obvious. Information is, independently from the means used to store it and process it, intrinsically physical.

Ciao

- viole
The concept of information is a human created one, much like beauty.
Before humans or any other critter experienced a thought, irreversible
physical processes in any system were increasing entropy.
 
Last edited:

Revoltingest

Pragmatic Libertarian
Premium Member
Entropy isn't quite as easy to define as even this suggests.

For a classical entropy paradox, suppose we consider a container divided in two by a separator. We have three scenarios:

1. Both sides have the same gas inside. We remove the separator and the entropy doesn't change.

2. The sides have *different* gases inside. We remove the separator, the gases mix and the entropy increases.

3. The sides have different gases inside, but we don't currently have the technology to detect the difference. The remarkable thing is that it is perfectly consistent to calculate the entropy in two ways: one in which there is only one gas and the other where there are two gases. ALL subsequent calculations will be consistent with observations as long as we are consistent about the distinguish-ability of the gases.

The point is that entropy isn't just a calculation of the number of quantum states. It is a calculation of the number of *distinguishable* quantum states. The difference shows up in a number of the calculations of entropy in practice (there are factorials involved that are required for consistency with observations).
I was never comfortable with being taught that mixing the gases increases entropy.
The explanation I heard was that it would require work to separate them.
But I never bothered to investigate it further to become comfortable.
 

Polymath257

Think & Care
Staff member
Premium Member
My understanding of Shannon entropy is poor. It has some similarities with thermodynamic entropy, but an explicit equivalence had never been shown. The concept of information is inherently observer based, and Shannon entropy is as well. However thermodynamic entropy can be defined without taking recourse to any observer in terms of physical states alone. This difference makes it difficult to find a formal and generalizable equivalence that is more than hand-wavy type.

Well, Shannon entropy is also linked to the number and probabilities of available states. And, like viole has been saying, there are thought experiments with Maxwell's demon that suggest a stronger link between the two. There is even a way of interpreting entropy as the change in information between the quantum level description and the macroscopic description of a situation.

The thermodynamic entropy uses the number of quantum states that are 'equivalent' to a given macroscopic situation. It is that number that is manipulated to get the thermodynamic entropy from statistical mechanics. The difficulty is that term 'equivalent'. That is where distinguishability of macroscopic states comes into play. This is related to questions of the different thermodynamic 'ensembles' for a physical situation.
 

Revoltingest

Pragmatic Libertarian
Premium Member
Well, Shannon entropy is also linked to the number and probabilities of available states. And, like viole has been saying, there are thought experiments with Maxwell's demon that suggest a stronger link between the two. There is even a way of interpreting entropy as the change in information between the quantum level description and the macroscopic description of a situation.

The thermodynamic entropy uses the number of quantum states that are 'equivalent' to a given macroscopic situation. It is that number that is manipulated to get the thermodynamic entropy from statistical mechanics. The difficulty is that term 'equivalent'. That is where distinguishability of macroscopic states comes into play. This is related to questions of the different thermodynamic 'ensembles' for a physical situation.
I have some steam engines from the time of Clausius.
Watching one run, we observe entropy increasing.
(Entropy smells like oil & humid air. And it makes a hissing sound.)
But no information deleted, stored, or added.
Shannon entropy is just an analog with entropy....like electricity & hydraulic fluid.
 

Polymath257

Think & Care
Staff member
Premium Member
I have some steam engines from the time of Clausius.
Watching one run, we observe entropy increasing.
(Entropy smells like oil & humid air. And it makes a hissing sound.)
But no information deleted, stored, or added.
Shannon entropy is just an analog with entropy....like electricity & hydraulic fluid.

Well, thermodynamic entropy *does* have a description at the molecular level. it isn't *just* the integral of dU_rev /T. That is a decent macroscopic description, but there is also a good microscopic description in terms of counting quantum states.
 

Revoltingest

Pragmatic Libertarian
Premium Member
Well, thermodynamic entropy *does* have a description at the molecular level. it isn't *just* the integral of dU_rev /T. That is a decent macroscopic description, but there is also a good microscopic description in terms of counting quantum states.
Too many for me to count!
 
Top