Short Rant about Entropy and the Universe

1entropy.gifOne of the most important laws of Physics is perhaps one we have all heard once in a while – the second law of thermodynamics.

This law states that the entropy – in a closed system – in which we can infer as the Universe, will always increase.

A common misconception with the term entropy is that it is a measure of disorder. A “disordered” state does not necessarily mean that it has high entropy and vice versa. Entropy is rather the number of ways particles can be arranged. We can take tea and milk as an example, as many people do. Looking at the tea and milk system, at the instantaneous moment when you pour milk into tea, it is perceived to have low entropy, this is because the milk molecules are virtually sitting on top of the tea molecules. When you wait for a second or two until the milk starts to blend and dissolve into the tea, the system begins to increase in entropy, because there are so many more ways for the milk and tea molecules to arrange themselves in this sense, rather than being stacked on top of each other.

boltzmann-equation

As entropy is the Boltzmann’s constant multiplied by the Log of W:

Less number of ways to arrange particles = Low entropy.

More number of ways to arrange particles = High entropy.

Physics tells us that entropy must always stay the same or increase. If I have a tidy room, I move out for a week, I’ll know that the room’s entropy will have increased when I return. Why? Because my little brother would’ve come in and ruined my perfectionist ideals. That, however, does not stop me from tidying it all again. When the room is tidy, does that mean entropy has decreased? No, because you can’t violate the laws of physics. The physical energy from my muscles and brain required to tidy the room would equate to the missing entropy and increase/maintain the entropy of the overall system. Even though the entropy may have decreased in one area, it would increase by the same quantity or greater in another area ensuring that the law is not violated.

Why must entropy increase, might be a question you have? The answer to this particular question might have had an impact on why some people like to use “disorder” to describe entropy to a layperson. High entropy is just statistically more likely than low entropy, like the milk and tea issue.

thermo

Entropy is just the spreading out of energy. And in a thermodynamic sense, when we interpret energy as heat, entropy will aim to spread out heat. This makes sense right? For the milk and tea analogy, the milk is cold and the tea is warm, thus there is a temperature difference. When the milk is poured into the hot tea, the heat energy from the tea is spread through the milk, so that after a couple of seconds the liquid gets to a fairly uniform temperature – energy spreading out and entropy increasing. You normally wouldn’t (never) perceive the opposite from happening – tea separating back into milk and plain tea since just because it is possible for the molecules to arrange themselves the previous configuration, the statistic likelihood is simply so incredibly low that it wouldn’t happen in our life time, or in the life time of the Universe.

abstract_time_by_ryuu_ima-d2y3rx4.png

Many physicists like the idea of the arrow of time, and constantly use entropy in their explanations of this. The forward arrow of time corresponds to a Universe of increasing entropy. We know that due to a quantum rule called reversibility, you can run the clock backwards and forwards without noticing anything different, because at a quantum mechanical level, there simply exists no such distinction, like when an object is in deep space and far away from the sight of Earth or any other planet, there is no way to tell which direction is up. However, in terms of the macroscopic world, there is, in fact, a clear distinction between the two. An egg cannot (super unlikely) un-crack itself and return onto the shelf. This is what the arrow of time is described by entropy. However, we don’t yet know why the arrow of time behaves this way, in order to do this, Physicists will have to figure out more about why the Universe was in such a low entropy state at the Big Bang.

Author – Susan Chen

Susan is a 6th year high school student currently studying three STEM subjects at Scottish Advanced Higher level – Mathematics, Physics and Chemistry. She particularly loves ideas in cosmology and hopes to embark on an academic journey in the area of Physics.

3 thoughts on “Short Rant about Entropy and the Universe

  1. Shantanu August 23, 2017 / 2:34 pm

    I quite didn’t understand why entropy isn’t related to disorder. I have in my posts compared them, and would like to know if I was really wrong. In my second year of Bachelors, thermodynamics was just the studying of the three laws. And our texts did compare entropy as a measure of disorder. Though I don’t argue the interpretation that it is a measure of number of possible arrangements, just like it is also a measure of useful information. Also, as entropy increases useful information available decreases, and that too could be called sorta disorder. Love to hear your side on it. Nice article by the way.

    Liked by 1 person

    • Susan Chen August 23, 2017 / 6:39 pm

      Hi Shantanu,
      Thank you very much!
      Many people do use disorder when talking about entropy, I’m aware it is also used quite frequently at undergraduate level, but this comparison of entropy and disorder is indeed false. An example could be the formation of crystals, crystal lattices are precise and very orderly arrangement of atoms, that formed from a disordered past. Entropy is really a measure of the number of ways to arrange these particles or I’d say the spreading out of energy, rather than disorder itself. Disorder is more like a common occurence in this sense, in that when there are more ways of arranging something, it would tend to jumble, but essentially, that is not entropy. Again, this is all from what I have read, and I would like to direct you to Frank Lambert’s “Disorder – A Cracked Crutch for Supporting Entropy” if you are interested: https://www.researchgate.net/publication/231265843_Disorder_-_A_Cracked_Crutch_for_Supporting_Entropy_Discussions

      Liked by 1 person

      • Shantanu August 23, 2017 / 6:46 pm

        I just read the abstract. Honestly it’s hard for me to get hold or more accurately leave the idea of disorder as entropy. All these authors , doctors, and scientists perhaps wrong. But I will definately talk about this with my peers and research more on it. If indeed I am wrong, it is better to avoid further mistakes. Thanks 😊

        Liked by 1 person

Leave a comment