From the big bang to the heat death of the universe
You are probably already familiar with the concepts of entropy and the second law of thermodynamics. These are key concepts in thermodynamics classes, but entropy is a term we all struggle with at some point in our studies. As my statistical physics professor used to say:
There are only 4 or 5 people in this world who really understand entropy and I'm not one of them.
—Nicolas Sator—
In fact, unlike other physical quantities like mass or energy, entropy seems to be asubjectiveQuantity that depends on the macroscopic variables thatobserverselected or has access. However, the second law appears to be a fundamental law that governs our universe, on a par with the conservation of energy. And every time I delved into these things, I got more and more confused...
Entropy was originally introduced by Clausius in the early 1850s to describe energy lossirreversible processes, which was very useful for Prediction of the spontaneous evolution of systems (e.g. chemical reactions, phase transitions etc.). But at the time, this was more of an abstract mathematical artifact, and lacked any formalism that could explain what entropy is.basicallyHe represents. In 1877, Boltzmann, the founder of statistical thermodynamics, proposed an elegant formalization of entropy. In short, he defined entropySas a measure of the number of possible microscopic arrangements (microstates)needof a system that satisfies the macroscopic condition of the system (observed macro state), e.g. temperature, pressure, energy:
In other words, the Boltzmann entropy represents thehidden informationof a system, that is, the largerOhthe less you will know about your true microstate. For example, black holes are now one of the highest entropy objects in the universe, as their macrostate is defined only by their mass, charge, and spin. Since we only have access to these variables, there are many ways to place the subject in them.
Since the mid-20th century, the concept of entropy has also found applications in information theory and quantum mechanics, but this article focuses on entropy in the context of statistical thermodynamics.
One of the most popular beliefs about entropy is thisrepresents the disorder🇧🇷 It comes from intuition that what we perceive as "disorderedSystems in general can have many more possible configurations than ordered systems, which therefore have lower entropy. However, as I will show below, the notion of order is subjective, and there are several counter-intuitive examples that show why describing entropy as disorder can lead to confusion:
- We generally consider the crystalline form of a system to be moresauberthan its liquid form. Nevertheless, there are few systems that are in a state of greater entropy in their crystalline form than liquid phases under the same thermodynamic conditions, which is paradoxical above (see crystalline phases of close-packed tetrahedra [1]).
- Contrary to popular belief, evenly distributed matter (which is generally perceived as disordered) is unstable when the interactions are dominated by gravity (Jeans instability) and is actually the most improbable state, i.e. with very low entropy. The most likely states with high entropy are those in which matter clumps together into massive objects.
A popular statement of the second law of thermodynamics is “The entropy of a closed system can only increase🇧🇷 But that's not what the second law is about. A correct wording would be:
When a thermally isolated system transitions from a state of thermodynamic equilibriumONEto another state of thermodynamic equilibriumB, the increase in its thermodynamic entropy S is greater than or equal to 0.
The main conclusion here is that is entropy is not a correctly defined quantity outside of thermodynamic equilibrium, a topic that is still actively debated among researchers [2🇧🇷 Nonetheless, non-equilibrium entropy can be defined in specific systems where we have a set of macroscopic variables that we can continuously monitor at any point in time. Then entropy is defined from the number of possible microstates compatible with these macrovariables, and the second principle naturally follows from probabilistic reasoning (Theorema H) [3].
A system will spend most of its time in its most probable state, that is, those compatible with the largest possible number of microstates.
For example, for a simple system of light particles scattered around a box, these macro variables could be the number of particles in each "grid" of the box. Therefore, the most likely configurations are those with homogeneously distributed particles, i. H. with the same number of particles in each grid. To visualize it, I ran a simple simulation with 100 particles starting in a corner and spreading across space (interacting via the Lennard-Jones potential).
When initializing particles in the corner, the system starts from a very unlikely state (p= 1/4¹⁰⁰ = 1/10⁶⁰) with low entropy. Then the particles are scattered in space and the entropy increases rapidly. But sometimes, by sheer coincidence, more particles than expected end up in a corner, sotemporarily decreasing entropy(See thefluctuation ratefor a quantitative formalism of this phenomenon). Hence my earlier point on the second law.
Here I have defined some "grids" to calculate the entropy. This aspect is actually much more complex, for example what happens when you scale down the grid? How does this change the calculation of entropy? And how can this be generalized to any physical system? check thesecond partthis article for further discussion.
From the second law of thermodynamics it follows that the entropy of the universe has increased and will continue to increase,until thermodynamic equilibrium is reached with its state of maximum entropy🇧🇷 Interestingly, this irreversibility generates aasymmetryin the flow of time (in contrast to the 3 spatial dimensions, where each direction is symmetrical). If we go back to the beginnings of the universe, it can be concluded that it began in a state of extraordinarily and surprisingly low entropy [4🇧🇷 So how can we explain this?
Part of the answer iscosmic inflation🇧🇷 In its earliest moments, the universe entered an exponential growth phase dominated by a high cosmological constant (~20 orders of magnitude higher than its current value). During this phase, matter was spreading too rapidly for gravity to play much of a role, and so the state of maximum entropy during cosmic inflation was an even distribution of matter. However, after about 1/10³⁵ second of expansion, the universe grew by a factor of 10²⁶ when the cosmological constant suddenly dropped to its present value and the expansion stopped [5].
After this event, gravity was finally able to start working and the universe was no longer in thermodynamic equilibrium. Objects began to clump together, which would later become stars, galaxies, and black holesgreatly increased the entropy of the universe(Remember that evenly distributed matter is unstable when gravity is dominant, and therefore generally has low entropy).
As for the future of our universe, it's only a matter of time before all matter collapses into black holes vaporized by Hawking radiation. The last black holes should have evaporated after ~10¹⁰⁰ years. After this point, the universe will be composed mostly of photons and neutrinos, very close to their state of maximum entropy, usually described asheat death.
There is an important subtlety in the second law of thermodynamics..As we discussed above, on impossibly large timescales, entropy can spontaneously decrease to a low value simply for statistical reasons. For example, particles in your room can spontaneously end up in a corner if you wait too long., a lot of,a lot oflong time.
Formally this is known asPoincaré-Rekursionsatz, which states that certain dynamical systems always return to their initial state (low entropy) after a finite time. eastdoes not violate the second principle, which simply states that "A system spends most of its time in its most likely state.In other words, the system will spend only a small fraction of its time in one of these low-entropy states for a long time. So the second law is about statistics, not deterministic predictions. However, since the timescales involved in these Poincaré repetitions are typically much larger than the age of the universe, the second law becomesdeterministic in practice,and we recoverthe formalism of classical thermodynamics introduced by Clausius and Callen.
Interestingly, some string theory formalisms that follow the same idea suggest that any macroscopic object of any size could spontaneously appear in a vacuum through quantum fluctuation. For example:
- ONEHuman bodycan appear somewhere in our observable universe after ~10^(10⁶⁹) years after reaching heat death (cfBolzman brain🇧🇷 The same calculation applies to any combination of human-sized atoms.
- ONEnew starting universemay appear after ~10^(10^(10⁵⁶)) years [6🇧🇷 By new universe I mean about 10⁸⁰ atoms packed into a very small volume, creating a state similar to our early observable universe in this region.
Boltzmann envisioned that our universe might have reached its thermodynamic equilibrium and maximum entropy state a long time ago, but that, for statistical reasons, after an extremely long time a spontaneous decrease in entropy to the level of our early universe occurred. However, the time scales of these calculations are so overly large and abstract that one might wonder if they make any sense at all.
Here I have tried to provide an intuitive understanding of entropy and the second law. In fact, however, some issues have not been addressed in this article. For example, when we calculated the entropy of these particles expanding in a box, we arbitrarily defined a 2x2 grid to calculate the entropy. However, we would have obtained different results if we had used other grids, or indeed any other quantity we chose to define. Hence the entropy appears to be asubjectiveQuantity that depends on the macroscopic variables thatobserverdecided to hire. So how could we objectively define a set of macro variables to calculate them?
In fact, digging deeper into this question is quite complicated, and we're only scratching the surface. check thesecond partThis article to go deeper!
I am grateful to Bastien Marguet, Hugo Belleza, Clement Quintard, and Janos Madar for their valuable comments, corrections, and suggestions that have led to several large editions of this article.
I am a PhD student in the field of AI and Health and am currently working atIBMResearchand I have a masters degree in quantum physics. In my spare time I am a blockchain developer (Web3) and CTO atPeer2Panel, a blockchain startup focused on renewable energy.
I enjoy writing in-depth articles on various topics such as the universe, blockchain, and AI. Unfortunately, writing these types of articles is quite demanding and I wish I had the time to write more. if you thinkSubscribe to middleand enjoy reading these kind of articles, please consider using my referral link! With this you would support me directly with a part of your subscription. If so, thank you so much!
https://medium.com/membership/@aurelien-pelissier
[1] Haji-Akbari, Amir, et al. "Disordered, quasi-crystalline and crystalline phases of close-packed tetrahedra".Nature462.7274 (2009): 773–777. (https://www.nature.com/articles/nature08641)
[2] Šafránek, Dominik, Anthony Aguirre, and J.M. German. "Gross-grained classical dynamical entropy and comparison with the quantum version".physical exam etc102.3 (2020): 032106. (https://arxiv.org/pdf/1905.03841.pdf)
[3] Jaynes, Edwin T. „Gibbs versus Boltzmann entropies“.American Journal of Physics33.5 (1965): 391–398. (https://bayes.wustl.edu/etj/articles/gibbs.vs.boltzmann.pdf)
[4] Egan, Chas A., and Charles H. Lineweaver. "A Greater Estimate of the Entropy of the Universe".The Astrophysical Journal710.2 (2010): 1825. (https://arxiv.org/pdf/0909.3983.pdf)
[5] Patel, Vihan M., and Charles H. Lineweaver. "Solutions to the problem of initial cosmic entropy without equilibrium initial conditions". Entropy 19.8 (2017): 411. (https://arxiv.org/ftp/arxiv/papers/1708/1708.03677.pdf)
[6] Carroll, Sean M., and Jennifer Chen. "Spontaneous Inflation and the Origin of the Arrow of Time".arXiv Preamp(2004). (https://arxiv.org/abs/hep-th/0410270)
FAQs
What I have learned about entropy? ›
Entropy is a measure of the disorder of a system. Entropy also describes how much energy is not available to do work. The more disordered a system and higher the entropy, the less of a system's energy is available to do work.
Why is entropy so hard to understand? ›Without a direct method for measurement, entropy is probably one of the most challenging concepts in physics to grasp. It is the center of the second law of thermodynamics, as it states that the total entropy, meaning the degree of disorder, of an enclosed system always increases over time.
What is a simple way to understand entropy? ›Entropy, loosely, is a measure of quality of energy in the sense that the lower the entropy the higher the quality. Energy stored in a carefully ordered way (the efficient library) has lower entropy. Energy stored in a chaotic way (the random-pile library) has high entropy.
Does anyone understand entropy? ›There are only 4 or 5 persons in this world that truly understand entropy, and I don't belong to them. In fact, in contrast to other quantity in physics such as mass or energy, it seems that entropy is a subjective quantity that depends on the macroscopic variables that an observer chose to define, or has access to.
What does entropy mean in life? ›Entropy is simply a measure of disorder and affects all aspects of our daily lives. In fact, you can think of it as nature's tax. Left unchecked disorder increases over time. Energy disperses, and systems dissolve into chaos. The more disordered something is, the more entropic we consider it.
What is the entropy of a person? ›46] tells us that entropy increases as the number of cells and the total energy within the body increase. Thus, as our body grows beyond its optimum configuration, the more disorder occurs within it. Also, as we eat more, we increase our total energy content (potential as well as kinetic) and more disorder occurs.
What are examples of entropy in daily life? ›Ice melting, salt or sugar dissolving, making popcorn, and boiling water for tea are processed with increasing entropy in your kitchen.
What is entropy in psychology? ›Entropy, a concept derived from thermodynamics and information theory, describes the amount of uncertainty and disorder within a system. Self-organizing systems engage in a continual dialogue with the environment and must adapt themselves to changing circumstances to keep internal entropy at a manageable level.
What is entropy in philosophy? ›Entropy, an irreversible source of energy, that like life itself, looks like a strong candidate to describe the phenomenon of the Universe and belongs to the philosophical framework. This claim is also justified by the conditions of disorder and tastelessness that exist in the modern world.
How many laws of entropy are there? ›Thermodynamics has three main laws: the first law, the second law, and the third law. Then there was another law, called the "zeroth law." The law of conservation of mass is also an important idea in thermodynamics, but it is not called law.
What is entropy and examples? ›
Melting ice makes a perfect example of entropy. As ice the individual molecules are fixed and ordered. As ice melts the molecules become free to move therefore becoming disordered. As the water is then heated to become gas, the molecules are then free to move independently through space.
How do you deal with entropy? ›Entropy will always increase on its own. The only way to make things orderly again is to add energy.
Why is entropy so important? ›It helps in figuring out an object's thermodynamic condition. A little thought will reveal that a spontaneous process shifts from a less probable condition to a more probable state when it occurs.
Can you reverse entropy? ›In a closed system, entropy cannot be reversed. All closed systems will therefore eventually move toward high entropy as changes between events develop. Entropy will undoubtedly fall due to statistical likelihood in the very short future, but this is highly uncommon.
What is the opposite of entropy? ›Negentropy is the inverse of entropy. This indicates that things are becoming more ordered. Order is the opposite of randomness or disorder, implying organization, structure, and function. Negentropy can be seen in a star system like the solar system. The inverse of entropy is negentropy.
Does life disobey entropy? ›We can view the entire universe as an isolated system, leading to the conclusion that the entropy of the universe is tending to a maximum. However, all living things maintain a highly ordered, low entropy structure.
How do humans affect entropy? ›Humans can extract the chemical energy in the food and use it to maintain or decrease local entropy levels, and thus stay alive. Obviously other animals do this too, though they may eat different things than we do or digest them in different ways.
Is life a form of entropy? ›An MIT physicist has proposed the provocative idea that life exists because the law of increasing entropy drives matter to acquire lifelike physical properties.
What is entropy in one word answer? ›With its Greek prefix en-, meaning "within", and the trop- root here meaning "change", entropy basically means "change within (a closed system)".
What is entropy in spirituality? ›Spiritual entropy is essentially the same thing as the physical entropy we've been talking about but in a spiritual sense – the tendency of our spirits to drift away from God and His commandments.
What is entropy in a family? ›
Family entropy is a novel term that borrows from thermodynamics to capture organization/disorganization across the family home environment [3]. In thermodynamics, entropy is “a measure of the amount of molecular disorder within a system” [4].
Is entropy a positive disorder? ›If a reaction is exothermic ( H is negative) and the entropy S is positive (more disorder), the free energy change is always negative and the reaction is always spontaneous.
What is entropy in therapy? ›The therapist serves to increase the capacity of the patient, both through developing a shared understanding of the challenges the patient faces and through generating shared solutions. This process can be understood in terms of entropy trade where energy is successfully redirected into adaptive behaviour.
Is entropy negative or positive? ›Defining Entropy and Looking at Entropy Changes in a System
The symbol for entropy is S, and a change in entropy is shown as “delta” S or ΔS. If the entropy of a system increases, ΔS is positive. If the entropy of a system decreases, ΔS is negative.
Yet, much like the commonplace misinterpretation of Darwin's theory of natural selection as 'survival of the fittest', entropy is not 'a progression from order to disorder or chaos'. Rather, entropy is a measure of disorder.
What is entropy and why is it important? ›Entropy is a measure of the disorder in a closed system. According to the second law, entropy in a system almost always increases over time — you can do work to create order in a system, but even the work that's put into reordering increases disorder as a byproduct — usually in the form of heat.
What is entropy and its importance? ›Entropy is one of the most important concepts in physics and in information theory. Informally, entropy is a measure of the amount of disorder in a physical, or a biological, system. The higher the entropy of a system, the less information we have about the system. Hence, information is a form of negative entropy.
Why is entropy important in life? ›Why Does Entropy Matter for Your Life? Here's the crucial thing about entropy: it always increases over time. It is the natural tendency of things to lose order. Left to its own devices, life will always become less structured.
What is the importance of entropy in living things? ›“Energy unavailable to do work” is one definition of entropy. Life requires a constant input of energy to maintain order, and without energy the complex structures of living systems would not exist. The steady flow of energy necessary to sustain a living system increases entropy.
How is entropy related to everyday life? ›Answer and Explanation: An example of positive entropy from everyday life is the melting of ice. In ice, water molecules are not able to move as they are at a fixed position. When it gains heat from surrounding, the thermal energy of molecules increases and they starts to move, changing state from solid to liquid.
What is the best way to explain entropy? ›
entropy, the measure of a system's thermal energy per unit temperature that is unavailable for doing useful work. Because work is obtained from ordered molecular motion, the amount of entropy is also a measure of the molecular disorder, or randomness, of a system.
What is a good example of entropy? ›Melting ice makes a perfect example of entropy. As ice the individual molecules are fixed and ordered. As ice melts the molecules become free to move therefore becoming disordered. As the water is then heated to become gas, the molecules are then free to move independently through space.
How do you fight entropy? ›- Sundays Are For Cleaning. It's been a long day and all you want to do is have a snack and hit the hay. ...
- Put Things Away As You Get Them. ...
- Use An Agenda / Planner. ...
- Chore Schedule. ...
- Meal Prep.
There would be no chemical reactions or chemical processes. All processes will be extremely slow: The irreversibility of a process causes entropy. Without entropy, the universe would be a reversible slow process in equilibrium with its surroundings.
What is entropy summary? ›entropy , Measure of a system's energy that is unavailable for work, or of the degree of a system's disorder. When heat is added to a system held at constant temperature, the change in entropy is related to the change in energy, the pressure, the temperature, and the change in volume.
Do you experience entropy in your life? ›Entropy In Everyday Life
“Disorder, or entropy, always increases with time. In other words, it is a form of Murphy's law: things always tend to go wrong!” On a daily basis we experience entropy without thinking about it: boiling water, hot objects cooling down, ice melting, salt or sugar dissolving.
We can view the entire universe as an isolated system, leading to the conclusion that the entropy of the universe is tending to a maximum. However, all living things maintain a highly ordered, low entropy structure.
What is opposite of entropy? ›Negentropy is the inverse of entropy. This indicates that things are becoming more ordered. Order is the opposite of randomness or disorder, implying organization, structure, and function. Negentropy can be seen in a star system like the solar system. The inverse of entropy is negentropy.
Is entropy The reason we age? ›Entropy is a measure of order and disorder. If left alone, aging systems go spontaneously from youthful, low entropy and order to old, high entropy and disorder.