Thermodynamics Entropy Quotes

We've searched our database for all the quotes and captions related to Thermodynamics Entropy. Here they are! All 91 of them:

Once I got home, I sulked for a while. All my brilliant plans foiled by thermodynamics. Damn you, Entropy!
Andy Weir (The Martian)
All my brilliant plans foiled by thermodynamics. Damn you, Entropy!
Andy Weir (The Martian)
Some say Karma is a bitch. It all comes back to you, eventually. If this world can be boiled down to two truths. It would be the second law of thermodynamics (entropy) and the law of action and reaction (Karma). Well, if Karma really makes up the fabric of this Universe, if it is really a bitch, then prove it in this life itself! Why wait for reincarnation? Do you think I would be me once I die and transition to another body? No! What makes me, me, are my memories.
Abhaidev (The World's Most Frustrated Man)
The fundamental laws of the universe which correspond to the two fundamental theorems of the mechanical theory of heat. 1. The energy of the universe is constant. 2. The entropy of the universe tends to a maximum.
Rudolf Clausius (The Mechanical Theory of Heat)
The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations - then so much the worse for Maxwell's equations. If it is found to be contradicted by observation - well, these experimentalists do bungle things sometimes. But if your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it to collapse in deepest humiliation.
Arthur Stanley Eddington (New Pathways in Science)
The Second Law of Thermodynamics defines the ultimate purpose of life, mind, and human striving: to deploy energy and information to fight back the tide of entropy and carve out refuges of beneficial order. An underappreciation of the inherent tendency toward disorder, and a failure to appreciate the precious niches of order we carve out, are a major source of human folly.
Steven Pinker
The universe tended towards chaos and entropy. That was basic thermodynamics. Maybe it was basic existence too.
Matt Haig (The Midnight Library)
We all behave like Maxwell’s demon. Organisms organize. In everyday experience lies the reason sober physicists across two centuries kept this cartoon fantasy alive. We sort the mail, build sand castles, solve jigsaw puzzles, separate wheat from chaff, rearrange chess pieces, collect stamps, alphabetize books, create symmetry, compose sonnets and sonatas, and put our rooms in order, and all this we do requires no great energy, as long as we can apply intelligence. We propagate structure (not just we humans but we who are alive). We disturb the tendency toward equilibrium. It would be absurd to attempt a thermodynamic accounting for such processes, but it is not absurd to say we are reducing entropy, piece by piece. Bit by bit. The original demon, discerning one molecules at a time, distinguishing fast from slow, and operating his little gateway, is sometimes described as “superintelligent,” but compared to a real organism it is an idiot savant. Not only do living things lessen the disorder in their environments; they are in themselves, their skeletons and their flesh, vesicles and membranes, shells and carapaces, leaves and blossoms, circulatory systems and metabolic pathways - miracles of pattern and structure. It sometimes seems as if curbing entropy is our quixotic purpose in the universe.
James Gleick (The Information: A History, a Theory, a Flood)
Though I felt dissatisfied, at least I felt like somebody, a person, rather than a thing exemplifying the second law of thermodynamics (all order tends toward entropy, decay, etc.).
Paul Kalanithi
a hallmark of a living system is that it maintains or reduces its entropy by increasing the entropy around it. In other words, the second law of thermodynamics has a life loophole: although the total entropy must increase, it’s allowed to decrease in some places as long as it increases even more elsewhere. So life maintains or increases its complexity by making its environment messier.
Max Tegmark (Life 3.0: Being Human in the Age of Artificial Intelligence)
A permanent state is reached, in which no observable events occur. The physicist calls this the state of thermodynamical equilibrium, or of ‘maximum entropy’. Practically, a state of this kind is usually reached very rapidly. Theoretically, it is very often not yet an absolute equilibrium, not yet the true maximum of entropy. But then the final approach to equilibrium is very slow. It could take anything between hours, years, centuries,
Erwin Schrödinger (What is Life? (Canto Classics))
I know this may sound like an excuse," he said. "But tensor functions in higher differential topology, as exemplified by application of the Gauss-Bonnett Theorem to Todd Polynomials, indicate that cohometric axial rotation in nonadiabatic thermal upwelling can, by random inference derived from translational equilibrium aggregates, array in obverse transitional order the thermodynamic characteristics of a transactional plasma undergoing negative entropy conversions." "Why don't you just shut up," said Hardesty.
Mark Helprin (Winter's Tale)
The laws of thermodynamics tell us something quite different. Economic activity is merely borrowing low-entropy energy inputs from the environment and transforming them into temporary products and services of value. In the transformation process, often more energy is expended and lost to the environment than is embedded in the particular good or service being produced.
Jeremy Rifkin (The The Third Industrial Revolution: How Lateral Power Is Transforming Energy, the Economy, and the World)
Our subjective sense of the direction of time, the psychological arrow of time, is therefore determined within our brain by the thermodynamic arrow of time. Just like a computer, we must remember things in the order in which entropy increases. This makes the second law of thermodynamics almost trivial. Disorder increases with time because we measure time in the direction in which disorder increases. You can’t have a safer bet than that!
Stephen Hawking
Entropy—already a difficult and poorly understood concept—is a measure of disorder in thermodynamics, the science of heat and energy.
James Gleick (The Information: A History, a Theory, a Flood)
The general struggle for existence of animate beings is not a struggle for raw materials – these, for organisms, are air, water and soil, all abundantly available – nor for energy which exists in plenty in any body in the form of heat, but a struggle for [negative] entropy, which becomes available through the transition of energy from the hot sun to the cold earth.
Ludwig Boltzmann (The Second Law of Thermodynamics (Theoretical Physics and Philosophical Problems))
The second law of thermodynamics says that the entropy of a system increases with time i.e. disorder rises as you age, leading to frantic chaos at times. Be aware of it. Do not fight it.
Rajesh`
Love takes its meaning from the mainfold ways in which it is used; which are indefinite in number. You can never understand its menaing fully because you can never experience love in all its context. And so it is energy and entropy.
Craig F. Bohren (Atmospheric Thermodynamics)
Clausius summarized his application of entropy to thermodynamics in two dramatic phrases that had a big impact at the time. They were (1) First Law: The energy of the universe is constant, and (2) Second Law: The entropy of the universe tends to a maximum.
Gino Segrè (A Matter of Degrees: What Temperature Reveals about the Past and Future of Our Species, Planet, and U niverse)
The law that entropy always increases, holds, I think, the supreme position among the laws of Nature. … if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.
Arthur Stanley Eddington (New Pathways in Science)
mathematician Steven Strogatz puts it . . . In every case, these feats of synchrony occur spontaneously, almost as if nature has an eerie yearning for order. And that raises a profound mystery: Scientists have long been baffled by the existence of spontaneous order in the universe. The laws of thermodynamics seem to dictate the opposite, that nature should inexorably degenerate toward a state of greater disorder, greater entropy. Yet all around us we see magnificent structures that have somehow managed to assemble themselves. This enigma bedevils all of science today.2
Stephen Harrod Buhner (Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth)
Entropy is the loyal servant of the second law of thermodynamics. So, if we think of entropy as a character in our story, we should imagine it as dissolute, lurking, careless of others’ pain and suffering, not interested in looking you in the eye. Entropy is also very, very dangerous, and in the end it will get us all.
David Christian (Origin Story: A Big History of Everything)
It is very desirable to have a word to express the Availability for work of the heat in a given magazine; a term for that possession, the waste of which is called Dissipation. Unfortunately the excellent word Entropy, which Clausius has introduced in this connexion, is applied by him to the negative of the idea we most naturally wish to express. It would only confuse the student if we were to endeavour to invent another term for our purpose. But the necessity for some such term will be obvious from the beautiful examples which follow. And we take the liberty of using the term Entropy in this altered sense ... The entropy of the universe tends continually to zero.
Peter Guthrie Tait (Sketch Of Thermodynamics)
THE WORLD IS APPROACHING TOWARDS ANARCHY" : The second law of thermodynamics concludes that the entropy of universe is increasing with time . Which means with time , the randomness or disorderness is also increasing. Anarchy is the condition of excessive social entropy . Thus , we can conclude that finally universe will reach to the condition of anarchy . -anup joshi
Anup Joshi
This “Hawking temperature” of a black hole and its “Hawking radiation” (as they came to be called) were truly radical—perhaps the most radical theoretical physics discovery in the second half of the twentieth century. They opened our eyes to profound connections between general relativity (black holes), thermodynamics (the physics of heat) and quantum physics (the creation of particles where before there were none). For example, they led Stephen to prove that a black hole has entropy, which means that somewhere inside or around the black hole there is enormous randomness. He deduced that the amount of entropy (the logarithm of the hole’s amount of randomness) is proportional to the hole’s surface area. His formula for the entropy is engraved on Stephen’s memorial stone at Gonville and Caius College in Cambridge, where he worked. For the past forty-five years, Stephen and hundreds of other physicists have struggled to understand the precise nature of a black hole’s randomness. It is a question that keeps on generating new insights about the marriage of quantum theory with general relativity—that is, about the ill-understood laws of quantum gravity.
Stephen Hawking (Brief Answers to the Big Questions)
How would we express in terms of the statistical theory the marvellous faculty of a living organism, by which it delays the decay into thermodynamical equilibrium (death)? We said before: ‘It feeds upon negative entropy’, attracting, as it were, a stream of negative entropy upon itself, to compensate the entropy increase it produces by living and thus to maintain itself on a stationary and fairly low entropy level.
Erwin Schrödinger (What is Life? (Canto Classics))
There are at least three different arrows of time. First, there is the thermodynamic arrow of time, the direction of time in which disorder or entropy increases. Then, there is the psychological arrow of time. This is the direction in which we feel time passes, the direction in which we remember the past but not the future. Finally, there is the cosmological arrow of time. This is the direction of time in which the universe is expanding rather than contracting.
Stephen Hawking (A Brief History of Time)
The universal laws of nature including the thermodynamic principles of entropy govern the relationships between interconnected organisms. The notion of internal thermodynamic equilibrium assure us that the powerful energy reserves of one person will always rush in to fill the void or vacuum in another person. Thus I will always register your mystical presence in my quiescent mind, your hallow echo fills the hollow space of my very being. You are the external reflection of my innermost want, the personification of a world that lies outside my conscious reach, ethereal substance of the soul, the guiding hand that my unconscious mind instinctually gropes for in order to make me complete.
Kilroy J. Oldster (Dead Toad Scrolls)
This was a golden age, in which we solved most of the major problems in black hole theory even before there was any observational evidence for black holes. In fact, we were so successful with the classical general theory of relativity that I was at a bit of a loose end in 1973 after the publication with George Ellis of our book The Large Scale Structure of Space–Time. My work with Penrose had shown that general relativity broke down at singularities, so the obvious next step would be to combine general relativity—the theory of the very large—with quantum theory—the theory of the very small. In particular, I wondered, can one have atoms in which the nucleus is a tiny primordial black hole, formed in the early universe? My investigations revealed a deep and previously unsuspected relationship between gravity and thermodynamics, the science of heat, and resolved a paradox that had been argued over for thirty years without much progress: how could the radiation left over from a shrinking black hole carry all of the information about what made the black hole? I discovered that information is not lost, but it is not returned in a useful way—like burning an encyclopedia but retaining the smoke and ashes. To answer this, I studied how quantum fields or particles would scatter off a black hole. I was expecting that part of an incident wave would be absorbed, and the remainder scattered. But to my great surprise I found there seemed to be emission from the black hole itself. At first, I thought this must be a mistake in my calculation. But what persuaded me that it was real was that the emission was exactly what was required to identify the area of the horizon with the entropy of a black hole. This entropy, a measure of the disorder of a system, is summed up in this simple formula which expresses the entropy in terms of the area of the horizon, and the three fundamental constants of nature, c, the speed of light, G, Newton’s constant of gravitation, and ħ, Planck’s constant. The emission of this thermal radiation from the black hole is now called Hawking radiation and I’m proud to have discovered it.
Stephen Hawking (Brief Answers to the Big Questions)
It will be noticed that the fundamental theorem proved above bears some remarkable resemblances to the second law of thermodynamics. Both are properties of populations, or aggregates, true irrespective of the nature of the units which compose them; both are statistical laws; each requires the constant increase of a measurable quantity, in the one case the entropy of a physical system and in the other the fitness, measured by m, of a biological population. As in the physical world we can conceive the theoretical systems in which dissipative forces are wholly absent, and in which the entropy consequently remains constant, so we can conceive, though we need not expect to find, biological populations in which the genetic variance is absolutely zero, and in which fitness does not increase. Professor Eddington has recently remarked that 'The law that entropy always increases—the second law of thermodynamics—holds, I think, the supreme position among the laws of nature'. It is not a little instructive that so similar a law should hold the supreme position among the biological sciences. While it is possible that both may ultimately be absorbed by some more general principle, for the present we should note that the laws as they stand present profound differences—-(1) The systems considered in thermodynamics are permanent; species on the contrary are liable to extinction, although biological improvement must be expected to occur up to the end of their existence. (2) Fitness, although measured by a uniform method, is qualitatively different for every different organism, whereas entropy, like temperature, is taken to have the same meaning for all physical systems. (3) Fitness may be increased or decreased by changes in the environment, without reacting quantitatively upon that environment. (4) Entropy changes are exceptional in the physical world in being irreversible, while irreversible evolutionary changes form no exception among biological phenomena. Finally, (5) entropy changes lead to a progressive disorganization of the physical world, at least from the human standpoint of the utilization of energy, while evolutionary changes are generally recognized as producing progressively higher organization in the organic world.
Ronald A. Fisher (The Genetical Theory of Natural Selection)
Loschmidt’s paradox Yet if, as is widely assumed, a thermodynamic system is composed of many fundamental particles and a thermodynamic process is composed of many fundamental interactions, why are not all thermodynamic processes reversible? Johann Loschmidt (1821–1895) asked this question in 1876. We still have no fully satisfactory answer. That many reversible fundamental processes do not necessarily compose a reversible thermodynamic process is known as Loschmidt’s paradox or the reversibility paradox. Our failure to resolve Loschmidt’s paradox suggests that the laws governing the interactions of fundamental particles do not form a complete picture of nature and need to be supplemented with additional physics equivalent to the second law of thermodynamics.
Don S. Lemons (A Student's Guide to Entropy (Student's Guides))
The universe seeks equilibriums; it prefers to disperse energy, disrupt organisation, and maximise chaos. Life is designed to combat these forces. We slow down reactions, concentrate matter, and organise chemicals into compartments; we sort laundry on Wednesdays. "It sometimes seems as if curbing entropy is our quixotic purpose in the universe," James Gleick wrote. We live in the loopholes of natural laws, seeking extensions, exceptions, and excuses. The laws of nature still mark the outer boundaries of permissibility – but life, in all its idiosyncratic, mad weirdness, flourishes by reading between the lines. Even the elephant cannot violate the law of thermodynamics – although its trunk, surely, must rank as one of the most peculiar means of moving matter using energy.
Siddhartha Mukhergee
Three laws governing black hole changes were thus found, but it was soon noticed that something unusual was going on. If one merely replaced the words 'surface area' by 'entropy' and 'gravitational field' by 'temperature', then the laws of black hole changes became merely statements of the laws of thermodynamics. The rule that the horizon surface areas can never decrease in physical processes becomes the second law of thermodynamics that the entropy can never decrease; the constancy of the gravitational field around the horizon is the so-called zeroth law of thermodynamics that the temperature must be the same everywhere in a state of thermal equilibrium. The rule linking allowed changes in the defining quantities of the black hole just becomes the first law of thermodynamics, which is more commonly known as the conservation of energy.
John D. Barrow (Theories of Everything: The Quest for Ultimate Explanation)
Dear Dr. Schrodinger, In What Is Life? you say that in all of nature only man hesitates to cause pain. As destruction is the master-method by which evolution produces new types, the reluctance to cause pain may express a human will to obstruct natural law. Christianity and its parent religion, a few short millennia, with frightful reverses … The train had stopped, the door was already shutting when Herzog roused himself and squeezed through. He caught a strap. The express flew uptown. It emptied and refilled at Times Square, but he did not sit down. It was too hard to fight your way out again from a seat. Now, where were we? In your remarks on entropy … How the organism maintains itself against death—in your words, against thermodynamic equilibrium … Being an unstable organization of matter, the body threatens to rush away from us. It leaves. It is real. It! Not we! Not I! This organism, while it has the power to hold its own form and suck what it needs from its environment, attracting a negative stream of entropy, the being of other things which it uses, returning the residue to the world in simpler form. Dung. Nitrogenous wastes. Ammonia. But reluctance to cause pain coupled with the necessity to devour … a peculiar human trick is the result, which consists in admitting and denying evils at the same time. To have a human life, and also an inhuman life. In fact, to have everything, to combine all elements with immense ingenuity and greed. To bite, to swallow. At the same time to pity your food. To have sentiment. At the same time to behave brutally. It has been suggested (and why not!) that reluctance to cause pain is actually an extreme form, a delicious form of sensuality, and that we increase the luxuries of pain by the injection of a moral pathos. Thus working both sides of the street.
Saul Bellow (Herzog)
The success of discovering a thermodynamic principle associated with the gravitational field of a black hole has led to a speculation that there might exist some thermodynamic aspect to the gravitational field of the whole Universe. The simplest assumption to make, following the black hole case, would be that it is the surface area of the boundary of the visible universe. As the Universe expands, this boundary increases and the information available to us about the Universe increases. But this does not seem promising. It would appear to tell us only that the Universe must continue expanding forever, for if it were ever to begin to recollapse the entropy would fall and violate the second law of thermodynamics. The universe can expand in all sorts of different ways and still have the increasing area. What we really want is some principle that tells us why the organization of the Universe changes in the way that it does: why it now expands so uniformally and isotropically.
John D. Barrow (Theories of Everything: The Quest for Ultimate Explanation)
Then, in 1974, Stephen Hawking made a dramatic discovery. He decided to examine for the forst time what occurs when one applies the notions of quantum mechanics to black holes. What he discovered was that black holes are not completely black. When quantum mechanics is included in the discussion of their properties, it is possible for energy to escape from the surface of the black hole and be recorded by an outside observer. The variation in the strength of the gravitational field near the horizon surface is strong enough to create pairs of particles and antiparticles spontaneously. The energy necessary to do this is extracted from the source of the gravitational field, and as the process continues, so the mass of the black hole ebbs away. If one waits long enough, it should disappear completely unless some unknown physics intervenes in the final stages. Such a discovery was exciting enough, but its most satisfying aspect was the fact that the particles radiated away from the surface of the black hole were found to have all the characteristics of heat radiation, with a temperature precisely equal to the gravitational field at the horizon and an entropy given by its surface area, just as the analogy had suggested. Black holes did possess a non-zero temperature and obeyed the laws of thermodynamics, but only when quantum mechanics was included in their description.
John D. Barrow (Theories of Everything: The Quest for Ultimate Explanation)
Perplexed about entropy? You are not alone. Josiah Willard Gibbs (1839–1903) understood this confusion all too well, almost 150 years ago, “ . . . a method involving the notion of entropy, the very existence of which depends upon the second law of thermodynamics, will doubtless seem to many far-fetched, and may repel beginners as obscure and difficult of comprehension. This inconvenience is perhaps more than counter-balanced by the advantages of a method which makes the second law of thermodynamics so prominent, and gives it so clear and elementary an expression. . . . (1).” Gibbs profoundly altered our understanding of chemistry with his insights. At a time when it was mostly a philosophical concept, Gibbs went straight for application and made entropy relevant. Rapid advancements and heralded achievements in the chemical sciences ensued. Enthalpy (H) is a measure of the internal energy of a system, but this energy has an availability issue; some of that energy is useful, some is not. Enthalpy also provides no information about the spontaneity of energy exchange. Entropy (S) does indicate the probability of energy exchange (i.e., spontaneous, −∆S, or nonspontaneous, +∆S), but it is not useful energy and so it provides little information on the quantity of energy that is available to perform work. Energy that is available to perform useful work is known as Gibbs energy, symbolized as G. Gibbs energy has also been termed free energy. Yet energy is anything but “free” and so that term will not be used here
Anonymous
This, in turn, has given us a “unified theory of aging” that brings the various strands of research into a single, coherent tapestry. Scientists now know what aging is. It is the accumulation of errors at the genetic and cellular level. These errors can build up in various ways. For example, metabolism creates free radicals and oxidation, which damage the delicate molecular machinery of our cells, causing them to age; errors can build up in the form of “junk” molecular debris accumulating inside and outside the cells. The buildup of these genetic errors is a by-product of the second law of thermodynamics: total entropy (that is, chaos) always increases. This is why rusting, rotting, decaying, etc., are universal features of life. The second law is inescapable. Everything, from the flowers in the field to our bodies and even the universe itself, is doomed to wither and die. But there is a small but important loophole in the second law that states total entropy always increases. This means that you can actually reduce entropy in one place and reverse aging, as long as you increase entropy somewhere else. So it’s possible to get younger, at the expense of wreaking havoc elsewhere. (This was alluded to in Oscar Wilde’s famous novel The Picture of Dorian Gray. Mr. Gray was mysteriously eternally young. But his secret was the painting of himself that aged horribly. So the total amount of aging still increased.) The principle of entropy can also be seen by looking behind a refrigerator. Inside the refrigerator, entropy decreases as the temperature drops. But to lower the entropy, you have to have a motor, which increases the heat generated behind the refrigerator, increasing the entropy outside the machine. That is why refrigerators are always hot in the back. As Nobel laureate Richard Feynman once said, “There is nothing in biology yet found that indicates the inevitability of death. This suggests to me that it is not at all inevitable and that it is only a matter of time before biologists discover what it is that is causing us the trouble and that this terrible universal disease or temporariness of the human’s body will be cured.
Michio Kaku (Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100)
A law in physics, called the second law of thermodynamics, says that entropy, or chaos (the opposite of growth…a winding-down process), increases over time. You can readily see this in life, and we have already talked about it. Anything left to its own is naturally dying, getting more disorganized, rusting, etc. Even the universe itself is subject to that process.
Henry Cloud (Integrity: The Courage to Face the Demands of Reali)
This is a fundamentally insane notion, which developed in my own mind from an idea of Buckminster Fuller's. Every so often I try to encourage other writers by telling them this cheerful set of thoughts; always they gaze at me absolutely appalled. Fuller's assertion was roughly to this affect: the purpose of people on earth is to counteract the tide of entropy described in the Second Law of Thermodynamics.
Annie Dillard (Living by Fiction)
However in 1934, Richard Tolman pointed out an inconsistency with the cyclic model due to the second law of thermodynamics-that entropy will always increase as time progresses. As the universe goes from cycle to cycle, the entropy will increase and the cycles will get longer. Extrapolating back, the cycles shorten, leading to a big bang singularity-no eternal cycles. It's back to the drawing board.
Stephon Alexander (The Jazz of Physics: The Secret Link Between Music and the Structure of the Universe)
impugn,
Elliot McGucken (The Physics of Time: Time & Its Arrows in Quantum Mechanics, Relativity, The Second Law of Thermodynamics, Entropy, The Twin Paradox, & Cosmology Explained via LTD Theory's Expanding Fourth Dimension)
Indeed, I once did a little exercise: I took about half a dozen economics books, the big fat ones like Samuelson’s, and so on, and looked up in the index: do the words “energy,” “entropy,” or “thermodynamics” ever occur? Not once in any of them. Energy! You can’t even have a f——king dream at night without energy. [Quoting physicist Geoffrey West.]
W. Brian Arthur (Complexity Economics: Proceedings of the Santa Fe Institute's 2019 Fall Symposium)
If a subset of the universe is special in this sense, then for this subset the entropy of the universe is low in the past, the second law of thermodynamics obtains; memories exist, traces are left—and there can be evolution, life, and thought. In other words, if in the universe there is something like this—and it seems natural to me that there could be—then we belong to that something. Here, “we” refers to that collection of physical variables to which we commonly have access and by means of which we describe the universe.
Carlo Rovelli (The Order of Time)
CLASSICAL PHYSICS RELIES upon the three laws of thermodynamics. These are laws about energy that tell us how energy functions and, therefore, what we can (and cannot) do with it. As practical as they might be for the Western medical practitioner, they are stretched by quantum occurrences. The three laws are as follows: First law: Energy likes to be conserved; therefore it cannot be created or destroyed, merely transformed. Second law: Entropy (a measure of information) tends to increase. This means that the longer a system exists, the more disorder or unavailable information it contains. Third law: As temperature approaches absolute zero, the entropy or chaos becomes more constant. These laws govern the macrocosmos, but are not consistently true in the microuniverse of quanta. According to the second law, for instance, energy (or information that vibrates) gradually reduces in availability until it reaches absolute zero. Science cannot yet achieve absolute zero, but it can approach it. At this point, energy supposedly stands still. According to the first law, however, energy cannot be destroyed, which means the unavailable information has to go somewhere. Atoms and mass can only store a limited amount of information, so this missing data is not hiding in a coffee cup. It is possible, however, that it is stored in anti- or parallel worlds, or perhaps in the subtle energy domains explored by Dr. Tiller in “A Model of Subtle Energy”.
Cyndi Dale (The Subtle Body: An Encyclopedia of Your Energetic Anatomy)
MIT physicist Seth Lloyd supports the idea of other worldly portals in his book Programming the Universe. Quantum mechanics has proven that an electron is not only allowed to be in two places at once—it is required to be. Certain particles not only spin in two directions at the same time, but have to do so.21 At really high speeds, atoms require more information to describe their movements, and therefore they have more entropy.22 However, an observer affects the outcome of whatever he or she is observing. As explained in the book The Orb Project, the effect of the observer on the quantum field causes reality to reorganize according to the observation. This means that a newly observed reality descends through the frequency levels below the quantum, becoming dense in material reality.23 The nonobserved information becomes “lost” if it doesn’t qualify as “real” or desirable to the observer. It is not eliminated; instead, the not-selected potential slips into a pocket of “elsewhere.” Conceivably, we can get it back. As Lloyd explains, we can access lost data by “flipping a qubit,” a code phrase that means we can apply a magnetic field to force energy to shift from one state to another.24 We have established that the subtle layer is atop the physical and that the etheric layer of subtle energies is magnetic in nature. Could it be that the information we cannot find—perhaps, the data that could make a sick person well—is lingering a plane above us? We’ve one more law to face: the third law of thermodynamics. Experiments with absolute zero provide a new perspective on it, one that coaxes an understanding of subtle energy. Absolute zero is the point at which particles have minimum energy, called zero-point energy. Researchers including Dr. Hal Puthoff have identified this zero-point energy with zero-point field, a mesh of light that encompasses all of reality. (This field is further explained in Part III.) This field of light is a vacuum state, but it is not empty; rather, it is a sea of electromagnetic energy, and possibly, virtual particles—ideas that can become real. Conceivably, energy should stand completely still at absolute zero, which would mean that information would become permanently imprisoned. Research on zero-point energy, however, reveals that nearing zero-point, atomic motion stops, but energy continues. This means that “lost information” is not really lost. Even when frozen, it continues to “vibrate” in the background. The pertinent questions are these: How do we “read” this background information? How do we apply it? These queries are similar to those we might ask about “hidden” information. How do we access suppressed but desirable data? The answers lie in learning about subtle structures, for these dwell at the interfaces between the concrete and the higher planes. Operate within the subtle structures, and you can shift a negative reality to a positive one, without losing energy in the process.
Cyndi Dale (The Subtle Body: An Encyclopedia of Your Energetic Anatomy)
In other words, the second law of thermodynamics has a life loophole: although the total entropy must increase, it’s allowed to decrease in some places as long as it increases even more elsewhere. So life maintains or increases its complexity by making its environment messier.
Max Tegmark (Life 3.0: Being Human in the Age of Artificial Intelligence)
He married military history with science, building his theory upon Gödel, Heisenberg, Popper, Kuhn, Piaget and Polanyi, who highlighted the unavoidable feature of uncertainty in any system of thought (as well as the limits of the Newtonian paradigm). Cybernetics and systems theory offered him the concept of feedback, the combination of analysis–synthesis as well as the Second Law of Thermodynamics and entropy, the distinction between open and closed systems, the importance of interactions and relations, and the need for a holistic approach. The cognitive revolution, combined with neo-Darwinist studies, showed him the role of schemata formed by genetics, culture and experience. Chaos theory highlighted non-linear behavior.
Frans P.B. Osinga (Science, Strategy and War: The Strategic Theory of John Boyd (Strategy and History))
Unfortunately, the laws of thermodynamics guarantee that the entropy in the universe tends toward a maximum.
Andrew Hunt (The Pragmatic Programmer)
Ironically, the modern era of molecular biology, and all the extraordinary DNA technology that it entails, arguably began with a physicist, specifically with the publication of Erwin Schrödinger’s book What is Life? in 1944. Schrödinger made two key points: first, that life somehow resists the universal tendency to decay, the increase in entropy (disorder) that is stipulated by the second law of thermodynamics; and second, that the trick to life’s local evasion of entropy lies in the genes. He proposed that the genetic material is an ‘aperiodic’ crystal, which does not have a strictly repeating structure, hence could act as a ‘code-script’ – reputedly the first use of the term in the biological literature. Schrödinger himself assumed, along with most biologists at the time, that the quasicrystal in question must be a protein; but within a frenzied decade, Crick and Watson had inferred the crystal structure of DNA itself.
Nick Lane (The Vital Question: Why is life the way it is?)
At first, one might think that something like thermodynamics is a rather restrictive concept because it concerns itself with temperature and heat. But its application is not just restricted to all things thermal. It is possible to relate the notion of entropy, which is a measure of disorder, to the more general and fruitful notions of 'information', of which we have already made use in discussing the richness of certain systems of axioms and rules of reasoning. We can think of the entropy of a large object like a black hole as being equal to the number of different ways in which its most elementary constituents can be rearranged in order to give the same large-scale state. This tells us the number of binary digits ('bits') that are needed to specify in every detail the internal configuration of the constituents out of which the black hole is composed. Moreover, we can also appreciate that, when a black hole horizon forms, a certain amount of information is forever lost to the outside observer when a horizon forms around a region of the universe to create a black hole.
John D. Barrow (Theories of Everything: The Quest for Ultimate Explanation)
Professor Boltzmann, the discoverer of the Boltzmann constant and the first to formulate the equation that describes entropy, committed suicide in 1906, overwhelmed by the lack of recognition of his ideas and by the virulent persecution to which he was subjected by Herr Mach, Herr Haeckel, and others who saw the Second Law of Thermodynamics as an attack on their atheist, scientistic dogma.
José Carlos González-Hurtado (New Scientific Evidence for the Existence of God)
After that the whole system fades away into a dead, inert lump of matter. A permanent state is reached, in which no observable events occur. The physicist calls this the state of thermodynamical equilibrium, or of ‘maximum entropy’.
Erwin Schrödinger (What is Life? (Canto Classics))
The grandfather of this brilliant madness was Austrian physicist Ludwig Boltzmann, whose proof that gas molecules disperse in proportion to their temperature was foundational to all that followed. Boltzmann showed that molecular movement is simply determined by probability, which results in concentrations of molecules dispersing until they reach equilibrium with their environment. If you pour a potful of boiling water into a cold bath, hot water molecules spread out until they are evenly distributed and have slightly raised the overall bath temperature. Time cannot go backward for the same reason that boiling water can’t re-form in one corner of a cold bath and the dead cannot return to life: random probability will never re-concentrate those molecules back into their original form. The branch of physics pioneered by Boltzmann was called statistical mechanics, and it explained one of the fundamental laws of nature: the Second Law of Thermodynamics, also known as entropy.
Sebastian Junger (In My Time of Dying: How I Came Face to Face with the Idea of an Afterlife)
Even in the equations that had been formulated to describe electromagnetism, there is no natural directionality to the interactions of particles; the equations look the same going both directions. If you looked at a video of atoms interacting, you could play it backward and you wouldn’t be able to tell which was correct. It is only in the macroworld of objects, people, planets, and so on, the world governed by entropy, that causation appears to unfold in a single direction. The second law of thermodynamics describes the increasing disorder in the universe at macroscales and is often seen as equivalent to the one-way arrow of time. More and more physicists over the past few decades, sensitive to the nondirectionality that seems to rule at the micro or quantum level, have begun to question the no-teleology rule. Recall that the tiny particles making up the matter and energy of the physical universe are really like worms or strings snaking through the block universe of Minkowski spacetime. Their interactions, which look to us a bit like tiny balls colliding on a billiard table, are from a four-dimensional perspective more like threads intertwining; the twists and turns where they wrap around each other are what we see as collisions, interactions, and “measurements” (in the physicists’ preferred idiom). Each interaction changes information associated with those threads—their trajectory through the block universe (position and momentum) as well as qualities like “spin” that influence that trajectory. According to some recent theories, a portion of the information particles carry with them actually might propagate backward rather than forward across their world lines. For instance, an experiment at the University of Rochester in 2009 found that photons in a laser beam could be amplified in their past when interacted with a certain way during a subsequent measurement—true backward causation, in other words.8 The Israeli-American physicist Yakir Aharonov and some of his students are now arguing that the famous uncertainty principle—the extent to which the outcome of an interaction is random and unpredictable—may actually be a measure of the portion of future influence on a particle’s behavior.9 In other words, the notorious randomness of quantum mechanics—those statistical laws that captured Jung’s imagination—may be where retrocausation was hiding all along. And it would mean Einstein was right: God doesn’t play dice.*23 If the new physics of retrocausation is correct, past and future cocreate the pattern of reality built up from the threads of the material world. The world is really woven like a tapestry on a four-dimensional loom. It makes little sense to think of a tapestry as caused by one side only;
Eric Wargo (Precognitive Dreamwork and the Long Self: Interpreting Messages from Your Future (A Sacred Planet Book))
Hyrum’s Law If you are maintaining a project that is used by other engineers, the most important lesson about “it works” versus “it is maintainable” is what we’ve come to call Hyrum’s Law: With a sufficient number of users of an API, it does not matter what you promise in the contract: all observable behaviors of your system will be depended on by somebody. In our experience, this axiom is a dominant factor in any discussion of changing software over time. It is conceptually akin to entropy: discussions of change and maintenance over time must be aware of Hyrum’s Law8 just as discussions of efficiency or thermodynamics must be mindful of entropy. Just because entropy never decreases doesn’t mean we shouldn’t try to be efficient. Just because Hyrum’s Law will apply when maintaining software doesn’t mean we can’t plan for it or try to better understand it. We can mitigate it, but we know that it can never be eradicated. Hyrum’s Law represents the practical knowledge that — even with the best of intentions, the best engineers, and solid practices for code review — we cannot assume perfect adherence to published contracts or best practices. As an API owner, you will gain some flexibility and freedom by being clear about interface promises, but in practice, the complexity and difficulty of a given change also depends on how useful a user finds some observable behavior of your API. If users cannot depend on such things, your API will be easy to change. Given enough time and enough users, even the most innocuous change will break something;9 your analysis of the value of that change must incorporate the difficulty in investigating, identifying, and resolving those breakages.
Titus Winters (Software Engineering at Google: Lessons Learned from Programming Over Time)
Despite its supposed universality, the second law appears to be constantly violated by living organisms, whose conception and growth (as individuals) and whose evolution (as species and ecosystems) produces distinctly more ordered, more complex forms of life. But there is really no conflict: the second law applies only to closed systems under thermodynamic equilibrium. The Earth’s biosphere is an open system, which incessantly imports solar energy and uses its photosynthetic conversion to new plant mass as the foundation for greater order and organization (a reduction of entropy).
Vaclav Smil (Energy: A Beginner's Guide (Beginner's Guides))
As a mathematician Fantappiè could not accept that half of the solutions of the fundamental equations where rejected and in 1941, while listing the properties of the forward and backward in time energy, Fantappiè discovered that forward in time energy is governed by the law of entropy, whereas backward in time energy is governed by a complementary law that he named syntropy, combining the Greek words syn which means converging and tropos which means tendency. Listing the mathematical properties of syntropy Fantappiè discovered: energy concentration, increase in differentiation, complexity and structures: the mysterious properties of life! In 1944 he published the book “Principi di una Teoria Unitaria del Mondo Fisico e Biologico”[5] (Unitary Theory of the Physical and Biological World) in which he suggests that the physical-material world is governed by the law of entropy and causality, whereas the biological world is governed by the law of syntropy and retrocausality. We cannot see the future and therefore retrocausality is invisible! The dual energy solution suggests the existence of a visible reality (causal and entropic) and an invisible reality (retrocausal and syntropic). The first law of thermodynamics states that energy is a constant, a unity that cannot be created or destroyed but only transformed, and the energy-momentum-mass equation suggests that this unity has two components: entropy and syntropy. We can therefore write: 1=Entropy+Syntropy which shows that syntropy is the complement of entropy. Syntropy is often mistaken with negentropy. However, it is fundamentally different since negentropy does not take into account the direction of time, but considers time only in the classical way: flowing forward. Life lies between these two components: one entropic and the other syntropic, one visible and the other invisible, and this can be portrayed using a seesaw with entropy and syntropy playing at the opposite sides, and life at the center. This suggests that entropy and syntropy are constantly interacting and that all the manifestations of reality are dual: emitters and absorbers, particles and waves, matter and anti-matter, causality and retrocausality
Ulisse Di Corpo (Syntropy, Precognition and Retrocausality)
that raises a profound mystery: Scientists have long been baffled by the existence of spontaneous order in the universe. The laws of thermodynamics seem to dictate the opposite, that nature should inexorably degenerate toward a state of greater disorder, greater entropy. Yet all around us we see magnificent structures—galaxies, cells, ecosystems, human beings—that have somehow managed to assemble themselves.
Steven H. Strogatz (Sync: How Order Emerges From Chaos In the Universe, Nature, and Daily Life)
Time’s arrow is irreversible, because entropy cannot decrease of its own accord without violating the second law of thermodynamics. A reversible arrow would be like a movie run backward. The scenes in the movie are not impossible by the laws of classical mechanics, but they are patently absurd.
Jeremy Campbell (GRAMMATICAL MAN: Information, Entropy,Language and Life)
Although it may not be immediately apparent, we have now come to an intriguing point. The second law of thermodynamics seems to have given us an arrow of time, one that emerges when physical systems have a large number of constituents. “For things with many constituents, going from lower to higher entropy—from order to disorder—is easy, so it happens all the time. Going from higher to lower entropy—from disorder to order—is harder, so it happens rarely, at best.
Brian Greene (The Fabric of the Cosmos: Space, Time, and the Texture of Reality)
Clausius also crisply formulated the second law of thermodynamics: entropy of the universe tends to maximum. In practical terms this means that in a closed system (one without any external supply of energy) the availability of useful energy can only decline.
Vaclav Smil (Energy: A Beginner's Guide (Beginner's Guides))
Finally, the third law of thermodynamics, initially formulated in 1906 as Walther Nernst’s (1864–1941) heat theorem, states that all processes come to a stop (and entropy shows no change) only when the temperature nears absolute zero (–273°C).
Vaclav Smil (Energy: A Beginner's Guide (Beginner's Guides))
I was especially fascinated by the second law of thermodynamics, which holds that entropy virtually always increases in a closed system. Entropy is a measure of disorder or uselessness. In lay terms, this means that progress stalls or declines when something is walled off from the outside world. Usually
Charles G. Koch (Believe in People: Bottom-Up Solutions for a Top-Down World)
He asked: “You are aware of the Second Law of Thermodynamics, right?” “‘You Do Not Talk About Thermodynamics?’” Rudy said nothing. “The currency of the universe, Entropy. Okay and…?” “A candle that burns twice as bright burns half as long.” “That seems unrelated, but I’ll allow it. Is that supposed to be comforting?” “I like either the lavender or cinnamon-scented ones.” “This isn’t the advice I was asking for and you know that.” “Isn’t it? You know, it doesn’t take a master’s in behavioral psychology to see you’ve some unresolved issues.” “And the universe has a tendency to devolve into chaos, so why bother controlling it, just control myself?” Rudy just continued to shoot glances at Danny’s arm. Danny kept it face down, pretending not to notice. “Rudy: Sigmund Freud meets Dr. Seuss. Thank you.
Kyle St Germain (Dysfunction)
Chaos will rot your plants and kill your dog and rust your bike. It will decay your most precious memories, topple your favorite cities, wreck any sanctuary you can ever build. It’s not if, it’s when. Chaos is the only sure thing in this world. The master that rules us all. My scientist father taught me early that there is no escaping the Second Law of Thermodynamics: entropy is only growing; it can never be diminished, no matter what we do.
Lulu Miller (Why Fish Don't Exist: A Story of Loss, Love, and the Hidden Order of Life)
IT POPS, A new discovery, by: Jonathan Roy Mckinney IT POPS is relative to space time in that it has two major components that follow the 3 laws of thermodynamics. The PIrandom creator injects an object with the relative XYZ coordinates in a model view projection that are derived by utilizing the correct Matrix math 11,10,10,10. Once Injected, the two way hash coin creator in time will create the entropy required to derive random shapes and reverse or abstract them into full objects for a complete wire frame like Michaelangelo could carve an Angel out of marble.
Jonathan Roy Mckinney
Clausius’s entropy, indicated by the letter S, is a measurable and calculable quantity15 that increases or remains the same but never decreases, in an isolated process. In order to indicate that it never decreases, we write: ΔS ≥ 0 This reads: “Delta S is always greater than or equal to zero,” and we call this “the second principle of thermodynamics” (the first being the conservation of energy). Its nub is the fact that heat passes only from hot bodies to cold, never the other way around.
Carlo Rovelli (The Order of Time)
What's in a name? In the case of Shannon's measure the naming was not accidental. In 1961 one of us (Tribus) asked Shannon what he had thought about when he had finally confirmed his famous measure. Shannon replied: “My greatest concern was what to call it. I thought of calling it ‘information,’ but the word was overly used, so I decided to call it ‘uncertainty.’ When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name. In the second place, and more important, no one knows what entropy really is, so in a debate you will always have the advantage.
Arieh Ben-Naim (Farewell To Entropy, A: Statistical Thermodynamics Based On Information)
In the over-a-hundred-years of the history of the Second Law, people were puzzled by the apparent conflict between the reversibility of the equations of motion, and the irreversibility associated with the Second Law. Boltzmann was the first to attempt to derive the Second Law from the dynamics of the particles. In my opinion, this, as well as other attempts, will inevitably fail in principle. First, because it is impractical to solve the equations of motion for some 1023 particles. Second, because one cannot get probabilities from the deterministic equations of motion. Third, and perhaps most important, because of the indistinguishability of the particles. It is well known that whenever we write the equation of motions of any number of particles, we must first label the particles. This is true for classical as well as for the quantum mechanical equations of motion. However, the very act of labeling the particles violates the principle of ID of the particles.
Arieh Ben-Naim (Farewell To Entropy, A: Statistical Thermodynamics Based On Information)
Prigogine and Stengers also undermine conventional views of thermodynamics by showing that, under nonequilibrium conditions, at least, entropy may produce, rather than degrade, order, organization—and therefore life. If this is so, then entropy, too, loses its either/or character. While certain systems run down, other systems simultaneously evolve and grow more coherent. This mutualistic, nonexclusive view makes it possible for biology and physics to coexist rather than merely contradict one another.
Ilya Prigogine (Order Out of Chaos: Man's New Dialogue with Nature (Radical Thinkers))
Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system—which is a measure of the system’s disorder, or ‘randomness’—is greater (or at least not smaller) at later times than it was at earlier times.
Roger Penrose (Cycles of Time: An Extraordinary New View of the Universe)
Software is like entropy: It is difficult to grasp, weighs nothing, and obeys the Second Law of Thermodynamics; i.e., it always increases.
Norman Augustine
Matter (Becoming) is entropic. Pure mind (Being) has zero entropy. The Second Law of Thermodynamics is said to predict the Heat Death of the universe. In fact, this is false. What it actually predicts is the death of matter, space and time – via the expansion of the physical universe until it flatlines and thus ceases to be. Evolution is about eliminating matter, and this is accomplished – exactly – at the end of the universe, in readiness for the creation of the next universe. Only at the end of the universe is Becoming not operating in conjunction with Being. However, as soon as Being reaches exclusivity (Becoming has ceased to exist), the first act of Being is to initiate Becoming again, and this is none other than the Big Bang, the cosmic eruption of Becoming across all monadic nodes, and the origin of the “splitting”of all monadic minds into separate centers of agency.
Thomas Stark (Extra Scientiam Nulla Salus: How Science Undermines Reason (The Truth Series Book 8))
Were organizational inertia the whole story, a well-adapted corporation would remain healthy and efficient as long as the outside world remained unchanged. But, another force, entropy, is also at work. In science, entropy measures a physical system’s degree of disorder, and the second law of thermodynamics states that entropy always increases in an isolated physical system. Similarly, weakly managed organizations tend to become less organized and focused. Entropy makes it necessary for leaders to constantly work on maintaining an organization’s purpose, form, and methods even if there are no changes in strategy or competition.
Richard P. Rumelt (Good Strategy Bad Strategy: The Difference and Why It Matters)
It’s natural to wonder whether faster sorting is even possible. The question sounds like it’s about productivity. But talk to a computer scientist and it turns out to be closer to metaphysics—akin to thinking about the speed of light, time travel, superconductors, or thermodynamic entropy. What are the universe’s fundamental rules and limits? What is possible? What is allowed? In this way computer scientists are glimpsing God’s blueprints every bit as much as the particle physicists and cosmologists. What is the minimum effort required to make order?
Brian Christian
THE DEVIL TEACHES THERMODYNAMICS My second law, your second law, ordains that local order, structures in space and time, be crafted in ever-so-losing contention with proximal disorder in this neat but getting messier universe. And we, in the intricate machinery of our healthy bodies and life-support systems, in the written and televised word do declare the majesty of the zoning ordinances of this Law. But oh so smart, we think that we are not things, like weeds, or rust, or plain boulders, and so invent a reason for an eternal subsidy of our perfection, or at least perfectibility, give it the names of God or the immortal soul. And while we allow the dissipations that cannot be hid, like death, and — in literary stances — even the end of love, we make the others just plain evil: anger, lust, pride — the whole lot of pimples of the spirit. Diseases need vectors, so the old call goes out for me. But the kicker is that the struts of God's stave church, those nice seven, they're such a tense and compressed support group that when they get through you're really ready to let off some magma. Faith serves up passing certitude to weak minds, recruits for the cults, and too much of her is going to play hell with that other grand invention of yours, the social contract. Boring Prudence hangs around with conservatives, and Love, love you say! Love one, leave out the others. Love them all, none will love you. I tell you, friends, love is the greatest entropy-increasing device invented by God. Love is my law's sweet man. And for God himself, well, his oneness seems too much for natural man to love, so he comes up with Northern Irelands and Lebanons... The argument to be made is not for your run-of-the-mill degeneracy, my stereotype. No, I want us to awake, join the imperfect universe at peace with the disorder that orders. For the cold death sets in slowly, and there is time, so much time, for the stars' light to scatter off the eddies of chance, into our minds, there to build ever more perfect loves, invisible cities, our own constellation.
Roald Hoffman
In an essay summarizing the results of this research, Baumeister captured what I am trying to convey about the purpose of life, the laws of nature, and the cosmos as it relates to finding meaning, particularly in the context of our search for immortality, the afterlife, and utopia: Meaning is a powerful tool in human life. To understand what that tool is used for, it helps to appreciate something else about life as a process of ongoing change. A living thing might always be in flux, but life cannot be at peace with endless change. Living things yearn for stability, seeking to establish harmonious relationships with their environment. They want to know how to get food, water, shelter and the like. They find or create places where they can rest and be safe … Life, in other words, is change accompanied by a constant striving to slow or stop the process of change, which leads ultimately to death. If only change could stop, especially at some perfect point: that was the theme of the profound story of Faust’s bet with the devil. Faust lost his soul because he could not resist the wish that a wonderful moment would last forever. Such dreams are futile. Life cannot stop changing until it ends.14 That a meaningful, purposeful life comes from struggle and challenge against the vicissitudes of nature more than it does a homeostatic balance of extropic pushback against entropy reinforces the point that the Second Law of Thermodynamics is the First Law of Life. We must act in the world. The thermostat is always being adjusted, balance sought but never achieved. There is no Faustian bargain to be made in life. We may strive for immortality while never reaching it, as we may seek utopian bliss while never finding it, for it is the striving and the seeking that matter, not the attainment of the unattainable. We are volitional beings, so the choice to act is ours, and our sense of purpose is defined by reaching for the upper limits of our natural abilities and learned skills, and by facing challenges with courage and conviction.
Michael Shermer (Heavens on Earth: The Scientific Search for the Afterlife, Immortality, and Utopia)
Gibbs’s insight was to find a way showing how the two laws of thermodynamics drive all chemical reactions. He chose to start his argument with a restatement of those laws, so let’s follow his lead: First law: The energy in the universe is constant. Second law: The entropy of the universe tends to increase. Gibbs then showed how all processes of change can be judged by these two laws. He did this, essentially, by turning the two laws into one new law we can call Gibbs’s law: The flow of energy is the means by which the entropy of the universe is increased.
Paul Sen (Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe)
Is Szilard, with his thought experiment, suggesting that information can do the opposite, overcoming the second law of thermodynamics and turning warm air at a constant temperature into useful work? Such a system would reduce the entropy of the universe because this “free” work could be used to force heat to flow the “wrong” way from cold to hot.
Paul Sen (Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe)
Hawking and Bekenstein had shown that the three great ideas of modern physics—general relativity, quantum mechanics, and thermodynamics—work in harmony. For these reasons, black hole entropy and radiation have come to dominate contemporary physics as scientists search for a so-called grand unified theory, the Holy Grail of a single principle that explains nature—the world, the universe, everything—at its most fundamental level.
Paul Sen (Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe)
Because the formula he derived for measuring the average number of bits needed to encode a piece of information looked almost exactly like Ludwig Boltzmann and Josiah Willard Gibbs’s formula for calculating entropy in thermodynamics. Here’s Shannon’s equation for calculating the size of any given piece of information: H = –Σi pi logb pi And here’s one way of stating Boltzmann’s equation for calculating the entropy of any given system: S = –kB Σi pi ln pi These two equations don’t just look similar; they’re effectively the same. Shortly after deriving his equation, Shannon pointed the similarity out to John von Neumann, then widely considered the world’s best mathematician. Von Neumann shrugged, suggesting that Shannon call his measure of the number of bits needed to carry a piece of information information entropy on the grounds that no one really understood thermodynamic entropy either.
Paul Sen (Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe)
imagine a box full of hot gas, which, of course, has entropy. Now, let’s drop the box past the event horizon of a black hole. Because nothing can come back from across the event horizon, the box has crossed a point of no return and is thus no longer part of our universe. Both the box of gas and the entropy associated with it have disappeared from our universe. But that means that the entropy of our universe has gone down, which directly contradicts the second law of thermodynamics.
Paul Sen (Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe)
Entropy increases the energy content of the black hole, increasing both its mass and the size of its event horizon. Bekenstein’s argument? Whenever the entropy of a black hole increases, so does the area of its event horizon. In other words, the area of the event horizon of a black hole was not an analogy for entropy, it was a direct measure of its entropy. In Bekenstein’s view, this saved the universal applicability of the second law of thermodynamics. The entropy of the universe always increases, even when things fall into black holes, because the entropy lost from the space outside the event horizon is made up for by an increase in the surface area of the event horizon. Bekenstein called this the generalized second law of thermodynamics, or GSL.
Paul Sen (Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe)
put another way, heat will always dissipate away from a hot region because after a period of random collisions the odds are stacked overwhelmingly in favor of that result. Entropy, by Boltzmann’s reasoning, is simply the number of indistinguishable ways the constituent parts of a system can be arranged. To say entropy increases in any given system is another way of saying that any given system evolves into ever-more-likely distributions or configurations. The second law of thermodynamics is true for the same reason that when a pack of cards arranged in suits is shuffled, it will end up jumbled. There are many more indistinguishable ways for the pack to be disordered than there are for it to end up ordered, and so shuffling takes it in that direction.
Paul Sen (Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe)
A thermodynamic chart reveals these to be transitions at a constant temperature at which the entropy of a material changes dramatically. In other words, paradoxically, during phase transitions materials can absorb heat without getting hotter and can reject heat without getting colder.
Paul Sen (Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe)
Archaeology suggests our ancestors figured out how to make fire around a million years ago. Making ice was much harder, and the power to cool is the unglamorous but indispensable technology of the modern age. Refrigeration is the most obviously thermodynamic of all human inventions, and the most defiant of the universal tendency for entropy to increase. These devices force heat to pass from a cold interior to a warm exterior, which is in the opposite direction to the one in which heat flows spontaneously. The purpose of this is to create a space where the relentless increase of entropy is slowed down. Although ostensibly a refrigerator is a cool box, that’s a means to an end. Its ultimate purpose is to slow down decay and putrefaction, which are both examples of entropy increasing. Think of a refrigerator as a device inside which time slows down.
Paul Sen (Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe)
In the condenser, as it releases heat, another phase change occurs as the coolant turns back into a liquid. But this is quite warm, it’s the same temperature as the room. For the refrigeration process to continue, the coolant’s temperature must fall back to 4°C before it can reenter the evaporator. To achieve this, the coolant liquid passes through a tiny nozzle called an expansion valve. As the coolant is forced through it, its pressure drops, and it cools and it’s ready to enter the evaporator once again. The compressor ensures the refrigerator complies with the second law of thermodynamics. Heat flows out of the refrigerator interior lowering its entropy. But the total heat that flows out of the condenser raises the room’s entropy to compensate.
Paul Sen (Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe)
Gordon Van Wylen, Chairman of the Dept. of Mechanical Engineering at the University of Michigan commented that, “The question that arises is how the universe got into the state of reduced entropy in the first place, since all natural processes known to us tend to increase entropy?” (Gordon Van Wylen and Richard Edwin Sonntag, Fundamentals of Classical Thermodynamics, 1973). He concludes by saying, “The author has found that the second law [of thermodynamics] tends to increase conviction that there is a Creator.
Gordon J. Van Wylen
A tangent that departs from the real to the imaginary: pure consciousness does and does not transcend the body, and I believe this after hearing that my mother felt suicidal after she took her medicines for weight loss and her biggest regrets in life came crushing down on her for three days in a row. This is the best of what I have learnt in my years of fascination for science and knowledge, and to make you grasp this takes fullness of life: in hydrology, the wet and the dry, and the hot and the cold always co-exist, but they are also in flux and are also stable: all depending on the reference point of analysis. Consciousness beyond matter, and consciousness tied to matter co-exist in everyplace at different scales, and sometimes even in the same scale. Tao te ching (the way and its power) that fascinated Lao Tzu; the calculus of infinitesimals; the wonderful infinity of the number line and fractals that fascinated Ramanujan and Mandelbrot; the horn of the rhinoceros that fascinated Dali, thermodynamic and hydrodynamic equilibriums that fascinate all scientists, the surety of a fading perfume smell or the permanence of a shattered mirror that is easy to understand to anyone; the concepts of anti-fragility, entropy, volatility, randomness, disorder are all intimately tied to this. Consciousness is constantly attainted and broken all around us all the time, and we rarely stop to think about this because it infinitesimally evades us. Here is where I begin to stretch this and I can't understand it and it is very discouraging -- prudence, temperance and courage -- some of the highest virtues may also be related to this. When you are prepared, it is consciousness. When we are unprepared for it, and this hits you without hurting you, it is magic and strength. Else, perhaps death.
Solomon Vimal
The universe tended towards chaos and entropy. That was basic thermodynamics. Maybe it was basic existence too. You lose your job, then more shit happens. The wind whispered through the trees. It began to rain. She headed towards the shelter of a newsagent's, with the deep — and, as it happened, correct — sense that things were about to get worse.
Matt Haig (The Midnight Library)