Entropy Related Quotes

We've searched our database for all the quotes and captions related to Entropy Related. Here they are! All 33 of them:

People come into our lives and then they go out again. The entropy law, as applied to human relations. Sometimes in their passing, though, they register an unimagined and far-reaching influence, as I suspect Hughes Rudd did upon me. There is no scientific way to discern such effects, but memory believes before knowing remembers. And the past lives coiled within the present, beyond sight, beyond revocation, lifting us up or weighting us down, sealed away--almost completely--behind walls of pearl.
David Quammen (The Flight of the Iguana: A Sidelong View of Science and Nature)
The entropy of a system depends explicitly on blurring. It depends on what I do not register, because it depends on the number of indistinguishable configurations. The same microscopic configuration may be of high entropy with regard to one blurring and of low in relation to another.
Carlo Rovelli (The Order of Time)
The behavior of spacetime in general relativity can be thought of as simply the natural tendency of systems to move toward configurations of higher entropy.
Sean Carroll (Something Deeply Hidden: Quantum Worlds and the Emergence of Spacetime)
We focus our attention on impending catastrophes, while the true catastrophes are already here, under our noses, with the degeneration of social practices, with the mass media's numbing effect, with a collective will blinded by the ideology of the 'market', in other words, succumbing to the law of the masses, to entropy, to the loss of singularity, to a general and collective infantilization. The old types of social relations, the old relations with sex, with time, with the cosmos, with human finitude have been rattled, not to say devastated, by the 'progress' generated by industrial firms.
Félix Guattari
How do fields express their principles? Physicists use terms like photons, electrons, quarks, quantum wave functions, relativity, and energy conservation. Astronomers use terms like planets, stars, galaxies, Hubble shift, and black holes. Thermodynamicists use terms like entropy, first law, second law, and Carnot cycle. Biologists use terms like phylogeny, ontology, DNA, and enzymes. Each of these terms can be considered to be the thread of a story. The principles of a field are actually a set of interwoven stories about the structure and behavior of field elements, the fabric of the multiverse.
Peter J. Denning
This “Hawking temperature” of a black hole and its “Hawking radiation” (as they came to be called) were truly radical—perhaps the most radical theoretical physics discovery in the second half of the twentieth century. They opened our eyes to profound connections between general relativity (black holes), thermodynamics (the physics of heat) and quantum physics (the creation of particles where before there were none). For example, they led Stephen to prove that a black hole has entropy, which means that somewhere inside or around the black hole there is enormous randomness. He deduced that the amount of entropy (the logarithm of the hole’s amount of randomness) is proportional to the hole’s surface area. His formula for the entropy is engraved on Stephen’s memorial stone at Gonville and Caius College in Cambridge, where he worked. For the past forty-five years, Stephen and hundreds of other physicists have struggled to understand the precise nature of a black hole’s randomness. It is a question that keeps on generating new insights about the marriage of quantum theory with general relativity—that is, about the ill-understood laws of quantum gravity.
Stephen Hawking (Brief Answers to the Big Questions)
Technology, I said before, is most powerful when it enables transitions—between linear and circular motion (the wheel), or between real and virtual space (the Internet). Science, in contrast, is most powerful when it elucidates rules of organization—laws—that act as lenses through which to view and organize the world. Technologists seek to liberate us from the constraints of our current realities through those transitions. Science defines those constraints, drawing the outer limits of the boundaries of possibility. Our greatest technological innovations thus carry names that claim our prowess over the world: the engine (from ingenium, or “ingenuity”) or the computer (from computare, or “reckoning together”). Our deepest scientific laws, in contrast, are often named after the limits of human knowledge: uncertainty, relativity, incompleteness, impossibility. Of all the sciences, biology is the most lawless; there are few rules to begin with, and even fewer rules that are universal. Living beings must, of course, obey the fundamental rules of physics and chemistry, but life often exists on the margins and interstices of these laws, bending them to their near-breaking limit. The universe seeks equilibriums; it prefers to disperse energy, disrupt organization, and maximize chaos. Life is designed to combat these forces. We slow down reactions, concentrate matter, and organize chemicals into compartments; we sort laundry on Wednesdays. “It sometimes seems as if curbing entropy is our quixotic purpose in the universe,” James Gleick wrote. We live in the loopholes of natural laws, seeking extensions, exceptions, and excuses.
Siddhartha Mukherjee (The Gene: An Intimate History)
The entropy of a system is related to the number of indistinguishable rearrangements of its constituents, but properly speaking is not equal to the number itself. The relationship is expressed by a mathematical operation called a logarithm; don't be put off if this brings back bad memories of high school math class. In our coin example, it simply means that you pick out the exponent in the number of rearrangements-that is, the entropy is defined as 1,000 rather than 2^1000.
Brian Greene (The Hidden Reality: Parallel Universes and the Deep Laws of the Cosmos)
Melancholy is that a scrambled egg can't be unscrambled--entropy increases--experience is subject to the arrow of time. And the infinite sadness of my life consists in that I only recognize the beauty of simple arrangements from the relative vantage of the scrambled; memory, not experience, is my only access to it. Anxiety is the progression toward equilibrium. Despair is the inescapability. Insanity is the rationalizing of it all. Sanity is the irrational acceptance of it all. Indifference is just detached therapy. And progression--activity / toil / tasks / success / failure--just coping distraction and procrastination, just ill-placed deferment--my preferred route. And crisis--
Jack Foster (Fresh Fruit: A Preface)
This was a golden age, in which we solved most of the major problems in black hole theory even before there was any observational evidence for black holes. In fact, we were so successful with the classical general theory of relativity that I was at a bit of a loose end in 1973 after the publication with George Ellis of our book The Large Scale Structure of Space–Time. My work with Penrose had shown that general relativity broke down at singularities, so the obvious next step would be to combine general relativity—the theory of the very large—with quantum theory—the theory of the very small. In particular, I wondered, can one have atoms in which the nucleus is a tiny primordial black hole, formed in the early universe? My investigations revealed a deep and previously unsuspected relationship between gravity and thermodynamics, the science of heat, and resolved a paradox that had been argued over for thirty years without much progress: how could the radiation left over from a shrinking black hole carry all of the information about what made the black hole? I discovered that information is not lost, but it is not returned in a useful way—like burning an encyclopedia but retaining the smoke and ashes. To answer this, I studied how quantum fields or particles would scatter off a black hole. I was expecting that part of an incident wave would be absorbed, and the remainder scattered. But to my great surprise I found there seemed to be emission from the black hole itself. At first, I thought this must be a mistake in my calculation. But what persuaded me that it was real was that the emission was exactly what was required to identify the area of the horizon with the entropy of a black hole. This entropy, a measure of the disorder of a system, is summed up in this simple formula which expresses the entropy in terms of the area of the horizon, and the three fundamental constants of nature, c, the speed of light, G, Newton’s constant of gravitation, and ħ, Planck’s constant. The emission of this thermal radiation from the black hole is now called Hawking radiation and I’m proud to have discovered it.
Stephen Hawking (Brief Answers to the Big Questions)
Because for Everettians, the explanation of the quantum arrow of time is the same as that of the entropic arrow of time: the initial conditions of the universe. Branching happens when systems become entangled with the environment and decohere, which unfolds as time moves toward the future, not the past. The number of branches of the wave function, just like the entropy, only increases with time. That means that the number of branches was relatively small to begin with. In other words, that there was a relatively low amount of entanglement between various systems and the environment in the far past. As with entropy, this is an initial condition we impose on the state of the universe, and at the present
Sean Carroll (Something Deeply Hidden: Quantum Worlds and the Emergence of Spacetime)
Many of the important principles in twentieth century physics are expressed as limitations on what we can know. Einstein's principle of relativity (which was an extension of a principle of Galileo's) says that we cannot do any experiment that would distinguish being at rest from moving at a constant velocity. Heisenberg's uncertainty principle tells us that we cannot know both the position and momentum of a particle to arbitrary accuracy. This new limitation tells us there is an absolute bound to the information available to us about what is contained on the other side of a horizon. It is known as Bekenstein's bound, as it was discussed in papers Jacob Bekenstein wrote in the 1970s shortly after he discovered the entropy of black holes.
Lee Smolin (Three Roads To Quantum Gravity)
The assessment will be guided by insights from research in particle physics, astrophysics, and cosmology that allow us to predict how the universe will unfold over epochs that dwarf the timeline back to the bang. There are significant uncertainties, of course, and like most scientists I live for the possibility that nature will slap down our hubris and reveal surprises we can’t yet fathom. But focusing on what we’ve measured, on what we’ve observed, and on what we’ve calculated, what we’ll find, as laid out in chapters 9 and 10, is not heartening. Planets and stars and solar systems and galaxies and even black holes are transitory. The end of each is driven by its own distinctive combination of physical processes, spanning quantum mechanics through general relativity, ultimately yielding a mist of particles drifting through a cold and quiet cosmos. How will conscious thought fare in a universe experiencing such transformation? The language for asking and answering this question is provided once again by entropy. And by following the entropic trail we will encounter the all-too-real possibility that the very act of thinking, undertaken by any entity of any kind anywhere, may be thwarted by an unavoidable buildup of environmental waste: in the distant future, anything that thinks may burn up in the heat generated by its own thoughts. Thought itself may become physically impossible. While the case against endless thought will be based on a conservative set of assumptions, we will also consider alternatives, possible futures more conducive to life and thinking. But the most straightforward reading suggests that life, and intelligent life in particular, is ephemeral. The interval on the cosmic timeline in which conditions allow for the existence of self-reflective beings may well be extremely narrow. Take a cursory glance at the whole shebang, and you might miss life entirely. Nabokov’s description of a human life as a “brief crack of light between two eternities of darkness”6 may apply to the phenomenon of life itself. We mourn our transience and take comfort in a symbolic transcendence, the legacy of having participated in the journey at all. You and I won’t be here, but others will, and what you and I do, what you and I create, what you and I leave behind contributes to what will be and how future life will live. But in a universe that will ultimately be devoid of life and consciousness, even a symbolic legacy—a whisper intended for our distant descendants—will disappear into the void. Where, then, does that leave us?
Brian Greene (Until the End of Time: Mind, Matter, and Our Search for Meaning in an Evolving Universe)
Setting the bar high in our approach to hiring has been, and will continue to be, the single most important element of Amazon.com’s success. During our hiring meetings, we ask people to consider three questions before making a decision: Will you admire this person? If you think about the people you’ve admired in your life, they are probably people you’ve been able to learn from or take an example from. For myself, I’ve always tried hard to work only with people I admire, and I encourage folks here to be just as demanding. Life is definitely too short to do otherwise. Will this person raise the average level of effectiveness of the group they’re entering? We want to fight entropy. The bar has to continuously go up. I ask people to visualize the company five years from now. At that point, each of us should look around and say, “The standards are so high now—boy, I’m glad I got in when I did!” Along what dimension might this person be a superstar? Many people have unique skills, interests, and perspectives that enrich the work environment for all of us. It’s often something that’s not even related to their jobs. One person here is a National Spelling Bee champion (1978, I believe). I suspect it doesn’t help her in her everyday work, but it does make working here more fun if you can occasionally snag her in the
Jeff Bezos (Invent and Wander: The Collected Writings of Jeff Bezos)
Instead, he ended up discovering nothing less than a brand new physical quantity, an entity that appears to be as fundamental to the way the universe works as energy or temperature. Because when the time came to try and understand entropy on the scale of atoms, an astonishing discovery was made. Entropy is directly related to information
Ben Miller (The Aliens Are Coming!: The Extraordinary Science Behind Our Search for Life in the Universe)
A sender of information creates order and so reverses the sign of entropy by assessing the largest possible array of informatic options, then choosing what fits the communicational situation. Systems that lack options are relatively predictable, low in information. Extreme order is as destructive of information as extreme chaos.
Bruce Clarke (From Energy to Information: Representation in Science and Technology, Art, and Literature (Writing Science))
impugn,
Elliot McGucken (The Physics of Time: Time & Its Arrows in Quantum Mechanics, Relativity, The Second Law of Thermodynamics, Entropy, The Twin Paradox, & Cosmology Explained via LTD Theory's Expanding Fourth Dimension)
So small changes in the ambient temperature over relatively short periods of time that are not sufficiently long for adaptive processes to develop can lead to huge ecological and climatological effects. Some of these may be positive, but many will be catastrophic. Regardless, however, of the sign of the effect, significant changes are upon us, and we desperately need to understand their origins and consequences and forge strategies for adaptation and mitigation. The crucial question is not whether these effects are anthropogenic in origin because they almost certainly are, but rather to what extent they can be minimized without leading to rapid discontinuous changes in our physical and economic environment and ultimately to the potential collapse of the global socioeconomic fabric. Hence my bewilderment at those in the general public including political and corporate leaders who reject the cautionary exhortations of scientists, environmentalists, and others, and why I am continually baffled by their lack of action. Yes, we should all delight in and promote the huge successes and fruits of the free market system and of the role of human ingenuity and innovation, but we should also recognize the critical roles of energy and entropy and together act strategically to find
Geoffrey West (Scale: The Universal Laws of Growth, Innovation, Sustainability, and the Pace of Life, in Organisms, Cities, Economies, and Companies)
Just like a wine glass breaking, or a car rusting, the entropy of a closed system will always increase with time.
Andrew Thomas (Hidden In Plain Sight: The simple link between relativity and quantum mechanics)
At first, one might think that something like thermodynamics is a rather restrictive concept because it concerns itself with temperature and heat. But its application is not just restricted to all things thermal. It is possible to relate the notion of entropy, which is a measure of disorder, to the more general and fruitful notions of 'information', of which we have already made use in discussing the richness of certain systems of axioms and rules of reasoning. We can think of the entropy of a large object like a black hole as being equal to the number of different ways in which its most elementary constituents can be rearranged in order to give the same large-scale state. This tells us the number of binary digits ('bits') that are needed to specify in every detail the internal configuration of the constituents out of which the black hole is composed. Moreover, we can also appreciate that, when a black hole horizon forms, a certain amount of information is forever lost to the outside observer when a horizon forms around a region of the universe to create a black hole.
John D. Barrow (Theories of Everything: The Quest for Ultimate Explanation)
These are the intricacies of probability as the hallmark of Nature and we, observers, can be very easily misled when looking at partial snapshots of the whole Reality -- particularly with our innate desire to find rock-solid patterns and laws. But… then… is there any relation at all between the new notion of Statistical Entropy and the Anisotropy of Time for which we presented so much evidence in Records of the Future? Clearly, as we just saw, not when we rely on the Statistical Entropy of a single perfectly isolated system!
Felix Alba-Juez (Aiming at REALITY: Statistical Entropy, Disorder, and the Quantum (Quantum Physics free of Folklore, Book 2))
Mike Rother says that it almost doesn’t matter what you improve, as long as you’re improving something. Why? Because if you are not improving, entropy guarantees that you are actually getting worse, which ensures that there is no path to zero errors, zero work-related accidents, and zero loss.
Gene Kim (The Phoenix Project: A Novel About IT, DevOps, and Helping Your Business Win)
He married military history with science, building his theory upon Gödel, Heisenberg, Popper, Kuhn, Piaget and Polanyi, who highlighted the unavoidable feature of uncertainty in any system of thought (as well as the limits of the Newtonian paradigm). Cybernetics and systems theory offered him the concept of feedback, the combination of analysis–synthesis as well as the Second Law of Thermodynamics and entropy, the distinction between open and closed systems, the importance of interactions and relations, and the need for a holistic approach. The cognitive revolution, combined with neo-Darwinist studies, showed him the role of schemata formed by genetics, culture and experience. Chaos theory highlighted non-linear behavior.
Frans P.B. Osinga (Science, Strategy and War: The Strategic Theory of John Boyd (Strategy and History))
IT POPS, A new discovery, by: Jonathan Roy Mckinney IT POPS is relative to space time in that it has two major components that follow the 3 laws of thermodynamics. The PIrandom creator injects an object with the relative XYZ coordinates in a model view projection that are derived by utilizing the correct Matrix math 11,10,10,10. Once Injected, the two way hash coin creator in time will create the entropy required to derive random shapes and reverse or abstract them into full objects for a complete wire frame like Michaelangelo could carve an Angel out of marble.
Jonathan Roy Mckinney
We observe the universe from within it, interacting with a minuscule portion of the innumerable variables of the cosmos. What we see is a blurred image. This blurring suggests that the dynamic of the universe with which we interact is governed by entropy, which measures the amount of blurring. It measures something that relates to us more than to the cosmos.
Carlo Rovelli (The Order of Time)
hypothesis that psychically sensitive individuals may somehow, through some as-yet-undiscovered “psychic retina,” be detecting large, rapid changes in entropy as bright beacons on the landscape ahead in time.24 May’s argument makes a certain amount of sense given the classical equivalence of time’s arrow with entropy. Things that are very rapidly dissipating heat, such as stars and nuclear reactors and houses on fire, or even just a living body making the ultimate transition to the state of disorder called death, could perhaps be seen as concentrated time. But steep entropy gradients also represent a category of information that is intrinsically interesting and meaningful to humans and toward which we are particularly vigilant, whatever the sensory channel through which we receive it. An attentional bias to entropy gradients has been shown for the conventional senses of sight and hearing, not just psi phenomena. Stimuli involving sudden, rapid motion, and especially fire and heat, as well as others’ deaths and illness, are signals that carry important information related to our survival, so we tend to notice and remember them.25 Thus, an alternative explanation for the link between psi accuracy and entropy is the perverse pleasure—that is, jouissance—aroused in people by signs of destruction. Some vigilant part of us needs be constantly scanning the environment for indications of threats to our life and health, which means we need on some level to find that search rewarding. If we were not rewarded, we would not keep our guard up. Entropic signals like smoke from an advancing fire, or screams or cries from a nearby victim of violence or illness, or the grief of a neighbor for their family member are all signifiers, part of what could be called the “natural language of peril.” We find it “enjoyable,” albeit in an ambivalent or repellent way, to engage with such signifiers because, again, their meaning, their signified, is our own survival. The heightened accuracy toward entropic targets that May observed could reflect a heightened fascination with fire, heat, and chaotic situations more generally, an attentional bias to survival-relevant stimuli. Our particular psychic fascination with fire may also reflect its central role as perhaps the most decisive technology in our evolutionary development as well as the most dangerous, always able to turn on its user in an unlucky instant.26 The same primitive threat-vigilance orientation accounts for the unique allure of artworks depicting destruction or the evidence of past destruction. In the 18th century, the sublime entered the vocabulary of art critics and philosophers like Edmund Burke and Immanuel Kant to describe the aesthetic appeal of ruins, impenetrable wilderness, thunderstorms and storms at sea, and other visual signals of potential or past peril, including the slow entropy of erosion and decay. Another definition of the sublime would be the semiotic of entropy.
Eric Wargo (Time Loops: Precognition, Retrocausation, and the Unconscious)
Hawking and Bekenstein had shown that the three great ideas of modern physics—general relativity, quantum mechanics, and thermodynamics—work in harmony. For these reasons, black hole entropy and radiation have come to dominate contemporary physics as scientists search for a so-called grand unified theory, the Holy Grail of a single principle that explains nature—the world, the universe, everything—at its most fundamental level.
Paul Sen (Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe)
In an essay summarizing the results of this research, Baumeister captured what I am trying to convey about the purpose of life, the laws of nature, and the cosmos as it relates to finding meaning, particularly in the context of our search for immortality, the afterlife, and utopia: Meaning is a powerful tool in human life. To understand what that tool is used for, it helps to appreciate something else about life as a process of ongoing change. A living thing might always be in flux, but life cannot be at peace with endless change. Living things yearn for stability, seeking to establish harmonious relationships with their environment. They want to know how to get food, water, shelter and the like. They find or create places where they can rest and be safe … Life, in other words, is change accompanied by a constant striving to slow or stop the process of change, which leads ultimately to death. If only change could stop, especially at some perfect point: that was the theme of the profound story of Faust’s bet with the devil. Faust lost his soul because he could not resist the wish that a wonderful moment would last forever. Such dreams are futile. Life cannot stop changing until it ends.14 That a meaningful, purposeful life comes from struggle and challenge against the vicissitudes of nature more than it does a homeostatic balance of extropic pushback against entropy reinforces the point that the Second Law of Thermodynamics is the First Law of Life. We must act in the world. The thermostat is always being adjusted, balance sought but never achieved. There is no Faustian bargain to be made in life. We may strive for immortality while never reaching it, as we may seek utopian bliss while never finding it, for it is the striving and the seeking that matter, not the attainment of the unattainable. We are volitional beings, so the choice to act is ours, and our sense of purpose is defined by reaching for the upper limits of our natural abilities and learned skills, and by facing challenges with courage and conviction.
Michael Shermer (Heavens on Earth: The Scientific Search for the Afterlife, Immortality, and Utopia)
Your luck surface area relates to the natural concept of entropy, which measures the amount of disorder in a system.
Gabriel Weinberg (Super Thinking: The Big Book of Mental Models)
We are fighting that force in the universe that nudges everything toward chaos. I mean that we are at war with time; we are enemies of entropy; we seek to snatch back those things that have been taken from us by the years—the childhood toys, the friends and relatives who are gone, the events of the past—everything, we struggle to recapture everything, back to the beginning of creation, out of this need not to let anything slip away.
Robert Silverberg (Across a Billion Years)
Society is neither an organism nor a machine; it is-like organisms and machines-a system. It is composed of components that are related in such a way that the whole is greater than, and essentially different from, the sum of the parts. This is so because relations between the parts are maintained by mechanisms of communication and control that depend on the flow of information, on "feedback," for effective operation. Cybernetic theory informs social analysis in a variety of ways: by focusing attention on system properties such as entropy and redundancy and on the values that function as operating rules; by emphasizing the extent to which the meaning and function of any part of the system is determined by context; and so on. Above all else, it reminds us that it is the context-a set of relationships, rather than any single component in isolation-that evolves.18 The focus of this book is on the evolving context of ideas in twentieth-century Vietnam. Vietnamese Society as a System of Yin and Yang In traditional Vietnamese culture we can find, in every domain of society, two different sets of operating principles, or values. These two sets can be used as the basis for a model of society and culture. One set can be seen as yang in nature; the other, as yin. Yang is defined by a tendency toward male dominance, high redundancy, low entropy, complex and rigid hierarchy, competition, and strict orthodoxy focused on rules for behavior based on social roles. Yin is defined by a tendency toward greater egalitarianism and flexibility, more female participation, mechanisms to dampen competition and conflict, high entropy, low redundancy, and more emphasis on feeling, empathy, and spontaneity. Much of traditional Vietnamese culture, social organization, and behavior expressed the balanced opposition between yin and yang as interlocking sets of ideas (including values, conceptual categories, operating rules, etc.). At a high level of abstraction, a great deal of persistence may be detected in the
Neil L. Jamieson (Understanding Vietnam (Philip E. Lilienthal Book.))
During our hiring meetings, we ask people to consider three questions before making a decision: Will you admire this person? If you think about the people you’ve admired in your life, they are probably people you’ve been able to learn from or take an example from. For myself, I’ve always tried hard to work only with people I admire, and I encourage folks here to be just as demanding. Life is definitely too short to do otherwise. Will this person raise the average level of effectiveness of the group they’re entering? We want to fight entropy. The bar has to continuously go up. I ask people to visualize the company five years from now. At that point, each of us should look around and say, “The standards are so high now—boy, I’m glad I got in when I did!” Along what dimension might this person be a superstar? Many people have unique skills, interests, and perspectives that enrich the work environment for all of us. It’s often something that’s not even related to their jobs.
Jeff Bezos (Invent and Wander: The Collected Writings of Jeff Bezos)
A tangent that departs from the real to the imaginary: pure consciousness does and does not transcend the body, and I believe this after hearing that my mother felt suicidal after she took her medicines for weight loss and her biggest regrets in life came crushing down on her for three days in a row. This is the best of what I have learnt in my years of fascination for science and knowledge, and to make you grasp this takes fullness of life: in hydrology, the wet and the dry, and the hot and the cold always co-exist, but they are also in flux and are also stable: all depending on the reference point of analysis. Consciousness beyond matter, and consciousness tied to matter co-exist in everyplace at different scales, and sometimes even in the same scale. Tao te ching (the way and its power) that fascinated Lao Tzu; the calculus of infinitesimals; the wonderful infinity of the number line and fractals that fascinated Ramanujan and Mandelbrot; the horn of the rhinoceros that fascinated Dali, thermodynamic and hydrodynamic equilibriums that fascinate all scientists, the surety of a fading perfume smell or the permanence of a shattered mirror that is easy to understand to anyone; the concepts of anti-fragility, entropy, volatility, randomness, disorder are all intimately tied to this. Consciousness is constantly attainted and broken all around us all the time, and we rarely stop to think about this because it infinitesimally evades us. Here is where I begin to stretch this and I can't understand it and it is very discouraging -- prudence, temperance and courage -- some of the highest virtues may also be related to this. When you are prepared, it is consciousness. When we are unprepared for it, and this hits you without hurting you, it is magic and strength. Else, perhaps death.
Solomon Vimal