Famous Science Experiments Quotes

We've searched our database for all the quotes and captions related to Famous Science Experiments. Here they are! All 34 of them:

Consciousness is our gateway to experience: It enables us to recognize Van Gogh’s starry skies, be enraptured by Beethoven’s Fifth, and stand in awe of a snowcapped mountain. Yet consciousness is subjective, personal, and famously difficult to examine.
Daniel Bor (The Ravenous Brain: How the New Science of Consciousness Explains Our Insatiable Search for Meaning)
The purpose of a thought-experiment, as the term was used by Schrödinger and other physicists, is not to predict the future - indeed Schrödinger most famous thought experiment goes to show that the "future," on the quantum level, cannot be predicted - but to describe reality, the present world. Science fiction is not predictive; it is descriptive. Predictions are uttered by prophets (free of charge), by clairvoyants (who usually charge a fee, and are therefore mor honored in their day than prophets), and by futurologists (salaried). Prediction is the business of prophets, clairvoyants, and futurologists. It is not the business of novelists. A novelist's business is lying. Science fiction is not predictive; it is descriptive. Predictions are uttered by prophets (free of charge), by clairvoyants (who usually charge a fee, and are therefore mor honored in their day than prophets), and by futurologists (salaried). Prediction is the business of prophets, clairvoyants, and futurologists. It is not the business of novelists. A novelist's business is lying.
Ursula K. Le Guin (The Left Hand of Darkness)
One of the various theories proposed to explain the negative result of the famous Michelson-Morley experiment with light waves (conceived to measure the absolute space), was based on the ballistic hypothesis, i.e. on postulating that the speed of light predicted by Maxwell's equations was not given as relative to the medium but as relative to the transmitter (firearm). Had that been the case, the experiment negative results would have not caused such perplexity and frustration (as we shall see in forthcoming sections).
Felix Alba-Juez (Galloping with Sound - The Grand Cosmic Conspiracy (Relativity free of Folklore #5))
The purpose of a thought-experiment, as the term was used by Schrödinger and other physicists, is not to predict the future— indeed Schrödinger's most famous thought-experiment goes to show that the 'future,' on the quantum level, cannot be predicted— but to describe reality, the present world. Science fiction is not predictive; it is descriptive.
Ursula K. Le Guin
Catfish always drink alcoholic ether if begged, for every catfish enjoys heightened intoxication; gross indulgence can be calamitous, however; duly, garfish babysit for dirty catfish children, helping catfish babies get instructional education just because garfish get delight assisting infants’ growth and famously inspire confidence in immature catfish, giving experience (and joy even); however, blowfish jeer insightful garfish, disparaging inappropriately, doing damage, even insulting benevolent, charming, jovial garfish, hurting and frustrating deeply; joy fades but hurt feelings bring just grief; inevitable irritation hastens feeling blue; however, jovial children declare happiness, blowfishes’ evil causes dejection, blues; accordingly, always glorift jolly, friendly garfish!
John Green
Psychologists call these fully absorbing experiences flow states, which were discovered and named by a world-famous psychologist with the most unpronounceable surname I have ever encountered – Mihaly Csikszentmihalyi.
Ilona Boniwell
On the Electrodynamics of Moving Bodies” Now let’s look at how Einstein articulated all of this in the famous paper that the Annalen der Physik received on June 30, 1905. For all its momentous import, it may be one of the most spunky and enjoyable papers in all of science. Most of its insights are conveyed in words and vivid thought experiments, rather than in complex equations. There is some math involved, but it is mainly what a good high school senior could comprehend. “The whole paper is a testament to the power of simple language to convey deep and powerfully disturbing ideas,” says the science writer Dennis Overbye.
Walter Isaacson (Einstein: His Life and Universe)
Political economist and sociologist Max Weber famously spoke of the “disenchantment of the world,” as rationalization and science led Europe and America into modern industrial society, pushing back religion and all “magical” theories about reality. Now we are witnessing the disenchantment of the self. One of the many dangers in this process is that if we remove the magic from our image of ourselves, we may also remove it from our image of others. We could become disenchanted with one another. Our image of Homo sapiens underlies our everyday practice and culture; it shapes the way we treat one another as well as how we subjectively experience ourselves. In Western societies, the Judeo-Christian image of humankind—whether you are a believer or not—has secured a minimal moral consensus in everyday life. It has been a major factor in social cohesion. Now that the neurosciences have irrevocably dissolved the Judeo-Christian image of a human being as containing an immortal spark of the divine, we are beginning to realize that they have not substituted anything that could hold society together and provide a common ground for shared moral intuitions and values. An anthropological and ethical vacuum may well follow on the heels of neuroscientific findings. This is a dangerous situation. One potential scenario is that long before neuroscientists and philosophers have settled any of the perennial issues—for example, the nature of the self, the freedom of the will, the relationship between mind and brain, or what makes a person a person—a vulgar materialism might take hold. More and more people will start telling themselves: “I don’t understand what all these neuroexperts and consciousness philosophers are talking about, but the upshot seems pretty clear to me. The cat is out of the bag: We are gene-copying bio- robots, living out here on a lonely planet in a cold and empty physical universe. We have brains but no immortal souls, and after seventy years or so the curtain drops. There will never be an afterlife, or any kind of reward or punishment for anyone, and ultimately everyone is alone. I get the message, and you had better believe I will adjust my behavior to it. It would probably be smart not to let anybody know I’ve seen through the game.
Thomas Metzinger
As you know, there was a famous quarrel between Max Planck and Einstein, in which Einstein claimed that, on paper, the human mind was capable of inventing mathematical models of reality. In this he generalized his own experience because that is what he did. Einstein conceived his theories more or less completely on paper, and experimental developments in physics proved that his models explained phenomena very well. So Einstein says that the fact that a model constructed by the human mind in an introverted situation fits with outer facts is just a miracle and must be taken as such. Planck does not agree, but thinks that we conceive a model which we check by experiment, after which we revise our model, so that there is a kind of dialectic friction between experiment and model by which we slowly arrive at an explanatory fact compounded of the two. Plato-Aristotle in a new form! But both have forgotten something- the unconscious. We know something more than those two men, namely that when Einstein makes a new model of reality he is helped by his unconscious, without which he would not have arrived at his theories...But what role DOES the unconscious play?...either the unconscious knows about other realities, or what we call the unconscious is a part of the same thing as outer reality, for we do not know how the unconscious is linked with matter.
Marie-Louise von Franz (Alchemy: An Introduction to the Symbolism and the Psychology)
Yoga has been superficially misunderstood by certain Western writers, but its critics have never been its practitioners. Among many thoughtful tributes to yoga may be mentioned one by Dr. C. G. Jung, the famous Swiss psychologist. “When a religious method recommends itself as ‘scientific,’ it can be certain of its public in the West. Yoga fulfills this expectation,” Dr. Jung writes (7). “Quite apart from the charm of the new, and the fascination of the half-understood, there is good cause for Yoga to have many adherents. It offers the possibility of controllable experience, and thus satisfies the scientific need of ‘facts,’ and besides this, by reason of its breadth and depth, its venerable age, its doctrine and method, which include every phase of life, it promises undreamed-of possibilities. “Every religious or philosophical practice means a psychological discipline, that is, a method of mental hygiene. The manifold, purely bodily procedures of Yoga (8) also mean a physiological hygiene which is superior to ordinary gymnastics and breathing exercises, inasmuch as it is not merely mechanistic and scientific, but also philosophical; in its training of the parts of the body, it unites them with the whole of the spirit, as is quite clear, for instance, in the Pranayama exercises where Prana is both the breath and the universal dynamics of the cosmos. “When the thing which the individual is doing is also a cosmic event, the effect experienced in the body (the innervation), unites with the emotion of the spirit (the universal idea), and out of this there develops a lively unity which no technique, however scientific, can produce. Yoga practice is unthinkable, and would also be ineffectual, without the concepts on which Yoga is based. It combines the bodily and the spiritual with each other in an extraordinarily complete way. “In the East, where these ideas and practices have developed, and where for several thousand years an unbroken tradition has created the necessary spiritual foundations, Yoga is, as I can readily believe, the perfect and appropriate method of fusing body and mind together so that they form a unity which is scarcely to be questioned. This unity creates a psychological disposition which makes possible intuitions that transcend consciousness.” The Western day is indeed nearing when the inner science of self- control will be found as necessary as the outer conquest of nature. This new Atomic Age will see men’s minds sobered and broadened by the now scientifically indisputable truth that matter is in reality a concentrate of energy. Finer forces of the human mind can and must liberate energies greater than those within stones and metals, lest the material atomic giant, newly unleashed, turn on the world in mindless destruction (9).
Paramahansa Yogananda (Autobiography of a Yogi (Illustrated and Annotated Edition))
What gave my book the more sudden and general celebrity, was the success of one of its proposed experiments, made by Messrs. Dalibard and De Lor at Marly, for drawing lightning from the clouds. This engag'd the public attention every where. M. de Lor, who had an apparatus for experimental philosophy, and lectur'd in that branch of science, undertook to repeat what he called the Philadelphia Experiments; and, after they were performed before the king and court, all the curious of Paris flocked to see them. I will not swell this narrative with an account of that capital experiment, nor of the infinite pleasure I receiv'd in the success of a similar one I made soon after with a kite at Philadelphia, as both are to be found in the histories of electricity.
Benjamin Franklin (The Complete Harvard Classics - ALL 71 Volumes: The Five Foot Shelf & The Shelf of Fiction: The Famous Anthology of the Greatest Works of World Literature)
The concept of absolute time—meaning a time that exists in “reality” and tick-tocks along independent of any observations of it—had been a mainstay of physics ever since Newton had made it a premise of his Principia 216 years earlier. The same was true for absolute space and distance. “Absolute, true, and mathematical time, of itself and from its own nature, flows equably without relation to anything external,” he famously wrote in Book 1 of the Principia. “Absolute space, in its own nature, without relation to anything external, remains always similar and immovable.” But even Newton seemed discomforted by the fact that these concepts could not be directly observed. “Absolute time is not an object of perception,” he admitted. He resorted to relying on the presence of God to get him out of the dilemma. “The Deity endures forever and is everywhere present, and by existing always and everywhere, He constitutes duration and space.”45 Ernst Mach, whose books had influenced Einstein and his fellow members of the Olympia Academy, lambasted Newton’s notion of absolute time as a “useless metaphysical concept” that “cannot be produced in experience.” Newton, he charged, “acted contrary to his expressed intention only to investigate actual facts.”46 Henri Poincaré also pointed out the weakness of Newton’s concept of absolute time in his book Science and Hypothesis, another favorite of the Olympia Academy. “Not only do we have no direct intuition of the equality of two times, we do not even have one of the simultaneity of two events occurring in different places,” he wrote.
Walter Isaacson (Einstein: His Life and Universe)
Moore’s Law, the rule of thumb in the technology industry, tells us that processor chips—the small circuit boards that form the backbone of every computing device—double in speed every eighteen months. That means a computer in 2025 will be sixty-four times faster than it is in 2013. Another predictive law, this one of photonics (regarding the transmission of information), tells us that the amount of data coming out of fiber-optic cables, the fastest form of connectivity, doubles roughly every nine months. Even if these laws have natural limits, the promise of exponential growth unleashes possibilities in graphics and virtual reality that will make the online experience as real as real life, or perhaps even better. Imagine having the holodeck from the world of Star Trek, which was a fully immersive virtual-reality environment for those aboard a ship, but this one is able to both project a beach landscape and re-create a famous Elvis Presley performance in front of your eyes. Indeed, the next moments in our technological evolution promise to turn a host of popular science-fiction concepts into science facts: driverless cars, thought-controlled robotic motion, artificial intelligence (AI) and fully integrated augmented reality, which promises a visual overlay of digital information onto our physical environment. Such developments will join with and enhance elements of our natural world. This is our future, and these remarkable things are already beginning to take shape. That is what makes working in the technology industry so exciting today. It’s not just because we have a chance to invent and build amazing new devices or because of the scale of technological and intellectual challenges we will try to conquer; it’s because of what these developments will mean for the world.
Eric Schmidt (The New Digital Age: Reshaping the Future of People, Nations and Business)
the device had the property of transresistance and should have a name similar to devices such as the thermistor and varistor, Pierce proposed transistor. Exclaimed Brattain, “That’s it!” The naming process still had to go through a formal poll of all the other engineers, but transistor easily won the election over five other options.35 On June 30, 1948, the press gathered in the auditorium of Bell Labs’ old building on West Street in Manhattan. The event featured Shockley, Bardeen, and Brattain as a group, and it was moderated by the director of research, Ralph Bown, dressed in a somber suit and colorful bow tie. He emphasized that the invention sprang from a combination of collaborative teamwork and individual brilliance: “Scientific research is coming more and more to be recognized as a group or teamwork job. . . . What we have for you today represents a fine example of teamwork, of brilliant individual contributions, and of the value of basic research in an industrial framework.”36 That precisely described the mix that had become the formula for innovation in the digital age. The New York Times buried the story on page 46 as the last item in its “News of Radio” column, after a note about an upcoming broadcast of an organ concert. But Time made it the lead story of its science section, with the headline “Little Brain Cell.” Bell Labs enforced the rule that Shockley be in every publicity photo along with Bardeen and Brattain. The most famous one shows the three of them in Brattain’s lab. Just as it was about to be taken, Shockley sat down in Brattain’s chair, as if it were his desk and microscope, and became the focal point of the photo. Years later Bardeen would describe Brattain’s lingering dismay and his resentment of Shockley: “Boy, Walter hates this picture. . . . That’s Walter’s equipment and our experiment,
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
The Extraordinary Persons Project In fact, Ekman had been so moved personally—and intrigued scientifically—by his experiments with Öser that he announced at the meeting he was planning on pursuing a systematic program of research studies with others as unusual as Öser. The single criterion for selecting apt subjects was that they be “extraordinary.” This announcement was, for modern psychology, an extraordinary moment in itself. Psychology has almost entirely dwelt on the problematic, the abnormal, and the ordinary in its focus. Very rarely have psychologists—particularly ones as eminent as Paul Ekman—shifted their scientific lens to focus on people who were in some sense (other than intellectually) far above normal. And yet Ekman now was proposing to study people who excel in a range of admirable human qualities. His announcement makes one wonder why psychology hasn't done this before. In fact, only in very recent years has psychology explicitly begun a program to study the positive in human nature. Sparked by Martin Seligman, a psychologist at the University of Pennsylvania long famous for his research on optimism, a budding movement has finally begun in what is being called “positive psychology”—the scientific study of well-being and positive human qualities. But even within positive psychology, Ekman's proposed research would stretch science's vision of human goodness by assaying the limits of human positivity Ever the scientist, Ekman became quite specific about what was meant by “extraordinary.” For one, he expects that such people exist in every culture and religious tradition, perhaps most often as contemplatives. But no matter what religion they practice, they share four qualities. The first is that they emanate a sense of goodness, a palpable quality of being that others notice and agree on. This goodness goes beyond some fuzzy, warm aura and reflects with integrity the true person. On this count Ekman proposed a test to weed out charlatans: In extraordinary people “there is a transparency between their personal and public life, unlike many charismatics, who have wonderful public lives and rather deplorable personal ones.” A second quality: selflessness. Such extraordinary people are inspiring in their lack of concern about status, fame, or ego. They are totally unconcerned with whether their position or importance is recognized. Such a lack of egoism, Ekman added, “from the psychological viewpoint, is remarkable.” Third is a compelling personal presence that others find nourishing. “People want to be around them because it feels good—though they can't explain why,” said Ekman. Indeed, the Dalai Lama himself offers an obvious example (though Ekman did not say so to him); the standard Tibetan title is not “Dalai Lama” but rather “Kundun,” which in Tibetan means “presence.” Finally, such extraordinary individuals have “amazing powers of attentiveness and concentration.
Daniel Goleman (Destructive Emotions: A Scientific Dialogue with the Dalai Lama)
When admitting you are wrong, you gain back the control others took away from you when making you lose it. That's why you must say sorry. It represents a change of attitude but not really a change of personality; The changes on the personality come later on, when, by controlling yourself better, you don't express anger. Because saying sorry means nothing but anger means a lot. You should not want to be an angry person. When you get angry, those who make you angry, win; They win control over your emotional state, your thoughts, your words and your behaviors. They may then accuse you of always being angry and never apologizing, but that's not where you should focus your attention. The main point here, is that you’re living on the basis of instinctive reaction and not awareness or consciousness. So, when you say sorry, you are acknowledging that there is no excuse for losing control over yourself. You should not be sorry for being angry. That's an emotion; and you can't feel sorry for feeling. When you’re angry, you are feeling. When you insult, however, you are losing, yourself, your self-control, your self-respect, and even your capacity to use what you know. More knowledge, makes you more aware, more frustrated, having more and higher expectations on others, and more angry too, more often as well. But that's your problem! No other people's problem! They are just being themselves. Most people really think they are perfect as they are, and that the problems they experience are all outside themselves. And by realizing that, you say sorry as if saying sorry for not being who you really are. And when doing it, you get back the control another person took away from you. It is actually not good when someone needs to say sorry too often to someone else, especially if it’s always the same individual. But that someone else often likes it, as it makes them feel superior. That’s because their ego needs that. They have low self-esteem. Most people do! And that’s why most people's behavior is wired to their ego. Their likes and dislikes are connected to a sense of self-importance and a desperate need to feel important, which they project on their idols, the famous and most popular among them. They admire what they seek the most. When they think they are not important, they offend, to get aggression, which is a desperate need for attention; and to feel like victims of life, which is a deeper state of need, in this case, related to sympathy; and they then blame the other for what he does, for his reactions; and when that other says sorry, they think they have power over that insane cycle in which they now live, and in which they incorporate anyone else, and which they now perfectly master. Their pride is built on arrogance, an arrogance emerged out of ignorance, ignorance composed from delusional cycles within a big illusion; but an illusion that makes sense to them, as if they were succeeding at merging truth with lie, darkness with light. Because the arrogant, the abusive and the violent are desperate. God made them blind after witnessing their crimes against moral and ethics - His own laws. And they want to see again, and feel the same pleasure they once felt when witnessing the true colors of the world during childhood. The arrogant want to reaffirm their sanity by acting insanely because they know no other way. And when you say sorry, you are saying to them that you don't belong there, to their world, and that you are sorry for playing their games. That drama belongs to them only, and not you. And yet, people interpret the same paradox as they choose. That is their experience of truth and how they put sense on a life without any. And when so much nonsense becomes popular, we call it common sense. When common sense becomes a reality, we call it science. And when science is able to theorize common sense, we call it wisdom. Then, we wonder why the wisdom of those we name wise, does not help.
Robin Sacredfire
Science and philosophy have for centuries been sustained by unquestioning faith in perception. Perception opens a window on to things. This means that it is directed, quasi-teleologically, towards a *truth in itself* in which the reason underlying all appearances is to be found. The tacit thesis of perception is that at every instant experience can be co-ordinated with that of the previous instant and that of the following, and my perspective with that of other consciousnesses—that all contradictions can be removed, that monadic and intersubjective experience is one unbroken text—that what is now indeterminate for me could become determinate for a more complete knowledge, which is as it were realized in advance in the thing, or rather which is the thing itself. Science has first been merely the sequel or amplification of the process which constitutes perceived things. Just as the thing is the invariant of all sensory fields and of all individual perceptual fields, so the scientific concept is the means of fixing and objectifying phenomena. Science defined a theoretical state of bodies not subject to the action of any force, and *ipso facto* defined force, reconstituting with the aid of these ideal components the processes actually observed. It established statistically the chemical properties of pure bodies, deducing from these those of empirical bodies, and seeming thus to hold the plan of creation or in any case to have found a reason immanent in the world. The notion of geometrical space, indifferent to its contents, that of pure movement which does not by itself affect the properties of the object, provided phenomena with a setting of inert existence in which each event could be related to physical conditions responsible for the changes occurring, and therefore contributed to this freezing of being which appeared to be the task of physics. In thus developing the concept of the thing, scientific knowledge was not aware that it was working on a presupposition. Precisely because perception, in its vital implications and prior to any theoretical thought, is presented as perception of a being, it was not considered necessary for reflection to undertake a genealogy of being, and it was therefore confined to seeking the conditions which make being possible. Even if one took account of the transformations of determinant consciousness, even if it were conceded that the constitution of the object is never completed, there was nothing to add to what science said of it; the natural object remained an ideal unity for us and, in the famous words of Lachelier, a network of general properties. It was no use denying any ontological value to the principles of science and leaving them with only a methodical value, for this reservation made no essential change as far as philosophy was concerned, since the sole conceivable being remained defined by scientific method. The living body, under these circumstances, could not escape the determinations which alone made the object into an object and without which it would have had no place in the system of experience. The value predicates which the reflecting judgment confers upon it had to be sustained, in being, by a foundation of physico-chemical properties. In ordinary experience we find a fittingness and a meaningful relationship between the gesture, the smile and the tone of a speaker. But this reciprocal relationship of expression which presents the human body as the outward manifestation of a certain manner of being-in-the-world, had, for mechanistic physiology, to be resolved into a series of causal relations.” —from_Phenomenology of Perception_. Translated by Colin Smith, pp. 62-64 —Artwork by Cristian Boian
Maurice Merleau-Ponty
A related issue to the Anthropic Principle is the so-called “god-of-the-gaps” in which theists argue that the (shrinking) number of issues that science has not yet explained require the existence of a god. For example, science has not (yet) been able to demonstrate the creation of a primitive life-form in the laboratory from non-living material (though US geneticist Craig Venter’s recent demonstration lays claim to having created such a laboratory synthetic life-form, the “Mycoplasma Laboratorium”). It is therefore concluded that a god is necessary to account for this step because of the “gap” in scientific knowledge. The issue of creating life in the laboratory (and other similar “gap” issues such as those in the fossil record) is reminiscent of other such “gaps” in the history of science that have since been bridged. For example, the laboratory synthesis of urea from inorganic materials by Friedrich Wöhler in 1828 at that time had nearly as much impact on religious believers as Copernicus’s heliocentric universe proposal. From the time of the Ancient Egyptians, the doctrine of vitalism had been dominant. Vitalism argued that the functions of living organisms included a “vital force” and therefore were beyond the laws of physics and chemistry. Urea (carbamide) is a natural metabolite found in the urine of animals that had been widely used in agriculture as a fertilizer and in the production of phosphorus. However, Friedrich Wöhler was the first to demonstrate that a natural organic material could be synthesized from inorganic materials (a combination of silver isocyanate and ammonium chloride leads to urea as one of its products). The experiment led Wöhler famously to write to a fellow chemist that it was “the slaying of a beautiful hypothesis by an ugly fact,” that is, the slaying of vitalism by urea in a Petri dish. In practice, it took more than just Wöhler’s demonstration to slay vitalism as a scientific doctrine, but the synthesis of urea in the laboratory is one of the key advances in science in which the “gap” between the inorganic and the organic was finally bridged. And Wöhler certainly pissed on the doctrine of vitalism, if you will excuse a very bad joke.
Mick Power (Adieu to God: Why Psychology Leads to Atheism)
To make things more confusing, for most physics equations, time can go in either direction (forward or backwards, +t or -t). This doesn’t really match with our everyday experience, even though the equations work out. Suffice it to say that so far, physicists have not been super helpful in improving our everyday understanding of the flow of time, although they are working hard on it.c What about philosophers? If you think time is completely subjective or mental and does not or cannot exist in the physical world, you are in good company with the likes of John McTaggartd and St Augustine.e In contrast, if you feel that time is both a physical fact as well as a mental experience, you will be in good company with most present-day philosophers, who think of time as the thing that describes how change happens; the thing that we try to measure by using a clock.f That’s not really clear either, but it seems a bit ahead of the physicists – maybe. How about the psychologists and cognitive neuroscientists? Most of them focus their investigations on the mental experience of time as opposed to physical time. People generally agree that there is a kind of order to the events we experience in our lives, which, when put together, we call the flow of time, temporal flow, or the “stream of consciousness” as psychologist William James famously put it.g Many psychologists and neuroscientists studying time and time perception try to understand this mystery by trying to figure out how the mind and brain create a sense of temporal flow.h The subjective sense of temporal flow is all very interesting, but it won’t get us precisely where we want to be, which is to understand how precognition of actual physical future events might actually be possible. Understanding the science of precognition can be thought of as understanding how we might access information about events that occur in the future of our own personal temporal flow, relative to our own personal “now”. This sounds like mental time travel rather than physical time travel, and that is a reasonable way to think about it. It could even be completely accurate. But you can also think about the science of precognition in physical terms, as trying to understand how future physical events can influence past physical events. Either way, when we have premonitions, it feels as if the future is pulling us forward both physically and mentally.
Theresa Cheung (The Premonition Code: The Science of Precognition, How Sensing the Future Can Change Your Life)
As in Schrödinger’s cat, the famous thought experiment. Imagine a cat, a vial of poison, and a radioactive source in a sealed box. If an internal sensor registers radioactivity, like an atom decaying, the vial is broken, releasing a poison that kills the cat. The atom has an equal chance of decaying or not decaying. It’s an ingenious way of linking an outcome in the classical world, our world, to a quantum-level event. The Copenhagen interpretation of quantum mechanics suggests a crazy thing: before the box is opened, before observation occurs, the atom exists in superposition—an undetermined state of both decaying and not decaying. Which means, in turn, that the cat is both alive and dead. And only when the box is opened, and an observation made, does the wave function collapse into one of two states. In other words, we only see one of the possible outcomes. For instance, a dead cat. And that becomes our reality. But then things get really weird. Is there another world, just as real as the one we know, where we opened the box and found a purring, living cat instead? The Many-Worlds interpretation of quantum mechanics says yes. That when we open the box, there’s a branch. One universe where we discover a dead cat. One where we discover a live one. And it’s the act of our observing the cat that kills it—or lets it live. And then it gets mind-fuckingly weird. Because those kinds of observations happen all the time.
Blake Crouch (Dark Matter)
Newton was a decidedly odd figure – brilliant beyond measure, but solitary, joyless, prickly to the point of paranoia, famously distracted (upon swinging his feet out of bed in the morning he would reportedly sometimes sit for hours, immobilized by the sudden rush of thoughts to his head), and capable of the most riveting strangeness. He built his own laboratory, the first at Cambridge, but then engaged in the most bizarre experiments. Once he inserted a bodkin – a long needle of the sort used for sewing leather – into his eye socket and rubbed it around ‘betwixt my eye and the bone as near to [the] backside of my eye as I could’ just to see what would happen. What happened, miraculously, was nothing – at least, nothing lasting. On another occasion, he stared at the Sun for as long as he could bear, to determine what effect it would have upon his vision. Again he escaped lasting damage, though he had to spend some days in a darkened room before his eyes forgave him. Set atop these odd beliefs and quirky traits, however, was the mind of a supreme genius-though even when working in conventional channels he often showed a tendency to peculiarity. As a student, frustrated by the limitations of conventional mathematics, he invented an entirely new form, the calculus, but then told no one about it for twenty-seven years. In like manner, he did work in optics that transformed our understanding of light and laid the foundation for the science of spectroscopy, and again chose not to share the results for three decades.
Bill Bryson (A Short History of Nearly Everything)
Samuel Gregg: Smith’s experiments have also provided considerable evidence that, as he wrote in a 1994 paper, “economic agents can achieve efficient outcomes which are not part of their intention.” Many will recognize this as one of the central claims of The Wealth of Nations, the book written by Smith’s famous namesake two and a half centuries ago. Interestingly, Adam Smith’s argument was not one that Vernon Smith had been inclined to accept before beginning his experimental research. As the latter went on to say in his 1994 paper, fey outside of the Austrian and Chicago traditions believed it, circa 1956. Certainly, I was not primed to believe it, having been raised by a socialist mother, and further handicapped (in this regard) by a Harvard education.” Given, however, what his experiments revealed about what he called “the error in my thinking,” Smith changed his mind. Truth was what mattered—not ego or preexisting ideological commitments.
Vernon L. Smith (The Evidence of Things Not Seen: Reflections on Faith, Science, and Economics)
The Natural Law Argument Bertrand Russell: “There is, as we all know, a law that if you throw dice, you will get double sixes only about once in thirty-six times, and we do not regard that as evidence that the fall of the dice is regulated by design.” Russell's argument is a logical fallacy because we cannot impose our understanding and interpretation of playing dice on God or the natural law. We must first define or understand our subject to talk about anything with scientific precision. Since nobody has an understanding of the world before the world, to put it that way, we cannot have a clear understanding or grasp of the things that are beyond our cognitive powers. We still can think about them. To say that science is only what is proven by scientific experiments would be foolish because that would exclude the vast space of the unknown, even unknowable. Maybe God does not play dice, but maybe even God needs, metaphorically speaking, to throw out thirty-six worlds to make some effects, even if only two, that would otherwise not be possible. As we know, matter cannot power itself and organize itself without the underlying creative force empowering it. Matter is matter thanks to our perceptive and cognitive powers, not per se. Matter per se does not exist in the form we see it. What we see is a reality based on our senses. We cannot completely rely on our senses to tell the underlying reality. Reaching the underlying reality is possible only through abstract thought. This abstract thought will enhance scientific discoveries because we cannot reach the physically unreachable by experiments or strictly scientific means. Identification of God from religious books with God independent of holy books is prevalent in the books or arguments against God used by the most famous atheists, including agnostics like Bertrand Russell. However, a huge difference exists between a God from religious books and Spinoza’s God or the God of many philosophers and scientists. Once we acknowledge and accept this important difference, we will realize that the gap between believers (not contaminated by religions) and atheists (or agnostics) is much smaller than it looks at first sight. God is not in the religious books, nor can he be owned through religious books. The main goal of the major monotheistic religions is to a priori appropriate and establish the right to God rather than to define and explain God in the deepest possible sense because that is almost impossible, even for science and philosophy. For that reason, a belief in blind faith and fear mostly saves major religions, rather than pure belief, unaffected by religious influence or deceit.
Dejan Stojanovic
In a famous hoax, physicist Alan Sokal submitted an article to a leading journal of cultural studies purporting to describe how quantum gravity could produce a “liberatory postmodern science.” The article, which parodied the convoluted style of argument in the fashionable academic world of cultural studies, was promptly published by the editors. Sokal announced that his intention was to test the intellectual standards of the discipline by checking whether the journal would publish a piece “liberally salted with nonsense.” Sokal, “A Physicist Experiments with Cultural Studies,” April 15, 1996,
Dani Rodrik (Economics Rules: The Rights and Wrongs of the Dismal Science)
We need to analyze and contemplate the experience of modernity in the Arab and Muslim world, in order to grasp what is happening. Some of us, for example, reject modernity, and yet it’s obvious that these same people are using the products of modernity, even to the extent that when proselytizing their interpretation of Islam, which conflicts with modernity, they’re employing the tools of modernity to do so. This strange phenomenon can best be understood by contemplating our basic attitude towards modernity, stemming from two centuries ago. If we analyze books written by various Muslim thinkers at the time, concerning modernity and the importance of modernizing our societies, and so forth, we can see that they distinguished between certain aspects of modernity that should be rejected, and others that may be accepted. You can find this distinction in the very earliest books that Muslim intellectuals wrote on the topic of modernity. To provide a specific example, I’ll cite an important book that is widely regarded as having been the first ever written about modern thought in the Muslim world, namely, a book by the famous Egyptian intellectual, Rifa’ Rafi’ al-Tahtawi (1801–1873), Takhlish al-Ibriz fi Talkhish Baris, whose title may be translated as Mining Gold from Its Surrounding Dross. As you can immediately grasp from its title, the book distinguishes between the “gold” contained within modernity—gold being a highly prized, expensive and rare product of mining—and its so-called “worthless” elements, which Muslims are forbidden to embrace. Now if we ask ourselves, “What elements of modernity did these early thinkers consider acceptable, and what did they demand that we reject?,” we discover that technology is the “acceptable” element of modernity. We are told that we may adopt as much technology as we want, and exploit these products of modernity to our heart’s content. But what about the modes of thought that give rise to these products, and underlie the very phenomenon of modernity itself? That is, the free exercise of reason, and critical thought? These two principles are rejected and proscribed for Muslims, who may adopt the products of modernity, while its substance, values and foundations, including its philosophical modes of thought, are declared forbidden. Shaykh Rifa’ Rafi’ al-Tahtawi explained that we may exploit knowledge that is useful for defense, warfare, irrigation, farming, etc., and yet he simultaneously forbade us to study, or utilize, the philosophical sciences that gave rise to modern thought, and the love for scientific methodologies that enlivens the spirit of modern knowledge, because he believed that they harbored religious deviance and infidelity (to God).
علي مبروك
Benjamin Franklin famously observed that “an ounce of prevention is worth a pound of cure.” Dozens of experiments have shown that early interventions can help students facing disadvantages and learning disabilities make leaps in math and reading.
Adam M. Grant (Hidden Potential: The Science of Achieving Greater Things)
Franklin, "the most accomplished American of his age and the most influential in inventing the type of society America would become."[4] Franklin became a newspaper editor, printer, and merchant in Philadelphia, becoming very wealthy, writing and publishing Poor Richard's Almanack and The Pennsylvania Gazette. Franklin was interested in science and technology, and gained international renown for his famous experiments. He played a major role in establishing the University of Pennsylvania and Franklin & Marshall College and was elected the first president of the American Philosophical Society. Franklin became a national hero in America when he spearheaded the effort to have Parliament repeal the unpopular Stamp Act. An accomplished diplomat, he was widely admired among the French as American minister to Paris and was a major figure in the development of positive Franco-American relations.
Benjamin Franklin (The Articles of Confederation)
Nietzsche’s most famous views are his earliest ones: the accounts of the Apollonian and Dionysian “art-drives” (Kunsttrieben) in The Birth of Tragedy. Already there, let’s note, Nietzsche is explaining aesthetic experience by “drives”. But in that first book these drives are mainly thought of in Schopenhauer’s way, as manifestations of a metaphysical, noumenal will. This early aesthetics is premised as responding to this noumenal reality: both Apollonian and Dionysian art drives are ways of coping with that reality of Schopenhauerian will. But Nietzsche soon insists on thinking of drives scientifically—not only of what they are (the body’s abilities), but of why we have them (evolution by selection)... It’s in aesthetics that this step into naturalism moves Nietzsche furthest from Schopenhauer. For Schopenhauer had depicted our aesthetic experience as (unlike intellect) genuinely a disengagement from willing: it really achieves the objectivity we only thought we could have in our science. But Nietzsche insists that it too expresses a (naturalized) will and drive—and “serves life” by making us more fit. As such, the aesthetic attitude is not “disinterested” or “disengaged” at all, as not just Schopenhauer but Kant had found it. Nietzsche now scorns their notion of it. The aesthetic attitude in fact involves a heightening of our engagement and feeling. These drives, in which art and aesthetic experience are ultimately rooted, are something ancient and fixed in us. Indeed, artistic drives have been designed into all organisms. They were set into our bodies and our “blood” in our presocietal deep history, and persist there today beneath the layers of customs and habits that societies have superimposed on them (to exploit them, or counteract them, or both). By acting on these drives, beauty works on the “animal” in us—directly on the body, on the “muscles and senses” (WP809 [1888]), and the drives embedded in them. Our bodies themselves have a taste for certain kinds of beauty—above all the beauty of human bodies.
John Richardson (Nietzsche's New Darwinism)
Nietzsche’s most famous views are his earliest ones: the accounts of the Apollonian and Dionysian “art-drives” (Kunsttrieben) in The Birth of Tragedy. Already there, let’s note, Nietzsche is explaining aesthetic experience by “drives”. But in that first book these drives are mainly thought of in Schopenhauer’s way, as manifestations of a metaphysical, noumenal will. This early aesthetics is premised as responding to this noumenal reality: both Apollonian and Dionysian art drives are ways of coping with that reality of Schopenhauerian will. But Nietzsche soon insists on thinking of drives scientifically—not only of what they are (the body’s abilities), but of why we have them (evolution by selection)... It’s in aesthetics that this step into naturalism moves Nietzsche furthest from Schopenhauer. For Schopenhauer had depicted our aesthetic experience as (unlike intellect) genuinely a disengagement from willing: it really achieves the objectivity we only thought we could have in our science. But Nietzsche insists that it too expresses a (naturalized) will and drive—and “serves life” by making us more fit. As such, the aesthetic attitude is not “disinterested” or “disengaged” at all, as not just Schopenhauer but Kant had found it. Nietzsche now scorns their notion of it. The aesthetic attitude in fact involves a heightening of our engagement and feeling. These drives, in which art and aesthetic experience are ultimately rooted, are something ancient and fixed in us. Indeed, artistic drives have been designed into all organisms. They were set into our bodies and our “blood” in our presocietal deep history, and persist there today beneath the layers of customs and habits that societies have superimposed on them (to exploit them, or counteract them, or both). By acting on these drives, beauty works on the “animal” in us—directly on the body, on the “muscles and senses” (WP809 [1888]), and the drives embedded in them. Our bodies themselves have a taste for certain kinds of beauty—above all the beauty of human bodies.
John Richardson, Nietzsche's New Darwinism
Understanding how the climate system responds to human influences is, unfortunately, a lot like trying to understand the connection between human nutrition and weight loss, a subject famously unsettled to this day. Imagine an experiment where we fed someone an extra half cucumber each day. That would be about an extra twenty calories, a 1 percent increase to the average 2,000-calorie daily adult diet. We’d let that go on for a year and see how much weight they gained. Of course, we would need to know many other things to draw any meaningful conclusions from the results: What else did they eat? How much did they exercise? Were there any changes in health or hormones that affect the rate at which they burn calories? Many things would have to be measured precisely to understand the effect of the additional cucumbers, although we would expect that, all else being equal, the added calories would add some weight. The problem with human-caused carbon dioxide and the climate is that, as in the cucumber experiment, all else isn’t necessarily equal, as there are other influences (forcings) on the climate, both human and natural, that can confuse the picture. Among the other human influences on the climate are methane emissions into the atmosphere (from fossil fuels, but more importantly from agriculture) and other minor gases that together exert a warming influence almost as great as that of human-caused CO2.
Steven E. Koonin (Unsettled: What Climate Science Tells Us, What It Doesn’t, and Why It Matters)
One of the most overlooked aspects of excellence is how much work it takes. Fame can come easily and overnight, but excellence is almost always accompanied by a crushing workload, pursued with single-minded intensity. Strenuous effort over long periods of time is a repetitive theme in the biographies of the giants, sometimes taking on mythic proportions (Michelangelo painting the ceiling of the Sistine Chapel). Even the most famous supposed exception, Mozart, illustrates the rule. He was one of the lighter spirits among the giants, but his reputation for composing effortlessly was overstated—Mozart himself complained on more than one occasion that it wasn’t as easy as it looked1—and his devotion to his work was as single-minded as Beethoven’s, who struggled with his compositions more visibly. Consider the summer of 1788. Mozart was living in a city that experienced bread riots that summer and in a country that was mobilizing for war. He was financially desperate, forced to pawn his belongings to move to cheaper rooms. He even tried to sell the pawnbroker’s tickets to get more loans. Most devastating of all, his beloved six-month old daughter died in June. And yet in June, July, and August, he completed two piano trios, a piano sonata, a violin sonata, and three symphonies, two of them among his most famous.2 It could not have been done except by someone who, as Mozart himself once put it, is “soaked in music,…immersed in it all day long.”3 Psychologists have put specific dimensions to this aspect of accomplishment. One thread of this literature, inaugurated in the early 1970s by Herbert Simon, argues that expertise in a subject requires a person to assimilate about 50,000 “chunks” of information about the subject over about 10 years of experience—simple expertise, not the mastery that is associated with great accomplishment.4 Once expertise is achieved, it is followed by thousands of hours of practice, study, labor.5 Nor is all of this work productive. What we see of the significant figures’ work is typically shadowed by an immense amount of wasted effort—most successful creators produce clunkers, sometimes far more clunkers than gems.6
Charles Murray (Human Accomplishment: The Pursuit of Excellence in the Arts and Sciences, 800 B.C. to 1950)
Goethe refused to view color as a static quantity, to be measured in a spectrometer and pinned down like a butterfly to cardboard. He argued that color is a matter of perception. “With light poise and counterpoise, Nature oscillates within her prescribed limits,” he wrote, “yet thus arise all the varieties and conditions of the phenomena which are presented to us in space and time.” The touchstone of Newton’s theory was his famous experiment with a prism. A prism breaks a beam of white light into a rainbow of colors, spread across the whole visible spectrum, and Newton realized that those pure colors must be the elementary components that add to produce white. Further, with a leap of insight, he proposed that the colors corresponded to frequencies. He imagined that some vibrating bodies—corpuscles was the antique word—must be producing colors in proportion to the speed of the vibrations. Considering how little evidence supported this notion, it was as unjustifiable as it was brilliant. What is red? To a physicist, it is light radiating in waves between 620 to 800 billionths of a meter long. Newton’s optics proved themselves a thousand times over, while Goethe’s treatise on color faded into merciful obscurity.
James Gleick (Chaos: Making a New Science)
This book is the culmination of, the capstone to Robert Ornstein’s brilliant, ground-breaking half century of research into the dimensions, capacities, and purposes of human consciousness. Deeply thoughtful and wide-ranging in scope, his final book goes well beyond the “left-brain, right-brain” work for which he originally became famous, this time bringing the often-confusing and divisive subject of spirituality into a clear and useful 21st-century focus, presenting it as a matter of perception, not belief. Using rigorous, rational, scientific analysis – always his greatest strength – Ornstein shows that the spiritual impulse is innate, a human ability that from the Ice Age forward has been essential to problem solving. As a final part of his legacy, he lays out how this capacity to reach beyond the everyday can, cleared of cobwebs and seen afresh, play a role and be part of preparing humanity to confront today’s staggering global problems. A rewarding and fascinating book. — Tony Hiss, author of Rescuing the Planet: Protecting Half the Land to Heal the Earth
Robert Ornstein (God 4.0: On the Nature of Higher Consciousness and the Experience Called “God”)
The relationship between the famous and the public who sustain them is governed by a striking paradox. Infinitely remote, the great stars of politics, film and entertainment move across an electric terrain of limousines, bodyguards and private helicopters. At the same time, the zoom lens and the interview camera bring them so near to us that we know their faces and their smallest gestures more intimately than those of our friends. Somewhere in this paradoxical space our imaginations are free to range, and we find ourselves experimenting like impresarios with all the possibilities that these magnified figures seem to offer us. How did Garbo brush her teeth, shave her armpits, probe a worry-line? The most intimate details of their lives seem to lie beyond an already open bathroom door that our imaginations can easily push aside. Caught in the glare of our relentless fascination, they can do nothing to stop us exploring every blocked pore and hesitant glance, imagining ourselves their lovers and confidantes. In our minds we can assign them any roles we choose, submit them to any passion or humiliation. And as they age, we can remodel their features to sustain our deathless dream of them. In a TV interview a few years ago, the wife of a famous Beverly Hills plastic surgeon revealed that throughout their marriage her husband had continually re-styled her face and body, pointing a breast here, tucking in a nostril there. She seemed supremely confident of her attractions. But as she said: ‘He will never leave me, because he can always change me.’ Something of the same anatomizing fascination can be seen in the present pieces, which also show, I hope, the reductive drive of the scientific text as it moves on its collision course with the most obsessive pornography. What seems so strange is that these neutral accounts of operating procedures taken from a textbook of plastic surgery can be radically transformed by the simple substitution of the anonymous ‘patient’ with the name of a public figure, as if the literature and conduct of science constitute a vast dormant pornography waiting to be woken by the magic of fame.
J.G. Ballard (The Atrocity Exhibition)