Famous Scientific Quotes

We've searched our database for all the quotes and captions related to Famous Scientific. Here they are! All 100 of them:

In the history of science, ever since the famous trial of Galileo, it has repeatedly been claimed that scientific truth cannot be reconciled with the religious interpretation of the world. Although I an now convinced that scientific truth is unassailable in its own field, I have never found it possible to dismiss the content of religious thinking as simply part of an outmoded phase in the consciousness of mankind, a part we shall have to give up from now on, Thus in the course of my life I have repeatedly been compelled to ponder on the relationship of these two regions of though, for I have never been able to doubt the reality of that to which they point.
Werner Heisenberg
Dearest creature in creation, Study English pronunciation. I will teach you in my verse Sounds like corpse, corps, horse, and worse. I will keep you, Suzy, busy, Make your head with heat grow dizzy. Tear in eye, your dress will tear. So shall I! Oh hear my prayer. Just compare heart, beard, and heard, Dies and diet, lord and word, Sword and sward, retain and Britain. (Mind the latter, how it’s written.) Now I surely will not plague you With such words as plaque and ague. But be careful how you speak: Say break and steak, but bleak and streak; Cloven, oven, how and low, Script, receipt, show, poem, and toe. Hear me say, devoid of trickery, Daughter, laughter, and Terpsichore, Typhoid, measles, topsails, aisles, Exiles, similes, and reviles; Scholar, vicar, and cigar, Solar, mica, war and far; One, anemone, Balmoral, Kitchen, lichen, laundry, laurel; Gertrude, German, wind and mind, Scene, Melpomene, mankind. Billet does not rhyme with ballet, Bouquet, wallet, mallet, chalet. Blood and flood are not like food, Nor is mould like should and would. Viscous, viscount, load and broad, Toward, to forward, to reward. And your pronunciation’s OK When you correctly say croquet, Rounded, wounded, grieve and sieve, Friend and fiend, alive and live. Ivy, privy, famous; clamour And enamour rhyme with hammer. River, rival, tomb, bomb, comb, Doll and roll and some and home. Stranger does not rhyme with anger, Neither does devour with clangour. Souls but foul, haunt but aunt, Font, front, wont, want, grand, and grant, Shoes, goes, does. Now first say finger, And then singer, ginger, linger, Real, zeal, mauve, gauze, gouge and gauge, Marriage, foliage, mirage, and age. Query does not rhyme with very, Nor does fury sound like bury. Dost, lost, post and doth, cloth, loth. Job, nob, bosom, transom, oath. Though the differences seem little, We say actual but victual. Refer does not rhyme with deafer. Foeffer does, and zephyr, heifer. Mint, pint, senate and sedate; Dull, bull, and George ate late. Scenic, Arabic, Pacific, Science, conscience, scientific. Liberty, library, heave and heaven, Rachel, ache, moustache, eleven. We say hallowed, but allowed, People, leopard, towed, but vowed. Mark the differences, moreover, Between mover, cover, clover; Leeches, breeches, wise, precise, Chalice, but police and lice; Camel, constable, unstable, Principle, disciple, label. Petal, panel, and canal, Wait, surprise, plait, promise, pal. Worm and storm, chaise, chaos, chair, Senator, spectator, mayor. Tour, but our and succour, four. Gas, alas, and Arkansas. Sea, idea, Korea, area, Psalm, Maria, but malaria. Youth, south, southern, cleanse and clean. Doctrine, turpentine, marine. Compare alien with Italian, Dandelion and battalion. Sally with ally, yea, ye, Eye, I, ay, aye, whey, and key. Say aver, but ever, fever, Neither, leisure, skein, deceiver. Heron, granary, canary. Crevice and device and aerie. Face, but preface, not efface. Phlegm, phlegmatic, ass, glass, bass. Large, but target, gin, give, verging, Ought, out, joust and scour, scourging. Ear, but earn and wear and tear Do not rhyme with here but ere. Seven is right, but so is even, Hyphen, roughen, nephew Stephen, Monkey, donkey, Turk and jerk, Ask, grasp, wasp, and cork and work. Pronunciation (think of Psyche!) Is a paling stout and spikey? Won’t it make you lose your wits, Writing groats and saying grits? It’s a dark abyss or tunnel: Strewn with stones, stowed, solace, gunwale, Islington and Isle of Wight, Housewife, verdict and indict. Finally, which rhymes with enough, Though, through, plough, or dough, or cough? Hiccough has the sound of cup. My advice is to give up!!!
Gerard Nolst Trenité (Drop your Foreign Accent)
It is sometimes said that we should never believe a scientific theory until it is verified by experiment. But a famous astronomer has also stated that we should never believe an observation until it is confirmed by a theory.
João Magueijo (Faster Than the Speed of Light: The Story of a Scientific Speculation)
You have heard me speak of Professor Moriarty?” “The famous scientific criminal, as famous among crooks as– –” “My blushes, Watson!” Holmes murmured in a deprecating voice. “I was about to say, as he is unknown to the public.” “A touch! A distinct touch!” cried Holmes. “You are developing a certain unexpected vein of pawky humour, Watson, against which I must learn to guard myself.
Arthur Conan Doyle (The Complete Sherlock Holmes: All 4 Novels & 56 Short Stories)
Even Scientific American entered the fray with an article proposing that the person portrayed in the famous Martin Droeshout engraving might actually be--I weep to say it--Elizabeth I.
Bill Bryson (Shakespeare)
We preach and practice brotherhood — not only of man but of all living beings — not on Sundays only but on all the days of the week. We believe in the law of universal justice — that our present condition is the result of our past actions and that we are not subjected to the freaks of an irresponsible governor, who is prosecutor and judge at the same time; we depend for our salvation on our own acts and deeds and not on the sacrificial death of an attorney.
Virchand Gandhi (The Monist)
Since Einstein derived his famous equation, literally millions of experiments have confirmed his revolutionary ideas.
Michio Kaku (Physics of the Impossible: A Scientific Exploration of the World of Phasers, Force Fields, Teleportation, and Time Travel)
I think a strong claim can be made that the process of scientific discovery may be regarded as a form of art. This is best seen in the theoretical aspects of Physical Science. The mathematical theorist builds up on certain assumptions and according to well understood logical rules, step by step, a stately edifice, while his imaginative power brings out clearly the hidden relations between its parts. A well constructed theory is in some respects undoubtedly an artistic production. A fine example is the famous Kinetic Theory of Maxwell. ... The theory of relativity by Einstein, quite apart from any question of its validity, cannot but be regarded as a magnificent work of art.
Ernest Rutherford
Why does the universe go to all the bother of existing? Is the unified theory so compelling that it brings about its own existence? Or does it need a creator, and, if so, does he have any other effect on the universe? And who created him? Up to now, most scientists have been too occupied with the development of new theories that describe what the universe is to ask the question why. On the other hand, the people whose business it is to ask why, the philosophers, have not been able to keep up with the advance of scientific theories. In the eighteenth century, philosophers considered the whole of human knowledge, including science, to be their field and discussed questions such as: Did the universe have a beginning? However, in the nineteenth and twentieth centuries, science became too technical and mathematical for the philosophers, or anyone else except a few specialists. Philosophers reduced the scope of their inquiries so much that Wittgenstein, the most famous philosopher of this century, said, 'The sole remaining task for philosophy is the analysis of language.' What a comedown from the great tradition of philosophy from Aristotle to Kant! However, if we do discover a complete theory, it should in time be understandable in broad principle by everyone, not just a few scientists. Then we shall all, philosophers, scientists, and just ordinary people, be able to take part in the discussion of the question of why it is that we and the universe exist. If we find the answer to that, it would be the ultimate triumph of human reason--for then we would know the mind of God.
Stephen Hawking (A Brief History of Time)
Yoga has been superficially misunderstood by certain Western writers, but its critics have never been its practitioners. Among many thoughtful tributes to yoga may be mentioned one by Dr. C. G. Jung, the famous Swiss psychologist. “When a religious method recommends itself as ‘scientific,’ it can be certain of its public in the West. Yoga fulfills this expectation,” Dr. Jung writes.10 “Quite apart from the charm of the new and the fascination of the half-understood, there is good cause for Yoga to have many adherents. It offers the possibility of controllable experience and thus satisfies the scientific need for ‘facts’; and, besides this, by reason of its breadth and depth, its venerable age, its doctrine and method, which include every phase of life, it promises undreamed-of possibilities. “Every religious or philosophical practice means a psychological discipline, that is, a method of mental hygiene. The manifold, purely bodily procedures of Yoga11 also mean a physiological hygiene which is superior to ordinary gymnastics and breathing exercises, inasmuch as it is not merely mechanistic and scientific, but also philosophical; in its training of the parts of the body, it unites them with the whole of the spirit, as is quite clear, for instance, in the Pranayama exercises where Prana is both the breath and the universal dynamics of the cosmos…. “Yoga practice...would be ineffectual without the concepts on which Yoga is based. It combines the bodily and the spiritual in an extraordinarily complete way. “In the East, where these ideas and practices have developed, and where for several thousand years an unbroken tradition has created the necessary spiritual foundations, Yoga is, as I can readily believe, the perfect and appropriate method of fusing body and mind together so that they form a unity which is scarcely to be questioned. This unity creates a psychological disposition which makes possible intuitions that transcend consciousness.
Paramahansa Yogananda (Autobiography of a Yogi (Self-Realization Fellowship))
Joe’s scientific life is defined by these significant near misses… He was Shackleton many times, almost the first: almost the first to see the big bang, almost the first to patent the laser, almost the first to detect gravitational waves. Famous for nearly getting there.
Janna Levin (Black Hole Blues and Other Songs from Outer Space)
After Darwin, human morality became a scientific mystery. Natural selection could explain how intelligent, upright, linguistic, not so hairy, bipedal primates could evolve, but where did our morals come from? Darwin himself was absorbed by this question. Natural selection, it was thought, promotes ruthless self-interest. Individuals who grab up all the resources and destroy the competition will survive better, reproduce more often, and thus populate the world with their ruthlessly selfish offspring. How, then, could morality evolve in a world that Tennyson famously described as “red in tooth and claw”? We now have an answer. Morality evolved as a solution to the problem of cooperation, as a way of averting the Tragedy of the Commons: Morality is a set of psychological adaptations that allow otherwise selfish individuals to reap the benefits of cooperation.
Joshua Greene (Moral Tribes: Emotion, Reason, and the Gap Between Us and Them)
She knew for a fact that being left-handed automatically made you special. Marie Curie, Albert Einstein, Linus Pauling, and Albert Schweitzer were all left-handed. Of course, no believable scientific theory could rest on such a small group of people. When Lindsay probed further, however, more proof emerged. Michelangelo, Leonardo da Vinci, M.C. Escher, Mark Twain, Hans Christian Andersen, Lewis Carrol, H.G. Wells, Eudora Welty, and Jessamyn West- all lefties. The lack of women in her research had initially bothered her until she mentioned it to Allegra. "Chalk that up to male chauvinism," she said. "Lots of left-handed women were geniuses. Janis Joplin was. All it means is that the macho-man researchers didn't bother asking.
Jo-Ann Mapson (The Owl & Moon Cafe)
Sir Isaac Newton famously said that he had achieved everything by standing on the shoulders of giants—the scientific men whose findings he built upon. The same might be said about silicon. After germanium did all the work, silicon became an icon, and germanium was banished to periodic table obscurity.
Sam Kean (The Disappearing Spoon: And Other True Tales of Madness, Love, and the History of the World from the Periodic Table of the Elements)
famous in some scientific circles. Considering their specialties,
Dean Koontz (The House at the End of the World)
Although it is not as famous as Kuhn's SSR, Bas van Fraassen's book The Scientific Image (1980) has certainly had a profound effect on the philosophy of science
Howard Margolis (It Started With Copernicus: How Turning the World Inside Out Led to the Scientific Revolution)
The practice of making claims that appear to be scientific, but do not actually follow the scientific method of testability and falsification of hypotheses, is usually called pseudoscience.
Daniel Loxton (Abominable Science!: Origins of the Yeti, Nessie, and Other Famous Cryptids)
You have heard me speak of Professor Moriarty?” “The famous scientific criminal, as famous among crooks as—” “My blushes, Watson!” Holmes murmured in a deprecating voice. “I was about to say, as he is unknown to the public.” “A touch! A distinct touch!” cried Holmes. “You are developing a certain unexpected vein of pawky humour, Watson, against which I must learn to guard myself.
Arthur Conan Doyle (The Complete Sherlock Holmes: All 4 Novels & 56 Short Stories)
Oliver, my professor, was a scientific bounder, a journalist by instinct, a thief of ideas,—he was always prying! And you know the knavish system of the scientific world. I simply would not publish, and let him share my credit. I went on working, I got nearer and nearer making my formula into an experiment, a reality. I told no living soul, because I meant to flash my work upon the world with crushing effect and become famous at a blow. I took up the question of pigments to fill up certain gaps. And suddenly, not by design but by accident, I made a discovery in physiology.
H.G. Wells (The Invisible Man)
In the scientific world, the syndrome known as 'great man's disease' happens when a famous researcher in one field develops strong opinions about another field that he or she does not understand, such as a chemist who decides that he is an expert in medicine or a physicist who decides that he is an expert in cognitive science. They have trouble accepting that they must go back to school before they can make pronouncements in a new field.
Paul Krugman (A Country Is Not a Company (Harvard Business Review Classics))
Scientific studies suggest that only about 25 percent of how long we live is dictated by genes, according to famous studies of Danish twins. The other 75 percent is determined by our lifestyles and the everyday choices we make.
Dan Buettner (The Blue Zones: 9 Lessons for Living Longer From the People Who've Lived the Longest)
Our famous scientific reality does not afford us the slightest protection against the so-called irreality of the unconscious. Something works behind the veil of fantastic images, whether we give this something a good name or a bad. It is something real, and for this reason its manifestations must be taken seriously. But first the tendency to concretization must be overcome; in other words, we must not take the fantasies literally when we approach the question of interpreting them. While we are in the grip of the actual experience, the fantasies cannot be taken literally enough.
C.G. Jung (The Red Book: Liber Novus)
Dr. Julian Huxley, famous English biologist and director of UNESCO, recently stated that Western scientists should “learn the Oriental techniques” for entering the trance state and for control of breathing. “What happens? How is it possible?” he said. An Associated Press dispatch from London, dated Aug. 21, 1948, reported: “Dr. Huxley told the new World Federation for Mental Health it might well look into the mystic lore of the East. If this lore could be investigated scientifically, he advised mental specialists, ‘then I think an immense step forward could be made in your field.
Paramahansa Yogananda (Autobiography of a Yogi (Self-Realization Fellowship))
Tell me, what kind of functions does pain have when one is convicted to 100 whippings in Saudie Arabia? You claim pain has a function, I claim that's scientific rubbish. The only thing pain really does is cause an instant reaction that is not rational and usually quite erratic. The famous example of the hand in boiling water, for example. You say it proves pain has a function. But exactly because of the spasmic reaction lots and lots of people will drop the bowl with boiling water over their entire bodies causing serious burns. So what was the 'function' of this pain? Pain and fear cause confusion and trauma. If pain actually did have a rational function, chronic pain would not exist.
Martijn Benders
Frederick II of Prussia (known as “the Great,” reigned 1740–1786) famously ran his Berlin court—and the associated Academy of Sciences—in French. When Voltaire visited in 1750, he wrote to the Marquis de Thibouville that “I find myself here in France. One speaks only our language. German is for the soldiers and for the horses; it is only necessary on the road.”187f
Michael D. Gordin (Scientific Babel: How Science Was Done Before and After Global English)
There is now a great emphasis on cause and effect, with earlier forms of thought focusing on resemblances, such as famously through the Greek and then Medieval notion of the Humours, in which Earth, Air, Fire and Water were linked to the constitution and condition of the human body. Those who embrace a scientific attitude stand apart from the universe, attempting to understand it in an abstract manner in order to then manipulate it.
Chris Gosden (Magic: A History: From Alchemy to Witchcraft, from the Ice Age to the Present)
The smartest person to ever walk this Earth in all probability lived and died herding goats on a mountain somewhere, with no way to disseminate their work globally even if they had realised they were super smart and had the means to do something with their abilities. I am not keen on 'who are the smartest' lists and websites because, as Scott Barry Kaufman points out, the concept of genius privileges the few who had the opportunity to see through and promote their life’s work, while excluding others who may have had equal or greater raw potential but lacked the practical and financial support, and the communication platform that famous names clearly had. This is why I am keen to develop, through my research work, a definition of genius from a cognitive neuroscience and psychometric point of view, so that whatever we decide that is and how it should be measured, only focuses on clearly measurable factors within the individual’s mind, regardless of their external achievements, eminence, popularity, wealth, public platform etc. In my view this would be both more equitable and more scientific.
Gwyneth Wesley Rolph
Christopher’s anti-God campaign was based on a fundamental error reflected in the subtitle of his book: How Religion Poisons Everything. On the contrary, since religion, as practiced, is a human activity, the reverse is true. Human beings poison religion, imposing their prejudices, superstitions, and corruptions onto its rituals and texts, not the other way around. “Pascal Is a Fraud!” When I first became acquainted with Christopher’s crusade, I immediately thought of the seventeenth-century scientist and mathematician, Blaise Pascal. In addition to major contributions to scientific knowledge, Pascal produced exquisite reflections on religious themes: When I consider the short duration of my life, swallowed up in the eternity before and after, the space which I fill, and even can see, engulfed in the infinite immensity of spaces of which I am ignorant and which know me not, I am frightened and astonished at being here rather than there; for there is no reason why here rather than there, why now rather than then. Who has put me here?4 These are the questions that only a religious faith can attempt to answer. There is no science of the why of our existence, no scientific counsel or solace for our human longings, loneliness, and fear. Without a God to make sense of our existence, Pascal wrote, human life is intolerable: This is what I see and what troubles me. I look on all sides, and I see only darkness everywhere. Nature presents to me nothing which is not a matter of doubt and concern. If I saw nothing there that revealed a Divinity, I would come to a negative conclusion; if I saw everywhere the signs of a Creator, I would remain peacefully in faith. But seeing too much to deny and too little to be sure, I am in a state to be pitied. . . .5 To resolve this dilemma, Pascal devised his famous “wager,” which, simply stated, is that since we cannot know whether there is a God or not, it is better to wager that there is one, rather than that there is not.
David Horowitz (Dark Agenda: The War to Destroy Christian America)
Yoga has been superficially misunderstood by certain Western writers, but its critics have never been its practitioners. Among many thoughtful tributes to yoga may be mentioned one by Dr. C. G. Jung, the famous Swiss psychologist. “When a religious method recommends itself as ‘scientific,’ it can be certain of its public in the West. Yoga fulfills this expectation,” Dr. Jung writes (7). “Quite apart from the charm of the new, and the fascination of the half-understood, there is good cause for Yoga to have many adherents. It offers the possibility of controllable experience, and thus satisfies the scientific need of ‘facts,’ and besides this, by reason of its breadth and depth, its venerable age, its doctrine and method, which include every phase of life, it promises undreamed-of possibilities. “Every religious or philosophical practice means a psychological discipline, that is, a method of mental hygiene. The manifold, purely bodily procedures of Yoga (8) also mean a physiological hygiene which is superior to ordinary gymnastics and breathing exercises, inasmuch as it is not merely mechanistic and scientific, but also philosophical; in its training of the parts of the body, it unites them with the whole of the spirit, as is quite clear, for instance, in the Pranayama exercises where Prana is both the breath and the universal dynamics of the cosmos. “When the thing which the individual is doing is also a cosmic event, the effect experienced in the body (the innervation), unites with the emotion of the spirit (the universal idea), and out of this there develops a lively unity which no technique, however scientific, can produce. Yoga practice is unthinkable, and would also be ineffectual, without the concepts on which Yoga is based. It combines the bodily and the spiritual with each other in an extraordinarily complete way. “In the East, where these ideas and practices have developed, and where for several thousand years an unbroken tradition has created the necessary spiritual foundations, Yoga is, as I can readily believe, the perfect and appropriate method of fusing body and mind together so that they form a unity which is scarcely to be questioned. This unity creates a psychological disposition which makes possible intuitions that transcend consciousness.” The Western day is indeed nearing when the inner science of self- control will be found as necessary as the outer conquest of nature. This new Atomic Age will see men’s minds sobered and broadened by the now scientifically indisputable truth that matter is in reality a concentrate of energy. Finer forces of the human mind can and must liberate energies greater than those within stones and metals, lest the material atomic giant, newly unleashed, turn on the world in mindless destruction (9).
Paramahansa Yogananda (Autobiography of a Yogi (Illustrated and Annotated Edition))
Things reached such a pitch that at one conference Bohr remarked of a new theory that the question was not whether it was crazy, but whether it was crazy enough. To illustrate the non-intuitive nature of the quantum world, Schrödinger offered a famous thought experiment in which a hypothetical cat was placed in a box with one atom of a radioactive substance attached to a vial of hydrocyanic acid. If the particle degraded within an hour, it would trigger a mechanism that would break the vial and poison the cat. If not, the cat would live. But we could not know which was the case, so there was no choice, scientifically, but to regard the cat as 100 per cent alive and 100 per cent dead at the same time.
Bill Bryson (A Short History of Nearly Everything)
Even if there is only one possible unified theory, it is just a set of rules and equations. What is it that breathes fire into the equations and makes a universe for them to describe? The usual approach of science of constructing a mathematical model cannot answer the questions of why there should be a universe for the model to describe. Why does the universe go to all the bother of existing? Is the unified theory so compelling that it brings about its own existence? Or does it need a creator, and, if so, does he have any other effect on the universe? And who created him? Up to now, most scientists have been too occupied with the development of new theories that describe what the universe is to ask the question why. On the other hand, the people whose business it is to ask why, the philosophers, have not been able to keep up with the advance of scientific theories. In the eighteenth century, philosophers considered the whole of human knowledge, including science, to be their field and discussed questions such as: did the universe have a beginning? However, in the nineteenth and twentieth centuries, science became too technical and mathematical for the philosophers, or anyone else except a few specialists. Philosophers reduced the scope of their inquiries so much that Wittgenstein, the most famous philosopher of this century, said, “The sole remaining task for philosophy is the analysis of language.” What a comedown from the great tradition of philosophy from Aristotle to Kant! However, if we do discover a complete theory, it should in time be understandable in broad principle by everyone, not just a few scientists. Then we shall all, philosophers, scientists, and just ordinary people, be able to take part in the discussion of the question of why it is that we and the universe exist. If we find the answer to that, it would be the ultimate triumph of human reason – for then we would know the mind of God.
Stephen Hawking (A Brief History of Time)
It has been noted in various quarters that the half-illiterate Italian violin maker Antonio Stradivari never recorded the exact plans or dimensions for how to make one of his famous instruments. This might have been a commercial decision (during the earliest years of the 1700s, Stradivari’s violins were in high demand and open to being copied by other luthiers). But it might also have been because, well, Stradivari didn’t know exactly how to record its dimensions, its weight, and its balance. I mean, he knew how to create a violin with his hands and his fingers but maybe not in figures he kept in his head. Today, those violins, named after the Latinized form of his name, Stradivarius, are considered priceless. It is believed there are only around five hundred of them still in existence, some of which have been submitted to the most intense scientific examination in an attempt to reproduce their extraordinary sound quality. But no one has been able to replicate Stradivari’s craftsmanship. They’ve worked out that he used spruce for the top, willow for the internal blocks and linings, and maple for the back, ribs, and neck. They’ve figured out that he also treated the wood with several types of minerals, including potassium borate, sodium and potassium silicate, as well as a handmade varnish that appears to have been composed of gum arabic, honey, and egg white. But they still can’t replicate a Stradivarius. The genius craftsman never once recorded his technique for posterity. Instead, he passed on his knowledge to a number of his apprentices through what the philosopher Michael Polyani called “elbow learning.” This is the process where a protégé is trained in a new art or skill by sitting at the elbow of a master and by learning the craft through doing it, copying it, not simply by reading about it. The apprentices of the great Stradivari didn’t learn their craft from books or manuals but by sitting at his elbow and feeling the wood as he felt it to assess its length, its balance, and its timbre right there in their fingertips. All the learning happened at his elbow, and all the knowledge was contained in his fingers. In his book Personal Knowledge, Polyani wrote, “Practical wisdom is more truly embodied in action than expressed in rules of action.”1 By that he meant that we learn as Stradivari’s protégés did, by feeling the weight of a piece of wood, not by reading the prescribed measurements in a manual. Polyani continues, To learn by example is to submit to authority. You follow your master because you trust his manner of doing things even when you cannot analyze and account in detail for its effectiveness. By watching the master and emulating his efforts in the presence of his example, the apprentice unconsciously picks up the rules of the art, including those which are not explicitly known to the master himself. These hidden rules can be assimilated only by a person who surrenders himself to that extent uncritically to the imitation of another.
Lance Ford (UnLeader: Reimagining Leadership…and Why We Must)
JULIAN HUXLEY’S “EUGENICS MANIFESTO”: “Eugenics Manifesto” was the name given to an article supporting eugenics. The document, which appeared in Nature, September 16, 1939, was a joint statement issued by America’s and Britain’s most prominent biologists, and was widely referred to as the “Eugenics Manifesto.” The manifesto was a response to a request from Science Service, of Washington, D.C. for a reply to the question “How could the world’s population be improved most effectively genetically?” Two of the main signatories and authors were Hermann J. Muller and Julian Huxley. Julian Huxley, as this book documents, was the founding director of UNESCO from the famous Huxley family. Muller was an American geneticist, educator and Nobel laureate best known for his work on the physiological and genetic effects of radiation. Put into the context of the timeline, this document was published 15 years after “Mein Kampf” and a year after the highly publicized violence of Kristallnacht. In other words, there is no way either Muller or Huxley were unaware at the moment of publication of the historical implications of eugenic agendas.
A.E. Samaan (From a "Race of Masters" to a "Master Race": 1948 to 1848)
If Thomas represents an epistemology of faith, which transcends but also includes historical and scientific knowing, we might suggest that Paul represents at this point an epistemology of hope. In 1 Corinthians 15 he sketches his argument that there will be a future resurrection as part of God’s new creation, the redemption of the entire cosmos as in Romans 8. Hope, for the Christian, is not wishful thinking or mere blind optimism. It is a mode of knowing, a mode within which new things are possible, options are not shut down, new creation can happen. There is more to be said about this, but not here. All of which brings us to Peter. Epistemologies of faith and hope, both transcending and including historical and scientific knowing, point on to an epistemology of love—an idea I first met in Bernard Lonergan but that was hardly new with him. The story of John 21 sharpens it up. Peter, famously, has denied Jesus. He has chosen to live within the normal world, where the tyrants win in the end and where it’s better to dissociate yourself from people who get on the wrong side of them. But now, with Easter, Peter is called to live in a new and different world. Where Thomas is called to a new kind of faith and Paul to a radically renewed hope, Peter is called to a new kind of love.15 Here
N.T. Wright (Surprised by Hope: Rethinking Heaven, the Resurrection, and the Mission of the Church)
For almost all astronomical objects, gravitation dominates, and they have the same unexpected behavior. Gravitation reverses the usual relation between energy and temperature. In the domain of astronomy, when heat flows from hotter to cooler objects, the hot objects get hotter and the cool objects get cooler. As a result, temperature differences in the astronomical universe tend to increase rather than decrease as time goes on. There is no final state of uniform temperature, and there is no heat death. Gravitation gives us a universe hospitable to life. Information and order can continue to grow for billions of years in the future, as they have evidently grown in the past. The vision of the future as an infinite playground, with an unending sequence of mysteries to be understood by an unending sequence of players exploring an unending supply of information, is a glorious vision for scientists. Scientists find the vision attractive, since it gives them a purpose for their existence and an unending supply of jobs. The vision is less attractive to artists and writers and ordinary people. Ordinary people are more interested in friends and family than in science. Ordinary people may not welcome a future spent swimming in an unending flood of information. A darker view of the information-dominated universe was described in the famous story “The Library of Babel,” written by Jorge Luis Borges in 1941.§ Borges imagined his library, with an infinite array of books and shelves and mirrors, as a metaphor for the universe. Gleick’s book has an epilogue entitled “The Return of Meaning,” expressing the concerns of people who feel alienated from the prevailing scientific culture. The enormous success of information theory came from Shannon’s decision to separate information from meaning. His central dogma, “Meaning is irrelevant,” declared that information could be handled with greater freedom if it was treated as a mathematical abstraction independent of meaning. The consequence of this freedom is the flood of information in which we are drowning. The immense size of modern databases gives us a feeling of meaninglessness. Information in such quantities reminds us of Borges’s library extending infinitely in all directions. It is our task as humans to bring meaning back into this wasteland. As finite creatures who think and feel, we can create islands of meaning in the sea of information. Gleick ends his book with Borges’s image of the human condition: We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.
Freeman Dyson (Dreams of Earth and Sky)
[THE DAILY BREATH] Blaise Pascal, the famous mathematician, once said: "To those who wish to see, God gives them sufficient light. To those who doesn't wish to see, God gives them sufficient darkness." Seeing the Truth is a choice. Listening to my words is a choice. Healing is a choice. If want scientific evidence about the existence of God, there is a wealth of data to support it. Dr. Jeffrey Long, M.D. used the best scientific techniques available today to study more than 4,000 people who had near-death experiences and found themselves face to face with our Heavenly Father. Read the book "God and the Afterlife" and you will find it. If you want scientific evidence about Jesus being the Son of God, Lee Strobel, an atheist investigative journalist discovered it. Read the book "The Case for Christ" and you will find it. If you want scientific evidence about Jesus still healing today, study the ministries of Dr. Charles Ndifon, T.L. Osborn, Kathryn Kuhlman among others, and you will find it. But most importantly, if you want to fill the emptiness within you, and experience the perfect love, mercy and forgiveness, if you want to live in the peace of our Heavenly Father, give your body, your mind and your heart to Christ. Give your life to Jesus. The empty place you feel in your heart is reserved only for the spirit of Christ and nothing from this world will fill it. Look up to heaven, behold Jesus and Live.
Dragos Bratasanu
But the basis of Freud's ideas aren't accepted by all philosophers, though many accept that he was right about the possibility of unconscious thought. Some have claimed that Freud's theories are unscientific. Most famously, Karl Popper (whose ideas are more fully discussed in Chapter 36) described many of the ideas of psychoanalysis as ‘unfalsifiable’. This wasn't a compliment, but a criticism. For Popper, the essence of scientific research was that it could be tested; that is, there could be some possible observation that would show that it was false. In Popper's example, the actions of a man who pushed a child into a river, and a man who dived in to save a drowning child were, like all human behaviour, equally open to Freudian explanation. Whether someone tried to drown or save a child, Freud's theory could explain it. He would probably say that the first man was repressing some aspect of his Oedipal conflict, and that led to his violent behaviour, whereas the second man had ‘sublimated’ his unconscious desires, that is, managed to steer them into socially useful actions. If every possible observation is taken as further evidence that the theory is true, whatever that observation is, and no imaginable evidence could show that it was false, Popper believed, the theory couldn't be scientific at all. Freud, on the other hand, might have argued that Popper had some kind of repressed desire that made him so aggressive towards psychoanalysis. Bertrand
Nigel Warburton (A Little History of Philosophy (Little Histories))
Despite his earthbound approach and his preoccupation with scientific fact, Aristotle had an acute understanding of the nature and importance of religion and mythology. He pointed out that people who had become initiates in the various mystery religions were not required to learn any facts “but to experience certain emotions and to be put in a certain disposition.”35 Hence his famous literary theory that tragedy effected a purification (katharsis) of the emotions of terror and pity that amounted to an experience of rebirth. The Greek tragedies, which originally formed part of a religious festival, did not necessarily present a factual account of historical events but were attempting to reveal a more serious truth. Indeed, history was more trivial than poetry and myth: “The one describes what has happened, the other what might. Hence poetry is something more philosophic and serious than history; for poetry speaks of what is universal, history of what is particular.”36 There may or may not have been a historical Achilles or Oedipus, but the facts of their lives were irrelevant to the characters we have experienced in Homer and Sophocles, which express a different but more profound truth about the human condition. Aristotle’s account of the katharsis of tragedy was a philosophic presentation of a truth that Homo religiosus had always understood intuitively: a symbolic, mythical or ritual presentation of events that would be unendurable in daily life can redeem and transform them into something pure and even pleasurable.
Karen Armstrong (A History of God: The 4,000-Year Quest of Judaism, Christianity and Islam)
I quickly learned that the congressional delegation from Alaska was deeply committed to the oil industry and other commercial interests, and senatorial courtesy prevented other members from disputing with Senators Ted Stevens (Republican) and Mike Gravel (Democrat) over a matter involving their home state. Former Idaho governor Cecil Andrus, my secretary of interior, and I began to study the history of the controversy and maps of the disputed areas, and I flew over some of them a few times. Environmental groups and most indigenous natives were my allies, but professional hunters, loggers, fishers, and the Chambers of Commerce were aligned with the oil companies. All the odds were against us until Cecil discovered an ancient law, the Antiquities Act of 1906, which permitted a president to set aside an area for “the protection of objects of historic and scientific interest,” such as Indian burial grounds, artifacts, or perhaps an ancient church building or the site of a famous battle. We decided to use this authority to set aside for preservation large areas of Alaska as national monuments, and eventually we had included more than 56 million acres (larger than the state of Minnesota). This gave me the bargaining chip I needed, and I was able to prevail in the subsequent debates. My efforts were extremely unpopular in Alaska, and I had to have extra security on my visits. I remember that there was a state fair where people threw baseballs at two targets to plunge a clown into a tank of water. My face was on one target and Iran’s Ayatollah Khomeini’s on the other, and few people threw at the Ayatollah’s.
Jimmy Carter (A Full Life: Reflections at Ninety)
the device had the property of transresistance and should have a name similar to devices such as the thermistor and varistor, Pierce proposed transistor. Exclaimed Brattain, “That’s it!” The naming process still had to go through a formal poll of all the other engineers, but transistor easily won the election over five other options.35 On June 30, 1948, the press gathered in the auditorium of Bell Labs’ old building on West Street in Manhattan. The event featured Shockley, Bardeen, and Brattain as a group, and it was moderated by the director of research, Ralph Bown, dressed in a somber suit and colorful bow tie. He emphasized that the invention sprang from a combination of collaborative teamwork and individual brilliance: “Scientific research is coming more and more to be recognized as a group or teamwork job. . . . What we have for you today represents a fine example of teamwork, of brilliant individual contributions, and of the value of basic research in an industrial framework.”36 That precisely described the mix that had become the formula for innovation in the digital age. The New York Times buried the story on page 46 as the last item in its “News of Radio” column, after a note about an upcoming broadcast of an organ concert. But Time made it the lead story of its science section, with the headline “Little Brain Cell.” Bell Labs enforced the rule that Shockley be in every publicity photo along with Bardeen and Brattain. The most famous one shows the three of them in Brattain’s lab. Just as it was about to be taken, Shockley sat down in Brattain’s chair, as if it were his desk and microscope, and became the focal point of the photo. Years later Bardeen would describe Brattain’s lingering dismay and his resentment of Shockley: “Boy, Walter hates this picture. . . . That’s Walter’s equipment and our experiment,
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
There are two famous quips of Stalin which are both grounded in this logic. When Stalin answered the question "Which deviation is worse, the Rightist or the Leftist one?" by "They are both worse!", the underlying premise is that the Leftist deviation is REALLY ("objectively," as Stalinists liked to put it) not leftist at all, but a concealed Rightist one! When Stalin wrote, in a report on a party congress, that the delegates, with the majority of votes, unanimously approved the CC resolution, the underlying premise is, again, that there was really no minority within the party: those who voted against thereby excluded themselves from the party... In all these cases, the genus repeatedly overlaps (fully coincides) with one of its species. This is also what allows Stalin to read history retroactively, so that things "become clear" retroactively: it was not that Trotsky was first fighting for the revolution with Lenin and Stalin and then, at a certain stage, opted for a different strategy than the one advocated by Stalin; this last opposition (Trotsky/Stalin) "makes it clear" how, "objectively," Trotsky was against revolution all the time back. We find the same procedure in the classificatory impasse the Stalinist ideologists and political activists faced in their struggle for collectivization in the years 1928-1933. In their attempt to account for their effort to crush the peasants' resistance in "scientific" Marxist terms, they divided peasants into three categories (classes): the poor peasants (no land or minimal land, working for others), natural allies of the workers; the autonomous middle peasants, oscillating between the exploited and exploiters; the rich peasants, "kulaks" (employing other workers, lending them money or seeds, etc.), the exploiting "class enemy" which, as such, has to be "liquidated." However, in practice, this classification became more and more blurred and inoperative: in the generalized poverty, clear criteria no longer applied, and other two categories often joined kulaks in their resistance to forced collectivization. An additional category was thus introduced, that of a subkulak, a peasant who, although, with regard to his economic situation, was to poor to be considered a kulak proper, nonetheless shared the kulak "counter-revolutionary" attitude.
Slavoj Žižek
The Extraordinary Persons Project In fact, Ekman had been so moved personally—and intrigued scientifically—by his experiments with Öser that he announced at the meeting he was planning on pursuing a systematic program of research studies with others as unusual as Öser. The single criterion for selecting apt subjects was that they be “extraordinary.” This announcement was, for modern psychology, an extraordinary moment in itself. Psychology has almost entirely dwelt on the problematic, the abnormal, and the ordinary in its focus. Very rarely have psychologists—particularly ones as eminent as Paul Ekman—shifted their scientific lens to focus on people who were in some sense (other than intellectually) far above normal. And yet Ekman now was proposing to study people who excel in a range of admirable human qualities. His announcement makes one wonder why psychology hasn't done this before. In fact, only in very recent years has psychology explicitly begun a program to study the positive in human nature. Sparked by Martin Seligman, a psychologist at the University of Pennsylvania long famous for his research on optimism, a budding movement has finally begun in what is being called “positive psychology”—the scientific study of well-being and positive human qualities. But even within positive psychology, Ekman's proposed research would stretch science's vision of human goodness by assaying the limits of human positivity Ever the scientist, Ekman became quite specific about what was meant by “extraordinary.” For one, he expects that such people exist in every culture and religious tradition, perhaps most often as contemplatives. But no matter what religion they practice, they share four qualities. The first is that they emanate a sense of goodness, a palpable quality of being that others notice and agree on. This goodness goes beyond some fuzzy, warm aura and reflects with integrity the true person. On this count Ekman proposed a test to weed out charlatans: In extraordinary people “there is a transparency between their personal and public life, unlike many charismatics, who have wonderful public lives and rather deplorable personal ones.” A second quality: selflessness. Such extraordinary people are inspiring in their lack of concern about status, fame, or ego. They are totally unconcerned with whether their position or importance is recognized. Such a lack of egoism, Ekman added, “from the psychological viewpoint, is remarkable.” Third is a compelling personal presence that others find nourishing. “People want to be around them because it feels good—though they can't explain why,” said Ekman. Indeed, the Dalai Lama himself offers an obvious example (though Ekman did not say so to him); the standard Tibetan title is not “Dalai Lama” but rather “Kundun,” which in Tibetan means “presence.” Finally, such extraordinary individuals have “amazing powers of attentiveness and concentration.
Daniel Goleman (Destructive Emotions: A Scientific Dialogue with the Dalai Lama)
The Tale of Human Evolution The subject most often brought up by advocates of the theory of evolution is the subject of the origin of man. The Darwinist claim holds that modern man evolved from ape-like creatures. During this alleged evolutionary process, which is supposed to have started 4-5 million years ago, some "transitional forms" between modern man and his ancestors are supposed to have existed. According to this completely imaginary scenario, four basic "categories" are listed: 1. Australopithecus 2. Homo habilis 3. Homo erectus 4. Homo sapiens Evolutionists call man's so-called first ape-like ancestors Australopithecus, which means "South African ape." These living beings are actually nothing but an old ape species that has become extinct. Extensive research done on various Australopithecus specimens by two world famous anatomists from England and the USA, namely, Lord Solly Zuckerman and Prof. Charles Oxnard, shows that these apes belonged to an ordinary ape species that became extinct and bore no resemblance to humans. Evolutionists classify the next stage of human evolution as "homo," that is "man." According to their claim, the living beings in the Homo series are more developed than Australopithecus. Evolutionists devise a fanciful evolution scheme by arranging different fossils of these creatures in a particular order. This scheme is imaginary because it has never been proved that there is an evolutionary relation between these different classes. Ernst Mayr, one of the twentieth century's most important evolutionists, contends in his book One Long Argument that "particularly historical [puzzles] such as the origin of life or of Homo sapiens, are extremely difficult and may even resist a final, satisfying explanation." By outlining the link chain as Australopithecus > Homo habilis > Homo erectus > Homo sapiens, evolutionists imply that each of these species is one another's ancestor. However, recent findings of paleoanthropologists have revealed that Australopithecus, Homo habilis, and Homo erectus lived at different parts of the world at the same time. Moreover, a certain segment of humans classified as Homo erectus have lived up until very modern times. Homo sapiens neandarthalensis and Homo sapiens sapiens (modern man) co-existed in the same region. This situation apparently indicates the invalidity of the claim that they are ancestors of one another. Stephen Jay Gould explained this deadlock of the theory of evolution although he was himself one of the leading advocates of evolution in the twentieth century: What has become of our ladder if there are three coexisting lineages of hominids (A. africanus, the robust australopithecines, and H. habilis), none clearly derived from another? Moreover, none of the three display any evolutionary trends during their tenure on earth. Put briefly, the scenario of human evolution, which is "upheld" with the help of various drawings of some "half ape, half human" creatures appearing in the media and course books, that is, frankly, by means of propaganda, is nothing but a tale with no scientific foundation. Lord Solly Zuckerman, one of the most famous and respected scientists in the U.K., who carried out research on this subject for years and studied Australopithecus fossils for 15 years, finally concluded, despite being an evolutionist himself, that there is, in fact, no such family tree branching out from ape-like creatures to man.
Harun Yahya (Those Who Exhaust All Their Pleasures In This Life)
The scientific origin of toughened glass is to be found in a famous curiosity of the 1640s known as Prince Rupert’s drops. These are teardrop-shaped pieces of glass that can withstand intense pressure at their rounded end, but if they incur the slightest damage at the tail end they will explode. Prince Rupert’s drops are very simple to make: all you need to do is drop a small piece of molten glass into water.
Mark Miodownik (Stuff Matters: Exploring the Marvelous Materials That Shape Our Man-Made World)
Correlation is enough,” 2 then-Wired editor in chief Chris Anderson famously declared in 2008. We can, he implied, solve innovation problems by the sheer brute force of the data deluge. Ever since Michael Lewis chronicled the Oakland A’s unlikely success in Moneyball (who knew on-base percentage was a better indicator of offensive success than batting averages?), organizations have been trying to find the Moneyball equivalent of customer data that will lead to innovation success. Yet few have. Innovation processes in many companies are structured and disciplined, and the talent applying them is highly skilled. There are careful stage-gates, rapid iterations, and checks and balances built into most organizations’ innovation processes. Risks are carefully calculated and mitigated. Principles like six-sigma have pervaded innovation process design so we now have precise measurements and strict requirements for new products to meet at each stage of their development. From the outside, it looks like companies have mastered an awfully precise, scientific process. But for most of them, innovation is still painfully hit or miss. And worst of all, all this activity gives the illusion of progress, without actually causing it. Companies are spending exponentially more to achieve only modest incremental innovations while completely missing the mark on the breakthrough innovations critical to long-term, sustainable growth. As Yogi Berra famously observed: “We’re lost, but we’re making good time!” What’s gone so wrong? Here is the fundamental problem: the masses and masses of data that companies accumulate are not organized in a way that enables them to reliably predict which ideas will succeed. Instead the data is along the lines of “this customer looks like that one,” “this product has similar performance attributes as that one,” and “these people behaved the same way in the past,” or “68 percent of customers say they prefer version A over version B.” None of that data, however, actually tells you why customers make the choices that they do.
Clayton M. Christensen (Competing Against Luck: The Story of Innovation and Customer Choice)
The scientific method is famous for requiring objectivity and emotional detachment on the part of the investigator. Scientific experimentation also involves extensive manipulation of conditions. When dealing with nature, this is fine, but with people it becomes problematic. “Being objective” can easily be taken to mean “treating people as objects,” emotional detachment can translate into indifference to human suffering, and manipulation can take the form of dominance and control.
Joseph Heath (Enlightenment 2.0)
Alma wrote in depth about laurel, mimosa, and verbena. She wrote about grapes and camellias, about the myrtle orange, about the cosseting of figs, She published under the name "A. Whittaker." Neither she nor George Hawkes believed that it would much benefit Alma to announce herself in print as female. In the scientific world of the day, there was still a strict division between "botany" (the study of plants by men) and "polite botany" was often indistinguishable from "botany"- except that one field was regarded with respect and the other was not- but still, Alma did not wish to be shrugged off as a mere polite botanist. Of course, the Whittaker name was famous in the world of plants and science, so a good number of botanists already knew precisely who "A. Whittaker" was. Not all of them, however. In response to her articles, then, Alma sometimes received letters from botanists around the world, sent to her in care of George Hawkes's print shop. Some of these letters began, "My dear Sir." Other letters were written to "Mr. A. Whittaker." One quite memorable missive even came addressed to "Dr. A. Whittaker." ( Alma kept that letter for a long time, tickled by the unexpected honorific.)
Elizabeth Gilbert (The Signature of All Things)
Science and philosophy have for centuries been sustained by unquestioning faith in perception. Perception opens a window on to things. This means that it is directed, quasi-teleologically, towards a *truth in itself* in which the reason underlying all appearances is to be found. The tacit thesis of perception is that at every instant experience can be co-ordinated with that of the previous instant and that of the following, and my perspective with that of other consciousnesses—that all contradictions can be removed, that monadic and intersubjective experience is one unbroken text—that what is now indeterminate for me could become determinate for a more complete knowledge, which is as it were realized in advance in the thing, or rather which is the thing itself. Science has first been merely the sequel or amplification of the process which constitutes perceived things. Just as the thing is the invariant of all sensory fields and of all individual perceptual fields, so the scientific concept is the means of fixing and objectifying phenomena. Science defined a theoretical state of bodies not subject to the action of any force, and *ipso facto* defined force, reconstituting with the aid of these ideal components the processes actually observed. It established statistically the chemical properties of pure bodies, deducing from these those of empirical bodies, and seeming thus to hold the plan of creation or in any case to have found a reason immanent in the world. The notion of geometrical space, indifferent to its contents, that of pure movement which does not by itself affect the properties of the object, provided phenomena with a setting of inert existence in which each event could be related to physical conditions responsible for the changes occurring, and therefore contributed to this freezing of being which appeared to be the task of physics. In thus developing the concept of the thing, scientific knowledge was not aware that it was working on a presupposition. Precisely because perception, in its vital implications and prior to any theoretical thought, is presented as perception of a being, it was not considered necessary for reflection to undertake a genealogy of being, and it was therefore confined to seeking the conditions which make being possible. Even if one took account of the transformations of determinant consciousness, even if it were conceded that the constitution of the object is never completed, there was nothing to add to what science said of it; the natural object remained an ideal unity for us and, in the famous words of Lachelier, a network of general properties. It was no use denying any ontological value to the principles of science and leaving them with only a methodical value, for this reservation made no essential change as far as philosophy was concerned, since the sole conceivable being remained defined by scientific method. The living body, under these circumstances, could not escape the determinations which alone made the object into an object and without which it would have had no place in the system of experience. The value predicates which the reflecting judgment confers upon it had to be sustained, in being, by a foundation of physico-chemical properties. In ordinary experience we find a fittingness and a meaningful relationship between the gesture, the smile and the tone of a speaker. But this reciprocal relationship of expression which presents the human body as the outward manifestation of a certain manner of being-in-the-world, had, for mechanistic physiology, to be resolved into a series of causal relations.” —from_Phenomenology of Perception_. Translated by Colin Smith, pp. 62-64 —Artwork by Cristian Boian
Maurice Merleau-Ponty
A related issue to the Anthropic Principle is the so-called “god-of-the-gaps” in which theists argue that the (shrinking) number of issues that science has not yet explained require the existence of a god. For example, science has not (yet) been able to demonstrate the creation of a primitive life-form in the laboratory from non-living material (though US geneticist Craig Venter’s recent demonstration lays claim to having created such a laboratory synthetic life-form, the “Mycoplasma Laboratorium”). It is therefore concluded that a god is necessary to account for this step because of the “gap” in scientific knowledge. The issue of creating life in the laboratory (and other similar “gap” issues such as those in the fossil record) is reminiscent of other such “gaps” in the history of science that have since been bridged. For example, the laboratory synthesis of urea from inorganic materials by Friedrich Wöhler in 1828 at that time had nearly as much impact on religious believers as Copernicus’s heliocentric universe proposal. From the time of the Ancient Egyptians, the doctrine of vitalism had been dominant. Vitalism argued that the functions of living organisms included a “vital force” and therefore were beyond the laws of physics and chemistry. Urea (carbamide) is a natural metabolite found in the urine of animals that had been widely used in agriculture as a fertilizer and in the production of phosphorus. However, Friedrich Wöhler was the first to demonstrate that a natural organic material could be synthesized from inorganic materials (a combination of silver isocyanate and ammonium chloride leads to urea as one of its products). The experiment led Wöhler famously to write to a fellow chemist that it was “the slaying of a beautiful hypothesis by an ugly fact,” that is, the slaying of vitalism by urea in a Petri dish. In practice, it took more than just Wöhler’s demonstration to slay vitalism as a scientific doctrine, but the synthesis of urea in the laboratory is one of the key advances in science in which the “gap” between the inorganic and the organic was finally bridged. And Wöhler certainly pissed on the doctrine of vitalism, if you will excuse a very bad joke.
Mick Power (Adieu to God: Why Psychology Leads to Atheism)
Another of our large ambitions here is to demonstrate that our new understanding of the relationship between parts and wholes in physical reality can serve as the basis for a renewed dialogue between the two cultures of humanists-social scientists and scientists-engineers. When C. P. Snow recognized the growing gap between these two cultures in his now famous Rede Lecture in 1959, his primary concern was that the culture of humanists-social scientists might become so scientifically illiterate that it would not be able to meaningfully evaluate the uses of new technologies
Robert L. Nadeau (The Non-Local Universe: The New Physics and Matters of the Mind)
At the beginning of his famous TV series, Cosmos, the American astronomer and cosmologist Carl Sagan said, “The cosmos is all that is, or ever was, or ever will be”. That is not a statement of science, to be put in the same category as, for example, the scientific statement that gravity obeys an inverse-square law. Sagan’s statement is simply an expression of his atheistic belief. The problem is, many people give to all statements by scientists the authority rightly due to science, simply because they are stated by a scientist.
John C. Lennox (Can Science Explain Everything?)
[Lucas] was most famous for his short, best-selling book on fossils, "Animals of the Past: An Account of Some of the Creatures of the Ancient World", in which he showed his gift for enlivening the driest science. Apologizing for using Latin scientific names, he wrote: 'The reader may perhaps sympathize with the old lady who said the discovery of all these strange animals did not surprise her so much as the fact that anyone should know their names when they were found.
Michael Capuzzo (Close to Shore: The Terrifying Shark Attacks of 1916)
Frankly, I see it as a “social force.” Before coming to this country (in naive pursuit of scientific freedom . . .) Dr. Reich had successively gotten expelled from the International Psychoanalytical Association for sounding too Marxist, expelled from the Communist Party for sounding too Freudian, expelled from the Socialist Party for sounding too anarchistic, fled Hitler for having known Jewish genes, and then got driven out of Sweden by a campaign of slander in the sensational press (for doing the kind of sex research that later made Masters and Johnson famous.) I would say Dr. Reich touched on a lot of hot issues and annoyed a lot of dogmatic people of Left and Right, and this created a truly international “social force” for the suppression of his ideas.
Robert Anton Wilson (Cosmic Trigger III: My Life After Death)
The second law has a reputation for being recondite, notoriously difficult, and a litmus test of scientific literacy. Indeed, the novelist and former chemist C. P. Snow is famous for having asserted in his The Two Cultures that not knowing the second law of thermodynamics is equivalent to never having read a work by Shakespeare. I actually have serious doubts about whether Snow understood the law himself, but I concur with his sentiments. The second law is of central importance in the whole of science, and hence in our rational understanding of the universe, because it provides a foundation for understanding why any change occurs. Thus, not only is it a basis for understanding why engines run and chemical reactions occur, but it is also a foundation for understanding those most exquisite consequences of chemical reactions, the acts of literary, artistic, and musical creativity that enhance our culture.
Peter Atkins (The Laws of Thermodynamics: A Very Short Introduction)
As a chief ingredient in the mythology of science, the accumulation of objective facts supposedly controls the history of conceptual change–as logical and self-effacing scientists bow before the dictates of nature and willingly change their views to accommodate the growth of conceptual knowledge. The paradigm for such an idealistic notion remains Huxley’s famous remark about “a beautiful theory killed by a nasty, ugly little fact.” But single facts almost never slay worldviews, at least not right away (and properly so, for the majority of deeply anomalous observations turn out to be wrong)... Anomalous facts get incorporated into existing theories, often with a bit of forced stretching to be sure, but usually with decent fit because most worldviews contain considerable flexibility. (How else could they last so long, or be so recalcitrant to overthrow?)
Stephen Jay Gould (Leonardo's Mountain of Clams and the Diet of Worms: Essays on Natural History)
Supernatural Supernatural has several meanings; the usual is “miraculous; ascribed to agencies or powers above or beyond nature; divine.” Because science is commonly regarded as a method of studying the natural world, a supernatural phenomenon is by this definition unexplainable by, and therefore totally incompatible with, science. Today, a few religious traditions continue to maintain that psi is supernatural and therefore not amenable to scientific study. But a few hundred years ago virtually all natural phenomena were thought to be manifestations of supernatural agencies and spirits. Through years of systematic investigation, many of these phenomena are now understood in quite ordinary terms. Thus, it is entirely reasonable to expect that so-called miracles are simply indicators of our present ignorance. Any such events may be more properly labeled first as paranormal, then as normal once we have developed an acceptable scientific explanation. As astronaut Edgar Mitchell put it: “There are no unnatural or supernatural phenomena, only very large gaps in our knowledge of what is natural, particularly regarding relatively rare occurrences.”2 Mystical Mystical refers to the direct perception of reality; knowledge derived directly rather than indirectly. In many respects, mysticism is surprisingly similar to science in that it is a systematic method of exploring the nature of the world. Science concentrates on outer, objective phenomena, and mysticism concentrates on inner, subjective phenomena. It is interesting that numerous scientists, scholars, and sages over the years have revealed deep, underlying similarities between the goals, practices, and findings of science and mysticism. Some of the most famous scientists wrote in terms that are practically indistinguishable from the writings of mystics.
Dean Radin (The Conscious Universe: The Scientific Truth of Psychic Phenomena)
Take for example the most famous scientific equation of all time, E=mc2, where 'c' denotes the speed of light, 'E' equals energy, and 'm' equals mass. If time is different or not existent in the quantum world or the event horizon of a black hole, then speed must also be different because speed is a measure of the rate in which time passes when an object travels over a distance between two points. Hypothetically, if time doesn't exist in these places, either E=m alone or the whole equation no longer applies.  
Vera Percepio (The Philosophy of Vera Percepio)
Scientific studies suggest that only about 25 percent of how long we live is dictated by genes, according to famous studies of Danish twins. The other 75 percent is determined by our lifestyles and the everyday choices we make. It follows that if we optimize our lifestyles, we can maximize our life expectancies within our biological limits.
Dan Buettner (The Blue Zones: 9 Lessons for Living Longer From the People Who've Lived the Longest)
John Maynard Keynes’s famous view about long-term forecasts: ‘About these matters there is no scientific basis on which to form any calculable probability whatever. We simply do not know.
Tim Harford (How to Make the World Add Up : Ten Rules for Thinking Differently About Numbers)
The grand idea was an atlas. A collection of maps, both of real places and of imagined ones, but reversed. She and Daniel had come up with a list of books, fantasy novels famous for the beautiful maps created just for them—Tolkien’s The Lord of the Rings; Le Guin’s Earthsea series; Lewis’s The Chronicles of Narnia books; Dragt’s De brief voor de koning, The Letter for the King; Pratchett’s Discworld novels—and another list of maps from our real world, famous for their cartographic significance. We would painstakingly research all of them, studying them from historical, scientific, and artistic angles, and then redraw them in the opposite style. Our recreations of the fantasy maps would be rigidly detailed and precise, and our re-creations of the realistic maps would be embellished, expanded, and dreamlike, like their fictional cousins. Once complete, we planned to publish it in one giant volume. Readers would open it, expecting the same old type of atlas, but instead, they’d find previously familiar lands rendered in a completely unexpected manner, opening their imaginations to an entirely new way of looking at maps.
Peng Shepherd (The Cartographers)
The Natural Law Argument Bertrand Russell: “There is, as we all know, a law that if you throw dice, you will get double sixes only about once in thirty-six times, and we do not regard that as evidence that the fall of the dice is regulated by design.” Russell's argument is a logical fallacy because we cannot impose our understanding and interpretation of playing dice on God or the natural law. We must first define or understand our subject to talk about anything with scientific precision. Since nobody has an understanding of the world before the world, to put it that way, we cannot have a clear understanding or grasp of the things that are beyond our cognitive powers. We still can think about them. To say that science is only what is proven by scientific experiments would be foolish because that would exclude the vast space of the unknown, even unknowable. Maybe God does not play dice, but maybe even God needs, metaphorically speaking, to throw out thirty-six worlds to make some effects, even if only two, that would otherwise not be possible. As we know, matter cannot power itself and organize itself without the underlying creative force empowering it. Matter is matter thanks to our perceptive and cognitive powers, not per se. Matter per se does not exist in the form we see it. What we see is a reality based on our senses. We cannot completely rely on our senses to tell the underlying reality. Reaching the underlying reality is possible only through abstract thought. This abstract thought will enhance scientific discoveries because we cannot reach the physically unreachable by experiments or strictly scientific means. Identification of God from religious books with God independent of holy books is prevalent in the books or arguments against God used by the most famous atheists, including agnostics like Bertrand Russell. However, a huge difference exists between a God from religious books and Spinoza’s God or the God of many philosophers and scientists. Once we acknowledge and accept this important difference, we will realize that the gap between believers (not contaminated by religions) and atheists (or agnostics) is much smaller than it looks at first sight. God is not in the religious books, nor can he be owned through religious books. The main goal of the major monotheistic religions is to a priori appropriate and establish the right to God rather than to define and explain God in the deepest possible sense because that is almost impossible, even for science and philosophy. For that reason, a belief in blind faith and fear mostly saves major religions, rather than pure belief, unaffected by religious influence or deceit.
Dejan Stojanovic
Consider Edgar Allen Poe’s famous poem, “The Raven.” Here we have a first-person narrator whose wife or lover, Lenore, has recently died. He is in his library searching through his books to find a way to make her death meaningful—or even understandable. When a raven enters the library, the narrator takes it as a sign and asks a series of increasingly desperate questions. The raven, of course, has long been a symbol for death, and the questions that the narrator asks the raven are all really questions about death. Is there a heaven? Does death come from God or the Devil? Will he ever get over her death? Will he see her again? These are likely the same things he was trying to find out from his books. But while the books may have tried to give answers, the raven—death itself—says only one word: “Nevermore.” So this is a poem that makes claims—or, more specifically, it is a poem that rejects claims. It rejects the notion that anyone can know anything about death, or what happens after death, except that a person who has died no longer exists. All that death “says” to us is “Nevermore.” If we try to go beyond this, we will eventually suffer the narrator’s fate and become insane. Many people would disagree vigorously with this premise. Some people believe that the spirits of the dead become ghosts that we can still communicate with. Others believe in heaven, hell, reincarnation, Nirvana, or some knowable final destination for the soul. I can imagine a number of different ways that one might go about rebutting Poe’s metaphysical truth claims. But it makes no difference whether or not ravens can talk. Nothing about Poe’s poem can be supported, or refuted, by scientific knowledge about the vocalization mechanisms of the Corvus corax. Nor does it matter whether or not Edgar Allen Poe ever knew anybody named Lenore, or owned a “bust of Pallas,” or did or said any of the things described in the poem. “The Raven” makes metaphysical truth claims that we can isolate and evaluate. But these claims do not depend on either the history or the science of the poem turning out to be true.
Michael Austin (Re-reading Job: Understanding the Ancient World’s Greatest Poem (Contemporary Studies in Scripture))
That’s the beauty of the famous scientific method. You observe your subject, ask questions, and then research before establishing a hypothesis.
Claudia Y. Burgoa (Undefeated (Unexpected #5))
DARWIN’S “SACRED CAUSE”? Much ink has been dedicated to determining Charles Darwin’s role in “scientific racism.” The only way to empirically and scientifically determine his role is to organize the events as a timeline, and thus placing them into context of historical events. Political analysis without historical context is all sail and no rudder. In America we are constantly made aware that both Abraham Lincoln and Charles Darwin were born on the same day, in the same year, February 12, 1809. Adrian Desmond and James Moore famous 2009 book, “Darwin’s Sacred Cause,” leverages this factoid in an effort to place Charles Darwin at par with Abraham Lincoln in the abolition of slavery. This fraudulently steals away credit from Abraham Lincoln, who took a bullet to the head for the cause, and transfers it by inference to an aristocrat whom remained in his plush abode throughout the conflict and never lifted a finger for the cause.
A.E. Samaan (From a "Race of Masters" to a "Master Race": 1948 to 1848)
Many American boys that fought in WWII had been sterilized under eugenic laws passed by the the United States Supreme Court under the 1927 case of Buck v. Bell. Over 80,000 Americans would be forcibly sterilized under that legal precedent. Coincidentally, Buck v Bell is also the legal precedent cited in Roe v. Wade, the famous abortion rights case.
A.E. Samaan (H.H. Laughlin: American Scientist, American Progressive, Nazi Collaborator (History of Eugenics, Vol. 2))
An LDL around 70 mg/dL corresponds to a total cholesterol reading of about 150, the level below which no deaths from coronary heart disease were reported in the famous Framingham Heart Study, a generations-long project to identify risk factors for heart disease.29
Michael Greger (How Not to Die: Discover the Foods Scientifically Proven to Prevent and Reverse Disease)
The only way to argue this in a comprehensive way would be to adopt a universal skepticism about method. The most famous case for such skepticism is the 1975 book Against Method by maverick philosopher of science Paul Feyerabend. Feyerabend appeals to the history of science to argue that no methodological prescription has ever been consistently followed in science
Howard Margolis (It Started With Copernicus: How Turning the World Inside Out Led to the Scientific Revolution)
RAF used scientific advancements to detect U-boats. They used ASV radars and Leigh Searchlights that made detection of U-boats at night possible. Once a U-boat was located, an attack would be carried out using conventional weapons and torpedoes. The RAF did not have to worry about never seeing a U-boat as U-Boats had to surface in order to recharge their batteries. The
Ryan Jenkins (World War 2 Air Battles: The Famous Air Combats that Defined WWII)
Mysteries are powerful, Cialdini says, because they create a need for closure. “You’ve heard of the famous Aha! experience, right?” he says. “Well, the Aha! experience is much more satisfying when it is preceded by the Huh? experience.” By creating a mystery, the writer-astronomer made dust interesting. He sustained attention, not just for the span of a punch line but for the span of a twenty-page article dense with information on scientific theories and experimentation.
Chip Heath (Made to Stick: Why some ideas take hold and others come unstuck)
During the Middle Ages, the Catholic Church actively supported a great deal of science, but it also decided that philosophical speculation should not impinge on theology. Ironically, by keeping philosophers focused on nature instead of metaphysics, the limitations set by the Church may even have benefited science in the long term. Furthermore and contrary to popular belief, the Church never supported the idea that the earth is flat, never banned human dissection, never banned zero and certainly never burnt anyone at the stake for scientific ideas. The most famous clash between science and religion was the trial of Galileo Galilei (1564–1642) in 1633. Academic historians are now convinced that this had as much to do with politics and the Pope’s self-esteem as it did with science. The trial is fully explained in the last chapter of this book, in which we will also see how much Galileo himself owed to his medieval predecessors.
James Hannam (God's Philosophers)
Bill Wilson would never have another drink. For the next thirty-six years, until he died of emphysema in 1971, he would devote himself to founding, building, and spreading Alcoholics Anonymous, until it became the largest, most well-known and successful habit-changing organization in the world. An estimated 2.1 million people seek help from AA each year, and as many as 10 million alcoholics may have achieved sobriety through the group.3.12,3.13 AA doesn’t work for everyone—success rates are difficult to measure, because of participants’ anonymity—but millions credit the program with saving their lives. AA’s foundational credo, the famous twelve steps, have become cultural lodestones incorporated into treatment programs for overeating, gambling, debt, sex, drugs, hoarding, self-mutilation, smoking, video game addictions, emotional dependency, and dozens of other destructive behaviors. The group’s techniques offer, in many respects, one of the most powerful formulas for change. All of which is somewhat unexpected, because AA has almost no grounding in science or most accepted therapeutic methods. Alcoholism, of course, is more than a habit. It’s a physical addiction with psychological and perhaps genetic roots. What’s interesting about AA, however, is that the program doesn’t directly attack many of the psychiatric or biochemical issues that researchers say are often at the core of why alcoholics drink.3.14 In fact, AA’s methods seem to sidestep scientific and medical findings altogether, as well as the types of intervention many psychiatrists say alcoholics really need.1 What AA provides instead is a method for attacking the habits that surround alcohol use.3.15 AA, in essence, is a giant machine for changing habit loops. And though the habits associated with alcoholism are extreme, the lessons AA provides demonstrate how almost any habit—even the most obstinate—can be changed.
Charles Duhigg (The Power Of Habit: Why We Do What We Do In Life And Business)
Even if it were right, third-degree doesn’t work as well as scientific methods.” His approach was to get the suspect to talk about anything. “It doesn’t matter what. Ask him about the weather, his favorite food, anything. If you get him to say something, you can pry the truth out of him eventually.
Jason Lucky Morrow (Famous Crimes the World Forgot: Ten Vintage True Crime Stories Rescued from Obscurity)
I propose that a scientifically sound set of hygiene rules could explain why the Jewish God was a jealous, monotheistic God: He could brook no compromise with the filthy practices of rival gods. Famously faceless, abstract, and unsuperstitious, the Jewish God was science.
John Durant (The Paleo Manifesto: Ancient Wisdom for Lifelong Health)
Karl Popper famously suggested the criterion of “falsifiability”: A theory is scientific if it makes clear predictions that can be unambiguously falsified.
John Brockman (This Idea Must Die: Scientific Theories That Are Blocking Progress (Edge Question))
We need to analyze and contemplate the experience of modernity in the Arab and Muslim world, in order to grasp what is happening. Some of us, for example, reject modernity, and yet it’s obvious that these same people are using the products of modernity, even to the extent that when proselytizing their interpretation of Islam, which conflicts with modernity, they’re employing the tools of modernity to do so. This strange phenomenon can best be understood by contemplating our basic attitude towards modernity, stemming from two centuries ago. If we analyze books written by various Muslim thinkers at the time, concerning modernity and the importance of modernizing our societies, and so forth, we can see that they distinguished between certain aspects of modernity that should be rejected, and others that may be accepted. You can find this distinction in the very earliest books that Muslim intellectuals wrote on the topic of modernity. To provide a specific example, I’ll cite an important book that is widely regarded as having been the first ever written about modern thought in the Muslim world, namely, a book by the famous Egyptian intellectual, Rifa’ Rafi’ al-Tahtawi (1801–1873), Takhlish al-Ibriz fi Talkhish Baris, whose title may be translated as Mining Gold from Its Surrounding Dross. As you can immediately grasp from its title, the book distinguishes between the “gold” contained within modernity—gold being a highly prized, expensive and rare product of mining—and its so-called “worthless” elements, which Muslims are forbidden to embrace. Now if we ask ourselves, “What elements of modernity did these early thinkers consider acceptable, and what did they demand that we reject?,” we discover that technology is the “acceptable” element of modernity. We are told that we may adopt as much technology as we want, and exploit these products of modernity to our heart’s content. But what about the modes of thought that give rise to these products, and underlie the very phenomenon of modernity itself? That is, the free exercise of reason, and critical thought? These two principles are rejected and proscribed for Muslims, who may adopt the products of modernity, while its substance, values and foundations, including its philosophical modes of thought, are declared forbidden. Shaykh Rifa’ Rafi’ al-Tahtawi explained that we may exploit knowledge that is useful for defense, warfare, irrigation, farming, etc., and yet he simultaneously forbade us to study, or utilize, the philosophical sciences that gave rise to modern thought, and the love for scientific methodologies that enlivens the spirit of modern knowledge, because he believed that they harbored religious deviance and infidelity (to God).
علي مبروك
the Harveys’ most famous son. An experimental physician famous for his discovery of the circulation of the blood, he had been the personal physician to Charles I and had been present with him at the Battle of Edgehill in 1642. Research in the Harvey family papers has also revealed that he was responsible for the only known scientific examination of a witch’s familiar. Personally ordered by Charles I to examine a lady suspected of witchcraft who lived on the outskirts of Newmarket, the dubious Harvey visited her in the guise of a wizard. He succeeded in capturing and dissecting her pet toad. The animal, Harvey concluded dryly, was a toad.
Sam Willis (The Fighting Temeraire: The Battle of Trafalgar and the Ship that Inspired J.M.W. Turner's Most Beloved Painting)
Think of Sir Isaac Newton. He spent two years working on what became Principia Mathematica, his famous writings on universal gravitation and the three laws of motion. This period of almost solitary confinement proved critical in what became a true breakthrough that shaped scientific thinking for the next three hundred years.
Greg McKeown (Essentialism: The Disciplined Pursuit of Less)
THE “AUSTRALOPITHS” At 4.2 million years ago, in northern Kenya, we find the first evidence of a hominid species called Australopithecus anamensis. This is the first member of our family whose fossil leg and foot bones speak directly of upright bipedality. Its jaws and teeth were also comfortingly similar to the next-in-time Australopithecus afarensis, a hominid whose fossils are widely known in eastern Africa between about 3.6 and 3.0 million years ago. Most famously represented by the 3.2-million-year-old partial skeleton “Lucy,” from Hadar in Ethiopia,
Ian Tattersall (Race?: Debunking a Scientific Myth (Texas A&M University Anthropology Series Book 15))
The RAF used scientific advancements to detect U-boats. They used ASV radars and Leigh Searchlights that made detection of U-boats at night possible. Once a U-boat was located, an attack would be carried out using conventional weapons and torpedoes. The RAF did not have to worry about never seeing a U-boat as U-Boats had to surface in order to recharge their batteries. The aerial depth charge made it very difficult for the U-boats to stay in one place for a long period of time. After
Ryan Jenkins (World War 2 Air Battles: The Famous Air Combats that Defined WWII)
What a jerk. How dare he tell Stanley that he was wasting the gift? Famous for being famous. Yeah, right. He was the first human being ever brought back to life by scientific means. Famous for being famous. Jesus Christ.
Jeff Strand (The Sinister Mr. Corpse)
Snake tongues come in shades of lipstick red, electric blue, and inky black. Outstretched and splayed, they can be longer and wider than their owners’ heads. Kurt Schwenk has been fascinated by them for decades, and he often finds that he’s alone in that. In the second year of his PhD, he told a fellow student what he was working on, eager to revel in the joys of scientific pursuits with a like-minded soul. The student (who is now a famous ecologist) burst out laughing. “That would have been enough to hurt my feelings, but this was a guy who studied the mites that hang out in the nostrils of hummingbirds,” Schwenk tells me, still slightly outraged.
Ed Yong (An Immense World: How Animal Senses Reveal the Hidden Realms Around Us)
Celera had achieved nothing short of a scientific miracle. So why did the stock crash? The likeliest explanation is simply that the fires of anticipation are so easily quenched by the cold water of reality. Once the good news that investors have awaited for so long is out, the thrill is gone. The resulting emotional vacuum almost instantly fills up with a painful awareness that the future will not be nearly as exciting as the past. (As Yogi Berra famously said, “The future ain’t what it used to be.”) Getting exactly what they wished for leaves investors with nothing to look forward to, so they get out and the stock crashes.
Jason Zweig (Your Money and Your Brain)
in 1931 Edwin Hubble invited Einstein to the observatory of the Hooker Telescope near Pasadena, California, and showed him that, in fact, the universe was expanding. Einstein then pronounced one of his most famous sentences: “Now I see the necessity of a beginning,” a lofty phrase followed by another equally famous but more earthly remark, “That was the biggest blunder of my whole life,”113 referring to the gravitational constant that he had devised in order to adjust the Theory of Relativity. In an ironic twist, the gravitational constant then proved to exist, although not in the magnitude that Einstein attributed to it. We will see this further on.
José Carlos González-Hurtado (New Scientific Evidence for the Existence of God)
Francis Crick477 was a British molecular biologist and co-discoverer with James Watson478 of the structure of DNA, for which he received the 1962 Nobel Prize in Physiology and Medicine. Mr. Crick was a militant atheist, a Christianophobe,479 and in favor of eugenics,480 an idea that he blamed religion for delaying (and on that point, he may have been right). He recognized the impossibility of DNA being produced by chance, and since he considered some intelligent cause necessary for it, he proposed his famous hypothesis of “panspermia,” which came to mean that life on Earth was sown by intelligent extraterrestrials. Yes, you read that correctly, by extraterrestrials.
José Carlos González-Hurtado (New Scientific Evidence for the Existence of God)
So when we look at our cell phones—as almost everyone does countless times every day—we stare directly into the face of a scientific invention made, in part, upon Hedy Lamarr’s invention. It is a tangible reminder of her life, beyond the films for which she is more famous. And who knows whether the cell phone as we know it today would have been constructed without her work?
Marie Benedict (The Only Woman in the Room)
This hardened response to those on the “other team” is not an invention of modern American politics. It seems to be hardwired into the circuitry of our brains. The Old Testament is filled with stories of sometimes deadly tribalism, and scientific data gives us insight into why that happens. In 1968, elementary school teacher Jane Elliott conducted a famous experiment with her students in the days after the assassination of Dr. Martin Luther King Jr. She divided the class by eye color. The brown-eyed children were told they were better. They were the “in-group.” The blue-eyed children were told they were less than the brown-eyed children—hence becoming the “out-group.” Suddenly, former classmates who had once played happily side by side were taunting and torturing one another on the playground. Lest we assign greater morality to the “out-group,” the blue-eyed children were just as quick to attack the brown-eyed children once the roles were reversed.6
Sarah Stewart Holland (I Think You're Wrong (But I'm Listening): A Guide to Grace-Filled Political Conversations)
In the same way that Firestone’s embrace of scientific and technological progress as manifest destiny tips its hat to Marx and Engels, so also it resembles (perhaps even more closely) the Marxist-inspired biofuturism of the interwar period, particularly in Britain, in the work of writers such as H. G. Wells, J. B. S. Haldane, J. D. Bernal, Julian Huxley, Conrad Waddington, and their contemporaries (including Gregory Bateson and Joseph Needham, the latter of whose embryological interests led to his enduring fascination with the history of technology in China). Interestingly, it is also in these early twentieth century writings that ideas about artificial reproduction, cybernation, space travel, genetic modification, and ectogenesis abound. As cultural theorist Susan Squier has demonstrated, debates about ectogenesis were crucial to both the scientific ambitions and futuristic narratives of many of the United Kingdom’s most eminent biologists from the 1920s and the 1930s onward. As John Burdon Sanderson (“Jack”) Haldane speculated in his famous 1923 paper “Daedalus, or Science and the Future” (originally read to the Heretics society in Cambridge) ectogenesis could provide a more efficient and rational basis for human reproduction in the future: [W]e can take an ovary from a woman, and keep it growing in a suitable fluid for as long as twenty years, producing a fresh ovum each month, of which 90 per cent can be fertilized, and the embryos grown successfully for nine months, and then brought out into the air.
Mandy Merck (Further Adventures of The Dialectic of Sex: Critical Essays on Shulamith Firestone (Breaking Feminist Waves))
without the scientific knowledge and machines that enable current mind-control
Kathryn Harkup (Vampirology: The Science of Horror's Most Famous Fiend)
Sir,’ I commented, ‘I have been thinking of the scientific men of the West, greater by far in intelligence than most people congregated here, living in distant Europe and America, professing different creeds, and ignorant of the real values of such melas as the present one. They are the men who could benefit greatly by meetings with India’s masters. But, although high in intellectual attainments, many Westerners are wedded to rank materialism. Others, famous in science and philosophy, do not recognize the essential unity in religion. Their creeds serve as insurmountable barriers that threaten to separate them from us forever.’ ‘I saw that you are interested in the West, as well as the East.’ Babaji’s face beamed with approval. ‘I felt the pangs of your heart, broad enough for all men, whether Oriental or Occidental. That is why I summoned you here. ‘East and West must establish a golden middle path of activity and spirituality combined,’ he continued. ‘India has much to learn from the West in material development; in return, India can teach the universal methods by which the West will be able to base its religious beliefs on the unshakable foundations of yogic science.
Paramahansa Yogananda (Autobiography of a Yogi)
identity politics,” a contemporary tendency to align without question with categorical divisions like political parties, sexual orientation, skin color, class, et cetera, intolerantly, in the name of virtue. The famous psychologists Carl Jung and Dr. Jordan B. Peterson share this scientifically-minded perspective. According to Dr.
Richard L Haight (The Genesis Code: Revealing the Ancient Path to Inner Freedom)
The superstitions of today are the scientific facts of tomorrow.
John L. Balderston (Dracula: The Ultimate, Illustrated Edition of the World-Famous Vampire Play)
The climate change debate resembles the famous tale of a group of blind men touching various parts of an elephant, each arriving at a very different idea of what it is like:
Craig D. Idso (Why Scientists Disagree About Global Warming: The NIPCC Report on Scientific Consensus)
The intrusion of entertainment in worship today can trace its roots back to the work of revivalist minister Charles G. Finney (1792–1875). An American Presbyterian minister, Finney became famous for the methods employed at his meetings, later known as the “new measures,” which were carefully designed to manipulate an emotional response from the crowd. For Finney, there was a formula that, employed correctly, would guarantee interest in the things of God. He said so himself: “A revival is not a miracle, or dependent on a miracle in any sense. It is a purely philosophic [i.e., scientific] result of the right use of the constituted means.”2 It was this sort of ministry that caused Charles Spurgeon (1834–1892) to remark in the 1800s that “the devil has seldom done a cleverer thing than hinting to the church that part of their mission is to provide entertainment for the people, with a view to winning them.”3 These words are just as true today.
Jonathan Landry Cruse (What Happens When We Worship)
Celia Farber, whose 2006 Harper’s article, “Out of Control: AIDS and the Destruction of Medical Science,” laid bare the culture of squalor, corruption, and violence at the vendetta-driven Division of AIDS (DAIDS). “The latter [genuine scientists] are the minority. They look, sound, and behave like scientists. And to varying degrees, they all live in a climate of both economic and reputational persecution. Peter Duesberg is one very famous example but there are others. Fauci’s vendetta system has many ways of crushing the natural scientific impulse—to question and to demand proof. Breathtakingly, because of Fauci’s impact since 1984, this tradition has been all but snuffed out in the US. ‘Everybody is afraid.’ How many times have I heard that line?
Robert F. Kennedy Jr. (The Real Anthony Fauci: Bill Gates, Big Pharma, and the Global War on Democracy and Public Health)
Evolutionary biologist Stephen Jay Gould was diagnosed with a form of cancer that had a median survival time of eight months; he died of a different and unrelated kind of cancer twenty years later.3 Gould subsequently wrote a famous article called “The Median Isn’t the Message,” in which he argued that his scientific knowledge of statistics saved him from the erroneous conclusion that he would necessarily be dead in eight months.
Charles Wheelan (Naked Statistics: Stripping the Dread from the Data)
Nietzsche’s most famous views are his earliest ones: the accounts of the Apollonian and Dionysian “art-drives” (Kunsttrieben) in The Birth of Tragedy. Already there, let’s note, Nietzsche is explaining aesthetic experience by “drives”. But in that first book these drives are mainly thought of in Schopenhauer’s way, as manifestations of a metaphysical, noumenal will. This early aesthetics is premised as responding to this noumenal reality: both Apollonian and Dionysian art drives are ways of coping with that reality of Schopenhauerian will. But Nietzsche soon insists on thinking of drives scientifically—not only of what they are (the body’s abilities), but of why we have them (evolution by selection)... It’s in aesthetics that this step into naturalism moves Nietzsche furthest from Schopenhauer. For Schopenhauer had depicted our aesthetic experience as (unlike intellect) genuinely a disengagement from willing: it really achieves the objectivity we only thought we could have in our science. But Nietzsche insists that it too expresses a (naturalized) will and drive—and “serves life” by making us more fit. As such, the aesthetic attitude is not “disinterested” or “disengaged” at all, as not just Schopenhauer but Kant had found it. Nietzsche now scorns their notion of it. The aesthetic attitude in fact involves a heightening of our engagement and feeling. These drives, in which art and aesthetic experience are ultimately rooted, are something ancient and fixed in us. Indeed, artistic drives have been designed into all organisms. They were set into our bodies and our “blood” in our presocietal deep history, and persist there today beneath the layers of customs and habits that societies have superimposed on them (to exploit them, or counteract them, or both). By acting on these drives, beauty works on the “animal” in us—directly on the body, on the “muscles and senses” (WP809 [1888]), and the drives embedded in them. Our bodies themselves have a taste for certain kinds of beauty—above all the beauty of human bodies.
John Richardson (Nietzsche's New Darwinism)
Death and life are two sides of the same coin; you cannot have one without the other. Each time you surrender, each time you trust the dying, your faith is led to a deeper level and you discover a Larger Self underneath. You decide not to push yourself to the front of the line, and something much better happens in the back of the line. You let go of your narcissistic anger, and you find that you start feeling much happier. You surrender your need to control your partner, and finally the relationship blossoms or ends. Yet each time it is a choice—and each time it is a kind of dying. It seems we only know what life is when we know what death is. The mystics and great saints were those who had learned to trust and allow this pattern, and often said in effect, “What did I ever lose by dying?” Or try Paul’s famous one-liner: “For me to live is Christ and to die is gain” (Philippians 1:21). Now even scientific studies, including those of near-death experiences, reveal the same universal pattern. Things change and grow by dying to their present state, but each time it is a risk. We always wonder, “Will it work this time?” So many academic disciplines are coming together, each in their own way, to say that there’s a constant movement of loss and renewal at work in this world at every level. It seems to be the pattern of all growth and evolution. To be alive means to surrender to this inevitable flow. It’s the same pattern in every atom, in every human relationship, and in every galaxy. Indigenous peoples, Hindu gurus, Buddha, Moses, Muhammad, and Jesus all saw it clearly in human history and named it as a kind of “necessary dying.” If this pattern is true, it has been true all the time and everywhere. Such seeing did not just start two thousand years ago. All of us have to eventually learn to let go of something smaller so something bigger can happen. But that’s not a religion—it’s highly visible truth. It is the Way Reality Works. Yes, I am saying that the way things work and Christ are one and the same. This is not a religion to be either fervently joined or angrily rejected. It is a train ride already in motion. The tracks are visible everywhere. You can be a willing and happy traveler. Or not.
Richard Rohr
The famous Canadian physician William Osler once wrote, “In science the credit goes to the man who convinced the world, not to the man to whom the idea first occurs.
John Brockman (This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking)
Nietzsche’s most famous views are his earliest ones: the accounts of the Apollonian and Dionysian “art-drives” (Kunsttrieben) in The Birth of Tragedy. Already there, let’s note, Nietzsche is explaining aesthetic experience by “drives”. But in that first book these drives are mainly thought of in Schopenhauer’s way, as manifestations of a metaphysical, noumenal will. This early aesthetics is premised as responding to this noumenal reality: both Apollonian and Dionysian art drives are ways of coping with that reality of Schopenhauerian will. But Nietzsche soon insists on thinking of drives scientifically—not only of what they are (the body’s abilities), but of why we have them (evolution by selection)... It’s in aesthetics that this step into naturalism moves Nietzsche furthest from Schopenhauer. For Schopenhauer had depicted our aesthetic experience as (unlike intellect) genuinely a disengagement from willing: it really achieves the objectivity we only thought we could have in our science. But Nietzsche insists that it too expresses a (naturalized) will and drive—and “serves life” by making us more fit. As such, the aesthetic attitude is not “disinterested” or “disengaged” at all, as not just Schopenhauer but Kant had found it. Nietzsche now scorns their notion of it. The aesthetic attitude in fact involves a heightening of our engagement and feeling. These drives, in which art and aesthetic experience are ultimately rooted, are something ancient and fixed in us. Indeed, artistic drives have been designed into all organisms. They were set into our bodies and our “blood” in our presocietal deep history, and persist there today beneath the layers of customs and habits that societies have superimposed on them (to exploit them, or counteract them, or both). By acting on these drives, beauty works on the “animal” in us—directly on the body, on the “muscles and senses” (WP809 [1888]), and the drives embedded in them. Our bodies themselves have a taste for certain kinds of beauty—above all the beauty of human bodies.
John Richardson, Nietzsche's New Darwinism
Whether the real setting and dating of the Hermetic tradition in late antiquity are, in fact, irrelevant to its reception in the Renaissance is an interesting hermeneutic question that cannot be answered here. In any case and for many other reasons, Yates’s views on the Hermetica became famous for some, notorious for others, especially when, in a 1968 article, she made Hermes a major figure in the preliminaries to the scientific revolution, just two years after J.E. McGuire and P.M. Rattansi had connected Newton’s physics with the ancient theology theme so closely associated with Hermes.
Hermes Trismegistus (Hermetica: The Greek Corpus Hermeticum and the Latin Asclepius in a New English Translation, with Notes and Introduction)