Famous Scientists Quotes

We've searched our database for all the quotes and captions related to Famous Scientists. Here they are! All 100 of them:

Isn’t every human being both a scientist and an artist; and in writing of human experience, isn’t there a good deal to be said for recognizing that fact and for using both methods?
James Agee (Let Us Now Praise Famous Men)
Nonsense remains nonsense, even when talked by world-famous scientists.
John C. Lennox
Anyway. I’m not allowed to watch TV, although I am allowed to rent documentaries that are approved for me, and I can read anything I want. My favorite book is A Brief History of Time, even though I haven’t actually finished it, because the math is incredibly hard and Mom isn’t good at helping me. One of my favorite parts is the beginning of the first chapter, where Stephen Hawking tells about a famous scientist who was giving a lecture about how the earth orbits the sun, and the sun orbits the solar system, and whatever. Then a woman in the back of the room raised her hand and said, “What you have told us is rubbish. The world is really a flat plate supported on the back of a giant tortoise.” So the scientist asked her what the tortoise was standing on. And she said, “But it’s turtles all the way down!” I love that story, because it shows how ignorant people can be. And also because I love tortoises.
Jonathan Safran Foer (Extremely Loud & Incredibly Close)
When you are famous it is hard to work on small problems. This is what did Shannon in. After information theory, what do you do for an encore? The great scientists often make this error. They fail to continue to plant the little acorns from which the mighty oak trees grow. They try to get the big thing right off. And that isn't the way things go. So that is another reason why you find that when you get early recognition it seems to sterilize you.
Richard Hamming
It is a well-known established fact throughout the many-dimensional worlds of the multiverse that most really great discoveries are owed to one brief moment of inspiration. There's a lot of spadework first, of course, but what clinches the whole thing is the sight of, say, a falling apple or a boiling kettle or the water slipping over the edge of the bath. Something goes click inside the observer's head and then everything falls into place. The shape of DNA, it is popularly said, owes its discovery to the chance sight of a spiral staircase when the scientist‘s mind was just at the right receptive temperature. Had he used the elevator, the whole science of genetics might have been a good deal different. This is thought of as somehow wonderful. It isn't. It is tragic. Little particles of inspiration sleet through the universe all the time traveling through the densest matter in the same way that a neutrino passes through a candyfloss haystack, and most of them miss. Even worse, most of the ones that hit the exact cerebral target, hit the wrong one. For example, the weird dream about a lead doughnut on a mile-high gantry, which in the right mind would have been the catalyst for the invention of repressed-gravitational electricity generation (a cheap and inexhaustible and totally non-polluting form of power which the world in question had been seeking for centuries, and for the lack of which it was plunged into a terrible and pointless war) was in fact had by a small and bewildered duck. By another stroke of bad luck, the sight of a herd of wild horses galloping through a field of wild hyacinths would have led a struggling composer to write the famous Flying God Suite, bringing succor and balm to the souls of millions, had he not been at home in bed with shingles. The inspiration thereby fell to a nearby frog, who was not in much of a position to make a startling contributing to the field of tone poetry. Many civilizations have recognized this shocking waste and tried various methods to prevent it, most of them involving enjoyable but illegal attempts to tune the mind into the right wavelength by the use of exotic herbage or yeast products. It never works properly.
Terry Pratchett (Sourcery (Discworld, #5; Rincewind, #3))
Pierre Curie, a brilliant scientist, happened to marry a still more brilliant one—Marie, the famous Madame Curie—and is the only great scientist in history who is consistently identified as the husband of someone else.
Isaac Asimov (Views From a Height: A Brilliant Overview of the Exciting Realms of Science)
For her next birthday she'd asked for a telescope. Her mother had been alive then, and had suggested a pony, but her father had laughed and bought her a beautiful telescope, saying: "Of course she should watch the stars! Any girl who cannot identify the constellation of Orion just isn't paying attention!" And when she started asking him complicated questions, he took her along to lectures at the Royal Society, where it turned out that a nine-year-old girl who had blond hair and knew what the precession of the equinoxes was could ask hugely bearded famous scientists anything she liked. Who'd want a pony when you could have the whole universe?
Terry Pratchett
Even if you are not a religious person by nature or training—even if you are an out-and-out skeptic—prayer can help you much more than you believe, for it is a practical thing. What do I mean, practical? I mean that prayer fulfills these three very basic psychological needs which all people share, whether they believe in God or not: 1. Prayer helps us to put into words exactly what is troubling us. We saw in Chapter 4 that it is almost impossible to deal with a problem while it remains vague and nebulous. Praying, in a way, is very much like writing our problems down on paper. If we ask help for a problem—even from God—we must put it into words. 2. Prayer gives us a sense of sharing our burdens, of not being alone. Few of us are so strong that we can bear our heaviest burdens, our most agonizing troubles, all by ourselves. Sometimes our worries are of so ultimate a nature that we cannot discuss them even with our closest relatives or friends. Then prayer is the answer. Any psychiatrist will tell us that when we are pent-up and tense, and in an agony of spirit, it is therapeutically good to tell someone our troubles. When we can’t tell anyone else—we can always tell God. 3. Prayer puts into force an active principle of doing. It’s a first step toward action. I doubt if anyone can pray for some fulfillment, day after day, without benefiting from it—in other words, without taking some steps to bring it to pass. The world-famous scientist, Dr. Alexis Carrel, said: “Prayer is the most powerful form of energy one can generate.” So why not make use of it? Call it God or Allah or Spirit—why quarrel with definitions as long as the mysterious powers of nature take us in hand?
Dale Carnegie (How To Stop Worrying & Start Living)
This picture of a hot early stage of the universe was first put forward by the scientist George Gamow in a famous paper written in 1948 with a student of his, Ralph Alpher. Gamow had quite a sense of humor—he persuaded the nuclear scientist Hans Bethe to add his name to the paper to make the list of authors “Alpher, Bethe, Gamow,
Stephen Hawking (A Brief History of Time)
Noam Chomsky has famously argued that a Martian scientist would conclude that all earthlings speak dialects of the same language.
Guy Deutscher (Through the Language Glass: Why the World Looks Different in Other Languages)
wouldn’t mind—I mean, I know he was a famous scientist, but that’s all I know. Could you tell me how you knew him? Maybe supply an anecdote? Did you know him long?
Bonnie Garmus (Lessons in Chemistry)
It is easy for a famous scientist to have lots of students doing the dirty work for him,” said one colleague. “But Opje helps people with their problems and then gives them the credit.
Kai Bird (American Prometheus)
What this all goes to show is that nonsense remains nonsense, even when talked by world-famous scientists. What serves to obscure the illogicality of such statements is the fact that they are made by scientists; and the general public, not surprisingly, assumes that they are statements of science and takes them on authority. That is why it is important to point out that they are not statements of science, and any statement, whether made by a scientist or not, should be open to logical analysis. Immense prestige and authority does not compensate for faulty logic.
John C. Lennox (God and Stephen Hawking)
A separate, international team analyzed more than a half million research articles, and classified a paper as “novel” if it cited two other journals that had never before appeared together. Just one in ten papers made a new combination, and only one in twenty made multiple new combinations. The group tracked the impact of research papers over time. They saw that papers with new knowledge combinations were more likely to be published in less prestigious journals, and also much more likely to be ignored upon publication. They got off to a slow start in the world, but after three years, the papers with new knowledge combos surpassed the conventional papers, and began accumulating more citations from other scientists. Fifteen years after publication, studies that made multiple new knowledge combinations were way more likely to be in the top 1 percent of most-cited papers. To recap: work that builds bridges between disparate pieces of knowledge is less likely to be funded, less likely to appear in famous journals, more likely to be ignored upon publication, and then more likely in the long run to be a smash hit in the library of human knowledge. •
David Epstein (Range: Why Generalists Triumph in a Specialized World)
Harvard Square looked both new and familiar. I felt like I would have been able to tell just from looking that this configuration of buildings and streets was familiar and meaningful to lots of people, not just me. It was weird to visit a suburb that nobody else every visited or went to, and then to return to these widely known halls and buildings where famous statesmen and writers and scientists had been coming for hundreds of years.
Elif Batuman (The Idiot)
New Rule: Stop pretending your drugs are morally superior to my drugs because you get yours at a store. This week, they released the autopsy report on Anna Nicole Smith, and the cause of death was what I always thought it was: mad cow. No, it turns out she had nine different prescription drugs in her—which, in the medical field, is known as the “full Limbaugh.” They opened her up, and a Walgreens jumped out. Antidepressants, anti-anxiety pills, sleeping pills, sedatives, Valium, methadone—this woman was killed by her doctor, who is a glorified bartender. I’m not going to say his name, but only because (a) I don’t want to get sued, and (b) my back is killing me. This month marks the thirty-fifth anniversary of a famous government report. I was sixteen in 1972, and I remember how excited we were when Nixon’s much ballyhooed National Commission on Drug Abuse came out and said pot should be legalized. It was a moment of great hope for common sense—and then, just like Bush did with the Iraq Study Group, Nixon took the report and threw it in the garbage, and from there the ’70s went right into disco and colored underpants. This week in American Scientist, a magazine George Bush wouldn’t read if he got food poisoning in Mexico and it was the only thing he could reach from the toilet, described a study done in England that measured the lethality of various drugs, and found tobacco and alcohol far worse than pot, LSD, or Ecstasy—which pretty much mirrors my own experiments in this same area. The Beatles took LSD and wrote Sgt. Pepper—Anna Nicole Smith took legal drugs and couldn’t remember the number for nine-one-one. I wish I had more time to go into the fact that the drug war has always been about keeping black men from voting by finding out what they’re addicted to and making it illegal—it’s a miracle our government hasn’t outlawed fat white women yet—but I leave with one request: Would someone please just make a bumper sticker that says, “I’m a stoner, and I vote.
Bill Maher (The New New Rules: A Funny Look At How Everybody But Me Has Their Head Up Their Ass)
This [discovery of a cell-free yeast extract] will make him famous, even though he has no talent for chemistry. {Comment on German scientist Eduard Buchner who later ironically won a Nobel Prize in Chemistry for this discovery}
Adolf von Baeyer
Why does the universe go to all the bother of existing? Is the unified theory so compelling that it brings about its own existence? Or does it need a creator, and, if so, does he have any other effect on the universe? And who created him? Up to now, most scientists have been too occupied with the development of new theories that describe what the universe is to ask the question why. On the other hand, the people whose business it is to ask why, the philosophers, have not been able to keep up with the advance of scientific theories. In the eighteenth century, philosophers considered the whole of human knowledge, including science, to be their field and discussed questions such as: Did the universe have a beginning? However, in the nineteenth and twentieth centuries, science became too technical and mathematical for the philosophers, or anyone else except a few specialists. Philosophers reduced the scope of their inquiries so much that Wittgenstein, the most famous philosopher of this century, said, 'The sole remaining task for philosophy is the analysis of language.' What a comedown from the great tradition of philosophy from Aristotle to Kant! However, if we do discover a complete theory, it should in time be understandable in broad principle by everyone, not just a few scientists. Then we shall all, philosophers, scientists, and just ordinary people, be able to take part in the discussion of the question of why it is that we and the universe exist. If we find the answer to that, it would be the ultimate triumph of human reason--for then we would know the mind of God.
Stephen Hawking (A Brief History of Time)
Ending up with that gigantic outsized brain must have taken some sort of runaway evolutionary process, something that would push and push without limits. And today's scientists had a pretty good guess at what that runaway evolutionary process had been. Harry had once read a famous book called Chimpanzee Politics. The book had described how an adult chimpanzee named Luit had confronted the aging alpha, Yeroen, with the help of a young, recently matured chimpanzee named Nikkie. Nikkie had not intervened directly in the fights between Luit and Yeroen, but had prevented Yeroen's other supporters in the tribe from coming to his aid, distracting them whenever a confrontation developed between Luit and Yeroen. And in time Luit had won, and become the new alpha, with Nikkie as the second most powerful... ...though it hadn't taken very long after that for Nikkie to form an alliance with the defeated Yeroen, overthrow Luit, and become the new new alpha. It really made you appreciate what millions of years of hominids trying to outwit each other - an evolutionary arms race without limit - had led to in the way of increased mental capacity. 'Cause, y'know, a human would have totally seen that one coming.
Eliezer Yudkowsky (Harry Potter and the Methods of Rationality)
Echoing a famous argument by the philosopher Karl Popper, most scientists today insist that the dividing line between science and pseudoscience is whether advocates of a hypothesis deliberately search for evidence that could falsify it and accept the hypothesis only if it survives.
Steven Pinker (Rationality: What It Is, Why It Seems Scarce, Why It Matters)
Gene patents are the point of greatest concern in the debate over ownership of human biological materials, and how that ownership might interfere with science. As of 2005—the most recent year figures were available—the U.S. government had issued patents relating to the use of about 20 percent of known human genes, including genes for Alzheimer’s, asthma, colon cancer, and, most famously, breast cancer. This means pharmaceutical companies, scientists, and universities control what research can be done on those genes, and how much resulting therapies and diagnostic tests will cost. And some enforce their patents aggressively: Myriad Genetics, which holds the patents on the BRCA1 and BRCA2 genes responsible for most cases of hereditary breast and ovarian cancer, charges $3,000 to test for the genes. Myriad has been accused of creating a monopoly, since no one else can offer the test, and researchers can’t develop cheaper tests or new therapies without getting permission from Myriad and paying steep licensing fees. Scientists who’ve gone ahead with research involving the breast-cancer genes without Myriad’s permission have found themselves on the receiving end of cease-and-desist letters and threats of litigation.
Rebecca Skloot
Who's more interesting: A famous scientist, or the famous who plays the cello and whittles marionettes in a lighthouse at the edge of the world where he sometimes writes poetry by the light of passing ships? Exactly. Follow your weird impulses and do all sorts of things. Getting sidetracked can lead you to exactly where you belong.
Jessica Hagy (How to Be Interesting: In 10 Simple Steps)
when another German scientist, Werner Heisenberg, formulated his famous uncertainty principle. In order to predict the future position and velocity of a particle, one has to be able to measure its present position and velocity accurately. The obvious way to do this is to shine light on the particle. Some of the waves of light will be scattered by the particle and this will indicate its position. However, one will not be able to determine the position of the particle more accurately than the distance between the wave crests of light, so one needs to use light of a short wavelength in order to measure the position of the particle precisely. Now, by Planck’s quantum hypothesis, one cannot use an arbitrarily small amount of light; one has to use at least one quantum. This quantum will disturb the particle and change its velocity in a way that cannot be predicted. Moreover, the more accurately one measures the position, the shorter the wavelength of the light that one needs and hence the higher the energy of a single quantum. So the velocity of the particle will be disturbed by a larger amount. In other words, the more accurately you try to measure the position of the particle, the less accurately you can measure its speed, and vice versa.
Stephen Hawking (A Brief History of Time)
The trouble with Oppenheimer, the famous but uninvolved scientist Einstein remarked, was that he loved a woman who did not love him back: the U.S. government.
TaraShea Nesbit (The Wives of Los Alamos)
Once upon a time an academic scientist went to visit a Zen Master, famous for being very wise. After greeting the scholar, the master offered him tea. As they sat together, the monk began to pour the tea into the scholar's cup. He poured until the tea overflowed onto the saucer, then the table and finally onto the floor. When the scholar could not stand it any more, he blurted out: "Stop, stop, can't you see the cup is full?" To which the Zen Master replied: "Yes, I can, and until your mind is empty, you will not hear what I have to say.
Jeffrey Armstrong (God the Astrologer: Soul, Karma, and Reincarnation--How We Continually Create Our Own Destiny)
The successful ideas survive scrutiny. The bad ideas get discarded. Conformity is also laughable to scientists attempting to advance their careers. The best way to get famous in your own lifetime is to pose an idea that counters prevailing research and that earns a consistency of observations and experiment. Healthy disagreement is a natural state on the bleeding edge of discovery.
Neil deGrasse Tyson (Starry Messenger: Cosmic Perspectives on Civilization)
Some of the more sought-after signers are, in no particular order, presidents, military heroes, sports icons, actors, singers, artists, religious and social leaders, scientists, astronauts, authors, and Kardashians.
Carrie Fisher (The Princess Diarist)
Turing’s vision was shared by his fellow computer scientists in America, who codified their curiosity in 1956 with a now famous Dartmouth College research proposal in which the term “artificial intelligence” was coined.
Fei-Fei Li (The Worlds I See: Curiosity, Exploration, and Discovery at the Dawn of AI)
Abraham Maslow once famously said,22 “When all you’ve got is a hammer, every problem looks like a nail.” What he meant was, when it comes to problem-solving, we tend to get locked into using familiar tools in expected ways.
Steven Kotler (Stealing Fire: How Silicon Valley, the Navy SEALs, and Maverick Scientists Are Revolutionizing the Way We Live and Work)
I found considerably more studies about women’s scent preferences than men’s. I don’t know if that’s because male scientists are particularly curious about What Women Want. Among studies on men, there’s the now-famous bit about men tipping strippers more if they’re ovulating—they do, the effects are reproducible, and they go away if the woman is on birth control—but that may or may not be scent related. (It’s hard to say what you’re smelling, exactly, in a strip club.) Men also prefer the smelly T-shirts of ovulating women, don’t like the pit smells of menstruating women and women who are less immuno-compatible as much, and almost universally dislike the smell of a woman’s tears, regardless of her reproductive status.
Cat Bohannon (Eve: How the Female Body Drove 200 Million Years of Human Evolution)
Singer cited the famous essay “The Tragedy of the Commons,” in which biologist Garrett Hardin argued that individuals acting in their rational self-interest may undermine the common good, and warned against assuming that technology would save us from ourselves. “If we ignore the present warning signs and wait for an ecological disaster to strike, it will probably be too late,” Singer noted. He imagined what it must have been like to be Noah, surrounded by “complacent compatriots,” saying, “‘Don’t worry about the rising waters, Noah; our advanced technology will surely discover a substitute for breathing.’ If it was wisdom that enabled Noah to believe in the ‘never-yet-happened,’ we could use some of that wisdom now,” Singer concluded.
Naomi Oreskes (Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming)
For years the physicist Donna Strickland was not deemed notable enough for an entry. She finally got her place in Wikipedia on the day she won the Nobel Prize. Surely that cannot be what it takes to be remembered? No man is held to such a standard.
Sandi Toksvig (Between the Stops: The View of My Life from the Top of the Number 12 Bus)
Einstein never accepted that the universe was governed by chance; his feelings were summed up in his famous statement, ‘God does not play dice.’ Most other scientists, however, were willing to accept quantum mechanics because it agreed perfectly with experiment.
Stephen Hawking (A Brief History of Time)
The subject of human races is so explosive that Darwin excised all discussion of it from his famous 1859 book On the Origin of Species. Even today, few scientists dare to study racial origins, lest they be branded racists simply for being interested in the problem.
Jared Diamond
Dr. Julian Huxley, famous English biologist and director of UNESCO, recently stated that Western scientists should “learn the Oriental techniques” for entering the trance state and for control of breathing. “What happens? How is it possible?” he said. An Associated Press dispatch from London, dated Aug. 21, 1948, reported: “Dr. Huxley told the new World Federation for Mental Health it might well look into the mystic lore of the East. If this lore could be investigated scientifically, he advised mental specialists, ‘then I think an immense step forward could be made in your field.
Paramahansa Yogananda (Autobiography of a Yogi (Self-Realization Fellowship))
The famous computer scientist Melvin Conway coined an adage that is often referred to as Conway's Law. It states that any organization that designs a system will produce a design whose structure mirrors the organization's structure. Another way to say this is to beware of shipping your org chart.
Marty Cagan (Empowered: Ordinary People, Extraordinary Products)
teachers do not hold bombs or knives, they are still dangerous enemies. They fill us with insidious revisionist ideas. They teach us that scholars are superior to workers. They promote personal ambition by encouraging competition for the highest grades. All these things are intended to change good young socialists into corrupt revisionists. They are invisible knives that are even more dangerous than real knives or guns. For example, a student from Yu-cai High School killed himself because he failed the university entrance examination. Brainwashed by his teachers, he believed his sole aim in life was to enter a famous university and become a scientist—
Ji-li Jiang (Red Scarf Girl)
... picture of a hot early stage of the universe was first put forward by the scientist George Gamow in a famous paper written in 1948 with a student of his, Ralph Alpher. Gamow had quite a sense of humour - he persuaded the nuclear scientist Hans Bethe to add his name to the paper to make the list of authors 'Alpher, Bethe, Gamow'...
Stephen Hawking (A Brief History of Time)
The interpretation of a result is an example. To take a trivial instance, there is a famous joke about a man who complains to a friend of a mysterious phenomenon. The white horses on his farm eat more than the black horses. He worries about this and cannot understand it, until his friend suggests that maybe he has more white horses than black ones.
Richard P. Feynman (The Meaning of It All: Thoughts of a Citizen-Scientist)
As the Reverend Sally Bingham, an Episcopalian preacher and renewables advocate, put it to me: “We believe that Mary was a virgin, that Jesus rose from the dead, that we might go to heaven. So why is it that two thousand years later, we still believe this story? And how can we believe that and not believe what the world’s most famous climate scientists tell us?
George Marshall (Don't Even Think About It: Why Our Brains Are Wired to Ignore Climate Change)
He was a big, rather clumsy man, with a substantial bay window that started in the middle of the chest. I should guess that he was less muscular than at first sight he looked. He had large staring blue eyes and a damp and pendulous lower lip. He didn't look in the least like an intellectual. Creative people of his abundant kind never do, of course, but all the talk of Rutherford looking like a farmer was unperceptive nonsense. His was really the kind of face and physique that often goes with great weight of character and gifts. It could easily have been the soma of a great writer. As he talked to his companions in the streets, his voice was three times as loud as any of theirs, and his accent was bizarre…. It was part of his nature that, stupendous as his work was, he should consider it 10 per cent more so. It was also part of his nature that, quite without acting, he should behave constantly as though he were 10 per cent larger than life. Worldly success? He loved every minute of it: flattery, titles, the company of the high official world...But there was that mysterious diffidence behind it all. He hated the faintest suspicion of being patronized, even when he was a world figure. Archbishop Lang was once tactless enough to suggest that he supposed a famous scientist had no time for reading. Rutherford immediately felt that he was being regarded as an ignorant roughneck. He produced a formidable list of his last month’s reading. Then, half innocently, half malevolently: "And what do you manage to read, your Grice?" I am afraid", said the Archbishop, somewhat out of his depth, "that a man in my position doesn't really have the leisure..." Ah yes, your Grice," said Rutherford in triumph, "it must be a dog's life! It must be a dog's life!
C.P. Snow
Valentine’s concept of introversion includes traits that contemporary psychology would classify as openness to experience (“thinker, dreamer”), conscientiousness (“idealist”), and neuroticism (“shy individual”). A long line of poets, scientists, and philosophers have also tended to group these traits together. All the way back in Genesis, the earliest book of the Bible, we had cerebral Jacob (a “quiet man dwelling in tents” who later becomes “Israel,” meaning one who wrestles inwardly with God) squaring off in sibling rivalry with his brother, the swashbuckling Esau (a “skillful hunter” and “man of the field”). In classical antiquity, the physicians Hippocrates and Galen famously proposed that our temperaments—and destinies—were a function of our bodily fluids, with extra blood and “yellow bile” making us sanguine or choleric (stable or neurotic extroversion), and an excess of phlegm and “black bile” making us calm or melancholic (stable or neurotic introversion). Aristotle noted that the melancholic temperament was associated with eminence in philosophy, poetry, and the arts (today we might classify this as opennessto experience). The seventeenth-century English poet John Milton wrote Il Penseroso (“The Thinker”) and L’Allegro (“The Merry One”), comparing “the happy person” who frolics in the countryside and revels in the city with “the thoughtful person” who walks meditatively through the nighttime woods and studies in a “lonely Towr.” (Again, today the description of Il Penseroso would apply not only to introversion but also to openness to experience and neuroticism.) The nineteenth-century German philosopher Schopenhauer contrasted “good-spirited” people (energetic, active, and easily bored) with his preferred type, “intelligent people” (sensitive, imaginative, and melancholic). “Mark this well, ye proud men of action!” declared his countryman Heinrich Heine. “Ye are, after all, nothing but unconscious instruments of the men of thought.” Because of this definitional complexity, I originally planned to invent my own terms for these constellations of traits. I decided against this, again for cultural reasons: the words introvert and extrovert have the advantage of being well known and highly evocative. Every time I uttered them at a dinner party or to a seatmate on an airplane, they elicited a torrent of confessions and reflections. For similar reasons, I’ve used the layperson’s spelling of extrovert rather than the extravert one finds throughout the research literature.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
Christopher’s anti-God campaign was based on a fundamental error reflected in the subtitle of his book: How Religion Poisons Everything. On the contrary, since religion, as practiced, is a human activity, the reverse is true. Human beings poison religion, imposing their prejudices, superstitions, and corruptions onto its rituals and texts, not the other way around. “Pascal Is a Fraud!” When I first became acquainted with Christopher’s crusade, I immediately thought of the seventeenth-century scientist and mathematician, Blaise Pascal. In addition to major contributions to scientific knowledge, Pascal produced exquisite reflections on religious themes: When I consider the short duration of my life, swallowed up in the eternity before and after, the space which I fill, and even can see, engulfed in the infinite immensity of spaces of which I am ignorant and which know me not, I am frightened and astonished at being here rather than there; for there is no reason why here rather than there, why now rather than then. Who has put me here?4 These are the questions that only a religious faith can attempt to answer. There is no science of the why of our existence, no scientific counsel or solace for our human longings, loneliness, and fear. Without a God to make sense of our existence, Pascal wrote, human life is intolerable: This is what I see and what troubles me. I look on all sides, and I see only darkness everywhere. Nature presents to me nothing which is not a matter of doubt and concern. If I saw nothing there that revealed a Divinity, I would come to a negative conclusion; if I saw everywhere the signs of a Creator, I would remain peacefully in faith. But seeing too much to deny and too little to be sure, I am in a state to be pitied. . . .5 To resolve this dilemma, Pascal devised his famous “wager,” which, simply stated, is that since we cannot know whether there is a God or not, it is better to wager that there is one, rather than that there is not.
David Horowitz (Dark Agenda: The War to Destroy Christian America)
I think it is almost impossible that he [Prophet Muhammad (saas)] could have known about things like the common origin of the universe, because scientists have only found out within the last few years with very complicated and advanced technological methods that this is the case. Somebody who did not know something about nuclear physics 1400 years ago could not, I think, be in a position to find out from his own mind for instance that the earth and the heavens had the same origin, or many others of the questions that we have discussed here. (Alfred Kroner, Professor of the Department of Geosciences, University of Mainz, Germany. One of the world's most famous geologists)
Harun Yahya (Allah's Miracles in the Qur'an)
Albert Einstein, considered the most influential person of the 20th century, was four years old before he could speak and seven before he could read. His parents thought he was retarded. He spoke haltingly until age nine. He was advised by a teacher to drop out of grade school: “You’ll never amount to anything, Einstein.” Isaac Newton, the scientist who invented modern-day physics, did poorly in math. Patricia Polacco, a prolific children’s author and illustrator, didn’t learn to read until she was 14. Henry Ford, who developed the famous Model-T car and started Ford Motor Company, barely made it through high school. Lucille Ball, famous comedian and star of I Love Lucy, was once dismissed from drama school for being too quiet and shy. Pablo Picasso, one of the great artists of all time, was pulled out of school at age 10 because he was doing so poorly. A tutor hired by Pablo’s father gave up on Pablo. Ludwig van Beethoven was one of the world’s great composers. His music teacher once said of him, “As a composer, he is hopeless.” Wernher von Braun, the world-renowned mathematician, flunked ninth-grade algebra. Agatha Christie, the world’s best-known mystery writer and all-time bestselling author other than William Shakespeare of any genre, struggled to learn to read because of dyslexia. Winston Churchill, famous English prime minister, failed the sixth grade.
Sean Covey (The 6 Most Important Decisions You'll Ever Make: A Guide for Teens)
Despite the complexity and variety of the universe, it turns out that to make one you need just three ingredients. Let’s imagine that we could list them in some kind of cosmic cookbook. So what are the three ingredients we need to cook up a universe? The first is matter—stuff that has mass. Matter is all around us, in the ground beneath our feet and out in space. Dust, rock, ice, liquids. Vast clouds of gas, massive spirals of stars, each containing billions of suns, stretching away for incredible distances. The second thing you need is energy. Even if you’ve never thought about it, we all know what energy is. Something we encounter every day. Look up at the Sun and you can feel it on your face: energy produced by a star ninety-three million miles away. Energy permeates the universe, driving the processes that keep it a dynamic, endlessly changing place. So we have matter and we have energy. The third thing we need to build a universe is space. Lots of space. You can call the universe many things—awesome, beautiful, violent—but one thing you can’t call it is cramped. Wherever we look we see space, more space and even more space. Stretching in all directions. It’s enough to make your head spin. So where could all this matter, energy and space come from? We had no idea until the twentieth century. The answer came from the insights of one man, probably the most remarkable scientist who has ever lived. His name was Albert Einstein. Sadly I never got to meet him, since I was only thirteen when he died. Einstein realised something quite extraordinary: that two of the main ingredients needed to make a universe—mass and energy—are basically the same thing, two sides of the same coin if you like. His famous equation E = mc2 simply means that mass can be thought of as a kind of energy, and vice versa. So instead of three ingredients, we can now say that the universe has just two: energy and space. So where did all this energy and space come from? The answer was found after decades of work by scientists: space and energy were spontaneously invented in an event we now call the Big Bang.
Stephen Hawking (Brief Answers to the Big Questions)
Newton had conceived of light as primarily a stream of emitted particles. But by Einstein’s day, most scientists accepted the rival theory, propounded by Newton’s contemporary Christiaan Huygens, that light should be considered a wave. A wide variety of experiments had confirmed the wave theory by the late nineteenth century. For example, Thomas Young did a famous experiment, now replicated by high school students, showing how light passing through two slits produces an interference pattern that resembles that of water waves going through two slits. In each case, the crests and troughs of the waves emanating from each slit reinforce each other in some places and cancel each other out in some places.
Walter Isaacson (Einstein: His Life and Universe)
Eventually he stood and pulled a slim volume off his bookshelf. About halfway through the thin leather journal he found the most often cited quote of the Third Age Imperial omnimancer Salam Abdus. Note, dear reader, that destiny is like a cat that you wish to call to you. Give it your attention, try to coax it into place, and it shall have naught to do with you. Play coy as a maiden, and it shall surely come running. Yet turn your back on the bastard at your deepest peril. Jynn took a deep breath. Regrettably little remained of Adbus’ teachings; he was most famous for this observation being quoted in Nove’s Lex Infortunii, wherein the great philosopher-scientist noted that shortly after writing the quote, Abdus was eaten by a Dire Ocelot.
J. Zachary Pike (Dragonfired (The Dark Profit Saga #3))
Speaking before a joint session of Congress, President Johnson said: “This generation has altered the composition of the atmosphere on a global scale through . . . a steady increase in carbon dioxide from the burning of fossil fuels.” It’s remarkable to note that, more than fifty years ago, an American president was already aware of, and acknowledging, human-created climate change. Johnson had been briefed on the dangers of CO 2 increases by the famous climate scientists Charles Keeling and Roger Revelle, among others. So, not only was Johnson aware of the issue, but he was already concerned enough to raise it before Congress. That single sentence in his address gives the lie to the claims of so many climate-change deniers that global warming is some kind of recent hoax.
Adam Frank (Light of the Stars: Alien Worlds and the Fate of the Earth)
Sixty years ago, Austin Ranney, an eminent political scientist, wrote a prophetic dissent to a famous report by an American Political Science Association committee entitled “Toward a More Responsible Two-Party System.”4 The report, by prominent political scientists frustrated with the role of conservative Southern Democrats in blocking civil rights and other social policy, issued a clarion call for more ideologically coherent, internally unified, and adversarial parties in the fashion of a Westminster-style parliamentary democracy like Britain or Canada. Ranney powerfully argued that such parties would be a disaster within the American constitutional system, given our separation of powers, separately elected institutions, and constraints on majority rule that favor cross-party coalitions and compromise. Time has proven Ranney dead right—we now have the kinds of parties the report desired, and it is disastrous.
Thomas E. Mann (It's Even Worse Than It Looks: How the American Constitutional System Collided with the New Politics of Extremism)
Bertrand Russell famously said: “It is undesirable to believe a proposition when there is no ground whatsoever for supposing it is true.” [but] Russell’s maxim is the luxury of a technologically advanced society with science, history, journalism, and their infrastructure of truth-seeking, including archival records, digital datasets, high-tech instruments, and communities of editing, fact-checking, and peer review. We children of the Enlightenment embrace the radical creed of universal realism: we hold that all our beliefs should fall within the reality mindset. We care about whether our creation story, our founding legends, our theories of invisible nutrients and germs and forces, our conceptions of the powerful, our suspicions about our enemies, are true or false. That’s because we have the tools to get answers to these questions, or at least to assign them warranted degrees of credence. And we have a technocratic state that should, in theory, put these beliefs into practice. But as desirable as that creed is, it is not the natural human way of believing. In granting an imperialistic mandate to the reality mindset to conquer the universe of belief and push mythology to the margins, we are the weird ones—or, as evolutionary social scientists like to say, the WEIRD ones: Western, Educated, Industrialized, Rich, Democratic. At least, the highly educated among us are, in our best moments. The human mind is adapted to understanding remote spheres of existence through a mythology mindset. It’s not because we descended from Pleistocene hunter-gatherers specifically, but because we descended from people who could not or did not sign on to the Enlightenment ideal of universal realism. Submitting all of one’s beliefs to the trials of reason and evidence is an unnatural skill, like literacy and numeracy, and must be instilled and cultivated.
Pinker Steven (Rationality: What It Is, Why It Seems Scarce, Why It Matters)
Even if there is only one possible unified theory, it is just a set of rules and equations. What is it that breathes fire into the equations and makes a universe for them to describe? The usual approach of science of constructing a mathematical model cannot answer the questions of why there should be a universe for the model to describe. Why does the universe go to all the bother of existing? Is the unified theory so compelling that it brings about its own existence? Or does it need a creator, and, if so, does he have any other effect on the universe? And who created him? Up to now, most scientists have been too occupied with the development of new theories that describe what the universe is to ask the question why. On the other hand, the people whose business it is to ask why, the philosophers, have not been able to keep up with the advance of scientific theories. In the eighteenth century, philosophers considered the whole of human knowledge, including science, to be their field and discussed questions such as: did the universe have a beginning? However, in the nineteenth and twentieth centuries, science became too technical and mathematical for the philosophers, or anyone else except a few specialists. Philosophers reduced the scope of their inquiries so much that Wittgenstein, the most famous philosopher of this century, said, “The sole remaining task for philosophy is the analysis of language.” What a comedown from the great tradition of philosophy from Aristotle to Kant! However, if we do discover a complete theory, it should in time be understandable in broad principle by everyone, not just a few scientists. Then we shall all, philosophers, scientists, and just ordinary people, be able to take part in the discussion of the question of why it is that we and the universe exist. If we find the answer to that, it would be the ultimate triumph of human reason – for then we would know the mind of God.
Stephen Hawking (A Brief History of Time)
The person who discovered the answer was a retiring, self-funded scientist named Peter Mitchell who in the early 1960s inherited a fortune from the Wimpey house-building company and used it to set up a research center in a stately home in Cornwall. Mitchell was something of an eccentric. He wore shoulder-length hair and an earring at a time when that was especially unusual among serious scientists. He was also famously forgetful. At his daughter’s wedding, he approached another guest and confessed that she looked familiar, though he couldn’t quite place her. “I was your first wife,” she answered. Mitchell’s ideas were universally dismissed, not altogether surprisingly. As one chronicler has noted, “At the time that Mitchell proposed his hypothesis there was not a shred of evidence in support of it.” But he was eventually vindicated and in 1978 was awarded the Nobel Prize in Chemistry—an extraordinary accomplishment for someone who worked from a home lab. The
Bill Bryson (The Body: A Guide for Occupants)
When I was a kid, my mother thought spinach was the healthiest food in the world because it contained so much iron. Getting enough iron was a big deal then because we didn't have 'iron-fortified' bread. Turns out that spinach is an okay source of iron, but no better than pizza, pistachio nuts, cooked lentils, or dried peaches. The spinach-iron myth grew out of a simple mathematical miscalculation: A researcher accidentally moved a decimal point one space, so he thought spinach had 10 times more iron than it did. The press reported it, and I had to eat spinach. Moving the decimal point was an honest mistake--but it's seldom that simple. If it happened today I'd suspect a spinach lobby was behind it. Businesses often twist science to make money. Lawyers do it to win cases. Political activists distort science to fit their agenda, bureaucrats to protect their turf. Reporters keep falling for it. Scientists sometimes go along with it because they like being famous.
John Stossel (Give Me a Break: How I Exposed Hucksters, Cheats, and Scam Artists and Became the Scourge of the Liberal Media...)
THE FOUNDING PROPHET of modern antihumanism was Thomas Malthus (1766–1834). For three decades a professor at the British East India Company’s East India College, Malthus was a political economist who famously argued that human reproduction always outruns available resources. This doctrine served to rationalize the starvation of millions caused by his employer’s policy of brutal oppression of the peasants of the Indian subcontinent. The British Empire’s colonial helots, however, were not Malthus’s only targets. Rather, his Essay on the Principle of Population (first published in 1798 and later expanded in numerous further editions) was initially penned as a direct attack on such Enlightenment revolutionaries as William Godwin and the Marquis de Condorcet, who advanced the notion that human liberty, expanding knowledge, and technological progress could ultimately make possible a decent life for all mankind. Malthus prescribed specific policies to keep population down by raising the death rate:
Robert Zubrin (Merchants of Despair: Radical Environmentalists, Criminal Pseudo-Scientists, and the Fatal Cult of Antihumanism)
Knowledge about the nutritious properties and growth cycles of what would later become staple crops, feeding vast populations – wheat, rice, corn – was initially maintained through ritual play farming of exactly this sort. Nor was this pattern of discovery limited to crops. Ceramics were first invented, long before the Neolithic, to make figurines, miniature models of animals and other subjects, and only later cooking and storage vessels. Mining is first attested as a way of obtaining minerals to be used as pigments, with the extraction of metals for industrial use coming only much later. Mesoamerican societies never employed wheeled transport; but we know they were familiar with spokes, wheels and axles since they made toy versions of them for children. Greek scientists famously came up with the principle of the steam engine, but only employed it to make temple doors that appeared to open of their own accord, or similar theatrical illusions. Chinese scientists, equally famously, first employed gunpowder for fireworks.
David Graeber (The Dawn of Everything: A New History of Humanity)
Never give up on yourself Everyone may give up on you but never give up on yourself, because if you do, it will also become the end. Believe that anything can be achieved with effort. Most important of all, we must understand that dyslexia is not just a hindrance to learning; it may also be considered a gift. Multiple studies have proven that dyslexic people are highly creative and intuitive. Not to mention the long list of dyslexic people who have succeeded in their chosen fields; Known scientist and the inventor of telephone, Alexander Graham Bell; The inventor of telescope, Galileo Galilei; Painter and polymath, Leonardo da Vinci; Mathematician and writer Lewis Carroll; American journalist, Anderson Cooper; Famous actor, Tom Cruise; Director of our all time favorites Indiana Jones and Jurassic Park, Steven Spielberg; Musician Paul Frappier; Entrepreneur and Apple founder, Steve Jobs; and maybe the person who is reading this book right now. We must always remember, everything can be learned and anyone can learn how to read!
Craig Donovan (Dyslexia: For Beginners - Dyslexia Cure and Solutions - Dyslexia Advantage (Dyslexic Advantage - Dyslexia Treatment - Dyslexia Therapy Book 1))
Look at the telephone; it would remind you of a unique scientist, Alexander Graham Bell. He, besides being a great inventor, was also a man of great compassion and service. In fact, much of the research which led to the development of the telephone was directed at finding solutions to the challenges of hearing impaired people and helping them to be able to listen and communicate. Bell’s mother and wife were both hearing impaired and it profoundly changed Bell’s outlook to science. He aimed to make devices which would help the hearing impaired. He started a special school in Boston to teach hearing impaired people in novel ways. It was these lessons which inspired him to work with sound and led to the invention of the telephone. Can you guess the name of the most famous student of Alexander Graham Bell? It was Helen Keller, the great author, activist and poet who was hearing and visually impaired. About her teacher, she once said that Bell dedicated his life to the penetration of that ‘inhuman silence which separates and estranges’.
A.P.J. Abdul Kalam (Learning How to Fly: Life Lessons for the Youth)
As arrogant as I may be in general, I am not sufficiently doltish or vainglorious to imagine that I can meaningfully address the deep philosophical questions embedded within this general inquiry of our intellectual ages—that is, fruitful modes of analysis for the history of human thought. I shall therefore take refuge in an escape route that has traditionally been granted to scientists: the liberty to act as a practical philistine. Instead of suggesting a principled and general solution, I shall ask whether I can specify an operational way to define “Darwinism” (and other intellectual entities) in a manner specific enough to win shared agreement and understanding among readers, but broad enough to avoid the doctrinal quarrels about membership and allegiance that always seem to arise when we define intellectual commitments as pledges of fealty to lists of dogmata (not to mention initiation rites, secret handshakes and membership cards—in short, the intellectual paraphernalia that led Karl Marx to make his famous comment to a French journalist: “je ne suis pas marxiste”).
Stephen Jay Gould (The Structure of Evolutionary Theory)
Benjamin Libet, a scientist in the physiology department of the University of California, San Francisco, was a pioneering researcher into the nature of human consciousness. In one famous experiment he asked a study group to move their hands at a moment of their choosing while their brain activity was being monitored. Libet was seeking to identify what came first — the brain’s electrical activity to make the hand move or the person’s conscious intention to make their hand move. It had to be the second one, surely? But no. Brain activity to move the hand was triggered a full half a second before any conscious intention to move it…. John-Dylan Haynes, a neuroscientist at the Max Planck Institute for Human Cognitive and Brain Studies in Leipzig, Germany, led a later study that was able to predict an action ten seconds before people had a conscious intention to do it. What was all the stuff about free will? Frank Tong, a neuroscientist at Vanderbilt University in Nashville, Tennessee, said: “Ten seconds is a lifetime in terms of brain activity.” So where is it coming from if not ‘us,’ the conscious mind?
David Icke
The lives of scientists, considered as Lives, almost always make dull reading. For one thing, the careers of the famous and the merely ordinary fall into much the same pattern, give or take an honorary degree or two, or (in European countries) an honorific order. It could be hardly otherwise. Academics can only seldom lead lives that are spacious or exciting in a worldly sense. They need laboratories or libraries and the company of other academics. Their work is in no way made deeper or more cogent by privation, distress or worldly buffetings. Their private lives may be unhappy, strangely mixed up or comic, but not in ways that tell us anything special about the nature or direction of their work. Academics lie outside the devastation area of the literary convention according to which the lives of artists and men of letters are intrinsically interesting, a source of cultural insight in themselves. If a scientist were to cut his ear off, no one would take it as evidence of a heightened sensibility; if a historian were to fail (as Ruskin did) to consummate his marriage, we should not suppose that our understanding of historical scholarship had somehow been enriched.
Peter Medawar
For almost all astronomical objects, gravitation dominates, and they have the same unexpected behavior. Gravitation reverses the usual relation between energy and temperature. In the domain of astronomy, when heat flows from hotter to cooler objects, the hot objects get hotter and the cool objects get cooler. As a result, temperature differences in the astronomical universe tend to increase rather than decrease as time goes on. There is no final state of uniform temperature, and there is no heat death. Gravitation gives us a universe hospitable to life. Information and order can continue to grow for billions of years in the future, as they have evidently grown in the past. The vision of the future as an infinite playground, with an unending sequence of mysteries to be understood by an unending sequence of players exploring an unending supply of information, is a glorious vision for scientists. Scientists find the vision attractive, since it gives them a purpose for their existence and an unending supply of jobs. The vision is less attractive to artists and writers and ordinary people. Ordinary people are more interested in friends and family than in science. Ordinary people may not welcome a future spent swimming in an unending flood of information. A darker view of the information-dominated universe was described in the famous story “The Library of Babel,” written by Jorge Luis Borges in 1941.§ Borges imagined his library, with an infinite array of books and shelves and mirrors, as a metaphor for the universe. Gleick’s book has an epilogue entitled “The Return of Meaning,” expressing the concerns of people who feel alienated from the prevailing scientific culture. The enormous success of information theory came from Shannon’s decision to separate information from meaning. His central dogma, “Meaning is irrelevant,” declared that information could be handled with greater freedom if it was treated as a mathematical abstraction independent of meaning. The consequence of this freedom is the flood of information in which we are drowning. The immense size of modern databases gives us a feeling of meaninglessness. Information in such quantities reminds us of Borges’s library extending infinitely in all directions. It is our task as humans to bring meaning back into this wasteland. As finite creatures who think and feel, we can create islands of meaning in the sea of information. Gleick ends his book with Borges’s image of the human condition: We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.
Freeman Dyson (Dreams of Earth and Sky)
In about 1980, he says, at a time when he was still struggling to articulate his own vision of a dynamic, evolving economy, he happened to read a book by the geneticist Richard Lewontin. And he was struck by a passage in which Lewontin said that scientists come in two types. Scientists of the first type see the world as being basically in equilibrium. And if untidy forces sometimes push a system slightly out of equilibrium, then they feel the whole trick is to push it back again. Lewontin called these scientists "Platonists," after the renowned Athenian philosopher who declared that the messy, imperfect objects we see around us are merely the reflections of perfect "archetypes." Scientists of the second type, however, see the world as a process of flow and change, with the same material constantly going around and around in endless combinations. Lewontin called these scientists "Heraclitans," after the Ionian philosopher who passionately and poetically argued that the world is in a constant state of flux. Heraclitus, who lived nearly a century before Plato, is famous for observing that "Upon those who step into the same rivers flow other and yet other waters," a statement that Plato himself paraphrased as "You can never step into the same river twice." "When I read what Lewontin said," says Arthur, "it was a moment of revelation. That's when it finally became clear to me what was going on. I thought to myself, "Yes! We're finally beginning to recover from Newton.
M. Mitchell Waldrop (Complexity: The Emerging Science at the Edge of Order and Chaos)
What would have happened, I wondered, if Clover and Jotter never ran the river—if they had listened to the critics and doomsayers, or to their own doubts? They brought knowledge, energy, and passion to their botanical work, but also a new perspective. Before them, men had gone down the Colorado to sketch dams, plot railroads, dig gold, and daydream little Swiss chalets stuck up on the cliffs. They saw the river for what it could be, harnessed for human use. Clover and Jotter saw it as it was, a living system made up of flower, leaf, and thorn, lovely in its fierceness, worthy of study for its own sake. They knew every saltbush twig and stickery cactus was, in its own way, as much a marvel as Boulder Dam—shaped to survive against all the odds. In the United States, half of all bachelor’s degrees in science, engineering, and mathematics go to women, yet these women go on to earn only 74 percent of a man’s salary in those fields. A recent study found that it will be another two decades before women and men publish papers at equal rates in the field of botany, a field traditionally welcoming to women. It may take four decades for chemistry, and three centuries for physics. Stereotypes linger of scientists as white-coated, wild-haired men, and they limit the ways in which young people envision their futures. In a famous, oft-replicated study, 70 percent of six-year-old girls, asked to draw a picture of a scientist, draw a woman, but only 25 percent do so at the age of sixteen.
Melissa L. Sevigny (Brave the Wild River: The Untold Story of Two Women Who Mapped the Botany of the Grand Canyon)
The dinosaurs, built of concrete, were a kind of bonus attraction. On New Year’s Eve 1853 a famous dinner for twenty-one prominent scientists was held inside the unfinished iguanodon. Gideon Mantell, the man who had found and identified the iguanodon, was not among them. The person at the head of the table was the greatest star of the young science of palaeontology. His name was Richard Owen and by this time he had already devoted several productive years to making Gideon Mantell’s life hell. A double-tailed lizard, part of the vast collection of natural wonders and anatomical specimens collected by the Scottish-born surgeon John Hunter in the eighteenth century. After Hunter’s death in 1793, the collection passed to the Royal College of Surgeons. (credit 6.8) Owen had grown up in Lancaster, in the north of England, where he had trained as a doctor. He was a born anatomist and so devoted to his studies that he sometimes illicitly borrowed limbs, organs and other parts from corpses and took them home for leisurely dissection. Once, while carrying a sack containing the head of a black African sailor that he had just removed, Owen slipped on a wet cobble and watched in horror as the head bounced away from him down the lane and through the open doorway of a cottage, where it came to rest in the front parlour. What the occupants had to say upon finding an unattached head rolling to a halt at their feet can only be imagined. One assumes that they had not formed any terribly advanced conclusions when, an instant later, a fraught-looking young man rushed in, wordlessly retrieved the head and rushed out again.
Bill Bryson (A Short History of Nearly Everything)
Every time we sit down to breakfast, we are likely to be benefiting from a dozen such prehistoric inventions. Who was the first person to figure out that you could make bread rise by the addition of those microorganisms we call yeasts? We have no idea, but we can be almost certain she was a woman and would most likely not be considered ‘white’ if she tried to immigrate to a European country today; and we definitely know her achievement continues to enrich the lives of billions of people. What we also know is that such discoveries were, again, based on centuries of accumulated knowledge and experimentation – recall how the basic principles of agriculture were known long before anyone applied them systematically – and that the results of such experiments were often preserved and transmitted through ritual, games and forms of play (or even more, perhaps, at the point where ritual, games and play shade into each other). ‘Gardens of Adonis’ are a fitting symbol here. Knowledge about the nutritious properties and growth cycles of what would later become staple crops, feeding vast populations – wheat, rice, corn – was initially maintained through ritual play farming of exactly this sort. Nor was this pattern of discovery limited to crops. Ceramics were first invented, long before the Neolithic, to make figurines, miniature models of animals and other subjects, and only later cooking and storage vessels. Mining is first attested as a way of obtaining minerals to be used as pigments, with the extraction of metals for industrial use coming only much later. Mesoamerican societies never employed wheeled transport; but we know they were familiar with spokes, wheels and axles since they made toy versions of them for children. Greek scientists famously came up with the principle of the steam engine, but only employed it to make temple doors that appeared to open of their own accord, or similar theatrical illusions. Chinese scientists, equally famously, first employed gunpowder for fireworks.
David Graeber (The Dawn of Everything: A New History of Humanity)
No nation influenced American thinking more profoundly than Germany, W.E.B. DuBois, Charles Beard, Walter Weyl, Richard Ely, Richard Ely, Nicholas Murray Butler, and countless other founders of modern American liberalism were among the nine thousand Americans who studied in German universities during the nineteenth century. When the American Economic Association was formed, five of the six first officers had studied in Germany. At least twenty of its first twenty-six presidents had as well. In 1906 a professor at Yale polled the top 116 economists and social scientists in America; more than half had studied in Germany for at least a year. By their own testimony, these intellectuals felt "liberated" by the experience of studying in an intellectual environment predicated on the assumption that experts could mold society like clay. No European statesman loomed larger in the minds and hearts of American progressives than Otto von Bismarck. As inconvenient as it may be for those who have been taught "the continuity between Bismarck and Hitler", writes Eric Goldman, Bismarck's Germany was "a catalytic of American progressive thought". Bismarck's "top-down socialism", which delivered the eight-hour workday, healthcare, social insurance, and the like, was the gold standard for enlightened social policy. "Give the working-man the right to work as long as he is healthy; assure him care when he is sick; assure him maintenance when he is old", he famously told the Reichstag in 1862. Bismarck was the original "Third Way" figure who triangulated between both ends of the ideological spectrum. "A government must not waver once it has chosen its course. It must not look to the left or right but go forward", he proclaimed. Teddy Roosevelt's 1912 national Progressive Party platform conspicuously borrowed from the Prussian model. Twenty-five years earlier, the political scientist Woodrow Wilson wrote that Bismarck's welfare state was an "admirable system . . . the most studied and most nearly perfected" in the world.
Jonah Goldberg (Liberal Fascism: The Secret History of the American Left from Mussolini to the Politics of Meaning)
This, in turn, has given us a “unified theory of aging” that brings the various strands of research into a single, coherent tapestry. Scientists now know what aging is. It is the accumulation of errors at the genetic and cellular level. These errors can build up in various ways. For example, metabolism creates free radicals and oxidation, which damage the delicate molecular machinery of our cells, causing them to age; errors can build up in the form of “junk” molecular debris accumulating inside and outside the cells. The buildup of these genetic errors is a by-product of the second law of thermodynamics: total entropy (that is, chaos) always increases. This is why rusting, rotting, decaying, etc., are universal features of life. The second law is inescapable. Everything, from the flowers in the field to our bodies and even the universe itself, is doomed to wither and die. But there is a small but important loophole in the second law that states total entropy always increases. This means that you can actually reduce entropy in one place and reverse aging, as long as you increase entropy somewhere else. So it’s possible to get younger, at the expense of wreaking havoc elsewhere. (This was alluded to in Oscar Wilde’s famous novel The Picture of Dorian Gray. Mr. Gray was mysteriously eternally young. But his secret was the painting of himself that aged horribly. So the total amount of aging still increased.) The principle of entropy can also be seen by looking behind a refrigerator. Inside the refrigerator, entropy decreases as the temperature drops. But to lower the entropy, you have to have a motor, which increases the heat generated behind the refrigerator, increasing the entropy outside the machine. That is why refrigerators are always hot in the back. As Nobel laureate Richard Feynman once said, “There is nothing in biology yet found that indicates the inevitability of death. This suggests to me that it is not at all inevitable and that it is only a matter of time before biologists discover what it is that is causing us the trouble and that this terrible universal disease or temporariness of the human’s body will be cured.
Michio Kaku (Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100)
Every human being with normal mental and emotional faculties longs for more. People typically associate their longing for more with a desire to somehow improve their lot in life—to get a better job, a nicer house, a more loving spouse, become famous, and so on. If only this, that, or some other thing were different, we say to ourselves, then we’d feel complete and happy. Some chase this “if only” all their lives. For others, the “if only” turns into resentment when they lose hope of ever acquiring completeness. But even if we get lucky and acquire our “if only,” it never quite satisfies. Acquiring the better job, the bigger house, the new spouse, or world fame we longed for may provide a temporary sense of happiness and completeness, but it never lasts. Sooner or later, the hunger returns. The best word in any language that captures this vague, unquenchable yearning, according to C. S. Lewis and other writers, is the German word Sehnsucht (pronounced “zane-zookt”).[9] It’s an unusual word that is hard to translate, for it expresses a deep longing or craving for something that you can’t quite identify and that always feels just out of reach. Some have described Sehnsucht as a vague and bittersweet nostalgia and/or longing for a distant country, but one that cannot be found on earth. Others have described it as a quasi-mystical sense that we (and our present world) are incomplete, combined with an unattainable yearning for whatever it is that would complete it. Scientists have offered several different explanations for this puzzling phenomenon—puzzling, because it’s hard to understand how natural processes alone could have evolved beings that hunger for something nature itself doesn’t provide.[10] But this longing is not puzzling from a biblical perspective, for Scripture teaches us that humans and the entire creation are fallen and estranged from God. Lewis saw Sehnsucht as reflective of our “pilgrim status.” It indicates that we are not where we were meant to be, where we are destined to be; we are not home. Lewis once wrote to a friend that “our best havings are wantings,” for our “wantings” are reminders that humans are meant for a different and better state.[11] In another place he wrote: Our lifelong nostalgia, our longing to be reunited with something in the universe from which we now feel cut off, to be on the inside of some door which we have always seen from the outside is . . . the truest index of our real situation.[12] With Lewis, Christians have always identified this Sehnsucht that resides in the human heart as a yearning for God. As St. Augustine famously prayed, “You have made us for yourself, and our hearts are restless till they find their rest in you.”[13] In this light, we might think of Sehnsucht as a sort of homing device placed in us by our Creator to lead us into a passionate relationship with him.
Gregory A. Boyd (Benefit of the Doubt: Breaking the Idol of Certainty)
The Extraordinary Persons Project In fact, Ekman had been so moved personally—and intrigued scientifically—by his experiments with Öser that he announced at the meeting he was planning on pursuing a systematic program of research studies with others as unusual as Öser. The single criterion for selecting apt subjects was that they be “extraordinary.” This announcement was, for modern psychology, an extraordinary moment in itself. Psychology has almost entirely dwelt on the problematic, the abnormal, and the ordinary in its focus. Very rarely have psychologists—particularly ones as eminent as Paul Ekman—shifted their scientific lens to focus on people who were in some sense (other than intellectually) far above normal. And yet Ekman now was proposing to study people who excel in a range of admirable human qualities. His announcement makes one wonder why psychology hasn't done this before. In fact, only in very recent years has psychology explicitly begun a program to study the positive in human nature. Sparked by Martin Seligman, a psychologist at the University of Pennsylvania long famous for his research on optimism, a budding movement has finally begun in what is being called “positive psychology”—the scientific study of well-being and positive human qualities. But even within positive psychology, Ekman's proposed research would stretch science's vision of human goodness by assaying the limits of human positivity Ever the scientist, Ekman became quite specific about what was meant by “extraordinary.” For one, he expects that such people exist in every culture and religious tradition, perhaps most often as contemplatives. But no matter what religion they practice, they share four qualities. The first is that they emanate a sense of goodness, a palpable quality of being that others notice and agree on. This goodness goes beyond some fuzzy, warm aura and reflects with integrity the true person. On this count Ekman proposed a test to weed out charlatans: In extraordinary people “there is a transparency between their personal and public life, unlike many charismatics, who have wonderful public lives and rather deplorable personal ones.” A second quality: selflessness. Such extraordinary people are inspiring in their lack of concern about status, fame, or ego. They are totally unconcerned with whether their position or importance is recognized. Such a lack of egoism, Ekman added, “from the psychological viewpoint, is remarkable.” Third is a compelling personal presence that others find nourishing. “People want to be around them because it feels good—though they can't explain why,” said Ekman. Indeed, the Dalai Lama himself offers an obvious example (though Ekman did not say so to him); the standard Tibetan title is not “Dalai Lama” but rather “Kundun,” which in Tibetan means “presence.” Finally, such extraordinary individuals have “amazing powers of attentiveness and concentration.
Daniel Goleman (Destructive Emotions: A Scientific Dialogue with the Dalai Lama)
The Tale of Human Evolution The subject most often brought up by advocates of the theory of evolution is the subject of the origin of man. The Darwinist claim holds that modern man evolved from ape-like creatures. During this alleged evolutionary process, which is supposed to have started 4-5 million years ago, some "transitional forms" between modern man and his ancestors are supposed to have existed. According to this completely imaginary scenario, four basic "categories" are listed: 1. Australopithecus 2. Homo habilis 3. Homo erectus 4. Homo sapiens Evolutionists call man's so-called first ape-like ancestors Australopithecus, which means "South African ape." These living beings are actually nothing but an old ape species that has become extinct. Extensive research done on various Australopithecus specimens by two world famous anatomists from England and the USA, namely, Lord Solly Zuckerman and Prof. Charles Oxnard, shows that these apes belonged to an ordinary ape species that became extinct and bore no resemblance to humans. Evolutionists classify the next stage of human evolution as "homo," that is "man." According to their claim, the living beings in the Homo series are more developed than Australopithecus. Evolutionists devise a fanciful evolution scheme by arranging different fossils of these creatures in a particular order. This scheme is imaginary because it has never been proved that there is an evolutionary relation between these different classes. Ernst Mayr, one of the twentieth century's most important evolutionists, contends in his book One Long Argument that "particularly historical [puzzles] such as the origin of life or of Homo sapiens, are extremely difficult and may even resist a final, satisfying explanation." By outlining the link chain as Australopithecus > Homo habilis > Homo erectus > Homo sapiens, evolutionists imply that each of these species is one another's ancestor. However, recent findings of paleoanthropologists have revealed that Australopithecus, Homo habilis, and Homo erectus lived at different parts of the world at the same time. Moreover, a certain segment of humans classified as Homo erectus have lived up until very modern times. Homo sapiens neandarthalensis and Homo sapiens sapiens (modern man) co-existed in the same region. This situation apparently indicates the invalidity of the claim that they are ancestors of one another. Stephen Jay Gould explained this deadlock of the theory of evolution although he was himself one of the leading advocates of evolution in the twentieth century: What has become of our ladder if there are three coexisting lineages of hominids (A. africanus, the robust australopithecines, and H. habilis), none clearly derived from another? Moreover, none of the three display any evolutionary trends during their tenure on earth. Put briefly, the scenario of human evolution, which is "upheld" with the help of various drawings of some "half ape, half human" creatures appearing in the media and course books, that is, frankly, by means of propaganda, is nothing but a tale with no scientific foundation. Lord Solly Zuckerman, one of the most famous and respected scientists in the U.K., who carried out research on this subject for years and studied Australopithecus fossils for 15 years, finally concluded, despite being an evolutionist himself, that there is, in fact, no such family tree branching out from ape-like creatures to man.
Harun Yahya (Those Who Exhaust All Their Pleasures In This Life)
What are we to do at any given moment, when we cannot say which of our current claims will be sustained and which will be rejected? This is one of the central questions that I have raised. Because we cannot know which of current claims will be sustained, the best we can do is to consider the weight of scientific evidence, the fulcrum of scientific opinion, and the trajectory of scientific knowledge. This is why consensus matters: If scientists are still debating a matter, then we may well be wise to “wait and see,” if conditions permit.26 If the available empirical evidence is thin, we may want to do more research. But the uncertainly of future scientific knowledge should not be used as an excuse for delay. As the epidemiologist Sir Austin Bradford Hill famously argued, “All scientific work is incomplete—whether it be observational or experimental. All scientific work is liable to be upset or modified by advancing knowledge. That does not confer upon us a freedom to ignore the knowledge we already have, or to postpone the action that it appears to demand at a given time.”27 At any given moment, it makes sense to make decisions on the information we have, and be prepared to alter our plans if future evidence warrants.
Naomi Oreskes (Why Trust Science? (The University Center for Human Values Series))
One other named Kinsey pedophile was Rex King, an American serial child rapist also known as “Mr. Braun,” “Mr. Green,” and “Mr. X.” The “king” of child molesters is on record as raping at least eight hundred children, the youngest two months of age. Kinsey met King in about 1943 when King demonstrated his instant-orgasm ability for Kinsey and Pomeroy.84 Kinsey’s mentor, the famous sexologist, Robert Dickenson, MD, had “trained” King to keep child sex-abuse records.
Judith Reisman (Sexual Sabotage: How One Mad Scientist Unleashed a Plague of Corruption and Contagion on America)
Speaking at the Chaos Communication Congress, an annual computer hacker conference held in Berlin, Germany, Tobias Engel, founder of Sternraute, and Karsten Nohl, chief scientist for Security Research Labs, explained that they could not only locate cell-phone callers anywhere in the world, they could also listen in on their phone conversations. And if they couldn’t listen in real time, they could record the encrypted calls and texts for later decryption.
Kevin D. Mitnick (The Art of Invisibility: The World's Most Famous Hacker Teaches You How to Be Safe in the Age of Big Brother and Big Data)
Rachel Carson’s Silent Spring is based on Humboldt’s concept of interconnectedness, and scientist James Lovelock’s famous Gaia theory of the earth as a living organism bears remarkable similarities.
Andrea Wulf (The Invention of Nature: Alexander von Humboldt's New World)
Belle is planning to host a series of salons," said Lio, appearing out of nowhere to fill her silence. It had been his first promise to her, in those wild days right after they broke the curse, when they talked feverishly about their most cherished dreams and whispered their deepest fears to each other. Back then, Belle's only fear had been her own ignorance. She had told him of her wish to travel to Paris and attend a salon herself, perhaps one that counted some of her favorite philosophes and encyclopédistes among its members. He had said her dream was toon small and that she herself should host one. The Mademoiselle de Vignerot smiled politely. "What will the subject be?" "Oh, everything," said Belle. Her enthusiasm elicited laughter, but she was entirely serious. The comte de Chamfort cleared his throat, his lips curling into a sneer. "That is very broad, madame. Surely you have a more specific interest? My parents used to attend the famous Bout-du-Banc literary salon in Paris, but that was a very long time ago." Belle gave him her best patient smile. "I don't wish to be limited, monsieur. My salons will invite scientists, philosophers, inventors, novelists, really anyone in possession of a good idea." The comte guffawed. "Why on earth would you do such a thing?" "To learn from them, monsieur. I would have thought the reason obvious." Marguerite snorted into her glass. Belle sipped her drink as Lio placed his hand on the small of her back. She didn't know if it was meant to calm her down or encourage her. "Whatever for?" the comte asked with the menacing air of a man discovering he was the butt of a joke. "Everything that is worth learning is already taught." "To whom?" Belle felt the heat rising in her cheeks. "Strictly the wealthy sons of wealthier fathers?" Some of Bastien's guests gasped, they themselves being the children of France's aristocracy, but Belle was heartened when she saw Marguerite smile encouragingly. "I believe that education is a right, monsieur, and one that has long been reserved exclusively for the most privileged among us. My salons will reflect the true reality." "Which is what, madame?" Marguerite prompted eagerly. Belle's heart rattled in her chest. "That scholarship is the province of any who would pursue it.
Emma Theriault (Rebel Rose (The Queen's Council, #1))
The 1918 influenza pandemic famously had a W-shaped curve. The very young and the very old were at increased risk, but there was also elevated risk in the middle of the age distribution, spiking in patients around twenty-five years old. Scientists have been studying this for decades but are still unsure why it happened.
Nicholas A. Christakis (Apollo's Arrow: The Profound and Enduring Impact of Coronavirus on the Way We Live)
Sometimes brands make a stand more quietly. Deep inside one of the world’s most famous factories, located in the tiny town of Billund, Denmark, more than a hundred engineers and scientists are collaborating to redesign a product that has worked perfectly for more than eighty years. The LEGO Sustainable Materials Centre, a well-funded group within LEGO, is dedicated to finding more sustainable materials within the next decade to make the company’s iconic bricks. In 2018 the group launched its first innovation, making flexible pieces such as leaves and palm trees from a plant-based plastic sourced from sugar cane. This sense of commitment to the environment is deeply felt at LEGO. Its efforts may inspire more such initiatives across the toy industry, especially if consumers take note of LEGO’s efforts and demand similar forward-looking commitments from other companies as well.
Rohit Bhargava (Non Obvious Megatrends: How to See What Others Miss and Predict the Future (Non-Obvious Trends Series))
As a chief ingredient in the mythology of science, the accumulation of objective facts supposedly controls the history of conceptual change–as logical and self-effacing scientists bow before the dictates of nature and willingly change their views to accommodate the growth of conceptual knowledge. The paradigm for such an idealistic notion remains Huxley’s famous remark about “a beautiful theory killed by a nasty, ugly little fact.” But single facts almost never slay worldviews, at least not right away (and properly so, for the majority of deeply anomalous observations turn out to be wrong)... Anomalous facts get incorporated into existing theories, often with a bit of forced stretching to be sure, but usually with decent fit because most worldviews contain considerable flexibility. (How else could they last so long, or be so recalcitrant to overthrow?)
Stephen Jay Gould (Leonardo's Mountain of Clams and the Diet of Worms: Essays on Natural History)
Otto captured this sacred sixth sense, at once subject and object, in a famous Latin sound bite: the sacred is the mysterium tremendum et fascinans, that is, the mystical (mysterium) as both fucking scary (tremendum) and utterly fascinating (fascinans).80 (page 9) With the sacred viewed within this gripping, emotionally charged sense, it is hardly surprising that these topics are too disturbing to be studied either by religious scholarship or by science. The presence of real siddhis, real psychic effects lurking in the dark boundaries between mind and matter, are so frightening and disorienting that defense mechanisms immediately snap into place to protect our psyches from these disturbing thoughts. We become blind to personal psychic episodes and to the supportive scientific evidence, we conveniently forget mind-shattering synchronicities, and if the intensity of the mysterium tremendum becomes too hot, we angrily deny any interest in the topic while backing away and vigorously making the sign of the cross. Within science this sort of behavior is understandable; science doesn’t like what it can’t explain because it makes scientists feel stupid. But the same resistance is also endemic in comparative religion scholarship, which is supposed to be the discipline that studies the sacred. As Kripal says, scholars of religion “simply ignore … or brush their data aside as ‘primitive,’ ‘mistaken,’ and so on. Now the dismissing word in vogue is ‘anecdotal’ ” (pp. 17–18).80 One reason for this odd state of affairs is that real psi and real siddhis powerfully refute Descartes’s dualism, the very idea that led to the split between science, which deals with matter, and the humanities, which deal with mind. This distinction has carved up the world so successfully that when phenomena appear that harshly illuminate the artificial nature of the split, the resulting glare, says Kripal, “can only violate and offend our present order of knowledge and possibility” (page 24).80 From this analysis, Kripal arrives at his central argument: Psychic phenomena may be thought of as symbols that indicate “the irruption [a bursting in] of meaning in the physical world via the radical collapse of the subject-object structure itself. They are not simply physical events. They are also meaning events” (page 25).80 In other words, where objective and subjective meet, the fabric of reality itself blurs. This is a place that is not quite physical, and not quite mental, but a limbo that somehow contains and creates both.
Dean Radin (Supernormal: Science, Yoga and the Evidence for Extraordinary Psychic Abilities)
The climate change debate resembles the famous tale of a group of blind men touching various parts of an elephant, each arriving at a very different idea of what it is like:
Craig D. Idso (Why Scientists Disagree About Global Warming: The NIPCC Report on Scientific Consensus)
Don’t mind him, he’s a joker,” I said. “Yes, these are the Piglin Brutes from the Triad Bastion. Now, we need to take them to the station for everyone’s safety.” “Yes, of course. Good job, detective.” Aaron said. “Farewell, sailors. Thanks once again for your help,” I said. “Goodbye. We’ll be here if you need us.” Henry replied. We took the prisoners to the police station. One by one, we put them in their individual cells and locked them up. The Piglin Brutes, as expected, didn’t say a word and obeyed our orders with no resistance. “I wish all prisoners acted like that,” Officer Barry said. “Don’t wish that. We still need them to talk,” I corrected him. Officers Zimmer and Sal were waiting for us at the main hall. “There they are, the travelling officers and detectives,” Officer Zimmer said. “Welcome back. We took care of the station while you guys were away,” Sal said. “Hello officers, I believe we haven’t been introduced yet because you were on your day off when I arrived. I’m CalvinPignes, nice to meet you.” “No need to introduce yourself, Detective. You’re too famous for that! It’s an honor to meet you. Officer Zimmer at your disposal.” “Officer Sal, I’m a huge fan of your work,” Sal said.
Mark Mulle (Diary of a Piglin Book 4: The Secret Scientist)
Henri Fehr, the famous Swiss scientist, said that practically all his good ideas came to him when he was not actively engaged in work on a problem, and that most of the discoveries of his contemporaries were made when they were away from their workbench, so to speak.
Maxwell Maltz (Psycho-Cybernetics: Updated and Expanded (The Psycho-Cybernetics Series))
Never forget,’ he said, ‘how proud we are that the first Jewish prime minister in Europe was Italian. Luigi Luzzatti’s name should never be forgotten. And remember all the famous Italian Jews who contributed to the importance of this nation: those who introduced printing to Italy and all our famous Jewish authors, poets, scientists and inventors. Each and every day we should strive to be the best people we can be, and remain proud of our heritage.
Angela Petch (The Tuscan House)
My work in my late twenties involved a box much like this one. Only it was a one-inch cube designed to put a macroscopic object into superposition. Into what we physicists sometimes call, in what passes for humor among scientists, cat state. As in Schrödinger’s cat, the famous thought experiment.
Blake Crouch (Dark Matter)
Carbon Dioxide: They Call It Pollution, We Call It Life.” Another, the Heartland Institute, which Exxon had helped found back in the 1990s, erected billboards comparing climate scientists to famous serial killers
Bill McKibben (Falter: Has the Human Game Begun to Play Itself Out?)
Only years later would scientists again need to harness the power of multiple processors at once, when massively parallel processing would become an integral part of supercomputing. Years later, too, the genealogy of Shoch’s worm would come full circle. Soon after he published a paper about the worm citing The Shockwave Rider, he received a letter from John Brunner himself. It seemed that most science fiction writers harbored an unspoken ambition to write a book that actually predicted the future. Their model was Arthur C. Clarke, the prolific author of 2001: A Space Odyssey, who had become world-famous for forecasting the invention of the geosynchronous communications satellite in an earlier short story. “Apparently they’re all jealous of Arthur Clarke,” Shoch reflected. “Brunner wrote that his editor had sent him my paper. He said he was ‘really delighted to learn, that like Arthur C. Clarke, I predicted an event of the future.’” Shoch briefly considered replying that he had only borrowed the tapeworm’s name but that the concept was his own and that, unfortunately, Brunner did not really invent the worm. But he let it pass.
Michael A. Hiltzik (Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age)
Rene Descartes, the famous French philosopher, scientist, and mathematician, was born in 1596, in the village of La Haye.
Michael H Hart (The 100: A Ranking Of The Most Influential Persons In History)
The fraction of the mass of two hydrogen atoms that is released as energy when they fuse to produce helium is 0.007 (0.7%). That is the source of the heat produced in the sun and in a hydrogen bomb. It is the amount of mass (m) that is converted to energy (E) in the famous Einstein formula E = mc2, and it is a direct measure of the strong nuclear force. If the strong force had a value of 0.006 or less, the universe would consist only of hydrogen—not very conducive to the complexities of life. If the value were greater than 0.008, all the hydrogen would have been fused shortly after the big bang, and there could be no stars, no solar heat—again, no life. As Stephen Hawking and Leonard Mlodinow put it in their book The Grand Design, “Our universe and its laws appear to have a design that both is tailor-made to support us and, if we are to exist, leaves little room for alteration.
Sy Garte (The Works of His Hands: A Scientist’s Journey from Atheism to Faith)
The fraction of the mass of two hydrogen atoms that is released as energy when they fuse to produce helium is 0.007 (0.7%). That is the source of the heat produced in the sun and in a hydrogen bomb. It is the amount of mass (m) that is converted to energy (E) in the famous Einstein formula E = mc2, and it is a direct measure of the strong nuclear force. If the strong force had a value of 0.006 or less, the universe would consist only of hydrogen—not very conducive to the complexities of life. If the value were greater than 0.008, all the hydrogen would have been fused shortly after the big bang, and there could be no stars, no solar heat—again, no life. As Stephen Hawking and Leonard Mlodinow put it in their book The Grand Design, “Our universe and its laws appear to have a design that both is tailor-made to support us and, if we are to exist, leaves little room for alteration.
Sy Garte (The Works of His Hands: A Scientist’s Journey from Atheism to Faith)
The fraction of the mass of two hydrogen atoms that is released as energy when they fuse to produce helium is 0.007 (0.7%). That is the source of the heat produced in the sun and in a hydrogen bomb. It is the amount of mass (m) that is converted to energy (E) in the famous Einstein formula E = mc2, and it is a direct measure of the strong nuclear force. If the strong force had a value of 0.006 or less, the universe would consist only of hydrogen—not very conducive to the complexities of life. If the value were greater than 0.008, all the hydrogen would have been fused shortly after the big bang, and there could be no stars, no solar heat—again, no life. As Stephen Hawking and Leonard Mlodinow put it in their book The Grand Design, “Our universe and its laws appear to have a design that both is tailor-made to support us and, if we are to exist, leaves little room for alteration.
Sy Garte (The Works of His Hands: A Scientist’s Journey from Atheism to Faith)
Michelet has done a good deal, it is true, to make Jeanne d’Arc popular and famous; but it was as the spokesman for the national sense of the people, not as a mystic or a saint, that she interested him. “What legend is more beautiful,” he writes, “than this incontestable story? But one must be careful not to make it into a legend. One must piously preserve all its circumstances, even the most human; one must respect its touching and terrible humanity…However deeply the historian may have been moved in writing this gospel, he has kept a firm hold on the real and never yielded to the temptation of idealism.” And he insisted that Jeanne d’Arc had established the modern type of hero of action, “contrary to passive Christianity.” His approach was thus entirely rational, based squarely on the philosophy of the eighteenth century – anti-clerical, democratic. And for this reason, the History fo the Middle Ages, important as it is, and for all its acute insight and its passages of marvelous eloquence, seems to me less satisfactory than the other parts of Michelet’s history.What Michelet admires are not the virtues which the chivalrous and Christian centuries cultivated, but the heroisms of the scientist and the artist, the Protestant in religion and politics, the efforts of man to understand his situation and rationally to control his development. Throughout the Middle Ages, Michelet is impatient for the Renaissance.
Edmund Wilson (To the Finland Station)
Hardin in a famous article published in 1968, the concept has been well known by social scientists for much longer.4 In economics, Paul Samuelson, a giant in the field, wrote about what he called
Richard H. Thaler (Nudge: The Final Edition)
One of the first things Tessa asked me was whether I had seen the whole Metz speech, not just the famous quote, which she repeated word for word:4 We are living in a computer-programmed reality, and the only clue we have to it is when some variable is changed, and some alteration in our reality occurs.
Rizwan Virk (The Simulated Multiverse: An MIT Computer Scientist Explores Parallel Universes, The Simulation Hypothesis, Quantum Computing and the Mandela Effect)
Von Neumann too wondered about the mystery of his and his compatriots’ origins. His friend and biographer, the Polish mathematician Stanislaw Ulam, remembers their discussions of the primitive rural foothills on both sides of the Carpathians, encompassing parts of Hungary, Czechoslovakia and Poland, populated thickly with impoverished Orthodox villages. “Johnny used to say that all the famous Jewish scientists, artists and writers who emigrated from Hungary around the time of the first World War came, either directly or indirectly, from those little Carpathian communities, moving up to Budapest as their material conditions improved.
Richard Rhodes (The Making of the Atomic Bomb: 25th Anniversary Edition)
Some imaginative Rebels play with their idea of their identity. One Rebel reported: “When I need to do repetitive chores, everything in me screams ‘Noooo.’ So I play a game I call ‘As If.’ I enact being somebody else or doing stuff while being filmed: e.g., I enact being a perfect butler, cook, interior designer, famous poet, cool scientist…sounds cheesy, but it works.” One Rebel combined the strategy of identity with the Rebel love of challenge: “To get things done, I trick my mind with a dare. I tell myself, ‘I’m a Rebel who can stick to a routine and follow through.’ This challenge excites me. It’s rebellious to be a Rebel who can do disciplined things that you don’t expect.
Gretchen Rubin (The Four Tendencies: The Indispensable Personality Profiles That Reveal How to Make Your Life Better (and Other People's Lives Better, Too))
She remembered reading a quote from a famous scientist, who was speculating about intelligent life in the universe, that was particularly apt to this situation: Sometimes I think we’re alone. Sometimes I think we’re not. In either case, the thought is staggering
Douglas E. Richards (The Cure)
Meanwhile, in the early 1970s Donald Johanson, a brash young scientist from the United States, joined a field expedition at a site in Ethiopia called Hadar. The team found hominin fossils, including a partial skeleton soon to become the most famous in the world, nicknamed “Lucy.” Geological work dated Lucy and associated fossils back more than three million years.
Lee Berger (Almost Human: The Astonishing Tale of Homo Naledi and the Discovery That Changed Our Human Story)
The Natural Law Argument Bertrand Russell: “There is, as we all know, a law that if you throw dice, you will get double sixes only about once in thirty-six times, and we do not regard that as evidence that the fall of the dice is regulated by design.” Russell's argument is a logical fallacy because we cannot impose our understanding and interpretation of playing dice on God or the natural law. We must first define or understand our subject to talk about anything with scientific precision. Since nobody has an understanding of the world before the world, to put it that way, we cannot have a clear understanding or grasp of the things that are beyond our cognitive powers. We still can think about them. To say that science is only what is proven by scientific experiments would be foolish because that would exclude the vast space of the unknown, even unknowable. Maybe God does not play dice, but maybe even God needs, metaphorically speaking, to throw out thirty-six worlds to make some effects, even if only two, that would otherwise not be possible. As we know, matter cannot power itself and organize itself without the underlying creative force empowering it. Matter is matter thanks to our perceptive and cognitive powers, not per se. Matter per se does not exist in the form we see it. What we see is a reality based on our senses. We cannot completely rely on our senses to tell the underlying reality. Reaching the underlying reality is possible only through abstract thought. This abstract thought will enhance scientific discoveries because we cannot reach the physically unreachable by experiments or strictly scientific means. Identification of God from religious books with God independent of holy books is prevalent in the books or arguments against God used by the most famous atheists, including agnostics like Bertrand Russell. However, a huge difference exists between a God from religious books and Spinoza’s God or the God of many philosophers and scientists. Once we acknowledge and accept this important difference, we will realize that the gap between believers (not contaminated by religions) and atheists (or agnostics) is much smaller than it looks at first sight. God is not in the religious books, nor can he be owned through religious books. The main goal of the major monotheistic religions is to a priori appropriate and establish the right to God rather than to define and explain God in the deepest possible sense because that is almost impossible, even for science and philosophy. For that reason, a belief in blind faith and fear mostly saves major religions, rather than pure belief, unaffected by religious influence or deceit.
Dejan Stojanovic
Amblyopsis hoosieri Type of animal: Eyeless cavefish Description: Completely colorless; 2 to 3 inches long; anus on underside of neck Home: Southern Indiana Fun fact: Unlike others of its kind, A. hoosieri lacks a debilitating mutation in the rhodopsin gene, which is an important gene for vision. That means it could see just fine … if it had eyes. Researchers named the fish after the Indiana Hoosiers basketball team — but not to imply the players might be visually challenged. The name honors several famous fish scientists who worked at Indiana University, as well as the species’s proximity to the university.Plus, the lead author is a Hoosier fan.  BRENDA POPPY Can You See Me Now? NIEMILLER/ZOOKEYS MATTHEW LEMOS; BARRETO GABRIELA : TOPFROM 22 DISCOVERMAGAZINE.COM
Anonymous