Scientist Famous Quotes

We've searched our database for all the quotes and captions related to Scientist Famous. Here they are! All 100 of them:

Isn’t every human being both a scientist and an artist; and in writing of human experience, isn’t there a good deal to be said for recognizing that fact and for using both methods?
James Agee (Let Us Now Praise Famous Men)
Nonsense remains nonsense, even when talked by world-famous scientists.
John C. Lennox
Anyway. I’m not allowed to watch TV, although I am allowed to rent documentaries that are approved for me, and I can read anything I want. My favorite book is A Brief History of Time, even though I haven’t actually finished it, because the math is incredibly hard and Mom isn’t good at helping me. One of my favorite parts is the beginning of the first chapter, where Stephen Hawking tells about a famous scientist who was giving a lecture about how the earth orbits the sun, and the sun orbits the solar system, and whatever. Then a woman in the back of the room raised her hand and said, “What you have told us is rubbish. The world is really a flat plate supported on the back of a giant tortoise.” So the scientist asked her what the tortoise was standing on. And she said, “But it’s turtles all the way down!” I love that story, because it shows how ignorant people can be. And also because I love tortoises.
Jonathan Safran Foer (Extremely Loud & Incredibly Close)
When you are famous it is hard to work on small problems. This is what did Shannon in. After information theory, what do you do for an encore? The great scientists often make this error. They fail to continue to plant the little acorns from which the mighty oak trees grow. They try to get the big thing right off. And that isn't the way things go. So that is another reason why you find that when you get early recognition it seems to sterilize you.
Richard Hamming
Pierre Curie, a brilliant scientist, happened to marry a still more brilliant one—Marie, the famous Madame Curie—and is the only great scientist in history who is consistently identified as the husband of someone else.
Isaac Asimov (Views From a Height: A Brilliant Overview of the Exciting Realms of Science)
It is a well-known established fact throughout the many-dimensional worlds of the multiverse that most really great discoveries are owed to one brief moment of inspiration. There's a lot of spadework first, of course, but what clinches the whole thing is the sight of, say, a falling apple or a boiling kettle or the water slipping over the edge of the bath. Something goes click inside the observer's head and then everything falls into place. The shape of DNA, it is popularly said, owes its discovery to the chance sight of a spiral staircase when the scientist‘s mind was just at the right receptive temperature. Had he used the elevator, the whole science of genetics might have been a good deal different. This is thought of as somehow wonderful. It isn't. It is tragic. Little particles of inspiration sleet through the universe all the time traveling through the densest matter in the same way that a neutrino passes through a candyfloss haystack, and most of them miss. Even worse, most of the ones that hit the exact cerebral target, hit the wrong one. For example, the weird dream about a lead doughnut on a mile-high gantry, which in the right mind would have been the catalyst for the invention of repressed-gravitational electricity generation (a cheap and inexhaustible and totally non-polluting form of power which the world in question had been seeking for centuries, and for the lack of which it was plunged into a terrible and pointless war) was in fact had by a small and bewildered duck. By another stroke of bad luck, the sight of a herd of wild horses galloping through a field of wild hyacinths would have led a struggling composer to write the famous Flying God Suite, bringing succor and balm to the souls of millions, had he not been at home in bed with shingles. The inspiration thereby fell to a nearby frog, who was not in much of a position to make a startling contributing to the field of tone poetry. Many civilizations have recognized this shocking waste and tried various methods to prevent it, most of them involving enjoyable but illegal attempts to tune the mind into the right wavelength by the use of exotic herbage or yeast products. It never works properly.
Terry Pratchett (Sourcery (Discworld, #5; Rincewind, #3))
For her next birthday she'd asked for a telescope. Her mother had been alive then, and had suggested a pony, but her father had laughed and bought her a beautiful telescope, saying: "Of course she should watch the stars! Any girl who cannot identify the constellation of Orion just isn't paying attention!" And when she started asking him complicated questions, he took her along to lectures at the Royal Society, where it turned out that a nine-year-old girl who had blond hair and knew what the precession of the equinoxes was could ask hugely bearded famous scientists anything she liked. Who'd want a pony when you could have the whole universe?
Terry Pratchett
This picture of a hot early stage of the universe was first put forward by the scientist George Gamow in a famous paper written in 1948 with a student of his, Ralph Alpher. Gamow had quite a sense of humor—he persuaded the nuclear scientist Hans Bethe to add his name to the paper to make the list of authors “Alpher, Bethe, Gamow,
Stephen Hawking (A Brief History of Time)
Harvard Square looked both new and familiar. I felt like I would have been able to tell just from looking that this configuration of buildings and streets was familiar and meaningful to lots of people, not just me. It was weird to visit a suburb that nobody else every visited or went to, and then to return to these widely known halls and buildings where famous statesmen and writers and scientists had been coming for hundreds of years.
Elif Batuman (The Idiot)
New Rule: Stop pretending your drugs are morally superior to my drugs because you get yours at a store. This week, they released the autopsy report on Anna Nicole Smith, and the cause of death was what I always thought it was: mad cow. No, it turns out she had nine different prescription drugs in her—which, in the medical field, is known as the “full Limbaugh.” They opened her up, and a Walgreens jumped out. Antidepressants, anti-anxiety pills, sleeping pills, sedatives, Valium, methadone—this woman was killed by her doctor, who is a glorified bartender. I’m not going to say his name, but only because (a) I don’t want to get sued, and (b) my back is killing me. This month marks the thirty-fifth anniversary of a famous government report. I was sixteen in 1972, and I remember how excited we were when Nixon’s much ballyhooed National Commission on Drug Abuse came out and said pot should be legalized. It was a moment of great hope for common sense—and then, just like Bush did with the Iraq Study Group, Nixon took the report and threw it in the garbage, and from there the ’70s went right into disco and colored underpants. This week in American Scientist, a magazine George Bush wouldn’t read if he got food poisoning in Mexico and it was the only thing he could reach from the toilet, described a study done in England that measured the lethality of various drugs, and found tobacco and alcohol far worse than pot, LSD, or Ecstasy—which pretty much mirrors my own experiments in this same area. The Beatles took LSD and wrote Sgt. Pepper—Anna Nicole Smith took legal drugs and couldn’t remember the number for nine-one-one. I wish I had more time to go into the fact that the drug war has always been about keeping black men from voting by finding out what they’re addicted to and making it illegal—it’s a miracle our government hasn’t outlawed fat white women yet—but I leave with one request: Would someone please just make a bumper sticker that says, “I’m a stoner, and I vote.
Bill Maher (The New New Rules: A Funny Look At How Everybody But Me Has Their Head Up Their Ass)
This [discovery of a cell-free yeast extract] will make him famous, even though he has no talent for chemistry. {Comment on German scientist Eduard Buchner who later ironically won a Nobel Prize in Chemistry for this discovery}
Adolf von Baeyer
A separate, international team analyzed more than a half million research articles, and classified a paper as “novel” if it cited two other journals that had never before appeared together. Just one in ten papers made a new combination, and only one in twenty made multiple new combinations. The group tracked the impact of research papers over time. They saw that papers with new knowledge combinations were more likely to be published in less prestigious journals, and also much more likely to be ignored upon publication. They got off to a slow start in the world, but after three years, the papers with new knowledge combos surpassed the conventional papers, and began accumulating more citations from other scientists. Fifteen years after publication, studies that made multiple new knowledge combinations were way more likely to be in the top 1 percent of most-cited papers. To recap: work that builds bridges between disparate pieces of knowledge is less likely to be funded, less likely to appear in famous journals, more likely to be ignored upon publication, and then more likely in the long run to be a smash hit in the library of human knowledge. •
David Epstein (Range: Why Generalists Triumph in a Specialized World)
Why does the universe go to all the bother of existing? Is the unified theory so compelling that it brings about its own existence? Or does it need a creator, and, if so, does he have any other effect on the universe? And who created him? Up to now, most scientists have been too occupied with the development of new theories that describe what the universe is to ask the question why. On the other hand, the people whose business it is to ask why, the philosophers, have not been able to keep up with the advance of scientific theories. In the eighteenth century, philosophers considered the whole of human knowledge, including science, to be their field and discussed questions such as: Did the universe have a beginning? However, in the nineteenth and twentieth centuries, science became too technical and mathematical for the philosophers, or anyone else except a few specialists. Philosophers reduced the scope of their inquiries so much that Wittgenstein, the most famous philosopher of this century, said, 'The sole remaining task for philosophy is the analysis of language.' What a comedown from the great tradition of philosophy from Aristotle to Kant! However, if we do discover a complete theory, it should in time be understandable in broad principle by everyone, not just a few scientists. Then we shall all, philosophers, scientists, and just ordinary people, be able to take part in the discussion of the question of why it is that we and the universe exist. If we find the answer to that, it would be the ultimate triumph of human reason--for then we would know the mind of God.
Stephen Hawking (A Brief History of Time)
Echoing a famous argument by the philosopher Karl Popper, most scientists today insist that the dividing line between science and pseudoscience is whether advocates of a hypothesis deliberately search for evidence that could falsify it and accept the hypothesis only if it survives.
Steven Pinker (Rationality: What It Is, Why It Seems Scarce, Why It Matters)
Ending up with that gigantic outsized brain must have taken some sort of runaway evolutionary process, something that would push and push without limits. And today's scientists had a pretty good guess at what that runaway evolutionary process had been. Harry had once read a famous book called Chimpanzee Politics. The book had described how an adult chimpanzee named Luit had confronted the aging alpha, Yeroen, with the help of a young, recently matured chimpanzee named Nikkie. Nikkie had not intervened directly in the fights between Luit and Yeroen, but had prevented Yeroen's other supporters in the tribe from coming to his aid, distracting them whenever a confrontation developed between Luit and Yeroen. And in time Luit had won, and become the new alpha, with Nikkie as the second most powerful... ...though it hadn't taken very long after that for Nikkie to form an alliance with the defeated Yeroen, overthrow Luit, and become the new new alpha. It really made you appreciate what millions of years of hominids trying to outwit each other - an evolutionary arms race without limit - had led to in the way of increased mental capacity. 'Cause, y'know, a human would have totally seen that one coming.
Eliezer Yudkowsky (Harry Potter and the Methods of Rationality)
Even if you are not a religious person by nature or training—even if you are an out-and-out skeptic—prayer can help you much more than you believe, for it is a practical thing. What do I mean, practical? I mean that prayer fulfills these three very basic psychological needs which all people share, whether they believe in God or not: 1. Prayer helps us to put into words exactly what is troubling us. We saw in Chapter 4 that it is almost impossible to deal with a problem while it remains vague and nebulous. Praying, in a way, is very much like writing our problems down on paper. If we ask help for a problem—even from God—we must put it into words. 2. Prayer gives us a sense of sharing our burdens, of not being alone. Few of us are so strong that we can bear our heaviest burdens, our most agonizing troubles, all by ourselves. Sometimes our worries are of so ultimate a nature that we cannot discuss them even with our closest relatives or friends. Then prayer is the answer. Any psychiatrist will tell us that when we are pent-up and tense, and in an agony of spirit, it is therapeutically good to tell someone our troubles. When we can’t tell anyone else—we can always tell God. 3. Prayer puts into force an active principle of doing. It’s a first step toward action. I doubt if anyone can pray for some fulfillment, day after day, without benefiting from it—in other words, without taking some steps to bring it to pass. The world-famous scientist, Dr. Alexis Carrel, said: “Prayer is the most powerful form of energy one can generate.” So why not make use of it? Call it God or Allah or Spirit—why quarrel with definitions as long as the mysterious powers of nature take us in hand?
Dale Carnegie (How To Stop Worrying & Start Living)
The trouble with Oppenheimer, the famous but uninvolved scientist Einstein remarked, was that he loved a woman who did not love him back: the U.S. government.
TaraShea Nesbit (The Wives of Los Alamos)
Noam Chomsky has famously argued that a Martian scientist would conclude that all earthlings speak dialects of the same language.
Guy Deutscher (Through the Language Glass: Why the World Looks Different in Other Languages)
wouldn’t mind—I mean, I know he was a famous scientist, but that’s all I know. Could you tell me how you knew him? Maybe supply an anecdote? Did you know him long?
Bonnie Garmus (Lessons in Chemistry)
Once upon a time an academic scientist went to visit a Zen Master, famous for being very wise. After greeting the scholar, the master offered him tea. As they sat together, the monk began to pour the tea into the scholar's cup. He poured until the tea overflowed onto the saucer, then the table and finally onto the floor. When the scholar could not stand it any more, he blurted out: "Stop, stop, can't you see the cup is full?" To which the Zen Master replied: "Yes, I can, and until your mind is empty, you will not hear what I have to say.
Jeffrey Armstrong (God the Astrologer: Soul, Karma, and Reincarnation--How We Continually Create Our Own Destiny)
It is easy for a famous scientist to have lots of students doing the dirty work for him,” said one colleague. “But Opje helps people with their problems and then gives them the credit.
Kai Bird (American Prometheus)
What this all goes to show is that nonsense remains nonsense, even when talked by world-famous scientists. What serves to obscure the illogicality of such statements is the fact that they are made by scientists; and the general public, not surprisingly, assumes that they are statements of science and takes them on authority. That is why it is important to point out that they are not statements of science, and any statement, whether made by a scientist or not, should be open to logical analysis. Immense prestige and authority does not compensate for faulty logic.
John C. Lennox (God and Stephen Hawking)
Gene patents are the point of greatest concern in the debate over ownership of human biological materials, and how that ownership might interfere with science. As of 2005—the most recent year figures were available—the U.S. government had issued patents relating to the use of about 20 percent of known human genes, including genes for Alzheimer’s, asthma, colon cancer, and, most famously, breast cancer. This means pharmaceutical companies, scientists, and universities control what research can be done on those genes, and how much resulting therapies and diagnostic tests will cost. And some enforce their patents aggressively: Myriad Genetics, which holds the patents on the BRCA1 and BRCA2 genes responsible for most cases of hereditary breast and ovarian cancer, charges $3,000 to test for the genes. Myriad has been accused of creating a monopoly, since no one else can offer the test, and researchers can’t develop cheaper tests or new therapies without getting permission from Myriad and paying steep licensing fees. Scientists who’ve gone ahead with research involving the breast-cancer genes without Myriad’s permission have found themselves on the receiving end of cease-and-desist letters and threats of litigation.
Rebecca Skloot
when another German scientist, Werner Heisenberg, formulated his famous uncertainty principle. In order to predict the future position and velocity of a particle, one has to be able to measure its present position and velocity accurately. The obvious way to do this is to shine light on the particle. Some of the waves of light will be scattered by the particle and this will indicate its position. However, one will not be able to determine the position of the particle more accurately than the distance between the wave crests of light, so one needs to use light of a short wavelength in order to measure the position of the particle precisely. Now, by Planck’s quantum hypothesis, one cannot use an arbitrarily small amount of light; one has to use at least one quantum. This quantum will disturb the particle and change its velocity in a way that cannot be predicted. Moreover, the more accurately one measures the position, the shorter the wavelength of the light that one needs and hence the higher the energy of a single quantum. So the velocity of the particle will be disturbed by a larger amount. In other words, the more accurately you try to measure the position of the particle, the less accurately you can measure its speed, and vice versa.
Stephen Hawking (A Brief History of Time)
Some of the more sought-after signers are, in no particular order, presidents, military heroes, sports icons, actors, singers, artists, religious and social leaders, scientists, astronauts, authors, and Kardashians.
Carrie Fisher (The Princess Diarist)
Abraham Maslow once famously said,22 “When all you’ve got is a hammer, every problem looks like a nail.” What he meant was, when it comes to problem-solving, we tend to get locked into using familiar tools in expected ways.
Steven Kotler (Stealing Fire: How Silicon Valley, the Navy SEALs, and Maverick Scientists Are Revolutionizing the Way We Live and Work)
Singer cited the famous essay “The Tragedy of the Commons,” in which biologist Garrett Hardin argued that individuals acting in their rational self-interest may undermine the common good, and warned against assuming that technology would save us from ourselves. “If we ignore the present warning signs and wait for an ecological disaster to strike, it will probably be too late,” Singer noted. He imagined what it must have been like to be Noah, surrounded by “complacent compatriots,” saying, “‘Don’t worry about the rising waters, Noah; our advanced technology will surely discover a substitute for breathing.’ If it was wisdom that enabled Noah to believe in the ‘never-yet-happened,’ we could use some of that wisdom now,” Singer concluded.
Naomi Oreskes (Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming)
For years the physicist Donna Strickland was not deemed notable enough for an entry. She finally got her place in Wikipedia on the day she won the Nobel Prize. Surely that cannot be what it takes to be remembered? No man is held to such a standard.
Sandi Toksvig (Between the Stops: The View of My Life from the Top of the Number 12 Bus)
Einstein never accepted that the universe was governed by chance; his feelings were summed up in his famous statement, ‘God does not play dice.’ Most other scientists, however, were willing to accept quantum mechanics because it agreed perfectly with experiment.
Stephen Hawking (A Brief History of Time)
Dr. Julian Huxley, famous English biologist and director of UNESCO, recently stated that Western scientists should “learn the Oriental techniques” for entering the trance state and for control of breathing. “What happens? How is it possible?” he said. An Associated Press dispatch from London, dated Aug. 21, 1948, reported: “Dr. Huxley told the new World Federation for Mental Health it might well look into the mystic lore of the East. If this lore could be investigated scientifically, he advised mental specialists, ‘then I think an immense step forward could be made in your field.
Paramahansa Yogananda (Autobiography of a Yogi (Self-Realization Fellowship))
The famous computer scientist Melvin Conway coined an adage that is often referred to as Conway's Law. It states that any organization that designs a system will produce a design whose structure mirrors the organization's structure. Another way to say this is to beware of shipping your org chart.
Marty Cagan (Empowered: Ordinary People, Extraordinary Products)
teachers do not hold bombs or knives, they are still dangerous enemies. They fill us with insidious revisionist ideas. They teach us that scholars are superior to workers. They promote personal ambition by encouraging competition for the highest grades. All these things are intended to change good young socialists into corrupt revisionists. They are invisible knives that are even more dangerous than real knives or guns. For example, a student from Yu-cai High School killed himself because he failed the university entrance examination. Brainwashed by his teachers, he believed his sole aim in life was to enter a famous university and become a scientist—
Ji-li Jiang (Red Scarf Girl)
Who's more interesting: A famous scientist, or the famous who plays the cello and whittles marionettes in a lighthouse at the edge of the world where he sometimes writes poetry by the light of passing ships? Exactly. Follow your weird impulses and do all sorts of things. Getting sidetracked can lead you to exactly where you belong.
Jessica Hagy (How to Be Interesting: In 10 Simple Steps)
... picture of a hot early stage of the universe was first put forward by the scientist George Gamow in a famous paper written in 1948 with a student of his, Ralph Alpher. Gamow had quite a sense of humour - he persuaded the nuclear scientist Hans Bethe to add his name to the paper to make the list of authors 'Alpher, Bethe, Gamow'...
Stephen Hawking (A Brief History of Time)
The interpretation of a result is an example. To take a trivial instance, there is a famous joke about a man who complains to a friend of a mysterious phenomenon. The white horses on his farm eat more than the black horses. He worries about this and cannot understand it, until his friend suggests that maybe he has more white horses than black ones.
Richard P. Feynman (The Meaning of It All: Thoughts of a Citizen-Scientist)
As the Reverend Sally Bingham, an Episcopalian preacher and renewables advocate, put it to me: “We believe that Mary was a virgin, that Jesus rose from the dead, that we might go to heaven. So why is it that two thousand years later, we still believe this story? And how can we believe that and not believe what the world’s most famous climate scientists tell us?
George Marshall (Don't Even Think About It: Why Our Brains Are Wired to Ignore Climate Change)
The successful ideas survive scrutiny. The bad ideas get discarded. Conformity is also laughable to scientists attempting to advance their careers. The best way to get famous in your own lifetime is to pose an idea that counters prevailing research and that earns a consistency of observations and experiment. Healthy disagreement is a natural state on the bleeding edge of discovery.
Neil deGrasse Tyson (Starry Messenger: Cosmic Perspectives on Civilization)
Valentine’s concept of introversion includes traits that contemporary psychology would classify as openness to experience (“thinker, dreamer”), conscientiousness (“idealist”), and neuroticism (“shy individual”). A long line of poets, scientists, and philosophers have also tended to group these traits together. All the way back in Genesis, the earliest book of the Bible, we had cerebral Jacob (a “quiet man dwelling in tents” who later becomes “Israel,” meaning one who wrestles inwardly with God) squaring off in sibling rivalry with his brother, the swashbuckling Esau (a “skillful hunter” and “man of the field”). In classical antiquity, the physicians Hippocrates and Galen famously proposed that our temperaments—and destinies—were a function of our bodily fluids, with extra blood and “yellow bile” making us sanguine or choleric (stable or neurotic extroversion), and an excess of phlegm and “black bile” making us calm or melancholic (stable or neurotic introversion). Aristotle noted that the melancholic temperament was associated with eminence in philosophy, poetry, and the arts (today we might classify this as opennessto experience). The seventeenth-century English poet John Milton wrote Il Penseroso (“The Thinker”) and L’Allegro (“The Merry One”), comparing “the happy person” who frolics in the countryside and revels in the city with “the thoughtful person” who walks meditatively through the nighttime woods and studies in a “lonely Towr.” (Again, today the description of Il Penseroso would apply not only to introversion but also to openness to experience and neuroticism.) The nineteenth-century German philosopher Schopenhauer contrasted “good-spirited” people (energetic, active, and easily bored) with his preferred type, “intelligent people” (sensitive, imaginative, and melancholic). “Mark this well, ye proud men of action!” declared his countryman Heinrich Heine. “Ye are, after all, nothing but unconscious instruments of the men of thought.” Because of this definitional complexity, I originally planned to invent my own terms for these constellations of traits. I decided against this, again for cultural reasons: the words introvert and extrovert have the advantage of being well known and highly evocative. Every time I uttered them at a dinner party or to a seatmate on an airplane, they elicited a torrent of confessions and reflections. For similar reasons, I’ve used the layperson’s spelling of extrovert rather than the extravert one finds throughout the research literature.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
Christopher’s anti-God campaign was based on a fundamental error reflected in the subtitle of his book: How Religion Poisons Everything. On the contrary, since religion, as practiced, is a human activity, the reverse is true. Human beings poison religion, imposing their prejudices, superstitions, and corruptions onto its rituals and texts, not the other way around. “Pascal Is a Fraud!” When I first became acquainted with Christopher’s crusade, I immediately thought of the seventeenth-century scientist and mathematician, Blaise Pascal. In addition to major contributions to scientific knowledge, Pascal produced exquisite reflections on religious themes: When I consider the short duration of my life, swallowed up in the eternity before and after, the space which I fill, and even can see, engulfed in the infinite immensity of spaces of which I am ignorant and which know me not, I am frightened and astonished at being here rather than there; for there is no reason why here rather than there, why now rather than then. Who has put me here?4 These are the questions that only a religious faith can attempt to answer. There is no science of the why of our existence, no scientific counsel or solace for our human longings, loneliness, and fear. Without a God to make sense of our existence, Pascal wrote, human life is intolerable: This is what I see and what troubles me. I look on all sides, and I see only darkness everywhere. Nature presents to me nothing which is not a matter of doubt and concern. If I saw nothing there that revealed a Divinity, I would come to a negative conclusion; if I saw everywhere the signs of a Creator, I would remain peacefully in faith. But seeing too much to deny and too little to be sure, I am in a state to be pitied. . . .5 To resolve this dilemma, Pascal devised his famous “wager,” which, simply stated, is that since we cannot know whether there is a God or not, it is better to wager that there is one, rather than that there is not.
David Horowitz (Dark Agenda: The War to Destroy Christian America)
I think it is almost impossible that he [Prophet Muhammad (saas)] could have known about things like the common origin of the universe, because scientists have only found out within the last few years with very complicated and advanced technological methods that this is the case. Somebody who did not know something about nuclear physics 1400 years ago could not, I think, be in a position to find out from his own mind for instance that the earth and the heavens had the same origin, or many others of the questions that we have discussed here. (Alfred Kroner, Professor of the Department of Geosciences, University of Mainz, Germany. One of the world's most famous geologists)
Harun Yahya (Allah's Miracles in the Qur'an)
Albert Einstein, considered the most influential person of the 20th century, was four years old before he could speak and seven before he could read. His parents thought he was retarded. He spoke haltingly until age nine. He was advised by a teacher to drop out of grade school: “You’ll never amount to anything, Einstein.” Isaac Newton, the scientist who invented modern-day physics, did poorly in math. Patricia Polacco, a prolific children’s author and illustrator, didn’t learn to read until she was 14. Henry Ford, who developed the famous Model-T car and started Ford Motor Company, barely made it through high school. Lucille Ball, famous comedian and star of I Love Lucy, was once dismissed from drama school for being too quiet and shy. Pablo Picasso, one of the great artists of all time, was pulled out of school at age 10 because he was doing so poorly. A tutor hired by Pablo’s father gave up on Pablo. Ludwig van Beethoven was one of the world’s great composers. His music teacher once said of him, “As a composer, he is hopeless.” Wernher von Braun, the world-renowned mathematician, flunked ninth-grade algebra. Agatha Christie, the world’s best-known mystery writer and all-time bestselling author other than William Shakespeare of any genre, struggled to learn to read because of dyslexia. Winston Churchill, famous English prime minister, failed the sixth grade.
Sean Covey (The 6 Most Important Decisions You'll Ever Make: A Guide for Teens)
Despite the complexity and variety of the universe, it turns out that to make one you need just three ingredients. Let’s imagine that we could list them in some kind of cosmic cookbook. So what are the three ingredients we need to cook up a universe? The first is matter—stuff that has mass. Matter is all around us, in the ground beneath our feet and out in space. Dust, rock, ice, liquids. Vast clouds of gas, massive spirals of stars, each containing billions of suns, stretching away for incredible distances. The second thing you need is energy. Even if you’ve never thought about it, we all know what energy is. Something we encounter every day. Look up at the Sun and you can feel it on your face: energy produced by a star ninety-three million miles away. Energy permeates the universe, driving the processes that keep it a dynamic, endlessly changing place. So we have matter and we have energy. The third thing we need to build a universe is space. Lots of space. You can call the universe many things—awesome, beautiful, violent—but one thing you can’t call it is cramped. Wherever we look we see space, more space and even more space. Stretching in all directions. It’s enough to make your head spin. So where could all this matter, energy and space come from? We had no idea until the twentieth century. The answer came from the insights of one man, probably the most remarkable scientist who has ever lived. His name was Albert Einstein. Sadly I never got to meet him, since I was only thirteen when he died. Einstein realised something quite extraordinary: that two of the main ingredients needed to make a universe—mass and energy—are basically the same thing, two sides of the same coin if you like. His famous equation E = mc2 simply means that mass can be thought of as a kind of energy, and vice versa. So instead of three ingredients, we can now say that the universe has just two: energy and space. So where did all this energy and space come from? The answer was found after decades of work by scientists: space and energy were spontaneously invented in an event we now call the Big Bang.
Stephen Hawking (Brief Answers to the Big Questions)
Newton had conceived of light as primarily a stream of emitted particles. But by Einstein’s day, most scientists accepted the rival theory, propounded by Newton’s contemporary Christiaan Huygens, that light should be considered a wave. A wide variety of experiments had confirmed the wave theory by the late nineteenth century. For example, Thomas Young did a famous experiment, now replicated by high school students, showing how light passing through two slits produces an interference pattern that resembles that of water waves going through two slits. In each case, the crests and troughs of the waves emanating from each slit reinforce each other in some places and cancel each other out in some places.
Walter Isaacson (Einstein: His Life and Universe)
I found considerably more studies about women’s scent preferences than men’s. I don’t know if that’s because male scientists are particularly curious about What Women Want. Among studies on men, there’s the now-famous bit about men tipping strippers more if they’re ovulating—they do, the effects are reproducible, and they go away if the woman is on birth control—but that may or may not be scent related. (It’s hard to say what you’re smelling, exactly, in a strip club.) Men also prefer the smelly T-shirts of ovulating women, don’t like the pit smells of menstruating women and women who are less immuno-compatible as much, and almost universally dislike the smell of a woman’s tears, regardless of her reproductive status.
Cat Bohannon (Eve: How the Female Body Drove 200 Million Years of Human Evolution)
Eventually he stood and pulled a slim volume off his bookshelf. About halfway through the thin leather journal he found the most often cited quote of the Third Age Imperial omnimancer Salam Abdus. Note, dear reader, that destiny is like a cat that you wish to call to you. Give it your attention, try to coax it into place, and it shall have naught to do with you. Play coy as a maiden, and it shall surely come running. Yet turn your back on the bastard at your deepest peril. Jynn took a deep breath. Regrettably little remained of Adbus’ teachings; he was most famous for this observation being quoted in Nove’s Lex Infortunii, wherein the great philosopher-scientist noted that shortly after writing the quote, Abdus was eaten by a Dire Ocelot.
J. Zachary Pike (Dragonfired (The Dark Profit Saga #3))
Speaking before a joint session of Congress, President Johnson said: “This generation has altered the composition of the atmosphere on a global scale through . . . a steady increase in carbon dioxide from the burning of fossil fuels.” It’s remarkable to note that, more than fifty years ago, an American president was already aware of, and acknowledging, human-created climate change. Johnson had been briefed on the dangers of CO 2 increases by the famous climate scientists Charles Keeling and Roger Revelle, among others. So, not only was Johnson aware of the issue, but he was already concerned enough to raise it before Congress. That single sentence in his address gives the lie to the claims of so many climate-change deniers that global warming is some kind of recent hoax.
Adam Frank (Light of the Stars: Alien Worlds and the Fate of the Earth)
He was a big, rather clumsy man, with a substantial bay window that started in the middle of the chest. I should guess that he was less muscular than at first sight he looked. He had large staring blue eyes and a damp and pendulous lower lip. He didn't look in the least like an intellectual. Creative people of his abundant kind never do, of course, but all the talk of Rutherford looking like a farmer was unperceptive nonsense. His was really the kind of face and physique that often goes with great weight of character and gifts. It could easily have been the soma of a great writer. As he talked to his companions in the streets, his voice was three times as loud as any of theirs, and his accent was bizarre…. It was part of his nature that, stupendous as his work was, he should consider it 10 per cent more so. It was also part of his nature that, quite without acting, he should behave constantly as though he were 10 per cent larger than life. Worldly success? He loved every minute of it: flattery, titles, the company of the high official world...But there was that mysterious diffidence behind it all. He hated the faintest suspicion of being patronized, even when he was a world figure. Archbishop Lang was once tactless enough to suggest that he supposed a famous scientist had no time for reading. Rutherford immediately felt that he was being regarded as an ignorant roughneck. He produced a formidable list of his last month’s reading. Then, half innocently, half malevolently: "And what do you manage to read, your Grice?" I am afraid", said the Archbishop, somewhat out of his depth, "that a man in my position doesn't really have the leisure..." Ah yes, your Grice," said Rutherford in triumph, "it must be a dog's life! It must be a dog's life!
C.P. Snow
Sixty years ago, Austin Ranney, an eminent political scientist, wrote a prophetic dissent to a famous report by an American Political Science Association committee entitled “Toward a More Responsible Two-Party System.”4 The report, by prominent political scientists frustrated with the role of conservative Southern Democrats in blocking civil rights and other social policy, issued a clarion call for more ideologically coherent, internally unified, and adversarial parties in the fashion of a Westminster-style parliamentary democracy like Britain or Canada. Ranney powerfully argued that such parties would be a disaster within the American constitutional system, given our separation of powers, separately elected institutions, and constraints on majority rule that favor cross-party coalitions and compromise. Time has proven Ranney dead right—we now have the kinds of parties the report desired, and it is disastrous.
Thomas E. Mann (It's Even Worse Than It Looks: How the American Constitutional System Collided with the New Politics of Extremism)
Bertrand Russell famously said: “It is undesirable to believe a proposition when there is no ground whatsoever for supposing it is true.” [but] Russell’s maxim is the luxury of a technologically advanced society with science, history, journalism, and their infrastructure of truth-seeking, including archival records, digital datasets, high-tech instruments, and communities of editing, fact-checking, and peer review. We children of the Enlightenment embrace the radical creed of universal realism: we hold that all our beliefs should fall within the reality mindset. We care about whether our creation story, our founding legends, our theories of invisible nutrients and germs and forces, our conceptions of the powerful, our suspicions about our enemies, are true or false. That’s because we have the tools to get answers to these questions, or at least to assign them warranted degrees of credence. And we have a technocratic state that should, in theory, put these beliefs into practice. But as desirable as that creed is, it is not the natural human way of believing. In granting an imperialistic mandate to the reality mindset to conquer the universe of belief and push mythology to the margins, we are the weird ones—or, as evolutionary social scientists like to say, the WEIRD ones: Western, Educated, Industrialized, Rich, Democratic. At least, the highly educated among us are, in our best moments. The human mind is adapted to understanding remote spheres of existence through a mythology mindset. It’s not because we descended from Pleistocene hunter-gatherers specifically, but because we descended from people who could not or did not sign on to the Enlightenment ideal of universal realism. Submitting all of one’s beliefs to the trials of reason and evidence is an unnatural skill, like literacy and numeracy, and must be instilled and cultivated.
Pinker Steven (Rationality: What It Is, Why It Seems Scarce, Why It Matters)
Even if there is only one possible unified theory, it is just a set of rules and equations. What is it that breathes fire into the equations and makes a universe for them to describe? The usual approach of science of constructing a mathematical model cannot answer the questions of why there should be a universe for the model to describe. Why does the universe go to all the bother of existing? Is the unified theory so compelling that it brings about its own existence? Or does it need a creator, and, if so, does he have any other effect on the universe? And who created him? Up to now, most scientists have been too occupied with the development of new theories that describe what the universe is to ask the question why. On the other hand, the people whose business it is to ask why, the philosophers, have not been able to keep up with the advance of scientific theories. In the eighteenth century, philosophers considered the whole of human knowledge, including science, to be their field and discussed questions such as: did the universe have a beginning? However, in the nineteenth and twentieth centuries, science became too technical and mathematical for the philosophers, or anyone else except a few specialists. Philosophers reduced the scope of their inquiries so much that Wittgenstein, the most famous philosopher of this century, said, “The sole remaining task for philosophy is the analysis of language.” What a comedown from the great tradition of philosophy from Aristotle to Kant! However, if we do discover a complete theory, it should in time be understandable in broad principle by everyone, not just a few scientists. Then we shall all, philosophers, scientists, and just ordinary people, be able to take part in the discussion of the question of why it is that we and the universe exist. If we find the answer to that, it would be the ultimate triumph of human reason – for then we would know the mind of God.
Stephen Hawking (A Brief History of Time)
The person who discovered the answer was a retiring, self-funded scientist named Peter Mitchell who in the early 1960s inherited a fortune from the Wimpey house-building company and used it to set up a research center in a stately home in Cornwall. Mitchell was something of an eccentric. He wore shoulder-length hair and an earring at a time when that was especially unusual among serious scientists. He was also famously forgetful. At his daughter’s wedding, he approached another guest and confessed that she looked familiar, though he couldn’t quite place her. “I was your first wife,” she answered. Mitchell’s ideas were universally dismissed, not altogether surprisingly. As one chronicler has noted, “At the time that Mitchell proposed his hypothesis there was not a shred of evidence in support of it.” But he was eventually vindicated and in 1978 was awarded the Nobel Prize in Chemistry—an extraordinary accomplishment for someone who worked from a home lab. The
Bill Bryson (The Body: A Guide for Occupants)
When I was a kid, my mother thought spinach was the healthiest food in the world because it contained so much iron. Getting enough iron was a big deal then because we didn't have 'iron-fortified' bread. Turns out that spinach is an okay source of iron, but no better than pizza, pistachio nuts, cooked lentils, or dried peaches. The spinach-iron myth grew out of a simple mathematical miscalculation: A researcher accidentally moved a decimal point one space, so he thought spinach had 10 times more iron than it did. The press reported it, and I had to eat spinach. Moving the decimal point was an honest mistake--but it's seldom that simple. If it happened today I'd suspect a spinach lobby was behind it. Businesses often twist science to make money. Lawyers do it to win cases. Political activists distort science to fit their agenda, bureaucrats to protect their turf. Reporters keep falling for it. Scientists sometimes go along with it because they like being famous.
John Stossel (Give Me a Break: How I Exposed Hucksters, Cheats, and Scam Artists and Became the Scourge of the Liberal Media...)
THE FOUNDING PROPHET of modern antihumanism was Thomas Malthus (1766–1834). For three decades a professor at the British East India Company’s East India College, Malthus was a political economist who famously argued that human reproduction always outruns available resources. This doctrine served to rationalize the starvation of millions caused by his employer’s policy of brutal oppression of the peasants of the Indian subcontinent. The British Empire’s colonial helots, however, were not Malthus’s only targets. Rather, his Essay on the Principle of Population (first published in 1798 and later expanded in numerous further editions) was initially penned as a direct attack on such Enlightenment revolutionaries as William Godwin and the Marquis de Condorcet, who advanced the notion that human liberty, expanding knowledge, and technological progress could ultimately make possible a decent life for all mankind. Malthus prescribed specific policies to keep population down by raising the death rate:
Robert Zubrin (Merchants of Despair: Radical Environmentalists, Criminal Pseudo-Scientists, and the Fatal Cult of Antihumanism)
11. Never give up on yourself Everyone may give up on you but never give up on yourself, because if you do, it will also become the end. Believe that anything can be achieved with effort. Most important of all, we must understand that dyslexia is not just a hindrance to learning; it may also be considered a gift. Multiple studies have proven that dyslexic people are highly creative and intuitive. Not to mention the long list of dyslexic people who have succeeded in their chosen fields; Known scientist and the inventor of telephone, Alexander Graham Bell; The inventor of telescope, Galileo Galilei; Painter and polymath, Leonardo da Vinci; Mathematician and writer Lewis Carroll; American journalist, Anderson Cooper; Famous actor, Tom Cruise; Director of our all time favorites Indiana Jones and Jurassic Park, Steven Spielberg; Musician Paul Frappier; Entrepreneur and Apple founder, Steve Jobs; and maybe the person who is reading this book right now. We must always remember, everything can be learned and anyone can learn how to read!  
Craig Donovan (Dyslexia: For Beginners - Dyslexia Cure and Solutions - Dyslexia Advantage (Dyslexic Advantage - Dyslexia Treatment - Dyslexia Therapy Book 1))
Look at the telephone; it would remind you of a unique scientist, Alexander Graham Bell. He, besides being a great inventor, was also a man of great compassion and service. In fact, much of the research which led to the development of the telephone was directed at finding solutions to the challenges of hearing impaired people and helping them to be able to listen and communicate. Bell’s mother and wife were both hearing impaired and it profoundly changed Bell’s outlook to science. He aimed to make devices which would help the hearing impaired. He started a special school in Boston to teach hearing impaired people in novel ways. It was these lessons which inspired him to work with sound and led to the invention of the telephone. Can you guess the name of the most famous student of Alexander Graham Bell? It was Helen Keller, the great author, activist and poet who was hearing and visually impaired. About her teacher, she once said that Bell dedicated his life to the penetration of that ‘inhuman silence which separates and estranges’.
A.P.J. Abdul Kalam (Learning How to Fly: Life Lessons for the Youth)
As arrogant as I may be in general, I am not sufficiently doltish or vainglorious to imagine that I can meaningfully address the deep philosophical questions embedded within this general inquiry of our intellectual ages—that is, fruitful modes of analysis for the history of human thought. I shall therefore take refuge in an escape route that has traditionally been granted to scientists: the liberty to act as a practical philistine. Instead of suggesting a principled and general solution, I shall ask whether I can specify an operational way to define “Darwinism” (and other intellectual entities) in a manner specific enough to win shared agreement and understanding among readers, but broad enough to avoid the doctrinal quarrels about membership and allegiance that always seem to arise when we define intellectual commitments as pledges of fealty to lists of dogmata (not to mention initiation rites, secret handshakes and membership cards—in short, the intellectual paraphernalia that led Karl Marx to make his famous comment to a French journalist: “je ne suis pas marxiste”).
Stephen Jay Gould (The Structure of Evolutionary Theory)
Benjamin Libet, a scientist in the physiology department of the University of California, San Francisco, was a pioneering researcher into the nature of human consciousness. In one famous experiment he asked a study group to move their hands at a moment of their choosing while their brain activity was being monitored. Libet was seeking to identify what came first — the brain’s electrical activity to make the hand move or the person’s conscious intention to make their hand move. It had to be the second one, surely? But no. Brain activity to move the hand was triggered a full half a second before any conscious intention to move it…. John-Dylan Haynes, a neuroscientist at the Max Planck Institute for Human Cognitive and Brain Studies in Leipzig, Germany, led a later study that was able to predict an action ten seconds before people had a conscious intention to do it. What was all the stuff about free will? Frank Tong, a neuroscientist at Vanderbilt University in Nashville, Tennessee, said: “Ten seconds is a lifetime in terms of brain activity.” So where is it coming from if not ‘us,’ the conscious mind?
David Icke
The lives of scientists, considered as Lives, almost always make dull reading. For one thing, the careers of the famous and the merely ordinary fall into much the same pattern, give or take an honorary degree or two, or (in European countries) an honorific order. It could be hardly otherwise. Academics can only seldom lead lives that are spacious or exciting in a worldly sense. They need laboratories or libraries and the company of other academics. Their work is in no way made deeper or more cogent by privation, distress or worldly buffetings. Their private lives may be unhappy, strangely mixed up or comic, but not in ways that tell us anything special about the nature or direction of their work. Academics lie outside the devastation area of the literary convention according to which the lives of artists and men of letters are intrinsically interesting, a source of cultural insight in themselves. If a scientist were to cut his ear off, no one would take it as evidence of a heightened sensibility; if a historian were to fail (as Ruskin did) to consummate his marriage, we should not suppose that our understanding of historical scholarship had somehow been enriched.
Peter Medawar
For almost all astronomical objects, gravitation dominates, and they have the same unexpected behavior. Gravitation reverses the usual relation between energy and temperature. In the domain of astronomy, when heat flows from hotter to cooler objects, the hot objects get hotter and the cool objects get cooler. As a result, temperature differences in the astronomical universe tend to increase rather than decrease as time goes on. There is no final state of uniform temperature, and there is no heat death. Gravitation gives us a universe hospitable to life. Information and order can continue to grow for billions of years in the future, as they have evidently grown in the past. The vision of the future as an infinite playground, with an unending sequence of mysteries to be understood by an unending sequence of players exploring an unending supply of information, is a glorious vision for scientists. Scientists find the vision attractive, since it gives them a purpose for their existence and an unending supply of jobs. The vision is less attractive to artists and writers and ordinary people. Ordinary people are more interested in friends and family than in science. Ordinary people may not welcome a future spent swimming in an unending flood of information. A darker view of the information-dominated universe was described in the famous story “The Library of Babel,” written by Jorge Luis Borges in 1941.§ Borges imagined his library, with an infinite array of books and shelves and mirrors, as a metaphor for the universe. Gleick’s book has an epilogue entitled “The Return of Meaning,” expressing the concerns of people who feel alienated from the prevailing scientific culture. The enormous success of information theory came from Shannon’s decision to separate information from meaning. His central dogma, “Meaning is irrelevant,” declared that information could be handled with greater freedom if it was treated as a mathematical abstraction independent of meaning. The consequence of this freedom is the flood of information in which we are drowning. The immense size of modern databases gives us a feeling of meaninglessness. Information in such quantities reminds us of Borges’s library extending infinitely in all directions. It is our task as humans to bring meaning back into this wasteland. As finite creatures who think and feel, we can create islands of meaning in the sea of information. Gleick ends his book with Borges’s image of the human condition: We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.
Freeman Dyson (Dreams of Earth and Sky)
In about 1980, he says, at a time when he was still struggling to articulate his own vision of a dynamic, evolving economy, he happened to read a book by the geneticist Richard Lewontin. And he was struck by a passage in which Lewontin said that scientists come in two types. Scientists of the first type see the world as being basically in equilibrium. And if untidy forces sometimes push a system slightly out of equilibrium, then they feel the whole trick is to push it back again. Lewontin called these scientists "Platonists," after the renowned Athenian philosopher who declared that the messy, imperfect objects we see around us are merely the reflections of perfect "archetypes." Scientists of the second type, however, see the world as a process of flow and change, with the same material constantly going around and around in endless combinations. Lewontin called these scientists "Heraclitans," after the Ionian philosopher who passionately and poetically argued that the world is in a constant state of flux. Heraclitus, who lived nearly a century before Plato, is famous for observing that "Upon those who step into the same rivers flow other and yet other waters," a statement that Plato himself paraphrased as "You can never step into the same river twice." "When I read what Lewontin said," says Arthur, "it was a moment of revelation. That's when it finally became clear to me what was going on. I thought to myself, "Yes! We're finally beginning to recover from Newton.
M. Mitchell Waldrop (Complexity: The Emerging Science at the Edge of Order and Chaos)
The dinosaurs, built of concrete, were a kind of bonus attraction. On New Year’s Eve 1853 a famous dinner for twenty-one prominent scientists was held inside the unfinished iguanodon. Gideon Mantell, the man who had found and identified the iguanodon, was not among them. The person at the head of the table was the greatest star of the young science of palaeontology. His name was Richard Owen and by this time he had already devoted several productive years to making Gideon Mantell’s life hell. A double-tailed lizard, part of the vast collection of natural wonders and anatomical specimens collected by the Scottish-born surgeon John Hunter in the eighteenth century. After Hunter’s death in 1793, the collection passed to the Royal College of Surgeons. (credit 6.8) Owen had grown up in Lancaster, in the north of England, where he had trained as a doctor. He was a born anatomist and so devoted to his studies that he sometimes illicitly borrowed limbs, organs and other parts from corpses and took them home for leisurely dissection. Once, while carrying a sack containing the head of a black African sailor that he had just removed, Owen slipped on a wet cobble and watched in horror as the head bounced away from him down the lane and through the open doorway of a cottage, where it came to rest in the front parlour. What the occupants had to say upon finding an unattached head rolling to a halt at their feet can only be imagined. One assumes that they had not formed any terribly advanced conclusions when, an instant later, a fraught-looking young man rushed in, wordlessly retrieved the head and rushed out again.
Bill Bryson (A Short History of Nearly Everything)
This, in turn, has given us a “unified theory of aging” that brings the various strands of research into a single, coherent tapestry. Scientists now know what aging is. It is the accumulation of errors at the genetic and cellular level. These errors can build up in various ways. For example, metabolism creates free radicals and oxidation, which damage the delicate molecular machinery of our cells, causing them to age; errors can build up in the form of “junk” molecular debris accumulating inside and outside the cells. The buildup of these genetic errors is a by-product of the second law of thermodynamics: total entropy (that is, chaos) always increases. This is why rusting, rotting, decaying, etc., are universal features of life. The second law is inescapable. Everything, from the flowers in the field to our bodies and even the universe itself, is doomed to wither and die. But there is a small but important loophole in the second law that states total entropy always increases. This means that you can actually reduce entropy in one place and reverse aging, as long as you increase entropy somewhere else. So it’s possible to get younger, at the expense of wreaking havoc elsewhere. (This was alluded to in Oscar Wilde’s famous novel The Picture of Dorian Gray. Mr. Gray was mysteriously eternally young. But his secret was the painting of himself that aged horribly. So the total amount of aging still increased.) The principle of entropy can also be seen by looking behind a refrigerator. Inside the refrigerator, entropy decreases as the temperature drops. But to lower the entropy, you have to have a motor, which increases the heat generated behind the refrigerator, increasing the entropy outside the machine. That is why refrigerators are always hot in the back. As Nobel laureate Richard Feynman once said, “There is nothing in biology yet found that indicates the inevitability of death. This suggests to me that it is not at all inevitable and that it is only a matter of time before biologists discover what it is that is causing us the trouble and that this terrible universal disease or temporariness of the human’s body will be cured.
Michio Kaku (Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100)
Every human being with normal mental and emotional faculties longs for more. People typically associate their longing for more with a desire to somehow improve their lot in life—to get a better job, a nicer house, a more loving spouse, become famous, and so on. If only this, that, or some other thing were different, we say to ourselves, then we’d feel complete and happy. Some chase this “if only” all their lives. For others, the “if only” turns into resentment when they lose hope of ever acquiring completeness. But even if we get lucky and acquire our “if only,” it never quite satisfies. Acquiring the better job, the bigger house, the new spouse, or world fame we longed for may provide a temporary sense of happiness and completeness, but it never lasts. Sooner or later, the hunger returns. The best word in any language that captures this vague, unquenchable yearning, according to C. S. Lewis and other writers, is the German word Sehnsucht (pronounced “zane-zookt”).[9] It’s an unusual word that is hard to translate, for it expresses a deep longing or craving for something that you can’t quite identify and that always feels just out of reach. Some have described Sehnsucht as a vague and bittersweet nostalgia and/or longing for a distant country, but one that cannot be found on earth. Others have described it as a quasi-mystical sense that we (and our present world) are incomplete, combined with an unattainable yearning for whatever it is that would complete it. Scientists have offered several different explanations for this puzzling phenomenon—puzzling, because it’s hard to understand how natural processes alone could have evolved beings that hunger for something nature itself doesn’t provide.[10] But this longing is not puzzling from a biblical perspective, for Scripture teaches us that humans and the entire creation are fallen and estranged from God. Lewis saw Sehnsucht as reflective of our “pilgrim status.” It indicates that we are not where we were meant to be, where we are destined to be; we are not home. Lewis once wrote to a friend that “our best havings are wantings,” for our “wantings” are reminders that humans are meant for a different and better state.[11] In another place he wrote: Our lifelong nostalgia, our longing to be reunited with something in the universe from which we now feel cut off, to be on the inside of some door which we have always seen from the outside is . . . the truest index of our real situation.[12] With Lewis, Christians have always identified this Sehnsucht that resides in the human heart as a yearning for God. As St. Augustine famously prayed, “You have made us for yourself, and our hearts are restless till they find their rest in you.”[13] In this light, we might think of Sehnsucht as a sort of homing device placed in us by our Creator to lead us into a passionate relationship with him.
Gregory A. Boyd (Benefit of the Doubt: Breaking the Idol of Certainty)
The Extraordinary Persons Project In fact, Ekman had been so moved personally—and intrigued scientifically—by his experiments with Öser that he announced at the meeting he was planning on pursuing a systematic program of research studies with others as unusual as Öser. The single criterion for selecting apt subjects was that they be “extraordinary.” This announcement was, for modern psychology, an extraordinary moment in itself. Psychology has almost entirely dwelt on the problematic, the abnormal, and the ordinary in its focus. Very rarely have psychologists—particularly ones as eminent as Paul Ekman—shifted their scientific lens to focus on people who were in some sense (other than intellectually) far above normal. And yet Ekman now was proposing to study people who excel in a range of admirable human qualities. His announcement makes one wonder why psychology hasn't done this before. In fact, only in very recent years has psychology explicitly begun a program to study the positive in human nature. Sparked by Martin Seligman, a psychologist at the University of Pennsylvania long famous for his research on optimism, a budding movement has finally begun in what is being called “positive psychology”—the scientific study of well-being and positive human qualities. But even within positive psychology, Ekman's proposed research would stretch science's vision of human goodness by assaying the limits of human positivity Ever the scientist, Ekman became quite specific about what was meant by “extraordinary.” For one, he expects that such people exist in every culture and religious tradition, perhaps most often as contemplatives. But no matter what religion they practice, they share four qualities. The first is that they emanate a sense of goodness, a palpable quality of being that others notice and agree on. This goodness goes beyond some fuzzy, warm aura and reflects with integrity the true person. On this count Ekman proposed a test to weed out charlatans: In extraordinary people “there is a transparency between their personal and public life, unlike many charismatics, who have wonderful public lives and rather deplorable personal ones.” A second quality: selflessness. Such extraordinary people are inspiring in their lack of concern about status, fame, or ego. They are totally unconcerned with whether their position or importance is recognized. Such a lack of egoism, Ekman added, “from the psychological viewpoint, is remarkable.” Third is a compelling personal presence that others find nourishing. “People want to be around them because it feels good—though they can't explain why,” said Ekman. Indeed, the Dalai Lama himself offers an obvious example (though Ekman did not say so to him); the standard Tibetan title is not “Dalai Lama” but rather “Kundun,” which in Tibetan means “presence.” Finally, such extraordinary individuals have “amazing powers of attentiveness and concentration.
Daniel Goleman (Destructive Emotions: A Scientific Dialogue with the Dalai Lama)
The Tale of Human Evolution The subject most often brought up by advocates of the theory of evolution is the subject of the origin of man. The Darwinist claim holds that modern man evolved from ape-like creatures. During this alleged evolutionary process, which is supposed to have started 4-5 million years ago, some "transitional forms" between modern man and his ancestors are supposed to have existed. According to this completely imaginary scenario, four basic "categories" are listed: 1. Australopithecus 2. Homo habilis 3. Homo erectus 4. Homo sapiens Evolutionists call man's so-called first ape-like ancestors Australopithecus, which means "South African ape." These living beings are actually nothing but an old ape species that has become extinct. Extensive research done on various Australopithecus specimens by two world famous anatomists from England and the USA, namely, Lord Solly Zuckerman and Prof. Charles Oxnard, shows that these apes belonged to an ordinary ape species that became extinct and bore no resemblance to humans. Evolutionists classify the next stage of human evolution as "homo," that is "man." According to their claim, the living beings in the Homo series are more developed than Australopithecus. Evolutionists devise a fanciful evolution scheme by arranging different fossils of these creatures in a particular order. This scheme is imaginary because it has never been proved that there is an evolutionary relation between these different classes. Ernst Mayr, one of the twentieth century's most important evolutionists, contends in his book One Long Argument that "particularly historical [puzzles] such as the origin of life or of Homo sapiens, are extremely difficult and may even resist a final, satisfying explanation." By outlining the link chain as Australopithecus > Homo habilis > Homo erectus > Homo sapiens, evolutionists imply that each of these species is one another's ancestor. However, recent findings of paleoanthropologists have revealed that Australopithecus, Homo habilis, and Homo erectus lived at different parts of the world at the same time. Moreover, a certain segment of humans classified as Homo erectus have lived up until very modern times. Homo sapiens neandarthalensis and Homo sapiens sapiens (modern man) co-existed in the same region. This situation apparently indicates the invalidity of the claim that they are ancestors of one another. Stephen Jay Gould explained this deadlock of the theory of evolution although he was himself one of the leading advocates of evolution in the twentieth century: What has become of our ladder if there are three coexisting lineages of hominids (A. africanus, the robust australopithecines, and H. habilis), none clearly derived from another? Moreover, none of the three display any evolutionary trends during their tenure on earth. Put briefly, the scenario of human evolution, which is "upheld" with the help of various drawings of some "half ape, half human" creatures appearing in the media and course books, that is, frankly, by means of propaganda, is nothing but a tale with no scientific foundation. Lord Solly Zuckerman, one of the most famous and respected scientists in the U.K., who carried out research on this subject for years and studied Australopithecus fossils for 15 years, finally concluded, despite being an evolutionist himself, that there is, in fact, no such family tree branching out from ape-like creatures to man.
Harun Yahya (Those Who Exhaust All Their Pleasures In This Life)
Humans are better equipped for sight than for smell. We process visual input ten times faster than olfactory. Visual and cognitive cues handily trump olfactory ones, a fact famously demonstrated in a 2001 collaboration between a sensory scientist and a team of oenologists (wine scientists) at the University of Bordeaux in Talence, France.
Anonymous
Scientists expected that the Super, a fusion or "thermonuclear" weapon, would be an awesomely destructive horror that could unleash the equivalent of several million tons of TNT. This was hundreds of times more powerful than atomic bombs. A few well-placed hydrogen bombs could kill millions of people. Among the foes of development were famous scientists who had supported atomic development during World War II. One was Albert Einstein, who took to the radio to say that "general annilihation beckons.
James T. Patterson (Grand Expectations: The United States, 1945-1974 (Oxford History of the United States Book 10))
More daring, though, was Jung’s uninhibited interest in spiritualism, which by this time had become a controversial topic on both sides of the Atlantic, ever since 1848, when the Fox sisters of Hydesville, New York, discovered they could communicate with the spirit of a dead man. Soon after this, mediums, table turning, floating tambourines, ectoplasmic limbs, and a variety of other otherworldly phenomena became the focus of an international craze; the flood of disincarnate appearances led one investigator to speak of an “invasion of the spirit people.”7 Colorful characters like the Russian medium and mystic Helena Petrovna Blavatsky were involved, but also scientists and philosophers like William James, Oliver Lodge, William Crookes, and Frederick Myers. It is difficult for us today to realize that at the time, many of the most famous men and women in the world were involved in spiritualism, to one degree or another. Thomas Edison, for example, who joined Blavatsky’s Theosophical Society, hoped to be able to record spirits on his “Spirit Phone.” Yet, for all this, the reductionist thought that dominates the academic world today was already securely in place, and Jung was risking his future career by openly advocating the unbiased study of the paranormal.
Gary Lachman (Jung the Mystic: The Esoteric Dimensions of Carl Jung's Life & Teachings)
Amblyopsis hoosieri Type of animal: Eyeless cavefish Description: Completely colorless; 2 to 3 inches long; anus on underside of neck Home: Southern Indiana Fun fact: Unlike others of its kind, A. hoosieri lacks a debilitating mutation in the rhodopsin gene, which is an important gene for vision. That means it could see just fine … if it had eyes. Researchers named the fish after the Indiana Hoosiers basketball team — but not to imply the players might be visually challenged. The name honors several famous fish scientists who worked at Indiana University, as well as the species’s proximity to the university.Plus, the lead author is a Hoosier fan.  BRENDA POPPY Can You See Me Now? NIEMILLER/ZOOKEYS MATTHEW LEMOS; BARRETO GABRIELA : TOPFROM 22 DISCOVERMAGAZINE.COM
Anonymous
There’s a quote from the famous physicist Niels Bohr, who posits that the way you become an expert in a field is to make every mistake possible in that field.
Sebastian Gutiérrez (Data Scientists at Work)
Singer cited the famous essay “The Tragedy of the Commons,” in which biologist Garrett Hardin argued that individuals acting in their rational self-interest may undermine the common good, and warned against assuming that technology would save us from ourselves. “If we ignore the present warning signs and wait for an ecological disaster to strike, it will probably be too late,” Singer noted. He imagined what it must have been like to be Noah, surrounded by “complacent compatriots,” saying, “‘Don’t worry about the rising waters, Noah; our advanced technology will surely discover a substitute for breathing.
Naomi Oreskes (Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming)
This is the dilemma of science-think and yet again a situation in which scientists simply shouldn't be such scientists. Bring in the professionals, and trust them when they tell you to invest in communication. It may be frustrating and seem like a frivolous waste of resources, but what's the alternative strategy—to assume that people are rational, thinking beings? There's a famous quote by Democratic presidential candidate Adlai Stevenson, who heard a woman shout to him that all the thinking people of America were with him. He replied, “That's not going to be enough, Madam; I need a majority of the public.
Randy Olson (Don't Be Such a Scientist: Talking Substance in an Age of Style)
Nierenberg was a man of strong will and even stronger opinions—a good talker but not always a good listener. Some colleagues said that the old adage about famous physicists definitely applied to him: he was sometimes in error but never in doubt. And he was fiercely competitive, often debating until his adversaries simply gave up.
Naomi Oreskes (Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming)
The Higgs Mass Hysteria If Anything, the Hype of the Century On July 4, 2012, at the famous CERN seminar, scientists applauded, cheered, celebrated. The news spread quickly all over the world that the Higgs had been discovered (nobody cared about the subtleties of “the Higgs” and “a Higgs”), allegedly the verification of an almost 50-year-old idea formulated by a Scottish theoretician. The nonsense starts right here.
Alexander Unzicker (The Higgs Fake - How Particle Physicists Fooled the Nobel Committee)
Barry Goldwater famously argued that extremism in the defense of liberty was no vice. Our story will show that it is.
Naomi Oreskes (Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming)
zygotes by definition are rather limited in number and most scientists working in very early development use cells from a bit later, the famous embryonic stem (ES) cells.
Nessa Carey (The Epigenetics Revolution: How Modern Biology is Rewriting our Understanding of Genetics, Disease and Inheritance)
According to string theory, which Professor Tamashi and other scientists have been using to try to solve the Big Bang, in addition to the four dimensions of spacetime we know, there are six of these very small, curled-up dimensions, making ten all told. And the strings, which are little strands of energy, wiggle around vibrating in these ten dimensions.’ ‘Like Dennis’s mother,’ Mario, seeking vengeance for the ant slur, interjects, ‘wiggling around vibrating with her vibrator, because she is a famous slut, and also, she has ten dimensions because she is a fat bitch.
Paul Murray (Skippy Dies)
It is no good getting furious if you get stuck. What I do is keep thinking about the problem but work on something else. Sometimes it is years before I see the way forward. In the case of information loss and black holes, it was 29 years.” The
Charles River Editors (Stephen Hawking: The Life of the World’s Most Famous Scientist)
Many American boys that fought in WWII had been sterilized under eugenic laws passed by the the United States Supreme Court under the 1927 case of Buck v. Bell. Over 80,000 Americans would be forcibly sterilized under that legal precedent. Coincidentally, Buck v Bell is also the legal precedent cited in Roe v. Wade, the famous abortion rights case.
A.E. Samaan (H.H. Laughlin: American Scientist, American Progressive, Nazi Collaborator (History of Eugenics, Vol. 2))
As with Lawrence, these other competitors in the field tended to be young, wholly untrained for the missions they were given, and largely unsupervised. And just as with their more famous British counterpart, to capitalize on their extraordinary freedom of action, these men drew upon a very particular set of personality traits—cleverness, bravery, a talent for treachery—to both forge their own destiny and alter the course of history. Among them was a fallen American aristocrat in his twenties who, as the only American field intelligence officer in the Middle East during World War I, would strongly influence his nation’s postwar policy in the region, even as he remained on the payroll of Standard Oil of New York. There was the young German scholar who, donning the camouflage of Arab robes, would seek to foment an Islamic jihad against the Western colonial powers, and who would carry his “war by revolution” ideas into the Nazi era. Along with them was a Jewish scientist who, under the cover of working for the Ottoman government, would establish an elaborate anti-Ottoman spy ring and play a crucial role in creating a Jewish homeland in Palestine. If little remembered today, these men shared something else with their British counterpart. Like Lawrence, they were not the senior generals who charted battlefield campaigns in the Middle East, nor the elder statesmen who drew lines on maps in the war’s aftermath. Instead, their roles were perhaps even more profound: it was they who created the conditions on the ground that brought those campaigns to fruition, who made those postwar policies and boundaries possible. History is always a collaborative effort, and in the case of World War I an effort that involved literally millions of players, but to a surprising degree, the subterranean and complex game these four men played, their hidden loyalties and personal duels, helped create the modern Middle East and, by inevitable extension, the world we live in today.
Scott Anderson (Lawrence in Arabia: War, Deceit, Imperial Folly, and the Making of the Modern Middle East)
Of Sir Isaac Newton’s momentous decipherment of the laws of the universe, the French scientist Pierre-Simon de Laplace famously told Napoleon, in his philosophical euphoria, that he no longer had need of God to make sense of creation. Secular science could henceforth exile God from his universe. In Joseph Smith’s conception, by contrast, naturalism and God co-exist.
Terryl L. Givens (Wrestling the Angel: The Foundations of Mormon Thought: Cosmos, God, Humanity)
My father had been a mathematician. A famous one. At least as far as any mathematician or scientist not named Einstein can be famous. Other mathematicians in his field knew his name. Nobody else did.
Mark Lawrence (One Word Kill (Impossible Times, #1))
Despite all the knowledge gained by scientists in the last few decades, this emotional realm remains much harder to reach. An obvious example on the southern coast is the Nazca, famous for the huge patterns they set into the ground. Figures of animals and plants, almost a thousand geometric symbols, arrow-straight lines many miles long—what were they for?
Charles C. Mann (1491: New Revelations of the Americas Before Columbus)
Curiously, we are the rare animal that actually likes the bitter taste of radicchio or black tea. I fear, however, that Americans raised on sugary things are losing the taste for things savory, sour, and bitter. It’s pitiful that commercial salad dressings contain sugar, and even sweet corn hybrids are much sweeter than when I was little. We’re not alone. In Britain, plant scientists are breeding sweeter hybrids of the brussels sprout, famous for its dour presence at Christmas lunch, but the more palatable sprouts may lack the healthy, bitter compounds.
Nina Planck (Real Food: What to Eat and Why)
glory, at the Science Museum of London. Charles Babbage was a well-known scientist and inventor of the time. He had spent years working on his Difference Engine, a revolutionary mechanical calculator. Babbage was also known for his extravagant parties, which he called “gatherings of the mind” and hosted for the upper class, the well-known, and the very intelligent.4 Many of the most famous people from Victorian England would be there—from Charles Darwin to Florence Nightingale to Charles Dickens. It was at one of these parties in 1833 that Ada glimpsed Babbage’s half-built Difference Engine. The teenager’s mathematical mind buzzed with possibilities, and Babbage recognized her genius immediately. They became fast friends. The US Department of Defense uses a computer language named Ada in her honor. Babbage sent Ada home with thirty of his lab books filled with notes on his next invention: the Analytic Engine. It would be much faster and more accurate than the Difference Engine, and Ada was thrilled to learn of this more advanced calculating machine. She understood that it could solve even harder, more complex problems and could even make decisions by itself. It was a true “thinking machine.”5 It had memory, a processor, and hardware and software just like computers today—but it was made from cogs and levers, and powered by steam. For months, Ada worked furiously creating algorithms (math instructions) for Babbage’s not-yet-built machine. She wrote countless lines of computations that would instruct the machine in how to solve complex math problems. These algorithms were the world’s first computer program. In 1840, Babbage gave a lecture in Italy about the Analytic Engine, which was written up in French. Ada translated the lecture, adding a set of her own notes to explain how the machine worked and including her own computations for it. These notes took Ada nine months to write and were three times longer than the article itself! Ada had some awesome nicknames. She called herself “the Bride of Science” because of her desire to devote her life to science; Babbage called her “the Enchantress of Numbers” because of her seemingly magical math
Michelle R. McCann (More Girls Who Rocked the World: Heroines from Ada Lovelace to Misty Copeland)
In 1950, he was accorded the dubious honor of being the first prominent scientist to appear on the earliest of Senator Joseph McCarthy’s famous lists of crypto-communists.
Sylvia Nasar (A Beautiful Mind)
NEXT day was fine and warm. 'We can go across to the island this morning,' said Aunt Fanny. 'We'll take our own food, because I'm sure Uncle Quentin will have forgotten we're coming.' 'Has he a boat there:' asked George. 'Mother hasn't taken my boat, has he?' 'No, dear,' said - her mother. 'He's got another boat. I was afraid he would never be able to get it in and out of all those dangerous rocks round the island, but he got one of the fishermen to take him, and had his own boat towed behind, with all its stuff in/' 'Who built the tower?' asked Julian. 'Oh, he made out the plans himself, and some men were sent down from the Ministry of Research to put the tower up for him,' said Aunt Fanny 'It was all rather hush-hush really. The people here were most curious about it, but they don't know any more than I do! No -local man helped in the building, but one or two fishermen were hired to take the material to the island, and to land the men and soon.' 'It's all very mysterious,' said Julian. 'Uncle Quentin -leads-rather an exciting life, really, doesn't he? I wouldn't mind being a scientist myself. I want to be something really worthwhile when I grow up I'm not just
Enid Blyton (Five On Kirrin Island Again (Famous Five Book 6))
What are we to do at any given moment, when we cannot say which of our current claims will be sustained and which will be rejected? This is one of the central questions that I have raised. Because we cannot know which of current claims will be sustained, the best we can do is to consider the weight of scientific evidence, the fulcrum of scientific opinion, and the trajectory of scientific knowledge. This is why consensus matters: If scientists are still debating a matter, then we may well be wise to “wait and see,” if conditions permit.26 If the available empirical evidence is thin, we may want to do more research. But the uncertainly of future scientific knowledge should not be used as an excuse for delay. As the epidemiologist Sir Austin Bradford Hill famously argued, “All scientific work is incomplete—whether it be observational or experimental. All scientific work is liable to be upset or modified by advancing knowledge. That does not confer upon us a freedom to ignore the knowledge we already have, or to postpone the action that it appears to demand at a given time.”27 At any given moment, it makes sense to make decisions on the information we have, and be prepared to alter our plans if future evidence warrants.
Naomi Oreskes (Why Trust Science? (The University Center for Human Values Series))
The most famous of these scientists was Wernher von Braun.
Eric Lichtblau (The Nazis Next Door: How America Became a Safe Haven for Hitler's Men)
Iron Man‘s success more than made up for that July’s Incredible Hulk. The result of Marvel’s most difficult production right up to the present, the second Hulk film starred Ed Norton, who proved a terrible fit for Maisel and Feige’s philosophy that studio executives should be the ultimate creative authority. Undeniably one of the best actors of his generation, Norton is also famous in Hollywood for being “difficult” and highly opinionated, refusing to allow artistic choices he disagrees with and seeking to rewrite scripts he doesn’t like, which is what he did on The Incredible Hulk. The clashes intensified in post-production, and the director, Louis Letterier, sided with Norton over the studio. They both learned who has the ultimate power at Marvel, though, when Feige took control of editing. He excised many of the darkest scenes, including a suicide attempt meant to portray how much the scientist Bruce Banner wants to rid himself of the curse of transforming into the Hulk when he’s mad. The resulting movie was still darker and more dramatic than any other Marvel Studios production and not different enough from the Hulk movie of 2003. It grossed only $263 million at the box office and barely broke even, the worst performance for any Marvel Studios film to date. The Incredible Hulk never got a sequel, but the character has returned in Avengers films, played by the easygoing Mark Ruffalo. The usually cheerful Feige stated that the decision to recast the role was “rooted in the need for an actor who embodies the creativity and collaborative spirit of our other talented cast members.
Ben Fritz (The Big Picture: The Fight for the Future of Movies)
We remember the famous curve in the shape of a hockey stick… However, no serious scientist still gives it the least credit.
Mark Steyn ("A Disgrace to the Profession")
The computer scientist Gerald Weinberg is famous for saying, “No matter what the problem is, it’s a people problem.
Mark Richards (Fundamentals of Software Architecture: An Engineering Approach)
Did anyone get a paper in the village?’ asked Dick. ‘Oh, you did, Julian. Good. Let’s have a look at the weather forecast. If it’s good we might go for a long walk this afternoon. The sea is not really very far off.’ Julian took the folded paper from his pocket and threw it over to Dick. He sat down on the steps of the caravan and opened it. He was looking for the paragraph giving the weather forecast when headlines caught his eye. He gave an exclamation. ‘Hallo! Here’s a bit more about those two vanished scientists, Julian!’ ‘Oh!’ said George, remembering Julian’s
Enid Blyton (Famous Five: 11: Five Have A Wonderful Time (Famous Five series))
But Nowlis, likely protecting his career, left quietly and, in fact, became a famous psychologist. Disgusted by what he called Kinsey’s “outrageous” child sex abuse protocol, Nowlis stayed silent about these crimes against children until Jones interviewed him165 half a century thereafter?166 Why?
Judith Reisman (Sexual Sabotage: How One Mad Scientist Unleashed a Plague of Corruption and Contagion on America)
One other named Kinsey pedophile was Rex King, an American serial child rapist also known as “Mr. Braun,” “Mr. Green,” and “Mr. X.” The “king” of child molesters is on record as raping at least eight hundred children, the youngest two months of age. Kinsey met King in about 1943 when King demonstrated his instant-orgasm ability for Kinsey and Pomeroy.84 Kinsey’s mentor, the famous sexologist, Robert Dickenson, MD, had “trained” King to keep child sex-abuse records.
Judith Reisman (Sexual Sabotage: How One Mad Scientist Unleashed a Plague of Corruption and Contagion on America)
The first out of the gate was The Ethics of Sexual Acts by Kinsey’s friend, René Guyon, a closet French pedophile jurist. The second was American Sexual Behavior and the Kinsey Report, by author/historian David Loth and Kinsey’s lawyer, Morris Ernst, the ACLU attorney. The third book was Sex Habits of American Men, a collection of essays, edited by journalist Albert Deutsch and written by world famous and stunningly foolish academicians.
Judith Reisman (Sexual Sabotage: How One Mad Scientist Unleashed a Plague of Corruption and Contagion on America)
Speaking at the Chaos Communication Congress, an annual computer hacker conference held in Berlin, Germany, Tobias Engel, founder of Sternraute, and Karsten Nohl, chief scientist for Security Research Labs, explained that they could not only locate cell-phone callers anywhere in the world, they could also listen in on their phone conversations. And if they couldn’t listen in real time, they could record the encrypted calls and texts for later decryption.
Kevin D. Mitnick (The Art of Invisibility: The World's Most Famous Hacker Teaches You How to Be Safe in the Age of Big Brother and Big Data)
The third quality that is needed for a scientist to become a public icon is wisdom. Besides being a famous joker and a famous genius, Feynman was also a wise human being whose answers to serious questions made sense.
Freeman Dyson (The Scientist as Rebel)
For the man on the street, science and math sound too and soulless. It is hard to appreciate their significance Most of us are just aware of Newton's apple trivia and Einstein's famous e mc2. Science, like philosophy, remains obscure and detached, playing role in our daily lives. There is a general perception that science is hard to grasp and has direct relevance to what we do. After all, how often do we discuss Dante or Descartes over dinner anyway? Some feel it to be too academic and leave it to the intellectuals or scientists to sort out while others feel that such topics are good only for academic debate. The great physicist, Rutherford, once quipped that, "i you can't explain a complex theory to a bartender, the theory not worth it" Well, it could be easier said than done (applications of tools
Sharad Nalawade (The Speed Of Time)
donated skeletal collection; one more skull was just a final drop in the bucket. Megan and Todd Malone, a CT technician in the Radiology Department at UT Medical Center, ran skull 05-01 through the scanner, faceup, in a box that was packed with foam peanuts to hold it steady. Megan FedExed the scans to Quantico, where Diana and Phil Williams ran them through the experimental software. It was with high hopes, shortly after the scan, that I studied the computer screen showing the features ReFace had overlaid, with mathematical precision, atop the CT scan of Maybe-Leoma’s skull. Surely this image, I thought—the fruit of several years of collaboration by computer scientists, forensic artists, and anthropologists—would clearly settle the question of 05-01’s identity: Was she Leoma or was she Not-Leoma? Instead, the image merely amplified the question. The flesh-toned image on the screen—eyes closed, the features impassive—could have been a department-store mannequin, or a sphinx. There was nothing in the image, no matter how I rotated it in three dimensions, that said, “I am Leoma.” Nor was there anything that said, “I am not Leoma.” To borrow Winston Churchill’s famous description of Russia, the masklike face on the screen was “a riddle wrapped in a mystery inside an enigma.” Between the scan, the software, and the tissue-depth data that the software merged with the
Jefferson Bass (Identity Crisis: The Murder, the Mystery, and the Missing DNA (Kindle Single))