Famous Research Quotes

We've searched our database for all the quotes and captions related to Famous Research. Here they are! All 100 of them:

I began to come into close contact with poverty, with hunger, with disease, with the inability to cure a child because of a lack of resources… And I began to see there was something that, at that time, seemed to me almost as important as being a famous researcher or making some substantial contribution to medical science, and this was helping those people.
Ernesto Che Guevara (The Motorcycle Diaries: Notes on a Latin American Journey)
The trouble with Goodreads is that they never authenticate these quotations of famous people.
Aristotle (Physics)
life expectancy among working-class white Americans had been decreasing since the early 2000s. In modern history the only obvious parallel was with Russia in the desperate aftermath of the fall of the Soviet Union. One journalistic essay and academic research paper after another confirmed the disaster, until the narrative was capped in 2015 by Anne Case and Angus Deaton’s famous account of “deaths of despair.
Adam Tooze (Crashed: How a Decade of Financial Crises Changed the World)
We like to believe that we live in a grand age of creative individualism. We look back at the midcentury era in which the Berkeley researchers conducted their creativity studies, and feel superior. Unlike the starched-shirted conformists of the 1950s, we hang posters of Einstein on our walls, his tongue stuck out iconoclastically. We consume indie music and films, and generate our own online content. We “think different” (even if we got the idea from Apple Computer’s famous ad campaign). But the way we organize many of our most important institutions—our schools and our workplaces—tells a very different story.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
It’s super-important to have a strong social media presence, and Jane’s always going, When interviewers ask you about your Twitter, say you love reaching out directly to your fans, and I’m like, I don’t even know how to use Twitter or what the password is because you disabled my laptop’s wireless and only let me go on the Internet to do homework research or email Nadine assignments, and she says, I’m doing you a big favor, it’s for nobodies who want to pretend like they’re famous and for self-promoting hacks without PR machines, and adults act like teenagers passing notes and everyone’s IQ drops thirty points on it.
Teddy Wayne (The Love Song of Jonny Valentine)
A separate, international team analyzed more than a half million research articles, and classified a paper as “novel” if it cited two other journals that had never before appeared together. Just one in ten papers made a new combination, and only one in twenty made multiple new combinations. The group tracked the impact of research papers over time. They saw that papers with new knowledge combinations were more likely to be published in less prestigious journals, and also much more likely to be ignored upon publication. They got off to a slow start in the world, but after three years, the papers with new knowledge combos surpassed the conventional papers, and began accumulating more citations from other scientists. Fifteen years after publication, studies that made multiple new knowledge combinations were way more likely to be in the top 1 percent of most-cited papers. To recap: work that builds bridges between disparate pieces of knowledge is less likely to be funded, less likely to appear in famous journals, more likely to be ignored upon publication, and then more likely in the long run to be a smash hit in the library of human knowledge. •
David Epstein (Range: Why Generalists Triumph in a Specialized World)
By the end of the day, we determined that we could provide chocolate therapy three times a day and research a chocolate protocol at the world-famous Hershey's Hospital. Do you think they provide it in IV formula?
Keith Desserich
In the scientific world, the syndrome known as 'great man's disease' happens when a famous researcher in one field develops strong opinions about another field that he or she does not understand, such as a chemist who decides that he is an expert in medicine or a physicist who decides that he is an expert in cognitive science. They have trouble accepting that they must go back to school before they can make pronouncements in a new field.
Paul Krugman (A Country Is Not a Company (Harvard Business Review Classics))
In a famous experiment conducted by NASA in the 1990s, researchers fed a variety of psychoactive substances to spiders to see how they would affect their web-making skills. The caffeinated spider spun a strangely cubist and utterly ineffective web, with oblique angles, openings big enough to let small birds through, and completely lacking in symmetry or a center. (The web was far more fanciful than the ones spun by spiders given cannabis or LSD.)
Michael Pollan (This Is Your Mind on Plants)
But when he instructed his staff to give the injections without telling patients they contained cancer cells, three young Jewish doctors refused, saying they wouldn’t conduct research on patients without their consent. All three knew about the research Nazis had done on Jewish prisoners. They also knew about the famous Nuremberg Trials.
Rebecca Skloot (The Immortal Life of Henrietta Lacks)
There is a famous study from the 1930s involving a group of orphanage babies who, at mealtimes, were presented with a smorgasbord of thirty-four whole, healthy foods. Nothing was processed or prepared beyond mincing or mashing. Among the more standard offerings—fresh fruits and vegetables, eggs, milk, chicken, beef—the researcher, Clara Davis, included liver, kidney, brains, sweetbreads, and bone marrow. The babies shunned liver and kidney (as well as all ten vegetables, haddock, and pineapple), but brains and sweetbreads did not turn up among the low-preference foods she listed. And the most popular item of all? Bone marrow.
Mary Roach (Gulp: Adventures on the Alimentary Canal)
Lincoln is not the only famous leader to have battled depression. Winston Churchill lived with the ‘black dog’ for much of his life too. Watching a fire, he once remarked to a young researcher he was employing: ‘I know why logs spit. I know what it is to be consumed.
Matt Haig (Reasons To Stay Alive)
She knew for a fact that being left-handed automatically made you special. Marie Curie, Albert Einstein, Linus Pauling, and Albert Schweitzer were all left-handed. Of course, no believable scientific theory could rest on such a small group of people. When Lindsay probed further, however, more proof emerged. Michelangelo, Leonardo da Vinci, M.C. Escher, Mark Twain, Hans Christian Andersen, Lewis Carrol, H.G. Wells, Eudora Welty, and Jessamyn West- all lefties. The lack of women in her research had initially bothered her until she mentioned it to Allegra. "Chalk that up to male chauvinism," she said. "Lots of left-handed women were geniuses. Janis Joplin was. All it means is that the macho-man researchers didn't bother asking.
Jo-Ann Mapson (The Owl & Moon Cafe)
Gene patents are the point of greatest concern in the debate over ownership of human biological materials, and how that ownership might interfere with science. As of 2005—the most recent year figures were available—the U.S. government had issued patents relating to the use of about 20 percent of known human genes, including genes for Alzheimer’s, asthma, colon cancer, and, most famously, breast cancer. This means pharmaceutical companies, scientists, and universities control what research can be done on those genes, and how much resulting therapies and diagnostic tests will cost. And some enforce their patents aggressively: Myriad Genetics, which holds the patents on the BRCA1 and BRCA2 genes responsible for most cases of hereditary breast and ovarian cancer, charges $3,000 to test for the genes. Myriad has been accused of creating a monopoly, since no one else can offer the test, and researchers can’t develop cheaper tests or new therapies without getting permission from Myriad and paying steep licensing fees. Scientists who’ve gone ahead with research involving the breast-cancer genes without Myriad’s permission have found themselves on the receiving end of cease-and-desist letters and threats of litigation.
Rebecca Skloot
Research shows that those who believe in a wrathful God are more likely to suffer from depression and anxiety disorders than those who believe in a loving, merciful God.
Tony Jones (Did God Kill Jesus?: Searching for Love in History's Most Famous Execution)
Malcolm Gladwell puts the "pop" in pop psychology, and although revered in lay circles, is roundly dismissed by experts - even by the researchers he makes famous.
Paul Gibbons (The Science of Successful Organizational Change: How Leaders Set Strategy, Change Behavior, and Create an Agile Culture)
As the famous saying goes, “Great minds discuss ideas; average minds discuss events; small minds discuss people.” But research suggests it deserves more credit than that.
Jamil Zaki (Hope for Cynics: The Surprising Science of Human Goodness)
There was some awareness back then about hidden gender bias, particularly because of research like the famous “Howard and Heidi” study. Two Columbia Business School professors had taken an HBS case study about a female venture capitalist named Heidi Roizen and, in half the classes they taught, presented exactly the same stories and qualifications but called her Howard. In surveys of the students, they came away believing that Howard was beloved—so competent! such a go-getter!—whereas Heidi was a power-hungry egomaniac. Same person, just a different name.
Ellen Pao (Reset: My Fight for Inclusion and Lasting Change)
Frankly, the overwhelming majority of academics have ignored the data explosion caused by the digital age. The world’s most famous sex researchers stick with the tried and true. They ask a few hundred subjects about their desires; they don’t ask sites like PornHub for their data. The world’s most famous linguists analyze individual texts; they largely ignore the patterns revealed in billions of books. The methodologies taught to graduate students in psychology, political science, and sociology have been, for the most part, untouched by the digital revolution. The broad, mostly unexplored terrain opened by the data explosion has been left to a small number of forward-thinking professors, rebellious grad students, and hobbyists. That will change.
Seth Stephens-Davidowitz (Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are)
He was nice enough, an old guy who got famous in the 1960s for doing drugs and getting high and calling it research, so you have to figure he was a bit of a flake and probably pretty immature, too.
Ruth Ozeki (A Tale for the Time Being)
Neurologically speaking, though, there are reasons we develop a confused sense of priorities when we’re in front of our computer screens. For one thing, email comes at unpredictable intervals, which, as B. F. Skinner famously showed with rats seeking pellets, is the most seductive and habit-forming reward pattern to the mammalian brain. (Think about it: would slot machines be half as thrilling if you knew when, and how often, you were going to get three cherries?) Jessie would later say as much to me when I asked her why she was “obsessed”—her word—with her email: “It’s like fishing. You just never know what you’re going to get.” More to the point, our nervous systems can become dysregulated when we sit in front of a screen. This, at least, is the theory of Linda Stone, formerly a researcher and senior executive at Microsoft Corporation. She notes that we often hold our breath or breathe shallowly when we’re working at our computers. She calls this phenomenon “email apnea” or “screen apnea.” “The result,” writes Stone in an email, “is a stress response. We become more agitated and impulsive than we’d ordinarily be.
Jennifer Senior (All Joy and No Fun: The Paradox of Modern Parenthood)
As a firstborn I also had a duty to succeed my father and look after my mother and siblings. Although school taught me that this was an outdated practice and that I would have been better off focusing on inheriting my father's assets for my own benefit, it was the strong emphasis on family values that ultimately prevailed. This was not because they had sounded good on paper or had been presented by a world-famous researcher, but because I saw they worked through my experience.
Salatiso Lonwabo Mdeni (The Homeschooling Father, How and Why I got started.: Traditional Schooling to Online Learning until Homeschooling)
Religion has used ritual forever. I remember a famous study led by psychologist Alfred Tomatis of a group of clinically depressed monks. After much examination, researchers concluded that the group’s depression stemmed from their abandoning a twice-daily ritual of gathering to sing Gregorian chants. They had lost the sense of community and the comfort of singing together in harmony. Creating beautiful music together was a formal recognition of their connection and a shared moment of joy.
Sue Johnson (Hold Me Tight: Seven Conversations for a Lifetime of Love (The Dr. Sue Johnson Collection Book 1))
Maria Orsic, a stunning beauty and an unusual medium was not an obscure personality. She was known to many celebrities of the era and had a fleet of very powerful admirers and friends both in Germany and abroad; famous, brilliant and influential people like Charles Lindbergh, Nikola Tesla, Marshal Tito of Yugoslavia, Henry Ford, Eva Peron, and the most illustrious figures in the spiritualism, parapsychological and psychical research in Great Britain. This was reported by Allies intelligence and documented by OSS operatives in Europe.
Maximillien de Lafayette (Volume I. UFOs: MARIA ORSIC, THE WOMAN WHO ORIGINATED AND CREATED EARTH’S FIRST UFOS (Extraterrestrial and Man-Made UFOs & Flying Saucers Book 1))
As a professional philosopher, I very rarely hyperventilate while doing research, but Peirce was a notorious recluse. Most of his books had been sold or carried off to Harvard at the end of his life, but somehow this little treasure—Peirce’s own copy of his first and most famous publication—had ended up here. *
John Kaag (American Philosophy: A Love Story)
For the benefit of your research people, I would like to mention (so as to avoid any duplication of labor): that the planet is very like Mars; that at least seventeen states have Pinedales; that the end of the top paragraph Galley 3 is an allusion to the famous "canals" (or, more correctly, "channels") of Schiaparelli (and Percival Lowell); that I have thoroughly studied the habits of chinchillas; that Charrete is old French and should have one "t"; that Boke's source on Galley 9 is accurate; that "Lancelotik" is not a Celtic diminutive but a Slavic one; that "Betelgeuze" is correctly spelled with a "z", not an "s" as some dictionaries have it; that the "Indigo" Knight is the result of some of my own research; that Sir Grummore, mentioned both in Le Morte Darthur ad in Amadis de Gaul, was a Scotsman; that L'Eau Grise is a scholarly pun; and that neither bludgeons nor blandishments will make me give up the word "hobnailnobbing".
Vladimir Nabokov
The basic concept of microdosing is nothing new. Albert Hofmann, who first synthesized LSD in 1938, considered it one of the drug’s most promising, and least researched, applications. He was among the first to realize its antidepressant and cognition-enhancing potential,[vi] famously taking between 10 and 20 μg himself, twice a week, for the last few decades of his life.[vii]
Paul Austin (Microdosing Psychedelics: A Practical Guide to Upgrade Your Life)
The successful ideas survive scrutiny. The bad ideas get discarded. Conformity is also laughable to scientists attempting to advance their careers. The best way to get famous in your own lifetime is to pose an idea that counters prevailing research and that earns a consistency of observations and experiment. Healthy disagreement is a natural state on the bleeding edge of discovery.
Neil deGrasse Tyson (Starry Messenger: Cosmic Perspectives on Civilization)
So I did some research,” she went on. “The good thing about being a famous model is that you can call anyone and they’ll talk to you. So I called this illusionist I’d seen on Broadway a couple of years ago. He heard the story and then he laughed. I said what’s so funny. He asked me a question: Did this guru do this after dinner? I was surprised. What the hell could that have to do with it? But I said yes, how did you know? He asked if we had coffee. Again I said yes. Did he take his black? One more time I said yes.” Shauna was smiling now. “Do you know how he did it, Beck?” I shook my head. “No clue.” “When he passed the card to Wendy, it went over his coffee cup. Black coffee, Beck. It reflects like a mirror. That’s how he saw what I’d written. It was just a dumb parlor trick.
Harlan Coben (Tell No One)
Valentine’s concept of introversion includes traits that contemporary psychology would classify as openness to experience (“thinker, dreamer”), conscientiousness (“idealist”), and neuroticism (“shy individual”). A long line of poets, scientists, and philosophers have also tended to group these traits together. All the way back in Genesis, the earliest book of the Bible, we had cerebral Jacob (a “quiet man dwelling in tents” who later becomes “Israel,” meaning one who wrestles inwardly with God) squaring off in sibling rivalry with his brother, the swashbuckling Esau (a “skillful hunter” and “man of the field”). In classical antiquity, the physicians Hippocrates and Galen famously proposed that our temperaments—and destinies—were a function of our bodily fluids, with extra blood and “yellow bile” making us sanguine or choleric (stable or neurotic extroversion), and an excess of phlegm and “black bile” making us calm or melancholic (stable or neurotic introversion). Aristotle noted that the melancholic temperament was associated with eminence in philosophy, poetry, and the arts (today we might classify this as opennessto experience). The seventeenth-century English poet John Milton wrote Il Penseroso (“The Thinker”) and L’Allegro (“The Merry One”), comparing “the happy person” who frolics in the countryside and revels in the city with “the thoughtful person” who walks meditatively through the nighttime woods and studies in a “lonely Towr.” (Again, today the description of Il Penseroso would apply not only to introversion but also to openness to experience and neuroticism.) The nineteenth-century German philosopher Schopenhauer contrasted “good-spirited” people (energetic, active, and easily bored) with his preferred type, “intelligent people” (sensitive, imaginative, and melancholic). “Mark this well, ye proud men of action!” declared his countryman Heinrich Heine. “Ye are, after all, nothing but unconscious instruments of the men of thought.” Because of this definitional complexity, I originally planned to invent my own terms for these constellations of traits. I decided against this, again for cultural reasons: the words introvert and extrovert have the advantage of being well known and highly evocative. Every time I uttered them at a dinner party or to a seatmate on an airplane, they elicited a torrent of confessions and reflections. For similar reasons, I’ve used the layperson’s spelling of extrovert rather than the extravert one finds throughout the research literature.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
Working hard is important. But more effort does not necessarily yield more results. “Less but better” does. Ferran Adrià, arguably the world’s greatest chef, who has led El Bulli to become the world’s most famous restaurant, epitomizes the principle of “less but better” in at least two ways. First, his specialty is reducing traditional dishes to their absolute essence and then re-imagining them in ways people have never thought of before. Second, while El Bulli has somewhere in the range of 2 million requests for dinner reservations each year, it serves only fifty people per night and closes for six months of the year. In fact, at the time of writing, Ferran had stopped serving food altogether and had instead turned El Bulli into a full-time food laboratory of sorts where he was continuing to pursue nothing but the essence of his craft.1 Getting used to the idea of “less but better” may prove harder than it sounds, especially when we have been rewarded in the past for doing more … and more and more. Yet at a certain point, more effort causes our progress to plateau and even stall. It’s true that the idea of a direct correlation between results and effort is appealing. It seems fair. Yet research across many fields paints a very different picture. Most people have heard of the “Pareto Principle,” the idea, introduced as far back as the 1790s by Vilfredo Pareto, that 20 percent of our efforts produce 80 percent of results. Much later, in 1951, in his Quality-Control Handbook, Joseph Moses Juran, one of the fathers of the quality movement, expanded on this idea and called it “the Law of the Vital Few.”2 His observation was that you could massively improve the quality of a product by resolving a tiny fraction of the problems. He found a willing test audience for this idea in Japan, which at the time had developed a rather poor reputation for producing low-cost, low-quality goods. By adopting a process in which a high percentage of effort and attention was channeled toward improving just those few things that were truly vital, he made the phrase “made in Japan” take on a totally new meaning. And gradually, the quality revolution led to Japan’s rise as a global economic power.3
Greg McKeown (Essentialism: The Disciplined Pursuit of Less)
Lederman is also a charismatic personality, famous among his colleagues for his humor and storytelling ability. One of his favorite anecdotes relates the time when, as a graduate student, he arranged to bump into Albert Einstein while walking the grounds at the Institute for Advanced Study at Princeton. The great man listened patiently as the eager youngster explained the particle-physics research he was doing at Columbia, and then said with a smile, “That is not interesting.
Sean Carroll (The Particle at the End of the Universe: The Hunt for the Higgs Boson and the Discovery of a New World)
Starting something new in middle age might look that way too. Mark Zuckerberg famously noted that “young people are just smarter.” And yet a tech founder who is fifty years old is nearly twice as likely to start a blockbuster company as one who is thirty, and the thirty-year-old has a better shot than a twenty-year-old. Researchers at Northwestern, MIT, and the U.S. Census Bureau studied new tech companies and showed that among the fastest-growing start-ups, the average age of a founder was forty-five when the company was launched.
David Epstein (Range: Why Generalists Triumph in a Specialized World)
The smartest person to ever walk this Earth in all probability lived and died herding goats on a mountain somewhere, with no way to disseminate their work globally even if they had realised they were super smart and had the means to do something with their abilities. I am not keen on 'who are the smartest' lists and websites because, as Scott Barry Kaufman points out, the concept of genius privileges the few who had the opportunity to see through and promote their life’s work, while excluding others who may have had equal or greater raw potential but lacked the practical and financial support, and the communication platform that famous names clearly had. This is why I am keen to develop, through my research work, a definition of genius from a cognitive neuroscience and psychometric point of view, so that whatever we decide that is and how it should be measured, only focuses on clearly measurable factors within the individual’s mind, regardless of their external achievements, eminence, popularity, wealth, public platform etc. In my view this would be both more equitable and more scientific.
Gwyneth Wesley Rolph
THREE FAMOUS ENGRAVINGS depict Alexis St. Martin in his youth. I’ve seen them many times, in biographies of his surgeon William Beaumont, in Beaumont’s own book, in journal articles about the pair. As detailed as the artworks are, you can’t tell what St. Martin looked like from examining them. All three woodcuts are of the lower portion of his left breast, and the famous hole. I could pick St. Martin’s nipple out of a lineup before I could his eyes. I suppose this makes sense; Beaumont was a researcher and St. Martin his subject—more a body than a man.
Mary Roach (Gulp: Adventures on the Alimentary Canal)
By 1952, the University of Minnesota nutritionist Ancel Keys was arguing that high blood levels of cholesterol caused heart disease, and that it was the fat in our diets that drove up cholesterol levels. Keys had a conflict of interest: his research had been funded by the sugar industry—the Sugar Research Foundation and then the Sugar Association—since 1944, if not earlier, and the K-rations he had famously developed for the military during the war (the “K” is said to have stood for “Keys”) were loaded with sugar. This might have naturally led him to perceive something other than sugar as the problem. We can only guess.
Gary Taubes (The Case Against Sugar)
When I am asked to summarize the fundamental message from research on self-control, I recall “Descartes’s famous dictum cogito, ergo sum—“I think, therefore I am.” What has been discovered about mind, brain, and self-control lets us move from his proposition to “I think, therefore I can change what I am.” Because by changing how we think, we can change what we feel, do, and become. If that leads to the question “But can I really change?,” I reply with what George Kelly said to his therapy clients when they kept asking him if they could get control of their lives. He looked straight into their eyes and said, “Would you like to?
Walter Mischel
Equally bad deals have been made with Big Tech. In many ways, Silicon Valley is a product of the U.S. government’s investments in the development of high-risk technologies. The National Science Foundation funded the research behind the search algorithm that made Google famous. The U.S. Navy did the same for the GPS technology that Uber depends on. And the Defense Advanced Research Projects Agency, part of the Pentagon, backed the development of the Internet, touchscreen technology, Siri, and every other key component in the iPhone. Taxpayers took risks when they invested in these technologies, yet most of the technology companies that have benefited fail to pay their fair share of taxes.
Mariana Mazzucato
Some researchers, such as psychologist Jean Twenge, say this new world where compliments are better than sex and pizza, in which the self-enhancing bias has been unchained and allowed to gorge unfettered, has led to a new normal in which the positive illusions of several generations have now mutated into full-blown narcissism. In her book The Narcissism Epidemic, Twenge says her research shows that since the mid-1980s, clinically defined narcissism rates in the United States have increased in the population at the same rate as obesity. She used the same test used by psychiatrists to test for narcissism in patients and found that, in 2006, one in four U.S. college students tested positive. That’s real narcissism, the kind that leads to diagnoses of personality disorders. In her estimation, this is a dangerous trend, and it shows signs of acceleration. Narcissistic overconfidence crosses a line, says Twenge, and taints those things improved by a skosh of confidence. Over that line, you become less concerned with the well-being of others, more materialistic, and obsessed with status in addition to losing all the restraint normally preventing you from tragically overestimating your ability to manage or even survive risky situations. In her book, Twenge connects this trend to the housing market crash of the mid-2000s and the stark increase in reality programming during that same decade. According to Twenge, the drive to be famous for nothing went from being strange to predictable thanks to a generation or two of people raised by parents who artificially boosted self-esteem to ’roidtastic levels and then released them into a culture filled with new technologies that emerged right when those people needed them most to prop up their self-enhancement biases. By the time Twenge’s research was published, reality programming had spent twenty years perfecting itself, and the modern stars of those shows represent a tiny portion of the population who not only want to be on those shows, but who also know what they are getting into and still want to participate. Producers with the experience to know who will provide the best television entertainment to millions then cull that small group. The result is a new generation of celebrities with positive illusions so robust and potent that the narcissistic overconfidence of the modern American teenager by comparison is now much easier to see as normal.
David McRaney (You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself)
The person who discovered the answer was a retiring, self-funded scientist named Peter Mitchell who in the early 1960s inherited a fortune from the Wimpey house-building company and used it to set up a research center in a stately home in Cornwall. Mitchell was something of an eccentric. He wore shoulder-length hair and an earring at a time when that was especially unusual among serious scientists. He was also famously forgetful. At his daughter’s wedding, he approached another guest and confessed that she looked familiar, though he couldn’t quite place her. “I was your first wife,” she answered. Mitchell’s ideas were universally dismissed, not altogether surprisingly. As one chronicler has noted, “At the time that Mitchell proposed his hypothesis there was not a shred of evidence in support of it.” But he was eventually vindicated and in 1978 was awarded the Nobel Prize in Chemistry—an extraordinary accomplishment for someone who worked from a home lab. The
Bill Bryson (The Body: A Guide for Occupants)
When I was a kid, my mother thought spinach was the healthiest food in the world because it contained so much iron. Getting enough iron was a big deal then because we didn't have 'iron-fortified' bread. Turns out that spinach is an okay source of iron, but no better than pizza, pistachio nuts, cooked lentils, or dried peaches. The spinach-iron myth grew out of a simple mathematical miscalculation: A researcher accidentally moved a decimal point one space, so he thought spinach had 10 times more iron than it did. The press reported it, and I had to eat spinach. Moving the decimal point was an honest mistake--but it's seldom that simple. If it happened today I'd suspect a spinach lobby was behind it. Businesses often twist science to make money. Lawyers do it to win cases. Political activists distort science to fit their agenda, bureaucrats to protect their turf. Reporters keep falling for it. Scientists sometimes go along with it because they like being famous.
John Stossel (Give Me a Break: How I Exposed Hucksters, Cheats, and Scam Artists and Became the Scourge of the Liberal Media...)
However that may be, after prolonged research on myself, I brought out the fundamental duplicity of the human being. Then I realized, as a result of delving in my memory, that modesty helped me to shine, humility to conquer, and virtue to oppress. I used to wage war by peaceful means and eventually used to achieve, through disinterested means, everything I desired. For instance, I never complained that my birthday was overlooked; people were even surprised, with a touch of admiration, by my discretion on this subject. But the reason for my disinterestedness was even more discreet: I longed to be forgotten in order to be able to complain to myself. Several days before the famous date (which I knew very well) I was on the alert, eager to let nothing slip that might arouse the attention and memory of those on whose lapse I was counting (didn’t I once go so far as to contemplate falsifying a friend’s calendar?). Once my solitude was thoroughly proved, I could surrender to the charms of a virile self-pity.
Albert Camus (The Fall)
Four thousand miles away in France, the old boys from the Haute-Loire Resistance wrote to each other to share the devastating news. They had enjoyed nearly forty years of freedom since spending a mere couple of months in Virginia’s presence in 1944. But the warrior they called La Madone had shown them hope, comradeship, courage, and the way to be the best version of themselves, and they had never forgotten. In the midst of hardship and fear, she had shared with them a fleeting but glorious state of happiness and the most vivid moment of their lives. The last of those famous Diane Irregulars—the ever-boyish Gabriel Eyraud, her chouchou—passed away in 2017 while I was researching Virginia’s story. Until the end of his days, he and the others who had known Virginia on the plateau liked to pause now and then to think of the woman in khaki who never, ever gave up on freedom. When they talked with awe and affection of her incredible exploits, they smiled and looked up at the wide, open skies with “les étoiles dans les yeux.
Sonia Purnell (A Woman of No Importance: The Untold Story of the American Spy Who Helped Win World War II)
Look at the telephone; it would remind you of a unique scientist, Alexander Graham Bell. He, besides being a great inventor, was also a man of great compassion and service. In fact, much of the research which led to the development of the telephone was directed at finding solutions to the challenges of hearing impaired people and helping them to be able to listen and communicate. Bell’s mother and wife were both hearing impaired and it profoundly changed Bell’s outlook to science. He aimed to make devices which would help the hearing impaired. He started a special school in Boston to teach hearing impaired people in novel ways. It was these lessons which inspired him to work with sound and led to the invention of the telephone. Can you guess the name of the most famous student of Alexander Graham Bell? It was Helen Keller, the great author, activist and poet who was hearing and visually impaired. About her teacher, she once said that Bell dedicated his life to the penetration of that ‘inhuman silence which separates and estranges’.
A.P.J. Abdul Kalam (Learning How to Fly: Life Lessons for the Youth)
And then, as slowly as the light fades on a calm winter evening, something went out of our relationship. I say that selfishly. Perhaps I started to look for something which had never been there in the first place: passion, romance. I aresay that as I entered my forties I had a sense that somehow life was going past me. I had hardly experienced those emotions which for me have mostly come from reading books or watching television. I suppose that if there was anything unsatisfactory in our marriage, it was in my perception of it—the reality was unchanged. Perhaps I grew up from childhood to manhood too quickly. One minute I was cutting up frogs in the science lab at school, the next I was working for the National Centre for Fisheries Excellence and counting freshwater mussel populations on riverbeds. Somewhere in between, something had passed me by: adolescence, perhaps? Something immature, foolish yet intensely emotive, like those favourite songs I had recalled dimly as if being played on a distant radio, almost too far away to make out the words. I had doubts, yearnings, but I did not know why or what for. Whenever I tried to analyse our lives, and talk about it with Mary, she would say, ‘Darling, you are on the way to becoming one of the leading authorities in the world on caddis fly larvae. Don’t allow anything to deflect you from that. You may be rather inadequately paid, certainly compared with me you are, but excellence in any field is an achievement beyond value.’ I don’t know when we started drifting apart. When I told Mary about the project—I mean about researching the possibility of a salmon fishery in the Yemen—something changed. If there was a defining moment in our marriage, then that was it. It was ironical, in a sense. For the first time in my life I was doing something which might bring me international recognition and certainly would make me considerably better off—I could live for years off the lecture circuit alone, if the project was even half successful. Mary didn’t like it. I don’t know what part she didn’t like: the fact I might become more famous than her, the fact I might even become better paid than her. That makes her sound carping.
Paul Torday (Salmon Fishing in the Yemen)
Benjamin Libet, a scientist in the physiology department of the University of California, San Francisco, was a pioneering researcher into the nature of human consciousness. In one famous experiment he asked a study group to move their hands at a moment of their choosing while their brain activity was being monitored. Libet was seeking to identify what came first — the brain’s electrical activity to make the hand move or the person’s conscious intention to make their hand move. It had to be the second one, surely? But no. Brain activity to move the hand was triggered a full half a second before any conscious intention to move it…. John-Dylan Haynes, a neuroscientist at the Max Planck Institute for Human Cognitive and Brain Studies in Leipzig, Germany, led a later study that was able to predict an action ten seconds before people had a conscious intention to do it. What was all the stuff about free will? Frank Tong, a neuroscientist at Vanderbilt University in Nashville, Tennessee, said: “Ten seconds is a lifetime in terms of brain activity.” So where is it coming from if not ‘us,’ the conscious mind?
David Icke
Nartok shows me an example of Arctic “greens”: cutout number 13, Caribou Stomach Contents. Moss and lichen are tough to digest, unless, like caribou, you have a multichambered stomach in which to ferment them. So the Inuit let the caribou have a go at it first. I thought of Pat Moeller and what he’d said about wild dogs and other predators eating the stomachs and stomach contents of their prey first. “And wouldn’t we all,” he’d said, “be better off.” If we could strip away the influences of modern Western culture and media and the high-fructose, high-salt temptations of the junk-food sellers, would we all be eating like Inuit elders, instinctively gravitating to the most healthful, nutrient-diverse foods? Perhaps. It’s hard to say. There is a famous study from the 1930s involving a group of orphanage babies who, at mealtimes, were presented with a smorgasbord of thirty-four whole, healthy foods. Nothing was processed or prepared beyond mincing or mashing. Among the more standard offerings—fresh fruits and vegetables, eggs, milk, chicken, beef—the researcher, Clara Davis, included liver, kidney, brains, sweetbreads, and bone marrow. The babies shunned liver and kidney (as well as all ten vegetables, haddock, and pineapple), but brains and sweetbreads did not turn up among the low-preference foods she listed. And the most popular item of all? Bone marrow.
Mary Roach (Gulp: Adventures on the Alimentary Canal)
Example: a famous-to-economists finding in behavioral economics concerns pricing, and the fact that people have a provable bias towards the middle of three prices. It was first demonstrated with an experiment in beer pricing: when there were two beers, a third of people chose the cheaper; adding an even cheaper beer made the share of that beer go up, because it was now in the middle of three prices; adding an even more expensive beer at the top, and dropping the cheapest beer, made the share of the new beer in the middle (which had previously been the most expensive) go up from two-thirds to 90 percent. Having a price above and a price below makes the price in the middle seem more appealing. This experiment has been repeated with other consumer goods, such as ovens, and is now a much-used strategy in the corporate world. Basically, if you have two prices for something, and want to make more people pay the higher price, you add a third, even higher price; that makes the formerly highest price more attractive. Watch out for this strategy. (The research paper about beer pricing, written by a trio of economists at Duke University in 1982, was published in the Journal of Consumer Research. It’s called “Adding Asymetrically Dominated Alternatives: Violations of Regularity and the Simularity Hypothesis”—which must surely be the least engaging title ever given to an article about beer.)
John Lanchester (How to Speak Money: What the Money People Say-And What It Really Means: What the Money People Say―And What It Really Means)
But the basis of Freud's ideas aren't accepted by all philosophers, though many accept that he was right about the possibility of unconscious thought. Some have claimed that Freud's theories are unscientific. Most famously, Karl Popper (whose ideas are more fully discussed in Chapter 36) described many of the ideas of psychoanalysis as ‘unfalsifiable’. This wasn't a compliment, but a criticism. For Popper, the essence of scientific research was that it could be tested; that is, there could be some possible observation that would show that it was false. In Popper's example, the actions of a man who pushed a child into a river, and a man who dived in to save a drowning child were, like all human behaviour, equally open to Freudian explanation. Whether someone tried to drown or save a child, Freud's theory could explain it. He would probably say that the first man was repressing some aspect of his Oedipal conflict, and that led to his violent behaviour, whereas the second man had ‘sublimated’ his unconscious desires, that is, managed to steer them into socially useful actions. If every possible observation is taken as further evidence that the theory is true, whatever that observation is, and no imaginable evidence could show that it was false, Popper believed, the theory couldn't be scientific at all. Freud, on the other hand, might have argued that Popper had some kind of repressed desire that made him so aggressive towards psychoanalysis. Bertrand
Nigel Warburton (A Little History of Philosophy (Little Histories))
For years I’ve been asking myself (and my readers) whether these propagandists—commonly called corporate or capitalist journalists—are evil or stupid. I vacillate day by day. Most often I think both. But today I’m thinking evil. Here’s why. You may have heard of John Stossel. He’s a long-term analyst, now anchor, on a television program called 20/20, and is most famous for his segment called “Give Me A Break,” in which, to use his language, he debunks commonly held myths. Most of the rest of us would call what he does “lying to serve corporations.” For example, in one of his segments, he claimed that “buying organic [vegetables] could kill you.” He stated that specially commissioned studies had found no pesticide residues on either organically grown or pesticide-grown fruits and vegetables, and had found further that organic foods are covered with dangerous strains of E. coli. But the researchers Stossel cited later stated he misrepresented their research. The reason they didn’t find any pesticides is because they never tested for them (they were never asked to). Further, they said Stossel misrepresented the tests on E. coli. Stossel refused to issue a retraction. Worse, the network aired the piece two more times. And still worse, it came out later that 20/20’s executive director Victor Neufeld knew about the test results and knew that Stossel was lying a full three months before the original broadcast.391 This is not unusual for Stossel and company.
Derrick Jensen (Endgame, Vol. 1: The Problem of Civilization)
If this is true—if solitude is an important key to creativity—then we might all want to develop a taste for it. We’d want to teach our kids to work independently. We’d want to give employees plenty of privacy and autonomy. Yet increasingly we do just the opposite. We like to believe that we live in a grand age of creative individualism. We look back at the midcentury era in which the Berkeley researchers conducted their creativity studies, and feel superior. Unlike the starched-shirted conformists of the 1950s, we hang posters of Einstein on our walls, his tongue stuck out iconoclastically. We consume indie music and films, and generate our own online content. We “think different” (even if we got the idea from Apple Computer’s famous ad campaign). But the way we organize many of our most important institutions—our schools and our workplaces—tells a very different story. It’s the story of a contemporary phenomenon that I call the New Groupthink—a phenomenon that has the potential to stifle productivity at work and to deprive schoolchildren of the skills they’ll need to achieve excellence in an increasingly competitive world. The New Groupthink elevates teamwork above all else. It insists that creativity and intellectual achievement come from a gregarious place. It has many powerful advocates. “Innovation—the heart of the knowledge economy—is fundamentally social,” writes the prominent journalist Malcolm Gladwell. “None of us is as smart as all of us,” declares the organizational consultant Warren Bennis,
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
the device had the property of transresistance and should have a name similar to devices such as the thermistor and varistor, Pierce proposed transistor. Exclaimed Brattain, “That’s it!” The naming process still had to go through a formal poll of all the other engineers, but transistor easily won the election over five other options.35 On June 30, 1948, the press gathered in the auditorium of Bell Labs’ old building on West Street in Manhattan. The event featured Shockley, Bardeen, and Brattain as a group, and it was moderated by the director of research, Ralph Bown, dressed in a somber suit and colorful bow tie. He emphasized that the invention sprang from a combination of collaborative teamwork and individual brilliance: “Scientific research is coming more and more to be recognized as a group or teamwork job. . . . What we have for you today represents a fine example of teamwork, of brilliant individual contributions, and of the value of basic research in an industrial framework.”36 That precisely described the mix that had become the formula for innovation in the digital age. The New York Times buried the story on page 46 as the last item in its “News of Radio” column, after a note about an upcoming broadcast of an organ concert. But Time made it the lead story of its science section, with the headline “Little Brain Cell.” Bell Labs enforced the rule that Shockley be in every publicity photo along with Bardeen and Brattain. The most famous one shows the three of them in Brattain’s lab. Just as it was about to be taken, Shockley sat down in Brattain’s chair, as if it were his desk and microscope, and became the focal point of the photo. Years later Bardeen would describe Brattain’s lingering dismay and his resentment of Shockley: “Boy, Walter hates this picture. . . . That’s Walter’s equipment and our experiment,
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Take the famous slogan on the atheist bus in London … “There’s probably no God. Now stop worrying and enjoy your life.” … The word that offends against realism here is “enjoy.” I’m sorry—enjoy your life? Enjoy your life? I’m not making some kind of neo-puritan objection to enjoyment. Enjoyment is lovely. Enjoyment is great. The more enjoyment the better. But enjoyment is one emotion … Only sometimes, when you’re being lucky, will you stand in a relationship to what’s happening to you where you’ll gaze at it with warm, approving satisfaction. The rest of the time, you’ll be busy feeling hope, boredom, curiosity, anxiety, irritation, fear, joy, bewilderment, hate, tenderness, despair, relief, exhaustion … This really is a bizarre category error. But not necessarily an innocent one … The implication of the bus slogan is that enjoyment would be your natural state if you weren’t being “worried” by us believer … Take away the malignant threat of God-talk, and you would revert to continuous pleasure, under cloudless skies. What’s so wrong with this, apart from it being total bollocks? … Suppose, as the atheist bus goes by, that you are the fifty-something woman with the Tesco bags, trudging home to find out whether your dementing lover has smeared the walls of the flat with her own shit again. Yesterday when she did it, you hit her, and she mewled till her face was a mess of tears and mucus which you also had to clean up. The only thing that would ease the weight on your heart would be to tell the funniest, sharpest-tongued person you know about it: but that person no longer inhabits the creature who will meet you when you unlock the door. Respite care would help, but nothing will restore your sweetheart, your true love, your darling, your joy. Or suppose you’re that boy in the wheelchair, the one with the spasming corkscrew limbs and the funny-looking head. You’ve never been able to talk, but one of your hands has been enough under your control to tap out messages. Now the electrical storm in your nervous system is spreading there too, and your fingers tap more errors than readable words. Soon your narrow channel to the world will close altogether, and you’ll be left all alone in the hulk of your body. Research into the genetics of your disease may abolish it altogether in later generations, but it won’t rescue you. Or suppose you’re that skanky-looking woman in the doorway, the one with the rat’s nest of dreadlocks. Two days ago you skedaddled from rehab. The first couple of hits were great: your tolerance had gone right down, over two weeks of abstinence and square meals, and the rush of bliss was the way it used to be when you began. But now you’re back in the grind, and the news is trickling through you that you’ve fucked up big time. Always before you’ve had this story you tell yourself about getting clean, but now you see it isn’t true, now you know you haven’t the strength. Social services will be keeping your little boy. And in about half an hour you’ll be giving someone a blowjob for a fiver behind the bus station. Better drugs policy might help, but it won’t ease the need, and the shame over the need, and the need to wipe away the shame. So when the atheist bus comes by, and tells you that there’s probably no God so you should stop worrying and enjoy your life, the slogan is not just bitterly inappropriate in mood. What it means, if it’s true, is that anyone who isn’t enjoying themselves is entirely on their own. The three of you are, for instance; you’re all three locked in your unshareable situations, banged up for good in cells no other human being can enter. What the atheist bus says is: there’s no help coming … But let’s be clear about the emotional logic of the bus’s message. It amounts to a denial of hope or consolation, on any but the most chirpy, squeaky, bubble-gummy reading of the human situation. St Augustine called this kind of thing “cruel optimism” fifteen hundred years ago, and it’s still cruel.
Francis Spufford
I’ve worn Niki’s pants for two days now. I thought a third day in the same clothes might be pushing it.” Ian shrugged with indifference. “It might send Derian through the roof, but it doesn’t bother me. Wear what you want to wear.” Eena wrinkled her nose at him. “Do you really feel that way or are you trying to appear more laissez-faire than Derian?” “More laissez-faire?” “Yes. That’s a real word.” “Two words actually,” he grinned. “Laissez faire et laissez passer, le monde va de lui même!" He coated the words with a heavy French accent. Eena gawked at him. “Since when do you speak French?” “I don’t.” Ian chuckled. “But I did do some research in world history the year I followed you around on Earth. Physics was a joke, but history—that I found fascinating.” Slapping a hand against her chest, Eena exclaimed, “I can’t believe it! Unbeknownst to me, Ian actually studied something in high school other than the library’s collection of sci-fi paperbacks!” He grimaced at her exaggerated performance before defending his preferred choice of reading material. “Hey, popular literature is a valuable and enlightening form of world history. You would know that if you read a book or two.” She ignored his reproach and asked with curiosity, “What exactly did you say?” “In French?” “Duh, yes.” “Don’t ‘duh’ me, you could easily have been referring to my remark about enlightening literature. I know the value of a good book is hard for you to comprehend.” He grinned crookedly at her look of offense and then moved into an English translation of his French quote. “Let it do and let it pass, the world goes on by itself.” “Hmm. And where did that saying come from?” Ian delivered his answer with a surprisingly straight face. “That is what the French Monarch said when his queen began dressing casually. The French revolution started one week following that famous declaration, right after the queen was beheaded by the rest of the aristocracy in her favorite pair of scroungy jeans.” “You are such a brazen-tongued liar!
Richelle E. Goodrich (Eena, The Companionship of the Dragon's Soul (The Harrowbethian Saga #6))
This, in turn, has given us a “unified theory of aging” that brings the various strands of research into a single, coherent tapestry. Scientists now know what aging is. It is the accumulation of errors at the genetic and cellular level. These errors can build up in various ways. For example, metabolism creates free radicals and oxidation, which damage the delicate molecular machinery of our cells, causing them to age; errors can build up in the form of “junk” molecular debris accumulating inside and outside the cells. The buildup of these genetic errors is a by-product of the second law of thermodynamics: total entropy (that is, chaos) always increases. This is why rusting, rotting, decaying, etc., are universal features of life. The second law is inescapable. Everything, from the flowers in the field to our bodies and even the universe itself, is doomed to wither and die. But there is a small but important loophole in the second law that states total entropy always increases. This means that you can actually reduce entropy in one place and reverse aging, as long as you increase entropy somewhere else. So it’s possible to get younger, at the expense of wreaking havoc elsewhere. (This was alluded to in Oscar Wilde’s famous novel The Picture of Dorian Gray. Mr. Gray was mysteriously eternally young. But his secret was the painting of himself that aged horribly. So the total amount of aging still increased.) The principle of entropy can also be seen by looking behind a refrigerator. Inside the refrigerator, entropy decreases as the temperature drops. But to lower the entropy, you have to have a motor, which increases the heat generated behind the refrigerator, increasing the entropy outside the machine. That is why refrigerators are always hot in the back. As Nobel laureate Richard Feynman once said, “There is nothing in biology yet found that indicates the inevitability of death. This suggests to me that it is not at all inevitable and that it is only a matter of time before biologists discover what it is that is causing us the trouble and that this terrible universal disease or temporariness of the human’s body will be cured.
Michio Kaku (Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100)
The Extraordinary Persons Project In fact, Ekman had been so moved personally—and intrigued scientifically—by his experiments with Öser that he announced at the meeting he was planning on pursuing a systematic program of research studies with others as unusual as Öser. The single criterion for selecting apt subjects was that they be “extraordinary.” This announcement was, for modern psychology, an extraordinary moment in itself. Psychology has almost entirely dwelt on the problematic, the abnormal, and the ordinary in its focus. Very rarely have psychologists—particularly ones as eminent as Paul Ekman—shifted their scientific lens to focus on people who were in some sense (other than intellectually) far above normal. And yet Ekman now was proposing to study people who excel in a range of admirable human qualities. His announcement makes one wonder why psychology hasn't done this before. In fact, only in very recent years has psychology explicitly begun a program to study the positive in human nature. Sparked by Martin Seligman, a psychologist at the University of Pennsylvania long famous for his research on optimism, a budding movement has finally begun in what is being called “positive psychology”—the scientific study of well-being and positive human qualities. But even within positive psychology, Ekman's proposed research would stretch science's vision of human goodness by assaying the limits of human positivity Ever the scientist, Ekman became quite specific about what was meant by “extraordinary.” For one, he expects that such people exist in every culture and religious tradition, perhaps most often as contemplatives. But no matter what religion they practice, they share four qualities. The first is that they emanate a sense of goodness, a palpable quality of being that others notice and agree on. This goodness goes beyond some fuzzy, warm aura and reflects with integrity the true person. On this count Ekman proposed a test to weed out charlatans: In extraordinary people “there is a transparency between their personal and public life, unlike many charismatics, who have wonderful public lives and rather deplorable personal ones.” A second quality: selflessness. Such extraordinary people are inspiring in their lack of concern about status, fame, or ego. They are totally unconcerned with whether their position or importance is recognized. Such a lack of egoism, Ekman added, “from the psychological viewpoint, is remarkable.” Third is a compelling personal presence that others find nourishing. “People want to be around them because it feels good—though they can't explain why,” said Ekman. Indeed, the Dalai Lama himself offers an obvious example (though Ekman did not say so to him); the standard Tibetan title is not “Dalai Lama” but rather “Kundun,” which in Tibetan means “presence.” Finally, such extraordinary individuals have “amazing powers of attentiveness and concentration.
Daniel Goleman (Destructive Emotions: A Scientific Dialogue with the Dalai Lama)
The Tale of Human Evolution The subject most often brought up by advocates of the theory of evolution is the subject of the origin of man. The Darwinist claim holds that modern man evolved from ape-like creatures. During this alleged evolutionary process, which is supposed to have started 4-5 million years ago, some "transitional forms" between modern man and his ancestors are supposed to have existed. According to this completely imaginary scenario, four basic "categories" are listed: 1. Australopithecus 2. Homo habilis 3. Homo erectus 4. Homo sapiens Evolutionists call man's so-called first ape-like ancestors Australopithecus, which means "South African ape." These living beings are actually nothing but an old ape species that has become extinct. Extensive research done on various Australopithecus specimens by two world famous anatomists from England and the USA, namely, Lord Solly Zuckerman and Prof. Charles Oxnard, shows that these apes belonged to an ordinary ape species that became extinct and bore no resemblance to humans. Evolutionists classify the next stage of human evolution as "homo," that is "man." According to their claim, the living beings in the Homo series are more developed than Australopithecus. Evolutionists devise a fanciful evolution scheme by arranging different fossils of these creatures in a particular order. This scheme is imaginary because it has never been proved that there is an evolutionary relation between these different classes. Ernst Mayr, one of the twentieth century's most important evolutionists, contends in his book One Long Argument that "particularly historical [puzzles] such as the origin of life or of Homo sapiens, are extremely difficult and may even resist a final, satisfying explanation." By outlining the link chain as Australopithecus > Homo habilis > Homo erectus > Homo sapiens, evolutionists imply that each of these species is one another's ancestor. However, recent findings of paleoanthropologists have revealed that Australopithecus, Homo habilis, and Homo erectus lived at different parts of the world at the same time. Moreover, a certain segment of humans classified as Homo erectus have lived up until very modern times. Homo sapiens neandarthalensis and Homo sapiens sapiens (modern man) co-existed in the same region. This situation apparently indicates the invalidity of the claim that they are ancestors of one another. Stephen Jay Gould explained this deadlock of the theory of evolution although he was himself one of the leading advocates of evolution in the twentieth century: What has become of our ladder if there are three coexisting lineages of hominids (A. africanus, the robust australopithecines, and H. habilis), none clearly derived from another? Moreover, none of the three display any evolutionary trends during their tenure on earth. Put briefly, the scenario of human evolution, which is "upheld" with the help of various drawings of some "half ape, half human" creatures appearing in the media and course books, that is, frankly, by means of propaganda, is nothing but a tale with no scientific foundation. Lord Solly Zuckerman, one of the most famous and respected scientists in the U.K., who carried out research on this subject for years and studied Australopithecus fossils for 15 years, finally concluded, despite being an evolutionist himself, that there is, in fact, no such family tree branching out from ape-like creatures to man.
Harun Yahya (Those Who Exhaust All Their Pleasures In This Life)
Research tells us that brainstorming becomes more productive when it’s focused. As jazz great Charles Mingus famously said, “You can’t improvise on nothing, man; you’ve gotta improvise on something.
Chip Heath (The Myth of the Garage: And Other Minor Surprises)
In 1968, elementary school teacher Jane Elliott conducted a famous experiment with her students in the days after the assassination of Dr. Martin Luther King Jr. She divided the class by eye color. The brown-eyed children were told they were better. They were the “in-group.” The blue-eyed children were told they were less than the brown-eyed children—hence becoming the “out-group.” Suddenly, former classmates who had once played happily side by side were taunting and torturing one another on the playground. Lest we assign greater morality to the “out-group,” the blue-eyed children were just as quick to attack the brown-eyed children once the roles were reversed.6 Since Elliott’s experiment, researchers have conducted thousands of studies to understand the in-group/out-group response. Now, with fMRI scans, these researchers can actually see which parts of our brains fire up when perceiving a member of an out-group. In a phenomenon called the out-group homogeneity effect, we are more likely to see members of our groups as unique and individually motivated—and more likely to see a member of the out-group as the same as everyone else in that group. When we encounter this out-group member, our amygdala—the part of our brain that processes anger and fear—is more likely to become active. The more we perceive this person outside our group as a threat, the more willing we are to treat them badly.
Sarah Stewart Holland (I Think You're Wrong (But I'm Listening): A Guide to Grace-Filled Political Conversations)
Research from Brunel University shows that chess students who trained with coaches increased on average 168 points in their national ratings versus those who didn’t. Though long hours of deliberate practice are unavoidable in the cognitively complex arena of chess, the presence of a coach for mentorship gives players a clear advantage. Chess prodigy Joshua Waitzkin (the subject of the film Searching for Bobby Fischer) for example, accelerated his career when national chess master Bruce Pandolfini discovered him playing chess in Washington Square Park in New York as a boy. Pandolfini coached young Waitzkin one on one, and the boy won a slew of chess championships, setting a world record at an implausibly young age. Business research backs this up, too. Analysis shows that entrepreneurs who have mentors end up raising seven times as much capital for their businesses, and experience 3.5 times faster growth than those without mentors. And in fact, of the companies surveyed, few managed to scale a profitable business model without a mentor’s aid. Even Steve Jobs, the famously visionary and dictatorial founder of Apple, relied on mentors, such as former football coach and Intuit CEO Bill Campbell, to keep himself sharp. SO, DATA INDICATES THAT those who train with successful people who’ve “been there” tend to achieve success faster. The winning formula, it seems, is to seek out the world’s best and convince them to coach us. Except there’s one small wrinkle. That’s not quite true. We just held up Justin Bieber as an example of great, rapid-mentorship success. But since his rapid rise, he’s gotten into an increasing amount of trouble. Fights. DUIs. Resisting arrest. Drugs. At least one story about egging someone’s house. It appears that Bieber started unraveling nearly as quickly as he rocketed to Billboard number one. OK, first of all, Bieber’s young. He’s acting like the rock star he is. But his mentor, Usher, also got to Billboard number one at age 18, and he managed to dominate pop music for a decade without DUIs or egg-vandalism incidents. Could it be that Bieber missed something in the mentorship process? History, it turns out, is full of people who’ve been lucky enough to have amazing mentors and have stumbled anyway.
Shane Snow (Smartcuts: The Breakthrough Power of Lateral Thinking)
The dementia that is caused by the same vascular problems that lead to stroke is clearly affected by diet. In a publication from the famous Framingham Study, researchers conclude that for every three additional servings of fruits and vegetables a day, the risk of stroke will be reduced by 22%.73
T. Colin Campbell (The China Study: The Most Comprehensive Study of Nutrition Ever Conducted and the Startling Implications for Diet, Weight Loss, and Long-term Health)
Minsky was an ardent supporter of the Cyc project, the most notorious failure in the history of AI. The goal of Cyc was to solve AI by entering into a computer all the necessary knowledge. When the project began in the 1980s, its leader, Doug Lenat, confidently predicted success within a decade. Thirty years later, Cyc continues to grow without end in sight, and commonsense reasoning still eludes it. Ironically, Lenat has belatedly embraced populating Cyc by mining the web, not because Cyc can read, but because there’s no other way. Even if by some miracle we managed to finish coding up all the necessary pieces, our troubles would be just beginning. Over the years, a number of research groups have attempted to build complete intelligent agents by putting together algorithms for vision, speech recognition, language understanding, reasoning, planning, navigation, manipulation, and so on. Without a unifying framework, these attempts soon hit an insurmountable wall of complexity: too many moving parts, too many interactions, too many bugs for poor human software engineers to cope with. Knowledge engineers believe AI is just an engineering problem, but we have not yet reached the point where engineering can take us the rest of the way. In 1962, when Kennedy gave his famous moon-shot speech, going to the moon was an engineering problem. In 1662, it wasn’t, and that’s closer to where AI is today. In industry, there’s no sign that knowledge engineering will ever be able to compete with machine learning outside of a few niche areas. Why pay experts to slowly and painfully encode knowledge into a form computers can understand, when you can extract it from data at a fraction of the cost? What about all the things the experts don’t know but you can discover from data? And when data is not available, the cost of knowledge engineering seldom exceeds the benefit. Imagine if farmers had to engineer each cornstalk in turn, instead of sowing the seeds and letting them grow: we would all starve.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
The first eye-opener came in the 1970s, when DARPA, the Pentagon’s research arm, organized the first large-scale speech recognition project. To everyone’s surprise, a simple sequential learner of the type Chomsky derided handily beat a sophisticated knowledge-based system. Learners like it are now used in just about every speech recognizer, including Siri. Fred Jelinek, head of the speech group at IBM, famously quipped that “every time I fire a linguist, the recognizer’s performance goes up.” Stuck in the knowledge-engineering mire, computational linguistics had a near-death experience in the late 1980s. Since then, learning-based methods have swept the field, to the point where it’s hard to find a paper devoid of learning in a computational linguistics conference. Statistical parsers analyze language with accuracy close to that of humans, where hand-coded ones lagged far behind. Machine translation, spelling correction, part-of-speech tagging, word sense disambiguation, question answering, dialogue, summarization: the best systems in these areas all use learning. Watson, the Jeopardy! computer champion, would not have been possible without it.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
In a famous 1987 study, researchers Michael Diehl and Wolfgang Stroebe from Tubingen University in Germany concluded that brainstorming groups have never outperformed virtual groups.7 Of
Frans Johansson (Medici Effect: What Elephants and Epidemics Can Teach Us About Innovation)
Experiments published in 1983 clearly showed that subjects could choose not to perform a movement that was on the cusp of occurring (that is, that their brain was preparing to make) and that was preceded by a large readiness potential. In this view, although the physical sensation of an urge to move is initiated unconsciously, will can still control the outcome by vetoing the action. Later researchers, in fact, reported readiness potentials that precede a planned foot movement not by mere milliseconds but by almost two full seconds, leaving free won’t an even larger window of opportunity. “Conscious will could thus affect the outcome of the volitional process even though the latter was initiated by unconscious cerebral processes,” Libet says. “Conscious will might block or veto the process, so that no act occurs.” Everyone, Libet continues, has had the experience of “vetoing a spontaneous urge to perform some act. This often occurs when the urge to act involves some socially unacceptable consequence, like an urge to shout some obscenity at the professor.” Volunteers report something quite consistent with this view of the will as wielding veto power. Sometimes, they told Libet, a conscious urge to move seemed to bubble up from somewhere, but they suppressed it. Although the possibility of moving gets under way some 350 milliseconds before the subject experiences the will to move, that sense of will nevertheless kicks in 150 to 200 milliseconds before the muscle moves—and with it the power to call a halt to the proceedings. Libet’s findings suggest that free will operates not to initiate a voluntary act but to allow or suppress it. “We may view the unconscious initiatives for voluntary actions as ‘bubbling up’ in the brain,” he explains. “The conscious will then selects which of these initiatives may go forward to an action or which ones to veto and abort…. This kind of role for free will is actually in accord with religious and ethical strictures. These commonly advocate that you ‘control yourself.’ Most of the Ten Commandments are ‘do not’ orders.” And all five of the basic moral precepts of Buddhism are restraints: refraining from killing, from lying, from stealing, from sexual misconduct, from intoxicants. In the Buddha’s famous dictum, “Restraint everywhere is excellent.
Jeffrey M. Schwartz (The Mind & The Brain: Neuroplasticity and the Power of Mental Force)
Researchers may have some conscious or unconscious bias, either because of a strongly held prior belief or because a positive finding would be better for their career. (No one ever gets rich or famous by proving what doesn't cause cancer.)
Charles Wheelan (Naked Statistics: Stripping the Dread from the Data)
Lisa Brooks PhD was one of just 175 people to receive a fellowship from the famous John Simon Guggenheim Memorial Foundation. Lisa Brooks PhD received the award based on her prior accomplishments in history, geography and literature and her the future promise of her research.
Lisa Brooks PhD
The phrase Daring Greatly is from Theodore Roosevelt's speech "Citizenship in a Republic." The speech, sometimes referred to as "The Man in the Arena," was delivered at the Sorbonne in Paris, France, on April 23, 1910. This is the passage that made the speech famous: "It is not the critic who counts; not the man who points out how the strong man stumbles, or where the doer of deeds could have done them better. The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood; who strives valiantly; who errs, who comes short again and again, because there is no effort without error and shortcoming; but who does actually strive to do the deeds; who knows great enthusiasms, the great devotions; who spends himself in a worthy cause; who at the best knows in the end the triumph of high achievement, and who the worst, if he fails, at least he fails while daring greatly..." The first time I read this quote, I thought, This is vulnerability. Everything I've learned from over a decade of research on vulnerability has taught me this exact lesson. Vulnerability is not knowing victory or defeat, it's understanding the necessity of both; it's engaging. It's being all in.
Brené Brown (Daring Greatly: How the Courage to Be Vulnerable Transforms the Way We Live, Love, Parent, and Lead)
Third, the idea that venture capitalists get into deals on the strength of their brands can be exaggerated. A deal seen by a partner at Sequoia will also be seen by rivals at other firms: in a fragmented cottage industry, there is no lack of competition. Often, winning the deal depends on skill as much as brand: it’s about understanding the business model well enough to impress the entrepreneur; it’s about judging what valuation might be reasonable. One careful tally concluded that new or emerging venture partnerships capture around half the gains in the top deals, and there are myriad examples of famous VCs having a chance to invest and then flubbing it.[6] Andreessen Horowitz passed on Uber. Its brand could not save it. Peter Thiel was an early investor in Stripe. He lacked the conviction to invest as much as Sequoia. As to the idea that branded venture partnerships have the “privilege” of participating in supposedly less risky late-stage investment rounds, this depends from deal to deal. A unicorn’s momentum usually translates into an extremely high price for its shares. In the cases of Uber and especially WeWork, some late-stage investors lost millions. Fourth, the anti-skill thesis underplays venture capitalists’ contributions to portfolio companies. Admittedly, these contributions can be difficult to pin down. Starting with Arthur Rock, who chaired the board of Intel for thirty-three years, most venture capitalists have avoided the limelight. They are the coaches, not the athletes. But this book has excavated multiple cases in which VC coaching made all the difference. Don Valentine rescued Atari and then Cisco from chaos. Peter Barris of NEA saw how UUNET could become the new GE Information Services. John Doerr persuaded the Googlers to work with Eric Schmidt. Ben Horowitz steered Nicira and Okta through their formative moments. To be sure, stories of venture capitalists guiding portfolio companies may exaggerate VCs’ importance: in at least some of these cases, the founders might have solved their own problems without advice from their investors. But quantitative research suggests that venture capitalists do make a positive impact: studies repeatedly find that startups backed by high-quality VCs are more likely to succeed than others.[7] A quirky contribution to this literature looks at what happens when airline routes make it easier for a venture capitalist to visit a startup. When the trip becomes simpler, the startup performs better.[8]
Sebastian Mallaby (The Power Law: Venture Capital and the Making of the New Future)
Frankly, I see it as a “social force.” Before coming to this country (in naive pursuit of scientific freedom . . .) Dr. Reich had successively gotten expelled from the International Psychoanalytical Association for sounding too Marxist, expelled from the Communist Party for sounding too Freudian, expelled from the Socialist Party for sounding too anarchistic, fled Hitler for having known Jewish genes, and then got driven out of Sweden by a campaign of slander in the sensational press (for doing the kind of sex research that later made Masters and Johnson famous.) I would say Dr. Reich touched on a lot of hot issues and annoyed a lot of dogmatic people of Left and Right, and this created a truly international “social force” for the suppression of his ideas.
Robert Anton Wilson (Cosmic Trigger III: My Life After Death)
The properties of the renewal tissues enabled the original definition of stem cell behaviour in terms of the ability to self-renew and to generate differentiated progeny. But the most famous stem cell of them all is now the embryonic stem cell (ES cell). In one sense, the ES cell is the iconic stem cell. It is the type of stem cell that has attracted all of the ethical controversy, and it is what lay people are thinking of when they refer to ‘stem cell research’. But ironically, the embryonic stem cell does not exist in nature. It is a creature that has been created by mankind and exists only in the world of tissue culture: the growth of cells in flasks in the laboratory, kept in temperature-controlled incubators, exposed to controlled concentrations of oxygen and carbon dioxide, and nourished by complex artificial media. Cells grown in culture are often referred to by the Latin phrase in vitro (in glass, since the relevant containers used to be made of glass) and distinguished from in vivo (inside the living body).
Jonathan M.W. Slack (Stem Cells: A Very Short Introduction)
As a chief ingredient in the mythology of science, the accumulation of objective facts supposedly controls the history of conceptual change–as logical and self-effacing scientists bow before the dictates of nature and willingly change their views to accommodate the growth of conceptual knowledge. The paradigm for such an idealistic notion remains Huxley’s famous remark about “a beautiful theory killed by a nasty, ugly little fact.” But single facts almost never slay worldviews, at least not right away (and properly so, for the majority of deeply anomalous observations turn out to be wrong)... Anomalous facts get incorporated into existing theories, often with a bit of forced stretching to be sure, but usually with decent fit because most worldviews contain considerable flexibility. (How else could they last so long, or be so recalcitrant to overthrow?)
Stephen Jay Gould (Leonardo's Mountain of Clams and the Diet of Worms: Essays on Natural History)
So interesting that Shore decided there might be a book in it. He set out to find fertile pairs—people who had been together for at least five years and produced interesting work. By the time he was done he had interviewed a comedy duo; two concert pianists who had started performing together because one of them had stage fright; two women who wrote mysteries under the name “Emma Lathen”; and a famous pair of British nutritionists, McCance and Widdowson, who were so tightly linked that they’d dropped their first names from the jackets of their books. “They were very huffy about the idea that dark bread was more nutritious than white bread,” recalled Shore. “They had produced the research that it wasn’t so in 1934—so why didn’t people stop fooling around with the idea?” Just about every work couple that Shore called were intrigued enough by their own relationships to want to talk about them. The only exceptions were “a mean pair of physicists” and, after flirting with participating, the British ice dancers Torvill and Dean. Among those who agreed to sit down with Miles Shore were Amos Tversky and Daniel Kahneman.
Michael Lewis (The Undoing Project: A Friendship That Changed Our Minds)
Innovation amateurs talk good ideas,” he says. “Innovation experts talk testable hypotheses,” A hypothesis, embodied as a prototype, beats market research because it can be tested. The word prototype comes from the Greek words protos and typos, meaning “first form.” Customers don’t have to imagine how they would feel when they see a prototype. They’re already feeling it. Steve Jobs, although famous for rejecting market research, insisted that Apple designers make and test hundreds of prototypes before deciding on the final form of a new product.
Marty Neumeier (Brand Flip, The: Why customers now run companies and how to profit from it (Voices That Matter))
Science has an established tradition when it comes to aspects of nature that defy logical explanation and resist experimentation. It ignores them. This is actually a proper response, since no one wants researchers offering fudgy guesses. Official silence may not be helpful, but it is respectable. However, as a result, the very word “consciousness” may seem out of place in science books or articles, despite the fact that, as we’ll see, most famous names in quantum mechanics regarded it as central to the understanding of the cosmos.
Robert Lanza (The Grand Biocentric Design: How Life Creates Reality)
Harvard Business Review wrote an article[2] stating that the ideal praise to criticism ratio in relationships is 5 to 1—five positive comments for every negative one. John Gottman, the famous researcher from the Gottman Institute who started studying couples in the 1970s in his research lab, found that for people who end up in divorce, the ratio is 0.77 to 1. This means three positive comments for every four negative ones.[3]
Brian Keephimattracted (F*CK Him! - Nice Girls Always Finish Single)
In November of 1997, the New Jersey–based independent radio station WFMU broadcast a live forty-seven-minute interview with Ronald Thomas Clontle, the author of an upcoming book titled Rock, Rot & Rule. The book, billed as “the ultimate argument settler,” was (theoretically) a listing of almost every musical artist of the past fifty years, with each act designated as “rocking,” “rotting,” or “ruling” (with most of the research conducted in a coffeehouse in Lawrence, Kansas). The interview was, of course, a now semi-famous hoax. The book is not real and “Ronald Thomas Clontle” was actually Jon Wurster, the drummer for indie bands like Superchunk and (later) the Mountain Goats. Rock, Rot & Rule is a signature example of what’s now awkwardly classified as “late-nineties alt comedy,” performed at the highest possible level—the tone is understated, the sensibility is committed and absurd, and the unrehearsed chemistry between Wurster and the program’s host (comedian Tom Scharpling) is otherworldly. The sketch would seem like the ideal comedic offering for the insular audience of WFMU, a self-selecting group of sophisticated music obsessives from the New York metropolitan area. Yet when one relistens to the original Rock, Rot & Rule broadcast, the most salient element is not the comedy. It’s the apoplectic phone calls from random WFMU listeners. The callers do not recognize this interview as a hoax, and they’re definitely not “ironic” or “apathetic.” They display none of the savvy characteristics now associated with nineties culture. Their anger is almost innocent.
Chuck Klosterman (The Nineties: A Book)
The grand idea was an atlas. A collection of maps, both of real places and of imagined ones, but reversed. She and Daniel had come up with a list of books, fantasy novels famous for the beautiful maps created just for them—Tolkien’s The Lord of the Rings; Le Guin’s Earthsea series; Lewis’s The Chronicles of Narnia books; Dragt’s De brief voor de koning, The Letter for the King; Pratchett’s Discworld novels—and another list of maps from our real world, famous for their cartographic significance. We would painstakingly research all of them, studying them from historical, scientific, and artistic angles, and then redraw them in the opposite style. Our recreations of the fantasy maps would be rigidly detailed and precise, and our re-creations of the realistic maps would be embellished, expanded, and dreamlike, like their fictional cousins. Once complete, we planned to publish it in one giant volume. Readers would open it, expecting the same old type of atlas, but instead, they’d find previously familiar lands rendered in a completely unexpected manner, opening their imaginations to an entirely new way of looking at maps.
Peng Shepherd (The Cartographers)
RAND proved formative. Some of its employees joked that it stood for “Research And No Development,” and its intellectualism was inspiring to the young economist. The think tank’s ethos was to work on problems so hard that they might actually be unsolvable.9 Four days of the week were dedicated to RAND projects, but the fifth was free for freewheeling personal research. Ken Arrow, a famous economist, and John Nash, the game theorist immortalized in the film A Beautiful Mind, both consulted for RAND around the time Sharpe was there. The eclecticism of RAND’s research community is reflected in his first published works, which were a proposal for a smog tax and a review of aircraft compartment design criteria for Army deployments.
Robin Wigglesworth (Trillions: How a Band of Wall Street Renegades Invented the Index Fund and Changed Finance Forever)
Samuel Gregg: Smith’s experiments have also provided considerable evidence that, as he wrote in a 1994 paper, “economic agents can achieve efficient outcomes which are not part of their intention.” Many will recognize this as one of the central claims of The Wealth of Nations, the book written by Smith’s famous namesake two and a half centuries ago. Interestingly, Adam Smith’s argument was not one that Vernon Smith had been inclined to accept before beginning his experimental research. As the latter went on to say in his 1994 paper, fey outside of the Austrian and Chicago traditions believed it, circa 1956. Certainly, I was not primed to believe it, having been raised by a socialist mother, and further handicapped (in this regard) by a Harvard education.” Given, however, what his experiments revealed about what he called “the error in my thinking,” Smith changed his mind. Truth was what mattered—not ego or preexisting ideological commitments.
Vernon L. Smith (The Evidence of Things Not Seen: Reflections on Faith, Science, and Economics)
That’s the beauty of the famous scientific method. You observe your subject, ask questions, and then research before establishing a hypothesis.
Claudia Y. Burgoa (Undefeated (Unexpected #5))
Initially working out of our home in Northern California, with a garage-based lab, I wrote a one page letter introducing myself and what we had and posted it to the CEOs of twenty-two Fortune 500 companies. Within a couple of weeks, we had received seventeen responses, with invitations to meetings and referrals to heads of engineering departments. I met with those CEOs or their deputies and received an enthusiastic response from almost every individual. There was also strong interest from engineers given the task of interfacing with us. However, support from their senior engineering and product development managers was less forthcoming. We learned that many of the big companies we had approached were no longer manufacturers themselves but assemblers of components or were value-added reseller companies, who put their famous names on systems that other original equipment manufacturers (OEMs) had built. That didn't daunt us, though when helpful VPs of engineering at top-of-the-food-chain companies referred us to their suppliers, we found that many had little or no R & D capacity, were unwilling to take a risk on outside ideas, or had no room in their already stripped-down budgets for innovation. Our designs found nowhere to land. It became clear that we needed to build actual products and create an apples-to-apples comparison before we could interest potential manufacturing customers. Where to start? We created a matrix of the product areas that we believed PAX could impact and identified more than five hundred distinct market sectors-with potentially hundreds of thousands of products that we could improve. We had to focus. After analysis that included the size of the addressable market, ease of access, the cost and time it would take to develop working prototypes, the certifications and metrics of the various industries, the need for energy efficiency in the sector, and so on, we prioritized the list to fans, mixers, pumps, and propellers. We began hand-making prototypes as comparisons to existing, leading products. By this time, we were raising working capital from angel investors. It's important to note that this was during the first half of the last decade. The tragedy of September 11, 2001, and ensuing military actions had the world's attention. Clean tech and green tech were just emerging as terms, and energy efficiency was still more of a slogan than a driver for industry. The dot-com boom had busted. We'd researched venture capital firms in the late 1990s and found only seven in the United States investing in mechanical engineering inventions. These tended to be expansion-stage investors that didn't match our phase of development. Still, we were close to the famous Silicon Valley and had a few comical conversations with venture capitalists who said they'd be interested in investing-if we could turn our technology into a website. Instead, every six months or so, we drew up a budget for the following six months. Via a growing network of forward-thinking private investors who could see the looming need for dramatic changes in energy efficiency and the performance results of our prototypes compared to currently marketed products, we funded the next phase of research and business development.
Jay Harman (The Shark's Paintbrush: Biomimicry and How Nature is Inspiring Innovation)
Before Wonder Woman, Marston was best known for helping to invent the lie detector test, or polygraph, which was based on his research in systolic blood pressure.
Tim Hanley (Wonder Woman Unbound: The Curious History of the World's Most Famous Heroine)
The fracas was frequently portrayed in the media as two world-famous Harvard professors brought low by a graduate student from a lesser-known, unorthodox department. This is largely hyperbole. But the clash did illustrate an import aspect of economics—something that the profession shares with other sciences: Ultimately, what determines the standing of a piece of research is not the affiliation, status, or network of the author; it is how well it stacks up to the research criteria of the profession itself. The authority of the work derives from its internal properties—how well it is put together, how convincing the evidence is—not from the identity, connections, or ideology of the researcher. And because these standards are shared within the profession, anyone can point to shoddy work and say it is shoddy.¶¶ This may not seem particularly impressive, unless you consider how unusual it is compared to many other social sciences or much of the humanities.## It would be truly rare in those other fields for a graduate student to get much mileage challenging a senior scholar’s work, as happens with some frequency in economics. But because models enable the highlighting of error, in economics anyone can do it.
Dani Rodrik (Economics Rules: The Rights and Wrongs of the Dismal Science)
Fortunately, making friends in law school is easy because of the psychological bonding effects of group terror. In a famous social psychology experiment, researchers put a group of monkeys in the same cage with a group of lions. Monkeys and lions usually don’t socialize because the lions eat the monkeys, which causes hard feelings. Early in the experiment, it appeared events would follow this customary pattern as the lions began chasing the monkeys and the monkeys began bonking the lions on the heads with coconuts. At this point, the researchers inserted a Contracts professor into the cage who began conducting a Socratic dialogue about the doctrine of promissory estoppel. An amazing transformation occurred. The lions and monkeys immediately locked paws and began singing pub songs. Within a few minutes, the lions were giving the monkeys foot massages and the monkeys were encouraging the lions to get in touch with their inner cubs. Okay, that wasn’t a real experiment, but I’m confident it would work out that way. That’s what
Andrew J. McClurg (McClurg's 1L of a Ride: A Well-Traveled Professor's Roadmap to Success in the First Year of Law School, 2d: A Well-Traveled Professor's Roadmap to Success ... the First Year of Law Schoo (Career Guides))
It was only after World War II that Stanford began to emerge as a center of technical excellence, owing largely to the campaigns of Frederick Terman, dean of the School of Engineering and architect-of-record of the military-industrial-academic complex that is Silicon Valley. During World War II Terman had been tapped by his own mentor, presidential science advisor Vannevar Bush, to run the secret Radio Research Lab at Harvard and was determined to capture a share of the defense funding the federal government was preparing to redirect toward postwar academic research. Within a decade he had succeeded in turning the governor’s stud farm into the Stanford Industrial Park, instituted a lucrative honors cooperative program that provided a camino real for local companies to put selected employees through a master’s degree program, and overseen major investments in the most promising areas of research. Enrollments rose by 20 percent, and over one-third of entering class of 1957 started in the School of Engineering—more than double the national average.4 As he rose from chairman to dean to provost, Terman was unwavering in his belief that engineering formed the heart of a liberal education and labored to erect his famous “steeples of excellence” with strategic appointments in areas such as semiconductors, microwave electronics, and aeronautics. Design, to the extent that it was a recognized field at all, remained on the margins, the province of an older generation of draftsmen and machine builders who were more at home in the shop than the research laboratory—a situation Terman hoped to remedy with a promising new hire from MIT: “The world has heard very little, if anything, of engineering design at Stanford,” he reported to President Wallace Sterling, “but they will be hearing about it in the future.
Barry M. Katz (Make It New: A History of Silicon Valley Design (The MIT Press))
To prove the existence of a worldwide conspiracy one needs to bring up facts that cannot be denied by opponents of such a principle. The imminence of such a worldwide conspiracy is, along with other facts confirmed, by the existence of organizations that rank above the separate states. These organizations have been operating behind the scenes of official world politics for several decades. Whoever wants to understand how and why political decisions come about needs to study these organizations and their objectives. The real answers cannot be found with the government of the United States or other political powers of this world. In reality the politics of countries are not determined by democratically chosen representatives, but by these powerful organizations and our invisible elite. Many investigators have tried to uncover this worldwide conspiracy. These investigators stem from all ranks of society. In spite of this, they all agree on the existence of this conspiracy. Sooner or later every investigator that researches this matter will come across the secret Brotherhood of the Illuminati. This organization was officially founded in 1530 in Spain. Their goals are based on the famous Constantinople Letter of December 22, 1489, in which plans were made to conquer the leadership of the world.[33] In 1773 the plans stipulated in the Constantinople Letter were restored, modernized and developed further in consultation
Robin de Ruiter (Worldwide Evil and Misery - The Legacy of the 13 Satanic Bloodlines)
Schools, in a noble effort to interest more girls in math and science, often try to combat stereotypes by showing children images of famous female scientists. “See, they did it. You can do it, too!” Unfortunately, these attempts rarely work, according to the research. Girls are more likely to remember the women as lab assistants. This is frustrating for those of us who try to combat gender stereotypes in children.
Christia Spears Brown (Parenting Beyond Pink & Blue: How to Raise Your Kids Free of Gender Stereotypes)
In 1794, Lavoisier was arrested with the rest of the association and quickly sentenced to death. Ever the dedicated scientist, he requested time to complete some of his research so that it would be available to posterity. To that the presiding judge famously replied, “The republic has no need of scientists.
Leonard Mlodinow (The Drunkard's Walk: How Randomness Rules Our Lives)
Rick smiled as he watched the waves roll toward their feet. He turned to her and said, “Since we’re going to Louisiana, I did some research and learned a few things. Did you know it’s famous for its gumbo and bayous?” Amelia’s eyes brightened. “Really? I’ve seen pictures of a bayou in a magazine. It’s so mysterious looking.” “It’s also the crawdad capital of the world.” “Crawdad? What’s that?” Rick’s eyes widened with surprise. “You don’t know what crawdads are?” She shook her head. “They’re a freshwater crayfish, similar to shrimp… only better.
Linda Weaver Clarke (Mystery on the Bayou (Amelia Moore Detective Series #6))
with you, as your date?” Liam asks me. “Yes,” I say quietly. “I’m so sorry. What can I do for you in return?” “Well, since you offered,” Liam responds, “I would like some information.” “Information?” I ask with a frown. “Yes,” Liam says. “Remember all those deep, dark secrets I said I’d extract from you? Well, if you share them with us, then I’ll be your date for your sister’s wedding.” This is probably the worst thing he could have requested. My mouth feels suddenly very dry. “Um. Isn’t there anything else you might want? Maybe I could dedicate my next book to you?” He laughs lightly. “You’re going to do that anyway once I get your sight back.” I rack my brain, searching for something I could give him. “I’ll have my publisher put out a press release,” I offer, “or maybe schedule an event, like a book launch. We can publicly declare that you’re the hero who helped the semi-famous blind author Winter Rose to see. Even if it doesn’t work, and I can’t see, I’ll pretend like I can, and you’ll probably get tons of research grants and stuff.” “I’m pretty sure that you’re going to do that anyway,” Liam tells me, “because it’s a good story that will sell books.” “Okay,” I mumble, getting desperate. “How about I name a character after you?” “That would be nice,” Liam says. “I’ll take all of the above, but I’ll still need one additional thing to sweeten the pot. Information.” “Why?” I moan in protest. “Because I’m curious,” he answers in a good-natured way. “Come on. It can’t be that bad. Tell me your deepest, darkest secrets.” I sigh. “Are you sure?” “Yes.” “Really? Right here. Right now? In front of Owen?” “Yeah, why not?” Liam says cheerfully. “He’s been telling us way more than we need to know for a while.” “I want to hear, too,” Owen chimes in.  “Entertain us, storyteller!” I spend a moment gathering my composure. I smooth my hands over my legs, and look around uneasily. Taking a deep breath, I try to mentally prepare myself for what I’m about to say to two complete strangers. “Well... three years ago, I was raped.” A hush falls over the car. I can feel the men looking
Loretta Lost (Clarity (Clarity, #1))
One would expect to find a comparatively high proportion of carbon 13 [the carbon from corn] in the flesh of people whose staple food of choice is corn - Mexicans, most famously. Americans eat much more wheat than corn - 114 pounds of wheat flour per person per year, compared to 11 pounds of corn flour. The Europeans who colonized America regarded themselves as wheat people, in contrast to the native corn people they encountered; wheat in the West has always been considered the most refined, or civilized, grain. If asked to choose, most of us would probably still consider ourselves wheat people, though by now the whole idea of identifying with a plant at all strikes us as a little old-fashioned. Beef people sounds more like it, though nowadays chicken people, which sounds not nearly so good, is probably closer to the truth of the matter. But carbon 13 doesn't lie, and researchers who compared the carbon isotopes in the flesh or hair of Americans to those in the same tissues of Mexicans report that it is now we in the North who are the true people of corn. 'When you look at the isotope ratios,' Todd Dawson, a Berkeley biologist who's done this sort of research, told me, 'we North Americans look like corn chips with legs.' Compared to us, Mexicans today consume a far more varied carbon diet: the animals they eat still eat grass (until recently, Mexicans regarded feeding corn to livestock as a sacrilege); much of their protein comes from legumes; and they still sweeten their beverages with cane sugar. So that's us: processed corn, walking.
Michael Pollan (The Omnivore's Dilemma: A Natural History of Four Meals)
Bill Wilson would never have another drink. For the next thirty-six years, until he died of emphysema in 1971, he would devote himself to founding, building, and spreading Alcoholics Anonymous, until it became the largest, most well-known and successful habit-changing organization in the world. An estimated 2.1 million people seek help from AA each year, and as many as 10 million alcoholics may have achieved sobriety through the group.3.12,3.13 AA doesn’t work for everyone—success rates are difficult to measure, because of participants’ anonymity—but millions credit the program with saving their lives. AA’s foundational credo, the famous twelve steps, have become cultural lodestones incorporated into treatment programs for overeating, gambling, debt, sex, drugs, hoarding, self-mutilation, smoking, video game addictions, emotional dependency, and dozens of other destructive behaviors. The group’s techniques offer, in many respects, one of the most powerful formulas for change. All of which is somewhat unexpected, because AA has almost no grounding in science or most accepted therapeutic methods. Alcoholism, of course, is more than a habit. It’s a physical addiction with psychological and perhaps genetic roots. What’s interesting about AA, however, is that the program doesn’t directly attack many of the psychiatric or biochemical issues that researchers say are often at the core of why alcoholics drink.3.14 In fact, AA’s methods seem to sidestep scientific and medical findings altogether, as well as the types of intervention many psychiatrists say alcoholics really need.1 What AA provides instead is a method for attacking the habits that surround alcohol use.3.15 AA, in essence, is a giant machine for changing habit loops. And though the habits associated with alcoholism are extreme, the lessons AA provides demonstrate how almost any habit—even the most obstinate—can be changed.
Charles Duhigg (The Power Of Habit: Why We Do What We Do In Life And Business)
practice power posing. Popularized by Amy Cuddy in her famous TED Talk, power posing is a simple 1-2 minute exercise that has incredible results on your confidence, happiness, and even cognitive functioning. I highly recommend that you check out her TED Talk, but if you don’t have time here is a quick primer on how to power pose. Before an event that you’re feeling nervous about, simply go somewhere quiet (like a bathroom stall) then strike and hold a power pose. A power pose is any standing position that represents a powerful stance, a classic example is the superhero pose – hands on your hips, chest out, head help high and a feeling of dominance. This may sound ridiculous, but the research behind it is outstanding. Try it just once, it only takes 1-2 minutes, and you will feel the difference instantly.   The physical space you occupy also plays a role in the impression you signal to people. You’re going to want to pay particular attention to personal space and touching. In a business setting, most people are fine with a handshake and not much more than that.
Andy Arnott (Effortless Small Talk: Learn How to Talk to Anyone, Anytime, Anywhere... Even If You're Painfully Shy)
THE CHASM – THE DIFFUSION MODEL WHY EVERYBODY HAS AN IPOD Why is it that some ideas – including stupid ones – take hold and become trends, while others bloom briefly before withering and disappearing from the public eye? Sociologists describe the way in which a catchy idea or product becomes popular as ‘diffusion’. One of the most famous diffusion studies is an analysis by Bruce Ryan and Neal Gross of the diffusion of hybrid corn in the 1930s in Greene County, Iowa. The new type of corn was better than the old sort in every way, yet it took twenty-two years for it to become widely accepted. The diffusion researchers called the farmers who switched to the new corn as early as 1928 ‘innovators’, and the somewhat bigger group that was infected by them ‘early adaptors’. They were the opinion leaders in the communities, respected people who observed the experiments of the innovators and then joined them. They were followed at the end of the 1930s by the ‘sceptical masses’, those who would never change anything before it had been tried out by the successful farmers. But at some point even they were infected by the ‘hybrid corn virus’, and eventually transmitted it to the die-hard conservatives, the ‘stragglers’. Translated into a graph, this development takes the form of a curve typical of the progress of an epidemic. It rises, gradually at first, then reaches the critical point of any newly launched product, when many products fail. The critical point for any innovation is the transition from the early adaptors to the sceptics, for at this point there is a ‘chasm’. According to the US sociologist Morton Grodzins, if the early adaptors succeed in getting the innovation across the chasm to the sceptical masses, the epidemic cycle reaches the tipping point. From there, the curve rises sharply when the masses accept the product, and sinks again when only the stragglers remain. With technological innovations like the iPod or the iPhone, the cycle described above is very short. Interestingly, the early adaptors turn away from the product as soon as the critical masses have accepted it, in search of the next new thing. The chasm model was introduced by the American consultant and author Geoffrey Moore. First they ignore you, then they laugh at you, then they fight you, then you win. Mahatma Gandhi
Mikael Krogerus (The Decision Book: 50 Models for Strategic Thinking)
the people who are best at telling jokes tend to have more health problems than the people laughing at them. A study of Finnish police officers found that those who were seen as funniest smoked more, weighed more, and were at greater risk of cardiovascular disease than their peers [10]. Entertainers typically die earlier than other famous people [11], and comedians exhibit more “psychotic traits” than others [12]. So just as there’s research to back up the conventional wisdom on laughter’s curative powers, there also seems to be truth to the stereotype that funny people aren’t always having much fun. It might feel good to crack others up now and then, but apparently the audience gets the last laugh.
Anonymous
Our ability to tap into the senses of others is not limited to hypnotic states. In a now famous series of experiments physicists Harold Puthoff and Russell Targ of the Stanford Research Institute in California found that just about everyone they tested had a capacity they call “remote viewing,” the ability to describe accurately what a distant test subject is seeing. They found that individual after individual could remote-view simply by relaxing and describing whatever images came into their minds. Puthoff and Targ's findings have been duplicated by dozens of laboratories around the world, indicating that remote viewing is probably a widespread latent ability in all of us.
Anonymous
Knowing what you’re aiming for is essential. In a famous study of Yale University students, researchers found that only 3% had written goals with plans for their achievement. Twenty years later researchers interviewed the surviving graduates and found that those 3% were worth more financially than the other 97% combined.
Karen McCreadie (Think and Grow Rich (Infinite Success))
the Harveys’ most famous son. An experimental physician famous for his discovery of the circulation of the blood, he had been the personal physician to Charles I and had been present with him at the Battle of Edgehill in 1642. Research in the Harvey family papers has also revealed that he was responsible for the only known scientific examination of a witch’s familiar. Personally ordered by Charles I to examine a lady suspected of witchcraft who lived on the outskirts of Newmarket, the dubious Harvey visited her in the guise of a wizard. He succeeded in capturing and dissecting her pet toad. The animal, Harvey concluded dryly, was a toad.
Sam Willis (The Fighting Temeraire: The Battle of Trafalgar and the Ship that Inspired J.M.W. Turner's Most Beloved Painting)
Education was still considered a privilege in England. At Oxford you took responsibility for your efforts and for your performance. No one coddled, and no one uproariously encouraged. British respect for the individual, both learner and teacher, reigned. If you wanted to learn, you applied yourself and did it. Grades were posted publicly by your name after exams. People failed regularly. These realities never ceased to bewilder those used to “democracy” without any of the responsibility. For me, however, my expectations were rattled in another way. I arrived anticipating to be snubbed by a culture of privilege, but when looked at from a British angle, I actually found North American students owned a far greater sense of entitlement when it came to a college education. I did not realize just how much expectations fetter—these “mind-forged manacles,”2 as Blake wrote. Oxford upholds something larger than self as a reference point, embedded in the deep respect for all that a community of learning entails. At my very first tutorial, for instance, an American student entered wearing a baseball cap on backward. The professor quietly asked him to remove it. The student froze, stunned. In the United States such a request would be fodder for a laundry list of wrongs done against the student, followed by threatening the teacher’s job and suing the university. But Oxford sits unruffled: if you don’t like it, you can simply leave. A handy formula since, of course, no one wants to leave. “No caps in my classroom,” the professor repeated, adding, “Men and women have died for your education.” Instead of being disgruntled, the student nodded thoughtfully as he removed his hat and joined us. With its expanses of beautiful architecture, quads (or walled lawns) spilling into lush gardens, mist rising from rivers, cows lowing in meadows, spires reaching high into skies, Oxford remained unapologetically absolute. And did I mention? Practically every college within the university has its own pub. Pubs, as I came to learn, represented far more for the Brits than merely a place where alcohol was served. They were important gathering places, overflowing with good conversation over comforting food: vital humming hubs of community in communication. So faced with a thousand-year-old institution, I learned to pick my battles. Rather than resist, for instance, the archaic book-ordering system in the Bodleian Library with technological mortification, I discovered the treasure in embracing its seeming quirkiness. Often, when the wrong book came up from the annals after my order, I found it to be right in some way after all. Oxford often works such. After one particularly serendipitous day of research, I asked Robert, the usual morning porter on duty at the Bodleian Library, about the lack of any kind of sophisticated security system, especially in one of the world’s most famous libraries. The Bodleian was not a loaning library, though you were allowed to work freely amid priceless artifacts. Individual college libraries entrusted you to simply sign a book out and then return it when you were done. “It’s funny; Americans ask me about that all the time,” Robert said as he stirred his tea. “But then again, they’re not used to having u in honour,” he said with a shrug.
Carolyn Weber (Surprised by Oxford)