Research Famous Quotes

We've searched our database for all the quotes and captions related to Research Famous. Here they are! All 100 of them:

I began to come into close contact with poverty, with hunger, with disease, with the inability to cure a child because of a lack of resources… And I began to see there was something that, at that time, seemed to me almost as important as being a famous researcher or making some substantial contribution to medical science, and this was helping those people.
Ernesto Che Guevara (The Motorcycle Diaries: Notes on a Latin American Journey)
The trouble with Goodreads is that they never authenticate these quotations of famous people.
Aristotle (Physics)
life expectancy among working-class white Americans had been decreasing since the early 2000s. In modern history the only obvious parallel was with Russia in the desperate aftermath of the fall of the Soviet Union. One journalistic essay and academic research paper after another confirmed the disaster, until the narrative was capped in 2015 by Anne Case and Angus Deaton’s famous account of “deaths of despair.
Adam Tooze (Crashed: How a Decade of Financial Crises Changed the World)
We like to believe that we live in a grand age of creative individualism. We look back at the midcentury era in which the Berkeley researchers conducted their creativity studies, and feel superior. Unlike the starched-shirted conformists of the 1950s, we hang posters of Einstein on our walls, his tongue stuck out iconoclastically. We consume indie music and films, and generate our own online content. We “think different” (even if we got the idea from Apple Computer’s famous ad campaign). But the way we organize many of our most important institutions—our schools and our workplaces—tells a very different story.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
It’s super-important to have a strong social media presence, and Jane’s always going, When interviewers ask you about your Twitter, say you love reaching out directly to your fans, and I’m like, I don’t even know how to use Twitter or what the password is because you disabled my laptop’s wireless and only let me go on the Internet to do homework research or email Nadine assignments, and she says, I’m doing you a big favor, it’s for nobodies who want to pretend like they’re famous and for self-promoting hacks without PR machines, and adults act like teenagers passing notes and everyone’s IQ drops thirty points on it.
Teddy Wayne (The Love Song of Jonny Valentine)
By the end of the day, we determined that we could provide chocolate therapy three times a day and research a chocolate protocol at the world-famous Hershey's Hospital. Do you think they provide it in IV formula?
Keith Desserich
In a famous experiment conducted by NASA in the 1990s, researchers fed a variety of psychoactive substances to spiders to see how they would affect their web-making skills. The caffeinated spider spun a strangely cubist and utterly ineffective web, with oblique angles, openings big enough to let small birds through, and completely lacking in symmetry or a center. (The web was far more fanciful than the ones spun by spiders given cannabis or LSD.)
Michael Pollan (This Is Your Mind on Plants)
There is a famous study from the 1930s involving a group of orphanage babies who, at mealtimes, were presented with a smorgasbord of thirty-four whole, healthy foods. Nothing was processed or prepared beyond mincing or mashing. Among the more standard offerings—fresh fruits and vegetables, eggs, milk, chicken, beef—the researcher, Clara Davis, included liver, kidney, brains, sweetbreads, and bone marrow. The babies shunned liver and kidney (as well as all ten vegetables, haddock, and pineapple), but brains and sweetbreads did not turn up among the low-preference foods she listed. And the most popular item of all? Bone marrow.
Mary Roach (Gulp: Adventures on the Alimentary Canal)
A separate, international team analyzed more than a half million research articles, and classified a paper as “novel” if it cited two other journals that had never before appeared together. Just one in ten papers made a new combination, and only one in twenty made multiple new combinations. The group tracked the impact of research papers over time. They saw that papers with new knowledge combinations were more likely to be published in less prestigious journals, and also much more likely to be ignored upon publication. They got off to a slow start in the world, but after three years, the papers with new knowledge combos surpassed the conventional papers, and began accumulating more citations from other scientists. Fifteen years after publication, studies that made multiple new knowledge combinations were way more likely to be in the top 1 percent of most-cited papers. To recap: work that builds bridges between disparate pieces of knowledge is less likely to be funded, less likely to appear in famous journals, more likely to be ignored upon publication, and then more likely in the long run to be a smash hit in the library of human knowledge. •
David Epstein (Range: Why Generalists Triumph in a Specialized World)
Lincoln is not the only famous leader to have battled depression. Winston Churchill lived with the ‘black dog’ for much of his life too. Watching a fire, he once remarked to a young researcher he was employing: ‘I know why logs spit. I know what it is to be consumed.
Matt Haig (Reasons To Stay Alive: A Novel)
But when he instructed his staff to give the injections without telling patients they contained cancer cells, three young Jewish doctors refused, saying they wouldn’t conduct research on patients without their consent. All three knew about the research Nazis had done on Jewish prisoners. They also knew about the famous Nuremberg Trials.
Rebecca Skloot (The Immortal Life of Henrietta Lacks)
Malcolm Gladwell puts the "pop" in pop psychology, and although revered in lay circles, is roundly dismissed by experts - even by the researchers he makes famous.
Paul Gibbons (The Science of Successful Organizational Change: How Leaders Set Strategy, Change Behavior, and Create an Agile Culture)
Research shows that those who believe in a wrathful God are more likely to suffer from depression and anxiety disorders than those who believe in a loving, merciful God.
Tony Jones (Did God Kill Jesus?: Searching for Love in History's Most Famous Execution)
She knew for a fact that being left-handed automatically made you special. Marie Curie, Albert Einstein, Linus Pauling, and Albert Schweitzer were all left-handed. Of course, no believable scientific theory could rest on such a small group of people. When Lindsay probed further, however, more proof emerged. Michelangelo, Leonardo da Vinci, M.C. Escher, Mark Twain, Hans Christian Andersen, Lewis Carrol, H.G. Wells, Eudora Welty, and Jessamyn West- all lefties. The lack of women in her research had initially bothered her until she mentioned it to Allegra. "Chalk that up to male chauvinism," she said. "Lots of left-handed women were geniuses. Janis Joplin was. All it means is that the macho-man researchers didn't bother asking.
Jo-Ann Mapson (The Owl & Moon Cafe)
There was some awareness back then about hidden gender bias, particularly because of research like the famous “Howard and Heidi” study. Two Columbia Business School professors had taken an HBS case study about a female venture capitalist named Heidi Roizen and, in half the classes they taught, presented exactly the same stories and qualifications but called her Howard. In surveys of the students, they came away believing that Howard was beloved—so competent! such a go-getter!—whereas Heidi was a power-hungry egomaniac. Same person, just a different name.
Ellen Pao (Reset: My Fight for Inclusion and Lasting Change)
Frankly, the overwhelming majority of academics have ignored the data explosion caused by the digital age. The world’s most famous sex researchers stick with the tried and true. They ask a few hundred subjects about their desires; they don’t ask sites like PornHub for their data. The world’s most famous linguists analyze individual texts; they largely ignore the patterns revealed in billions of books. The methodologies taught to graduate students in psychology, political science, and sociology have been, for the most part, untouched by the digital revolution. The broad, mostly unexplored terrain opened by the data explosion has been left to a small number of forward-thinking professors, rebellious grad students, and hobbyists. That will change.
Seth Stephens-Davidowitz (Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are)
He was nice enough, an old guy who got famous in the 1960s for doing drugs and getting high and calling it research, so you have to figure he was a bit of a flake and probably pretty immature, too.
Ruth Ozeki (A Tale for the Time Being)
Gene patents are the point of greatest concern in the debate over ownership of human biological materials, and how that ownership might interfere with science. As of 2005—the most recent year figures were available—the U.S. government had issued patents relating to the use of about 20 percent of known human genes, including genes for Alzheimer’s, asthma, colon cancer, and, most famously, breast cancer. This means pharmaceutical companies, scientists, and universities control what research can be done on those genes, and how much resulting therapies and diagnostic tests will cost. And some enforce their patents aggressively: Myriad Genetics, which holds the patents on the BRCA1 and BRCA2 genes responsible for most cases of hereditary breast and ovarian cancer, charges $3,000 to test for the genes. Myriad has been accused of creating a monopoly, since no one else can offer the test, and researchers can’t develop cheaper tests or new therapies without getting permission from Myriad and paying steep licensing fees. Scientists who’ve gone ahead with research involving the breast-cancer genes without Myriad’s permission have found themselves on the receiving end of cease-and-desist letters and threats of litigation.
Rebecca Skloot
Neurologically speaking, though, there are reasons we develop a confused sense of priorities when we’re in front of our computer screens. For one thing, email comes at unpredictable intervals, which, as B. F. Skinner famously showed with rats seeking pellets, is the most seductive and habit-forming reward pattern to the mammalian brain. (Think about it: would slot machines be half as thrilling if you knew when, and how often, you were going to get three cherries?) Jessie would later say as much to me when I asked her why she was “obsessed”—her word—with her email: “It’s like fishing. You just never know what you’re going to get.” More to the point, our nervous systems can become dysregulated when we sit in front of a screen. This, at least, is the theory of Linda Stone, formerly a researcher and senior executive at Microsoft Corporation. She notes that we often hold our breath or breathe shallowly when we’re working at our computers. She calls this phenomenon “email apnea” or “screen apnea.” “The result,” writes Stone in an email, “is a stress response. We become more agitated and impulsive than we’d ordinarily be.
Jennifer Senior (All Joy and No Fun: The Paradox of Modern Parenthood)
Religion has used ritual forever. I remember a famous study led by psychologist Alfred Tomatis of a group of clinically depressed monks. After much examination, researchers concluded that the group’s depression stemmed from their abandoning a twice-daily ritual of gathering to sing Gregorian chants. They had lost the sense of community and the comfort of singing together in harmony. Creating beautiful music together was a formal recognition of their connection and a shared moment of joy.
Sue Johnson (Hold Me Tight: Seven Conversations for a Lifetime of Love (The Dr. Sue Johnson Collection Book 1))
Maria Orsic, a stunning beauty and an unusual medium was not an obscure personality. She was known to many celebrities of the era and had a fleet of very powerful admirers and friends both in Germany and abroad; famous, brilliant and influential people like Charles Lindbergh, Nikola Tesla, Marshal Tito of Yugoslavia, Henry Ford, Eva Peron, and the most illustrious figures in the spiritualism, parapsychological and psychical research in Great Britain. This was reported by Allies intelligence and documented by OSS operatives in Europe.
Jean-Maximillien De La Croix de Lafayette (Volume I. UFOs: MARIA ORSIC, THE WOMAN WHO ORIGINATED AND CREATED EARTH’S FIRST UFOS (Extraterrestrial and Man-Made UFOs & Flying Saucers Book 1))
As a professional philosopher, I very rarely hyperventilate while doing research, but Peirce was a notorious recluse. Most of his books had been sold or carried off to Harvard at the end of his life, but somehow this little treasure—Peirce’s own copy of his first and most famous publication—had ended up here. *
John Kaag (American Philosophy: A Love Story)
For the benefit of your research people, I would like to mention (so as to avoid any duplication of labor): that the planet is very like Mars; that at least seventeen states have Pinedales; that the end of the top paragraph Galley 3 is an allusion to the famous "canals" (or, more correctly, "channels") of Schiaparelli (and Percival Lowell); that I have thoroughly studied the habits of chinchillas; that Charrete is old French and should have one "t"; that Boke's source on Galley 9 is accurate; that "Lancelotik" is not a Celtic diminutive but a Slavic one; that "Betelgeuze" is correctly spelled with a "z", not an "s" as some dictionaries have it; that the "Indigo" Knight is the result of some of my own research; that Sir Grummore, mentioned both in Le Morte Darthur ad in Amadis de Gaul, was a Scotsman; that L'Eau Grise is a scholarly pun; and that neither bludgeons nor blandishments will make me give up the word "hobnailnobbing".
Vladimir Nabokov
The basic concept of microdosing is nothing new. Albert Hofmann, who first synthesized LSD in 1938, considered it one of the drug’s most promising, and least researched, applications. He was among the first to realize its antidepressant and cognition-enhancing potential,[vi] famously taking between 10 and 20 μg himself, twice a week, for the last few decades of his life.[vii]
Paul Austin (Microdosing Psychedelics: A Practical Guide to Upgrade Your Life)
The successful ideas survive scrutiny. The bad ideas get discarded. Conformity is also laughable to scientists attempting to advance their careers. The best way to get famous in your own lifetime is to pose an idea that counters prevailing research and that earns a consistency of observations and experiment. Healthy disagreement is a natural state on the bleeding edge of discovery.
Neil deGrasse Tyson (Starry Messenger: Cosmic Perspectives on Civilization)
So I did some research,” she went on. “The good thing about being a famous model is that you can call anyone and they’ll talk to you. So I called this illusionist I’d seen on Broadway a couple of years ago. He heard the story and then he laughed. I said what’s so funny. He asked me a question: Did this guru do this after dinner? I was surprised. What the hell could that have to do with it? But I said yes, how did you know? He asked if we had coffee. Again I said yes. Did he take his black? One more time I said yes.” Shauna was smiling now. “Do you know how he did it, Beck?” I shook my head. “No clue.” “When he passed the card to Wendy, it went over his coffee cup. Black coffee, Beck. It reflects like a mirror. That’s how he saw what I’d written. It was just a dumb parlor trick.
Harlan Coben (Tell No One)
Valentine’s concept of introversion includes traits that contemporary psychology would classify as openness to experience (“thinker, dreamer”), conscientiousness (“idealist”), and neuroticism (“shy individual”). A long line of poets, scientists, and philosophers have also tended to group these traits together. All the way back in Genesis, the earliest book of the Bible, we had cerebral Jacob (a “quiet man dwelling in tents” who later becomes “Israel,” meaning one who wrestles inwardly with God) squaring off in sibling rivalry with his brother, the swashbuckling Esau (a “skillful hunter” and “man of the field”). In classical antiquity, the physicians Hippocrates and Galen famously proposed that our temperaments—and destinies—were a function of our bodily fluids, with extra blood and “yellow bile” making us sanguine or choleric (stable or neurotic extroversion), and an excess of phlegm and “black bile” making us calm or melancholic (stable or neurotic introversion). Aristotle noted that the melancholic temperament was associated with eminence in philosophy, poetry, and the arts (today we might classify this as opennessto experience). The seventeenth-century English poet John Milton wrote Il Penseroso (“The Thinker”) and L’Allegro (“The Merry One”), comparing “the happy person” who frolics in the countryside and revels in the city with “the thoughtful person” who walks meditatively through the nighttime woods and studies in a “lonely Towr.” (Again, today the description of Il Penseroso would apply not only to introversion but also to openness to experience and neuroticism.) The nineteenth-century German philosopher Schopenhauer contrasted “good-spirited” people (energetic, active, and easily bored) with his preferred type, “intelligent people” (sensitive, imaginative, and melancholic). “Mark this well, ye proud men of action!” declared his countryman Heinrich Heine. “Ye are, after all, nothing but unconscious instruments of the men of thought.” Because of this definitional complexity, I originally planned to invent my own terms for these constellations of traits. I decided against this, again for cultural reasons: the words introvert and extrovert have the advantage of being well known and highly evocative. Every time I uttered them at a dinner party or to a seatmate on an airplane, they elicited a torrent of confessions and reflections. For similar reasons, I’ve used the layperson’s spelling of extrovert rather than the extravert one finds throughout the research literature.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
In the scientific world, the syndrome known as 'great man's disease' happens when a famous researcher in one field develops strong opinions about another field that he or she does not understand, such as a chemist who decides that he is an expert in medicine or a physicist who decides that he is an expert in cognitive science. They have trouble accepting that they must go back to school before they can make pronouncements in a new field.
Paul Krugman (A Country Is Not a Company (Harvard Business Review Classics))
Lederman is also a charismatic personality, famous among his colleagues for his humor and storytelling ability. One of his favorite anecdotes relates the time when, as a graduate student, he arranged to bump into Albert Einstein while walking the grounds at the Institute for Advanced Study at Princeton. The great man listened patiently as the eager youngster explained the particle-physics research he was doing at Columbia, and then said with a smile, “That is not interesting.
Sean Carroll (The Particle at the End of the Universe)
As a firstborn I also had a duty to succeed my father and look after my mother and siblings. Although school taught me that this was an outdated practice and that I would have been better off focusing on inheriting my father's assets for my own benefit, it was the strong emphasis on family values that ultimately prevailed. This was not because they had sounded good on paper or had been presented by a world-famous researcher, but because I saw they worked through my experience.
Salatiso Lonwabo Mdeni (The Homeschooling Father, How and Why I got started.: Traditional Schooling to Online Learning until Homeschooling)
Starting something new in middle age might look that way too. Mark Zuckerberg famously noted that “young people are just smarter.” And yet a tech founder who is fifty years old is nearly twice as likely to start a blockbuster company as one who is thirty, and the thirty-year-old has a better shot than a twenty-year-old. Researchers at Northwestern, MIT, and the U.S. Census Bureau studied new tech companies and showed that among the fastest-growing start-ups, the average age of a founder was forty-five when the company was launched.
David Epstein (Range: Why Generalists Triumph in a Specialized World)
Working hard is important. But more effort does not necessarily yield more results. “Less but better” does. Ferran Adrià, arguably the world’s greatest chef, who has led El Bulli to become the world’s most famous restaurant, epitomizes the principle of “less but better” in at least two ways. First, his specialty is reducing traditional dishes to their absolute essence and then re-imagining them in ways people have never thought of before. Second, while El Bulli has somewhere in the range of 2 million requests for dinner reservations each year, it serves only fifty people per night and closes for six months of the year. In fact, at the time of writing, Ferran had stopped serving food altogether and had instead turned El Bulli into a full-time food laboratory of sorts where he was continuing to pursue nothing but the essence of his craft.1 Getting used to the idea of “less but better” may prove harder than it sounds, especially when we have been rewarded in the past for doing more … and more and more. Yet at a certain point, more effort causes our progress to plateau and even stall. It’s true that the idea of a direct correlation between results and effort is appealing. It seems fair. Yet research across many fields paints a very different picture. Most people have heard of the “Pareto Principle,” the idea, introduced as far back as the 1790s by Vilfredo Pareto, that 20 percent of our efforts produce 80 percent of results. Much later, in 1951, in his Quality-Control Handbook, Joseph Moses Juran, one of the fathers of the quality movement, expanded on this idea and called it “the Law of the Vital Few.”2 His observation was that you could massively improve the quality of a product by resolving a tiny fraction of the problems. He found a willing test audience for this idea in Japan, which at the time had developed a rather poor reputation for producing low-cost, low-quality goods. By adopting a process in which a high percentage of effort and attention was channeled toward improving just those few things that were truly vital, he made the phrase “made in Japan” take on a totally new meaning. And gradually, the quality revolution led to Japan’s rise as a global economic power.3
Greg McKeown (Essentialism: The Disciplined Pursuit of Less)
THREE FAMOUS ENGRAVINGS depict Alexis St. Martin in his youth. I’ve seen them many times, in biographies of his surgeon William Beaumont, in Beaumont’s own book, in journal articles about the pair. As detailed as the artworks are, you can’t tell what St. Martin looked like from examining them. All three woodcuts are of the lower portion of his left breast, and the famous hole. I could pick St. Martin’s nipple out of a lineup before I could his eyes. I suppose this makes sense; Beaumont was a researcher and St. Martin his subject—more a body than a man.
Mary Roach (Gulp: Adventures on the Alimentary Canal)
By 1952, the University of Minnesota nutritionist Ancel Keys was arguing that high blood levels of cholesterol caused heart disease, and that it was the fat in our diets that drove up cholesterol levels. Keys had a conflict of interest: his research had been funded by the sugar industry—the Sugar Research Foundation and then the Sugar Association—since 1944, if not earlier, and the K-rations he had famously developed for the military during the war (the “K” is said to have stood for “Keys”) were loaded with sugar. This might have naturally led him to perceive something other than sugar as the problem. We can only guess.
Gary Taubes (The Case Against Sugar)
When I am asked to summarize the fundamental message from research on self-control, I recall “Descartes’s famous dictum cogito, ergo sum—“I think, therefore I am.” What has been discovered about mind, brain, and self-control lets us move from his proposition to “I think, therefore I can change what I am.” Because by changing how we think, we can change what we feel, do, and become. If that leads to the question “But can I really change?,” I reply with what George Kelly said to his therapy clients when they kept asking him if they could get control of their lives. He looked straight into their eyes and said, “Would you like to?
Walter Mischel
Equally bad deals have been made with Big Tech. In many ways, Silicon Valley is a product of the U.S. government’s investments in the development of high-risk technologies. The National Science Foundation funded the research behind the search algorithm that made Google famous. The U.S. Navy did the same for the GPS technology that Uber depends on. And the Defense Advanced Research Projects Agency, part of the Pentagon, backed the development of the Internet, touchscreen technology, Siri, and every other key component in the iPhone. Taxpayers took risks when they invested in these technologies, yet most of the technology companies that have benefited fail to pay their fair share of taxes.
Mariana Mazzucato
Some researchers, such as psychologist Jean Twenge, say this new world where compliments are better than sex and pizza, in which the self-enhancing bias has been unchained and allowed to gorge unfettered, has led to a new normal in which the positive illusions of several generations have now mutated into full-blown narcissism. In her book The Narcissism Epidemic, Twenge says her research shows that since the mid-1980s, clinically defined narcissism rates in the United States have increased in the population at the same rate as obesity. She used the same test used by psychiatrists to test for narcissism in patients and found that, in 2006, one in four U.S. college students tested positive. That’s real narcissism, the kind that leads to diagnoses of personality disorders. In her estimation, this is a dangerous trend, and it shows signs of acceleration. Narcissistic overconfidence crosses a line, says Twenge, and taints those things improved by a skosh of confidence. Over that line, you become less concerned with the well-being of others, more materialistic, and obsessed with status in addition to losing all the restraint normally preventing you from tragically overestimating your ability to manage or even survive risky situations. In her book, Twenge connects this trend to the housing market crash of the mid-2000s and the stark increase in reality programming during that same decade. According to Twenge, the drive to be famous for nothing went from being strange to predictable thanks to a generation or two of people raised by parents who artificially boosted self-esteem to ’roidtastic levels and then released them into a culture filled with new technologies that emerged right when those people needed them most to prop up their self-enhancement biases. By the time Twenge’s research was published, reality programming had spent twenty years perfecting itself, and the modern stars of those shows represent a tiny portion of the population who not only want to be on those shows, but who also know what they are getting into and still want to participate. Producers with the experience to know who will provide the best television entertainment to millions then cull that small group. The result is a new generation of celebrities with positive illusions so robust and potent that the narcissistic overconfidence of the modern American teenager by comparison is now much easier to see as normal.
David McRaney (You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself)
The person who discovered the answer was a retiring, self-funded scientist named Peter Mitchell who in the early 1960s inherited a fortune from the Wimpey house-building company and used it to set up a research center in a stately home in Cornwall. Mitchell was something of an eccentric. He wore shoulder-length hair and an earring at a time when that was especially unusual among serious scientists. He was also famously forgetful. At his daughter’s wedding, he approached another guest and confessed that she looked familiar, though he couldn’t quite place her. “I was your first wife,” she answered. Mitchell’s ideas were universally dismissed, not altogether surprisingly. As one chronicler has noted, “At the time that Mitchell proposed his hypothesis there was not a shred of evidence in support of it.” But he was eventually vindicated and in 1978 was awarded the Nobel Prize in Chemistry—an extraordinary accomplishment for someone who worked from a home lab. The
Bill Bryson (The Body: A Guide for Occupants)
When I was a kid, my mother thought spinach was the healthiest food in the world because it contained so much iron. Getting enough iron was a big deal then because we didn't have 'iron-fortified' bread. Turns out that spinach is an okay source of iron, but no better than pizza, pistachio nuts, cooked lentils, or dried peaches. The spinach-iron myth grew out of a simple mathematical miscalculation: A researcher accidentally moved a decimal point one space, so he thought spinach had 10 times more iron than it did. The press reported it, and I had to eat spinach. Moving the decimal point was an honest mistake--but it's seldom that simple. If it happened today I'd suspect a spinach lobby was behind it. Businesses often twist science to make money. Lawyers do it to win cases. Political activists distort science to fit their agenda, bureaucrats to protect their turf. Reporters keep falling for it. Scientists sometimes go along with it because they like being famous.
John Stossel (Give Me a Break: How I Exposed Hucksters, Cheats, and Scam Artists and Became the Scourge of the Liberal Media...)
Four thousand miles away in France, the old boys from the Haute-Loire Resistance wrote to each other to share the devastating news. They had enjoyed nearly forty years of freedom since spending a mere couple of months in Virginia’s presence in 1944. But the warrior they called La Madone had shown them hope, comradeship, courage, and the way to be the best version of themselves, and they had never forgotten. In the midst of hardship and fear, she had shared with them a fleeting but glorious state of happiness and the most vivid moment of their lives. The last of those famous Diane Irregulars—the ever-boyish Gabriel Eyraud, her chouchou—passed away in 2017 while I was researching Virginia’s story. Until the end of his days, he and the others who had known Virginia on the plateau liked to pause now and then to think of the woman in khaki who never, ever gave up on freedom. When they talked with awe and affection of her incredible exploits, they smiled and looked up at the wide, open skies with “les étoiles dans les yeux.
Sonia Purnell (A Woman of No Importance: The Untold Story of the American Spy Who Helped Win World War II)
Look at the telephone; it would remind you of a unique scientist, Alexander Graham Bell. He, besides being a great inventor, was also a man of great compassion and service. In fact, much of the research which led to the development of the telephone was directed at finding solutions to the challenges of hearing impaired people and helping them to be able to listen and communicate. Bell’s mother and wife were both hearing impaired and it profoundly changed Bell’s outlook to science. He aimed to make devices which would help the hearing impaired. He started a special school in Boston to teach hearing impaired people in novel ways. It was these lessons which inspired him to work with sound and led to the invention of the telephone. Can you guess the name of the most famous student of Alexander Graham Bell? It was Helen Keller, the great author, activist and poet who was hearing and visually impaired. About her teacher, she once said that Bell dedicated his life to the penetration of that ‘inhuman silence which separates and estranges’.
A.P.J. Abdul Kalam (Learning How to Fly: Life Lessons for the Youth)
And then, as slowly as the light fades on a calm winter evening, something went out of our relationship. I say that selfishly. Perhaps I started to look for something which had never been there in the first place: passion, romance. I aresay that as I entered my forties I had a sense that somehow life was going past me. I had hardly experienced those emotions which for me have mostly come from reading books or watching television. I suppose that if there was anything unsatisfactory in our marriage, it was in my perception of it—the reality was unchanged. Perhaps I grew up from childhood to manhood too quickly. One minute I was cutting up frogs in the science lab at school, the next I was working for the National Centre for Fisheries Excellence and counting freshwater mussel populations on riverbeds. Somewhere in between, something had passed me by: adolescence, perhaps? Something immature, foolish yet intensely emotive, like those favourite songs I had recalled dimly as if being played on a distant radio, almost too far away to make out the words. I had doubts, yearnings, but I did not know why or what for. Whenever I tried to analyse our lives, and talk about it with Mary, she would say, ‘Darling, you are on the way to becoming one of the leading authorities in the world on caddis fly larvae. Don’t allow anything to deflect you from that. You may be rather inadequately paid, certainly compared with me you are, but excellence in any field is an achievement beyond value.’ I don’t know when we started drifting apart. When I told Mary about the project—I mean about researching the possibility of a salmon fishery in the Yemen—something changed. If there was a defining moment in our marriage, then that was it. It was ironical, in a sense. For the first time in my life I was doing something which might bring me international recognition and certainly would make me considerably better off—I could live for years off the lecture circuit alone, if the project was even half successful. Mary didn’t like it. I don’t know what part she didn’t like: the fact I might become more famous than her, the fact I might even become better paid than her. That makes her sound carping.
Paul Torday (Salmon Fishing in the Yemen)
The smartest person to ever walk this Earth in all probability lived and died herding goats on a mountain somewhere, with no way to disseminate their work globally even if they had realised they were super smart and had the means to do something with their abilities. I am not keen on 'who are the smartest' lists and websites because, as Scott Barry Kaufman points out, the concept of genius privileges the few who had the opportunity to see through and promote their life’s work, while excluding others who may have had equal or greater raw potential but lacked the practical and financial support, and the communication platform that famous names clearly had. This is why I am keen to develop, through my research work, a definition of genius from a cognitive neuroscience and psychometric point of view, so that whatever we decide that is and how it should be measured, only focuses on clearly measurable factors within the individual’s mind, regardless of their external achievements, eminence, popularity, wealth, public platform etc. In my view this would be both more equitable and more scientific.
Gwyneth Wesley Rolph
Benjamin Libet, a scientist in the physiology department of the University of California, San Francisco, was a pioneering researcher into the nature of human consciousness. In one famous experiment he asked a study group to move their hands at a moment of their choosing while their brain activity was being monitored. Libet was seeking to identify what came first — the brain’s electrical activity to make the hand move or the person’s conscious intention to make their hand move. It had to be the second one, surely? But no. Brain activity to move the hand was triggered a full half a second before any conscious intention to move it…. John-Dylan Haynes, a neuroscientist at the Max Planck Institute for Human Cognitive and Brain Studies in Leipzig, Germany, led a later study that was able to predict an action ten seconds before people had a conscious intention to do it. What was all the stuff about free will? Frank Tong, a neuroscientist at Vanderbilt University in Nashville, Tennessee, said: “Ten seconds is a lifetime in terms of brain activity.” So where is it coming from if not ‘us,’ the conscious mind?
David Icke
Nartok shows me an example of Arctic “greens”: cutout number 13, Caribou Stomach Contents. Moss and lichen are tough to digest, unless, like caribou, you have a multichambered stomach in which to ferment them. So the Inuit let the caribou have a go at it first. I thought of Pat Moeller and what he’d said about wild dogs and other predators eating the stomachs and stomach contents of their prey first. “And wouldn’t we all,” he’d said, “be better off.” If we could strip away the influences of modern Western culture and media and the high-fructose, high-salt temptations of the junk-food sellers, would we all be eating like Inuit elders, instinctively gravitating to the most healthful, nutrient-diverse foods? Perhaps. It’s hard to say. There is a famous study from the 1930s involving a group of orphanage babies who, at mealtimes, were presented with a smorgasbord of thirty-four whole, healthy foods. Nothing was processed or prepared beyond mincing or mashing. Among the more standard offerings—fresh fruits and vegetables, eggs, milk, chicken, beef—the researcher, Clara Davis, included liver, kidney, brains, sweetbreads, and bone marrow. The babies shunned liver and kidney (as well as all ten vegetables, haddock, and pineapple), but brains and sweetbreads did not turn up among the low-preference foods she listed. And the most popular item of all? Bone marrow.
Mary Roach (Gulp: Adventures on the Alimentary Canal)
Example: a famous-to-economists finding in behavioral economics concerns pricing, and the fact that people have a provable bias towards the middle of three prices. It was first demonstrated with an experiment in beer pricing: when there were two beers, a third of people chose the cheaper; adding an even cheaper beer made the share of that beer go up, because it was now in the middle of three prices; adding an even more expensive beer at the top, and dropping the cheapest beer, made the share of the new beer in the middle (which had previously been the most expensive) go up from two-thirds to 90 percent. Having a price above and a price below makes the price in the middle seem more appealing. This experiment has been repeated with other consumer goods, such as ovens, and is now a much-used strategy in the corporate world. Basically, if you have two prices for something, and want to make more people pay the higher price, you add a third, even higher price; that makes the formerly highest price more attractive. Watch out for this strategy. (The research paper about beer pricing, written by a trio of economists at Duke University in 1982, was published in the Journal of Consumer Research. It’s called “Adding Asymetrically Dominated Alternatives: Violations of Regularity and the Simularity Hypothesis”—which must surely be the least engaging title ever given to an article about beer.)
John Lanchester (How to Speak Money: What the Money People Say-And What It Really Means: What the Money People Say―And What It Really Means)
But the basis of Freud's ideas aren't accepted by all philosophers, though many accept that he was right about the possibility of unconscious thought. Some have claimed that Freud's theories are unscientific. Most famously, Karl Popper (whose ideas are more fully discussed in Chapter 36) described many of the ideas of psychoanalysis as ‘unfalsifiable’. This wasn't a compliment, but a criticism. For Popper, the essence of scientific research was that it could be tested; that is, there could be some possible observation that would show that it was false. In Popper's example, the actions of a man who pushed a child into a river, and a man who dived in to save a drowning child were, like all human behaviour, equally open to Freudian explanation. Whether someone tried to drown or save a child, Freud's theory could explain it. He would probably say that the first man was repressing some aspect of his Oedipal conflict, and that led to his violent behaviour, whereas the second man had ‘sublimated’ his unconscious desires, that is, managed to steer them into socially useful actions. If every possible observation is taken as further evidence that the theory is true, whatever that observation is, and no imaginable evidence could show that it was false, Popper believed, the theory couldn't be scientific at all. Freud, on the other hand, might have argued that Popper had some kind of repressed desire that made him so aggressive towards psychoanalysis. Bertrand
Nigel Warburton (A Little History of Philosophy (Little Histories))
For years I’ve been asking myself (and my readers) whether these propagandists—commonly called corporate or capitalist journalists—are evil or stupid. I vacillate day by day. Most often I think both. But today I’m thinking evil. Here’s why. You may have heard of John Stossel. He’s a long-term analyst, now anchor, on a television program called 20/20, and is most famous for his segment called “Give Me A Break,” in which, to use his language, he debunks commonly held myths. Most of the rest of us would call what he does “lying to serve corporations.” For example, in one of his segments, he claimed that “buying organic [vegetables] could kill you.” He stated that specially commissioned studies had found no pesticide residues on either organically grown or pesticide-grown fruits and vegetables, and had found further that organic foods are covered with dangerous strains of E. coli. But the researchers Stossel cited later stated he misrepresented their research. The reason they didn’t find any pesticides is because they never tested for them (they were never asked to). Further, they said Stossel misrepresented the tests on E. coli. Stossel refused to issue a retraction. Worse, the network aired the piece two more times. And still worse, it came out later that 20/20’s executive director Victor Neufeld knew about the test results and knew that Stossel was lying a full three months before the original broadcast.391 This is not unusual for Stossel and company.
Derrick Jensen (Endgame, Vol. 1: The Problem of Civilization)
If this is true—if solitude is an important key to creativity—then we might all want to develop a taste for it. We’d want to teach our kids to work independently. We’d want to give employees plenty of privacy and autonomy. Yet increasingly we do just the opposite. We like to believe that we live in a grand age of creative individualism. We look back at the midcentury era in which the Berkeley researchers conducted their creativity studies, and feel superior. Unlike the starched-shirted conformists of the 1950s, we hang posters of Einstein on our walls, his tongue stuck out iconoclastically. We consume indie music and films, and generate our own online content. We “think different” (even if we got the idea from Apple Computer’s famous ad campaign). But the way we organize many of our most important institutions—our schools and our workplaces—tells a very different story. It’s the story of a contemporary phenomenon that I call the New Groupthink—a phenomenon that has the potential to stifle productivity at work and to deprive schoolchildren of the skills they’ll need to achieve excellence in an increasingly competitive world. The New Groupthink elevates teamwork above all else. It insists that creativity and intellectual achievement come from a gregarious place. It has many powerful advocates. “Innovation—the heart of the knowledge economy—is fundamentally social,” writes the prominent journalist Malcolm Gladwell. “None of us is as smart as all of us,” declares the organizational consultant Warren Bennis,
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
the device had the property of transresistance and should have a name similar to devices such as the thermistor and varistor, Pierce proposed transistor. Exclaimed Brattain, “That’s it!” The naming process still had to go through a formal poll of all the other engineers, but transistor easily won the election over five other options.35 On June 30, 1948, the press gathered in the auditorium of Bell Labs’ old building on West Street in Manhattan. The event featured Shockley, Bardeen, and Brattain as a group, and it was moderated by the director of research, Ralph Bown, dressed in a somber suit and colorful bow tie. He emphasized that the invention sprang from a combination of collaborative teamwork and individual brilliance: “Scientific research is coming more and more to be recognized as a group or teamwork job. . . . What we have for you today represents a fine example of teamwork, of brilliant individual contributions, and of the value of basic research in an industrial framework.”36 That precisely described the mix that had become the formula for innovation in the digital age. The New York Times buried the story on page 46 as the last item in its “News of Radio” column, after a note about an upcoming broadcast of an organ concert. But Time made it the lead story of its science section, with the headline “Little Brain Cell.” Bell Labs enforced the rule that Shockley be in every publicity photo along with Bardeen and Brattain. The most famous one shows the three of them in Brattain’s lab. Just as it was about to be taken, Shockley sat down in Brattain’s chair, as if it were his desk and microscope, and became the focal point of the photo. Years later Bardeen would describe Brattain’s lingering dismay and his resentment of Shockley: “Boy, Walter hates this picture. . . . That’s Walter’s equipment and our experiment,
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Take the famous slogan on the atheist bus in London … “There’s probably no God. Now stop worrying and enjoy your life.” … The word that offends against realism here is “enjoy.” I’m sorry—enjoy your life? Enjoy your life? I’m not making some kind of neo-puritan objection to enjoyment. Enjoyment is lovely. Enjoyment is great. The more enjoyment the better. But enjoyment is one emotion … Only sometimes, when you’re being lucky, will you stand in a relationship to what’s happening to you where you’ll gaze at it with warm, approving satisfaction. The rest of the time, you’ll be busy feeling hope, boredom, curiosity, anxiety, irritation, fear, joy, bewilderment, hate, tenderness, despair, relief, exhaustion … This really is a bizarre category error. But not necessarily an innocent one … The implication of the bus slogan is that enjoyment would be your natural state if you weren’t being “worried” by us believer … Take away the malignant threat of God-talk, and you would revert to continuous pleasure, under cloudless skies. What’s so wrong with this, apart from it being total bollocks? … Suppose, as the atheist bus goes by, that you are the fifty-something woman with the Tesco bags, trudging home to find out whether your dementing lover has smeared the walls of the flat with her own shit again. Yesterday when she did it, you hit her, and she mewled till her face was a mess of tears and mucus which you also had to clean up. The only thing that would ease the weight on your heart would be to tell the funniest, sharpest-tongued person you know about it: but that person no longer inhabits the creature who will meet you when you unlock the door. Respite care would help, but nothing will restore your sweetheart, your true love, your darling, your joy. Or suppose you’re that boy in the wheelchair, the one with the spasming corkscrew limbs and the funny-looking head. You’ve never been able to talk, but one of your hands has been enough under your control to tap out messages. Now the electrical storm in your nervous system is spreading there too, and your fingers tap more errors than readable words. Soon your narrow channel to the world will close altogether, and you’ll be left all alone in the hulk of your body. Research into the genetics of your disease may abolish it altogether in later generations, but it won’t rescue you. Or suppose you’re that skanky-looking woman in the doorway, the one with the rat’s nest of dreadlocks. Two days ago you skedaddled from rehab. The first couple of hits were great: your tolerance had gone right down, over two weeks of abstinence and square meals, and the rush of bliss was the way it used to be when you began. But now you’re back in the grind, and the news is trickling through you that you’ve fucked up big time. Always before you’ve had this story you tell yourself about getting clean, but now you see it isn’t true, now you know you haven’t the strength. Social services will be keeping your little boy. And in about half an hour you’ll be giving someone a blowjob for a fiver behind the bus station. Better drugs policy might help, but it won’t ease the need, and the shame over the need, and the need to wipe away the shame. So when the atheist bus comes by, and tells you that there’s probably no God so you should stop worrying and enjoy your life, the slogan is not just bitterly inappropriate in mood. What it means, if it’s true, is that anyone who isn’t enjoying themselves is entirely on their own. The three of you are, for instance; you’re all three locked in your unshareable situations, banged up for good in cells no other human being can enter. What the atheist bus says is: there’s no help coming … But let’s be clear about the emotional logic of the bus’s message. It amounts to a denial of hope or consolation, on any but the most chirpy, squeaky, bubble-gummy reading of the human situation. St Augustine called this kind of thing “cruel optimism” fifteen hundred years ago, and it’s still cruel.
Francis Spufford
I’ve worn Niki’s pants for two days now. I thought a third day in the same clothes might be pushing it.” Ian shrugged with indifference. “It might send Derian through the roof, but it doesn’t bother me. Wear what you want to wear.” Eena wrinkled her nose at him. “Do you really feel that way or are you trying to appear more laissez-faire than Derian?” “More laissez-faire?” “Yes. That’s a real word.” “Two words actually,” he grinned. “Laissez faire et laissez passer, le monde va de lui même!" He coated the words with a heavy French accent. Eena gawked at him. “Since when do you speak French?” “I don’t.” Ian chuckled. “But I did do some research in world history the year I followed you around on Earth. Physics was a joke, but history—that I found fascinating.” Slapping a hand against her chest, Eena exclaimed, “I can’t believe it! Unbeknownst to me, Ian actually studied something in high school other than the library’s collection of sci-fi paperbacks!” He grimaced at her exaggerated performance before defending his preferred choice of reading material. “Hey, popular literature is a valuable and enlightening form of world history. You would know that if you read a book or two.” She ignored his reproach and asked with curiosity, “What exactly did you say?” “In French?” “Duh, yes.” “Don’t ‘duh’ me, you could easily have been referring to my remark about enlightening literature. I know the value of a good book is hard for you to comprehend.” He grinned crookedly at her look of offense and then moved into an English translation of his French quote. “Let it do and let it pass, the world goes on by itself.” “Hmm. And where did that saying come from?” Ian delivered his answer with a surprisingly straight face. “That is what the French Monarch said when his queen began dressing casually. The French revolution started one week following that famous declaration, right after the queen was beheaded by the rest of the aristocracy in her favorite pair of scroungy jeans.” “You are such a brazen-tongued liar!
Richelle E. Goodrich (Eena, The Companionship of the Dragon's Soul (The Harrowbethian Saga #6))
This, in turn, has given us a “unified theory of aging” that brings the various strands of research into a single, coherent tapestry. Scientists now know what aging is. It is the accumulation of errors at the genetic and cellular level. These errors can build up in various ways. For example, metabolism creates free radicals and oxidation, which damage the delicate molecular machinery of our cells, causing them to age; errors can build up in the form of “junk” molecular debris accumulating inside and outside the cells. The buildup of these genetic errors is a by-product of the second law of thermodynamics: total entropy (that is, chaos) always increases. This is why rusting, rotting, decaying, etc., are universal features of life. The second law is inescapable. Everything, from the flowers in the field to our bodies and even the universe itself, is doomed to wither and die. But there is a small but important loophole in the second law that states total entropy always increases. This means that you can actually reduce entropy in one place and reverse aging, as long as you increase entropy somewhere else. So it’s possible to get younger, at the expense of wreaking havoc elsewhere. (This was alluded to in Oscar Wilde’s famous novel The Picture of Dorian Gray. Mr. Gray was mysteriously eternally young. But his secret was the painting of himself that aged horribly. So the total amount of aging still increased.) The principle of entropy can also be seen by looking behind a refrigerator. Inside the refrigerator, entropy decreases as the temperature drops. But to lower the entropy, you have to have a motor, which increases the heat generated behind the refrigerator, increasing the entropy outside the machine. That is why refrigerators are always hot in the back. As Nobel laureate Richard Feynman once said, “There is nothing in biology yet found that indicates the inevitability of death. This suggests to me that it is not at all inevitable and that it is only a matter of time before biologists discover what it is that is causing us the trouble and that this terrible universal disease or temporariness of the human’s body will be cured.
Michio Kaku (Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100)
The Extraordinary Persons Project In fact, Ekman had been so moved personally—and intrigued scientifically—by his experiments with Öser that he announced at the meeting he was planning on pursuing a systematic program of research studies with others as unusual as Öser. The single criterion for selecting apt subjects was that they be “extraordinary.” This announcement was, for modern psychology, an extraordinary moment in itself. Psychology has almost entirely dwelt on the problematic, the abnormal, and the ordinary in its focus. Very rarely have psychologists—particularly ones as eminent as Paul Ekman—shifted their scientific lens to focus on people who were in some sense (other than intellectually) far above normal. And yet Ekman now was proposing to study people who excel in a range of admirable human qualities. His announcement makes one wonder why psychology hasn't done this before. In fact, only in very recent years has psychology explicitly begun a program to study the positive in human nature. Sparked by Martin Seligman, a psychologist at the University of Pennsylvania long famous for his research on optimism, a budding movement has finally begun in what is being called “positive psychology”—the scientific study of well-being and positive human qualities. But even within positive psychology, Ekman's proposed research would stretch science's vision of human goodness by assaying the limits of human positivity Ever the scientist, Ekman became quite specific about what was meant by “extraordinary.” For one, he expects that such people exist in every culture and religious tradition, perhaps most often as contemplatives. But no matter what religion they practice, they share four qualities. The first is that they emanate a sense of goodness, a palpable quality of being that others notice and agree on. This goodness goes beyond some fuzzy, warm aura and reflects with integrity the true person. On this count Ekman proposed a test to weed out charlatans: In extraordinary people “there is a transparency between their personal and public life, unlike many charismatics, who have wonderful public lives and rather deplorable personal ones.” A second quality: selflessness. Such extraordinary people are inspiring in their lack of concern about status, fame, or ego. They are totally unconcerned with whether their position or importance is recognized. Such a lack of egoism, Ekman added, “from the psychological viewpoint, is remarkable.” Third is a compelling personal presence that others find nourishing. “People want to be around them because it feels good—though they can't explain why,” said Ekman. Indeed, the Dalai Lama himself offers an obvious example (though Ekman did not say so to him); the standard Tibetan title is not “Dalai Lama” but rather “Kundun,” which in Tibetan means “presence.” Finally, such extraordinary individuals have “amazing powers of attentiveness and concentration.
Daniel Goleman (Destructive Emotions: A Scientific Dialogue with the Dalai Lama)
The Tale of Human Evolution The subject most often brought up by advocates of the theory of evolution is the subject of the origin of man. The Darwinist claim holds that modern man evolved from ape-like creatures. During this alleged evolutionary process, which is supposed to have started 4-5 million years ago, some "transitional forms" between modern man and his ancestors are supposed to have existed. According to this completely imaginary scenario, four basic "categories" are listed: 1. Australopithecus 2. Homo habilis 3. Homo erectus 4. Homo sapiens Evolutionists call man's so-called first ape-like ancestors Australopithecus, which means "South African ape." These living beings are actually nothing but an old ape species that has become extinct. Extensive research done on various Australopithecus specimens by two world famous anatomists from England and the USA, namely, Lord Solly Zuckerman and Prof. Charles Oxnard, shows that these apes belonged to an ordinary ape species that became extinct and bore no resemblance to humans. Evolutionists classify the next stage of human evolution as "homo," that is "man." According to their claim, the living beings in the Homo series are more developed than Australopithecus. Evolutionists devise a fanciful evolution scheme by arranging different fossils of these creatures in a particular order. This scheme is imaginary because it has never been proved that there is an evolutionary relation between these different classes. Ernst Mayr, one of the twentieth century's most important evolutionists, contends in his book One Long Argument that "particularly historical [puzzles] such as the origin of life or of Homo sapiens, are extremely difficult and may even resist a final, satisfying explanation." By outlining the link chain as Australopithecus > Homo habilis > Homo erectus > Homo sapiens, evolutionists imply that each of these species is one another's ancestor. However, recent findings of paleoanthropologists have revealed that Australopithecus, Homo habilis, and Homo erectus lived at different parts of the world at the same time. Moreover, a certain segment of humans classified as Homo erectus have lived up until very modern times. Homo sapiens neandarthalensis and Homo sapiens sapiens (modern man) co-existed in the same region. This situation apparently indicates the invalidity of the claim that they are ancestors of one another. Stephen Jay Gould explained this deadlock of the theory of evolution although he was himself one of the leading advocates of evolution in the twentieth century: What has become of our ladder if there are three coexisting lineages of hominids (A. africanus, the robust australopithecines, and H. habilis), none clearly derived from another? Moreover, none of the three display any evolutionary trends during their tenure on earth. Put briefly, the scenario of human evolution, which is "upheld" with the help of various drawings of some "half ape, half human" creatures appearing in the media and course books, that is, frankly, by means of propaganda, is nothing but a tale with no scientific foundation. Lord Solly Zuckerman, one of the most famous and respected scientists in the U.K., who carried out research on this subject for years and studied Australopithecus fossils for 15 years, finally concluded, despite being an evolutionist himself, that there is, in fact, no such family tree branching out from ape-like creatures to man.
Harun Yahya (Those Who Exhaust All Their Pleasures In This Life)
Our ability to tap into the senses of others is not limited to hypnotic states. In a now famous series of experiments physicists Harold Puthoff and Russell Targ of the Stanford Research Institute in California found that just about everyone they tested had a capacity they call “remote viewing,” the ability to describe accurately what a distant test subject is seeing. They found that individual after individual could remote-view simply by relaxing and describing whatever images came into their minds. Puthoff and Targ's findings have been duplicated by dozens of laboratories around the world, indicating that remote viewing is probably a widespread latent ability in all of us.
Anonymous
Research tells us that brainstorming becomes more productive when it’s focused. As jazz great Charles Mingus famously said, “You can’t improvise on nothing, man; you’ve gotta improvise on something.
Chip Heath (The Myth of the Garage: And Other Minor Surprises)
Knowing what you’re aiming for is essential. In a famous study of Yale University students, researchers found that only 3% had written goals with plans for their achievement. Twenty years later researchers interviewed the surviving graduates and found that those 3% were worth more financially than the other 97% combined.
Karen McCreadie (Think and Grow Rich (Infinite Success))
the people who are best at telling jokes tend to have more health problems than the people laughing at them. A study of Finnish police officers found that those who were seen as funniest smoked more, weighed more, and were at greater risk of cardiovascular disease than their peers [10]. Entertainers typically die earlier than other famous people [11], and comedians exhibit more “psychotic traits” than others [12]. So just as there’s research to back up the conventional wisdom on laughter’s curative powers, there also seems to be truth to the stereotype that funny people aren’t always having much fun. It might feel good to crack others up now and then, but apparently the audience gets the last laugh.
Anonymous
THE CHASM – THE DIFFUSION MODEL WHY EVERYBODY HAS AN IPOD Why is it that some ideas – including stupid ones – take hold and become trends, while others bloom briefly before withering and disappearing from the public eye? Sociologists describe the way in which a catchy idea or product becomes popular as ‘diffusion’. One of the most famous diffusion studies is an analysis by Bruce Ryan and Neal Gross of the diffusion of hybrid corn in the 1930s in Greene County, Iowa. The new type of corn was better than the old sort in every way, yet it took twenty-two years for it to become widely accepted. The diffusion researchers called the farmers who switched to the new corn as early as 1928 ‘innovators’, and the somewhat bigger group that was infected by them ‘early adaptors’. They were the opinion leaders in the communities, respected people who observed the experiments of the innovators and then joined them. They were followed at the end of the 1930s by the ‘sceptical masses’, those who would never change anything before it had been tried out by the successful farmers. But at some point even they were infected by the ‘hybrid corn virus’, and eventually transmitted it to the die-hard conservatives, the ‘stragglers’. Translated into a graph, this development takes the form of a curve typical of the progress of an epidemic. It rises, gradually at first, then reaches the critical point of any newly launched product, when many products fail. The critical point for any innovation is the transition from the early adaptors to the sceptics, for at this point there is a ‘chasm’. According to the US sociologist Morton Grodzins, if the early adaptors succeed in getting the innovation across the chasm to the sceptical masses, the epidemic cycle reaches the tipping point. From there, the curve rises sharply when the masses accept the product, and sinks again when only the stragglers remain. With technological innovations like the iPod or the iPhone, the cycle described above is very short. Interestingly, the early adaptors turn away from the product as soon as the critical masses have accepted it, in search of the next new thing. The chasm model was introduced by the American consultant and author Geoffrey Moore. First they ignore you, then they laugh at you, then they fight you, then you win. Mahatma Gandhi
Mikael Krogerus (The Decision Book: 50 Models for Strategic Thinking)
Amblyopsis hoosieri Type of animal: Eyeless cavefish Description: Completely colorless; 2 to 3 inches long; anus on underside of neck Home: Southern Indiana Fun fact: Unlike others of its kind, A. hoosieri lacks a debilitating mutation in the rhodopsin gene, which is an important gene for vision. That means it could see just fine … if it had eyes. Researchers named the fish after the Indiana Hoosiers basketball team — but not to imply the players might be visually challenged. The name honors several famous fish scientists who worked at Indiana University, as well as the species’s proximity to the university.Plus, the lead author is a Hoosier fan.  BRENDA POPPY Can You See Me Now? NIEMILLER/ZOOKEYS MATTHEW LEMOS; BARRETO GABRIELA : TOPFROM 22 DISCOVERMAGAZINE.COM
Anonymous
the Harveys’ most famous son. An experimental physician famous for his discovery of the circulation of the blood, he had been the personal physician to Charles I and had been present with him at the Battle of Edgehill in 1642. Research in the Harvey family papers has also revealed that he was responsible for the only known scientific examination of a witch’s familiar. Personally ordered by Charles I to examine a lady suspected of witchcraft who lived on the outskirts of Newmarket, the dubious Harvey visited her in the guise of a wizard. He succeeded in capturing and dissecting her pet toad. The animal, Harvey concluded dryly, was a toad.
Sam Willis (The Fighting Temeraire: The Battle of Trafalgar and the Ship that Inspired J.M.W. Turner's Most Beloved Painting)
practice power posing. Popularized by Amy Cuddy in her famous TED Talk, power posing is a simple 1-2 minute exercise that has incredible results on your confidence, happiness, and even cognitive functioning. I highly recommend that you check out her TED Talk, but if you don’t have time here is a quick primer on how to power pose. Before an event that you’re feeling nervous about, simply go somewhere quiet (like a bathroom stall) then strike and hold a power pose. A power pose is any standing position that represents a powerful stance, a classic example is the superhero pose – hands on your hips, chest out, head help high and a feeling of dominance. This may sound ridiculous, but the research behind it is outstanding. Try it just once, it only takes 1-2 minutes, and you will feel the difference instantly.   The physical space you occupy also plays a role in the impression you signal to people. You’re going to want to pay particular attention to personal space and touching. In a business setting, most people are fine with a handshake and not much more than that.
Andy Arnott (Effortless Small Talk: Learn How to Talk to Anyone, Anytime, Anywhere... Even If You're Painfully Shy)
That’s the beauty of the famous scientific method. You observe your subject, ask questions, and then research before establishing a hypothesis.
Claudia Y. Burgoa (Undefeated (Unexpected #5))
Before Wonder Woman, Marston was best known for helping to invent the lie detector test, or polygraph, which was based on his research in systolic blood pressure.
Tim Hanley (Wonder Woman Unbound: The Curious History of the World's Most Famous Heroine)
with you, as your date?” Liam asks me. “Yes,” I say quietly. “I’m so sorry. What can I do for you in return?” “Well, since you offered,” Liam responds, “I would like some information.” “Information?” I ask with a frown. “Yes,” Liam says. “Remember all those deep, dark secrets I said I’d extract from you? Well, if you share them with us, then I’ll be your date for your sister’s wedding.” This is probably the worst thing he could have requested. My mouth feels suddenly very dry. “Um. Isn’t there anything else you might want? Maybe I could dedicate my next book to you?” He laughs lightly. “You’re going to do that anyway once I get your sight back.” I rack my brain, searching for something I could give him. “I’ll have my publisher put out a press release,” I offer, “or maybe schedule an event, like a book launch. We can publicly declare that you’re the hero who helped the semi-famous blind author Winter Rose to see. Even if it doesn’t work, and I can’t see, I’ll pretend like I can, and you’ll probably get tons of research grants and stuff.” “I’m pretty sure that you’re going to do that anyway,” Liam tells me, “because it’s a good story that will sell books.” “Okay,” I mumble, getting desperate. “How about I name a character after you?” “That would be nice,” Liam says. “I’ll take all of the above, but I’ll still need one additional thing to sweeten the pot. Information.” “Why?” I moan in protest. “Because I’m curious,” he answers in a good-natured way. “Come on. It can’t be that bad. Tell me your deepest, darkest secrets.” I sigh. “Are you sure?” “Yes.” “Really? Right here. Right now? In front of Owen?” “Yeah, why not?” Liam says cheerfully. “He’s been telling us way more than we need to know for a while.” “I want to hear, too,” Owen chimes in.  “Entertain us, storyteller!” I spend a moment gathering my composure. I smooth my hands over my legs, and look around uneasily. Taking a deep breath, I try to mentally prepare myself for what I’m about to say to two complete strangers. “Well... three years ago, I was raped.” A hush falls over the car. I can feel the men looking
Loretta Lost (Clarity (Clarity, #1))
Minsky was an ardent supporter of the Cyc project, the most notorious failure in the history of AI. The goal of Cyc was to solve AI by entering into a computer all the necessary knowledge. When the project began in the 1980s, its leader, Doug Lenat, confidently predicted success within a decade. Thirty years later, Cyc continues to grow without end in sight, and commonsense reasoning still eludes it. Ironically, Lenat has belatedly embraced populating Cyc by mining the web, not because Cyc can read, but because there’s no other way. Even if by some miracle we managed to finish coding up all the necessary pieces, our troubles would be just beginning. Over the years, a number of research groups have attempted to build complete intelligent agents by putting together algorithms for vision, speech recognition, language understanding, reasoning, planning, navigation, manipulation, and so on. Without a unifying framework, these attempts soon hit an insurmountable wall of complexity: too many moving parts, too many interactions, too many bugs for poor human software engineers to cope with. Knowledge engineers believe AI is just an engineering problem, but we have not yet reached the point where engineering can take us the rest of the way. In 1962, when Kennedy gave his famous moon-shot speech, going to the moon was an engineering problem. In 1662, it wasn’t, and that’s closer to where AI is today. In industry, there’s no sign that knowledge engineering will ever be able to compete with machine learning outside of a few niche areas. Why pay experts to slowly and painfully encode knowledge into a form computers can understand, when you can extract it from data at a fraction of the cost? What about all the things the experts don’t know but you can discover from data? And when data is not available, the cost of knowledge engineering seldom exceeds the benefit. Imagine if farmers had to engineer each cornstalk in turn, instead of sowing the seeds and letting them grow: we would all starve.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
The first eye-opener came in the 1970s, when DARPA, the Pentagon’s research arm, organized the first large-scale speech recognition project. To everyone’s surprise, a simple sequential learner of the type Chomsky derided handily beat a sophisticated knowledge-based system. Learners like it are now used in just about every speech recognizer, including Siri. Fred Jelinek, head of the speech group at IBM, famously quipped that “every time I fire a linguist, the recognizer’s performance goes up.” Stuck in the knowledge-engineering mire, computational linguistics had a near-death experience in the late 1980s. Since then, learning-based methods have swept the field, to the point where it’s hard to find a paper devoid of learning in a computational linguistics conference. Statistical parsers analyze language with accuracy close to that of humans, where hand-coded ones lagged far behind. Machine translation, spelling correction, part-of-speech tagging, word sense disambiguation, question answering, dialogue, summarization: the best systems in these areas all use learning. Watson, the Jeopardy! computer champion, would not have been possible without it.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
It was only after World War II that Stanford began to emerge as a center of technical excellence, owing largely to the campaigns of Frederick Terman, dean of the School of Engineering and architect-of-record of the military-industrial-academic complex that is Silicon Valley. During World War II Terman had been tapped by his own mentor, presidential science advisor Vannevar Bush, to run the secret Radio Research Lab at Harvard and was determined to capture a share of the defense funding the federal government was preparing to redirect toward postwar academic research. Within a decade he had succeeded in turning the governor’s stud farm into the Stanford Industrial Park, instituted a lucrative honors cooperative program that provided a camino real for local companies to put selected employees through a master’s degree program, and overseen major investments in the most promising areas of research. Enrollments rose by 20 percent, and over one-third of entering class of 1957 started in the School of Engineering—more than double the national average.4 As he rose from chairman to dean to provost, Terman was unwavering in his belief that engineering formed the heart of a liberal education and labored to erect his famous “steeples of excellence” with strategic appointments in areas such as semiconductors, microwave electronics, and aeronautics. Design, to the extent that it was a recognized field at all, remained on the margins, the province of an older generation of draftsmen and machine builders who were more at home in the shop than the research laboratory—a situation Terman hoped to remedy with a promising new hire from MIT: “The world has heard very little, if anything, of engineering design at Stanford,” he reported to President Wallace Sterling, “but they will be hearing about it in the future.
Barry M. Katz (Make It New: A History of Silicon Valley Design (The MIT Press))
Fortunately, making friends in law school is easy because of the psychological bonding effects of group terror. In a famous social psychology experiment, researchers put a group of monkeys in the same cage with a group of lions. Monkeys and lions usually don’t socialize because the lions eat the monkeys, which causes hard feelings. Early in the experiment, it appeared events would follow this customary pattern as the lions began chasing the monkeys and the monkeys began bonking the lions on the heads with coconuts. At this point, the researchers inserted a Contracts professor into the cage who began conducting a Socratic dialogue about the doctrine of promissory estoppel. An amazing transformation occurred. The lions and monkeys immediately locked paws and began singing pub songs. Within a few minutes, the lions were giving the monkeys foot massages and the monkeys were encouraging the lions to get in touch with their inner cubs. Okay, that wasn’t a real experiment, but I’m confident it would work out that way. That’s what
Andrew J. McClurg (McClurg's 1L of a Ride: A Well-Traveled Professor's Roadmap to Success in the First Year of Law School, 2d: A Well-Traveled Professor's Roadmap to Success ... the First Year of Law Schoo (Career Guides))
The truth was that Newton’s biblical research was central to his entire scientific career. They form the essential backdrop for his most famous work, the Principia Mathematica.
Arthur Herman (The Cave and the Light: Plato Versus Aristotle, and the Struggle for the Soul of Western Civilization)
the ideal inductive approach, you will not have any prior beliefs about gender and its effects. In Strauss and Corbin’s famous description, “The researcher begins with an area of study and allows the theory to emerge from the data
Sam Ladner (Mixed Methods: A short guide to applied mixed methods research)
Another middle-sized (six thousand) Southern university on the move. Conservative by national standards, although it severed its ties to the ultraconservative Southern Baptists in 1986. ACC athletics and Greek parties shape the social scene. The strategic central North Carolina location is accessible to mountains, beaches, and the famous research triangle. (Rising Stars - Wake Forest University)
Fiske Guide To Colleges (Fiske Guide to Colleges 2005)
In 1950, he was accorded the dubious honor of being the first prominent scientist to appear on the earliest of Senator Joseph McCarthy’s famous lists of crypto-communists.
Sylvia Nasar (A Beautiful Mind)
NEXT day was fine and warm. 'We can go across to the island this morning,' said Aunt Fanny. 'We'll take our own food, because I'm sure Uncle Quentin will have forgotten we're coming.' 'Has he a boat there:' asked George. 'Mother hasn't taken my boat, has he?' 'No, dear,' said - her mother. 'He's got another boat. I was afraid he would never be able to get it in and out of all those dangerous rocks round the island, but he got one of the fishermen to take him, and had his own boat towed behind, with all its stuff in/' 'Who built the tower?' asked Julian. 'Oh, he made out the plans himself, and some men were sent down from the Ministry of Research to put the tower up for him,' said Aunt Fanny 'It was all rather hush-hush really. The people here were most curious about it, but they don't know any more than I do! No -local man helped in the building, but one or two fishermen were hired to take the material to the island, and to land the men and soon.' 'It's all very mysterious,' said Julian. 'Uncle Quentin -leads-rather an exciting life, really, doesn't he? I wouldn't mind being a scientist myself. I want to be something really worthwhile when I grow up I'm not just
Enid Blyton (Five On Kirrin Island Again (Famous Five Book 6))
What are we to do at any given moment, when we cannot say which of our current claims will be sustained and which will be rejected? This is one of the central questions that I have raised. Because we cannot know which of current claims will be sustained, the best we can do is to consider the weight of scientific evidence, the fulcrum of scientific opinion, and the trajectory of scientific knowledge. This is why consensus matters: If scientists are still debating a matter, then we may well be wise to “wait and see,” if conditions permit.26 If the available empirical evidence is thin, we may want to do more research. But the uncertainly of future scientific knowledge should not be used as an excuse for delay. As the epidemiologist Sir Austin Bradford Hill famously argued, “All scientific work is incomplete—whether it be observational or experimental. All scientific work is liable to be upset or modified by advancing knowledge. That does not confer upon us a freedom to ignore the knowledge we already have, or to postpone the action that it appears to demand at a given time.”27 At any given moment, it makes sense to make decisions on the information we have, and be prepared to alter our plans if future evidence warrants.
Naomi Oreskes (Why Trust Science? (The University Center for Human Values Series))
prime example is the Japanese herb ashitaba, which is available as a tea or powder and helps prevent zombie cells. It is traditionally used to treat high blood pressure, hay fever, gout, and digestive issues, but researchers recently discovered a compound in the plant called dimethoxychalcone (DMC—no relation to the famous rappers), which slows senescence. In worms and fruit flies, DMC increases life-span by 20 percent.
Dave Asprey (Super Human: The Bulletproof Plan to Age Backward and Maybe Even Live Forever)
Blog,,cheifagboladedayoire.blogspot.com gmail..com,,, adayoire@gmail.com What's app,,,+2348168965161 CHEIF DAYOIRE is the only best powerful traditional spiritual herbalist healer, Lost Love Spells, Sangoma, LOTTO Winning Spells, Marriage Spells Caster, AZUUA Magic Ring for wealth, AZUUA Magic Wallet for money, Get Money into your Account Spells, Penis Enlargement Medicine, Back pains Medicine, Hips and Bums Enlargement, Breasts Enlargement, Short boys for money, Black Magic Spells, Voodoo Spells, Binding Spells and many more. I use the miracle black magic spells and strong herbal medicine to heal and cure all people’s complications in life. I inherited this job from my ancestors of my family. For so long my family has been famous as the best traditional spiritual healer family. CHEIF DAYOIRE can read your fate and destiny accurately by using the ancient methods of checking through water, mirror, your hands and many other enabling me to tell you all your problems, AM the current leader and Fore teller of the grand ancestral shrine which has been in existence since the beginning of the world as a source of the most powerful unseen forces, I have solved many mysterious problems by using the invisible powers. Am regarded by many as the greatest powerful spiritual healer on the planet today” The Gods of my fore ‘father’s ancestral powers anointed me when I was two months old to inherit, heal and solve most of the problems and ailments that are failed to be healed by other doctors. Education background: I hold a bachelor’s degree in medicine but ancestors forced me to do the work they anointed me for: THE PROBLEMS THAT I CAN HEAL AND SOLVE THROUGH THE POWERFUL SPIRITUAL ANCESTORS AND HERBAL MEDICINAL RESEARCHES INCLUDE; 1) Do you want Supernatural Luck into your life, 2) See your Enemies Using a Mirror, 3) Get back LOST LOVER in 1–2 days, …..BEST LOVE SPELL CASTER……. 4) Do you spend sleepless nights thinking and dreaming about that lover of your life but your lover’s mind is elsewhere (A shortest Time & Seal Up Marriage with eternal Love & Happiness is here.) ……BEST MARRIAGE SPELLS….. Call chief dayoire on +2348168965161
Adayoire
Reintroducing history into evolutionary thinking has already begun at other biological scales. The cell, once an emblem of replicable units, turns out to be the historical product of symbiosis among free- living bacteria. Even DNA turns out to have more history in its amino- acid sequences than once thought. Human DNA is part virus; viral encoun- ters mark historical moments in making us human. Genome research has taken up the challenge of identifying encounter in the making of DNA. Population science cannot avoid history for much longer. Fungi are ideal guides. Fungi have always been recalcitrant to the iron cage of self- replication. Like bacteria, some are given to exchanging genes in nonreproductive encounters (“horizontal gene transfer”); many also seem averse to keeping their genetic material sorted out as “individ- uals” and “species,” not to speak of “populations.” When researchers studied the fruiting bodies of what they thought of as a species, the ex- pensive Tibetan “caterpillar fungus,” they found many species entan- gled together. When they looked into the filaments of Armillaria root rot, they found genetic mosaics that confused the identification of an individual. Meanwhile, fungi are famous for their symbiotic attach- ments. Lichen are fungi living together with algae and cyanobacteria. I have been discussing fungal collaborations with plants, but fungi live with animals as well. For example, Macrotermes termites digest their food only through the help of fungi. The termites chew up wood, but they cannot digest it. Instead, they build “fungus gardens” in which the chewed- up wood is digested by Termitomyces fungi, producing edible nutrients. Researcher Scott Turner points out that, while you might say that the termites farm the fungus, you could equally say that the fungus farms the termites. Termitomyces uses the environment of the termite mound to outcompete other fungi; meanwhile, the fungus regulates the mound, keeping it open, by throwing up mushrooms annually, cre- ating a colony- saving disturbance in termite mound- building.
Anna Lowenhaupt Tsing
Speaking at the Chaos Communication Congress, an annual computer hacker conference held in Berlin, Germany, Tobias Engel, founder of Sternraute, and Karsten Nohl, chief scientist for Security Research Labs, explained that they could not only locate cell-phone callers anywhere in the world, they could also listen in on their phone conversations. And if they couldn’t listen in real time, they could record the encrypted calls and texts for later decryption.
Kevin D. Mitnick (The Art of Invisibility: The World's Most Famous Hacker Teaches You How to Be Safe in the Age of Big Brother and Big Data)
Education was still considered a privilege in England. At Oxford you took responsibility for your efforts and for your performance. No one coddled, and no one uproariously encouraged. British respect for the individual, both learner and teacher, reigned. If you wanted to learn, you applied yourself and did it. Grades were posted publicly by your name after exams. People failed regularly. These realities never ceased to bewilder those used to “democracy” without any of the responsibility. For me, however, my expectations were rattled in another way. I arrived anticipating to be snubbed by a culture of privilege, but when looked at from a British angle, I actually found North American students owned a far greater sense of entitlement when it came to a college education. I did not realize just how much expectations fetter—these “mind-forged manacles,”2 as Blake wrote. Oxford upholds something larger than self as a reference point, embedded in the deep respect for all that a community of learning entails. At my very first tutorial, for instance, an American student entered wearing a baseball cap on backward. The professor quietly asked him to remove it. The student froze, stunned. In the United States such a request would be fodder for a laundry list of wrongs done against the student, followed by threatening the teacher’s job and suing the university. But Oxford sits unruffled: if you don’t like it, you can simply leave. A handy formula since, of course, no one wants to leave. “No caps in my classroom,” the professor repeated, adding, “Men and women have died for your education.” Instead of being disgruntled, the student nodded thoughtfully as he removed his hat and joined us. With its expanses of beautiful architecture, quads (or walled lawns) spilling into lush gardens, mist rising from rivers, cows lowing in meadows, spires reaching high into skies, Oxford remained unapologetically absolute. And did I mention? Practically every college within the university has its own pub. Pubs, as I came to learn, represented far more for the Brits than merely a place where alcohol was served. They were important gathering places, overflowing with good conversation over comforting food: vital humming hubs of community in communication. So faced with a thousand-year-old institution, I learned to pick my battles. Rather than resist, for instance, the archaic book-ordering system in the Bodleian Library with technological mortification, I discovered the treasure in embracing its seeming quirkiness. Often, when the wrong book came up from the annals after my order, I found it to be right in some way after all. Oxford often works such. After one particularly serendipitous day of research, I asked Robert, the usual morning porter on duty at the Bodleian Library, about the lack of any kind of sophisticated security system, especially in one of the world’s most famous libraries. The Bodleian was not a loaning library, though you were allowed to work freely amid priceless artifacts. Individual college libraries entrusted you to simply sign a book out and then return it when you were done. “It’s funny; Americans ask me about that all the time,” Robert said as he stirred his tea. “But then again, they’re not used to having u in honour,” he said with a shrug.
Carolyn Weber (Surprised by Oxford)
Bill Wilson would never have another drink. For the next thirty-six years, until he died of emphysema in 1971, he would devote himself to founding, building, and spreading Alcoholics Anonymous, until it became the largest, most well-known and successful habit-changing organization in the world. An estimated 2.1 million people seek help from AA each year, and as many as 10 million alcoholics may have achieved sobriety through the group.3.12,3.13 AA doesn’t work for everyone—success rates are difficult to measure, because of participants’ anonymity—but millions credit the program with saving their lives. AA’s foundational credo, the famous twelve steps, have become cultural lodestones incorporated into treatment programs for overeating, gambling, debt, sex, drugs, hoarding, self-mutilation, smoking, video game addictions, emotional dependency, and dozens of other destructive behaviors. The group’s techniques offer, in many respects, one of the most powerful formulas for change. All of which is somewhat unexpected, because AA has almost no grounding in science or most accepted therapeutic methods. Alcoholism, of course, is more than a habit. It’s a physical addiction with psychological and perhaps genetic roots. What’s interesting about AA, however, is that the program doesn’t directly attack many of the psychiatric or biochemical issues that researchers say are often at the core of why alcoholics drink.3.14 In fact, AA’s methods seem to sidestep scientific and medical findings altogether, as well as the types of intervention many psychiatrists say alcoholics really need.1 What AA provides instead is a method for attacking the habits that surround alcohol use.3.15 AA, in essence, is a giant machine for changing habit loops. And though the habits associated with alcoholism are extreme, the lessons AA provides demonstrate how almost any habit—even the most obstinate—can be changed.
Charles Duhigg (The Power Of Habit: Why We Do What We Do In Life And Business)
One would expect to find a comparatively high proportion of carbon 13 [the carbon from corn] in the flesh of people whose staple food of choice is corn - Mexicans, most famously. Americans eat much more wheat than corn - 114 pounds of wheat flour per person per year, compared to 11 pounds of corn flour. The Europeans who colonized America regarded themselves as wheat people, in contrast to the native corn people they encountered; wheat in the West has always been considered the most refined, or civilized, grain. If asked to choose, most of us would probably still consider ourselves wheat people, though by now the whole idea of identifying with a plant at all strikes us as a little old-fashioned. Beef people sounds more like it, though nowadays chicken people, which sounds not nearly so good, is probably closer to the truth of the matter. But carbon 13 doesn't lie, and researchers who compared the carbon isotopes in the flesh or hair of Americans to those in the same tissues of Mexicans report that it is now we in the North who are the true people of corn. 'When you look at the isotope ratios,' Todd Dawson, a Berkeley biologist who's done this sort of research, told me, 'we North Americans look like corn chips with legs.' Compared to us, Mexicans today consume a far more varied carbon diet: the animals they eat still eat grass (until recently, Mexicans regarded feeding corn to livestock as a sacrilege); much of their protein comes from legumes; and they still sweeten their beverages with cane sugar. So that's us: processed corn, walking.
Michael Pollan (The Omnivore's Dilemma: A Natural History of Four Meals)
Schools, in a noble effort to interest more girls in math and science, often try to combat stereotypes by showing children images of famous female scientists. “See, they did it. You can do it, too!” Unfortunately, these attempts rarely work, according to the research. Girls are more likely to remember the women as lab assistants. This is frustrating for those of us who try to combat gender stereotypes in children.
Christia Spears Brown (Parenting Beyond Pink & Blue: How to Raise Your Kids Free of Gender Stereotypes)
In a famous 1987 study, researchers Michael Diehl and Wolfgang Stroebe from Tubingen University in Germany concluded that brainstorming groups have never outperformed virtual groups.7 Of
Frans Johansson (Medici Effect: What You Can Learn from Elephants and Epidemics)
Rick smiled as he watched the waves roll toward their feet. He turned to her and said, “Since we’re going to Louisiana, I did some research and learned a few things. Did you know it’s famous for its gumbo and bayous?” Amelia’s eyes brightened. “Really? I’ve seen pictures of a bayou in a magazine. It’s so mysterious looking.” “It’s also the crawdad capital of the world.” “Crawdad? What’s that?” Rick’s eyes widened with surprise. “You don’t know what crawdads are?” She shook her head. “They’re a freshwater crayfish, similar to shrimp… only better.
Linda Weaver Clarke (Mystery on the Bayou (Amelia Moore Detective Series #6))
I became well known for researching what the corporate government did not want researched.
Steven Magee
I’m reminded of the famous attorney who was asked if luck played any part in success at trial and he said yes, and it usually comes at three in the morning when I’m in the library doing research.
Bill Fitzhugh (A Perfect Harvest (The Transplant Tetralogy Book 4))
Frankly, I see it as a “social force.” Before coming to this country (in naive pursuit of scientific freedom . . .) Dr. Reich had successively gotten expelled from the International Psychoanalytical Association for sounding too Marxist, expelled from the Communist Party for sounding too Freudian, expelled from the Socialist Party for sounding too anarchistic, fled Hitler for having known Jewish genes, and then got driven out of Sweden by a campaign of slander in the sensational press (for doing the kind of sex research that later made Masters and Johnson famous.) I would say Dr. Reich touched on a lot of hot issues and annoyed a lot of dogmatic people of Left and Right, and this created a truly international “social force” for the suppression of his ideas.
Robert Anton Wilson (Cosmic Trigger III: My Life After Death)
As a chief ingredient in the mythology of science, the accumulation of objective facts supposedly controls the history of conceptual change–as logical and self-effacing scientists bow before the dictates of nature and willingly change their views to accommodate the growth of conceptual knowledge. The paradigm for such an idealistic notion remains Huxley’s famous remark about “a beautiful theory killed by a nasty, ugly little fact.” But single facts almost never slay worldviews, at least not right away (and properly so, for the majority of deeply anomalous observations turn out to be wrong)... Anomalous facts get incorporated into existing theories, often with a bit of forced stretching to be sure, but usually with decent fit because most worldviews contain considerable flexibility. (How else could they last so long, or be so recalcitrant to overthrow?)
Stephen Jay Gould (Leonardo's Mountain of Clams and the Diet of Worms: Essays on Natural History)
However that may be, after prolonged research on myself, I brought out the fundamental duplicity of the human being. Then I realized, as a result of delving in my memory, that modesty helped me to shine, humility to conquer, and virtue to oppress. I used to wage war by peaceful means and eventually used to achieve, through disinterested means, everything I desired. For instance, I never complained that my birthday was overlooked; people were even surprised, with a touch of admiration, by my discretion on this subject. But the reason for my disinterestedness was even more discreet: I longed to be forgotten in order to be able to complain to myself. Several days before the famous date (which I knew very well) I was on the alert, eager to let nothing slip that might arouse the attention and memory of those on whose lapse I was counting (didn’t I once go so far as to contemplate falsifying a friend’s calendar?). Once my solitude was thoroughly proved, I could surrender to the charms of a virile self-pity.
Albert Camus (The Fall)
What is so important about Engelbart’s legacy is that he saw the computer as primarily a tool to augment—not replace—human capability. In our current era, by contrast, much of the financing flowing out of Silicon Valley is aimed at building machines that can replace humans. In a famous encounter in 1953 at MIT, Marvin Minsky, the father of research on artificial intelligence, declared: “We’re going to make machines intelligent. We are going to make them conscious!” To which Doug Engelbart replied: “You’re going to do all that for the machines? What are you going to do for the people?
Jonathan Taplin (Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy)
The properties of the renewal tissues enabled the original definition of stem cell behaviour in terms of the ability to self-renew and to generate differentiated progeny. But the most famous stem cell of them all is now the embryonic stem cell (ES cell). In one sense, the ES cell is the iconic stem cell. It is the type of stem cell that has attracted all of the ethical controversy, and it is what lay people are thinking of when they refer to ‘stem cell research’. But ironically, the embryonic stem cell does not exist in nature. It is a creature that has been created by mankind and exists only in the world of tissue culture: the growth of cells in flasks in the laboratory, kept in temperature-controlled incubators, exposed to controlled concentrations of oxygen and carbon dioxide, and nourished by complex artificial media. Cells grown in culture are often referred to by the Latin phrase in vitro (in glass, since the relevant containers used to be made of glass) and distinguished from in vivo (inside the living body).
Jonathan M.W. Slack (Stem Cells: A Very Short Introduction)
the roughly $800 billion in available stimulus, we directed more than $90 billion toward clean energy initiatives across the country. Within a year, an Iowa Maytag plant I’d visited during the campaign that had been shuttered because of the recession was humming again, with workers producing state-of-the-art wind turbines. We funded construction of one of the world’s largest wind farms. We underwrote the development of new battery storage systems and primed the market for electric and hybrid trucks, buses, and cars. We financed programs to make buildings and businesses more energy efficient, and collaborated with Treasury to temporarily convert the existing federal clean energy tax credit into a direct-payments program. Within the Department of Energy, we used Recovery Act money to launch the Advanced Research Projects Agency–Energy (ARPA-E), a high-risk, high-reward research program modeled after DARPA, the famous Defense Department effort launched after Sputnik that helped develop not only advanced weapons systems like stealth technology but also an early iteration of the internet, automated voice activation, and GPS.
Barack Obama (A Promised Land)
Just because someone does something well doesn’t mean it’s easy for them,” I said. “Elsa’s sermons are researched and very carefully written. They are not first-draft productions. And the same goes for Doris’s.” “I know, I know.” Jennie scrunched up her face. “They’re the great older women of color in the association: they’ve both preached the big important sermons at General Assembly. Everyone knows they’re amazing.” “And that’s a bad thing?” “No. But there are also young people who have new and important ways of seeing and saying things. They maybe aren’t as experienced as these great older women—and I know, it’s unbelievable that big famous ministers like Elsa and Doris are even interested in us, but to me they’re a little, I don’t know, been there, done that.
Michelle Huneven (Search)
Lisa Brooks PhD was one of just 175 people to receive a fellowship from the famous John Simon Guggenheim Memorial Foundation. Lisa Brooks PhD received the award based on her prior accomplishments in history, geography and literature and her the future promise of her research.
Lisa Brooks PhD
The reason archaeological evidence from Europe is so rich is that European governments tend to be rich; and that European professional institutions, learned societies and university departments have been pursuing prehistory far longer on their own doorstep than in other parts of the world. With each year that passes, new evidence accumulates for early behavioural complexity elsewhere: not just Africa, but also the Arabian Peninsula, Southeast Asia and the Indian subcontinent.13 Even as we write, a cave site on the coast of Kenya called Panga ya Saidi is yielding evidence of shell beads and worked pigments stretching back 60,000 years;14 and research on the islands of Borneo and Sulawesi is opening vistas on to an unsuspected world of cave art, many thousands of years older than the famous images of Lascaux and Altamira, on the other side of Eurasia.
David Graeber (The Dawn of Everything: A New History of Humanity)
The phrase Daring Greatly is from Theodore Roosevelt's speech "Citizenship in a Republic." The speech, sometimes referred to as "The Man in the Arena," was delivered at the Sorbonne in Paris, France, on April 23, 1910. This is the passage that made the speech famous: "It is not the critic who counts; not the man who points out how the strong man stumbles, or where the doer of deeds could have done them better. The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood; who strives valiantly; who errs, who comes short again and again, because there is no effort without error and shortcoming; but who does actually strive to do the deeds; who knows great enthusiasms, the great devotions; who spends himself in a worthy cause; who at the best knows in the end the triumph of high achievement, and who the worst, if he fails, at least he fails while daring greatly..." The first time I read this quote, I thought, This is vulnerability. Everything I've learned from over a decade of research on vulnerability has taught me this exact lesson. Vulnerability is not knowing victory or defeat, it's understanding the necessity of both; it's engaging. It's being all in.
Brené Brown (Daring Greatly: How the Courage to Be Vulnerable Transforms the Way We Live, Love, Parent, and Lead)
Science has an established tradition when it comes to aspects of nature that defy logical explanation and resist experimentation. It ignores them. This is actually a proper response, since no one wants researchers offering fudgy guesses. Official silence may not be helpful, but it is respectable. However, as a result, the very word “consciousness” may seem out of place in science books or articles, despite the fact that, as we’ll see, most famous names in quantum mechanics regarded it as central to the understanding of the cosmos.
Robert Lanza (The Grand Biocentric Design: How Life Creates Reality)
In November of 1997, the New Jersey–based independent radio station WFMU broadcast a live forty-seven-minute interview with Ronald Thomas Clontle, the author of an upcoming book titled Rock, Rot & Rule. The book, billed as “the ultimate argument settler,” was (theoretically) a listing of almost every musical artist of the past fifty years, with each act designated as “rocking,” “rotting,” or “ruling” (with most of the research conducted in a coffeehouse in Lawrence, Kansas). The interview was, of course, a now semi-famous hoax. The book is not real and “Ronald Thomas Clontle” was actually Jon Wurster, the drummer for indie bands like Superchunk and (later) the Mountain Goats. Rock, Rot & Rule is a signature example of what’s now awkwardly classified as “late-nineties alt comedy,” performed at the highest possible level—the tone is understated, the sensibility is committed and absurd, and the unrehearsed chemistry between Wurster and the program’s host (comedian Tom Scharpling) is otherworldly. The sketch would seem like the ideal comedic offering for the insular audience of WFMU, a self-selecting group of sophisticated music obsessives from the New York metropolitan area. Yet when one relistens to the original Rock, Rot & Rule broadcast, the most salient element is not the comedy. It’s the apoplectic phone calls from random WFMU listeners. The callers do not recognize this interview as a hoax, and they’re definitely not “ironic” or “apathetic.” They display none of the savvy characteristics now associated with nineties culture. Their anger is almost innocent.
Chuck Klosterman (The Nineties: A Book)