Famous Researcher Quotes

We've searched our database for all the quotes and captions related to Famous Researcher. Here they are! All 100 of them:

I began to come into close contact with poverty, with hunger, with disease, with the inability to cure a child because of a lack of resources… And I began to see there was something that, at that time, seemed to me almost as important as being a famous researcher or making some substantial contribution to medical science, and this was helping those people.
Ernesto Che Guevara (The Motorcycle Diaries: Notes on a Latin American Journey)
The trouble with Goodreads is that they never authenticate these quotations of famous people.
Aristotle (Physics)
life expectancy among working-class white Americans had been decreasing since the early 2000s. In modern history the only obvious parallel was with Russia in the desperate aftermath of the fall of the Soviet Union. One journalistic essay and academic research paper after another confirmed the disaster, until the narrative was capped in 2015 by Anne Case and Angus Deaton’s famous account of “deaths of despair.
Adam Tooze (Crashed: How a Decade of Financial Crises Changed the World)
We like to believe that we live in a grand age of creative individualism. We look back at the midcentury era in which the Berkeley researchers conducted their creativity studies, and feel superior. Unlike the starched-shirted conformists of the 1950s, we hang posters of Einstein on our walls, his tongue stuck out iconoclastically. We consume indie music and films, and generate our own online content. We “think different” (even if we got the idea from Apple Computer’s famous ad campaign). But the way we organize many of our most important institutions—our schools and our workplaces—tells a very different story.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
It’s super-important to have a strong social media presence, and Jane’s always going, When interviewers ask you about your Twitter, say you love reaching out directly to your fans, and I’m like, I don’t even know how to use Twitter or what the password is because you disabled my laptop’s wireless and only let me go on the Internet to do homework research or email Nadine assignments, and she says, I’m doing you a big favor, it’s for nobodies who want to pretend like they’re famous and for self-promoting hacks without PR machines, and adults act like teenagers passing notes and everyone’s IQ drops thirty points on it.
Teddy Wayne (The Love Song of Jonny Valentine)
A separate, international team analyzed more than a half million research articles, and classified a paper as “novel” if it cited two other journals that had never before appeared together. Just one in ten papers made a new combination, and only one in twenty made multiple new combinations. The group tracked the impact of research papers over time. They saw that papers with new knowledge combinations were more likely to be published in less prestigious journals, and also much more likely to be ignored upon publication. They got off to a slow start in the world, but after three years, the papers with new knowledge combos surpassed the conventional papers, and began accumulating more citations from other scientists. Fifteen years after publication, studies that made multiple new knowledge combinations were way more likely to be in the top 1 percent of most-cited papers. To recap: work that builds bridges between disparate pieces of knowledge is less likely to be funded, less likely to appear in famous journals, more likely to be ignored upon publication, and then more likely in the long run to be a smash hit in the library of human knowledge. •
David Epstein (Range: Why Generalists Triumph in a Specialized World)
By the end of the day, we determined that we could provide chocolate therapy three times a day and research a chocolate protocol at the world-famous Hershey's Hospital. Do you think they provide it in IV formula?
Keith Desserich
In the scientific world, the syndrome known as 'great man's disease' happens when a famous researcher in one field develops strong opinions about another field that he or she does not understand, such as a chemist who decides that he is an expert in medicine or a physicist who decides that he is an expert in cognitive science. They have trouble accepting that they must go back to school before they can make pronouncements in a new field.
Paul Krugman (A Country Is Not a Company (Harvard Business Review Classics))
In a famous experiment conducted by NASA in the 1990s, researchers fed a variety of psychoactive substances to spiders to see how they would affect their web-making skills. The caffeinated spider spun a strangely cubist and utterly ineffective web, with oblique angles, openings big enough to let small birds through, and completely lacking in symmetry or a center. (The web was far more fanciful than the ones spun by spiders given cannabis or LSD.)
Michael Pollan (This Is Your Mind on Plants)
But when he instructed his staff to give the injections without telling patients they contained cancer cells, three young Jewish doctors refused, saying they wouldn’t conduct research on patients without their consent. All three knew about the research Nazis had done on Jewish prisoners. They also knew about the famous Nuremberg Trials.
Rebecca Skloot (The Immortal Life of Henrietta Lacks)
There is a famous study from the 1930s involving a group of orphanage babies who, at mealtimes, were presented with a smorgasbord of thirty-four whole, healthy foods. Nothing was processed or prepared beyond mincing or mashing. Among the more standard offerings—fresh fruits and vegetables, eggs, milk, chicken, beef—the researcher, Clara Davis, included liver, kidney, brains, sweetbreads, and bone marrow. The babies shunned liver and kidney (as well as all ten vegetables, haddock, and pineapple), but brains and sweetbreads did not turn up among the low-preference foods she listed. And the most popular item of all? Bone marrow.
Mary Roach (Gulp: Adventures on the Alimentary Canal)
Lincoln is not the only famous leader to have battled depression. Winston Churchill lived with the ‘black dog’ for much of his life too. Watching a fire, he once remarked to a young researcher he was employing: ‘I know why logs spit. I know what it is to be consumed.
Matt Haig (Reasons To Stay Alive)
Gene patents are the point of greatest concern in the debate over ownership of human biological materials, and how that ownership might interfere with science. As of 2005—the most recent year figures were available—the U.S. government had issued patents relating to the use of about 20 percent of known human genes, including genes for Alzheimer’s, asthma, colon cancer, and, most famously, breast cancer. This means pharmaceutical companies, scientists, and universities control what research can be done on those genes, and how much resulting therapies and diagnostic tests will cost. And some enforce their patents aggressively: Myriad Genetics, which holds the patents on the BRCA1 and BRCA2 genes responsible for most cases of hereditary breast and ovarian cancer, charges $3,000 to test for the genes. Myriad has been accused of creating a monopoly, since no one else can offer the test, and researchers can’t develop cheaper tests or new therapies without getting permission from Myriad and paying steep licensing fees. Scientists who’ve gone ahead with research involving the breast-cancer genes without Myriad’s permission have found themselves on the receiving end of cease-and-desist letters and threats of litigation.
Rebecca Skloot
Malcolm Gladwell puts the "pop" in pop psychology, and although revered in lay circles, is roundly dismissed by experts - even by the researchers he makes famous.
Paul Gibbons (The Science of Successful Organizational Change: How Leaders Set Strategy, Change Behavior, and Create an Agile Culture)
Research shows that those who believe in a wrathful God are more likely to suffer from depression and anxiety disorders than those who believe in a loving, merciful God.
Tony Jones (Did God Kill Jesus?: Searching for Love in History's Most Famous Execution)
As the famous saying goes, “Great minds discuss ideas; average minds discuss events; small minds discuss people.” But research suggests it deserves more credit than that.
Jamil Zaki (Hope for Cynics: The Surprising Science of Human Goodness)
She knew for a fact that being left-handed automatically made you special. Marie Curie, Albert Einstein, Linus Pauling, and Albert Schweitzer were all left-handed. Of course, no believable scientific theory could rest on such a small group of people. When Lindsay probed further, however, more proof emerged. Michelangelo, Leonardo da Vinci, M.C. Escher, Mark Twain, Hans Christian Andersen, Lewis Carrol, H.G. Wells, Eudora Welty, and Jessamyn West- all lefties. The lack of women in her research had initially bothered her until she mentioned it to Allegra. "Chalk that up to male chauvinism," she said. "Lots of left-handed women were geniuses. Janis Joplin was. All it means is that the macho-man researchers didn't bother asking.
Jo-Ann Mapson (The Owl & Moon Cafe)
There was some awareness back then about hidden gender bias, particularly because of research like the famous “Howard and Heidi” study. Two Columbia Business School professors had taken an HBS case study about a female venture capitalist named Heidi Roizen and, in half the classes they taught, presented exactly the same stories and qualifications but called her Howard. In surveys of the students, they came away believing that Howard was beloved—so competent! such a go-getter!—whereas Heidi was a power-hungry egomaniac. Same person, just a different name.
Ellen Pao (Reset: My Fight for Inclusion and Lasting Change)
Frankly, the overwhelming majority of academics have ignored the data explosion caused by the digital age. The world’s most famous sex researchers stick with the tried and true. They ask a few hundred subjects about their desires; they don’t ask sites like PornHub for their data. The world’s most famous linguists analyze individual texts; they largely ignore the patterns revealed in billions of books. The methodologies taught to graduate students in psychology, political science, and sociology have been, for the most part, untouched by the digital revolution. The broad, mostly unexplored terrain opened by the data explosion has been left to a small number of forward-thinking professors, rebellious grad students, and hobbyists. That will change.
Seth Stephens-Davidowitz (Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are)
The successful ideas survive scrutiny. The bad ideas get discarded. Conformity is also laughable to scientists attempting to advance their careers. The best way to get famous in your own lifetime is to pose an idea that counters prevailing research and that earns a consistency of observations and experiment. Healthy disagreement is a natural state on the bleeding edge of discovery.
Neil deGrasse Tyson (Starry Messenger: Cosmic Perspectives on Civilization)
He was nice enough, an old guy who got famous in the 1960s for doing drugs and getting high and calling it research, so you have to figure he was a bit of a flake and probably pretty immature, too.
Ruth Ozeki (A Tale for the Time Being)
Turing’s vision was shared by his fellow computer scientists in America, who codified their curiosity in 1956 with a now famous Dartmouth College research proposal in which the term “artificial intelligence” was coined.
Fei-Fei Li (The Worlds I See: Curiosity, Exploration, and Discovery at the Dawn of AI)
Neurologically speaking, though, there are reasons we develop a confused sense of priorities when we’re in front of our computer screens. For one thing, email comes at unpredictable intervals, which, as B. F. Skinner famously showed with rats seeking pellets, is the most seductive and habit-forming reward pattern to the mammalian brain. (Think about it: would slot machines be half as thrilling if you knew when, and how often, you were going to get three cherries?) Jessie would later say as much to me when I asked her why she was “obsessed”—her word—with her email: “It’s like fishing. You just never know what you’re going to get.” More to the point, our nervous systems can become dysregulated when we sit in front of a screen. This, at least, is the theory of Linda Stone, formerly a researcher and senior executive at Microsoft Corporation. She notes that we often hold our breath or breathe shallowly when we’re working at our computers. She calls this phenomenon “email apnea” or “screen apnea.” “The result,” writes Stone in an email, “is a stress response. We become more agitated and impulsive than we’d ordinarily be.
Jennifer Senior (All Joy and No Fun: The Paradox of Modern Parenthood)
As a firstborn I also had a duty to succeed my father and look after my mother and siblings. Although school taught me that this was an outdated practice and that I would have been better off focusing on inheriting my father's assets for my own benefit, it was the strong emphasis on family values that ultimately prevailed. This was not because they had sounded good on paper or had been presented by a world-famous researcher, but because I saw they worked through my experience.
Salatiso Lonwabo Mdeni (The Homeschooling Father, How and Why I got started.: Traditional Schooling to Online Learning until Homeschooling)
Religion has used ritual forever. I remember a famous study led by psychologist Alfred Tomatis of a group of clinically depressed monks. After much examination, researchers concluded that the group’s depression stemmed from their abandoning a twice-daily ritual of gathering to sing Gregorian chants. They had lost the sense of community and the comfort of singing together in harmony. Creating beautiful music together was a formal recognition of their connection and a shared moment of joy.
Sue Johnson (Hold Me Tight: Seven Conversations for a Lifetime of Love (The Dr. Sue Johnson Collection Book 1))
Maria Orsic, a stunning beauty and an unusual medium was not an obscure personality. She was known to many celebrities of the era and had a fleet of very powerful admirers and friends both in Germany and abroad; famous, brilliant and influential people like Charles Lindbergh, Nikola Tesla, Marshal Tito of Yugoslavia, Henry Ford, Eva Peron, and the most illustrious figures in the spiritualism, parapsychological and psychical research in Great Britain. This was reported by Allies intelligence and documented by OSS operatives in Europe.
Maximillien de Lafayette (Volume I. UFOs: MARIA ORSIC, THE WOMAN WHO ORIGINATED AND CREATED EARTH’S FIRST UFOS (Extraterrestrial and Man-Made UFOs & Flying Saucers Book 1))
As a professional philosopher, I very rarely hyperventilate while doing research, but Peirce was a notorious recluse. Most of his books had been sold or carried off to Harvard at the end of his life, but somehow this little treasure—Peirce’s own copy of his first and most famous publication—had ended up here. *
John Kaag (American Philosophy: A Love Story)
For the benefit of your research people, I would like to mention (so as to avoid any duplication of labor): that the planet is very like Mars; that at least seventeen states have Pinedales; that the end of the top paragraph Galley 3 is an allusion to the famous "canals" (or, more correctly, "channels") of Schiaparelli (and Percival Lowell); that I have thoroughly studied the habits of chinchillas; that Charrete is old French and should have one "t"; that Boke's source on Galley 9 is accurate; that "Lancelotik" is not a Celtic diminutive but a Slavic one; that "Betelgeuze" is correctly spelled with a "z", not an "s" as some dictionaries have it; that the "Indigo" Knight is the result of some of my own research; that Sir Grummore, mentioned both in Le Morte Darthur ad in Amadis de Gaul, was a Scotsman; that L'Eau Grise is a scholarly pun; and that neither bludgeons nor blandishments will make me give up the word "hobnailnobbing".
Vladimir Nabokov
The basic concept of microdosing is nothing new. Albert Hofmann, who first synthesized LSD in 1938, considered it one of the drug’s most promising, and least researched, applications. He was among the first to realize its antidepressant and cognition-enhancing potential,[vi] famously taking between 10 and 20 μg himself, twice a week, for the last few decades of his life.[vii]
Paul Austin (Microdosing Psychedelics: A Practical Guide to Upgrade Your Life)
So I did some research,” she went on. “The good thing about being a famous model is that you can call anyone and they’ll talk to you. So I called this illusionist I’d seen on Broadway a couple of years ago. He heard the story and then he laughed. I said what’s so funny. He asked me a question: Did this guru do this after dinner? I was surprised. What the hell could that have to do with it? But I said yes, how did you know? He asked if we had coffee. Again I said yes. Did he take his black? One more time I said yes.” Shauna was smiling now. “Do you know how he did it, Beck?” I shook my head. “No clue.” “When he passed the card to Wendy, it went over his coffee cup. Black coffee, Beck. It reflects like a mirror. That’s how he saw what I’d written. It was just a dumb parlor trick.
Harlan Coben (Tell No One)
Working hard is important. But more effort does not necessarily yield more results. “Less but better” does. Ferran Adrià, arguably the world’s greatest chef, who has led El Bulli to become the world’s most famous restaurant, epitomizes the principle of “less but better” in at least two ways. First, his specialty is reducing traditional dishes to their absolute essence and then re-imagining them in ways people have never thought of before. Second, while El Bulli has somewhere in the range of 2 million requests for dinner reservations each year, it serves only fifty people per night and closes for six months of the year. In fact, at the time of writing, Ferran had stopped serving food altogether and had instead turned El Bulli into a full-time food laboratory of sorts where he was continuing to pursue nothing but the essence of his craft.1 Getting used to the idea of “less but better” may prove harder than it sounds, especially when we have been rewarded in the past for doing more … and more and more. Yet at a certain point, more effort causes our progress to plateau and even stall. It’s true that the idea of a direct correlation between results and effort is appealing. It seems fair. Yet research across many fields paints a very different picture. Most people have heard of the “Pareto Principle,” the idea, introduced as far back as the 1790s by Vilfredo Pareto, that 20 percent of our efforts produce 80 percent of results. Much later, in 1951, in his Quality-Control Handbook, Joseph Moses Juran, one of the fathers of the quality movement, expanded on this idea and called it “the Law of the Vital Few.”2 His observation was that you could massively improve the quality of a product by resolving a tiny fraction of the problems. He found a willing test audience for this idea in Japan, which at the time had developed a rather poor reputation for producing low-cost, low-quality goods. By adopting a process in which a high percentage of effort and attention was channeled toward improving just those few things that were truly vital, he made the phrase “made in Japan” take on a totally new meaning. And gradually, the quality revolution led to Japan’s rise as a global economic power.3
Greg McKeown (Essentialism: The Disciplined Pursuit of Less)
Lederman is also a charismatic personality, famous among his colleagues for his humor and storytelling ability. One of his favorite anecdotes relates the time when, as a graduate student, he arranged to bump into Albert Einstein while walking the grounds at the Institute for Advanced Study at Princeton. The great man listened patiently as the eager youngster explained the particle-physics research he was doing at Columbia, and then said with a smile, “That is not interesting.
Sean Carroll (The Particle at the End of the Universe: The Hunt for the Higgs Boson and the Discovery of a New World)
Four thousand miles away in France, the old boys from the Haute-Loire Resistance wrote to each other to share the devastating news. They had enjoyed nearly forty years of freedom since spending a mere couple of months in Virginia’s presence in 1944. But the warrior they called La Madone had shown them hope, comradeship, courage, and the way to be the best version of themselves, and they had never forgotten. In the midst of hardship and fear, she had shared with them a fleeting but glorious state of happiness and the most vivid moment of their lives. The last of those famous Diane Irregulars—the ever-boyish Gabriel Eyraud, her chouchou—passed away in 2017 while I was researching Virginia’s story. Until the end of his days, he and the others who had known Virginia on the plateau liked to pause now and then to think of the woman in khaki who never, ever gave up on freedom. When they talked with awe and affection of her incredible exploits, they smiled and looked up at the wide, open skies with “les étoiles dans les yeux.
Sonia Purnell (A Woman of No Importance: The Untold Story of the American Spy Who Helped Win World War II)
Starting something new in middle age might look that way too. Mark Zuckerberg famously noted that “young people are just smarter.” And yet a tech founder who is fifty years old is nearly twice as likely to start a blockbuster company as one who is thirty, and the thirty-year-old has a better shot than a twenty-year-old. Researchers at Northwestern, MIT, and the U.S. Census Bureau studied new tech companies and showed that among the fastest-growing start-ups, the average age of a founder was forty-five when the company was launched.
David Epstein (Range: Why Generalists Triumph in a Specialized World)
The smartest person to ever walk this Earth in all probability lived and died herding goats on a mountain somewhere, with no way to disseminate their work globally even if they had realised they were super smart and had the means to do something with their abilities. I am not keen on 'who are the smartest' lists and websites because, as Scott Barry Kaufman points out, the concept of genius privileges the few who had the opportunity to see through and promote their life’s work, while excluding others who may have had equal or greater raw potential but lacked the practical and financial support, and the communication platform that famous names clearly had. This is why I am keen to develop, through my research work, a definition of genius from a cognitive neuroscience and psychometric point of view, so that whatever we decide that is and how it should be measured, only focuses on clearly measurable factors within the individual’s mind, regardless of their external achievements, eminence, popularity, wealth, public platform etc. In my view this would be both more equitable and more scientific.
Gwyneth Wesley Rolph
THREE FAMOUS ENGRAVINGS depict Alexis St. Martin in his youth. I’ve seen them many times, in biographies of his surgeon William Beaumont, in Beaumont’s own book, in journal articles about the pair. As detailed as the artworks are, you can’t tell what St. Martin looked like from examining them. All three woodcuts are of the lower portion of his left breast, and the famous hole. I could pick St. Martin’s nipple out of a lineup before I could his eyes. I suppose this makes sense; Beaumont was a researcher and St. Martin his subject—more a body than a man.
Mary Roach (Gulp: Adventures on the Alimentary Canal)
By 1952, the University of Minnesota nutritionist Ancel Keys was arguing that high blood levels of cholesterol caused heart disease, and that it was the fat in our diets that drove up cholesterol levels. Keys had a conflict of interest: his research had been funded by the sugar industry—the Sugar Research Foundation and then the Sugar Association—since 1944, if not earlier, and the K-rations he had famously developed for the military during the war (the “K” is said to have stood for “Keys”) were loaded with sugar. This might have naturally led him to perceive something other than sugar as the problem. We can only guess.
Gary Taubes (The Case Against Sugar)
Valentine’s concept of introversion includes traits that contemporary psychology would classify as openness to experience (“thinker, dreamer”), conscientiousness (“idealist”), and neuroticism (“shy individual”). A long line of poets, scientists, and philosophers have also tended to group these traits together. All the way back in Genesis, the earliest book of the Bible, we had cerebral Jacob (a “quiet man dwelling in tents” who later becomes “Israel,” meaning one who wrestles inwardly with God) squaring off in sibling rivalry with his brother, the swashbuckling Esau (a “skillful hunter” and “man of the field”). In classical antiquity, the physicians Hippocrates and Galen famously proposed that our temperaments—and destinies—were a function of our bodily fluids, with extra blood and “yellow bile” making us sanguine or choleric (stable or neurotic extroversion), and an excess of phlegm and “black bile” making us calm or melancholic (stable or neurotic introversion). Aristotle noted that the melancholic temperament was associated with eminence in philosophy, poetry, and the arts (today we might classify this as opennessto experience). The seventeenth-century English poet John Milton wrote Il Penseroso (“The Thinker”) and L’Allegro (“The Merry One”), comparing “the happy person” who frolics in the countryside and revels in the city with “the thoughtful person” who walks meditatively through the nighttime woods and studies in a “lonely Towr.” (Again, today the description of Il Penseroso would apply not only to introversion but also to openness to experience and neuroticism.) The nineteenth-century German philosopher Schopenhauer contrasted “good-spirited” people (energetic, active, and easily bored) with his preferred type, “intelligent people” (sensitive, imaginative, and melancholic). “Mark this well, ye proud men of action!” declared his countryman Heinrich Heine. “Ye are, after all, nothing but unconscious instruments of the men of thought.” Because of this definitional complexity, I originally planned to invent my own terms for these constellations of traits. I decided against this, again for cultural reasons: the words introvert and extrovert have the advantage of being well known and highly evocative. Every time I uttered them at a dinner party or to a seatmate on an airplane, they elicited a torrent of confessions and reflections. For similar reasons, I’ve used the layperson’s spelling of extrovert rather than the extravert one finds throughout the research literature.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
When I am asked to summarize the fundamental message from research on self-control, I recall “Descartes’s famous dictum cogito, ergo sum—“I think, therefore I am.” What has been discovered about mind, brain, and self-control lets us move from his proposition to “I think, therefore I can change what I am.” Because by changing how we think, we can change what we feel, do, and become. If that leads to the question “But can I really change?,” I reply with what George Kelly said to his therapy clients when they kept asking him if they could get control of their lives. He looked straight into their eyes and said, “Would you like to?
Walter Mischel
Equally bad deals have been made with Big Tech. In many ways, Silicon Valley is a product of the U.S. government’s investments in the development of high-risk technologies. The National Science Foundation funded the research behind the search algorithm that made Google famous. The U.S. Navy did the same for the GPS technology that Uber depends on. And the Defense Advanced Research Projects Agency, part of the Pentagon, backed the development of the Internet, touchscreen technology, Siri, and every other key component in the iPhone. Taxpayers took risks when they invested in these technologies, yet most of the technology companies that have benefited fail to pay their fair share of taxes.
Mariana Mazzucato
Research made famous by Kent Berridge at the University of Michigan shows that dopamine is released when something new and potentially useful triggers the brain. We often think dopamine is the stuff of pleasure, but Berridge’s research shows that dopamine is related to pleasure, but not pleasure itself. It’s a chemical message that says, “Give me more!” And it’s activated by sex, many drugs, chocolate, and novelty. The buzz of the phone in your pocket, wondering if it’s good news or bad, the endless potential of what you could learn from the next Instagram story you swipe through, triggers dopamine release in a way similar to methamphetamine and lust. This, as I’m sure you have noticed, is very distracting.
Jedidiah Jenkins (Like Streams to the Ocean: Notes on Ego, Love, and the Things That Make Us Who We Are: Essaysc)
Some researchers, such as psychologist Jean Twenge, say this new world where compliments are better than sex and pizza, in which the self-enhancing bias has been unchained and allowed to gorge unfettered, has led to a new normal in which the positive illusions of several generations have now mutated into full-blown narcissism. In her book The Narcissism Epidemic, Twenge says her research shows that since the mid-1980s, clinically defined narcissism rates in the United States have increased in the population at the same rate as obesity. She used the same test used by psychiatrists to test for narcissism in patients and found that, in 2006, one in four U.S. college students tested positive. That’s real narcissism, the kind that leads to diagnoses of personality disorders. In her estimation, this is a dangerous trend, and it shows signs of acceleration. Narcissistic overconfidence crosses a line, says Twenge, and taints those things improved by a skosh of confidence. Over that line, you become less concerned with the well-being of others, more materialistic, and obsessed with status in addition to losing all the restraint normally preventing you from tragically overestimating your ability to manage or even survive risky situations. In her book, Twenge connects this trend to the housing market crash of the mid-2000s and the stark increase in reality programming during that same decade. According to Twenge, the drive to be famous for nothing went from being strange to predictable thanks to a generation or two of people raised by parents who artificially boosted self-esteem to ’roidtastic levels and then released them into a culture filled with new technologies that emerged right when those people needed them most to prop up their self-enhancement biases. By the time Twenge’s research was published, reality programming had spent twenty years perfecting itself, and the modern stars of those shows represent a tiny portion of the population who not only want to be on those shows, but who also know what they are getting into and still want to participate. Producers with the experience to know who will provide the best television entertainment to millions then cull that small group. The result is a new generation of celebrities with positive illusions so robust and potent that the narcissistic overconfidence of the modern American teenager by comparison is now much easier to see as normal.
David McRaney (You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself)
The person who discovered the answer was a retiring, self-funded scientist named Peter Mitchell who in the early 1960s inherited a fortune from the Wimpey house-building company and used it to set up a research center in a stately home in Cornwall. Mitchell was something of an eccentric. He wore shoulder-length hair and an earring at a time when that was especially unusual among serious scientists. He was also famously forgetful. At his daughter’s wedding, he approached another guest and confessed that she looked familiar, though he couldn’t quite place her. “I was your first wife,” she answered. Mitchell’s ideas were universally dismissed, not altogether surprisingly. As one chronicler has noted, “At the time that Mitchell proposed his hypothesis there was not a shred of evidence in support of it.” But he was eventually vindicated and in 1978 was awarded the Nobel Prize in Chemistry—an extraordinary accomplishment for someone who worked from a home lab. The
Bill Bryson (The Body: A Guide for Occupants)
When I was a kid, my mother thought spinach was the healthiest food in the world because it contained so much iron. Getting enough iron was a big deal then because we didn't have 'iron-fortified' bread. Turns out that spinach is an okay source of iron, but no better than pizza, pistachio nuts, cooked lentils, or dried peaches. The spinach-iron myth grew out of a simple mathematical miscalculation: A researcher accidentally moved a decimal point one space, so he thought spinach had 10 times more iron than it did. The press reported it, and I had to eat spinach. Moving the decimal point was an honest mistake--but it's seldom that simple. If it happened today I'd suspect a spinach lobby was behind it. Businesses often twist science to make money. Lawyers do it to win cases. Political activists distort science to fit their agenda, bureaucrats to protect their turf. Reporters keep falling for it. Scientists sometimes go along with it because they like being famous.
John Stossel (Give Me a Break: How I Exposed Hucksters, Cheats, and Scam Artists and Became the Scourge of the Liberal Media...)
However that may be, after prolonged research on myself, I brought out the fundamental duplicity of the human being. Then I realized, as a result of delving in my memory, that modesty helped me to shine, humility to conquer, and virtue to oppress. I used to wage war by peaceful means and eventually used to achieve, through disinterested means, everything I desired. For instance, I never complained that my birthday was overlooked; people were even surprised, with a touch of admiration, by my discretion on this subject. But the reason for my disinterestedness was even more discreet: I longed to be forgotten in order to be able to complain to myself. Several days before the famous date (which I knew very well) I was on the alert, eager to let nothing slip that might arouse the attention and memory of those on whose lapse I was counting (didn’t I once go so far as to contemplate falsifying a friend’s calendar?). Once my solitude was thoroughly proved, I could surrender to the charms of a virile self-pity.
Albert Camus (The Fall)
Look at the telephone; it would remind you of a unique scientist, Alexander Graham Bell. He, besides being a great inventor, was also a man of great compassion and service. In fact, much of the research which led to the development of the telephone was directed at finding solutions to the challenges of hearing impaired people and helping them to be able to listen and communicate. Bell’s mother and wife were both hearing impaired and it profoundly changed Bell’s outlook to science. He aimed to make devices which would help the hearing impaired. He started a special school in Boston to teach hearing impaired people in novel ways. It was these lessons which inspired him to work with sound and led to the invention of the telephone. Can you guess the name of the most famous student of Alexander Graham Bell? It was Helen Keller, the great author, activist and poet who was hearing and visually impaired. About her teacher, she once said that Bell dedicated his life to the penetration of that ‘inhuman silence which separates and estranges’.
A.P.J. Abdul Kalam (Learning How to Fly: Life Lessons for the Youth)
And then, as slowly as the light fades on a calm winter evening, something went out of our relationship. I say that selfishly. Perhaps I started to look for something which had never been there in the first place: passion, romance. I aresay that as I entered my forties I had a sense that somehow life was going past me. I had hardly experienced those emotions which for me have mostly come from reading books or watching television. I suppose that if there was anything unsatisfactory in our marriage, it was in my perception of it—the reality was unchanged. Perhaps I grew up from childhood to manhood too quickly. One minute I was cutting up frogs in the science lab at school, the next I was working for the National Centre for Fisheries Excellence and counting freshwater mussel populations on riverbeds. Somewhere in between, something had passed me by: adolescence, perhaps? Something immature, foolish yet intensely emotive, like those favourite songs I had recalled dimly as if being played on a distant radio, almost too far away to make out the words. I had doubts, yearnings, but I did not know why or what for. Whenever I tried to analyse our lives, and talk about it with Mary, she would say, ‘Darling, you are on the way to becoming one of the leading authorities in the world on caddis fly larvae. Don’t allow anything to deflect you from that. You may be rather inadequately paid, certainly compared with me you are, but excellence in any field is an achievement beyond value.’ I don’t know when we started drifting apart. When I told Mary about the project—I mean about researching the possibility of a salmon fishery in the Yemen—something changed. If there was a defining moment in our marriage, then that was it. It was ironical, in a sense. For the first time in my life I was doing something which might bring me international recognition and certainly would make me considerably better off—I could live for years off the lecture circuit alone, if the project was even half successful. Mary didn’t like it. I don’t know what part she didn’t like: the fact I might become more famous than her, the fact I might even become better paid than her. That makes her sound carping.
Paul Torday (Salmon Fishing in the Yemen)
Benjamin Libet, a scientist in the physiology department of the University of California, San Francisco, was a pioneering researcher into the nature of human consciousness. In one famous experiment he asked a study group to move their hands at a moment of their choosing while their brain activity was being monitored. Libet was seeking to identify what came first — the brain’s electrical activity to make the hand move or the person’s conscious intention to make their hand move. It had to be the second one, surely? But no. Brain activity to move the hand was triggered a full half a second before any conscious intention to move it…. John-Dylan Haynes, a neuroscientist at the Max Planck Institute for Human Cognitive and Brain Studies in Leipzig, Germany, led a later study that was able to predict an action ten seconds before people had a conscious intention to do it. What was all the stuff about free will? Frank Tong, a neuroscientist at Vanderbilt University in Nashville, Tennessee, said: “Ten seconds is a lifetime in terms of brain activity.” So where is it coming from if not ‘us,’ the conscious mind?
David Icke
Nartok shows me an example of Arctic “greens”: cutout number 13, Caribou Stomach Contents. Moss and lichen are tough to digest, unless, like caribou, you have a multichambered stomach in which to ferment them. So the Inuit let the caribou have a go at it first. I thought of Pat Moeller and what he’d said about wild dogs and other predators eating the stomachs and stomach contents of their prey first. “And wouldn’t we all,” he’d said, “be better off.” If we could strip away the influences of modern Western culture and media and the high-fructose, high-salt temptations of the junk-food sellers, would we all be eating like Inuit elders, instinctively gravitating to the most healthful, nutrient-diverse foods? Perhaps. It’s hard to say. There is a famous study from the 1930s involving a group of orphanage babies who, at mealtimes, were presented with a smorgasbord of thirty-four whole, healthy foods. Nothing was processed or prepared beyond mincing or mashing. Among the more standard offerings—fresh fruits and vegetables, eggs, milk, chicken, beef—the researcher, Clara Davis, included liver, kidney, brains, sweetbreads, and bone marrow. The babies shunned liver and kidney (as well as all ten vegetables, haddock, and pineapple), but brains and sweetbreads did not turn up among the low-preference foods she listed. And the most popular item of all? Bone marrow.
Mary Roach (Gulp: Adventures on the Alimentary Canal)
Example: a famous-to-economists finding in behavioral economics concerns pricing, and the fact that people have a provable bias towards the middle of three prices. It was first demonstrated with an experiment in beer pricing: when there were two beers, a third of people chose the cheaper; adding an even cheaper beer made the share of that beer go up, because it was now in the middle of three prices; adding an even more expensive beer at the top, and dropping the cheapest beer, made the share of the new beer in the middle (which had previously been the most expensive) go up from two-thirds to 90 percent. Having a price above and a price below makes the price in the middle seem more appealing. This experiment has been repeated with other consumer goods, such as ovens, and is now a much-used strategy in the corporate world. Basically, if you have two prices for something, and want to make more people pay the higher price, you add a third, even higher price; that makes the formerly highest price more attractive. Watch out for this strategy. (The research paper about beer pricing, written by a trio of economists at Duke University in 1982, was published in the Journal of Consumer Research. It’s called “Adding Asymetrically Dominated Alternatives: Violations of Regularity and the Simularity Hypothesis”—which must surely be the least engaging title ever given to an article about beer.)
John Lanchester (How to Speak Money: What the Money People Say-And What It Really Means: What the Money People Say―And What It Really Means)
But the basis of Freud's ideas aren't accepted by all philosophers, though many accept that he was right about the possibility of unconscious thought. Some have claimed that Freud's theories are unscientific. Most famously, Karl Popper (whose ideas are more fully discussed in Chapter 36) described many of the ideas of psychoanalysis as ‘unfalsifiable’. This wasn't a compliment, but a criticism. For Popper, the essence of scientific research was that it could be tested; that is, there could be some possible observation that would show that it was false. In Popper's example, the actions of a man who pushed a child into a river, and a man who dived in to save a drowning child were, like all human behaviour, equally open to Freudian explanation. Whether someone tried to drown or save a child, Freud's theory could explain it. He would probably say that the first man was repressing some aspect of his Oedipal conflict, and that led to his violent behaviour, whereas the second man had ‘sublimated’ his unconscious desires, that is, managed to steer them into socially useful actions. If every possible observation is taken as further evidence that the theory is true, whatever that observation is, and no imaginable evidence could show that it was false, Popper believed, the theory couldn't be scientific at all. Freud, on the other hand, might have argued that Popper had some kind of repressed desire that made him so aggressive towards psychoanalysis. Bertrand
Nigel Warburton (A Little History of Philosophy (Little Histories))
For years I’ve been asking myself (and my readers) whether these propagandists—commonly called corporate or capitalist journalists—are evil or stupid. I vacillate day by day. Most often I think both. But today I’m thinking evil. Here’s why. You may have heard of John Stossel. He’s a long-term analyst, now anchor, on a television program called 20/20, and is most famous for his segment called “Give Me A Break,” in which, to use his language, he debunks commonly held myths. Most of the rest of us would call what he does “lying to serve corporations.” For example, in one of his segments, he claimed that “buying organic [vegetables] could kill you.” He stated that specially commissioned studies had found no pesticide residues on either organically grown or pesticide-grown fruits and vegetables, and had found further that organic foods are covered with dangerous strains of E. coli. But the researchers Stossel cited later stated he misrepresented their research. The reason they didn’t find any pesticides is because they never tested for them (they were never asked to). Further, they said Stossel misrepresented the tests on E. coli. Stossel refused to issue a retraction. Worse, the network aired the piece two more times. And still worse, it came out later that 20/20’s executive director Victor Neufeld knew about the test results and knew that Stossel was lying a full three months before the original broadcast.391 This is not unusual for Stossel and company.
Derrick Jensen (Endgame, Vol. 1: The Problem of Civilization)
If this is true—if solitude is an important key to creativity—then we might all want to develop a taste for it. We’d want to teach our kids to work independently. We’d want to give employees plenty of privacy and autonomy. Yet increasingly we do just the opposite. We like to believe that we live in a grand age of creative individualism. We look back at the midcentury era in which the Berkeley researchers conducted their creativity studies, and feel superior. Unlike the starched-shirted conformists of the 1950s, we hang posters of Einstein on our walls, his tongue stuck out iconoclastically. We consume indie music and films, and generate our own online content. We “think different” (even if we got the idea from Apple Computer’s famous ad campaign). But the way we organize many of our most important institutions—our schools and our workplaces—tells a very different story. It’s the story of a contemporary phenomenon that I call the New Groupthink—a phenomenon that has the potential to stifle productivity at work and to deprive schoolchildren of the skills they’ll need to achieve excellence in an increasingly competitive world. The New Groupthink elevates teamwork above all else. It insists that creativity and intellectual achievement come from a gregarious place. It has many powerful advocates. “Innovation—the heart of the knowledge economy—is fundamentally social,” writes the prominent journalist Malcolm Gladwell. “None of us is as smart as all of us,” declares the organizational consultant Warren Bennis,
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
Let's define the Karikó Problem like this: American science funding has become biased against young scientists and risky ideas. What is most obvious is that American science is getting older. In the early 1900s, some of the most famous scientists – Einstein, Heisenberg, Schrödinger – did their breakthrough work in their twenties and thirties. Indeed, their youth may have been critical to their paradigm-busting genius. But these days the twentysomething scientist is an endangered species. The share of NIH-funded scientists who are thirty-five years old or younger declined from 22 percent in 1980 to less than 2 percent by the 2010s.54 American science also seems to produce far too many papers that don't create new knowledge while overlooking researchers with promising new ideas. A 2023 study titled "Papers and Patents Are Becoming Less Disruptive Over Time" found that any given paper today is much less likely to become influential than a paper from the same field decades ago. This could be because too many papers are essentially worthless. Or it could mean that scientists feel pressured to herd around the same few safe ideas that will keep them in good standing with their peers. "When you look at the diminishing returns in medicine, you can say, well, maybe all the easy drugs have been discovered," said James Evans, a sociologist at the University of Chicago. But the more compelling possibility, he said, is that "the very organization of modern science is leading us astray." In Evans's interpretation, the low-hanging fruit hasn't been plucked. The problem is that too many scientists are all looking at the same few trees.
Ezra Klein (Abundance)
the device had the property of transresistance and should have a name similar to devices such as the thermistor and varistor, Pierce proposed transistor. Exclaimed Brattain, “That’s it!” The naming process still had to go through a formal poll of all the other engineers, but transistor easily won the election over five other options.35 On June 30, 1948, the press gathered in the auditorium of Bell Labs’ old building on West Street in Manhattan. The event featured Shockley, Bardeen, and Brattain as a group, and it was moderated by the director of research, Ralph Bown, dressed in a somber suit and colorful bow tie. He emphasized that the invention sprang from a combination of collaborative teamwork and individual brilliance: “Scientific research is coming more and more to be recognized as a group or teamwork job. . . . What we have for you today represents a fine example of teamwork, of brilliant individual contributions, and of the value of basic research in an industrial framework.”36 That precisely described the mix that had become the formula for innovation in the digital age. The New York Times buried the story on page 46 as the last item in its “News of Radio” column, after a note about an upcoming broadcast of an organ concert. But Time made it the lead story of its science section, with the headline “Little Brain Cell.” Bell Labs enforced the rule that Shockley be in every publicity photo along with Bardeen and Brattain. The most famous one shows the three of them in Brattain’s lab. Just as it was about to be taken, Shockley sat down in Brattain’s chair, as if it were his desk and microscope, and became the focal point of the photo. Years later Bardeen would describe Brattain’s lingering dismay and his resentment of Shockley: “Boy, Walter hates this picture. . . . That’s Walter’s equipment and our experiment,
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Take the famous slogan on the atheist bus in London … “There’s probably no God. Now stop worrying and enjoy your life.” … The word that offends against realism here is “enjoy.” I’m sorry—enjoy your life? Enjoy your life? I’m not making some kind of neo-puritan objection to enjoyment. Enjoyment is lovely. Enjoyment is great. The more enjoyment the better. But enjoyment is one emotion … Only sometimes, when you’re being lucky, will you stand in a relationship to what’s happening to you where you’ll gaze at it with warm, approving satisfaction. The rest of the time, you’ll be busy feeling hope, boredom, curiosity, anxiety, irritation, fear, joy, bewilderment, hate, tenderness, despair, relief, exhaustion … This really is a bizarre category error. But not necessarily an innocent one … The implication of the bus slogan is that enjoyment would be your natural state if you weren’t being “worried” by us believer … Take away the malignant threat of God-talk, and you would revert to continuous pleasure, under cloudless skies. What’s so wrong with this, apart from it being total bollocks? … Suppose, as the atheist bus goes by, that you are the fifty-something woman with the Tesco bags, trudging home to find out whether your dementing lover has smeared the walls of the flat with her own shit again. Yesterday when she did it, you hit her, and she mewled till her face was a mess of tears and mucus which you also had to clean up. The only thing that would ease the weight on your heart would be to tell the funniest, sharpest-tongued person you know about it: but that person no longer inhabits the creature who will meet you when you unlock the door. Respite care would help, but nothing will restore your sweetheart, your true love, your darling, your joy. Or suppose you’re that boy in the wheelchair, the one with the spasming corkscrew limbs and the funny-looking head. You’ve never been able to talk, but one of your hands has been enough under your control to tap out messages. Now the electrical storm in your nervous system is spreading there too, and your fingers tap more errors than readable words. Soon your narrow channel to the world will close altogether, and you’ll be left all alone in the hulk of your body. Research into the genetics of your disease may abolish it altogether in later generations, but it won’t rescue you. Or suppose you’re that skanky-looking woman in the doorway, the one with the rat’s nest of dreadlocks. Two days ago you skedaddled from rehab. The first couple of hits were great: your tolerance had gone right down, over two weeks of abstinence and square meals, and the rush of bliss was the way it used to be when you began. But now you’re back in the grind, and the news is trickling through you that you’ve fucked up big time. Always before you’ve had this story you tell yourself about getting clean, but now you see it isn’t true, now you know you haven’t the strength. Social services will be keeping your little boy. And in about half an hour you’ll be giving someone a blowjob for a fiver behind the bus station. Better drugs policy might help, but it won’t ease the need, and the shame over the need, and the need to wipe away the shame. So when the atheist bus comes by, and tells you that there’s probably no God so you should stop worrying and enjoy your life, the slogan is not just bitterly inappropriate in mood. What it means, if it’s true, is that anyone who isn’t enjoying themselves is entirely on their own. The three of you are, for instance; you’re all three locked in your unshareable situations, banged up for good in cells no other human being can enter. What the atheist bus says is: there’s no help coming … But let’s be clear about the emotional logic of the bus’s message. It amounts to a denial of hope or consolation, on any but the most chirpy, squeaky, bubble-gummy reading of the human situation. St Augustine called this kind of thing “cruel optimism” fifteen hundred years ago, and it’s still cruel.
Francis Spufford
I’ve worn Niki’s pants for two days now. I thought a third day in the same clothes might be pushing it.” Ian shrugged with indifference. “It might send Derian through the roof, but it doesn’t bother me. Wear what you want to wear.” Eena wrinkled her nose at him. “Do you really feel that way or are you trying to appear more laissez-faire than Derian?” “More laissez-faire?” “Yes. That’s a real word.” “Two words actually,” he grinned. “Laissez faire et laissez passer, le monde va de lui même!" He coated the words with a heavy French accent. Eena gawked at him. “Since when do you speak French?” “I don’t.” Ian chuckled. “But I did do some research in world history the year I followed you around on Earth. Physics was a joke, but history—that I found fascinating.” Slapping a hand against her chest, Eena exclaimed, “I can’t believe it! Unbeknownst to me, Ian actually studied something in high school other than the library’s collection of sci-fi paperbacks!” He grimaced at her exaggerated performance before defending his preferred choice of reading material. “Hey, popular literature is a valuable and enlightening form of world history. You would know that if you read a book or two.” She ignored his reproach and asked with curiosity, “What exactly did you say?” “In French?” “Duh, yes.” “Don’t ‘duh’ me, you could easily have been referring to my remark about enlightening literature. I know the value of a good book is hard for you to comprehend.” He grinned crookedly at her look of offense and then moved into an English translation of his French quote. “Let it do and let it pass, the world goes on by itself.” “Hmm. And where did that saying come from?” Ian delivered his answer with a surprisingly straight face. “That is what the French Monarch said when his queen began dressing casually. The French revolution started one week following that famous declaration, right after the queen was beheaded by the rest of the aristocracy in her favorite pair of scroungy jeans.” “You are such a brazen-tongued liar!
Richelle E. Goodrich (Eena, The Companionship of the Dragon's Soul (The Harrowbethian Saga #6))
This, in turn, has given us a “unified theory of aging” that brings the various strands of research into a single, coherent tapestry. Scientists now know what aging is. It is the accumulation of errors at the genetic and cellular level. These errors can build up in various ways. For example, metabolism creates free radicals and oxidation, which damage the delicate molecular machinery of our cells, causing them to age; errors can build up in the form of “junk” molecular debris accumulating inside and outside the cells. The buildup of these genetic errors is a by-product of the second law of thermodynamics: total entropy (that is, chaos) always increases. This is why rusting, rotting, decaying, etc., are universal features of life. The second law is inescapable. Everything, from the flowers in the field to our bodies and even the universe itself, is doomed to wither and die. But there is a small but important loophole in the second law that states total entropy always increases. This means that you can actually reduce entropy in one place and reverse aging, as long as you increase entropy somewhere else. So it’s possible to get younger, at the expense of wreaking havoc elsewhere. (This was alluded to in Oscar Wilde’s famous novel The Picture of Dorian Gray. Mr. Gray was mysteriously eternally young. But his secret was the painting of himself that aged horribly. So the total amount of aging still increased.) The principle of entropy can also be seen by looking behind a refrigerator. Inside the refrigerator, entropy decreases as the temperature drops. But to lower the entropy, you have to have a motor, which increases the heat generated behind the refrigerator, increasing the entropy outside the machine. That is why refrigerators are always hot in the back. As Nobel laureate Richard Feynman once said, “There is nothing in biology yet found that indicates the inevitability of death. This suggests to me that it is not at all inevitable and that it is only a matter of time before biologists discover what it is that is causing us the trouble and that this terrible universal disease or temporariness of the human’s body will be cured.
Michio Kaku (Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100)
The Extraordinary Persons Project In fact, Ekman had been so moved personally—and intrigued scientifically—by his experiments with Öser that he announced at the meeting he was planning on pursuing a systematic program of research studies with others as unusual as Öser. The single criterion for selecting apt subjects was that they be “extraordinary.” This announcement was, for modern psychology, an extraordinary moment in itself. Psychology has almost entirely dwelt on the problematic, the abnormal, and the ordinary in its focus. Very rarely have psychologists—particularly ones as eminent as Paul Ekman—shifted their scientific lens to focus on people who were in some sense (other than intellectually) far above normal. And yet Ekman now was proposing to study people who excel in a range of admirable human qualities. His announcement makes one wonder why psychology hasn't done this before. In fact, only in very recent years has psychology explicitly begun a program to study the positive in human nature. Sparked by Martin Seligman, a psychologist at the University of Pennsylvania long famous for his research on optimism, a budding movement has finally begun in what is being called “positive psychology”—the scientific study of well-being and positive human qualities. But even within positive psychology, Ekman's proposed research would stretch science's vision of human goodness by assaying the limits of human positivity Ever the scientist, Ekman became quite specific about what was meant by “extraordinary.” For one, he expects that such people exist in every culture and religious tradition, perhaps most often as contemplatives. But no matter what religion they practice, they share four qualities. The first is that they emanate a sense of goodness, a palpable quality of being that others notice and agree on. This goodness goes beyond some fuzzy, warm aura and reflects with integrity the true person. On this count Ekman proposed a test to weed out charlatans: In extraordinary people “there is a transparency between their personal and public life, unlike many charismatics, who have wonderful public lives and rather deplorable personal ones.” A second quality: selflessness. Such extraordinary people are inspiring in their lack of concern about status, fame, or ego. They are totally unconcerned with whether their position or importance is recognized. Such a lack of egoism, Ekman added, “from the psychological viewpoint, is remarkable.” Third is a compelling personal presence that others find nourishing. “People want to be around them because it feels good—though they can't explain why,” said Ekman. Indeed, the Dalai Lama himself offers an obvious example (though Ekman did not say so to him); the standard Tibetan title is not “Dalai Lama” but rather “Kundun,” which in Tibetan means “presence.” Finally, such extraordinary individuals have “amazing powers of attentiveness and concentration.
Daniel Goleman (Destructive Emotions: A Scientific Dialogue with the Dalai Lama)
The Tale of Human Evolution The subject most often brought up by advocates of the theory of evolution is the subject of the origin of man. The Darwinist claim holds that modern man evolved from ape-like creatures. During this alleged evolutionary process, which is supposed to have started 4-5 million years ago, some "transitional forms" between modern man and his ancestors are supposed to have existed. According to this completely imaginary scenario, four basic "categories" are listed: 1. Australopithecus 2. Homo habilis 3. Homo erectus 4. Homo sapiens Evolutionists call man's so-called first ape-like ancestors Australopithecus, which means "South African ape." These living beings are actually nothing but an old ape species that has become extinct. Extensive research done on various Australopithecus specimens by two world famous anatomists from England and the USA, namely, Lord Solly Zuckerman and Prof. Charles Oxnard, shows that these apes belonged to an ordinary ape species that became extinct and bore no resemblance to humans. Evolutionists classify the next stage of human evolution as "homo," that is "man." According to their claim, the living beings in the Homo series are more developed than Australopithecus. Evolutionists devise a fanciful evolution scheme by arranging different fossils of these creatures in a particular order. This scheme is imaginary because it has never been proved that there is an evolutionary relation between these different classes. Ernst Mayr, one of the twentieth century's most important evolutionists, contends in his book One Long Argument that "particularly historical [puzzles] such as the origin of life or of Homo sapiens, are extremely difficult and may even resist a final, satisfying explanation." By outlining the link chain as Australopithecus > Homo habilis > Homo erectus > Homo sapiens, evolutionists imply that each of these species is one another's ancestor. However, recent findings of paleoanthropologists have revealed that Australopithecus, Homo habilis, and Homo erectus lived at different parts of the world at the same time. Moreover, a certain segment of humans classified as Homo erectus have lived up until very modern times. Homo sapiens neandarthalensis and Homo sapiens sapiens (modern man) co-existed in the same region. This situation apparently indicates the invalidity of the claim that they are ancestors of one another. Stephen Jay Gould explained this deadlock of the theory of evolution although he was himself one of the leading advocates of evolution in the twentieth century: What has become of our ladder if there are three coexisting lineages of hominids (A. africanus, the robust australopithecines, and H. habilis), none clearly derived from another? Moreover, none of the three display any evolutionary trends during their tenure on earth. Put briefly, the scenario of human evolution, which is "upheld" with the help of various drawings of some "half ape, half human" creatures appearing in the media and course books, that is, frankly, by means of propaganda, is nothing but a tale with no scientific foundation. Lord Solly Zuckerman, one of the most famous and respected scientists in the U.K., who carried out research on this subject for years and studied Australopithecus fossils for 15 years, finally concluded, despite being an evolutionist himself, that there is, in fact, no such family tree branching out from ape-like creatures to man.
Harun Yahya (Those Who Exhaust All Their Pleasures In This Life)
Blog,,cheifagboladedayoire.blogspot.com gmail..com,,, adayoire@gmail.com What's app,,,+2348168965161 CHEIF DAYOIRE is the only best powerful traditional spiritual herbalist healer, Lost Love Spells, Sangoma, LOTTO Winning Spells, Marriage Spells Caster, AZUUA Magic Ring for wealth, AZUUA Magic Wallet for money, Get Money into your Account Spells, Penis Enlargement Medicine, Back pains Medicine, Hips and Bums Enlargement, Breasts Enlargement, Short boys for money, Black Magic Spells, Voodoo Spells, Binding Spells and many more. I use the miracle black magic spells and strong herbal medicine to heal and cure all people’s complications in life. I inherited this job from my ancestors of my family. For so long my family has been famous as the best traditional spiritual healer family. CHEIF DAYOIRE can read your fate and destiny accurately by using the ancient methods of checking through water, mirror, your hands and many other enabling me to tell you all your problems, AM the current leader and Fore teller of the grand ancestral shrine which has been in existence since the beginning of the world as a source of the most powerful unseen forces, I have solved many mysterious problems by using the invisible powers. Am regarded by many as the greatest powerful spiritual healer on the planet today” The Gods of my fore ‘father’s ancestral powers anointed me when I was two months old to inherit, heal and solve most of the problems and ailments that are failed to be healed by other doctors. Education background: I hold a bachelor’s degree in medicine but ancestors forced me to do the work they anointed me for: THE PROBLEMS THAT I CAN HEAL AND SOLVE THROUGH THE POWERFUL SPIRITUAL ANCESTORS AND HERBAL MEDICINAL RESEARCHES INCLUDE; 1) Do you want Supernatural Luck into your life, 2) See your Enemies Using a Mirror, 3) Get back LOST LOVER in 1–2 days, …..BEST LOVE SPELL CASTER……. 4) Do you spend sleepless nights thinking and dreaming about that lover of your life but your lover’s mind is elsewhere (A shortest Time & Seal Up Marriage with eternal Love & Happiness is here.) ……BEST MARRIAGE SPELLS….. Call chief dayoire on +2348168965161
Adayoire
prime example is the Japanese herb ashitaba, which is available as a tea or powder and helps prevent zombie cells. It is traditionally used to treat high blood pressure, hay fever, gout, and digestive issues, but researchers recently discovered a compound in the plant called dimethoxychalcone (DMC—no relation to the famous rappers), which slows senescence. In worms and fruit flies, DMC increases life-span by 20 percent.
Dave Asprey (Super Human: The Bulletproof Plan to Age Backward and Maybe Even Live Forever)
What are we to do at any given moment, when we cannot say which of our current claims will be sustained and which will be rejected? This is one of the central questions that I have raised. Because we cannot know which of current claims will be sustained, the best we can do is to consider the weight of scientific evidence, the fulcrum of scientific opinion, and the trajectory of scientific knowledge. This is why consensus matters: If scientists are still debating a matter, then we may well be wise to “wait and see,” if conditions permit.26 If the available empirical evidence is thin, we may want to do more research. But the uncertainly of future scientific knowledge should not be used as an excuse for delay. As the epidemiologist Sir Austin Bradford Hill famously argued, “All scientific work is incomplete—whether it be observational or experimental. All scientific work is liable to be upset or modified by advancing knowledge. That does not confer upon us a freedom to ignore the knowledge we already have, or to postpone the action that it appears to demand at a given time.”27 At any given moment, it makes sense to make decisions on the information we have, and be prepared to alter our plans if future evidence warrants.
Naomi Oreskes (Why Trust Science? (The University Center for Human Values Series))
The truth was that Newton’s biblical research was central to his entire scientific career. They form the essential backdrop for his most famous work, the Principia Mathematica.
Arthur Herman (The Cave and the Light: Plato Versus Aristotle, and the Struggle for the Soul of Western Civilization)
Reintroducing history into evolutionary thinking has already begun at other biological scales. The cell, once an emblem of replicable units, turns out to be the historical product of symbiosis among free- living bacteria. Even DNA turns out to have more history in its amino- acid sequences than once thought. Human DNA is part virus; viral encoun- ters mark historical moments in making us human. Genome research has taken up the challenge of identifying encounter in the making of DNA. Population science cannot avoid history for much longer. Fungi are ideal guides. Fungi have always been recalcitrant to the iron cage of self- replication. Like bacteria, some are given to exchanging genes in nonreproductive encounters (“horizontal gene transfer”); many also seem averse to keeping their genetic material sorted out as “individ- uals” and “species,” not to speak of “populations.” When researchers studied the fruiting bodies of what they thought of as a species, the ex- pensive Tibetan “caterpillar fungus,” they found many species entan- gled together. When they looked into the filaments of Armillaria root rot, they found genetic mosaics that confused the identification of an individual. Meanwhile, fungi are famous for their symbiotic attach- ments. Lichen are fungi living together with algae and cyanobacteria. I have been discussing fungal collaborations with plants, but fungi live with animals as well. For example, Macrotermes termites digest their food only through the help of fungi. The termites chew up wood, but they cannot digest it. Instead, they build “fungus gardens” in which the chewed- up wood is digested by Termitomyces fungi, producing edible nutrients. Researcher Scott Turner points out that, while you might say that the termites farm the fungus, you could equally say that the fungus farms the termites. Termitomyces uses the environment of the termite mound to outcompete other fungi; meanwhile, the fungus regulates the mound, keeping it open, by throwing up mushrooms annually, cre- ating a colony- saving disturbance in termite mound- building.
Anna Lowenhaupt Tsing
Speaking at the Chaos Communication Congress, an annual computer hacker conference held in Berlin, Germany, Tobias Engel, founder of Sternraute, and Karsten Nohl, chief scientist for Security Research Labs, explained that they could not only locate cell-phone callers anywhere in the world, they could also listen in on their phone conversations. And if they couldn’t listen in real time, they could record the encrypted calls and texts for later decryption.
Kevin D. Mitnick (The Art of Invisibility: The World's Most Famous Hacker Teaches You How to Be Safe in the Age of Big Brother and Big Data)
One reason for this “dirty little secret” is the positive publication bias described in Chapter 7. If researchers and medical journals pay attention to positive findings and ignore negative findings, then they may well publish the one study that finds a drug effective and ignore the nineteen in which it has no effect. Some clinical trials may also have small samples (such as for a rare diseases), which magnifies the chances that random variation in the data will get more attention than it deserves. On top of that, researchers may have some conscious or unconscious bias, either because of a strongly held prior belief or because a positive finding would be better for their career. (No one ever gets rich or famous by proving what doesn’t cure cancer.)
Charles Wheelan (Naked Statistics: Stripping the Dread from the Data)
In a now-famous experiment, he and his colleagues compared three groups of expert violinists at the elite Music Academy in West Berlin. The researchers asked the professors to divide the students into three groups: the “best violinists,” who had the potential for careers as international soloists; the “good violinists”; and a third group training to be violin teachers rather than performers. Then they interviewed the musicians and asked them to keep detailed diaries of their time. They found a striking difference among the groups. All three groups spent the same amount of time—over fifty hours a week— participating in music-related activities. All three had similar classroom requirements making demands on their time. But the two best groups spent most of their music-related time practicing in solitude: 24.3 hours a week, or 3.5 hours a day, for the best group, compared with only 9.3 hours a week, or 1.3 hours a day, for the worst group. The best violinists rated “practice alone” as the most important of all their music-related activities. Elite musicians—even those who perform in groups—describe practice sessions with their chamber group as “leisure” compared with solo practice, where the real work gets done. Ericsson and his cohorts found similar effects of solitude when they studied other kinds of expert performers. “Serious study alone” is the strongest predictor of skill for tournament-rated chess players, for example; grandmasters typically spend a whopping five thousand hours—almost five times as many hours as intermediatelevel players—studying the game by themselves during their first ten years of learning to play. College students who tend to study alone learn more over time than those who work in groups. Even elite athletes in team sports often spend unusual amounts of time in solitary practice. What’s so magical about solitude? In many fields, Ericsson told me, it’s only when you’re alone that you can engage in Deliberate Practice, which he has identified as the key to exceptional achievement. When you practice deliberately, you identify the tasks or knowledge that are just out of your reach, strive to upgrade your performance, monitor your progress, and revise accordingly. Practice sessions that fall short of this standard are not only less useful—they’re counterproductive. They reinforce existing cognitive mechanisms instead of improving them. Deliberate Practice is best conducted alone for several reasons. It takes intense concentration, and other people can be distracting. It requires deep motivation, often self-generated. But most important, it involves working on the task that’s most challenging to you personally. Only when you’re alone, Ericsson told me, can you “go directly to the part that’s challenging to you. If you want to improve what you’re doing, you have to be the one who generates the move. Imagine a group class—you’re the one generating the move only a small percentage of the time.” To see Deliberate Practice in action, we need look no further than the story of Stephen Wozniak. The Homebrew meeting was the catalyst that inspired him to build that first PC, but the knowledge base and work habits that made it possible came from another place entirely: Woz had deliberately practiced engineering ever since he was a little kid. (Ericsson says that it takes approximately ten thousand hours of Deliberate Practice to gain true expertise, so it helps to start young.)
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
the roughly $800 billion in available stimulus, we directed more than $90 billion toward clean energy initiatives across the country. Within a year, an Iowa Maytag plant I’d visited during the campaign that had been shuttered because of the recession was humming again, with workers producing state-of-the-art wind turbines. We funded construction of one of the world’s largest wind farms. We underwrote the development of new battery storage systems and primed the market for electric and hybrid trucks, buses, and cars. We financed programs to make buildings and businesses more energy efficient, and collaborated with Treasury to temporarily convert the existing federal clean energy tax credit into a direct-payments program. Within the Department of Energy, we used Recovery Act money to launch the Advanced Research Projects Agency–Energy (ARPA-E), a high-risk, high-reward research program modeled after DARPA, the famous Defense Department effort launched after Sputnik that helped develop not only advanced weapons systems like stealth technology but also an early iteration of the internet, automated voice activation, and GPS.
Barack Obama (A Promised Land)
As a chief ingredient in the mythology of science, the accumulation of objective facts supposedly controls the history of conceptual change–as logical and self-effacing scientists bow before the dictates of nature and willingly change their views to accommodate the growth of conceptual knowledge. The paradigm for such an idealistic notion remains Huxley’s famous remark about “a beautiful theory killed by a nasty, ugly little fact.” But single facts almost never slay worldviews, at least not right away (and properly so, for the majority of deeply anomalous observations turn out to be wrong)... Anomalous facts get incorporated into existing theories, often with a bit of forced stretching to be sure, but usually with decent fit because most worldviews contain considerable flexibility. (How else could they last so long, or be so recalcitrant to overthrow?)
Stephen Jay Gould (Leonardo's Mountain of Clams and the Diet of Worms: Essays on Natural History)
First published in 2020 this book contains over 560 easily readable compact entries in systematic order augmented by an extensive bibliography, an alphabetical list of countries and locations of individuals final resting places (where known) and a day and month list in consecutive order of when an individual died. It details the deaths of individuals, who died too early and often in tragic circumstances, from film, literature, music, theatre, and television, and the achievements they left behind. In addition, some ordinary people who died in bizarre, freak, or strange circumstances are also included. It does not matter if they were famous or just celebrated by a few individuals, all the people in this book left behind family, friends and in some instances devotees who idolised them. Our heartfelt thoughts and sympathies go out to all those affected by each persons death. Whether you are concerned about yourself, a loved one, a friend, or a work colleague there are many helplines and support groups that offer confidential non-judgemental help, guidance and advice on mental health problems (such as anxiety, bereavement, depression, despair, distress, stress, substance abuse, suicidal feelings, and trauma). Support can be by phone, email, face-to-face counselling, courses, and self-help groups. Details can be found online or at your local health care organisation. There are many conspiracy theories, rumours, cover-ups, allegations, sensationalism, and myths about the cause of some individual’s deaths. Only the facts known at the time of writing are included in this book. Some important information is deliberately kept secret or undisclosed. Sometimes not until 20 or even 30 years later are full details of an accident or incident released or in some cases found during extensive research. Similarly, unsolved murders can be reinvestigated years later if new information becomes known. In some cases, 50 years on there are those who continue to investigate what they consider are alleged cover-ups. The first name in an entry is that by which a person was generally known. Where relevant their real name is included in brackets. Date of Death | In the entry detailing the date an individual died their age at the time of their death is recorded in brackets. Final Resting Place | Where known details of a persons final resting place are included. “Unknown” | Used when there is insufficient evidence available to the authorities to establish whether an individuals’ death was due to suicide, accident or caused by another. Statistics The following statistics are derived from the 579 individual “cause of death” entries included in this publication. The top five causes of death are, Heart attack/failure 88 (15.2%) Cancer 55 (9.5%) Fatal injuries (plane crash) 43 (7.4%) Fatal injuries (vehicle crash/collision) 39 (6.7%) Asphyxiation (Suicide) 23 (4%). extract from 'Untimely and Tragic Deaths of the Renowned, The Celebrated, The Iconic
B.H. McKechnie
This book is the culmination of, the capstone to Robert Ornstein’s brilliant, ground-breaking half century of research into the dimensions, capacities, and purposes of human consciousness. Deeply thoughtful and wide-ranging in scope, his final book goes well beyond the “left-brain, right-brain” work for which he originally became famous, this time bringing the often-confusing and divisive subject of spirituality into a clear and useful 21st-century focus, presenting it as a matter of perception, not belief. Using rigorous, rational, scientific analysis – always his greatest strength – Ornstein shows that the spiritual impulse is innate, a human ability that from the Ice Age forward has been essential to problem solving. As a final part of his legacy, he lays out how this capacity to reach beyond the everyday can, cleared of cobwebs and seen afresh, play a role and be part of preparing humanity to confront today’s staggering global problems. A rewarding and fascinating book. — Tony Hiss, author of Rescuing the Planet: Protecting Half the Land to Heal the Earth
Robert Ornstein (God 4.0: On the Nature of Higher Consciousness and the Experience Called “God”)
After the Marxist revolution failed to topple capitalism in the early twentieth century, many Marxists went back to the drawing board, modifying and adapting Marx’s ideas. Perhaps the most famous was a group associated with the Institute for Social Research in Frankfurt, Germany, which applied Marxism to a radical interdisciplinary social theory. The group included Max Horkheimer, T.W. Adorno, Erich Fromm, Herbert Marcuse, Georg Lukács, and Walter Benjamin and came to be known as the Frankfurt School. These men developed Critical Theory as an expansion of Conflict Theory and applied it more broadly, including other social sciences and philosophy. Their main goal was to address structural issues causing inequity. They worked from the assumption that current social reality was broken, and they needed to identify the people and institutions that could make changes and provide practical goals for social transformation.
Voddie T. Baucham Jr. (Fault Lines: The Social Justice Movement and Evangelicalism's Looming Catastrophe)
What is so important about Engelbart’s legacy is that he saw the computer as primarily a tool to augment—not replace—human capability. In our current era, by contrast, much of the financing flowing out of Silicon Valley is aimed at building machines that can replace humans. In a famous encounter in 1953 at MIT, Marvin Minsky, the father of research on artificial intelligence, declared: “We’re going to make machines intelligent. We are going to make them conscious!” To which Doug Engelbart replied: “You’re going to do all that for the machines? What are you going to do for the people?
Jonathan Taplin (Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy)
I’m reminded of the famous attorney who was asked if luck played any part in success at trial and he said yes, and it usually comes at three in the morning when I’m in the library doing research.
Bill Fitzhugh (A Perfect Harvest (The Transplant Tetralogy Book 4))
Veeraloka Book House - A Center point of Kannada Writing 207, 2nd Floor, 3rd Main Rd, Chamrajpet, Bengaluru, Karnataka 560018 Call – +91 7022122121 Veeraloka kannada bookshop is a famous objective for admirers of Kannada writing, known for its rich assortment and commitment to advancing territorial works. Arranged in the core of Karnataka, this notable bookshop fills in as a social guide, offering perusers admittance to probably the best works in Kannada writing. Whether you're an enthusiastic peruser, an understudy, or a specialist, Veeraloka Book House has turned into the go-to put for anybody looking for quality Kannada books. A Tradition of Kannada Writing Veeraloka Book House was established with the mission of safeguarding and advancing Kannada writing. Throughout the long term, it has become something other than a bookshop — it has transformed into a social establishment. The book shop invests heavily in being one of only a handful of exceptional spots where one can track down uncommon and exemplary works, contemporary books, and instructive materials across the board place. It has contributed altogether to supporting the Kannada language by making writing open to perusers of any age and foundations. A Tremendous Assortment One of the greatest draws of Veeraloka Book House is its broad assortment of books. The shop brags a wide cluster types, including verse, books, verifiable works, life stories, expositions, and examination materials. From the compositions of antiquated Kannada artists like Pampa and Ranna to current creators like Kuvempu, U.R. Ananthamurthy, and Girish Karnad, Veeraloka Book House takes care of a wide range of scholarly preferences. Other than writing, the shop additionally offers reading material, scholarly works, youngsters' writing, and books on way of thinking, otherworldliness, and self-advancement. This guarantees that the bookshop isn't just for easygoing perusers yet in addition for researchers and understudies looking for information on a large number of subjects. Support for Arising Scholars Veeraloka Book House has likewise turned into a stage for maturing writers. The book shop frequently has book dispatches, verse readings, and scholarly conversations, offering new essayists a chance to introduce their work to a more extensive crowd. This has made the bookshop a huge piece of Karnataka's scholarly biological system. By supporting arising creators, it guarantees that the fate of Kannada writing keeps on thriving. Local area Commitment and Occasions Aside from being a spot for purchasing books, Veeraloka kannada bookshop assumes a vital part in drawing in with the neighborhood local area. The book shop oftentimes arranges scholarly occasions, studios, and conversations, welcoming perusers, authors, and learned people to share their adoration for Kannada writing. These occasions advance perusing as well as encourage a feeling of social character and pride among Kannada speakers. Online Presence With regards to present day patterns, Veeraloka kannada bookshop has embraced the computerized world by making its assortment accessible on the web. This permits Kannada perusers from across the globe to get to their number one books with only a couple of snaps. The web-based entry is easy to understand and gives definite portrayals of each book, guaranteeing that clients have a simple and consistent shopping experience. End Veeraloka Book House is something other than a book shop; it is an image of Karnataka's rich scholarly legacy. With its wide assortment, support for arising essayists, and profound commitment with the local area, the shop keeps on being a treasured spot for anybody energetic about Kannada writing. Whether you visit face to face or peruse its broad web-based assortment, Veeraloka Book House offers an advancing encounter for all book sweethearts.
veeralokabooks
The grand idea was an atlas. A collection of maps, both of real places and of imagined ones, but reversed. She and Daniel had come up with a list of books, fantasy novels famous for the beautiful maps created just for them—Tolkien’s The Lord of the Rings; Le Guin’s Earthsea series; Lewis’s The Chronicles of Narnia books; Dragt’s De brief voor de koning, The Letter for the King; Pratchett’s Discworld novels—and another list of maps from our real world, famous for their cartographic significance. We would painstakingly research all of them, studying them from historical, scientific, and artistic angles, and then redraw them in the opposite style. Our recreations of the fantasy maps would be rigidly detailed and precise, and our re-creations of the realistic maps would be embellished, expanded, and dreamlike, like their fictional cousins. Once complete, we planned to publish it in one giant volume. Readers would open it, expecting the same old type of atlas, but instead, they’d find previously familiar lands rendered in a completely unexpected manner, opening their imaginations to an entirely new way of looking at maps.
Peng Shepherd (The Cartographers)
RAND proved formative. Some of its employees joked that it stood for “Research And No Development,” and its intellectualism was inspiring to the young economist. The think tank’s ethos was to work on problems so hard that they might actually be unsolvable.9 Four days of the week were dedicated to RAND projects, but the fifth was free for freewheeling personal research. Ken Arrow, a famous economist, and John Nash, the game theorist immortalized in the film A Beautiful Mind, both consulted for RAND around the time Sharpe was there. The eclecticism of RAND’s research community is reflected in his first published works, which were a proposal for a smog tax and a review of aircraft compartment design criteria for Army deployments.
Robin Wigglesworth (Trillions: How a Band of Wall Street Renegades Invented the Index Fund and Changed Finance Forever)
Amblyopsis hoosieri Type of animal: Eyeless cavefish Description: Completely colorless; 2 to 3 inches long; anus on underside of neck Home: Southern Indiana Fun fact: Unlike others of its kind, A. hoosieri lacks a debilitating mutation in the rhodopsin gene, which is an important gene for vision. That means it could see just fine … if it had eyes. Researchers named the fish after the Indiana Hoosiers basketball team — but not to imply the players might be visually challenged. The name honors several famous fish scientists who worked at Indiana University, as well as the species’s proximity to the university.Plus, the lead author is a Hoosier fan.  BRENDA POPPY Can You See Me Now? NIEMILLER/ZOOKEYS MATTHEW LEMOS; BARRETO GABRIELA : TOPFROM 22 DISCOVERMAGAZINE.COM
Anonymous
the Harveys’ most famous son. An experimental physician famous for his discovery of the circulation of the blood, he had been the personal physician to Charles I and had been present with him at the Battle of Edgehill in 1642. Research in the Harvey family papers has also revealed that he was responsible for the only known scientific examination of a witch’s familiar. Personally ordered by Charles I to examine a lady suspected of witchcraft who lived on the outskirts of Newmarket, the dubious Harvey visited her in the guise of a wizard. He succeeded in capturing and dissecting her pet toad. The animal, Harvey concluded dryly, was a toad.
Sam Willis (The Fighting Temeraire: The Battle of Trafalgar and the Ship that Inspired J.M.W. Turner's Most Beloved Painting)
Education was still considered a privilege in England. At Oxford you took responsibility for your efforts and for your performance. No one coddled, and no one uproariously encouraged. British respect for the individual, both learner and teacher, reigned. If you wanted to learn, you applied yourself and did it. Grades were posted publicly by your name after exams. People failed regularly. These realities never ceased to bewilder those used to “democracy” without any of the responsibility. For me, however, my expectations were rattled in another way. I arrived anticipating to be snubbed by a culture of privilege, but when looked at from a British angle, I actually found North American students owned a far greater sense of entitlement when it came to a college education. I did not realize just how much expectations fetter—these “mind-forged manacles,”2 as Blake wrote. Oxford upholds something larger than self as a reference point, embedded in the deep respect for all that a community of learning entails. At my very first tutorial, for instance, an American student entered wearing a baseball cap on backward. The professor quietly asked him to remove it. The student froze, stunned. In the United States such a request would be fodder for a laundry list of wrongs done against the student, followed by threatening the teacher’s job and suing the university. But Oxford sits unruffled: if you don’t like it, you can simply leave. A handy formula since, of course, no one wants to leave. “No caps in my classroom,” the professor repeated, adding, “Men and women have died for your education.” Instead of being disgruntled, the student nodded thoughtfully as he removed his hat and joined us. With its expanses of beautiful architecture, quads (or walled lawns) spilling into lush gardens, mist rising from rivers, cows lowing in meadows, spires reaching high into skies, Oxford remained unapologetically absolute. And did I mention? Practically every college within the university has its own pub. Pubs, as I came to learn, represented far more for the Brits than merely a place where alcohol was served. They were important gathering places, overflowing with good conversation over comforting food: vital humming hubs of community in communication. So faced with a thousand-year-old institution, I learned to pick my battles. Rather than resist, for instance, the archaic book-ordering system in the Bodleian Library with technological mortification, I discovered the treasure in embracing its seeming quirkiness. Often, when the wrong book came up from the annals after my order, I found it to be right in some way after all. Oxford often works such. After one particularly serendipitous day of research, I asked Robert, the usual morning porter on duty at the Bodleian Library, about the lack of any kind of sophisticated security system, especially in one of the world’s most famous libraries. The Bodleian was not a loaning library, though you were allowed to work freely amid priceless artifacts. Individual college libraries entrusted you to simply sign a book out and then return it when you were done. “It’s funny; Americans ask me about that all the time,” Robert said as he stirred his tea. “But then again, they’re not used to having u in honour,” he said with a shrug.
Carolyn Weber (Surprised by Oxford)
Rick smiled as he watched the waves roll toward their feet. He turned to her and said, “Since we’re going to Louisiana, I did some research and learned a few things. Did you know it’s famous for its gumbo and bayous?” Amelia’s eyes brightened. “Really? I’ve seen pictures of a bayou in a magazine. It’s so mysterious looking.” “It’s also the crawdad capital of the world.” “Crawdad? What’s that?” Rick’s eyes widened with surprise. “You don’t know what crawdads are?” She shook her head. “They’re a freshwater crayfish, similar to shrimp… only better.
Linda Weaver Clarke (Mystery on the Bayou (Amelia Moore Detective Series #6))
One would expect to find a comparatively high proportion of carbon 13 [the carbon from corn] in the flesh of people whose staple food of choice is corn - Mexicans, most famously. Americans eat much more wheat than corn - 114 pounds of wheat flour per person per year, compared to 11 pounds of corn flour. The Europeans who colonized America regarded themselves as wheat people, in contrast to the native corn people they encountered; wheat in the West has always been considered the most refined, or civilized, grain. If asked to choose, most of us would probably still consider ourselves wheat people, though by now the whole idea of identifying with a plant at all strikes us as a little old-fashioned. Beef people sounds more like it, though nowadays chicken people, which sounds not nearly so good, is probably closer to the truth of the matter. But carbon 13 doesn't lie, and researchers who compared the carbon isotopes in the flesh or hair of Americans to those in the same tissues of Mexicans report that it is now we in the North who are the true people of corn. 'When you look at the isotope ratios,' Todd Dawson, a Berkeley biologist who's done this sort of research, told me, 'we North Americans look like corn chips with legs.' Compared to us, Mexicans today consume a far more varied carbon diet: the animals they eat still eat grass (until recently, Mexicans regarded feeding corn to livestock as a sacrilege); much of their protein comes from legumes; and they still sweeten their beverages with cane sugar. So that's us: processed corn, walking.
Michael Pollan (The Omnivore's Dilemma: A Natural History of Four Meals)
In a famous 1987 study, researchers Michael Diehl and Wolfgang Stroebe from Tubingen University in Germany concluded that brainstorming groups have never outperformed virtual groups.7 Of
Frans Johansson (Medici Effect: What Elephants and Epidemics Can Teach Us About Innovation)
Research from Brunel University shows that chess students who trained with coaches increased on average 168 points in their national ratings versus those who didn’t. Though long hours of deliberate practice are unavoidable in the cognitively complex arena of chess, the presence of a coach for mentorship gives players a clear advantage. Chess prodigy Joshua Waitzkin (the subject of the film Searching for Bobby Fischer) for example, accelerated his career when national chess master Bruce Pandolfini discovered him playing chess in Washington Square Park in New York as a boy. Pandolfini coached young Waitzkin one on one, and the boy won a slew of chess championships, setting a world record at an implausibly young age. Business research backs this up, too. Analysis shows that entrepreneurs who have mentors end up raising seven times as much capital for their businesses, and experience 3.5 times faster growth than those without mentors. And in fact, of the companies surveyed, few managed to scale a profitable business model without a mentor’s aid. Even Steve Jobs, the famously visionary and dictatorial founder of Apple, relied on mentors, such as former football coach and Intuit CEO Bill Campbell, to keep himself sharp. SO, DATA INDICATES THAT those who train with successful people who’ve “been there” tend to achieve success faster. The winning formula, it seems, is to seek out the world’s best and convince them to coach us. Except there’s one small wrinkle. That’s not quite true. We just held up Justin Bieber as an example of great, rapid-mentorship success. But since his rapid rise, he’s gotten into an increasing amount of trouble. Fights. DUIs. Resisting arrest. Drugs. At least one story about egging someone’s house. It appears that Bieber started unraveling nearly as quickly as he rocketed to Billboard number one. OK, first of all, Bieber’s young. He’s acting like the rock star he is. But his mentor, Usher, also got to Billboard number one at age 18, and he managed to dominate pop music for a decade without DUIs or egg-vandalism incidents. Could it be that Bieber missed something in the mentorship process? History, it turns out, is full of people who’ve been lucky enough to have amazing mentors and have stumbled anyway.
Shane Snow (Smartcuts: The Breakthrough Power of Lateral Thinking)
The dementia that is caused by the same vascular problems that lead to stroke is clearly affected by diet. In a publication from the famous Framingham Study, researchers conclude that for every three additional servings of fruits and vegetables a day, the risk of stroke will be reduced by 22%.73
T. Colin Campbell (The China Study: The Most Comprehensive Study of Nutrition Ever Conducted and the Startling Implications for Diet, Weight Loss, and Long-term Health)
In 1794, Lavoisier was arrested with the rest of the association and quickly sentenced to death. Ever the dedicated scientist, he requested time to complete some of his research so that it would be available to posterity. To that the presiding judge famously replied, “The republic has no need of scientists.
Leonard Mlodinow (The Drunkard's Walk: How Randomness Rules Our Lives)
Schools, in a noble effort to interest more girls in math and science, often try to combat stereotypes by showing children images of famous female scientists. “See, they did it. You can do it, too!” Unfortunately, these attempts rarely work, according to the research. Girls are more likely to remember the women as lab assistants. This is frustrating for those of us who try to combat gender stereotypes in children.
Christia Spears Brown (Parenting Beyond Pink & Blue: How to Raise Your Kids Free of Gender Stereotypes)
Bill Wilson would never have another drink. For the next thirty-six years, until he died of emphysema in 1971, he would devote himself to founding, building, and spreading Alcoholics Anonymous, until it became the largest, most well-known and successful habit-changing organization in the world. An estimated 2.1 million people seek help from AA each year, and as many as 10 million alcoholics may have achieved sobriety through the group.3.12,3.13 AA doesn’t work for everyone—success rates are difficult to measure, because of participants’ anonymity—but millions credit the program with saving their lives. AA’s foundational credo, the famous twelve steps, have become cultural lodestones incorporated into treatment programs for overeating, gambling, debt, sex, drugs, hoarding, self-mutilation, smoking, video game addictions, emotional dependency, and dozens of other destructive behaviors. The group’s techniques offer, in many respects, one of the most powerful formulas for change. All of which is somewhat unexpected, because AA has almost no grounding in science or most accepted therapeutic methods. Alcoholism, of course, is more than a habit. It’s a physical addiction with psychological and perhaps genetic roots. What’s interesting about AA, however, is that the program doesn’t directly attack many of the psychiatric or biochemical issues that researchers say are often at the core of why alcoholics drink.3.14 In fact, AA’s methods seem to sidestep scientific and medical findings altogether, as well as the types of intervention many psychiatrists say alcoholics really need.1 What AA provides instead is a method for attacking the habits that surround alcohol use.3.15 AA, in essence, is a giant machine for changing habit loops. And though the habits associated with alcoholism are extreme, the lessons AA provides demonstrate how almost any habit—even the most obstinate—can be changed.
Charles Duhigg (The Power Of Habit: Why We Do What We Do In Life And Business)
I became well known for researching what the corporate government did not want researched.
Steven Magee
THE CHASM – THE DIFFUSION MODEL WHY EVERYBODY HAS AN IPOD Why is it that some ideas – including stupid ones – take hold and become trends, while others bloom briefly before withering and disappearing from the public eye? Sociologists describe the way in which a catchy idea or product becomes popular as ‘diffusion’. One of the most famous diffusion studies is an analysis by Bruce Ryan and Neal Gross of the diffusion of hybrid corn in the 1930s in Greene County, Iowa. The new type of corn was better than the old sort in every way, yet it took twenty-two years for it to become widely accepted. The diffusion researchers called the farmers who switched to the new corn as early as 1928 ‘innovators’, and the somewhat bigger group that was infected by them ‘early adaptors’. They were the opinion leaders in the communities, respected people who observed the experiments of the innovators and then joined them. They were followed at the end of the 1930s by the ‘sceptical masses’, those who would never change anything before it had been tried out by the successful farmers. But at some point even they were infected by the ‘hybrid corn virus’, and eventually transmitted it to the die-hard conservatives, the ‘stragglers’. Translated into a graph, this development takes the form of a curve typical of the progress of an epidemic. It rises, gradually at first, then reaches the critical point of any newly launched product, when many products fail. The critical point for any innovation is the transition from the early adaptors to the sceptics, for at this point there is a ‘chasm’. According to the US sociologist Morton Grodzins, if the early adaptors succeed in getting the innovation across the chasm to the sceptical masses, the epidemic cycle reaches the tipping point. From there, the curve rises sharply when the masses accept the product, and sinks again when only the stragglers remain. With technological innovations like the iPod or the iPhone, the cycle described above is very short. Interestingly, the early adaptors turn away from the product as soon as the critical masses have accepted it, in search of the next new thing. The chasm model was introduced by the American consultant and author Geoffrey Moore. First they ignore you, then they laugh at you, then they fight you, then you win. Mahatma Gandhi
Mikael Krogerus (The Decision Book: 50 Models for Strategic Thinking)
Research tells us that brainstorming becomes more productive when it’s focused. As jazz great Charles Mingus famously said, “You can’t improvise on nothing, man; you’ve gotta improvise on something.
Chip Heath (The Myth of the Garage: And Other Minor Surprises)
Knowing what you’re aiming for is essential. In a famous study of Yale University students, researchers found that only 3% had written goals with plans for their achievement. Twenty years later researchers interviewed the surviving graduates and found that those 3% were worth more financially than the other 97% combined.
Karen McCreadie (Think and Grow Rich (Infinite Success))
the people who are best at telling jokes tend to have more health problems than the people laughing at them. A study of Finnish police officers found that those who were seen as funniest smoked more, weighed more, and were at greater risk of cardiovascular disease than their peers [10]. Entertainers typically die earlier than other famous people [11], and comedians exhibit more “psychotic traits” than others [12]. So just as there’s research to back up the conventional wisdom on laughter’s curative powers, there also seems to be truth to the stereotype that funny people aren’t always having much fun. It might feel good to crack others up now and then, but apparently the audience gets the last laugh.
Anonymous
Our ability to tap into the senses of others is not limited to hypnotic states. In a now famous series of experiments physicists Harold Puthoff and Russell Targ of the Stanford Research Institute in California found that just about everyone they tested had a capacity they call “remote viewing,” the ability to describe accurately what a distant test subject is seeing. They found that individual after individual could remote-view simply by relaxing and describing whatever images came into their minds. Puthoff and Targ's findings have been duplicated by dozens of laboratories around the world, indicating that remote viewing is probably a widespread latent ability in all of us.
Anonymous
with you, as your date?” Liam asks me. “Yes,” I say quietly. “I’m so sorry. What can I do for you in return?” “Well, since you offered,” Liam responds, “I would like some information.” “Information?” I ask with a frown. “Yes,” Liam says. “Remember all those deep, dark secrets I said I’d extract from you? Well, if you share them with us, then I’ll be your date for your sister’s wedding.” This is probably the worst thing he could have requested. My mouth feels suddenly very dry. “Um. Isn’t there anything else you might want? Maybe I could dedicate my next book to you?” He laughs lightly. “You’re going to do that anyway once I get your sight back.” I rack my brain, searching for something I could give him. “I’ll have my publisher put out a press release,” I offer, “or maybe schedule an event, like a book launch. We can publicly declare that you’re the hero who helped the semi-famous blind author Winter Rose to see. Even if it doesn’t work, and I can’t see, I’ll pretend like I can, and you’ll probably get tons of research grants and stuff.” “I’m pretty sure that you’re going to do that anyway,” Liam tells me, “because it’s a good story that will sell books.” “Okay,” I mumble, getting desperate. “How about I name a character after you?” “That would be nice,” Liam says. “I’ll take all of the above, but I’ll still need one additional thing to sweeten the pot. Information.” “Why?” I moan in protest. “Because I’m curious,” he answers in a good-natured way. “Come on. It can’t be that bad. Tell me your deepest, darkest secrets.” I sigh. “Are you sure?” “Yes.” “Really? Right here. Right now? In front of Owen?” “Yeah, why not?” Liam says cheerfully. “He’s been telling us way more than we need to know for a while.” “I want to hear, too,” Owen chimes in.  “Entertain us, storyteller!” I spend a moment gathering my composure. I smooth my hands over my legs, and look around uneasily. Taking a deep breath, I try to mentally prepare myself for what I’m about to say to two complete strangers. “Well... three years ago, I was raped.” A hush falls over the car. I can feel the men looking
Loretta Lost (Clarity (Clarity, #1))
Minsky was an ardent supporter of the Cyc project, the most notorious failure in the history of AI. The goal of Cyc was to solve AI by entering into a computer all the necessary knowledge. When the project began in the 1980s, its leader, Doug Lenat, confidently predicted success within a decade. Thirty years later, Cyc continues to grow without end in sight, and commonsense reasoning still eludes it. Ironically, Lenat has belatedly embraced populating Cyc by mining the web, not because Cyc can read, but because there’s no other way. Even if by some miracle we managed to finish coding up all the necessary pieces, our troubles would be just beginning. Over the years, a number of research groups have attempted to build complete intelligent agents by putting together algorithms for vision, speech recognition, language understanding, reasoning, planning, navigation, manipulation, and so on. Without a unifying framework, these attempts soon hit an insurmountable wall of complexity: too many moving parts, too many interactions, too many bugs for poor human software engineers to cope with. Knowledge engineers believe AI is just an engineering problem, but we have not yet reached the point where engineering can take us the rest of the way. In 1962, when Kennedy gave his famous moon-shot speech, going to the moon was an engineering problem. In 1662, it wasn’t, and that’s closer to where AI is today. In industry, there’s no sign that knowledge engineering will ever be able to compete with machine learning outside of a few niche areas. Why pay experts to slowly and painfully encode knowledge into a form computers can understand, when you can extract it from data at a fraction of the cost? What about all the things the experts don’t know but you can discover from data? And when data is not available, the cost of knowledge engineering seldom exceeds the benefit. Imagine if farmers had to engineer each cornstalk in turn, instead of sowing the seeds and letting them grow: we would all starve.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
The first eye-opener came in the 1970s, when DARPA, the Pentagon’s research arm, organized the first large-scale speech recognition project. To everyone’s surprise, a simple sequential learner of the type Chomsky derided handily beat a sophisticated knowledge-based system. Learners like it are now used in just about every speech recognizer, including Siri. Fred Jelinek, head of the speech group at IBM, famously quipped that “every time I fire a linguist, the recognizer’s performance goes up.” Stuck in the knowledge-engineering mire, computational linguistics had a near-death experience in the late 1980s. Since then, learning-based methods have swept the field, to the point where it’s hard to find a paper devoid of learning in a computational linguistics conference. Statistical parsers analyze language with accuracy close to that of humans, where hand-coded ones lagged far behind. Machine translation, spelling correction, part-of-speech tagging, word sense disambiguation, question answering, dialogue, summarization: the best systems in these areas all use learning. Watson, the Jeopardy! computer champion, would not have been possible without it.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
That’s the beauty of the famous scientific method. You observe your subject, ask questions, and then research before establishing a hypothesis.
Claudia Y. Burgoa (Undefeated (Unexpected #5))
To prove the existence of a worldwide conspiracy one needs to bring up facts that cannot be denied by opponents of such a principle. The imminence of such a worldwide conspiracy is, along with other facts confirmed, by the existence of organizations that rank above the separate states. These organizations have been operating behind the scenes of official world politics for several decades. Whoever wants to understand how and why political decisions come about needs to study these organizations and their objectives. The real answers cannot be found with the government of the United States or other political powers of this world. In reality the politics of countries are not determined by democratically chosen representatives, but by these powerful organizations and our invisible elite. Many investigators have tried to uncover this worldwide conspiracy. These investigators stem from all ranks of society. In spite of this, they all agree on the existence of this conspiracy. Sooner or later every investigator that researches this matter will come across the secret Brotherhood of the Illuminati. This organization was officially founded in 1530 in Spain. Their goals are based on the famous Constantinople Letter of December 22, 1489, in which plans were made to conquer the leadership of the world.[33] In 1773 the plans stipulated in the Constantinople Letter were restored, modernized and developed further in consultation
Robin de Ruiter (Worldwide Evil and Misery - The Legacy of the 13 Satanic Bloodlines)