Applied Research Quotes

We've searched our database for all the quotes and captions related to Applied Research. Here they are! All 100 of them:

Let's quit our jobs and fuck all day." "Works for me. Think somebody will subsidize that? Maybe we could apply for some kind of research grant," she said.
Cara McKenna (Willing Victim (Flynn and Laurel, #1))
Research on one kind of skincare product can’t be completely applied to all skincare products.
Pooja Agnihotri (17 Reasons Why Businesses Fail :Unscrew Yourself From Business Failure)
Nothing can illustrate these observations more forcibly, than a recollection of the happy conjuncture of times and circumstances, under which our Republic assumed its rank among the Nations; The foundation of our Empire was not laid in the gloomy age of Ignorance and Superstition, but at an Epoch when the rights of mankind were better understood and more clearly defined, than at any former period, the researches of the human mind, after social happiness, have been carried to a great extent, the Treasures of knowledge, acquired by the labours of Philosophers, Sages and Legislatures, through a long succession of years, are laid open for our use, and their collected wisdom may be happily applied in the Establishment of our forms of Government; the free cultivation of Letters, the unbounded extension of Commerce, the progressive refinement of Manners, the growing liberality of sentiment... have had a meliorating influence on mankind and increased the blessings of Society. At this auspicious period, the United States came into existence as a Nation, and if their Citizens should not be completely free and happy, the fault will be entirely their own. [Circular to the States, 8 June 1783 - Writings 26:484--89]
George Washington (Writings)
(P)sychologists at the new School for Social Research found that fiction books improve our ability to register and read others' emotions and, according to an article in the Journal of Applied Social Psychology, research also shows that literary fiction enhances our ability to reflect on our problems through reading about characters who are facing similar issues and problems.
Meik Wiking (The Little Book of Lykke: The Danish Search for the World's Happiest People)
In the past, pure scientists took a snobbish view of business. They saw the pursuit of money as intellectually uninteresting, suited only to shopkeepers. And to do research for industry, even at the prestigious Bell or IBM labs, was only for those who couldn't get a university appointment. Thus the attitude of pure scientists was fundamentally critical toward the work of applied scientists, and to industry in general. Their long-standing antagonism kept university scientists free of contaminating industry ties, and whenever debate arose about technological matters, disinterested scientists were available to discuss the issues at the highest levels.
Michael Crichton (Jurassic Park (Jurassic Park, #1))
If a mathematician wishes to disparage the work of one of his colleagues, say, A, the most effective method he finds for doing this is to ask where the results can be applied. The hard pressed man, with his back against the wall, finally unearths the researches of another mathematician B as the locus of the application of his own results. If next B is plagued with a similar question, he will refer to another mathematician C. After a few steps of this kind we find ourselves referred back to the researches of A, and in this way the chain closes.
Alfred Tarski
As he spoke, he whipped a tape measure and a large round magnifying glass from his pocket. With these two implements he trotted noiselessly about the room, sometimes stopping, occasionally kneeling, and once lying flat upon his face... As I watched him I was irresistibly reminded of a pure-blooded well-trained foxhound as it dashes backwards and forwards through the covert, whining in its eagerness, until it comes across the lost scent. For twenty minutes or more he continued his researches, measuring with the most exact care the distance between marks which were entirely invisible to me, and occasionally applying his tape to the walls in an equally incomprehensible manner.
Arthur Conan Doyle (A Study in Scarlet (Sherlock Holmes, #1))
You can tell if a discipline is BS if the degree depends severely on the prestige of the school granting it. I remember when I applied to MBA programs being told that anything outside the top ten or twenty would be a waste of time. On the other hand a degree in mathematics is much less dependent on the school (conditional on being above a certain level, so the heuristic would apply to the difference between top ten and top two thousand schools). The same applies to research papers. In math and physics, a result posted on the repository site arXiv (with a minimum hurdle) is fine. In low-quality fields like academic finance (where papers are usually some form of complicated storytelling), the “prestige” of the journal is the sole criterion.
Nassim Nicholas Taleb (Skin in the Game: Hidden Asymmetries in Daily Life)
Many presume that integrating more advanced automation will directly translate into productivity gains. But research reveals that lower-performing algorithms often elicit greater human effort and diligence. When automation makes obvious mistakes, people stay attentive to compensate. Yet flawless performance prompts blind reliance, causing costly disengagement. Workers overly dependent on accurate automation sleepwalk through responsibilities rather than apply their own judgment.
I. Almeida (Introduction to Large Language Models for Business Leaders: Responsible AI Strategy Beyond Fear and Hype (Byte-sized Learning Book 2))
By 1991, for instance, epidemiologist surverys in populations had revealed that high cholesterol was NOT associated with heart disease or premature death in women. Rather, the higher the cholesterol in women, the longer they lived, a finding that was so consistent across populations and surveys that it prompted an editorial in the American Heart Associations journal, Circulation: "We are coming to realize," the three authors, led by UC San Francisco epidemiologist Stephen Hulley, wrote, "the the results of cardiovascular research in men, which represents the great majority of the effort thus far, may not apply to women.
Gary Taubes (Rethinking Diabetes: What Science Reveals about Diet, Insulin and Successful Treatments)
... we can define concept as a logical, mental construction of one or more relationships. [...] It is purely mental, is logical, and can be described; it has been reasoned through sufficiently and presented with clarity. As such, a concept is inherently abstract (takes some things as given or assumed)
Don E. Ethridge (Research Methodology in Applied Economics)
The direction of research . . .should be: - toward non-violence rather than violence, - towards a harmonious cooperation with nature rather than with warfare against nature; - towards the noiseless, low-energy, elegant, and economical solutions normally applied in nature rather than [our often] noisy, high-energy, brutal, wasteful, and clumsy solutions.
Ernst F. Schumacher
State philosophy reposes on a double identity: of the thinking subject, and of the concepts it creates and to which it lends its own presumed attributes of sameness and constancy. The subjects, its concepts, and also the objects in the world to which the concepts are applied have a shared, internal essence: the self-resemblance at the basis of identity. Representational thought is analogical; its concern is to establish a correspondence between these symmetrically structured domains. The faculty of judgment is the policeman of analogy, assuring that each of these terms is honestly itself, and that the proper correspondences obtain. In thought its end is truth, in action justice. The weapons it wields in their pursuit are limitive distribution (the determination of the exclusive set of properties possessed by each term in contradistinction to the others: logos, law) and hierarchical ranking (the measurement of the degree of perfection of a term’s self-resemblance in relation to a supreme standard, man, god, or gold: value, morality). The modus operandi is negation: x = x = not y. Identity, resemblance, truth, justice, and negation. The rational foundation for order. The established order, of course: philosophers have traditionally been employees of the State. The collusion between philosophy and the State was most explicitly enacted in the first decade of the nineteenth century with the foundation of the University of Berlin, which was to become the model of higher learning throughout Europe and in the United States. The goal laid out for it by Wilhelm von Humboldt (based on proposals by Fichte and Schleiermacher) was the ‘spiritual and moral training of the nation,’ to be achieved by ‘deriving everything from an original principle’ (truth), by ‘relating everything to an ideal’ (justice), and by ‘unifying this principle and this ideal to a single Idea’ (the State). The end product would be ‘a fully legitimated subject of knowledge and society’ – each mind an analogously organized mini-State morally unified in the supermind of the State. More insidious than the well-known practical cooperation between university and government (the burgeoning military funding of research) is its philosophical role in the propagation of the form of representational thinking itself, that ‘properly spiritual absolute State’ endlessly reproduced and disseminated at every level of the social fabric.
Gilles Deleuze (A Thousand Plateaus: Capitalism and Schizophrenia)
Designers possess more than simply an ability to style products; they are practitioners of an applied process of creative skills: identifying problems, researching, analysing, evaluating, synthesising and then conceptualising, testing and communicating solutions.
Marc Stickdorn (This is Service Design Thinking: Basics - Tools - Cases)
Another subtle but worrisome effect television has on its viewers is its tendency to promote passivity and a lack of creativity. Watching television requires little mental activity on the viewer’s part. You simply sit and let the images flow by. Some research suggests that this sort of nonparticipatory viewing fosters a short attention span, making it hard for children to apply themselves in school. Obesity
Benjamin Spock (Dr. Spock's Baby and Child Care)
The evidence that women are being let down by the medical establishment is overwhelming. The bodies, symptoms and diseases that affect half the world’s population are being dismissed, disbelieved and ignored. And it’s all a result of the data gap combined with the still prevalent belief, in the face of all the evidence that we do have, that men are the default humans. They are not. They are, to state the obvious, just men. And data collected on them does not, cannot, and should not, apply to women. We need a revolution in the research and the practice of medicine, and we need it yesterday. We need to train doctors to listen to women, and to recognise that their inability to diagnose a woman may not be because she is lying or being hysterical: the problem may be the gender data gaps in their knowledge. It’s time to stop dismissing women, and start saving them.
Caroline Criado Pérez (Invisible Women: Exposing Data Bias in a World Designed for Men)
I saw, during the midterm campaign of 2006, how difficult it was for opponents of stem cell research to run against hope. And so it was in the 2008 presidential contest. This was hope in the collective, a definition that should always apply to the expression of a people's political will. Christopher Reeve had believed in a formula: optimism + information = hope. In this case, the informing agent was us. Granted, it may all look different in six months to a year, but it is hard not to be buoyed by the desire for positive change as articulated and advanced by Barack Obama. It is okay to hope. This time the aspiration of many will not be derided as desperation by a few, as it was during the stem cell debate of '06. By the time you read this book, President Obama and the 111th Congress will have established federal funding for stem cell research. The dam has broken. Just as I'd hoped.
Michael J. Fox (Always Looking Up: The Adventures of an Incurable Optimist)
Research has shown that simply imagining yourself in a situation where the rules of regular life don’t apply can greatly increase your creativity.
Tanner Christensen (The Creativity Challenge: Design, Experiment, Test, Innovate, Build, Create, Inspire, and Unleash Your Genius)
Research is formalized curiosity. It is poking and prying with a purpose.
Jeff Gothelf (Lean UX: Applying Lean Principles to Improve User Experience)
The lessons he learned were diligently applied to modify his methods, resulting in a continuous research loop,
Wendy Moore (The Knife Man: Blood, Body Snatching, and the Birth of Modern Surgery)
Let’s quit our jobs and fuck all day.” “Works for me. Think somebody will subsidize that? Maybe we could apply for some kind of research grant,” she said.
Cara McKenna (Willing Victim (Flynn and Laurel, #1))
If a faithful account was rendered of man's ideas upon the Divinity, he would be obliged to acknowledge, that for the most part the word Gods has been used to express the concealed, remote, unknown causes of the effects he witnessed; that he applies this term when the spring of natural, the source of known causes ceases to be visible: as soon as he loses the thread of these causes, or as soon as his mind can no longer follow the chain, he solves the difficulty, terminates his research, by ascribing it to his gods; thus giving a vague definition to an unknown cause, at which either his idleness, or his limited knowledge, obliges him to stop. When, therefore, he ascribes to his gods the production of some phenomenon, the novelty or the extent of which strikes him with wonder, but of which his ignorance precludes him from unravelling the true cause, or which he believes the natural powers with which he is acquainted are inadequate to bring forth; does he, in fact, do any thing more than substitute for the darkness of his own mind, a sound to which he has been accustomed to listen with reverential awe?
Paul-Henri Thiry (System of Nature)
Were we dealing with a spectrum-based system that described male and female sexuality with equal accuracy, data taken from gay males would look similar to data taken from straight females—and yet this is not what we see in practice. Instead, the data associated with gay male sexuality presents a mirror image of data associated with straight males: Most gay men are as likely to find the female form aversive as straight men are likely to find the male form aversive. In gay females we observe a similar phenomenon, in which they mirror straight females instead of appearing in the same position on the spectrum as straight men—in other words, gay women are just as unlikely to find the male form aversive as straight females are to find the female form aversive. Some of the research highlighting these trends has been conducted with technology like laser doppler imaging (LDI), which measures genital blood flow when individuals are presented with pornographic images. The findings can, therefore, not be written off as a product of men lying to hide middling positions on the Kinsey scale due to a higher social stigma against what is thought of in the vernacular as male bisexuality/pansexuality. We should, however, note that laser Doppler imaging systems are hardly perfect, especially when measuring arousal in females. It is difficult to attribute these patterns to socialization, as they are observed across cultures and even within the earliest of gay communities that emerged in America, which had to overcome a huge amount of systemic oppression to exist. It’s a little crazy to argue that the socially oppressed sexuality of the early American gay community was largely a product of socialization given how much they had overcome just to come out. If, however, one works off the assumptions of our model, this pattern makes perfect sense. There must be a stage in male brain development that determines which set of gendered stimuli is dominant, then applies a negative modifier to stimuli associated with other genders. This stage does not apparently take place during female sexual development. 
Simone Collins (The Pragmatist’s Guide to Sexuality: What Turns People On, Why, and What That Tells Us About Our Species (The Pragmatist's Guide))
John P. Ioannidis published a controversial paper titled “Why Most Published Research Findings Are False.”39 The paper studied positive findings documented in peer-reviewed journals: descriptions of successful predictions of medical hypotheses carried out in laboratory experiments. It concluded that most of these findings were likely to fail when applied in the real world. Bayer Laboratories recently confirmed Ioannidis’s hypothesis. They could not replicate about two-thirds of the positive findings claimed in medical journals when they attempted the experiments themselves.40
Nate Silver (The Signal and the Noise: Why So Many Predictions Fail-but Some Don't)
Talkative people are rated as smarter, better-looking, more interesting, and more desirable as friends. Velocity of speech counts as well as volume: we rank fast-talkers as more competent and likable than slow ones. The same dynamics ap-ply in groups, where research shows that the voluble are considered smarter than the reticent—even though there’s zero correlation between the gift of gab and good ideas.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
The commercialization of molecular biology is the most stunning ethical event in the history of science, and it has happened with astonishing speed. For four hundred years since Galileo, science has always proceeded as a free and open inquiry into the workings of nature. Scientists have always ignored national boundaries, holding themselves above the transitory concerns of politics and even wars. Scientists have always rebelled against secrecy in research, and have even frowned on the idea of patenting their discoveries, seeing themselves as working to the benefit of all mankind. And for many generations, the discoveries of scientists did indeed have a peculiarly selfless quality... Suddenly it seemed as if everyone wanted to become rich. New companies were announced almost weekly, and scientists flocked to exploit genetic research... It is necessary to emphasize how significant this shift in attitude actually was. In the past, pure scientists took a snobbish view of business. They saw the pursuit of money as intellectually uninteresting, suited only to shopkeepers. And to do research for industry, even at the prestigious Bell or IBM labs, was only for those who couldn't get a university appointment. Thus the attitude of pure scientists was fundamentally critical toward the work of applied scientists, and to industry in general. Their long-standing antagonism kept university scientists free of contaminating industry ties, and whenever debate arose about technological matters, disinterested scientists were available to discuss the issues at the highest levels. But that is no longer true. There are very few molecular biologists and very few research institutions without commercial affiliations. The old days are gone. Genetic research continues, at a more furious pace than ever. But it is done in secret, and in haste, and for profit.
Michael Crichton (Jurassic Park (Jurassic Park, #1))
If there is a hallmark for this age, perhaps it will be our ability to take the complex findings of scientific research and apply them smoothly and effectively in our everyday lives, to better understand ourselves and to love more fully.
Stan Tatkin (Wired for Love: How Understanding Your Partner's Brain and Attachment Style Can Help You Defuse Conflict and Build a Secure Relationship)
It has seemed to me that if I had the genius to found the jet propulsion field in the US, and found a multimillion dollar corporation and a world renowned research laboratory, then I should also be able to apply this genius in the magical field.
George Pendle (Strange Angel: The Otherworldly Life of Rocket Scientist John Whiteside Parsons)
The Human Genome Project, the full sequence of the normal human genome, was completed in 2003. In its wake comes a far less publicized but vastly more complex project: fully sequencing the genomes of several human cancer cells. Once completed, this effort, called the Cancer Genome Atlas, will dwarf the Human Genome Project in its scope. The sequencing effort involves dozens of teams of researchers across the world. The initial list of cancers to be sequenced includes brain, lung, pancreatic, and ovarian cancer. The Human Genome Project will provide the normal genome, against which cancer’s abnormal genome can be juxtaposed and contrasted. The result, as Francis Collins, the leader of the Human Genome Project describes it, will be a “colossal atlas” of cancer—a compendium of every gene mutated in the most common forms of cancer: “When applied to the 50 most common types of cancer, this effort could ultimately prove to be the equivalent of more than 10,000 Human Genome Projects in terms of the sheer volume of DNA to be sequenced.
Siddhartha Mukherjee (The Emperor of All Maladies)
George Bernard Shaw, in a toast at a dinner feting Albert Einstein, proclaimed, “Science is always wrong. It never solves a problem without creating 10 more.” Isn’t that glorious? Science (and I think this applies to all kinds of research and scholarship) produces ignorance, possibly at a faster rate than it produces knowledge. Science, then, is not like the onion in the often used analogy of stripping away layer after layer to get at some core, central, fundamental truth. Rather it’s like the magic well: no matter how
Stuart Firestein (Ignorance: How It Drives Science)
Other than chemistry, rowing was the only thing Calvin had true passion for. In fact, rowing is why Calvin applied to Harvard in the first place: to row for Harvard was, in 1945, to row for the best. Or actually second best. University of Washington was the best, but University of Washington was in Seattle and Seattle had a reputation for rain. Calvin hated rain. Therefore, he looked further afield—to the other Cambridge, the one in England, thus exposing one of the biggest myths about scientists: that they’re any good at research.
Bonnie Garmus (Lessons in Chemistry)
A burst of high-frequency TMS pulses applied over Broca’s area on the left side would shut down the ability to speak, Shirley told me. This wasn’t what they were doing in the autism study—what they proposed was a much subtler tweaking. But I was intrigued by her comment and didn’t let it go. “Did you actually try it yourself?” I asked her. It turned out that she had—in fact, quite a few of the researchers, as part of their training to work in the lab, had experienced the speech-suppression TMS. They offered to show me what it felt like.
John Elder Robison (Switched On: A Memoir of Brain Change and Emotional Awakening)
However, in the fall of 2013, Ball Canning announced that according to its research, it did not find that warming the lids made any discernible difference in the quality of the seal and so were recommending that home canners apply clean, room-temperature lids to their jars.
Marisa McClellan (Naturally Sweet Food in Jars: 100 Preserves Made with Coconut, Maple, Honey, and More)
Malaria prevention and eradication should be inspired by General George Patton’s advice: “A good plan executed violently today is better than a perfect plan in a week.” In this war of attrition, millions of people will be lost while waiting on researchers to finally emerge triumphant from their labs with the perfect malaria cure; yet meanwhile, there are plenty of time-proven, practical actions that individuals, families and communities can do today with what is already in hand that can decisively defeat malaria transmission if applied with vigor and disciplined consistency.
T.K. Naliaka
Researchers have applied diverse methods to examine the connection between thinking and self-control. Some have addressed it by asking the correlation question: If people were ranked by their self-control and by their cognitive aptitude, would individuals have similar positions in the two rankings?
Daniel Kahneman (Thinking, Fast and Slow)
I would rather understand one cause than be King of Persia. —Democritus of Abdera If a faithful account was rendered of Man’s ideas upon Divinity, he would be obliged to acknowledge, that for the most part the word “gods” has been used to express the concealed, remote, unknown causes of the effects he witnessed; that he applies this term when the spring of the natural, the source of known causes, ceases to be visible: as soon as he loses the thread of these causes, or as soon as his mind can no longer follow the chain, he solves the difficulty, terminates his research, by ascribing it to his gods … When, therefore, he ascribes to his gods the production of some phenomenon … does he, in fact, do any thing more than substitute for the darkness of his own mind, a sound to which he has been accustomed to listen with reverential awe? —Paul Heinrich Dietrich, Baron von Holbach,    Système de la Nature, London, 1770 When
Carl Sagan (Cosmos)
To limit the time resource applied to any one company, he reminds himself of psychological research which suggests that in many contexts decisions are best made with no more than five to seven points of information. Any more information beyond that does not significantly improve decisions, and may even degrade them.
Guy Thomas (Free Capital: How 12 private investors made millions in the stock market)
From a very early age Edison became used to doing things for himself, by necessity. His family was poor, and by the age of twelve he had to earn money to help his parents. He sold newspapers on trains, and traveling around his native Michigan for his job, he developed an ardent curiosity about everything he saw. He wanted to know how things worked—machines, gadgets, anything with moving parts. With no schools or teachers in his life, he turned to books, particularly anything he could find on science. He began to conduct his own experiments in the basement of his family home, and he taught himself how to take apart and fix any kind of watch. At the age of fifteen he apprenticed as a telegraph operator, then spent years traveling across the country plying his trade. He had no chance for a formal education, and nobody crossed his path who could serve as a teacher or mentor. And so in lieu of that, in every city he spent time in, he frequented the public library. One book that crossed his path played a decisive role in his life: Michael Faraday’s two-volume Experimental Researches in Electricity. This book became for Edison what The Improvement of the Mind had been for Faraday. It gave him a systematic approach to science and a program for how to educate himself in the field that now obsessed him—electricity. He could follow the experiments laid out by the great Master of the field and absorb as well his philosophical approach to science. For the rest of his life, Faraday would remain his role model. Through books, experiments, and practical experience at various jobs, Edison gave himself a rigorous education that lasted about ten years, up until the time he became an inventor. What made this successful was his relentless desire to learn through whatever crossed his path, as well as his self-discipline. He had developed the habit of overcoming his lack of an organized education by sheer determination and persistence. He worked harder than anyone else. Because he was a consummate outsider and his mind had not been indoctrinated in any school of thought, he brought a fresh perspective to every problem he tackled. He turned his lack of formal direction into an advantage. If you are forced onto this path, you must follow Edison’s example by developing extreme self-reliance. Under these circumstances, you become your own teacher and mentor. You push yourself to learn from every possible source. You read more books than those who have a formal education, developing this into a lifelong habit. As much as possible, you try to apply your knowledge in some form of experiment or practice. You find for yourself second-degree mentors in the form of public figures who can serve as role models. Reading and reflecting on their experiences, you can gain some guidance. You try to make their ideas come to life, internalizing their voice. As someone self-taught, you will maintain a pristine vision, completely distilled through your own experiences—giving you a distinctive power and path to mastery.
Robert Greene (Mastery (The Modern Machiavellian Robert Greene Book 1))
Harvard psychologist Daniel Gilbert talks about this phenomenon in his 2006 book, Stumbling on Happiness. “The greatest achievement of the human brain is its ability to imagine objects and episodes that do not exist in the realm of the real,” he writes. “The frontal lobe—the last part of the human brain to evolve, the slowest to mature, and the first to deteriorate in old age—is a time machine that allows each of us to vacate the present and experience the future before it happens.” This time travel into the future—otherwise known as anticipation—accounts for a big chunk of the happiness gleaned from any event. As you look forward to something good that is about to happen, you experience some of the same joy you would in the moment. The major difference is that the joy can last much longer. Consider that ritual of opening presents on Christmas morning. The reality of it seldom takes more than an hour, but the anticipation of seeing the presents under the tree can stretch out the joy for weeks. One study by several Dutch researchers, published in the journal Applied Research in Quality of Life in 2010, found that vacationers were happier than people who didn’t take holiday trips. That finding is hardly surprising. What is surprising is the timing of the happiness boost. It didn’t come after the vacations, with tourists bathing in their post-trip glow. It didn’t even come through that strongly during the trips, as the joy of travel mingled with the stress of travel: jet lag, stomach woes, and train conductors giving garbled instructions over the loudspeaker. The happiness boost came before the trips, stretching out for as much as two months beforehand as the holiday goers imagined their excursions. A vision of little umbrella-sporting drinks can create the happiness rush of a mini vacation even in the midst of a rainy commute. On some level, people instinctively know this. In one study that Gilbert writes about, people were told they’d won a free dinner at a fancy French restaurant. When asked when they’d like to schedule the dinner, most people didn’t want to head over right then. They wanted to wait, on average, over a week—to savor the anticipation of their fine fare and to optimize their pleasure. The experiencing self seldom encounters pure bliss, but the anticipating self never has to go to the bathroom in the middle of a favorite band’s concert and is never cold from too much air conditioning in that theater showing the sequel to a favorite flick. Planning a few anchor events for a weekend guarantees you pleasure because—even if all goes wrong in the moment—you still will have derived some pleasure from the anticipation. I love spontaneity and embrace it when it happens, but I cannot bank my pleasure solely on it. If you wait until Saturday morning to make your plans for the weekend, you will spend a chunk of your Saturday working on such plans, rather than anticipating your fun. Hitting the weekend without a plan means you may not get to do what you want. You’ll use up energy in negotiations with other family members. You’ll start late and the museum will close when you’ve only been there an hour. Your favorite restaurant will be booked up—and even if, miraculously, you score a table, think of how much more you would have enjoyed the last few days knowing that you’d be eating those seared scallops on Saturday night!
Laura Vanderkam (What the Most Successful People Do on the Weekend: A Short Guide to Making the Most of Your Days Off (A Penguin Special from Portfo lio))
Remember when you discovered your father owned a book called "How To Disappear and Never Be Found?" You're sure it was just research for new and creative ways of thinking, for concepts that might apply to his work, but it raised the distinct possibility that there is something very upsetting that people you love could do instead of dying.
Lena Dunham (Not That Kind of Girl: A Young Woman Tells You What She's "Learned")
Dr. Norman Shealy found while researching magnesium oil that magnesium applied to the skin on a regular basis naturally enhances the level of a vitally important hormone, DHEA. DHEA is normally produced in the adrenal glands, but production slows down as we age. Apparently as magnesium is absorbed through the skin and the underlying fatty tissues of the body it sets off many chain reactions, one of which ends in the production of DHEA. Increasing DHEA levels by taking supplements of the hormone is recommended by some antiaging specialists, but others caution about side effects. To increase it naturally by improving your magnesium balance may be a safe way to turn back the clock.
Carolyn Dean (The Magnesium Miracle (Revised and Updated))
I remember a woman, speaking at a ceremony when Anne was given an award for National Women’s Health Week. She said, “women need to work in medical research, and in applied medicine, because too many men treat women’s bodies like they are just men’s bodies with female parts, but our bodies are fundamentally different and need to be treated that way.
Wil Wheaton (Still Just a Geek: An Annotated Memoir)
Richard Charnin is an author and quantitative software developer with advanced degrees in applied mathematics and operations research. He paints a very clear portrait of the JFK witness deaths in the context of the mathematical landscape: I have proved mathematically what many have long suspected: The scores of convenient JFK unnatural witness deaths cannot be coincidental.15
Richard Belzer (Hit List: An In-Depth Investigation Into the Mysterious Deaths of Witnesses to the JFK Assassination)
As a rule, I found that each person is especially good at one of these three phases and especially bad at one. Each of us has a transition superpower, if you will, and a transition kryptonite. Our research suggests that people gravitate to the phase they’re naturally adept at and bog down in the one they’re weakest at. If you’re comfortable saying goodbye, you might knock that off quickly and move on to the next challenge; but if you’re conflict averse and don’t like to disappoint people, you might remain in a situation that’s toxic far longer than you should. The same applies to the messy middle: Some people thrive in chaos; others are paralyzed by it. As for new beginnings, some people embrace the novelty; others dread it—they like things the way they were.
Bruce Feiler (Life Is in the Transitions: Mastering Change at Any Age)
the Institute of Applied Economic and Social Research at the University of Melbourne published the results of an extensive study of international money laundering.9 The authors compared the banking systems of two hundred countries. The Vatican ranked in the top ten money laundering havens, behind Luxembourg, Switzerland, the Cayman Islands, and Liechtenstein, but ahead of Singapore.
Gerald Posner (God's Bankers: A History of Money and Power at the Vatican)
The idea that an understanding of the genocide, that a memory of the holocausts, can only lead people to want to dismantle the system, is erroneous. The continuing appeal of nationalism suggests that the opposite is truer, namely that an understanding of genocide has led people to mobilize genocidal armies, that the memory of holocausts has led people to perpetrate holocausts. The sensitive poets who remembered the loss, the researchers who documented it, have been like the pure scientists who discovered the structure of the atom. Applied scientists used the discovery to split the atom’s nucleus, to produce weapons which can split every atom’s nucleus; Nationalists used the poetry to split and fuse human populations, to mobilize genocidal armies, to perpetrate new holocausts.
Fredy Perlman (The Continuing Appeal of Nationalism)
No one can understand history without continually relating the long periods which are constantly mentioned to the experiences of our own short lives. Five years is a lot. Twenty years is the horizon to most people. Fifty years is antiquity. To understand how the impact of destiny fell upon any generation of men one must first imagine their position and then apply the time-scale of our own lives. Thus nearly all changes were far less perceptible to those who lived through them from day to day than appears when the salient features of an epoch are extracted by the chronicler. We peer at these scenes through dim telescopes of research across a gulf of nearly two thousand years. We cannot doubt that the second and to some extent the third century of the Christian era, in contrast with all that had gone before and most that was to follow, were a Golden Age for Britain. But by the early part of the fourth century shadows had fallen upon this imperfect yet none the less tolerable society. By steady, persistent steps the sense of security departed from Roman Britain. Its citizens felt by daily experience a sense that the world-wide system of which they formed a partner province was in decline.
Winston S. Churchill (The Birth of Britain (A History of the English Speaking Peoples #1))
Most lecture-based courses contribute nothing to real learning. Consequential and retained learning comes from applying knowledge to new situations or problems, research on questions and issues that students consider important, peer interaction, activities, and projects. Experiences, rather than short-term memorization, help students develop the skills and motivation that transforms lives." [p. 7-8]
Tony Wagner (Most Likely to Succeed: Preparing Our Kids for the Innovation Era)
Hence I most seriously believe that one does people the best service by giving them some elevating work to do and thus indirectly elevating them. This applies most of all to the great artist, but also in a lesser degree to the scientist. To be sure, it is not the fruits of scientific research that elevate a man and enrich his nature, but the urge to understand, the intellectual work, creative or receptive.
Albert Einstein (Ideas and Opinions)
Valentine’s concept of introversion includes traits that contemporary psychology would classify as openness to experience (“thinker, dreamer”), conscientiousness (“idealist”), and neuroticism (“shy individual”). A long line of poets, scientists, and philosophers have also tended to group these traits together. All the way back in Genesis, the earliest book of the Bible, we had cerebral Jacob (a “quiet man dwelling in tents” who later becomes “Israel,” meaning one who wrestles inwardly with God) squaring off in sibling rivalry with his brother, the swashbuckling Esau (a “skillful hunter” and “man of the field”). In classical antiquity, the physicians Hippocrates and Galen famously proposed that our temperaments—and destinies—were a function of our bodily fluids, with extra blood and “yellow bile” making us sanguine or choleric (stable or neurotic extroversion), and an excess of phlegm and “black bile” making us calm or melancholic (stable or neurotic introversion). Aristotle noted that the melancholic temperament was associated with eminence in philosophy, poetry, and the arts (today we might classify this as opennessto experience). The seventeenth-century English poet John Milton wrote Il Penseroso (“The Thinker”) and L’Allegro (“The Merry One”), comparing “the happy person” who frolics in the countryside and revels in the city with “the thoughtful person” who walks meditatively through the nighttime woods and studies in a “lonely Towr.” (Again, today the description of Il Penseroso would apply not only to introversion but also to openness to experience and neuroticism.) The nineteenth-century German philosopher Schopenhauer contrasted “good-spirited” people (energetic, active, and easily bored) with his preferred type, “intelligent people” (sensitive, imaginative, and melancholic). “Mark this well, ye proud men of action!” declared his countryman Heinrich Heine. “Ye are, after all, nothing but unconscious instruments of the men of thought.” Because of this definitional complexity, I originally planned to invent my own terms for these constellations of traits. I decided against this, again for cultural reasons: the words introvert and extrovert have the advantage of being well known and highly evocative. Every time I uttered them at a dinner party or to a seatmate on an airplane, they elicited a torrent of confessions and reflections. For similar reasons, I’ve used the layperson’s spelling of extrovert rather than the extravert one finds throughout the research literature.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
totality, the picture is in line with a classic research finding that is not specific to music: breadth of training predicts breadth of transfer. That is, the more contexts in which something is learned, the more the learner creates abstract models, and the less they rely on any particular example. Learners become better at applying their knowledge to a situation they’ve never seen before, which is the essence of creativity.
David Epstein (Range: Why Generalists Triumph in a Specialized World)
Mis-information is rampant in this great age of mass-information. While we have more access to learning than ever before in the history of the world, we’re actually getting dumber it seems. The amount of (mis)information at everyone's fingertips has lured us into a false sense of knowing. Whether it be information about science, politics, or theology, our society is suffering from an inability to research, process, filter, and apply. At the same time we seem entirely oblivious to the zeitgeist (spirit of the age) that is nihilistic and libertine, making everything relative and subjective. And Satan himself rushes to blur our vision, stirring up the dust of confusion. The church must respond by teaching the critical faculties of logic and spiritual discernment, embedded in a cohesive framework of fides quaerens intellectum (faith seeking understanding). We must obtain a reasonable faith that is consistent with historic Christianity and relevant for our post-modern age. Otherwise, those rejecting the blatant errors of religious fundamentalism will be susceptible to every wind of false doctrine and repackaged heresy imaginable. They will leave the orthodox faith and accept something that vaguely resembles Christianity, but in reality is a vile concoction of demonic lies.
David D. Flowers
In totality, the picture is in line with a classic research finding that is not specific to music: breadth of training predicts breadth of transfer. That is, the more contexts in which something is learned, the more the learner creates abstract models, and the less they rely on any particular example. Learners become better at applying their knowledge to a situation they’ve never seen before, which is the essence of creativity.
David Epstein (Range: Why Generalists Triumph in a Specialized World)
The field of scientific abstraction encompasses independent kingdoms of ideas and of experiments and within these, rulers whose fame outlasts the centuries. But they are not the only kings in science. He also is a king who guides the spirit of his contemporaries by knowledge and creative work, by teaching and research in the field of applied science, and who conquers for science provinces which have only been raided by craftsmen.
Fritz Haber
[ Dr. Lois Jolyon West was cleared at Top Secret for his work on MKULTRA. ] Dr. Michael Persinger [235], another FSMF Board Member, is the author of a paper entitled “Elicitation of 'Childhood Memories' in Hypnosis-Like Settings Is Associated With Complex Partial Epileptic-Like Signs For Women But Not for Men: the False Memory Syndrome.” In the paper Perceptual and Motor Skills,In the paper, Dr. Persinger writes: On the day of the experiment each subject (not more than two were tested per day) was asked to sit quietly in an acoustic chamber and was told that the procedure was an experiment in relaxation. The subject wore goggles and a modified motorcycle helmet through which 10-milligauss (1 microTesla) magnetic fields were applied through the temporal plane. Except for a weak red (photographic developing) light, the room was dark. Dr. Persinger's research on the ability of magnetic fields to facilitate the creation of false memories and altered states of consciousness is apparently funded by the Defense Intelligence Agency through the project cryptonym SLEEPING BEAUTY. Freedom of Information Act requests concerning SLEEPING BEAUTY with a number of different intelligence agencies including the CIA and DEA has yielded denial that such a program exists. Certainly, such work would be of direct interest to BLUEBIRD, ARTICHOKE, MKULTRA and other non-lethal weapons programs. Schnabel [280] lists Dr. Persinger as an Interview Source in his book on remote viewing operations conducted under Stargate, Grill Flame and other cryptonyms at Fort Meade and on contract to the Stanford Research Institute. Schnabel states (p. 220) that, “As one of the Pentagon's top scientists, Vorona was privy to some of the strangest, most secret research projects ever conceived. Grill Flame was just one. Another was code-named Sleeping Beauty; it was a Defense Department study of remote microwave mind-influencing techniques ... [...] It appears from Schnabel's well-documented investigations that Sleeping Beauty is a real, but still classified mind control program. Schnabel [280] lists Dr. West as an Interview Source and says that West was a, “Member of medical oversight board for Science Applications International Corp. remote-viewing research in early 1990s.
Colin A. Ross (The CIA Doctors: Human Rights Violations by American Psychiatrists)
Susan’s and Jennifer’s job searches are likely made harder by the color of their skin. In the early 2000s, researchers in Chicago and Boston mailed out fake résumés to hundreds of employers, varying only the names of the applicants, but choosing names that would be seen as identifiably black or white. Strikingly, “Emily” and “Brendan” were 50 percent more likely to get called for an interview than “Lakisha” and “Jamal.” A few years later, a researcher at the University of Wisconsin conducted a similar study in Milwaukee, but with a unique twist. She recruited two black and two white actors (college students, posing as high school graduates) who were as similar as possible in every way. She sent these “job applicants” out in pairs, with virtually identical fake résumés, to apply for entry-level jobs. Her twist was to instruct one of the white and one of the black applicants to tell employers that they had a felony conviction and had just been released from prison the month before. Even the researcher was surprised by what she found: the white applicant with a felony conviction was more likely to get a positive response from a prospective employer than the black applicant with no criminal record. When the study was replicated in New York City a few years later, she and her colleagues saw similar results for Latino applicants relative to whites.
Kathryn J. Edin ($2.00 a Day: Living on Almost Nothing in America)
another obstacle to educating innovators in universities is the lack of respect for interdisciplinary inquiry, practical knowledge, and applied learning. Discipline-based, in-depth knowledge is important, and basic research makes significant contributions to innovation. It is essential to our future that we continue to support this kind of inquiry, but this cannot—and must not—be the only kind of knowledge that is valued by our universities and our society.
Tony Wagner (Creating Innovators: The Making of Young People Who Will Change the World)
Amos and I called our first joint article “Belief in the Law of Small Numbers.” We explained, tongue-in-cheek, that “intuitions about random sampling appear to satisfy the law of small numbers, which asserts that the law of large numbers applies to small numbers as well.” We also included a strongly worded recommendation that researchers regard their “statistical intuitions with proper suspicion and replace impression formation by computation whenever possible.
Daniel Kahneman (Thinking, Fast and Slow)
But when it has been shown by the researches of Pasteur that the septic property of the atmosphere depended not on the oxygen, or any gaseous constituent, but on minute organisms suspended in it, which owed their energy to their vitality, it occurred to me that decomposition in the injured part might be avoided without excluding the air, by applying as a dressing some material capable of destroying the life of the floating particles. Upon this principle I have based a practice.
Joseph Lister (On the Antiseptic Principle of the Practice of Surgery)
Say the ape, the Eurasian magpie, or the elephant looks in the mirror and recognizes the paint smeared on her body by the researcher. The animal who passes the mirror test then investigates her own body for the offending mark. Say she finds nothing. How long before she trusts the reflection over her own body? Say the mark on her reflection is confirmed by all the other elephants. How long before her reflection replaces herself? Say the mark is not of paint but instead a word applied to her.
Melissa Febos (Girlhood)
The Matrix is a system, Neo. That system is our enemy. But when you're inside, you look around, what do you see? Businessmen, teachers, lawyers, carpenters. The very minds of the people we are trying to save. But until we do, these people are still a part of that system and that makes them our enemy. You have to understand, most of these people are not ready to be unplugged. And many of them are so inured, so hopelessly dependent on the system, that they will fight to protect it. The Matrix, 1999
Sam Ladner (Mixed Methods: A short guide to applied mixed methods research)
Critical Thinking: Why Is It So Hard to Teach? By Daniel T. Willingham SUMMER 2007 AMERICAN FEDERATION OF TEACHERS pp. 8-1 Can critical thinking actually be taught? Decades of cognitive research point to a disappointing answer: not really. People who have sought to teach critical thinking have assumed that it is a skill, like riding a bicycle, and that, like other skills, once you learn it, you can apply it in any situation. Research from cognitive science shows that thinking is not that sort of skill. The processes of thinking are intertwined with the content of thought (that is, domain knowledge). Thus, if you remind a student to “look at an issue from multiple perspectives” often enough, he will learn that he ought to do so, but if he doesn’t know much about an issue, he can’t think about it from multiple perspectives. You can teach students maxims about how they ought to think, but without background knowledge and practice, they probably will not be able to implement the advice they memorize.
Daniel T. Willingham
Frequently, I have been asked if an experiment I have planned is pure or applied science; to me it is more important to know if the experiment will yield new and probably enduring knowledge about nature. If it is likely to yield such knowledge, it is, in my opinion, good fundamental research; and this is more important than whether the motivation is purely aesthetic satisfaction on the part of the experimenter on the one hand or the improvement of the stability of a high-power transistor on the other.
William Shockley
PERFORMANCE PRACTICES Apply the components of perfect practice each time you set out to do meaningful work: •Define a purpose and concrete objectives for each working session. •Ask yourself: What do I want to learn or get done? •Focus and concentrate deeply, even if doing so isn’t always enjoyable. •Single-task: The next time you feel like multitasking, remind yourself that research shows it’s not effective. Keep in mind Dr. Bob’s secret: “Do only one thing at a time.” •Remember that quality trumps quantity.
Brad Stulberg (Peak Performance: Elevate Your Game, Avoid Burnout, and Thrive with the New Science of Success)
An interesting question in the research on feedback is how quick it should be. Should you get immediate information about your mistakes or wait some period of time? In general, research has pointed to immediate feedback being superior in settings outside of the laboratory. James A. Kulik and Chen-Lin C. Kulik review the literature on feedback timing and suggest that “Applied studies using actual classroom quizzes and real learning materials have usually found immediate feedback to be more effective than delay.
Scott H. Young (Ultralearning: Master Hard Skills, Outsmart the Competition, and Accelerate Your Career)
In the course of an extended investigation into the nature of inflammation, and the healthy and morbid conditions of the blood in relation to it, I arrived several years ago at the conclusion that the essential cause of suppuration in wounds is decomposition brought about by the influence of the atmosphere upon blood or serum retained within them, and, in the case of contused wounds, upon portions of tissue destroyed by the violence of the injury. To prevent the occurrence of suppuration with all its attendant risks was an object manifestly desirable, but till lately apparently unattainable, since it seemed hopeless to attempt to exclude the oxygen which was universally regarded as the agent by which putrefaction was effected. But when it had been shown by the researches of Pasteur that the septic properties of the atmosphere depended not on the oxygen, or any gaseous constituent, but on minute organisms suspended in it, which owed their energy to their vitality, it occurred to me that decomposition in the injured part might be avoided without excluding the air, by applying as a dressing some material capable of destroying the life of the floating particles.
Joseph Lister (On the Antiseptic Principle of the Practice of Surgery)
In full disclosure, this diet—like all other diets—hasn’t been fully proven. The pilot project didn’t include a control group and wasn’t intended as scientific research. We can’t be sure how these outcomes would apply to the general public. But the ideas presented in this book culminate a century of research questioning the calorie balance model of obesity, and represent a fundamentally different way to understand why we gain weight and what we can do about it.7 For those of you with a scientific bent, I’ve included hundreds of supporting studies from many research teams among the references.
David Ludwig (Always Hungry?: Conquer Cravings, Retrain Your Fat Cells, and Lose Weight Permanently)
Research and practice are clear. Stress inoculation doesn’t work unless you have acquired the skills to navigate the environment you will encounter. As sports psychologist Brian Zuleger told me, “Telling people to relax doesn’t work unless you’ve taught people how to actually relax. The same goes for mental strength. The historical way to develop toughness was to do something physically challenging, and you’d have a fifty-fifty shot if they thrived. You have to teach the skill before it can be applied.” Throwing people in the deep end doesn’t work unless they’ve been taught the basics of how to swim.
Steve Magness (Do Hard Things: Why We Get Resilience Wrong and the Surprising Science of Real Toughness)
For a while, every smart and shy eccentric from Bobby Fischer to Bill Gate was hastily fitted with this label, and many were more or less believably retrofitted, including Isaac Newton, Edgar Allen Pie, Michelangelo, and Virginia Woolf. Newton had great trouble forming friendships and probably remained celibate. In Poe's poem Alone, he wrote that "All I lov'd - I lov'd alone." Michelangelo is said to have written "I have no friends of any sort and I don't want any." Woolf killed herself. Asperger's disorder, once considered a sub-type of autism, was named after the Austrian pediatrician Hans Asperger, a pioneer, in the 1940s, in identifying and describing autism. Unlike other early researchers, according to the neurologist and author Oliver Sacks, Asperger felt that autistic people could have beneficial talents, especially what he called a "particular originality of thought" that was often beautiful and pure, unfiltered by culture of discretion, unafraid to grasp at extremely unconventional ideas. Nearly every autistic person that Sacks observed appeard happiest when alone. The word "autism" is derived from autos, the Greek word for "self." "The cure for Asperger's syndrome is very simple," wrote Tony Attwood, a psychologist and Asperger's expert who lives in Australia. The solution is to leave the person alone. "You cannot have a social deficit when you are alone. You cannot have a communication problem when you are alone. All the diagnostic criteria dissolve in solitude." Officially, Asperger's disorder no longer exists as a diagnostic category. The diagnosis, having been inconsistently applied, was replaced, with clarified criteria, in the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders; Asperger's is now grouped under the umbrella term Autism Spectrum Disorder, or ASD.
Michael Finkel (The Stranger in the Woods: The Extraordinary Story of the Last True Hermit)
But we don’t correct for the difference in science, medicine, and mathematics, for the same reasons we didn’t pay attention to iatrogenics. We are suckers for the sophisticated. In institutional research, one can selectively report facts that confirm one’s story, without revealing facts that disprove it or don’t apply to it—so the public perception of science is biased into believing in the necessity of the highly conceptualized, crisp, and purified Harvardized methods. And statistical research tends to be marred with this one-sidedness. Another reason one should trust the disconfirmatory more than the confirmatory.
Nassim Nicholas Taleb (Antifragile: Things that Gain from Disorder)
From the centre of the “perfect man” flows the ocean (where, as we have said, the god dwells). The “perfect” man is, as Jesus says, the “true door,” through which the “perfect” man must go in order to be reborn. Here the problem of how to translate “teleios” becomes crucial; for—we must ask—why should anyone who is “perfect” need renewal through rebirth?108 One can only conclude that the perfect man was not so perfected that no further improvement was possible. We encounter a similar difficulty in Philippians 3 : 12, where Paul says: “Not that I … am already perfect” (τετελείωμαɩ). But three verses further on he writes: “Let us then, as many as are perfect (τέλεɩoɩ) be of this mind.” The Gnostic use of τέλεɩoς obviously agrees with Paul’s. The word has only an approximate meaning and amounts to much the same thing as πνεʋματɩκóς, ‘spiritual,’109 which is not connected with any conception of a definite degree of perfection or spirituality. The word “perfect” gives the sense of the Greek τέλεɩoς correctly only when it refers to God. But when it applies to a man, who in addition is in need of rebirth, it can at most mean “whole” or “complete,” especially if, as our text says, the complete man cannot even be saved unless he passes through this door.
C.G. Jung (Aion: Researches into the Phenomenology of the Self (Collected Works, Vol 9ii))
Forgetting herself entirely, Pandora let her head loll back against Gabriel's shoulder. "What kind of glue does Ivo use?" she asked languidly. "Glue?" he echoed after a moment, his mouth close to her temple, grazing softly. "For his kites." "Ah." He paused while a wave retreated. "Joiner's glue, I believe." "That's not strong enough," Pandora said, relaxed and pensive. "He should use chrome glue." "Where would he find that?" One of his hands caressed her side gently. "A druggist can make it. One part acid chromate of lime to five parts gelatin." Amusement filtered through his voice. "Does your mind ever slow down, sweetheart?" "Not even for sleeping," she said. Gabriel steadied her against another wave. "How do you know so much about glue?" The agreeable trance began to fade as Pandora considered how to answer him. After her long hesitation, Gabriel tilted his head and gave her a questioning sideways glance. "The subject of glue is complicated, I gather." I'm going to have to tell him at some point, Pandora thought. It might as well be now. After taking a deep breath, she blurted out, "I design and construct board games. I've researched every possible kind of glue required for manufacturing them. Not just for the construction of the boxes, but the best kind to adhere lithographs to the boards and lids. I've registered a patent for the first game, and soon I intend to apply for two more." Gabriel absorbed the information in remarkably short order. "Have you considered selling the patents to a publisher?" "No, I want to make the games at my own factory. I have a production schedule. The first one will be out by Christmas. My brother-in-law, Mr. Winterborne, helped me to write a business plan. The market in board games is quite new, and he thinks my company will be successful." "I'm sure it will be. But a young woman in your position has no need of a livelihood." "I do if I want to be self-supporting." "Surely the safety of marriage is preferable to the burdens of being a business proprietor." Pandora turned to face him fully. "Not if 'safety' means being owned. As things stand now, I have the freedom to work and keep my earnings. But if I marry you, everything I have, including my company, would immediately become yours. You would have complete authority over me. Every shilling I made would go directly to you- it wouldn't even pass through my hands. I'd never be able to sign a contract, or hire employees, or buy property. In the eyes of the law, a husband and wife are one person, and that person is the husband. I can't bear the thought of it. It's why I never want to marry.
Lisa Kleypas (Devil in Spring (The Ravenels, #3))
I doubt many reading this would decide to re-plaster a ceiling that keeps leaking every time it rains, knowing the real leak is on the roof of the building. Yet our localized view of the human condition is still plastering away. To stop the leak, we need to seek out and resolve root causes that continue to lead to social oppression, ecological disregard, and other influences that reduce human well-being. It has only been in the modern age that sociological research has provided powerful evidence of what’s needed to resolve these problems. These new frameworks or models for understanding society must be applied if we expect to see true social progress.
Peter Joseph (The New Human Rights Movement: Reinventing the Economy to End Oppression)
Talking on a cell phone makes us four times as likely to have an accident—the same as a driver who has a blood alcohol content of .08 percent, which qualifies as intoxicated in most states. The risk is equal for drivers holding their phones to their ears and for those speaking through a hands-free device. In both cases, researchers suggest, the drivers generate mental images of the unseen person at the other end of the line, which conflicts with their capacity for spatial processing. “It’s not that your hands aren’t on the wheel,” says David Strayer, the director of the Applied Cognition Laboratory at the University of Utah, “it’s that your mind is not on the road.
Tony Schwartz (The Way We're Working Isn't Working: The Four Forgotten Needs That Energize Great Performance)
But as a Puerto Rican woman, she belonged to not one but two minority groups. New research suggests that her double minority status may have amplified the costs and the benefits of speaking up. Management researcher Ashleigh Rosette, who is African American, noticed that she was treated differently when she led assertively than were both white women and black men. Working with colleagues, she found that double minority group members faced double jeopardy. When black women failed, they were evaluated much more harshly than black men and white leaders of both sexes. They didn’t fit the stereotype of leaders as black or as female, and they shouldered an unfair share of the blame for mistakes. For double minorities, Rosette’s team pointed out, failure is not an option. Interestingly, though, Rosette and her colleagues found that when black women acted dominantly, they didn’t face the same penalties as white women and black men. As double minorities, black women defy categories. Because people don’t know which stereotypes to apply to them, they have greater flexibility to act “black” or “female” without violating stereotypes. But this only holds true when there’s clear evidence of their competence. For minority-group members, it’s particularly important to earn status before exercising power. By quietly advancing the agenda of putting intelligence online as part of her job, Carmen Medina was able to build up successes without attracting too much attention. “I was able to fly under the radar,” she says. “Nobody really noticed what I was doing, and I was making headway by iterating to make us more of a publish-when-ready organization. It was almost like a backyard experiment. I pretty much proceeded unfettered.” Once Medina had accumulated enough wins, she started speaking up again—and this time, people were ready to listen. Rosette has discovered that when women climb to the top and it’s clear that they’re in the driver’s seat, people recognize that since they’ve overcome prejudice and double standards, they must be unusually motivated and talented. But what happens when voice falls on deaf ears?
Adam M. Grant (Originals: How Non-Conformists Move the World)
If Paul brought the first generation of Christians the useful skills of a trained theologian, Origen was the first great philosopher to rethink the new religion from first principles. As his philosophical enemy, the anti-Christian Porphyry, summed it up, he 'introduced Greek ideas to foreign fables' -- that is, gave a barbarous eastern religion the intellectual respectability of a philosophical defense. Origen was also a phenomenon. As Eusebius put it admiringly, 'even the facts from his cradle are worth mentioning'. Origen came from Alexandria, the second city of the empire and then it's intellectual centre; his father's martyrdom left him an orphan at seventeen with six younger brothers. He was a hard working prodigy, at eighteen head of the Catechetical School, and already trained as a literary scholar and teacher. But at this point, probably in 203, he became a religious fanatic and remained one for the next fifty years. He gave up his job and sold his books to concentrate on religion. he slept on the floor, ate no meat, drank no wine, had only one coat and no shoes. He almost certainly castrated himself, in obedience to the notorious text, Matthew 19:12, 'there are some who have made themselves eunuchs for the kingdom of heaven's sake.' Origen's learning was massive and it was of a highly original kind: he always went back to the sources and thought through the whole process himself. This he learned Hebrew and, according to Eusebius, 'got into his possession the original writings extant among the Jews in the actual Hebrew character'. These included the discovery of lost texts; in the case of the psalms, Origen collected not only the four known texts but three others unearthed, including 'one he found at Jericho in a jar'. The result was an enormous tome, the Hexapla, which probably existed in only one manuscript now lost, setting out the seven alternative texts in parallel columns. He applied the same principles of original research to every aspect of Christianity and sacred literature. He seems to have worked all day and though most of the night, and was a compulsive writer. Even the hardy Jerome later complained: 'Has anyone read everything Origen wrote?'
Paul Johnson (A History of Christianity)
The same trend is noticeable in the scientific realm: research here is for its own sake far more than for the partial and fragmentary results it achieves; here we see an ever more rapid succession of unfounded theories and hypotheses, no sooner set up than crumbling to give way to others that will have an even shorter life - a veritable chaos amid which one would search in vain for anything definitive, unless it be a monstrous accumulation of facts and details incapable of proving or signifying anything. We refer here of course to speculative science, insofar as this still exists; in applied science there are on the contrary undeniable results, and this is easily understandable since these results bear directly on the domain of matter.
René Guénon (The Crisis of the Modern World)
Even if men and women in America spoke the same language, they would still live by much different standards. For example, if a man in a movie researches a woman’s schedule, finds out where she lives and works, even goes to her work uninvited, it shows his commitment, proves his love. When Robert Redford does this to Demi Moore in Indecent Proposal, it’s adorable. But when she shows up at his work unannounced, interrupting a business lunch, it’s alarming and disruptive. If a man in the movies wants a sexual encounter or applies persistence, he’s a regular everyday guy, but if a woman does the same thing, she’s a maniac or a killer. Just recall Fatal Attraction, King of Comedy, Single White Female, Play Misty for Me, Hand That Rocks the Cradle, and Basic Instinct.
Gavin de Becker (The Gift of Fear: Survival Signals That Protect Us from Violence)
I'd attended a selective liberal arts college, trained at respectable research institutions, and even completed a dissertation for a doctoral degree. In our shared office, I'd tell new hires I was ABD, so they wouldn't feel their own situation was so bleak. If they saw a ten-year veteran adjunct with a PhD, they might lose hope of securing a permanent job. It was the least I could do, as a good American, to remind the young we were an innocent and optimistic country where everyone was entitled to a fulfilling career. To make sure they understood that PhD stood not for "piled higher and deeper" or "Pop has dough," but in fact the degree meant "professional happiness desired," and at the altruistic colleges of democratic America only the angry or sad ones need not apply.
Alex Kudera (Auggie's Revenge)
We have to show the world a society in which all relationships, fundamental principles and laws flow directly from ethics, and from them alone. Ethical demands must determine all considerations: how to bring up children, what to train them for, to what end the work of grown-ups should be directed, and how their leisure should be occupied. As for scientific research, it should only be conducted where it doesn't damage morality, in the first instance where it doesn't damage the researchers themselves. The same should apply to foreign policy. Whenever the question of frontiers arises, we should think not of how much richer or stronger this or that course of action will make us, or of how it will raise our prestige. We should consider one criterion only: how far is it ethical?
Aleksandr Solzhenitsyn (Cancer Ward)
When we clean ourselves, we at least temporarily alter the microscopic populations—either by removing them or by altering the resources available to them. Even if we do not use cleaning products that specifically say they are “antimicrobial,” any chemistry applied to the skin will have some effect on the environment in which the microbes grow. Soaps and astringents meant to make us drier and less oily also remove the sebum on which microbes feed. Because scientists and doctors didn’t have the technology to fully understand the number or importance of these microbes until recently, very little is known about what exactly they’re doing there. But as this new research elucidates the interplay of microbes and skin, it is challenging long-held beliefs about what is good and bad.
James Hamblin (Clean: The New Science of Skin and the Beauty of Doing Less)
A somewhat longer deferment was available, and totally legal, for college students. Bobby had dropped out of high school, but the New School for Social Research, a progressive college in New York City, was willing to accept his extraordinary chess accomplishments in lieu of traditional schoolwork. Alfred Landa, then assistant to the president, said that Fischer would not only be allowed to matriculate into the college, but be given a full scholarship. Bobby thought long and hard about the offer. One afternoon he started to walk to the New School to put in his application—and then stopped. His experience with schools had been distasteful, and perhaps that caused forebodings. Without giving an explanation, he refused to enter the school building, and he refused to apply for a student deferment.
Frank Brady (Endgame: Bobby Fischer's Remarkable Rise and Fall - from America's Brightest Prodigy to the Edge of Madness)
Throughout college, my monastic, scholarly study of human meaning would conflict with my urge to forge and strengthen the human relationships that formed that meaning. If the unexamined life was not worth living, was the unlived life worth examining? Heading into my sophomore summer, I applied for two jobs: as an intern at the highly scientific Yerkes Primate Research Center, in Atlanta, and as a prep chef at Sierra Camp, a family vacation spot for Stanford alumni on the pristine shores of Fallen Leaf Lake, abutting the stark beauty of Desolation Wilderness in Eldorado National Forest. The camp’s literature promised, simply, the best summer of your life. I was surprised and flattered to be accepted. Yet I had just learned that macaques had a rudimentary form of culture, and I was eager to go to Yerkes and see what could be the natural origin of meaning itself. In other words, I could either study meaning or I could experience it.
Paul Kalanithi (When Breath Becomes Air)
Introversion—along with its cousins sensitivity, seriousness, and shyness—is now a second-class personality trait, somewhere between a disappointment and a pathology. Introverts living under the Extrovert Ideal are like women in a man’s world, discounted because of a trait that goes to the core of who they are. Extroversion is an enormously appealing personality style, but we’ve turned it into an oppressive standard to which most of us feel we must conform. The Extrovert Ideal has been documented in many studies, though this research has never been grouped under a single name. Talkative people, for example, are rated as smarter, better-looking, more interesting, and more desirable as friends. Velocity of speech counts as well as volume: we rank fast talkers as more competent and likable than slow ones. The same dynamics apply in groups, where research shows that the voluble are considered smarter than the reticent—even though there’s zero correlation between the gift of gab and good ideas.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
A 2011 study done by Alan Krueger, a Princeton economics professor who served for two years as the chairman of President Obama’s Council of Economic Advisers, and Stacy Dale, an analyst with Mathematica Policy Research, tried to adjust for that sort of thing. Krueger and Dale examined sets of students who had started college in 1976 and in 1989; that way, they could get a sense of incomes both earlier and later in careers. And they determined that the graduates of more selective colleges could expect earnings 7 percent greater than graduates of less selective colleges, even if the graduates in that latter group had SAT scores and high school GPAs identical to those of their peers at more exclusive institutions. But then Krueger and Dale made their adjustment. They looked specifically at graduates of less selective colleges who had applied to more exclusive ones even though they hadn’t gone there. And they discovered that the difference in earnings pretty much disappeared. Someone with a given SAT score who had gone to Penn State but had also applied to the University of Pennsylvania, an Ivy League school with a much lower acceptance rate, generally made the same amount of money later on as someone with an equivalent SAT score who was an alumnus of UPenn. It was a fascinating conclusion, suggesting that at a certain level of intelligence and competence, what drives earnings isn’t the luster of the diploma but the type of person in possession of it. If he or she came from a background and a mindset that made an elite institution seem desirable and within reach, then he or she was more likely to have the tools and temperament for a high income down the road, whether an elite institution ultimately came into play or not. This was powerfully reflected in a related determination that Krueger and Dale made in their 2011 study: “The average SAT score of schools that rejected a student is more than twice as strong a predictor of the student’s subsequent earnings as the average SAT score of the school the student attended.
Frank Bruni (Where You Go Is Not Who You'll Be: An Antidote to the College Admissions Mania)
What would fly researchers discover at the tips of the reproductive structures? In the immediate environment in which the GSC (germline stem cells) sit? Shangri-La. [...] The experimental biologist J.J. Trentin proposed in 1970 that within the bone marrow and other home locations, there exists 'hematopoietic inductice microenvironment' with the unique ability to serve as a home location for blood stem cells. In the later 1970s, another blood cell expert, R. Schofield, referred to this specialized microenvironment as a 'niche', introducing the term that would stick and eventually, become widely applied to describe the microenvironment surrounding any type of stem cell. Fly biologist H. Lin describes a stem cell niche as 'the Shangri-La, the idyllic hideaway' in which these cells reside. Nestled in the niche, Lin states, stem cells 'thrive to self-renew and to produce numerous daughter cells that will differentiate and age as they leave the paradise'. In other words, the niche is the place that a stem cell is granted its two wishes - allowing it both to remain and to become something else.
Stephanie Elizabeth Mohr (First in Fly: Drosophila Research and Biological Discovery)
This principle is sometimes known as Price’s law, after Derek J. de Solla Price,13 the researcher who discovered its application in science in 1963. It can be modelled using an approximately L-shaped graph, with number of people on the vertical axis, and productivity or resources on the horizontal. The basic principle had been discovered much earlier. Vilfredo Pareto (1848–1923), an Italian polymath, noticed its applicability to wealth distribution in the early twentieth century, and it appears true for every society ever studied, regardless of governmental form. It also applies to the population of cities (a very small number have almost all the people), the mass of heavenly bodies (a very small number hoard all the matter), and the frequency of words in a language (90 percent of communication occurs using just 500 words), among many other things. Sometimes it is known as the Matthew Principle (Matthew 25:29), derived from what might be the harshest statement ever attributed to Christ: “to those who have everything, more will be given; from those who have nothing, everything will be taken.
Jordan B. Peterson (12 Rules for Life: An Antidote to Chaos)
Only years later—as an investigative journalist writing about poor scientific research—did I realize that I had committed statistical malpractice in one section of the thesis that earned me a master’s degree from Columbia University. Like many a grad student, I had a big database and hit a computer button to run a common statistical analysis, never having been taught to think deeply (or at all) about how that statistical analysis even worked. The stat program spit out a number summarily deemed “statistically significant.” Unfortunately, it was almost certainly a false positive, because I did not understand the limitations of the statistical test in the context in which I applied it. Nor did the scientists who reviewed the work. As statistician Doug Altman put it, “Everyone is so busy doing research they don’t have time to stop and think about the way they’re doing it.” I rushed into extremely specialized scientific research without having learned scientific reasoning. (And then I was rewarded for it, with a master’s degree, which made for a very wicked learning environment.) As backward as it sounds, I only began to think broadly about how science should work years after I left it.
David Epstein (Range: Why Generalists Triumph in a Specialized World)
The essence of Roosevelt’s leadership, I soon became convinced, lay in his enterprising use of the “bully pulpit,” a phrase he himself coined to describe the national platform the presidency provides to shape public sentiment and mobilize action. Early in Roosevelt’s tenure, Lyman Abbott, editor of The Outlook, joined a small group of friends in the president’s library to offer advice and criticism on a draft of his upcoming message to Congress. “He had just finished a paragraph of a distinctly ethical character,” Abbott recalled, “when he suddenly stopped, swung round in his swivel chair, and said, ‘I suppose my critics will call that preaching, but I have got such a bully pulpit.’ ” From this bully pulpit, Roosevelt would focus the charge of a national movement to apply an ethical framework, through government action, to the untrammeled growth of modern America. Roosevelt understood from the outset that this task hinged upon the need to develop powerfully reciprocal relationships with members of the national press. He called them by their first names, invited them to meals, took questions during his midday shave, welcomed their company at day’s end while he signed correspondence, and designated, for the first time, a special room for them in the West Wing. He brought them aboard his private railroad car during his regular swings around the country. At every village station, he reached the hearts of the gathered crowds with homespun language, aphorisms, and direct moral appeals. Accompanying reporters then extended the reach of Roosevelt’s words in national publications. Such extraordinary rapport with the press did not stem from calculation alone. Long before and after he was president, Roosevelt was an author and historian. From an early age, he read as he breathed. He knew and revered writers, and his relationship with journalists was authentically collegial. In a sense, he was one of them. While exploring Roosevelt’s relationship with the press, I was especially drawn to the remarkably rich connections he developed with a team of journalists—including Ida Tarbell, Ray Stannard Baker, Lincoln Steffens, and William Allen White—all working at McClure’s magazine, the most influential contemporary progressive publication. The restless enthusiasm and manic energy of their publisher and editor, S. S. McClure, infused the magazine with “a spark of genius,” even as he suffered from periodic nervous breakdowns. “The story is the thing,” Sam McClure responded when asked to account for the methodology behind his publication. He wanted his writers to begin their research without preconceived notions, to carry their readers through their own process of discovery. As they educated themselves about the social and economic inequities rampant in the wake of teeming industrialization, so they educated the entire country. Together, these investigative journalists, who would later appropriate Roosevelt’s derogatory term “muckraker” as “a badge of honor,” produced a series of exposés that uncovered the invisible web of corruption linking politics to business. McClure’s formula—giving his writers the time and resources they needed to produce extended, intensively researched articles—was soon adopted by rival magazines, creating what many considered a golden age of journalism. Collectively, this generation of gifted writers ushered in a new mode of investigative reporting that provided the necessary conditions to make a genuine bully pulpit of the American presidency. “It is hardly an exaggeration to say that the progressive mind was characteristically a journalistic mind,” the historian Richard Hofstadter observed, “and that its characteristic contribution was that of the socially responsible reporter-reformer.
Doris Kearns Goodwin (The Bully Pulpit: Theodore Roosevelt, William Howard Taft, and the Golden Age of Journalism)
It would be a mistake to imagine that drug companies are the only people applying pressure for fast approvals. Patients can also feel they are being deprived of access to drugs, especially if they are desperate. In fact, in the 1980s and 1990s the key public drive for faster approvals came from an alliance forged between drug companies and AIDS activists such as ACT UP. At the time, HIV and AIDS had suddenly appeared out of nowhere, and young, previously healthy gay men were falling ill and dying in terrifying numbers, with no treatment available. We don’t care, they explained, if the drugs that are currently being researched for effectiveness might kill us: we want them, because we’re dying anyway. Losing a couple of months of life because a currently unapproved drug turned out to be dangerous was nothing, compared to a shot at a normal lifespan. In an extreme form, the HIV-positive community was exemplifying the very best motivations that drive people to participate in clinical trials: they were prepared to take a risk, in the hope of finding better treatments for themselves or others like them in the future. To achieve this goal they blocked traffic on Wall Street, marched on the FDA headquarters in Rockville, Maryland, and campaigned tirelessly for faster approvals.
Ben Goldacre (Bad Pharma: How Drug Companies Mislead Doctors and Harm Patients)
This man is someone for whom the world isn’t a mystery. The world is a boulder, but it has levers and he knows when and where and how to apply just the right amount of force, and it moves for him, while my father and I, pushing up against it, don’t have any angle, any torque, no grip or traction or leverage. My father thinks success must be in direct proportion to effort exerted. He doesn’t know where or how to exert the least amount for the most gain, doesn’t know where the secret buttons are, the hidden doors, the golden keys. He thinks that, even if you have a great idea, there have to be trials and tribulations, errors and failures, a dark night of the soul, a slog, a time in the desert, a fallow period, a period of quiet, a period of silent and earnest and frustrated toiling before emerging, victorious, into the sunshine and acclaim. My father makes to-do lists, makes plans, makes business plans. This is how he starts, always with a blank sheet of graph paper. We make bullet points. We identify the key areas we need to research further. We try to figure out how to research those areas. We work in a vacuum. We work in his study. We ponder. We stare at our feet. We stare at the ceiling. We talk to each other, create a world, create a tiny, artificial, formal space, on a blank sheet of paper, where we can imagine rules and principles and categories and ideas, all of which have absolutely nothing to do with the actual world out there.
Charles Yu (How to Live Safely in a Science Fictional Universe)
If life has accelerated, and we have become overwhelmed by information to the point that we are less and less able to focus on any of it, why has there been so little pushback? Why haven’t we tried to slow things down to a pace where we can think clearly? I was able to find the first part of an answer to this—and it’s only the first part—when I went to interview Professor Earl Miller. He has won some of the top awards in neuroscience in the world, and he was working at the cutting edge of brain research when I went to see him in his office at the Massachusetts Institute of Technology (MIT). He told me bluntly that instead of acknowledging our limitations and trying to live within them, we have—en masse—fallen for an enormous delusion. There’s one key fact, he said, that every human being needs to understand—and everything else he was going to explain flows from that. “Your brain can only produce one or two thoughts” in your conscious mind at once. That’s it. “We’re very, very single-minded.” We have “very limited cognitive capacity.” This is because of the “fundamental structure of the brain,” and it’s not going to change. But rather than acknowledge this, Earl told me, we invented a myth. The myth is that we can actually think about three, five, ten things at the same time. To pretend this was the case, we took a term that was never meant to be applied to human beings at all. In the 1960s, computer scientists invented machines with more than one processor, so they really could do two things (or more) simultaneously. They called this machine-power “multitasking.” Then we took the concept and applied it to ourselves.
Johann Hari (Stolen Focus: Why You Can't Pay Attention - and How to Think Deeply Again)
Two days ago, I was lunching at the Writers Union with the eminent historian Tomashevski. That's the sort of man you should know. Respected, charming, hasn't produced a piece of work in ten years. He has a system, which he explained to me. First, he submits an outline for a biography to the Academy to be absolutely sure his approach is consistent with Party policy. A crucial first step, as you'll see later. Now, the person he studies is always an important figure - that is, someone from Moscow - hence Tomashevski must do his Russian research close to home for two years. But this historical character also traveled, yes, lived for some years in Paris or London; hence Tomashevski must do the same, apply for and receive permission for foreign residence. Four years have passed. The Academy and the Party are rubbing their hands in anticipation of this seminal study of the important figure by the eminent Tomashevski. And now Tomashevski must retire to the solitude of a dacha outside Moscow to tend his garden and creatively brood over his cartons of research. Two more years pass in seminal thought. And just as Tomashevski is about to commit himself to paper, he checks with the Academy again only to learn that Party policy has totally about-faced; his hero is a traitor, and with regrets all around, Tomashevski must sacrifice his years of labor for the greater good. Naturally, they are only too happy to urge Tomashevski to start a new project, to plow under his grief with fresh labor. Tomashevski is now studying a very important historical figure who lived for some time in the South of France. He says there is always a bright future for Soviet historians, and I believe him.
Martin Cruz Smith (Gorky Park (Arkady Renko, #1))
Yet the deepest and most enduring forms of cultural change nearly always occurs from the “top down.” In other words, the work of world-making and world-changing are, by and large, the work of elites: gatekeepers who provide creative direction and management within spheres of social life. Even where the impetus for change draws from popular agitation, it does not gain traction until it is embraced and propagated by elites. The reason for this, as I have said, is that culture is about how societies define reality—what is good, bad, right, wrong, real, unreal, important, unimportant, and so on. This capacity is not evenly distributed in a society, but is concentrated in certain institutions and among certain leadership groups who have a lopsided access to the means of cultural production. These elites operate in well-developed networks and powerful institutions. Over time, cultural innovation is translated and diffused. Deep-rooted cultural change tends to begin with those whose work is most conceptual and invisible and it moves through to those whose work is most concrete and visible. In a very crude formulation, the process begins with theorists who generate ideas and knowledge; moves to researchers who explore, revise, expand, and validate ideas; moves on to teachers and educators who pass those ideas on to others, then passes on to popularizers who simplify ideas and practitioners who apply those ideas. All of this, of course, transpires through networks and structures of cultural production. Cultural change is most enduring when it penetrates the structure of our imagination, frameworks of knowledge and discussion, the perception of everyday reality. This rarely if ever happens through grassroots political mobilization though grassroots mobilization can be a manifestation of deeper cultural transformation.
James Davison Hunter (To Change the World: The Irony, Tragedy, and Possibility of Christianity in the Late Modern World)
we were disposed, when starting out, to use them as the basis for our practice in this new context. In our early work with professors, however, we were not in a position to impose rules. We soon discovered the folly of making any pronouncements about the amount of writing, for instance, which might distinguish a writing-intensive approach, and took it as our goal to find professors willing to experiment and to take some steps toward engaging more with writing. We were consultants with expertise in how to use and teach writing, and we suggested strategies and provided materials. The fact that those strategies were grounded in research and theory, and could be signaled as criteria emerged in our process of discussion with faculty about the rationale for adopting particular teaching strategies. The emphasis was on alternative pedagogies, not on a list of rules requiring compliance. Just as one does not need to know that a particular word in a sentence is functioning as an adjective in order to use adjectives, faculty also neither needed, nor were necessarily concerned, to associate a practice like revision with criteria for an as yet non-existent W-course. What eventually became official criteria were initially the elements that we encouraged according to what an individual faculty member was able and willing to accommodate. The early pilot courses overall represented all the elements that we would identify as foundational to effective practice in teaching writing, but in very few courses were all of the criteria present. Faculty members positioned themselves across a spectrum of starting points in their views of writing and its role in their courses. For many professors in the Arts and social sciences, the “writing-intensive” label simply acknowledged that their courses included substantial amounts of writing. For many in the Physical Sciences, the concept as applied to their courses could, at first glance,
Wendy Strachan (Writing-Intensive: Becoming W-Faculty in a New Writing Curriculum)
Read the following chain of events and see whether a similar pattern might apply to other toxic products that were reported in the news during your lifetime: 1. Workers were told that the paint was nontoxic, although there was no factual basis for this declaration. The employers discounted scientists. The workers believed their superiors. 2. Health complaints were made in ever-increasing frequency. It became obvious that something was seriously wrong. 3. U.S. Radium and other watch-dial companies began a campaign of disinformation and bogus medical tests - some of which involved X-rays and may even have made the condition worse. 4. Doctors, dentists, and researchers complied with U.S. Radium's and other companies' requests and refused to release their data to the public. 5. Medical professionals also aided the companies by attributing worker deaths to other causes. Syphilis was often cited as the diagnosis, which had the added benefit to management of being a smear on the victims' reputations. 6. One worker, Grace Fryer, decided to sue U.S. Radium. It took Fryer two years to find a lawyer who was willing to take on U.S. Radium. Only four other workers joined her suit; they became known as the "Radium Girls." 7. In 1928, the case was settled in the middle of the trial before it went to the jury for deliberation. The settlement for each of the five "Radium Girls" was $10,000 (the equivalent of $124,000 in 2009 dollars), plus $600 a year while the victim lived and all medical expenses. Remember the general outline of this scenario because you will see it over and over again: The company denies everything while the doctors and researchers (and even the industrial hygienists) in the company's employ support the company's distorted version of the facts. Perhaps one worker in a hundred will finally pursue justice, one lawyer out of the hundreds of thousands in the United States will finally step up to the plate, and the case will be settled for chump change.
Monona Rossol
the scientific rulers will provide one kind of education for ordinary men and women, and another for those who are to become holders of scientific power. Ordinary men and women will be expected to be docile, industrious, punctual, thoughtless, and contented. Of these qualities, probably contentment will be considered the most important. In order to produce it, all the researches of psycho-analysis, behaviourism, and biochemistry will be brought into play…. All the boys and girls will learn from an early age to be what is called “co-operative”, i.e., to do exactly what everybody is doing. Initiative will be discouraged in these children, and insubordination, without being punished, will be scientifically trained out of them…. Except for the one matter of loyalty to the world State and to their own order, members of the governing class will be encouraged to be adventurous and full of initiative. It will be recognized that it is their business to improve scientific technique, and to keep the manual workers contented by means of continual new amusements…. In normal cases, children of sufficient heredity will be admitted to the governing class from the moment of conception. I start with this moment rather than birth since it is from this moment and not merely the moment of birth that the treatment of the two classes will be different. If, however, by the time the child reaches the age of three it is fairly clear that he does not attain the required standard, he will be degraded at that point. [T]here would be a very strong tendency for the governing classes to become hereditary, and that after a few generations not many children would be moved from either class into the other. This is especially likely to be the case if embryological methods of improving the breed are applied to the governing class, but not to the others. In this way the gulf between the two classes as regards native intelligence will become continually wider and wider…. Assuming that both kinds of breeding are scientifically carried out, there will come to be an increasing divergence between the two types, making them in the end almost different species. (pp. 181–188, emphasis added)
Jasun Horsley (The Vice of Kings: How Socialism, Occultism, and the Sexual Revolution Engineered a Culture of Abuse)
RAIN IN MEASURED AMOUNTS Another item of information provided in the Qur'an about rain is that it is sent down to Earth in "due measure." This is mentioned in Surat az-Zukhruf as follows: It is He Who sends down water in measured amounts from the sky by which We bring a dead land back to life. That is how you too will be raised [from the dead]. (Qur'an, 43:11) This measured quantity in rain has again been discovered by modern research. It is estimated that in one second, approximately 16 million tons of water evaporates from the Earth. This figure amounts to 513 trillion tons of water in one year. This number is equal to the amount of rain that falls on the Earth in a year. Therefore, water continuously circulates in a balanced cycle, according to a "measure." Life on Earth depends on this water cycle. Even if all the available technology in the world were to be employed for this purpose, this cycle could not be reproduced artificially. Even a minor deviation in this equilibrium would soon give rise to a major ecological imbalance that would bring about the end of life on Earth. Yet, it never happens, and rain continues to fall every year in exactly the same measure, just as revealed in the Qur'an. The proportion of rain does not merely apply to its quantity, but also to the speed of the falling raindrops. The speed of raindrops, regardless of their size, does not exceed a certain limit. Philipp Lenard, a German physicist who received the Nobel Prize in physics in 1905, found that the fall speed increased with drop diameter until a size of 4.5 mm (0.18 inch). For larger drops, however, the fall speed did not increase beyond 8 metres per second (26 ft/sec).57 He attributed this to the changes in drop shape caused by the air flow as the drop size increased. The change in shape thus increased the air Allah's Miracles in the Qur'an 113 resistance of the drop and slowed its fall rate. As can be seen, the Qur'an may also be drawing our attention to the subtle adjustment in rain which could not have been known 1,400 years ago. Harun Yahya Every year, the amount of water that evaporates and that falls back to the Earth in the form of rain is "constant": 513 trillion tons. This constant amount is declared in the Qur'an by the expression "sending down water in due measure from the sky." The constancy of this quantity is very important for the continuity of the ecological balance, and therefore, life.
Harun Yahya (Allah's Miracles in the Qur'an)
This symbolism may well have been based, originally, on some visionary experience, such as happens not uncommonly today during psychological treatment. For the medical psychologist there is nothing very lurid about it. The context itself points the way to the right interpretation. The image expresses a psychologem that can hardly be formulated in rational terms and has, therefore, to make use of a concrete symbol, just as a dream must when a more or less “abstract” thought comes up during the abaissement du niveau mental that occurs in sleep. These “shocking” surprises, of which there is certainly no lack in dreams, should always be taken “as-if,” even though they clothe themselves in sensual imagery that stops at no scurrility and no obscenity. They are unconcerned with offensiveness, because they do not really mean it. It is as if they were stammering in their efforts to express the elusive meaning that grips the dreamer’s attention.62 [316]       The context of the vision (John 3 : 12) makes it clear that the image should be taken not concretistically but symbolically; for Christ speaks not of earthly things but of a heavenly or spiritual mystery—a “mystery” not because he is hiding something or making a secret of it (indeed, nothing could be more blatant than the naked obscenity of the vision!) but because its meaning is still hidden from consciousness. The modern method of dream-analysis and interpretation follows this heuristic rule.63 If we apply it to the vision, we arrive at the following result: [317]       1. The MOUNTAIN means ascent, particularly the mystical, spiritual ascent to the heights, to the place of revelation where the spirit is present. This motif is so well known that there is no need to document it.64 [318]       2. The central significance of the CHRIST-FIGURE for that epoch has been abundantly proved. In Christian Gnosticism it was a visualization of God as the Archanthropos (Original Man = Adam), and therefore the epitome of man as such: “Man and the Son of Man.” Christ is the inner man who is reached by the path of self-knowledge, “the kingdom of heaven within you.” As the Anthropos he corresponds to what is empirically the most important archetype and, as judge of the living and the dead and king of glory, to the real organizing principle of the unconscious, the quaternity, or squared circle of the self.65 In saying this I have not done violence to anything; my views are based on the experience that mandala structures have the meaning and function of a centre of the unconscious personality.66 The quaternity of Christ, which must be borne in mind in this vision, is exemplified by the cross symbol, the rex gloriae, and Christ as the year.
C.G. Jung (Aion: Researches into the Phenomenology of the Self (Collected Works, Vol 9ii))
In many fields—literature, music, architecture—the label ‘Modern’ stretches back to the early 20th century. Philosophy is odd in starting its Modern period almost 400 years earlier. This oddity is explained in large measure by a radical 16th century shift in our understanding of nature, a shift that also transformed our understanding of knowledge itself. On our Modern side of this line, thinkers as far back as Galileo Galilei (1564–1642) are engaged in research projects recognizably similar to our own. If we look back to the Pre-Modern era, we see something alien: this era features very different ways of thinking about how nature worked, and how it could be known. To sample the strange flavour of pre-Modern thinking, try the following passage from the Renaissance thinker Paracelsus (1493–1541): The whole world surrounds man as a circle surrounds one point. From this it follows that all things are related to this one point, no differently from an apple seed which is surrounded and preserved by the fruit … Everything that astronomical theory has profoundly fathomed by studying the planetary aspects and the stars … can also be applied to the firmament of the body. Thinkers in this tradition took the universe to revolve around humanity, and sought to gain knowledge of nature by finding parallels between us and the heavens, seeing reality as a symbolic work of art composed with us in mind (see Figure 3). By the 16th century, the idea that everything revolved around and reflected humanity was in danger, threatened by a number of unsettling discoveries, not least the proposal, advanced by Nicolaus Copernicus (1473–1543), that the earth was not actually at the centre of the universe. The old tradition struggled against the rise of the new. Faced with the news that Galileo’s telescopes had detected moons orbiting Jupiter, the traditionally minded scholar Francesco Sizzi argued that such observations were obviously mistaken. According to Sizzi, there could not possibly be more than seven ‘roving planets’ (or heavenly bodies other than the stars), given that there are seven holes in an animal’s head (two eyes, two ears, two nostrils and a mouth), seven metals, and seven days in a week. Sizzi didn’t win that battle. It’s not just that we agree with Galileo that there are more than seven things moving around in the solar system. More fundamentally, we have a different way of thinking about nature and knowledge. We no longer expect there to be any special human significance to natural facts (‘Why seven planets as opposed to eight or 15?’) and we think knowledge will be gained by systematic and open-minded observations of nature rather than the sorts of analogies and patterns to which Sizzi appeals. However, the transition into the Modern era was not an easy one. The pattern-oriented ways of thinking characteristic of pre-Modern thought naturally appeal to meaning-hungry creatures like us. These ways of thinking are found in a great variety of cultures: in classical Chinese thought, for example, the five traditional elements (wood, water, fire, earth, and metal) are matched up with the five senses in a similar correspondence between the inner and the outer. As a further attraction, pre-Modern views often fit more smoothly with our everyday sense experience: naively, the earth looks to be stable and fixed while the sun moves across the sky, and it takes some serious discipline to convince oneself that the mathematically more simple models (like the sun-centred model of the solar system) are right.
Jennifer Nagel (Knowledge: A Very Short Introduction)