Presenting Data Quotes

We've searched our database for all the quotes and captions related to Presenting Data. Here they are! All 100 of them:

Cognitive robotics can integrate information from pre-operation medical records with real-time operating metrics to guide and enhance the precision of physicians’ instruments. By processing data from genuine surgical experiences, they’re able to provide new and improved insights and techniques. These kinds of improvements can improve patient outcomes and boost trust in AI throughout the surgery. Robotics can lead to a 21% reduction in length of stay.
Ronald M. Razmi (AI Doctor: The Rise of Artificial Intelligence in Healthcare - A Guide for Users, Buyers, Builders, and Investors)
At present, people are happy to give away their most valuable asset—their personal data—in exchange for free email services and funny cat videos. It’s a bit like African and Native American tribes who unwittingly sold entire countries to European imperialists in exchange for colorful beads and cheap trinkets.
Yuval Noah Harari (21 Lessons for the 21st Century)
We ought to regard the present state of the universe as the effect of its antecedent state and as the cause of the state that is to follow. An intelligence knowing all the forces acting in nature at a given instant, as well as the momentary positions of all things in the universe, would be able to comprehend in one single formula the motions of the largest bodies as well as the lightest atoms in the world, provided that its intellect were sufficiently powerful to subject all data to analysis; to it nothing would be uncertain, the future as well as the past would be present to its eyes. The perfection that the human mind has been able to give to astronomy affords but a feeble outline of such an intelligence.
Pierre-Simon Laplace
My hair would continue to gray, and then one day, it would fall out entirely, and then, on a day meaninglessly close to the present one, meaninglessly like the present one, I would disappear from the earth. And all these emotions, all these yearnings, all these data, if that helps to clinch the enormity of what I'm talking about, would be gone. And that's what immortality means. It means selfishness. My generations belief that each one of us matters more than you or anyone else would think.
Gary Shteyngart (Super Sad True Love Story)
Patient endurance and self-discipline mean we submit to the present moment as it is, not as we would like it to be.
Amos Smith (Holistic Mysticism: The Integrated Spiritual Path of the Quakers)
Past Data should not be the basis of Present Truth. Data from a prior time or experience should always and only be the basis for new questions. Always the treasure should be in the question, not in the answer.
Neale Donald Walsch (The Complete Conversations with God)
We teach brilliance bias to children from an early age. A recent US study found that when girls start primary school at the age of five, they are as likely as five-year-old boys to think women could be 'really really smart'. But by the time they turn six, something changes. They start doubting their gender. So much so, in fact, that they start limiting themselves: if a game is presented to them as intended for 'children who are really, really smart', five-year-old girls are as likely to want to play it as boys - but six-year-old girls are suddenly uninterested. Schools are teaching little girls that brilliance doesn't belong to them. No wonder that by the time they're filling out university evaluation forms, students are primed to see their female teachers as less qualified.
Caroline Criado Pérez (Invisible Women: Data Bias in a World Designed for Men)
Effective consulting is about gathering and presenting actionable data, and helping businesses solve problems, implement solutions, and innovate.
Hendrith Vanlon Smith Jr.
Privacy is an inherent human right, and a requirement for maintaining the human condition with dignity and respect. It is about choice, and having the power to control how you present yourself to the world.
Bruce Schneier (Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World)
That the volume of information, of data, of judgements, of measurements, was too much, and there were too many people, and too many desires of too many people, and too many opinions of too many people, and too much pain from too many people, and having all of it constantly collated, collected, added and aggregated, and presented to her as if that all made it tidier and more manageable--it was too much.
Dave Eggers (The Circle (The Circle, #1))
Precisely because technology is now moving so fast, and parliaments and dictators alike are overwhelmed by data they cannot process quickly enough, present-day politicians are thinking on a far smaller scale than their predecessors a century ago. Consequently, in the early twenty-first century politics is bereft of grand visions. Government has become mere administration. It manages the country, but it no longer leads it.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
When the tragedies of others become for us diversions, sad stories with which to enthrall our friends, interesting bits of data to toss out at cocktail parties, a means of presenting a pose of political concern, or whatever…when this happens we commit the gravest of sins, condemn ourselves to ignominy, and consign the world to a dangerous course. We begin to justify our casual overview of pain and suffering by portraying ourselves as do-gooders incapacitated by the inexorable forces of poverty, famine, and war. “What can I do?” we say, “I’m only one person, and these things are beyond my control. I care about the world’s trouble, but there are no solutions.” Yet no matter how accurate this assessment, most of us are relying on it to be true, using it to mask our indulgence, our deep-seated lack of concern, our pathological self-involvement.
Lucius Shepard (The Best of Lucius Shepard)
Temporality is obviously an organised structure, and these three so-called elements of time: past, present, future, must not be envisaged as a collection of 'data' to be added together...but as the structured moments of an original synthesis. Otherwise we shall immediately meet with this paradox: the past is no longer, the future is not yet, as for the instantaneous present, everyone knows that it is not at all: it is the limit of infinite division, like the dimensionless point.
Jean-Paul Sartre (Being and Nothingness)
It is clear from the evidence presented in this book that the pharmaceutical industry does a biased job of disseminating evidence - to be surprised by this would be absurd - whether it is through advertising, drug reps, ghostwriting, hiding data, bribing people, or running educational programmes for doctors.
Ben Goldacre (Bad Pharma: How Drug Companies Mislead Doctors and Harm Patients)
At this point, godless materialists might be cheering. If humans evolved strictly by mutation and natural selection, who needs God to explain us? To this, I reply: I do. The comparison of chimp and human sequences, interesting as it is, does not tell us what it means to be human. In my views, DNA sequence alone, even if accompanied by a vast trove of data on biological function, will never explain certain special human attributes, such as the knowledge of the Moral Law and the universal search for God. Freeing God from the burden of special acts of creation does not remove Him as the source of the things that make humanity special, and of the universe itself. It merely shows us something of how He operates.
Francis S. Collins (The Language of God: A Scientist Presents Evidence for Belief)
Thriving through change requires clear strategy. But ironically, it also requires the willingness to toss all of your existing plans out the window if the business is presented with new data or new circumstances that delegitimize the clear strategy.
Hendrith Vanlon Smith Jr.
Something as superfluous as "play" is also an essential feature of our consciousness. If you ask children why they like to play, they will say, "Because it's fun." But that invites the next question: What is fun? Actually, when children play, they are often trying to reenact complex human interactions in simplified form. Human society is extremely sophisticated, much too involved for the developing brains of young children, so children run simplified simulations of adult society, playing games such as doctor, cops and robber, and school. Each game is a model that allows children to experiment with a small segment of adult behavior and then run simulations into the future. (Similarly, when adults engage in play, such as a game of poker, the brain constantly creates a model of what cards the various players possess, and then projects that model into the future, using previous data about people's personality, ability to bluff, etc. The key to games like chess, cards, and gambling is the ability to simulate the future. Animals, which live largely in the present, are not as good at games as humans are, especially if they involve planning. Infant mammals do engage in a form of play, but this is more for exercise, testing one another, practicing future battles, and establishing the coming social pecking order rather than simulating the future.)
Michio Kaku (The Future of the Mind: The Scientific Quest to Understand, Enhance, and Empower the Mind)
Habits are undeniably useful tools, relieving us of the need to run a complex mental operation every time we’re confronted with a new task or situation. Yet they also relieve us of the need to stay awake to the world: to attend, feel, think, and then act in a deliberate manner. (That is, from freedom rather than compulsion.) If you need to be reminded how completely mental habit blinds us to experience, just take a trip to an unfamiliar country. Suddenly you wake up! And the algorithms of everyday life all but start over, as if from scratch. This is why the various travel metaphors for the psychedelic experience are so apt. The efficiencies of the adult mind, useful as they are, blind us to the present moment. We’re constantly jumping ahead to the next thing. We approach experience much as an artificial intelligence (AI) program does, with our brains continually translating the data of the present into the terms of the past, reaching back in time for the relevant experience, and then using that to make its best guess as to how to predict and navigate the future. One of the things that commends travel, art, nature, work, and certain drugs to us is the way these experiences, at their best, block every mental path forward and back, immersing us in the flow of a present that is literally wonderful—wonder being the by-product of precisely the kind of unencumbered first sight, or virginal noticing, to which the adult brain has closed itself. (It’s so inefficient!) Alas, most of the time I inhabit a near-future tense, my psychic thermostat set to a low simmer of anticipation and, too often, worry. The good thing is I’m seldom surprised. The bad thing is I’m seldom surprised.
Michael Pollan (How to Change Your Mind: What the New Science of Psychedelics Teaches Us About Consciousness, Dying, Addiction, Depression, and Transcendence)
Arno Penzias, the Nobel Prize–winning scientist who codiscovered the cosmic microwave background radiation that provided strong support for the Big Bang in the first place, states, “The best data we have are exactly what I would have predicted, had I nothing to go on but the five Books of Moses, the Psalms, the Bible as a whole.
Francis S. Collins (The Language of God: A Scientist Presents Evidence for Belief)
The flash opened up into something larger, an even more blasphemous notion that her brain contained too much. That the volume of information, of data, of judgments, of measurements, was too much, and there were too many people, and too many desires of too many people, and too many opinions of too many people, and too much pain from too many people, and having all of it constantly collated, collected, added and aggregated, and presented to her as if that all made it tidier and more manageable--it was too much.
Dave Eggers (The Circle (The Circle, #1))
The efficiencies of the adult mind, useful as they are, blind us to the present moment. We’re constantly jumping ahead to the next thing. We approach experience much as an artificial intelligence (AI) program does, with our brains continually translating the data of the present into the terms of the past, reaching back in time for the relevant experience, and then using that to make its best guess as to how to predict and navigate the future.
Michael Pollan (How to Change Your Mind: What the New Science of Psychedelics Teaches Us About Consciousness, Dying, Addiction, Depression, and Transcendence)
The viewer of television, the listener to radio, the reader of magazines, is presented with a whole complex of elements—all the way from ingenious rhetoric to carefully selected data and statistics—to make it easy for him to “make up his own mind” with the minimum of difficulty and effort. But the packaging is often done so effectively that the viewer, listener, or reader does not make up his own mind at all. Instead, he inserts a packaged opinion into his mind, somewhat like inserting a cassette into a cassette player. He then pushes a button and “plays back” the opinion whenever it seems appropriate to do so. He has performed acceptably without having had to think.
Mortimer J. Adler (How to Read a Book)
Anthropologists have invented the ingenious, convenient, fictional notion of the “true Negro,” which allows them to consider, if need be, all the real Negroes on earth as fake Negroes, more or less approaching a kind of Platonic archetype, without ever attaining it. Thus, African history is full of “Negroids,” Hamites, semi-Hamites, Nilo-Hamitics, Ethiopoids, Sabaeans, even Caucasoids! Yet, if one stuck strictly to scientific data and archeological facts, the prototype of the White race would be sought in vain throughout the earliest years of present-day humanity.
Cheikh Anta Diop (The African Origin of Civilization: Myth or Reality)
Men (women were not found to exhibit this bias) who believe that they are objective in hiring decisions are more likely to hire a male applicant than an identically described female applicant. And in organisations which are explicitly presented as meritocratic, managers favour male employees over equally qualified female employees.
Caroline Criado Pérez (Invisible Women: Data Bias in a World Designed for Men)
What I have always liked best is when he talks about having no memory. No memory of things he'd done just a second before. Good or bad. Because memory is time folding back on itself. To remember is to disengage from the present. In order to reach any success in automobile racing, a driver must never remember. Which is why drivers compulsively record their every move, their every race, with cockpit cameras, in-car video, data mapping; a driver cannot be a witness to his own greatness. This is what Danny says. He says racing is doing. It is being a part of a moment and being aware of nothing else but that moment. Reflection must come at a later time. The great champion Julian Sabella Rosa has said: “When I am racing, my mind and my body are working so quickly and so well together, I must be sure not to think, or else I will definitely make a mistake.
Garth Stein (The Art of Racing in the Rain)
Even rational, data-driven scientists could be sent into prolonged states of hysteria when presented with evidence that their favorite foods might be killing them.
T. Colin Campbell (Whole: Rethinking the Science of Nutrition)
In April 2017, a confidential document is leaked that reveals Facebook is offering advertisers the opportunity to target thirteen-to-seventeen-year-olds across its platforms, including Instagram, during moments of psychological vulnerability when they feel “worthless,” “insecure,” “stressed,” “defeated,” “anxious,” “stupid,” “useless,” and “like a failure.” Or to target them when they’re worried about their bodies and thinking of losing weight. Basically, when a teen is in a fragile emotional state. Facebook’s advertising team had made this presentation for an Australian client that explains that Instagram and Facebook monitor teenagers’ posts, photos, interactions, conversations with friends, visual communications, and internet activity on and off Facebook’s platforms and use this data to target young people when they’re vulnerable. In addition to the moments of vulnerability listed, Facebook finds moments when teenagers are concerned with “body confidence” and “working out & losing weight.
Sarah Wynn-Williams (Careless People: A Cautionary Tale of Power, Greed, and Lost Idealism)
The economist Robin Hanson estimates, based on historical economic and population data, a characteristic world economy doubling time for Pleistocene hunter–gatherer society of 224,000 years; for farming society, 909 years; and for industrial society, 6.3 years.3 (In Hanson’s model, the present epoch is a mixture of the farming and the industrial growth modes—the world economy as a whole is not yet growing at the 6.3-year doubling rate.) If
Nick Bostrom (Superintelligence: Paths, Dangers, Strategies)
The Web is no longer just about the present-that crazy driver or this delicious meal. As we share messages, photos and updates, we're building a data trail about our lives and histories online.We can now tell stories not just about what is happening today, but where we've been, what we've shared, and what might happen in the future.
Keith Ferrazzi
Yet like many other human traits that made sense in past ages but cause trouble in the modern age, the knowledge illusion has its downside. The world is becoming ever more complex, and people fail to realise just how ignorant they are of what’s going on. Consequently some who know next to nothing about meteorology or biology nevertheless propose policies regarding climate change and genetically modified crops, while others hold extremely strong views about what should be done in Iraq or Ukraine without being able to locate these countries on a map. People rarely appreciate their ignorance, because they lock themselves inside an echo chamber of like-minded friends and self-confirming newsfeeds, where their beliefs are constantly reinforced and seldom challenged. Providing people with more and better information is unlikely to improve matters. Scientists hope to dispel wrong views by better science education, and pundits hope to sway public opinion on issues such as Obamacare or global warming by presenting the public with accurate facts and expert reports. Such hopes are grounded in a misunderstanding of how humans actually think. Most of our views are shaped by communal groupthink rather than individual rationality, and we hold on to these views out of group loyalty. Bombarding people with facts and exposing their individual ignorance is likely to backfire. Most people don’t like too many facts, and they certainly don’t like to feel stupid. Don’t be so sure that you can convince Tea Party supporters of the truth of global warming by presenting them with sheets of statistical data.
Yuval Noah Harari (21 Lessons for the 21st Century)
If you summarily rule out any single sensation and do not make a distinction between the element of belief that is superimposed on a percept that awaits verification and what is actually present in sensation or in the feelings or some percept of the mind itself, you will cast doubt on all other sensations by your unfounded interpretation and consequently abandon all the criteria of truth. On the other hand, in cases of interpreted data, if you accept as true those that need verification as well as those that do not, you will still be in error, since the whole question at issue in every judgment of what is true or not true will be left intact.
Epicurus (Lettera sulla felicità)
After presenting the data and making my arguments about the unconstitutional exclusion of African Americans, the judge complained loudly. “I’m going to grant your motion, Mr. Stevenson, but I’ll be honest. I’m pretty fed up with people always talking about minority rights. African Americans, Mexican Americans, Asian Americans, Native Americans … When is someone going to come to my courtroom and protect the rights of Confederate Americans?
Bryan Stevenson (Just Mercy: A Story of Justice and Redemption)
As CEO, you should have an opinion on absolutely everything. You should have an opinion on every forecast, every product plan, every presentation, and even every comment. Let people know what you think. If you like someone’s comment, give her the feedback. If you disagree, give her the feedback. Say what you think. Express yourself. This will have two critically important positive effects:   Feedback won’t be personal in your company. If the CEO constantly gives feedback, then everyone she interacts with will just get used to it. Nobody will think, “Gee, what did she really mean by that comment? Does she not like me?” Everybody will naturally focus on the issues, not an implicit random performance evaluation.   People will become comfortable discussing bad news. If people get comfortable talking about what each other are doing wrong, then it will be very easy to talk about what the company is doing wrong. High-quality company cultures get their cue from data networking routing protocols: Bad news travels fast and good news travels slowly. Low-quality company cultures take on the personality of the Wicked Witch of the West in The Wiz: “Don’t nobody bring me no bad news.
Ben Horowitz (The Hard Thing About Hard Things: Building a Business When There Are No Easy Answers)
The first time I heard Robert Anda present the results of the ACE study, he could not hold back his tears. In his career at the CDC he had previously worked in several major risk areas, including tobacco research and cardiovascular health. But when the ACE study data started to appear on his computer screen, he realized that they had stumbled upon the gravest and most costly public health issue in the United States: child abuse. He had calculated that its overall costs exceeded those of cancer or heart disease and that eradicating child abuse in America would reduce the overall rate of depression by more than half, alcoholism by two-thirds, and suicide, IV drug use, and domestic violence by three-quarters. 20 It would also have a dramatic effect on workplace performance and vastly decrease the need for incarceration.
Bessel van der Kolk (The Body Keeps the Score: Brain, Mind, and Body in the Healing of Trauma)
countries with gender-inflected languages, which have strong ideas of masculine and feminine present in almost every utterance, are the most unequal in terms of gender
Caroline Criado Pérez (Invisible Women: Data Bias in a World Designed for Men)
When we see new ways of displaying data, we should be careful not to confuse presentation with substance
David Sumpter (Soccermatics: Mathematical Adventures in the Beautiful Game (Bloomsbury Sigma))
Every time a scientific paper presents a bit of data, it's accompanied by an error bar – a quiet but insistent reminder that no knowledge is complete or perfect. It's a calibration of how much we trust what we think we know. If the error bars are small, the accuracy of our empirical knowledge is high; if the error bars are large, then so is the uncertainty in our knowledge.
Carl Sagan (The Demon-Haunted World: Science as a Candle in the Dark)
Were we dealing with a spectrum-based system that described male and female sexuality with equal accuracy, data taken from gay males would look similar to data taken from straight females—and yet this is not what we see in practice. Instead, the data associated with gay male sexuality presents a mirror image of data associated with straight males: Most gay men are as likely to find the female form aversive as straight men are likely to find the male form aversive. In gay females we observe a similar phenomenon, in which they mirror straight females instead of appearing in the same position on the spectrum as straight men—in other words, gay women are just as unlikely to find the male form aversive as straight females are to find the female form aversive. Some of the research highlighting these trends has been conducted with technology like laser doppler imaging (LDI), which measures genital blood flow when individuals are presented with pornographic images. The findings can, therefore, not be written off as a product of men lying to hide middling positions on the Kinsey scale due to a higher social stigma against what is thought of in the vernacular as male bisexuality/pansexuality. We should, however, note that laser Doppler imaging systems are hardly perfect, especially when measuring arousal in females. It is difficult to attribute these patterns to socialization, as they are observed across cultures and even within the earliest of gay communities that emerged in America, which had to overcome a huge amount of systemic oppression to exist. It’s a little crazy to argue that the socially oppressed sexuality of the early American gay community was largely a product of socialization given how much they had overcome just to come out. If, however, one works off the assumptions of our model, this pattern makes perfect sense. There must be a stage in male brain development that determines which set of gendered stimuli is dominant, then applies a negative modifier to stimuli associated with other genders. This stage does not apparently take place during female sexual development. 
Simone Collins (The Pragmatist's Guide to Sexuality)
Ordinary humans will find it very difficult to resist this process. At present, people are happy to give away their most valuable asset—their personal data—in exchange for free email services and funny cat videos. It’s a bit like African and Native American tribes who unwittingly sold entire countries to European imperialists in exchange for colorful beads and cheap trinkets. If, later on, ordinary people decide to try to block the flow of data, they might find it increasingly difficult, especially as they might come to rely on the network for all their decisions, and even for their healthcare and physical survival.
Yuval Noah Harari (21 Lessons for the 21st Century)
The brain, he writes, is like Kublai Khan, the great Mongol emperor of the thirteenth century. It sits enthroned in its skull, "encased in darkness and silence," at a lofty remove from brute reality. Messengers stream in from every corner of the sensory kingdom, bringing word of distant sights, sounds, and smells. Their reports arrive at different rates, often long out of date, yet the details are all stitched together into a seamless chronology. The difference is that Kublai Khan was piecing together the past. The brain is describing the present—processing reams of disjointed data on the fly, editing everything down to an instantaneous now. How does it manage it?
Burkhard Bilger
The viewer of television, the listener to radio, the reader of magazines, is presented with a whole complex of elements—all the way from ingenious rhetoric to carefully selected data and statistics—to make it easy for him to “make up his own mind” with the minimum of difficulty and effort. But the packaging is often done so effectively that the viewer, listener, or reader does not make up his own mind at all. Instead, he inserts a packaged opinion into his mind, somewhat like inserting a cassette into a cassette player. He then pushes a button and “plays back” the opinion whenever it seems appropriate to do so. He has performed acceptably without having had to think.
Charles van Doren (How to Read a Book)
As it happens, there’s a way of presenting data, called the funnel plot, that indicates whether or not the scientific literature is biased in this way.15 (If statistics don’t excite you, feel free to skip straight to the probably unsurprising conclusion in the last sentence of this paragraph.) You plot the data points from all your studies according to the effect sizes, running along the horizontal axis, and the sample size (roughly)16 running up the vertical axis. Why do this? The results from very large studies, being more “precise,” should tend to cluster close to the “true” size of the effect. Smaller studies by contrast, being subject to more random error because of their small, idiosyncratic samples, will be scattered over a wider range of effect sizes. Some small studies will greatly overestimate a difference; others will greatly underestimate it (or even “flip” it in the wrong direction). The next part is simple but brilliant. If there isn’t publication bias toward reports of greater male risk taking, these over- and underestimates of the sex difference should be symmetrical around the “true” value indicated by the very large studies. This, with quite a bit of imagination, will make the plot of the data look like an upside-down funnel. (Personally, my vote would have been to call it the candlestick plot, but I wasn’t consulted.) But if there is bias, then there will be an empty area in the plot where the smaller samples that underestimated the difference, found no differences, or yielded greater female risk taking should be. In other words, the overestimates of male risk taking get published, but various kinds of “underestimates” do not. When Nelson plotted the data she’d been examining, this is exactly what she found: “Confirmation bias is strongly indicated.”17 This
Cordelia Fine (Testosterone Rex: Myths of Sex, Science, and Society)
Theorists of propaganda have identified five basic rules: 1. The rule of simplification: reducing all data to a simple confrontation between ‘Good and Bad’, ‘Friend and Foe’. 2. The rule of disfiguration: discrediting the opposition by crude smears and parodies. 3. The rule of transfusion: manipulating the consensus values of the target audience for one’s own ends. 4. The rule of unanimity: presenting one’s viewpoint as if it were the unanimous opinion of all right-thinking people: drawing the doubting individual into agreement by the appeal of star-performers, by social pressure, and by ‘psychological contagion’. 5. The rule of orchestration: endlessly repeating the same messages in different variations and combinations.
Norman Davies (Europe: A History)
A humorous treatment of the rigid uniformitarian view came from Mark Twain. Although the shortening of the Mississippi River he referred to was the result of engineering projects eliminating many of the bends in the river, it is a thought-provoking spoof: The Mississippi between Cairo and New Orleans was twelve hundred and fifteen miles long one hundred and seventy-six years ago. . . . Its length is only nine hundred and seventy-three miles at present. Now, if I wanted to be one of those ponderous scientific people, and “let on” to prove what had occurred in the remote past by what had occurred in a given time in the recent past . . . what an opportunity is here! Geology never had such a chance, nor such exact data to argue from! . . . In the space of one hundred and seventy-six years the Lower Mississippi has shortened itself two hundred and forty-two miles. That is an average of a trifle over one mile and a third per year. Therefore, any calm person, who is not blind or idiotic, can see that in the Old Oolitic Silurian Period, just a million years ago next November, the Lower Mississippi River was upwards of one million three hundred thousand miles long, and stuck out over the Gulf of Mexico like a fishing-rod. And by the same token any person can see that seven hundred and forty-two years from now the lower Mississippi will be only a mile and three-quarters long. . . . There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.
Mark Twain (Life on the Mississippi)
Consumers of news should be aware of its built-in bias and adjust their information diet to include sources that present the bigger statistical picture: less Facebook News Feed, more Our World in Data.38 Journalists should put lurid events in context. A killing or plane crash or shark attack should be accompanied by the annual rate, which takes into account the denominator of the probability, not just the numerator. A setback or spate of misfortunes should be put into the context of the longer-term trend.
Steven Pinker (Rationality: What It Is, Why It Seems Scarce, Why It Matters)
Memory is the enemy of wonder, which abides nowhere else but in the present. This is why, unless you are a child, wonder depends on forgetting—on a process, that is, of subtraction. Ordinarily we think of drug experiences as additive—it’s often said that drugs “distort” normal perceptions and augment the data of the senses (adding hallucinations, say), but it may be that the very opposite is true—that they work by subtracting some of the filters that consciousness normally interposes between us and the world.
Michael Pollan (The Botany of Desire: A Plant's-Eye View of the World)
From this point of view, the laws of science represent data compression in action. A theoretical physicist acts like a very clever coding algorithm. “The laws of science that have been discovered can be viewed as summaries of large amounts of empirical data about the universe,” wrote Solomonoff. “In the present context, each such law can be transformed into a method of compactly coding the empirical data that gave rise to that law.” A good scientific theory is economical. This was yet another way of saying so.
James Gleick (The Information: A History, a Theory, a Flood)
most of us are rarely inside the present moment. We spend a disproportionate amount of time plotting the future or revisiting past events. But when we swim, or shower, or take a bath, we have little choice but to position ourselves in the present, giving our thoughts room to float and wander
Martin Lindstrom (Small Data: The Tiny Clues That Uncover Huge Trends)
Bullshit involves language, statistical figures, data graphics, and other forms of presentation intended to persuade or impress an audience by distracting, overwhelming, or intimidating them with a blatant disregard for truth, logical coherence, or what information is actually being conveyed. The key elements of this definition are that bullshit bears no allegiance to conveying the truth, and that the bullshitter attempts to conceal this fact behind some type of rhetorical veil. Sigmund Freud illustrated the concept about as well as one could imagine in a letter he wrote his fiancée, Martha Bernays, in 1884: So I gave my lecture yesterday.
Carl T. Bergstrom (Calling Bullshit: The Art of Skepticism in a Data-Driven World)
The other terror that scares us from self-trust is our consistency; a reverence for our past act or word, because the eyes of others have no other data for computing our orbit than our past acts, and we are loath to disappoint them. But why should you keep your head over your shoulder? Why drag about this corpse of your memory, lest you contradict somewhat you have stated in this or that public place? Suppose you should contradict yourself; what then? It seems to be a rule of wisdom never to rely on your memory alone, scarcely even in acts of pure memory, but to bring the past for judgment into the thousand-eyed present, and live ever in a new day. In
Ralph Waldo Emerson (Self-Reliance and Other Essays)
Precisely because technology is now moving so fast, and parliaments and dictators alike are overwhelmed by data they cannot process quickly enough, present-day politicians are thinking on a far smaller scale than their predecessors a century ago. Consequently, in the early twenty-first century politics is bereft of grand visions.
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
Present-day democratic structures just cannot collect and process the relevant data fast enough, and most voters don’t understand biology and cybernetics well enough to form any pertinent opinions. Hence traditional democratic politics is losing control of events, and is failing to present us with meaningful visions of the future.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
Meanwhile, the extraordinary measures we take to stay abreast of each minuscule change to the data stream end up magnifying the relative importance of these blips to the real scheme of things. Investors trade, politicians respond, and friends judge based on the micromovements of virtual needles. By dividing our attention between our digital extensions, we sacrifice our connection to the truer present in which we are living. The tension between the faux present of digital bombardment and the true now of a coherently living human generated the second kind of present shock, what we're calling digiphrenia—digi for "digital," and phrenia for "disordered condition of mental activity.
Douglas Rushkoff (Present Shock: When Everything Happens Now)
Ordinary humans will find it very difficult to resist this process. At present, people are happy to give away their most valuable asset—their personal data—in exchange for free email services and funny cat videos. It’s a bit like African and Native American tribes who unwittingly sold entire countries to European imperialists in exchange for colorful beads and cheap trinkets.
Yuval Noah Harari (21 Lessons for the 21st Century)
There is another issue with the largely cognitive approach to management, which we had big-time at Google. Smart, analytical people, especially ones steeped in computer science and mathematics as we were, will tend to assume that data and other empirical evidence can solve all problems. Quants or techies with this worldview tend to see the inherently messy, emotional tension that’s always present in teams of humans as inconvenient and irrational—an irritant that will surely be resolved in the course of a data-driven decision process. Of course, humans don’t always work that way. Things come up, tensions arise, and they don’t naturally go away. People do their best to avoid talking about these situations, because they’re awkward. Which makes it worse.
Eric Schmidt (Trillion Dollar Coach: The Leadership Playbook of Silicon Valley's Bill Campbell)
(…) it may be seriously questioned whether the advent of modern communications media has much enhanced our understanding of the world in which we live.(…) Perhaps we know more about the world than we used to, and insofar as knowledge is prerequisite to understanding, that is all to the good. But knowledge is not as much a prerequisite to understanding as is commonly supposed. We do not have to know everything about something in order to understand it; too many facts are often as much of an obstacle to understanding as too few. There is a sense in which we moderns are inundated with facts to the detriment of understanding. (…) One of the reasons for this situation is that the very media we have mentioned are so designed as to make thinking seem unnecessary (though this is only an appearance). The packaging of intellectual positions and views is one of the most active enterprises of some of the best minds of our day. The viewer of television, the listener to radio, the reader of magazines, is presented with a whole complex of elements—all the way from ingenious rhetoric to carefully selected data and statistics—to make it easy for him to “make up his own mind” with the minimum of difficulty and effort. But the packaging is often done so effectively that the viewer, listener, or reader does not make up his own mind at all. Instead, he inserts a packaged opinion into his mind, somewhat like inserting a cassette into a cassette player. He then pushes a button and “plays back” the opinion whenever it seems appropriate to do so. He has performer acceptably without having had to think.
Mortimer J. Adler (How to Read a Book: The Classic Guide to Intelligent Reading)
With each passing year, do our religious beliefs conserve more and more of the data of human experience? If religion addresses a genuine sphere of understanding and human necessity, then it should be susceptible to progress; its doctrines should become more useful, rather than less. Progress in religion, as in other fields, would have to be a matter of present inquiry, not the mere reiteration of past doctrine.
Sam Harris (The End of Faith: Religion, Terror, and the Future of Reason)
When Bruhn and Wolf first presented their findings to the medical community, you can imagine the kind of skepticism they faced. They went to conferences where their peers were presenting long rows of data arrayed in complex charts and referring to this kind of gene or that kind of physiological process, and they themselves were talking instead about the mysterious and magical benefits of people stopping to talk to one another on the street and of having three generations under one roof. Living a long life, the conventional wisdom at the time said, depended to a great extent on who we were—that is, our genes. It depended on the decisions we made—on what we chose to eat, and how much we chose to exercise, and how effectively we were treated by the medical system. No one was used to thinking about health in terms of community.
Malcolm Gladwell (Outliers: The Story of Success)
In 2012, a World Economic Forum analysis found that countries with gender-inflected languages, which have strong ideas of masculine and feminine present in almost every utterance, are the most unequal in terms of gender. 33 But here’s an interesting quirk: countries with genderless languages (such as Hungarian and Finnish) are not the most equal. Instead, that honour belongs to a third group, countries with ‘natural gender languages’ such as English. These languages allow gender to be marked (female teacher, male nurse) but largely don’t encode it into the words themselves. The study authors suggested that if you can’t mark gender in any way you can’t ‘correct’ the hidden bias in a language by emphasising ‘women’s presence in the world’. In short: because men go without saying, it matters when women literally can’t get said at all.
Caroline Criado Pérez (Invisible Women: Exposing Data Bias in a World Designed for Men)
The story presents a curious pattern. Adam and Eve both go beyond requested facts to offer explanations. What’s the significance? Early on, says Pearl, “we humans realized the world is not made up only of dry facts (what we might call data today); rather, those facts are glued together by an intricate web of cause-effect relationships.” What’s more, he says, “causal explanations, not dry facts, make up the bulk of our knowledge.
Michael Hyatt (Mind Your Mindset: The Science That Shows Success Starts with Your Thinking)
When a process is rescheduled to run on a multiprocessor system, it doesn’t necessarily run on the same CPU on which it last executed. The usual reason it may run on another CPU is that the original CPU is already busy. When a process changes CPUs, there is a performance impact: in order for a line of the process’s data to be loaded into the cache of the new CPU, it must first be invalidated (i.e., either discarded if it is unmodified, or flushed to main memory if it was modified), if present in the cache of the old CPU. (To prevent cache inconsistencies, multiprocessor architectures allow data to be kept in only one CPU cache at a time.) This invalidation costs execution time. Because of this performance impact, the Linux (2.6) kernel tries to ensure soft CPU affinity for a process — wherever possible, the process is rescheduled to run on the same CPU.
Michael Kerrisk (The Linux Programming Interface: A Linux and UNIX System Programming Handbook)
What will happen once we have genome-wide data from a thousand European farmers living shortly after the transition to agriculture? Comparing the results of a scan for recent natural selection in these individuals to the same scan performed in present-day Europeans should make it possible to understand whether the pace and nature of human adaptation has changed between preagricultural times and the time since the transition to agriculture.
David Reich (Who We Are and How We Got Here: Ancient DNA and the New Science of the Human Past)
The master propagandist, like the advertising expert, avoids obvious emotional appeals and strives for a tone that is consistent with the prosaic quality of modern life—a dry, bland matter-of-factness. Nor does the propagandist circulate "intentionally biased" information. He knows that partial truths serve as more effective instruments of deception than lies. Thus he tries to impress the public with statistics of economic growth that neglect to give the base year from which growth is calculated, with accurate but meaningless facts about the standard of living—with raw and uninterpreted data, in other words, from which the audience is invited to draw the inescapable conclusion that things are getting better and the present régime therefore deserves the people's confidence, or on the other hand that things are getting worse so rapidly that the present régime should be given emergency powers to deal with the developing crisis.
Christopher Lasch (The Culture of Narcissism: American Life in An Age of Diminishing Expectations)
But young children, whose prefrontal cortexes have barely begun to ripen, can’t conceive of a future, which means they spend their lives in the permanent present, a forever feeling of right now. At times, this is a desirable state of consciousness; indeed, for meditators, it’s the ultimate aspiration. But living in the permanent present is not a practical parenting strategy. “Everybody would like to be in the present,” says Daniel Gilbert, a social psychologist at Harvard and author of the 2006 best-seller Stumbling on Happiness. “Certainly it’s true that there is an important role for being present in our lives. All the data say that. My own research says that.” The difference is that children, by definition, only live in the present, which means that you, as a parent, don’t get much of a chance. “Everyone is moving at the same speed toward the future,” he says. “But your children are moving at that same speed with their eyes closed. So you’re the ones who’ve got to steer.” He thinks about this for a moment. “You know, back in the early seventies, I hung out with a lot of people who wanted to live in the present. And it meant that no one paid the rent.” In effect, parents and small children have two completely different temporal outlooks. Parents can project into the future; their young children, anchored in the present, have a much harder time of it. This difference can be a formula for heartbreak for a small child.
Jennifer Senior (All Joy and No Fun: The Paradox of Modern Parenthood)
In trying to comprehend and judge moral dilemmas of this scale, people often resort to one of four methods. The first is to downsize the issue. To understand the Syrian civil war as though it were occurring between two foragers, for example, one imagines the Assad regime as a lone person and the rebels as another person; one of them is bad and one of them is good. The historical complexity of the conflict is replaced by a simple, clear plot.4 The second method is to focus on a touching human story that ostensibly stands for the whole conflict. When you try to explain to people the true complexity of the conflict by means of statistics and precise data, you lose them, but a personal story about the fate of one child activates the tear ducts, makes the blood boil, and generates false moral certainty.5 This is something that many charities have understood for a long time. In one noteworthy experiment, people were asked to donate money to help a poor seven-year-old girl from Mali named Rokia. Many were moved by her story and opened their hearts and purses. However, when in addition to Rokia’s personal story the researchers also presented people with statistics about the broader problem of poverty in Africa, respondents suddenly became less willing to help. In another study, scholars solicited donations to help either one sick child or eight sick children. People gave more money to the single child than to the group of eight.6
Yuval Noah Harari (21 Lessons for the 21st Century)
At present, computers are spectacular at number crunching and data processing. We can code programmes that feel as though computers are interacting with us, and that's fun, but in fact they aren't interacting in a way that we expect a human being to interact. But what will happen when a programme that has self-developed, that has its own version of what we call consciousness - realises, in the human sense of the verb 'to realise', exactly what/who is on the other side of the screen?
Jeanette Winterson (Frankissstein: A Love Story)
We brought you fire, dumbed it down so you could understand it, but now you think it's magic, you're taking us for granted, you used the very tech and platforms that we created, to spit disgust and spread distrust of everything we've ever stated, now we can research for years present our data in reams, but if your mothers plumber's meth dealer sends you some memes, despite the lack of proof you think they're just as valid as fact so now we're forming a pact. We're going to take the fire back.
Mur Lafferty (Chaos Terminal (The Midsolar Murders, #2))
At the first trans health conference I ever attended, a parent asked about long-term health risks for people taking hormones. The doctor gave a full assessment of issues that trans men face; many of them mimic the risks that would be inherited from father to son if they'd been born male, now that testosterone is a factor. "What about trans women?" another parent asked. The doctor took a deep breath. "Those outcomes are murkier. Because trans women are so discriminated against, they're at far greater risk for issues like alcoholism, poverty, homelessness, and lack of access to good healthcare. All of these issues impact their overall health so much that it's hard to gather data on what their health outcomes would be if these issues weren't present." This was stunning-a group of people is treated so badly by our culture that we can't clearly study their health. The burden of this abuse is that substantial and pervasive. Your generation will be healthier. The signs are already there.
Carolyn Hays (A Girlhood: Letter to My Transgender Daughter)
Quantum physics tells us that no matter how thorough our observation of the present, the (unobserved) past, like the future, is indefinite and exists only as a spectrum of possibilities. The universe, according to quantum physics, has no single past, or history. The fact that the past takes no definite form means that observations you make on a system in the present affect its past. That is underlined rather dramatically by a type of experiment thought up by physicist John Wheeler, called a delayed-choice experiment. Schematically, a delayed-choice experiment is like the double-slit experiment we just described, in which you have the option of observing the path that the particle takes, except in the delayed-choice experiment you postpone your decision about whether or not to observe the path until just before the particle hits the detection screen. Delayed-choice experiments result in data identical to those we get when we choose to observe (or not observe) the which-path information by watching the slits themselves. But in this case the path each particle takes—that is, its past—is determined long after it passed through the slits and presumably had to “decide” whether to travel through just one slit, which does not produce interference, or both slits, which does. Wheeler even considered a cosmic version of the experiment, in which the particles involved are photons emitted by powerful quasars billions of light-years away. Such light could be split into two paths and refocused toward earth by the gravitational lensing of an intervening galaxy. Though the experiment is beyond the reach of current technology, if we could collect enough photons from this light, they ought to form an interference pattern. Yet if we place a device to measure which-path information shortly before detection, that pattern should disappear. The choice whether to take one or both paths in this case would have been made billions of years ago, before the earth or perhaps even our sun was formed, and yet with our observation in the laboratory we will be affecting that choice. In
Stephen Hawking (The Grand Design)
We may regard the present state of the universe as the effect of its past and the cause of its future. An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past could be present before its eyes.
Pierre Simon Laplace, A Philosophical Essay on Probabilities
Precisely because technology is now moving so fast, and parliaments and dictators alike are overwhelmed by data they cannot process quickly enough, present-day politicians are thinking on a far smaller scale than their predecessors a century ago. In the early twenty-first century, politics is consequently bereft of grand visions. Government has become mere administration. It manages the country, but it no longer leads it. It makes sure teachers are paid on time and sewage systems don’t overflow, but it has no idea where the country will be in twenty years.
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
I am little acquainted with the phenomenon as it shows itself at sea; but at Monticello it is familiar. There is a solitary mountain about forty miles off in the South, whose natural shape, as presented to view there, is a regular cone; but by the effect of looming, it sometimes subsides almost totally in the horizon; sometimes it rises more acute and more elevated; sometimes it is hemispherical; and sometimes its sides are perpendicular, its top flat, and as broad as its base. In short, it assumes at times the most whimsical shapes, and all these perhaps successively in the same morning.
Thomas Jefferson (Notes on the State of Virginia: A Compilation of Data About the State's Natural Resources, Economy and the Nature of the Good Society)
In fact this desire for consonance in the apocalyptic data, and our tendency to be derisive about it, seem to me equally interesting. Each manifests itself, in the presence of the other, in most of our minds. We are all ready to be sceptical about Father Marystone, but we are most of us given to some form of 'centurial mysticism,' and even to more extravagant apocalyptic practices: a point I shall be taking up in my fourth talk. What it seems to come to is this. Men in the middest make considerable imaginative investments in coherent patterns which, by the provision of an end, make possible a satisfying consonance with the origins and with the middle. That is why the image of the end can never be permanently falsified. But they also, when awake and sane, feel the need to show a marked respect for things as they are; so that there is a recurring need for adjustments in the interest of reality as well as of control. This has relevance to literary plots, images of the grand temporal consonance; and we may notice that there is the same co-existence of naïve acceptance and scepticism here as there is in apocalyptic. Broadly speaking, it is the popular story that sticks most closely to established conventions; novels the clerisy calls 'major' tend to vary them, and to vary them more and more as time goes by. I shall be talking about this in some detail later, but a few brief illustrations might be useful now. I shall refer chiefly to one aspect of the matter, the falsification of one's expectation of the end. The story that proceeded very simply to its obviously predestined end would be nearer myth than novel or drama. Peripeteia, which has been called the equivalent, in narrative, of irony in rhetoric, is present in every story of the least structural sophistication. Now peripeteia depends on our confidence of the end; it is a disconfirmation followed by a consonance; the interest of having our expectations falsified is obviously related to our wish to reach the discovery or recognition by an unexpected and instructive route. It has nothing whatever to do with any reluctance on our part to get there at all. So that in assimilating the peripeteia we are enacting that readjustment of expectations in regard to an end which is so notable a feature of naïve apocalyptic. And we are doing rather more than that; we are, to look at the matter in another way, re-enacting the familiar dialogue between credulity and scepticism. The more daring the peripeteia, the more we may feel that the work respects our sense of reality; and the more certainly we shall feel that the fiction under consideration is one of those which, by upsetting the ordinary balance of our naïve expectations, is finding something out for us, something real. The falsification of an expectation can be terrible, as in the death of Cordelia; it is a way of finding something out that we should, on our more conventional way to the end, have closed our eyes to. Obviously it could not work if there were not a certain rigidity in the set of our expectations.
Frank Kermode (The Sense of an Ending: Studies in the Theory of Fiction)
A disturbing demonstration of depletion effects in judgment was recently reported in the Proceedings of the National Academy of Sciences. The unwitting participants in the study were eight parole judges in Israel. They spend entire days reviewing applications for parole. The cases are presented in random order, and the judges spend little time on each one, an average of 6 minutes. (The default decision is denial of parole; only 35% of requests are approved. The exact time of each decision is recorded, and the times of the judges’ three food breaks—morning break, lunch, and afternoon break—during the day are recorded as well.) The authors of the study plotted the proportion of approved requests against the time since the last food break. The proportion spikes after each meal, when about 65% of requests are granted. During the two hours or so until the judges’ next feeding, the approval rate drops steadily, to about zero just before the meal. As you might expect, this is an unwelcome result and the authors carefully checked many alternative explanations. The best possible account of the data provides bad news: tired and hungry judges tend to fall back on the easier default position of denying requests for parole. Both
Daniel Kahneman (Thinking, Fast and Slow)
I know very little with anything approaching certainty. I know that I was born, that I exist, and that I will die. For the most part, I can trust my brain's interpretation of the data presented to my senses: this is a rose, that is a car, she is my wife. I do not doubt the reality of the thoughts and emotions and impulses I experience in response to these things. . . . Yet apart from these primary perceptions, intuitions, inferences, and bits of information, the views that I hold about the things that really matter to me--meaning, truth, happiness, goodness, beauty--are finely woven tissues of belief and opinion.
Stephen Batchelor (Confession of a Buddhist Atheist)
I try to be rational (or at least my imaginary facsimile of what rationality is supposed to be). I try to look at the available data objectively (fully aware that this is impossible). I try to extrapolate what may be happening now into what will be happening later. And this, of course, is where naïve realism punches me in the throat. There's simply no way around the limited ceiling of my own mind. It's flat-out impossible to speculate on the future without (a) consciously focusing on the most obvious aspects of what we already know and (b) unconsciously excluding all the things we don't have the intellectual potential to grasp.
Chuck Klosterman (But What If We're Wrong? Thinking About the Present As If It Were the Past)
We can begin to understand what this means by taking up the fourth principle: whenever possible, we should take measures to re-embody the information we think about. The pursuit of knowledge has frequently sought to disengage thinking from the body, to elevate ideas to a cerebral sphere separate from our grubby animal anatomy. Research on the extended mind counsels the opposite approach: we should be seeking to draw the body back into the thinking process. That may take the form of allowing our choices to be influenced by our interoceptive signals—a source of guidance we’ve often ignored in our focus on data-driven decisions. It might take the form of enacting, with bodily movements, the academic concepts that have become abstracted, detached from their origin in the physical world. Or it might take the form of attending to our own and others’ gestures, tuning back in to what was humanity’s first language, present long before speech. As we’ve seen from research on embodied cognition, at a deep level the brain still understands abstract concepts in terms of physical action, a fact reflected in the words we use (“reaching for a goal,” “running behind schedule”); we can assist the brain in its efforts by bringing the literal body back into the act of thinking.
Annie Murphy Paul (The Extended Mind: The Power of Thinking Outside the Brain)
These stories are real, the dreams are real, yet the dilemmas each person faces are founded on the presences that haunt from their past. We see again the twin mechanisms present in all relationships: projection and transference. Each of them, meeting any stranger, reflexively scans the data of history for clues, expectations, possibilities. This scanning mechanism is instantaneous, mostly unconscious, and then the lens of history slips over one's eyes. This refractive lens alters the reality of the other and brings to consciousness a necessarily distorted picture. Attached to that particular lens is a particular history, the dynamics, the script, the outcomes of which are part of the transferred package. Freud once humorously speculated that when a couple goes to bed there are six people jammed together because the spectral presences of the parents are unavoidable. One would have to add to this analogy the reminder that those parents also import their own relational complexes from their parents, so we quickly have fourteen underfoot, not to mention the persistence of even more ancestral influences. How could intimate relationships not be congested arenas? As shopworn as the idea seems, we cannot overemphasize the importance of primal imagoes playing a domineering role in our relational patterns. They may be unconscious, which grants them inordinate power, or we may flee them, but they are always present. Thus, for example, wherever the parent is stuck—such as Damon's mother who only equates sexuality with the perverse and the unappealing, and his father who stands de-potentiated and co-opted—so the child will feel similarly constrained or spend his or her life trying to break away (“anything but that”) and still be defined by someone else's journey. How could Damon not feel depressed, then, at his own stuckness, and how could he not approach intimacy with such debilitating ambivalence?
James Hollis (Hauntings: Dispelling the Ghosts Who Run Our Lives)
Later, in the late twentieth century and the early twenty-first, this thinking informed theories of AI and machine learning. Such theories posited that AI’s potential lay partly in its ability to scan large data sets to learn types and patterns—e.g., groupings of words often found together, or features most often present in an image when that image was of a cat—and then to make sense of reality by identifying networks of similarities and likenesses with what the AI already knew. Even if AI would never know something in the way a human mind could, an accumulation of matches with the patterns of reality could approximate and sometimes exceed the performance of human perception and reason.
Henry Kissinger (The Age of A.I. and Our Human Future)
It has often been claimed that there has been very little change in the average real income of American households over a period of decades. It is an undisputed fact that the average real income—that is, money income adjusted for inflation—of American households rose by only 6 percent over the entire period from 1969 to 1996. That might well be considered to qualify as stagnation. But it is an equally undisputed fact that the average real income per person in the United States rose by 51 percent over that very same period.3 How can both these statistics be true? Because the average number of individuals per household has been declining over the years. Half the households in the United States contained six or more people in 1900, as did 21 percent in 1950. But, by 1998, only ten percent of American households had that many people.4 The average number of persons per household not only varies over time, it also varies from one racial or ethnic group to another at a given time, and varies from one income bracket to another. As of 2007, for example, black household income was lower than Hispanic household income, even though black per capita income was higher than Hispanic per capita income, because black households average fewer people than Hispanic households. Similarly, Asian American household income was higher than white household income, even though white per capita income was higher than Asian American per capita income, because Asian American households average more people.5 Income comparisons using household statistics are far less reliable indicators of standards of living than are individual income data because households vary in size while an individual always means one person. Studies of what people actually consume—that is, their standard of living—show substantial increases over the years, even among the poor,6 which is more in keeping with a 51 percent increase in real per capita income than with a 6 percent increase in real household income. But household income statistics present golden opportunities for fallacies to flourish, and those opportunities have been seized by many in the media, in politics, and in academia.
Thomas Sowell (Economic Facts and Fallacies)
Even so, putting all exaggerations aside, sound neuroscience really is providing us with an ever richer picture of the brain and its operations, and in some far distant epoch may actually achieve something like a comprehensive survey of what is perhaps the single most complex physical object in the universe. That is all entirely irrelevant to my argument, however. My claim here is that, whatever we may learn about the brain in the future, it will remain in principle impossible to produce any entirely mechanistic account of the conscious mind, for a great many reasons (many of which I shall soon address), and that therefore consciousness is a reality that defeats mechanistic or materialist thinking. For the intuitions of folk psychology are in fact perfectly accurate; they are not merely some theory about the mind that is either corrigible or dispensable. They constitute nothing less than a full and coherent phenomenological description of the life of the mind, and they are absolutely “primordial data,” which cannot be abandoned in favor of some alternative description without producing logical nonsense. Simply said, consciousness as we commonly conceive of it is quite real (as all of us, apart from a few cognitive scientists and philosophers, already know—and they know it too, really). And this presents a problem for materialism, because consciousness as we commonly conceive of it is also almost certainly irreconcilable with a materialist view of reality.
David Bentley Hart (The Experience of God: Being, Consciousness, Bliss)
In the longer term, by bringing together enough data and enough computing power, the data giants could hack the deepest secrets of life, and then use this knowledge not just to make choices for us or manipulate us but also to reengineer organic life and create inorganic life-forms. Selling advertisements may be necessary to sustain the giants in the short term, but tech companies often evaluate apps, products, and other companies according to the data they harvest rather than according to the money they generate. A popular app may lack a business model and may even lose money in the short term, but as long as it sucks data, it could be worth billions.4 Even if you don’t know how to cash in on the data today, it is worth having it because it might hold the key to controlling and shaping life in the future. I don’t know for certain that the data giants explicitly think about this in such terms, but their actions indicate that they value the accumulation of data in terms beyond those of mere dollars and cents. Ordinary humans will find it very difficult to resist this process. At present, people are happy to give away their most valuable asset—their personal data—in exchange for free email services and funny cat videos. It’s a bit like African and Native American tribes who unwittingly sold entire countries to European imperialists in exchange for colorful beads and cheap trinkets. If, later on, ordinary people decide to try to block the flow of data, they might find it increasingly difficult, especially as they might come to rely on the network for all their decisions, and even for their healthcare and physical survival.
Yuval Noah Harari (21 Lessons for the 21st Century)
The pursuit of the past makes you aware, whether you are novelist or historian, of the dangers of your own fallibility and inbuilt bias. The writer of history is a walking anachronism, a displaced person, using today’s techniques to try to know things about yesterday that yesterday didn’t know itself. He must try to work authentically, hearing the words of the past, but communicating in a language the present understands. The historian, the biographer, the writer of fiction work within different constraints, but in a way that is complementary, not opposite. The novelist’s trade is never just about making things up. The historian’s trade is never simply about stockpiling facts. Even the driest, most data-driven research involves an element of interpretation. Deep research in the archives can be reported in tabular form and lists, by historians talking to each other. But to talk to their public, they use the same devices as all storytellers – selection, elision, artful arrangement.
Hilary Mantel
Here are some practical Dataist guidelines for you: ‘You want to know who you really are?’ asks Dataism. ‘Then forget about mountains and museums. Have you had your DNA sequenced? No?! What are you waiting for? Go and do it today. And convince your grandparents, parents and siblings to have their DNA sequenced too – their data is very valuable for you. And have you heard about these wearable biometric devices that measure your blood pressure and heart rate twenty-four hours a day? Good – so buy one of those, put it on and connect it to your smartphone. And while you are shopping, buy a mobile camera and microphone, record everything you do, and put in online. And allow Google and Facebook to read all your emails, monitor all your chats and messages, and keep a record of all your Likes and clicks. If you do all that, then the great algorithms of the Internet-of-All-Things will tell you whom to marry, which career to pursue and whether to start a war.’ But where do these great algorithms come from? This is the mystery of Dataism. Just as according to Christianity we humans cannot understand God and His plan, so Dataism declares that the human brain cannot fathom the new master algorithms. At present, of course, the algorithms are mostly written by human hackers. Yet the really important algorithms – such as the Google search algorithm – are developed by huge teams. Each member understands just one part of the puzzle, and nobody really understands the algorithm as a whole. Moreover, with the rise of machine learning and artificial neural networks, more and more algorithms evolve independently, improving themselves and learning from their own mistakes. They analyse astronomical amounts of data that no human can possibly encompass, and learn to recognise patterns and adopt strategies that escape the human mind. The seed algorithm may initially be developed by humans, but as it grows it follows its own path, going where no human has gone before – and where no human can follow.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
In the twenty-first century it sounds childish to compare the human psyche to a steam engine. Today we know of a far more sophisticated technology – the computer – so we explain the human psyche as if it were a computer processing data rather than a steam engine regulating pressure. But this new analogy may turn out to be just as naïve. After all, computers have no minds. They don’t crave anything even when they have a bug, and the Internet doesn’t feel pain even when authoritarian regimes sever entire countries from the Web. So why use computers as a model for understanding the mind? Well, are we really sure that computers have no sensations or desires? And even if they haven’t got any at present, perhaps once they become complex enough they might develop consciousness? If that were to happen, how could we ascertain it? When computers replace our bus driver, our teacher and our shrink, how could we determine whether they have feelings or whether they are just a collection of mindless algorithms? When
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
In the quantum theory, you start with a symmetry, and then you break it with the Higgs boson to get the universe that we see all around us. Similarly, Guth then theorized that maybe there was a new type of Higgs boson (called the inflaton) that made inflation possible. As with the original Higgs boson, the universe started out in the false vacuum that gave us the era of rapid inflation. But then quantum bubbles occurred within the inflaton field. Inside the bubble, the true vacuum emerged, where the inflation had stopped. Our universe emerged as one of these bubbles. The universe slowed down within the bubble, giving us the present-day expansion. So far, inflation seems to fit the astronomical data. It is currently the leading theory. But it has unexpected consequences. If we invoke the quantum theory, it means that the Big Bang can happen again and again. New universes may be being born out of our universe all the time. This means that our universe is actually a single bubble in a bubble bath of universes. This creates a multiverse of parallel universes.
Michio Kaku (The God Equation: The Quest for a Theory of Everything)
On May 21, 1941, Camp de Schirmeck, Natzweiler-Struthof, located 31 miles southwest of Strasbourg in the Vosges Mountains, was opened as the only Nazi Concentration Camp established on present day French territory. Intended to be a transit labor camp it held about 52,000 detainees during the three and a half years of its existence. It is estimated that about 22,000 people died of malnutrition and exertion while at the concentration camp during those years. Natzweiler-Struthof was the location of the infamous Jewish skeleton collection used in the documentary movie “Le nom des 86” made from data provided by the notorious Hauptsturmführer August Hirt. On November 23, 1944, the camp was liberated by the French First Army under the command of the U.S. Sixth Army Group. It is presently preserved as a museum. Boris Pahor, the noted author was interned in Natzweiler-Struthof for having been a Slovene Partisan, and wrote his novel “Necropolis,” named for a large, ancient Greek cemetery. His story is based on his Holocaust experiences while incarcerated at Camp de Schirmeck.
Hank Bracker
As we’ve seen, one of the most frequently pursued paths for achievement-minded college seniors is to spend several years advancing professionally and getting trained and paid by an investment bank, consulting firm, or law firm. Then, the thought process goes, they can set out to do something else with some exposure and experience under their belts. People are generally not making lifelong commitments to the field in their own minds. They’re “getting some skills” and making some connections before figuring out what they really want to do. I subscribed to a version of this mind-set when I graduated from Brown. In my case, I went to law school thinking I’d practice for a few years (and pay down my law school debt) before lining up another opportunity. It’s clear why this is such an attractive approach. There are some immensely constructive things about spending several years in professional services after graduating from college. Professional service firms are designed to train large groups of recruits annually, and they do so very successfully. After even just a year or two in a high-level bank or consulting firm, you emerge with a set of skills that can be applied in other contexts (financial modeling in Excel if you’re a financial analyst, PowerPoint and data organization and presentation if you’re a consultant, and editing and issue spotting if you’re a lawyer). This is very appealing to most any recent graduate who may not yet feel equipped with practical skills coming right out of college. Even more than the professional skill you gain, if you spend time at a bank, consultancy, or law firm, you will become excellent at producing world-class work. Every model, report, presentation, or contract needs to be sophisticated, well done, and error free, in large part because that’s one of the core value propositions of your organization. The people above you will push you to become more rigorous and disciplined, and your work product will improve across the board as a result. You’ll get used to dressing professionally, preparing for meetings, speaking appropriately, showing up on time, writing official correspondence, and so forth. You will be able to speak the corporate language. You’ll become accustomed to working very long hours doing detail-intensive work. These attributes are transferable to and helpful in many other contexts.
Andrew Yang (Smart People Should Build Things: How to Restore Our Culture of Achievement, Build a Path for Entrepreneurs, and Create New Jobs in America)
Two observations take us across the finish line. The Second Law ensures that entropy increases throughout the entire process, and so the information hidden within the hard drives, Kindles, old-fashioned paper books, and everything else you packed into the region is less than that hidden in the black hole. From the results of Bekenstein and Hawking, we know that the black hole's hidden information content is given by the area of its event horizon. Moreover, because you were careful not to overspill the original region of space, the black hole's event horizon coincides with the region's boundary, so the black hole's entropy equals the area of this surrounding surface. We thus learn an important lesson. The amount of information contained within a region of space, stored in any objects of any design, is always less than the area of the surface that surrounds the region (measured in square Planck units). This is the conclusion we've been chasing. Notice that although black holes are central to the reasoning, the analysis applies to any region of space, whether or not a black hole is actually present. If you max out a region's storage capacity, you'll create a black hole, but as long as you stay under the limit, no black hole will form. I hasten to add that in any practical sense, the information storage limit is of no concern. Compared with today's rudimentary storage devices, the potential storage capacity on the surface of a spatial region is humongous. A stack of five off-the-shelf terabyte hard drives fits comfortable within a sphere of radius 50 centimeters, whose surface is covered by about 10^70 Planck cells. The surface's storage capacity is thus about 10^70 bits, which is about a billion, trillion, trillion, trillion, trillion terabytes, and so enormously exceeds anything you can buy. No one in Silicon Valley cares much about these theoretical constraints. Yet as a guide to how the universe works, the storage limitations are telling. Think of any region of space, such as the room in which I'm writing or the one in which you're reading. Take a Wheelerian perspective and imagine that whatever happens in the region amounts to information processing-information regarding how things are right now is transformed by the laws of physics into information regarding how they will be in a second or a minute or an hour. Since the physical processes we witness, as well as those by which we're governed, seemingly take place within the region, it's natural to expect that the information those processes carry is also found within the region. But the results just derived suggest an alternative view. For black holes, we found that the link between information and surface area goes beyond mere numerical accounting; there's a concrete sense in which information is stored on their surfaces. Susskind and 'tHooft stressed that the lesson should be general: since the information required to describe physical phenomena within any given region of space can be fully encoded by data on a surface that surrounds the region, then there's reason to think that the surface is where the fundamental physical processes actually happen. Our familiar three-dimensional reality, these bold thinkers suggested, would then be likened to a holographic projection of those distant two-dimensional physical processes. If this line of reasoning is correct, then there are physical processes taking place on some distant surface that, much like a puppeteer pulls strings, are fully linked to the processes taking place in my fingers, arms, and brain as I type these words at my desk. Our experiences here, and that distant reality there, would form the most interlocked of parallel worlds. Phenomena in the two-I'll call them Holographic Parallel Universes-would be so fully joined that their respective evolutions would be as connected as me and my shadow.
Brian Greene (The Hidden Reality: Parallel Universes and the Deep Laws of the Cosmos)
Among the objections to the reality of objects of sense, there is one which is derived from the apparent difference between matter as it appears in physics and things as they appear in sensation. Men of science, for the most part, are willing to condemn immediate data as "merely subjective," while yet maintaining the truth of the physics inferred from those data. But such an attitude, though it may be *capable* of justification, obviously stands in need of it; and the only justification possible must be one which exhibits matter as a logical construction from sense-data―unless, indeed, there were some wholly *a priori* principle by which unknown entities could be inferred from such as are known. It is therefore necessary to find some way of bridging the gulf between the world of physics and the world of sense, and it is this problem which will occupy us in the present lecture. Physicists appear to be unconscious of the gulf, while psychologists, who are conscious of it, have not the mathematical knowledge required for spanning it. The problem is difficult, and I do not know its solution in detail. All that I can hope to do is to make the problem felt, and to indicate the kind of methods by which a solution is to be sought." ―from_Our Knowledge of the External World_, p. 107.
Bertrand Russell
It was common knowledge at one prominent women’s brand I worked for that the reason they didn’t have more women of color, specifically Black women, on their legacy magazine covers was because they didn’t sell as well. For a business enterprise, and a financially struggling one at that, the editorial strategy to routinely flood the covers with normatively sized straight white women was presented as necessary business, and not a deeply racist lens. But this is where I’ve encountered capitalism to be at its most damaging: it provides an all-encompassing language to code racism, heterosexism, and classism as something else—to establish distance between these deeply coursing prejudices and the unavoidable realities of running a business. This distance insulates. It establishes an alternative reality in which testimonials, diversity reports, investigations, and data analysis on representation don’t resonate because making money is the ultimate objective above all else. But that’s all the more reason why the impetus to drive profits also needs to be aligned and analyzed in endeavors against oppression. Because the drive to make money, more money, more money than your competitors, more money than you made last year, more money than projected for the following year is an enduring vehicle for suppression.
Koa Beck (White Feminism: From the Suffragettes to Influencers and Who They Leave Behind)
Under the direction of General Westmoreland, significantly himself a graduate of the Harvard Business School in which McNamara had at one time taught, the computers zestfully went to work. Fed on forms that had to be filled in by the troops, they digested data on everything from the amount of rice brought to local markets to the number of incidents that had taken place in a given region in a given period of time. They then spewed forth a mighty stream of tables and graphs which purported to measure “progress” week by week and day by day. So long as the tables looked neat, few people bothered to question the accuracy, let alone the relevance, of the data on which they were based. So long as they looked neat, too, the illusion of having a grip on the war helped prevent people from attempting to gain a real understanding of its nature. This is not to say that the Vietnam War was lost simply because the American defense establishment’s management of the conflict depended heavily on computers. Rather, it proves that there is, in war and presumably in peace as well, no field so esoteric or so intangible as to be completely beyond the reach of technology. The technology in use helps condition tactics, strategy, organization, logistics, intelligence, command, control, and communication. Now, however, we are faced with an additional reality. Not only the conduct of war, but the very framework our brains employ in order to think about it, are partly conditioned by the technical instruments at our disposal.
Martin van Creveld (Technology and War: From 2000 B.C. to the Present)
Thought is measured by a different rule, and puts us in mind, rather, of those souls whose number, according to certain ancient myths, is limited. There was in that time a limited contingent of souls or spiritual substance, redistributed from one living creature to the next as successive deaths occurred. With the result that some bodies were sometimes waiting for a soul (like present-day heart patients waiting for an organ donor). On this hypothesis, it is clear that the more human beings there are, the rarer will be those who have a soul. Not a very democratic situation and one which might be translated today into: the more intelligent beings there are (and, by the grace of information technology, they are virtually all intelligent), the rarer thought will be. Christianity was first to institute a kind of democracy and generalized right to a personal soul (it wavered for a long time where women were concerned). The production of souls increased substantially as a result, like the production of banknotes in an inflationary period, and the concept of soul was greatly devalued. It no longer really has any currency today and it has ceased to be traded on the exchanges. There are too many souls on the market today. That is to say, recycling the metaphor, there is too much information, too much meaning, too much immaterial data for the bodies that are left, too much grey matter for the living substance that remains. To the point where the situation is no longer that of bodies in search of a soul, as in the archaic liturgies, but of innumerable souls in search of a body. Or an incalculable knowledge in search of a knowing subject.
Jean Baudrillard (The Intelligence of Evil or the Lucidity Pact (Talking Images))
GCHQ has traveled a long and winding road. That road stretches from the wooden huts of Bletchley Park, past the domes and dishes of the Cold War, and on towards what some suggest will be the omniscient state of the Brave New World. As we look to the future, the docile and passive state described by Aldous Huxley in his Brave New World is perhaps more appropriate analogy than the strictly totalitarian predictions offered by George Orwell's Nineteen Eighty-Four. Bizarrely, many British citizens are quite content in this new climate of hyper-surveillance, since its their own lifestyle choices that helped to create 'wired world' - or even wish for it, for as we have seen, the new torrents of data have been been a source of endless trouble for the overstretched secret agencies. As Ken Macdonald rightly points out, the real drives of our wired world have been private companies looking for growth, and private individuals in search of luxury and convenience at the click of a mouse. The sigint agencies have merely been handed the impossible task of making an interconnected society perfectly secure and risk-free, against the background of a globalized world that presents many unprecedented threats, and now has a few boundaries or borders to protect us. Who, then, is to blame for the rapid intensification of electronic surveillance? Instinctively, many might reply Osama bin Laden, or perhaps Pablo Escobar. Others might respond that governments have used these villains as a convenient excuse to extend state control. At first glance, the massive growth of security, which includes includes not only eavesdropping but also biometric monitoring, face recognition, universal fingerprinting and the gathering of DNA, looks like a sad response to new kinds of miscreants. However, the sad reality is that the Brave New World that looms ahead of us is ultimately a reflection of ourselves. It is driven by technologies such as text messaging and customer loyalty cards that are free to accept or reject as we choose. The public debate on surveillance is often cast in terms of a trade-off between security and privacy. The truth is that luxury and convenience have been pre-eminent themes in the last decade, and we have given them a much higher priority than either security or privacy. We have all been embraced the world of surveillance with remarkable eagerness, surfing the Internet in a global search for a better bargain, better friends, even a better partner. GCHQ vast new circular headquarters is sometimes represented as a 'ring of power', exercising unparalleled levels of surveillance over citizens at home and abroad, collecting every email, every telephone and every instance of internet acces. It has even been asserted that GCHQ is engaged in nothing short of 'algorithmic warfare' as part of a battle for control of global communications. By contrast, the occupants of 'Celtenham's Doughnut' claim that in reality they are increasingly weak, having been left behind by the unstoppable electronic communications that they cannot hope to listen to, still less analyse or make sense of. In fact, the frightening truth is that no one is in control. No person, no intelligence agency and no government is steering the accelerating electronic processes that may eventually enslave us. Most of the devices that cause us to leave a continual digital trail of everything we think or do were not devised by the state, but are merely symptoms of modernity. GCHQ is simply a vast mirror, and it reflects the spirit of the age.
Richard J. Aldrich (GCHQ)
Well before the end of the 20th century however print had lost its former dominance. This resulted in, among other things, a different kind of person getting elected as leader. One who can present himself and his programs in a polished way, as Lee Quan Yu you observed in 2000, adding, “Satellite television has allowed me to follow the American presidential campaign. I am amazed at the way media professionals can give a candidate a new image and transform him, at least superficially, into a different personality. Winning an election becomes, in large measure, a contest in packaging and advertising. Just as the benefits of the printed era were inextricable from its costs, so it is with the visual age. With screens in every home entertainment is omnipresent and boredom a rarity. More substantively, injustice visualized is more visceral than injustice described. Television played a crucial role in the American Civil rights movement, yet the costs of television are substantial, privileging emotional display over self-command, changing the kinds of people and arguments that are taken seriously in public life. The shift from print to visual culture continues with the contemporary entrenchment of the Internet and social media, which bring with them four biases that make it more difficult for leaders to develop their capabilities than in the age of print. These are immediacy, intensity, polarity, and conformity. Although the Internet makes news and data more immediately accessible than ever, this surfeit of information has hardly made us individually more knowledgeable, let alone wiser, as the cost of accessing information becomes negligible, as with the Internet, the incentives to remember it seem to weaken. While forgetting anyone fact may not matter, the systematic failure to internalize information brings about a change in perception, and a weakening of analytical ability. Facts are rarely self-explanatory; their significance and interpretation depend on context and relevance. For information to be transmuted into something approaching wisdom it must be placed within a broader context of history and experience. As a general rule, images speak at a more emotional register of intensity than do words. Television and social media rely on images that inflamed the passions, threatening to overwhelm leadership with the combination of personal and mass emotion. Social media, in particular, have encouraged users to become image conscious spin doctors. All this engenders a more populist politics that celebrates utterances perceived to be authentic over the polished sound bites of the television era, not to mention the more analytical output of print. The architects of the Internet thought of their invention as an ingenious means of connecting the world. In reality, it has also yielded a new way to divide humanity into warring tribes. Polarity and conformity rely upon, and reinforce, each other. One is shunted into a group, and then the group polices once thinking. Small wonder that on many contemporary social media platforms, users are divided into followers and influencers. There are no leaders. What are the consequences for leadership? In our present circumstances, Lee's gloomy assessment of visual media's effects is relevant. From such a process, I doubt if a Churchill or Roosevelt or a de Gaulle can emerge. It is not that changes in communications technology have made inspired leadership and deep thinking about world order impossible, but that in an age dominated by television and the Internet, thoughtful leaders must struggle against the tide.
Henry Kissinger (Leadership : Six Studies in World Strategy)
Again you must learn the point which comes next. Every circle, of those which are by the act of man drawn or even turned on a lathe, is full of that which is opposite to the fifth thing. For everywhere it has contact with the straight. But the circle itself, we say, has nothing in either smaller or greater, of that which is its opposite. We say also that the name is not a thing of permanence for any of them, and that nothing prevents the things now called round from being called straight, and the straight things round; for those who make changes and call things by opposite names, nothing will be less permanent (than a name). Again with regard to the definition, if it is made up of names and verbal forms, the same remark holds that there is no sufficiently durable permanence in it. And there is no end to the instances of the ambiguity from which each of the four suffers; but the greatest of them is that which we mentioned a little earlier, that, whereas there are two things, that which has real being, and that which is only a quality, when the soul is seeking to know, not the quality, but the essence, each of the four, presenting to the soul by word and in act that which it is not seeking (i.e., the quality), a thing open to refutation by the senses, being merely the thing presented to the soul in each particular case whether by statement or the act of showing, fills, one may say, every man with puzzlement and perplexity. [...] But in subjects where we try to compel a man to give a clear answer about the fifth, any one of those who are capable of overthrowing an antagonist gets the better of us, and makes the man, who gives an exposition in speech or writing or in replies to questions, appear to most of his hearers to know nothing of the things on which he is attempting to write or speak; for they are sometimes not aware that it is not the mind of the writer or speaker which is proved to be at fault, but the defective nature of each of the four instruments. The process however of dealing with all of these, as the mind moves up and down to each in turn, does after much effort give birth in a well-constituted mind to knowledge of that which is well constituted. [...] Therefore, if men are not by nature kinship allied to justice and all other things that are honourable, though they may be good at learning and remembering other knowledge of various kinds-or if they have the kinship but are slow learners and have no memory-none of all these will ever learn to the full the truth about virtue and vice. For both must be learnt together; and together also must be learnt, by complete and long continued study, as I said at the beginning, the true and the false about all that has real being. After much effort, as names, definitions, sights, and other data of sense, are brought into contact and friction one with another, in the course of scrutiny and kindly testing by men who proceed by question and answer without ill will, with a sudden flash there shines forth understanding about every problem, and an intelligence whose efforts reach the furthest limits of human powers. Therefore every man of worth, when dealing with matters of worth, will be far from exposing them to ill feeling and misunderstanding among men by committing them to writing. In one word, then, it may be known from this that, if one sees written treatises composed by anyone, either the laws of a lawgiver, or in any other form whatever, these are not for that man the things of most worth, if he is a man of worth, but that his treasures are laid up in the fairest spot that he possesses. But if these things were worked at by him as things of real worth, and committed to writing, then surely, not gods, but men "have themselves bereft him of his wits".
Plato (The Letters)
...one of the most powerful examples of group feeling and belief affecting a broad geographic area was documented as a daring experiment during the war between Lebanon and Israel that began in 1982. It was during that time that researchers trained a group of people to "feel" peace in their bodies while believing that it was already present within them, rather than simply thinking about it in their minds or praying "for" it to occur. For this particular experiment, those involved used a form of meditation known as TM (Transcendental Meditation) to achieve that feeling. At appointed times on specific days of the month, these people were positioned throughout the war-torn areas of the Middle East. During the window of time when they were feeling peace, terrorist activities ceased, the rate of crimes against people went down, the number of emergency-room visits declined, and the incidence of traffic accidents dropped. When the participants' feelings changed, the statistics were reversed. This study confirmed the earlier findings: When a small percentage of the population achieved peace within themselves, it was reflected in the world around them. The experiments took into account the days of the week, holidays, and even lunar cycles; and the data was so consistent that the researchers were able to identify how many people are needed to share the experience of peace before it's mirrored in their world. The number is the square root of one percent of the population. This formula produces figures that are smaller than we might expect. For example, in a city of one million people, the number is about 100. In a world of 6 billion, it's just under 8,000. This calculation represents only the minimum needed to begin the process. The more people involved in feeling peace, the faster the effect is created. The study became known as the International Peace Project in the Middle East...
Gregg Braden (The Spontaneous Healing of Belief: Shattering the Paradigm of False Limits)
To give you a sense of the sheer volume of unprocessed information that comes up the spinal cord into the thalamus, let’s consider just one aspect: vision, since many of our memories are encoded this way. There are roughly 130 million cells in the eye’s retina, called cones and rods; they process and record 100 million bits of information from the landscape at any time. This vast amount of data is then collected and sent down the optic nerve, which transports 9 million bits of information per second, and on to the thalamus. From there, the information reaches the occipital lobe, at the very back of the brain. This visual cortex, in turn, begins the arduous process of analyzing this mountain of data. The visual cortex consists of several patches at the back of the brain, each of which is designed for a specific task. They are labeled V1 to V8. Remarkably, the area called V1 is like a screen; it actually creates a pattern on the back of your brain very similar in shape and form to the original image. This image bears a striking resemblance to the original, except that the very center of your eye, the fovea, occupies a much larger area in V1 (since the fovea has the highest concentration of neurons). The image cast on V1 is therefore not a perfect replica of the landscape but is distorted, with the central region of the image taking up most of the space. Besides V1, other areas of the occipital lobe process different aspects of the image, including: •  Stereo vision. These neurons compare the images coming in from each eye. This is done in area V2. •  Distance. These neurons calculate the distance to an object, using shadows and other information from both eyes. This is done in area V3. •  Colors are processed in area V4. •  Motion. Different circuits can pick out different classes of motion, including straight-line, spiral, and expanding motion. This is done in area V5. More than thirty different neural circuits involved with vision have been identified, but there are probably many more. From the occipital lobe, the information is sent to the prefrontal cortex, where you finally “see” the image and form your short-term memory. The information is then sent to the hippocampus, which processes it and stores it for up to twenty-four hours. The memory is then chopped up and scattered among the various cortices. The point here is that vision, which we think happens effortlessly, requires billions of neurons firing in sequence, transmitting millions of bits of information per second. And remember that we have signals from five sense organs, plus emotions associated with each image. All this information is processed by the hippocampus to create a simple memory of an image. At present, no machine can match the sophistication of this process, so replicating it presents an enormous challenge for scientists who want to create an artificial hippocampus for the human brain.
Michio Kaku (The Future of the Mind: The Scientific Quest to Understand, Enhance, and Empower the Mind)
A famous British writer is revealed to be the author of an obscure mystery novel. An immigrant is granted asylum when authorities verify he wrote anonymous articles critical of his home country. And a man is convicted of murder when he’s connected to messages painted at the crime scene. The common element in these seemingly disparate cases is “forensic linguistics”—an investigative technique that helps experts determine authorship by identifying quirks in a writer’s style. Advances in computer technology can now parse text with ever-finer accuracy. Consider the recent outing of Harry Potter author J.K. Rowling as the writer of The Cuckoo’s Calling , a crime novel she published under the pen name Robert Galbraith. England’s Sunday Times , responding to an anonymous tip that Rowling was the book’s real author, hired Duquesne University’s Patrick Juola to analyze the text of Cuckoo , using software that he had spent over a decade refining. One of Juola’s tests examined sequences of adjacent words, while another zoomed in on sequences of characters; a third test tallied the most common words, while a fourth examined the author’s preference for long or short words. Juola wound up with a linguistic fingerprint—hard data on the author’s stylistic quirks. He then ran the same tests on four other books: The Casual Vacancy , Rowling’s first post-Harry Potter novel, plus three stylistically similar crime novels by other female writers. Juola concluded that Rowling was the most likely author of The Cuckoo’s Calling , since she was the only one whose writing style showed up as the closest or second-closest match in each of the tests. After consulting an Oxford linguist and receiving a concurring opinion, the newspaper confronted Rowling, who confessed. Juola completed his analysis in about half an hour. By contrast, in the early 1960s, it had taken a team of two statisticians—using what was then a state-of-the-art, high-speed computer at MIT—three years to complete a project to reveal who wrote 12 unsigned Federalist Papers. Robert Leonard, who heads the forensic linguistics program at Hofstra University, has also made a career out of determining authorship. Certified to serve as an expert witness in 13 states, he has presented evidence in cases such as that of Christopher Coleman, who was arrested in 2009 for murdering his family in Waterloo, Illinois. Leonard testified that Coleman’s writing style matched threats spray-painted at his family’s home (photo, left). Coleman was convicted and is serving a life sentence. Since forensic linguists deal in probabilities, not certainties, it is all the more essential to further refine this field of study, experts say. “There have been cases where it was my impression that the evidence on which people were freed or convicted was iffy in one way or another,” says Edward Finegan, president of the International Association of Forensic Linguists. Vanderbilt law professor Edward Cheng, an expert on the reliability of forensic evidence, says that linguistic analysis is best used when only a handful of people could have written a given text. As forensic linguistics continues to make headlines, criminals may realize the importance of choosing their words carefully. And some worry that software also can be used to obscure distinctive written styles. “Anything that you can identify to analyze,” says Juola, “I can identify and try to hide.
Anonymous
Our patients predict the culture by living out consciously what the masses of people are able to keep unconscious for the time being. The neurotic is cast by destiny into a Cassandra role. In vain does Cassandra, sitting on the steps of the palace at Mycenae when Agamemnon brings her back from Troy, cry, “Oh for the nightingale’s pure song and a fate like hers!” She knows, in her ill-starred life, that “the pain flooding the song of sorrow is [hers] alone,” and that she must predict the doom she sees will occur there. The Mycenaeans speak of her as mad, but they also believe she does speak the truth, and that she has a special power to anticipate events. Today, the person with psychological problems bears the burdens of the conflicts of the times in his blood, and is fated to predict in his actions and struggles the issues which will later erupt on all sides in the society. The first and clearest demonstration of this thesis is seen in the sexual problems which Freud found in his Victorian patients in the two decades before World War I. These sexual topics‒even down to the words‒were entirely denied and repressed by the accepted society at the time. But the problems burst violently forth into endemic form two decades later after World War II. In the 1920's, everybody was preoccupied with sex and its functions. Not by the furthest stretch of the imagination can anyone argue that Freud "caused" this emergence. He rather reflected and interpreted, through the data revealed by his patients, the underlying conflicts of the society, which the “normal” members could and did succeed in repressing for the time being. Neurotic problems are the language of the unconscious emerging into social awareness. A second, more minor example is seen in the great amount of hostility which was found in patients in the 1930's. This was written about by Horney, among others, and it emerged more broadly and openly as a conscious phenomenon in our society a decade later. A third major example may be seen in the problem of anxiety. In the late 1930's and early 1940's, some therapists, including myself, were impressed by the fact that in many of our patients anxiety was appearing not merely as a symptom of repression or pathology, but as a generalized character state. My research on anxiety, and that of Hobart Mowrer and others, began in the early 1940's. In those days very little concern had been shown in this country for anxiety other than as a symptom of pathology. I recall arguing in the late 1940's, in my doctoral orals, for the concept of normal anxiety, and my professors heard me with respectful silence but with considerable frowning. Predictive as the artists are, the poet W. H. Auden published his Age of Anxiety in 1947, and just after that Bernstein wrote his symphony on that theme. Camus was then writing (1947) about this “century of fear,” and Kafka already had created powerful vignettes of the coming age of anxiety in his novels, most of them as yet untranslated. The formulations of the scientific establishment, as is normal, lagged behind what our patients were trying to tell us. Thus, at the annual convention of the American Psychopathological Association in 1949 on the theme “Anxiety,” the concept of normal anxiety, presented in a paper by me, was still denied by most of the psychiatrists and psychologists present. But in the 1950's a radical change became evident; everyone was talking about anxiety and there were conferences on the problem on every hand. Now the concept of "normal" anxiety gradually became accepted in the psychiatric literature. Everybody, normal as well as neurotic, seemed aware that he was living in the “age of anxiety.” What had been presented by the artists and had appeared in our patients in the late 30's and 40's was now endemic in the land.
Rollo May (Love and Will)