Data Structures Quotes

We've searched our database for all the quotes and captions related to Data Structures. Here they are! All 100 of them:

Bad programmers worry about the code. Good programmers worry about data structures and their relationships.
Linus Torvalds
For the human brain,” Edmond explained, “any answer is better than no answer. We feel enormous discomfort when faced with ‘insufficient data,’ and so our brains invent the data—offering us, at the very least, the illusion of order—creating myriad philosophies, mythologies, and religions to reassure us that there is indeed an order and structure to the unseen world.
Dan Brown (Origin (Robert Langdon, #5))
I will, in fact, claim that the difference between a bad programmer and a good one is whether he considers his code or his data structures more important. Bad programmers worry about the code. Good programmers worry about data structures and their relationships.
Linus Torvalds
Generally, the craft of programming is the factoring of a set of requirements into a a set of functions and data structures.
Douglas Crockford (JavaScript: The Good Parts)
It is better to have 100 functions operate on one data structure than 10 functions on 10 data structures.
Alan J. Perlis (Structure and Interpretation of Computer Programs)
Compassion is what you're good at. I'm better at complex searches through organized data structures.
Orson Scott Card (Speaker for the Dead (Ender's Saga, #2))
Computational processes are abstract beings that inhabit computers. As they evolve, processes manipulate other abstract things called data. The evolution of a process is directed by a pattern of rules called a program. People create programs to direct processes. In effect, we conjure the spirits of the computer with our spells.
Harold Abelson (Structure and Interpretation of Computer Programs)
Companies can learn a lot from biological systems. The human immune system for example is adaptive, redundant, diverse, modular, data-driven and network collaborative. A company that desires not just short term profit but also long term resilience should apply these features of the human immune system to it's business models and company structure.
Hendrith Vanlon Smith Jr.
The string is a stark data structure and everywhere it is passed there is much duplication of process. It is a perfect vehicle for hiding information.
Alan J. Perlis
Temporality is obviously an organised structure, and these three so-called elements of time: past, present, future, must not be envisaged as a collection of 'data' to be added together...but as the structured moments of an original synthesis. Otherwise we shall immediately meet with this paradox: the past is no longer, the future is not yet, as for the instantaneous present, everyone knows that it is not at all: it is the limit of infinite division, like the dimensionless point.
Jean-Paul Sartre (Being and Nothingness)
Programming is a science dressed up as art, because most of us don’t understand the physics of software and it’s rarely, if ever, taught. The physics of software is not algorithms, data structures, languages, and abstractions. These are just tools we make, use, and throw away. The real physics of software is the physics of people. Specifically, it’s about our limitations when it comes to complexity and our desire to work together to solve large problems in pieces. This is the science of programming: make building blocks that people can understand and use easily, and people will work together to solve the very largest problems.
Pieter Hintjens (ZeroMQ: Messaging for Many Applications)
All so-called ‘quantitative’ data, when scrutinized, turn out to be composites of ‘qualitative’ – i.e., contextually located and indexical – interpretations produced by situated researchers, coders, government officials and others. The
Anthony Giddens (The Constitution of Society: Outline of the Theory of Structuration)
In the coming decades, it is likely that we will see more Internet-like revolutions, in which technology steals a march on politics. Artificial intelligence and biotechnology might soon overhaul our societies and economies – and our bodies and minds too – but they are hardly a blip on our political radar. Our current democratic structures just cannot collect and process the relevant data fast enough, and most voters don’t understand biology and cybernetics well enough to form any pertinent opinions. Hence traditional democratic politics loses control of events, and fails to provide us with meaningful visions for the future. That
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
He liked to start sentences with okay, so. It was a habit he had picked up from the engineers. He thought it made him sound smarter, thought it made him sound like them, those code jockeys, standing by the coffee machine, talking faster than he could think, talking not so much in sentences as in data structures, dense clumps of logic with the occasional inside joke. He liked to stand near them, pretending to stir sugar into his coffee, listening in on them as if they were speaking a different language. A language of knowing something, a language of being an expert at something. A language of being something more than an hourly unit.
Charles Yu (Sorry Please Thank You)
Not all roots are buried down in the ground, some are at the top of a tree.
Jinvirle
Purpose gives meaning to action in the same way that structure gives meaning to data.
David Amerland (Intentional: How To Live, Love, Work and Play Meaningfully)
Until now, I've been writing about "now" as if it were literally an instant of time, but of course human faculties are not infinitely precise. It is simplistic to suppose that physical events and mental events march along exactly in step, with the stream of "actual moments" in the outside world and the stream of conscious awareness of them perfectly synchronized. The cinema industry depends on the phenomenon that what seems to us a movie is really a succession of still pictures, running at twenty-five [sic] frames per second. We don't notice the joins. Evidently the "now" of our conscious awareness stretches over at least 1/25 of a second. In fact, psychologists are convinced it can last a lot longer than that. Take he familiar "tick-tock" of the clock. Well, the clock doesn't go "tick-tock" at all; it goes "tick-tick," every tick producing the same sound. It's just that our consciousness runs two successive ticks into a singe "tick-tock" experience—but only if the duration between ticks is less than about three seconds. A really bug pendulum clock just goes "tock . . . tock . . . tock," whereas a bedside clock chatters away: "ticktockticktock..." Two to three seconds seems to be the duration over which our minds integrate sense data into a unitary experience, a fact reflected in the structure of human music and poetry.
Paul C.W. Davies (About Time: Einstein's Unfinished Revolution)
First identified by academic psychologist Leon Festinger, cognitive dissonance occurs when we are confronted with empirical data at odds with the way we “know” the world to work. To resolve this discrepancy, we choose to ignore data or try to fit the data into our preconceived belief structure. Sometimes, there is a crisis and the belief structure eventually crumbles.
David N. Schwartz (The Last Man Who Knew Everything: The Life and Times of Enrico Fermi, Father of the Nuclear Age)
All programs transform data, converting an input into an output. And yet when we think about design, we rarely think about creating transformations. Instead we worry about classes and modules, data structures and algorithms, languages and frameworks.
Andrew Hunt (The Pragmatic Programmer: Your Journey to Mastery, 20th Anniversary Edition)
Quote may seem a bit of a foreign concept, because few other languages have anything like it. It's closely tied to one of the most distinctive features of Lisp: code and data are made out of the same data structures, and the quote operator is the way we distinguish between them.
Paul Graham
What the ethnographer is in fact faced with—except when (as, of course, he must do) he is pursuing the more automatized routines of data collection—is a multiplicity of complex conceptual structures, many of them superimposed upon or knotted into one another, which are at once strange, irregular, and inexplicit, and which he must contrive somehow first to grasp and then to render. And this is true at the most down-to-earth, jungle field work levels of his activity; interviewing informants, observing rituals, eliciting kin terms, tracing property lines, censusing households … writing his journal. Doing ethnography is like trying to read (in the sense of “construct a reading of”) a manuscript—foreign, faded, full of ellipses, incoherencies, suspicious emendations, and tendentious commentaries, but written not in conventionalized graphs of sound but in transient examples of shaped behavior.
Clifford Geertz (The Interpretation of Cultures)
Sherrie described atheism as a positive system of belief—one based on data, exploration and observation rather than scripture, creed and prayer. Atheists believe that human life is a chemical phenomenon, that our first parents were super-novas that happened billions of years ago—that humans are inexplicable miracles in a universe of structured chaos. Atheists believe that when we die, we will turn into organic debris which will continue cycling for billions of years in various incarnations. Sherrie explained that atheists appreciate life unfathomably because it is going to end. No one who takes atheism seriously dies without hope.
Israel Morrow (Gods of the Flesh: A Skeptic's Journey Through Sex, Politics and Religion)
Programming is a science dressed up as art, because most of us don't understand the physics of software, and it's rarely if ever taught. The physics of software is not algorithms, data structures, languages and abstractions. These are just tools we make, use, throw away. The real physics of software is the physics of people.
ØMQ - The Guide
Database Management System [Origin: Data + Latin basus "low, mean, vile, menial, degrading, ounterfeit."] A complex set of interrelational data structures allowing data to be lost in many convenient sequences while retaining a complete record of the logical relations between the missing items. -- From The Devil's DP Dictionary
Stan Kelly-Bootle
It is very easy to grow tired at collecting; the period of a low tide is about all men can endure. At first the rocks are bright and every moving animal makes his mark on the attention. The picture is wide and colored and beautiful. But after an hour and a half the attention centers weary, the color fades, and the field is likely to narrow to an individual animal. Here one may observe his own world narrowed down until interest and, with it, observation, flicker and go out. And what if with age this weariness becomes permanent and observation dim out and not recover? Can this be what happens to so many men of science? Enthusiasm, interest, sharpness, dulled with a weariness until finally they retire into easy didacticism? With this weariness, this stultification of attention centers, perhaps there comes the pained and sad memory of what the old excitement was like, and regret might turn to envy of the men who still have it. Then out of the shell of didacticism, such a used-up man might attack the unwearied, and he would have in his hands proper weapons of attack. It does seem certain that to a wearied man an error in a mass of correct data wipes out all the correctness and is a focus for attack; whereas the unwearied man, in his energy and receptivity, might consider the little dross of error a by-product of his effort. These two may balance and produce a purer thing than either in the end. These two may be the stresses which hold up the structure, but it is a sad thing to see the interest in interested men thin out and weaken and die. We have known so many professors who once carried their listeners high on their single enthusiasm, and have seen these same men finally settle back comfortably into lectures prepared years before and never vary them again. Perhaps this is the same narrowing we observe in relation to ourselves and the tide pool—a man looking at reality brings his own limitations to the world. If he has strength and energy of mind the tide pool stretches both ways, digs back to electrons and leaps space into the universe and fights out of the moment into non-conceptual time. Then ecology has a synonym which is ALL.
John Steinbeck (The Log from the Sea of Cortez)
Now he was…dust. To an outside observer, these ten seconds had been ground up into ten thousand uncorrelated moments and scattered throughout real time - and in model time, the outside world had suffered an equivalent fate. Yet the pattern of his awareness remained perfectly intact: somehow he found himself, “assembled himself” from these scrambled fragments. He’d been taken apart like a jigsaw puzzle - but his dissection and shuffling were transparent to him. Somehow - on their own terms - the pieces remained connected. Imagine a universe entirely without structure, without shape, without connections. A cloud of microscopic events, like fragments of space-time … except that there is no space or time. What characterizes one point in space, for one instant? Just the values of the fundamental particle fields, just a handful of numbers. Now, take away all notions of position, arrangement, order, and what’s left? A cloud of random numbers. But if the pattern that is me could pick itself out from all the other events taking place on this planet, why shouldn’t the pattern we think of as ‘the universe’ assemble itself, find itself, in exactly the same way? If I can piece together my own coherent space and time from data scattered so widely that it might as well be part of some giant cloud of random numbers, then what makes you think that you’re not doing the very same thing?
Greg Egan (Permutation City)
Huge volumes of data may be compelling at first glance, but without an interpretive structure they are meaningless.
Tom Boellstorff (Ethnography and Virtual Worlds: A Handbook of Method)
It’s only because the data force us into corners that we are inspired to create the highly counterintuitive structures that form the basis for modern physics.
Sean Carroll (The Particle at the End of the Universe: How the Hunt for the Higgs Boson Leads Us to the Edge of a New World)
The interface theory says that space and time are not fundamental aspects of objective reality, but simply a data format for messages about fitness, a format evolved to compress and correct such messages. Objects in spacetime are not aspects of objective reality, but simply messages about fitness coded in a format of icons that is specific to the needs of Homo sapiens. In particular, our bodies are not aspects of objective reality, and our actions don’t give us direct access to preexisting objects in spacetime. Our bodies are messages about fitness that are coded as icons in a format specific to our species. When you perceive yourself sitting inside space and enduring through time, you’re actually seeing yourself as an icon inside your own data structure.
Donald D. Hoffman (The Case Against Reality: Why Evolution Hid the Truth from Our Eyes)
But when you look at CMB map, you also see that the structure that is observed, is in fact, in a weird way, correlated with the plane of the earth around the sun. Is this Copernicus coming back to haunt us? That's crazy. We're looking out at the whole universe. There's no way there should be a correlation of structure with our motion of the earth around the sun - the plane of the earth around the sun - the ecliptic. That would say we are truly the center of the universe. The new results are either telling us that all of science is wrong and we're the center of the universe, or maybe the data is (s)imply incorrect, or maybe it's telling us there's something weird about the microwave background results and that maybe, maybe there's something wrong with our theories on the larger scales.
Lawrence M. Krauss
According to the sex role and structural powerlessness hypothesis, women who have a lot of personal access to resources are predicted not to value resources in a mate as much as women lacking resources. This hypothesis receives no support from the existing empirical data, however. Indeed, women with high incomes value a potential mate’s income and education more, not less, than women with lower incomes.
David M. Buss (Evolutionary Psychology: The New Science of the Mind)
Data about your thoughts goes into a database owned by Google, what you buy into Amazon or Walmart, and what you owe into Experian or Equifax. You live in a world structured by concentrated corporate power.
Matt Stoller (Goliath: The 100-Year War Between Monopoly Power and Democracy)
Will you be encountering each other for the first time through this communication, or do you have an established relationship? Do they already trust you as an expert, or do you need to work to establish credibility? These are important considerations when it comes to determining how to structure your communication and whether and when to use data, and may impact the order and flow of the overall story you aim to tell.
Cole Nussbaumer Knaflic (Storytelling with Data: A Data Visualization Guide for Business Professionals)
Amidst all this organic plasticity and compromise, though, the infrastructure fields could still stake out territory for a few standardized subsystems, identical from citizen to citizen. Two of these were channels for incoming data—one for gestalt, and one for linear, the two primary modalities of all Konishi citizens, distant descendants of vision and hearing. By the orphan's two-hundredth iteration, the channels themselves were fully formed, but the inner structures to which they fed their data, the networks for classifying and making sense of it, were still undeveloped, still unrehearsed. Konishi polis itself was buried two hundred meters beneath the Siberian tundra, but via fiber and satellite links the input channels could bring in data from any forum in the Coalition of Polises, from probes orbiting every planet and moon in the solar system, from drones wandering the forests and oceans of Earth, from ten million kinds of scape or abstract sensorium. The first problem of perception was learning how to choose from this superabundance.
Greg Egan (Diaspora)
I was so struck by Flow’s negative implications for parents that I decided I wanted to speak to Csikszentmihalyi, just to make sure I wasn’t misreading him. And eventually I did, at a conference in Philadelphia where he was one of the marquee speakers. As we sat down to chat, the first thing I asked was why he talked so little about family life in Flow. He devotes only ten pages to it. “Let me tell you a couple of things that may be relevant to you,” he said. And then he told a personal story. When Csikszentmihalyi first developed the Experience Sampling Method, one of the first people he tried it out on was himself. “And at the end of the week,” he said, “I looked at my responses, and one thing that suddenly was very strange to me was that every time I was with my two sons, my moods were always very, very negative.” His sons weren’t toddlers at that point either. They were older. “And I said, ‘This doesn’t make any sense to me, because I’m very proud of them, and we have a good relationship.’ ” But then he started to look at what, specifically, he was doing with his sons that made his feelings so negative. “And what was I doing?” he asked. “I was saying, ‘It’s time to get up, or you will be late for school.’ Or, ‘You haven’t put away your cereal dish from breakfast.’ ” He was nagging, in other words, and nagging is not a flow activity. “I realized,” he said, “that being a parent consists, in large part, of correcting the growth pattern of a person who is not necessarily ready to live in a civilized society.” I asked if, in that same data set, he had any numbers about flow in family life. None were in his book. He said he did. “They were low. Family life is organized in a way that flow is very difficult to achieve, because we assume that family life is supposed to relax us and to make us happy. But instead of being happy, people get bored.” Or enervated, as he’d said before, when talking about disciplining his sons. And because children are constantly changing, the “rules” of handling them change too, which can further confound a family’s ability to flow. “And then we get into these spirals of conflict and so forth,” he continued. “That’s why I’m saying it’s easier to get into flow at work. Work is more structured. It’s structured more like a game. It has clear goals, you get feedback, you know what has to be done, there are limits.” He thought about this. “Partly, the lack of structure in family life, which seems to give people freedom, is actually a kind of an impediment.
Jennifer Senior (All Joy and No Fun: The Paradox of Modern Parenthood)
Persinger and Lafreniere: We, as a species, exist in a world in which exist a myriad of data points. Upon these matrices of points we superimpose a structure and the world makes sense to us. The pattern of the structure originates within our biological and sociological properties.
Robert Anton Wilson (Prometheus Rising)
Despite all their surface diversity, most jokes and funny incidents have the following logical structure: Typically you lead the listener along a garden path of expectation, slowly building up tension. At the very end, you introduce an unexpected twist that entails a complete reinterpretation of all the preceding data, and moreover, it's critical that the new interpretation, though wholly unexpected, makes as much "sense" of the entire set of facts as did the originally "expected" interpretation. In this regard, jokes have much in common with scientific creativity, with what Thomas Kuhn calls a "paradigm shift" in response to a single "anomaly." (It's probably not coincidence that many of the most creative scientists have a great sense of humor.) Of course, the anomaly in the joke is the traditional punch line and the joke is "funny" only if the listener gets the punch line by seeing in a flash of insight how a completely new interpretation of the same set of facts can incorporate the anomalous ending. The longer and more tortuous the garden path of expectation, the "funnier" the punch line when finally delivered.
V.S. Ramachandran
Use # as an introducer for comments. It is good to have a way to embed annotations and comments in data files. It’s best if they’re actually part of the file structure, and so will be preserved by tools that know its format. For comments that are not preserved during parsing, # is the conventional start character.
Eric S. Raymond (Art of UNIX Programming, The (Addison-Wesley Professional Computing Series))
Conspiracy theories—feverishly creative, lovingly plotted—are in fact fictional stories that some people believe. Conspiracy theorists connect real data points and imagined data points into a coherent, emotionally satisfying version of reality. Conspiracy theories exert a powerful hold on the human imagination—yes, perhaps even your imagination—not despite structural parallels with fiction, but in large part because of them. They fascinate us because they are ripping good yarns, showcasing classic problem structure and sharply defined good guys and villains. They offer vivid, lurid plots that translate with telling ease into wildly popular entertainment.
Jonathan Gottschall (The Storytelling Animal: How Stories Make Us Human)
Present-day democratic structures just cannot collect and process the relevant data fast enough, and most voters don’t understand biology and cybernetics well enough to form any pertinent opinions. Hence traditional democratic politics is losing control of events, and is failing to present us with meaningful visions of the future.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
Today most of the debate on the cutting edge in macroeconomics would not call itself “Keynesian” or “monetarist” or any other label relating to a school of thought. The data are considered the ruling principle, and it is considered suspect to have too strong a loyalty to any particular model about the underlying structure of the economy.
Tyler Cowen (Average Is Over: Powering America Beyond the Age of the Great Stagnation)
As a thought experiment, von Neumann's analysis was simplicity itself. He was saying that the genetic material of any self-reproducing system, whether natural or artificial, must function very much like a stored program in a computer: on the one hand, it had to serve as live, executable machine code, a kind of algorithm that could be carried out to guide the construction of the system's offspring; on the other hand, it had to serve as passive data, a description that could be duplicated and passed along to the offspring. As a scientific prediction, that same analysis was breathtaking: in 1953, when James Watson and Francis Crick finally determined the molecular structure of DNA, it would fulfill von Neumann's two requirements exactly. As a genetic program, DNA encodes the instructions for making all the enzymes and structural proteins that the cell needs in order to function. And as a repository of genetic data, the DNA double helix unwinds and makes a copy of itself every time the cell divides in two. Nature thus built the dual role of the genetic material into the structure of the DNA molecule itself.
M. Mitchell Waldrop (The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal)
Ray Bradbury said that thinking is the enemy of creativity because it’s self-conscious. When you think you sit calmly and try to reason through something in a structured, logical way. Creativity dances to a different tune. Once you flip that switch, things get a bit chaotic. Ideas start buzzing. Images start popping into your head. Fragments of all kinds of data find their way into orbit.
Sean Patrick (Nikola Tesla: Imagination and the Man That Invented the 20th Century)
Technology is a powerful force in our society. Data, software, and communication can be used for bad: to entrench unfair power structures, to undermine human rights, and to protect vested interests. But they can also be used for good: to make underrepresented people’s voices heard, to create opportunities for everyone, and to avert disasters. This book is dedicated to everyone working toward the good.
Martin Kleppmann (Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems)
Imagine an alien, Fox said, who's come here to identify the planet's dominant form of intelligence. The alien has s look, then chooses. What do you think he picks? I probably shrugged. The zaibatsus, Fox said, the multinationals. The blood of a zaibatsu is information, not people. The structure is independent of the individual lives that comprise it. Corporation as life form. Not the Edge lecture again, I said.
William Gibson (Burning Chrome (Sprawl, #0))
Philosophers of science have repeatedly demonstrated that more than one theoretical construction can always be placed upon a given collection of data. History of science indicates that, particularly in the early developmental stages of a new paradigm, it is not even very difficult to invent such alternates. But that invention of alternates is just what scientists seldom undertake except during the pre-paradigm stage of their science's development and at very special occasions during its subsequent evolution. So long as the tools a paradigm supplies continue to prove capable of solving the problems it defines, science moves fastest and penetrates most deeply through confident employment of those tools. The reason is clear. As in manufacture so in science-retooling is an extravagance to be reserved for the occasion that demands it. The significance of crises is the indication they provide that an occasion for retooling has arrived.
Thomas S. Kuhn (The Structure of Scientific Revolutions)
Power vacuums seldom last long. If in the twenty-first century traditional political structures can no longer process the data fast enough to produce meaningful visions, then new and more efficient structures will evolve to take their place.These new structures may be very different from any previous political institutions, whether democratic or authoritarian. The only question is who will build and control these structures. If humankind is no longer up to the task, perhaps it might give somebody else a try.
Yuval Noah Harari (Homo Deus A Brief History of Tomorrow By Yuval Noah Harari & How We Got to Now Six Innovations that Made the Modern World By Steven Johnson 2 Books Collection Set)
A more ambitious route would be changing the structure of governance altogether: away from majority-based, and towards unanimous decision-making. This has been shown to boost women’s speech participation and to mitigate against their minority position. A 2012 US study found that women only participate at an equal rate in discussions when they are in ‘a large majority’ – interestingly while individual women speak less when they are in the minority, individual men speak the same amount no matter what the gender proportion of the group.
Caroline Criado Pérez (Invisible Women: Data Bias in a World Designed for Men)
And yet, despite the horror it caused, the plague turned out to be the catalyst for social and economic change that was so profound that far from marking the death of Europe, it served as its making. The transformation provided an important pillar in the rise—and the triumph—of the west. It did so in several phases. First was the top-to-bottom reconfiguration of how social structures functioned. Chronic depopulation in the wake of the Black Death had the effect of sharply increasing wages because of the accentuated value of labour. So many died before the plague finally began to peter out in the early 1350s that one source noted a “shortage of servants, craftsmen, and workmen, and agricultural workers and labourers.” This gave considerable negotiating powers to those who had previously been at the lower end of the social and economic spectrum. Some simply “turned their noses up at employment, and could scarcely be persuaded to serve the eminent unless for triple wages.”66 This was hardly an exaggeration: empirical data shows that urban wages rose dramatically in the decades after the Black Death.
Peter Frankopan (The Silk Roads: A New History of the World)
the hippocampus is perhaps the central organ in the brain responsible for processing sensory data and acting as a gateway for inflows. As Kisley et al. note: “The cholinergic innervation of the hippocampus, which is crucial for intact sensory gating, exhibits extensive remodeling during pre- and early postnatal development.”4 It displays a great deal of plasticity in response to incoming sensory flows; the more it works with meaning and the more sensory input it is sensitive to, the more it shifts its neural structure as it sensorally interacts with the world.
Stephen Harrod Buhner (Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth)
In fact this desire for consonance in the apocalyptic data, and our tendency to be derisive about it, seem to me equally interesting. Each manifests itself, in the presence of the other, in most of our minds. We are all ready to be sceptical about Father Marystone, but we are most of us given to some form of 'centurial mysticism,' and even to more extravagant apocalyptic practices: a point I shall be taking up in my fourth talk. What it seems to come to is this. Men in the middest make considerable imaginative investments in coherent patterns which, by the provision of an end, make possible a satisfying consonance with the origins and with the middle. That is why the image of the end can never be permanently falsified. But they also, when awake and sane, feel the need to show a marked respect for things as they are; so that there is a recurring need for adjustments in the interest of reality as well as of control. This has relevance to literary plots, images of the grand temporal consonance; and we may notice that there is the same co-existence of naïve acceptance and scepticism here as there is in apocalyptic. Broadly speaking, it is the popular story that sticks most closely to established conventions; novels the clerisy calls 'major' tend to vary them, and to vary them more and more as time goes by. I shall be talking about this in some detail later, but a few brief illustrations might be useful now. I shall refer chiefly to one aspect of the matter, the falsification of one's expectation of the end. The story that proceeded very simply to its obviously predestined end would be nearer myth than novel or drama. Peripeteia, which has been called the equivalent, in narrative, of irony in rhetoric, is present in every story of the least structural sophistication. Now peripeteia depends on our confidence of the end; it is a disconfirmation followed by a consonance; the interest of having our expectations falsified is obviously related to our wish to reach the discovery or recognition by an unexpected and instructive route. It has nothing whatever to do with any reluctance on our part to get there at all. So that in assimilating the peripeteia we are enacting that readjustment of expectations in regard to an end which is so notable a feature of naïve apocalyptic. And we are doing rather more than that; we are, to look at the matter in another way, re-enacting the familiar dialogue between credulity and scepticism. The more daring the peripeteia, the more we may feel that the work respects our sense of reality; and the more certainly we shall feel that the fiction under consideration is one of those which, by upsetting the ordinary balance of our naïve expectations, is finding something out for us, something real. The falsification of an expectation can be terrible, as in the death of Cordelia; it is a way of finding something out that we should, on our more conventional way to the end, have closed our eyes to. Obviously it could not work if there were not a certain rigidity in the set of our expectations.
Frank Kermode (The Sense of an Ending: Studies in the Theory of Fiction)
We are about to study the idea of a computational process. Computational processes are abstract beings that inhabit computers. As they evolve, processes manipulate other abstract things called data. The evolution of a process is directed by a pattern of rules called a program. People create programs to direct processes. In effect, we conjure the spirits of the computer with our spells. A computational process is indeed much like a sorcerer's idea of a spirit. It cannot be seen or touched. It is not composed of matter at all. However, it is very real. It can perform intellectual work. It can answer questions. It can affect the world by disbursing money at a bank or by controlling a robot arm in a factory. The programs we use to conjure processes are like a sorcerer's spells. They are carefully composed from symbolic expressions in arcane and esoteric programming languages that prescribe the tasks we want our processes to perform. A computational process, in a correctly working computer, executes programs precisely and accurately. Thus, like the sorcerer's apprentice, novice programmers must learn to understand and to anticipate the consequences of their conjuring. Even small errors (usually called bugs or glitches) in programs can have complex and unanticipated consequences.
Harold Abelson (Structure and Interpretation of Computer Programs)
In a remarkable letter to the director of the Vatican Observatory, John Paul II wrote: The church does not propose that science should become religion or religion science. On the contrary, unity always presupposes the diversity and integrity of its elements. Each of these members should become not less itself but more itself in a dynamic interchange, for a unity in which one of the elements is reduced to the other is destructive, false in its promises of harmony, and ruinous of the integrity of its components. We are asked to become one. We are not asked to become each other. . . . Unity involves the drive of the human mind towards understanding and the desire of the human spirit for love. When human beings seek to understand the multiplicities that surround them, when they seek to make sense of experience, they do so by bringing many factors into a common vision. Understanding is achieved when many data are unified by a common structure. The one illuminates the many: it makes sense of the whole. . . . We move towards unity as we move towards meaning in our lives. Unity is also the consequence of love. If love is genuine, it moves not towards the assimilation of the other but towards union with the other. Human community begins in desire when that union has not been achieved, and it is completed in joy when those who have been apart are now united.10
Ilia Delio (Making All Things New: Catholicity, Cosmology, Consciousness (Catholicity in an Evolving Universe Series))
In the coming decades, it is likely that we will see more Internet-like revolutions, in which technology steals a march on politics. Artificial intelligence and biotechnology might soon overhaul our societies and economies – and our bodies and minds too – but they are hardly a blip on our political radar. Our current democratic structures just cannot collect and process the relevant data fast enough, and most voters don’t understand biology and cybernetics well enough to form any pertinent opinions. Hence traditional democratic politics loses control of events, and fails to provide us with meaningful visions for the future. That doesn’t mean we will go back to twentieth-century-style dictatorships. Authoritarian regimes seem to be equally overwhelmed by the pace of technological development and the speed and volume of the data flow. In the twentieth century, dictators had grand visions for the future. Communists and fascists alike sought to completely destroy the old world and build a new world in its place. Whatever you think about Lenin, Hitler or Mao, you cannot accuse them of lacking vision. Today it seems that leaders have a chance to pursue even grander visions. While communists and Nazis tried to create a new society and a new human with the help of steam engines and typewriters, today’s prophets could rely on biotechnology and super-computers.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
The Scientific Revolution proposed a very different formula for knowledge: Knowledge = Empirical Data × Mathematics. If we want to know the answer to some question, we need to gather relevant empirical data, and then use mathematical tools to analyse the data. For example, in order to gauge the true shape of the earth, we can observe the sun, the moon and the planets from various locations across the world. Once we have amassed enough observations, we can use trigonometry to deduce not only the shape of the earth, but also the structure of the entire solar system. In practice, that means that scientists seek knowledge by spending years in observatories, laboratories and research expeditions, gathering more and more empirical data, and sharpening their mathematical tools so they could interpret the data correctly. The scientific formula for knowledge led to astounding breakthroughs in astronomy, physics, medicine and countless other disciplines. But it had one huge drawback: it could not deal with questions of value and meaning. Medieval pundits could determine with absolute certainty that it is wrong to murder and steal, and that the purpose of human life is to do God’s bidding, because scriptures said so. Scientists could not come up with such ethical judgements. No amount of data and no mathematical wizardry can prove that it is wrong to murder. Yet human societies cannot survive without such value judgements.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
It has become fashionable for modern workplaces to relax what are often seen as outmoded relics of a less egalitarian age: out with stuffy hierarchies, in with flat organisational structures. But the problem with the absence of a formal hierarchy is that it doesn’t actually result in an absence of a hierarchy altogether. It just means that the unspoken, implicit, profoundly non-egalitarian structure reasserts itself, with white men at the top and the rest of us fighting for a piece of the small space left for everyone else. Group-discussion approaches like brainstorming, explains female leadership trainer Gayna Williams, are ‘well known to be loaded with challenges for diverse representation’, because already-dominant voices dominate.
Caroline Criado Pérez (Invisible Women: Data Bias in a World Designed for Men)
Sound waves, regardless of their frequency or intensity, can only be detected by the Mole Fly’s acute sense of smell—it is a little known fact that the Mole Fly’s auditory receptors do not, in fact, have a corresponding center in the brain designated for the purposes of processing sensory stimuli and so, these stimuli, instead of being siphoned out as noise, bypass the filters to be translated, oddly enough, by the part of the brain that processes smell. Consequently, the Mole Fly’s brain, in its inevitable confusion, understands sound as an aroma, rendering the boundary line between the auditory and olfactory sense indistinguishable. Sounds, thus, come in a variety of scents with an intensity proportional to its frequency. Sounds of shorter wavelength, for example, are particularly pungent. What results is a species of creature that cannot conceptualize the possibility that sound and smell are separate entities, despite its ability to discriminate between the exactitudes of pitch, timbre, tone, scent, and flavor to an alarming degree of precision. Yet, despite this ability to hyper-analyze, they lack the cognitive skill to laterally link successions of either sound or smell into a meaningful context, resulting in the equivalent of a data overflow. And this may be the most defining element of the Mole Fly’s behavior: a blatant disregard for the context of perception, in favor of analyzing those remote and diminutive properties that distinguish one element from another. While sensory continuity seems logical to their visual perception, as things are subject to change from moment-to-moment, such is not the case with their olfactory sense, as delays in sensing new smells are granted a degree of normality by the brain. Thus, the Mole Fly’s olfactory-auditory complex seems to be deprived of the sensory continuity otherwise afforded in the auditory senses of other species. And so, instead of sensing aromas and sounds continuously over a period of time—for example, instead of sensing them 24-30 times per second, as would be the case with their visual perception—they tend to process changes in sound and smell much more slowly, thereby preventing them from effectively plotting the variations thereof into an array or any kind of meaningful framework that would allow the information provided by their olfactory and auditory stimuli to be lasting in their usefulness. The Mole flies, themselves, being the structurally-obsessed and compulsive creatures that they are, in all their habitual collecting, organizing, and re-organizing of found objects into mammoth installations of optimal functional value, are remarkably easy to control, especially as they are given to a rather false and arbitrary sense of hierarchy, ascribing positions—that are otherwise trivial, yet necessarily mundane if only to obscure their true purpose—with an unfathomable amount of honor, to the logical extreme that the few chosen to serve in their most esteemed ranks are imbued with a kind of obligatory arrogance that begins in the pupal stages and extends indefinitely, as they are further nurtured well into adulthood by a society that infuses its heroes of middle management with an immeasurable sense of importance—a kind of celebrity status recognized by the masses as a living embodiment of their ideals. And yet, despite this culture of celebrity worship and vicarious living, all whims and impulses fall subservient, dropping humbly to the knees—yes, Mole Flies do, in fact, have knees!—before the grace of the merciful Queen, who is, in actuality, just a puppet dictator installed by the Melic papacy, using an old recycled Damsel fly-fishing lure. The dummy is crude, but convincing, as the Mole flies treat it as they would their true-born queen.
Ashim Shanker (Don't Forget to Breathe (Migrations, Volume I))
Now, describe, in a single written sentence, your intended successful outcome for this problem or situation. In other words, what would need to happen for you to check this project off as “done”? It could be as simple as “Take the Hawaii vacation,” “Handle situation with customer X,” “Resolve college situation with Susan,” “Clarify new divisional management structure,” “Implement new investment strategy,” or “Research options for dealing with Manuel’s reading issue.” All clear? Great. Now write down the very next physical action required to move the situation forward. If you had nothing else to do in your life but get closure on this, what visible action would you take right now? Would you call or text someone? Write an e-mail? Take pen and paper and brainstorm about it? Surf the Web for data? Buy nails at the hardware store? Talk about it face-to-face with your partner, your assistant, your attorney, or your boss? What? Got the answer to that?
David Allen (Getting Things Done: The Art of Stress-Free Productivity)
Gell-Mann and Ne'eman discovered that one such simple Lie group, called "special unitary group of degree 3," or SU(3), was particularly well suited for the "eightfold way"-the family structure the particles were found to obey. The beaty of the SU(3) symmetry was revealed in full glory via its predictive power. Gell-Mann and Ne'eman showed that if the theory were to hold true, a previously unknown tenth member of a particular family of nine particles had to be found. The extensive hunt for the missing particle was conducted in an accelerator experiment in 1964 at Brookhaven National Lab on Long Island. Yuval Ne'eman told me some years later that, upon hearing that half of the data had already been scrutinized without discovering the anticipated particle, he was contemplating leaving physics altogether. Symmetry triumphed at the end-the missing particle (called the omega minus) was found, and it had precisely the properties predicted by the theory.
Mario Livio (The Equation That Couldn't Be Solved: How Mathematical Genius Discovered the Language of Symmetry)
Judging types are in a hurry to make decisions. Perceiving types are not. This is why science doesn’t make any serious attempt to reach a final theory of everything. It always says, “Let’s do another experiment. And another. And another.” When will the experimentation ever end? When will scientists conclude that they have now collected easily enough data to now draw definitive conclusions? But they don’t want to draw any such conclusions. That’s not how they roll. Their method has no such requirement. That’s why many of them openly say that they do not want a final theory. It will stop them, they say, from “discovering” new things. Judging types like order and structure. They like decisions, conclusions, getting things done and reaching objectives. Perceiving types are doubtful and skeptical about all of that. They frequently refer to judging types as “judgmental”, which is literally perceived as a bad thing, “authoritarian”, “totalitarian”, “fascist”, “Nazi”, and so on. Perceiving types always want to have an open road ahead of them. They never want to actually arrive. Judging types cannot see the point of not wanting to reach your destination.
Thomas Stark (Extra Scientiam Nulla Salus: How Science Undermines Reason (The Truth Series Book 8))
Saint John Paul II wrote, “when its concepts and conclusions can be integrated into the wider human culture and its concerns for ultimate meaning and value.”7 Religion, too, develops best when its doctrines are not abstract and fixed in an ancient past but integrated into the wider stream of life. Albert Einstein once said that “science without religion is lame and religion without science is blind.”8 So too, John Paul II wrote: “Science can purify religion from error and superstition; religion can purify science from idolatry and false absolutes. Each can draw the other into a wider world, a world in which both can flourish.”9 Teilhard de Chardin saw that dialogue alone between the disciplines is insufficient; what we need is a new synthesis of science and religion, drawing insights from each discipline into a new unity. In a remarkable letter to the director of the Vatican Observatory, John Paul II wrote: The church does not propose that science should become religion or religion science. On the contrary, unity always presupposes the diversity and integrity of its elements. Each of these members should become not less itself but more itself in a dynamic interchange, for a unity in which one of the elements is reduced to the other is destructive, false in its promises of harmony, and ruinous of the integrity of its components. We are asked to become one. We are not asked to become each other. . . . Unity involves the drive of the human mind towards understanding and the desire of the human spirit for love. When human beings seek to understand the multiplicities that surround them, when they seek to make sense of experience, they do so by bringing many factors into a common vision. Understanding is achieved when many data are unified by a common structure. The one illuminates the many: it makes sense of the whole. . . . We move towards unity as we move towards meaning in our lives. Unity is also the consequence of love. If love is genuine, it moves not towards the assimilation of the other but towards union with the other. Human community begins in desire when that union has not been achieved, and it is completed in joy when those who have been apart are now united.10 The words of the late pope highlight the core of catholicity: consciousness of belonging to a whole and unity as a consequence of love.
Ilia Delio (Making All Things New: Catholicity, Cosmology, Consciousness (Catholicity in an Evolving Universe Series))
Christianity and other traditional religions are still important players in the world. Yet their role is now largely reactive. In the past, they were a creative force. Christianity, for example, spread the hitherto heretical notion that all humans are equal before God, thereby changing human political structures, social hierarchies and even gender relations. In his Sermon on the Mount Jesus went further, insisting that the meek and oppressed are God’s favourite people, thus turning the pyramid of power on its head, and providing ammunition for generations of revolutionaries. In addition to social and ethical reforms, Christianity was responsible for important economic and technological innovations. The Catholic Church established medieval Europe’s most sophisticated administrative system, and pioneered the use of archives, catalogues, timetables and other techniques of data processing. The Vatican was the closest thing twelfth-century Europe had to Silicon Valley. The Church established Europe’s first economic corporations – the monasteries – which for 1,000 years spearheaded the European economy and introduced advanced agricultural and administrative methods.
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
The main ones are the symbolists, connectionists, evolutionaries, Bayesians, and analogizers. Each tribe has a set of core beliefs, and a particular problem that it cares most about. It has found a solution to that problem, based on ideas from its allied fields of science, and it has a master algorithm that embodies it. For symbolists, all intelligence can be reduced to manipulating symbols, in the same way that a mathematician solves equations by replacing expressions by other expressions. Symbolists understand that you can’t learn from scratch: you need some initial knowledge to go with the data. They’ve figured out how to incorporate preexisting knowledge into learning, and how to combine different pieces of knowledge on the fly in order to solve new problems. Their master algorithm is inverse deduction, which figures out what knowledge is missing in order to make a deduction go through, and then makes it as general as possible. For connectionists, learning is what the brain does, and so what we need to do is reverse engineer it. The brain learns by adjusting the strengths of connections between neurons, and the crucial problem is figuring out which connections are to blame for which errors and changing them accordingly. The connectionists’ master algorithm is backpropagation, which compares a system’s output with the desired one and then successively changes the connections in layer after layer of neurons so as to bring the output closer to what it should be. Evolutionaries believe that the mother of all learning is natural selection. If it made us, it can make anything, and all we need to do is simulate it on the computer. The key problem that evolutionaries solve is learning structure: not just adjusting parameters, like backpropagation does, but creating the brain that those adjustments can then fine-tune. The evolutionaries’ master algorithm is genetic programming, which mates and evolves computer programs in the same way that nature mates and evolves organisms. Bayesians are concerned above all with uncertainty. All learned knowledge is uncertain, and learning itself is a form of uncertain inference. The problem then becomes how to deal with noisy, incomplete, and even contradictory information without falling apart. The solution is probabilistic inference, and the master algorithm is Bayes’ theorem and its derivates. Bayes’ theorem tells us how to incorporate new evidence into our beliefs, and probabilistic inference algorithms do that as efficiently as possible. For analogizers, the key to learning is recognizing similarities between situations and thereby inferring other similarities. If two patients have similar symptoms, perhaps they have the same disease. The key problem is judging how similar two things are. The analogizers’ master algorithm is the support vector machine, which figures out which experiences to remember and how to combine them to make new predictions.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
The last refuge of the Self, perhaps, is “physical continuity.” Despite the body’s mercurial nature, it feels like a badge of identity we have carried since the time of our earliest childhood memories. A thought experiment dreamed up in the 1980s by British philosopher Derek Parfit illustrates how important—yet deceiving—this sense of physical continuity is to us.15 He invites us to imagine a future in which the limitations of conventional space travel—of transporting the frail human body to another planet at relatively slow speeds—have been solved by beaming radio waves encoding all the data needed to assemble the passenger to their chosen destination. You step into a machine resembling a photo booth, called a teletransporter, which logs every atom in your body then sends the information at the speed of light to a replicator on Mars, say. This rebuilds your body atom by atom using local stocks of carbon, oxygen, hydrogen, and so on. Unfortunately, the high energies needed to scan your body with the required precision vaporize it—but that’s okay because the replicator on Mars faithfully reproduces the structure of your brain nerve by nerve, synapse by synapse. You step into the teletransporter, press the green button, and an instant later materialize on Mars and can continue your existence where you left off. The person who steps out of the machine at the other end not only looks just like you, but etched into his or her brain are all your personality traits and memories, right down to the memory of eating breakfast that morning and your last thought before you pressed the green button. If you are a fan of Star Trek, you may be perfectly happy to use this new mode of space travel, since this is more or less what the USS Enterprise’s transporter does when it beams its crew down to alien planets and back up again. But now Parfit asks us to imagine that a few years after you first use the teletransporter comes the announcement that it has been upgraded in such a way that your original body can be scanned without destroying it. You decide to give it a go. You pay the fare, step into the booth, and press the button. Nothing seems to happen, apart from a slight tingling sensation, but you wait patiently and sure enough, forty-five minutes later, an image of your new self pops up on the video link and you spend the next few minutes having a surreal conversation with yourself on Mars. Then comes some bad news. A technician cheerfully informs you that there have been some teething problems with the upgraded teletransporter. The scanning process has irreparably damaged your internal organs, so whereas your replica on Mars is absolutely fine and will carry on your life where you left off, this body here on Earth will die within a few hours. Would you care to accompany her to the mortuary? Now how do you feel? There is no difference in outcome between this scenario and what happened in the old scanner—there will still be one surviving “you”—but now it somehow feels as though it’s the real you facing the horror of imminent annihilation. Parfit nevertheless uses this thought experiment to argue that the only criterion that can rationally be used to judge whether a person has survived is not the physical continuity of a body but “psychological continuity”—having the same memories and personality traits as the most recent version of yourself. Buddhists
James Kingsland (Siddhartha's Brain: Unlocking the Ancient Science of Enlightenment)
In 1942, Merton set out four scientific values, now known as the ‘Mertonian Norms’. None of them have snappy names, but all of them are good aspirations for scientists. First, universalism: scientific knowledge is scientific knowledge, no matter who comes up with it – so long as their methods for finding that knowledge are sound. The race, sex, age, gender, sexuality, income, social background, nationality, popularity, or any other status of a scientist should have no bearing on how their factual claims are assessed. You also can’t judge someone’s research based on what a pleasant or unpleasant person they are – which should come as a relief for some of my more disagreeable colleagues. Second, and relatedly, disinterestedness: scientists aren’t in it for the money, for political or ideological reasons, or to enhance their own ego or reputation (or the reputation of their university, country, or anything else). They’re in it to advance our understanding of the universe by discovering things and making things – full stop.20 As Charles Darwin once wrote, a scientist ‘ought to have no wishes, no affections, – a mere heart of stone.’ The next two norms remind us of the social nature of science. The third is communality: scientists should share knowledge with each other. This principle underlies the whole idea of publishing your results in a journal for others to see – we’re all in this together; we have to know the details of other scientists’ work so that we can assess and build on it. Lastly, there’s organised scepticism: nothing is sacred, and a scientific claim should never be accepted at face value. We should suspend judgement on any given finding until we’ve properly checked all the data and methodology. The most obvious embodiment of the norm of organised scepticism is peer review itself. 20. Robert K. Merton, ‘The Normative Structure of Science’ (1942), The Sociology of Science: Empirical and Theoretical Investigations (Chicago and London: University of Chicago Press, 1973): pp. 267–278.
Stuart Ritchie (Science Fictions)
Isaac Asimov’s short story “The Fun They Had” describes a school of the future that uses advanced technology to revolutionize the educational experience, enhancing individualized learning and providing students with personalized instruction and robot teachers. Such science fiction has gone on to inspire very real innovation. In a 1984 Newsweek interview, Apple’s co-founder Steve Jobs predicted computers were going to be a bicycle for our minds, extending our capabilities, knowledge, and creativity, much the way a ten-speed amplifies our physical abilities. For decades, we have been fascinated by the idea that we can use computers to help educate people. What connects these science fiction narratives is that they all imagined computers might eventually emulate what we view as intelligence. Real-life researchers have been working for more than sixty years to make this AI vision a reality. In 1962, the checkers master Robert Nealey played the game against an IBM 7094 computer, and the computer beat him. A few years prior, in 1957, the psychologist Frank Rosenblatt created Perceptron, the first artificial neural network, a computer simulation of a collection of neurons and synapses trained to perform certain tasks. In the decades following such innovations in early AI, we had the computation power to tackle systems only as complex as the brain of an earthworm or insect. We also had limited techniques and data to train these networks. The technology has come a long way in the ensuing decades, driving some of the most common products and apps today, from the recommendation engines on movie streaming services to voice-controlled personal assistants such as Siri and Alexa. AI has gotten so good at mimicking human behavior that oftentimes we cannot distinguish between human and machine responses. Meanwhile, not only has the computation power developed enough to tackle systems approaching the complexity of the human brain, but there have been significant breakthroughs in structuring and training these neural networks.
Salman Khan (Brave New Words: How AI Will Revolutionize Education (and Why That’s a Good Thing))
In 1950, a thirty-year-old scientist named Rosalind Franklin arrived at King’s College London to study the shape of DNA. She and a graduate student named Raymond Gosling created crystals of DNA, which they bombarded with X-rays. The beams bounced off the crystals and struck photographic film, creating telltale lines, spots, and curves. Other scientists had tried to take pictures of DNA, but no one had created pictures as good as Franklin had. Looking at the pictures, she suspected that DNA was a spiral-shaped molecule—a helix. But Franklin was relentlessly methodical, refusing to indulge in flights of fancy before the hard work of collecting data was done. She kept taking pictures. Two other scientists, Francis Crick and James Watson, did not want to wait. Up in Cambridge, they were toying with metal rods and clamps, searching for plausible arrangements of DNA. Based on hasty notes Watson had written during a talk by Franklin, he and Crick put together a new model. Franklin and her colleagues from King’s paid a visit to Cambridge to inspect it, and she bluntly told Crick and Watson they had gotten the chemistry all wrong. Franklin went on working on her X-ray photographs and growing increasingly unhappy with King’s. The assistant lab chief, Maurice Wilkins, was under the impression that Franklin was hired to work directly for him. She would have none of it, bruising Wilkins’s ego and leaving him to grumble to Crick about “our dark lady.” Eventually a truce was struck, with Wilkins and Franklin working separately on DNA. But Wilkins was still Franklin’s boss, which meant that he got copies of her photographs. In January 1953, he showed one particularly telling image to Watson. Now Watson could immediately see in those images how DNA was shaped. He and Crick also got hold of a summary of Franklin’s unpublished research she wrote up for the Medical Research Council, which guided them further to their solution. Neither bothered to consult Franklin about using her hard-earned pictures. The Cambridge and King’s teams then negotiated a plan to publish a set of papers in Nature on April 25, 1953. Crick and Watson unveiled their model in a paper that grabbed most of the attention. Franklin and Gosling published their X-ray data in another paper, which seemed to readers to be a “me-too” effort. Franklin died of cancer five years later, while Crick, Watson, and Wilkins went on to share the Nobel prize in 1962. In his 1968 book, The Double Helix, Watson would cruelly caricature Franklin as a belligerent, badly dressed woman who couldn’t appreciate what was in her pictures. That bitter fallout is a shame, because these scientists had together discovered something of exceptional beauty. They had found a molecular structure that could make heredity possible.
Carl Zimmer (She Has Her Mother's Laugh: What Heredity Is, Is Not, and May Become)
Months later, Time magazine would run its now infamous article bragging about how it had been done. Without irony or shame, the magazine reported that “[t]here was a conspiracy unfolding behind the scenes” creating “an extraordinary shadow effort” by a “well-funded cabal of powerful people” to oppose Trump.112 Corporate CEOs, organized labor, left-wing activists, and Democrats all worked together in secret to secure a Biden victory. For Trump, these groups represented a powerful Washington and Democratic establishment that saw an unremarkable career politician like Biden as merely a vessel for protecting their self-interests. Accordingly, when Trump was asked whom he blames for the rigging of the 2020 election, he quickly responded, “Least of all Biden.” Time would, of course, disingenuously frame this effort as an attempt to “oppose Trump’s assault on democracy,” even as Time reporter Molly Ball noted this shadow campaign “touched every aspect of the election. They got states to change voting systems and laws and helped secure hundreds of millions in public and private funding.” The funding enabled the country’s sudden rush to mail-in balloting, which Ball described as “a revolution in how people vote.”113 The funding from Democratic donors to public election administrators was revolutionary. The Democrats’ network of nonprofit activist groups embedded into the nation’s electoral structure through generous grants from Democratic donors. They helped accomplish the Democrats’ vote-by-mail strategy from the inside of the election process. It was as if the Dallas Cowboys were paying the National Football League’s referee staff and conducting all of their support operations. No one would feel confident in games won by the Cowboys in such a scenario. Ball also reported that this shadowy cabal “successfully pressured social media companies to take a harder line against disinformation and used data-driven strategies to fight viral smears.” And yet, Time magazine made this characterization months after it was revealed that the New York Post’s reporting on Hunter Biden’s corrupt deal-making with Chinese and other foreign officials—deals that alleged direct involvement from Joe Biden, resulting in the reporting’s being overtly censored by social media—was substantially true. Twitter CEO Jack Dorsey would eventually tell Congress that censoring the New York Post and locking it out of its Twitter account over the story was “a mistake.” And the Hunter Biden story was hardly the only egregious mistake, to say nothing of the media’s willful dishonesty, in the 2020 election. Republicans read the Time article with horror and as an admission of guilt. It confirmed many voters’ suspicions that the election wasn’t entirely fair. Trump knew the article helped his case, calling it “the only good article I’ve read in Time magazine in a long time—that was actually just a piece of the truth because it was much deeper than that.
Mollie Ziegler Hemingway (Rigged: How the Media, Big Tech, and the Democrats Seized Our Elections)
Structured Application Design with MVC MVC defines a clean separation between the critical components of our apps. Consistent with its name, MVC defines three parts of an application: • A model provides the underlying data and methods that offer information to the rest of the application. The model does not define how the application will look or how it will act. • One or more views make up the user interface. A view consists of the different onscreen widgets (buttons, fields, switches, and so forth) that a user can interact with. • A controller is typically paired with a view. The controller is responsible for receiving user input and acting accordingly. Controllers may access and update a view using information from the model and update the model using the results of user interactions in the view. In short, it bridges the MVC components.
John Ray (Sams Teach Yourself iOS 5 Application Development in 24 Hours (3rd Edition))
The fundamental problem in the U.S. health care system is that the structure of health care delivery is broken. This is what all the data about rising costs and alarming quality are telling us. And the structure of health care delivery is broken because competition is broken. All of the well-intended reform movements have failed because they did not address the underlying nature of competition.
Michael E. Porter (Redefining Health Care: Creating Value-based Competition on Results)
A configuration is the structure of architectural relationships among components, connectors, and data during a period of system run-time.
Anonymous
Product development has become a faster, more flexible process, where radically better products don’t stand on the shoulders of giants, but on the shoulders of lots of iterations. The basis for success then, and for continual product excellence, is speed. Unfortunately, like Jonathan’s failed gate-based product development framework, most management processes in place at companies today are designed with something else in mind. They were devised over a century ago, at a time when mistakes were expensive and only the top executives had comprehensive information, and their primary objectives are lowering risk and ensuring that decisions are made only by the few executives with lots of information. In this traditional command-and-control structure, data flows up to the executives from all over the organization, and decisions subsequently flow down. This approach is designed to slow things down, and it accomplishes the task very well. Meaning that at the very moment when businesses must permanently accelerate, their architecture is working against them.
Eric Schmidt (How Google Works)
Big Data speaks to the huge and quickly developing volume of data, for example, high-volume sensor data and long range interpersonal communication data from sites – Facebook and Twitter to give some examples. Numerous Organizations are sharp in catching this data and breaking down the same as this can enhance settle on suitable vital choices. In any case, it is essential to take note of that such big data comes in many structures. want to know more kindly visit 361onlinecom/bigdata/beginners.php
361online.com
blockchain is a data structure that makes it possible to create a digital ledger of data and share it among a network of independent parties.
Tiana Laurence (Blockchain for Dummies)
Turing showed that just as the uncertainties of physics stem from using electrons and photons to measure themselves, the limitations of computers stem from recursive self-reference. Just as quantum theory fell into self-referential loops of uncertainty because it measured atoms and electrons using instruments composed of atoms and electrons, computer logic could not escape self-referential loops as its own logical structures informed its own algorithms.12
George Gilder (Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy)
So where do we get these plausibility structures? They come from three main sources: (1) community, (2) experience, and (3) facts, evidence, and data.
Sam Chan (Evangelism in a Skeptical World: How to Make the Unbelievable News about Jesus More Believable)
The reasons for cooperatives’ success should be obvious by now, but they are worth reiterating: “The major basis for cooperative success…has been superior labor productivity. Studies comparing square-foot output have repeatedly shown higher physical volume of output per hour, and others…show higher quality of product and also economy of material use.”118 Hendrik Thomas concludes from an analysis of Mondragon that “Productivity and profitability are higher for cooperatives than for capitalist firms. It makes little difference whether the Mondragon group is compared with the largest 500 companies, or with small- or medium-scale industries; in both comparisons the Mondragon group is more productive and more profitable.”119 As we have seen, recent research has arrived at the same conclusions. It is a truism by now that worker participation tends to increase productivity and profitability. Research conducted by Henk Thomas and Chris Logan corroborates these conclusions. “A frequent but unfounded criticism,” they observe, “of self-managed firms is that workers prefer to enjoy a high take-home pay rather than to invest in their own enterprises. This has been proven invalid…in the Mondragon case… A comparison of gross investment figures shows that the cooperatives invest on average four times as much as private enterprises.” After a detailed analysis they also conclude that “there can be no doubt that the [Mondragon] cooperatives have been more profitable than capitalist enterprises.”120 Recent data indicate the same thing.121 One particularly successful company, Irizar, which was mentioned earlier, has been awarded prizes for being the most efficient company in its sector; in Spain it has ten competitors, but its market share is 40 percent. The same level of achievement is true of its subsidiaries, for instance in Mexico, where it had a 45 percent market share in 2005, six years after entering the market. An author comments that “the basis for this increased efficiency appears to be linked directly to the organization’s unique participatory and democratic management structure.”122 A major reason for all these successes is Mondragon’s federated structure: the group of cooperatives has its own supply of banking, education, and technical support services. The enormous funds of the central credit union, the Caja Laboral Popular, have likewise been crucial to Mondragon’s expansion. It proves that if cooperatives have access to credit they are perfectly capable of being far more successful than private enterprises.
Chris Wright (Worker Cooperatives and Revolution: History and Possibilities in the United States)
A few books that I've read.... Pascal, an Introduction to the Art and Science of Programming by Walter Savitch Programming algorithms Introduction to Algorithms, 3rd Edition (The MIT Press) Data Structures and Algorithms in Java Author: Michael T. Goodrich - Roberto Tamassia - Michael H. Goldwasser The Algorithm Design Manual Author: Steven S Skiena Algorithm Design Author: Jon Kleinberg - Éva Tardos Algorithms + Data Structures = Programs Book by Niklaus Wirth Discrete Math Discrete Mathematics and Its Applications Author: Kenneth H Rosen Computer Org Structured Computer Organization Andrew S. Tanenbaum Introduction to Assembly Language Programming: From 8086 to Pentium Processors (Undergraduate Texts in Computer Science) Author: Sivarama P. Dandamudi Distributed Systems Distributed Systems: Concepts and Design Author: George Coulouris - Jean Dollimore - Tim Kindberg - Gordon Blair Distributed Systems: An Algorithmic Approach, Second Edition (Chapman & Hall/CRC Computer and Information Science Series) Author: Sukumar Ghosh Mathematical Reasoning Mathematical Reasoning: Writing and Proof Version 2.1 Author: Ted Sundstrom An Introduction to Mathematical Reasoning: Numbers, Sets and Functions Author: Peter J. Eccles Differential Equations Differential Equations (with DE Tools Printed Access Card) Author: Paul Blanchard - Robert L. Devaney - Glen R. Hall Calculus Calculus: Early Transcendentals Author: James Stewart And more....
Michael Gitabaum
So which country will lead in the broader category of business AI? Today, the United States enjoys a commanding lead (90–10) in this wave, but I believe in five years China will close that gap somewhat (70–30), and the Chinese government has a better shot at putting the power of business AI to good use. The United States has a clear advantage in the most immediate and profitable implementations of the technology: optimizations within banking, insurance, or any industry with lots of structured data that can be mined for better decision-making. Its companies have the raw material and corporate willpower to apply business AI to the problem of maximizing their bottom line.
Kai-Fu Lee (AI Superpowers: China, Silicon Valley, and the New World Order)
Optimizations like this work well in industries with large amounts of structured data on meaningful business outcomes. In this case, “structured” refers to data that has been categorized, labeled, and made searchable. Prime examples of well-structured corporate data sets include historic stock prices, credit-card usage, and mortgage defaults.
Kai-Fu Lee (AI Superpowers: China, Silicon Valley, and the New World Order)
Correlation is enough,” 2 then-Wired editor in chief Chris Anderson famously declared in 2008. We can, he implied, solve innovation problems by the sheer brute force of the data deluge. Ever since Michael Lewis chronicled the Oakland A’s unlikely success in Moneyball (who knew on-base percentage was a better indicator of offensive success than batting averages?), organizations have been trying to find the Moneyball equivalent of customer data that will lead to innovation success. Yet few have. Innovation processes in many companies are structured and disciplined, and the talent applying them is highly skilled. There are careful stage-gates, rapid iterations, and checks and balances built into most organizations’ innovation processes. Risks are carefully calculated and mitigated. Principles like six-sigma have pervaded innovation process design so we now have precise measurements and strict requirements for new products to meet at each stage of their development. From the outside, it looks like companies have mastered an awfully precise, scientific process. But for most of them, innovation is still painfully hit or miss. And worst of all, all this activity gives the illusion of progress, without actually causing it. Companies are spending exponentially more to achieve only modest incremental innovations while completely missing the mark on the breakthrough innovations critical to long-term, sustainable growth. As Yogi Berra famously observed: “We’re lost, but we’re making good time!” What’s gone so wrong? Here is the fundamental problem: the masses and masses of data that companies accumulate are not organized in a way that enables them to reliably predict which ideas will succeed. Instead the data is along the lines of “this customer looks like that one,” “this product has similar performance attributes as that one,” and “these people behaved the same way in the past,” or “68 percent of customers say they prefer version A over version B.” None of that data, however, actually tells you why customers make the choices that they do.
Clayton M. Christensen (Competing Against Luck: The Story of Innovation and Customer Choice)
I took 17 computer science classes and made an A in 11 of them. 1 point away from an A in 3 of them and the rest of them didn't matter. Math is a tool for physics,chemistry,biology/basic computation and nothing else. CS I(Pascal Vax), CS II(Pascal Vax), Sr. Software Engineering, Sr. Distributed Systems, Sr. Research, Sr. Operating Systems, Sr. Unix Operating Systems, Data Structures, Sr. Object Oriented A&D, CS (perl/linux), Sr. Java Programming, Information Systems Design, Jr. Unix Operating Systems, Microprocessors, Programming Algorithms, Calculus I,II,III, B Differential Equations, TI-89 Mathematical Reasoning, 92 C++ Programming, Assembly 8086, Digital Computer Organization, Discrete Math I,II, B Statistics for the Engineering & Sciences (w/permutations & combinatorics) -- A-American Literature A-United States History 1865 CLEP-full year english CLEP-full year biology A-Psychology A-Environmental Ethics
Michael Gitabaum
The collapse of startups should be no surprise. Ever since antitrust enforcement was changed under Ronald Reagan in the early 1980s, small was bad and big was considered beautiful. Murray Weidenbaum, the first chair of Reagan's Council of Economic Advisors, argued that economic growth, not competition, should be policymakers' primary goal. In his words, “It is not the small businesses that created the jobs,' he concluded, ‘but the economic growth.” And small businesses were sacrificed for the sake of bigger businesses.34 Ryan Decker, an economist at the Federal Reserve, found that the decline is even infecting the high technology sector. Americans look at startups over the years like PayPal and Uber and conclude the tech scene is thriving, but Decker points out that in the post-2000 period, we have seen a decline even in areas of great innovation like technology. Over the past 15 years, there are not only fewer technology startups, but these young firms are slower growing than they were before. Given the importance of technology to growth and productivity, his findings should be extremely troubling. The decline in firm entries is a mystery to many economists, but the cause is clear: greater industrial concentration has been choking the economy, leading to fewer startups. Firms are getting bigger and older. In a comprehensive study, Professor Gustavo Grullon showed that the disappearance of small firms is directly related to increasing industrial concentration. In real terms, the average firm in the economy has become three times larger over the past 20 years. The proportion of people employed by firms with 10,000 employees or more has been growing steadily. The share started to increase in the 1990s, and has recently exceeded previous historical peaks. Grullon concluded that when you look at all the evidence, it points “to a structural change in the US labor market, where most jobs are being created by large and established firms, rather than by entrepreneurial activity.”35 The employment data of small firms supports Grullon's conclusions; from 1978 to 2011, the number of jobs created by new firms fell from 3.4% of total business employment to 2% (Figure 3.2).36
Jonathan Tepper (The Myth of Capitalism: Monopolies and the Death of Competition)
As prevalent as disks once were, they are now a dying breed. Soon they will have gone the way of tape drives, floppy drives, and CDs. They are being replaced by RAM. Ask yourself this question: When all the disks are gone, and all your data is stored in RAM, how will you organize that data? Will you organize it into tables and access it with SQL? Will you organize it into files and access it through a directory? Of course not. You’ll organize it into linked lists, trees, hash tables, stacks, queues, or any of the other myriad data structures, and you’ll access it using pointers or references—because that’s what programmers do.
Robert C. Martin (Clean Architecture)
The use case class accepts simple request data structures for its input, and returns simple response data structures as its output. These data structures are not dependent on anything.
Robert C. Martin (Clean Architecture)
4. Sketch Out Possible Dynamic Behaviors—Given the structure you have identified, what possible dynamic behaviors can be produced? Compare those to what has been observed to build support for your initial model. Does the model support the observed data? If not, there may be unidentified loops or delays present.
Rich Jolly (Systems Thinking for Business: Capitalize on Structures Hidden in Plain Sight)
at its core, the Black Swan is a modeling problem. The observer has deemed the event improbable based on their existing experiences which serve as their current data set.
Rich Jolly (Systems Thinking for Business: Capitalize on Structures Hidden in Plain Sight)
The scientific method asserts that one cannot prove a hypothesis, only disprove it. Seeing more white swans may strengthen the hypothesis, but seeing a black swan can disprove it. The confirmation bias, the search for data that confirms existing beliefs (versus alternatives), accentuates this effect in humans.
Rich Jolly (Systems Thinking for Business: Capitalize on Structures Hidden in Plain Sight)
In the early phases, the structure exhibits exponential growth. This identifies the reinforcing loop. However, the balancing loop is effectively dormant. It's only later in time that the balancing loop reveals itself in the data set.
Rich Jolly (Systems Thinking for Business: Capitalize on Structures Hidden in Plain Sight)
If, as I believe, the conceptual structures we construct today are too complicated to be accurately specified in advance, and too complex to be built faultlessly, then we must take a radically different approach. Let us turn to nature and study complexity in living things, instead of just the dead works of man. Here we find constructs whose complexities thrill us with awe. The brain alone is intricate beyond mapping, powerful beyond imitation, rich in diversity, self-protecting, and self-renewing. The secret is that it is grown, not built. So it must be with our software systems. Some years ago Harlan Mills proposed that any software system should be grown by incremental development.[11] That is, the system should first be made to run, even though it does nothing useful except call the proper set of dummy subprograms. Then, bit by bit it is fleshed out, with the subprograms in turn being developed into actions or calls to empty stubs in the level below. I have seen the most dramatic results since I began urging this technique on the project builders in my software engineering laboratory class. Nothing in the past decade has so radically changed my own practice, or its effectiveness. The approach necessitates top-down design, for it is a top-down growing of the software. It allows easy backtracking. It lends itself to early prototypes. Each added function and new provision for more complex data or circumstances grows organically out of what is already there. The morale effects are startling. Enthusiasm jumps when there is a running system, even a simple one. Efforts redouble when the first picture from a new graphics software system appears on the screen, even if it is only a rectangle. One always has, at every stage in the process, a working system. I find that teams can grow much more complex entities in four months than they can build.
Frederick P. Brooks Jr. (The Mythical Man-Month: Essays on Software Engineering)
Having studied both the possible risks and the likely rewards, the Guardian’s managers decided both to “open in” the website, by bringing in more data and applications from the outside, and to “open out” the site, by enabling partners to create products using Guardian content and services on other digital platforms. To work toward the “open out” goal, the Guardian created a set of APIs that made its content easily available to external parties. These interfaces include three different levels of access. The lowest access tier, which the paper calls Keyless, allows anyone to use Guardian headlines, metadata, and information architecture (that is, the software and design elements that structure Guardian data and make it easier to access, analyze, and use) without requesting permission and without any requirement to share revenues that might be generated. The second access tier, Approved, allows registered developers to reprint entire Guardian articles, with certain time and usage restrictions. Advertising revenues are shared between the newspaper and the developers. The third and highest access tier, Bespoke, is a customized support package that provides unlimited use of Guardian content—for a fee.
Geoffrey G. Parker (Platform Revolution: How Networked Markets Are Transforming the Economy and How to Make Them Work for You: How Networked Markets Are Transforming the Economy―and How to Make Them Work for You)
In using the notion of self, I am in no way suggesting that all the contents of our minds are inspected by a single central knower and owner, and even less that such an entity would reside in a single brain place. I am saying, though, that our experiences tend to have a consistent perspective, as if there were indeed an owner and knower for most, though not all, contents. I imagine this perspective to be rooted in a relatively stable, endlessly repeated biological state. The source of the stability is the predominantly invariant structure and operation of the organism, and the slowly evolving elements of autobiographical data.
António Damásio (Descartes' Error: Emotion, Reason and the Human Brain)
Logic. Rationality. Reasoning. Thought. Analysis. Calculation. Decision-making. All this is within the mind of a human being, correct? Humanity
Code Well Academy (Javascript Artificial Intelligence: Made Easy, w/ Essential Programming; Create your * Problem Solving * Algorithms! TODAY! w/ Machine Learning & Data Structures (Artificial Intelligence Series))
Each tribe’s solution to its central problem is a brilliant, hard-won advance. But the true Master Algorithm must solve all five problems, not just one. For example, to cure cancer we need to understand the metabolic networks in the cell: which genes regulate which others, which chemical reactions the resulting proteins control, and how adding a new molecule to the mix would affect the network. It would be silly to try to learn all of this from scratch, ignoring all the knowledge that biologists have painstakingly accumulated over the decades. Symbolists know how to combine this knowledge with data from DNA sequencers, gene expression microarrays, and so on, to produce results that you couldn’t get with either alone. But the knowledge we obtain by inverse deduction is purely qualitative; we need to learn not just who interacts with whom, but how much, and backpropagation can do that. Nevertheless, both inverse deduction and backpropagation would be lost in space without some basic structure on which to hang the interactions and parameters they find, and genetic programming can discover it. At this point, if we had complete knowledge of the metabolism and all the data relevant to a given patient, we could figure out a treatment for her. But in reality the information we have is always very incomplete, and even incorrect in places; we need to make headway despite that, and that’s what probabilistic inference is for. In the hardest cases, the patient’s cancer looks very different from previous ones, and all our learned knowledge fails. Similarity-based algorithms can save the day by seeing analogies between superficially very different situations, zeroing in on their essential similarities and ignoring the rest. In this book we will synthesize a single algorithm will all these capabilities:
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
For the hardest problems—the ones we really want to solve but haven’t been able to, like curing cancer—pure nature-inspired approaches are probably too uninformed to succeed, even given massive amounts of data. We can in principle learn a complete model of a cell’s metabolic networks by a combination of structure search, with or without crossover, and parameter learning via backpropagation, but there are too many bad local optima to get stuck in. We need to reason with larger chunks, assembling and reassembling them as needed and using inverse deduction to fill in the gaps. And we need our learning to be guided by the goal of optimally diagnosing cancer and finding the best drugs to cure it. Optimal learning is the Bayesians’ central goal, and they are in no doubt that they’ve figured out how to reach it. This way, please …
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
The rate of time flow perceived by an observer in the simulated universe is completely independent of the rate at which a computer runs the simulation, a point emphasized in Greg Egan's science-fiction novel Permutation City. Moreover, as we discussed in the last chapter and as stressed by Einstein, it's arguably more natural to view our Universe not from the frog perspective as a three-dimensional space where things happen, but from the bird perspective as a four-dimensional spacetime that merely is. There should therefore be no need for the computer to compute anything at all-it could simply store all the four-dimensional data, that is, encode all properties of the mathematical structure that is our Universe. Individual time slices could then be read out sequentially if desired, and the "simulated" world should still feel as real to its inhabitants as in the case where only three-dimensional data is stored and evolved. In conclusion: the role of the simulating computer isn't to compute the history of our Universe, but to specify it. How specify it? The way in which the data are stored (the type of computer, the data format, etc.) should be irrelevant, so the extent to which the inhabitants of the simulated universe perceive themselves as real should be independent of whatever method is used for data compression. The physical laws that we've discovered provide great means of data compression, since they make it sufficient to store the initial data at some time together with the equations and a program computing the future from these initial data. As emphasized on pages 340-344, the initial data might be extremely simple: popular initial states from quantum field theory with intimidating names such as the Hawking-Hartle wavefunction or the inflationary Bunch-Davies vacuum have very low algorithmic complexity, since they can be defined in brief physics papers, yet simulating their time evolution would simulate not merely one universe like ours, but a vast decohering collection of parallel ones. It's therefore plausible that our Universe (and even the whole Level III multiverse) could be simulated by quite a short computer program.
Max Tegmark (Our Mathematical Universe: My Quest for the Ultimate Nature of Reality)
Serialization is the process of converting the internal representation of a data structure into a format that can be transmitted one byte at a time, also known as a byte stream. Serialization
Andreas M. Antonopoulos (Mastering Bitcoin: Programming the Open Blockchain)
Our understanding of the sociology of knowledge leads to the conclusion that the sociologies of language and religion cannot be considered peripheral specialties of little interest to sociological theory as such, but have essential contributions to make to it. This insight is not new. Durkheim and his school had it, but it was lost for a variety of theoretically irrelevant reasons. We hope we have made it clear that the sociology of knowledge presupposes a sociology of language, and that a sociology of knowledge without a sociology of religion is impossible (and vice versa). Furthermore, we believe that we have shown how the theoretical positions of Weber and Durkheim can be combined in a comprehensive theory of social action that does not lose the inner logic of either. Finally, we would contend that the linkage we have been led to make here between the sociology of knowledge and the theoretical core of the thought of Mead and his school suggests an interesting possibility for what might be called a sociological psychology, that is, a psychology that derives its fundamental perspectives from a sociological understanding of the human condition. The observations made here point to a program that seems to carry theoretical promise. More generally, we would contend that the analysis of the role of knowledge in the dialectic of individual and society, of personal identity and social structure, provides a crucial complementary perspective for all areas of sociology. This is certainly not to deny that purely structural analyses of social phenomena are fully adequate for wide areas of sociological inquiry, ranging from the study of small groups to that of large institutional complexes, such as the economy or politics. Nothing is further from our intentions than the suggestion that a sociology-of-knowledge “angle” ought somehow to be injected into all such analyses. In many cases this would be unnecessary for the cognitive goal at which these studies aim. We are suggesting, however, that the integration of the findings of such analyses into the body of sociological theory requires more than the casual obeisance that might be paid to the “human factor” behind the uncovered structural data. Such integration requires a systematic accounting of the dialectical relation between the structural realities and the human enterprise of constructing reality—in history. We
Peter L. Berger (The Social Construction of Reality: A Treatise in the Sociology of Knowledge)
One of the most surprising findings to emerge from neuroscience in recent years is that rather than responding in real time to the vast amount of incoming sensory data, the brain tries to keep one step ahead by constantly predicting what will happen next. It simulates a model of the immediate future based on what has just happened. When its predictions turn out to be wrong—for example, we’re feeling just fine then suddenly experience a stab of anxiety about a romantic date—this mismatch creates an unpleasant sense of dissatisfaction that we can either try to resolve by ruminating and then doing something to alleviate the anxiety (canceling the date, perhaps) or by updating the brain’s model of reality (investigating and accepting the new sensation). These alternative strategies employ the “narrative” and “being” modes of thought I described earlier in this chapter. Of course, both strategies have their place according to the situation, but an overreliance on avoidance behavior rather than acceptance stores up problems for the future because there are many things in life that cannot be changed and therefore need to be faced. Mindfulness through interoception is all about accepting the way things are. When we are mindful, the insula continually updates its representation of our internal world to improve its accuracy by reducing discrepancies between expectation and reality. As we’ve seen in previous chapters, this reality check—the focusing of dispassionate attention on unpleasant sensations such as pain or anxiety—loosens the hold that they have over us. So the structural changes in the brains of highly experienced meditators of Siddhārtha’s caliber, in particular in their insula and ACC, may be responsible for the imperturbable calm and acceptance that is the ultimate goal of contemplative practice, sometimes described as enlightenment or nirvana.
James Kingsland (Siddhartha's Brain: Unlocking the Ancient Science of Enlightenment)
The separation of mind and body that informs medical practice is also the dominant ideology in our culture. We do not often think of socio-economic structures and practices as determinants of illness or well-being. They are not usually “part of the equation.” Yet the scientific data is beyond dispute: socio-economic relationships have a profound influence on health. For example, although the media and the medical profession — inspired by pharmaceutical research — tirelessly promote the idea that next to hypertension and smoking, high cholesterol poses the greatest risk for heart disease, the evidence is that job strain is more important than all the other risk factors combined. Further, stress in general and job strain in particular are significant contributors both to high blood pressure and to elevated cholesterol levels. Economic relationships influence health because, most obviously, people with higher incomes are better able to afford healthier diets, living and working conditions and stress-reducing pursuits. Dennis Raphael, associate professor at the School of Health Policy and Management at York University in Toronto has recently published a study of the societal influences on heart disease in Canada and elsewhere. His conclusion: “One of the most important life conditions that determine whether individuals stay healthy or become ill is their income. In addition, the overall health of North American society may be more determined by the distribution of income among its members rather than the overall wealth of the society…. Many studies find that socioeconomic circumstances, rather than medical and lifestyle risk factors, are the main causes of cardiovascular disease, and that conditions during early life are especially important.” The element of control is the less obvious but equally important aspect of social and job status as a health factor. Since stress escalates as the sense of control diminishes, people who exercise greater control over their work and lives enjoy better health. This principle was demonstrated in the British Whitehall study showing that second-tier civil servants were at greater risk for heart disease than their superiors, despite nearly comparable incomes. Recognizing the multigenerational template for behaviour and for illness, and recognizing, too, the social influences that shape families and human lives, we dispense with the unhelpful and unscientific attitude of blame. Discarding blame leaves us free to move toward the necessary adoption of responsibility, a matter to be taken up when we come in the final chapters to consider healing.
Gabor Maté (When the Body Says No: The Cost of Hidden Stress)