Data Structures Quotes

We've searched our database for all the quotes and captions related to Data Structures. Here they are! All 100 of them:

Bad programmers worry about the code. Good programmers worry about data structures and their relationships.
Linus Torvalds
For the human brain,” Edmond explained, “any answer is better than no answer. We feel enormous discomfort when faced with ‘insufficient data,’ and so our brains invent the data—offering us, at the very least, the illusion of order—creating myriad philosophies, mythologies, and religions to reassure us that there is indeed an order and structure to the unseen world.
Dan Brown (Origin (Robert Langdon, #5))
I will, in fact, claim that the difference between a bad programmer and a good one is whether he considers his code or his data structures more important. Bad programmers worry about the code. Good programmers worry about data structures and their relationships.
Linus Torvalds
Generally, the craft of programming is the factoring of a set of requirements into a a set of functions and data structures.
Douglas Crockford (JavaScript: The Good Parts)
It is better to have 100 functions operate on one data structure than 10 functions on 10 data structures.
Alan J. Perlis (Structure and Interpretation of Computer Programs)
Compassion is what you're good at. I'm better at complex searches through organized data structures.
Orson Scott Card (Speaker for the Dead (Ender's Saga, #2))
Computational processes are abstract beings that inhabit computers. As they evolve, processes manipulate other abstract things called data. The evolution of a process is directed by a pattern of rules called a program. People create programs to direct processes. In effect, we conjure the spirits of the computer with our spells.
Harold Abelson (Structure and Interpretation of Computer Programs)
Companies can learn a lot from biological systems. The human immune system for example is adaptive, redundant, diverse, modular, data-driven and network collaborative. A company that desires not just short term profit but also long term resilience should apply these features of the human immune system to it's business models and company structure.
Hendrith Vanlon Smith Jr.
The string is a stark data structure and everywhere it is passed there is much duplication of process. It is a perfect vehicle for hiding information.
Alan J. Perlis
Temporality is obviously an organised structure, and these three so-called elements of time: past, present, future, must not be envisaged as a collection of 'data' to be added together...but as the structured moments of an original synthesis. Otherwise we shall immediately meet with this paradox: the past is no longer, the future is not yet, as for the instantaneous present, everyone knows that it is not at all: it is the limit of infinite division, like the dimensionless point.
Jean-Paul Sartre (Being and Nothingness)
Programming is a science dressed up as art, because most of us don’t understand the physics of software and it’s rarely, if ever, taught. The physics of software is not algorithms, data structures, languages, and abstractions. These are just tools we make, use, and throw away. The real physics of software is the physics of people. Specifically, it’s about our limitations when it comes to complexity and our desire to work together to solve large problems in pieces. This is the science of programming: make building blocks that people can understand and use easily, and people will work together to solve the very largest problems.
Pieter Hintjens (ZeroMQ: Messaging for Many Applications)
All so-called ‘quantitative’ data, when scrutinized, turn out to be composites of ‘qualitative’ – i.e., contextually located and indexical – interpretations produced by situated researchers, coders, government officials and others. The
Anthony Giddens (The Constitution of Society: Outline of the Theory of Structuration)
He liked to start sentences with okay, so. It was a habit he had picked up from the engineers. He thought it made him sound smarter, thought it made him sound like them, those code jockeys, standing by the coffee machine, talking faster than he could think, talking not so much in sentences as in data structures, dense clumps of logic with the occasional inside joke. He liked to stand near them, pretending to stir sugar into his coffee, listening in on them as if they were speaking a different language. A language of knowing something, a language of being an expert at something. A language of being something more than an hourly unit.
Charles Yu (Sorry Please Thank You)
Not all roots are buried down in the ground, some are at the top of a tree.
Jinvirle
Purpose gives meaning to action in the same way that structure gives meaning to data.
David Amerland (Intentional: How To Live, Love, Work and Play Meaningfully)
In the coming decades, it is likely that we will see more Internet-like revolutions, in which technology steals a march on politics. Artificial intelligence and biotechnology might soon overhaul our societies and economies – and our bodies and minds too – but they are hardly a blip on our political radar. Our current democratic structures just cannot collect and process the relevant data fast enough, and most voters don’t understand biology and cybernetics well enough to form any pertinent opinions. Hence traditional democratic politics loses control of events, and fails to provide us with meaningful visions for the future. That
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
First identified by academic psychologist Leon Festinger, cognitive dissonance occurs when we are confronted with empirical data at odds with the way we “know” the world to work. To resolve this discrepancy, we choose to ignore data or try to fit the data into our preconceived belief structure. Sometimes, there is a crisis and the belief structure eventually crumbles.
David N. Schwartz (The Last Man Who Knew Everything: The Life and Times of Enrico Fermi, Father of the Nuclear Age)
All programs transform data, converting an input into an output. And yet when we think about design, we rarely think about creating transformations. Instead we worry about classes and modules, data structures and algorithms, languages and frameworks.
Andrew Hunt (The Pragmatic Programmer: Your Journey to Mastery)
Until now, I've been writing about "now" as if it were literally an instant of time, but of course human faculties are not infinitely precise. It is simplistic to suppose that physical events and mental events march along exactly in step, with the stream of "actual moments" in the outside world and the stream of conscious awareness of them perfectly synchronized. The cinema industry depends on the phenomenon that what seems to us a movie is really a succession of still pictures, running at twenty-five [sic] frames per second. We don't notice the joins. Evidently the "now" of our conscious awareness stretches over at least 1/25 of a second. In fact, psychologists are convinced it can last a lot longer than that. Take he familiar "tick-tock" of the clock. Well, the clock doesn't go "tick-tock" at all; it goes "tick-tick," every tick producing the same sound. It's just that our consciousness runs two successive ticks into a singe "tick-tock" experience—but only if the duration between ticks is less than about three seconds. A really bug pendulum clock just goes "tock . . . tock . . . tock," whereas a bedside clock chatters away: "ticktockticktock..." Two to three seconds seems to be the duration over which our minds integrate sense data into a unitary experience, a fact reflected in the structure of human music and poetry.
Paul C.W. Davies (About Time: Einstein's Unfinished Revolution)
Quote may seem a bit of a foreign concept, because few other languages have anything like it. It's closely tied to one of the most distinctive features of Lisp: code and data are made out of the same data structures, and the quote operator is the way we distinguish between them.
Paul Graham
Sherrie described atheism as a positive system of belief—one based on data, exploration and observation rather than scripture, creed and prayer. Atheists believe that human life is a chemical phenomenon, that our first parents were super-novas that happened billions of years ago—that humans are inexplicable miracles in a universe of structured chaos. Atheists believe that when we die, we will turn into organic debris which will continue cycling for billions of years in various incarnations. Sherrie explained that atheists appreciate life unfathomably because it is going to end. No one who takes atheism seriously dies without hope.
Israel Morrow (Gods of the Flesh: A Skeptic's Journey Through Sex, Politics and Religion)
Programming is a science dressed up as art, because most of us don't understand the physics of software, and it's rarely if ever taught. The physics of software is not algorithms, data structures, languages and abstractions. These are just tools we make, use, throw away. The real physics of software is the physics of people.
ØMQ - The Guide
Database Management System [Origin: Data + Latin basus "low, mean, vile, menial, degrading, ounterfeit."] A complex set of interrelational data structures allowing data to be lost in many convenient sequences while retaining a complete record of the logical relations between the missing items. -- From The Devil's DP Dictionary
Stan Kelly-Bootle
It is very easy to grow tired at collecting; the period of a low tide is about all men can endure. At first the rocks are bright and every moving animal makes his mark on the attention. The picture is wide and colored and beautiful. But after an hour and a half the attention centers weary, the color fades, and the field is likely to narrow to an individual animal. Here one may observe his own world narrowed down until interest and, with it, observation, flicker and go out. And what if with age this weariness becomes permanent and observation dim out and not recover? Can this be what happens to so many men of science? Enthusiasm, interest, sharpness, dulled with a weariness until finally they retire into easy didacticism? With this weariness, this stultification of attention centers, perhaps there comes the pained and sad memory of what the old excitement was like, and regret might turn to envy of the men who still have it. Then out of the shell of didacticism, such a used-up man might attack the unwearied, and he would have in his hands proper weapons of attack. It does seem certain that to a wearied man an error in a mass of correct data wipes out all the correctness and is a focus for attack; whereas the unwearied man, in his energy and receptivity, might consider the little dross of error a by-product of his effort. These two may balance and produce a purer thing than either in the end. These two may be the stresses which hold up the structure, but it is a sad thing to see the interest in interested men thin out and weaken and die. We have known so many professors who once carried their listeners high on their single enthusiasm, and have seen these same men finally settle back comfortably into lectures prepared years before and never vary them again. Perhaps this is the same narrowing we observe in relation to ourselves and the tide pool—a man looking at reality brings his own limitations to the world. If he has strength and energy of mind the tide pool stretches both ways, digs back to electrons and leaps space into the universe and fights out of the moment into non-conceptual time. Then ecology has a synonym which is ALL.
John Steinbeck (The Log from the Sea of Cortez)
What the ethnographer is in fact faced with—except when (as, of course, he must do) he is pursuing the more automatized routines of data collection—is a multiplicity of complex conceptual structures, many of them superimposed upon or knotted into one another, which are at once strange, irregular, and inexplicit, and which he must contrive somehow first to grasp and then to render. And this is true at the most down-to-earth, jungle field work levels of his activity; interviewing informants, observing rituals, eliciting kin terms, tracing property lines, censusing households … writing his journal. Doing ethnography is like trying to read (in the sense of “construct a reading of”) a manuscript—foreign, faded, full of ellipses, incoherencies, suspicious emendations, and tendentious commentaries, but written not in conventionalized graphs of sound but in transient examples of shaped behavior.
Clifford Geertz (The Interpretation of Cultures)
Huge volumes of data may be compelling at first glance, but without an interpretive structure they are meaningless.
Tom Boellstorff (Ethnography and Virtual Worlds: A Handbook of Method)
It’s only because the data force us into corners that we are inspired to create the highly counterintuitive structures that form the basis for modern physics.
Sean Carroll (The Particle at the End of the Universe: How the Hunt for the Higgs Boson Leads Us to the Edge of a New World)
Now he was…dust. To an outside observer, these ten seconds had been ground up into ten thousand uncorrelated moments and scattered throughout real time - and in model time, the outside world had suffered an equivalent fate. Yet the pattern of his awareness remained perfectly intact: somehow he found himself, “assembled himself” from these scrambled fragments. He’d been taken apart like a jigsaw puzzle - but his dissection and shuffling were transparent to him. Somehow - on their own terms - the pieces remained connected. Imagine a universe entirely without structure, without shape, without connections. A cloud of microscopic events, like fragments of space-time … except that there is no space or time. What characterizes one point in space, for one instant? Just the values of the fundamental particle fields, just a handful of numbers. Now, take away all notions of position, arrangement, order, and what’s left? A cloud of random numbers. But if the pattern that is me could pick itself out from all the other events taking place on this planet, why shouldn’t the pattern we think of as ‘the universe’ assemble itself, find itself, in exactly the same way? If I can piece together my own coherent space and time from data scattered so widely that it might as well be part of some giant cloud of random numbers, then what makes you think that you’re not doing the very same thing?
Greg Egan (Permutation City)
Will you be encountering each other for the first time through this communication, or do you have an established relationship? Do they already trust you as an expert, or do you need to work to establish credibility? These are important considerations when it comes to determining how to structure your communication and whether and when to use data, and may impact the order and flow of the overall story you aim to tell.
Cole Nussbaumer Knaflic (Storytelling with Data: A Data Visualization Guide for Business Professionals)
Amidst all this organic plasticity and compromise, though, the infrastructure fields could still stake out territory for a few standardized subsystems, identical from citizen to citizen. Two of these were channels for incoming data—one for gestalt, and one for linear, the two primary modalities of all Konishi citizens, distant descendants of vision and hearing. By the orphan's two-hundredth iteration, the channels themselves were fully formed, but the inner structures to which they fed their data, the networks for classifying and making sense of it, were still undeveloped, still unrehearsed. Konishi polis itself was buried two hundred meters beneath the Siberian tundra, but via fiber and satellite links the input channels could bring in data from any forum in the Coalition of Polises, from probes orbiting every planet and moon in the solar system, from drones wandering the forests and oceans of Earth, from ten million kinds of scape or abstract sensorium. The first problem of perception was learning how to choose from this superabundance.
Greg Egan (Diaspora)
But when you look at CMB map, you also see that the structure that is observed, is in fact, in a weird way, correlated with the plane of the earth around the sun. Is this Copernicus coming back to haunt us? That's crazy. We're looking out at the whole universe. There's no way there should be a correlation of structure with our motion of the earth around the sun - the plane of the earth around the sun - the ecliptic. That would say we are truly the center of the universe. The new results are either telling us that all of science is wrong and we're the center of the universe, or maybe the data is (s)imply incorrect, or maybe it's telling us there's something weird about the microwave background results and that maybe, maybe there's something wrong with our theories on the larger scales.
Lawrence M. Krauss
I was so struck by Flow’s negative implications for parents that I decided I wanted to speak to Csikszentmihalyi, just to make sure I wasn’t misreading him. And eventually I did, at a conference in Philadelphia where he was one of the marquee speakers. As we sat down to chat, the first thing I asked was why he talked so little about family life in Flow. He devotes only ten pages to it. “Let me tell you a couple of things that may be relevant to you,” he said. And then he told a personal story. When Csikszentmihalyi first developed the Experience Sampling Method, one of the first people he tried it out on was himself. “And at the end of the week,” he said, “I looked at my responses, and one thing that suddenly was very strange to me was that every time I was with my two sons, my moods were always very, very negative.” His sons weren’t toddlers at that point either. They were older. “And I said, ‘This doesn’t make any sense to me, because I’m very proud of them, and we have a good relationship.’ ” But then he started to look at what, specifically, he was doing with his sons that made his feelings so negative. “And what was I doing?” he asked. “I was saying, ‘It’s time to get up, or you will be late for school.’ Or, ‘You haven’t put away your cereal dish from breakfast.’ ” He was nagging, in other words, and nagging is not a flow activity. “I realized,” he said, “that being a parent consists, in large part, of correcting the growth pattern of a person who is not necessarily ready to live in a civilized society.” I asked if, in that same data set, he had any numbers about flow in family life. None were in his book. He said he did. “They were low. Family life is organized in a way that flow is very difficult to achieve, because we assume that family life is supposed to relax us and to make us happy. But instead of being happy, people get bored.” Or enervated, as he’d said before, when talking about disciplining his sons. And because children are constantly changing, the “rules” of handling them change too, which can further confound a family’s ability to flow. “And then we get into these spirals of conflict and so forth,” he continued. “That’s why I’m saying it’s easier to get into flow at work. Work is more structured. It’s structured more like a game. It has clear goals, you get feedback, you know what has to be done, there are limits.” He thought about this. “Partly, the lack of structure in family life, which seems to give people freedom, is actually a kind of an impediment.
Jennifer Senior (All Joy and No Fun: The Paradox of Modern Parenthood)
Persinger and Lafreniere: We, as a species, exist in a world in which exist a myriad of data points. Upon these matrices of points we superimpose a structure and the world makes sense to us. The pattern of the structure originates within our biological and sociological properties.
Robert Anton Wilson (Prometheus Rising)
Despite all their surface diversity, most jokes and funny incidents have the following logical structure: Typically you lead the listener along a garden path of expectation, slowly building up tension. At the very end, you introduce an unexpected twist that entails a complete reinterpretation of all the preceding data, and moreover, it's critical that the new interpretation, though wholly unexpected, makes as much "sense" of the entire set of facts as did the originally "expected" interpretation. In this regard, jokes have much in common with scientific creativity, with what Thomas Kuhn calls a "paradigm shift" in response to a single "anomaly." (It's probably not coincidence that many of the most creative scientists have a great sense of humor.) Of course, the anomaly in the joke is the traditional punch line and the joke is "funny" only if the listener gets the punch line by seeing in a flash of insight how a completely new interpretation of the same set of facts can incorporate the anomalous ending. The longer and more tortuous the garden path of expectation, the "funnier" the punch line when finally delivered.
V.S. Ramachandran
Use # as an introducer for comments. It is good to have a way to embed annotations and comments in data files. It’s best if they’re actually part of the file structure, and so will be preserved by tools that know its format. For comments that are not preserved during parsing, # is the conventional start character.
Eric S. Raymond (Art of UNIX Programming, The, Portable Documents)
Conspiracy theories—feverishly creative, lovingly plotted—are in fact fictional stories that some people believe. Conspiracy theorists connect real data points and imagined data points into a coherent, emotionally satisfying version of reality. Conspiracy theories exert a powerful hold on the human imagination—yes, perhaps even your imagination—not despite structural parallels with fiction, but in large part because of them. They fascinate us because they are ripping good yarns, showcasing classic problem structure and sharply defined good guys and villains. They offer vivid, lurid plots that translate with telling ease into wildly popular entertainment.
Jonathan Gottschall (The Storytelling Animal: How Stories Make Us Human)
Present-day democratic structures just cannot collect and process the relevant data fast enough, and most voters don’t understand biology and cybernetics well enough to form any pertinent opinions. Hence traditional democratic politics is losing control of events, and is failing to present us with meaningful visions of the future.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
Today most of the debate on the cutting edge in macroeconomics would not call itself “Keynesian” or “monetarist” or any other label relating to a school of thought. The data are considered the ruling principle, and it is considered suspect to have too strong a loyalty to any particular model about the underlying structure of the economy.
Tyler Cowen (Average Is Over: Powering America Beyond the Age of the Great Stagnation)
As a thought experiment, von Neumann's analysis was simplicity itself. He was saying that the genetic material of any self-reproducing system, whether natural or artificial, must function very much like a stored program in a computer: on the one hand, it had to serve as live, executable machine code, a kind of algorithm that could be carried out to guide the construction of the system's offspring; on the other hand, it had to serve as passive data, a description that could be duplicated and passed along to the offspring. As a scientific prediction, that same analysis was breathtaking: in 1953, when James Watson and Francis Crick finally determined the molecular structure of DNA, it would fulfill von Neumann's two requirements exactly. As a genetic program, DNA encodes the instructions for making all the enzymes and structural proteins that the cell needs in order to function. And as a repository of genetic data, the DNA double helix unwinds and makes a copy of itself every time the cell divides in two. Nature thus built the dual role of the genetic material into the structure of the DNA molecule itself.
M. Mitchell Waldrop (The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal)
Ray Bradbury said that thinking is the enemy of creativity because it’s self-conscious. When you think you sit calmly and try to reason through something in a structured, logical way. Creativity dances to a different tune. Once you flip that switch, things get a bit chaotic. Ideas start buzzing. Images start popping into your head. Fragments of all kinds of data find their way into orbit.
Sean Patrick (Nikola Tesla: Imagination and the Man That Invented the 20th Century)
Technology is a powerful force in our society. Data, software, and communication can be used for bad: to entrench unfair power structures, to undermine human rights, and to protect vested interests. But they can also be used for good: to make underrepresented people’s voices heard, to create opportunities for everyone, and to avert disasters. This book is dedicated to everyone working toward the good.
Martin Kleppmann (Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems)
Imagine an alien, Fox said, who's come here to identify the planet's dominant form of intelligence. The alien has s look, then chooses. What do you think he picks? I probably shrugged. The zaibatsus, Fox said, the multinationals. The blood of a zaibatsu is information, not people. The structure is independent of the individual lives that comprise it. Corporation as life form. Not the Edge lecture again, I said.
William Gibson (Burning Chrome (Sprawl, #0))
Power vacuums seldom last long. If in the twenty-first century traditional political structures can no longer process the data fast enough to produce meaningful visions, then new and more efficient structures will evolve to take their place.These new structures may be very different from any previous political institutions, whether democratic or authoritarian. The only question is who will build and control these structures. If humankind is no longer up to the task, perhaps it might give somebody else a try.
Yuval Noah Harari (Homo Deus A Brief History of Tomorrow By Yuval Noah Harari & How We Got to Now Six Innovations that Made the Modern World By Steven Johnson 2 Books Collection Set)
A more ambitious route would be changing the structure of governance altogether: away from majority-based, and towards unanimous decision-making. This has been shown to boost women’s speech participation and to mitigate against their minority position. A 2012 US study found that women only participate at an equal rate in discussions when they are in ‘a large majority’ – interestingly while individual women speak less when they are in the minority, individual men speak the same amount no matter what the gender proportion of the group.
Caroline Criado Pérez (Invisible Women: Data Bias in a World Designed for Men)
And yet, despite the horror it caused, the plague turned out to be the catalyst for social and economic change that was so profound that far from marking the death of Europe, it served as its making. The transformation provided an important pillar in the rise—and the triumph—of the west. It did so in several phases. First was the top-to-bottom reconfiguration of how social structures functioned. Chronic depopulation in the wake of the Black Death had the effect of sharply increasing wages because of the accentuated value of labour. So many died before the plague finally began to peter out in the early 1350s that one source noted a “shortage of servants, craftsmen, and workmen, and agricultural workers and labourers.” This gave considerable negotiating powers to those who had previously been at the lower end of the social and economic spectrum. Some simply “turned their noses up at employment, and could scarcely be persuaded to serve the eminent unless for triple wages.”66 This was hardly an exaggeration: empirical data shows that urban wages rose dramatically in the decades after the Black Death.
Peter Frankopan (The Silk Roads: A New History of the World)
the hippocampus is perhaps the central organ in the brain responsible for processing sensory data and acting as a gateway for inflows. As Kisley et al. note: “The cholinergic innervation of the hippocampus, which is crucial for intact sensory gating, exhibits extensive remodeling during pre- and early postnatal development.”4 It displays a great deal of plasticity in response to incoming sensory flows; the more it works with meaning and the more sensory input it is sensitive to, the more it shifts its neural structure as it sensorally interacts with the world.
Stephen Harrod Buhner (Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth)
In fact this desire for consonance in the apocalyptic data, and our tendency to be derisive about it, seem to me equally interesting. Each manifests itself, in the presence of the other, in most of our minds. We are all ready to be sceptical about Father Marystone, but we are most of us given to some form of 'centurial mysticism,' and even to more extravagant apocalyptic practices: a point I shall be taking up in my fourth talk. What it seems to come to is this. Men in the middest make considerable imaginative investments in coherent patterns which, by the provision of an end, make possible a satisfying consonance with the origins and with the middle. That is why the image of the end can never be permanently falsified. But they also, when awake and sane, feel the need to show a marked respect for things as they are; so that there is a recurring need for adjustments in the interest of reality as well as of control. This has relevance to literary plots, images of the grand temporal consonance; and we may notice that there is the same co-existence of naïve acceptance and scepticism here as there is in apocalyptic. Broadly speaking, it is the popular story that sticks most closely to established conventions; novels the clerisy calls 'major' tend to vary them, and to vary them more and more as time goes by. I shall be talking about this in some detail later, but a few brief illustrations might be useful now. I shall refer chiefly to one aspect of the matter, the falsification of one's expectation of the end. The story that proceeded very simply to its obviously predestined end would be nearer myth than novel or drama. Peripeteia, which has been called the equivalent, in narrative, of irony in rhetoric, is present in every story of the least structural sophistication. Now peripeteia depends on our confidence of the end; it is a disconfirmation followed by a consonance; the interest of having our expectations falsified is obviously related to our wish to reach the discovery or recognition by an unexpected and instructive route. It has nothing whatever to do with any reluctance on our part to get there at all. So that in assimilating the peripeteia we are enacting that readjustment of expectations in regard to an end which is so notable a feature of naïve apocalyptic. And we are doing rather more than that; we are, to look at the matter in another way, re-enacting the familiar dialogue between credulity and scepticism. The more daring the peripeteia, the more we may feel that the work respects our sense of reality; and the more certainly we shall feel that the fiction under consideration is one of those which, by upsetting the ordinary balance of our naïve expectations, is finding something out for us, something real. The falsification of an expectation can be terrible, as in the death of Cordelia; it is a way of finding something out that we should, on our more conventional way to the end, have closed our eyes to. Obviously it could not work if there were not a certain rigidity in the set of our expectations.
Frank Kermode (The Sense of an Ending: Studies in the Theory of Fiction)
In a remarkable letter to the director of the Vatican Observatory, John Paul II wrote: The church does not propose that science should become religion or religion science. On the contrary, unity always presupposes the diversity and integrity of its elements. Each of these members should become not less itself but more itself in a dynamic interchange, for a unity in which one of the elements is reduced to the other is destructive, false in its promises of harmony, and ruinous of the integrity of its components. We are asked to become one. We are not asked to become each other. . . . Unity involves the drive of the human mind towards understanding and the desire of the human spirit for love. When human beings seek to understand the multiplicities that surround them, when they seek to make sense of experience, they do so by bringing many factors into a common vision. Understanding is achieved when many data are unified by a common structure. The one illuminates the many: it makes sense of the whole. . . . We move towards unity as we move towards meaning in our lives. Unity is also the consequence of love. If love is genuine, it moves not towards the assimilation of the other but towards union with the other. Human community begins in desire when that union has not been achieved, and it is completed in joy when those who have been apart are now united.10
Ilia Delio (Making All Things New: Catholicity, Cosmology, Consciousness (Catholicity in an Evolving Universe Series))
In the coming decades, it is likely that we will see more Internet-like revolutions, in which technology steals a march on politics. Artificial intelligence and biotechnology might soon overhaul our societies and economies – and our bodies and minds too – but they are hardly a blip on our political radar. Our current democratic structures just cannot collect and process the relevant data fast enough, and most voters don’t understand biology and cybernetics well enough to form any pertinent opinions. Hence traditional democratic politics loses control of events, and fails to provide us with meaningful visions for the future. That doesn’t mean we will go back to twentieth-century-style dictatorships. Authoritarian regimes seem to be equally overwhelmed by the pace of technological development and the speed and volume of the data flow. In the twentieth century, dictators had grand visions for the future. Communists and fascists alike sought to completely destroy the old world and build a new world in its place. Whatever you think about Lenin, Hitler or Mao, you cannot accuse them of lacking vision. Today it seems that leaders have a chance to pursue even grander visions. While communists and Nazis tried to create a new society and a new human with the help of steam engines and typewriters, today’s prophets could rely on biotechnology and super-computers.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
The Scientific Revolution proposed a very different formula for knowledge: Knowledge = Empirical Data × Mathematics. If we want to know the answer to some question, we need to gather relevant empirical data, and then use mathematical tools to analyse the data. For example, in order to gauge the true shape of the earth, we can observe the sun, the moon and the planets from various locations across the world. Once we have amassed enough observations, we can use trigonometry to deduce not only the shape of the earth, but also the structure of the entire solar system. In practice, that means that scientists seek knowledge by spending years in observatories, laboratories and research expeditions, gathering more and more empirical data, and sharpening their mathematical tools so they could interpret the data correctly. The scientific formula for knowledge led to astounding breakthroughs in astronomy, physics, medicine and countless other disciplines. But it had one huge drawback: it could not deal with questions of value and meaning. Medieval pundits could determine with absolute certainty that it is wrong to murder and steal, and that the purpose of human life is to do God’s bidding, because scriptures said so. Scientists could not come up with such ethical judgements. No amount of data and no mathematical wizardry can prove that it is wrong to murder. Yet human societies cannot survive without such value judgements.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
Sound waves, regardless of their frequency or intensity, can only be detected by the Mole Fly’s acute sense of smell—it is a little known fact that the Mole Fly’s auditory receptors do not, in fact, have a corresponding center in the brain designated for the purposes of processing sensory stimuli and so, these stimuli, instead of being siphoned out as noise, bypass the filters to be translated, oddly enough, by the part of the brain that processes smell. Consequently, the Mole Fly’s brain, in its inevitable confusion, understands sound as an aroma, rendering the boundary line between the auditory and olfactory sense indistinguishable. Sounds, thus, come in a variety of scents with an intensity proportional to its frequency. Sounds of shorter wavelength, for example, are particularly pungent. What results is a species of creature that cannot conceptualize the possibility that sound and smell are separate entities, despite its ability to discriminate between the exactitudes of pitch, timbre, tone, scent, and flavor to an alarming degree of precision. Yet, despite this ability to hyper-analyze, they lack the cognitive skill to laterally link successions of either sound or smell into a meaningful context, resulting in the equivalent of a data overflow. And this may be the most defining element of the Mole Fly’s behavior: a blatant disregard for the context of perception, in favor of analyzing those remote and diminutive properties that distinguish one element from another. While sensory continuity seems logical to their visual perception, as things are subject to change from moment-to-moment, such is not the case with their olfactory sense, as delays in sensing new smells are granted a degree of normality by the brain. Thus, the Mole Fly’s olfactory-auditory complex seems to be deprived of the sensory continuity otherwise afforded in the auditory senses of other species. And so, instead of sensing aromas and sounds continuously over a period of time—for example, instead of sensing them 24-30 times per second, as would be the case with their visual perception—they tend to process changes in sound and smell much more slowly, thereby preventing them from effectively plotting the variations thereof into an array or any kind of meaningful framework that would allow the information provided by their olfactory and auditory stimuli to be lasting in their usefulness. The Mole flies, themselves, being the structurally-obsessed and compulsive creatures that they are, in all their habitual collecting, organizing, and re-organizing of found objects into mammoth installations of optimal functional value, are remarkably easy to control, especially as they are given to a rather false and arbitrary sense of hierarchy, ascribing positions—that are otherwise trivial, yet necessarily mundane if only to obscure their true purpose—with an unfathomable amount of honor, to the logical extreme that the few chosen to serve in their most esteemed ranks are imbued with a kind of obligatory arrogance that begins in the pupal stages and extends indefinitely, as they are further nurtured well into adulthood by a society that infuses its heroes of middle management with an immeasurable sense of importance—a kind of celebrity status recognized by the masses as a living embodiment of their ideals. And yet, despite this culture of celebrity worship and vicarious living, all whims and impulses fall subservient, dropping humbly to the knees—yes, Mole Flies do, in fact, have knees!—before the grace of the merciful Queen, who is, in actuality, just a puppet dictator installed by the Melic papacy, using an old recycled Damsel fly-fishing lure. The dummy is crude, but convincing, as the Mole flies treat it as they would their true-born queen.
Ashim Shanker (Don't Forget to Breathe (Migrations, Volume I))
Philosophers of science have repeatedly demonstrated that more than one theoretical construction can always be placed upon a given collection of data. History of science indicates that, particularly in the early developmental stages of a new paradigm, it is not even very difficult to invent such alternates. But that invention of alternates is just what scientists seldom undertake except during the pre-paradigm stage of their science's development and at very special occasions during its subsequent evolution. So long as the tools a paradigm supplies continue to prove capable of solving the problems it defines, science moves fastest and penetrates most deeply through confident employment of those tools. The reason is clear. As in manufacture so in science-retooling is an extravagance to be reserved for the occasion that demands it. The significance of crises is the indication they provide that an occasion for retooling has arrived.
Thomas S. Kuhn (The Structure of Scientific Revolutions)
Gell-Mann and Ne'eman discovered that one such simple Lie group, called "special unitary group of degree 3," or SU(3), was particularly well suited for the "eightfold way"-the family structure the particles were found to obey. The beaty of the SU(3) symmetry was revealed in full glory via its predictive power. Gell-Mann and Ne'eman showed that if the theory were to hold true, a previously unknown tenth member of a particular family of nine particles had to be found. The extensive hunt for the missing particle was conducted in an accelerator experiment in 1964 at Brookhaven National Lab on Long Island. Yuval Ne'eman told me some years later that, upon hearing that half of the data had already been scrutinized without discovering the anticipated particle, he was contemplating leaving physics altogether. Symmetry triumphed at the end-the missing particle (called the omega minus) was found, and it had precisely the properties predicted by the theory.
Mario Livio (The Equation That Couldn't Be Solved: How Mathematical Genius Discovered the Language of Symmetry)
Judging types are in a hurry to make decisions. Perceiving types are not. This is why science doesn’t make any serious attempt to reach a final theory of everything. It always says, “Let’s do another experiment. And another. And another.” When will the experimentation ever end? When will scientists conclude that they have now collected easily enough data to now draw definitive conclusions? But they don’t want to draw any such conclusions. That’s not how they roll. Their method has no such requirement. That’s why many of them openly say that they do not want a final theory. It will stop them, they say, from “discovering” new things. Judging types like order and structure. They like decisions, conclusions, getting things done and reaching objectives. Perceiving types are doubtful and skeptical about all of that. They frequently refer to judging types as “judgmental”, which is literally perceived as a bad thing, “authoritarian”, “totalitarian”, “fascist”, “Nazi”, and so on. Perceiving types always want to have an open road ahead of them. They never want to actually arrive. Judging types cannot see the point of not wanting to reach your destination.
Thomas Stark (Extra Scientiam Nulla Salus: How Science Undermines Reason (The Truth Series Book 8))
Saint John Paul II wrote, “when its concepts and conclusions can be integrated into the wider human culture and its concerns for ultimate meaning and value.”7 Religion, too, develops best when its doctrines are not abstract and fixed in an ancient past but integrated into the wider stream of life. Albert Einstein once said that “science without religion is lame and religion without science is blind.”8 So too, John Paul II wrote: “Science can purify religion from error and superstition; religion can purify science from idolatry and false absolutes. Each can draw the other into a wider world, a world in which both can flourish.”9 Teilhard de Chardin saw that dialogue alone between the disciplines is insufficient; what we need is a new synthesis of science and religion, drawing insights from each discipline into a new unity. In a remarkable letter to the director of the Vatican Observatory, John Paul II wrote: The church does not propose that science should become religion or religion science. On the contrary, unity always presupposes the diversity and integrity of its elements. Each of these members should become not less itself but more itself in a dynamic interchange, for a unity in which one of the elements is reduced to the other is destructive, false in its promises of harmony, and ruinous of the integrity of its components. We are asked to become one. We are not asked to become each other. . . . Unity involves the drive of the human mind towards understanding and the desire of the human spirit for love. When human beings seek to understand the multiplicities that surround them, when they seek to make sense of experience, they do so by bringing many factors into a common vision. Understanding is achieved when many data are unified by a common structure. The one illuminates the many: it makes sense of the whole. . . . We move towards unity as we move towards meaning in our lives. Unity is also the consequence of love. If love is genuine, it moves not towards the assimilation of the other but towards union with the other. Human community begins in desire when that union has not been achieved, and it is completed in joy when those who have been apart are now united.10 The words of the late pope highlight the core of catholicity: consciousness of belonging to a whole and unity as a consequence of love.
Ilia Delio (Making All Things New: Catholicity, Cosmology, Consciousness (Catholicity in an Evolving Universe Series))
Christianity and other traditional religions are still important players in the world. Yet their role is now largely reactive. In the past, they were a creative force. Christianity, for example, spread the hitherto heretical notion that all humans are equal before God, thereby changing human political structures, social hierarchies and even gender relations. In his Sermon on the Mount Jesus went further, insisting that the meek and oppressed are God’s favourite people, thus turning the pyramid of power on its head, and providing ammunition for generations of revolutionaries. In addition to social and ethical reforms, Christianity was responsible for important economic and technological innovations. The Catholic Church established medieval Europe’s most sophisticated administrative system, and pioneered the use of archives, catalogues, timetables and other techniques of data processing. The Vatican was the closest thing twelfth-century Europe had to Silicon Valley. The Church established Europe’s first economic corporations – the monasteries – which for 1,000 years spearheaded the European economy and introduced advanced agricultural and administrative methods.
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
We are about to study the idea of a computational process. Computational processes are abstract beings that inhabit computers. As they evolve, processes manipulate other abstract things called data. The evolution of a process is directed by a pattern of rules called a program. People create programs to direct processes. In effect, we conjure the spirits of the computer with our spells. A computational process is indeed much like a sorcerer's idea of a spirit. It cannot be seen or touched. It is not composed of matter at all. However, it is very real. It can perform intellectual work. It can answer questions. It can affect the world by disbursing money at a bank or by controlling a robot arm in a factory. The programs we use to conjure processes are like a sorcerer's spells. They are carefully composed from symbolic expressions in arcane and esoteric programming languages that prescribe the tasks we want our processes to perform. A computational process, in a correctly working computer, executes programs precisely and accurately. Thus, like the sorcerer's apprentice, novice programmers must learn to understand and to anticipate the consequences of their conjuring. Even small errors (usually called bugs or glitches) in programs can have complex and unanticipated consequences.
Harold Abelson (Structure and Interpretation of Computer Programs)
The main ones are the symbolists, connectionists, evolutionaries, Bayesians, and analogizers. Each tribe has a set of core beliefs, and a particular problem that it cares most about. It has found a solution to that problem, based on ideas from its allied fields of science, and it has a master algorithm that embodies it. For symbolists, all intelligence can be reduced to manipulating symbols, in the same way that a mathematician solves equations by replacing expressions by other expressions. Symbolists understand that you can’t learn from scratch: you need some initial knowledge to go with the data. They’ve figured out how to incorporate preexisting knowledge into learning, and how to combine different pieces of knowledge on the fly in order to solve new problems. Their master algorithm is inverse deduction, which figures out what knowledge is missing in order to make a deduction go through, and then makes it as general as possible. For connectionists, learning is what the brain does, and so what we need to do is reverse engineer it. The brain learns by adjusting the strengths of connections between neurons, and the crucial problem is figuring out which connections are to blame for which errors and changing them accordingly. The connectionists’ master algorithm is backpropagation, which compares a system’s output with the desired one and then successively changes the connections in layer after layer of neurons so as to bring the output closer to what it should be. Evolutionaries believe that the mother of all learning is natural selection. If it made us, it can make anything, and all we need to do is simulate it on the computer. The key problem that evolutionaries solve is learning structure: not just adjusting parameters, like backpropagation does, but creating the brain that those adjustments can then fine-tune. The evolutionaries’ master algorithm is genetic programming, which mates and evolves computer programs in the same way that nature mates and evolves organisms. Bayesians are concerned above all with uncertainty. All learned knowledge is uncertain, and learning itself is a form of uncertain inference. The problem then becomes how to deal with noisy, incomplete, and even contradictory information without falling apart. The solution is probabilistic inference, and the master algorithm is Bayes’ theorem and its derivates. Bayes’ theorem tells us how to incorporate new evidence into our beliefs, and probabilistic inference algorithms do that as efficiently as possible. For analogizers, the key to learning is recognizing similarities between situations and thereby inferring other similarities. If two patients have similar symptoms, perhaps they have the same disease. The key problem is judging how similar two things are. The analogizers’ master algorithm is the support vector machine, which figures out which experiences to remember and how to combine them to make new predictions.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
The last refuge of the Self, perhaps, is “physical continuity.” Despite the body’s mercurial nature, it feels like a badge of identity we have carried since the time of our earliest childhood memories. A thought experiment dreamed up in the 1980s by British philosopher Derek Parfit illustrates how important—yet deceiving—this sense of physical continuity is to us.15 He invites us to imagine a future in which the limitations of conventional space travel—of transporting the frail human body to another planet at relatively slow speeds—have been solved by beaming radio waves encoding all the data needed to assemble the passenger to their chosen destination. You step into a machine resembling a photo booth, called a teletransporter, which logs every atom in your body then sends the information at the speed of light to a replicator on Mars, say. This rebuilds your body atom by atom using local stocks of carbon, oxygen, hydrogen, and so on. Unfortunately, the high energies needed to scan your body with the required precision vaporize it—but that’s okay because the replicator on Mars faithfully reproduces the structure of your brain nerve by nerve, synapse by synapse. You step into the teletransporter, press the green button, and an instant later materialize on Mars and can continue your existence where you left off. The person who steps out of the machine at the other end not only looks just like you, but etched into his or her brain are all your personality traits and memories, right down to the memory of eating breakfast that morning and your last thought before you pressed the green button. If you are a fan of Star Trek, you may be perfectly happy to use this new mode of space travel, since this is more or less what the USS Enterprise’s transporter does when it beams its crew down to alien planets and back up again. But now Parfit asks us to imagine that a few years after you first use the teletransporter comes the announcement that it has been upgraded in such a way that your original body can be scanned without destroying it. You decide to give it a go. You pay the fare, step into the booth, and press the button. Nothing seems to happen, apart from a slight tingling sensation, but you wait patiently and sure enough, forty-five minutes later, an image of your new self pops up on the video link and you spend the next few minutes having a surreal conversation with yourself on Mars. Then comes some bad news. A technician cheerfully informs you that there have been some teething problems with the upgraded teletransporter. The scanning process has irreparably damaged your internal organs, so whereas your replica on Mars is absolutely fine and will carry on your life where you left off, this body here on Earth will die within a few hours. Would you care to accompany her to the mortuary? Now how do you feel? There is no difference in outcome between this scenario and what happened in the old scanner—there will still be one surviving “you”—but now it somehow feels as though it’s the real you facing the horror of imminent annihilation. Parfit nevertheless uses this thought experiment to argue that the only criterion that can rationally be used to judge whether a person has survived is not the physical continuity of a body but “psychological continuity”—having the same memories and personality traits as the most recent version of yourself. Buddhists
James Kingsland (Siddhartha's Brain: Unlocking the Ancient Science of Enlightenment)
In 1942, Merton set out four scientific values, now known as the ‘Mertonian Norms’. None of them have snappy names, but all of them are good aspirations for scientists. First, universalism: scientific knowledge is scientific knowledge, no matter who comes up with it – so long as their methods for finding that knowledge are sound. The race, sex, age, gender, sexuality, income, social background, nationality, popularity, or any other status of a scientist should have no bearing on how their factual claims are assessed. You also can’t judge someone’s research based on what a pleasant or unpleasant person they are – which should come as a relief for some of my more disagreeable colleagues. Second, and relatedly, disinterestedness: scientists aren’t in it for the money, for political or ideological reasons, or to enhance their own ego or reputation (or the reputation of their university, country, or anything else). They’re in it to advance our understanding of the universe by discovering things and making things – full stop.20 As Charles Darwin once wrote, a scientist ‘ought to have no wishes, no affections, – a mere heart of stone.’ The next two norms remind us of the social nature of science. The third is communality: scientists should share knowledge with each other. This principle underlies the whole idea of publishing your results in a journal for others to see – we’re all in this together; we have to know the details of other scientists’ work so that we can assess and build on it. Lastly, there’s organised scepticism: nothing is sacred, and a scientific claim should never be accepted at face value. We should suspend judgement on any given finding until we’ve properly checked all the data and methodology. The most obvious embodiment of the norm of organised scepticism is peer review itself. 20. Robert K. Merton, ‘The Normative Structure of Science’ (1942), The Sociology of Science: Empirical and Theoretical Investigations (Chicago and London: University of Chicago Press, 1973): pp. 267–278.
Stuart Ritchie (Science Fictions)
In 1950, a thirty-year-old scientist named Rosalind Franklin arrived at King’s College London to study the shape of DNA. She and a graduate student named Raymond Gosling created crystals of DNA, which they bombarded with X-rays. The beams bounced off the crystals and struck photographic film, creating telltale lines, spots, and curves. Other scientists had tried to take pictures of DNA, but no one had created pictures as good as Franklin had. Looking at the pictures, she suspected that DNA was a spiral-shaped molecule—a helix. But Franklin was relentlessly methodical, refusing to indulge in flights of fancy before the hard work of collecting data was done. She kept taking pictures. Two other scientists, Francis Crick and James Watson, did not want to wait. Up in Cambridge, they were toying with metal rods and clamps, searching for plausible arrangements of DNA. Based on hasty notes Watson had written during a talk by Franklin, he and Crick put together a new model. Franklin and her colleagues from King’s paid a visit to Cambridge to inspect it, and she bluntly told Crick and Watson they had gotten the chemistry all wrong. Franklin went on working on her X-ray photographs and growing increasingly unhappy with King’s. The assistant lab chief, Maurice Wilkins, was under the impression that Franklin was hired to work directly for him. She would have none of it, bruising Wilkins’s ego and leaving him to grumble to Crick about “our dark lady.” Eventually a truce was struck, with Wilkins and Franklin working separately on DNA. But Wilkins was still Franklin’s boss, which meant that he got copies of her photographs. In January 1953, he showed one particularly telling image to Watson. Now Watson could immediately see in those images how DNA was shaped. He and Crick also got hold of a summary of Franklin’s unpublished research she wrote up for the Medical Research Council, which guided them further to their solution. Neither bothered to consult Franklin about using her hard-earned pictures. The Cambridge and King’s teams then negotiated a plan to publish a set of papers in Nature on April 25, 1953. Crick and Watson unveiled their model in a paper that grabbed most of the attention. Franklin and Gosling published their X-ray data in another paper, which seemed to readers to be a “me-too” effort. Franklin died of cancer five years later, while Crick, Watson, and Wilkins went on to share the Nobel prize in 1962. In his 1968 book, The Double Helix, Watson would cruelly caricature Franklin as a belligerent, badly dressed woman who couldn’t appreciate what was in her pictures. That bitter fallout is a shame, because these scientists had together discovered something of exceptional beauty. They had found a molecular structure that could make heredity possible.
Carl Zimmer (She Has Her Mother's Laugh: What Heredity Is, Is Not, and May Become)
Months later, Time magazine would run its now infamous article bragging about how it had been done. Without irony or shame, the magazine reported that “[t]here was a conspiracy unfolding behind the scenes” creating “an extraordinary shadow effort” by a “well-funded cabal of powerful people” to oppose Trump.112 Corporate CEOs, organized labor, left-wing activists, and Democrats all worked together in secret to secure a Biden victory. For Trump, these groups represented a powerful Washington and Democratic establishment that saw an unremarkable career politician like Biden as merely a vessel for protecting their self-interests. Accordingly, when Trump was asked whom he blames for the rigging of the 2020 election, he quickly responded, “Least of all Biden.” Time would, of course, disingenuously frame this effort as an attempt to “oppose Trump’s assault on democracy,” even as Time reporter Molly Ball noted this shadow campaign “touched every aspect of the election. They got states to change voting systems and laws and helped secure hundreds of millions in public and private funding.” The funding enabled the country’s sudden rush to mail-in balloting, which Ball described as “a revolution in how people vote.”113 The funding from Democratic donors to public election administrators was revolutionary. The Democrats’ network of nonprofit activist groups embedded into the nation’s electoral structure through generous grants from Democratic donors. They helped accomplish the Democrats’ vote-by-mail strategy from the inside of the election process. It was as if the Dallas Cowboys were paying the National Football League’s referee staff and conducting all of their support operations. No one would feel confident in games won by the Cowboys in such a scenario. Ball also reported that this shadowy cabal “successfully pressured social media companies to take a harder line against disinformation and used data-driven strategies to fight viral smears.” And yet, Time magazine made this characterization months after it was revealed that the New York Post’s reporting on Hunter Biden’s corrupt deal-making with Chinese and other foreign officials—deals that alleged direct involvement from Joe Biden, resulting in the reporting’s being overtly censored by social media—was substantially true. Twitter CEO Jack Dorsey would eventually tell Congress that censoring the New York Post and locking it out of its Twitter account over the story was “a mistake.” And the Hunter Biden story was hardly the only egregious mistake, to say nothing of the media’s willful dishonesty, in the 2020 election. Republicans read the Time article with horror and as an admission of guilt. It confirmed many voters’ suspicions that the election wasn’t entirely fair. Trump knew the article helped his case, calling it “the only good article I’ve read in Time magazine in a long time—that was actually just a piece of the truth because it was much deeper than that.
Mollie Ziegler Hemingway (Rigged: How the Media, Big Tech, and the Democrats Seized Our Elections)
Product development has become a faster, more flexible process, where radically better products don’t stand on the shoulders of giants, but on the shoulders of lots of iterations. The basis for success then, and for continual product excellence, is speed. Unfortunately, like Jonathan’s failed gate-based product development framework, most management processes in place at companies today are designed with something else in mind. They were devised over a century ago, at a time when mistakes were expensive and only the top executives had comprehensive information, and their primary objectives are lowering risk and ensuring that decisions are made only by the few executives with lots of information. In this traditional command-and-control structure, data flows up to the executives from all over the organization, and decisions subsequently flow down. This approach is designed to slow things down, and it accomplishes the task very well. Meaning that at the very moment when businesses must permanently accelerate, their architecture is working against them.
Eric Schmidt (How Google Works)
A configuration is the structure of architectural relationships among components, connectors, and data during a period of system run-time.
Anonymous
Amortization allows for occasional operations to have actual costs that exceed their amortized costs. Such operations are called expensive. Operations whose actual costs are less than their amortized costs are called cheap. Expensive operations decrease the accumulated savings and cheap operations increase it. The key to proving amortized bounds is to show that expensive operations occur only when the accumulated savings are sufficient to cover the remaining cost.
Chris Okasaki (Purely Functional Data Structures)
The methodological benefits of functional languages are well known [Bac78, Hug89, HJ94], but still the vast majority of programs are written in imperative languages such as C. This apparent contradiction is easily explained by the fact that functional languages have historically been slower than their more traditional cousins, but this gap is narrowing.
Chris Okasaki (Purely Functional Data Structures)
A distinctive property of functional data structures is that they are always persistent—updating a functional data structure does not destroy the existing version, but rather creates a new version that coexists with the old one. Persistence is achieved by copying the affected nodes of a data structure and making all changes in the copy rather than in the original.
Chris Okasaki (Purely Functional Data Structures)
The notion of amortization arises from the following observation. Given a sequence of operations, we may wish to know the running time of the entire sequence, but not care about the running time of any individual operation. For instance, given a sequence of n operations, we may wish to bound the total running time of the sequence by O(n) without insisting that every individual operation run in O(1) time. We might be satisfied if a few operations run in O(log n) or even O(n) time, provided the total cost of the sequence is only O(n). This freedom opens up a wide design space of possible solutions, and often yields new solutions that are simpler and faster than worst-case solutions with equivalent bounds.
Chris Okasaki (Purely Functional Data Structures)
The fundamental problem in the U.S. health care system is that the structure of health care delivery is broken. This is what all the data about rising costs and alarming quality are telling us. And the structure of health care delivery is broken because competition is broken. All of the well-intended reform movements have failed because they did not address the underlying nature of competition.
Michael E. Porter (Redefining Health Care: Creating Value-Based Competition on Results)
Structured Application Design with MVC MVC defines a clean separation between the critical components of our apps. Consistent with its name, MVC defines three parts of an application: • A model provides the underlying data and methods that offer information to the rest of the application. The model does not define how the application will look or how it will act. • One or more views make up the user interface. A view consists of the different onscreen widgets (buttons, fields, switches, and so forth) that a user can interact with. • A controller is typically paired with a view. The controller is responsible for receiving user input and acting accordingly. Controllers may access and update a view using information from the model and update the model using the results of user interactions in the view. In short, it bridges the MVC components.
John Ray (Sams Teach Yourself iOS 5 Application Development in 24 Hours (3rd Edition))
An object is not a data structure. In fact, if you are the consumer of an object, you aren't allowed to see any data that might be inside it. And, in fact, the object might have no data inside it at all.
Anonymous
In general, organisms that share very similar morphologies or similar DNA sequences are likely to be more closely related than organisms with vastly different structures or sequences. In some cases, however, the morphological divergence between related species can be great and their genetic divergence small (or vice versa). Consider the Hawaiian silversword plants discussed in Chapter 25. These species vary dramatically in appearance throughout the islands. Some are tall, twiggy trees, and others are dense, ground-hugging shrubs (see Figure 25.20). But despite these striking phenotypic differences, the silverswords’ genes are very similar. Based on these small molecular divergences, scientists estimate that the silversword group began to diverge 5 million years ago, which is also about the time when the oldest of the current islands formed. We’ll discuss how scientists use molecular data to estimate such divergence times later in this chapter.
Jane B. Reece (Campbell Biology)
Rules for Building High-Performance Code We’ve got the following rules for creating high-performance software: Know where you’re going (understand the objective of the software). Make a big map (have an overall program design firmly in mind, so the various parts of the program and the data structures work well together). Make lots of little maps (design an algorithm for each separate part of the overall design). Know the territory (understand exactly how the computer carries out each task). Know when it matters (identify the portions of your programs where performance matters, and don’t waste your time optimizing the rest). Always consider the alternatives (don’t get stuck on a single approach; odds are there’s a better way, if you’re clever and inventive enough). Know how to turn on the juice (optimize the code as best you know how when it does matter).
Anonymous
One way programming languages avoid the issue of data being modified by concurrently running threads is by providing immutable data structures or collection classes. Clearly data that cannot change doesn't need to be protected. It is often desirable to be able to create new data structures that are similar to existing ones, for example, a list with a new item added at one end or a hash map with a new key/value pair added.
Anonymous
In this traditional command-and-control structure, data flows up to the executives from all over the organization, and decisions subsequently flow down. This approach is designed to slow things down, and it accomplishes the task very well. Meaning that at the very moment when businesses must permanently accelerate, their architecture is working against them.
Eric Schmidt (How Google Works)
Objects hide their data behind abstractions and expose functions that operate on that data. Data structure expose their data and have no meaningful functions.
Robert C. Martin (Clean Code: A Handbook of Agile Software Craftsmanship (Robert C. Martin Series))
Table 9-1. Chapter Summary Problem Solution Listing Create an AngularJS module. Use the angular.module method. 1, 2 Set the scope of a module. Use the ng-app attribute. 3 Define a controller. Use the Module.controller method. 4, 8 Apply a controller to a view. Use the ng-controller attribute. 5, 7 Pass data from a controller to a view. Use the $scope service. 6 Define a directive. Use the Module.directive method. 9 Define a filter. Use the Module.filter method. 10 Use a filter programmatically. Use the $filter service. 11 Define a service. Use the Module.service, Module.factory, or Module.provider method. 12 Define a service from an existing object or value. Use the Module.value method. 13 Add structure to the code in an application. Create multiple modules and declare dependencies from the module referenced by the ng-app attribute. 14–16 Register functions that are called when modules are loaded. Use the Module.config and Module.run methods. 17
Adam Freeman (Pro AngularJS (Expert's Voice in Web Development))
Dimensional models implemented in relational database management systems are referred to as star schemas because of their resemblance to a star-like structure. Dimensional models implemented in multidimensional database environments are referred to as online analytical processing (OLAP) cubes, as illustrated in Figure 1.1. Figure 1.1 Star schema versus OLAP cube. If your DW/BI environment includes either star schemas or OLAP cubes, it leverages dimensional concepts. Both stars and cubes have a common logical design with recognizable dimensions; however, the physical implementation differs. When data is loaded into an OLAP cube, it is stored and indexed using formats and techniques that are designed for dimensional data. Performance aggregations or precalculated summary tables are often created and managed by the OLAP cube engine. Consequently, cubes deliver superior query performance because of the precalculations, indexing strategies, and other optimizations. Business users can drill down or up by adding or removing attributes from their analyses with excellent performance without issuing new queries. OLAP cubes also provide more analytically robust functions that exceed those available with SQL. The downside is that you pay a load performance price for these capabilities, especially with large data sets.
Ralph Kimball (The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling)
Each business process is represented by a dimensional model that consists of a fact table containing the event's numeric measurements surrounded by a halo of dimension tables that contain the textual context that was true at the moment the event occurred. This characteristic star-like structure is often called a star join, a term dating back to the earliest days of relational databases. Figure 1.5 Fact and dimension tables in a dimensional model. The first thing to notice about the dimensional schema is its simplicity and symmetry. Obviously, business users benefit from the simplicity because the data is easier to understand and navigate. The charm of the design in Figure 1.5 is that it is highly recognizable to business users. We have observed literally hundreds of instances in which users immediately agree that the dimensional model is their business. Furthermore, the reduced number of tables and use of meaningful business descriptors make it easy to navigate and less likely that mistakes will occur. The simplicity of a dimensional model also has performance benefits. Database optimizers process these simple schemas with fewer joins more efficiently. A database engine can make strong assumptions about first constraining the heavily indexed dimension tables, and then attacking the fact table all at once with the Cartesian product of the dimension table keys satisfying the user's constraints. Amazingly, using this approach, the optimizer can evaluate arbitrary n-way joins to a fact table in a single pass through the fact table's index. Finally, dimensional models are gracefully extensible to accommodate change. The predictable framework of a dimensional model withstands unexpected changes in user behavior. Every dimension is equivalent; all dimensions are symmetrically-equal entry points into the fact table. The dimensional model has no built-in bias regarding expected query patterns. There are no preferences for the business questions asked this month versus the questions asked next month. You certainly don't want to adjust schemas if business users suggest new ways to analyze their business.
Ralph Kimball (The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling)
final step of the ETL process is the physical structuring and loading of data into the presentation area's target dimensional models. Because the primary mission of the ETL system is to hand off the dimension and fact tables in the delivery step, these subsystems are critical. Many of these defined subsystems focus on dimension table processing, such as surrogate key assignments, code lookups to provide appropriate descriptions, splitting, or combining columns to present the appropriate data values, or joining underlying third normal form table structures into flattened denormalized dimensions. In contrast, fact tables are typically large and time consuming to load, but preparing them for the presentation area is typically straightforward. When the dimension and fact tables in a dimensional model have been updated, indexed, supplied with appropriate aggregates, and further quality assured, the business community is notified that the new data has been published.
Ralph Kimball (The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling)
The structure of de Prony’s computing office cannot be easily seen in Smith’s example. His computing staff had two distinct classes of workers. The larger of these was a staff of nearly ninety computers. These workers were quite different from Smith’s pin makers or even from the computers at the British Nautical Almanac and the Connaissance des Temps. Many of de Prony’s computers were former servants or wig dressers, who had lost their jobs when the Revolution rendered the elegant styles of Louis XVI unfashionable or even treasonous.35 They were not trained in mathematics and held no special interest in science. De Prony reported that most of them “had no knowledge of arithmetic beyond the two first rules [of addition and subtraction].”36 They were little different from manual workers and could not discern whether they were computing trigonometric functions, logarithms, or the orbit of Halley’s comet. One labor historian has described them as intellectual machines, “grasping and releasing a single piece of ‘data’ over and over again.”37 The second class of workers prepared instructions for the computation and oversaw the actual calculations. De Prony had no special title for this group of workers, but subsequent computing organizations came to use the term “planning committee” or merely “planners,” as they were the ones who actually planned the calculations. There were eight planners in de Prony’s organization. Most of them were experienced computers who had worked for either the Bureau du Cadastre or the Paris Observatory. A few had made interesting contributions to mathematical theory, but the majority had dealt only with the problems of practical mathematics.38 They took the basic equations for the trigonometric functions and reduced them to the fundamental operations of addition and subtraction. From this reduction, they prepared worksheets for the computers. Unlike Nevil Maskelyne’s worksheets, which gave general equations to the computers, these sheets identified every operation of the calculation and left nothing for the workers to interpret. Each step of the calculation was followed by a blank space for the computers to fill with a number. Each table required hundreds of these sheets, all identical except for a single unique starting value at the top of the page. Once the computers had completed their sheets, they returned their results to the planners. The planners assembled the tables and checked the final values. The task of checking the results was a substantial burden in itself. The group did not double-compute, as that would have obviously doubled the workload. Instead the planners checked the final values by taking differences between adjacent values in order to identify miscalculated numbers. This procedure, known as “differencing,” was an important innovation for human computers. As one observer noted, differencing removed the “necessity of repeating, or even of examining, the whole of the work done by the [computing] section.”39 The entire operation was overseen by a handful of accomplished scientists, who “had little or nothing to do with the actual numerical work.” This group included some of France’s most accomplished mathematicians, such as Adrien-Marie Legendre (1752–1833) and Lazare-Nicolas-Marguerite Carnot (1753–1823).40 These scientists researched the appropriate formulas for the calculations and identified potential problems. Each formula was an approximation, as no trigonometric function can be written as an exact combination of additions and subtractions. The mathematicians analyzed the quality of the approximations and verified that all the formulas produced values adequately close to the true values of the trigonometric functions.
David Alan Grier (When Computers Were Human)
Integration databases—don’t do it! Seriously! Not even with views. Not even with stored procedures. Take it up a level, and wrap a web service around the database. Then make the web service redundant and accessed through a virtual IP. Build a test harness to verify what happens when the web service is down. That’s an enterprise integration technology. Reaching into another system’s database is just…icky. Nothing hobbles a system’s ability to adapt quite like having other systems poking into its guts. Database “integrations” are pure evil. They violate encapsulation and information hiding by exposing the most intimate details about a system’s inner workings. They encourage inappropriate coupling at both the structural and semantic levels. Even worse, the system that hangs its database out for the world cannot trust the data in the database at all. Rows can be added or modified by other entities even while the owner has objects in memory mapped from those rows. Vital application logic can be bypassed, resulting in illegal or unreachable states.[119]
Anonymous
The Engineer began dismantling the helmet at a breakneck speed, stacking the components—faceplate, lining, mikes, data processor, even microfans—on the nearest flat surface, a hydroplane-like structure on a small vessel.
Karen Traviss (Glasslands (Halo, #8))
Available data indicate that the blood and saliva normally carry defensive factors which when present control the growth of the acid producing organisms and the local reactions at tooth surfaces. When these defensive factors are not present the acid producing organisms multiply and produce an acid which dissolves tooth structure. The origin of this protective factor is provided in nutrition and is directly related to the mineral content of the foods and to known and unknown vitamins particularly the fat-soluble. Clinical data demonstrate that by following the program outlined dental caries can be prevented or controlled when active in practically all individuals. This does not require either permission or prescription but it is the inherent right of every individual. A properly balanced diet is good for the entire body
Anonymous
Social conservatives do have a pretty decent predictive track record, including in many cases where their fears were dismissed as wild and apocalyptic, their projections as sky-is-falling nonsense, their theories of how society and human nature works as evidence-free fantasies. . . . If you look at the post-1960s trend data — whether it’s on family structure and social capital, fertility and marriage rates, patterns of sexual behavior and their links to flourishing relationships, or just trends in marital contentment and personal happiness more generally — the basic social conservative analysis has turned out to have more predictive power than my rigorously empirical liberal friends are inclined to admit. . . . In the late 1960s and early ’70s, the pro-choice side of the abortion debate frequently predicted that legal abortion would reduce single parenthood and make marriages more stable, while the pro-life side made the allegedly-counterintuitive claim that it would have roughly the opposite effect; overall, it’s fair to say that post-Roe trends were considerably kinder to Roe’s critics than to the “every child a wanted child” conceit. Conservatives (and not only conservatives) also made various “dystopian” predictions about eugenics and the commodification of human life as reproductive science advanced in the ’70s, while many liberals argued that these fears were overblown; today, from “selective reduction” to the culling of Down’s Syndrome fetuses to worldwide trends in sex-selective abortion, from our fertility industry’s “embryo glut” to the global market in paid surrogacy, the dystopian predictions are basically just the status quo. No-fault divorce was pitched as an escape hatch for the miserable and desperate that wouldn’t affect the average marriage, but of course divorce turned out to havesocial-contagion effects as well. Religious fears that population control would turn coercive and tyrannical were scoffed at and then vindicated. Dan Quayle was laughed at until the data suggested that basically he had it right. The fairly-ancient conservative premise that social permissiveness is better for the rich than for the poor persistently bemuses the left; it also persistently describes reality. And if you dropped some of the documentation from today’s college rape crisis through a wormhole into the 1960s-era debates over shifting to coed living arrangements on campuses, I’m pretty sure that even many of the conservatives in that era would assume that someone was pranking them, that even in their worst fears it couldn’t possibly end up like this. More broadly, over the last few decades social conservatives have frequently offered “both/and” cultural analyses that liberals have found strange or incredible — arguing (as noted above) that a sexually-permissive society can easily end up with a high abortion rate and a high out-of-wedlock birthrate; or that permissive societies can end up with more births to single parents and fewer births (not only fewer than replacement, but fewer than women actually desire) overall; or that expressive individualism could lead to fewer marriages and greater unhappiness for people who do get hitched. Social liberals, on the other hand, have tended to take a view of human nature that’s a little more positivist and consumerist, in which the assumption is that some kind of “perfectly-liberated decision making” is possible and that such liberation leads to optimal outcomes overall. Hence that 1970s-era assumption that unrestricted abortion would be good for children’s family situations, hence the persistent assumption that marriages must be happier when there’s more sexual experimentation beforehand, etc.
Ross Douthat
Unfortunately, like Jonathan’s failed gate-based product development framework, most management processes in place at companies today are designed with something else in mind. They were devised over a century ago, at a time when mistakes were expensive and only the top executives had comprehensive information, and their primary objectives are lowering risk and ensuring that decisions are made only by the few executives with lots of information. In this traditional command-and-control structure, data flows up to the executives from all over the organization, and decisions subsequently flow down. This approach is designed to slow things down, and it accomplishes the task very well. Meaning that at the very moment when businesses must permanently accelerate, their architecture is working against them.
Eric Schmidt (How Google Works)
In analytics, it’s more important for individuals to be able to formulate problems well, to prototype solutions quickly, to make reasonable assumptions in the face of ill-structured problems, to design experiments that represent good investments, and to analyze results.
Foster Provost (Data Science for Business: What You Need to Know about Data Mining and Data-Analytic Thinking)
scripting language is a programming language that provides you with the ability to write scripts that are evaluated (or interpreted) by a runtime environment called a script engine (or an interpreter). A script is a sequence of characters that is written using the syntax of a scripting language and used as the source for a program executed by an interpreter. The interpreter parses the scripts, produces intermediate code, which is an internal representation of the program, and executes the intermediate code. The interpreter stores the variables used in a script in data structures called symbol tables. Typically, unlike in a compiled programming language, the source code (called a script) in a scripting language is not compiled but is interpreted at runtime. However, scripts written in some scripting languages may be compiled into Java bytecode that can be run by the JVM. Java 6 added scripting support to the Java platform that lets a Java application execute scripts written in scripting languages such as Rhino JavaScript, Groovy, Jython, JRuby, Nashorn JavaScript, and so on. Two-way communication is supported. It also lets scripts access Java objects created by the host application. The Java runtime and a scripting language runtime can communicate and make use of each other’s features. Support for scripting languages in Java comes through the Java Scripting API. All classes and interfaces in the Java Scripting API are in the javax.script package. Using a scripting language in a Java application provides several advantages: Most scripting languages are dynamically typed, which makes it simpler to write programs. They provide a quicker way to develop and test small applications. Customization by end users is possible. A scripting language may provide domain-specific features that are not available in Java. Scripting languages have some disadvantages as well. For example, dynamic typing is good to write simpler code; however, it turns into a disadvantage when a type is interpreted incorrectly and you have to spend a lot of time debugging it. Scripting support in Java lets you take advantage of both worlds: it allows you to use the Java programming language for developing statically typed, scalable, and high-performance parts of the application and use a scripting language that fits the domain-specific needs for other parts. I will use the term script engine frequently in this book. A script engine is a software component that executes programs written in a particular scripting language. Typically, but not necessarily, a script engine is an implementation of an interpreter for a scripting language. Interpreters for several scripting languages have been implemented in Java. They expose programming interfaces so a Java program may interact with them.
Kishori Sharan (Scripting in Java: Integrating with Groovy and JavaScript)
DECISIONS Useful: Graphical Presentation Monitor Key Indicators Effective Measurements Wisdom Knowledge The Goal: Strategic Thinking Predictive Value Experience and Judgment Automated Exception Notification Information Structured: Voluminous Grouped and Summarized Relationships Not Always Evident Raw Data: Massive Fragmented Meaningless Data EVENTS Figure 1-01. The Pyramid of KnowledgeToyota, this begins with genchi genbutsu, or gemba, which means literally “go see it for yourself. ” Taiichi Ohno, a founding father of Lean, once said, “Data is of course important in manufacturing, but I place the greatest emphasis on facts. ” 2 A direct and intuitive understanding of a situation is far more useful than mountains of data. The raw data stored in a database adds value for decision-making only if the right information is presented in the right format, to the right people, at the right time. A tall stack of printout may contain the right data, but it’s certainly not in an accessible format. Massive weekly batch printouts do not enable timely and proactive decisions. Raw data must be summarized, structured, and presented as digestible information. Once information is combined with direct experience, then the incredible human mind can extract and develop useful knowledge. Over time, as knowledge is accumulated and combined with direct experience and judgment, wisdom develops. This evolution is described by the classic pyramid of knowledge shown in Figure 1-01. BACK TO CHICAGO So what happened in Chicago? We can speculate upon several possible perspectives for why the team and its change leader were far from a true Lean system, yet they refused any help from IT providers: 1. They feared wasteful IT systems and procedures would be foisted on them.
Anonymous
Encapsulation is almost always a good thing to do, but sometimes information can be hidden in the wrong place. This makes the code difficult to understand, to integrate, or to build behavior from by composing objects. The best defense is to be clear about the difference between the two concepts when discussing a design. For example, we might say: • “Encapsulate the data structure for the cache in the CachingAuctionLoader class.” • “Encapsulate the name of the application’s log file in the PricingPolicy class.” These sound reasonable until we recast them in terms of information hiding: • “Hide the data structure used for the cache in the CachingAuctionLoader class.” • “Hide the name of the application’s log file in the PricingPolicy class.” Context independence tells us that we have no business hiding details of the log file in the PricingPolicy class—they’re concepts from different levels in the “Russian doll” structure of nested domains. If the log file name is necessary, it should be packaged up and passed in from a level that understands external configuration.
Steve Freeman (Growing Object-Oriented Software, Guided by Tests (Addison-Wesley Signature Series (Beck)))
Such revolutions in formal learning and felt experience needed new modes to express their understanding, beyond sonorous Ciceronian periods and the rigid structure of heroic couplets. It needed something looser, longer, and above all historical, which could not only link events, data, ideas, and context through time, but in which history could itself serve as an informing principle. The age craved creation stories in which the logic and moral order were manifest in and through the unfolding of the story.
Lydia Pyne (The Last Lost World: Ice Ages, Human Origins, and the Invention of the Pleistocene)
Figure 3.35 shows examples of nonstandard trend lines: FIGURE 3.35 Nonstandard Trend Lines in XLF A is drawn between lows in a downtrend instead of between highs in a downtrend. B is also drawn between lows in a downtrend. Furthermore, it ignores a large price spike in an effort to fit the line to later data. C is more of a best-fit line drawn through the center of a price area. These may be drawn freehand or via a procedure like linear regression. D is drawn between highs in an uptrend. E raises a critical point about trend lines: They are lines drawn between successive swings in the market. If there are no swings, there should be no trend line. It would be hard to argue that the market was showing any swings at E, at least on this time frame. This trend line may be valid on a lower time frame, but it is nonstandard on this time frame. In general, trend lines are tools to define the relationship between swings, and are a complement to the simple length of swing analysis. As such, one of the requirements for drawing trend lines is that there must actually be swings in the market. We see many cases where markets are flat, and it is possible to draw trend lines that touch the tops or bottoms of many consecutive price bars. With one important exception later in this chapter, these types of trend lines do not tend to be very significant. They are penetrated easily by the smallest motions in the market, and there is no reliable price action after the penetration. Avoid drawing these trend lines in flat markets with no definable swings.
Adam H. Grimes (The Art and Science of Technical Analysis: Market Structure, Price Action, and Trading Strategies (Wiley Trading Book 547))
The Indivo system resolves the Problem of Mutual Accommodation of Interdependent Systems summarized earlier by inserting a layer of virtualization between two interdependent structures. It makes the data open, modular, and conformable, so that the applications using the data can be optimized. By being modular (open source), the data in PHRs are commoditized—it is no longer a strategic asset, nor where money can be made. Instead, profit in the industry will be made by firms that build applications that use the data. Some
Clayton M. Christensen (The Innovator's Prescription: A Disruptive Solution for Health Care)
Procedural code (code using data structures) makes it easy to add new functions without changing the existing data structures. OO code, on the other hand, makes it easy to add new classes without changing existing functions.
Robert C. Martin
The complement is also true: Procedural code makes it hard to add new data structures because all the functions must change. OO code makes it hard to add new functions because all the classes must change.
Robert C. Martin
Stale data can cause serious and confusing failures such as unexpected exceptions, corrupted data structures, inaccurate computations, and infinite loops. [2]
Brian Goetz (Java Concurrency in Practice)
However, there are actually few good data, or much theory, as to why relative brain size is the best indicator of cognitive ability, other than a general feeling that large animals need large brains. Instead, there is increasing evidence from structural analyses of brains, as well as from attempts to test species with different-sized brains on comparable tasks, that absolute size may be a better general measure of cognitive ability.
Hal Whitehead (The Cultural Lives of Whales and Dolphins)
Logic. Rationality. Reasoning. Thought. Analysis. Calculation. Decision-making. All this is within the mind of a human being, correct? Humanity
Code Well Academy (Javascript Artificial Intelligence: Made Easy, w/ Essential Programming; Create your * Problem Solving * Algorithms! TODAY! w/ Machine Learning & Data Structures (Artificial Intelligence Series))
Following the path of earlier unificationists, one of Eddington's aims was to reduce the contingencies in the description of nature, for example, by explaining the fundamental constants of physics rather than accepting them as merely experimental data. One of these constants was the fine-structure constant ..., which entered prominently in Dirac's theory and was known to be about 1/137.
Helge Kragh (Quantum Generations: A History of Physics in the Twentieth Century)
Each tribe’s solution to its central problem is a brilliant, hard-won advance. But the true Master Algorithm must solve all five problems, not just one. For example, to cure cancer we need to understand the metabolic networks in the cell: which genes regulate which others, which chemical reactions the resulting proteins control, and how adding a new molecule to the mix would affect the network. It would be silly to try to learn all of this from scratch, ignoring all the knowledge that biologists have painstakingly accumulated over the decades. Symbolists know how to combine this knowledge with data from DNA sequencers, gene expression microarrays, and so on, to produce results that you couldn’t get with either alone. But the knowledge we obtain by inverse deduction is purely qualitative; we need to learn not just who interacts with whom, but how much, and backpropagation can do that. Nevertheless, both inverse deduction and backpropagation would be lost in space without some basic structure on which to hang the interactions and parameters they find, and genetic programming can discover it. At this point, if we had complete knowledge of the metabolism and all the data relevant to a given patient, we could figure out a treatment for her. But in reality the information we have is always very incomplete, and even incorrect in places; we need to make headway despite that, and that’s what probabilistic inference is for. In the hardest cases, the patient’s cancer looks very different from previous ones, and all our learned knowledge fails. Similarity-based algorithms can save the day by seeing analogies between superficially very different situations, zeroing in on their essential similarities and ignoring the rest. In this book we will synthesize a single algorithm will all these capabilities:
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)