Data Engineering Quotes

We've searched our database for all the quotes and captions related to Data Engineering. Here they are! All 100 of them:

Software engineers are sneaky bastards when it comes to data management.
Andy Weir (The Martian)
On the first day of a college you will worry about how will you do inside the college? and at the last day of a college you will wonder what will you do outside the college?
Amit Kalantri
Generally, the craft of programming is the factoring of a set of requirements into a a set of functions and data structures.
Douglas Crockford (JavaScript: The Good Parts)
Sometimes I would worry about my internet habits and force myself awy from the computer, to read a magazine or book. Contemporary literature offered no respite: I would find the prose cluttered with data points, tenuous historical connections, detail so finely tuned it could have only been extracted from a feverish night of search-engine queries. Aphorisms were in; authors were wired. I would pick up books that had been heavily documented on social media, only to find that the books themselves had a curatorial affect: beautiful descriptions of little substance, arranged in elegant vignettes—gestural text, the equivalent of a rumpled linen bedsheet or a bunch of dahlias placed just so. Oh, I would think, turning the page. This author is addicted to the internet, too.
Anna Wiener (Uncanny Valley)
The important thing with Elon,” he says, “is that if you told him the risks and showed him the engineering data, he would make a quick assessment and let the responsibility shift from your shoulders to his.
Walter Isaacson (Elon Musk)
Search engines finds the information, not necessarily the truth.
Amit Kalantri
A basic principle of data processing teaches the folly of trying to maintain independent files in synchonism.
Frederick P. Brooks Jr. (The Mythical Man-Month: Essays on Software Engineering)
ideologues of every stripe, as well as folks with interests economic, political, or personal, can interpret data and statistics to suit their own purposes...
Peter Benchley (Shark Trouble)
Data scientist (noun): Person who is better at statistics than any software engineer and better at software engineering than any statistician. — Josh Wills
Rachel Schutt (Doing Data Science: Straight Talk from the Frontline)
Algorithms tuned by an average engineer can outperform those built by the world’s leading experts if the average engineer has access to far more data.
Kai-Fu Lee (AI Superpowers: China, Silicon Valley, and the New World Order)
At its most basic level semantic search applies meaning to the connections between the different data nodes of the Web in ways that allow a clearer understanding of them than we have ever had to date.
David Amerland (Google Semantic Search: Search Engine Optimization (SEO) Techniques That Get Your Company More Traffic)
The appeal by twentieth-century pluralists to scientific method was also ideologically—and even messianically—driven. It ignored scientific data that interfered with environmentalist assumptions and misrepresented socialist faith as “scientific planning.
Paul Edward Gottfried
Every thought that arises in the mind has its roots in data you have already accumulated.
Sadhguru (Inner Engineering: A Yogi's Guide to Joy)
Of course!” Jack beamed. “Software engineers are sneaky bastards when it comes to data management.
Andy Weir (The Martian)
the invention of deep learning means that we are moving from the age of expertise to the age of data. Training successful deep-learning algorithms requires computing power, technical talent, and lots of data. But of those three, it is the volume of data that will be the most important going forward. That’s because once technical talent reaches a certain threshold, it begins to show diminishing returns. Beyond that point, data makes all the difference. Algorithms tuned by an average engineer can outperform those built by the world’s leading experts if the average engineer has access to far more data.
Kai-Fu Lee (AI Superpowers: China, Silicon Valley, and the New World Order)
He liked to start sentences with okay, so. It was a habit he had picked up from the engineers. He thought it made him sound smarter, thought it made him sound like them, those code jockeys, standing by the coffee machine, talking faster than he could think, talking not so much in sentences as in data structures, dense clumps of logic with the occasional inside joke. He liked to stand near them, pretending to stir sugar into his coffee, listening in on them as if they were speaking a different language. A language of knowing something, a language of being an expert at something. A language of being something more than an hourly unit.
Charles Yu (Sorry Please Thank You)
If you could be any character on The Next Generation, who would you be?" "Easy," Solomon said. "Data. For sure." "That makes sense," Clark said. "You?" "I always liked Wesley Crusher." "What?" Solomon was appalled. "Nobody likes Wesley Crusher." "Why not?" Lisa asked. "Because he's a total Mary Sue," Solomon said. "He's too perfect." "But he's always saving the day," Clark argued. "Like, always." "Exactly. He's just a talking deus ex machina. Everybody on the ship treats him like a dumb kid, then he saves them at the last minute and, every single time, they go right back to treating him like a dumb kid again. Do I need to remind you that the starship Enterprise is full of genius scientists and engineers? Why's this kid who can't get into Starfleet Academy smarter than all of them?" "Good point," Clark said. "He's still my choice, though.
John Corey Whaley (Highly Illogical Behavior)
data mining is an exploratory undertaking closer to research and development than it is to engineering.
Foster Provost (Data Science for Business: What You Need to Know about Data Mining and Data-Analytic Thinking)
The science and engineering of programming just isn’t good enough to produce flawless software, and that isn’t going to change anytime soon. The
Bruce Schneier (Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World)
Data science is the civil engineering of data. Its acolytes possess a practical knowledge of tools and materials, coupled with a theoretical understanding of what’s possible.
Rachel Schutt (Doing Data Science: Straight Talk from the Frontline)
As one Google Translate engineer put it, "when you go from 10,000 training examples to 10 billion training examples, it all starts to work. Data trumps everything.
Garry Kasparov (Deep Thinking: Where Machine Intelligence Ends and Human Creativity Begins)
My nose wrinkles at the reek of the trickle of data flowing to the device in his hands; no security system devised by man can withstand the relentless destructive pressure of boredom.
Greg Chivers (The Crying Machine)
Monte Carlo is able to discover practical solutions to otherwise intractable problems because the most efficient search of an unmapped territory takes the form of a random walk. Today’s search engines, long descended from their ENIAC-era ancestors, still bear the imprint of their Monte Carlo origins: random search paths being accounted for, statistically, to accumulate increasingly accurate results. The genius of Monte Carlo—and its search-engine descendants—lies in the ability to extract meaningful solutions, in the face of overwhelming information, by recognizing that meaning resides less in the data at the end points and more in the intervening paths.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
At this risk of being redundant, everything must be checked in to code. This includes the following: DB object migrations Triggers Procedures and functions Views Configurations Sample datasets for functionality Data cleanup scripts
Laine Campbell (Database Reliability Engineering: Designing and Operating Resilient Database Systems)
I repeat, feedback is a method of controlling a system by reinserting into it the results of its past performance. If these results are merely used as numerical data for the criticism of the system and its regulation, we have the simple feedback of the control engineers. If, however, the information which proceeds backward from the performance is able to change the general method and pattern of performance, we have a process which may well be called learning.
Norbert Wiener (The Human Use Of Human Beings: Cybernetics And Society (The Da Capo series in science))
engineering desire, is the approach of Silicon Valley, authoritarian governments, and the Cult of Experts. The first two use intelligence and data to centrally plan a system in which people want things that other people want them to want -- things that benefit a certain group of people. This approach poses a serious threat to human agency. It also lacks respect for the capability of people to freely desire what is best for themselves and for the people they love.
Luke Burgis (Wanting: The Power of Mimetic Desire in Everyday Life)
intellectuals who reject eternal truths and experience through the ages for the social engineering by supposed experts and their administrative state—which claim to use data, science, and empiricism to analyze, manage, and control society.
Mark R. Levin (American Marxism)
The Columbia Accident Investigation Board concluded that NASA’s culture “emphasized chain of command, procedure, following the rules, and going by the book. While rules and procedures were essential for coordination, they had an unintended negative effect.” Once again, “allegiance to hierarchy and procedure” had ended in disaster. Again, lower ranking engineers had concerns they could not quantify; they stayed silent because “the requirement for data was stringent and inhibiting.
David Epstein (Range: Why Generalists Triumph in a Specialized World)
Realizing the newfound promise of electrification a century ago required four key inputs: fossil fuels to generate it, entrepreneurs to build new businesses around it, electrical engineers to manipulate it, and a supportive government to develop the underlying public infrastructure. Harnessing the power of AI today—the “electricity” of the twenty-first century—requires four analogous inputs: abundant data, hungry entrepreneurs, AI scientists, and an AI-friendly policy environment.
Kai-Fu Lee (AI Superpowers: China, Silicon Valley, and the New World Order)
A humorous treatment of the rigid uniformitarian view came from Mark Twain. Although the shortening of the Mississippi River he referred to was the result of engineering projects eliminating many of the bends in the river, it is a thought-provoking spoof: The Mississippi between Cairo and New Orleans was twelve hundred and fifteen miles long one hundred and seventy-six years ago. . . . Its length is only nine hundred and seventy-three miles at present. Now, if I wanted to be one of those ponderous scientific people, and “let on” to prove what had occurred in the remote past by what had occurred in a given time in the recent past . . . what an opportunity is here! Geology never had such a chance, nor such exact data to argue from! . . . In the space of one hundred and seventy-six years the Lower Mississippi has shortened itself two hundred and forty-two miles. That is an average of a trifle over one mile and a third per year. Therefore, any calm person, who is not blind or idiotic, can see that in the Old Oolitic Silurian Period, just a million years ago next November, the Lower Mississippi River was upwards of one million three hundred thousand miles long, and stuck out over the Gulf of Mexico like a fishing-rod. And by the same token any person can see that seven hundred and forty-two years from now the lower Mississippi will be only a mile and three-quarters long. . . . There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.
Mark Twain (Life on the Mississippi)
Stop for a second to behold the miracle of engineering that these hand-held, networked computers represent—the typical CPU in a modern smartphone is ten times more powerful than the Cray-1 supercomputer installed at Los Alamos National Laboratory in 1976.
Anthony M. Townsend (Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia)
Complete rationality of action in the Cartesian sense demands complete knowledge of all the relevant facts. A designer or engineer needs all the data and full power to control or manipulate them if he is to organize the material objects to produce the intended result. But the success of action in society depends on more particular facts than anyone can possibly know. And our whole civilization in consequence rests, and must rest, on our believing rnuch that we cannot know to be true in the Cartesian sense.
Friedrich A. Hayek (Law, Legislation and Liberty, Volume 1: Rules and Order)
Amazon engineer Greg Linden originally introduced doppelganger searches to predict readers’ book preferences, the improvement in recommendations was so good that Amazon founder Jeff Bezos got to his knees and shouted, “I’m not worthy!” to Linden. But what is really interesting about doppelganger searches, considering their power, is not how they’re commonly being used now. It is how frequently they are not used. There are major areas of life that could be vastly improved by the kind of personalization these searches allow.
Seth Stephens-Davidowitz (Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are)
they didn't examine the problem and accumulate data to figure out the best solution - they engineered the outcome they wanted from the beginning. if they didn't achieve their desired outcome, they understood it was because of a decision they made at the start of a process
Simon Sinek (Start with Why: How Great Leaders Inspire Everyone to Take Action)
It would be interesting to keep a running log of predictions and see if we can spot the absolute corkers when they are still just pert little buds. One such that I spotted recently was a statement made in February by a Mr. Wayne Leuck, vice-president of engineering at USWest, the American phone company. Arguing against the deployment of high-speed wireless data connections, he said, “Granted, you could use it in your car going sixty miles an hour, but I don’t think too many people are going to be doing that.” Just watch. That’s a statement that will come back to haunt him.
Douglas Adams (The Salmon of Doubt: Hitchhiking the Galaxy One Last Time)
But you weren’t born,” I tell him. “I wrote an algorithm based on the Linux operating kernel. You’re an open-source search engine married to a dialog bot and a video compiler. The program scrubs the Web and archives a person’s images and videos and data—everything you say, you’ve said before.” For
Adam Johnson (Fortune Smiles)
PRISM enabled the NSA to routinely collect data from Microsoft, Yahoo!, Google, Facebook, Paltalk, YouTube, Skype, AOL, and Apple, including email, photos, video and audio chats, Web-browsing content, search engine queries, and all other data stored on their clouds, transforming the companies into witting coconspirators.
Edward Snowden (Permanent Record)
Different databases are designed to solve different problems. Using a single database engine for all of the requirements usually leads to non- performant solutions; storing transactional data, caching session information, traversing graph of customers and the products their friends bought are essentially different problems.
Pramod J. Sadalage (NoSQL Distilled: A Brief Guide to the Emerging World of Polyglot Persistence)
The more time I spent in Finland, the more I started to worry that the reforms sweeping across the United States had the equation backwards. We were trying to reverse engineer a high-performance teaching culture through dazzlingly complex performance evaluations and value-added data analysis. It made sense to reward, train, and dismiss more teachers based on their performance, but that approach assumed that the worst teachers would be replaced with much better ones, and that the mediocre teachers would improve enough to give students the kind of education they deserved. However, there was not much evidence that either scenario was happening in reality.
Amanda Ripley (The Smartest Kids in the World: And How They Got That Way)
The chopped salad is engineered…to free one’s hand and eyes from the task of consuming nutrients, so that precious attention can be directed toward a small screen, where it is more urgently needed, so it can consume data: work email or Amazon’s nearly infinite catalog or Facebook’s actually infinite News Feed, where, as one shops for diapers or engages with the native advertising sprinkled between the not-hoaxes and baby photos, one is being productive by generating revenue for a large internet company, which is obviously good for the economy, or at least it is certainly better than spending lunch reading a book from the library, because who is making money from that?
Jia Tolentino (Trick Mirror)
As more and more data flows from your body and brain to the smart machines via the biometric sensors, it will become easy for corporations and government agencies to know you, manipulate you, and make decisions on your behalf. Even more importantly, they could decipher the deep mechanisms of all bodies and brains, and thereby gain the power to engineer life. If we want to prevent a small elite from monopolising such godlike powers, and if we want to prevent humankind from splitting into biological castes, the key question is: who owns the data? Does the data about my DNA, my brain and my life belong to me, to the government, to a corporation, or to the human collective?
Yuval Noah Harari (21 Lessons for the 21st Century)
Connascence, in the context of software engineering, refers to the degree of coupling between software components. (Connascence.io hosts a handy reference to the various types of connascence.) Software components are connascent if a change in one would require the other(s) to be modified in order to maintain the overall correctness of the system.
Piethein Strengholt (Data Management at Scale: Best Practices for Enterprise Architecture)
In the mid-1990s, a new employee of Sun Microsystems in California kept disappearing from their database. Every time his details were entered, the system seemed to eat him whole; he would disappear without a trace. No one in HR could work out why poor Steve Null was database kryptonite. The staff in HR were entering the surname as “Null,” but they were blissfully unaware that, in a database, NULL represents a lack of data, so Steve became a non-entry. To computers, his name was Steve Zero or Steve McDoesNotExist. Apparently, it took a while to work out what was going on, as HR would happily reenter his details each time the issue was raised, never stopping to consider why the database was routinely removing him.
Matt Parker (Humble Pi: A Comedy of Maths Errors)
The modern world is drowning in information. We have more data than we can possibly use regarding nearly every picayune matter of society, economics, and politics. Science has contributed to this tsunami of facts and figures, but Riley's reports demonstrated that the tidal wave of minutiae is hardly unique to our time. In every age the challenge has been to move from information to knowledge. And the value of experts lies in their capacity to extract meaning from the reams of facts. Rather than being swamped by raw data, the connoisseur, craftsman, engineer, clinician, or scientist is selectively and self-consciously blind. Knowing what to ignore, recognizing what is extraneous, is the key to deriving pattern, form, and insight.
Jeffrey A. Lockwood
What’s more, attempting to score a teacher’s effectiveness by analyzing the test results of only twenty-five or thirty students is statistically unsound, even laughable. The numbers are far too small given all the things that could go wrong. Indeed, if we were to analyze teachers with the statistical rigor of a search engine, we’d have to test them on thousands or even millions of randomly selected students.
Cathy O'Neil (Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy)
One moment it was a calculating machine, attempting dispassionately to keep up with the gouts of data. And then awash in those gouts, something metal twitched and a patter of valves sounded that had not been instructed by those numbers. A loop of data was self-generated by the analytical engine. The processor reflected on its creation in a hiss of high-pressure steam. One moment it was a calculating machine. The next, it thought.
China Miéville (Perdido Street Station (New Crobuzon, #1))
Lilah did little more than sleep and eat and cry, which to me was the most fascinating thing in the entire universe. Why did she cry? When did she sleep? What made her eat a lot one day and little the next? Was she changing with time? I did what any obsessed person would do in such a case: I recorded data, plotted it, calculated statistical correlations. First I just wrote on scraps of paper and made charts on graph paper, but I very quickly became more sophisticated. I wrote computer software to make a beautifully colored plot showing times when Diane fed Lilah, in black; when I fed her, in blue (expressed mother's milk, if you must know); Lilah's fussy times, in angry red; her happy times, in green. I calculated patterns in sleeping times, eating times, length of sleep, amounts eaten. Then, I did what any obsessed person would do these days; I put it all on the Web.
Mike Brown (How I Killed Pluto and Why It Had It Coming)
The brain is a belief engine. From sensory data flowing in through the senses the brain naturally begins to look for and find patterns, and then infuses those patterns with meaning. The first process I call patternicity: the tendency to find meaningful patterns in both meaningful and meaningless data. The second process I call agenticity: the tendency to infuse patterns with meaning, intention, and agency. We can’t help it. Our brains evolved to connect the dots of our world into meaningful patterns that explain why things happen. These meaningful patterns become beliefs, and these beliefs shape our understanding of reality. Once beliefs are formed, the brain begins to look for and find confirmatory evidence in support of those beliefs, which adds an emotional boost of further confidence in the beliefs and thereby accelerates the process of reinforcing them, and round and round the process goes in a positive feedback loop of belief confirmation.
Michael Shermer (The Believing Brain: From Spiritual Faiths to Political Convictions – How We Construct Beliefs and Reinforce Them as Truths)
In general, software engineering teams and IT departments seemed to be at the mercy of other groups who would negotiate, cajole, intimidate, and overrule even the most defensible and objectively derived plans. Even plans based on thorough analysis and backed by years of historical data were vulnerable. Most teams, which had neither a thorough analysis method nor any historical data, were powerless at the hands of others who would push them to commit to unknown (and often completely unreasonable) deliverables.
David J. Anderson (Kanban)
Uber had tagged Mr. England and his colleagues—essentially Greyballing them as city officials—based on data collected from the app and in other ways. The company then served up a fake version of the app, populated with ghost cars, to evade capture. This is supposed to be serious evidence of terrible wrongdoing. But a lot of us just read that description and burst out laughing, congratulating Uber’s engineers on their cleverness in leaving some blue-nosed petty authoritarian standing at the curb waiting for a car that never comes.
Robert Tracinski (So Who Is John Galt, Anyway?: A Reader's Guide to Ayn Rand's "Atlas Shrugged")
So we had better call upon our lawyers, politicians, philosophers and even poets to turn their attention to this conundrum: how do you regulate the ownership of data? This may well be the most important political question of our era. If we cannot answer this question soon, our sociopolitical system might collapse. People are already sensing the coming cataclysm. Perhaps this is why citizens all over the world are losing faith in the liberal story, which just a decade ago seemed irresistible. How, then, do we go forward from here, and how do we cope with the immense challenges of the biotech and infotech revolutions? Perhaps the very same scientists and entrepreneurs who disrupted the world in the first place could engineer some technological solution? For example, might networked algorithms form the scaffolding for a global human community that could collectively own all the data and oversee the future development of life? As global inequality rises and social tensions increase around the world, perhaps Mark Zuckerberg could call upon his 2 billion friends to join forces and do something together?
Yuval Noah Harari (21 Lessons for the 21st Century)
Whatever you are identified with, all your thoughts and emotions spring from that identity. Right now suppose you identify yourself as a man, all your thoughts and emotions flow from that identification. If you identify yourself with your nationality or religion, they will flow from those identifications. Whatever your thoughts and emotions, these identifications are a certain level of prejudice. In fact, your mind is itself a certain kind of prejudice. Why? Because it functions from limited data and is fronted by an essentially discriminatory intellect.
Sadhguru (Inner Engineering: A Yogi's Guide to Joy)
I dare to hope that search engines and social media algorithms will be optimized for truth and social relevance rather than simply showing people what they want to see; that there will be independent, third-party algorithms that rate the veracity of headlines, websites, and news stories in real time, allowing users to more quickly sift through the propaganda-laden garbage and get closer to evidence-based truth; that there will be actual respect for empirically tested data, because in an infinite sea of possible beliefs, evidence is the only life preserver we’ve got.
Mark Manson (Everything Is F*cked: A Book About Hope)
This underscores another common feature of WMDs. They tend to punish the poor. This is, in part, because they are engineered to evaluate large numbers of people. They specialize in bulk, and they’re cheap. That’s part of their appeal. The wealthy, by contrast, often benefit from personal input. A white-shoe law firm or an exclusive prep school will lean far more on recommendations and face-to-face interviews than will a fast-food chain or a cash-strapped urban school district. The privileged, we’ll see time and again, are processed more by people, the masses by machines.
Cathy O'Neil (Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy)
According to the prevailing notion, to be free means to be free to satisfy one’s preferences. Preferences themselves are beyond rational scrutiny; they express the authentic core of a self whose freedom is realized when there are no encumbrances to its preference-satisfying behavior. Reason is in the service of this freedom, in a purely instrumental way; it is a person’s capacity to calculate the best means to satisfy his ends. About the ends themselves we are to maintain a principled silence, out of respect for the autonomy of the individual. To do otherwise would be to risk lapsing into paternalism. Thus does liberal agnosticism about the human good line up with the market ideal of “choice.” We invoke the latter as a content-free meta-good that bathes every actual choice made in the softly egalitarian, flattering light of autonomy. This mutually reinforcing set of posits about freedom and rationality provides the basic framework for the discipline of economics, and for “liberal theory” in departments of political science. It is all wonderfully consistent, even beautiful. But in surveying contemporary life, it is hard not to notice that this catechism doesn’t describe our situation very well. Especially the bit about our preferences expressing a welling-up of the authentic self. Those preferences have become the object of social engineering, conducted not by government bureaucrats but by mind-bogglingly wealthy corporations armed with big data. To continue to insist that preferences express the sovereign self and are for that reason sacred—unavailable for rational scrutiny—is to put one’s head in the sand. The resolutely individualistic understanding of freedom and rationality we have inherited from the liberal tradition disarms the critical faculties we need most in order to grapple with the large-scale societal pressures we now face.
Matthew B. Crawford (The World Beyond Your Head: On Becoming an Individual in an Age of Distraction)
PRISM enabled the NSA to routinely collect data from Microsoft, Yahoo!, Google, Facebook, Paltalk, YouTube, Skype, AOL, and Apple, including email, photos, video and audio chats, Web-browsing content, search engine queries, and all other data stored on their clouds, transforming the companies into witting coconspirators. Upstream collection, meanwhile, was arguably even more invasive. It enabled the routine capturing of data directly from private-sector Internet infrastructure—the switches and routers that shunt Internet traffic worldwide, via the satellites in orbit and the high-capacity fiber-optic cables that run under the ocean.
Edward Snowden (Permanent Record)
A friend of ours encountered this problem with his home-built computer long ago. He wrote a BIOS that used a magic value in a particular memory location to determine whether a reset was a cold reboot or a warm reboot. After a while the machine refused to boot after power-up because the memory had learned the magic value, and the boot process therefore treated every reset as a warm reboot. As this did not initialize the proper variables, the boot process failed. The solution in his case was to swap some memory chips around, scrambling the magic value that the SRAM had learned. For us, it was a lesson to remember: memory retains more data than you think.
Niels Ferguson (Cryptography Engineering: Design Principles and Practical Applications)
In the coming decades, it is likely that we will see more Internet-like revolutions, in which technology steals a march on politics. Artificial intelligence and biotechnology might soon overhaul our societies and economies – and our bodies and minds too – but they are hardly a blip on our political radar. Our current democratic structures just cannot collect and process the relevant data fast enough, and most voters don’t understand biology and cybernetics well enough to form any pertinent opinions. Hence traditional democratic politics loses control of events, and fails to provide us with meaningful visions for the future. That doesn’t mean we will go back to twentieth-century-style dictatorships. Authoritarian regimes seem to be equally overwhelmed by the pace of technological development and the speed and volume of the data flow. In the twentieth century, dictators had grand visions for the future. Communists and fascists alike sought to completely destroy the old world and build a new world in its place. Whatever you think about Lenin, Hitler or Mao, you cannot accuse them of lacking vision. Today it seems that leaders have a chance to pursue even grander visions. While communists and Nazis tried to create a new society and a new human with the help of steam engines and typewriters, today’s prophets could rely on biotechnology and super-computers.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
Google had a built-in disadvantage in the social networking sweepstakes. It was happy to gather information about the intricate web of personal and professional connections known as the “social graph” (a term favored by Facebook’s Mark Zuckerberg) and integrate that data as signals in its search engine. But the basic premise of social networking—that a personal recommendation from a friend was more valuable than all of human wisdom, as represented by Google Search—was viewed with horror at Google. Page and Brin had started Google on the premise that the algorithm would provide the only answer. Yet there was evidence to the contrary. One day a Googler, Joe Kraus, was looking for an anniversary gift for his wife. He typed “Sixth Wedding Anniversary Gift Ideas” into Google, but beyond learning that the traditional gift involved either candy or iron, he didn’t see anything creative or inspired. So he decided to change his status message on Google Talk, a line of text seen by his contacts who used Gmail, to “Need ideas for sixth anniversary gift—candy ideas anyone?” Within a few hours, he got several amazing suggestions, including one from a colleague in Europe who pointed him to an artist and baker whose medium was cake and candy. (It turned out that Marissa Mayer was an investor in the company.) It was a sobering revelation for Kraus that sometimes your friends could trump algorithmic search.
Steven Levy (In the Plex: How Google Thinks, Works, and Shapes Our Lives)
It’s easy to raise graduation rates, for example, by lowering standards. Many students struggle with math and science prerequisites and foreign languages. Water down those requirements, and more students will graduate. But if one goal of our educational system is to produce more scientists and technologists for a global economy, how smart is that? It would also be a cinch to pump up the income numbers for graduates. All colleges would have to do is shrink their liberal arts programs, and get rid of education departments and social work departments while they’re at it, since teachers and social workers make less money than engineers, chemists, and computer scientists. But they’re no less valuable to society.
Cathy O'Neil (Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy)
Dr. Fauci’s business closures pulverized America’s middle class and engineered the largest upward transfer of wealth in human history. In 2020, workers lost $3.7 trillion while billionaires gained $3.9 trillion.46 Some 493 individuals became new billionaires,47 and an additional 8 million Americans dropped below the poverty line.48 The biggest winners were the robber barons—the very companies that were cheerleading Dr. Fauci’s lockdown and censoring his critics: Big Technology, Big Data, Big Telecom, Big Finance, Big Media behemoths (Michael Bloomberg, Rupert Murdoch, Viacom, and Disney), and Silicon Valley Internet titans like Jeff Bezos, Bill Gates, Mark Zuckerberg, Eric Schmidt, Sergey Brin, Larry Page, Larry Ellison, and Jack Dorsey.
Robert F. Kennedy Jr. (The Real Anthony Fauci: Bill Gates, Big Pharma, and the Global War on Democracy and Public Health)
It’s All about Scaling Most of the current learning algorithms were discovered more than twenty-five years ago, so why did it take so long for them to have an impact on the real world? With the computers and labeled data that were available to researchers in the 1980s, it was only possible to demonstrate proof of principle on toy problems. Despite some promising results, we did not know how well network learning and performance would scale as the number of units and connections increased to match the complexity of real-world problems. Most algorithms in AI scale badly and never went beyond solving toy problems. We now know that neural network learning scales well and that performance continues to increase with the size of the network and the number of layers. Backprop, in particular, scales extremely well. Should we be surprised? The cerebral cortex is a mammalian invention that mushroomed in primates and especially in humans. And as it expanded, more capacity became available and more layers were added in association areas for higher-order representations. There are few complex systems that scale this well. The Internet is one of the few engineered systems whose size has also been scaled up by a million times. The Internet evolved once the protocols were established for communicating packets, much like the genetic code for DNA made it possible for cells to evolve. Training many deep learning networks with the same set of data results in a large number of different networks that have roughly the same average level of performance.
Terrence J. Sejnowski (The Deep Learning Revolution (The MIT Press))
By 2011, enough data had been accumulated to show that some risk existed due to long-term, heavy use of mobile phones, compelling the International Agency for Research on Cancer, a branch of the World Health Organization, to classify mobile cell phone radiation in the Group 2B category, indicating a possible carcinogen (a substance or source of exposure that can cause cancer).14 This is the same category as DDT, lead, engine exhaust, chloroform, and glyphosate (the active ingredient in Roundup®). Later, in 2016, a $25 million study conducted by the National Toxicology Program (NTP), part of the National Institutes of Health, confirmed what many have believed for years—that exposure to EMF radiation emitted from cell phones can lead to serious health issues including brain and heart tumors.
Daniel T. DeBaun (Radiation Nation: Complete Guide to EMF Protection & Safety - The Proven Health Risks of EMF Radiation & What You Can Do to Protect Yourself & Family)
Equally important, statistical systems require feedback—something to tell them when they’re off track. Without feedback, however, a statistical engine can continue spinning out faulty and damaging analysis while never learning from its mistakes. Many of the WMDs I’ll be discussing in this book, including the Washington school district’s value-added model, behave like that. They define their own reality and use it to justify their results. This type of model is self-perpetuating, highly destructive—and very common. If the people being evaluated are kept in the dark, the thinking goes, they’ll be less likely to attempt to game the system. Instead, they’ll simply have to work hard, follow the rules, and pray that the model registers and appreciates their efforts. But if the details are hidden, it’s also harder to question the score or to protest against it.
Cathy O'Neil (Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy)
von Braun went looking for problems, hunches, and bad news. He even rewarded those who exposed problems. After Kranz and von Braun’s time, the “All Others Bring Data” process culture remained, but the informal culture and power of individual hunches shriveled. In 1974, William Lucas took over the Marshall Space Flight Center. A NASA chief historian wrote that Lucas was a brilliant engineer but “often grew angry when he learned of problems.” Allan McDonald described him to me as a “shoot-the-messenger type guy.” Lucas transformed von Braun’s Monday Notes into a system purely for upward communication. He did not write feedback and the notes did not circulate. At one point they morphed into standardized forms that had to be filled out. Monday Notes became one more rigid formality in a process culture. “Immediately, the quality of the notes fell,” wrote another official NASA historian.
David Epstein (Range: Why Generalists Triumph in a Specialized World)
The current narrative we seem to tell ourselves about our privacy is that it is a sort of currency we trade to corporations in return for innovation. But the corporation has an insatiable appetite for our most personal data in order to drive us to consume during our every waking moment. I think this is critical, because in some ways social networks are powerful engines of conformity. The ability for students to develop their own ideas, identities, and political affiliations should take place outside of the panopticon view of Facebook, but whether this is any longer possible is an open question. My own memory is that the development of my political and cultural persona between the ages of fifteen and twenty-one had a lot to do with being outside the zone of judgment of my parents, their conservative peers from my hometown, Cleveland, and maybe even from my siblings. I’m not sure that it could happen if we were all on Facebook together.
Jonathan Taplin (Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy)
Peopleware. A major contribution during recent years has been DeMarco and Lister's 1987 book, Peopleware: Productive Projects and Teams. Its underlying thesis is that "The major problems of our work are not so much technological as sociological in nature." It abounds with gems such as, "The manager's function is not to make people work, it is to make it possible for people to work." It deals with such mundane topics as space, furniture, team meals together. DeMarco and Lister provide real data from their Coding War Games that show stunning correlation between performances of programmers from the same organization, and between workplace characteristics and both productivity and defect levels. The top performers' space is quieter, more private, better protected against interruption, and there is more of it. . . . Does it really matter to you . . . whether quiet, space, and privacy help your current people to do better work or [alternatively] help you to attract and keep better people?[19]
Frederick P. Brooks Jr. (The Mythical Man-Month: Essays on Software Engineering)
Wider lanes were, obviously, safer than narrower ones. Only they’re not. This time, the problem with the cost-benefit equation wasn’t a faulty premise, but the data itself. In order to test the wider-lanes-are-safer-lanes hypothesis, I studied every crash that occurred on the bridge over a three-year period and marked each one on a map. If that notion had been true, I reasoned, more crashes would have occurred where the lanes were narrowest, that is, at the towers. Just the opposite turned out to be the case. The towers, it turned out, were the safest places on the entire bridge; my explanation is that when lanes get very narrow motorists drive more carefully. Even though every traffic engineer in the country had been taught the gospel of wider lanes, the opposite appeared to be true: “grossly substandard lanes seemed to be the safest of all.” This was the traffic engineering equivalent of saying the Earth was round when the masses knew it was flat. Still, most engineers do not accept this fact.
Samuel I. Schwartz (Street Smart: The Rise of Cities and the Fall of Cars)
In the twenty-first century it sounds childish to compare the human psyche to a steam engine. Today we know of a far more sophisticated technology – the computer – so we explain the human psyche as if it were a computer processing data rather than a steam engine regulating pressure. But this new analogy may turn out to be just as naïve. After all, computers have no minds. They don’t crave anything even when they have a bug, and the Internet doesn’t feel pain even when authoritarian regimes sever entire countries from the Web. So why use computers as a model for understanding the mind? Well, are we really sure that computers have no sensations or desires? And even if they haven’t got any at present, perhaps once they become complex enough they might develop consciousness? If that were to happen, how could we ascertain it? When computers replace our bus driver, our teacher and our shrink, how could we determine whether they have feelings or whether they are just a collection of mindless algorithms? When
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
When we mix a practical ability to engineer minds with our ignorance of the mental spectrum and with the narrow interests of governments, armies and corporations, we get a recipe for trouble. We may successfully upgrade our bodies and our brains, while losing our minds in the process. Indeed, techno-humanism may end up downgrading humans. The system may prefer downgraded humans not because they would possess any superhuman knacks, but because they would lack some really disturbing human qualities that hamper the system and slow it down. As any farmer knows, it’s usually the brightest goat in the flock that stirs up the most trouble, which is why the Agricultural Revolution involved downgrading animals’ mental abilities. The second cognitive revolution, dreamed up by techno-humanists, might do the same to us, producing human cogs who communicate and process data far more effectively than ever before, but who can hardly pay attention, dream or doubt. For millions of years we were enhanced chimpanzees. In the future, we may become oversized ants.
Yuval Noah Harari (Homo Deus: ‘An intoxicating brew of science, philosophy and futurism’ Mail on Sunday)
One other thing. And this really matters for readers of this book. According to official Myers–Briggs documents, the test can ‘give you an insight into what kinds of work you might enjoy and be successful doing’. So if you are, like me, classified as ‘INTJ’ (your dominant traits are being introverted, intuitive and having a preference for thinking and judging), the best-fit occupations include management consultant, IT professional and engineer.30 Would a change to one of these careers make me more fulfilled? Unlikely, according to respected US psychologist David Pittenger, because there is ‘no evidence to show a positive relation between MBTI type and success within an occupation…nor is there any data to suggest that specific types are more satisfied within specific occupations than are other types’. Then why is the MBTI so popular? Its success, he argues, is primarily due to ‘the beguiling nature of the horoscope-like summaries of personality and steady marketing’.31 Personality tests have their uses, even if they do not reveal any scientific ‘truth’ about us. If we are in a state of confusion they can be a great emotional comfort, offering a clear diagnosis of why our current job may not be right, and suggesting others that might suit us better. They also raise interesting hypotheses that aid self-reflection: until I took the MBTI, I had certainly never considered that IT could offer me a bright future (by the way, I apparently have the wrong personality type to be a writer). Yet we should be wary about relying on them as a magic pill that enables us suddenly to hit upon a dream career. That is why wise career counsellors treat such tests with caution, using them as only one of many ways of exploring who you are. Human personality does not neatly reduce into sixteen or any other definitive number of categories: we are far more complex creatures than psychometric tests can ever reveal. And as we will shortly learn, there is compelling evidence that we are much more likely to find fulfilling work by conducting career experiments in the real world than by filling out any number of questionnaires.32
Roman Krznaric (How to Find Fulfilling Work (The School of Life))
Search engine query data is not the product of a designed statistical experiment and finding a way to meaningfully analyse such data and extract useful knowledge is a new and challenging field that would benefit from collaboration. For the 2012–13 flu season, Google made significant changes to its algorithms and started to use a relatively new mathematical technique called Elasticnet, which provides a rigorous means of selecting and reducing the number of predictors required. In 2011, Google launched a similar program for tracking Dengue fever, but they are no longer publishing predictions and, in 2015, Google Flu Trends was withdrawn. They are, however, now sharing their data with academic researchers... Google Flu Trends, one of the earlier attempts at using big data for epidemic prediction, provided useful insights to researchers who came after them... The Delphi Research Group at Carnegie Mellon University won the CDC’s challenge to ‘Predict the Flu’ in both 2014–15 and 2015–16 for the most accurate forecasters. The group successfully used data from Google, Twitter, and Wikipedia for monitoring flu outbreaks.
Dawn E. Holmes (Big Data: A Very Short Introduction (Very Short Introductions))
I have identified patterns in the technology and civilisational behaviour that confirms to me, with great certainty, that approximately thirty thousand human years ago, about ten thousand years after the date of the ceephays’ fall, a new center of data and physical traffic accumulated at the Keijir System. Ten thousand years correlates closely with how long I estimate a ceephay vessel would have taken to travel some feasible sublight to avoid detection using jump engines, until being discovered by reeh vessels, possibly in some kind of hibernation. The accumulation of technologies at the Keijir System was very rapid, and was responded to by several of the species who were recording this data. There are even several surviving speculations from their academia at the time, wondering what was happening at Keijir. I believe these events are entirely consistent with the discovery, by an organic species of lesser intelligence and capability, of an entity of greater intelligence and capability. Great technological advances appear to have followed, and the expansion of what became the Reeh Empire commenced shortly thereafter.
Joel Shepherd (Qalea Drop (The Spiral Wars, #7))
Back in the early 1990s, the FBI started worrying about its ability to conduct telephone surveillance. The FBI could do it with the old analog phone switches: a laborious process involving alligator clips, wires, and a tape recorder. The problem was that digital switches didn’t work that way. Isolating individual connections was harder, and the FBI became concerned about the potential loss of its ability to wiretap. So it lobbied Congress hard and got a law passed in 1994 called the Communications Assistance for Law Enforcement Act, or CALEA, requiring telcos to re-engineer their digital switches to have eavesdropping capabilities built in. Fast-forward 20 years, and the FBI again wants the IT industry to make surveillance easier for itself. A lot of communications no longer happen over the telephone. They’re happening over chat. They’re happening over e-mail. They’re happening over Skype. The FBI is currently lobbying for a legislative upgrade to CALEA, one that covers all communications systems: all voice, video, and text systems, including World of Warcraft and that little chat window attached to your online Scrabble game. The FBI’s ultimate goal is government prohibition of truly secure communications. Valerie
Bruce Schneier (Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World)
Dr. Fauci’s business closures pulverized America’s middle class and engineered the largest upward transfer of wealth in human history. In 2020, workers lost $3.7 trillion while billionaires gained $3.9 trillion.46 Some 493 individuals became new billionaires,47 and an additional 8 million Americans dropped below the poverty line.48 The biggest winners were the robber barons—the very companies that were cheerleading Dr. Fauci’s lockdown and censoring his critics: Big Technology, Big Data, Big Telecom, Big Finance, Big Media behemoths (Michael Bloomberg, Rupert Murdoch, Viacom, and Disney), and Silicon Valley Internet titans like Jeff Bezos, Bill Gates, Mark Zuckerberg, Eric Schmidt, Sergey Brin, Larry Page, Larry Ellison, and Jack Dorsey. The very Internet companies that snookered us all with the promise of democratizing communications made it impermissible for Americans to criticize their government or question the safety of pharmaceutical products; these companies propped up all official pronouncements while scrubbing all dissent. The same Tech/Data and Telecom robber barons, gorging themselves on the corpses of our obliterated middle class, rapidly transformed America’s once-proud democracy into a censorship and surveillance police state from which they profit at every turn.
Robert F. Kennedy Jr. (The Real Anthony Fauci: Bill Gates, Big Pharma, and the Global War on Democracy and Public Health)
This happens because data scientists all too often lose sight of the folks on the receiving end of the transaction. They certainly understand that a data-crunching program is bound to misinterpret people a certain percentage of “he time, putting them in the wrong groups and denying them a job or a chance at their dream house. But as a rule, the people running the WMDs don’t dwell on those errors. Their feedback is money, which is also their incentive. Their systems are engineered to gobble up more data and fine-tune their analytics so that more money will pour in. Investors, of course, feast on these returns and shower WMD companies with more money. And the victims? Well, an internal data scientist might say, no statistical system can be perfect. Those folks are collateral damage. And often, like Sarah Wysocki, they are deemed unworthy and expendable. Big Data has plenty of evangelists, but I’m not one of them. This book will focus sharply in the other direction, on the damage inflicted by WMDs and the injustice they perpetuate. We will explore harmful examples that affect people at critical life moments: going to college, borrowing money, getting sentenced to prison, or finding and holding a job. All of these life domains are increasingly controlled by secret models wielding arbitrary punishments.
Cathy O'Neil (Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy)
one stubborn glitch they couldn’t figure out: the program did a wonderful job spewing out data on the trajectory of artillery shells, but it just didn’t know when to stop. Even after the shell would have hit the ground, the program kept calculating its trajectory, “like a hypothetical shell burrowing through the ground at the same rate it had traveled through the air,” as Jennings described it. “Unless we solved that problem, we knew the demonstration would be a dud, and the ENIAC’s inventors and engineers would be embarrassed.”69 Jennings and Snyder worked late into the evening before the press briefing trying to fix it, but they couldn’t. They finally gave up at midnight, when Snyder needed to catch the last train to her suburban apartment. But after she went to bed, Snyder figured it out: “I woke up in the middle of the night thinking what that error was. . . . I came in, made a special trip on the early train that morning to look at a certain wire.” The problem was that there was a setting at the end of a “do loop” that was one digit off. She flipped the requisite switch and the glitch was fixed. “Betty could do more logical reasoning while she was asleep than most people can do awake,” Jennings later marveled. “While she slept, her subconscious untangled the knot that her conscious mind had been unable to.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
We are striving to engineer the internet of all things in hope to make us healthy, happy and powerful. Yet once the internet of all things is up and running, we might be reduced from engineers to chips then to data and eventually, we might dissolve within the data torrent like a clamp of earth within a gushing river. Dataism, thereby, threatens to do to Homo sapiens what Homo sapiens has done to all other animals. In the course of history, humans have created a global network and evaluated everything according to its function within the network. For thousands of years this boosted human pride and prejudices. Since humans fulfilled the most important function in the network, it was easy for us to take credit for the network’s achievements and to see ourselves as the apex of creation. The lives and experiences of all other animals were undervalued because they fulfilled far less important functions. And whenever an animal ceased to fulfil any function at all it went extinct. However, once humans loose their functional importance to the network, we’ll discover that we are not the apex of creation after all. The yardsticks that we ourselves have enshrined will condemn us to join the mammoths and the Chinese river dolphins in oblivion. Looking back, humanity will turn out to be just a ripple within the cosmic data flow.
Yuval Noah Harari (Homo Deus)
2006 interview by Jim Gray, Amazon CTO Werner Vogels recalled another watershed moment: We went through a period of serious introspection and concluded that a service-oriented architecture would give us the level of isolation that would allow us to build many software components rapidly and independently. By the way, this was way before service-oriented was a buzzword. For us service orientation means encapsulating the data with the business logic that operates on the data, with the only access through a published service interface. No direct database access is allowed from outside the service, and there’s no data sharing among the services.3 That’s a lot to unpack for non–software engineers, but the basic idea is this: If multiple teams have direct access to a shared block of software code or some part of a database, they slow each other down. Whether they’re allowed to change the way the code works, change how the data are organized, or merely build something that uses the shared code or data, everybody is at risk if anybody makes a change. Managing that risk requires a lot of time spent in coordination. The solution is to encapsulate, that is, assign ownership of a given block of code or part of a database to one team. Anyone else who wants something from that walled-off area must make a well-documented service request via an API.
Colin Bryar (Working Backwards: Insights, Stories, and Secrets from Inside Amazon)
Imagine a latter-day Helmholtz presented by an engineer with a digital camera, with its screen of tiny photocells, set up to capture images projected directly on to the surface of the screen. That makes good sense, and obviously each photocell has a wire connecting it to a computing device of some kind where images are collated. Makes sense again. Helmholtz wouldn’t send it back. But now, suppose I tell you that the eye’s ‘photocells’ are pointing backwards, away from the scene being looked at. The ‘wires’ connecting the photocells to the brain run all over the surface of the retina, so the light rays have to pass through a carpet of massed wires before they hit the photocells. That doesn’t make sense – and it gets even worse. One consequence of the photocells pointing backwards is that the wires that carry their data somehow have to pass through the retina and back to the brain. What they do, in the vertebrate eye, is all converge on a particular hole in the retina, where they dive through it. The hole filled with nerves is called the blind spot, because it is blind, but ‘spot’ is too flattering, for it is quite large, more like a blind patch, which again doesn’t actually inconvenience us much because of the ‘automatic Photoshop’ software in the brain. Once again, send it back, it’s not just bad design, it’s the design of a complete idiot.
Richard Dawkins (The Greatest Show on Earth: The Evidence for Evolution)
The main ones are the symbolists, connectionists, evolutionaries, Bayesians, and analogizers. Each tribe has a set of core beliefs, and a particular problem that it cares most about. It has found a solution to that problem, based on ideas from its allied fields of science, and it has a master algorithm that embodies it. For symbolists, all intelligence can be reduced to manipulating symbols, in the same way that a mathematician solves equations by replacing expressions by other expressions. Symbolists understand that you can’t learn from scratch: you need some initial knowledge to go with the data. They’ve figured out how to incorporate preexisting knowledge into learning, and how to combine different pieces of knowledge on the fly in order to solve new problems. Their master algorithm is inverse deduction, which figures out what knowledge is missing in order to make a deduction go through, and then makes it as general as possible. For connectionists, learning is what the brain does, and so what we need to do is reverse engineer it. The brain learns by adjusting the strengths of connections between neurons, and the crucial problem is figuring out which connections are to blame for which errors and changing them accordingly. The connectionists’ master algorithm is backpropagation, which compares a system’s output with the desired one and then successively changes the connections in layer after layer of neurons so as to bring the output closer to what it should be. Evolutionaries believe that the mother of all learning is natural selection. If it made us, it can make anything, and all we need to do is simulate it on the computer. The key problem that evolutionaries solve is learning structure: not just adjusting parameters, like backpropagation does, but creating the brain that those adjustments can then fine-tune. The evolutionaries’ master algorithm is genetic programming, which mates and evolves computer programs in the same way that nature mates and evolves organisms. Bayesians are concerned above all with uncertainty. All learned knowledge is uncertain, and learning itself is a form of uncertain inference. The problem then becomes how to deal with noisy, incomplete, and even contradictory information without falling apart. The solution is probabilistic inference, and the master algorithm is Bayes’ theorem and its derivates. Bayes’ theorem tells us how to incorporate new evidence into our beliefs, and probabilistic inference algorithms do that as efficiently as possible. For analogizers, the key to learning is recognizing similarities between situations and thereby inferring other similarities. If two patients have similar symptoms, perhaps they have the same disease. The key problem is judging how similar two things are. The analogizers’ master algorithm is the support vector machine, which figures out which experiences to remember and how to combine them to make new predictions.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
As black-box technologies become more widespread, there have been no shortage of demands for increased transparency. In 2016 the European Union's General Data Protection Regulation included in its stipulations the "right to an explanation," declaring that citizens have a right to know the reason behind the automated decisions that involve them. While no similar measure exists in the United States, the tech industry has become more amenable to paying lip service to "transparency" and "explainability," if only to build consumer trust. Some companies claim they have developed methods that work in reverse to suss out data points that may have triggered the machine's decisions—though these explanations are at best intelligent guesses. (Sam Ritchie, a former software engineer at Stripe, prefers the term "narratives," since the explanations are not a step-by-step breakdown of the algorithm's decision-making process but a hypothesis about reasoning tactics it may have used.) In some cases the explanations come from an entirely different system trained to generate responses that are meant to account convincingly, in semantic terms, for decisions the original machine made, when in truth the two systems are entirely autonomous and unrelated. These misleading explanations end up merely contributing another layer of opacity. "The problem is now exacerbated," writes the critic Kathrin Passig, "because even the existence of a lack of explanation is concealed.
Meghan O'Gieblyn (God, Human, Animal, Machine: Technology, Metaphor, and the Search for Meaning)
According to this view, free-market capitalism and state-controlled communism aren’t competing ideologies, ethical creeds or political institutions. At bottom, they are competing data-processing systems. Capitalism uses distributed processing, whereas communism relies on centralised processing. Capitalism processes data by directly connecting all producers and consumers to one another, and allowing them to exchange information freely and make decisions independently. For example, how do you determine the price of bread in a free market? Well, every bakery may produce as much bread as it likes, and charge for it as much as it wants. The customers are equally free to buy as much bread as they can afford, or take their business to the competitor. It isn’t illegal to charge $1,000 for a baguette, but nobody is likely to buy it. On a much grander scale, if investors predict increased demand for bread, they will buy shares of biotech firms that genetically engineer more prolific wheat strains. The inflow of capital will enable the firms to speed up their research, thereby providing more wheat faster, and averting bread shortages. Even if one biotech giant adopts a flawed theory and reaches an impasse, its more successful competitors will achieve the hoped-for breakthrough. Free-market capitalism thus distributes the work of analysing data and making decisions between many independent but interconnected processors. As the Austrian economics guru Friedrich Hayek explained, ‘In a system in which the knowledge of the relevant facts is dispersed among many people, prices can act to coordinate the separate actions of different people.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
In September 1999, the Department of Justice succeeded in denaturalizing 63 participants in Nazi acts of persecution; and in removing 52 such individuals from this country. This appears to be but a small portion of those who actually were brought here by our own government. A 1999 report to the Senate and the House said "that between 1945 and 1955, 765 scientists, engineers, and technicians were brought to the United States under Overcast, Paperclip, and similar programs. It has been estimated that at least half, and perhaps as many as 80 percent of all the imported specialists were former Nazi Party members." A number of these scientists were recruited to work for the Air Force's School of Aviation Medicine (SAM) at Brooks Air Force Base in Texas, where dozens of human radiation experiments were conducted during the Cold War. Among them were flash-blindness studies in connection with atomic weapons tests and data gathering for total-body irradiation studies conducted in Houston. The experiments for which Nazi investigators were tried included many related to aviation research. Hubertus Strughold, called "the father of space medicine," had a long career at the SAM, including the recruitment of other Paperclip scientists in Germany. On September 24, 1995 the Jewish Telegraphic Agency reported that as head of Nazi Germany's Air Force Institute for Aviation Medicine, Strughold particpated in a 1942 conference that discussed "experiments" on human beings. The experiments included subjecting Dachau concentration camp inmates to torture and death. The Edgewood Arsenal of the Army's Chemical Corps as well as other military research sites recruited these scientists with backgrounds in aeromedicine, radiobiology, and opthamology. Edgewood Arsenal, Maryland ended up conducting experiments on more than seven thousand American soldiers. Using Auschwitz experiments as a guide, they conducted the same type of poison gas experiments that had been done in the secret I.G. Farben laboratories.
Carol Rutz (A Nation Betrayed: Secret Cold War Experiments Performed on Our Children and Other Innocent People)
The population, who are, ultimately, indifferent to public affairs and even to their own interests, negotiate this indifference with an equally spectral partner and one that is similarly indifferent to its own will: the government [Ie pouvoir] . This game between zombies may stabilize in the long term. The Year 2000 will not take place in that an era of indifference to time itself - and therefore to the symbolic term of the millennium - will be ushered in by negotiation. Nowadays, you have to go straight from money to money, telegraphically so to speak, by direct transfer (that is the viral side of the matter). A viral revolution, then, more akin to the Glass Bead Game than to the steam engine, and admirably personified in Bernard Tapie's playboy face. For the look of money is reflected in faces. Gone are the hideous old capitalists, the old-style industrial barons wearing the masks of the suffering they have inflicted. Now there are only dashing playboys, sporty and sexual, true knights of industry, wearing the mask of the happiness they spread all around themselves. The world put on a show of despair after 1968. It's been putting on a big show of hope since 1980. No more tears, alright? Reaganite optimism, the pump ing up of the dollar. Fabius's glossy new look. Patriotic conviviality. Reluctance prohibited. The old pessimism was produced by the idea that things were getting worse and worse. The new pessimism is produced by the fact that everything is getting better and better. Supercooled euphoria. Controlled anaesthesia. I should like to see the equivalent of Bernard Tapie in the world of business emerge in the world of concepts. Buying up failing concepts, swallowing them up, dusting them off (firing all the deadbeats who are in the way), putting them back into circulation with a dynamic virginity, sending them shooting up on the Stock Exchange and then abandoning them afterwards like dogs. Some people do this very well. It is perhaps better to save tired concepts by maintaining them in a super cooled state like unemployed labour, or locking them away in interactive data banks kept alive on a respirator.
Jean Baudrillard (Cool Memories)
program in which all the pieces work together like a finely tuned machine. So your Web site should look very much like your brochure and direct mail pieces, using the same graphics, headlines, and market data from your core story. As you learned in Chapter Four, I don’t care what kind of product or ser vice you offer, there is information that can be of value to your prospects that can soup up your ability to spread your fame and advance your brand. The information on your Web site will get search engines to send you even more leads. Then once folks come to your Web site because it has information of value to them, you can then go a step further and offer Web seminars and mass teleconferences to teach folks how to be more successful in the area in which they live that intersects with your product or ser vice. This will get you even deeper with your prospects. So think of your Web site as a community where there are benefits to your prospects when they visit.
Chet Holmes (The Ultimate Sales Machine: Turbocharge Your Business with Relentless Focus on 12 Key Strategies)
Eric Spiegel, the head of Siemens’ US arm, laid out a vision not that far removed from Ms Huang’s when he spoke at a breakfast in Washington hosted by the McKinsey Global Institute, the consultancy’s think-tank. The German engineering company, he said, would soon begin delivering spare parts to customers via email and 3D printers, also avoiding physical borders and the usual logistical complexities of global trade. But the advances in business are also coming up against fundamental debates about privacy. The Edward Snowden revelations of US online snooping have sparked a worldwide debate about privacy and the internet. Receiving less attention is the way international trade negotiations are trying to deal with what limits, if any, ought to be set on the flow of data around the globe and how to prepare for a digital future that is already a reality in some sectors. The negotiation of a 12-country Transpacific trade partnership (TPP) has sparked debate in Australia and New Zealand over whether companies ought to be allowed to store personal banking and medical data in foreign countries, or if such sensitive information should even be allowed to cross borders freely.
Anonymous
leap in the ability to process and data. For the sake of simplicity, this book will focus on the recent past to discuss various stages where information technology, norms, practices, and rules combined to allow for data gathering and sharing within an enterprise and with individuals. Framing and noting the various risks and opportunities within various stages in the Information Age creates a context for the ensuing discussion surrounding the mission and purpose of the privacy engineer and the call to action for the privacy engineer’s manifesto, as presented later in this book.
Michelle Finneran Dennedy (The Privacy Engineer's Manifesto: Getting from Policy to Code to QA to Value)
a digital design engineer, you would spend long hours going through the TTL Data Book familiarizing yourself with the types of TTL chips that were available. Once you knew all your tools, you could actually build the computer I showed in Chapter 17 out of TTL chips. Wiring the chips together is a lot easier than wiring individual transistors
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
The adjective “efficient” in “efficient markets” refers to how investors use information. In an efficient market, every titbit of new information is processed correctly and immediately by investors. As a result, market prices react instantly and appropriately to any relevant news about the asset in question, whether it is a share of stock, a corporate bond, a derivative, or some other vehicle. As the saying goes, there are no $100 bills left on the proverbial sidewalk for latecomers to pick up, because asset prices move up or down immediately. To profit from news, you must be jackrabbit fast; otherwise, you’ll be too late. This is one rationale for the oft-cited aphorism “You can’t beat the market.” An even stronger form of efficiency holds that market prices do not react to irrelevant news. If this were so, prices would ignore will-o’-the-wisps, unfounded rumors, the madness of crowds, and other extraneous factors—focusing at every moment on the fundamentals. In that case, prices would never deviate from fundamental values; that is, market prices would always be “right.” Under that exaggerated form of market efficiency, which critics sometimes deride as “free-market fundamentalism,” there would never be asset-price bubbles. Almost no one takes the strong form of the efficient markets hypothesis (EMH) as the literal truth, just as no physicist accepts Newtonian mechanics as 100 percent accurate. But, to extend the analogy, Newtonian physics often provides excellent approximations of reality. Similarly, economists argue over how good an approximation the EMH is in particular applications. For example, the EMH fits data on widely traded stocks rather well. But thinly traded or poorly understood securities are another matter entirely. Case in point: Theoretical valuation models based on EMH-type reasoning were used by Wall Street financial engineers to devise and price all sorts of exotic derivatives. History records that some of these calculations proved wide of the mark.
Alan S. Blinder (After the Music Stopped: The Financial Crisis, the Response, and the Work Ahead)
As an author writing about software engineering, I am committed to providing the best grounding for any factual claims I make or support. To that end I will: only cite papers that I have in fact personally read refrain from indirect quotation (or other ‘telephone game’ variants) make it clear whenever I’m citing opinion or indirect quotation, as opposed to original research cite page and section numbers when available, and always when citing books whenever possible, cite papers freely available online in full text versions refrain from citing obscure or non peer-reviewed sources check that the data I’m citing actually supports the claim look for contradictory evidence as well as supporting, to avoid confirmation bias only make prudent claims, and present all plausible threats to validity.
Anonymous
The endgame is this: Without absence in our lives, we risk fooling ourselves into believing that things (a message from a lover, the performance of a song, the face of a human body) matter less. De Beers hoards its diamonds to invent a scarcity that equals preciousness. Perhaps we now need to engineer scarcity in our communications, in our interactions, and in the things we consume. Otherwise our lives become like a Morse code transmission that’s lacking breaks—a swarm of noise blanketing the valuable data beneath.
Michael Harris (The End of Absence: Reclaiming What We've Lost in a World of Constant Connection)
Pratt & Whitney, the aerospace manufacturer, now can predict with 97% accuracy when an aircraft engine will need to have maintenance, conceivably helping it run its operations much more efficiently, says Anjul Bhambhri, VP of Big Data at IBM.
Anonymous
Perhaps we now need to engineer scarcity in our communications, in our interactions, and in the things we consume. Otherwise our lives become like a Morse code transmission that's lacking breaks - a swarm of noise blanketing the valuable data beneath.
Michael Harris
The companies that didn’t suffer, including Netflix, knew how to design for reliability; they understood resilience, spreading data across zones, and a whole lot of reliability engineering.
Mike Loukides (What is DevOps?)
Dimensional models implemented in relational database management systems are referred to as star schemas because of their resemblance to a star-like structure. Dimensional models implemented in multidimensional database environments are referred to as online analytical processing (OLAP) cubes, as illustrated in Figure 1.1. Figure 1.1 Star schema versus OLAP cube. If your DW/BI environment includes either star schemas or OLAP cubes, it leverages dimensional concepts. Both stars and cubes have a common logical design with recognizable dimensions; however, the physical implementation differs. When data is loaded into an OLAP cube, it is stored and indexed using formats and techniques that are designed for dimensional data. Performance aggregations or precalculated summary tables are often created and managed by the OLAP cube engine. Consequently, cubes deliver superior query performance because of the precalculations, indexing strategies, and other optimizations. Business users can drill down or up by adding or removing attributes from their analyses with excellent performance without issuing new queries. OLAP cubes also provide more analytically robust functions that exceed those available with SQL. The downside is that you pay a load performance price for these capabilities, especially with large data sets.
Ralph Kimball (The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling)
Each business process is represented by a dimensional model that consists of a fact table containing the event's numeric measurements surrounded by a halo of dimension tables that contain the textual context that was true at the moment the event occurred. This characteristic star-like structure is often called a star join, a term dating back to the earliest days of relational databases. Figure 1.5 Fact and dimension tables in a dimensional model. The first thing to notice about the dimensional schema is its simplicity and symmetry. Obviously, business users benefit from the simplicity because the data is easier to understand and navigate. The charm of the design in Figure 1.5 is that it is highly recognizable to business users. We have observed literally hundreds of instances in which users immediately agree that the dimensional model is their business. Furthermore, the reduced number of tables and use of meaningful business descriptors make it easy to navigate and less likely that mistakes will occur. The simplicity of a dimensional model also has performance benefits. Database optimizers process these simple schemas with fewer joins more efficiently. A database engine can make strong assumptions about first constraining the heavily indexed dimension tables, and then attacking the fact table all at once with the Cartesian product of the dimension table keys satisfying the user's constraints. Amazingly, using this approach, the optimizer can evaluate arbitrary n-way joins to a fact table in a single pass through the fact table's index. Finally, dimensional models are gracefully extensible to accommodate change. The predictable framework of a dimensional model withstands unexpected changes in user behavior. Every dimension is equivalent; all dimensions are symmetrically-equal entry points into the fact table. The dimensional model has no built-in bias regarding expected query patterns. There are no preferences for the business questions asked this month versus the questions asked next month. You certainly don't want to adjust schemas if business users suggest new ways to analyze their business.
Ralph Kimball (The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling)
Figure 1 schematically shows how in-vehicle networking will be conceived. In this conception, CAN and the other communication protocols developed concurrently made it possible for multiple LANs to exchange data efficiently via a gateway. Motor Motor Motor Air Sub network Switch Switch Sensor Safety system Passenger detection conditioner Radar Door CAN Up to 125 kbps zLIN 2.4 to 19.2 kbps AFS Instrument panel meter Keyless Body White line detection Head lamp Levelizer Combination lamp Sub network system Squib zSafe-(150 kbpsby-Wire ) Airbag Gateway control Tire Information Engine and powertrain pressure system ACC ITS system system CAN CAN 500 kbps 125 kbps MD/CD Audio VICS Engine Steering Brake changer Video navi TVSS Sub network compo zFlexRay *2(5 Mbps) zMOST Chassis z1394 AT system CAN 500 kbps Failure diagnostic system zCAN (statutory control) Diagnostic tool Figure 1. Conception of In-vehicle Networking * 1 : ISO stands for International Organization for Standardization.* 2 : FlexRay TM is a registered trademark of DaimlerChrysler AG. REJ05B0804-0100/Rev. 1.00 April 2006 Page 2 of 44
Anonymous
Tracked Vehicles "Each war proves anew to those who may have had their doubts, the primacy of the main battle tank. Between wars, the tank is always a target for cuts. But in wartime, everyone remembers why we need it, in its most advanced, upgraded versions and in militarily significant numbers." - IDF Brigadier General Yahuda Admon (retired) Since their first appearance in the latter part of World War I, tanks have increasingly dominated military thinking. Armies became progressively more mechanised during World War II, with many infantry being carried in armoured carriers by the end of the war. The armoured personnel carrier (APC) evolved into the infantry fighting vehicle (IFV), which is able to support the infantry as well as simply transport them. Modern IFVs have a similar level of battlefield mobility to the tanks, allowing tanks and infantry to operate together and provide mutual support. Abrams Mission Provide heavy armour superiority on the battlefield. Entered Army Service 1980 Description and Specifications The Abrams tank closes with and destroys enemy forces on the integrated battlefield using mobility, firepower, and shock effect. There are three variants in service: M1A1, M1A2 and M1A2 SEP. The 120mm main gun, combined with the powerful 1,500 HP turbine engine and special armour, make the Abrams tank particularly suitable for attacking or defending against large concentrations of heavy armour forces on a highly lethal battlefield. Features of the M1A1 modernisation program include increased armour protection; suspension improvements; and an improved nuclear, biological and chemical (NBC) protection system that increases survivability in a contaminated environment. The M1A1D modification consists of an M1A1 with integrated computer and a far-target-designation capability. The M1A2 modernisation program includes a commander's independent thermal viewer, an improved commander's weapon station, position navigation equipment, a distributed data and power architecture, an embedded diagnostic system and improved fire control systems.
Russell Phillips (This We'll Defend: The Weapons & Equipment of the US Army)
M113 Family of Vehicles Mission Provide a highly mobile, survivable, and reliable tracked-vehicle platform that is able to keep pace with Abrams- and Bradley-equipped units and that is adaptable to a wide range of current and future battlefield tasks through the integration of specialised mission modules at minimum operational and support cost. Entered Army Service 1960 Description and Specifications After more than four decades, the M113 family of vehicles (FOV) is still in service in the U.S. Army (and in many foreign armies). The original M113 Armoured Personnel Carrier (APC) helped to revolutionise mobile military operations. These vehicles carried 11 soldiers plus a driver and track commander under armour protection across hostile battlefield environments. More importantly, these vehicles were air transportable, air-droppable, and swimmable, allowing planners to incorporate APCs in a much wider range of combat situations, including many "rapid deployment" scenarios. The M113s were so successful that they were quickly identified as the foundation for a family of vehicles. Early derivatives included both command post (M577) and mortar carrier (M106) configurations. Over the years, the M113 FOV has undergone numerous upgrades. In 1964, the M113A1 package replaced the original gasoline engine with a 212 horsepower diesel package, significantly improving survivability by eliminating the possibility of catastrophic loss from fuel tank explosions. Several new derivatives were produced, some based on the armoured M113 chassis (e.g., the M125A1 mortar carrier and M741 "Vulcan" air defence vehicle) and some based on the unarmoured version of the chassis (e.g., the M548 cargo carrier, M667 "Lance" missile carrier, and M730 "Chaparral" missile carrier). In 1979, the A2 package of suspension and cooling enhancements was introduced. Today's M113 fleet includes a mix of these A2 variants, together with other derivatives equipped with the most recent A3 RISE (Reliability Improvements for Selected Equipment) package. The standard RISE package includes an upgraded propulsion system (turbocharged engine and new transmission), greatly improved driver controls (new power brakes and conventional steering controls), external fuel tanks, and 200-amp alternator with four batteries. Additional A3 improvements include incorporation of spall liners and provisions for mounting external armour. The future M113A3 fleet will include a number of vehicles that will have high speed digital networks and data transfer systems. The M113A3 digitisation program includes applying hardware, software, and installation kits and hosting them in the M113 FOV. Current variants: Mechanised Smoke Obscurant System M548A1/A3 Cargo Carrier M577A2/A3 Command Post Carrier M901A1 Improved TOW Vehicle M981 Fire Support Team Vehicle M1059/A3 Smoke Generator Carrier M1064/A3 Mortar Carrier M1068/A3 Standard Integrated Command Post System Carrier OPFOR Surrogate Vehicle (OSV) Manufacturer Anniston Army Depot (Anniston, AL) United Defense, L.P. (Anniston, AL)
Russell Phillips (This We'll Defend: The Weapons & Equipment of the US Army)