Computer Engineering Quotes

We've searched our database for all the quotes and captions related to Computer Engineering. Here they are! All 100 of them:

Did he just rip out the engine?" I asked. "Yes", Saiman said. "And now he is demolishing the Maserati with it." Ten seconds later Curran hurled the twisted wreck of black and orange that used to be the Maserati into the wall. The first melodic notes of an old song came from the computer. I glanced at Saiman. He shrugged. "It begged for a soundtrack.
Ilona Andrews (Magic Slays (Kate Daniels, #5))
Indeed, the ratio of time spent reading versus writing is well over 10 to 1. We are constantly reading old code as part of the effort to write new code. ...[Therefore,] making it easy to read makes it easier to write.
Robert C. Martin (Clean Code: A Handbook of Agile Software Craftsmanship)
An algorithm must be seen to be believed.
Donald Ervin Knuth (Leaders in Computing: Changing the digital world)
But the main lesson to draw from the birth of computers is that innovation is usually a group effort, involving collaboration between visionaries and engineers, and that creativity comes from drawing on many sources. Only in storybooks do inventions come like a thunderbolt, or a lightbulb popping out of the head of a lone individual in a basement or garret or garage.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Algorithms are not arbiters of objective truth and fairness simply because they're math.
Zoe Quinn (Crash Override: How Gamergate (Nearly) Destroyed My Life, and How We Can Win the Fight Against Online Hate)
On the first day of a college you will worry about how will you do inside the college? and at the last day of a college you will wonder what will you do outside the college?
Amit Kalantri
early computer engineers relied on LSD in designing circuit chips,
Michael Pollan (How to Change Your Mind: What the New Science of Psychedelics Teaches Us About Consciousness, Dying, Addiction, Depression, and Transcendence)
Software testing is a sport like hunting, it's bughunting.
Amit Kalantri
If I were king, I would redress an abuse which cuts back, as it were, one half of human kind. I would have women participate in all human rights, especially those of the mind.
Émilie du Châtelet (Selected Philosophical and Scientific Writings (The Other Voice in Early Modern Europe))
Schwartz said that several of the early computer engineers relied on LSD in designing circuit chips, especially in the years before they could be designed on computers. “You had to be able to visualize a staggering complexity in three dimensions, hold it all in your head. They found that LSD could help.
Michael Pollan (How to Change Your Mind: What the New Science of Psychedelics Teaches Us About Consciousness, Dying, Addiction, Depression, and Transcendence)
The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths. Its province is to assist us to making available what we are already acquainted with. [Describing Charles Babbage's machine.]
Ada Lovelace
In the 1930s, with no computers to precisely calculate tolerances of construction materials, cautious engineers simply heaped on excess mass and redundancy. “We’re living off the overcapacity of our forefathers.
Alan Weisman (The World Without Us)
Curran lunged at a silver Bentley. The hood went flying. He thrust his hand into the car. Metal screamed, and Curran jerked a twisted clump out of the hood and smashed it into the nearest car like a club. “Did he just rip out the engine?” I asked. “Yes,” Saiman said. “And now he’s demolishing the Maserati with it.” Ten seconds later Curran hurled the twisted wreck of black and orange that used to be the Maserati into the wall. The first melodic notes of an old song came from the computer. I glanced at Saiman. He shrugged. “It begged for a soundtrack.
Ilona Andrews (Magic Slays (Kate Daniels, #5))
Why give a robot an order to obey orders—why aren't the original orders enough? Why command a robot not to do harm—wouldn't it be easier never to command it to do harm in the first place? Does the universe contain a mysterious force pulling entities toward malevolence, so that a positronic brain must be programmed to withstand it? Do intelligent beings inevitably develop an attitude problem? (…) Now that computers really have become smarter and more powerful, the anxiety has waned. Today's ubiquitous, networked computers have an unprecedented ability to do mischief should they ever go to the bad. But the only mayhem comes from unpredictable chaos or from human malice in the form of viruses. We no longer worry about electronic serial killers or subversive silicon cabals because we are beginning to appreciate that malevolence—like vision, motor coordination, and common sense—does not come free with computation but has to be programmed in. (…) Aggression, like every other part of human behavior we take for granted, is a challenging engineering problem!
Steven Pinker (How the Mind Works)
People know, or dimly feel, that if thinking is not kept pure and keen, and if respect for the world of mind is no longer operative, ships and automobiles will soon cease to run right, the engineer's slide rule and the computations of banks and stock exchanges will forfeit validity and authority, and chaos will ensue.
Hermann Hesse (The Glass Bead Game)
Memory has always been social. Now we’re using search engines and computers to augment our memories, too.
Clive Thompson
The business we're in is more sociological than technological, more dependent on workers' abilities to communicate with each other than their abilities to communicate with machines.
Tom DeMarco (Peopleware: Productive Projects and Teams)
Sometimes I would worry about my internet habits and force myself awy from the computer, to read a magazine or book. Contemporary literature offered no respite: I would find the prose cluttered with data points, tenuous historical connections, detail so finely tuned it could have only been extracted from a feverish night of search-engine queries. Aphorisms were in; authors were wired. I would pick up books that had been heavily documented on social media, only to find that the books themselves had a curatorial affect: beautiful descriptions of little substance, arranged in elegant vignettes—gestural text, the equivalent of a rumpled linen bedsheet or a bunch of dahlias placed just so. Oh, I would think, turning the page. This author is addicted to the internet, too.
Anna Wiener (Uncanny Valley)
But the Turing test cuts both ways. You can't tell if a machine has gotten smarter or if you've just lowered your own standards of intelligence to such a degree that the machine seems smart. If you can have a conversation with a simulated person presented by an AI program, can you tell how far you've let your sense of personhood degrade in order to make the illusion work for you? People degrade themselves in order to make machines seem smart all the time. Before the crash, bankers believed in supposedly intelligent algorithms that could calculate credit risks before making bad loans. We ask teachers to teach to standardized tests so a student will look good to an algorithm. We have repeatedly demonstrated our species' bottomless ability to lower our standards to make information technology look good. Every instance of intelligence in a machine is ambiguous. The same ambiguity that motivated dubious academic AI projects in the past has been repackaged as mass culture today. Did that search engine really know what you want, or are you playing along, lowering your standards to make it seem clever? While it's to be expected that the human perspective will be changed by encounters with profound new technologies, the exercise of treating machine intelligence as real requires people to reduce their mooring to reality.
Jaron Lanier (You Are Not a Gadget)
A YOUNG COMPUTER ENGINEER, known to be one of the most skillful in Westborough’s basement, said he had a fantasy about a better job than his. In it, he goes to work as a janitor for a computer company whose designs leave much to be desired. There, at night, disguised by mop and broom, he sneaks into the offices of the company’s engineers and corrects the designs on their blackboards and desks.
Tracy Kidder (The Soul of A New Machine)
...artificial intelligence will become a major human rights issue in the twenty-first century.
Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism)
Above all, Rasala wanted around him engineers who took an interest in the entire computer, not just in the parts that they had designed.
Tracy Kidder (The Soul of a New Machine)
Impact minus twenty seconds, guys . . .” said the computer. “Then turn the bloody engines back on!” bawled Zaphod. “Oh, sure thing, guys,” said the computer.
Douglas Adams (The Hitchhiker's Guide to the Galaxy (Hitchhiker's Guide, #1))
Every night I went into Hannah's room and sat with her stuff. The thing I couldn't get was how her clothes and her books and her drawings were still there, but she wasn't. It just didn't compute. Her room was a like a car without an engine, everything where it should be, except all it was was potential. None of it was going to get used again.
J.R. Ward (Lover Unbound (Black Dagger Brotherhood, #5))
My guess is (it will be) about 300 years until computers are as good as, say, your local reference library in search.
Craig Silverstein
Search engines finds the information, not necessarily the truth.
Amit Kalantri
Let me tell you as a brain scientist and a computer engineering dropout - transhumanism is to brain computer interface, what nuclear weapons are to nuclear physics.
Abhijit Naskar (Amantes Assemble: 100 Sonnets of Servant Sultans)
Now comes the second machine age. Computers and other digital advances are doing for mental power—the ability to use our brains to understand and shape our environments—what the steam engine and its descendants did for muscle power.
Erik Brynjolfsson (The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies)
All these beefy Caucasians with guns. Get enough of them together,looking for the America they always believed they'd grow up in, and they glom together like overcooked rice, form integral, starchy little units. With their power tools, portable generators, weapons, four-wheel-drive vehicles, and personal computers, they are like beavers hyped up on crystal meth, manic engineers without a blueprint, chewing through the wilderness, building things and abandoning them, altering the flow of mighty rivers and then moving on because the place ain't what it used to be. The byproduct of the lifestyle is polluted rivers, greenhouse effect, spouse abuse, televangelists, and serial killers. But as long as you have that four-wheel-drive vehicle and can keep driving north, you can sustain it, keep moving just quickly enough to stay one step ahead of your own waste stream. In twenty years, ten million white people will converge on the north pole and park their bagos there. The low-grade waste heat of their thermodynamically intense lifestyle will turn the crystalline icescape pliable and treacherous. It will melt a hole through the polar icecap, and all that metal will sink to the bottom, sucking the biomass down with it.
Neal Stephenson (Snow Crash)
You may know how to operate computers. You may know a lot about aliens or robots. You may be a doctor, lawyer, engineer, teacher, specialist… BUT if you don’t know how you operate, why your life is the way it is and how to increase fulfillment, love and peace in your life then all the knowledge and degrees aren’t much worth having!
Maddy Malhotra (How to Build Self-Esteem and Be Confident: Overcome Fears, Break Habits, Be Successful and Happy)
Real arms races are run by highly intelligent, bespectacled engineers in glass offices thoughtfully designing shiny weapons on modern computers. But there's no thinking in the mud and cold of nature's trenches. At best, weapons thrown together amidst the explosions and confusion of smoky battlefields are tiny variations on old ones, held together by chewing gum. If they don't work, then something else is thrown at the enemy, including the kitchen sink - there's nothing "progressive" about that. At its usual worst, trench warfare is fought by attrition. If the enemy can be stopped or slowed by burning your own bridges and bombing your own radio towers and oil refineries, then away they go. Darwinian trench warfare does not lead to progress - it leads back to the Stone Age.
Michael J. Behe (The Edge of Evolution: The Search for the Limits of Darwinism)
There are no inherent barriers to our being able to reverse engineer the operating principles of human intelligence and replicate these capabilities in the more powerful computational substrates that will become available in the decades ahead. The human brain is a complex hierarchy of complex systems, but it does not represent a level of complexity beyond what we are already capable of handling.
Ray Kurzweil (The Singularity is Near: When Humans Transcend Biology)
And yet the city is not dead: the machines, the engines, the turbines continue to hum and vibrate, every Wheel's cogs are caught in the cogs of other wheels, trains run on tracks and signals on wires; and no human is there any longer to send or receive, to charge or discharge. The machines, which have long known they could do without men, have finally driven them out; and after a long exile, the wild animals have come back to occupy the territory wrested from the forest: foxes and martens wave their soft tails over the control panels starred with manometers and levers and gauges and diagrams; badgers and dormice luxuriate on batteries and magnetos. Man was necessary; now he is useless. For the world to receive information from the world and enjoy it, now computers and butterflies suffice.
Italo Calvino (The Castle of Crossed Destinies)
My wife, Rohini, visits a lot of government schools as part of her NGO reach-out. One of the questions she most likes to ask the kids is what they would like to be when they grow up. The answers are varied--'engineer,' 'teacher,' 'policeman' and, increasingly, 'computer' [sic]. But even in the rural schools, one aspiration that they never express is 'farmer
Nandan Nilekani
Much of the engineering of computers takes place in silence, while engineers pace in hallways or sit alone and gaze at blank pages.
Tracy Kidder (The Soul of a New Machine)
Without requirements and design, programming is the art of adding bugs to an empty text file.
Louis Srygley
She tilts the computer screen toward Drew "A boy," she says. "Luke." .... "Luke," Drew repeats. "Bible or Star Wars?" "Star Wars," Vanessa says, thinking of Teri's engineer husband.
Meg Donohue (All the Summer Girls)
Apple Computers is a famous example: it was founded by (mostly Republi­can) computer engineers who broke from IBM in Silicon Valley in the 198os, forming little democratic circles of twenty to forty people with their laptops in each other's garages.
David Graeber (Debt: The First 5,000 Years)
A digital computer is essentially a huge army of clerks, equipped with rule books, pencil and paper, all stupid and entirely without initiative, but able to follow millions of precisely defined operations. The difficulty lies in handing over the rule book.
Christopher W. Alexander
Just as combining the steam engine with ingenious machinery drove the Industrial Revolution, the combination of the computer and distributed networks led to a digital revolution that allowed anyone to create, disseminate, and access any information anywhere.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution)
Thanks to the very best in Microsoft/Intel engineering, it crashes every time you exhale too hard in its general vicinity. --Fanboy on his computer
Barry Lyga (The Astonishing Adventures of Fanboy and Goth Girl (The Astonishing Adventures of Fanboy and Goth Girl, #1))
People are always saying these things about how there's no need to read literature anymore-that it won't help the world. Everyone should apparently learn to speak Mandarin, and learn how to write code for computers. More young people should go into STEM fields: science, technology, engineering, and math. And that all sounds to be true and reasonable. But you can't say that what you learn in English class doesn't matter. That great writing doesn't make a difference. I'm different. It's hard to put into words, but it's true. Words matter.
Meg Wolitzer
The mind has greater power than a computer. The heart has greater power than an engine. The soul has greater power than a reactor. The tongue has greater power than a sword. The eye has greater power than a camera. The ear has greater power than a recorder. The feet are a greater invention than the car. The hands are a greater invention than the carriage. The nose is a greater invention than the vacuum. The mouth is a greater invention than the megaphone. The stomach is a greater invention than the refrigerator. The skin is a greater invention than clothes.
Matshona Dhliwayo
The third way to change the laws of life is to engineer completely inorganic beings. The most obvious examples are computer programs and computer viruses that can undergo independent evolution.
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
The value system at Intel is completely the reverse. The Ph.D. in computer science who knows an answer in the abstract, yet does not apply it to create some tangible output, gets little recognition, but a junior engineer who produces results is highly valued and esteemed. And that is how it should be.
Andrew S. Grove (High Output Management)
On the other hand it is possible that human control over the machines may be retained. In that case the average man may have control over certain private machines of his own, such as his car of his personal computer, but control over large systems of machines will be in the hands of a tiny elite -- just as it is today, but with two difference. Due to improved techniques the elite will have greater control over the masses; and because human work will no longer be necessary the masses will be superfluous, a useless burden on the system. If the elite is ruthless the may simply decide to exterminate the mass of humanity. If they are humane they may use propaganda or other psychological or biological techniques to reduce the birth rate until the mass of humanity becomes extinct, leaving the world to the elite. Or, if the elite consist of soft-hearted liberals, they may decide to play the role of good shepherds to the rest of the human race. They will see to it that everyone's physical needs are satisfied, that all children are raised under psychologically hygienic conditions, that everyone has a wholesome hobby to keep him busy, and that anyone who may become dissatisfied undergoes "treatment" to cure his "problem." Of course, life will be so purposeless that people will have to be biologically or psychologically engineered either to remove their need for the power process or to make them "sublimate" their drive for power into some harmless hobby. These engineered human beings may be happy in such a society, but they most certainly will not be free. They will have been reduced to the status of domestic animals.
Theodore John Kaczynski
Engineers want to produce something,” said Wallach. “I didn’t go to school for six years just to get a paycheck. I thought that if this is what engineering’s all about, the hell with it.” He went to night school, to get a master’s in business administration. “I was always looking for the buck. I’d get the M.B.A., go back to New York, and make some money,” he figured. But he didn’t really want to do that. He wanted to build computers.
Tracy Kidder (The Soul of A New Machine)
Edwin Land of Polaroid talked about the intersection of the humanities and science. I like that intersection. There's something magical about that place. There are a lot of people innovating, and that's not the main distinction of my career. The reason Apple resonates with people is that there's a deep current of humanity in our innovation. I think great artists and great engineers are similar in that they both have a desire to express themselves. In fact some of the best people working on the original Mac were poets and musicians on the side. In the seventies computers became a way for people to express their creativity. Great artists like Leonardo da Vinci and Michelangelo were also great art science. Michelangelo knew a lot about how to quarry stone, not just how to be a sculptor.
Walter Isaacson (Steve Jobs)
They're trying to breed a nation of techno-peasants. Educated just enough to keep things going, but not enough to ask tough questions. They encourage any meme that downplays thoughtful analysis or encourages docility or self indulgence or uniformity. In what other society do people use "smart" and "wise" as insults? We tell people "don't get smart." Those who try, those who really like to learn, we call "nerds." Look at television or the press or the trivia that passes for political debate. When a candidate DOES try to talk about the issues, the newspapers talk about his sex life. Look at Saturday morning cartoon shows. Peasants, whether they're tilling fields or stuffing circuit boards, are easier to manipulate. Don't question; just believe. Turn off your computer and Trust the Force. Or turn your computer on and treat it like the Oracle of Delphi. That's right. They've made education superficial and specialized. Science classes for art majors? Forget it! And how many business or engineering students get a really good grounding in the humanities? When did universities become little more than white collar vocational schools?
Michael Flynn (In the Country of the Blind)
For her first summer vacation, my sister went to California with a couple of friends on a package tour put together by her agency. One of the members of the tour group was a computer engineer a year her senior, and she started dating him when they came back to Japan. This kind of thing happens all the time, but it's not for me. First of all, I hate package tours, and the thought of getting serious about somebody you meet in a group like that makes me sick.
Haruki Murakami (The Elephant Vanishes)
Still another factor is compatibility with vested interests. This book, like probably every other typed document you have ever read, was typed with a QWERTY keyboard, named for the left-most six letters in its upper row. Unbelievable as it may now sound, that keyboard layout was designed in 1873 as a feat of anti-engineering. It employs a whole series of perverse tricks designed to force typists to type as slowly as possible, such as scattering the commonest letters over all keyboard rows and concentrating them on the left side (where right-handed people have to use their weaker hand). The reason behind all of those seemingly counterproductive features is that the typewriters of 1873 jammed if adjacent keys were struck in quick succession, so that manufacturers had to slow down typists. When improvements in typewriters eliminated the problem of jamming, trials in 1932 with an efficiently laid-out keyboard showed that it would let us double our typing speed and reduce our typing effort by 95 percent. But QWERTY keyboards were solidly entrenched by then. The vested interests of hundreds of millions of QWERTY typists, typing teachers, typewriter and computer salespeople, and manufacturers have crushed all moves toward keyboard efficiency for over 60 years.
Jared Diamond (Guns, Germs, and Steel: The Fates of Human Societies (20th Anniversary Edition))
We cannot compound the ideas of others into a singular meaning for ourselves unless we’re given a private mental workshop in which to hammer at them. (Will I ever be able to write my book, I worry, if I can’t build such a workshop for myself?) Without daydreams our minds are only parrots—or, worse, computers. Daydreams are the engineers of new worlds.
Michael Harris (Solitude: In Pursuit of a Singular Life in a Crowded World)
It is no longer just engineers who dominate our technology leadership, because it is no longer the case that computers are so mysterious that only engineers can understand what they are capable of. There is an industry-wide shift toward more "product thinking" in leadership--leaders who understand the social and cultural contexts in which our technologies are deployed. Products must appeal to human beings, and a rigorously cultivated humanistic sensibility is a valued asset for this challenge. That is perhaps why a technology leader of the highest status--Steve Jobs--recently credited an appreciation for the liberal arts as key to his company's tremendous success with their various i-gadgets.
Damon Horowitz
Adopting a remote, managerial point of view, you could say that the Eagle project was a case where a local system of management worked as it should: competition for resources creating within a team inside a company an entrepreneurial spirit, which was channeled in the right direction by constraints sent down from the top. But it seems more accurate to say that a group of engineers got excited about building a computer.
Tracy Kidder (The Soul of a New Machine)
Impact minus twenty seconds, guys …” said the computer. “Then turn the bloody engines back on!” bawled Zaphod. “Oh, sure thing, guys,” said the computer. With a subtle roar the engines cut back in, the ship smoothly flattened out of its dive and headed back toward the missiles again.
Douglas Adams (The Hitchhiker's Guide to the Galaxy (Hitchhiker's Guide, #1))
Nerds are used to transparency. They add value by becoming expert at a technical skill like computer programming. In engineering disciplines, a solution either works or it fails. You can evaluate someone else’s work with relative ease, as surface appearances don’t matter much. Sales is the opposite: an orchestrated campaign to change surface appearances without changing the underlying reality. This strikes engineers as trivial if not fundamentally dishonest. They know their own jobs are hard, so when they look at salespeople laughing on the phone with a customer or going to two-hour lunches, they suspect that no real work is being done. If anything, people overestimate the relative difficulty of science and engineering, because the challenges of those fields are obvious. What nerds miss is that it takes hard work to make sales look easy. SALES
Peter Thiel (Zero to One: Notes on Startups, or How to Build the Future)
Our brains replay every painful memory from the past and every possible scary scenario from the future over and over, just like a complex computer simulation, in an attempt to scare us away from threats before they can happen and regardless of the probability of their happening at all.
Mo Gawdat (Solve For Happy: Engineer Your Path to Joy)
the invention of deep learning means that we are moving from the age of expertise to the age of data. Training successful deep-learning algorithms requires computing power, technical talent, and lots of data. But of those three, it is the volume of data that will be the most important going forward. That’s because once technical talent reaches a certain threshold, it begins to show diminishing returns. Beyond that point, data makes all the difference. Algorithms tuned by an average engineer can outperform those built by the world’s leading experts if the average engineer has access to far more data.
Kai-Fu Lee (AI Superpowers: China, Silicon Valley, and the New World Order)
Today, most people in industrial societies don’t need to know much about the natural world in order to survive. What do you really need to know in order to get by as a computer engineer, an insurance agent, a history teacher or a factory worker? You need to know a lot about your own tiny field of expertise, but for the vast majority of life’s necessities you rely blindly on the help of other experts, whose own knowledge is also limited to a tiny field of expertise. The human collective knows far more today than did the ancient bands. But at the individual level, ancient foragers were the most knowledgeable and skilful people in history.
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
The Universe is a quantum computer, and over time, it is simply more likely that structure comes out of it than noise. That means rules, patterns. That means a game. But spend long enough poking at it, and you start to see the game engine, the labyrinth of the quantum circuit, wires looping around each other, forwards and backwards.
Hannu Rajaniemi (The Causal Angel (Jean le Flambeur #3))
Women, on the other hand, had to wield their intellects like a scythe, hacking away against the stubborn underbrush of low expectations. A woman who worked in the central computing pools was one step removed from the research, and the engineers’ assignments sometimes lacked the context to give the computer much knowledge about the afterlife of the numbers that bedeviled her days. She might spend weeks calculating a pressure distribution without knowing what kind of plane was being tested or whether the analysis that depended on her math had resulted in significant conclusions. The work of most of the women, like that of the Friden, Marchant, or Monroe computing machines they used, was anonymous. Even a woman who had worked closely with an engineer on the content of a research report was rarely rewarded by seeing her name alongside his on the final publication. Why would the computers have the same desire for recognition that they did? many engineers figured. They were women, after all. As
Margot Lee Shetterly (Hidden Figures: The American Dream and the Untold Story of the Black Women Mathematicians Who Helped Win the Space Race)
Today I am more convinced than ever. Conceptual integrity is central to product quality. Having a system architect is the most important single step toward conceptual integrity. These principles are by no means limited to software systems, but to the design of any complex construct, whether a computer, an airplane, a Strategic Defense Initiative, a Global Positioning System. After teaching a software engineering laboratory more than 20 times, I came to insist that student teams as small as four people choose a manager and a separate architect. Defining distinct roles in such small teams may be a little extreme, but I have observed it to work well and to contribute to design success even for small teams.
Frederick P. Brooks Jr. (The Mythical Man-Month: Essays on Software Engineering)
Tinkerers built America. Benjamin Franklin, Thomas Edison, Henry Ford, all were tinkerers in their childhood. Everything from the airplane to the computer started in somebody's garage. Go back even further: the Industrial Revolution was a revolution of tinkerers. The great scientific thinkers of eighteenth-century England couldn't have been less interested in cotton spinning and weaving. Why would you be? It was left to a bloke on the shop floor who happened to glance at a one-thread wheel that had toppled over and noticed that both the wheel and the spindle were still turning. So James Hargreaves invented the spinning jenny, and there followed other artful gins and mules and frames and looms, and Britain and the world were transformed. By tinkerers rather than thinkerers. "Technological change came from tinkerers," wrote Professor J.R. McNeill of Georgetown, "people with little or no scientific education but with plenty of hands-on experience." John Ratzenberger likes to paraphrase a Stanford University study: "Engineers who are great in physics and calculus but can't think in new ways about old objects are doomed to think in old ways about new objects." That's the lesson of the spinning jenny: an old object fell over and someone looked at it in a new way.
Mark Steyn (After America: Get Ready for Armageddon)
(talking about when he tells his wife he’s going out to buy an envelope) Oh, she says well, you’re not a poor man. You know, why don’t you go online and buy a hundred envelopes and put them in the closet? And so I pretend not to hear her. And go out to get an envelope because I’m going to have a hell of a good time in the process of buying one envelope. I meet a lot of people. And, see some great looking babes. And a fire engine goes by. And I give them the thumbs up. And, and ask a woman what kind of dog that is. And, and I don’t know. The moral of the story is, is we’re here on Earth to fart around. And, of course, the computers will do us out of that. And, what the computer people don’t realize, or they don’t care, is we’re dancing animals.
Kurt Vonnegut Jr.
All these beefy Caucasians with guns! Get enough of them together, looking for the America they always believed they'd grow up in, and they glom together like overcooked rice, form integral, starchy little units. With their power tools, portable generators, weapons, four-wheel-drive vehicles, and personal computers, they are like beavers hyped up on crystal meth, manic engineers without a blueprint, chewing through the wilderness, building things and abandoning them, altering the flow of mighty rivers and then moving on because the place ain't what it used to be.
Neal Stephenson (Snow Crash)
The distinction that only sciences are useful and only arts are spirit-enhancing is a nonsensical one. I couldn't write much without scientists designing my computer. And some of them must want to read about Greek myth after a long day at work. These Muses always remind me that scientists and artists should disregard the idiotic attempts to separate us. We are all nerds, in the end.
Natalie Haynes (Divine Might: Goddesses in Greek Myth)
the average forager had wider, deeper and more varied knowledge of her immediate surroundings than most of her modern descendants. Today, most people in industrial societies don’t need to know much about the natural world in order to survive. What do you really need to know in order to get by as a computer engineer, an insurance agent, a history teacher or a factory worker? You need to know a lot about your own tiny field of expertise, but for the vast majority of life’s necessities you rely blindly on the help of other experts, whose own knowledge is also limited to a tiny field of expertise. The human collective knows far more today than did the ancient bands. But at the individual level, ancient foragers were the most knowledgeable and skilful people in history. There
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
If three hundred years of chainsaws, CFCs, depleted uranium, automobiles, genetic engineering, airplanes, routine international trade, computers, plastics, endocrine disruptors, pesticides, vivisection, internal combustion engines, feller bunchers, dragline excavators, televisions, cell phones, and nuclear (and conventional) bombs are not enough to convey the picture, then that picture will never be conveyed.
Derrick Jensen (Dreams)
In the early twenty-first century the train of progress is again pulling out of the station – and this will probably be the last train ever to leave the station called Homo sapiens. Those who miss this train will never get a second chance. In order to get a seat on it you need to understand twenty-first-century technology, and in particular the powers of biotechnology and computer algorithms. These powers are far more potent than steam and the telegraph, and they will not be used merely for the production of food, textiles, vehicles and weapons. The main products of the twenty-first century will be bodies, brains and minds, and the gap between those who know how to engineer bodies and brains and those who do not will be far bigger than the gap between Dickens’s Britain and the Mahdi’s Sudan. Indeed, it will be bigger than the gap between Sapiens and Neanderthals. In the twenty-first century, those who ride the train of progress will acquire divine abilities of creation and destruction, while those left behind will face extinction.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
The phrase, “technology and education” usually means inventing new gadgets to teach the same old stuff in a thinly disguised version of the same old way. Moreover, if the gadgets are computers, the same old teaching becomes incredibly more expensive and biased towards its dumbest parts, namely the kind of rote learning in which measurable results can be obtained by treating the children like pigeons in a Skinner box. (Papert, 1972a)
Sylvia Libow Martinez (Invent To Learn: Making, Tinkering, and Engineering in the Classroom)
We’ve known since 2007 that there’s superposition in chlorophyll, for instance. Photosynthesis has a ninety-five percent energy-transfer efficiency rate, which is better than anything we can engineer. Plants achieve that by using superposition to simultaneously try all the possible pathways between their light-collecting molecules and their reaction-center proteins so that energy is always sent down the most efficient route; it’s a form of biological quantum computing.
Robert J. Sawyer (Quantum Night)
If you’ve ever wondered what we’re missing by sitting at computers in cubicles all day, follow Jessica DuLong when she loses her desk job and embarks on this unlikely but fantastic voyage. Deeply original, riveting to read, and soul-bearingly honest, "My River Chronicles" is a surprisingly infectious romance about a young woman falling in love with a muscle-y old boat. As DuLong learns to navigate her way through a man’s world of tools and engines, and across the swirling currents of a temperamental river, her book also becomes a love letter to a nation. In tune with the challenges of our times, DuLong reminds us of the skills and dedication that built America, and inspires us to renew ourselves once again.
Trevor Corson (The Story of Sushi: An Unlikely Saga of Raw Fish and Rice)
Most of the successful innovators and entrepreneurs in this book had one thing in common: they were product people. They cared about, and deeply understood, the engineering and design. They were not primarily marketers or salesmen or financial types; when such folks took over companies, it was often to the detriment of sustained innovation. “When the sales guys run the company, the product guys don’t matter so much, and a lot of them just turn off,” Jobs said. Larry Page felt the same: “The best leaders are those with the deepest understanding of the engineering and product design.”34 Another lesson of the digital age is as old as Aristotle: “Man is a social animal.” What else could explain CB and ham radios or their successors, such as WhatsApp and Twitter? Almost every digital tool, whether designed for it or not, was commandeered by humans for a social purpose: to create communities, facilitate communication, collaborate on projects, and enable social networking. Even the personal computer, which was originally embraced as a tool for individual creativity, inevitably led to the rise of modems, online services, and eventually Facebook, Flickr, and Foursquare. Machines, by contrast, are not social animals. They don’t join Facebook of their own volition nor seek companionship for its own sake. When Alan Turing asserted that machines would someday behave like humans, his critics countered that they would never be able to show affection or crave intimacy. To indulge Turing, perhaps we could program a machine to feign affection and pretend to seek intimacy, just as humans sometimes do. But Turing, more than almost anyone, would probably know the difference. According to the second part of Aristotle’s quote, the nonsocial nature of computers suggests that they are “either a beast or a god.” Actually, they are neither. Despite all of the proclamations of artificial intelligence engineers and Internet sociologists, digital tools have no personalities, intentions, or desires. They are what we make of them.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Adopting a remote, managerial point of view, you could say that the Eagle project was a case where a local system of management worked as it should: competition for resources creating within a team inside a company an entrepreneurial spirit, which was channeled in the right direction by constraints sent down from the top. But it seems more accurate to say that a group of engineers got excited about building a computer. Whether it arose by corporate bungling or by design, the opportunity had to be grasped.
Tracy Kidder (The Soul of A New Machine)
But the “jobs of the future” do not need scientists who have memorized the periodic table. In fact, business leaders say they are looking for creative, independent problem solvers in every field, not just math and science. Yet in most schools, STEM subjects are taught as a series of memorized procedures and vocabulary words, when they are taught at all. In 2009, only 3% of high school graduates had any credits in an engineering course. (National Science Board, 2012) Technology is increasingly being relegated to using computers for Internet research and test taking.
Sylvia Libow Martinez (Invent To Learn: Making, Tinkering, and Engineering in the Classroom)
40. Be Defiant In our opinion, most search engine optimization (SEO) is bullshit. It involves trying to read Google’s mind and then gaming the system to make Google find crap. There are three thousand computer science PhDs at Google trying to make each search relevant, and then there’s you trying to fool them. Who’s going to win? Tricking Google is futile. Instead, you should let Google do what it does best: find great content. So defy all the SEO witchcraft out there and focus on creating, curating, and sharing great content. This is what’s called SMO: social-media optimization.
Guy Kawasaki (The Art of Social Media: Power Tips for Power Users)
The nuclear arms race is over, but the ethical problems raised by nonmilitary technology remain. The ethical problems arise from three "new ages" flooding over human society like tsunamis. First is the Information Age, already arrived and here to stay, driven by computers and digital memory. Second is the Biotechnology Age, due to arrive in full force early in the next century, driven by DNA sequencing and genetic engineering. Third is the Neurotechnology Age, likely to arrive later in the next century, driven by neural sensors and exposing the inner workings of human emotion and personality to manipulation.
Freeman Dyson (The Scientist as Rebel)
The Industrial Revolution was based on two grand concepts that were profound in their simplicity. Innovators came up with ways to simplify endeavors by breaking them into easy, small tasks that could be accomplished on assembly lines. Then, beginning in the textile industry, inventors found ways to mechanize steps so that they could be performed by machines, many of them powered by steam engines. Babbage, building on ideas from Pascal and Leibniz, tried to apply these two processes to the production of computations, creating a mechanical precursor to the modern computer. His most significant conceptual leap was that such machines did not have to be set to do only one process, but instead could be programmed and reprogrammed through the use of punch cards. Ada saw the beauty and significance of that enchanting notion, and she also described an even more exciting idea that derived from it: such machines could process not only numbers but anything that could be notated in symbols.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution)
In the mid-1990s, a new employee of Sun Microsystems in California kept disappearing from their database. Every time his details were entered, the system seemed to eat him whole; he would disappear without a trace. No one in HR could work out why poor Steve Null was database kryptonite. The staff in HR were entering the surname as “Null,” but they were blissfully unaware that, in a database, NULL represents a lack of data, so Steve became a non-entry. To computers, his name was Steve Zero or Steve McDoesNotExist. Apparently, it took a while to work out what was going on, as HR would happily reenter his details each time the issue was raised, never stopping to consider why the database was routinely removing him.
Matt Parker (Humble Pi: A Comedy of Maths Errors)
Entrepreneurs who kept their day jobs had 33 percent lower odds of failure than those who quit. If you’re risk averse and have some doubts about the feasibility of your ideas, it’s likely that your business will be built to last. If you’re a freewheeling gambler, your startup is far more fragile. Like the Warby Parker crew, the entrepreneurs whose companies topped Fast Company’s recent most innovative lists typically stayed in their day jobs even after they launched. Former track star Phil Knight started selling running shoes out of the trunk of his car in 1964, yet kept working as an accountant until 1969. After inventing the original Apple I computer, Steve Wozniak started the company with Steve Jobs in 1976 but continued working full time in his engineering job at Hewlett-Packard until 1977. And although Google founders Larry Page and Sergey Brin figured out how to dramatically improve internet searches in 1996, they didn’t go on leave from their graduate studies at Stanford until 1998. “We almost didn’t start Google,” Page says, because we “were too worried about dropping out of our Ph.D. program.” In 1997, concerned that their fledgling search engine was distracting them from their research, they tried to sell Google for less than $2 million in cash and stock. Luckily for them, the potential buyer rejected the offer. This habit of keeping one’s day job isn’t limited to successful entrepreneurs. Many influential creative minds have stayed in full-time employment or education even after earning income from major projects. Selma director Ava DuVernay made her first three films while working in her day job as a publicist, only pursuing filmmaking full time after working at it for four years and winning multiple awards. Brian May was in the middle of doctoral studies in astrophysics when he started playing guitar in a new band, but he didn’t drop out until several years later to go all in with Queen. Soon thereafter he wrote “We Will Rock You.” Grammy winner John Legend released his first album in 2000 but kept working as a management consultant until 2002, preparing PowerPoint presentations by day while performing at night. Thriller master Stephen King worked as a teacher, janitor, and gas station attendant for seven years after writing his first story, only quitting a year after his first novel, Carrie, was published. Dilbert author Scott Adams worked at Pacific Bell for seven years after his first comic strip hit newspapers. Why did all these originals play it safe instead of risking it all?
Adam M. Grant (Originals: How Non-Conformists Move the World)
How do we learn? Is there a better way? What can we predict? Can we trust what we’ve learned? Rival schools of thought within machine learning have very different answers to these questions. The main ones are five in number, and we’ll devote a chapter to each. Symbolists view learning as the inverse of deduction and take ideas from philosophy, psychology, and logic. Connectionists reverse engineer the brain and are inspired by neuroscience and physics. Evolutionaries simulate evolution on the computer and draw on genetics and evolutionary biology. Bayesians believe learning is a form of probabilistic inference and have their roots in statistics. Analogizers learn by extrapolating from similarity judgments and are influenced by psychology and mathematical optimization.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
Historically, noted James Manyika, one of the authors of the McKinsey report, companies kept their eyes on competitors “who looked like them, were in their sector and in their geography.” Not anymore. Google started as a search engine and is now also becoming a car company and a home energy management system. Apple is a computer manufacturer that is now the biggest music seller and is also going into the car business, but in the meantime, with Apple Pay, it’s also becoming a bank. Amazon, a retailer, came out of nowhere to steal a march on both IBM and HP in cloud computing. Ten years ago neither company would have listed Amazon as a competitor. But Amazon needed more cloud computing power to run its own business and then decided that cloud computing was a business! And now Amazon is also a Hollywood studio.
Thomas L. Friedman (Thank You for Being Late: An Optimist's Guide to Thriving in the Age of Accelerations)
The world's greatest computer is the brain. The world's greatest engine is the heart. The world's greatest generator is the soul. The world's greatest television is the mind. The world's greatest radio is the tongue. The world's greatest camera is the eye. The world's greatest ladder is faith. The world's greatest hammer is courage. The world's greatest sword is accuracy. The world's greatest photographer is sight. The world's greatest knife is fate. The world's greatest spear is intelligence. The world's greatest submerine is a fish. The world's greatest aeroplane is a bird. The world's greatest jet is a fly. The world's greatest bicycle is a camel. The world's greatest motorbike is a horse. The world's greatest train is a centipede. The world's greatest sniper is a cobra. The world's greatest schemer is a fox. The world's greatest builder is an ant. The world's greatest tailor is a spider. The world's greatest assassin is a wolf. The world's greatest ruler is a lion. The world's greatest judge is karma. The world's greatest preacher is nature. The world's greatest philosopher is truth. The world's greatest mirror is reality. The world's greatest curtain is darkness. The world's greatest author is destiny.
Matshona Dhliwayo
Technology, I said before, is most powerful when it enables transitions—between linear and circular motion (the wheel), or between real and virtual space (the Internet). Science, in contrast, is most powerful when it elucidates rules of organization—laws—that act as lenses through which to view and organize the world. Technologists seek to liberate us from the constraints of our current realities through those transitions. Science defines those constraints, drawing the outer limits of the boundaries of possibility. Our greatest technological innovations thus carry names that claim our prowess over the world: the engine (from ingenium, or “ingenuity”) or the computer (from computare, or “reckoning together”). Our deepest scientific laws, in contrast, are often named after the limits of human knowledge: uncertainty, relativity, incompleteness, impossibility. Of all the sciences, biology is the most lawless; there are few rules to begin with, and even fewer rules that are universal. Living beings must, of course, obey the fundamental rules of physics and chemistry, but life often exists on the margins and interstices of these laws, bending them to their near-breaking limit. The universe seeks equilibriums; it prefers to disperse energy, disrupt organization, and maximize chaos. Life is designed to combat these forces. We slow down reactions, concentrate matter, and organize chemicals into compartments; we sort laundry on Wednesdays. “It sometimes seems as if curbing entropy is our quixotic purpose in the universe,” James Gleick wrote. We live in the loopholes of natural laws, seeking extensions, exceptions, and excuses.
Siddhartha Mukherjee (The Gene: An Intimate History)
Gene Berdichevsky, one of the members of the solar-powered-car team, lit up the second he heard from Straubel. An undergraduate, Berdichevsky volunteered to quit school, work for free, and sweep the floors at Tesla if that’s what it took to get a job. The founders were impressed with his spirit and hired Berdichevsky after one meeting. This left Berdichevsky in the uncomfortable position of calling his Russian immigrant parents, a pair of nuclear submarine engineers, to tell them that he was giving up on Stanford to join an electric car start-up. As employee No. 7, he spent part of the workday in the Menlo Park office and the rest in Straubel’s living room designing three-dimensional models of the car’s powertrain on a computer and building battery pack prototypes in the garage. “Only now do I realize how insane it was,” Berdichevsky said.
Ashlee Vance (Elon Musk: Inventing the Future)
Lilah did little more than sleep and eat and cry, which to me was the most fascinating thing in the entire universe. Why did she cry? When did she sleep? What made her eat a lot one day and little the next? Was she changing with time? I did what any obsessed person would do in such a case: I recorded data, plotted it, calculated statistical correlations. First I just wrote on scraps of paper and made charts on graph paper, but I very quickly became more sophisticated. I wrote computer software to make a beautifully colored plot showing times when Diane fed Lilah, in black; when I fed her, in blue (expressed mother's milk, if you must know); Lilah's fussy times, in angry red; her happy times, in green. I calculated patterns in sleeping times, eating times, length of sleep, amounts eaten. Then, I did what any obsessed person would do these days; I put it all on the Web.
Mike Brown (How I Killed Pluto and Why It Had It Coming)
I want economists to quit concerning themselves with allocation problems, per se, with the problem, as it has been traditionally defined. The vocabulary of science is important here, and as T. D. Weldon once suggested, the very word "problem" in and of itself implies the presence of "solution." Once the format has been established in allocation terms, some solution is more or less automatically suggested. Our whole study becomes one of applied maximization of a relatively simple computational sort. Once the ends to be maximized are provided by the social welfare function, everything becomes computational, as my colleague, Rutledge Vining, has properly noted. If there is really nothing more to economics than this, we had as well turn it all over to the applied mathematicians. This does, in fact, seem to be the direction in which we are moving, professionally, and developments of note, or notoriety, during the past two decades consist largely in improvements in what are essentially computing techniques, in the mathematics of social engineering. What I am saying is that we should keep these contributions in perspective; I am urging that they be recognized for what they are, contributions to applied mathematics, to managerial science if you will, but not to our chosen subject field which we, for better or for worse, call "economics.
James M. Buchanan
Historians are wont to name technological advances as the great milestones of culture, among them the development of the plow, the discovery of smelting and metalworking, the invention of the clock, printing press, steam power, electric engine, lightbulb, semiconductor, and computer. But possibly even more transforming than any of these was the recognition by Greek philosophers and their intellectual descendants that human beings could examine, comprehend, and eventually even guide or control their own thought process, emotions, and resulting behavior. With that realization we became something new and different on earth: the only animal that, by examining its own cerebration and behavior, could alter them. This, surely, was a giant step in evolution. Although we are physically little different from the people of three thousand years ago, we are culturally a different species. We are the psychologizing animal.
Morton Hunt (The Story of Psychology)
The world has been changing even faster as people, devices and information are increasingly connected to each other. Computational power is growing and quantum computing is quickly being realised. This will revolutionise artificial intelligence with exponentially faster speeds. It will advance encryption. Quantum computers will change everything, even human biology. There is already one technique to edit DNA precisely, called CRISPR. The basis of this genome-editing technology is a bacterial defence system. It can accurately target and edit stretches of genetic code. The best intention of genetic manipulation is that modifying genes would allow scientists to treat genetic causes of disease by correcting gene mutations. There are, however, less noble possibilities for manipulating DNA. How far we can go with genetic engineering will become an increasingly urgent question. We can’t see the possibilities of curing motor neurone diseases—like my ALS—without also glimpsing its dangers. Intelligence is characterised as the ability to adapt to change. Human intelligence is the result of generations of natural selection of those with the ability to adapt to changed circumstances. We must not fear change. We need to make it work to our advantage. We all have a role to play in making sure that we, and the next generation, have not just the opportunity but the determination to engage fully with the study of science at an early level, so that we can go on to fulfil our potential and create a better world for the whole human race. We need to take learning beyond a theoretical discussion of how AI should be and to make sure we plan for how it can be. We all have the potential to push the boundaries of what is accepted, or expected, and to think big. We stand on the threshold of a brave new world. It is an exciting, if precarious, place to be, and we are the pioneers. When we invented fire, we messed up repeatedly, then invented the fire extinguisher. With more powerful technologies such as nuclear weapons, synthetic biology and strong artificial intelligence, we should instead plan ahead and aim to get things right the first time, because it may be the only chance we will get. Our future is a race between the growing power of our technology and the wisdom with which we use it. Let’s make sure that wisdom wins.
Stephen Hawking (Brief Answers to the Big Questions)
REINHOLD JOBS. Wisconsin-born Coast Guard seaman who, with his wife, Clara, adopted Steve in 1955. REED JOBS. Oldest child of Steve Jobs and Laurene Powell. RON JOHNSON. Hired by Jobs in 2000 to develop Apple’s stores. JEFFREY KATZENBERG. Head of Disney Studios, clashed with Eisner and resigned in 1994 to cofound DreamWorks SKG. ALAN KAY. Creative and colorful computer pioneer who envisioned early personal computers, helped arrange Jobs’s Xerox PARC visit and his purchase of Pixar. DANIEL KOTTKE. Jobs’s closest friend at Reed, fellow pilgrim to India, early Apple employee. JOHN LASSETER. Cofounder and creative force at Pixar. DAN’L LEWIN. Marketing exec with Jobs at Apple and then NeXT. MIKE MARKKULA. First big Apple investor and chairman, a father figure to Jobs. REGIS MCKENNA. Publicity whiz who guided Jobs early on and remained a trusted advisor. MIKE MURRAY. Early Macintosh marketing director. PAUL OTELLINI. CEO of Intel who helped switch the Macintosh to Intel chips but did not get the iPhone business. LAURENE POWELL. Savvy and good-humored Penn graduate, went to Goldman Sachs and then Stanford Business School, married Steve Jobs in 1991. GEORGE RILEY. Jobs’s Memphis-born friend and lawyer. ARTHUR ROCK. Legendary tech investor, early Apple board member, Jobs’s father figure. JONATHAN “RUBY” RUBINSTEIN. Worked with Jobs at NeXT, became chief hardware engineer at Apple in 1997. MIKE SCOTT. Brought in by Markkula to be Apple’s president in 1977 to try to manage Jobs.
Walter Isaacson (Steve Jobs)
Most people think the Lego corporation assembled a crack team of world-class experts to engineer Mini-Florida on a computer, but I’m not buying it.” “You aren’t?” asked Coleman. “It’s way too good.” Serge pointed at a two-story building in Key West. “Examine the meticulous green shutters on Hemingway’s house. No, my money is on a lone-wolf manic type like the famous Latvian Edward Leedskalnin, who single-handedly built the Coral Castle back in the twenties. He operated in secret, moving multi-ton hewn boulders south of Miami, and nobody knows how he did it. Probably happened here as well: The Lego people conducting an exhaustive nationwide search among the obsessive-compulsive community. But they had to be selective and stay away from the ones whose entire houses are filled to the ceiling with garbage bags of their own hair. Then they most likely found some cult guru living in a remote Lego ashram south of Pueblo with nineteen wives, offered him unlimited plastic blocks and said, ‘Knock yourself out.
Tim Dorsey (Tiger Shrimp Tango (Serge Storms #17))
In this section I have tried to demonstrate that Darwinian thinking does live up to its billing as universal acid: it turns the whole traditional world upside down, challenging the top-down image of designs flowing from that genius of geniuses, the Intelligent Designer, and replacing it with the bubble-up image of mindless, motiveless cyclical processes churning out ever-more robust combinations until they start replicating on their own, speeding up the design process by reusing all the best bits over and over. Some of these earliest offspring eventually join forces (one major crane, symbiosis), which leads to multicellularity (another major crane), which leads to the more effective exploration vehicles made possible by sexual reproduction (another major crane), which eventually leads in one species to language and cultural evolution (cranes again), which provide the medium for literature and science and engineering, the latest cranes to emerge, which in turn permits us to “go meta” in a way no other life form can do, reflecting in many ways on who and what we are and how we got here, modeling these processes in plays and novels, theories and computer simulations, and ever-more thinking tools to add to our impressive toolbox. This perspective is so widely unifying and at the same time so generous with detailed insights that one might say it’s a power tool, all on its own. Those who are still strangely repelled by Darwinian thinking must consider the likelihood that if they try to go it alone with only the hand tools of tradition, they will find themselves laboring far from the cutting edge of research on important phenomena as diverse as epidemics and epistemology, biofuels and brain architecture, molecular genetics, music, and morality.
Daniel C. Dennett (Intuition Pumps And Other Tools for Thinking)
Take for example job applications. In the 21st century the decision wherever to hire somebody for a job while increasingly be made by algorithms. We cannot rely on the machines to set the relevant ethical standards, humans will still need to do that, but once we decide on an ethical standard in the job market, that it is wrong to discriminate against blacks or against women for example, we can rely on machines to implement and maintain these standards better than humans. A human manager may know and even agree that is unethical to discriminate against blacks and women but then when a black woman applies for a job the manager subconsciously discriminate against her and decides not to hire her. If we allow a computer to evaluate job applications and program computers to completely ignore race and gender we can be certain that the computer will indeed ignore these factors because computers do not have a subconscious. Of course it won't be easy to write code for evaluating job applications and there is always the danger that the engineers will somehow program their own subconscious biases into the software, yet once we discover such mistakes it would probably be far easier to debug the software than to get rid humans of their racist and misogynist biases.
Yuval Noah Harari (21 Lessons for the 21st Century)
Astounding, really, that Michel could consider psychology any kind of science at all. So much of it consisted of throwing together. Of thinking of the mind as a steam engine, the mechanical analogy most ready to hand during the birth of modern psychology. People had always done that when they thought about the mind: clockwork for Descartes, geological changes for the early Victorians, computers or holography for the twentieth century, AIs for the twenty-first…and for the Freudian traditionalists, steam engines. Application of heat, pressure buildup, pressure displacement, venting, all shifted into repression, sublimation, the return of the repressed. Sax thought it unlikely steam engines were an adequate model for the human mind. The mind was more like—what?—an ecology—a fellfield—or else a jungle, populated by all manner of strange beasts. Or a universe, filled with stars and quasars and black holes. Well—a bit grandiose, that—really it was more like a complex collection of synapses and axons, chemical energies surging hither and yon, like weather in an atmosphere. That was better—weather—storm fronts of thought, high-pressure zones, low-pressure cells, hurricanes—the jet streams of biological desires, always making their swift powerful rounds…life in the wind. Well. Throwing together. In fact the mind was poorly understood.
Kim Stanley Robinson (Blue Mars (Mars Trilogy, #3))
In the coming decades, it is likely that we will see more Internet-like revolutions, in which technology steals a march on politics. Artificial intelligence and biotechnology might soon overhaul our societies and economies – and our bodies and minds too – but they are hardly a blip on our political radar. Our current democratic structures just cannot collect and process the relevant data fast enough, and most voters don’t understand biology and cybernetics well enough to form any pertinent opinions. Hence traditional democratic politics loses control of events, and fails to provide us with meaningful visions for the future. That doesn’t mean we will go back to twentieth-century-style dictatorships. Authoritarian regimes seem to be equally overwhelmed by the pace of technological development and the speed and volume of the data flow. In the twentieth century, dictators had grand visions for the future. Communists and fascists alike sought to completely destroy the old world and build a new world in its place. Whatever you think about Lenin, Hitler or Mao, you cannot accuse them of lacking vision. Today it seems that leaders have a chance to pursue even grander visions. While communists and Nazis tried to create a new society and a new human with the help of steam engines and typewriters, today’s prophets could rely on biotechnology and super-computers.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
Facebook’s own North American marketing director, Michelle Klein, who told an audience in 2016 that while the average adult checks his or her phone 30 times a day, the average millennial, she enthusiastically reported, checks more than 157 times daily. Generation Z, we now know, exceeds this pace. Klein described Facebook’s engineering feat: “a sensory experience of communication that helps us connect to others, without having to look away,” noting with satisfaction that this condition is a boon to marketers. She underscored the design characteristics that produce this mesmerizing effect: design is narrative, engrossing, immediate, expressive, immersive, adaptive, and dynamic.11 If you are over the age of thirty, you know that Klein is not describing your adolescence, or that of your parents, and certainly not that of your grandparents. Adolescence and emerging adulthood in the hive are a human first, meticulously crafted by the science of behavioral engineering; institutionalized in the vast and complex architectures of computer-mediated means of behavior modification; overseen by Big Other; directed toward economies of scale, scope, and action in the capture of behavioral surplus; and funded by the surveillance capital that accrues from unprecedented concentrations of knowledge and power. Our children endeavor to come of age in a hive that is owned and operated by the applied utopianists of surveillance capitalism and is continuously monitored and shaped by the gathering force of instrumentarian power. Is this the life that we want for the most open, pliable, eager, self-conscious, and promising members of our society?
Shoshana Zuboff (The Age of Surveillance Capitalism)
As I became older, I was given many masks to wear. I could be a laborer laying railroad tracks across the continent, with long hair in a queue to be pulled by pranksters; a gardener trimming the shrubs while secretly planting a bomb; a saboteur before the day of infamy at Pearl Harbor, signaling the Imperial Fleet; a kamikaze pilot donning his headband somberly, screaming 'Banzai' on my way to my death; a peasant with a broad-brimmed straw hat in a rice paddy on the other side of the world, stooped over to toil in the water; an obedient servant in the parlor, a houseboy too dignified for my own good; a washerman in the basement laundry, removing stains using an ancient secret; a tyrant intent on imposing my despotism on the democratic world, opposed by the free and the brave; a party cadre alongside many others, all of us clad in coordinated Mao jackets; a sniper camouflaged in the trees of the jungle, training my gunsights on G.I. Joe; a child running with a body burning from napalm, captured in an unforgettable photo; an enemy shot in the head or slaughtered by the villageful; one of the grooms in a mass wedding of couples, having met my mate the day before through our cult leader; an orphan in the last airlift out of a collapsed capital, ready to be adopted into the good life; a black belt martial artist breaking cinderblocks with his head, in an advertisement for Ginsu brand knives with the slogan 'but wait--there's more' as the commercial segued to show another free gift; a chef serving up dog stew, a trick on the unsuspecting diner; a bad driver swerving into the next lane, exactly as could be expected; a horny exchange student here for a year, eager to date the blonde cheerleader; a tourist visiting, clicking away with his camera, posing my family in front of the monuments and statues; a ping pong champion, wearing white tube socks pulled up too high and batting the ball with a wicked spin; a violin prodigy impressing the audience at Carnegie Hall, before taking a polite bow; a teen computer scientist, ready to make millions on an initial public offering before the company stock crashes; a gangster in sunglasses and a tight suit, embroiled in a turf war with the Sicilian mob; an urban greengrocer selling lunch by the pound, rudely returning change over the counter to the black patrons; a businessman with a briefcase of cash bribing a congressman, a corrupting influence on the electoral process; a salaryman on my way to work, crammed into the commuter train and loyal to the company; a shady doctor, trained in a foreign tradition with anatomical diagrams of the human body mapping the flow of life energy through a multitude of colored points; a calculus graduate student with thick glasses and a bad haircut, serving as a teaching assistant with an incomprehensible accent, scribbling on the chalkboard; an automobile enthusiast who customizes an imported car with a supercharged engine and Japanese decals in the rear window, cruising the boulevard looking for a drag race; a illegal alien crowded into the cargo hold of a smuggler's ship, defying death only to crowd into a New York City tenement and work as a slave in a sweatshop. My mother and my girl cousins were Madame Butterfly from the mail order bride catalog, dying in their service to the masculinity of the West, and the dragon lady in a kimono, taking vengeance for her sisters. They became the television newscaster, look-alikes with their flawlessly permed hair. Through these indelible images, I grew up. But when I looked in the mirror, I could not believe my own reflection because it was not like what I saw around me. Over the years, the world opened up. It has become a dizzying kaleidoscope of cultural fragments, arranged and rearranged without plan or order.
Frank H. Wu (Yellow)
I will give technology three definitions that we will use throughout the book. The first and most basic one is that a technology is a means to fulfill a human purpose. For some technologies-oil refining-the purpose is explicit. For others- the computer-the purpose may be hazy, multiple, and changing. As a means, a technology may be a method or process or device: a particular speech recognition algorithm, or a filtration process in chemical engineering, or a diesel engine. it may be simple: a roller bearing. Or it may be complicated: a wavelength division multiplexer. It may be material: an electrical generator. Or it may be nonmaterial: a digital compression algorithm. Whichever it is, it is always a means to carry out a human purpose. The second definition I will allow is a plural one: technology as an assemblage of practices and components. This covers technologies such as electronics or biotechnology that are collections or toolboxes of individual technologies and practices. Strictly speaking, we should call these bodies of technology. But this plural usage is widespread, so I will allow it here. I will also allow a third meaning. This is technology as the entire collection of devices and engineering practices available to a culture. Here we are back to the Oxford's collection of mechanical arts, or as Webster's puts it, "The totality of the means employed by a people to provide itself with the objects of material culture." We use this collective meaning when we blame "technology" for speeding up our lives, or talk of "technology" as a hope for mankind. Sometimes this meaning shades off into technology as a collective activity, as in "technology is what Silicon Valley is all about." I will allow this too as a variant of technology's collective meaning. The technology thinker Kevin Kelly calls this totality the "technium," and I like this word. But in this book I prefer to simply use "technology" for this because that reflects common use. The reason we need three meanings is that each points to technology in a different sense, a different category, from the others. Each category comes into being differently and evolves differently. A technology-singular-the steam engine-originates as a new concept and develops by modifying its internal parts. A technology-plural-electronics-comes into being by building around certain phenomena and components and develops by changing its parts and practices. And technology-general, the whole collection of all technologies that have ever existed past and present, originates from the use of natural phenomena and builds up organically with new elements forming by combination from old ones.
W. Brian Arthur (The Nature of Technology: What It Is and How It Evolves)
When General Genius built the first mentar [Artificial Intelligence] mind in the last half of the twenty-first century, it based its design on the only proven conscious material then known, namely, our brains. Specifically, the complex structure of our synaptic network. Scientists substituted an electrochemical substrate for our slower, messier biological one. Our brains are an evolutionary hodgepodge of newer structures built on top of more ancient ones, a jury-rigged system that has gotten us this far, despite its inefficiency, but was crying out for a top-to-bottom overhaul. Or so the General genius engineers presumed. One of their chief goals was to make minds as portable as possible, to be easily transferred, stored, and active in multiple media: electronic, chemical, photonic, you name it. Thus there didn't seem to be a need for a mentar body, only for interchangeable containers. They designed the mentar mind to be as fungible as a bank transfer. And so they eliminated our most ancient brain structures for regulating metabolic functions, and they adapted our sensory/motor networks to the control of peripherals. As it turns out, intelligence is not limited to neural networks, Merrill. Indeed, half of human intelligence resides in our bodies outside our skulls. This was intelligence the mentars never inherited from us. ... The genius of the irrational... ... We gave them only rational functions -- the ability to think and feel, but no irrational functions... Have you ever been in a tight situation where you relied on your 'gut instinct'? This is the body's intelligence, not the mind's. Every living cell possesses it. The mentar substrate has no indomitable will to survive, but ours does. Likewise, mentars have no 'fire in the belly,' but we do. They don't experience pure avarice or greed or pride. They're not very curious, or playful, or proud. They lack a sense of wonder and spirit of adventure. They have little initiative. Granted, their cognition is miraculous, but their personalities are rather pedantic. But probably their chief shortcoming is the lack of intuition. Of all the irrational faculties, intuition in the most powerful. Some say intuition transcends space-time. Have you ever heard of a mentar having a lucky hunch? They can bring incredible amounts of cognitive and computational power to bear on a seemingly intractable problem, only to see a dumb human with a lucky hunch walk away with the prize every time. Then there's luck itself. Some people have it, most don't, and no mentar does. So this makes them want our bodies... Our bodies, ape bodies, dog bodies, jellyfish bodies. They've tried them all. Every cell knows some neat tricks or survival, but the problem with cellular knowledge is that it's not at all fungible; nor are our memories. We're pretty much trapped in our containers.
David Marusek (Mind Over Ship)
The main ones are the symbolists, connectionists, evolutionaries, Bayesians, and analogizers. Each tribe has a set of core beliefs, and a particular problem that it cares most about. It has found a solution to that problem, based on ideas from its allied fields of science, and it has a master algorithm that embodies it. For symbolists, all intelligence can be reduced to manipulating symbols, in the same way that a mathematician solves equations by replacing expressions by other expressions. Symbolists understand that you can’t learn from scratch: you need some initial knowledge to go with the data. They’ve figured out how to incorporate preexisting knowledge into learning, and how to combine different pieces of knowledge on the fly in order to solve new problems. Their master algorithm is inverse deduction, which figures out what knowledge is missing in order to make a deduction go through, and then makes it as general as possible. For connectionists, learning is what the brain does, and so what we need to do is reverse engineer it. The brain learns by adjusting the strengths of connections between neurons, and the crucial problem is figuring out which connections are to blame for which errors and changing them accordingly. The connectionists’ master algorithm is backpropagation, which compares a system’s output with the desired one and then successively changes the connections in layer after layer of neurons so as to bring the output closer to what it should be. Evolutionaries believe that the mother of all learning is natural selection. If it made us, it can make anything, and all we need to do is simulate it on the computer. The key problem that evolutionaries solve is learning structure: not just adjusting parameters, like backpropagation does, but creating the brain that those adjustments can then fine-tune. The evolutionaries’ master algorithm is genetic programming, which mates and evolves computer programs in the same way that nature mates and evolves organisms. Bayesians are concerned above all with uncertainty. All learned knowledge is uncertain, and learning itself is a form of uncertain inference. The problem then becomes how to deal with noisy, incomplete, and even contradictory information without falling apart. The solution is probabilistic inference, and the master algorithm is Bayes’ theorem and its derivates. Bayes’ theorem tells us how to incorporate new evidence into our beliefs, and probabilistic inference algorithms do that as efficiently as possible. For analogizers, the key to learning is recognizing similarities between situations and thereby inferring other similarities. If two patients have similar symptoms, perhaps they have the same disease. The key problem is judging how similar two things are. The analogizers’ master algorithm is the support vector machine, which figures out which experiences to remember and how to combine them to make new predictions.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)