History Of Computers Quotes

We've searched our database for all the quotes and captions related to History Of Computers. Here they are! All 100 of them:

The paradox of our time in history is that we have taller buildings but shorter tempers, wider Freeways, but narrower viewpoints. We spend more, but have less, we buy more, but enjoy less. We have bigger houses and smaller families, more conveniences, but less time. We have more degrees but less sense, more knowledge, but less judgment, more experts, yet more problems, more medicine, but less wellness. We drink too much, smoke too much, spend too recklessly, laugh too little, drive too fast, get too angry, stay up too late, get up too tired, read too little, watch TV too much, and pray too seldom. We have multiplied our possessions, but reduced our values. We talk too much, love too seldom, and hate too often. We've learned how to make a living, but not a life. We've added years to life not life to years. We've been all the way to the moon and back, but have trouble crossing the street to meet a new neighbor. We conquered outer space but not inner space. We've done larger things, but not better things. We've cleaned up the air, but polluted the soul. We've conquered the atom, but not our prejudice. We write more, but learn less. We plan more, but accomplish less. We've learned to rush, but not to wait. We build more computers to hold more information, to produce more copies than ever, but we communicate less and less. These are the times of fast foods and slow digestion, big men and small character, steep profits and shallow relationships. These are the days of two incomes but more divorce, fancier houses, but broken homes. These are days of quick trips, disposable diapers, throwaway morality, one night stands, overweight bodies, and pills that do everything from cheer, to quiet, to kill. It is a time when there is much in the showroom window and nothing in the stockroom. A time when technology can bring this letter to you, and a time when you can choose either to share this insight, or to just hit delete... Remember, to spend some time with your loved ones, because they are not going to be around forever. Remember, say a kind word to someone who looks up to you in awe, because that little person soon will grow up and leave your side. Remember, to give a warm hug to the one next to you, because that is the only treasure you can give with your heart and it doesn't cost a cent. Remember, to say, "I love you" to your partner and your loved ones, but most of all mean it. A kiss and an embrace will mend hurt when it comes from deep inside of you. Remember to hold hands and cherish the moment for someday that person might not be there again. Give time to love, give time to speak! And give time to share the precious thoughts in your mind.
Bob Moorehead (Words Aptly Spoken)
An integral part of any best friend's job is to immediately clear your computer history if you die.
Darynda Jones (Third Grave Dead Ahead (Charley Davidson, #3))
The upshot of all this is that we live in a universe whose age we can't quite compute, surrounded by stars whose distances we don't altogether know, filled with matter we can't identify, operating in conformance with physical laws whose properties we don’t truly understand.
Bill Bryson (A Short History of Nearly Everything)
One of history’s fews iron laws is that luxuries tend to become necessities and to spawn new obligations. Once people get used to a certain luxury, they take it for granted. Then they begin to count on it. Finally they reach a point where they can’t live without it. Over the few decades, we have invented countless time saving machines that are supposed to make like more relaxed - washing machines, vacuum cleaners, dishwashers, telephones, mobile phones, computers, email. We thought we were saving time; instead we revved up the treadmill of life to ten times its former speed and made our days more anxious and agitated.
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
We build our computers the way we build our cities—over time, without a plan, on top of ruins.
Ellen Ullman (Life in Code: A Personal History of Technology)
To be human is to be 'a' human, a specific person with a life history and idiosyncrasy and point of view; artificial intelligence suggest that the line between intelligent machines and people blurs most when a puree is made of that identity.
Brian Christian (The Most Human Human: What Talking with Computers Teaches Us About What It Means to Be Alive)
You Can't Write Perfect Software. Did that hurt? It shouldn't. Accept it as an axiom of life. Embrace it. Celebrate it. Because perfect software doesn't exist. No one in the brief history of computing has ever written a piece of perfect software. It's unlikely that you'll be the first. And unless you accept this as a fact, you'll end up wasting time and energy chasing an impossible dream.
Andrew Hunt (The Pragmatic Programmer: From Journeyman to Master)
People that hold onto hate for so long do so because they want to avoid dealing with their pain. They falsely believe if they forgive they are letting their enemy believe they are a doormat. What they don’t understand is hatred can’t be isolated or turned off. It manifests in their health, choices and belief systems. Their values and religious beliefs make adjustments to justify their negative emotions. Not unlike malware infesting a hard drive, their spirit slowly becomes corrupted and they make choices that don’t make logical sense to others. Hatred left unaddressed will crash a person’s spirit. The only thing he or she can do is to reboot, by fixing him or herself, not others. This might require installing a firewall of boundaries or parental controls on their emotions. Regardless of the approach, we are all connected on this "network of life" and each of us is responsible for cleaning up our spiritual registry.
Shannon L. Alder
If one person sits down at their computer one day and types one word, dose that affect the future? If that one person didn't type that one word, would the future's history be changed? Dose their one word even mean anything? Dose my one (times a lot) word mean anything? Dose that one person's one word even get read-once? If I wasn't sitting here writing my words, would my future be different?
Esther Earl (This Star Won't Go Out: The Life and Words of Esther Grace Earl)
The universe is computing its own destiny.
James Gleick (The Information: A History, a Theory, a Flood)
I long ago became convinced that the most reliable source for arcane and obscure and seemingly unobtainable information does not lie with the government or law enforcement agencies. Apparently neither the CIA nor the military intelligence apparatus inside the Pentagon had even a slight inkling of the Soviet Union's impending collapse, right up to the moment the Kremlin's leaders were trying to cut deals for their memoirs with New York publishers. Or, if a person really wishes a lesson in the subjective nature of official information, he can always call the IRS and ask for help with his tax forms, then call back a half hour later and ask the same questions to a different representative. So where do you go to find a researcher who is intelligent, imaginative, skilled in the use of computers, devoted to discovering the truth, and knowledgeable about science, technology, history, and literature, and who usually works for dirt and gets credit for nothing? After lunch I drove to the city library on Main and asked the reference librarian to find what she could on Junior Crudup.
James Lee Burke (Last Car to Elysian Fields (Dave Robicheaux, #13))
We have met the Devil of Information Overload and his impish underlings, the computer virus, the busy signal, the dead link, and the PowerPoint presentation.
James Gleick (The Information: A History, a Theory, a Flood)
Friedrich Nietzsche, who famously gave us the ‘God is dead’ phrase was interested in the sources of morality. He warned that the emergence of something (whether an organ, a legal institution, or a religious ritual) is never to be confused with its acquired purpose: ‘Anything in existence, having somehow come about, is continually interpreted anew, requisitioned anew, transformed and redirected to a new purpose.’ This is a liberating thought, which teaches us to never hold the history of something against its possible applications. Even if computers started out as calculators, that doesn’t prevent us from playing games on them. (47) (quoting Nietzsche, the Genealogy of Morals)
Frans de Waal (The Bonobo and the Atheist: In Search of Humanism Among the Primates)
When economists insist that they too are scientists because they use mathematics, they are no different from astrologists protesting that they are just as scientific as astronomers because they also use computers and complicated charts.
Yanis Varoufakis (Talking to My Daughter About the Economy: A Brief History of Capitalism)
There is another physical law that teases me, too: the Doppler Effect. The sound of anything coming at you- a train, say, or the future- has a higher pitch than the sound of the same thing going away. If you have perfect pitch and a head for mathematics you can compute the speed of the object by the interval between its arriving and departing sounds. I have neither perfect pitch nor a head for mathematics, and anyway who wants to compute the speed of history? Like all falling bodies, it constantly accelerates. But I would like to hear your life as you heard it, coming at you, instead of hearing it as I do, a somber sound of expectations reduced, desires blunted, hopes deferred or abandoned, chances lost, defeats accepted, griefs borne.
Wallace Stegner (Angle of Repose)
Sometimes in studying Ramanujan's work, [George Andrews] said at another time, "I have wondered how much Ramanujan could have done if he had had MACSYMA or SCRATCHPAD or some other symbolic algebra package.
Robert Kanigel (The Man Who Knew Infinity: A Life of the Genius Ramanujan)
A hundred and fifty years before, when the parochial disagreements between Earth and Mars had been on the verge of war, the Belt had been a far horizon of tremendous mineral wealth beyond viable economic reach, and the outer planets had been beyond even the most unrealistic corporate dream. Then Solomon Epstein had built his little modified fusion drive, popped it on the back of his three-man yacht, and turned it on. With a good scope, you could still see his ship going at a marginal percentage of the speed of light, heading out into the big empty. The best, longest funeral in the history of mankind. Fortunately, he’d left the plans on his home computer. The Epstein Drive hadn’t given humanity the stars, but it had delivered the planets.
James S.A. Corey (Leviathan Wakes (The Expanse, #1))
A computer lets you make more mistakes faster than any invention in human history-with the possible exceptions of handguns and tequila
Mitch Ratcliffe
To use a computer analogy, we are running twenty-first-century software on hardware last upgraded 50,000 years ago or more. This may explain quite a lot of what we see in the news.
Ronald Wright (A Short History Of Progress)
It would all be done with keys on alphanumeric keyboards that stood for weightless, invisible chains of electronic presence or absence. If patterns of ones and zeroes were "like" patterns of human lives and deaths, if everything about an individual could be represented in a computer record by a long strings of ones and zeroes, then what kind of creature could be represented by a long string of lives and deaths? It would have to be up one level, at least -- an angel, a minor god, something in a UFO. It would take eight human lives and deaths just to form one character in this being's name -- its complete dossier might take up a considerable piece of history of the world. We are digits in God's computer, she not so much thought as hummed to herself to sort of a standard gospel tune, And the only thing we're good for, to be dead or to be living, is the only thing He sees. What we cry, what we contend for, in our world of toil and blood, it all lies beneath the notice of the hacker we call God.
Thomas Pynchon (Vineland)
A computer lets you make more mistakes faster then any other human invention in history...with the possible exception of handguns and tequila.
Mitch Ratcliffe
An integral part of any best friend’s job is to immediately clear your computer history if you die. —T-SHIRT
Darynda Jones (Third Grave Dead Ahead (Charley Davidson, #3))
Epidemiologists have computed that measles requires an unvaccinated population of at least half a million people living in fairly close contact to continue to exist.
John M. Barry (The Great Influenza: The Story of the Deadliest Pandemic in History)
Babbage had most of this system sketched out by 1837, but the first true computer to use this programmable architecture didn’t appear for more than a hundred years.
Steven Johnson (Where Good Ideas Come From: The Natural History of Innovation)
If the automobile had followed the same development as the computer, a Rolls Royce would today cost $100 and get a million miles per gallon, and explode once a year killing everyone inside. —Robert X. Cringely, InfoWorld magazine
Robert J. Gordon (The Rise and Fall of American Growth: The U.S. Standard of Living since the Civil War (The Princeton Economic History of the Western World Book 60))
But before a computer became an inanimate object, and before Mission Control landed in Houston; before Sputnik changed the course of history, and before the NACA became NASA; before the Supreme Court case Brown v. Board of Education of Topeka established that separate was in fact not equal, and before the poetry of Martin Luther King Jr.’s “I Have a Dream” speech rang out over the steps of the Lincoln Memorial, Langley’s West Computers were helping America dominate aeronautics, space research, and computer technology, carving out a place for themselves as female mathematicians who were also black, black mathematicians who were also female.
Margot Lee Shetterly (Hidden Figures: The American Dream and the Untold Story of the Black Women Mathematicians Who Helped Win the Space Race)
The computer on the desk of the student in school knows no history. It is not like a book, worn at the edges by human hands. No little child has written a note in it, long ago. It will not be passed down to the children of the children who use it. Its “meaning” is that there is no enduring meaning.
Anthony Esolen (Life Under Compulsion: Ten Ways to Destroy the Humanity of Your Child)
In fact, as time goes by, it becomes easier and easier to replace humans with computer algorithms, not merely because the algorithms are getting smarter, but also because humans are professionalising. Ancient hunter-gatherers mastered a very wide variety of skills in order to survive, which is why it would be immensely difficult to design a robotic hunter-gatherer. Such a robot would have to know how to prepare spear points from flint stones, how to find edible mushrooms in a forest, how to use medicinal herbs to bandage a wound, how to track down a mammoth and how to coordinate a charge with a dozen other hunters. However, over the last few thousand years we humans have been specialising. A taxi driver or a cardiologist specialises in a much narrower niche than a hunter-gatherer, which makes it easier to replace them with AI.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
History cannot be explained deterministically and it cannot be predicted because it is chaotic. So many forces are at work and their interactions are so complex that extremely small variations in the strength of the forces and the way they interact produce huge differences in outcomes. Not only that, but history is what is called a ‘level two’ chaotic system. Chaotic systems come in two shapes. Level one chaos is chaos that does not react to predictions about it. The weather, for example, is a level one chaotic system. Though it is influenced by myriad factors, we can build computer models that take more and more of them into consideration, and produce better and better weather forecasts. Level two chaos is chaos that reacts to predictions about it, and therefore can never be predicted accurately. Markets, for example, are a level two chaotic system. What will happen if we develop a computer program that forecasts with 100 per cent accuracy the price of oil tomorrow? The price of oil will immediately react to the forecast, which would consequently fail to materialise. If the current price of oil is $90 a barrel, and the infallible computer program predicts that tomorrow it will be $100, traders will rush to buy oil so that they can profit from the predicted price rise. As a result, the price will shoot up to $100 a barrel today rather than tomorrow. Then what will happen tomorrow? Nobody knows.
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
I wish it were different. I wish that we privileged knowledge in politicians, that the ones who know things didn't have to hide it behind brown pants, and that the know-not-enoughs were laughed all the way to the Maine border on their first New Hampshire meet and greet. I wish that in order to secure his party's nomination, a presidential candidate would be required to point at the sky and name all the stars; have the periodic table of the elements memorized; rattle off the kings and queens of Spain; define the significance of the Gatling gun; joke around in Latin; interpret the symbolism in seventeenth-century Dutch painting; explain photosynthesis to a six-year-old; recite Emily Dickinson; bake a perfect popover; build a shortwave radio out of a coconut; and know all the words to Hoagy Carmichael's "Two Sleepy People," Johnny Cash's "Five Feet High and Rising," and "You Got the Silver" by the Rolling Stones. After all, the United States is the greatest country on earth dealing with the most complicated problems in the history of the world--poverty, pollution, justice, Jerusalem. What we need is a president who is at least twelve kinds of nerd, a nerd messiah to come along every four years, acquire the Secret Service code name Poindexter, install a Revenge of the Nerds screen saver on the Oval Office computer, and one by one decrypt our woes.
Sarah Vowell (The Partly Cloudy Patriot)
My computer is a tool not a way of life. The real world is outside.
Kirk Ward Robinson (Hiking Through History: Hannibal, Highlanders & Joan of Arc)
The bookcase was filled with computer games, history books, and sci-fi novels in about equal proportions. Odd reading choices, maybe, but I just thought of it as past and future history.
Mike Mullin (Ashfall (Ashfall, #1))
Memes can replicate with impressive virulence while leaving swaths of collateral damage—patent medicines and psychic surgery, astrology and satanism, racist myths, superstitions, and (a special case) computer viruses. In a way, these are the most interesting—the memes that thrive to their hosts’ detriment, such as the idea that suicide bombers will find their reward in heaven.
James Gleick (The Information: A History, a Theory, a Flood)
And there is one disconcerting thing about working with a computer – it's likely to talk back to you. You make some tiny mistake in your FORTRAN language – putting a letter in the wrong column, say, or omitting a comma – and the 360 comes to a screeching halt and prints out rude remarks, like "ILLEGAL FORMAT," or "UNKNOWN PROBLEM," or, if the man who wrote the program was really feeling nasty that morning, "WHAT'S THE MATTER STUPID? CAN'T YOU READ?" Everyone who uses a computer frequently has had, from time to time, a mad desire to attack the precocious abacus with an axe.
John Drury Clark (Ignition!: An informal history of liquid rocket propellants)
Deep Blue didn't win by being smarter than a human; it won by being millions of times faster than a human. Deep Blue had no intuition. An expert human player looks at a board position and immediately sees what areas of play are most likely to be fruitful or dangerous, whereas a computer has no innate sense of what is important and must explore many more options. Deep Blue also had no sense of the history of the game, and didn't know anything about its opponent. It played chess yet didn't understand chess, in the same way a calculator performs arithmetic bud doesn't understand mathematics.
Jeff Hawkins (On Intelligence)
We're at a crucial point in history. We cannot have fast cars, computers the size of credit cards, and modern conveniences, whilst simultaneously having clean air, abundant rainforests, fresh drinking water and a stable climate. This generation can have one or the other but not both. Humanity must make a choice. Both have an opportunity cost. Gadgetry or nature? Pick the wrong one and the next generations may have neither.
Mark Boyle (The Moneyless Man: A Year of Freeconomic Living)
Many of my all-time favorite movies are almost entirely verbal. The entire plot of My Dinner with Andre is “Wallace Shawn and Andre Gregory eat dinner.” The entire plot of Before Sunrise is “Ethan Hawke and Julie Delpy walk around Vienna.” But the dialogue takes us everywhere, and as Roger Ebert notes, of My Dinner with Andre, these films may be paradoxically among the most visually stimulating in the history of the cinema:
Brian Christian (The Most Human Human: What Talking with Computers Teaches Us About What It Means to Be Alive)
Second by second, the Queng Ho counted from the instant that a human had first set foot on Old Earth's moon. But if you looked at it still more closely ... the starting instant was actually about fifteen million seconds later, the 0-second of one of Humankind's first computer operating systems.
Vernor Vinge (A Deepness in the Sky)
It might not be immediately obvious to some readers why the ability to perform 10^85 computational operations is a big deal. So it's useful to put it in context. [I]t may take about 10^31-10^44 operations to simulate all neuronal operations that have occurred in the history of life on Earth. Alternatively, let us suppose that the computers are used to run human whole brain emulations that live rich and happy lives while interacting with one another in virtual environments. A typical estimate of the computational requirements for running one emulation is 10^18 operations per second. To run an emulation for 100 subjective years would then require some 10^27 operations. This would be mean that at least 10^58 human lives could be created in emulation even with quite conservative assumptions about the efficiency of computronium. In other words, assuming that the observable universe is void of extraterrestrial civilizations, then what hangs in the balance is at least 10,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 human lives. If we represent all the happiness experienced during one entire such life with a single teardrop of joy, then the happiness of these souls could fill and refill the Earth's oceans every second, and keep doing so for a hundred billion billion millennia. It is really important that we make sure these truly are tears of joy.
Nick Bostrom (Superintelligence: Paths, Dangers, Strategies)
Empirical studies show that New Zealanders are the most widely traveled people on the planet. The computer and the Internet have made a major difference. Insularity, distance, and isolation may have been important in an earlier period of New Zealand’s history, but not today. The rapid progress of communications has wrought a revolution in the spatial condition of New Zealand, and yet its culture remains very distinctive. This fact suggests that distance itself is not the key.
David Hackett Fischer (Fairness and Freedom: A History of Two Open Societies: New Zealand and the United States)
Like Ada Lovelace, Turing was a programmer, looking inward to the step-by-step logic of his own mind. He imagined himself as a computer. He distilled mental procedures into their smallest constituent parts, the atoms of information processing.
James Gleick (The Information: A History, a Theory, a Flood)
If Henry Adams, whom you knew slightly, could make a theory of history by applying the second law of thermodynamics to human affairs, I ought to be entitled to base one on the angle of repose, and may yet. There is another physical law that teases me, too: the Doppler Effect. The sound of anything coming at you -- a train, say, or the future -- has a higher pitch than the sound of the same thing going away. If you have perfect pitch and a head for mathematics you can compute the speed of the object by the interval between its arriving and departing sounds. I have neither perfect pitch nor a head for mathematics, and anyway who wants to compute the speed of history? Like all falling bodies, it constantly accelerates. But I would like to hear your life as you heard it, coming at you, instead of hearing it as I do, a sober sound of expectations reduced, desires blunted, hopes deferred or abandoned, chances lost, defeats accepted, griefs borne. I don't find your life uninteresting, as Rodman does. I would like to hear it as it sounded while it was passing. Having no future of my own, why shouldn't I look forward to yours.
Wallace Stegner
I wonder if we are seeing a return to the object in the science-based museum. Since any visitor can go to a film like Jurassic Park and see dinosaurs reawakened more graphically than any museum could emulate, maybe a museum should be the place to have an encounter with the bony truth. Maybe some children have overdosed on simulations on their computers at home and just want to see something solid--a fact of life.
Richard Fortey (Dry Store Room No. 1: The Secret Life of the Natural History Museum)
In history, and in evolution, progress is always a futile, Sisyphean struggle to stay in the same relative place by getting ever better at things. Cars move through the congested streets of London no faster than horse-drawn carriages did a century ago. Computers have no effect on productivity because people learn to complicate and repeat tasks that have been made easier.
Matt Ridley
Every set of phenomena, whether cultural totality or sequence of events, has to be fragmented, disjointed, so that it can be sent down the circuits; every kind of language has to be resolved into a binary formulation so that it can circulate not, any longer, in our memories, but in the luminous, electronic memory of the computers. No human language can withstand the speed of light. No event can withstand being beamed across the whole planet. No meaning can withstand acceleration. No history can withstand the centrifugation of facts or their being short-circuited in real time (to pursue the same train of thought: no sexuality can withstand being liberated, no culture can withstand being hyped, no truth can withstand being verified, etc.).
Jean Baudrillard (The Illusion of the End)
Unix is not so much a product as it is a painstakingly compiled oral history of the hacker subculture. It is our Gilgamesh epic: a living body of narrative that many people know by heart, and tell over and over again—making their own personal embellishments whenever it strikes their fancy. The bad embellishments are shouted down, the good ones picked up by others, polished, improved, and, over time, incorporated into the story. […] Thus Unix has slowly accreted around a simple kernel and acquired a kind of complexity and asymmetry about it that is organic, like the roots of a tree, or the branchings of a coronary artery. Understanding it is more like anatomy than physics.
Neal Stephenson
Yet the possibility of information storage, beyond what men and governments ever had before, can make available at the touch of a button a man's total history (including remarks put on his record by his kindergarten teacher about his ability and character). And with the computer must be placed the modern scientific technical capability which exists for wholesale monitoring of telephone, cable, Telex and microwave transmissions which carry much of today's spoken and written communications. The combined use of the technical capability of listening in on all these forms of communications with the high-speed computer literally leaves no place to hide and little room for privacy.
Francis A. Schaeffer (How Should We Then Live? The Rise and Decline of Western Thought and Culture)
In the pentagram, the Pythagoreans found all proportions well-known in antiquity: arithmetic, geometric, harmonic, and also the well-known golden proportion, or the golden ratio. ... Probably owing to the perfect form and the wealth of mathematical forms, the pentagram was chosen by the Pythagoreans as their secret symbol and a symbol of health. - Alexander Voloshinov [As quoted in Stakhov]
Alexey Stakhov (MATHEMATICS OF HARMONY: FROM EUCLID TO CONTEMPORARY MATHEMATICS AND COMPUTER SCIENCE (Series in Knots and Everything, 22))
Once upon a time in the land of Shinar, God came down to see the city and the tower. People were united and spoke in one language. Then God confound their language and caused them scattered all over the planet earth. I believe, because of our technology, there will be one computer-based language on earth. Then God will come back again and make us all scattered all over the stars constellation.
Toba Beta
The steampunk genre often works as a form of alternate history, showing us how small changes to what actually happened might have resulted in momentous differences: clockwork Victorian-era computers, commercial transcontinental dirigible lines, and a host of other wonders. This is that kind of book.
Nisi Shawl (Everfair (Everfair #1))
When I was an activist in the 1980s, ninety-eight percent of my time was spent stuffing envelopes and writing addresses on them. The remaining two percent was the time we spent figuring out what to put in the envelopes. Today, we get those envelopes and stamps and address books for free. This is so fantastically, hugely different and weird that we haven’t even begun to feel the first tendrils of it.
Cory Doctorow (In Real Life)
[Dialogue between Solon and an Egyptian Priest] In the Egyptian Delta, at the head of which the river Nile divides, there is a certain district which is called the district of Sais [...] To this city came Solon, and was received there with great honour; he asked the priests who were most skilful in such matters, about antiquity, and made the discovery that neither he nor any other Hellene knew anything worth mentioning about the times of old. On one occasion, wishing to draw them on to speak of antiquity, he began to tell about the most ancient things in our part of the world-about Phoroneus, who is called "the first man," and about Niobe; and after the Deluge, of the survival of Deucalion and Pyrrha; and he traced the genealogy of their descendants, and reckoning up the dates, tried to compute how many years ago the events of which he was speaking happened. Thereupon one of the priests, who was of a very great age, said: O Solon, Solon, you Hellenes are never anything but children, and there is not an old man among you. Solon in return asked him what he meant. I mean to say, he replied, that in mind you are all young; there is no old opinion handed down among you by ancient tradition, nor any science which is hoary with age.
Plato (Timaeus and Critias)
By the early 1960’s America had reluctantly come to realize that it possessed, as a nation, the most potent scientific complex in the history of the world. Eighty per cent of all scientific discoveries in the preceding three decades had been made by Americans. The United States had 75 per cent of the world’s computers, and 90 per cent of the world’s lasers. The United States had three and a half times as many scientists as the Soviet Union and spent three and a half times as much money on research; the U.S. had four times as many scientists as the European Economic Community and spent seven times as much on research.
Michael Crichton (The Andromeda Strain)
In accordance with the law of accelerating returns, paradigm shift (also called innovation) turns the S-curve of any specific paradigm into a continuing exponential. A new paradigm, such as three-dimensional circuits, takes over when the old paradigm approaches its natural limit, which has already happened at least four times in the history of computation. In such nonhuman species as apes, the mastery of a toolmaking or -using skill by each animal is characterized by an S-shaped learning curve that ends abruptly; human-created technology, in contrast, has followed an exponential pattern of growth and acceleration since its inception.
Ray Kurzweil (The Singularity is Near: When Humans Transcend Biology)
A computer destroys the sense of historical succession, just as do other forms of mechanization...Certain farms contain hospitably the remnants and reminders of the forest or prairie that preceded them. It is possible even for towns and cities to remember farms and forests or prairies. All good human work remembers its history.
Wendell Berry (Why I am not Going to Buy a Computer)
There are too many famous Steve Jobs anecdotes to count, but several of them revolve around one theme: his unwillingness to leave well enough alone. His products had to be perfect; they had to do what they promised, and then some. And even though deadlines loomed and people would have to work around the clock, he would regularly demand more from his teams than they thought they could provide. The result? The most successful company in the history of the world and products that inspire devotion that is truly unusual for a personal computer or cell phone.
Ryan Holiday (Perennial Seller: The Art of Making and Marketing Work that Lasts)
When a printed book—whether a recently published scholarly history or a two-hundred-year-old Victorian novel—is transferred to an electronic device connected to the Internet, it turns into something very like a Web site. Its words become wrapped in all the distractions of the networked computer. Its links and other digital enhancements propel the reader hither and yon. It loses what the late John Updike called its “edges” and dissolves into the vast, rolling waters of the Net. The linearity of the printed book is shattered, along with the calm attentiveness it encourages in the reader.
Nicholas Carr (What the Internet is Doing to Our Brains)
Wise people throughout history have been those who saw that while life is real, life’s problems are an illusion, they are thought-created. These people know that we manufacture and blow problems way out of proportion through our own ability to think. They also know that if we can step outside the boundaries of our own thinking, we can find the answer we are looking for. This, in a nutshell, is wisdom: the ability to see an answer without having to think of an answer. Wisdom is the ‘ah ha, that’s so obvious’ experience most of us have had many times. Few people seem to understand that this voice is always available to us. Wisdom is indeed your inner sense of knowing. It is true mental health, a peaceful state of mind where answers to questions are as plentiful as the problems you see when you aren’t experiencing wisdom. It’s as if wisdom lies in the space between your thoughts, in those quiet moments when your ‘biological computer’ is turned off.
Richard Carlson (Stop Thinking, Start Living: Discover Lifelong Happiness)
You know, Bo, there is a feeling, in that instant following some life-changing tragedy, that you can actually step back over that sliver of time and stop the horror from coming. But that feeling is a lie, because in the tiniest microminisecond after any event occurs, it is as safe in history as Julius Caesar. Data in the universal computer is backed up as it happens.
Chris Crutcher (Ironman)
The sum total of money in the world is about $60 trillion, yet the sum total of coins and banknotes is less than $6 trillion.7 More than 90 per cent of all money – more than $50 trillion appearing in our accounts – exists only on computer servers.
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
Google gets $59 billion, and you get free search and e-mail. A study published by the Wall Street Journal in advance of Facebook’s initial public offering estimated the value of each long-term Facebook user to be $80.95 to the company. Your friendships were worth sixty-two cents each and your profile page $1,800. A business Web page and its associated ad revenue were worth approximately $3.1 million to the social network. Viewed another way, Facebook’s billion-plus users, each dutifully typing in status updates, detailing his biography, and uploading photograph after photograph, have become the largest unpaid workforce in history. As a result of their free labor, Facebook has a market cap of $182 billion, and its founder, Mark Zuckerberg, has a personal net worth of $33 billion. What did you get out of the deal? As the computer scientist Jaron Lanier reminds us, a company such as Instagram—which Facebook bought in 2012—was not valued at $1 billion because its thirteen employees were so “extraordinary. Instead, its value comes from the millions of users who contribute to the network without being paid for it.” Its inventory is personal data—yours and mine—which it sells over and over again to parties unknown around the world. In short, you’re a cheap date.
Marc Goodman (Future Crimes)
Greek philosophers looked upon the past and the future as the primary evils weighing upon human life, and as the source of all the anxieties which blight the present. The present moment is the only dimension of existence worth inhabiting, because it is the only one available to us. The past is no longer and the future has yet to come, they liked to remind us; yet we live virtually all of our lives somewhere between memories and aspirations, nostalgia and expectation. We imagine we would be much happier with new shoes, a faster computer, a bigger house, more exotic holidays, different friends … But by regretting the past or guessing the future, we end up missing the only life worth living: the one which proceeds from the here and now and deserves to be savoured.
Luc Ferry (A Brief History of Thought: A Philosophical Guide to Living (Learning to Live))
Throw in the valley’s rich history of computer science breakthroughs, and you’ve set the stage for the geeky-hippie hybrid ideology that has long defined Silicon Valley. Central to that ideology is a wide-eyed techno-optimism, a belief that every person and company can truly change the world through innovative thinking. Copying ideas or product features is frowned upon as a betrayal of the zeitgeist and an act that is beneath the moral code of a true entrepreneur. It’s all about “pure” innovation, creating a totally original product that generates what Steve Jobs called a “dent in the universe.” Startups that grow up in this kind of environment tend to be mission-driven. They start with a novel idea or idealistic goal, and they build a company around that. Company mission statements are clean and lofty, detached from earthly concerns or financial motivations.
Kai-Fu Lee (AI Superpowers: China, Silicon Valley, and the New World Order)
The upshot of all this is that we live in a universe whose age we can’t quite compute, surrounded by stars whose distances from us and each other we don’t altogether know, filled with matter we can’t identify, operating in conformance with physical laws whose properties we don’t truly understand.
Bill Bryson (A Short History of Nearly Everything)
Writing was born as the maidservant of human consciousness, but is increasingly becoming its master. Our computers have trouble understanding how Homo sapiens talks, feels and dreams. So we are teaching Homo sapiens to talk, feel and dream in the language of numbers, which can be understood by computers.
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
Not only are the new technologies exponential, digital, and combinatorial, but most of the gains are still ahead of us. In the next twenty-four months, the planet will add more computer power than it did in all previous history. Over the next twenty-four years, the increase will likely be over a thousand-fold.
Erik Brynjolfsson (The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies)
Every moment of our existence is linked by a peculiar triple thread to our past—the most recent and the most distant—by memory. Our present swarms with traces of our past. We are histories of ourselves, narratives. I am not this momentary mass of flesh reclined on the sofa typing the letter a on my laptop; I am my thoughts full of the traces of the phrases that I am writing; I am my mother’s caresses, and the serene kindness with which my father calmly guided me; I am my adolescent travels; I am what my reading has deposited in layers in my mind; I am my loves, my moments of despair, my friendships, what I’ve written, what I’ve heard; the faces engraved on my memory. I am, above all, the one who a minute ago made a cup of tea for himself. The one who a moment ago typed the word “memory” into his computer. The one who just composed the sentence that I am now completing. If all this disappeared, would I still exist? I am this long, ongoing novel. My life consists of it.
Carlo Rovelli (The Order of Time)
We are suddenly showing unprecedented interest in the fate of so-called lower life forms, perhaps because we are about to become one. If and when computer programs attain superhuman intelligence and unprecedented power, should we begin valuing these programs more than we value humans? Would it be okay, for example, for an artificial intelligence to exploit humans and even kill them to further its own needs and desires? If it should never be allowed to do that, despite its superior intelligence
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
As we're told that 10 percent of all high school education will be computer-based by 2014 and rise to 50 percent by 2019, and as the PowerPoint throws up aphoristic bromides by the corporate heroes of the digitally driven 'global economy' -- the implication being that 'great companies' know what they're doing, while most schools don't -- and as we're goaded mercilessly to the conclusion that everything we are, know, and do is bound for the dustbin of history, I want to ask what kind of schooling Bill Gates and Steve Jobs had. Wasn't it at bottom the very sort of book-based, content-driven education that we declare obsolete in the name of their achievements?
Garret Keizer (Getting Schooled: The Reeducation of an American Teacher)
Turing knew from personal experience that it didn’t matter who you really were – it mattered only what others thought about you. According to Turing, in the future computers would be just like gay men in the 1950s. It won’t matter whether computers will actually be conscious or not. It will matter only what people think about it. The
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
Then I thought of the federal government discovering they could send special ops into the past to change whatever they wanted. I didn’t know if that were possible, but if so, the folks who gave us fun stuff like bio-weapons and computer-guided smart bombs were the last folks I’d want carrying their various agendas into living, unarmored history. The
Stephen King (11/22/63)
I neither oblige the belief of other person, nor overhastily subscribe mine own. Nor have I stood with others computing or collating years and chronologies, lest I should be vainly curious about the time and circumstance of things, whereof the substance is so much in doubt. By this time, like one who had set out on his way by night, and travelled through a region of smooth or idle dreams, our history now arrives on the confines, where daylight and truth meet us with a clear dawn, representing to our view, though at a far distance, true colours and shapes.
John Milton (The History of Britain; That Part Especially Now Called England, from the First Traditional Beginning Continued to the Norman Conquest)
In the history of ideas, it's repeatedly happened that an idea, developed in one area for one purpose, finds an unexpected application elsewhere. Concepts developed purely for philosophy of mathematics turned out to be just what you needed to build a computer. Statistical formulae for understanding genetic change in biology are now applied in both economics and in programming.
Patrick Grim
It’s also probably fair to say she was probably too young at thirteen to innocently open the drawer under his bed and come across a leather gas mask type thing with a leather dick attached where she presumed a nose should be, along with associated whips, gels, handcuffs and other unexplainable objects Unfortunately, once seen, never unseen and it was a lesson for her at a young age that you never know people until you’ve been through their drawers and computer history
Bernardine Evaristo (Girl, Woman, Other)
Like I said last time, the world our parents grew up in is history. All the old rules, we've thrown them out. We're the ones making the future. We're the founding fathers. Hand us universal Wi-Fi and soup dumplings and we'll fix the world. So how do you fit in? What if you can't code? What if you've never been able to build anything more than a birdhouse? It doesn't matter. You've got skills that you probably disniss as tricks. That dance you can do, that song you can sing, the painting hanging in your room, those are all skills we need. See there's a reason my status online is recruiting for the future. We broke some eggs and we baked a cake. It was delicious, really amazing cream cheese frosting. I saved you a piece, but I don't want to give it to you. I want to teach you how to bake your own cake from scratch. Only, instead of flour and water and eggs, I want you to make something with oil paints, yarn, peptides, or computer parts. The revolution is now. Welcome aboard. And, uh, get ready to create...
Leopoldo Gout (Genius: The Revolution (Genius, 3))
In the early twenty-first century the train of progress is again pulling out of the station – and this will probably be the last train ever to leave the station called Homo sapiens. Those who miss this train will never get a second chance. In order to get a seat on it you need to understand twenty-first-century technology, and in particular the powers of biotechnology and computer algorithms. These powers are far more potent than steam and the telegraph, and they will not be used merely for the production of food, textiles, vehicles and weapons. The main products of the twenty-first century will be bodies, brains and minds, and the gap between those who know how to engineer bodies and brains and those who do not will be far bigger than the gap between Dickens’s Britain and the Mahdi’s Sudan. Indeed, it will be bigger than the gap between Sapiens and Neanderthals. In the twenty-first century, those who ride the train of progress will acquire divine abilities of creation and destruction, while those left behind will face extinction.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
Homo sapiens is not going to be exterminated by a robot revolt. Rather, Homo sapiens is likely to upgrade itself step by step, merging with robots and computers in the process, until our descendants will look back and realise that they are no longer the kind of animal that wrote the Bible, built the Great Wall of China and laughed at Charlie Chaplin’s antics. This will not happen in a day, or a year. Indeed, it is already happening right now, through innumerable mundane actions. Every day millions of people decide to grant their smartphone a bit more control over their lives or try a new and more effective antidepressant drug. In pursuit of health, happiness and power, humans will gradually change first one of their features and then another, and another, until they will no longer be human.
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
For as to what we have heard you affirm, that there are other kingdoms and states in the world inhabited by human creatures as large as yourself, our philosophers are in much doubt, and would rather conjecture that you dropped from the moon, or one of the stars; because it is certain, that a hundred mortals of your bulk would in a short time destroy all the fruits and cattle of his majesty’s dominions: besides, our histories of six thousand moons make no mention of any other regions than the two great empires of Lilliput and Blefuscu. Which two mighty powers have, as I was going to tell you, been engaged in a most obstinate war for six-and-thirty moons past. It began upon the following occasion. It is allowed on all hands, that the primitive way of breaking eggs, before we eat them, was upon the larger end; but his present majesty’s grandfather, while he was a boy, going to eat an egg, and breaking it according to the ancient practice, happened to cut one of his fingers. Whereupon the emperor his father published an edict, commanding all his subjects, upon great penalties, to break the smaller end of their eggs. The people so highly resented this law, that our histories tell us, there have been six rebellions raised on that account; wherein one emperor lost his life, and another his crown. These civil commotions were constantly fomented by the monarchs of Blefuscu; and when they were quelled, the exiles always fled for refuge to that empire. It is computed that eleven thousand persons have at several times suffered death, rather than submit to break their eggs at the smaller end. Many hundred large volumes have been published upon this controversy: but the books of the Big-endians have been long forbidden, and the whole party rendered incapable by law of holding employments. During the course of these troubles, the emperors of Blefusca did frequently expostulate by their ambassadors, accusing us of making a schism in religion, by offending against a fundamental doctrine of our great prophet Lustrog, in the fifty-fourth chapter of the Blundecral (which is their Alcoran). This, however, is thought to be a mere strain upon the text; for the words are these: ‘that all true believers break their eggs at the convenient end.’ And which is the convenient end, seems, in my humble opinion to be left to every man’s conscience, or at least in the power of the chief magistrate to determine.
Jonathan Swift (Gulliver's Travels)
In the economic sphere too, the ability to hold a hammer or press a button is becoming less valuable than before. In the past, there were many things only humans could do. But now robots and computers are catching up, and may soon outperform humans in most tasks. True, computers function very differently from humans, and it seems unlikely that computers will become humanlike any time soon. In particular, it doesn’t seem that computers are about to gain consciousness, and to start experiencing emotions and sensations. Over the last decades there has been an immense advance in computer intelligence, but there has been exactly zero advance in computer consciousness. As far as we know, computers in 2016 are no more conscious than their prototypes in the 1950s. However, we are on the brink of a momentous revolution. Humans are in danger of losing their value, because intelligence is decoupling from consciousness. Until today, high intelligence always went hand in hand with a developed consciousness. Only conscious beings could perform tasks that required a lot of intelligence, such as playing chess, driving cars, diagnosing diseases or identifying terrorists. However, we are now developing new types of non-conscious intelligence that can perform such tasks far better than humans. For all these tasks are based on pattern recognition, and non-conscious algorithms may soon excel human consciousness in recognising patterns. This raises a novel question: which of the two is really important, intelligence or consciousness? As long as they went hand in hand, debating their relative value was just a pastime for philosophers. But in the twenty-first century, this is becoming an urgent political and economic issue. And it is sobering to realise that, at least for armies and corporations, the answer is straightforward: intelligence is mandatory but consciousness is optional.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
God was dead: to begin with. And romance was dead. Chivalry was dead. Poetry, the novel, painting, they were all dead, and art was dead. Theatre and cinema were both dead. Literature was dead. The book was dead. Modernism, postmodernism, realism and surrealism were all dead. Jazz was dead, pop music, disco, rap, classical music, dead. Culture was dead. Decency, society, family values were dead. The past was dead. History was dead. The welfare state was dead. Politics was dead. Democracy was dead. Communism, fascism, neoliberalism, capitalism, all dead, and marxism, dead, feminism, also dead. Political correctness, dead. Racism was dead. Religion was dead. Thought was dead. Hope was dead. Truth and fiction were both dead. The media was dead. The internet was dead. Twitter, instagram, facebook, google, dead. Love was dead. Death was dead. A great many things were dead. Some, though, weren’t, or weren’t dead yet. Life wasn’t yet dead. Revolution wasn’t dead. Racial equality wasn’t dead. Hatred wasn’t dead. But the computer? Dead. TV? Dead. Radio? Dead. Mobiles were dead. Batteries were dead. Marriages were dead, sex
Ali Smith (Winter (Seasonal #2))
Dataism adopts a strictly functional approach to humanity, appraising the value of human experiences according to their function in data-processing mechanisms. If we develop an algorithm that fulfils the same function better, human experiences will lose their value. Thus if we can replace not just taxi drivers and doctors but also lawyers, poets and musicians with superior computer programs, why should we care if these programs have no consciousness and no subjective experiences? If some humanist starts adulating the sacredness of human experience, Dataists would dismiss such sentimental humbug. ‘The experience you praise is just an outdated biochemical algorithm. In the African savannah 70,000 years ago, that algorithm was state-of-the-art. Even in the twentieth century it was vital for the army and for the economy. But soon we will have much better algorithms.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
Maya Schenwar and Victoria Law’s book Prison by Any Other Name: The Harmful Consequences of Popular Reforms, we’ve learned how to send people into outer space and how to shrink a powerful computer into a device that fits into the palm of our hand, yet we haven’t yet learned how to face our racial history or how to tell the truth about the devastation wrought by colonialism, militarism, and global capitalism. We’ve learned how to develop powerful surveillance systems and how to build missiles that can reach halfway around the globe. But what have we learned about the true meaning of justice?
Michelle Alexander (The New Jim Crow: Mass Incarceration in the Age of Colorblindness)
The novel had a framework made by thinking. The thought was that to divide off and compartmentalize living was dangerous and led to nothing but trouble. Old, young; black, white; men, women; capitalism, socialism; these dichotomies undo us, force us into unreal categorisation, make us look for what separates us rather than what we have in common. That was the thought, which made the shape or pattern of 'The Golden Notebook'. But the emotions were stronger than the thought. This is why I have always seen TGN as a failure: a failure in my terms, of what I had meant. For has this book changed by an iota our tendency to think like computers set to sort everything - people, ideas, history - into boxes? No, it has not. Yet why should I have such a hubristic thought? But I was in the grip of discovery, of revelation. I had only just seen this Truth: I was watching my own mind working like a sorting machine, and I was appalled.
Doris Lessing
Perhaps we are all living inside a giant computer simulation, Matrix-style. That would contradict all our national, religious and ideological stories. But our mental experiences would still be real. If it turns out that human history is an elaborate simulation run on a super-computer by rat scientists from the planet Zircon, that would be rather embarrassing for Karl Marx and the Islamic State. But these rat scientists would still have to answer for the Armenian genocide and for Auschwitz. How did they get that one past the Zircon University’s ethics committee? Even if the gas chambers were just electric signals in silicon chips, the experiences of pain, fear and despair were not one iota less excruciating for that. Pain is pain, fear is fear, and love is love – even in the matrix. It doesn’t matter if the fear you feel is inspired by a collection of atoms in the outside world or by electrical signals manipulated by a computer. The fear is still real. So if you want to explore the reality of your mind, you can do that inside the matrix as well as outside it.
Yuval Noah Harari (21 Lessons for the 21st Century)
April 26th, 2014 is not only the day of the Alamogordo dig, it’s also my mother’s 78th birthday. How perfect is that? Without her, I wouldn’t be here. Of course, with her I might not be here either. She didn’t want me to go to Atari. When I announced I was leaving Hewlett-Packard to go make games, she told me I was throwing my life away. She told me I wasn’t her son, because no child of hers would do such a stupid thing. She came around though. After I made several million-sellers and put an addition on her home, she told me it was a good thing I had listened to her and gone into computers. This may shed some light on how my background prepared me for becoming a therapist, and before that a client. After all, if it weren’t for families, there would be no therapists.
Howard Scott Warshaw (Once Upon Atari: How I made history by killing an industry)
the average forager had wider, deeper and more varied knowledge of her immediate surroundings than most of her modern descendants. Today, most people in industrial societies don’t need to know much about the natural world in order to survive. What do you really need to know in order to get by as a computer engineer, an insurance agent, a history teacher or a factory worker? You need to know a lot about your own tiny field of expertise, but for the vast majority of life’s necessities you rely blindly on the help of other experts, whose own knowledge is also limited to a tiny field of expertise. The human collective knows far more today than did the ancient bands. But at the individual level, ancient foragers were the most knowledgeable and skilful people in history. There
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
Since then, several other conjectures have been resolved with the aid of computers (notably, in 1988, the nonexistence of a projective plane of order 10). Meanwhile, mathematicians have tidied up the Haken-Appel argument so that the computer part is much shorter, and some still hope that a traditional, elegant, and illuminating proof of the four-color theorem will someday be found. It was the desire for illumination, after all, that motivated so many to work on the problem, even to devote their lives to it, during its long history. (One mathematician had his bride color maps on their honeymoon.) Even if the four-color theorem is itself mathematically otiose, a lot of useful mathematics got created in failed attempts to prove it, and it has certainly made grist for philosophers in the last few decades. As for its having wider repercussions, I’m not so sure. When I looked at the map of the United States in the back of a huge dictionary that I once won in a spelling bee for New York journalists, I noticed with mild surprise that it was colored with precisely four colors. Sadly, though, the states of Arkansas and Louisiana, which share a border, were both blue.
Jim Holt (When Einstein Walked with Gödel: Excursions to the Edge of Thought)
Even mighty states and kingdoms are not exempted. If we look into history, we shall find some nations rising from contemptible beginnings and spreading their influence, until the whole globe is subjected to their ways. When they have reached the summit of grandeur, some minute and unsuspected cause commonly affects their ruin, and the empire of the world is transferred to some other place. Immortal Rome was at first but an insignificant village, inhabited only by a few abandoned ruffians, but by degrees it rose to a stupendous height, and excelled in arts and arms all the nations that preceded it. But the demolition of Carthage (what one should think should have established is in supreme dominion) by removing all danger, suffered it to sink into debauchery, and made it at length an easy prey to Barbarians. England immediately upon this began to increase (the particular and minute cause of which I am not historian enough to trace) in power and magnificence, and is now the greatest nation upon the globe. Soon after the reformation a few people came over into the new world for conscience sake. Perhaps this (apparently) trivial incident may transfer the great seat of empire into America. It looks likely to me. For if we can remove the turbulent Gallics, our people according to exactest computations, will in another century, become more numerous than England itself. Should this be the case, since we have (I may say) all the naval stores of the nation in our hands, it will be easy to obtain the mastery of the seas, and then the united force of all Europe will not be able to subdue us. The only way to keep us from setting up for ourselves is to disunite us. Divide et impera. Keep us in distinct colonies, and then, some great men from each colony, desiring the monarchy of the whole, they will destroy each others' influence and keep the country in equilibrio. Be not surprised that I am turned into politician. The whole town is immersed in politics.
John Adams
Technology, I said before, is most powerful when it enables transitions—between linear and circular motion (the wheel), or between real and virtual space (the Internet). Science, in contrast, is most powerful when it elucidates rules of organization—laws—that act as lenses through which to view and organize the world. Technologists seek to liberate us from the constraints of our current realities through those transitions. Science defines those constraints, drawing the outer limits of the boundaries of possibility. Our greatest technological innovations thus carry names that claim our prowess over the world: the engine (from ingenium, or “ingenuity”) or the computer (from computare, or “reckoning together”). Our deepest scientific laws, in contrast, are often named after the limits of human knowledge: uncertainty, relativity, incompleteness, impossibility. Of all the sciences, biology is the most lawless; there are few rules to begin with, and even fewer rules that are universal. Living beings must, of course, obey the fundamental rules of physics and chemistry, but life often exists on the margins and interstices of these laws, bending them to their near-breaking limit. The universe seeks equilibriums; it prefers to disperse energy, disrupt organization, and maximize chaos. Life is designed to combat these forces. We slow down reactions, concentrate matter, and organize chemicals into compartments; we sort laundry on Wednesdays. “It sometimes seems as if curbing entropy is our quixotic purpose in the universe,” James Gleick wrote. We live in the loopholes of natural laws, seeking extensions, exceptions, and excuses.
Siddhartha Mukherjee (The Gene: An Intimate History)
If patterns of ones and zeros were “like” patterns of human lives and deaths, if everything about an individual could be represented in a computer record by a long string of ones and zeros, then what kind of creature would be represented by a long string of lives and deaths? It would have to be up one level at least—an angel, a minor god, something in a UFO. It would take eight human lives and deaths just to form one character in this being’s name—its complete dossier might take up a considerable piece of the history of the world. We are digits in God’s computer, she not so much thought as hummed to herself to a sort of standard gospel tune, And the only thing we’re good for, to be dead or to be living, is the only thing He sees. What we cry, what we contend for, in our world of toil and blood, it all lies beneath the notice of the hacker we call God.
Thomas Pynchon (Vineland)
Historians are wont to name technological advances as the great milestones of culture, among them the development of the plow, the discovery of smelting and metalworking, the invention of the clock, printing press, steam power, electric engine, lightbulb, semiconductor, and computer. But possibly even more transforming than any of these was the recognition by Greek philosophers and their intellectual descendants that human beings could examine, comprehend, and eventually even guide or control their own thought process, emotions, and resulting behavior. With that realization we became something new and different on earth: the only animal that, by examining its own cerebration and behavior, could alter them. This, surely, was a giant step in evolution. Although we are physically little different from the people of three thousand years ago, we are culturally a different species. We are the psychologizing animal.
Morton Hunt (The Story of Psychology)
In history and in evolution, progress is always a futile, Sisyphean struggle to stay in the same relative place by getting ever better at things. Cars move through the congested streets of London no faster than horse-drawn carriages did a century ago. Computers have no effect on productivity because people learn to complicate and repeat tasks that have been made easier.13 This concept, that all progress is relative, has come to be known in biology by the name of the Red Queen, after a chess piece that Alice meets in Through the Looking-Glass, who perpetually runs without getting very far because the landscape moves with her. It is an increasingly influential idea in evolutionary theory, and one that will recur throughout the book. The faster you run, the more the world moves with you and the less you make progress. Life is a chess tournament in which if you win a game, you start the next game with the handicap of a missing pawn.
Matt Ridley (The Red Queen: Sex and the Evolution of Human Nature)
Yet, ironically, the most tech-cautious parents are the people who invented our iCulture. People are shocked to find out that tech god Steve Jobs was a low-tech parent; in 2010, when a reporter suggested that his children must love the just-released iPad, he replied: “They haven’t used it. We limit how much technology our kids use at home.” In a September, 10, 2014, New York Times article, his biographer Walter Isaacson revealed: “Every evening Steve made a point of having dinner at the big long table in their kitchen, discussing books and history and a variety of things. No one ever pulled out an iPad or computer.” Years earlier, in an interview for Wired magazine, Jobs expressed a very clear anti-tech-in-the-classroom opinion as well—after having once believed that technology was the educational panacea: “I’ve probably spearheaded giving away more computer equipment to schools than anybody on the planet. But I’ve come to the conclusion that the problem is not one that technology can hope to solve. What’s wrong with education cannot be fixed with technology. No amount of technology will make a dent.”34 Education
Nicholas Kardaras (Glow Kids: How Screen Addiction Is Hijacking Our Kids - and How to Break the Trance)
...Now let's set the record straight. There's no argument over the choice between peace and war, but there's only one guaranteed way you can have peace—and you can have it in the next second—surrender. Admittedly, there's a risk in any course we follow other than this, but every lesson of history tells us that the greater risk lies in appeasement, and this is the specter our well-meaning liberal friends refuse to face—that their policy of accommodation is appeasement, and it gives no choice between peace and war, only between fight or surrender. If we continue to accommodate, continue to back and retreat, eventually we have to face the final demand—the ultimatum. And what then—when Nikita Khrushchev has told his people he knows what our answer will be? He has told them that we're retreating under the pressure of the Cold War, and someday when the time comes to deliver the final ultimatum, our surrender will be voluntary, because by that time we will have been weakened from within spiritually, morally, and economically. He believes this because from our side he's heard voices pleading for "peace at any price" or "better Red than dead," or as one commentator put it, he'd rather "live on his knees than die on his feet." And therein lies the road to war, because those voices don't speak for the rest of us. You and I know and do not believe that life is so dear and peace so sweet as to be purchased at the price of chains and slavery. If nothing in life is worth dying for, when did this begin—just in the face of this enemy? Or should Moses have told the children of Israel to live in slavery under the pharaohs? Should Christ have refused the cross? Should the patriots at Concord Bridge have thrown down their guns and refused to fire the shot heard 'round the world? The martyrs of history were not fools, and our honored dead who gave their lives to stop the advance of the Nazis didn't die in vain. Where, then, is the road to peace? Well it's a simple answer after all. You and I have the courage to say to our enemies, "There is a price we will not pay." "There is a point beyond which they must not advance." And this—this is the meaning in the phrase of Barry Goldwater's "peace through strength." Winston Churchill said, "The destiny of man is not measured by material computations. When great forces are on the move in the world, we learn we're spirits—not animals." And he said, "There's something going on in time and space, and beyond time and space, which, whether we like it or not, spells duty." You and I have a rendezvous with destiny. We'll preserve for our children this, the last best hope of man on earth, or we'll sentence them to take the last step into a thousand years of darkness...
Ronald Reagan (Speaking My Mind: Selected Speeches)
a simple, inspiring mission for Wikipedia: “Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge. That’s what we’re doing.” It was a huge, audacious, and worthy goal. But it badly understated what Wikipedia did. It was about more than people being “given” free access to knowledge; it was also about empowering them, in a way not seen before in history, to be part of the process of creating and distributing knowledge. Wales came to realize that. “Wikipedia allows people not merely to access other people’s knowledge but to share their own,” he said. “When you help build something, you own it, you’re vested in it. That’s far more rewarding than having it handed down to you.”111 Wikipedia took the world another step closer to the vision propounded by Vannevar Bush in his 1945 essay, “As We May Think,” which predicted, “Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.” It also harkened back to Ada Lovelace, who asserted that machines would be able to do almost anything, except think on their own. Wikipedia was not about building a machine that could think on its own. It was instead a dazzling example of human-machine symbiosis, the wisdom of humans and the processing power of computers being woven together like a tapestry.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
But the period I studied -- the rollicking eighteenth century engraved by Hogarth -- was the one that saw the birth of America, of women's rights, and of the novel. The novel started as a low-class form, fit only to be read by serving maids, and it is the only literary form where women have distinguished themselves so early and with such excellence that even the rampant misogyny of literary history cannot erase them. Ever wonder about women and the novel? Women, like any underclass, depend for their survival on self-definition. The novel permitted this -- and pages could still be hidden under the embroidery hoop. From the writer's mind to the reader's there was only the intervention of printing presses. You could stay at home, yet send your book abroad to London -- the perfect situation for women. In a world where women are still the second sex, many still dream of becoming writers so they can work at home, make their own hours, nurse the baby. Writing still seems to fit into the interstices of a woman's life. Through the medium of words, we have hopes of changing our class. Perhaps the pen will not always be equated with the penis. In a world of computers, our swift fingers may yet win us the world. One of these days we'll have class. And so we write as feverishly as only the dispossessed can. We write to come into our own, to build our houses and plant our gardens, to give ourselves names and histories, inventing ourselves as we go along.
Erica Jong (Fear of Fifty: A Midlife Memoir)
The eyes have been used to signify a perverse capacity - honed to perfection in the history of science tied to militarism, capitalism, colonialism, and male supremacy - to distance the knowing subject from everybody and everything in the interests of unfettered power. The instruments of visualization in multinationalist, postmodernist culture have compounded these meanings of dis-embodiment. The visualizing technologies are without apparent limit; the eye of any ordinary primate like us can be endlessly enhanced by sonography systems, magnetic resonance imaging, artificial intelligence-linked graphic manipulation systems, scanning electron microscopes, computer-aided tomography scanners, colour enhancement techniques, satellite surveillance systems, home and office VDTs, cameras for every purpose from filming the mucous membrane lining the gut cavity of a marine worm living in the vent gases on a fault between continental plates to mapping a planetary hemisphere elsewhere in the solar system. Vision in this technological feast becomes unregulated gluttony; all perspective gives way to infinitely mobile vision, which no longer seems just mythically about the god-trick of seeing everything from nowhere, but to have put the myth into ordinary practice. And like the god-trick, this eye fucks the world to make techno-monsters. Zoe Sofoulis (1988) calls this the cannibal-eye of masculinist extra-terrestrial projects for excremental second birthing.
Donna J. Haraway (Simians, Cyborgs, and Women: The Reinvention of Nature)
There’s an old phrase,” Matthew says. “Knowledge is power. Power to do evil, like Jeanine…or power to do good, like what we’re doing. Power itself is not evil. So knowledge itself is not evil.” “I guess I grew up suspicious of both. Power and knowledge,” I say. “To the Abnegation, power should only be given to people who don’t want it.” “There’s something to that,” Matthew says. “But maybe it’s time to grow out of that suspicion.” He reaches under the desk and takes out a book. It is thick, with a worn cover and frayed edges. On it is printed HUMAN BIOLOGY. “It’s a little rudimentary, but this book helped to teach me that it is to be human,” he says. “To be such a complicated, mysterious piece of biological machinery, and more amazing still, to have the capacity to analyze that machinery! That is a special thing, unprecedented in all of evolutionary history. Our ability to know about ourselves and the world is what makes us human.” He hands me the book and turns back to the computer. I look down at the worn cover and run my fingers along the edge of the pages. He makes the acquisition of knowledge feel like a secret, beautiful thing, and an ancient thing. I feel like, if I read this book, I can reach backward through all the generations of humanity to the very first one, whenever it was--that I can participate in something many times larger and older than myself. “Thank you,” I say, and it’s not for the book. It’s for giving something back to me, something I lost before I was able to really have it.
Veronica Roth (Allegiant (Divergent, #3))
In theory, if some holy book misrepresented reality, its disciples would sooner or later discover this, and the text’s authority would be undermined. Abraham Lincoln said you cannot deceive everybody all the time. Well, that’s wishful thinking. In practice, the power of human cooperation networks depends on a delicate balance between truth and fiction. If you distort reality too much, it will weaken you, and you will not be able to compete against more clear-sighted rivals. On the other hand, you cannot organise masses of people effectively without relying on some fictional myths. So if you stick to unalloyed reality, without mixing any fiction with it, few people will follow you. If you used a time machine to send a modern scientist to ancient Egypt, she would not be able to seize power by exposing the fictions of the local priests and lecturing the peasants on evolution, relativity and quantum physics. Of course, if our scientist could use her knowledge in order to produce a few rifles and artillery pieces, she could gain a huge advantage over pharaoh and the crocodile god Sobek. Yet in order to mine iron ore, build blast furnaces and manufacture gunpowder the scientist would need a lot of hard-working peasants. Do you really think she could inspire them by explaining that energy divided by mass equals the speed of light squared? If you happen to think so, you are welcome to travel to present-day Afghanistan or Syria and try your luck. Really powerful human organisations – such as pharaonic Egypt, the European empires and the modern school system – are not necessarily clear-sighted. Much of their power rests on their ability to force their fictional beliefs on a submissive reality. That’s the whole idea of money, for example. The government makes worthless pieces of paper, declares them to be valuable and then uses them to compute the value of everything else. The government has the power to force citizens to pay taxes using these pieces of paper, so the citizens have no choice but to get their hands on at least some of them. Consequently, these bills really do become valuable, the government officials are vindicated in their beliefs, and since the government controls the issuing of paper money, its power grows. If somebody protests that ‘These are just worthless pieces of paper!’ and behaves as if they are only pieces of paper, he won’t get very far in life.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
a harbinger of a third wave of computing, one that blurred the line between augmented human intelligence and artificial intelligence. “The first generation of computers were machines that counted and tabulated,” Rometty says, harking back to IBM’s roots in Herman Hollerith’s punch-card tabulators used for the 1890 census. “The second generation involved programmable machines that used the von Neumann architecture. You had to tell them what to do.” Beginning with Ada Lovelace, people wrote algorithms that instructed these computers, step by step, how to perform tasks. “Because of the proliferation of data,” Rometty adds, “there is no choice but to have a third generation, which are systems that are not programmed, they learn.”27 But even as this occurs, the process could remain one of partnership and symbiosis with humans rather than one designed to relegate humans to the dustbin of history. Larry Norton, a breast cancer specialist at New York’s Memorial Sloan-Kettering Cancer Center, was part of the team that worked with Watson. “Computer science is going to evolve rapidly, and medicine will evolve with it,” he said. “This is coevolution. We’ll help each other.”28 This belief that machines and humans will get smarter together is a process that Doug Engelbart called “bootstrapping” and “coevolution.”29 It raises an interesting prospect: perhaps no matter how fast computers progress, artificial intelligence may never outstrip the intelligence of the human-machine partnership. Let us assume, for example, that a machine someday exhibits all of the mental capabilities of a human: giving the outward appearance of recognizing patterns, perceiving emotions, appreciating beauty, creating art, having desires, forming moral values, and pursuing goals. Such a machine might be able to pass a Turing Test. It might even pass what we could call the Ada Test, which is that it could appear to “originate” its own thoughts that go beyond what we humans program it to do. There would, however, be still another hurdle before we could say that artificial intelligence has triumphed over augmented intelligence. We can call it the Licklider Test. It would go beyond asking whether a machine could replicate all the components of human intelligence to ask whether the machine accomplishes these tasks better when whirring away completely on its own or when working in conjunction with humans. In other words, is it possible that humans and machines working in partnership will be indefinitely more powerful than an artificial intelligence machine working alone?
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)