Modern Computers Quotes

We've searched our database for all the quotes and captions related to Modern Computers. Here they are! All 100 of them:

Now, 75 years [after To Kill a Mockingbird], in an abundant society where people have laptops, cell phones, iPods, and minds like empty rooms, I still plod along with books. [Open Letter, O Magazine, July 2006]
Harper Lee
Tess realized one of the great modern dating sadnesses: everyone is so used to the comforting glow of the computer screen that no one can go so far as to say "good morning" in public without being liquored up.
Amelia Gray (AM/PM)
Faith does not protect you. Medicine and airbags... Those are the things that protect you. God does not protect you. Intelligence protects you. Enlightenment. Put your faith in something with tangible results. How long has it been since someone walked on water? Modern miracles belong to science.. Computers, vaccines, space stations... Even the devine miracle of creation. Matter from nothing... In a lab. Who needs God? No! Science is God!
Dan Brown (Angels & Demons (Robert Langdon, #1))
But what is the way forward? I know what it isn't. It's not, as we once believed, plenty to eat and a home with all the modern conveniences. It's not a 2,000-mile-long wall to keep Mexicans out or more accurate weapons to kill them. It's not a better low-fat meal or a faster computer speed. It's not a deodorant, a car, a soft drink, a skin cream. The way forward is found on a path through the wilderness of the head and heart---reason and emotion. Thinking, knowing, understanding.
Laurence Gonzales (Everyday Survival: Why Smart People Do Stupid Things)
Roman influence seeds itself, sprouting mighty oaks right through the modern forest of computers, digital disks, microviruses and space satellites.
Anne Rice (Pandora (New Tales of the Vampires, #1))
On the first day of a college you will worry about how will you do inside the college? and at the last day of a college you will wonder what will you do outside the college?
Amit Kalantri
The problem with being an author in this modern world is such: computers break often; books don't
Emma Iadanza
The spectacular thing about Johnny [von Neumann] was not his power as a mathematician, which was great, or his insight and his clarity, but his rapidity; he was very, very fast. And like the modern computer, which no longer bothers to retrieve the logarithm of 11 from its memory (but, instead, computes the logarithm of 11 each time it is needed), Johnny didn't bother to remember things. He computed them. You asked him a question, and if he didn't know the answer, he thought for three seconds and would produce and answer.
Paul R. Halmos
Computers bootstrap their own offspring, grow so wise and incomprehensible that their communiqués assume the hallmarks of dementia: unfocused and irrelevant to the barely-intelligent creatures left behind. And when your surpassing creations find the answers you asked for, you can't understand their analysis and you can't verify their answers. You have to take their word on faith.
Peter Watts (Blindsight (Firefall, #1))
Real arms races are run by highly intelligent, bespectacled engineers in glass offices thoughtfully designing shiny weapons on modern computers. But there's no thinking in the mud and cold of nature's trenches. At best, weapons thrown together amidst the explosions and confusion of smoky battlefields are tiny variations on old ones, held together by chewing gum. If they don't work, then something else is thrown at the enemy, including the kitchen sink - there's nothing "progressive" about that. At its usual worst, trench warfare is fought by attrition. If the enemy can be stopped or slowed by burning your own bridges and bombing your own radio towers and oil refineries, then away they go. Darwinian trench warfare does not lead to progress - it leads back to the Stone Age.
Michael J. Behe (The Edge of Evolution: The Search for the Limits of Darwinism)
Brains are like representative democracies. They are built of multiple, overlapping experts who weigh in and compete over different choices. As Walt Whitman correctly surmised, we are large and we harbor multitudes within us. And those multitudes are locked in chronic battle. There is an ongoing conversation among the different factions in your brain, each competing to control the single output channel of your behavior. As a result, you can accomplish the strange feats of arguing with yourself, cursing at yourself, and cajoling yourself to do something – feats that modern computers simply do not do.
David Eagleman (Incognito: The Secret Lives of the Brain)
The Air Force’s demand for self-contained, inertial guidance systems played a leading role in the miniaturization of computers and the development of integrated circuits, the building blocks of the modern electronics industry.
Eric Schlosser (Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety)
I live in nature where everything is connected, circular. The seasons are circular. The planet is circular, and so is the planet around the sun. The course of water over the earth is circular coming down from the sky and circulating through the world to spread life and then evaporating up again. I live in a circular teepee and build my fire in a circle. The life cycles of plants and animals are circular. I live outside where I can see this. The ancient people understood that our world is a circle, but we modern people have lost site of that. I don’t live inside buildings because buildings are dead places where nothing grows, where water doesn’t flow, and where life stops. I don’t want to live in a dead place. People say that I don’t live in a real world, but it’s modern Americans who live in a fake world, because they have stepped outside the natural circle of life. Do people live in circles today? No. They live in boxes. They wake up every morning in a box of their bedrooms because a box next to them started making beeping noises to tell them it was time to get up. They eat their breakfast out of a box and then they throw that box away into another box. Then they leave the box where they live and get into another box with wheels and drive to work, which is just another big box broken into little cubicle boxes where a bunch of people spend their days sitting and staring at the computer boxes in front of them. When the day is over, everyone gets into the box with wheels again and goes home to the house boxes and spends the evening staring at the television boxes for entertainment. They get their music from a box, they get their food from a box, they keep their clothing in a box, they live their lives in a box. Break out of the box! This not the way humanity lived for thousands of years.
Elizabeth Gilbert (The Last American Man)
Rife's key realization was that there's no difference between modern culture and Sumerian. We have a huge workforce that is illiterate or alliterate and relies on TV-which is sort of an oral tradition. And we have a small, extremely literate power elite-the people who go into the Meatverse, basically-who understand that information is power, and who control society because they have this semimystical ability to speak magic computer languages.
Neal Stephenson (Snow Crash)
Search engines finds the information, not necessarily the truth.
Amit Kalantri
When outsiders claim that we are unchristian, it is a reflection of this jumbled (and predominately negative) set of perceptions. When they see Christians not acting like Jesus, they quickly conclude that the group deserves an unchristian label. Like a corrupted computer file or a bad photocopy, Christianity, they say, is no longer in pure form, and so they reject it. One quarter of outsiders say therefore most perception of Christianity is that the faith has changed for the worse. It has gotten off-track and is not what Christ intended. Modern-day Christianity no longer seems Christian.
David Kinnaman (unChristian: What a New Generation Really Thinks about Christianity... and Why It Matters)
My life has taught me that true spiritual insight can come about only through direct experience, the way a severe burn can be attained only by putting your hand in the fire. Faith is nothing more than a watered-down attempt to accept someone else's insight as your own. Belief is the psychic equivalent of an article of secondhand clothing, worn-out and passed down. I equate true spiritual insight with wisdom, which is different from knowledge. Knowledge can be obtained through many sources: books, stories, songs, legends, myths, and, in modern times, computers and television programs. On the other hand, there's only one real source of wisdom - pain. Any experience that provides a person with wisdom will also usually provide them with a scar. The greater the pain, the greater the realization. Faith is spiritual rigor mortis.
Damien Echols (Life After Death)
I know it's common for old people to complain about the modern moment, and lament the passing of a golden age when children were polite and you could buy a kilo of meat for pennies, but in our case, my boy, I think I am not mistaken when I say that something fundamental has changed about the world in which we live. We have reached a state of constant reinvention. Revolutions have moved off the battlefield and on to home computers.
G. Willow Wilson (Alif the Unseen)
virus writers lack the basic social and moral values and the “well-formed consciousness” that are the hallmarks of civilized modern societies.
Peter H. Gregory (Computer Viruses For Dummies)
It had as many immoralities as the machine of today has virtues. After a year or two I found that it was degrading my character, so I thought I would give it to Howells.
Mark Twain (The $30,000 Bequest and Other Stories)
The I.B.M. machine has no ethic of its own; what it does is enable one or two people to do the computing work that formerly required many more people. If people often use it stupidly, it's their stupidity, not the machine's, and a return to the abacus would not exorcise the failing. People can be treated as drudges just as effectively without modern machines.
William H. Whyte (The Organization Man)
You think this is the first time such a thing has happened? Don’t you know about oxygen?” “I know it’s necessary for life.” “It is now,” Malcolm said. “But oxygen is actually a metabolic poison. It’s a corrosive gas, like fluorine, which is used to etch glass. And when oxygen was first produced as a waste product by certain plant cells—say, around three billion years ago—it created a crisis for all other life on our planet. Those plant cells were polluting the environment with a deadly poison. They were exhaling a lethal gas, and building up its concentration. A planet like Venus has less than one percent oxygen. On earth, the concentration of oxygen was going up rapidly—five, ten, eventually twenty-one percent! Earth had an atmosphere of pure poison! Incompatible with life!” Hammond looked irritated. “So what is your point? That modern pollutants will be incorporated, too?” “No,” Malcolm said. “My point is that life on earth can take care of itself. In the thinking of a human being, a hundred years is a long time. A hundred years ago, we didn’t have cars and airplanes and computers and vaccines.… It was a whole different world. But to the earth, a hundred years is nothing. A million years is nothing. This planet lives and breathes on a much vaster scale. We can’t imagine its slow and powerful rhythms, and we haven’t got the humility to try. We have been residents here for the blink of an eye. If we are gone tomorrow, the earth will not miss us.
Michael Crichton (Jurassic Park (Jurassic Park, #1))
The moderns, carrying little baggage of the kind that Shelly called "merely cultural," not even living in the traditional air, but breathing into their space helmets a scientific mixture of synthetic gases (and polluted at that) are the true pioneers. Their circuitry seems to include no atavistic domestic sentiment, they have suffered empathectomy, their computers hum no ghostly feedback of Home, Sweet Home. How marvelously free they are! How unutterably deprived!
Wallace Stegner (Angle of Repose)
We're at a crucial point in history. We cannot have fast cars, computers the size of credit cards, and modern conveniences, whilst simultaneously having clean air, abundant rainforests, fresh drinking water and a stable climate. This generation can have one or the other but not both. Humanity must make a choice. Both have an opportunity cost. Gadgetry or nature? Pick the wrong one and the next generations may have neither.
Mark Boyle (The Moneyless Man: A Year of Freeconomic Living)
As the old computer-science joke goes: “Let’s say you have a problem, and you decide to solve it with regular expressions. Well, now you have two problems.
Ryan Mitchell (Web Scraping with Python: Collecting Data from the Modern Web)
Higher-order functions allow us to abstract over actions, not just values.
Marijn Haverbeke (Eloquent JavaScript: A Modern Introduction to Programming)
Google attracts the best talents of the world; why their cloud computing will not be the most secure?
Enamul Haque (The Ultimate Modern Guide to Digital Transformation: The "Evolve or Die" thing clarified in a simpler way)
Cloud computing is like tab water; you only use it when you need it but available 24/7
Enamul Haque (The Ultimate Modern Guide to Cloud Computing)
Yet the possibility of information storage, beyond what men and governments ever had before, can make available at the touch of a button a man's total history (including remarks put on his record by his kindergarten teacher about his ability and character). And with the computer must be placed the modern scientific technical capability which exists for wholesale monitoring of telephone, cable, Telex and microwave transmissions which carry much of today's spoken and written communications. The combined use of the technical capability of listening in on all these forms of communications with the high-speed computer literally leaves no place to hide and little room for privacy.
Francis A. Schaeffer (How Should We Then Live? The Rise and Decline of Western Thought and Culture)
How much computing power could we could achieve if the entire world population stopped whatever we are doing right now and started doing calculations? How would it compare to a modern-day computer or smartphone?
Randall Munroe (What If?: Serious Scientific Answers to Absurd Hypothetical Questions)
The suppression of ecstasy and condemnation of pleasure by patriarchal religion have left us in a deep, festering morass. The pleasures people seek in modern times are superficial, venal, and corrupt. This is deeply unfortunate, for it justifies the patriarchal condemnation of pleasure that rotted out our hedonistic capacities in the first place! Narcissism is rampant, having reached a truly global scale. It now appears to have entered the terminal phase known as “cocooning,” the ultimate state of isolation. Dissociation from the natural world verges on complete disembodiment, represented in Archontic ploys such as “transhumanism,” cloning, virtual reality, and the uploading of human consciousness into cyberspace. The computer looks due to replace the cross as the primary image of salvation. It is already the altar where millions worship daily. If the technocrats prevail, artificial intelligence and artificial life will soon overrule the natural order of the planet.
John Lamb Lash
Nonetheless, the appeal of Copenhagen makes some sense, seen in this light. Quantum physics drove much of the technological and scientific progress of the past ninety years: nuclear power, modern computers, the Internet. Quantum-driven medical imaging changed the face of health care; quantum imaging techniques at smaller scales have revolutionized biology and kicked off the entirely new field of molecular genetics. The list goes on. Make some kind of personal peace with Copenhagen, and contribute to this amazing revolution in science . . . or take quantum physics seriously, and come face-to-face with a problem that even Einstein couldn't solve. Shutting up never looked so good.
Adam Becker (What Is Real?: The Unfinished Quest for the Meaning of Quantum Physics)
One of the great challenges in healthcare technology is that medicine is at once an enormous business and an exquisitely human endeavor; it requires the ruthless efficiency of the modern manufacturing plant and the gentle hand-holding of the parish priest; it is about science, but also about art; it is eminently quantifiable and yet stubbornly not.
Robert M. Wachter (The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age)
In many schools today, the phrase "computer-aided instruction" means making the computer teach the child. One might say the computer is being used to program the child. In my vision, the child programs the computer and, in doing so, both acquires a sense of mastery over a piece of the most modern and powerful technology and establishes an intimate contact with some of the deepest ideas from science, from mathematics, and from the art of intellectual model building.
Seymour Papert (Mindstorms: Children, Computers, And Powerful Ideas)
the Maya knew the time taken by the moon to orbit the earth. Their estimate of this period was 29.528395 days – extremely close to the true figure of 29.530588 days computed by the finest modern methods.11 The Mayan priests also had in their possession very accurate tables for the prediction of solar and lunar eclipses and were aware that these could occur only within plus or minus eighteen days of the node (when the moon’s path crosses the apparent path of the sun).
Graham Hancock (Fingerprints of the Gods: The Evidence of Earth's Lost Civilization)
Individually, the experience of most people was of accelerating impotence and incomprehension. They lived in a world of superstition. They relied on voodoo - charms, fetishes, and crystal balls whose caprices they were helpless to govern, yet without which the conduct of daily life came to a standstill. Faith that the computer would switch on one more time and do as it was asked had more a religious than a rational cast. When the screen went black, the gods were angry.
Lionel Shriver (So Much for That)
In the modern computer, software has developed in such a way as to fill this role of go-between. On one end you have the so-called end user who wants to be able to order up a piece of long division, say, simply by supplying two numbers to the machine and ordering it to divide them. At the other end stands the actual computer, which for all its complexity is something of a brute. It can perform only several hundred basic operations, and long division may not be one of them. The machine may have to be instructed to perform a sequence of several of its basic operations in order to accomplish a piece of long division. Software—a series of what are known as programs—translates the end user’s wish into specific, functional commands for the machine.
Tracy Kidder (The Soul of a New Machine)
A billion hours ago, modern Homo sapiens emerged. A billion minutes ago, Christianity began. A billion seconds ago, the IBM personal computer was released. A billion Google searches ago… was this morning. —HAL VARIAN, GOOGLE’S CHIEF ECONOMIST, DECEMBER 20, 2013
Laszlo Bock (Work Rules!: Insights from Inside Google That Will Transform How You Live and Lead)
It’s like we've been flung back in time," he said. "Here we are in the Stone Age, knowing all these great things after centuries of progress but what can we do to make life easier for the Stone Agers? Can we make a refrigerator? Can we even explain how it works? What is electricity? What is light? We experience these things every day of our lives but what good does it do if we find ourselves hurled back in time and we can’t even tell people the basic principles much less actually make something that would improve conditions. Name one thing you could make. Could you make a simple wooden match that you could strike on a rock to make a flame? We think we’re so great and modern. Moon landings, artificial hearts. But what if you were hurled into a time warp and came face to face with the ancient Greeks. The Greeks invented trigonometry. They did autopsies and dissections. What could you tell an ancient Greek that he couldn’t say, ‘Big Deal.’ Could you tell him about the atom? Atom is a Greek word. The Greeks knew that the major events in the universe can’t be seen by the eye of man. It’s waves, it’s rays, it’s particles." “We’re doing all right.” “We’re sitting in this huge moldy room. It’s like we’re flung back.” “We have heat, we have light.” “These are Stone Age things. They had heat and light. They had fire. They rubbed flints together and made sparks. Could you rub flints together? Would you know a flint if you saw one? If a Stone Ager asked you what a nucleotide is, could you tell him? How do we make carbon paper? What is glass? If you came awake tomorrow in the Middle Ages and there was an epidemic raging, what could you do to stop it, knowing what you know about the progress of medicines and diseases? Here it is practically the twenty-first century and you’ve read hundreds of books and magazines and seen a hundred TV shows about science and medicine. Could you tell those people one little crucial thing that might save a million and a half lives?” “‘Boil your water,’ I’d tell them.” “Sure. What about ‘Wash behind your ears.’ That’s about as good.” “I still think we’re doing fairly well. There was no warning. We have food, we have radios.” “What is a radio? What is the principle of a radio? Go ahead, explain. You’re sitting in the middle of this circle of people. They use pebble tools. They eat grubs. Explain a radio.” “There’s no mystery. Powerful transmitters send signals. They travel through the air, to be picked up by receivers.” “They travel through the air. What, like birds? Why not tell them magic? They travel through the air in magic waves. What is a nucleotide? You don’t know, do you? Yet these are the building blocks of life. What good is knowledge if it just floats in the air? It goes from computer to computer. It changes and grows every second of every day. But nobody actually knows anything.
Don DeLillo (White Noise)
In the modern culture of speed, we seem to not do anything fully. We are half watching television and half using the computer; we are driving while talking on the phone; we have a hard time having even one conversation; when we sit down to eat, we are reading a newspaper and watching television, and even when we watch television, we are flipping through channels. This quality of speed gives life a superficial feeling: we never experience anything fully. We engage ourselves in these activities in order to live a full life, but being speedy
Sakyong Mipham (Running with the Mind of Meditation: Lessons for Training Body and Mind)
It is our glory to use artificial intelligence, swarm drones, quantum computing and other modern technologies for removing the pain of the humanity. But it is a big disaster for the scientists and the humanity to use these technologies for developing mass destruction weapons.
Amit Ray (Compassionate Artificial Intelligence: Frameworks and Algorithms)
We spend so much of our time in car seats, in desk chairs, at computers, and peering at our various devices that modern life sometimes seems like an all-out assault on the integrity of our spine. The spine has three parts: lumbar (lower back), thoracic (midback), and cervical (neck) spine. Radiologists see so much degeneration in the cervical spine, brought on by years of hunching forward to look at phones, that they have a name for it: “tech neck.
Peter Attia (Outlive: The Science and Art of Longevity)
It is too easy to think that ‘science’ is what happens now, that modernity and scientific thought are inseparable. Yet as Laura Snyder so brilliantly shows in this riveting picture of the first heroic age, the nineteenth century saw the invention of the computer, of electrical impulses, the harnessing of the power of steam – the birth of railways, statistics and technology. In ‘The Philosophical Breakfast Club’ she draws an endearing – almost domestic – picture of four scientific titans, and shows how – through their very ‘clubbability’ – they created the scientific basis on which the modern world stands.
Judith Flanders (Inside the Victorian Home: A Portrait of Domestic Life in Victorian England)
The Industrial Revolution was based on two grand concepts that were profound in their simplicity. Innovators came up with ways to simplify endeavors by breaking them into easy, small tasks that could be accomplished on assembly lines. Then, beginning in the textile industry, inventors found ways to mechanize steps so that they could be performed by machines, many of them powered by steam engines. Babbage, building on ideas from Pascal and Leibniz, tried to apply these two processes to the production of computations, creating a mechanical precursor to the modern computer. His most significant conceptual leap was that such machines did not have to be set to do only one process, but instead could be programmed and reprogrammed through the use of punch cards. Ada saw the beauty and significance of that enchanting notion, and she also described an even more exciting idea that derived from it: such machines could process not only numbers but anything that could be notated in symbols.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution)
Within the next couple of years, whatever your job, you will be able to consult an on-demand expert, ask it about your latest ad campaign or product design, quiz it on the specifics of a legal dilemma, isolate the most effective elements of a pitch, solve a thorny logistical question, get a second opinion on a diagnosis, keep probing and testing, getting ever more detailed answers grounded in the very cutting edge of knowledge, delivered with exceptional nuance. All of the world’s knowledge, best practices, precedent, and computational power will be available, tailored to you, to your specific needs and circumstances, instantaneously and effortlessly. It is a leap in cognitive potential at least as great as the introduction of the internet. And that is before you even get into the implications of something like ACI and the Modern Turing Test.
Mustafa Suleyman (The Coming Wave: Technology, Power, and the Twenty-first Century's Greatest Dilemma)
Though a modern woman’s computer is more intimate than her bed...
Elizaveta Mikhailichenko (Preemptive Revenge)
Computer science is no more about computers than astronomy is about telescopes.” —Edsger W. Dijkstra
Peter Gottschling (Discovering Modern C++: An Intensive Course for Scientists, Engineers, and Programmers (C++ In-Depth))
Computer science is like learning to speak a new language, but instead of talking to people, you're talking to computers.
Enamul Haque (The Ultimate Modern Guide to Artificial Intelligence: Including Machine Learning, Deep Learning, IoT, Data Science, Robotics, The Future of Jobs, Required Upskilling and Intelligent Industries)
In the happy land of elegant code and pretty rainbows, there lives a spoil-sport monster called inefficiency.
Marijn Haverbeke (Eloquent JavaScript: A Modern Introduction to Programming)
When all is said and done, the invention of writing must be reckoned not only as a brilliant innovation but as a surpassing good for humanity. And assuming that we survive long enough to use their inventions wisely, I believe the same will be said of the modern Thoths and Prometheuses who are today devising computers and programs at the edge of machine intelligence.
Carl Sagan (The Dragons of Eden: Speculations on the Evolution of Human Intelligence)
… it had almost nothing to do with computers, the modernity I was trying to understand. Computers were the bones, but imagination,ambition and possibility were the blood. These kids, they simply did not accept that the world as it is has any special gravity, any hold upon us. If something was wrong, if it was bad, then that something was to be fixed, not endured. Where my generation reached for philosophy and the virtue of suffering, they reached instead for science and technology and they actually did something about the beggar in the street, the woman in the wheelchair. They got on with it. It wasn’t that they had no sense of spirit or depth. Rather they reserved it for the truly wondrous, and for everything else they made tools.
Nick Harkaway (Gnomon)
Imagine someone sitting alone in a room without television, radio, computer or phone and with the door closed and the blinds down. This person must be a dangerous lunatic or a prisoner sentenced to solitary confinement. If a free agent, then a panty-sniffing loser shunned by society, or a psycho planning to return to college with an automatic weapon and a backpack full of ammo.
Michael Foley (The Age of Absurdity: Why Modern Life makes it Hard to be Happy)
members of the Athenian assemblies were chosen by lot, a method meant to protect the system from degeneracy. Luckily, this effect has been investigated with modern political systems. In a computer simulation, Alessandro Pluchino and his colleagues showed how adding a certain number of randomly selected politicians to the process can improve the functioning of the parliamentary system.
Nassim Nicholas Taleb (Antifragile: Things That Gain from Disorder)
A billion hours ago, modern Homo sapiens emerged. A billion minutes ago, Christianity began. A billion seconds ago, the IBM personal computer was released. A billion Google searches ago… was this morning.
Laszlo Bock (Work Rules!: Insights from Inside Google That Will Transform How You Live and Lead)
He lamented the attitude of his younger students, who “no longer noticed that their heads had been turned into relays in a telephone network for communicating and distributing sensational physics messages” without realizing that, like almost all modern developments, mathematics was hostile to life: “It is inhuman, like every truly diabolic machine, and it kills everyone whose spinal marrow isn’t conditioned to fit the movement of its wheels.” His already excruciating self-criticism and inferiority complex became truly unbearable, for although he knew mathematics, it was not simple for him. He was not a computer.
Benjamín Labatut (The Maniac)
Yes. You do understand, you do. I knew you would. It was that analogy you made to the Quran that got me thinking in the first place. Metaphors: knowledge existing in several states simultaneously and without contradiction. The stag and the doe and the trap. Instead of working with linear strings of ones and zeroes, the computer could work with bundles that were one and zero and every point in between, all at once. If, if, if you could teach it to overcome its binary nature." "That sounds very complicated indeed." "It should be impossible, but it isn't." Alif began typing furiously. "All modern computers are pedants. To them the world is divided into black and white, off and on, right and wrong. But I will teach yours to recognize multiple origin points, interrelated geneses, systems of multivalent cause and effect.
G. Willow Wilson (Alif the Unseen)
A paper by Gordon Moore (of Moore’s law fame) gives figures for the total number of transistors manufactured per year since the 1950s. It looks something like this: Using our ratio, we can convert the number of transistors to a total amount of computing power. This tells us that a typical modern laptop, which has a benchmark score in the tens of thousands of MIPS, has more computing power than existed in the entire world in 1965.
Randall Munroe (What If?: Serious Scientific Answers to Absurd Hypothetical Questions)
Neurologically speaking, though, there are reasons we develop a confused sense of priorities when we’re in front of our computer screens. For one thing, email comes at unpredictable intervals, which, as B. F. Skinner famously showed with rats seeking pellets, is the most seductive and habit-forming reward pattern to the mammalian brain. (Think about it: would slot machines be half as thrilling if you knew when, and how often, you were going to get three cherries?) Jessie would later say as much to me when I asked her why she was “obsessed”—her word—with her email: “It’s like fishing. You just never know what you’re going to get.” More to the point, our nervous systems can become dysregulated when we sit in front of a screen. This, at least, is the theory of Linda Stone, formerly a researcher and senior executive at Microsoft Corporation. She notes that we often hold our breath or breathe shallowly when we’re working at our computers. She calls this phenomenon “email apnea” or “screen apnea.” “The result,” writes Stone in an email, “is a stress response. We become more agitated and impulsive than we’d ordinarily be.
Jennifer Senior (All Joy and No Fun: The Paradox of Modern Parenthood)
Another reason we know that language could not determine thought is that when a language isn't up to the conceptual demands of its speakers, they don't scratch their heads dumbfounded (at least not for long); they simply change the language. They stretch it with metaphors and metonyms, borrow words and phrases from other languages, or coin new slang and jargon. (When you think about it, how else could it be? If people had trouble thinking without language, where would their language have come from-a committee of Martians?) Unstoppable change is the great given in linguistics, which is not why linguists roll their eyes at common claims such as that German is the optimal language of science, that only French allows for truly logical expression, and that indigenous languages are not appropriate for the modern world. As Ray Harlow put it, it's like saying, "Computers were not discussed in Old English; therefore computers cannot be discussed in Modern English.
Steven Pinker (The Stuff of Thought: Language as a Window into Human Nature)
Multitasking is a highly overrated thing if you ask me. Today, we do everything in twos and threes. There is no longer time to stand and gaze, to look and ponder and to let your mind harmlessly wander. Everything goes around at such a breakneck pace. I almost envy my grandparents. Their lives were mostly devoid of Televisions and completely Computer free! I cannot imagine what my life would be today without the internet – forget even the computer – but I’d love to know how it would feel.
Anirudh Arun (The Steadfast Tin Soldier?)
There's a hardness I'm seeing in modern people. Those little moments of goofiness that used to make the day pass seem to have gone. Life's so serious now. Maybe it's just because I'm with an older gang now.[...]I mean nobody even has hobbies these days. Not that I can see. Husbands and wives both work. Kids are farmed out to schools and video games. Nobody seems able to endure simply being themselves, either - but at the same time they're isolated. People work much more, only go home and surf the Internet and send e-mail rather than calling or writing a note or visiting each other. They work, watch TV, and sleep. I see these things. The world is only about work: work work work get get get...racing ahead...getting sacked from work...going online...knowing computer languages...winning contracts. I mean, it's just not what I would have imagined the world might be if you'd asked me seventeen years ago. People are frazzled and angry, desperate about money, and, at best, indifferent to the future.
Douglas Coupland (Girlfriend in a Coma)
Stop for a second to behold the miracle of engineering that these hand-held, networked computers represent—the typical CPU in a modern smartphone is ten times more powerful than the Cray-1 supercomputer installed at Los Alamos National Laboratory in 1976.
Anthony M. Townsend (Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia)
quantum physics would open the door to a host of practical inventions that now define the digital age, including the modern personal computer, nuclear power, genetic engineering, and laser technology (from which we get such consumer products as the CD player
Kai Bird (American Prometheus)
The problem is going to be, "Let's see, I spent all day staring at a computer screen and then at night my most meaningful relationships are with the two-dimensional characters who aren't in fact two-dimensional characters . . . Gee, I wonder why I'm lonely and doing a lot of drugs? Could there be any connection between the fact that I've got nothing to do with other people, that I don't really have a fucking clue what it is to have a real life, and the fact that most of my existence is mediated by entertainment that I passively choose to receive?
David Foster Wallace (David Foster Wallace: The Last Interview and Other Conversations)
God was dead: to begin with. And romance was dead. Chivalry was dead. Poetry, the novel, painting, they were all dead, and art was dead. Theatre and cinema were both dead. Literature was dead. The book was dead. Modernism, postmodernism, realism and surrealism were all dead. Jazz was dead, pop music, disco, rap, classical music, dead. Culture was dead. Decency, society, family values were dead. The past was dead. History was dead. The welfare state was dead. Politics was dead. Democracy was dead. Communism, fascism, neoliberalism, capitalism, all dead, and marxism, dead, feminism, also dead. Political correctness, dead. Racism was dead. Religion was dead. Thought was dead. Hope was dead. Truth and fiction were both dead. The media was dead. The internet was dead. Twitter, instagram, facebook, google, dead. Love was dead. Death was dead. A great many things were dead. Some, though, weren’t, or weren’t dead yet. Life wasn’t yet dead. Revolution wasn’t dead. Racial equality wasn’t dead. Hatred wasn’t dead. But the computer? Dead. TV? Dead. Radio? Dead. Mobiles were dead. Batteries were dead. Marriages were dead, sex
Ali Smith (Winter (Seasonal #2))
For Barth, and for us, Nazi Germany was the supreme test for modern theology. There we experienced the “modern world,” which we had so labored to understand and to become credible to, as the world, not only of the Copernican world view, computers, and the dynamo, but also of the Nazis.
William H. Willimon (Resident Aliens: Life in the Christian Colony)
The externalization of memory [via the use of external symbolic storage systems] has altered the actual memory architecture within which humans think, which is changing the role of biological memory, the way in which the human brain deploys its resources, and the form of modern culture.
Merlin Donald
There are not many places left in the United States where people can get off the computer, stop filing tax returns, and in effect become invisible. The rain forests in the Cascades and parts of West Montana come to mind, and perhaps the ’Glades still offer hope to those who wish to resign from modern times. The other place is the Atchafalaya Basin.
James Lee Burke (Crusader's Cross (Dave Robicheaux, #14))
the average forager had wider, deeper and more varied knowledge of her immediate surroundings than most of her modern descendants. Today, most people in industrial societies don’t need to know much about the natural world in order to survive. What do you really need to know in order to get by as a computer engineer, an insurance agent, a history teacher or a factory worker? You need to know a lot about your own tiny field of expertise, but for the vast majority of life’s necessities you rely blindly on the help of other experts, whose own knowledge is also limited to a tiny field of expertise. The human collective knows far more today than did the ancient bands. But at the individual level, ancient foragers were the most knowledgeable and skilful people in history. There
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
absurd.” Quantum mechanics seems to study that which doesn’t exist—but nevertheless proves true. It works. In the decades to come, quantum physics would open the door to a host of practical inventions that now define the digital age, including the modern personal computer, nuclear power, genetic engineering, and laser technology (from which we get such consumer products as the CD player and the bar-code reader commonly used in supermarkets).
Kai Bird (American Prometheus)
Rife's key realization was that there's no difference between modern culture and Sumerian. We have a huge workforce that is illiterate or alliterate and relies on TV—which is sort of an oral tradition. And we have a small, extremely literate power elite—the people who go into the Metaverse, basically—who understand that information is power, and who control society because they have this semimystical ability to speak magic computer languages.
Neal Stephenson (Snow Crash)
I equate true spiritual insight with wisdom, which is different from knowledge. Knowledge can be obtained through many sources: books, stories, songs, legends, myths, and, in modern times, computers and television programs. On the other hand, there’s only one real source of wisdom—pain. Any experience that provides a person with wisdom will also usually provide them with a scar. The greater the pain, the greater the realization. Faith is spiritual rigor mortis.
Damien Echols (Life After Death)
Logically enough, the office and the nunnery have been singularly popular in the imaginations of pornographers. We should not be surprised to learn that the erotic novels of the early modern period were overwhelmingly focused on debauchery and flagellation amongst clergy in vespers and chapels, just as contemporary Internet pornography is inordinately concerned with fellatios and sodomies performed by office workers against a backdrop of work stations and computer equipment.
Alain de Botton (The Pleasures and Sorrows of Work)
As psychologist Bruce Hood writes in his book The Self Illusion, you have an origin story and a sense that you’ve traveled from youth to now along a linear path, with ups and downs that ultimately made you who you are today. Babies don’t have that. That sense is built around events that you can recall and place in time. Babies and small children have what Hood calls “unconscious knowledge,” which is to say they simply recognize patterns and make associations with stimuli. Without episodic memories, there is no narrative; and without any narrative, there is no self. Somewhere between ages two and three, according to Hood, that sense of self begins to come online, and that awakening corresponds with the ability to tell a story about yourself based on memories. He points to a study by Alison Gopnik and Janet Astington in 1988 in which researchers presented to three-year-olds a box of candy, but the children were then surprised to find pencils inside instead of sweets. When they asked each child what the next kid would think was in the box when he or she went through the same experiment, the answer was usually pencils. The children didn’t yet know that other people have minds, so they assumed everyone knew what they knew. Once you gain the ability to assume others have their own thoughts, the concept of other minds is so powerful that you project it into everything: plants, glitchy computers, boats with names, anything that makes more sense to you when you can assume, even jokingly, it has a sort of self. That sense of agency is so powerful that people throughout time have assumed a consciousness at the helm of the sun, the moon, the winds, and the seas. Out of that sense of self and other selves come the narratives that have kept whole societies together. The great mythologies of the ancients and moderns are stories made up to make sense of things on a grand scale. So strong is the narrative bias that people live and die for such stories and devote whole lives to them (as well as take lives for them).
David McRaney (You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself)
Another reason we know that language could not determine thought is that when a language isn't up to the conceptual demands of its speakers, they don't scratch their heads dumbfounded (at least not for long); they simply change the language. They stretch it with metaphors and metonyms, borrow words and phrases from other languages, or coin new slang and jargon. (When you think about it, how else could it be? If people had trouble thinking without language, where would their language have come from-a committee of Martians?) Unstoppable change is the great given in linguistics, which is not what you would expect from "a prisonhouse of thought." That is why linguists roll their eyes at common claims such as that German is the optimal language of science, that only French allows for truly logical expression, and that indigenous languages are not appropriate for the modern world. As Ray Harlow put it, it's like saying, "Computers were not discussed in Old English; therefore computers cannot be discussed in Modern English.
Steven Pinker (The Stuff of Thought: Language as a Window into Human Nature)
These conservative critics call for a return to “family values,” to a world in which prohibition kept us safe from outbreaks of enjoyment. This desire for a return to the past, however, is rarely genuine. Which is to say, such proclamations don’t really want the return to the past that they claim to want. Instead, they want the best of both worlds—the “benefits” of modernity (computers, cars, televisions) without their effects (isolation, enjoyment, narcissism)—and fail to grasp the interdependence of the benefits and the effects
Todd McGowan (The End of Dissatisfaction: Jacques Lacan and the Emerging Society of Enjoyment (Psychoanalysis and Culture))
In theory, if some holy book misrepresented reality, its disciples would sooner or later discover this, and the text’s authority would be undermined. Abraham Lincoln said you cannot deceive everybody all the time. Well, that’s wishful thinking. In practice, the power of human cooperation networks depends on a delicate balance between truth and fiction. If you distort reality too much, it will weaken you, and you will not be able to compete against more clear-sighted rivals. On the other hand, you cannot organise masses of people effectively without relying on some fictional myths. So if you stick to unalloyed reality, without mixing any fiction with it, few people will follow you. If you used a time machine to send a modern scientist to ancient Egypt, she would not be able to seize power by exposing the fictions of the local priests and lecturing the peasants on evolution, relativity and quantum physics. Of course, if our scientist could use her knowledge in order to produce a few rifles and artillery pieces, she could gain a huge advantage over pharaoh and the crocodile god Sobek. Yet in order to mine iron ore, build blast furnaces and manufacture gunpowder the scientist would need a lot of hard-working peasants. Do you really think she could inspire them by explaining that energy divided by mass equals the speed of light squared? If you happen to think so, you are welcome to travel to present-day Afghanistan or Syria and try your luck. Really powerful human organisations – such as pharaonic Egypt, the European empires and the modern school system – are not necessarily clear-sighted. Much of their power rests on their ability to force their fictional beliefs on a submissive reality. That’s the whole idea of money, for example. The government makes worthless pieces of paper, declares them to be valuable and then uses them to compute the value of everything else. The government has the power to force citizens to pay taxes using these pieces of paper, so the citizens have no choice but to get their hands on at least some of them. Consequently, these bills really do become valuable, the government officials are vindicated in their beliefs, and since the government controls the issuing of paper money, its power grows. If somebody protests that ‘These are just worthless pieces of paper!’ and behaves as if they are only pieces of paper, he won’t get very far in life.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
The story of the “exquisite cadavers” is as follows. In the aftermath of the First World War, a collection of surrealist poets—which included André Breton, their pope, Paul Eluard, and others—got together in cafés and tried the following exercise (modern literary critics attribute the exercise to the depressed mood after the war and the need to escape reality). On a folded piece of paper, in turn, each one of them would write a predetermined part of a sentence, not knowing the others’ choice. The first would pick an adjective, the second a noun, the third a verb, the fourth an adjective, and the fifth a noun. The first publicized exercise of such random (and collective) arrangement produced the following poetic sentence: The exquisite cadavers shall drink the new wine. (Les cadavres exquis boiront le vin nouveau.) Impressive? It sounds even more poetic in the native French. Quite impressive poetry has been produced in such a manner, sometimes with the aid of a computer. But poetry has never been truly taken seriously outside of the beauty of its associations, whether they have been produced by the random ranting of one or more disorganized brains, or the more elaborate constructions of one conscious creator.
Nassim Nicholas Taleb (Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets (Incerto Book 1))
The mind cannot be a blank slate, because blank slates don’t do anything. As long as people had only the haziest concept of what a mind was or how it might work, the metaphor of a blank slate inscribed by the environment did not seem too outrageous. But as soon as one starts to think seriously about what kind of computation enables a system to see, think, speak, and plan, the problem with blank slates becomes all too obvious: they don’t do anything. The inscriptions will sit there forever unless something notices patterns in them, combines them with patterns learned at other times, uses the combinations to scribble new thoughts onto the slate, and reads the results to guide behavior toward goals. Locke recognized this problem and alluded to something called “the understanding,” which looked at the inscriptions on the white paper and carried out the recognizing, reflecting, and associating. But of course explaining how the mind understands by invoking something called “the understanding” is circular. This argument against the Blank Slate was stated pithily by Gottfried Wilhelm Leibniz (1646-1716) in a reply to Locke. Leibniz repeated the empiricist motto “There is nothing in the intellect that was not first in the senses,” then added, “except the intellect itself.”8
Steven Pinker (The Blank Slate: The Modern Denial of Human Nature)
Astounding, really, that Michel could consider psychology any kind of science at all. So much of it consisted of throwing together. Of thinking of the mind as a steam engine, the mechanical analogy most ready to hand during the birth of modern psychology. People had always done that when they thought about the mind: clockwork for Descartes, geological changes for the early Victorians, computers or holography for the twentieth century, AIs for the twenty-first…and for the Freudian traditionalists, steam engines. Application of heat, pressure buildup, pressure displacement, venting, all shifted into repression, sublimation, the return of the repressed. Sax thought it unlikely steam engines were an adequate model for the human mind. The mind was more like—what?—an ecology—a fellfield—or else a jungle, populated by all manner of strange beasts. Or a universe, filled with stars and quasars and black holes. Well—a bit grandiose, that—really it was more like a complex collection of synapses and axons, chemical energies surging hither and yon, like weather in an atmosphere. That was better—weather—storm fronts of thought, high-pressure zones, low-pressure cells, hurricanes—the jet streams of biological desires, always making their swift powerful rounds…life in the wind. Well. Throwing together. In fact the mind was poorly understood.
Kim Stanley Robinson (Blue Mars (Mars Trilogy, #3))
When modern humans first invented computer ray tracing, they generated thousands if not millions of images of reflective chrome spheres hovering above checkerboard tiles, just to show off how gorgeously ray tracing rendered those reflections. When they invented lens flares in Photoshop, we all had to endure years of lens flares being added to everything, because the artists involved were super excited about a new tool they’d just figured out how to use. The invention of perspective was no different, and since it coincided with the Renaissance going on in Europe at the same time, some of the greatest art in the European canon is dripping with the 1400s CE equivalent of lens flares and hovering chrome spheres.
Ryan North (How to Invent Everything: A Survival Guide for the Stranded Time Traveler)
As technology enables us to upgrade humans, overcome old age and find the key to happiness, won’t people care less about fictional gods, nations and corporations, and focus instead on deciphering the physical and biological reality? It might seem so, but in fact things are far more complicated. Modern science certainly changed the rules of the game, yet it did not simply replace myths with facts. Myths continue to dominate humankind, and science only makes these myths stronger. Instead of destroying the intersubjective reality, science will enable it to control the objective and subjective realities more completely than ever before. Thanks to computers and bioengineering, the difference between fiction and reality will blur, as people reshape reality to match their pet fictions.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
As the physicist Richard Feynman once observed, “[Quantum mechanics] describes nature as absurd from the point of view of common sense. And it fully agrees with experiment. So I hope you can accept nature as She is— absurd.” Quantum mechanics seems to study that which doesn’t exist—but nevertheless proves true. It works. In the decades to come, quantum physics would open the door to a host of practical inventions that now define the digital age, including the modern personal computer, nuclear power, genetic engineering, and laser technology (from which we get such consumer products as the CD player and the bar-code reader commonly used in supermarkets). If the youthful Oppenheimer loved quantum mechanics for the sheer beauty of its abstractions, it was nevertheless a theory that would soon spawn a revolution in how human beings relate to the world.
Kai Bird (American Prometheus)
And the lights are everywhere. They are so pervasive in modern life we’ve stopped seeing them. In turning them off, it’s hard to know where to begin. There are house lights and garage lights, fluorescent lights and halogen lights. There are streetlights and stoplights, headlights, taillights, dashboard lights, and billboard lights. There are night-lights to stand sentinel against the dark of our rooms and hallways, and reading lights for feeding our addiction to words and images and information, even in the middle of the night. There are warning lights and safety lights, and the lights of our cell phones and televisions and computer screens. No wonder our larger towns and cities are so bright you can see them from space. Nor does that urban and suburban light stay put. It seeps into the nearby plains and hills and mountains, casting shadows from trees and telephone poles. It throws off the rhythms of insects and animals and confuses the migrations of birds.
Clark Strand (Waking Up to the Dark: Ancient Wisdom for a Sleepless Age)
God was dead: to begin with. And romance was dead. Chivalry was dead. Poetry, the novel, painting, they were all dead, and art was dead. Theatre and cinema were both dead. Literature was dead. The book was dead. Modernism, postmodernism, realism and surrealism were all dead. Jazz was dead, pop music, disco, rap, classical music, dead. Culture was dead. Decency, society, family values were dead. The past was dead. History was dead. The welfare state was dead. Politics was dead. Democracy was dead. Communism, fascism, neoliberalism, capitalism, all dead, and marxism, dead, feminism, also dead. Political correctness, dead. Racism was dead. Religion was dead. Thought was dead. Hope was dead. Truth and fiction were both dead. The media was dead. The internet was dead. Twitter, instagram, facebook, google, dead. Love was dead. Death was dead. A great many things were dead. Some, though, weren’t, or weren’t dead yet. Life wasn’t yet dead. Revolution wasn’t dead. Racial equality wasn’t dead. Hatred wasn’t dead. But the computer? Dead. TV? Dead. Radio? Dead. Mobiles were dead. Batteries were dead. Marriages were dead, sex lives were dead, conversation was dead. Leaves were dead. Flowers were dead, dead in their water. Imagine being haunted by the ghosts of all these dead things. Imagine being haunted by the ghost of a flower. No, imagine being haunted (if there were such a thing as being haunted, rather than just neurosis or psychosis) by the ghost (if there were such a thing as ghosts, rather than just imagination) of a flower. Ghosts themselves weren’t dead, not exactly. Instead, the following questions came up: “are ghosts dead are ghosts dead or alive are ghosts deadly” but in any case forget ghosts, put them out of your mind because this isn’t a ghost story, though it’s the dead of winter when it happens, a bright sunny post-millennial global-warming Christmas Eve morning (Christmas, too, dead), and it’s about real things really happening in the real world involving real people in real time on the real earth (uh huh, earth, also dead):
Ali Smith (Winter (Seasonal, #2))
What are we to make of the existence of historical patterns? It is often said that history repeats itself, sometimes as tragedy, sometimes as farce, sometimes with special flourishes and variations, but this notion stands at odds with our modern understanding of history as an arc of progress. As Weber pointed out, modernity hinges on the collective belief that history is an ongoing process, one in which we steadily increase our knowledge and technical mastery of the world. Unlike the ancient Hebrews and Greeks, who believed that history was cyclical, the modern standpoint is that time is going somewhere, that we are gaining knowledge and understanding of the world, that our inventions and discoveries build on one another in a cumulative fashion. But then why do the same problems—and even the same metaphors—keep appearing century after century in new form? More specifically, how is it that the computer metaphor—an analogy that was expressly designed to avoid the notion of a metaphysical soul—has returned to us these ancient religious ideas about physical transcendence and the disembodied spirit?
Meghan O'Gieblyn (God, Human, Animal, Machine: Technology, Metaphor, and the Search for Meaning)
The phrase “slow reading” goes back at least as far as the philosopher Friedrich Nietzsche, who in 1887 described himself as a “teacher of slow reading.” The way he phrased it, you know he thought he was bucking the tide. That makes sense, because the modern world, i.e., a world built upon the concept that fast is good and faster is better, was just getting up a full head of steam. In the century and a quarter since he wrote, we have seen the world fall in love with speed in all its guises, including reading—part of President John F. Kennedy’s legend was his ability to speed read through four or five newspapers every morning. And this was all long before computers became household gadgets and our BFFs. Now and then the Nietzsches of the world have fought back. Exponents of New Criticism captured the flag in the halls of academe around the middle of the last century and made “close reading” all the rage. Then came Slow Food, then Slow Travel, then Slow Money. And now there is Slow Reading. In all these initiatives, people have fought against the velocity of modern life by doing … less and doing it slower.
Malcolm Jones
It’s true that in the 1950s many women felt they had to choose between children and career—and for good reason. Birth control was not a surefire thing, for one thing. And technology hadn’t advanced enough to offer women the gift of time. The reason modern women have a better shot at “having it all” isn’t because feminists made it happen. Life simply changed. Technological advances, along with The Pill, did more for the work/family conflict than ten boatloads of feminists could ever hope to do. The effects of The Pill are obvious: safe, reliable birth control means those who want smaller families can have them. And fewer children means more time for women to focus on other things they want to do. The effects of technology are also obvious: they made life at home less taxing. Laborsaving devices, the mechanization of housework, and the tech boom—via electricity, the sewing machine, the frozen food process, the automobile, the washing machine and dryer, the dishwasher, the vacuum cleaner, computers, and the Internet—allowed women, generation by generation, to turn their attention away from the home and onto the marketplace.
Suzanne Venker (The War On Men)
I was once shown the script of a film based on a parable of a city completely ruled by randomness—very Borgesian. At set intervals, the ruler randomly assigns to the denizens a new role in the city. Say the butcher would now become a baker, and the baker a prisoner, etc. At the end, people end up rebelling against the ruler, asking for stability as their inalienable right. I immediately thought that perhaps the opposite parable should be written: instead of having the rulers randomize the jobs of citizens, we should have citizens randomize the jobs of rulers, naming them by raffles and removing them at random as well. That is similar to simulated annealing—and it happens to be no less effective. It turned out that the ancients—again, those ancients!—were aware of it: the members of the Athenian assemblies were chosen by lot, a method meant to protect the system from degeneracy. Luckily, this effect has been investigated with modern political systems. In a computer simulation, Alessandro Pluchino and his colleagues showed how adding a certain number of randomly selected politicians to the process can improve the functioning of the parliamentary system.
Nassim Nicholas Taleb (Antifragile: Things that Gain from Disorder)
...[I]f the goal is a realistic sustainable future, then it’s necessary to take a look at what we can do to lengthen the lives of the products we’re going to buy anyway. So my ... answer to the question of how we can boost recycling rates is this: Demand that companies start designing products for repair, reuse, and recycling. Take, for example, the super-thin MacBook Air, a wonder of modern design packed into an aluminum case that’s barely bigger than a handful of documents in a manila envelope. At first glance, it would seem to be a sustainable wonder that uses fewer raw materials to do more. But that’s just the gloss; the reality is that the MacBook Air’s thin profile means that its components—memory chips, solid state drive, and processor—are packed so tightly in the case that there’s no room for upgrades (a point driven home by the unusual screws used to hold the case together, thus making home repair even more difficult). Even worse, from the perspective of recycling, the thin profile (and the tightly packed innards) means that the computer is exceptionally difficult to break down into individual components when it comes time to recycle it. In effect, the MacBook Air is a machine built to be shredded, not repaired, upgraded, and reused.
Adam Minter (Junkyard Planet: Travels in the Billion-Dollar Trash Trade)
You could tell the quality of his thinking by what he chose to ask (questions being the true measure of a man), and after I successfully explained my thesis on symbiogenesis, we began conversing more openly and freely, and I got the chance to peer inside his head. He asked me if I’d heard of Turing’s oracle machines. In time, I have come to regard that simple question as a test. Luckily for me, I knew that Turing had written about oracle machines in his PhD thesis when he was just twenty-six years old: these were regular computers that worked, like all modern devices, following a precise set of sequential instructions. But Turing knew—from his study of Gödel and the halting problem—that all such devices would suffer from inescapable limitations, and that many problems would forever remain beyond their ability to solve. That weakness tortured the grandfather of computers: Turing longed for something different, a machine that could look beyond logic and behave in a manner more akin to humans, who possess not only intelligence but also intuition. So he dreamed up a computer capable of taking the machine equivalent of a wild guess: just like the Sibyl in her ecstatic drunkenness, his device would, at a certain point in its operations, make a nondeterministic leap.
Benjamín Labatut (The MANIAC)
Humans are cognitive misers because their basic tendency is to default to Type I processing mechanisms of low computational expense. Using less computational capacity for one task means that there is more left over for another task if they both must be completed simultaneously. This would seem to be adaptive. Nevertheless, this strong bias to default to the simplest cognitive mechanism-to be a cognitive miser-means that humans are often less than rational. Increasingly, in the modern world we are presented with decisions and problems that require more accurate responses than those generated by heuristic processing. Type i processes often provide a quick solution that is a first approximation to an optimal response. But modern life often requires more precise thought than this. Modern technological societies are in fact hostile environments for people reliant on only the most easily computed automatic response. Think of the multi-million-dollar advertising industry that has been designed to exploit just this tendency. Modern society keeps proliferating situations where shallow processing is not sufficient for maximizing personal happiness-precisely because many structures of market-based societies have been designed explicitly to exploit such tendencies. Being cognitive misers will seriously impede people from achieving their goals.
Keith E. Stanovich (What Intelligence Tests Miss)
In 1997 an IBM computer called Deep Blue defeated the world chess champion Garry Kasparov, and unlike its predecessors, it did not just evaluate trillions of moves by brute force but was fitted with strategies that intelligently responded to patterns in the game. [Y]ou might still object that chess is an artificial world with discrete moves and a clear winner, perfectly suited to the rule-crunching of a computer. People, on the other hand, live in a messy world offering unlimited moves and nebulous goals. Surely this requires human creativity and intuition — which is why everyone knows that computers will never compose a symphony, write a story, or paint a picture. But everyone may be wrong. Recent artificial intelligence systems have written credible short stories, composed convincing Mozart-like symphonies, drawn appealing pictures of people and landscapes, and conceived clever ideas for advertisements. None of this is to say that the brain works like a digital computer, that artificial intelligence will ever duplicate the human mind, or that computers are conscious in the sense of having first-person subjective experience. But it does suggest that reasoning, intelligence, imagination, and creativity are forms of information processing, a well-understood physical process. Cognitive science, with the help of the computational theory of mind, has exorcised at least one ghost from the machine.
Steven Pinker (The Blank Slate: The Modern Denial of Human Nature)
Many aspects of the modern financial system are designed to give an impression of overwhelming urgency: the endless ‘news’ feeds, the constantly changing screens of traders, the office lights blazing late into the night, the young analysts who find themselves required to work thirty hours at a stretch. But very little that happens in the finance sector has genuine need for this constant appearance of excitement and activity. Only its most boring part—the payments system—is an essential utility on whose continuous functioning the modern economy depends. No terrible consequence would follow if the stock market closed for a week (as it did in the wake of 9/11)—or longer, or if a merger were delayed or large investment project postponed for a few weeks, or if an initial public offering happened next month rather than this. The millisecond improvement in data transmission between New York and Chicago has no significance whatever outside the absurd world of computers trading with each other. The tight coupling is simply unnecessary: the perpetual flow of ‘information’ part of a game that traders play which has no wider relevance, the excessive hours worked by many employees a tournament in which individuals compete to display their alpha qualities in return for large prizes. The traditional bank manager’s culture of long lunches and afternoons on the golf course may have yielded more useful information about business than the Bloomberg terminal. Lehman
John Kay (Other People's Money: The Real Business of Finance)
If we’re not careful, the automation of mental labor, by changing the nature and focus of intellectual endeavor, may end up eroding one of the foundations of culture itself: our desire to understand the world. Predictive algorithms may be supernaturally skilled at discovering correlations, but they’re indifferent to the underlying causes of traits and phenomena. Yet it’s the deciphering of causation—the meticulous untangling of how and why things work the way they do—that extends the reach of human understanding and ultimately gives meaning to our search for knowledge. If we come to see automated calculations of probability as sufficient for our professional and social purposes, we risk losing or at least weakening our desire and motivation to seek explanations, to venture down the circuitous paths that lead toward wisdom and wonder. Why bother, if a computer can spit out “the answer” in a millisecond or two? In his 1947 essay “Rationalism in Politics,” the British philosopher Michael Oakeshott provided a vivid description of the modern rationalist: “His mind has no atmosphere, no changes of season and temperature; his intellectual processes, so far as possible, are insulated from all external influence and go on in the void.” The rationalist has no concern for culture or history; he neither cultivates nor displays a personal perspective. His thinking is notable only for “the rapidity with which he reduces the tangle and variety of experience” into “a formula.”54 Oakeshott’s words also provide us with a perfect description of computer intelligence: eminently practical and productive and entirely lacking in curiosity,
Nicholas Carr (The Glass Cage: Where Automation is Taking Us)
A Code of Nature must accommodate a mixture of individually different behavioral tendencies. The human race plays a mixed strategy in the game of life. People are not molecules, all alike and behaving differently only because of random interactions. People just differ, dancing to their own personal drummer. The merger of economic game theory with neuroscience promises more precise understanding of those individual differences and how they contribute to the totality of human social interactions. It's understanding those differences, Camerer says, that will make such a break with old schools of economic thought. "A lot of economic theory uses what is called the representative agent model," Camerer told me. In an economy with millions of people, everybody is clearly not going to be completely alike in behavior. Maybe 10 percent will be of some type, 14 percent another type, 6 percent something else. A real mix. "It's often really hard, mathematically, to add all that up," he said. "It's much easier to say that there's one kind of person and there's a million of them. And you can add things up rather easily." So for the sake of computational simplicity, economists would operate as though the world was populated by millions of one generic type of person, using assumptions about how that generic person would behave. "It's not that we don't think people are different—of course they are, but that wasn't the focus of analysis," Camerer said. "It was, well, let's just stick to one type of person. But I think the brain evidence, as well as genetics, is just going to force us to think about individual differences." And in a way, that is a very natural thing for economists to want to do.
Tom Siegfried (A Beautiful Math: John Nash, Game Theory, and the Modern Quest for a Code of Nature (Mathematics))
Back in 2015, a volunteer group called Bitnation set up something called the Blockchain Emergency ID. There’s not a lot of data on the project now, BE-ID - used public-key cryptography to generate unique IDs for people without their documents. People could verify their relations, that these people belonged to their family, and so on. It was a very modern way of maintaining an ID; secure, fast, and easy to use. Using the Bitcoin blockchain, the group published all these IDs on to a globally distributed public ledger, spread across the computers of every single Bitcoin user online - hundreds of thousands of users, in those times. Once published, no government could undo it; the identities would float around in the recesses of the Internet. As long as the network remained alive, every person's identity would remain intact, forever floating as bits and bytes between the nations: no single country, government or company could ever deny them this. “That was, and I don't say this often, the fucking bomb,” said Common, In one fell swoop, identities were taken outside government control. BE-ID, progressing in stages, became the refugees' gateway to social assistance and financial services. First it became compliant with UN guidelines. Then it was linked to a VISA card. And thus out of the Syrian war was something that looked like it could solve global identification forever. Experts wrote on its potential. No more passports. No more national IDs. Sounds familiar? Yes, that’s the United Nations Identity in a nutshell. Julius Common’s first hit - the global identity revolution that he sold first to the UN, and then to almost every government in the world - was conceived of when he was a teenager.
Yudhanjaya Wijeratne (Numbercaste)
A confidential report delivered in June 1965 by Abel Aganbegyan, director of the Novobirsk Institute of Economics, highlighted the difficulties. Aganbegyan noted that the growth rate of the Soviet economy was beginning to decline, just as the rival US economy seemed particularly buoyant; at the same time, some sectors of the Soviet economy - housing, agriculture, services, retail trade - remained very backward, and were failing to develop at an adequate rate. The root causes of this poor performance he saw in the enormous commitment of resources to defense (in human terms, 30-40 million people out of a working population of 100 million, he reckoned), and the 'extreme centralism and lack of democracy in economic matters' which had survived from the past. In a complex modern society, he argued, not everything could be planned, since it was impossible to foresee all possible contingencies and their potential effects. So the plan amounted to central command, and even that could not be properly implemented for lack of information and of modern data-processing equipment. 'The Central Statistical Administration ... does not have a single computer, and is not planning to acquire any,' he commented acidly. Economic administration was also impeded by excessive secrecy: 'We obtain many figures... from American journals sooner than they are released by the Central Statistical Administration.' Hence the economy suffered from inbuilt distortions: the hoarding of goods and labour to provide for unforeseen contingencies, the production of shoddy goods to fulfill planning targets expressed in crude quantitative terms, the accumulation of unused money by a public reluctant to buy substandard products, with resultant inflation and a flourishing black market.
Geoffrey Hosking (The First Socialist Society: A History of the Soviet Union from Within)
Computers speak machine language," Hiro says. "It's written in ones and zeroes -- binary code. At the lowest level, all computers are programmed with strings of ones and zeroes. When you program in machine language, you are controlling the computer at its brainstem, the root of its existence. It's the tongue of Eden. But it's very difficult to work in machine language because you go crazy after a while, working at such a minute level. So a whole Babel of computer languages has been created for programmers: FORTRAN, BASIC, COBOL, LISP, Pascal, C, PROLOG, FORTH. You talk to the computer in one of these languages, and a piece of software called a compiler converts it into machine language. But you never can tell exactly what the compiler is doing. It doesn't always come out the way you want. Like a dusty pane or warped mirror. A really advanced hacker comes to understand the true inner workings of the machine -- he sees through the language he's working in and glimpses the secret functioning of the binary code -- becomes a Ba'al Shem of sorts." "Lagos believed that the legends about the tongue of Eden were exaggerated versions of true events," the Librarian says. "These legends reflected nostalgia for a time when people spoke Sumerian, a tongue that was superior to anything that came afterward." "Is Sumerian really that good?" "Not as far as modern-day linguists can tell," the Librarian says. "As I mentioned, it is largely impossible for us to grasp. Lagos suspected that words worked differently in those days. If one's native tongue influences the physical structure of the developing brain, then it is fair to say that the Sumerians -- who spoke a language radically different from anything in existence today -- had fundamentally different brains from yours. Lagos believed that for this reason, Sumerian was a language ideally suited to the creation and propagation of viruses. That a virus, once released into Sumer, would spread rapidly and virulently, until it had infected everyone." "Maybe Enki knew that also," Hiro says. "Maybe the nam-shub of Enki wasn't such a bad thing. Maybe Babel was the best thing that ever happened to us.
Neal Stephenson (Snow Crash)
With the introduction of radio, we now had a superfast. convenient, and wireless way of communicating over long distances. Historically, the lack of a fast and reliable communication system was one of the great obstacles to the march of history. (In 490 BCE, after the Battle of Marathon between the Greeks and the Persians, a poor runner was ordered to spread the news of the Greek victory as fast as he could. Bravely, he ran 26 miles to Athens after previously running 147 miles to Sparta, and then, according to legend, dropped dead of sheer exhaustion. His heroism, in the age before telecommunication, is now celebrated in the modern marathon.) Today, we take for granted that we can send messages and information effortlessly across the globe, utilizing the fact that energy can be transformed in many ways. For example, when speaking on a cell phone, the energy of the sound of your voice converts to mechanical energy in a vibrating diaphragm. The diaphragm is attached to a magnet that relies on the interchangeability of electricity and magnetism to create an electrical impulse, the kind that can be transported and read by a computer. This electrical impulse is then translated into electromagnetic waves that are picked up by a nearby microwave tower. There, the message is amplified and sent across the globe. But Maxwell's equations not only gave us nearly instantaneous communication via radio, cell phone, and fiber-optic cables, they also opened up the entire electromagnetic spectrum, of which visible light and radio were just two members. In the 166os, Newton had shown that white light, when sent through a prism, can be broken up into the colors of the rainbow. In 1800, William Herschel had asked himself a simple question: What lies beyond the colors of the rainbow, which extend from red to violet? He took a prism, which created a rainbow in his lab, and placed a thermometer below the color red, where there was no color at all. Much to his surprise, the temperature of this blank area began to rise. In other words, there was a "color" below red that was invisible to the naked eye but contained energy. It was called infrared light. Today, we realize that there is an entire spectrum of electromagnetic radiation, most of which is invisible, and each has a distinct wavelength. The wavelength of radio and TV, for example, is longer than that of visible light. The wavelength of the colors of the rainbow, in turn, is longer than that of ultraviolet and X-rays. This also meant that the reality we see all around us is only the tiniest sliver of the complete EM spectrum, the smallest approximation of a much larger universe
Michio Kaku (The God Equation: The Quest for a Theory of Everything)