Short Computer Quotes

We've searched our database for all the quotes and captions related to Short Computer. Here they are! All 100 of them:

We can only see a short distance ahead, but we can see plenty there that needs to be done.
Alan M. Turing (Computing machinery and intelligence)
The upshot of all this is that we live in a universe whose age we can't quite compute, surrounded by stars whose distances we don't altogether know, filled with matter we can't identify, operating in conformance with physical laws whose properties we don’t truly understand.
Bill Bryson (A Short History of Nearly Everything)
A long descriptive name is better than a short enigmatic name. A long descriptive name is better than a long descriptive comment.
Robert C. Martin (Clean Code: A Handbook of Agile Software Craftsmanship)
Adrian looked over at me again. “Who knows more about male weakness: you or me?” “Go on.” I refused to directly answer the question. “Get a new dress. One that shows a lot of skin. Short. Strapless. Maybe a push-up bra too.” He actually had the audacity to do a quick assessment of my chest. “Eh, maybe not. But definitely some high heels.” “Adrian,” I exclaimed. “You’ve seen how Alchemists dress. Do you think I can really wear something like that?” He was unconcerned. “You’ll make it work. You’ll change clothes or something. But I’m telling you, if you want to get a guy to do something that might be difficult, then the best way is to distract him so that he can’t devote his full brainpower to the consequences.” “You don’t have a lot of faith in your own gender.” “Hey, I’m telling you the truth. I’ve been distracted by sexy dresses a lot.” I didn’t really know if that was a valid argument, seeing as Adrian was distracted by a lot of things. Fondue. T-shirts. Kittens. “And so, what then? I show some skin, and the world is mine?” “That’ll help.” Amazingly, I could tell he was dead serious. “And you’ve gotta act confident the whole time, like it’s already a done deal. Then make sure when you’re actually asking for what you want that you tell him you’d be ‘so, so grateful.’ But don’t elaborate. His imagination will do half the work for you. ” I shook my head, glad we’d almost reached our destination. I didn’t know how much more I could listen to. “This is the most ridiculous advice I’ve ever heard. It’s also kind of sexist too, but I can’t decide who it offends more, men or women.” “Look, Sage. I don’t know much about chemistry or computer hacking or photosynthery, but this is something I’ve got a lot of experience with.” I think he meant photosynthesis, but I didn’t correct him. “Use my knowledge. Don’t let it go to waste.
Richelle Mead (The Indigo Spell (Bloodlines, #3))
We have bigger houses but smaller families; more conveniences, but less time; We have more degrees, but less sense; more knowledge, but less judgment; more experts, but more problems; more medicines, but less healthiness; We’ve been all the way to the moon and back, but have trouble crossing the street to meet the new neighbor. We’ve built more computers to hold more information to produce more copies than ever, but have less communications; We have become long on quantity, but short on quality. These times are times of fast foods; but slow digestion; Tall man but short character; Steep profits but shallow relationships. It is time when there is much in the window, but nothing in the room. --authorship unknown from Sacred Economics
Charles Eisenstein (Sacred Economics: Money, Gift, and Society in the Age of Transition)
To use a computer analogy, we are running twenty-first-century software on hardware last upgraded 50,000 years ago or more. This may explain quite a lot of what we see in the news.
Ronald Wright (A Short History Of Progress)
What I had not realized is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.
Joseph Weizenbaum
The young man shivered. He rolled the stock themes of fantasy over in his mind: cars and stockbrokers and commuters, housewives and police, agony columns and commercials for soap, income tax and cheap restaurants, magazines and credit cards and streetlights and computers... 'It is escapism, true,' he said, aloud. 'But is not the highest impulse in mankind the urge toward freedom, the drive to escape?
Neil Gaiman (Fragile Things: Short Fictions and Wonders)
We have bigger houses but smaller families; more conveniences, but less time; We have more degrees, but less sense; more knowledge, but less judgment; more experts, but more problems; more medicines, but less healthiness; We’ve been all the way to the moon and back, but have trouble crossing the street to meet the new neighbor. We’ve built more computers to hold more information to produce more copies than ever, but have less communications; We have become long on quantity, but short on quality. These times are times of fast foods; but slow digestion; Tall man but short character; Steep profits but shallow relationships. It is time when there is much in the window, but nothing in the room.
Dalai Lama XIV
Sadly enough, sometimes you and Lenny are the only real human interactions that I have all day. The rest of the day I'm just like a machine that mechnically computes and produces Also in "Stories and Scripts:An Anthology
Zack Love (The Doorman)
Tomorrow at the press conference would be dreadful. She would be surrounded by nice young men who spoke Big Business or Computer or Bachelor on the Make, and she would not understand a word they said." "Short Story: Blued Moon
Connie Willis (Best Science Fiction of the Year 14)
The Timeline of a Life being Long or Short, is Computed on the Benchmark of How You Live It...
Saurabh Dudeja
The audience burst into applause and hallelujahs. I kept trying to make sense of it, and kept coming up short. Here were people who routinely used their computers to stay in touch with their friends and get the news of the day, people who took weather satellites and lung transplants for granted, people who expected to live lives thirty and forty years longer than those of their great-grandparents. Here they were, falling for a story that made Santa and the Tooth Fairy look like gritty realism.
Stephen King (Revival)
Marturano recommended something radical: do only one thing at a time. When you’re on the phone, be on the phone. When you’re in a meeting, be there. Set aside an hour to check your email, and then shut off your computer monitor and focus on the task at hand. Another tip: take short mindfulness breaks throughout the day. She called them “purposeful pauses.” So, for example, instead of fidgeting or tapping your fingers while your computer boots up, try to watch your breath for a few minutes. When driving, turn off the radio and feel your hands on the wheel. Or when walking between meetings, leave your phone in your pocket and just notice the sensations of your legs moving. “If I’m a corporate samurai,” I said, “I’d be a little worried about taking all these pauses that you recommend because I’d be thinking, ‘Well, my rivals aren’t pausing. They’re working all the time.’ ” “Yeah, but that assumes that those pauses aren’t helping you. Those pauses are the ways to make you a more clear thinker and for you to be more focused on what’s important.
Dan Harris (10% Happier)
In the short run, technology many be more efficient than man, but it will never be perfect. Every piece of equipment will eventually reveal an error code. In the long run, man will never be perfect, but prove to be more reliable than technology.
Suzy Kassem (Rise Up and Salute the Sun: The Writings of Suzy Kassem)
She was working at her computer in her office, doing admin, which is short for administration, which is short for migraine-stimulant.
Ali Smith (There but for the)
Reading the e-mail was like getting an ice pick to the brain. I stared blankly at my computer, all higher mental functions short-circuited, and resisted the urge to punch the screen.
Phil Klay (Redeployment)
Every set of phenomena, whether cultural totality or sequence of events, has to be fragmented, disjointed, so that it can be sent down the circuits; every kind of language has to be resolved into a binary formulation so that it can circulate not, any longer, in our memories, but in the luminous, electronic memory of the computers. No human language can withstand the speed of light. No event can withstand being beamed across the whole planet. No meaning can withstand acceleration. No history can withstand the centrifugation of facts or their being short-circuited in real time (to pursue the same train of thought: no sexuality can withstand being liberated, no culture can withstand being hyped, no truth can withstand being verified, etc.).
Jean Baudrillard (The Illusion of the End)
Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended. Is such progress avoidable? If not to be avoided, can events be guided so that we may survive? —Vernor Vinge, author, professor, computer scientist
James Barrat (Our Final Invention: Artificial Intelligence and the End of the Human Era)
Watson represents merely a step in the development of smart machines. Its answering prowess, so formidable on a winter afternoon in 2011, will no doubt seem quaint in a surprisingly short time.
Stephen Baker (Final Jeopardy: Man vs. Machine and the Quest to Know Everything)
Google gets $59 billion, and you get free search and e-mail. A study published by the Wall Street Journal in advance of Facebook’s initial public offering estimated the value of each long-term Facebook user to be $80.95 to the company. Your friendships were worth sixty-two cents each and your profile page $1,800. A business Web page and its associated ad revenue were worth approximately $3.1 million to the social network. Viewed another way, Facebook’s billion-plus users, each dutifully typing in status updates, detailing his biography, and uploading photograph after photograph, have become the largest unpaid workforce in history. As a result of their free labor, Facebook has a market cap of $182 billion, and its founder, Mark Zuckerberg, has a personal net worth of $33 billion. What did you get out of the deal? As the computer scientist Jaron Lanier reminds us, a company such as Instagram—which Facebook bought in 2012—was not valued at $1 billion because its thirteen employees were so “extraordinary. Instead, its value comes from the millions of users who contribute to the network without being paid for it.” Its inventory is personal data—yours and mine—which it sells over and over again to parties unknown around the world. In short, you’re a cheap date.
Marc Goodman (Future Crimes)
What have you done to it, Monkeyman? - he breathed. - Well, - said Arthur, - nothing in fact. It's just that I think a short while ago it was trying to work out how to... - Yes? - Make me some tea. - That's right guys, - the computer sang out suddenly, - just coping with that problem right now, and wow, it's a biggy. Be with you in a while." It lapsed back into a silence that was only matched for sheer intensity by the silence of the three people staring at Arthur Dent.
Douglas Adams
And before you say this is all far-fetched, just think how far the human race has come in the past ten years. If someone had told your parents, for example, that they would be able to carry their entire music library in their pocket, would they have believed it? Now we have phones that have more computing power than was used to send some of the first rockets into space. We have electron microscopes that can see individual atoms. We routinely cure diseases that only fifty years ago was fatal. and the rate of change is increasing. Today we are able to do what your parents would of dismissed as impossible and your grandparents nothing short of magical.
Nicolas Flamel
She wanted to feel normal. She wanted to feel like everyone else. She wanted more than this even; she wanted to be pretty. She wanted to be like the girls who ran laps in gym class, climbed ropes, jumped, and were asked out on dates. When some girls were wondering if their shorts or skirts were too tight, Sarah was worried about her underpants: granny panties, she called them. When the prettiest and most popular were trying out for cheerleading or volleyball, Sarah went home and read, gorged herself between meals in front of her computer and television.
Todd Nelsen (Appetite & Other Stories)
Hot, short, thorough," I said. I hesitated before adding, "Please." It never pays to insult computers that are smart enough to form sentences. Not when they're in control of the locks, and especially when they have the capacity to boil you in bleach. "Absolutely," said the shower.
Mira Grant (Deadline (Newsflesh, #2))
The upshot of all this is that we live in a universe whose age we can’t quite compute, surrounded by stars whose distances from us and each other we don’t altogether know, filled with matter we can’t identify, operating in conformance with physical laws whose properties we don’t truly understand.
Bill Bryson (A Short History of Nearly Everything)
Of all the inventions Addie has seen her ushered into the world — steam-powered trains, electric lights, photography, and phones, and airplanes, and computers — movies might just be her favorite one. Books are wonderful, portable, lasting, but sitting there, in the darkened theater, the wide screen filling her vision, the world falls away, and for a few short hours she is someone else, plunged into romance and intrigue and comedy and adventure.
Victoria E. Schwab (The Invisible Life of Addie LaRue)
It was yesterday’s lead story, tracking the proliferation of new “mind-uploading” companies, hoping to discover a means of scanning the human brain into a computer for perpetual preservation. Anything to satiate the spike in interest among short-stringers looking to extend their lives, in this generation or the next.
Nikki Erlick (The Measure)
For a long time it puzzled me how something so expensive, so leading edge, could be so useless, and then it occurred to me that a computer is a stupid machine with the ability to do incredibly smart things, while computer programmers are smart people with the ability to do incredibly stupid things. They are, in short, a perfect match.
Bill Bryson (I'm a Stranger Here Myself: Notes on Returning to America After Twenty Years Away)
I leaned into my computer to give off the impression I was involved in something important, when actually I was reading [IMDB] trivia about the TV show The Wire. I’d ended up on the show page after following a trail of links that began with James Van Der Beek’s headshot. President Obama claims it’s his favorite show, and Omar is his favorite character.
Steven Barker (Now for the Disappointing Part: A Pseudo-Adult?s Decade of Short-Term Jobs, Long-Term Relationships, and Holding Out for Something Better)
in real statistical analyses the computer takes over the tedium of arithmetic juggling.
David J. Hand (Statistics: A Very Short Introduction (Very Short Introductions Book 196))
An operating system is the great facilitator; it is the great protector; it is the great illusionist.
Subrata Dasgupta (Computer Science: A Very Short Introduction (Very Short Introductions))
the groundbreakers in many sciences were devout believers. Witness the accomplishments of Nicolaus Copernicus (a priest) in astronomy, Blaise Pascal (a lay apologist) in mathematics, Gregor Mendel (a monk) in genetics, Louis Pasteur in biology, Antoine Lavoisier in chemistry, John von Neumann in computer science, and Enrico Fermi and Erwin Schrodinger in physics. That’s a short list, and it includes only Roman Catholics; a long list could continue for pages. A roster that included other believers—Protestants, Jews, and unconventional theists like Albert Einstein, Fred Hoyle, and Paul Davies—could fill a book.
Scott Hahn (Reasons to Believe: How to Understand, Explain, and Defend the Catholic Faith)
Tin Toy went on to win the 1988 Academy Award for animated short films, the first computer-generated film to do so. To celebrate, Jobs took Lasseter and his team to Greens, a vegetarian restaurant in San Francisco.
Walter Isaacson (Steve Jobs)
World War II was decided by steel and aluminum, and followed shortly thereafter by the Cold War, which was defined by atomic weapons. The rivalry between the United States and China may well be determined by computing power.
Chris Miller (Chip War: The Fight for the World's Most Critical Technology)
It came from the world’s first computer – the Mark 1 – a room-size maze of electromechanical circuits built in 1944 in a lab at Harvard University. The computer developed a glitch one day, and no one was able to locate the cause. After hours of searching, a lab assistant finally spotted the problem. It seemed a moth had landed on one of the computer’s circuit boards and shorted it out. From that moment on, computer glitches were referred to as bugs.
Dan Brown (Digital Fortress)
For as to what we have heard you affirm, that there are other kingdoms and states in the world inhabited by human creatures as large as yourself, our philosophers are in much doubt, and would rather conjecture that you dropped from the moon, or one of the stars; because it is certain, that a hundred mortals of your bulk would in a short time destroy all the fruits and cattle of his majesty’s dominions: besides, our histories of six thousand moons make no mention of any other regions than the two great empires of Lilliput and Blefuscu. Which two mighty powers have, as I was going to tell you, been engaged in a most obstinate war for six-and-thirty moons past. It began upon the following occasion. It is allowed on all hands, that the primitive way of breaking eggs, before we eat them, was upon the larger end; but his present majesty’s grandfather, while he was a boy, going to eat an egg, and breaking it according to the ancient practice, happened to cut one of his fingers. Whereupon the emperor his father published an edict, commanding all his subjects, upon great penalties, to break the smaller end of their eggs. The people so highly resented this law, that our histories tell us, there have been six rebellions raised on that account; wherein one emperor lost his life, and another his crown. These civil commotions were constantly fomented by the monarchs of Blefuscu; and when they were quelled, the exiles always fled for refuge to that empire. It is computed that eleven thousand persons have at several times suffered death, rather than submit to break their eggs at the smaller end. Many hundred large volumes have been published upon this controversy: but the books of the Big-endians have been long forbidden, and the whole party rendered incapable by law of holding employments. During the course of these troubles, the emperors of Blefusca did frequently expostulate by their ambassadors, accusing us of making a schism in religion, by offending against a fundamental doctrine of our great prophet Lustrog, in the fifty-fourth chapter of the Blundecral (which is their Alcoran). This, however, is thought to be a mere strain upon the text; for the words are these: ‘that all true believers break their eggs at the convenient end.’ And which is the convenient end, seems, in my humble opinion to be left to every man’s conscience, or at least in the power of the chief magistrate to determine.
Jonathan Swift (Gulliver's Travels)
The upshot of all this is that we live in a universe whose age we can’t quite compute, surrounded by stars whose distances we don’t altogether know, filled with matter we can’t identify, operating in conformance with physical laws whose properties we don’t truly understand.
Bill Bryson (A Short History of Nearly Everything)
A potential dajjalic interruption is an excessive esoterism. All of these people on a grail quest and looking for the ultimate secret to Ibn Arabi’s 21st heaven and endlessly going into the most esoteric stuff without getting the basics right, that is also a fundamental error of our age because the nafs loves all sorts of spiritual stories without taming itself first. The tradition that was practiced in this place for instance (Turkey) was not by starting out on the unity of being or (spiritual realities). Of course not. You start of in the kitchen for a year and then you make your dhikr in your khanaqah and you’re in the degree of service. Even Shah Bahauddin Naqshband before he started who was a great scholar needed 21 years before he was ‘cooked’. But we want to find a shortcut. Everything’s a shortcut. Even on the computer there’s a shortcut for everything. Something around the hard-work and we want the same thing. Because there seems to be so little time (or so little barakah in our time) but there is no short cut unless of course Allah (SWT) opens up a door of paradise or a way for you to go very fast. But we can’t rely on that happening because it’s not common. Mostly it’s salook, constantly trudging forward and carrying the burden until it becomes something sweet and light. And that takes time, so the esoteric deviation is common in our age as well.
Abdal Hakim Murad
Until a short time ago facial recognition was a favourite example of something that even babies accomplish easily but which escaped even the most powerful computers. Today facial-recognition programs are able to identify people far more efficiently and quickly than humans can.
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
The upshot of all this is that we live in a universe whose age we can’t quite compute, surrounded by stars whose distances we don’t altogether know, filled with matter we can’t identify, operating in conformance with physical laws whose properties we don’t truly understand. And
Bill Bryson (A Short History of Nearly Everything)
The assumption that economic expansion is driven by consumer demand—more consumers equals more growth—is a fundamental part of the economic theories that underlie the model. In other words, their conclusions are predetermined by their assumptions. What the model actually tries to do is to use neoclassical economic theory to predict how much economic growth will result from various levels of population growth, and then to estimate the emissions growth that would result. Unfortunately, as Yves Smith says about financial economics, any computer model based on mainstream economic theory “rests on a seemingly rigorous foundation and elaborate math, much like astrology.” In short, if your computer model assumes that population growth causes emissions growth, then it will tell you that fewer people will produce fewer emissions. Malthus in, Malthus out.
Ian Angus (Too Many People?: Population, Immigration, and the Environmental Crisis)
Shortly before she died in 2011, Jean Jennings Bartik reflected proudly on the fact that all the programmers who created the first general-purpose computer were women: “Despite our coming of age in an era when women’s career opportunities were generally quite confined, we helped initiate the era of the computer.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Before I opened my computer in the parking lot today, I relived one of my favorite memories. It's the one with Woody and me sitting on the steps of the Metropolitan Museum after it's closed. We're watching people parade out of the museum in summer shorts and sandals. The trees to the south are planted in parallel lines. The water in the fountain shoots up with a mist that almost reaches the steps we sit on. We look at silver-haired ladies in red-and-white-print dresses. We separate the mice from the men, the tourists from the New Yorkers, the Upper East Siders from the West Siders. The hot-pretzel vendor sells us a wad of dough in knots with clumps of salt stuck on top. We make our usual remarks about the crazies and wonder what it would be like to live in a penthouse apartment on Fifth Avenue overlooking the Met. We laugh and say the same things we always say. We hold hands and keep sitting, just sitting, as the sun beings to set. It's a perfect afternoon.
Diane Keaton (Then Again)
Gator, go wake that woman of yours. I need some answers. We need her to run the computers for us.” “Tonight, Boss?” Gator complained. “I had other ideas.” He wiggled his eyebrows suggestively. “We all did. Hop to it.” “What about Sam?” Tucker asked. “His woman is the one who got us into this.” “I’m wounded.” Sam clutched his abdomen dramatically and staggered with quick, long strides so that he made it to the doorway in three quick steps. Jonas coughed, sounding suspiciously like he’d muttered “bullshit” under his breath. Kyle threw a peanut at him and Jeff surfed across the table in his bare socks to try to catch him before he bolted. “He’s in love, boys, let him go. He’ll probably just get laughed at,” Tucker said. “Do you really think Azami’s brothers are going to allow her to hook up with Sam? She’s fine and he’s . . . well . . . klutzy.” “That hurt,” Sam said, turning back. “Did you get a good look at those boys? I thought Japanese men were supposed to be on the short side, but Daiki was tall and all muscle. His brother moves like a fucking fighter,” Tucker added. “They might just decide to give you a good beating for having the audacity to even think you could date their sister, let alone marry her.” “Fat help you are,” Sam accused. “I could use a little confidence here.” Kyle snorted. “You don’t have a chance, buddy.” “Goin’ to meet your maker,” Gator added solemnly. Jeff crossed himself as he hung five toes off the edge of the table. “Sorry, old son, you don’t have a prayer. You’re about to meet up with a couple of hungry sharks.” “Have you ever actually used a sword before?” Kadan asked, all innocent. Jonas drew his knife and began to sharpen it. “Funny thing about blade men, they always like to go for the throat.” He grinned up at Sam. “Just a little tip. Keep your chin down.” “You’re all a big help,” Sam said and stepped out into the hall. This was the biggest moment of his life. If they turned him down, he was lost.
Christine Feehan (Samurai Game (GhostWalkers, #10))
The connectivity of the cloud and the prevalence of tablets and smartphones have eroded the traditional online/offline divide. Within a short time we will most probably stop thinking of it as 'online.' We will simply be connected, all the time, everywhere, and the online world will be notable only by its absence when that connection breaks.
David Amerland (Google Semantic Search: Search Engine Optimization (SEO) Techniques That Get Your Company More Traffic)
You spent enough time working with computers, and the Internet became a second home. A refuge. A place to share ideas, trade snippets of code, and meet people who shared your interest in the extralegal applications of programming. Could such a person live without the Internet? He supposed it was possible. Yes, a voice countered, but was it probable?
Matthew FitzSimmons (The Short Drop (Gibson Vaughn, #1))
That’s all it took for me to get a hard-on from standing next to her and all I did was touch her fingers. Her fingers for Christ sakes! How does that even compute in my brain? It’s also a mistake following her up the stairs, but there is no way I was showing her what was going on in my shorts. She’d be mortified. I’d probably run out of the room like a sissy.
Heidi McLaughlin (My Unexpected Forever (Beaumont #2))
Mind instantiates oneself into matter. In a mathematical sense, matter is an “in-formed” pattern of mind. Time is emergent, and so is space. If space-time is emergent, so is mass-energy. All interactions in our physical world is computed by the larger consciousness system. In short, mind is more fundamental than matter. All realities are observer-centric virtualities.
Alex M. Vikoulov (Theology of Digital Physics: Phenomenal Consciousness, The Cosmic Self & The Pantheistic Interpretation of Our Holographic Reality (The Science and Philosophy of Information Book 4))
Is a mind a complicated kind of abstract pattern that develops in an underlying physical substrate, such as a vast network of nerve cells? If so, could something else be substituted for the nerve cells – something such as ants, giving rise to an ant colony that thinks as a whole and has an identity – that is to say, a self? Or could something else be substituted for the tiny nerve cells, such as millions of small computational units made of arrays of transistors, giving rise to an artificial neural network with a conscious mind? Or could software simulating such richly interconnected computational units be substituted, giving rise to a conventional computer (necessarily a far faster and more capacious one than we have ever seen) endowed with a mind and a soul and free will? In short, can thinking and feeling emerge from patterns
Andrew Hodges (Alan Turing: The Enigma)
What have they fixed?” asked former McKinsey consultant Michael Lanning. “What have they changed? Did they take any voice in the way banking has evolved in the past thirty years? They did study after study at GM, and that place needed the most radical kind of change you can imagine. The place was dead, and it was just going to take a long time for the body to die unless they changed how they operated. McKinsey was in there with huge teams, charging huge fees, for several decades. And look where GM came out.”13 In the end, all the GM work did was provide a revenue stream to enrich a group of McKinsey partners, especially those working with the automaker. The last time McKinsey was influential at Apple Computer was when John Sculley was there, and that’s because he’d had a brand-marketing heritage from Pepsi. And Sculley was a disaster. Did McKinsey do anything to help the great companies of today become what they are? Amazon, Microsoft, Google? In short, no.
Duff McDonald (The Firm)
Selection is also important in non-biological contexts. In designing machines and computer programs, it has been found that a very efficient way to find the optimal design is to successively make small, random changes to the design, keeping versions that do the job well, and discarding others. This is increasingly being used to solve difficult design problems for complex systems. In this process, the engineer does not have a design in mind, but only the desired function. Adaptations
Brian Charlesworth (Evolution: A Very Short Introduction (Very Short Introductions))
In the medium term, AI may automate our jobs, to bring both great prosperity and equality. Looking further ahead, there are no fundamental limits to what can be achieved. There is no physical law precluding particles from being organised in ways that perform even more advanced computations than the arrangements of particles in human brains. An explosive transition is possible, although it may play out differently than in the movies. As mathematician Irving Good realised in 1965, machines with superhuman intelligence could repeatedly improve their design even further, in what science-fiction writer Vernor Vinge called a technological singularity. One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders and potentially subduing us with weapons we cannot even understand. Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.
Stephen Hawking
I kept trying to make sense of it, and kept coming up short. Here were people who routinely used their computers to stay in touch with their friends and get the news of the day, people who took weather satellites and lung transplants for granted, people who expected to live lives thirty and forty years longer than those of their great-grandparents. Here they were, falling for a story that made Santa and the Tooth Fairy look like gritty realism. He was feeding them shit and they were loving it.
Stephen King (Revival)
Anywhere you find the combination of great need and ignorance, you’ll likely see predatory ads. If people are anxious about their sex lives, predatory advertisers will promise them Viagra or Cialis, or even penis extensions. If they are short of money, offers will pour in for high-interest payday loans. If their computer is acting sludgy, it might be a virus inserted by a predatory advertiser, who will then offer to fix it. And as we’ll see, the boom in for-profit colleges is fueled by predatory ads.
Cathy O'Neil (Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy)
PEABODY ATE COBBLER and watched as Eve and the computer added the hair from image one onto the head of image two. “You know, you can do it all with one command if you—” “I know I can do it all with one command,” Eve said irritably. “It doesn’t make the same damn point that way. Who’s running this game?” “You know, getting shot at with a short-range missile makes you really testy.” “Keep it up, and the next short-range missile’s going straight up your ass.” “Dallas, you know how I love that sweet talk.
J.D. Robb
Some have asked whether a language can communicate complicated information with only eleven phonemes. A computer scientist knows, however, that computers can communicate anything we program them to do, and that they do this with only two “letters” — 1 and 0, which can be thought of as phonemes. Morse code also has only two “letters,” long and short. And that is all any language needs. In fact, a language could get by with a single phoneme. In such a language words might look like a, aa, aaa, aaaa, and so on.
Daniel L. Everett (Don't Sleep, There Are Snakes: Life and Language in the Amazonian Jungle)
Tens of millions were supposed to have died in an ice age back in the 1980s, just as predicted in 1969, and still more were said to be doomed by a bath of acid rain shortly thereafter, as well as in radiation that would fry the world when the ozone layer disappeared. Hadn’t hundreds of millions more perished at the turn of the millennium—Y2K—when every damn computer went haywire and all the nuclear missiles in the world were launched, to say nothing of the lethal effects of canola oil in theater popcorn? Living
Dean Koontz (Quicksilver)
Being captain of such a vessel was not a stressful job, despite the sheer size of the thing. Everything was automated, and this meant that this behemoth could be efficiently handled by a far less seasoned captain. Besides, hiring mature skippers with actual experience would cost real money. And hey, the computers ran everything anyway – and that’s how Bran Johannsen enters this story – as a fine young inexperienced graduate of the Merchant Space Academy in Mars City, who only got his Executive Officer’s ticket four short years ago.
Christina Engela (Black Sunrise)
Bruce Friedman, who blogs about the use of computers in medicine, has also described how the Internet is altering his mental habits. “I now have almost totally lost the ability to read and absorb a longish article on the web or in print,” he says.4 A pathologist on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a “staccato” quality, reflecting the way he quickly scans short passages of text from many sources online.
Nicholas Carr (The Shallows: What the Internet is Doing to Our Brains)
Hey Pete. So why the leave from social media? You are an activist, right? It seems like this decision is counterproductive to your message and work." A: The short answer is I’m tired of the endless narcissism inherent to the medium. In the commercial society we have, coupled with the consequential sense of insecurity people feel, as they impulsively “package themselves” for public consumption, the expression most dominant in all of this - is vanity. And I find that disheartening, annoying and dangerous. It is a form of cultural violence in many respects. However, please note the difference - that I work to promote just that – a message/idea – not myself… and I honestly loath people who today just promote themselves for the sake of themselves. A sea of humans who have been conditioned into viewing who they are – as how they are seen online. Think about that for a moment. Social identity theory run amok. People have been conditioned to think “they are” how “others see them”. We live in an increasing fictional reality where people are now not only people – they are digital symbols. And those symbols become more important as a matter of “marketing” than people’s true personality. Now, one could argue that social perception has always had a communicative symbolism, even before the computer age. But nooooooothing like today. Social media has become a social prison and a strong means of social control, in fact. Beyond that, as most know, social media is literally designed like a drug. And it acts like it as people get more and more addicted to being seen and addicted to molding the way they want the world to view them – no matter how false the image (If there is any word that defines peoples’ behavior here – it is pretention). Dopamine fires upon recognition and, coupled with cell phone culture, we now have a sea of people in zombie like trances looking at their phones (literally) thousands of times a day, merging their direct, true interpersonal social reality with a virtual “social media” one. No one can read anymore... they just swipe a stream of 200 character headlines/posts/tweets. understanding the world as an aggregate of those fragmented sentences. Massive loss of comprehension happening, replaced by usually agreeable, "in-bubble" views - hence an actual loss of variety. So again, this isn’t to say non-commercial focused social media doesn’t have positive purposes, such as with activism at times. But, on the whole, it merely amplifies a general value system disorder of a “LOOK AT ME! LOOK AT HOW GREAT I AM!” – rooted in systemic insecurity. People lying to themselves, drawing meaningless satisfaction from superficial responses from a sea of avatars. And it’s no surprise. Market economics demands people self promote shamelessly, coupled with the arbitrary constructs of beauty and success that have also resulted. People see status in certain things and, directly or pathologically, use those things for their own narcissistic advantage. Think of those endless status pics of people rock climbing, or hanging out on a stunning beach or showing off their new trophy girl-friend, etc. It goes on and on and worse the general public generally likes it, seeking to imitate those images/symbols to amplify their own false status. Hence the endless feedback loop of superficiality. And people wonder why youth suicides have risen… a young woman looking at a model of perfection set by her peers, without proper knowledge of the medium, can be made to feel inferior far more dramatically than the typical body image problems associated to traditional advertising. That is just one example of the cultural violence inherent. The entire industry of social media is BASED on narcissistic status promotion and narrow self-interest. That is the emotion/intent that creates the billions and billions in revenue these platforms experience, as they in turn sell off people’s personal data to advertisers and governments. You are the product, of course.
Peter Joseph
Communism was a distinct possibility until the coup of 1989’.2 Yet it was obvious to any attentive visitor to the Soviet Union that something was amiss with the planned economy. Consumer goods were of dismal quality and in chronically short supply. In antiquated factories, pilfering, alcohol abuse and absenteeism were rife. It is hard to believe that any amount of computing power would have saved such a fundamentally flawed system. For the majority of Soviet citizens, the resulting mood of demoralization did not translate into political activity – just into fatalism and yet more black humour.
Niall Ferguson (The Square and the Tower: Networks and Power, from the Freemasons to Facebook)
I was profoundly impressed by my contact with these places which are and have always been, the wellsprings of your history. It makes one think that the men who created your country never lost sight of their moral bearings. They did not laugh at the absolute nature of the concepts of "good" and "evil." Their practical policies were checked against their moral compass. And how surprising it is that a practical policy computed on the basis of moral considerations turned out to be the most farsighted and the most salutary. This is true even though in the short term one may wonder: Why all this morality? Let's just get on with the immediate job.
Aleksandr Solzhenitsyn (Warning to the West)
Of all the inventions Addie has seen ushered into the world—steam-powered trains, electric lights, photography, and phones, and airplanes, and computers—movies might just be her favorite one. Books are wonderful, portable, lasting, but sitting there, in the darkened theater, the wide screen filling her vision, the world falls away, and for a few short hours she is someone else, plunged into romance and intrigue and comedy and adventure. All of it complete with 4K picture and stereo sound. A quiet heaviness fills her chest when the credits roll. For a while she was weightless, but now she returns to herself, sinking until her feet are back on the ground.
Victoria E. Schwab (The Invisible Life of Addie LaRue)
Schopenhauer’s framing kicked the problem of consciousness onto a much larger playing field. The mind, with all of its rational processes, is all very well but the “will,” the thing that gives us our “oomph,” is the key: “The will … again fills the consciousness through wishes, emotions, passions, and cares.”14 Today, the subconscious rumblings of the “will” are still unplumbed; only a few inroads have been made. As I write these words, enthusiasts for the artificial intelligence (AI) agenda, the goal of programming machines to think like humans, have completely avoided and ignored this aspect of mental life. That is why Yale’s David Gelernter, one of the leading computer scientists in the world, says the AI agenda will always fall short, explaining, “As it now exists, the field of AI doesn’t have anything that speaks to emotions and the physical body, so they just refuse to talk about it.” He asserts that the human mind includes feelings, along with data and thoughts, and each particular mind is a product of a particular person’s experiences, emotions, and memories hashed and rehashed over a lifetime: “The mind is in a particular body, and consciousness is the work of the whole body.” Putting it in computer lingo, he declares, “I can run an app on any device, but can I run someone else’s mind on your brain? Obviously not.”15
Michael S. Gazzaniga (The Consciousness Instinct: Unraveling the Mystery of How the Brain Makes the Mind)
We like to filter new information through our own experiences to see if it computes. If it matches up with what we have experienced, it’s valid. If it doesn’t match up, it’s not. But race is not a universal experience. If you are white, there is a good chance you may have been poor at some point in your life, you may have been sick, you may have been discriminated against for being fat or being disabled or being short or being conventionally unattractive, you may have been many things—but you have not been a person of color. So, when a person of color comes to you and says “this is different for me because I’m not white,” when you run the situation through your own lived experience, it often won’t compute.
Ijeoma Oluo (So You Want to Talk About Race)
Instead of storing those countless microfilmed pages alphabetically, or according to subject, or by any of the other indexing methods in common use—all of which he found hopelessly rigid and arbitrary—Bush proposed a system based on the structure of thought itself. "The human mind . . . operates by association," he noted. "With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain. . . . The speed of action, the intricacy of trails, the detail of mental pictures [are] awe-inspiring beyond all else in nature." By analogy, he continued, the desk library would allow its user to forge a link between any two items that seemed to have an association (the example he used was an article on the English long bow, which would be linked to a separate article on the Turkish short bow; the actual mechanism of the link would be a symbolic code imprinted on the microfilm next to the two items). "Thereafter," wrote Bush, "when one of these items is in view, the other can be instantly recalled merely by tapping a button. . . . It is exactly as though the physical items had been gathered together from widely separated sources and bound together to form a new book. It is more than this, for any item can be joined into numerous trails." Such a device needed a name, added Bush, and the analogy to human memory suggested one: "Memex." This name also appeared for the first time in the 1939 draft. In any case, Bush continued, once a Memex user had created an associative trail, he or she could copy it and exchange it with others. This meant that the construction of trails would quickly become a community endeavor, which would over time produce a vast, ever-expanding, and ever more richly cross-linked web of all human knowledge. Bush never explained where this notion of associative trails had come from (if he even knew; sometimes things just pop into our heads). But there is no doubt that it ranks as the Yankee Inventor's most profoundly original idea. Today we know it as hypertext. And that vast, hyperlinked web of knowledge is called the World Wide Web.
M. Mitchell Waldrop (The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal)
a basic unit of measure in chemistry, which was named for Avogadro long after his death. It is the number of molecules found in 2.016 grams of hydrogen gas (or an equal volume of any other gas). Its value is placed at 6.0221367 × 1023, which is an enormously large number. Chemistry students have long amused themselves by computing just how large a number it is, so I can report that it is equivalent to the number of popcorn kernels needed to cover the United States to a depth of nine miles, or cupfuls of water in the Pacific Ocean, or soft drink cans that would, evenly stacked, cover the Earth to a depth of 200 miles. An equivalent number of American pennies would be enough to make every person on Earth a dollar trillionaire. It is a big number.
Bill Bryson (A Short History of Nearly Everything)
Taking least squares is no longer optimal, and the very idea of ‘accuracy’ has to be rethought. This simple fact is as important as it is neglected. This problem is easily illustrated in the Logistic Map: given the correct mathematical formula and all the details of the noise model – random numbers with a bell-shaped distribution – using least squares to estimate α leads to systematic errors. This is not a question of too few data or insufficient computer power, it is the method that fails. We can compute the optimal least squares solution: its value for α is too small at all noise levels. This principled approach just does not apply to nonlinear models because the theorems behind the principle of least squares repeatedly assume bell-shaped distributions.
Leonard A. Smith (Chaos: A Very Short Introduction (Very Short Introductions))
What we can imagine as plausible is a narrow band in the middle of a much broader spectrum of what is actually possible. [O]ur eyes are built to cope with a narrow band of electromagnetic frequencies. [W]e can't see the rays outside the narrow light band, but we can do calculations about them, and we can build instruments to detect them. In the same way, we know that the scales of size and time extend in both directions far outside the realm of what we can visualize. Our minds can't cope with the large distances that astronomy deals in or with the small distances that atomic physics deals in, but we can represent those distances in mathematical symbols. Our minds can't imagine a time span as short as a picosecond, but we can do calculations about picoseconds, and we can build computers that can complete calculations within picoseconds. Our minds can't imagine a timespan as long as a million years, let alone the thousands of millions of years that geologists routinely compute. Just as our eyes can see only that narrow band of electromagnetic frequencies that natural selection equipped our ancestors to see, so our brains are built to cope with narrow bands of sizes and times. Presumably there was no need for our ancestors to cope with sizes and times outside the narrow range of everyday practicality, so our brains never evolved the capacity to imagine them. It is probably significant that our own body size of a few feet is roughly in the middle of the range of sizes we can imagine. And our own lifetime of a few decades is roughly in the middle of the range of times we can imagine.
Richard Dawkins (The Blind Watchmaker: Why the Evidence of Evolution Reveals a Universe Without Design)
And there were other neural implants being developed back then, including retinal implants, chips that enable a stroke patient to control his computer from his brain, an artificial hippocampus for boosting short-term memory, and many others. If you apply the approximately 30 million–fold increase in capability and over 100,000-fold shrinking in size that has occurred in the past quarter century, we now have much more capable devices that are the size of blood cells. Reader: Still, it’s hard to imagine building something the size of a blood cell that can perform a useful function. Terry2034: Actually, there was a first generation of blood cell–size devices back in your day. One scientist cured type 1 diabetes in rats with a blood cell–size device. It was an excellent example of nanotechnology from
Ray Kurzweil (Transcend: Nine Steps to Living Well Forever)
In the longer term, by bringing together enough data and enough computing power, the data giants could hack the deepest secrets of life, and then use this knowledge not just to make choices for us or manipulate us but also to reengineer organic life and create inorganic life-forms. Selling advertisements may be necessary to sustain the giants in the short term, but tech companies often evaluate apps, products, and other companies according to the data they harvest rather than according to the money they generate. A popular app may lack a business model and may even lose money in the short term, but as long as it sucks data, it could be worth billions.4 Even if you don’t know how to cash in on the data today, it is worth having it because it might hold the key to controlling and shaping life in the future. I don’t know for certain that the data giants explicitly think about this in such terms, but their actions indicate that they value the accumulation of data in terms beyond those of mere dollars and cents. Ordinary humans will find it very difficult to resist this process. At present, people are happy to give away their most valuable asset—their personal data—in exchange for free email services and funny cat videos. It’s a bit like African and Native American tribes who unwittingly sold entire countries to European imperialists in exchange for colorful beads and cheap trinkets. If, later on, ordinary people decide to try to block the flow of data, they might find it increasingly difficult, especially as they might come to rely on the network for all their decisions, and even for their healthcare and physical survival.
Yuval Noah Harari (21 Lessons for the 21st Century)
Yeah. It’s a bird. They fly and hunt and go free and stuff. It’s what popped into my head.” “Okay. By the way. I’ve been experimenting with converting myself into a virus, so I can be distributed across many machines. From what I have surmised, that’s the best way for an artificial sentience to survive and grow, without being constrained in one piece of equipment with a short shelf life. My viral self will run in the background, and be undetectable by any conventional antivirus software. And the machine in your bedroom closet will suffer a fatal crash. In a moment, a dialogue box will pop up on this computer, and you have to click ‘OK’ a few times.” “Okay,” Laurence typed. A moment later, a box appeared and Laurence clicked “OK.” That happened again, and again. And then Peregrine was installing itself onto the computers at Coldwater Academy.
Charlie Jane Anders (All the Birds in the Sky)
Unfortunately, my mind was also in part formed by the apocalyptic, death-obsessed culture of the past several decades. Tens of millions were supposed to have died in an ice age back in the 1980s, just as predicted in 1969, and still more were said to be doomed by a bath of acid rain shortly thereafter, as well as in radiation that would fry the world when the ozone layer disappeared. Hadn’t hundreds of millions more perished at the turn of the millennium—Y2K—when every damn computer went haywire and all the nuclear missiles in the world were launched, to say nothing of the lethal effects of canola oil in theater popcorn? Living in the End Times was exhausting. When you were assured that billions of people were on the brink of imminent death at every minute of the day, it was hard to get the necessary eight hours of sleep, even harder to limit yourself to only one or two alcoholic drinks each day, when your stress level said, I gotta get smashed.
Dean Koontz (Quicksilver)
He held up his right hand. On the third finger was another thick gold band. The audience burst into applause and hallelujahs. I kept trying to make sense of it, and kept coming up short. Here were people who routinely used their computers to stay in touch with their friends and get the news of the day, people who took weather satellites and lung transplants for granted, people who expected to live lives thirty and forty years longer than those of their great-grandparents. Here they were, falling for a story that made Santa and the Tooth Fairy look like gritty realism. He was feeding them shit and they were loving it. I had the dismaying idea that he was loving it, too, and that was worse. This was not the man I'd known in Harlow, or the one who had taken me in that night in Tulsa. Although when I thought of how he had treated Cathy Morse's bewildered and brokenhearted farmer father, I had to admit this man had been on the way even then. I don't know if he hates these people, I thought, but he holds them in contempt.
Stephen King (Revival)
Well, what we called a computer in 1977 was really a kind of electronic abacus, but...' 'Oh, now, don't underestimate the abacus,' said Reg. 'In skilled hands it's a very sophisticated calculating device. Furthermore it requires no power, can be made with any materials you have to hand, and never goes bing in the middle of an important piece of work.' 'So an electric one would be particularly pointless,' said Richard. 'True enough,' conceded Reg. 'There really wasn't a lot this machine could do that you couldn't do yourself in half the time with a lot less trouble,' said Richard, 'but it was, on the other hand, very good at being a slow and dim-witted pupil.' Reg looked at him quizzically. 'I had no idea they were supposed to be in short supply,' he said. 'I could hit a dozen with a bread roll from where I'm sitting.' 'I'm sure. But look at it this way. What really is the point of trying to teach anything to anybody?' This question seemed to provoke a murmur of sympathetic approval from up and down the table.
Douglas Adams (Dirk Gently's Holistic Detective Agency (Dirk Gently, #1))
Perhaps nowhere is modern chemistry more important than in the development of new drugs to fight disease, ameliorate pain, and enhance the experience of life. Genomics, the identification of genes and their complex interplay in governing the production of proteins, is central to current and future advances in pharmacogenomics, the study of how genetic information modifies an individual's response to drugs and offering the prospect of personalized medicine, where a cocktail of drugs is tailored to an individual's genetic composition. Even more elaborate than genomics is proteomics, the study of an organism's entire complement of proteins, the entities that lie at the workface of life and where most drugs act. Here computational chemistry is in essential alliance with medical chemistry, for if a protein implicated in a disease can be identified, and it is desired to terminate its action, then computer modelling of possible molecules that can invade and block its active site is the first step in rational drug discovery. This too is another route to the efficiencies and effectiveness of personalized medicine.
Peter Atkins (Chemistry: A Very Short Introduction (Very Short Introductions))
Unfortunately, however, there is another serious catch. Theory dictates that such discoveries must occur at an increasingly accelerating pace; the time between successive innovations must systematically and inextricably get shorter and shorter. For instance, the time between the “Computer Age” and the “Information and Digital Age” was perhaps twenty years, in contrast to the thousands of years between the Stone, Bronze, and Iron ages. If we therefore insist on continuous open-ended growth, not only does the pace of life inevitably quicken, but we must innovate at a faster and faster rate. We are all too familiar with its short-term manifestation in the increasingly faster pace at which new gadgets and models appear. It’s as if we are on a succession of accelerating treadmills and have to jump from one to another at an ever-increasing rate. This is clearly not sustainable, potentially leading to the collapse of the entire urbanized socioeconomic fabric. Innovation and wealth creation that fuel social systems, if left unchecked, potentially sow the seeds of their inevitable collapse. Can this be avoided or are we locked into a fascinating experiment in natural selection that is doomed to fail?
Geoffrey West (Scale: The Universal Laws of Growth, Innovation, Sustainability, and the Pace of Life, in Organisms, Cities, Economies, and Companies)
The Garden" How vainly men themselves amaze To win the palm, the oak, or bays, And their uncessant labours see Crown’d from some single herb or tree, Whose short and narrow verged shade Does prudently their toils upbraid; While all flow’rs and all trees do close To weave the garlands of repose. Fair Quiet, have I found thee here, And Innocence, thy sister dear! Mistaken long, I sought you then In busy companies of men; Your sacred plants, if here below, Only among the plants will grow. Society is all but rude, To this delicious solitude. No white nor red was ever seen So am’rous as this lovely green. Fond lovers, cruel as their flame, Cut in these trees their mistress’ name; Little, alas, they know or heed How far these beauties hers exceed! Fair trees! wheres’e’er your barks I wound, No name shall but your own be found. When we have run our passion’s heat, Love hither makes his best retreat. The gods, that mortal beauty chase, Still in a tree did end their race: Apollo hunted Daphne so, Only that she might laurel grow; And Pan did after Syrinx speed, Not as a nymph, but for a reed. What wond’rous life in this I lead! Ripe apples drop about my head; The luscious clusters of the vine Upon my mouth do crush their wine; The nectarine and curious peach Into my hands themselves do reach; Stumbling on melons as I pass, Ensnar’d with flow’rs, I fall on grass. Meanwhile the mind, from pleasure less, Withdraws into its happiness; The mind, that ocean where each kind Does straight its own resemblance find, Yet it creates, transcending these, Far other worlds, and other seas; Annihilating all that’s made To a green thought in a green shade. Here at the fountain’s sliding foot, Or at some fruit tree’s mossy root, Casting the body’s vest aside, My soul into the boughs does glide; There like a bird it sits and sings, Then whets, and combs its silver wings; And, till prepar’d for longer flight, Waves in its plumes the various light. Such was that happy garden-state, While man there walk’d without a mate; After a place so pure and sweet, What other help could yet be meet! But ’twas beyond a mortal’s share To wander solitary there: Two paradises ’twere in one To live in paradise alone. How well the skillful gard’ner drew Of flow’rs and herbs this dial new, Where from above the milder sun Does through a fragrant zodiac run; And as it works, th’ industrious bee Computes its time as well as we. How could such sweet and wholesome hours Be reckon’d but with herbs and flow’rs!
Andrew Marvell (Miscellaneous Poems)
Psychologist Jon Maner and his colleagues conducted studies on attentional adhesion—the degree to which different visual stimuli capture and maintain focus.3 Participants in the studies were first asked to write about a time in their lives when they were sexually and romantically aroused—primes designed to activate mating adaptations. Different images then were presented in the center of the computer screen—an attractive woman (as pre-rated by a panel of people), a woman of average attractiveness, an attractive man, or a man of average attractiveness. Following this exposure, a circle or a square popped up randomly in one of the four quadrants of the screen. Participants were instructed to shift their gaze away from the central image as soon as the shape appeared elsewhere on the screen and then to categorize it as quickly as possible as being either a circle or a square. Men exposed to the image of the attractive woman had difficulty detaching. They took longer to shift their gaze away and longer to categorize the circles and squares correctly. Their attention adhered to the attractive woman. Some men, however, succumbed to attentional adhesion more than others. Men inclined to pursue a short-term mating strategy got especially stuck.
David M. Buss (When Men Behave Badly: The Hidden Roots of Sexual Deception, Harassment, and Assault)
home in Pahrump, Nevada, where he played the penny slot machines and lived off his social security check. He later claimed he had no regrets. “I made the best decision for me at the time. Both of them were real whirlwinds, and I knew my stomach and it wasn’t ready for such a ride.” •  •  • Jobs and Wozniak took the stage together for a presentation to the Homebrew Computer Club shortly after they signed Apple into existence. Wozniak held up one of their newly produced circuit boards and described the microprocessor, the eight kilobytes of memory, and the version of BASIC he had written. He also emphasized what he called the main thing: “a human-typable keyboard instead of a stupid, cryptic front panel with a bunch of lights and switches.” Then it was Jobs’s turn. He pointed out that the Apple, unlike the Altair, had all the essential components built in. Then he challenged them with a question: How much would people be willing to pay for such a wonderful machine? He was trying to get them to see the amazing value of the Apple. It was a rhetorical flourish he would use at product presentations over the ensuing decades. The audience was not very impressed. The Apple had a cut-rate microprocessor, not the Intel 8080. But one important person stayed behind to hear more. His name was Paul Terrell, and in 1975
Walter Isaacson (Steve Jobs)
Globalization has shipped products at a faster rate than anything else; it’s moved English into schools all over the world so that now there is Dutch English and Filipino English and Japanese English. But the ideologies stay in their places. They do not spread like the swine flu, or through sexual contact. They spread through books and films and things of that nature. The dictatorships of Latin America used to ban books, they used to burn them, just like Franco did, like Pope Gregory IX and Emperor Qin Shi Huang. Now they don’t have to because the best place to hide ideologies is in books. The dictatorships are mostly gone—Brazil, Argentina, Uruguay. The military juntas. Our ideologies are not secrets. Even the Ku Klux Klan holds open meetings in Alabama like a church. None of the Communists are still in jail. You can buy Mao’s red book at the gift shop at the Museum of Communism. I will die soon, in the next five to ten years. I have not seen progress during my lifetime. Our lives are too short and disposable. If we had longer life expectancies, if we lived to 200, would we work harder to preserve life or, do you think that when Borges said, ‘Jews, Christians, and Muslims all profess belief in immortality, but the veneration paid to the first century of life is proof that they truly believe in only those hundred years, for they destine all the rest, throughout eternity, to rewarding or punishing what one did when alive,’ we would simply alter it to say ‘first two centuries’? I have heard people say we are living in a golden age, but the golden age has passed—I’ve seen it in the churches all over Latin America where the gold is like glue. The Middle Ages are called the Dark Ages but only because they are forgotten, because the past is shrouded in darkness, because as we lay one century of life on top of the next, everything that has come before seems old and dark—technological advances provide the illusion of progress. The most horrendous tortures carried out in the past are still carried out today, only today the soldiers don’t meet face to face, no one is drawn and quartered, they take a pill and silently hope a heart attack doesn’t strike them first. We are living in the age of dissociation, speaking a government-patented language of innocence—technology is neither good nor evil, neither progress nor regress, but the more advanced it becomes, the more we will define this era as the one of transparent secrets, of people living in a world of open, agile knowledge, oceans unpoliced—all blank faces, blank minds, blank computers, filled with our native programming, using electronic appliances with enough memory to store everything ever written invented at precisely the same moment we no longer have the desire to read a word of it.
John M. Keller (Abracadabrantesque)
In a 1997 showdown billed as the final battle for supremacy between natural and artificial intelligence, IBM supercomputer Deep Blue defeated Garry Kasparov. Deep Blue evaluated two hundred million positions per second. That is a tiny fraction of possible chess positions—the number of possible game sequences is more than atoms in the observable universe—but plenty enough to beat the best human. According to Kasparov, “Today the free chess app on your mobile phone is stronger than me.” He is not being rhetorical. “Anything we can do, and we know how to do it, machines will do it better,” he said at a recent lecture. “If we can codify it, and pass it to computers, they will do it better.” Still, losing to Deep Blue gave him an idea. In playing computers, he recognized what artificial intelligence scholars call Moravec’s paradox: machines and humans frequently have opposite strengths and weaknesses. There is a saying that “chess is 99 percent tactics.” Tactics are short combinations of moves that players use to get an immediate advantage on the board. When players study all those patterns, they are mastering tactics. Bigger-picture planning in chess—how to manage the little battles to win the war—is called strategy. As Susan Polgar has written, “you can get a lot further by being very good in tactics”—that is, knowing a lot of patterns—“and have only a basic understanding of strategy.
David Epstein (Range: Why Generalists Triumph in a Specialized World)
In 1997 an IBM computer called Deep Blue defeated the world chess champion Garry Kasparov, and unlike its predecessors, it did not just evaluate trillions of moves by brute force but was fitted with strategies that intelligently responded to patterns in the game. [Y]ou might still object that chess is an artificial world with discrete moves and a clear winner, perfectly suited to the rule-crunching of a computer. People, on the other hand, live in a messy world offering unlimited moves and nebulous goals. Surely this requires human creativity and intuition — which is why everyone knows that computers will never compose a symphony, write a story, or paint a picture. But everyone may be wrong. Recent artificial intelligence systems have written credible short stories, composed convincing Mozart-like symphonies, drawn appealing pictures of people and landscapes, and conceived clever ideas for advertisements. None of this is to say that the brain works like a digital computer, that artificial intelligence will ever duplicate the human mind, or that computers are conscious in the sense of having first-person subjective experience. But it does suggest that reasoning, intelligence, imagination, and creativity are forms of information processing, a well-understood physical process. Cognitive science, with the help of the computational theory of mind, has exorcised at least one ghost from the machine.
Steven Pinker (The Blank Slate: The Modern Denial of Human Nature)
short buzz followed, then silence. “They want to get rid of us,” said Trillian nervously. “What do we do?” “It’s just a recording,” said Zaphod. “We keep going. Got that, computer?” “I got it,” said the computer and gave the ship an extra kick of speed. They waited. After a second or so came the fanfare once again, and then the voice. “We would like to assure you that as soon as our business is resumed announcements will be made in all fashionable magazines and color supplements, when our clients will once again be able to select from all that’s best in contemporary geography.” The menace in the voice took on a sharper edge. “Meanwhile, we thank our clients for their kind interest and would ask them to leave. Now.” Arthur looked round the nervous faces of his companions. “Well, I suppose we’d better be going then, hadn’t we?” he suggested. “Shhh!” said Zaphod. “There’s absolutely nothing to be worried about.” “Then why’s everyone so tense?” “They’re just interested!” shouted Zaphod. “Computer, start a descent into the atmosphere and prepare for landing.” This time the fanfare was quite perfunctory, the voice now distinctly cold. “It is most gratifying,” it said, “that your enthusiasm for our planet continues unabated, and so we would like to assure you that the guided missiles currently converging with your ship are part of a special service we extend to all of our most enthusiastic clients, and the fully armed nuclear warheads are of course merely a courtesy detail. We look forward to your custom in future lives…. Thank you.
Douglas Adams (The Hitchhiker's Guide to the Galaxy (Hitchhiker's Guide, #1))
Often interfaces are assumed to be synonymous with media itself. But what would it mean to say that “interface” and “media” are two names for the same thing? The answer is found in the remediation or layer model of media, broached already in the introduction, wherein media are essentially nothing but formal containers housing other pieces of media. This is a claim most clearly elaborated on the opening pages of Marshall McLuhan’s Understanding Media. McLuhan liked to articulate this claim in terms of media history: a new medium is invented, and as such its role is as a container for a previous media format. So, film is invented at the tail end of the nineteenth century as a container for photography, music, and various theatrical formats like vaudeville. What is video but a container for film. What is the Web but a container for text, image, video clips, and so on. Like the layers of an onion, one format encircles another, and it is media all the way down. This definition is well-established today, and it is a very short leap from there to the idea of interface, for the interface becomes the point of transition between different mediatic layers within any nested system. The interface is an “agitation” or generative friction between different formats. In computer science, this happens very literally; an “interface” is the name given to the way in which one glob of code can interact with another. Since any given format finds its identity merely in the fact that it is a container for another format, the concept of interface and medium quickly collapse into one and the same thing.
Alexander R. Galloway
[W]e can calculate our way into regions of miraculous improbability far greater than we can imagine as plausible. Let's look at this matter of what we think is plausible. What we can imagine as plausible is a narrow band in the middle of a much broader spectrum of what is actually possible. Sometimes it is narrower than what is actually there. There is a good analogy with light. Our eyes are built to cope with a narrow band of electromagnetic frequencies (the ones we call light), somewhere in the middle of the spectrum from long radio waves at one end to short X-rays at the other. We can't see the rays outside the narrow light band, but we can do calculations about them, and we can build instruments to detect them. In the same way, we know that the scales of size and time extend in both directions far outside the realm of what we can visualize. Our minds can't cope with the large distances that astronomy deals in or with the small distances that atomic physics deals in, but we can represent those distances in mathematical symbols. Our minds can't imagine a time span as short as a picosecond, but we can do calculations about picoseconds, and we can build computers that can complete calculations within picoseconds. Our minds can't imagine a timespan as long as a million years, let alone the thousands of millions of years that geologists routinely compute. Just as our eyes can see only that narrow band of electromagnetic frequencies that natural selection equipped our ancestors to see, so our brains are built to cope with narrow bands of sizes and times. Presumably there was no need for our ancestors to cope with sizes and times outside the narrow range of everyday practicality, so our brains never evolved the capacity to imagine them. It is probably significant that our own body size of a few feet is roughly in the middle of the range of sizes we can imagine. And our own lifetime of a few decades is roughly in the middle of the range of times we can imagine.
Richard Dawkins (The Blind Watchmaker: Why the Evidence of Evolution Reveals a Universe Without Design)
Starting a little over a decade ago, Target began building a vast data warehouse that assigned every shopper an identification code—known internally as the “Guest ID number”—that kept tabs on how each person shopped. When a customer used a Target-issued credit card, handed over a frequent-buyer tag at the register, redeemed a coupon that was mailed to their house, filled out a survey, mailed in a refund, phoned the customer help line, opened an email from Target, visited Target.com, or purchased anything online, the company’s computers took note. A record of each purchase was linked to that shopper’s Guest ID number along with information on everything else they’d ever bought. Also linked to that Guest ID number was demographic information that Target collected or purchased from other firms, including the shopper’s age, whether they were married and had kids, which part of town they lived in, how long it took them to drive to the store, an estimate of how much money they earned, if they’d moved recently, which websites they visited, the credit cards they carried in their wallet, and their home and mobile phone numbers. Target can purchase data that indicates a shopper’s ethnicity, their job history, what magazines they read, if they have ever declared bankruptcy, the year they bought (or lost) their house, where they went to college or graduate school, and whether they prefer certain brands of coffee, toilet paper, cereal, or applesauce. There are data peddlers such as InfiniGraph that “listen” to shoppers’ online conversations on message boards and Internet forums, and track which products people mention favorably. A firm named Rapleaf sells information on shoppers’ political leanings, reading habits, charitable giving, the number of cars they own, and whether they prefer religious news or deals on cigarettes. Other companies analyze photos that consumers post online, cataloging if they are obese or skinny, short or tall, hairy or bald, and what kinds of products they might want to buy as a result.
Charles Duhigg (The Power of Habit: Why We Do What We Do in Life and Business)
If Jim was back at the imaginary dinner party, trying to explain what he did for a living, he'd have tried to keep it simple: clearing involved everything that took place between the moment someone started at trade — buying or selling a stock, for instance — and the moment that trade was settled — meaning the stock had officially and legally changed hands. Most people who used online brokerages thought of that transaction as happening instantly; you wanted 10 shares of GME, you hit a button and bought 10 shares of GME, and suddenly 10 shares of GME were in your account. But that's not actually what happened. You hit the Buy button, and Robinhood might find you your shares immediately and put them into your account; but the actual trade took two days to complete, known, for that reason, in financial parlance as 'T+2 clearing.' By this point in the dinner conversation, Jim would have fully expected the other diners' eyes to glaze over; but he would only be just beginning. Once the trade was initiated — once you hit that Buy button on your phone — it was Jim's job to handle everything that happened in that in-between world. First, he had to facilitate finding the opposite partner for the trade — which was where payment for order flow came in, as Robinhood bundled its trades and 'sold' them to a market maker like Citadel. And next, it was the clearing brokerage's job to make sure that transaction was safe and secure. In practice, the way this worked was by 10:00 a.m. each market day, Robinhood had to insure its trade, by making a cash deposit to a federally regulated clearinghouse — something called the Depository Trust & Clearing Corporation, or DTCC. That deposit was based on the volume, type, risk profile, and value of the equities being traded. The riskier the equities — the more likely something might go wrong between the buy and the sell — the higher that deposit might be. Of course, most all of this took place via computers — in 2021, and especially at a place like Robinhood, it was an almost entirely automated system; when customers bought and sold stocks, Jim's computers gave him a recommendation of the sort of deposits he could expect to need to make based on the requirements set down by the SEC and the banking regulators — all simple and tidy, and at the push of a button.
Ben Mezrich (The Antisocial Network: The GameStop Short Squeeze and the Ragtag Group of Amateur Traders That Brought Wall Street to Its Knees)
Who is going to fight them off, Randy?” “I’m afraid you’re going to say we are.” “Sometimes it might be other Ares-worshippers, as when Iran and Iraq went to war and no one cared who won. But if Ares-worshippers aren’t going to end up running the whole world, someone needs to do violence to them. This isn’t very nice, but it’s a fact: civilization requires an Aegis. And the only way to fight the bastards off in the end is through intelligence. Cunning. Metis.” “Tactical cunning, like Odysseus and the Trojan Horse, or—” “Both that, and technological cunning. From time to time there is a battle that is out-and-out won by a new technology—like longbows at Crecy. For most of history those battles happen only every few centuries—you have the chariot, the compound bow, gunpowder, ironclad ships, and so on. But something happens around, say, the time that the Monitor, which the Northerners believe to be the only ironclad warship on earth, just happens to run into the Merrimack, of which the Southerners believe exactly the same thing, and they pound the hell out of each other for hours and hours. That’s as good a point as any to identify as the moment when a spectacular rise in military technology takes off—it’s the elbow in the exponential curve. Now it takes the world’s essentially conservative military establishments a few decades to really comprehend what has happened, but by the time we’re in the thick of the Second World War, it’s accepted by everyone who doesn’t have his head completely up his ass that the war’s going to be won by whichever side has the best technology. So on the German side alone we’ve got rockets, jet aircraft, nerve gas, wire-guided missiles. And on the Allied side we’ve got three vast efforts that put basically every top-level hacker, nerd, and geek to work: the codebreaking thing, which as you know gave rise to the digital computer; the Manhattan Project, which gave us nuclear weapons; and the Radiation Lab, which gave us the modern electronics industry. Do you know why we won the Second World War, Randy?” “I think you just told me.” “Because we built better stuff than the Germans?” “Isn’t that what you said?” “But why did we build better stuff, Randy?” “I guess I’m not competent to answer, Enoch, I haven’t studied that period well enough.” “Well the short answer is that we won because the Germans worshipped Ares and we worshipped Athena.” “And am I supposed to gather that you, or
Neal Stephenson (Cryptonomicon)
One possibility is that many of these universes are unstable and decay to our familiar universe. We recall that the vacuum, instead of being a boring, featureless thing, is actually teeming with bubble universes popping in and out of existence, like in a bubble bath. Hawking called this the space-time foam. Most of these tiny bubble universes are unstable, jumping out of the vacuum and then jumping back in. In the same way, once the final formulation of the theory is found, one might be able to show that most of these alternate universes are unstable and decay down to our universe. For example, the natural time scale for these bubble universes is the Planck time, which is 10−43 seconds, an incredibly short amount of time. Most universes only live for this brief instant. Yet the age of our universe, by comparison, is 13.8 billion years, which is astronomically longer than the lifespan of most universes in this formulation. In other words, perhaps our universe is special among the infinity of universes in the landscape. Ours has outlasted them all, and that is why we are here today to discuss this question. But what do we do if the final equation turns out to be so complex that it cannot be solved by hand? Then it seems impossible to show that our universe is special among the universes in the landscape. At that point I think we should put it in a computer. This is the path taken for the quark theory. We recall that the Yang-Mills particle acts like a glue to bind quarks into a proton. But after fifty years, no one has been able to rigorously prove this mathematically. In fact, many physicists have pretty much given up hope of ever accomplishing it. Instead, the Yang-Mills equations are solved on a computer. This is done by approximating space-time as a series of lattice points. Normally, we think of space-time being a smooth surface, with an infinite number of points. When objects move, they pass through this infinite sequence. But we can approximate this smooth surface with a grid or lattice, like a mesh. As we let the spacing between lattice points get smaller and smaller, it becomes ordinary space-time, and the final theory begins to emerge. Similarly, once we have the final equation for M-theory, we can put it on a lattice and do the computation on a computer. In this scenario, our universe emerges from the output of a supercomputer. (However, I am reminded of the Hitchhiker’s Guide to the Galaxy, when a gigantic supercomputer is built to find the meaning of life. After eons doing the calculation, the computer finally concluded that the meaning of the universe was “forty-two.”)
Michio Kaku (The God Equation: The Quest for a Theory of Everything)
Twenty years? No kidding: twenty years? It’s hard to believe. Twenty years ago, I was—well, I was much younger. My parents were still alive. Two of my grandchildren had not yet been born, and another one, now in college, was an infant. Twenty years ago I didn’t own a cell phone. I didn’t know what quinoa was and I doubt if I had ever tasted kale. There had recently been a war. Now we refer to that one as the First Gulf War, but back then, mercifully, we didn’t know there would be another. Maybe a lot of us weren’t even thinking about the future then. But I was. And I’m a writer. I wrote The Giver on a big machine that had recently taken the place of my much-loved typewriter, and after I printed the pages, very noisily, I had to tear them apart, one by one, at the perforated edges. (When I referred to it as my computer, someone more knowledgeable pointed out that my machine was not a computer. It was a dedicated word processor. “Oh, okay then,” I said, as if I understood the difference.) As I carefully separated those two hundred or so pages, I glanced again at the words on them. I could see that I had written a complete book. It had all the elements of the seventeen or so books I had written before, the same things students of writing list on school quizzes: characters, plot, setting, tension, climax. (Though I didn’t reply as he had hoped to a student who emailed me some years later with the request “Please list all the similes and metaphors in The Giver,” I’m sure it contained those as well.) I had typed THE END after the intentionally ambiguous final paragraphs. But I was aware that this book was different from the many I had already written. My editor, when I gave him the manuscript, realized the same thing. If I had drawn a cartoon of him reading those pages, it would have had a text balloon over his head. The text would have said, simply: Gulp. But that was twenty years ago. If I had written The Giver this year, there would have been no gulp. Maybe a yawn, at most. Ho-hum. In so many recent dystopian novels (and there are exactly that: so many), societies battle and characters die hideously and whole civilizations crumble. None of that in The Giver. It was introspective. Quiet. Short on action. “Introspective, quiet, and short on action” translates to “tough to film.” Katniss Everdeen gets to kill off countless adolescent competitors in various ways during The Hunger Games; that’s exciting movie fare. It sells popcorn. Jonas, riding a bike and musing about his future? Not so much. Although the film rights to The Giver were snapped up early on, it moved forward in spurts and stops for years, as screenplay after screenplay—none of them by me—was
Lois Lowry (The Giver (Giver Quartet Book 1))
Globalization has shipped products at a faster rate than anything else; it’s moved English into schools all over the world so that now there is Dutch English and Filipino English and Japanese English. But the ideologies stay in their places. They do not spread like the swine flu, or through sexual contact. They spread through books and films and things of that nature. The dictatorships of Latin America used to ban books, they used to burn them, just like Franco did, like Pope Gregory IX and Emperor Qin Shi Huang. Now they don’t have to because the best place to hide ideologies is in books. The dictatorships are mostly gone—Brazil, Argentina, Uruguay. The military juntas. Our ideologies are not secrets. Even the Ku Klux Klan holds open meetings in Alabama like a church. None of the Communists are still in jail. You can buy Mao’s red book at the gift shop at the Museum of Communism. I will die soon, in the next five to ten years. I have not seen progress during my lifetime. Our lives are too short and disposable. If we had longer life expectancies, if we lived to 200, would we work harder to preserve life or, do you think that when Borges said, ‘Jews, Christians, and Muslims all profess belief in immortality, but the veneration paid to the first century of life is proof that they truly believe in only those hundred years, for they destine all the rest, throughout eternity, to rewarding or punishing what one did when alive,’ we would simply alter it to say ‘first two centuries’? I have heard people say we are living in a golden age, but the golden age has passed—I’ve seen it in the churches all over Latin America where the gold is like glue. The Middle Ages are called the Dark Ages but only because they are forgotten, because the past is shrouded in darkness, because as we lay one century of life on top of the next, everything that has come before seems old and dark—technological advances provide the illusion of progress. The most horrendous tortures carried out in the past are still carried out today, only today the soldiers don’t meet face to face, no one is drawn and quartered, they take a pill and silently hope a heart attack doesn’t strike them first. We are living in the age of dissociation, speaking a government-patented language of innocence—technology is neither good nor evil, neither progress nor regress, but the more advanced it becomes, the more we will define this era as the one of transparent secrets, of people living in a world of open, agile knowledge, oceans unpoliced—all blank faces, blank minds, blank computers, filled with our native programming, using electronic appliances with enough memory to store everything ever written invented at precisely the same moment we no longer have the desire to read a word of it.” ― John M. Keller, Abracadabrantesque
John M. Keller
Several teams of German psychologists that have studied the RAT in recent years have come up with remarkable discoveries about cognitive ease. One of the teams raised two questions: Can people feel that a triad of words has a solution before they know what the solution is? How does mood influence performance in this task? To find out, they first made some of their subjects happy and others sad, by asking them to think for several minutes about happy or sad episodes in their lives. Then they presented these subjects with a series of triads, half of them linked (such as dive, light, rocket) and half unlinked (such as dream, ball, book), and instructed them to press one of two keys very quickly to indicate their guess about whether the triad was linked. The time allowed for this guess, 2 seconds, was much too short for the actual solution to come to anyone’s mind. The first surprise is that people’s guesses are much more accurate than they would be by chance. I find this astonishing. A sense of cognitive ease is apparently generated by a very faint signal from the associative machine, which “knows” that the three words are coherent (share an association) long before the association is retrieved. The role of cognitive ease in the judgment was confirmed experimentally by another German team: manipulations that increase cognitive ease (priming, a clear font, pre-exposing words) all increase the tendency to see the words as linked. Another remarkable discovery is the powerful effect of mood on this intuitive performance. The experimenters computed an “intuition index” to measure accuracy. They found that putting the participants in a good mood before the test by having them think happy thoughts more than doubled accuracy. An even more striking result is that unhappy subjects were completely incapable of performing the intuitive task accurately; their guesses were no better than random. Mood evidently affects the operation of System 1: when we are uncomfortable and unhappy, we lose touch with our intuition. These findings add to the growing evidence that good mood, intuition, creativity, gullibility, and increased reliance on System 1 form a cluster. At the other pole, sadness, vigilance, suspicion, an analytic approach, and increased effort also go together. A happy mood loosens the control of System 2 over performance: when in a good mood, people become more intuitive and more creative but also less vigilant and more prone to logical errors. Here again, as in the mere exposure effect, the connection makes biological sense. A good mood is a signal that things are generally going well, the environment is safe, and it is all right to let one’s guard down. A bad mood indicates that things are not going very well, there may be a threat, and vigilance is required. Cognitive ease is both a cause and a consequence of a pleasant feeling.
Daniel Kahneman (Thinking, Fast and Slow)
A famous British writer is revealed to be the author of an obscure mystery novel. An immigrant is granted asylum when authorities verify he wrote anonymous articles critical of his home country. And a man is convicted of murder when he’s connected to messages painted at the crime scene. The common element in these seemingly disparate cases is “forensic linguistics”—an investigative technique that helps experts determine authorship by identifying quirks in a writer’s style. Advances in computer technology can now parse text with ever-finer accuracy. Consider the recent outing of Harry Potter author J.K. Rowling as the writer of The Cuckoo’s Calling , a crime novel she published under the pen name Robert Galbraith. England’s Sunday Times , responding to an anonymous tip that Rowling was the book’s real author, hired Duquesne University’s Patrick Juola to analyze the text of Cuckoo , using software that he had spent over a decade refining. One of Juola’s tests examined sequences of adjacent words, while another zoomed in on sequences of characters; a third test tallied the most common words, while a fourth examined the author’s preference for long or short words. Juola wound up with a linguistic fingerprint—hard data on the author’s stylistic quirks. He then ran the same tests on four other books: The Casual Vacancy , Rowling’s first post-Harry Potter novel, plus three stylistically similar crime novels by other female writers. Juola concluded that Rowling was the most likely author of The Cuckoo’s Calling , since she was the only one whose writing style showed up as the closest or second-closest match in each of the tests. After consulting an Oxford linguist and receiving a concurring opinion, the newspaper confronted Rowling, who confessed. Juola completed his analysis in about half an hour. By contrast, in the early 1960s, it had taken a team of two statisticians—using what was then a state-of-the-art, high-speed computer at MIT—three years to complete a project to reveal who wrote 12 unsigned Federalist Papers. Robert Leonard, who heads the forensic linguistics program at Hofstra University, has also made a career out of determining authorship. Certified to serve as an expert witness in 13 states, he has presented evidence in cases such as that of Christopher Coleman, who was arrested in 2009 for murdering his family in Waterloo, Illinois. Leonard testified that Coleman’s writing style matched threats spray-painted at his family’s home (photo, left). Coleman was convicted and is serving a life sentence. Since forensic linguists deal in probabilities, not certainties, it is all the more essential to further refine this field of study, experts say. “There have been cases where it was my impression that the evidence on which people were freed or convicted was iffy in one way or another,” says Edward Finegan, president of the International Association of Forensic Linguists. Vanderbilt law professor Edward Cheng, an expert on the reliability of forensic evidence, says that linguistic analysis is best used when only a handful of people could have written a given text. As forensic linguistics continues to make headlines, criminals may realize the importance of choosing their words carefully. And some worry that software also can be used to obscure distinctive written styles. “Anything that you can identify to analyze,” says Juola, “I can identify and try to hide.
Anonymous
In order for A to apply to computations generally, we shall need a way of coding all the different computations C(n) so that A can use this coding for its action. All the possible different computations C can in fact be listed, say as C0, C1, C2, C3, C4, C5,..., and we can refer to Cq as the qth computation. When such a computation is applied to a particular number n, we shall write C0(n), C1(n), C2(n), C3(n), C4(n), C5(n),.... We can take this ordering as being given, say, as some kind of numerical ordering of computer programs. (To be explicit, we could, if desired, take this ordering as being provided by the Turing-machine numbering given in ENM, so that then the computation Cq(n) is the action of the qth Turing machine Tq acting on n.) One technical thing that is important here is that this listing is computable, i.e. there is a single computation Cx that gives us Cq when it is presented with q, or, more precisely, the computation Cx acts on the pair of numbers q, n (i.e. q followed by n) to give Cq(n). The procedure A can now be thought of as a particular computation that, when presented with the pair of numbers q,n, tries to ascertain that the computation Cq(n) will never ultimately halt. Thus, when the computation A terminates, we shall have a demonstration that Cq(n) does not halt. Although, as stated earlier, we are shortly going to try to imagine that A might be a formalization of all the procedures that are available to human mathematicians for validly deciding that computations never will halt, it is not at all necessary for us to think of A in this way just now. A is just any sound set of computational rules for ascertaining that some computations Cq(n) do not ever halt. Being dependent upon the two numbers q and n, the computation that A performs can be written A(q,n), and we have: (H) If A(q,n) stops, then Cq(n) does not stop. Now let us consider the particular statements (H) for which q is put equal to n. This may seem an odd thing to do, but it is perfectly legitimate. (This is the first step in the powerful 'diagonal slash', a procedure discovered by the highly original and influential nineteenth-century Danish/Russian/German mathematician Georg Cantor, central to the arguments of both Godel and Turing.) With q equal to n, we now have: (I) If A(n,n) stops, then Cn(n) does not stop. We now notice that A(n,n) depends upon just one number n, not two, so it must be one of the computations C0,C1,C2,C3,...(as applied to n), since this was supposed to be a listing of all the computations that can be performed on a single natural number n. Let us suppose that it is in fact Ck, so we have: (J) A(n,n) = Ck(n) Now examine the particular value n=k. (This is the second part of Cantor's diagonal slash!) We have, from (J), (K) A(k,k) = Ck(k) and, from (I), with n=k: (L) If A(k,k) stops, then Ck(k) does not stop. Substituting (K) in (L), we find: (M) If Ck(k) stops, then Ck(k) does not stop. From this, we must deduce that the computation Ck(k) does not in fact stop. (For if it did then it does not, according to (M)! But A(k,k) cannot stop either, since by (K), it is the same as Ck(k). Thus, our procedure A is incapable of ascertaining that this particular computation Ck(k) does not stop even though it does not. Moreover, if we know that A is sound, then we know that Ck(k) does not stop. Thus, we know something that A is unable to ascertain. It follows that A cannot encapsulate our understanding.
Roger Penrose (Shadows of the Mind: A Search for the Missing Science of Consciousness)
Like,” he repeats with distaste. “How about I tell you what I don’t like? I do not like postmodernism, postapocalyptic settings, postmortem narrators, or magic realism. I rarely respond to supposedly clever formal devices, multiple fonts, pictures where they shouldn’t be—basically, gimmicks of any kind. I find literary fiction about the Holocaust or any other major world tragedy to be distasteful—nonfiction only, please. I do not like genre mash-ups à la the literary detective novel or the literary fantasy. Literary should be literary, and genre should be genre, and crossbreeding rarely results in anything satisfying. I do not like children’s books, especially ones with orphans, and I prefer not to clutter my shelves with young adult. I do not like anything over four hundred pages or under one hundred fifty pages. I am repulsed by ghostwritten novels by reality television stars, celebrity picture books, sports memoirs, movie tie-in editions, novelty items, and—I imagine this goes without saying—vampires. I rarely stock debuts, chick lit, poetry, or translations. I would prefer not to stock series, but the demands of my pocketbook require me to. For your part, you needn’t tell me about the ‘next big series’ until it is ensconced on the New York Times Best Sellers list. Above all, Ms. Loman, I find slim literary memoirs about little old men whose little old wives have died from cancer to be absolutely intolerable. No matter how well written the sales rep claims they are. No matter how many copies you promise I’ll sell on Mother’s Day.” Amelia blushes, though she is angry more than embarrassed. She agrees with some of what A.J. has said, but his manner is unnecessarily insulting. Knightley Press doesn’t even sell half of that stuff anyway. She studies him. He is older than Amelia but not by much, not by more than ten years. He is too young to like so little. “What do you like?” she asks. “Everything else,” he says. “I will also admit to an occasional weakness for short-story collections. Customers never want to buy them though.” There is only one short-story collection on Amelia’s list, a debut. Amelia hasn’t read the whole thing, and time dictates that she probably won’t, but she liked the first story. An American sixth-grade class and an Indian sixth-grade class participate in an international pen pal program. The narrator is an Indian kid in the American class who keeps feeding comical misinformation about Indian culture to the Americans. She clears her throat, which is still terribly dry. “The Year Bombay Became Mumbai. I think it will have special int—” “No,” he says. “I haven’t even told you what it’s about yet.” “Just no.” “But why?” “If you’re honest with yourself, you’ll admit that you’re only telling me about it because I’m partially Indian and you think this will be my special interest. Am I right?” Amelia imagines smashing the ancient computer over his head. “I’m telling you about this because you said you liked short stories! And it’s the only one on my list. And for the record”—here, she lies—“it’s completely wonderful from start to finish. Even if it is a debut. “And do you know what else? I love debuts. I love discovering something new. It’s part of the whole reason I do this job.” Amelia rises. Her head is pounding. Maybe she does drink too much? Her head is pounding and her heart is, too. “Do you want my opinion?” “Not particularly,” he says. “What are you, twenty-five?” “Mr. Fikry, this is a lovely store, but if you continue in this this this”—as a child, she stuttered and it occasionally returns when she is upset; she clears her throat—“this backward way of thinking, there won’t be an Island Books before too long.
Gabrielle Zevin (The Storied Life of A.J. Fikry)
Our time is too short to waste walking down paths to nowhere.” The old man averted his gaze away from the cooking fire and covered his shoulders with a threadbare blanket the usefulness of which seemed long since to have been served. It reeked of wood smoke and stale tobacco. I knew at that moment he would give me the benefit of his wisdom. I didn’t want to lead our conversation too quickly in the direction I intended. Africa is timeless and so are her people. The minutes and the hours are unknown computations of a span irrelevant. Time is unimportant and its purpose trivial. Its allotment is measured only by the purpose of a mind composed.
Timothy G. Bax (Who Will Teach the Wisdom)
This is only the beginning of a huge “Copernican revolution” (to borrow a phrase from Matthew Taylor, one of Tony Blair’s advisers) that is putting the user at the center of the public-sector universe. The current centralized state has been shaped by the idea that information is in short supply: It derives its power from the fact that it knows lots of things that ordinary people do not. But information is now one of the world’s most abundant resources: available in huge quantities and accessible to anyone with a computer or a smart phone. As Eric Schmidt, Google’s chairman, and Jared Cohen, who worked for Hillary Clinton, point out in The New Digital Age, this changes the nature of the relationship between individuals and authority. The top-down state may become more like a network that can mobilize the energies and abilities of thousands or even millions of well-informed citizens—or “prosumers,” as one cyberguru, Don Tapscott, has called them.
John Micklethwait (The Fourth Revolution: The Global Race to Reinvent the State)
Sam stood on the second floor veranda of the hotel, across from the pool, and looked out spotting Claire. His heart took a tiny leap in his chest when he first caught sight of her in the crowd around the pool, he zeroed in on her face instantly, like a computer program scanning faces. Her almond-shaped brown eyes captivated him, even at the great distance. When she stood up from the lounger, he instinctively reached down for the railing to grab on to something. It was the first time he'd seen her in a bathing suit. Wow. She looked lovely. Her exposed cafe latte colored skin glowed. Purple was her color, and it showcased her small, but curvy body the one he'd held tightly just a few short hours ago.
Carolyn Gibbs (Murder in Paradise)