Computer Geek Quotes

We've searched our database for all the quotes and captions related to Computer Geek. Here they are! All 100 of them:

No one messes around with a nerd’s computer and escapes unscathed.
E.A. Bucchianeri (Brushstrokes of a Gadfly, (Gadfly Saga, #1))
Hey, I'm a computer geek, not a hero. ~Jack Farley
Vicki Lewis Thompson (Nerd in Shining Armor (Nerds, #1))
I knew she’d have an ASCII table in there somewhere. All computer geeks do.
Andy Weir (The Martian)
The word "geek" today does not mean what it used to mean. A geek isn't the skinny kid with a pocket protector and acne. There can be computer geeks, video game geeks, car geeks, military geeks, and sports geeks. Being a geek just means that you're passionate about something.
Olivia Munn (Suck It, Wonder Woman!: The Misadventures of a Hollywood Geek)
Sign by elevator put up by computer geeks in office building: REMEMBER: FIRST YOU PILAGE, THEN YOU BURN. THOSE WHO DO NOT COMPLY WILL BE SUSPENDED FROM THE RAIDING TEAM. In Mr Perfect
Linda Howard
I’ve often wished that I had some suave and socially acceptable hobby that I could fall back on in times like this. You know, play the violin (or was it the viola) like Sherlock Holmes, or maybe twiddle away on the pipe organ like the Disney version of Captain Nemo. But I don’t. I’m sort of the arcane equivalent of a classic computer geek. I do magic, in one form or another, and that’s pretty much it. I really need to get a life, one of these days
Jim Butcher (Storm Front (The Dresden Files, #1))
But the main lesson to draw from the birth of computers is that innovation is usually a group effort, involving collaboration between visionaries and engineers, and that creativity comes from drawing on many sources. Only in storybooks do inventions come like a thunderbolt, or a lightbulb popping out of the head of a lone individual in a basement or garret or garage.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Just like the notion of "Internet natives", who have never known a world without Internet access, we, who have lived our entire lives with video games, can be known as "video game natives.
Alexei Maxim Russell (The Classic Gamer's Bible)
Software testing is a sport like hunting, it's bughunting.
Amit Kalantri
For most people, home we represented by four walls and a roof. Not for Noa. She preferred a motherboard to a mother, a keyboard to house keys. Nothing was more comforting than the hum of a spinning hard drive.
Michelle Gagnon (Don't Turn Around (Persef0ne, #1))
Why can't you summon a command line and search your real-world home for 'Honda car keys,' and specify rooms in your house to search instead of folders or paths in your computer's home directory? It's a crippling design flaw in the real-world interface.
Richard Dooling (Rapture for the Geeks: When AI Outsmarts IQ)
Twitter Terrorist, billionaire heir, ex-con, computer geek, bad boy—none of those terms came close to describing Kyle Rhodes. He was, simply, a good person, and a confident, intelligent man to boot, and she found that combination absolutely irresistible.
Julie James (About That Night (FBI/US Attorney, #3))
And thanks to all those science projects, I acquired a central ability that was to help me through my entire career: patience.
Steve Wozniak (I, Woz: Computer Geek to Cult Icon - Getting to the Core of Apple's Inventor)
The computer and the Internet are among the most important inventions of our era, but few people know who created them.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Wonderful craftsmanship, Simon decided with the expert eye of one who had played enough computer games to know art when he saw it.
Sorin Suciu (The Scriptlings)
this early learning of how to do things one tiny little step at a time. I learned to not worry so much about the outcome, but to concentrate on the step I was on and to try to do it as perfectly as I could when I was doing it.
Steve Wozniak (I, Woz: Computer Geek to Cult Icon - Getting to the Core of Apple's Inventor)
The Prophet database (of course, the whole IT—computer geek—world called it the For-Profit database) was well written, but all the programs the mother company tried to sell with it were garbage.
Patricia Briggs (Shifting Shadows: Stories from the World of Mercy Thompson)
The role had been spawned by the widespread belief that traders didn’t know how to talk to computer geeks and that computer geeks did not respond rationally to big, hairy traders hollering at them.
Michael Lewis (Flash Boys: A Wall Street Revolt)
I pat the brand new twenty-seven inch Macintosh computers Mr. Foley brought us. 'These boxes alone should make both of us scream like it's Christmas morning! Snap out of it. Santa came! Now we get to play with all of our toys!
Anne Eliot
A mind forever voyaging through strange seas of thought…alone.
Steve Wozniak (iWoz: Computer Geek to Cult Icon)
In other words, the future might belong to people who can best partner and collaborate with computers.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
This is what Grandma was worried about, you know.' 'Me eating a whole chocolate cake practically all by myself in a single sitting?' 'You falling in love with a computer geek. Sure, they have good stock options and smokin' hot bods, but what about that dark side of genius that reanimates the dead?
Laurie Frankel (Goodbye for Now)
Just as combining the steam engine with ingenious machinery drove the Industrial Revolution, the combination of the computer and distributed networks led to a digital revolution that allowed anyone to create, disseminate, and access any information anywhere.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution)
We were there too, the other geeks and weird kids whose lives were hellish at school, who escaped into books and computers, who stayed up all night scanning obscure forums, looking for transcendence, dreaming of elsewhere. We were there too, but you didn’t see us, because we were girls. And the costs of being the geek were the same for us, right down to the sexual frustration, the yearning, the being laughed at, the loneliness. […] We had to fight the same battles you did, only harder, because we were women and we also had to fight sexism, some of it from you, and when we went looking for other weird kids to join our gang, we were told we weren’t ‘real geeks’ because we were girls.
Laurie Penny (Cybersexism: Sex, Gender and Power on the Internet)
One day ladies will take their computers for walks in the park and tell each other ‘My little computer said such a funny thing this morning!’ ” he japed in 1951.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
On a scale ranging from very little to too much, Merkin could just about categorize the amount of personal data stored in Master Loo’s computer as a shitload.
Sorin Suciu (The Scriptlings)
TO ERR IS HUMAN; TO REALLY SCREW UP REQUIRES THE ROOT PASSWORD.” —COMPUTER GEEK TRUISM
Stephen H. Segal (Geek Wisdom: The Sacred Teachings of Nerd Culture)
They think i'm at a special school for computer geeks and homosexuals.
Lev Grossman (The Magicians #1)
Geeks = Know more about computers than their computer teacher, so everyone comes to them for computer problems. Nerds = Have no life and only worries about school, no one talks to them. Jocks = Know a lot about sports but not much else. Geek's Wife: Completely depend on the geek for tech support. Tend to be pretty good looking. Nerd's Wife: nonexistent Jock's Wife: only there for money, most likely having an affair with another jock See Geeks are the best!
Hamza Charlemagne
The ark was like a portable computer hard drive and Noah was a one-man Geek Squad, and he dumped God's most important files onto it before he zorched the virus-ridden computer that was the world.
BikeSnobNYC (The Enlightened Cyclist: Commuter Angst, Dangerous Drivers, and Other Obstacles on the Path to Two-Wheeled Trancendence)
Now kids get a MacBook and regard it as an appliance. They treat it like a refrigerator and expect it to be filled with good things, but they don’t know how it works. They don’t fully understand what I knew, and my parents knew, which was what you could do with a computer was limited only by your imagination.”8
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Mauchly and Eckert should be at the top of the list of people who deserve credit for inventing the computer, not because the ideas were all their own but because they had the ability to draw ideas from multiple sources, add their own innovations, execute their vision by building a competent team, and have the most influence on the course of subsequent developments. The machine they built was the first general-purpose electronic computer.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
... there was one new metallic monstrosity stacked in one corner that she hadn’t seen the last time she was a visitor to his strange chamber, it appeared to be a mass of hard drives all fused together, but they looked too sophisticated to be merely hard drives. “What on earth is that?” “That’s my Kung Fu,” he said proudly, patting the top of the futuristic-looking stack. “Is that what you wanted to show me?” “No, but it’s impressive, isn’t it?” “If you say so.” Steves sighed and shook his head, so few people could appreciate the intellectual complexity of an almost untraceable hacking device.
E.A. Bucchianeri (Brushstrokes of a Gadfly, (Gadfly Saga, #1))
Email did more than facilitate the exchange of messages between two computer users. It led to the creation of virtual communities, ones that, as predicted in 1968 by Licklider and Taylor, were “selected more by commonality of interests and goals than by accidents of proximity.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
In my experience, a lot of the boys who are real computer-heads tend to be pale-skinned sci-fi fans. Not that there’s anything wrong with that—I happen to be a pale-skinned sci-fi fan myself—but that doesn’t mean I have to fancy other seethrough geeks. So let’s hope Adam isn’t as picky as me.
Kate le Vann (Things I Know About Love)
Many people suppose that computing machines are replacements for intelligence and have cut down the need for original thought,” Wiener wrote. “This is not the case.”14 The more powerful the computer, the greater the premium that will be placed on connecting it with imaginative, creative, high-level human thinking.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Life will be happier for the on-line individual because the people with whom one interacts most strongly will be selected more by commonality of interests and goals than by accidents of proximity.” J. C. R. Licklider and Robert Taylor, “The Computer as a Communication Device,” Science and Technology , Apr. 1968.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution)
The following year, students at Duke University and the University of North Carolina, which were not yet connected to the Internet, developed another system, hosted on personal computers, which featured threaded message-and-reply discussion forums. It became known as “Usenet,” and the categories of postings on it were called “newsgroups.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
The computer will never be as important to society as the copier.”73
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Grace Hopper develops first computer compiler.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
An invention, especially one as complex as the computer, usually comes not from an individual brainstorm but from a collaboratively woven tapestry of creativity.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Power to the people was a romantic lie,” he later said. “Computers did more than politics did to change society.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Engelbart showed, back in 1968, nearly everything that a networked personal computer does today.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
The digital age could not become truly transformational until computers became truly personal.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Computers today are brilliant idiots,
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Jobs was blown away by bitmapping. “It was like a veil being lifted from my eyes,” he recalled. “I could see what the future of computing was destined to be.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Computer innovators, like other pioneers, can find themselves left behind if they get stuck in their ways.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
perhaps no matter how fast computers progress, artificial intelligence may never outstrip the intelligence of the human-machine partnership. Let
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
rise of computers could mean that “man will become a passive, purposeless, machine-conditioned animal.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
The Bush-Licklider approach was given a friendly interface by Engelbart, who in 1968 demonstrated a networked computer system with an intuitive graphical display and a mouse.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
I can’t see any reason that anyone would want a computer of his own,” DEC president Ken Olsen declared at a May 1974 meeting
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
The Internet and the personal computer were both born in the 1970s, but they grew up apart from one another.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
A program called SNDMSG allowed a user of a big central computer to send a message to the personal folder of another user who was sharing the same computer.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
the future might belong to people who can best partner and collaborate with computers.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
PDP-1 was the first computer to be designed for direct interaction with the user.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
ARPA should not force the research computers at each site to handle the routing of data, Clark argued. Instead ARPA should design and give each site a standardized minicomputer that would do the routing.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
McCarthy’s vision was prescient, but it differed in one major way from Kay’s vision, and from the networked world that we have today. It was not based on personal computers with their own memory and processing power.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
two historic innovations that caused them to revolutionize how we live: microchips allowed computers to become small enough to be personal appliances, and packet-switched networks allowed them to be connected as nodes on a web.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
He was able (this being the Depression) to buy used desk calculators cheaply from ailing banks and to hire a group of young people, through the New Deal’s National Youth Administration, to do computations at fifty cents an hour.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution)
The roots of the personal computer can be found in the Free Speech Movement that arose at Berkeley in 1964 and in the Whole Earth Catalog, which did the marketing for the do-it-yourself ideals behind the personal computer movement.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
who most deserves to be dubbed the inventor of the electronic digital computer: John Atanasoff, a professor who worked almost alone at Iowa State, or the team led by John Mauchly and Presper Eckert at the University of Pennsylvania.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Do you want this marriage to work or not?’ she said. ‘My spreadsheet identified –’ I interpreted Sonia’s expression as I don’t want to hear about your fucking spreadsheet. Do you, emotionally, as a whole mature person, want to live the rest of your life with Rosie and the Baby Under Development or are you going to let a computer make that decision for you, you pathetic geek? ‘Work. But I don’t think –’ ‘You think too much. Take her out to dinner and talk it over.
Graeme Simsion (The Rosie Effect (Don Tillman, #2))
Let us change our traditional attitude to the construction of programs: Instead of imagining that our main task is to instruct a computer what to do, let us concentrate rather on explaining to human beings what we want a computer to do.
Vikram Chandra (Geek Sublime: The Beauty of Code, the Code of Beauty)
I can’t see any reason that anyone would want a computer of his own,” DEC president Ken Olsen declared at a May 1974 meeting where his operations committee was debating whether to create a smaller version of its PDP-8 for personal consumers.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Taylor recalled that he ran into a brick wall every time he tried to deal with the suits back east. As the head of a Xerox research facility in Webster, New York, explained to him, “The computer will never be as important to society as the copier.”73
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Most of the successful innovators and entrepreneurs in this book had one thing in common: they were product people. They cared about, and deeply understood, the engineering and design. They were not primarily marketers or salesmen or financial types; when such folks took over companies, it was often to the detriment of sustained innovation. “When the sales guys run the company, the product guys don’t matter so much, and a lot of them just turn off,” Jobs said. Larry Page felt the same: “The best leaders are those with the deepest understanding of the engineering and product design.”34 Another lesson of the digital age is as old as Aristotle: “Man is a social animal.” What else could explain CB and ham radios or their successors, such as WhatsApp and Twitter? Almost every digital tool, whether designed for it or not, was commandeered by humans for a social purpose: to create communities, facilitate communication, collaborate on projects, and enable social networking. Even the personal computer, which was originally embraced as a tool for individual creativity, inevitably led to the rise of modems, online services, and eventually Facebook, Flickr, and Foursquare. Machines, by contrast, are not social animals. They don’t join Facebook of their own volition nor seek companionship for its own sake. When Alan Turing asserted that machines would someday behave like humans, his critics countered that they would never be able to show affection or crave intimacy. To indulge Turing, perhaps we could program a machine to feign affection and pretend to seek intimacy, just as humans sometimes do. But Turing, more than almost anyone, would probably know the difference. According to the second part of Aristotle’s quote, the nonsocial nature of computers suggests that they are “either a beast or a god.” Actually, they are neither. Despite all of the proclamations of artificial intelligence engineers and Internet sociologists, digital tools have no personalities, intentions, or desires. They are what we make of them.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
But Jobs was the first to become obsessed with the idea of incorporating PARC’s interface ideas into a simple, inexpensive, personal computer. Once again, the greatest innovation would come not from the people who created the breakthroughs but from the people who applied them usefully. On
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
a table and diagram showing exactly how the algorithm would be fed into the computer, step by step, including two recursive loops. It was a numbered list of coding instructions that included destination registers, operations, and commentary—something that would be familiar to any C++ coder today.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Shortly before she died in 2011, Jean Jennings Bartik reflected proudly on the fact that all the programmers who created the first general-purpose computer were women: “Despite our coming of age in an era when women’s career opportunities were generally quite confined, we helped initiate the era of the computer.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
If you read the same things as others and say the same things they say, then you're perceived as intelligent. I'm a bit more independent and radical and consider intelligence the ability to think about matters on your own and ask a lot of skeptical questions to get at the real truth, not just what you're told it is.
Steve Wozniak (iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It)
Machines such as these emerged in the 1950s, and during the subsequent thirty years there were two historic innovations that caused them to revolutionize how we live: microchips allowed computers to become small enough to be personal appliances, and packet-switched networks allowed them to be connected as nodes on a web.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
singularity, a term that von Neumann coined and the futurist Ray Kurzweil and the science fiction writer Vernor Vinge popularized, which is sometimes used to describe the moment when computers are not only smarter than humans but also can design themselves to be even supersmarter, and will thus no longer need us mortals.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
The Industrial Revolution was based on two grand concepts that were profound in their simplicity. Innovators came up with ways to simplify endeavors by breaking them into easy, small tasks that could be accomplished on assembly lines. Then, beginning in the textile industry, inventors found ways to mechanize steps so that they could be performed by machines, many of them powered by steam engines. Babbage, building on ideas from Pascal and Leibniz, tried to apply these two processes to the production of computations, creating a mechanical precursor to the modern computer. His most significant conceptual leap was that such machines did not have to be set to do only one process, but instead could be programmed and reprogrammed through the use of punch cards. Ada saw the beauty and significance of that enchanting notion, and she also described an even more exciting idea that derived from it: such machines could process not only numbers but anything that could be notated in symbols.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution)
late 1980s, when it became possible for ordinary people at home or in the office to dial up and go online. This would launch a new phase of the Digital Revolution, one that would fulfill the vision of Bush, Licklider, and Engelbart that computers would augment human intelligence by being tools both for personal creativity and for collaborating.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Licklider helped chart that course back in 1960 in his paper “Man-Computer Symbiosis,” which proclaimed: “Human brains and computing machines will be coupled together very tightly, and the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
In the seventy years since von Neumann effectively placed his “Draft Report” on the EDVAC into the public domain, the trend for computers has been, with a few notable exceptions, toward a more proprietary approach. In 2011 a milestone was reached: Apple and Google spent more on lawsuits and payments involving patents than they did on research and development of new products.64
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
The fact that we are still sitting on and depending on technical protocols nearly a half century old is a testament to the genius of those who invented everything from such inventions, protocols and standards like Ethernet to personal computers that were more than just circuit boards for geeks, but actually had small GUI interfaces, as well as connected devices such as a mouse and keyboard.
Scott C. Holstad
As Licklider explained, the sensible goal was to create an environment in which humans and machines “cooperate in making decisions.” In other words, they would augment each other. “Men will set the goals, formulate the hypotheses, determine the criteria, and perform the evaluations. Computing machines will do the routinizable work that must be done to prepare the way for insights and decisions in technical and scientific thinking.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Gates was, after all, a serial stealer of computer time, and he had manipulated passwords to hack into accounts from eighth grade through his sophomore year at Harvard. Indeed, when he claimed in his letter that he and Allen had used more than $40,000 worth of computer time to make BASIC, he omitted the fact that he had never actually paid for that time and that much of it was on Harvard’s military-supplied computer, funded by American taxpayers.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
This entailed switching around by hand ENIAC’s rat’s nest of cables and resetting its switches. At first the programming seemed to be a routine, perhaps even menial task, which may have been why it was relegated to women, who back then were not encouraged to become engineers. But what the women of ENIAC soon showed, and the men later came to understand, was that the programming of a computer could be just as significant as the design of its hardware.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Wiener believed that the most promising path for computer science was to devise machines that would work well with human minds rather than try to replace them. “Many people suppose that computing machines are replacements for intelligence and have cut down the need for original thought,” Wiener wrote. “This is not the case.”14 The more powerful the computer, the greater the premium that will be placed on connecting it with imaginative, creative, high-level human thinking.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Eventually, computers might outperform humans in the very fields that made Homo sapiens the ruler of the world: intelligence and communication. The process that began in the Euphrates valley 5,000 years ago, when Sumerian geeks outsourced data-processing from the human brain to a clay tablet, would culminate in Silicon Valley with the victory of the tablet. Humans might still be around, but they would no longer be able to make sense of the world. The new ruler of the world would be a long line of zeros and ones.
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
Cade studied her for a moment, then sat forward in his chair. “Seriously, what is it about this guy? He’s just a rich computer geek with good hair.” Rylann smiled. “I think there’s a little more to it than that.” “Christ, you are smitten.” He threw up his hands. “What is going on with everyone these days? Sam Wilkins is babbling about a meet-cute, Cameron’s sneaking off to get hitched, and now you’re all starry-eyed over the Twitter Terrorist. Has everyone been sneaking happy pills out of the evidence room when I’m not looking?” "No, just some really good pot.” Cade laughed out loud at that. “You are a funny one, Pierce. I’ll say that.” “So does that mean we’re still on for Starbucks later today?” He studied her suspiciously. “You’re not going to want to talk about Kyle Rhodes the whole time, are you?” “Actually, yes. And then we’ll go shoe shopping together and get mani-pedis.” She threw him a get-real look. “We’ll talk about the same stuff we always talk about.” With a grin, he finally nodded. “Fine. Three o’clock, Pierce. I’ll swing by your office
Julie James (About That Night (FBI/US Attorney, #3))
On October 29 the connection was ready to be made. The event was appropriately casual. It had none of the drama of the “one small step for man, one giant leap for mankind” that had occurred on the moon a few weeks earlier, with a half billion people watching on television. Instead it was an undergraduate named Charley Kline, under the eye of Crocker and Cerf, who put on a telephone headset to coordinate with a researcher at SRI while typing in a login sequence that he hoped would allow his terminal at UCLA to connect through the network to the computer 354 miles away in Palo Alto.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
It was thus that in the second half of 1969—amid the static of Woodstock, Chappaquiddick, Vietnam War protests, Charles Manson, the Chicago Eight trial, and Altamont—the culmination was reached for three historic enterprises, each in the making for almost a decade. NASA was able to send a man to the moon. Engineers in Silicon Valley were able to devise a way to put a programmable computer on a chip called a microprocessor. And ARPA created a network that could connect distant computers. Only the first of these (perhaps the least historically significant of them?) made headlines. THE
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Kyle worked in technology; he knew it would only be a matter of time before the video of Daniela and the A-list actor went viral and spread everywhere. So he did what any pissed-off, red-blooded computer geek would do after catching his girlfriend giving an underwater blowjob to another man: he hacked into Twitter and deleted both the video and her earlier tweet from the site. Then, raging at the world that had devolved so much in civility that 140-character breakups had become acceptable, he shut down the entire network in a denial-of-service attack that lasted two days. And so began the Great Twitter Outage of 2011. The Earth nearly stopped on its axis. Panic and mayhem ensued as Twitter unsuccessfully attempted to counteract what it deemed the most sophisticated hijacking they’d ever experienced. Meanwhile, the FBI waited for either a ransom demand or political statement from the so-called “Twitter Terrorist.” But neither was forthcoming, as the Twitter Terrorist had no political agenda, already was worth millions, and had most inconveniently taken off to Tijuana, Mexico to get shit-faced drunk on cheap tequila being served by an eight-fingered bartender named Esteban.
Julie James (A Lot like Love (FBI/US Attorney, #2))
Spacewar highlighted three aspects of the hacker culture that became themes of the digital age. First, it was created collaboratively. “We were able to build it together, working as a team, which is how we liked to do things,” Russell said. Second, it was free and open-source software. “People asked for copies of the source code, and of course we gave them out.” Of course—that was in a time and place when software yearned to be free. Third, it was based on the belief that computers should be personal and interactive. “It allowed us to get our hands on a computer and make it respond to us in real time,” said Russell.10
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
A lot of the credit, too, should go to Turing, for developing the concept of a universal computer and then being part of a hands-on team at Bletchley Park. How you rank the historic contributions of the others depends partly on the criteria you value. If you are enticed by the romance of lone inventors and care less about who most influenced the progress of the field, you might put Atanasoff and Zuse high. But the main lesson to draw from the birth of computers is that innovation is usually a group effort, involving collaboration between visionaries and engineers, and that creativity comes from drawing on many sources.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Being a brash entrepreneur, Roberts responded to the crisis by deciding to launch a whole new business. He had always been fascinated by computers, and he assumed that other hobbyists felt the same. His goal, he enthused to a friend, was building a computer for the masses that would eliminate the Computer Priesthood once and for all. After studying the instruction set for the Intel 8080, Roberts concluded that MITS could make a do-it-yourself kit for a rudimentary computer that would be so cheap, under $400, that every enthusiast would buy it. “We thought he was off the deep end,” a colleague later confessed.112 Ed Roberts (1941–2010).
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
a simple, inspiring mission for Wikipedia: “Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge. That’s what we’re doing.” It was a huge, audacious, and worthy goal. But it badly understated what Wikipedia did. It was about more than people being “given” free access to knowledge; it was also about empowering them, in a way not seen before in history, to be part of the process of creating and distributing knowledge. Wales came to realize that. “Wikipedia allows people not merely to access other people’s knowledge but to share their own,” he said. “When you help build something, you own it, you’re vested in it. That’s far more rewarding than having it handed down to you.”111 Wikipedia took the world another step closer to the vision propounded by Vannevar Bush in his 1945 essay, “As We May Think,” which predicted, “Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.” It also harkened back to Ada Lovelace, who asserted that machines would be able to do almost anything, except think on their own. Wikipedia was not about building a machine that could think on its own. It was instead a dazzling example of human-machine symbiosis, the wisdom of humans and the processing power of computers being woven together like a tapestry.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
I rest my elbows on my knees, watching Paco make a complete fool of himself. Paco takes a little white golf ball and places it on top of a rubber circle inserted into the fake grass. When he swings the golf club, I wince. The club misses the ball and connects with the fake grass instead. Paco swears. The guy next to Paco takes one look at him and moves to another section. Paco tries again. This time the club connects, but his ball only rolls along the grass in front of him. He keeps trying, but each time Paco swings, he makes a complete ass out of himself. Does he think he’s hitting a hockey puck? “You done?” I ask once he’s gone through half the basket. “Alex,” Paco says, leaning on the golf club like it’s a cane. “Do ya think I was meant to play golf?” Looking Paco straight in the eye, I answer, “No.” “I heard you talkin’ to Hector. I don’t think you were mean to deal, either.” “Is that why we’re here? You’re tryin’ to make a point?” “Hear me out,” Paco insists. “I’ve got the keys to the car in my pocket and I’m not goin’ nowhere until I finish hittin’ all of these bulls, so you might as well listen. I’m not smart like you. I don’t have choices in life, but you, you’re smart enough to go to college and be a doctor or computer geek or somethin’ like that. Just like I wasn’t meant to hit golf balls, you weren’t meant to deal drugs. Let me do the drop for you.” “No way, man. I appreciate you makin’ an ass out of yourself to prove a point, but I know what I need to do,” I tell him.
Simone Elkeles (Perfect Chemistry (Perfect Chemistry, #1))
then “man-computer symbiosis,” as Licklider called it, will remain triumphant. Artificial intelligence need not be the holy grail of computing. The goal instead could be to find ways to optimize the collaboration between human and machine capabilities—to forge a partnership in which we let the machines do what they do best, and they let us do what we do best. SOME LESSONS FROM THE JOURNEY Like all historical narratives, the story of the innovations that created the digital age has many strands. So what lessons, in addition to the power of human-machine symbiosis just discussed, might be drawn from the tale? First and foremost is that creativity is a collaborative process. Innovation comes from teams more often than from the lightbulb moments of lone geniuses.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Instead it was an undergraduate named Charley Kline, under the eye of Crocker and Cerf, who put on a telephone headset to coordinate with a researcher at SRI while typing in a login sequence that he hoped would allow his terminal at UCLA to connect through the network to the computer 354 miles away in Palo Alto. He typed in “L.” The guy at SRI told him that it had been received. Then he typed in “O.” That, too, was confirmed. When he typed in “G,” the system hit a memory snag because of an auto-complete feature and crashed. Nevertheless, the first message had been sent across the ARPANET, and if it wasn’t as eloquent as “The Eagle has landed” or “What has God wrought,” it was suitable in its understated way: “Lo.” As in “Lo and behold.” In his logbook, Kline recorded, in a memorably minimalist notation, “22:30. Talked to SRI Host to Host. CSK.”101
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
C. P. Snow was right about the need to respect both of “the two cultures,” science and the humanities. But even more important today is understanding how they intersect. Those who helped lead the technology revolution were people in the tradition of Ada, who could combine science and the humanities. From her father came a poetic streak and from her mother a mathematical one, and it instilled in her a love for what she called “poetical science.” Her father defended the Luddites who smashed mechanical looms, but Ada loved how punch cards instructed those looms to weave beautiful patterns, and she envisioned how this wondrous combination of art and technology could be manifest in computers. (...) This innovation will come from people who are able to link beauty to engineering, humanity to technology, and poetry to processors. In other words, it will come from the spiritual heirs of Ada Lovelace, creators who can flourish where the arts intersect with the sciences and who have a rebellious sense of wonder that opens them to the beauty of both.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution)
a harbinger of a third wave of computing, one that blurred the line between augmented human intelligence and artificial intelligence. “The first generation of computers were machines that counted and tabulated,” Rometty says, harking back to IBM’s roots in Herman Hollerith’s punch-card tabulators used for the 1890 census. “The second generation involved programmable machines that used the von Neumann architecture. You had to tell them what to do.” Beginning with Ada Lovelace, people wrote algorithms that instructed these computers, step by step, how to perform tasks. “Because of the proliferation of data,” Rometty adds, “there is no choice but to have a third generation, which are systems that are not programmed, they learn.”27 But even as this occurs, the process could remain one of partnership and symbiosis with humans rather than one designed to relegate humans to the dustbin of history. Larry Norton, a breast cancer specialist at New York’s Memorial Sloan-Kettering Cancer Center, was part of the team that worked with Watson. “Computer science is going to evolve rapidly, and medicine will evolve with it,” he said. “This is coevolution. We’ll help each other.”28 This belief that machines and humans will get smarter together is a process that Doug Engelbart called “bootstrapping” and “coevolution.”29 It raises an interesting prospect: perhaps no matter how fast computers progress, artificial intelligence may never outstrip the intelligence of the human-machine partnership. Let us assume, for example, that a machine someday exhibits all of the mental capabilities of a human: giving the outward appearance of recognizing patterns, perceiving emotions, appreciating beauty, creating art, having desires, forming moral values, and pursuing goals. Such a machine might be able to pass a Turing Test. It might even pass what we could call the Ada Test, which is that it could appear to “originate” its own thoughts that go beyond what we humans program it to do. There would, however, be still another hurdle before we could say that artificial intelligence has triumphed over augmented intelligence. We can call it the Licklider Test. It would go beyond asking whether a machine could replicate all the components of human intelligence to ask whether the machine accomplishes these tasks better when whirring away completely on its own or when working in conjunction with humans. In other words, is it possible that humans and machines working in partnership will be indefinitely more powerful than an artificial intelligence machine working alone?
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Bush’s description of how basic research provides the seed corn for practical inventions became known as the “linear model of innovation.” Although subsequent waves of science historians sought to debunk the linear model for ignoring the complex interplay between theoretical research and practical applications, it had a popular appeal as well as an underlying truth. The war, Bush wrote, had made it “clear beyond all doubt” that basic science—discovering the fundamentals of nuclear physics, lasers, computer science, radar—“is absolutely essential to national security.” It was also, he added, crucial for America’s economic security. “New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science. A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade.” By the end of his report, Bush had reached poetic heights in extolling the practical payoffs of basic scientific research: “Advances in science when put to practical use mean more jobs, higher wages, shorter hours, more abundant crops, more leisure for recreation, for study, for learning how to live without the deadening drudgery which has been the burden of the common man for past ages.”9 Based on this report, Congress established the National Science Foundation.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
values of commons-based sharing and of private enterprise often conflict, most notably over the extent to which innovations should be patent-protected. The commons crowd had its roots in the hacker ethic that emanated from the MIT Tech Model Railroad Club and the Homebrew Computer Club. Steve Wozniak was an exemplar. He went to Homebrew meetings to show off the computer circuit he built, and he handed out freely the schematics so that others could use and improve it. But his neighborhood pal Steve Jobs, who began accompanying him to the meetings, convinced him that they should quit sharing the invention and instead build and sell it. Thus Apple was born, and for the subsequent forty years it has been at the forefront of aggressively patenting and profiting from its innovations. The instincts of both Steves were useful in creating the digital age. Innovation is most vibrant in the realms where open-source systems compete with proprietary ones. Sometimes people advocate one of these modes of production over the others based on ideological sentiments. They prefer a greater government role, or exalt private enterprise, or romanticize peer sharing. In the 2012 election, President Barack Obama stirred up controversy by saying to people who owned businesses, “You didn’t build that.” His critics saw it as a denigration of the role of private enterprise. Obama’s point was that any business benefits from government and peer-based community support: “If you were successful, somebody along the line gave you some help.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Although thrilled that the era of the personal computer had arrived, he was afraid that he was going to miss the party. Slapping down seventy-five cents, he grabbed the issue and trotted through the slushy snow to the Harvard dorm room of Bill Gates, his high school buddy and fellow computer fanatic from Seattle, who had convinced him to drop out of college and move to Cambridge. “Hey, this thing is happening without us,” Allen declared. Gates began to rock back and forth, as he often did during moments of intensity. When he finished the article, he realized that Allen was right. For the next eight weeks, the two of them embarked on a frenzy of code writing that would change the nature of the computer business.1 Unlike the computer pioneers before him, Gates, who was born in 1955, had not grown up caring much about the hardware. He had never gotten his thrills by building Heathkit radios or soldering circuit boards. A high school physics teacher, annoyed by the arrogance Gates sometimes displayed while jockeying at the school’s timesharing terminal, had once assigned him the project of assembling a Radio Shack electronics kit. When Gates finally turned it in, the teacher recalled, “solder was dripping all over the back” and it didn’t work.2 For Gates, the magic of computers was not in their hardware circuits but in their software code. “We’re not hardware gurus, Paul,” he repeatedly pronounced whenever Allen proposed building a machine. “What we know is software.” Even his slightly older friend Allen, who had built shortwave radios, knew that the future belonged to the coders. “Hardware,” he admitted, “was not our area of expertise.”3 What Gates and Allen set out to do on that December day in 1974 when they first saw the Popular Electronics cover was to create the software for personal computers. More than that, they wanted to shift the balance in the emerging industry so that the hardware would become an interchangeable commodity, while those who created the operating system and application software would capture most of the profits.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
recalled Stephen Crocker, a graduate student on the UCLA team who had driven up with his best friend and colleague, Vint Cerf. So they decided to meet regularly, rotating among their sites. The polite and deferential Crocker, with his big face and bigger smile, had just the right personality to be the coordinator of what became one of the digital age’s archetypical collaborative processes. Unlike Kleinrock, Crocker rarely used the pronoun I; he was more interested in distributing credit than claiming it. His sensitivity toward others gave him an intuitive feel for how to coordinate a group without trying to centralize control or authority, which was well suited to the network model they were trying to invent. Months passed, and the graduate students kept meeting and sharing ideas while they waited for some Powerful Official to descend upon them and give them marching orders. They assumed that at some point the authorities from the East Coast would appear with the rules and regulations and protocols engraved on tablets to be obeyed by the mere managers of the host computer sites. “We were nothing more than a self-appointed bunch of graduate students, and I was convinced that a corps of authority figures or grownups from Washington or Cambridge would descend at any moment and tell us what the rules were,” Crocker recalled. But this was a new age. The network was supposed to be distributed, and so was the authority over it. Its invention and rules would be user-generated. The process would be open. Though it was funded partly to facilitate military command and control, it would do so by being resistant to centralized command and control. The colonels had ceded authority to the hackers and academics. So after an especially fun gathering in Utah in early April 1967, this gaggle of graduate students, having named itself the Network Working Group, decided that it would be useful to write down some of what they had conjured up.95 And Crocker, who with his polite lack of pretense could charm a herd of hackers into consensus, was tapped for the task. He was anxious to find an approach that did not seem presumptuous. “I realized that the mere act of writing down what we were talking about could be seen as a presumption of authority and someone was going to come and yell at us—presumably some adult out of the east.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Who is going to fight them off, Randy?” “I’m afraid you’re going to say we are.” “Sometimes it might be other Ares-worshippers, as when Iran and Iraq went to war and no one cared who won. But if Ares-worshippers aren’t going to end up running the whole world, someone needs to do violence to them. This isn’t very nice, but it’s a fact: civilization requires an Aegis. And the only way to fight the bastards off in the end is through intelligence. Cunning. Metis.” “Tactical cunning, like Odysseus and the Trojan Horse, or—” “Both that, and technological cunning. From time to time there is a battle that is out-and-out won by a new technology—like longbows at Crecy. For most of history those battles happen only every few centuries—you have the chariot, the compound bow, gunpowder, ironclad ships, and so on. But something happens around, say, the time that the Monitor, which the Northerners believe to be the only ironclad warship on earth, just happens to run into the Merrimack, of which the Southerners believe exactly the same thing, and they pound the hell out of each other for hours and hours. That’s as good a point as any to identify as the moment when a spectacular rise in military technology takes off—it’s the elbow in the exponential curve. Now it takes the world’s essentially conservative military establishments a few decades to really comprehend what has happened, but by the time we’re in the thick of the Second World War, it’s accepted by everyone who doesn’t have his head completely up his ass that the war’s going to be won by whichever side has the best technology. So on the German side alone we’ve got rockets, jet aircraft, nerve gas, wire-guided missiles. And on the Allied side we’ve got three vast efforts that put basically every top-level hacker, nerd, and geek to work: the codebreaking thing, which as you know gave rise to the digital computer; the Manhattan Project, which gave us nuclear weapons; and the Radiation Lab, which gave us the modern electronics industry. Do you know why we won the Second World War, Randy?” “I think you just told me.” “Because we built better stuff than the Germans?” “Isn’t that what you said?” “But why did we build better stuff, Randy?” “I guess I’m not competent to answer, Enoch, I haven’t studied that period well enough.” “Well the short answer is that we won because the Germans worshipped Ares and we worshipped Athena.” “And am I supposed to gather that you, or
Neal Stephenson (Cryptonomicon)