Computer Hardware Quotes

We've searched our database for all the quotes and captions related to Computer Hardware. Here they are! All 100 of them:

My first impulse, when presented with any spanking-new piece of computer hardware, is to imagine how it will look in ten years’ time, gathering dust under a card table in a thrift shop.
William Gibson (Distrust That Particular Flavor)
People who are really serious about software should make their own hardware.
Alan Kay
Code is not like other how-computers-work books. It doesn't have big color illustrations of disk drives with arrows showing how the data sweeps into the computer. Code has no drawings of trains carrying a cargo of zeros and ones. Metaphors and similes are wonderful literary devices but they do nothing but obscure the beauty of technology.
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
The transformation can only be accomplished by man, not by hardware (computers, gadgets, automation, new machinery). A company can not buy its way into quality.
W. Edwards Deming (Out of the Crises)
The hardware of a computer is useless without the right software. Similarly, in an organization the hardware (strategy and structure) is inert without the software (beliefs and behaviors).
Larry Bossidy (Execution: The Discipline of Getting Things Done)
To use a computer analogy, we are running twenty-first-century software on hardware last upgraded 50,000 years ago or more. This may explain quite a lot of what we see in the news.
Ronald Wright (A Short History Of Progress)
Programming in machine code is like eating with a toothpick.
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
Quantum Machine Learning is defined as the branch of science and technology that is concerned with the application of quantum mechanical phenomena such as superposition, entanglement and tunneling for designing software and hardware to provide machines the ability to learn insights and patterns from data and the environment, and the ability to adapt automatically to changing situations with high precision, accuracy and speed. 
Amit Ray (Quantum Computing Algorithms for Artificial Intelligence)
Windows 10 has been a nightmare for computer hardware manufacturers, as it can make perfectly good hardware appear to be faulty.
Steven Magee
We could just as reasonably base our number system on eight (if we were cartoon characters) or four (if we were lobsters) or even two (if we were dolphins).
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
Her computer’s fan whirred to life, blowing warm air onto her fingers. Two flame-red slits glowed from the monitor. The speakers boomed. “I lived! I died! I live again!” Olivie had dealt with blue screens, frozen hourglasses, and even the odd hardware conflict back in the day. This was new.
Choong JayVee (In Memory: A Tribute to Sir Terry Pratchett)
She noted the lack of female hardware hackers, and was enraged at the male hacker obsession with technological play and power.
Steven Levy (Hackers: Heroes of the Computer Revolution)
How many computer programmers does it take to change a light bulb? Are you kidding? That's a hardware problem!
Various (101 Best Jokes)
The NSA employs more mathematicians, buys more computer hardware, and intercepts more messages than any other organization in the world.
Simon Singh (The Code Book: The Science of Secrecy from Ancient Egypt to Quantum Cryptography)
In a recent analysis, Martin Grötschel of the Zuse Institute in Berlin found that, using the computers and software that existed in 1982, it would have taken a full eighty-two years to solve a particularly complex production planning problem. As of 2003, the same problem could be solved in about a minute—an improvement by a factor of around 43 million. Computer hardware became about 1,000 times faster over the same period, which means that improvements in the algorithms used accounted for approximately a 43,000-fold increase in performance.
Martin Ford (Rise of the Robots: Technology and the Threat of a Jobless Future)
When the veterans in the group were growing up, computers were quite rare and expensive, but Veres went to school in the age when anyone with a little money and skill could make up a small personal system. Veres says that what he does at home is different enough from what he does at work to serve as recreation for him. At work he deals with hardware; when he’s at home, he focuses on software—reading programming manuals and creating new software for his own computer.
Tracy Kidder (The Soul of A New Machine)
I really can’t keep it up anymore,” Huahua said. “No one’s doing any better,” Specs said lightly. “It’s not the same. This is impossible!” “Think of yourself as a computer. You’re just cold hardware, and reality is just data. Accept your input and perform your calculations. That’s how you keep it up.
Liu Cixin (Supernova Era)
According to the Hebrew understanding of the human being, we are more than just a body. We are also a soul and spirit. The “spirit” of a person can be considered the breath of life, while the soul is the eternal part of us. If our body is our computer hardware, then the soul is our software, and the spirit is the electricity that gives the whole thing life.
Chuck Missler (The Physics of Immortality)
The flip side of this is that any information that can be reduced to a choice among two or more possibilities can be expressed using bits. Needless
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
Some people believe that Moore’s Law will continue to be accurate until about 2015.
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
In 1948, while working for Bell Telephone Laboratories, he published a paper in the Bell System Technical Journal entitled "A Mathematical Theory of Communication" that not only introduced the word bit in print but established a field of study today known as information theory. Information theory is concerned with transmitting digital information in the presence of noise (which usually prevents all the information from getting through) and how to compensate for that. In 1949, he wrote the first article about programming a computer to play chess, and in 1952 he designed a mechanical mouse controlled by relays that could learn its way around a maze. Shannon was also well known at Bell Labs for riding a unicycle and juggling simultaneously.
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
The relentless acceleration of computer hardware over decades suggests that we’ve somehow managed to remain on the steep part of the S-curve for far longer than has been possible in other spheres of technology. The reality, however, is that Moore’s Law has involved successfully climbing a staircase of cascading S-curves, each representing a specific semiconductor fabrication technology.
Martin Ford (Rise of the Robots: Technology and the Threat of a Jobless Future)
Our mind is nothing but accumulated thoughts-good or evil recorded from the day the child is born. For memory or thought to work, a brain is needed. Software cannot work without a hardware. When a computer is damaged can we believe that its software is still somewhere in the sky? How can memory or thinking faculty exist outside brain? The neurotransmitters are responsible for the thought process and memory retention and retrival. All are elecrochemical impulses which cannot travel to sky. Our personality, individuality etc. are result of the accumulated thoughts in our brain. It is quality and nature of accumulated thoughts which decides if one is to become a scientist,poet or a terrorist. A guitar in the hands of a layman does not make any sense. If it is in the hands of a musician melodious tunes can come out. A child in the hands of lovable and intelligent parents go to heights.
V.A. Menon
We have in our head a remarkably powerful computer, not vast by conventional hardware standards, but able to represent the structure of our world by various types of associative links in a vast network of various types of ideas.
Daniel Kahneman (Thinking, Fast and Slow)
The truth is that anxiety is at once a function of biology and philosophy, body and mind, instinct and reason, personality and culture. Even as anxiety is experienced at a spiritual and psychological level, it is scientifically measurable at the molecular level and the physiological level. It is produced by nature and it is produced by nurture. It’s a psychological phenomenon and a sociological phenomenon. In computer terms, it’s both a hardware problem (I’m wired badly) and a software problem (I run faulty logic programs that make me think anxious thoughts). The origins of a temperament are many faceted; emotional dispositions that may seem to have a simple, single source—a bad gene, say, or a childhood trauma—may not.
Scott Stossel (My Age of Anxiety: Fear, Hope, Dread, and the Search for Peace of Mind)
Our minds have the incredible capacity to both alter the strength of connections among neurons, essentially rewiring them, and create entirely new pathways. (It makes a computer, which cannot create new hardware when its system crashes, seem fixed and helpless).
Susannah Cahalan (Brain on Fire: My Month of Madness)
Under a $652-million clandestine program code named GENIE, the NSA, CIA, and special military operatives have planted covert digital bugs in tens of thousands of computers, routers, and firewalls around the world to conduct computer network exploitation, or CNE. Some are planted remotely, but others require physical access to install through so-called interdiction—the CIA or FBI intercepts shipments of hardware from manufacturers and retailers in order to plant malware in them or install doctored chips before they reach the customer.
Kim Zetter (Countdown to Zero Day: Stuxnet and the Launch of the World's First Digital Weapon)
If men create intelligent machines, or fantasize about them, it is either because they secretly despair of their own intelligence or because they are in danger of succumbing to the weight of a monstrous and useless intelligence which they seek to exorcize by transferring it to machines, where they can play with it and make fun of it. By entrusting this burdensome intelligence to machines we are released from any responsibility to knowledge, much as entrusting power to politicians allows us to disdain any aspiration of our own to power. If men dream of machines that are unique, that are endowed with genius, it is because they despair of their own uniqueness, or because they prefer to do without it - to enjoy it by proxy, so to speak, thanks to machines. What such machines offer is the spectacle of thought, and in manipulating them people devote themselves more to the spectacle of thought than to thought itself. It is not for nothing that they are described as 'virtual', for they put thought on hold indefinitely, tying its emergence to the achievement of a complete knowledge. The act of thinking itself is thus put off for ever. Indeed, the question of thought can no more be raised than the question of the freedom of future generations, who will pass through life as we travel through the air, strapped into their seats. These Men of Artificial Intelligence will traverse their own mental space bound hand and foot to their computers. Immobile in front of his computer, Virtual Man makes love via the screen and gives lessons by means of the teleconference. He is a physical - and no doubt also a mental cripple. That is the price he pays for being operational. Just as eyeglasses and contact lenses will arguably one day evolve into implanted prostheses for a species that has lost its sight, it is similarly to be feared that artificial intelligence and the hardware that supports it will become a mental prosthesis for a species without the capacity for thought. Artificial intelligence is devoid of intelligence because it is devoid of artifice.
Jean Baudrillard (The Transparency of Evil: Essays in Extreme Phenomena)
the modern relationship between software and hardware is essentially the same as that between music and the instrument or voice that brings it to life. A single computer can transform itself into the cockpit of a fighter jet, a budget projection, a chapter of a novel, or whatever else you want, just as a single piano can be used to play Bach or funky blues.
M. Mitchell Waldrop (The Dream Machine)
Based on the above analyses, it is reasonable to expect the hardware that can emulate human-brain functionality to be available for approximately one thousand dollars by around 2020. As we will discuss in chapter 4, the software that will replicate that functionality will take about a decade longer. However, the exponential growth of the price-performance, capacity, and speed of our hardware technology will continue during that period, so by 2030 it will take a village of human brains (around one thousand) to match a thousand dollars’ worth of computing. By 2050, one thousand dollars of computing will exceed the processing power of all human brains on Earth. Of course, this figure includes those brains still using only biological neurons.
Ray Kurzweil (The Singularity is Near: When Humans Transcend Biology)
Nothing's immortal on a road trip of a billion years. The universe runs down in stop-motion around you, your backups' backups' backups need backups. Not even the error-correcting replication strategies cadged from biology can keep the mutations at bay forever. It was true for us meatsicles cycling through mayfly moments every thousand years; it was just as true for the hardware.
Peter Watts (The Freeze-Frame Revolution)
Pham Nuwen spent years learning to program/explore. Programming went back to the beginning of time. It was a little like the midden out back of his father’s castle. Where the creek had worn that away, ten meters down, there were the crumpled hulks of machines—flying machines, the peasants said—from the great days of Canberra’s original colonial era. But the castle midden was clean and fresh compared to what lay within the Reprise’s local net. There were programs here that had been written five thousand years ago, before Humankind ever left Earth. The wonder of it—the horror of it, Sura said—was that unlike the useless wrecks of Canberra’s past, these programs still worked! And via a million million circuitous threads of inheritance, many of the oldest programs still ran in the bowels of the Qeng Ho system. Take the Traders’ method of timekeeping. The frame corrections were incredibly complex—and down at the very bottom of it was a little program that ran a counter. Second by second, the Qeng Ho counted from the instant that a human had first set foot on Old Earth’s moon. But if you looked at it still more closely. . .the starting instant was actually some hundred million seconds later, the 0-second of one of Humankind’s first computer operating systems. So behind all the top-level interfaces was layer under layer of support. Some of that software had been designed for wildly different situations. Every so often, the inconsistencies caused fatal accidents. Despite the romance of spaceflight, the most common accidents were simply caused by ancient, misused programs finally getting their revenge. “We should rewrite it all,” said Pham. “It’s been done,” said Sura, not looking up. She was preparing to go off-Watch, and had spent the last four days trying to root a problem out of the coldsleep automation. “It’s been tried,” corrected Bret, just back from the freezers. “But even the top levels of fleet system code are enormous. You and a thousand of your friends would have to work for a century or so to reproduce it.” Trinli grinned evilly. “And guess what—even if you did, by the time you finished, you’d have your own set of inconsistencies. And you still wouldn’t be consistent with all the applications that might be needed now and then.” Sura gave up on her debugging for the moment. “The word for all this is ‘mature programming environment.’ Basically, when hardware performance has been pushed to its final limit, and programmers have had several centuries to code, you reach a point where there is far more signicant code than can be rationalized. The best you can do is understand the overall layering, and know how to search for the oddball tool that may come in handy—take the situation I have here.” She waved at the dependency chart she had been working on. “We are low on working fluid for the coffins. Like a million other things, there was none for sale on dear old Canberra. Well, the obvious thing is to move the coffins near the aft hull, and cool by direct radiation. We don’t have the proper equipment to support this—so lately, I’ve been doing my share of archeology. It seems that five hundred years ago, a similar thing happened after an in-system war at Torma. They hacked together a temperature maintenance package that is precisely what we need.” “Almost precisely.
Vernor Vinge (A Deepness in the Sky (Zones of Thought, #2))
This entailed switching around by hand ENIAC’s rat’s nest of cables and resetting its switches. At first the programming seemed to be a routine, perhaps even menial task, which may have been why it was relegated to women, who back then were not encouraged to become engineers. But what the women of ENIAC soon showed, and the men later came to understand, was that the programming of a computer could be just as significant as the design of its hardware.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
REINHOLD JOBS. Wisconsin-born Coast Guard seaman who, with his wife, Clara, adopted Steve in 1955. REED JOBS. Oldest child of Steve Jobs and Laurene Powell. RON JOHNSON. Hired by Jobs in 2000 to develop Apple’s stores. JEFFREY KATZENBERG. Head of Disney Studios, clashed with Eisner and resigned in 1994 to cofound DreamWorks SKG. ALAN KAY. Creative and colorful computer pioneer who envisioned early personal computers, helped arrange Jobs’s Xerox PARC visit and his purchase of Pixar. DANIEL KOTTKE. Jobs’s closest friend at Reed, fellow pilgrim to India, early Apple employee. JOHN LASSETER. Cofounder and creative force at Pixar. DAN’L LEWIN. Marketing exec with Jobs at Apple and then NeXT. MIKE MARKKULA. First big Apple investor and chairman, a father figure to Jobs. REGIS MCKENNA. Publicity whiz who guided Jobs early on and remained a trusted advisor. MIKE MURRAY. Early Macintosh marketing director. PAUL OTELLINI. CEO of Intel who helped switch the Macintosh to Intel chips but did not get the iPhone business. LAURENE POWELL. Savvy and good-humored Penn graduate, went to Goldman Sachs and then Stanford Business School, married Steve Jobs in 1991. GEORGE RILEY. Jobs’s Memphis-born friend and lawyer. ARTHUR ROCK. Legendary tech investor, early Apple board member, Jobs’s father figure. JONATHAN “RUBY” RUBINSTEIN. Worked with Jobs at NeXT, became chief hardware engineer at Apple in 1997. MIKE SCOTT. Brought in by Markkula to be Apple’s president in 1977 to try to manage Jobs.
Walter Isaacson (Steve Jobs)
Jon Rubinstein, who was in charge of hardware, adapted the microprocessor and guts of the PowerMac G3, Apple’s high-end professional computer, for use in the proposed new machine. It would have a hard drive and a tray for compact disks, but in a rather bold move, Jobs and Rubinstein decided not to include the usual floppy disk drive. Jobs quoted the hockey star Wayne Gretzky’s maxim, “Skate where the puck’s going, not where it’s been.” He was a bit ahead of his time, but eventually most computers eliminated floppy disks.
Walter Isaacson (Steve Jobs)
Tiny hardware malfunctions can produce outsized risks. In 1980 a single faulty computer chip costing forty-six cents almost triggered a major nuclear incident over the Pacific. And in perhaps the most well-known case, nuclear catastrophe was only avoided during the Cuban missile crisis when one man, the acting Russian commodore, Vasili Arkhipov, refused to give an order to fire nuclear torpedoes. The two other officers on the submarine, convinced they were under attack, had brought the world within a split second of full-scale nuclear war.
Mustafa Suleyman (The Coming Wave: AI, Power, and Our Future)
The clarity offered by software as metaphor - and the empowerment allegedly offered to us who know software - should make us pause, because software also engenders a sense of profound ignorance. Software is extremely difficult to comprehend. Who really knows what lurks behind our smiling interfaces, behind the objects we click and manipulate? Who completely understands what one’s computer is actually doing at any given moment? Software as a metaphor for metaphor troubles the usual functioning of metaphor, that is, the clarification of an unknown concept through a known one. For, if software illuminates an unknown, it does so through an unknowable (software). This paradox - this drive to grasp what we do not know through what we do not entirely understand… does not undermine, but rather grounds software’s appeal. Its combination of what can be seen and not seen, can be known and no known - it’s separation of interface from algorithm, of software from hardware - makes it a powerful metaphor for everything we believe is invisible yet generates visible effects, from genetics to the invisible hand of the market, from ideology to culture. Every use entails an act of faith.
Wendy Hui Kyong Chun (Programmed Visions: Software and Memory (Software Studies))
There is no such thing as a computational person, whose mind is like computer software, able to work on any suitable computer or neural hardware whose mind somehow derives meaning from taking meaningless symbols as input, manipulating them by rule, and giving meaningless symbols as output. Real people have embodied minds whose conceptual systems arise from, are shaped by, and are given meaning through living human bodies. The neural structures of our brains produce conceptual systems and linguistic structures that cannot be adequately accounted for by formal systems that only manipulate symbols.
George Lakoff (Philosophy in the Flesh: The Embodied Mind and its Challenge to Western Thought)
The creative imitator looks at products or services from the viewpoint of the customer. IBM’s personal computer is practically indistinguishable from the Apple in its technical features, but IBM from the beginning offered the customer programs and software. Apple maintained traditional computer distribution through specialty stores. IBM—in a radical break with its own traditions—developed all kinds of distribution channels, specialty stores, major retailers like Sears, Roebuck, its own retail stores, and so on. It made it easy for the consumer to buy and it made it easy for the consumer to use the product. These, rather than hardware features, were the “innovations” that gave IBM the personal computer market.
Peter F. Drucker (Innovation and Entrepreneurship)
This kind of pragmatism has become a hallmark of our psychological culture. In the mid-1990s, I described how it was commonplace for people to “cycle through” different ideas of the human mind as (to name only a few images) mechanism, spirit, chemistry, and vessel for the soul.14 These days, the cycling through intensifies. We are in much more direct contact with the machine side of mind. People are fitted with a computer chip to help with Parkinson’s. They learn to see their minds as program and hardware. They take antidepressants prescribed by their psychotherapists, confident that the biochemical and oedipal self can be treated in one room. They look for signs of emotion in a brain scan. Old jokes about couples needing “chemistry” turn out not to be jokes at all.
Sherry Turkle (Alone Together: Why We Expect More from Technology and Less from Each Other)
Even in the 1950s, computers were described in the popular press as “super-brains” that were “faster than Einstein.” So can we say now, finally, that computers are as powerful as the human brain? No. Focusing on raw computing power misses the point entirely. Speed alone won’t give us AI. Running a poorly designed algorithm on a faster computer doesn’t make the algorithm better; it just means you get the wrong answer more quickly. (And with more data there are more opportunities for wrong answers!) The principal effect of faster machines has been to make the time for experimentation shorter, so that research can progress more quickly. It’s not hardware that is holding AI back; it’s software. We don’t yet know how to make a machine really intelligent—even if it were the size of the universe.
Stuart Russell (Human Compatible: Artificial Intelligence and the Problem of Control)
Jobs’s reluctance to make the Mac compatible with the architecture of the Lisa was motivated by more than rivalry or revenge. There was a philosophical component, one that was related to his penchant for control. He believed that for a computer to be truly great, its hardware and its software had to be tightly linked. When a computer was open to running software that also worked on other computers, it would end up sacrificing some functionality. The best products, he believed, were “whole widgets” that were designed end-to-end, with the software closely tailored to the hardware and vice versa. This is what would distinguish the Macintosh, which had an operating system that worked only on its own hardware, from the environment that Microsoft was creating, in which its operating system could be used on hardware made by many different companies.
Walter Isaacson (Steve Jobs)
About sixty thousand different thoughts are said to go through a person’s mind over the course of a day. Ninety-five percent of that is made up of the same things we’d been thinking about the day before, and 80 percent of those thoughts are believed to be negative. In my days as a maximalist, I lived in fear of my future, constantly worrying about my career and how others saw me. Forget about that 80 percent I mentioned a moment earlier—practically all my thoughts were negative. So, how do you make a slow computer like that work properly? Since our fifty-thousand-year-old hardware isn’t going to change, we need to get rid of the extra load that isn’t needed. Rather than trying to add more and more, running out of disk space and exhausting ourselves in the process, I think it’s time we started thinking about subtracting and refining to enhance the truly important things that might be buried deep down underneath all that excess.
Fumio Sasaki (Goodbye, Things: The New Japanese Minimalism)
So which theory did Lagos believe in? The relativist or the universalist?" "He did not seem to think there was much of a difference. In the end, they are both somewhat mystical. Lagos believed that both schools of thought had essentially arrived at the same place by different lines of reasoning." "But it seems to me there is a key difference," Hiro says. "The universalists think that we are determined by the prepatterned structure of our brains -- the pathways in the cortex. The relativists don't believe that we have any limits." "Lagos modified the strict Chomskyan theory by supposing that learning a language is like blowing code into PROMs -- an analogy that I cannot interpret." "The analogy is clear. PROMs are Programmable Read-Only Memory chips," Hiro says. "When they come from the factory, they have no content. Once and only once, you can place information into those chips and then freeze it -- the information, the software, becomes frozen into the chip -- it transmutes into hardware. After you have blown the code into the PROMs, you can read it out, but you can't write to them anymore. So Lagos was trying to say that the newborn human brain has no structure -- as the relativists would have it -- and that as the child learns a language, the developing brain structures itself accordingly, the language gets 'blown into the hardware and becomes a permanent part of the brain's deep structure -- as the universalists would have it." "Yes. This was his interpretation." "Okay. So when he talked about Enki being a real person with magical powers, what he meant was that Enki somehow understood the connection between language and the brain, knew how to manipulate it. The same way that a hacker, knowing the secrets of a computer system, can write code to control it -- digital namshubs?" "Lagos said that Enki had the ability to ascend into the universe of language and see it before his eyes. Much as humans go into the Metaverse. That gave him power to create nam-shubs. And nam-shubs had the power to alter the functioning of the brain and of the body." "Why isn't anyone doing this kind of thing nowadays? Why aren't there any namshubs in English?" "Not all languages are the same, as Steiner points out. Some languages are better at metaphor than others. Hebrew, Aramaic, Greek, and Chinese lend themselves to word play and have achieved a lasting grip on reality: Palestine had Qiryat Sefer, the 'City of the Letter,' and Syria had Byblos, the 'Town of the Book.' By contrast other civilizations seem 'speechless' or at least, as may have been the case in Egypt, not entirely cognizant of the creative and transformational powers of language. Lagos believed that Sumerian was an extraordinarily powerful language -- at least it was in Sumer five thousand years ago." "A language that lent itself to Enki's neurolinguistic hacking." "Early linguists, as well as the Kabbalists, believed in a fictional language called the tongue of Eden, the language of Adam. It enabled all men to understand each other, to communicate without misunderstanding. It was the language of the Logos, the moment when God created the world by speaking a word. In the tongue of Eden, naming a thing was the same as creating it. To quote Steiner again, 'Our speech interposes itself between apprehension and truth like a dusty pane or warped mirror. The tongue of Eden was like a flawless glass; a light of total understanding streamed through it. Thus Babel was a second Fall.' And Isaac the Blind, an early Kabbalist, said that, to quote Gershom Scholem's translation, 'The speech of men is connected with divine speech and all language whether heavenly or human derives from one source: the Divine Name.' The practical Kabbalists, the sorcerers, bore the title Ba'al Shem, meaning 'master of the divine name.'" "The machine language of the world," Hiro says.
Neal Stephenson (Snow Crash)
The Xerox Corporation’s Palo Alto Research Center, known as Xerox PARC, had been established in 1970 to create a spawning ground for digital ideas. It was safely located, for better and for worse, three thousand miles from the commercial pressures of Xerox corporate headquarters in Connecticut. Among its visionaries was the scientist Alan Kay, who had two great maxims that Jobs embraced: “The best way to predict the future is to invent it” and “People who are serious about software should make their own hardware.” Kay pushed the vision of a small personal computer, dubbed the “Dynabook,” that would be easy enough for children to use. So Xerox PARC’s engineers began to develop user-friendly graphics that could replace all of the command lines and DOS prompts that made computer screens intimidating. The metaphor they came up with was that of a desktop. The screen could have many documents and folders on it, and you could use a mouse to point and click on the one you wanted to use.
Walter Isaacson (Steve Jobs)
Bitcoin was in theory and in practice inseparable from the process of computation run on cheap, powerful hardware: the system could not have existed without markets for digital moving images; especially video games, driving down the price of microchips that could handle the onerous business of guessing. It also had a voracious appetite for electricity, which had to come from somewhere - burning coal or natural gas, spinning turbines, decaying uranium - and which wasn't being used for something arguably more constructive than this discovery of meaningless hashes. The whole apparatus of the early twenty-first century's most complex and refined infrastructures and technologies was turned to the conquest of the useless. It resembled John Maynard Keynes's satirical response to criticisms of his capital injection proposal by proponents of the gold standard: just put banknotes in bottles, he suggested, and bury them in disused coal mines for people to dig up - a useless task to slow the dispersal of the new money and get people to work for it. 'It would, indeed, be more sensible to build houses and the like; but if there are political and practical difficulties in the way of this, the above would be better than nothing.
Finn Brunton (Digital Cash: The Unknown History of the Anarchists, Utopians, and Technologists Who Created Cryptocurrency)
The collapse, for example, of IBM’s legendary 80-year-old hardware business in the 1990s sounds like a classic P-type story. New technology (personal computers) displaces old (mainframes) and wipes out incumbent (IBM). But it wasn’t. IBM, unlike all its mainframe competitors, mastered the new technology. Within three years of launching its first PC, in 1981, IBM achieved $5 billion in sales and the #1 position, with everyone else either far behind or out of the business entirely (Apple, Tandy, Commodore, DEC, Honeywell, Sperry, etc.). For decades, IBM dominated computers like Pan Am dominated international travel. Its $13 billion in sales in 1981 was more than its next seven competitors combined (the computer industry was referred to as “IBM and the Seven Dwarfs”). IBM jumped on the new PC like Trippe jumped on the new jet engines. IBM owned the computer world, so it outsourced two of the PC components, software and microprocessors, to two tiny companies: Microsoft and Intel. Microsoft had all of 32 employees. Intel desperately needed a cash infusion to survive. IBM soon discovered, however, that individual buyers care more about exchanging files with friends than the brand of their box. And to exchange files easily, what matters is the software and the microprocessor inside that box, not the logo of the company that assembled the box. IBM missed an S-type shift—a change in what customers care about. PC clones using Intel chips and Microsoft software drained IBM’s market share. In 1993, IBM lost $8.1 billion, its largest-ever loss. That year it let go over 100,000 employees, the largest layoff in corporate history. Ten years later, IBM sold what was left of its PC business to Lenovo. Today, the combined market value of Microsoft and Intel, the two tiny vendors IBM hired, is close to $1.5 trillion, more than ten times the value of IBM. IBM correctly anticipated a P-type loonshot and won the battle. But it missed a critical S-type loonshot, a software standard, and lost the war.
Safi Bahcall (Loonshots: How to Nurture the Crazy Ideas That Win Wars, Cure Diseases, and Transform Industries)
Although thrilled that the era of the personal computer had arrived, he was afraid that he was going to miss the party. Slapping down seventy-five cents, he grabbed the issue and trotted through the slushy snow to the Harvard dorm room of Bill Gates, his high school buddy and fellow computer fanatic from Seattle, who had convinced him to drop out of college and move to Cambridge. “Hey, this thing is happening without us,” Allen declared. Gates began to rock back and forth, as he often did during moments of intensity. When he finished the article, he realized that Allen was right. For the next eight weeks, the two of them embarked on a frenzy of code writing that would change the nature of the computer business.1 Unlike the computer pioneers before him, Gates, who was born in 1955, had not grown up caring much about the hardware. He had never gotten his thrills by building Heathkit radios or soldering circuit boards. A high school physics teacher, annoyed by the arrogance Gates sometimes displayed while jockeying at the school’s timesharing terminal, had once assigned him the project of assembling a Radio Shack electronics kit. When Gates finally turned it in, the teacher recalled, “solder was dripping all over the back” and it didn’t work.2 For Gates, the magic of computers was not in their hardware circuits but in their software code. “We’re not hardware gurus, Paul,” he repeatedly pronounced whenever Allen proposed building a machine. “What we know is software.” Even his slightly older friend Allen, who had built shortwave radios, knew that the future belonged to the coders. “Hardware,” he admitted, “was not our area of expertise.”3 What Gates and Allen set out to do on that December day in 1974 when they first saw the Popular Electronics cover was to create the software for personal computers. More than that, they wanted to shift the balance in the emerging industry so that the hardware would become an interchangeable commodity, while those who created the operating system and application software would capture most of the profits.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
In quel preciso momento, Karla è entrata nella stanza. Ha spento la televisione, ha guardato Todd fisso negli occhi e ha detto: "Todd, tu esisti non soltanto come membro di una famiglia o di una compagnia o di una nazione, ma come membro di una specie... sei un essere umano. Sei parte dell'umanità. Attualmente la nostra specie ha problemi profondi e stiamo cercando di sognare un modo per uscirne e stiamo usando i computer per cavarcela. La costruzione di hardware e software è il campo in cui la specie ha deciso di investire energie per la sua sopravvivenza e questa costruzione richiede zone di pace, bambini nati dalla pace, e l'assenza di distrazioni che interferiscano col codice. Non possiamo acquisire conoscenza attraverso l'informatica, ma riusciremo a usarla per tenerci fuori dalla merda. Quello che tu percepisci come un vuoto è un paradiso terrestre: alla lettera, linea per linea, la libertà di impedire all'umanità di diventare non lineare". Si è seduta sul divano e c'era il rumore della pioggia che tamburellava sul soffitto e mi sono reso conto del fatto che non c'era abbastanza luce nella stanza e che noi eravamo tutti in silenzio. Karla ha detto: "Abbiamo avuto una vita discreta. Nessuno di noi, a quanto mi risulta, è mai stato maltrattato. Non abbiamo mai desiderato niente, né abbiamo mai voluto possedere qualcosa. I nostri genitori sono tutti ancora insieme, a parte quelli di Susan. Ci hanno trattato bene, ma la vera moralità, qui. Todd consiste nel sapere se le loro mani sono state sprecate in vite non creative, o se queste mani sono utilizzate per portare avanti il sogno dell'umanità". Continuava a piovere. "Non è una coincidenza che come specie abbiamo inventato la classe media. Senza la classe media, non avremmo potuto avere quel particolare tipo di configurazione mentale che contribuisce in misura consistente a sputar fuori i sistemi informatici e la nostra specie non avrebbe mai potuto farcela ad arrivare allo stadio evolutivo successivo, qualunque esso sia. Ci sono buone probabilità che la classe media non rientri neanche parzialmente nella prossima fase evolutiva. Ma non è né qui né là. Che ti piaccia o no, Todd, tu, io, Dan, Abe, Bug, e Susan... tutti noi siamo fabbricanti del prossimo ciclo Rem del sogno umano. Tutti gli altri ne saranno attratti. Non metterli in discussione, Todd, e non crogiolartici dentro, ma non permettere mai a te stesso di dimenticarlo".
Douglas Coupland (Microserfs)
me to be honest about his failings as well as his strengths. She is one of the smartest and most grounded people I have ever met. “There are parts of his life and personality that are extremely messy, and that’s the truth,” she told me early on. “You shouldn’t whitewash it. He’s good at spin, but he also has a remarkable story, and I’d like to see that it’s all told truthfully.” I leave it to the reader to assess whether I have succeeded in this mission. I’m sure there are players in this drama who will remember some of the events differently or think that I sometimes got trapped in Jobs’s distortion field. As happened when I wrote a book about Henry Kissinger, which in some ways was good preparation for this project, I found that people had such strong positive and negative emotions about Jobs that the Rashomon effect was often evident. But I’ve done the best I can to balance conflicting accounts fairly and be transparent about the sources I used. This is a book about the roller-coaster life and searingly intense personality of a creative entrepreneur whose passion for perfection and ferocious drive revolutionized six industries: personal computers, animated movies, music, phones, tablet computing, and digital publishing. You might even add a seventh, retail stores, which Jobs did not quite revolutionize but did reimagine. In addition, he opened the way for a new market for digital content based on apps rather than just websites. Along the way he produced not only transforming products but also, on his second try, a lasting company, endowed with his DNA, that is filled with creative designers and daredevil engineers who could carry forward his vision. In August 2011, right before he stepped down as CEO, the enterprise he started in his parents’ garage became the world’s most valuable company. This is also, I hope, a book about innovation. At a time when the United States is seeking ways to sustain its innovative edge, and when societies around the world are trying to build creative digital-age economies, Jobs stands as the ultimate icon of inventiveness, imagination, and sustained innovation. He knew that the best way to create value in the twenty-first century was to connect creativity with technology, so he built a company where leaps of the imagination were combined with remarkable feats of engineering. He and his colleagues at Apple were able to think differently: They developed not merely modest product advances based on focus groups, but whole new devices and services that consumers did not yet know they needed. He was not a model boss or human being, tidily packaged for emulation. Driven by demons, he could drive those around him to fury and despair. But his personality and passions and products were all interrelated, just as Apple’s hardware and software tended to be, as if part of an integrated system. His tale is thus both instructive and cautionary, filled with lessons about innovation, character, leadership, and values.
Walter Isaacson (Steve Jobs)
Similarly, the computers used to run the software on the ground for the mission were borrowed from a previous mission. These machines were so out of date that Bowman had to shop on eBay to find replacement parts to get the machines working. As systems have gone obsolete, JPL no longer uses the software, but Bowman told me that the people on her team continue to use software built by JPL in the 1990s, because they are familiar with it. She said, “Instead of upgrading to the next thing we decided that it was working just fine for us and we would stay on the platform.” They have developed so much over such a long period of time with the old software that they don’t want to switch to a newer system. They must adapt to using these outdated systems for the latest scientific work. Working within these constraints may seem limiting. However, building tools with specific constraints—from outdated technologies and low bitrate radio antennas—can enlighten us. For example, as scientists started to explore what they could learn from the wait times while communicating with deep space probes, they discovered that the time lag was extraordinarily useful information. Wait times, they realized, constitute an essential component for locating a probe in space, calculating its trajectory, and accurately locating a target like Pluto in space. There is no GPS for spacecraft (they aren’t on the globe, after all), so scientists had to find a way to locate the spacecraft in the vast expanse. Before 1960, the location of planets and objects in deep space was established through astronomical observation, placing an object like Pluto against a background of stars to determine its position.15 In 1961, an experiment at the Goldstone Deep Space Communications Complex in California used radar to more accurately define an “astronomical unit” and help measure distances in space much more accurately.16 NASA used this new data as part of creating the trajectories for missions in the following years. Using the data from radio signals across a wide range of missions over the decades, the Deep Space Network maintained an ongoing database that helped further refine the definition of an astronomical unit—a kind of longitudinal study of space distances that now allows missions like New Horizons to create accurate flight trajectories. The Deep Space Network continued to find inventive ways of using the time lag of radio waves to locate objects in space, ultimately finding that certain ways of waiting for a downlink signal from the spacecraft were less accurate than others. It turned to using the antennas from multiple locations, such as Goldstone in California and the antennas in Canberra, Australia, or Madrid, Spain, to time how long the signal took to hit these different locations on Earth. The time it takes to receive these signals from the spacecraft works as a way to locate the probes as they are journeying to their destination. Latency—or the different time lag of receiving radio signals on different locations of Earth—is the key way that deep space objects are located as they journey through space. This discovery was made possible during the wait times for communicating with these craft alongside the decades of data gathered from each space mission. Without the constraint of waiting, the notion of using time as a locating feature wouldn’t have been possible.
Jason Farman (Delayed Response: The Art of Waiting from the Ancient to the Instant World)
a digital design engineer, you would spend long hours going through the TTL Data Book familiarizing yourself with the types of TTL chips that were available. Once you knew all your tools, you could actually build the computer I showed in Chapter 17 out of TTL chips. Wiring the chips together is a lot easier than wiring individual transistors
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
might be just a set of equations and eye-blearing numbers disembodied from all physical significance. She might not hear another word about the work until a piece appeared in Air Scoop or Aviation or Air Trails. Or never. For many men, a computer was a piece of living hardware, an appliance that inhaled one set of figures and exhaled another. Once a girl finished a particular job, the calculations were whisked away into the shadowy kingdom of the engineers. “Woe unto thee if they shall make thee a computer,” joked a column in Air Scoop. “For the Project Engineer will take credit for whatsoever thou doth that is clever and full of glory. But if he slippeth up, and maketh a wrong calculation, or pulleth a boner of any kind whatsoever, he shall lay the mistake at thy door when he is called to account and he shall say, ‘What can you
Margot Lee Shetterly (Hidden Figures: The American Dream and the Untold Story of the Black Women Mathematicians Who Helped Win the Space Race)
And this is why MapReduce is designed to tolerate frequent unexpected task termination: it’s not because the hardware is particularly unreliable, it’s because the freedom to arbitrarily terminate processes enables better resource utilization in a computing cluster.
Martin Kleppmann (Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems)
Ben-Ari frowned and stared around the galley. "We'll need more than computers. The Brendan has sensors too. Telescopes. Radio dishes. Lots of hardware. We might be able to move some of it over. The navigational systems I can probably move. We might have to duct tape them onto the Anansi's hull, but . . ." She nodded. "We might just be able to navigate on that thing.
Daniel Arenson (Earth Shadows (Earthrise, #5))
It’s natural for us to rate the difficulty of tasks relative to how hard it is for us humans to perform them, as in figure 2.1. But this can give a misleading picture of how hard they are for computers. It feels much harder to multiply 314,159 by 271,828 than to recognize a friend in a photo, yet computers creamed us at arithmetic long before I was born, while human-level image recognition has only recently become possible. This fact that low-level sensorimotor tasks seem easy despite requiring enormous computational resources is known as Moravec’s paradox, and is explained by the fact that our brain makes such tasks feel easy by dedicating massive amounts of customized hardware to them—more than a quarter of our brains, in fact.
Max Tegmark (Life 3.0: Being Human in the Age of Artificial Intelligence)
Your laptop is a note in a symphony currently being played by an orchestra of incalculable size. It’s a very small part of a much greater whole. Most of its capacity resides beyond its hard shell. It maintains its function only because a vast array of other technologies are currently and harmoniously at play. It is fed, for example, by a power grid whose function is invisibly dependent on the stability of a myriad of complex physical, biological, economic and interpersonal systems. The factories that make its parts are still in operation. The operating system that enables its function is based on those parts, and not on others yet to be created. Its video hardware runs the technology expected by the creative people who post their content on the web. Your laptop is in communication with a certain, specified ecosystem of other devices and web servers. And, finally, all this is made possible by an even less visible element: the social contract of trust—the interconnected and fundamentally honest political and economic systems that make the reliable electrical grid a reality. This interdependency of part on whole, invisible in systems that work, becomes starkly evident in systems that don’t. The higher-order, surrounding systems that enable personal computing hardly exist at all in corrupt, third-world countries, so that the power lines, electrical switches, outlets, and all the other entities so hopefully and concretely indicative of such a grid are absent or compromised, and in fact make little contribution to the practical delivery of electricity to people’s homes and factories. This makes perceiving the electronic and other devices that electricity theoretically enables as separate, functional units frustrating, at minimum, and impossible, at worst. This is partly because of technical insufficiency: the systems simply don’t work. But it is also in no small part because of the lack of trust characteristic of systemically corrupt societies. To put it another way: What you perceive as your computer is like a single leaf, on a tree, in a forest—or, even more accurately, like your fingers rubbing briefly across that leaf. A single leaf can be plucked from a branch. It can be perceived, briefly, as a single, self-contained entity—but that perception misleads more than clarifies. In a few weeks, the leaf will crumble and dissolve. It would not have been there at all, without the tree. It cannot continue to exist, in the absence of the tree. This is the position of our laptops in relation to the world. So much of what they are resides outside their boundaries that the screened devices we hold on our laps can only maintain their computer-like façade for a few short years. Almost everything we see and hold is like that, although often not so evidently
Jordan B. Peterson (12 Rules for Life: An Antidote to Chaos)
I had a great deal of valuable knowledge - about genetics, computers, aikido, karate, hardware, chess, wine, cocktails, dancings, sexual positions, social protocols, and the probability of a fifty-six-game hitting streak occurring in the history of baseball. I knew so much shit and I still couldn't fix myself.
Graeme Simsion (The Rosie Project (Don Tillman, #1))
The big change has been in the hardware/software cost ratio. The buyer of a $2-million machine in 1960 felt that he could afford $250,000 more for a customized payroll program, one that slipped easily and nondisruptively into the computer-hostile social environment. Buyers of $50,000 office machines today cannot conceivably afford customized payroll programs; so they adapt their payroll procedures to the packages available.
Frederick P. Brooks Jr. (The Mythical Man-Month: Essays on Software Engineering)
In 2016, Tesla announced that every new vehicle would be equipped with all the hardware it needs to drive autonomously, including a bevy of sensors and an onboard computer running a neural network.2 The kicker: the autonomous AI software won’t be fully deployed. As it turns out, Tesla will test drivers against software simulations running in the background on the car’s computer. Only when the background program consistently simulates moves more safely than the driver does will the autonomous software be ready for prime time. At that point, Tesla will release the program through remote software updates. What this all means is that Tesla drivers will, in aggregate, be teaching the fleet of cars how to drive.
Paul R. Daugherty (Human + Machine: Reimagining Work in the Age of AI)
Computer power has become a service instead of a hardware investment. Anyone with a credit card and some know-how can rent a virtual supercomputer.
James Barrat (Our Final Invention: Artificial Intelligence and the End of the Human Era)
Unstable internet messages on Zoom are sometimes being generated by the poor performing computer hardware.
Steven Magee
Ah, so you’re an astronomer?” said Twoflower. “Oh no,” said Belafon, as the rock drifted gently around the curve of a mountain, “I’m a computer hardware consultant.
Terry Pratchett (The Light Fantastic (Discworld, #2; Rincewind, #2))
John Hennessy and David Patterson: they are titled Computer Organization and Design: The Hardware/Software Interface and Computer Architecture: A Quantitative Approach (both published by Morgan Kaufmann).
Gian-Paolo D. Musumeci (System Performance Tuning: Help for Unix Administrators)
The oldest computer hardware I worked on was in professional astronomy.
Steven Magee
With both the hardware and software needed to fully emulate human intelligence, we can expect computers to pass the Turing test, indicating intelligence indistinguishable from that of biological humans, by the end of the 2020s.
Ray Kurzweil (The Singularity is Near: When Humans Transcend Biology)
Although the parallel truism, Wirth’s Law, states that software becomes exponentially slower to run as computer power increases, so the net gain from upgrading your hardware is much smaller than you’d expect.
Helen Arney (The Element in the Room: Science-y Stuff Staring You in the Face)
IT Infrastructure IT infrastructure encompasses the hardware, software, networks, and services required to operate an organization's information technology environment, supporting its computing needs, data storage, networking, and other essential operations.
Education Transforming mental health and substance abuse systems of care : community integration and
hard·ware n. The part of a computer system that can be kicked.
Michael Barr (Programming Embedded Systems: With C and GNU Development Tools)
More than that, to the extent computer hardware and software had security holes, the NSA’s managers were reluctant to patch them. Much of this hardware and software was used (or copied) in countries worldwide, including the targets of NSA surveillance; if it could easily be hacked, so much the better for surveillance.
Fred Kaplan (Dark Territory: The Secret History of Cyber War)
In the same way that Morse code reduces written language to dots and dashes, the spoken version of the code reduces speech to just two vowel sounds. The key word here is two. Two types of blinks, two vowel sounds, two different anything, really, can with suitable combinations convey all types of information.
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
Just as Morse code provides a good introduction to the nature of codes, the telegraph provides a good introduction to the hardware of the computer.
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
physical sharing and exchange of computer tapes and disks on which the code was recorded. In current Internet days, rapid technological advances in computer hardware and software and networking technologies have made it much easier to create and sustain a communal development style on ever-larger scales. Also, implementing new projects is becoming progressively easier as effective project design becomes better understood, and as prepackaged infrastructural support for such projects becomes available on the Web. Today, an open source software development project is typically initiated by an individual or a small group seeking a solution to an individual's or a firm's need. Raymond (1999, p. 32) suggests that "every good work of software starts by scratching a developer's personal itch" and that "too often software developers spend their days grinding away for pay at programs they neither need nor love. But not in the (open source) world...." A project's initiators also generally become the project's "owners" or "maintainers" who take on responsibility for project management." Early on, this individual or group generally develops a first, rough version of the code that outlines the functionality envisioned. The source code for this initial version is then made freely available to all via downloading from an Internet website established by the project. The project founders also set up infrastructure for the project that those interested in using or further developing the code can use to seek help, provide information or provide new open source code for others to discuss and test. In the case of projects that are successful in attracting interest, others do download and use and "play with" the code-and some of these do go on to create new and modified code. Most then post what they have done on the project website for use and critique by any who are interested. New and modified code that is deemed to be of sufficient quality and of general interest by the project maintainers is then added to the authorized version of the code. In many projects the privilege of adding to the authorized code is restricted to only a few trusted developers. These few then serve as gatekeepers for code written by contributors who do not have such access (von Krogh and Spaeth 2002). Critical tools and infrastructure available to open source software project participants includes email lists for specialized purposes that are open to all. Thus, there is a list where code users can report software failures ("bugs") that they encounter during field use of the software. There is also a list where those developing the code can share ideas about what would be good next steps for the project, good features to add, etc. All of these lists are open to all and are also publicly archived,
Eric von Hippel (Democratizing Innovation)
Bits also play a part in logic, that strange blend of philosophy and mathematics for which a primary goal is to determine whether certain statements are true or false. True
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
Today’s computers use transistors. When used in computers, transistors basically function the same way relays do, but (as we’ll see) they’re much faster and much smaller and much quieter and use much less power and are much cheaper. Building an 8-Bit Adder still requires 144 transistors (more if you replace the ripple carry with a look-ahead carry), but the circuit is microscopic. Chapter 13. But What About Subtraction? After you’ve convinced yourself that relays can indeed be wired together to add binary numbers, you might ask, “But what about subtraction?” Rest assured that you’re not making a nuisance of yourself by asking questions like this; you’re actually being quite perceptive. Addition and subtraction complement each other in some ways, but the mechanics of the two operations are different. An addition marches consistently from the rightmost column of digits to the leftmost column. Each carry from one column is added to the next column. We don’t carry in subtraction, however; we borrow, and that involves an intrinsically different mechanism—a messy back-and-forth kind of thing. For example, let’s look at a typical borrow-laden subtraction
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
Today’s computers use transistors. When used in computers, transistors basically function the same way relays do, but (as we’ll see) they’re much faster and much smaller and much quieter and use much less power and are much cheaper. Building an 8-Bit Adder still requires 144 transistors (more if you replace the ripple carry with a look-ahead carry), but the circuit is microscopic.
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
We don’t carry in subtraction, however; we borrow, and that involves an intrinsically different mechanism—a messy back-and-forth kind of thing.
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
Last year, I was teaching a group of executives who were arguing about whether it was possible to do creative work with people who had poor social skills and who preferred to work alone. One executive from a computer hardware firm squirmed and turned red, finally blurting out, “These are exactly the kind of people I manage.” He went on to say: They hide in their offices, and don’t come out. We divide the work so they each have a separate part. We slide their assignment under the door and run away. They ignore us when we tell them it is good enough—they won’t let us build it until it meets their standards for elegant designs—they don’t care what we think.
Robert I. Sutton (Weird Ideas That Work: 11 1/2 Practices for Promoting, Managing, and Sustaining Innovation)
Virtualization in computing often refers to the abstraction of some physical component into a logical object. By virtualizing an object, you can obtain some greater measure of utility from the resource the object provides. For example, Virtual LANs (local area networks), or VLANs, provide greater network performance and improved manageability by being separated from the physical hardware.
Matthew Portnoy (Virtualization Essentials)
The U.S. military is no more capable of operating without the Internet than Amazon.com would be. Logistics, command and control, fleet positioning, everything down to targeting, all rely on software and other Internet-related technologies. And all of it is just as insecure as your home computer, because it is all based on the same flawed underlying technologies and uses the same insecure software and hardware.
Richard A. Clarke (Cyberwar: The Next Threat to National Security & What to Do About It)
first is that we’re living in a time of astonishing progress with digital technologies—those that have computer hardware, software, and networks at their core.
Erik Brynjolfsson (The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies)
Association of dissimilar ideas “I had earlier devised an arrangement for beam steering on the two-mile accelerator which reduced the amount of hardware necessary by a factor of two…. Two weeks ago it was pointed out to me that this scheme would steer the beam into the wall and therefore was unacceptable. During the session, I looked at the schematic and asked myself how could we retain the factor of two but avoid steering into the wall. Again a flash of inspiration, in which I thought of the word ‘alternate.’ I followed this to its logical conclusion, which was to alternate polarities sector by sector so the steering bias would not add but cancel. I was extremely impressed with this solution and the way it came to me.” “Most of the insights come by association.” “It was the last idea that I thought was remarkable because of the way in which it developed. This idea was the result of a fantasy that occurred during Wagner…. [The participant had earlier listened to Wagner’s ‘Ride of the Valkyries.’] I put down a line which seemed to embody this…. I later made the handle which my sketches suggested and it had exactly the quality I was looking for…. I was very amused at the ease with which all of this was done.” 10. Heightened motivation to obtain closure “Had tremendous desire to obtain an elegant solution (the most for the least).” “All known constraints about the problem were simultaneously imposed as I hunted for possible solutions. It was like an analog computer whose output could not deviate from what was desired and whose input was continually perturbed with the inclination toward achieving the output.” “It was almost an awareness of the ‘degree of perfection’ of whatever I was doing.” “In what seemed like ten minutes, I had completed the problem, having what I considered (and still consider) a classic solution.” 11. Visualizing the completed solution “I looked at the paper I was to draw on. I was completely blank. I knew that I would work with a property three hundred feet square. I drew the property lines (at a scale of one inch to forty feet), and I looked at the outlines. I was blank…. Suddenly I saw the finished project. [The project was a shopping center specializing in arts and crafts.] I did some quick calculations …it would fit on the property and not only that …it would meet the cost and income requirements …it would park enough cars …it met all the requirements. It was contemporary architecture with the richness of a cultural heritage …it used history and experience but did not copy it.” “I visualized the result I wanted and subsequently brought the variables into play which could bring that result about. I had great visual (mental) perceptibility; I could imagine what was wanted, needed, or not possible with almost no effort. I was amazed at my idealism, my visual perception, and the rapidity with which I could operate.
James Fadiman (The Psychedelic Explorer's Guide: Safe, Therapeutic, and Sacred Journeys)
Traditional theory said that hardware was a limit,” Ericsson said. “But if people are able to transform the mechanism that mediates performance by training, then we're in an entirely new space. This is a biological system, not a computer. It can construct itself.
Daniel Coyle (The Talent Code: Unlocking the Secret of Skill in Sports, Art, Music, Math, and Just About Everything Else)
You have to be an optimist to believe in the Singularity,” she says, “and that’s harder than it seems. Have you ever played Maximum Happy Imagination?” “Sounds like a Japanese game show.” Kat straightens her shoulders. “Okay, we’re going to play. To start, imagine the future. The good future. No nuclear bombs. Pretend you’re a science fiction writer.” Okay: “World government … no cancer … hover-boards.” “Go further. What’s the good future after that?” “Spaceships. Party on Mars.” “Further.” “Star Trek. Transporters. You can go anywhere.” “Further.” I pause a moment, then realize: “I can’t.” Kat shakes her head. “It’s really hard. And that’s, what, a thousand years? What comes after that? What could possibly come after that? Imagination runs out. But it makes sense, right? We probably just imagine things based on what we already know, and we run out of analogies in the thirty-first century.” I’m trying hard to imagine an average day in the year 3012. I can’t even come up with a half-decent scene. Will people live in buildings? Will they wear clothes? My imagination is almost physically straining. Fingers of thought are raking the space behind the cushions, looking for loose ideas, finding nothing. “Personally, I think the big change is going to be our brains,” Kat says, tapping just above her ear, which is pink and cute. “I think we’re going to find different ways to think, thanks to computers. You expect me to say that”—yes—“but it’s happened before. It’s not like we have the same brains as people a thousand years ago.” Wait: “Yes we do.” “We have the same hardware, but not the same software. Did you know that the concept of privacy is, like, totally recent? And so is the idea of romance, of course.” Yes, as a matter of fact, I think the idea of romance just occurred to me last night. (I don’t say that out loud.) “Each big idea like that is an operating system upgrade,” she says, smiling. Comfortable territory. “Writers are responsible for some of it. They say Shakespeare invented the internal monologue.” Oh, I am very familiar with the internal monologue. “But I think the writers had their turn,” she says, “and now it’s programmers who get to upgrade the human operating system.” I am definitely talking to a girl from Google. “So what’s the next upgrade?” “It’s already happening,” she says. “There are all these things you can do, and it’s like you’re in more than one place at one time, and it’s totally normal. I mean, look around.” I swivel my head, and I see what she wants me to see: dozens of people sitting at tiny tables, all leaning into phones showing them places that don’t exist and yet are somehow more interesting than the Gourmet Grotto. “And it’s not weird, it’s not science fiction at all, it’s…” She slows down a little and her eyes dim. I think she thinks she’s getting too intense. (How do I know that? Does my brain have an app for that?) Her cheeks are flushed and she looks great with all her blood right there at the surface of her skin. “Well,” she says finally, “it’s just that I think the Singularity is totally reasonable to imagine.
Robin Sloan (Mr. Penumbra's 24-Hour Bookstore (Mr. Penumbra's 24-Hour Bookstore, #1))
IBM’s Watson draws on a plethora of clever algorithms, but it would be uncompetitive without computer hardware that is about one hundred times more powerful than Deep Blue, its chess-playing predecessor that beat the human world champion, Garry Kasparov, in a 1997 match.
Erik Brynjolfsson (The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies)
Although earlier computers existed in isolation from the world, requiring their visuals and sound to be generated and live only within their memory, the Amiga was of the world, able to interface with it in all its rich analog glory. It was the first PC with a sufficient screen resolution and color palette as well as memory and processing power to practically store and display full-color photographic representations of the real world, whether they be scanned in from photographs, captured from film or video, or snapped live by a digitizer connected to the machine. It could be used to manipulate video, adding titles, special effects, or other postproduction tricks. And it was also among the first to make practical use of recordings of real-world sound. The seeds of the digital-media future, of digital cameras and Photoshop and MP3 players, are here. The Amiga was the first aesthetically satisfying PC. Although the generation of machines that preceded it were made to do many remarkable things, works produced on them always carried an implied asterisk; “Remarkable,” we say, “. . . for existing on such an absurdly limited platform.” Even the Macintosh, a dramatic leap forward in many ways, nevertheless remained sharply limited by its black-and-white display and its lack of fast animation capabilities. Visuals produced on the Amiga, however, were in full color and could often stand on their own terms, not as art produced under huge technological constraints, but simply as art. And in allowing game programmers to move beyond blocky, garish graphics and crude sound, the Amiga redefined the medium of interactive entertainment as being capable of adult sophistication and artistry. The seeds of the aesthetic future, of computers as everyday artistic tools, ever more attractive computer desktops, and audiovisually rich virtual worlds, are here. The Amiga empowered amateur creators by giving them access to tools heretofore available only to the professional. The platform’s most successful and sustained professional niche was as a video-production workstation, where an Amiga, accompanied by some relatively inexpensive software and hardware peripherals, could give the hobbyist amateur or the frugal professional editing and postproduction capabilities equivalent to equipment costing tens or hundreds of thousands. And much of the graphical and musical creation software available for the machine was truly remarkable. The seeds of the participatory-culture future, of YouTube and Flickr and even the blogosphere, are here. The
Jimmy Maher (The Future Was Here: The Commodore Amiga (Platform Studies))
What is an operating system, really? What did Cutler’s team wish to create? Picture a wealthy English household in the early 1900s. Think of a computer—the hardware—as a big house, the family’s residence. The house consists of plumbing and lighting, bricks and mortar, windows and doors—all manner of physical things and processes. Next, imagine computer software as the people in the house. The household staff, living downstairs, provide a whole range of services at once. The butler stands by the door, the driver washes the car, the housekeeper presses the linen, the cook provides meals and bakes cakes, the gardener rakes the leaves from the lawn. And this activity, which seemingly happens of its own accord, is coordinated by the head of the household staff. Such is the life of the downstairs dwellers, who in a certain sense exist in the background. Then consider the people upstairs. They are the whole reason for the toil of the people downstairs. The husband desires a driver not simply for peace of mind but because he wishes to travel. The wife employs a cook, so her family can eat well. The children benefit from the work of the gardener, who clears the yard of debris, enabling them to play outdoors safely. The picture of the family upstairs and their faithful downstairs servants neatly illustrates the great divide in the world of software. The people upstairs are the applications: the word-processing, electronic ledger, database, publishing and numerous other programs that satisfy human needs and wants. The people downstairs collectively perform the functions of an operating system. Theirs is a realm of services, some automatic, some requiring a special request. These services lay the basis for the good stuff of life. Cutler
G. Pascal Zachary (Showstopper!: The Breakneck Race to Create Windows NT and the Next Generation at Microsoft)
In the formative years of digital computing, following World War II, both the operating system and applications were considered afterthoughts by designers. The “hardware” of electronics, as distinct from the “software” of programs, was so difficult that engineers could hardly see past it. The most important type of hardware was the circuitry or processors that actually carried out the instructions given the computer. A second set of devices made it possible to get data into and out of a computer. A third class stored information. A fourth class allowed one computer to send information to another, over special cable or telephone lines. The question of software generally arose only after the hardware pieces fell into place.
G. Pascal Zachary (Showstopper!: The Breakneck Race to Create Windows NT and the Next Generation at Microsoft)
In 1951, Grace Murray Hopper, a mathematician with the U.S. Navy’s Bureau of Ordnance Naval Reserve, conceived of a program called a compiler, which translated a programmer’s instructions into the strings of ones and zeroes, or machine language, that ultimately controlled the computer. In principle, compilers seemed just the thing to free programmers from the tyranny of hardware and the mind-numbing binary code. Hopper
G. Pascal Zachary (Showstopper!: The Breakneck Race to Create Windows NT and the Next Generation at Microsoft)
It wasn’t until five years after the first 360 hardware was introduced in 1964 that all of its software ran well. By then, IBM had spent nearly as much writing the software as designing the hardware. This astonished the company’s managers and vividly highlighted “the greatest impediment to advances in computer technology,” the problem of managing large software projects. At
G. Pascal Zachary (Showstopper!: The Breakneck Race to Create Windows NT and the Next Generation at Microsoft)
Imagine a computer. The monitor, keyboard, and processor are the hardware. Without any software to run it, your computer would be worthless. Your body is your hardware and your mindset is your operating system. It gives you access to the power of the hardware, and determines what software you can run. It lets you get the most out of your computer, allowing you to balance your checkbook and even create 3-D designs. Your mindset determines how you perceive and interact with the world.
Mike Cernovich (Gorilla Mindset)
An example of the extent of the FSB and GRU covert cyber collection and exploitation was the exposure of what was most likely a Russian State Security & Navy Intelligence covert operation to monitor, exploit and hack targets within the central United States from Russian merchant ships equipped with advanced hacking hardware and tools. The US Coast guard boarded the merchant ship SS Chem Hydra and in it they found wireless intercept equipment associated with Russian hacking teams. Apparently the vessel had personnel on board who were tasked to collect intelligence on wireless networks and attempt hackings on regional computer networks in the heartland of America.59
Malcolm W. Nance (The Plot to Hack America: How Putin's Cyberspies and WikiLeaks Tried to Steal the 2016 Election)
NSA employs more mathematicians, buys more computer hardware, and intercepts more messages than any other organization in the world. It is the world leader when it comes to snooping.
Simon Singh (The Code Book: The Science of Secrecy from Ancient Egypt to Quantum Cryptography)
You can still use your Kindle while it is connected to your computer and charging via USB. To do so, unmount or eject it so that your Kindle exits USB drive mode. Windows: Right-click on the "Safely remove hardware" icon in the lower right-hand corner of the task bar and follow the onscreen instructions to remove your Kindle. Mac OS X: Click the Eject button next to the Kindle in any Finder window, or drag it from the Desktop to the Trash.
Amazon (Kindle User's Guide)
communication simply requires
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
get a sense of how powerful Musk’s work may end up being for the American economy, have a think about the dominant mechatronic machine of the past several years: the smartphone. Pre-iPhone, the United States was the laggard in the telecommunications industry. All of the exciting cell phones and mobile services were in Europe and Asia, while American consumers bumbled along with dated equipment. When the iPhone arrived in 2007, it changed everything. Apple’s device mimicked many of the functions of a computer and then added new abilities with its apps, sensors, and location awareness. Google charged to market with its Android software and related handsets, and the United States suddenly emerged as the driving force in the mobile industry. Smartphones were revolutionary because of the ways they allowed hardware, software, and services to work in unison. This was a mix that favored the skills of Silicon Valley. The rise of the smartphone led to a massive industrial boom in which Apple became the most valuable company in the country, and billions of its clever devices were spread all over the world.
Ashlee Vance (Elon Musk: Inventing the Future)
Tas led us down the hall to his cold, dark, Gigeresque computer room. Network cables and power cords covered with thick metal shielding dangled from the ceiling, forming a web over our heads as they trailed back and forth to the racks of computer hardware. They looked like tendrils spreading out from the belly of some hideous alien-cyborg creature. Red LED lights flashed here and there in the darkness, and racks of computer equipment loomed obelisk-like around us. “This place is creepy,” Butch said. “Do you sacrifice virgins in here when you’re not hacking?” Tas smirked.
Jamie Sedgwick (Death in the Hallows (Hank Mossberg, Private Ogre #2))
Terrorism suspects aren’t the NSA’s only targets, however. Operations against nation-state adversaries have exploded in recent years as well. In 2011, the NSA mounted 231 offensive cyber operations against other countries, according to the documents, three-fourths of which focused on “top-priority” targets like Iran, Russia, China, and North Korea. Under a $652-million clandestine program code named GENIE, the NSA, CIA, and special military operatives have planted covert digital bugs in tens of thousands of computers, routers, and firewalls around the world to conduct computer network exploitation, or CNE. Some are planted remotely, but others require physical access to install through so-called interdiction—the CIA or FBI intercepts shipments of hardware from manufacturers and retailers in order to plant malware in them or install doctored chips before they reach the customer.
Anonymous
I said early on in this chapter that we would need 144 relays for our adding machine. Here’s how I figured that out: Each
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
Back in the early days of the personal computer, IBM was the industry leader. In the early eighties they had just released the personal computer, and it was taking the business world by storm. They were so sure the profit was all in the hardware, and software was just a minor accessory. In their carelessness, they sold the software that ran their computers to an unknown Harvard dropout named Bill Gates, and they learned the hard way, the computer business is driven by applications, not by the hardware.
Nicholas L Vulich (Manage Like Abraham Lincoln)