Hardware 1990 Quotes

We've searched our database for all the quotes and captions related to Hardware 1990. Here they are! All 9 of them:

Apple has always insisted on having a hardware monopoly, except for a brief period in the mid-1990s when they allowed clone-makers to compete with them, before subsequently putting them out of business. Macintosh hardware was, consequently, expensive. You didn’t open it up and fool around with it because doing so would void the warranty. In fact, the first Mac was specifically designed to be difficult to open—you needed a kit of exotic tools, which you could buy through little ads that began to appear in the back pages of magazines a few months after the Mac came out on the market. These ads always had a certain disreputable air about them, like pitches for lock-picking tools in the backs of lurid detective magazines.
Neal Stephenson (In the Beginning...Was the Command Line)
This kind of pragmatism has become a hallmark of our psychological culture. In the mid-1990s, I described how it was commonplace for people to “cycle through” different ideas of the human mind as (to name only a few images) mechanism, spirit, chemistry, and vessel for the soul.14 These days, the cycling through intensifies. We are in much more direct contact with the machine side of mind. People are fitted with a computer chip to help with Parkinson’s. They learn to see their minds as program and hardware. They take antidepressants prescribed by their psychotherapists, confident that the biochemical and oedipal self can be treated in one room. They look for signs of emotion in a brain scan. Old jokes about couples needing “chemistry” turn out not to be jokes at all.
Sherry Turkle (Alone Together: Why We Expect More from Technology and Less from Each Other)
The collapse, for example, of IBM’s legendary 80-year-old hardware business in the 1990s sounds like a classic P-type story. New technology (personal computers) displaces old (mainframes) and wipes out incumbent (IBM). But it wasn’t. IBM, unlike all its mainframe competitors, mastered the new technology. Within three years of launching its first PC, in 1981, IBM achieved $5 billion in sales and the #1 position, with everyone else either far behind or out of the business entirely (Apple, Tandy, Commodore, DEC, Honeywell, Sperry, etc.). For decades, IBM dominated computers like Pan Am dominated international travel. Its $13 billion in sales in 1981 was more than its next seven competitors combined (the computer industry was referred to as “IBM and the Seven Dwarfs”). IBM jumped on the new PC like Trippe jumped on the new jet engines. IBM owned the computer world, so it outsourced two of the PC components, software and microprocessors, to two tiny companies: Microsoft and Intel. Microsoft had all of 32 employees. Intel desperately needed a cash infusion to survive. IBM soon discovered, however, that individual buyers care more about exchanging files with friends than the brand of their box. And to exchange files easily, what matters is the software and the microprocessor inside that box, not the logo of the company that assembled the box. IBM missed an S-type shift—a change in what customers care about. PC clones using Intel chips and Microsoft software drained IBM’s market share. In 1993, IBM lost $8.1 billion, its largest-ever loss. That year it let go over 100,000 employees, the largest layoff in corporate history. Ten years later, IBM sold what was left of its PC business to Lenovo. Today, the combined market value of Microsoft and Intel, the two tiny vendors IBM hired, is close to $1.5 trillion, more than ten times the value of IBM. IBM correctly anticipated a P-type loonshot and won the battle. But it missed a critical S-type loonshot, a software standard, and lost the war.
Safi Bahcall (Loonshots: How to Nurture the Crazy Ideas That Win Wars, Cure Diseases, and Transform Industries)
SCULLEY. Pepsi executive recruited by Jobs in 1983 to be Apple’s CEO, clashed with and ousted Jobs in 1985. JOANNE SCHIEBLE JANDALI SIMPSON. Wisconsin-born biological mother of Steve Jobs, whom she put up for adoption, and Mona Simpson, whom she raised. MONA SIMPSON. Biological full sister of Jobs; they discovered their relationship in 1986 and became close. She wrote novels loosely based on her mother Joanne (Anywhere but Here), Jobs and his daughter Lisa (A Regular Guy), and her father Abdulfattah Jandali (The Lost Father). ALVY RAY SMITH. A cofounder of Pixar who clashed with Jobs. BURRELL SMITH. Brilliant, troubled hardware designer on the original Mac team, afflicted with schizophrenia in the 1990s. AVADIS “AVIE” TEVANIAN. Worked with Jobs and Rubinstein at NeXT, became chief software engineer at Apple in 1997. JAMES VINCENT. A music-loving Brit, the younger partner with Lee Clow and Duncan Milner at the ad agency Apple hired. RON WAYNE. Met Jobs at Atari, became first partner with Jobs and Wozniak at fledgling Apple, but unwisely decided to forgo his equity stake. STEPHEN WOZNIAK. The star electronics geek at Homestead High; Jobs figured out how to package and market his amazing circuit boards and became his partner in founding Apple. DEL YOCAM. Early Apple employee who became the General Manager of the Apple II Group and later Apple’s Chief Operating Officer. INTRODUCTION How This Book Came to Be In the early summer of 2004, I got a phone call from Steve Jobs. He had been scattershot friendly to me over the years, with occasional bursts of intensity, especially when he was launching a new product that he wanted on the cover of Time or featured on CNN, places where I’d worked. But now that I was no longer at either of those places, I hadn’t heard from him much. We talked a bit about the Aspen Institute, which I had recently joined, and I invited him to speak at our summer campus in Colorado. He’d be happy to come, he said, but not to be onstage. He wanted instead to take a walk so that we could talk. That seemed a bit odd. I didn’t yet
Walter Isaacson (Steve Jobs)
Similarly, the computers used to run the software on the ground for the mission were borrowed from a previous mission. These machines were so out of date that Bowman had to shop on eBay to find replacement parts to get the machines working. As systems have gone obsolete, JPL no longer uses the software, but Bowman told me that the people on her team continue to use software built by JPL in the 1990s, because they are familiar with it. She said, “Instead of upgrading to the next thing we decided that it was working just fine for us and we would stay on the platform.” They have developed so much over such a long period of time with the old software that they don’t want to switch to a newer system. They must adapt to using these outdated systems for the latest scientific work. Working within these constraints may seem limiting. However, building tools with specific constraints—from outdated technologies and low bitrate radio antennas—can enlighten us. For example, as scientists started to explore what they could learn from the wait times while communicating with deep space probes, they discovered that the time lag was extraordinarily useful information. Wait times, they realized, constitute an essential component for locating a probe in space, calculating its trajectory, and accurately locating a target like Pluto in space. There is no GPS for spacecraft (they aren’t on the globe, after all), so scientists had to find a way to locate the spacecraft in the vast expanse. Before 1960, the location of planets and objects in deep space was established through astronomical observation, placing an object like Pluto against a background of stars to determine its position.15 In 1961, an experiment at the Goldstone Deep Space Communications Complex in California used radar to more accurately define an “astronomical unit” and help measure distances in space much more accurately.16 NASA used this new data as part of creating the trajectories for missions in the following years. Using the data from radio signals across a wide range of missions over the decades, the Deep Space Network maintained an ongoing database that helped further refine the definition of an astronomical unit—a kind of longitudinal study of space distances that now allows missions like New Horizons to create accurate flight trajectories. The Deep Space Network continued to find inventive ways of using the time lag of radio waves to locate objects in space, ultimately finding that certain ways of waiting for a downlink signal from the spacecraft were less accurate than others. It turned to using the antennas from multiple locations, such as Goldstone in California and the antennas in Canberra, Australia, or Madrid, Spain, to time how long the signal took to hit these different locations on Earth. The time it takes to receive these signals from the spacecraft works as a way to locate the probes as they are journeying to their destination. Latency—or the different time lag of receiving radio signals on different locations of Earth—is the key way that deep space objects are located as they journey through space. This discovery was made possible during the wait times for communicating with these craft alongside the decades of data gathered from each space mission. Without the constraint of waiting, the notion of using time as a locating feature wouldn’t have been possible.
Jason Farman (Delayed Response: The Art of Waiting from the Ancient to the Instant World)
Department stores in particular had difficulty surviving the onslaught of low-price competitors. Their inability to adapt to changing consumer tastes and the emergence of new retail channels that targeted specific market segments—the so-called category killers in hardware, toys, and furniture—deeply eroded their market share. While in the 1960s and 1970s most clothing was sold in full-service department stores, by 1990 such stores accounted for only 29 percent of sales.
Ellen Ruppel Shell (Cheap: The High Cost of Discount Culture)
The insatiable need for more processing power -- ideally, located as close as possible to the user but, at the very least, in nearby indus­trial server farms -- invariably leads to a third option: decentralized computing. With so many powerful and often inactive devices in the homes and hands of consumers, near other homes and hands, it feels inevitable that we'd develop systems to share in their mostly idle pro­cessing power. "Culturally, at least, the idea of collectively shared but privately owned infrastructure is already well understood. Anyone who installs solar panels at their home can sell excess power to their local grid (and, indirectly, to their neighbor). Elon Musk touts a future in which your Tesla earns you rent as a self-driving car when you're not using it yourself -- better than just being parked in your garage for 99% of its life. "As early as the 1990s programs emerged for distributed computing using everyday consumer hardware. One of the most famous exam­ples is the University of California, Berkeley's SETl@HOME, wherein consumers would volunteer use of their home computers to power the search for alien life. Sweeney has highlighted that one of the items on his 'to-do list' for the first-person shooter Unreal Tournament 1, which shipped in 1998, was 'to enable game servers to talk to each other so we can just have an unbounded number of players in a single game session.' Nearly 20 years later, however, Sweeney admitted that goal 'seems to still be on our wish list.' "Although the technology to split GPUs and share non-data cen­ter CPUs is nascent, some believe that blockchains provide both the technological mechanism for decentralized computing as well as its economic model. The idea is that owners of underutilized CPUs and GPUs would be 'paid' in some cryptocurrency for the use of their processing capabilities. There might even be a live auction for access to these resources, either those with 'jobs' bidding for access or those with capacity bidding on jobs. "Could such a marketplace provide some of the massive amounts of processing capacity that will be required by the Metaverse? Imagine, as you navigate immersive spaces, your account continuously bidding out the necessary computing tasks to mobile devices held but unused by people near you, perhaps people walking down the street next to you, to render or animate the experiences you encounter. Later, when you’re not using your own devices, you would be earning tokens as they return the favor. Proponents of this crypto-exchange concept see it as an inevitable feature of all future microchips. Every computer, no matter how small, would be designed to be auctioning off any spare cycles at all times. Billions of dynamically arrayed processors will power the deep compute cycles of event the largest industrial customers and provide the ultimate and infinite computing mesh that enables the Metaverse.
Mattew Ball
Like NeXT, like Polavision, like the Boeing 747, the PIC was a beautiful, turbo-powered, wildly expensive machine—with no customers. Once again, love of loonshots had triumphed over strength of strategy, just as it had with Juan Trippe and Edwin Land. Only Jobs, unlike the other two, had doubled down on the Moses Trap. After two more years and over $50 million invested, Jobs finally pulled the plug on the PIC. In April 1990, Pixar sold its hardware business to a California-based technology company, Vicom Systems.
Safi Bahcall (Loonshots: How to Nurture the Crazy Ideas That Win Wars, Cure Diseases, and Transform Industries)
Some will say that this is merely a matter of software, which is intrinsically more adaptable than hardware like televisions or cellular phones. But before the Web became mainstream in the mid-1990s, the pace of software innovation followed the exact same 10/ 10 pattern of development that we saw in the spread of other twentieth-century technologies.
Steven Johnson (Where Good Ideas Come From)