Ibm Think Quotes

We've searched our database for all the quotes and captions related to Ibm Think. Here they are! All 38 of them:

One reason alone is enough for today, and that reason lies in the national misconception of what constitutes education. All of your lives you have been trained to believe that your mental equipment consisted of learning how to memorize a multitude of facts. This is what I call parroting a man. To my mind, this inadequate concept of education is the crime of the age.
The two directions of thinking are the outward direction toward your material equipment which gives you your resources, and the inward direction toward your mental equipment, which gives you your resourcefulness.
And Thomas Watson, chairman of IBM, said in 1943, “I think there is a world market for maybe five computers.
Michio Kaku (Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100)
All great masters in any line think positive thoughts, which they put into action with intensive desire tested and tempered by balanced judgement.   All
If you look at economics textbooks, you will learn that homo economicus can think like Albert Einstein, store as much memory as IBM’s Big Blue, and exercise the willpower of Mahatma Gandhi. Really. But the folks that we know are not like that. Real people have trouble with long division if they don’t have a calculator, sometimes forget their spouse’s birthday, and have a hangover on New Year’s Day. They are not homo economicus; they are homo sapiens.
Richard H. Thaler (Nudge: The Final Edition)
You were never to say you weren't "fine, thank you — and yourself?" You were supposed to be Heidi. You were supposed to lug goat milk up the hills and not think twice. Heidi did not complain. Heidi did not do things like stand in front of the new IBM photocopier saying, "If this fucking Xerox machine breaks on me one more time, I'm going to slit my wrists.
Lorrie Moore (Like Life)
Think different” isn’t just a slogan. It’s a credo, one that made Apple the most profitable company in human history. People accused Steve Jobs of creating a “reality distortion field,” but he understood that reality is already distorted. Apple would never win by trying to build a better mainframe computer. That would have been playing by IBM’s rules. Instead, Apple created a personal computer because that was what it wanted the future to look like.
Chase Jarvis (Creative Calling: Establish a Daily Practice, Infuse Your World with Meaning, and Succeed in Work + Life)
In this office right now I am thinking about how long it would take a corpse to disintegrate right in this office. In this office these are the things I fantasize about while dreaming: Eating ribs at Red, Hot and Blue in Washington, D.C. If I should switch shampoos. What really is the best dry beer? Is Bill Robinson an overrated designer? What’s wrong with IBM? Ultimate luxury. Is the term “playing hardball” an adverb? The fragile peace of Assisi. Electric light. The epitome of luxury. Of ultimate luxury. The bastard’s wearing the same damn Armani linen suit I’ve got on.
Bret Easton Ellis (American Psycho (Vintage Contemporaries))
I think we're all just doing our best to survive the inevitable pain and suffering that walks alongside us through life. Long ago, it was wild animals and deadly poxes and harsh terrain. I learned about it playing The Oregon Trail on an old IBM in my computer class in the fourth grade. The nature of the trail has changed, but we keep trekking along. We trek through the death of a sibling, a child, a parent, a partner, a spouse; the failed marriage, the crippling debt, the necessary abortion, the paralyzing infertility, the permanent disability, the job you can't seem to land; the assault, the robbery, the break-in, the accident, the flood, the fire; the sickness, the anxiety, the depression, the loneliness, the betrayal, the disappointment, and the heartbreak. There are these moments in life where you change instantly. In one moment, you're the way you were, and in the next, you're someone else. Like becoming a parent: you're adding, of course, instead of subtracting, as it is when someone dies, and the tone of the occasion is obviously different, but the principal is the same. Birth is an inciting incident, a point of no return, that changes one's circumstances forever. The second that beautiful baby onto whom you have projected all your hopes and dreams comes out of your body, you will never again do anything for yourself. It changes you suddenly and entirely. Birth and death are the same in that way.
Stephanie Wittels Wachs (Everything is Horrible and Wonderful: A Tragicomic Memoir of Genius, Heroin, Love and Loss)
What’s the best thing you’ve done in your work and career? In business decision-making, certainly one of your highlights was licensing your computer operating system to IBM for almost no money, provided you could retain the right to license the system to other computer manufacturers as well. IBM was happy to agree because, after all, nobody would possibly want to compete with the most powerful company in the world, right? With that one decision, your system and your company became dominant throughout the world, and you, Bill Gates, were on your way to a net worth of more than $60 billion. Or maybe you’d like to look at your greatest career achievement from a different angle. Instead of focusing on the decision that helped you make so much money, maybe you’d like to look at the decision to give so much of it away. After all, no other person in history has become a philanthropist on the scale of Bill Gates. Nations in Africa and Asia are receiving billions of dollars in medical and educational support. This may not be as well publicized as your big house on Lake Washington with its digitalized works of art, but it’s certainly something to be proud of. Determining your greatest career achievement is a personal decision. It can be something obvious or something subtle. But it should make you proud of yourself when you think of it. So take a moment, then make your choice.
Dale Carnegie (Make Yourself Unforgettable: How to Become the Person Everyone Remembers and No One Can Resist (Dale Carnegie))
Quanta. On Yom Kippur Eve, the quanta went to ask Einstein for his forgiveness. “I'm not home,” Einstein yelled at them from behind his locked door. On their way back, people swore loudly at them through the windows, and someone even threw a can. The quanta pretended not to care, but deep in their hearts they were really hurt. Nobody understands the quanta, everybody hates them. “You parasites,” people would shout at them as they walked down the road. “Go serve in the army.” “We wanted to, actually,” the quanta would try to explain, “but the army wouldn't take us because we're so tiny.” Not that anyone listened. Nobody listens to the quanta when they try to defend themselves, but when they say something that can be interpreted negatively, well, then everyone's all ears. The quanta can make the most innocent statement, like “Look, there's a cat!” and right away they're saying on the news how the quanta were stirring up trouble and they rush off to interview Schrödinger. All in all, the media hated the quanta worse than anybody, because once the quanta had spoken at an IBM press conference about how the very act of viewing had an effect on an event, and all the journalists thought the quanta were lobbying to keep them from covering the Intifada. The quanta could insist as much as they wanted that this wasn't at all what they meant and that they had no political agenda whatsoever, but nobody would believe them anyway. Everyone knew they were friends of the government's Chief Scientist. Loads of people think the quanta are indifferent, that they have no feelings, but it simply isn't true. On Friday, after the program about the bombing of Hiroshima, they were interviewed in the studio in Jerusalem. They could barely talk. They just sat there facing the open mike and sniffling, and all the viewers at home, who didn't know the quanta very well, thought they were avoiding the question and didn't realize the quanta were crying What's sad is that even if the quanta were to write dozens of letters to the editors of all the scientific journals in the world and prove beyond a doubt that people had taken advantage of their naiveté, and that they'd never ever imagined it would end that way, it wouldn't do them any good, because nobody understands the quanta. The physicists least of all.
Etgar Keret (The Bus Driver Who Wanted to be God and Other Stories) IBM ThinkPad X60 Series Laptop Battery Replacement
Barry (Art Culture and the Semiotics of Meaning)
Since we built such sophisticated business machines, people tended to think of IBM as a model of order and logic—a totally streamlined organization in which we developed plans rationally and carried them out with utter precision. I never thought for a minute that was really the case.
Thomas J. Watson Jr. (Father, Son & Co.: My Life at IBM and Beyond)
The five-digit numbers tattooed into the forearms of Nazi concentration-camp prisoners initially corresponded to IBM Hollerith punch-card numbers;
Viktor Mayer-Schönberger (Big Data: A Revolution That Will Transform How We Live, Work, and Think)
IBM was granted a U.S. patent in 2012 on “Securing premises using surface-based computing technology.” That’s intellectual-property-lawyer-speak for a touch-sensitive floor covering, somewhat like a giant smartphone screen. The potential uses are plentiful. It would be able to identify the objects on it. In basic form, it could know to turn on lights in a room or open doors when a person enters. More important, however, it might identify individuals by their weight or the way they stand and walk. It could tell if someone fell and did not get back up, an important feature for the elderly. Retailers could learn the flow of traffic through their stores. When the floor is datafied, there is no ceiling to its possible uses.
Viktor Mayer-Schönberger (Big Data: A Revolution That Will Transform How We Live, Work, and Think)
Computers already have enough power to outperform people in activities we used to think of as distinctively human. In 1997, IBM’s Deep Blue defeated world chess champion Garry Kasparov. Jeopardy!’s best-ever contestant, Ken Jennings, succumbed to IBM’s Watson in 2011. And Google’s self-driving cars are already on California roads today. Dale Earnhardt Jr. needn’t feel threatened by them, but the Guardian worries (on behalf of the millions of chauffeurs and cabbies in the world) that self-driving cars “could drive the next wave of unemployment.
Peter Thiel (Zero to One: Notes on Startups, or How to Build the Future)
They were so happy to be relieved of those strictures that they very quickly lapsed. Not everyone, of course, but the majority. We built the company a little too fast, and consequently the last 50 percent of the people hired really didn't have much commitment to the corporate culture. There were some warning signs. Consider McKinsey, which holds itself out as one of the world's leading repositories of knowledge on how to manage a business. They say they'll never grow their company by more than 25 percent per year, because otherwise it's just too hard to transmit the corporate culture. So if you're growing faster than 25 percent a year, you have to ask yourself, "What do I know about management that McKinsey doesn't know?" I still think it's more efficient—this is just an old Lisp programmer's standard way of thinking—if you have two really good people and a very powerful tool. That's better than having 20 mediocre people and inefficient tools. ArsDigita demonstrated that pretty well. We were able to get projects done in about 1/5th the time and probably at about 1/10th or 1/20th the cost of people using other tools. Of course, we would do it at 1/20th of the cost and we would charge 1/10th of the cost. So the customer would have a big consumer surplus. They would pay 1/10th of what they would have paid with IBM Global Services or Broadvision or something, but we would have a massive profit margin because we'd be spending less than half of what they paid us to do the job.
Jessica Livingston (Founders at Work: Stories of Startups' Early Days)
Outsourcing requires a tight integration of suppliers, making sure that all pieces arrive just in time. Therefore, when some suppliers were unable to deliver certain basic components like capacitors and flash memory, Compaq's network was paralyzed. The company was looking at 600,000 to 700,000 unfilled orders in handheld devices. The $499 Pocket PCs were selling for $700 to $800 at auctions on eBay and Cisco experienced a different but equally damaging problem: When orders dried up, Cisco neglected to turn off its supply chain, resulting in a 300 percent ballooning of its raw materials inventory. The final numbers are frightening: The aggregate market value loss between March 2000 and March 2001 of the twelve major companies that adopted outsourcing-Cisco, Dell, Compaq, Gateway, Apple, IBM, Lucent, Hewlett-Packard, Motorola, Ericsson, Nokia, and Nortel-exceeded $1.2 trillion. The painful experience of these companies and their investors is a vivid demonstration of the consequences of ignoring network effects. A me attitude, where the company's immediate financial balance is the only factor, limits network thinking. Not understanding how the actions of one node affect other nodes easily cripples whole segments of the network. Experts agree that such rippling losses are not an inevitable downside of the network economy. Rather, these companies failed because they outsourced their manufacturing without fully understanding the changes required in their business models. Hierarchical thinking does not fit a network economy. In traditional organizations, rapid shifts can be made within the organization, with any resulting losses being offset by gains in other parts of the hierarchy. In a network economy each node must be profitable. Failing to understand this, the big players of the network game exposed themselves to the risks of connectedness without benefiting from its advantages. When problems arose, they failed to make the right, tough decisions, such as shutting down the supply line in Cisco's case, and got into even bigger trouble. At both the macro- and the microeconomic level, the network economy is here to stay. Despite some high-profile losses, outsourcing will be increasingly common. Financial interdependencies, ignoring national and continental boundaries, will only be strengthened with globalization. A revolution in management is in the making. It will take a new, network-oriented view of the economy and an understanding of the consequences of interconnectedness to smooth the way.
Albert-László Barabási (Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life)
Would you like me to give you a formula for success? It’s quite simple, really. Double your rate of failure. You are thinking of failure as the enemy of success. But it isn’t at all. You can be discouraged by failure, or you can learn from it. So go ahead and make mistakes. Make all you can. Because remember, that’s where you will find success.”   Thomas Watson, former chairman and CEO of IBM
Calvert Cazier (The Resiliency Toolkit: A Busy Parent's Guide to Raising Happy, Confident, Successful Children)
As I reflect upon some of the exceptional leaders I’ve studied in my research, I’m struck by how Covey’s principles are manifested in many of their stories. Let me focus on one of my favorite cases, Bill Gates. It’s become fashionable in recent years to attribute the outsize success of someone like Bill Gates to luck, to being in the right place at the right time. But if you think about it, this argument falls apart. When Popular Electronics put the Altair computer on its cover, announcing the advent of the first-ever personal computer, Bill Gates teamed up with Paul Allen to launch a software company and write the BASIC programming language for the Altair. Yes, Gates was at just the right moment with programming skills, but so were other people—students in computer science and electrical engineering at schools like Cal Tech, MIT, and Stanford; seasoned engineers at technology companies like IBM, Xerox, and HP; and scientists in government research laboratories. Thousands of people could’ve done what Bill Gates did at that moment, but they didn’t. Gates acted upon the moment. He dropped out of Harvard, moved to Albuquerque (where the Altair was based), and wrote computer code day and night. It was not the luck of being at the right moment in history that separated Bill Gates, but his proactive response to being at the right moment (Habit 1: Be Proactive).
Stephen R. Covey (The 7 Habits of Highly Effective People: Powerful Lessons in Personal Change)
It’s difficult to imagine that Artificial Intelligence will take the place of people but many believe that it’s only a short time before computers will outthink us. They already can beat our best chess players and have been able to out calculate us since calculators first came onto the scene. IBM’s Watson is on the cutting edge of Cognitive Computers, being used to out think our physicians but closer to home, for the greatest part; our cars are no longer assembled by people but rather robots. Our automobiles can be considered among our first robots, since they took the place of horses. Just after the turn of the last century when the population in the United States crossed the 100 M mark the number of horses came to 20M. Now we have a population of 325 M but only 9 M horses. You might ask what happened. Well back in 1915 there were 2.4 M cars but this jumped to 3.6 M in just one year. Although horses still out-numbered cars the handwriting was on the wall! You might think that this doesn’t apply to us but why not? The number of robots increase, taking the place of first our workers on the assembly line and then workers in the food industry and this takes us from tractors and combines on the farms to the cooking and serving hamburgers at your favorite burger joint. People are becoming redundant! That’s right we are becoming superfluous! Worldwide only 7 out of 100 people have college degrees and here in the United States only 40% of our working population possesses a sheep skin, although mine is printed on ordinary paper. With education becoming ever more expensive, we as a population are becoming ever more uneducated. A growing problem is that as computers and robots become smarter, as they are, we are no longer needed to be anything more than a consumer and where will the money come from for that? I recently read that this death spiral will run its course within 40 years! Nice statistics that we’re looking at…. Looking at the bright side of things you can now buy an atomically correct, life sized doll, as perhaps a robotic non-complaining, companion for under $120. In time these robotic beings will be able to talk back but hopefully there will be an off switch. As interesting as this sounds it will most likely not be for everyone, however it may appeal to some of our less capable, not to have to actually interface with real live people. The fact is that most people will soon outlive their usefulness! We as a society are being challenged and there will soon be little reason for our being. When machines make machines that can out think us; when we become dumb and superfluous, then what? Are we ready for this transition? It’s scary but If nothing else, it’s something to think about….
Hank Bracker
Electrical signals require electrons, which generate heat, which limits the amount of work a chip can perform and requires a lot of power for cooling. Light has neither limitation. If IBM’s estimations are correct, over the next eight years, its new chip design will accelerate supercomputer performance a thousandfold, taking us from our current 2.6 petaflops to an exaflop (that’s 10 to the 18th, or a quintillion operations per second)—or one hundred times faster than the human brain.
Peter H. Diamandis (Abundance: The Future is Better Than You Think)
The fashionable term now is “Big Data.” IBM estimates that we are generating 2.5 quintillion bytes of data each day, more than 90 percent of which was created in the last two years.36 This exponential growth in information is sometimes seen as a cure-all, as computers were in the 1970s. Chris Anderson, the editor of Wired magazine, wrote in 2008 that the sheer volume of data would obviate the need for theory, and even the scientific method.37 This is an emphatically pro-science and pro-technology book, and I think of it as a very optimistic one. But it argues that these views are badly mistaken. The numbers have no way of speaking for themselves. We speak for them. We imbue them with meaning. Like Caesar, we may construe them in self-serving ways that are detached from their objective reality. Data-driven predictions can succeed—and they can fail. It is when we deny our role in the process that the odds of failure rise. Before we demand more of our data, we need to demand more of ourselves.
Nate Silver (The Signal and the Noise: Why So Many Predictions Fail-but Some Don't)
next. My dad called this imprinting. In the 1980s he’d have NASA leaders, IBM executives, educators, and athletes write down specific memories of strong performances on note cards and put a plan in place to review and relive them. He understood that words trigger pictures, which impact emotions, which lead to performances. He’d teach me this process as flick back/flick up—flick back to a past moment, and then flick up and apply that past behavior to a future moment.
Trevor Moawad (It Takes What It Takes: How to Think Neutrally and Gain Control of Your Life)
The Apple Ad to IBM when the IBM computer launched. Welcome to the most exciting and important marketplace since the computer revolution began 35 years ago. Read the opening sentence of Apple's Ad. Putting real computer power in the hands of the individual is already improving the way people work, think, learn, communicate, and spend their leisure hours. Ove the course of the next decade the growth of the personal computer will continue in algorithmic leaps. We look forward to responsible competition in the massive effort to distribute this American technology to the world. And we appreciate the magnitude of your commitment, because what we are doing is increasing social capital by enhancing individual productivity. Apple signed the letter to their new rival with the words, Welcome to the task. Apple was trying to advance a just cause and IBM was going to help them. IBM accepted the challenge.
Simon Sinek (The Infinite Game)
Kurzweil cites numerous quotations from prominent people in history who completely underestimated the progress and impact of technology. Here are a few examples. IBM’s chairman, Thomas J. Watson, in 1943: ‘I think there is a world market for maybe five computers.’ Digital Equipment Corporation’s co-founder Ken Olsen in 1977: ‘There’s no reason for individuals to have a computer in their home.’ Bill Gates in 1981: ‘640,000 bytes of memory ought to be enough for anybody.
Melanie Mitchell (Artificial Intelligence: A Guide for Thinking Humans)
the late 1940s there were still only a few devices. Early in that decade IBM’s president, Thomas J. Watson, had allegedly (and notoriously) said, “I think there is a world market for about five computers.” Popular Mechanics magazine made a forecast typical of its time in 1949: “Computers in the future may have only 1000 vacuum tubes,” it argued, “and perhaps weigh only 1½ tons.” A decade after Bletchley, there were still only hundreds of computers around the world.
Mustafa Suleyman (The Coming Wave: Technology, Power, and the Twenty-first Century's Greatest Dilemma)
A good chief executive is essentially a hard-to-automate decision engine, not unlike IBM’s Jeopardy!-playing Watson system. They have built up a hard-won repository of experience and have honed and proved an instinct for their market. They’re then presented inputs throughout the day—in the form of e-mails, meetings, site visits, and the like—that they must process and act on. To ask a CEO to spend four hours thinking deeply about a single problem is a waste of what makes him or her valuable. It’s better to hire three smart subordinates to think deeply about the problem and then bring their solutions to the executive for a final decision.
Cal Newport (Deep Work: Rules for Focused Success in a Distracted World)
I tell him that such stories make me nervous for the future in the face of advances in AI. Garry shakes his head. “I’m more optimistic about the future of humanity,” he says. “You’re not worried about AI taking over?” “Why should I be? I was the first knowledge worker whose job was threatened by machines,” he says. I laugh. It’s true. In 1997, Garry was famously beaten in a chess match by IBM’s supercomputer Deep Blue. “I think it’s wrong to cry about progress,” he says. “The future is not humans fighting machines. The future is humans collaborating with machines. Every technology in history destroyed jobs, but it also created new ones.
A.J. Jacobs (The Puzzler: One Man's Quest to Solve the Most Baffling Puzzles Ever, from Crosswords to Jigsaws to the Meaning of Life)
In 1996 or so, I bought my first home computer. It was some sort of IBM product. If I was some weird computer nerd, I would be able to tell you all about the ROM and RAM this machine had. All I know is that it was black when every other model was off-white. When I was perusing models with the sales guy who was blathering on and on about what it could do, all I could think was how much better the black would look in my home office than the ugly off-white. I’m that kind of nerd.
Jen Mann (People I Want to Punch in the Throat: Competitive Crafters, Drop-Off Despots, and Other Suburban Scourges)
I think there is a world wide market for maybe five computers. (Thomas Watson, Chairman IBM, 1943)
Briony J. Oates (Researching Information Systems and Computing)
I think there is a world wide market for maybe five computers. (Thomas Watson, Chairman IBM, 1943) This telephone has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us. (Western Union internal memo, 1876) But what is it good for? (Engineer at the Advanced Computing Systems Division of IBM, 1968, commenting on the microchip) There is no reason why anyone would want a computer in their home. (Ken Olson, president, chairman and founder of Digital Equipment Corporation, 1977) Computers in the future may weigh no more than 1.5 tons. (Popular Mechanics, 1949)
Briony J. Oates (Researching Information Systems and Computing)
I made up my mind I was going to learn something about IBM computers. So I enrolled in an IBM school for retailers in Poughkeepsie, New York. One of the speakers was a guy from the National Mass Retailers’ Institute (NMRI), the discounters’ trade association, a guy named Abe Marks. ABE MARKS, HEAD OF HARTFIELD ZODY’S, AND FIRST PRESIDENT, NMRI: “I was sitting there at the conference reading the paper, and I had a feeling somebody was standing over me, so I look up and there’s this grayish gentleman standing there in a black suit carrying an attaché case. And I said to myself, ‘Who is this guy? He looks like an undertaker.’ “He asks me if I’m Abe Marks and I say, ‘Yes, I am.’ “ ‘Let me introduce myself, my name is Sam Walton,’ he says. ‘I’m only a little fellow from Bentonville, Arkansas, and I’m in the retail business.’ “I say, ‘You’ll have to pardon me, Sam, I thought I knew everybody and every company in the retail business, but I never heard of Sam Walton. What did you say the name of your company is again?’ “ ‘Wal-Mart Stores,’ he says. “So I say, ‘Well, welcome to the fraternity of discount merchants. I’m sure you’ll enjoy the conference and getting acquainted socially with everyone.’ “ ‘Well, to be perfectly honest with you, Mr. Marks, I didn’t come here to socialize, I came here to meet you. I know you’re a CPA and you’re able to keep confidences, and I really wanted your opinion on what I am doing now.’ So he opens up this attaché case, and, I swear, he had every article I had ever written and every speech I had ever given in there. I’m thinking, This is a very thorough man.’ Then he hands me an accountant’s working column sheet, showing all his operating categories all written out by hand. “Then he says: ‘Tell me what’s wrong. What am I doing wrong?’ “I look at these numbers—this was in 1966—and I don’t believe what I’m seeing. He’s got a handful of stores and he’s doing about $10 million a year with some incredible margin. An unbelievable performance! “So I look at it, and I say, ‘What are you doing wrong? Sam—if I may call you Sam—I’ll tell you what you are doing wrong.’ I handed back his papers and I closed his attaché case, and I said to him, ‘Being here is wrong, Sam. Don’t unpack your bags. Go down, catch a cab, go back to the airport and go back to where you came from and keep doing exactly what you are doing. There is nothing that can possibly improve what you are doing. You are a genius.’ That’s how I met Sam Walton.” Abe
Sam Walton (Sam Walton: Made In America)
I often suggest they think about making a limited investment, putting aside maybe only 1 percent of their budget for special projects to test out a new idea. In this way, risk stops being scary and becomes R&D. Talk to private sector CEOs and they will be quick to point out that R&D is the lifeblood of innovative companies. Yes, some things will fail as you discover what works and what doesn’t. But as Einstein reportedly said, “You never fail until you stop trying.” This is true whether you’re launching a program, developing a product, or starting a movement. I’ve often heard people from the social sector protest, “But we don’t have funding for R&D!” My response is to remind them of the words of one of our greatest modern-day innovators, Steve Jobs: “Innovation has nothing to do with how many R&D dollars you have. When Apple came up with the Mac, IBM was spending at least one hundred times more on R&D. It’s not about money. It’s about the people you have, how you’re led, and how much you get it.” You don’t need a big budget in order to experiment. “You never fail until you stop trying.” —ALBERT EINSTEIN Realistically, budgets are often stretched and funding for programs “locked.” I see this especially with foundations or government programs, which can have rigid protocols. When nonprofits or governments experiment and fail, those failures are often labeled as waste or fraud or abuse, which discourages more risk taking.
Jean Case (Be Fearless: 5 Principles for a Life of Breakthroughs and Purpose)
The five most highly correlated factors are: Organizational culture. Strong feelings of burnout are found in organizations with a pathological, power-oriented culture. Managers are ultimately responsible for fostering a supportive and respectful work environment, and they can do so by creating a blame-free environment, striving to learn from failures, and communicating a shared sense of purpose. Managers should also watch for other contributing factors and remember that human error is never the root cause of failure in systems. Deployment pain. Complex, painful deployments that must be performed outside of business hours contribute to high stress and feelings of lack of control.4 With the right practices in place, deployments don’t have to be painful events. Managers and leaders should ask their teams how painful their deployments are and fix the things that hurt the most. Effectiveness of leaders. Responsibilities of a team leader include limiting work in process and eliminating roadblocks for the team so they can get their work done. It’s not surprising that respondents with effective team leaders reported lower levels of burnout. Organizational investments in DevOps. Organizations that invest in developing the skills and capabilities of their teams get better outcomes. Investing in training and providing people with the necessary support and resources (including time) to acquire new skills are critical to the successful adoption of DevOps. Organizational performance. Our data shows that Lean management and continuous delivery practices help improve software delivery performance, which in turn improves organizational performance. At the heart of Lean management is giving employees the necessary time and resources to improve their own work. This means creating a work environment that supports experimentation, failure, and learning, and allows employees to make decisions that affect their jobs. This also means creating space for employees to do new, creative, value-add work during the work week—and not just expecting them to devote extra time after hours. A good example of this is Google’s 20% time policy, where the company allows employees 20% of their week to work on new projects, or IBM’s “THINK Friday” program, where Friday afternoons are designated for time without meetings and employees are encouraged to work on new and exciting projects they normally don’t have time for.
Nicole Forsgren (Accelerate: The Science of Lean Software and DevOps: Building and Scaling High Performing Technology Organizations)
Tablebases [logs of complete chess games played backwards from the end-state of checkmate] are the clearest case of human chess vs. alien chess. A decade of trying to teach computers how to play endgames was rendered obsolete in an instant thanks to a new tool. This is a pattern we see over and over again in everything related to intelligent machines. It's wonderful if we can teach machines to think like we do, but why settle for thinking like a human if you can be a god? (jm3: Frustratingly for the humans, it was not disclosed whether IBM's Deep Blue stored and consulted endgame tablebases during competition).
Garry Kasparov (Deep Thinking: Where Machine Intelligence Ends and Human Creativity Begins)
In Tsai's go‐go years, high‐flying stocks with​ positive momentum were all the rage. Polaroid, Xerox, IBM all traded at price‐to‐earnings ratios of more than 50. These expensive stocks were supported by explosively high growth rates. From 1964 to 1968, IBM, Polaroid, and Xerox grew their earnings per share at 88%, 22%, and 171%, respectively. Others like University Computing, Mohawk Data, and Fairchild Camera traded at several‐hundred times their trailing 12‐month earnings. The latter three and many others like them would go on to lose more than 80% in the 1969–1970 bear market. The Manhattan Fund was up almost 40% in 1967, more than double the Dow. But in 1968, he was down 7% and was ranked 299th out of 305 funds tracked by Arthur Lipper.16 When the market crash came, the people responsible were entirely unprepared. By 1969, half of the salesmen on Wall Street had only come into the business since 196217 and had seen nothing but a rising market. And when stocks turned, the highfliers that went up the fastest also came down the fastest. For example, National Student Marketing, which Tsai bought 122,000 shares for $5 million, crashed from $143 in December 1969 to $3.50 in July 1970.18 Between September and November 1929, $30 billion worth of stock value vanished; in the1969‐1970 crash, the loss was $300 billion!19 The gunslingers of the 1960s were thinking only about return and paid little attention to risk. This carefree attitude was a result of the market they were playing in. From 1950 through the end of 1965, the Dow was within 5% of its highs 66% of the time, and within 10% of its highs 87% of the time. There was virtually no turbulence at all. From 1950 to 1965, the only bear market was “The Kennedy Slide,” which chopped 27% off the S&P 500, and recovered in just over a year.
Michael Batnick (Big Mistakes: The Best Investors and Their Worst Investments (Bloomberg))
One day during a pep rally to the troops, Watson scrawled the word THINK on a piece of paper. Patterson saw the note and ordered THINK signs distributed throughout the company.
Edwin Black (IBM and the Holocaust: The Strategic Alliance Between Nazi Germany and America's Most Powerful Corporation)