Claude Shannon Quotes

We've searched our database for all the quotes and captions related to Claude Shannon. Here they are! All 42 of them:

When you are famous it is hard to work on small problems. This is what did Shannon in. After information theory, what do you do for an encore? The great scientists often make this error. They fail to continue to plant the little acorns from which the mighty oak trees grow. They try to get the big thing right off. And that isn't the way things go. So that is another reason why you find that when you get early recognition it seems to sterilize you.
Richard Hamming
Claude Shannon, the father of information theory, once declared, “I visualize a time when we will be to robots what dogs are to humans, and I’m rooting for the machines.
Michio Kaku (The Future of Humanity: Terraforming Mars, Interstellar Travel, Immortality, and Our Destiny Beyond)
The enemy knows the system
Claude Shannon
The second simplest algorithm is: combine two bits. Claude Shannon, better known as the father of information theory, was the first to realize that what transistors are doing, as they switch on and off in response to other transistors, is reasoning. (That was his master’s thesis at MIT—the most important master’s thesis of all time.)
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
For Wiener, entropy was a measure of disorder; for Shannon, of uncertainty. Fundamentally, as they were realizing, these were the same.
James Gleick (The Information: A History, a Theory, a Flood)
Two revolutions coincided in the 1950s. Mathematicians, including Claude Shannon and Alan Turing, showed that all information could be encoded by binary digits, known as bits. This led to a digital revolution powered by circuits with on-off switches that processed information. Simultaneously, Watson and Crick discovered how instructions for building every cell in every form of life were encoded by the four-letter sequences of DNA. Thus was born an information age based on digital coding (0100110111001…) and genetic coding (ACTGGTAGATTACA…). The flow of history is accelerated when two rivers converge.
Walter Isaacson (The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race)
In these days, when there is a tendency to specialize so closely, it is well for us to be reminded that the possibilities of being at once broad and deep did not pass with Leonardo da Vinci or even Benjamin Franklin.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
The first [method] I might speak about is simplification. Suppose that you are given a problem to solve, I don't care what kind of problem-a machine to design, or a physical theory to develop, or a mathematical theorem to prove or something of that kind-probably a very powerful approach to this is to attempt to eliminate everything from the problem except the essentials; that is, cut is down to size. Almost every problem that you come across is befuddled with all kinds of extraneous data of one sort or another; and if you can bring this problem down into the main issues, you can see more clearly what you are trying to do an perhaps find a solution. Now in so doing you may have stripped away the problem you're after. You may have simplified it to the point that it doesn't even resemble the problem that you started with; but very often if you can solve this simple problem, you can add refinements to the solution of this until you get back to the solution of the one you started with.
Claude Shannon
How much did a degree and a neat resume really prepare someone for a job? The Alan Turings and Claude Shannons of the world had been eccentric, inventive, forceful people. Rule breakers. The people at Bletchley Park and Room 40 didn’t stop to check boxes; they got the job done no matter what the cost.
David Walton (The Genius Plague)
Robert said: ‘Oh my God!’ and Joe calmly replied, ‘Please don’t exaggerate, just call me Professor.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
I don’t think I was ever motivated by the notion of winning prizes, although I have a couple of dozen of them in the other room. I was more motivated by curiosity. Never by the desire for financial gain. I just wondered how things were put together. Or what laws or rules govern a situation, or if there are theorems about what one can’t or can do. Mainly because I wanted to know myself.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
There were two kinds of researchers at Bell Labs: those who are being paid for what they used to do, and those who are being paid for what they were going to do. Nobody was paid for what they were doing now.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
Having outgrown its Manhattan headquarters, most of Bell Labs moved to two hundred rolling acres in Murray Hill, New Jersey. Mervin Kelly and his colleagues wanted their new home to feel like an academic campus, but without the segregation of various disciplines into different buildings. They knew that creativity came through chance encounters. “All buildings have been connected so as to avoid fixed geographical delineation between departments and to encourage free interchange and close contact among them,” an executive wrote.11 The corridors were extremely long, more than the length of two football fields, and designed to promote random meetings among people with different talents and specialties, a strategy that Steve Jobs replicated in designing Apple’s new headquarters seventy years later. Anyone walking around Bell Labs might be bombarded with random ideas, soaking them up like a solar cell. Claude Shannon, the eccentric information theorist, would sometimes ride a unicycle up and down the long red terrazzo corridors while juggling three balls and nodding at colleagues.III It was a wacky metaphor for the balls-in-the-air ferment in the halls.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
In one sense, the world seen through such eyes looks starkly unequal. “A very small percentage of the population produces the greatest proportion of the important ideas,” Shannon began, gesturing toward a rough graph of the distribution of intelligence. “There are some people if you shoot one idea into the brain, you will get a half an idea out. There are other people who are beyond this point at which they produce two ideas for each idea sent in. Those are the people beyond the knee of the curve.” He was not, he quickly added, claiming membership for himself in the mental aristocracy—he was talking about history’s limited supply of Newtons and Einsteins.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
Geniuses are the luckiest of mortals because what they must do is the same as what they most want to do and, even if their genius is unrecognized in their lifetime, the essential earthly reward is always theirs, the certainty that their work is good and will stand the test of time. One suspects that the geniuses will be least in the Kingdom of Heaven—if, indeed, they ever make it; they have had their reward.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
Modern man lives isolated in his artificial environment, not because the artificial is evil as such, but because of his lack of comprehension of the forces which make it work—of the principles which relate his gadgets to the forces of nature, to the universal order. It is not central heating which makes his existence “unnatural,” but his refusal to take an interest in the principles behind it. By being entirely dependent on science, yet closing his mind to it, he leads the life of an urban barbarian
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
Claude E. Shannon, who first enunciated the theory, “was able to define channel capacity for continuous signals such as music and speech.
Ingo Swann (Reality Boxes: And Other Black Holes in Human Consciousness)
How is logic like a machine? Here is how one logician explained it around the turn of the twentieth century: "As a material machine is an instrument for economising the exertion of force, so a symbolic calculus is an instrument for economising the exertion of intelligence." Logic, just like a machine, was a tool for democratizing force: built with enough precision and skill, it could multiply the power of the gifted and the average alike.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
Encoding and decoding messages was a mathematical problem that was too interesting to be abandoned as the war dwindled. Mathematicians continued to formalize the idea of information, but they framed their efforts in the context of communication technologies, transcending the efforts to decipher intercepted messages. The mathematicians who triumphed became known as the world’s first information theorists or cyberneticists. These pioneers included Claude Shannon, Warren Weaver, Alan Turing, and Norbert Wiener.
Cesar A. Hidalgo (Why Information Grows: The Evolution of Order, from Atoms to Economies)
At Bush’s MIT, math and engineering were an extension of the metal shop and the woodshop, and students who were skilled with the planimeter and the slide rule had to be skilled as well with the soldering iron and the saw.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
When I was a digital system designer in the 1960s, I didn’t worry very much about minimization. If a simplification was obvious, then of course I used it. But I didn’t try to do anything “clever.” In college, minimization was a big deal, and textbook writers generally made minimization a central topic. In real life, however, the fact is that right up until the project is actually accepted, shipped, and paid for by the client, designers are ever alert for changes, that is, memos (called engineering change orders) from the client saying something like “we didn’t really mean what we originally asked for in paragraph 19 on page 73, but instead we now want …” To make such changes on a machine that is already 80% wired on the production floor meant that you’d better have some spare logic gates in your design (typically, 10% of the nominal design). Minimization was actually counterproductive, something I didn’t learn until I actually designed for a paycheck and not for homework points.
Paul J. Nahin (The Logician and the Engineer: How George Boole and Claude Shannon Created the Information Age)
Another completely different way that the contacts of a relay could fail was if dirt or an insect got trapped in the spacing between contacts. If a fly or a moth, for example, happened to be sitting on the make contact when the coil was energized, then it could be squashed and, after its smashed little body dried, the contacts would be covered with a very disgusting but quite effective insulator. To clean up such a disabled relay was called debugging, a term that has survived in the vocabulary of modern computer users trying to fix their faulty programs. This is not a joke—I heard it as a quite serious story in a lecture at the Naval Postgraduate School in 1982 from a legend in computer science, Rear Admiral Grace Hopper (1906–1992), a Yale PhD mathematician who worked during the Second World War with Harvard’s five ton, 800 cubic foot Mark I relay computer, which when operating was described as sounding like a “roomful of ladies knitting.” To debug such a machine must have been an “interesting” job for someone;
Paul J. Nahin (The Logician and the Engineer: How George Boole and Claude Shannon Created the Information Age)
Analog data are superior for this job because they can be changed back and forth with relative ease whenever the environment within or outside the cell demands it, and they can store an almost unlimited number of possible values, even in response to conditions that have never been encountered before.25 The unlimited number of possible values is why many audiophiles still prefer the rich sounds of analog storage systems. But even though analog devices have their advantages, they have a major disadvantage. In fact, it’s the reason we’ve moved from analog to digital. Unlike digital, analog information degrades over time—falling victim to the conspiring forces of magnetic fields, gravity, cosmic rays, and oxygen. Worse still, information is lost as it’s copied. No one was more acutely disturbed by the problem of information loss than Claude Shannon, an electrical engineer from the Massachusetts Institute of Technology (MIT) in Boston.
David A. Sinclair (Lifespan: Why We Age – and Why We Don’t Have To)
Claude Shannon. “He couldn’t have been in any other department successfully,” Brock McMillan recalls. “But then, there weren’t many other departments where people just sat and thought.
Jon Gertner (The Idea Factory: Bell Labs and the Great Age of American Innovation)
I’ve often noticed that many of the notes people take are of ideas they already know, already agree with, or could have guessed. We have a natural bias as humans to seek evidence that conrms what we already believe, a well-studied phenomenon known as “conrmation bias.” That isn’t what a Second Brain is for. The renowned information theorist Claude Shannon, whose discoveries paved the way for modern technology, had a simple denition for “information”: that which surprises you. If you’re not surprised, then you already knew it at some level, so why take note of it? Surprise is an excellent barometer for information that doesn’t t neatly into our existing understanding, which means it has the potential to change how we think.
Tiago Forte (Building a Second Brain: A Proven Method to Organize Your Digital Life and Unlock Your Creative Potential)
Claude Shannon, the father of information theory, had defined information as “the resolution of uncertainty,
Meghan O'Gieblyn (God, Human, Animal, Machine: Technology, Metaphor, and the Search for Meaning)
universal order. It is not central heating which makes his existence “unnatural,” but his refusal to take an interest in the principles behind it. By being entirely dependent on science, yet closing his mind to it, he leads the life of an urban barbarian.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
A Symbolic Analysis of Relay and Switching Circuits” is arguably the most influential master’s thesis of the twentieth century: in it Claude Shannon laid the theoretical foundation for all of modern computer design, nearly a decade before such computers even existed.
M. Mitchell Waldrop (The Dream Machine)
The cause-effect information is defined as the smaller (minimum) of the cause-information and the effect-information. If either one is zero, the cause-effect information is likewise zero. That is, the mechanism's past must be able to determine its present, which, in turn, must be able to determine its future. The more the past and the future are specified by the present state, the higher the mechanism's cause-effect power. Note that this usage of 'information' is very different from its customary meaning in engineering and science introduced by Claude Shannon. Shannon information, which is always assessed from the external perspective of an observer, quantifies how accurately signals transmitted over some noisy communication channel, such as a radio link or an optical cable, can be decoded. Data that distinguishes between two possibilities, OFF and ON, carries 1 bit of information. What that information is, though - the result of a critical blood test or the least significant bit in a pixel in the corner of a holiday photo - completely depends on the context. The meaning of Shannon information is in the eye of the beholder, not in the signals themselves. Shannon information is observational and extrinsic. Information in the sense of integrated information theory reflects a much older Aristotelian usage, derived from the Latin in-formare, 'to give form or shape to.' Integrated information gives rise to the cause-effect structure, a form. Integrated information is causal, intrinsic, and qualitative: it is assessed from the inner perspective of a system, based on how its mechanisms and its present state shape its own past and future. How the system constrains its past and future states determines whether the experience feels like azure blue or the smell of wet dog.
Christof Koch (The Feeling of Life Itself: Why Consciousness Is Widespread but Can't Be Computed (Mit Press))
fulsome,
Jimmy Soni (A Mind at Play: The Brilliant Life of Claude Shannon, Inventor of the Information Age)
I’m a machine and you’re a machine, and we both think, don’t we? —Claude Shannon
Jimmy Soni (A Mind at Play: The Brilliant Life of Claude Shannon, Inventor of the Information Age)
The best on horses you think will lose are a valuable "insurance policy." When rare disaster strikes, you'll be glad you had the insurance. 71 The exponential growth of wealth in the Kelly system is also a consequence of proportional betting. As the bankroll grows, make larger bets. 98 [2 questions are central to John Kelly's analysis] What level of risk will lead to the highest long-run return? What is the chance of losing everything? 286 As Fred Schwed, Jr. author of Where are the Customer's Yatchs? put it back in 1940, "Like all of life's rich emotional experiences, the full flavor of losing important money cannot be conveyed in literature." 304 Claude Shannon: A smart investor should understand where he has an edge and invest only in those opportunities. 308 The longer you hold a stock, the harder it is to beat the market by much. 316
William Poundstone (Fortune's Formula: The Untold Story of the Scientific Betting System That Beat the Casinos and Wall Street)
The few steps that connect strangers explain how rumors spread rapidly and widely. If you have a good investment idea, you might want to keep it secret. In 1998 a New York Times Science Times article said that mathematicians had discovered how networks might “make a big world small” using the equivalent of the famous person idea, and attributed the concept of six degrees of separation to a sociologist in 1967. Yet all this was known to Claude Shannon in 1960.
Edward O. Thorp (A Man for All Markets: From Las Vegas to Wall Street, How I Beat the Dealer and the Market)
This equivalence wasn’t discovered until the 1930s, most notably by Claude Elwood Shannon (born 1916), whose famous 1938 M.I.T. master’s thesis was entitled “A Symbolic Analysis of Relay and Switching Circuits.
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
The messages that resolve the greatest amount of uncertainty—that are picked from the widest range of symbols with the fairest odds—are the richest in information.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
Curiosity in extremis runs the risk of becoming dilettantism, a tendency to sample everything and finish nothing.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
Shannon was an engineer—a man more attuned to practicality than most—and yet he was drawn to the idea that knowledge was valuable for its own sake and that discovery was pleasurable in its own right.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
As the story goes, the manuscript that formed the outlines of Wiener’s contributions to information theory was nearly lost to humanity. Wiener had entrusted the manuscript to Walter Pitts, a graduate student, who had checked it as baggage for a trip from New York’s Grand Central Terminal to Boston. Pitts forgot to retrieve the baggage. Realizing his mistake, he asked two friends to pick up the bag. They either ignored or forgot the request. Only five months later was the manuscript finally tracked down; it had been labeled “unclaimed property” and cast aside in a coatroom. Wiener was, understandably, blind with rage. “Under these circumstances please consider me as completely dissociated from your future career,” he wrote to Pitts. He complained to one administrator of the “total irresponsibleness of the boys” and to another faculty member that the missing parcel meant that he had “lost priority on some important work.” “One of my competitors, Shannon of the Bell Telephone Company, is coming out with a paper before mine,” he fumed. Wiener wasn’t being needlessly paranoid: Shannon had, by that point, previewed his still-unpublished work at 1947 conferences at Harvard and Columbia. In April 1947, Wiener and Shannon shared the same stage, and both had the opportunity to present early versions of their thoughts. Wiener, in a moment of excessive self-regard, would write to a colleague, “The Bell people are fully accepting my thesis concerning statistics and communications engineering.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
Thomson’s tidal solution was something like the inverse of Bush’s lawnmower. The surveying machine would read the land’s data of hills and dips and even manhole covers and output a graph; the tide machine invented by Thomson and his brother, which they christened the harmonic analyzer, took a graph as input. The operator stood before a long, open wooden box resting on eight legs, a steel pointer and a hand crank protruding from its innards. With his right hand, he took hold of the pointer and traced a graph of water levels, months’ data on high tides and low; with his left, he steadily turned the crank that turned the oiled gears in the casket. Inside, eleven little cranks rotated at their own speeds, each isolating one of the simple functions that added up to the chaotic tide. At the end, their gauges displayed eleven little numbers—the average water level, the pull of the moon, the pull of the sun, and so on—that together filled in the equation to state the tides. All of it, in principle, could be ground out by human hands on a notepad—but, said Thomson, this was “calculation of so methodical a kind that a machine ought to be found to do it.
Jimmy Soni (A Mind at Play: How Claude Shannon Invented the Information Age)
The father of information theory, Claude Shannon (1916–2001),
Luciano Floridi (Information: A Very Short Introduction (Very Short Introductions))
No matter how well or loudly a signal is transmitted, it may not be effectively received. We suspect Dr. Claude Shannon, who created information theory, would agree with author Simon Sinek, who wrote, “Communication is not about speaking what we think. It’s about ensuring others hear what we mean.”26
Gene Kim (Wiring the Winning Organization: Liberating Our Collective Greatness through Slowification, Simplification, and Amplification)
Information is the resolution of uncertainty.
Claude Shannon