Famous Computer Science Quotes

We've searched our database for all the quotes and captions related to Famous Computer Science. Here they are! All 12 of them:

Moore’s Law, the rule of thumb in the technology industry, tells us that processor chips—the small circuit boards that form the backbone of every computing device—double in speed every eighteen months. That means a computer in 2025 will be sixty-four times faster than it is in 2013. Another predictive law, this one of photonics (regarding the transmission of information), tells us that the amount of data coming out of fiber-optic cables, the fastest form of connectivity, doubles roughly every nine months. Even if these laws have natural limits, the promise of exponential growth unleashes possibilities in graphics and virtual reality that will make the online experience as real as real life, or perhaps even better. Imagine having the holodeck from the world of Star Trek, which was a fully immersive virtual-reality environment for those aboard a ship, but this one is able to both project a beach landscape and re-create a famous Elvis Presley performance in front of your eyes. Indeed, the next moments in our technological evolution promise to turn a host of popular science-fiction concepts into science facts: driverless cars, thought-controlled robotic motion, artificial intelligence (AI) and fully integrated augmented reality, which promises a visual overlay of digital information onto our physical environment. Such developments will join with and enhance elements of our natural world. This is our future, and these remarkable things are already beginning to take shape. That is what makes working in the technology industry so exciting today. It’s not just because we have a chance to invent and build amazing new devices or because of the scale of technological and intellectual challenges we will try to conquer; it’s because of what these developments will mean for the world.
Eric Schmidt (The New Digital Age: Reshaping the Future of People, Nations and Business)
In fact, the same basic ingredients can easily be found in numerous start-up clusters in the United States and around the world: Austin, Boston, New York, Seattle, Shanghai, Bangalore, Istanbul, Stockholm, Tel Aviv, and Dubai. To discover the secret to Silicon Valley’s success, you need to look beyond the standard origin story. When people think of Silicon Valley, the first things that spring to mind—after the HBO television show, of course—are the names of famous start-ups and their equally glamorized founders: Apple, Google, Facebook; Jobs/ Wozniak, Page/ Brin, Zuckerberg. The success narrative of these hallowed names has become so universally familiar that people from countries around the world can tell it just as well as Sand Hill Road venture capitalists. It goes something like this: A brilliant entrepreneur discovers an incredible opportunity. After dropping out of college, he or she gathers a small team who are happy to work for equity, sets up shop in a humble garage, plays foosball, raises money from sage venture capitalists, and proceeds to change the world—after which, of course, the founders and early employees live happily ever after, using the wealth they’ve amassed to fund both a new generation of entrepreneurs and a set of eponymous buildings for Stanford University’s Computer Science Department. It’s an exciting and inspiring story. We get the appeal. There’s only one problem. It’s incomplete and deceptive in several important ways. First, while “Silicon Valley” and “start-ups” are used almost synonymously these days, only a tiny fraction of the world’s start-ups actually originate in Silicon Valley, and this fraction has been getting smaller as start-up knowledge spreads around the globe. Thanks to the Internet, entrepreneurs everywhere have access to the same information. Moreover, as other markets have matured, smart founders from around the globe are electing to build companies in start-up hubs in their home countries rather than immigrating to Silicon Valley.
Reid Hoffman (Blitzscaling: The Lightning-Fast Path to Building Massively Valuable Companies)
One thing that we conclude from all this is that the 'learning robot' procedure for doing mathematics is not the procedure that actually underlies human understanding of mathematics. In any case, such bottom-up-dominated procedure would appear to be hopelessly bad for any practical proposal for the construction of a mathematics-performing robot, even one having no pretensions whatever for simulating the actual understandings possessed by a human mathematician. As stated earlier, bottom-up learning procedures by themselves are not effective for the unassailable establishing of mathematical truths. If one is to envisage some computational system for producing unassailable mathematical results, it would be far more efficient to have the system constructed according to top-down principles (at least as regards the 'unassailable' aspects of its assertions; for exploratory purposes, bottom-up procedures might well be appropriate). The soundness and effectiveness of these top-down procedures would have to be part of the initial human input, where human understanding an insight provide the necesssary additional ingredients that pure computation is unable to achieve. In fact, computers are not infrequently employed in mathematical arguments, nowadays, in this kind of way. The most famous example was the computer-assisted proof, by Kenneth Appel and Wolfgang Haken, of the four-colour theorem, as referred to above. The role of the computer, in this case, was to carry out a clearly specified computation that ran through a very large but finite number of alternative possibilities, the elimination of which had been shown (by the human mathematicians) to lead to a general proof of the needed result. There are other examples of such computer-assisted proofs and nowadays complicated algebra, in addition to numerical computation, is frequently carried out by computer. Again it is human understanding that has supplied the rules and it is a strictly top-down action that governs the computer's activity.
Roger Penrose (Shadows of the Mind: A Search for the Missing Science of Consciousness)
I imagine that if I had been a male student my name might have been mentioned in class or that the professor might have encouraged my career in computer science, or perhaps offered me an opportunity in his or a colleague’s lab. This is why I get deeply angry when famous men (like Larry Summers, whom I will come to below) espouse the idea that women as a group are innately less good at science than men but say that of course they do not discriminate against individual talented women. They fail to miss the basic point that in the face of pervasive negative stereotyping, talented women will not be recognized. Such negative stereotyping is not supported by any data and is deeply harmful to all women
Ben Barres (The Autobiography of a Transgender Scientist)
Where people were once dazzled to be online, now their expectations had soared, and they did not bother to hide their contempt for those who sought to curtail their freedom on the Web. Nobody was more despised than a computer science professor in his fifties named Fang Binxing. Fang had played a central role in designing the architecture of censorship, and the state media wrote admiringly of him as the “father of the Great Firewall.” But when Fang opened his own social media account, a user exhorted others, “Quick, throw bricks at Fang Binxing!” Another chimed in, “Enemies of the people will eventually face trial.” Censors removed the insults as fast as possible, but they couldn’t keep up, and the lacerating comments poured in. People called Fang a “eunuch” and a “running dog.” Someone Photoshopped his head onto a voodoo doll with a pin in its forehead. In digital terms, Fang had stepped into the hands of a frenzied mob. Less than three hours after Web users spotted him, the Father of the Great Firewall shut down his account and recoiled from the digital world that he had helped create. A few months later, in May 2011, Fang was lecturing at Wuhan University when a student threw an egg at him, followed by a shoe, hitting the professor in the chest. Teachers tried to detain the shoe thrower, a science student from a nearby college, but other students shielded him and led him to safety. He was instantly famous online. People offered him cash and vacations in Hong Kong and Singapore. A female blogger offered to sleep with him.
Evan Osnos (Age of Ambition: Chasing Fortune, Truth, and Faith in the New China)
I suppose that this viewpoint-that physical systems are to be regarded as merely computational entities-stems partly from the powerful and increasing role that computational simulations play in modern twentieth-century science, and also partly from a belief that physical objects are themselves merely 'patterns of information', in some sense, that are subject to computational mathematical laws. Most of the material of our bodies and brains, after all, is being continuously replaced, and it is just its pattern that persists. Moreover, matter itself seems to have merely a transient existence since it can be converted from one form into another. Even the mass of a material body, which provides a precise physical measure of the quantity of matter that the body contains, can in appropriate circumstances be converted into pure energy (according to Einstein's famous E=mc^2)-so even material substance seems to be able to convert itself into something with a theoretical mathematical actuality. Furthermore, quantum theory seemst o tell us that material particles are merely 'waves' of information. (We shall examine these issues more thoroughly in Part II.) Thus, matter itself is nebulous and transient; and it is not at all unreasonable to suppose that the persistence of 'self' might have more to do with the preservation of patterns than of actual material particles.
Roger Penrose (Shadows of the Mind: A Search for the Missing Science of Consciousness)
The minute I dropped out I could stop taking the required classes that didn’t interest me, and begin dropping in on the ones that looked interesting. It wasn’t all romantic. I didn’t have a dorm room, so I slept on the floor in friends’ rooms, I returned coke bottles for the 5¢ deposits to buy food with, and I would walk the seven miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it. And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on. Let me give you one example: Reed College at that time offered perhaps the best calligraphy instruction in the country. Throughout the campus every poster, every label on every drawer, was beautifully hand calligraphed. Because I had dropped out and didn’t have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can’t capture, and I found it fascinating. None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, it’s likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do. Of course it was impossible to connect the dots looking forward when I was in college. But it was very, very clear looking backwards ten years later. Again, you can’t connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something—your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life. The narrator of this story is Steve Jobs, the legendary CEO of Apple. The story was part of his famous Stanford commencement speech in 2005.[23] It’s a perfect illustration of how passion and purpose drive success, not the crossing of an imaginary finish line in the future. Forget the finish line. It doesn’t exist. Instead, look for passion and purpose directly in front of you. The dots will connect later, I promise—and so does Steve.
Jesse Tevelow (The Connection Algorithm: Take Risks, Defy the Status Quo, and Live Your Passions)
Only years later would scientists again need to harness the power of multiple processors at once, when massively parallel processing would become an integral part of supercomputing. Years later, too, the genealogy of Shoch’s worm would come full circle. Soon after he published a paper about the worm citing The Shockwave Rider, he received a letter from John Brunner himself. It seemed that most science fiction writers harbored an unspoken ambition to write a book that actually predicted the future. Their model was Arthur C. Clarke, the prolific author of 2001: A Space Odyssey, who had become world-famous for forecasting the invention of the geosynchronous communications satellite in an earlier short story. “Apparently they’re all jealous of Arthur Clarke,” Shoch reflected. “Brunner wrote that his editor had sent him my paper. He said he was ‘really delighted to learn, that like Arthur C. Clarke, I predicted an event of the future.’” Shoch briefly considered replying that he had only borrowed the tapeworm’s name but that the concept was his own and that, unfortunately, Brunner did not really invent the worm. But he let it pass.
Michael A. Hiltzik (Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age)
glory, at the Science Museum of London. Charles Babbage was a well-known scientist and inventor of the time. He had spent years working on his Difference Engine, a revolutionary mechanical calculator. Babbage was also known for his extravagant parties, which he called “gatherings of the mind” and hosted for the upper class, the well-known, and the very intelligent.4 Many of the most famous people from Victorian England would be there—from Charles Darwin to Florence Nightingale to Charles Dickens. It was at one of these parties in 1833 that Ada glimpsed Babbage’s half-built Difference Engine. The teenager’s mathematical mind buzzed with possibilities, and Babbage recognized her genius immediately. They became fast friends. The US Department of Defense uses a computer language named Ada in her honor. Babbage sent Ada home with thirty of his lab books filled with notes on his next invention: the Analytic Engine. It would be much faster and more accurate than the Difference Engine, and Ada was thrilled to learn of this more advanced calculating machine. She understood that it could solve even harder, more complex problems and could even make decisions by itself. It was a true “thinking machine.”5 It had memory, a processor, and hardware and software just like computers today—but it was made from cogs and levers, and powered by steam. For months, Ada worked furiously creating algorithms (math instructions) for Babbage’s not-yet-built machine. She wrote countless lines of computations that would instruct the machine in how to solve complex math problems. These algorithms were the world’s first computer program. In 1840, Babbage gave a lecture in Italy about the Analytic Engine, which was written up in French. Ada translated the lecture, adding a set of her own notes to explain how the machine worked and including her own computations for it. These notes took Ada nine months to write and were three times longer than the article itself! Ada had some awesome nicknames. She called herself “the Bride of Science” because of her desire to devote her life to science; Babbage called her “the Enchantress of Numbers” because of her seemingly magical math
Michelle R. McCann (More Girls Who Rocked the World: Heroines from Ada Lovelace to Misty Copeland)
The University of Michigan opened its new Computer Center in 1971, in a brand-new building on Beal Avenue in Ann Arbor, with beige-brick exterior walls and a dark-glass front. The university’s enormous mainframe computers stood in the middle of a vast white room, looking, as one faculty member remembers, “like one of the last scenes in the movie 2001: A Space Odyssey.” Off to the side were dozens of keypunch machines—what passed in those days for computer terminals. In 1971, this was state of the art. The University of Michigan had one of the most advanced computer science programs in the world, and over the course of the Computer Center’s life, thousands of students passed through that white room, the most famous of whom was a gawky teenager named Bill Joy.
Malcolm Gladwell (Outliers: The Story of Success)
John von Neumann, one of the founding fathers of computer science, famously said that “with four parameters I can fit an elephant, and with five I can make him wiggle his trunk.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
These are inspired by the famous Schrödinger's cat thought experiment. This setup uses superpositions of coherent states — states in which wave phases are the same —. When errors occur, they shift the phase. These shifts become detectable due to the unique properties of the superpositions, enabling timely corrections.
Pantheon Space Academy (Quantum Computing Explained for Beginners: The Science, Technology, and Impact)