Computer Generated Quotes

We've searched our database for all the quotes and captions related to Computer Generated. Here they are! All 100 of them:

This generation will witness social and economic changes in our societies, that will be irreversible, thanks to AI.
A.R. Merrydew
The large majority of teenagers who attend Higgs are soulless, conformist idiots. I have successfully integrated myself into a small group of girls who I consider to be “good people,” but sometimes I still feel that I might be the only person with a consciousness, like a video game protagonist, and everyone else are computer-generated extras who have only a select few actions, such as “initiate meaningless conversation” and “hug.
Alice Oseman (Solitaire)
I remember when I first came around, the computer-generated stuff was pretty wicked. I was like, 'Wow!' but I feel like then for the longest time, we saw so much of it, after a while, you might as well just be watching an animated movie.
Paul Walker
We consume indie music and films, and generate our own online content. We “think different” (even if we got the idea from Apple Computer’s famous ad campaign).
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
And while a hundred civilizations have prospered (sometimes for centuries) without computers or windmills or even the wheel, none have survived even a few generations without art.
David Bayles (Art & Fear: Observations on the Perils (and Rewards) of Artmaking)
All these beefy Caucasians with guns. Get enough of them together,looking for the America they always believed they'd grow up in, and they glom together like overcooked rice, form integral, starchy little units. With their power tools, portable generators, weapons, four-wheel-drive vehicles, and personal computers, they are like beavers hyped up on crystal meth, manic engineers without a blueprint, chewing through the wilderness, building things and abandoning them, altering the flow of mighty rivers and then moving on because the place ain't what it used to be. The byproduct of the lifestyle is polluted rivers, greenhouse effect, spouse abuse, televangelists, and serial killers. But as long as you have that four-wheel-drive vehicle and can keep driving north, you can sustain it, keep moving just quickly enough to stay one step ahead of your own waste stream. In twenty years, ten million white people will converge on the north pole and park their bagos there. The low-grade waste heat of their thermodynamically intense lifestyle will turn the crystalline icescape pliable and treacherous. It will melt a hole through the polar icecap, and all that metal will sink to the bottom, sucking the biomass down with it.
Neal Stephenson (Snow Crash)
Anti-sabbatical: A job taken with the sole intention of staying only for a limited period of time (often one year). The intention is usually to raise enough funds to partake in another, more personally meaningful activity such as watercolor sketching in Crete or designing computer knit sweaters in Hong Kong. Employers are rarely informed of intentions
Douglas Coupland (Generation X: Tales for an Accelerated Culture)
We are often told that the next generation of literati won't have private libraries: everything will be in the computer. It's a rational solution, but that's probably what's wrong with it. Being book crazy is an aspect of love, and therefore scarcely rational at all.
Clive James (Latest Readings)
This computer-generated pangram contains six a's, one b, three c's, three d's, thirty-seven e's, six f's, three g's, nine h's, twelve i's, one j, one k, two l's, three m's, twenty-two n's, thirteen o's, three p's, one q, fourteen r's, twenty-nine s's, twenty-four t's, five u's, six v's, seven w's, four x's, five y's, and one z.
Douglas R. Hofstadter (Metamagical Themas: Questing for the Essence of Mind and Pattern)
Octopuses are tough--and not just in the sense that they can take out sharks (both real and computer generated, as in Mega Shark versus Giant Octopus). They're almost pure muscle. With tridirectional muscles in the arms, they're a tad less supple than a well-marbled sirloin, to say the least (though certainly a lot more healthful). So over the centuries, people have been finding ways to make them a little easier on the jaw. The classic tactic is beating the bejesus out of them on rocks.
Katherine Harmon Courage (Octopus!: The Most Mysterious Creature in the Sea)
When outsiders claim that we are unchristian, it is a reflection of this jumbled (and predominately negative) set of perceptions. When they see Christians not acting like Jesus, they quickly conclude that the group deserves an unchristian label. Like a corrupted computer file or a bad photocopy, Christianity, they say, is no longer in pure form, and so they reject it. One quarter of outsiders say therefore most perception of Christianity is that the faith has changed for the worse. It has gotten off-track and is not what Christ intended. Modern-day Christianity no longer seems Christian.
David Kinnaman (unChristian: What a New Generation Really Thinks about Christianity... and Why It Matters)
The Matrix has its roots in primitive arcade games,' said the voice-over, 'in early graphics programs and military experimentation with cranial jacks.' On the Sony, a two-dimensional space war faded behind a forest of mathematically generated ferns, demonstrating the spatial possibilities of logarithmic spirals; cold blue military footage burned through, lab animals wired into test systems, helmets feeding into fire control circuits of tanks and war planes. 'Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts... A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding...
William Gibson (Neuromancer (Sprawl, #1))
How Smart Is a Rock? To appreciate the feasibility of computing with no energy and no heat, consider the computation that takes place in an ordinary rock. Although it may appear that nothing much is going on inside a rock, the approximately 1025 (ten trillion trillion) atoms in a kilogram of matter are actually extremely active. Despite the apparent solidity of the object, the atoms are all in motion, sharing electrons back and forth, changing particle spins, and generating rapidly moving electromagnetic fields. All of this activity represents computation, even if not very meaningfully organized. We’ve already shown that atoms can store information at a density of greater than one bit per atom, such as in computing systems built from nuclear magnetic-resonance devices. University of Oklahoma researchers stored 1,024 bits in the magnetic interactions of the protons of a single molecule containing nineteen hydrogen atoms.51 Thus, the state of the rock at any one moment represents at least 1027 bits of memory.
Ray Kurzweil (The Singularity is Near: When Humans Transcend Biology)
Could you make a computer imagine an entire world? How would you start? A generation of people would wrestle with this problem—they're still wrestling with it.
Austin Grossman (You)
The computer is really good for editing your ideas, and it’s really good for getting your ideas ready for publishing out into the world, but it’s not really good for generating ideas.
Austin Kleon (Steal Like an Artist: 10 Things Nobody Told You About Being Creative)
there are just two activities that are significantly correlated with depression and other suicide-related outcomes (such as considering suicide, making a plan, or making an actual attempt): electronic device use (such as a smartphone, tablet, or computer) and watching TV. On the other hand, there are five activities that have inverse relationships with depression (meaning that kids who spend more hours per week on these activities show lower rates of depression): sports and other forms of exercise, attending religious services, reading books and other print media, in-person social interactions, and doing homework.
Greg Lukianoff (The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting up a Generation for Failure)
Everyone is always telling my generation that we aren't going to know how to engage with people. We're all going to end up with computer chips implanted in our brains and screens stuck in our eyes like contact lenses. But no one gives us any solutions, so I decided to find my own.
Nina LaCour (Meet Cute: Some People Are Destined to Meet)
I hear a lot of people say that the fear of death and the fear of public speaking are two of the main fears in my generation, but I disagree. I think it’s the fear of silence. We refuse to turn off our computers, turn off our phones, log off Facebook, and just sit in silence, because in those moments we might actually have to face up to who we really are. We fear silence like it’s an invisible monster, gnawing at us, ripping us open, and showing us our dissatisfaction. Silence is terrifying.
Jefferson Bethke (Jesus > Religion: Why He Is So Much Better Than Trying Harder, Doing More, and Being Good Enough)
We're at a crucial point in history. We cannot have fast cars, computers the size of credit cards, and modern conveniences, whilst simultaneously having clean air, abundant rainforests, fresh drinking water and a stable climate. This generation can have one or the other but not both. Humanity must make a choice. Both have an opportunity cost. Gadgetry or nature? Pick the wrong one and the next generations may have neither.
Mark Boyle (The Moneyless Man: A Year of Freeconomic Living)
Consider an AI that has hedonism as its final goal, and which would therefore like to tile the universe with “hedonium” (matter organized in a configuration that is optimal for the generation of pleasurable experience). To this end, the AI might produce computronium (matter organized in a configuration that is optimal for computation) and use it to implement digital minds in states of euphoria. In order to maximize efficiency, the AI omits from the implementation any mental faculties that are not essential for the experience of pleasure, and exploits any computational shortcuts that according to its definition of pleasure do not vitiate the generation of pleasure. For instance, the AI might confine its simulation to reward circuitry, eliding faculties such as a memory, sensory perception, executive function, and language; it might simulate minds at a relatively coarse-grained level of functionality, omitting lower-level neuronal processes; it might replace commonly repeated computations with calls to a lookup table; or it might put in place some arrangement whereby multiple minds would share most parts of their underlying computational machinery (their “supervenience bases” in philosophical parlance). Such tricks could greatly increase the quantity of pleasure producible with a given amount of resources.
Nick Bostrom (Superintelligence: Paths, Dangers, Strategies)
We like to believe that we live in a grand age of creative individualism. We look back at the midcentury era in which the Berkeley researchers conducted their creativity studies, and feel superior. Unlike the starched-shirted conformists of the 1950s, we hang posters of Einstein on our walls, his tongue stuck out iconoclastically. We consume indie music and films, and generate our own online content. We “think different” (even if we got the idea from Apple Computer’s famous ad campaign). But the way we organize many of our most important institutions—our schools and our workplaces—tells a very different story.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
There is no glory of using technologies like artificial intelligence, swarm drones and quantum computing for developing mass destruction weapons. Our glory lies in using technologies and AI for embracing all, generating love and happiness, and removing the pain of the humanity.
Amit Ray (Compassionate Artificial Intelligence: Frameworks and Algorithms)
Well, whatever you want to say, I recommend you come right out and say it. Just open your mouth and tell the world what's on your mind. Of course, with you generation, I always feel like I have to add this: Please don't do it through text or e-mail or anything like that. When you need to communicate something important, speak your truth face-to-face.... When you say what you have to say through a computer or phone, there are often miscommunications. But when it's just you and someone else, and you're right in front of them, speaking your truth, they are much more likely to understand.
Ali Benjamin (The Thing About Jellyfish)
A thought expressed is a falsehood." In poetry what is not said and yet gleams through the beauty of the symbol, works more powerfully on the heart than that which is expressed in words. Symbolism makes the very style, the very artistic substance of poetry inspired, transparent, illuminated throughout like the delicate walls of an alabaster amphora in which a flame is ignited. Characters can also serve as symbols. Sancho Panza and Faust, Don Quixote and Hamlet, Don Juan and Falstaff, according to the words of Goethe, are "schwankende Gestalten." Apparitions which haunt mankind, sometimes repeatedly from age to age, accompany mankind from generation to generation. It is impossible to communicate in any words whatsoever the idea of such symbolic characters, for words only define and restrict thought, but symbols express the unrestricted aspect of truth. Moreover we cannot be satisfied with a vulgar, photographic exactness of experimental photoqraphv. We demand and have premonition of, according to the allusions of Flaubert, Maupassant, Turgenev, Ibsen, new and as yet undisclosed worlds of impressionability. This thirst for the unexperienced, in pursuit of elusive nuances, of the dark and unconscious in our sensibility, is the characteristic feature of the coming ideal poetry. Earlier Baudelaire and Edgar Allan Poe said that the beautiful must somewhat amaze, must seem unexpected and extraordinary. French critics more or less successfully named this feature - impressionism. Such are the three major elements of the new art: a mystical content, symbols, and the expansion of artistic impressionability. No positivistic conclusions, no utilitarian computation, but only a creative faith in something infinite and immortal can ignite the soul of man, create heroes, martyrs and prophets... People have need of faith, they need inspiration, they crave a holy madness in their heroes and martyrs. ("On The Reasons For The Decline And On The New Tendencies In Contemporary Literature")
Dmitry Merezhkovsky (Silver Age of Russian Culture (An Anthology))
All these beefy Caucasians with guns! Get enough of them together, looking for the America they always believed they'd grow up in, and they glom together like overcooked rice, form integral, starchy little units. With their power tools, portable generators, weapons, four-wheel-drive vehicles, and personal computers, they are like beavers hyped up on crystal meth, manic engineers without a blueprint, chewing through the wilderness, building things and abandoning them, altering the flow of mighty rivers and then moving on because the place ain't what it used to be.
Neal Stephenson (Snow Crash)
Computers thwart, contort, and befuddle us. We mess around with fonts, change screen backgrounds, slow down or increase mouse speed. We tweak and we piddle. We spend countless hours preparing PowerPoint slides that most people forget in seconds. We generate reports in duplicate and triplicate and then somw that end up serving only one function for most of the recipients - to collect dust.
Jeff Davidson (The Complete Idiot's Guide to Getting Things Done)
Attend any conference on telecommunications or computer technology, and you will be attending a celebration of innovative machinery that generates, stores, and distributes more information, more conveniently, at greater speed than ever before, To the question “What problem does the information solve?” the answer is usually “How to generate, store and distribute more information, more conveniently, at greater speeds than ever before.” This is the elevation of information to a metaphysical status: information as both the means and end of human creativity. In Technopoly, we are driven to fill our lives with the quest to “access” information. For what purpose or with what limitations, it is not for us to ask; and we are not accustomed to asking, since the problem is unprecedented. The world has never before been confronted with information glut and has hardly had time to reflect on its consequences (61).
Neil Postman (Technopoly: The Surrender of Culture to Technology)
A great programmer, on a roll, could create a million dollars worth of wealth in a couple weeks. A mediocre programmer over the same period will generate zero or even neg- ative wealth (e.g. by introducing bugs). This is why so many of the best programmers are libertarians.
Paul Graham (Hackers & Painters: Big Ideas from the Computer Age)
Some analogies are so useful that they don’t merely shed light on a concept, they actually become platforms for novel thinking. For example, the metaphor of the brain as a computer has been central to the insights generated by cognitive psychologists during the past fifty years.
Chip Heath (Made to Stick: Why Some Ideas Survive and Others Die)
Speculations about what the world would be like after human control of it ended had been – long ago, briefly – a queasy form of popular entertainment. There had even been online TV shows about it: computer-generated landscape pictures with deer grazing in Times Square, serves-us-
Margaret Atwood (MaddAddam (MaddAddam, #3))
It was yesterday’s lead story, tracking the proliferation of new “mind-uploading” companies, hoping to discover a means of scanning the human brain into a computer for perpetual preservation. Anything to satiate the spike in interest among short-stringers looking to extend their lives, in this generation or the next.
Nikki Erlick (The Measure)
Twenge finds that there are just two activities that are significantly correlated with depression and other suicide-related outcomes (such as considering suicide, making a plan, or making an actual attempt): electronic device use (such as a smartphone, tablet, or computer) and watching TV. On the other hand, there are five activities that have inverse relationships with depression (meaning that kids who spend more hours per week on these activities show lower rates of depression): sports and other forms of exercise, attending religious services, reading books and other print media, in-person social interactions, and doing homework.
Jonathan Haidt (The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting up a Generation for Failure)
A restaurant can afford to serve the occasional burnt dinner. But in technology, you cook one thing and that's what everyone eats. So any difference between what people want and what you deliver is multiplied. You please or annoy customers wholesale. The closer you can get to what they want, the more wealth you generate.
Paul Graham (Hackers & Painters: Big Ideas from the Computer Age)
Over the next three decades, scholars and fans, aided by computational algorithms, will knit together the books of the world into a single networked literature. A reader will be able to generate a social graph of an idea, or a timeline of a concept, or a networked map of influence for any notion in the library. We’ll come to understand that no work, no idea stands alone, but that all good, true, and beautiful things are ecosystems of intertwined parts and related entities, past and present.
Kevin Kelly (The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future)
I have successfully integrated myself into a small group of girls who I consider to be "good people", but sometimes I still feel that I might be the only person with a consciousness, like a video game protagonist, and everyone else is a computer-generated extra with only a select few actions, such as "initiate meaningless conversation" and "hug".
Alice Oseman (Solitaire)
… it had almost nothing to do with computers, the modernity I was trying to understand. Computers were the bones, but imagination,ambition and possibility were the blood. These kids, they simply did not accept that the world as it is has any special gravity, any hold upon us. If something was wrong, if it was bad, then that something was to be fixed, not endured. Where my generation reached for philosophy and the virtue of suffering, they reached instead for science and technology and they actually did something about the beggar in the street, the woman in the wheelchair. They got on with it. It wasn’t that they had no sense of spirit or depth. Rather they reserved it for the truly wondrous, and for everything else they made tools.
Nick Harkaway (Gnomon)
A new generation of Sirius Cybernetics Corporation robots and computers, with the new GPP feature.’” “GPP feature?” said Arthur. “What’s that?” “Oh, it says Genuine People Personalities.
Douglas Adams (The Hitchhiker's Guide to the Galaxy (Hitchhiker's Guide, #1))
Tin Toy went on to win the 1988 Academy Award for animated short films, the first computer-generated film to do so. To celebrate, Jobs took Lasseter and his team to Greens, a vegetarian restaurant in San Francisco.
Walter Isaacson (Steve Jobs)
If the case isn't plea bargained, dismissed or placed on the inactive docket for an indefinite period of time, if by some perverse twist of fate it becomes a trial by jury, you will then have the opportunity of sitting on the witness stand and reciting under oath the facts of the case-a brief moment in the sun that clouds over with the appearance of the aforementioned defense attorney who, at worst, will accuse you of perjuring yourself in a gross injustice or, at best, accuse you of conducting an investigation so incredibly slipshod that the real killer has been allowed to roam free. Once both sides have argued the facts of the case, a jury of twelve men and women picked from computer lists of registered voters in one of America's most undereducated cities will go to a room and begin shouting. If these happy people manage to overcome the natural impulse to avoid any act of collective judgement, they just may find one human being guilty of murdering another. Then you can go to Cher's Pub at Lexington and Guilford, where that selfsame assistant state's attorney, if possessed of any human qualities at all, will buy you a bottle of domestic beer. And you drink it. Because in a police department of about three thousand sworn souls, you are one of thirty-six investigators entrusted with the pursuit of that most extraordinary of crimes: the theft of a human life. You speak for the dead. You avenge those lost to the world. Your paycheck may come from fiscal services but, goddammit, after six beers you can pretty much convince yourself that you work for the Lord himself. If you are not as good as you should be, you'll be gone within a year or two, transferred to fugitive, or auto theft or check and fraud at the other end of the hall. If you are good enough, you will never do anything else as a cop that matters this much. Homicide is the major leagues, the center ring, the show. It always has been. When Cain threw a cap into Abel, you don't think The Big Guy told a couple of fresh uniforms to go down and work up the prosecution report. Hell no, he sent for a fucking detective. And it will always be that way, because the homicide unit of any urban police force has for generations been the natural habitat of that rarefied species, the thinking cop.
David Simon
She saw the picture of idle fishing boats tied up at Peterhead; further gloom for Scotland and for a way of life that had produced such a strong culture. Fishermen had composed their songs; but what culture would a generation of computer operators leave behind them?
Alexander McCall Smith (The Sunday Philosophy Club (Isabel Dalhousie, #1))
There the crew would reside, either strapped into reclining metal chairs or with magnetic boots clanking around on a metal gridwork floor, nicely warmed by all the heat-generating vacuum-tube electronics necessary for the primitive computers, radios, and other necessary equipment.
Rod Pyle (Amazing Stories of the Space Age)
Throw in the valley’s rich history of computer science breakthroughs, and you’ve set the stage for the geeky-hippie hybrid ideology that has long defined Silicon Valley. Central to that ideology is a wide-eyed techno-optimism, a belief that every person and company can truly change the world through innovative thinking. Copying ideas or product features is frowned upon as a betrayal of the zeitgeist and an act that is beneath the moral code of a true entrepreneur. It’s all about “pure” innovation, creating a totally original product that generates what Steve Jobs called a “dent in the universe.” Startups that grow up in this kind of environment tend to be mission-driven. They start with a novel idea or idealistic goal, and they build a company around that. Company mission statements are clean and lofty, detached from earthly concerns or financial motivations.
Kai-Fu Lee (AI Superpowers: China, Silicon Valley, and the New World Order)
I etch a pattern of geometric shapes onto a stone. To the uninitiated, the shapes look mysterious and complex, but I know that when arranged correctly they will give the stone a special power, enabling it to respond to incantations in a language no human being has ever spoken. ...Yet my work involves no witchcraft. The stone is a wafer of silicon, and the incantations are software. The patterns etched on the chip and the programs that instruct the computer may look complicated and mysterious, but they are generated according to a few basic principles that are easily explained.
William Daniel Hillis
The computer is really good for editing your ideas, and it’s really good for getting your ideas ready for publishing out into the world, but it’s not really good for generating ideas. There are too many opportunities to hit the delete key. The computer brings out the uptight perfectionist in us—we start editing ideas before we have them. The cartoonist Tom Gauld says he stays away from the computer until he’s done most of the thinking for his strips, because once the computer is involved, “things are on an inevitable path to being finished. Whereas in my sketchbook the possibilities are endless.
Austin Kleon (Steal Like an Artist: 10 Things Nobody Told You About Being Creative)
Spurred on by both the science and science fiction of our time, my generation of researchers and engineers grew up to ask what if? and what’s next? We went on to pursue new disciplines like computer vision, artificial intelligence, real-time speech translation, machine learning, and quantum computing.
Elizabeth Bear (Future Visions: Original Science Fiction Inspired by Microsoft)
That’s the wonder and terror of computer-generated images for me: If they look real, my brain isn’t nearly sophisticated enough to understand they are not. We’ve long known that images are unreliable—Kafka wrote that “nothing is as deceptive as a photograph”—and yet I still can’t help but believe them.
John Green (The Anthropocene Reviewed: Essays on a Human-Centered Planet)
These computer simulations try only to duplicate the interactions between the cortex and the thalamus. Huge chunks of the brain are therefore missing. Dr. [Dharmendra] Modha understands the enormity of his project. His ambitious research has allowed him to estimate what it would take to create a working model of the entire human brain, and not just a portion or a pale version of it, complete with all parts of the neocortex and connections to the senses. He envisions using not just a single Blue Gene computer [with over a hundred thousand processors and terabytes of RAM] but thousands of them, which would fill up not just a room but an entire city block. The energy consumption would be so great that you would need a thousand-megawatt nuclear power plant to generate all the electricity. And then, to cool off this monstrous computer so it wouldn't melt, you would need to divert a river and send it through the computer circuits. It is remarkable that a gigantic, city-size computer is required to simulate a piece of human tissue that weighs three pounds, fits inside your skull, raises your body temperature by only a few degrees, uses twenty watts of power, and needs only a few hamburgers to keep it going.
Michio Kaku (The Future of the Mind: The Scientific Quest to Understand, Enhance, and Empower the Mind)
Despite his new fame and fortune, he still fancied himself a child of the counterculture. On a visit to a Stanford class, he took off his Wilkes Bashford blazer and his shoes, perched on top of a table, and crossed his legs into a lotus position. The students asked questions, such as when Apple’s stock price would rise, which Jobs brushed off. Instead he spoke of his passion for future products, such as someday making a computer as small as a book. When the business questions tapered off, Jobs turned the tables on the well-groomed students. “How many of you are virgins?” he asked. There were nervous giggles. “How many of you have taken LSD?” More nervous laughter, and only one or two hands went up. Later Jobs would complain about the new generation of kids, who seemed to him more materialistic and careerist than his own. “When I went to school, it was right after the sixties and before this general wave of practical purposefulness had set in,” he said. “Now students aren’t even thinking in idealistic terms, or at least nowhere near as much.” His generation, he said, was different. “The idealistic wind of the sixties is still at our backs, though, and most of the people I know who are my age have that ingrained in them forever.
Walter Isaacson (Steve Jobs)
The mechanisms that enable and govern our behavior today have been shaped by the ecology and behavior of our ancestors across countless generations; the mind/brain can then be studied as an evolved -computational organ-, or more precisely, a collection of specialized organs that perform various kinds of computations.
Marco del Giudice (Evolutionary Psychopathology: A Unified Approach)
Virtuality is the cultural perception that material objects are interpenetrated by information patterns. The definition plays off the duality at the heart of the condition of virtuality—materiality on the one hand, information on the other. Normally virtuality is associated with computer simulations that put the body into a feedback loop with a computer-generated image. For example, in virtual Ping-Pong, one swings a paddle wired into a computer, which calculates from the paddle’s momentum and position where the ball would go. Instead of hitting a real ball, the player makes the appropriate motions with the paddle and watches the image of the ball on a computer monitor. Thus the game takes place partly in real life (RL) and partly in virtual reality (VR). Virtual reality technologies are fascinating because they make visually immediate the perception that a world of information exists parallel to the “real” world, the former intersecting the latter at many points and in many ways. Hence the definition’s strategic quality, strategic because it seeks to connect virtual technologies with the sense, pervasive in the late twentieth century, that all material objects are interpenetrated by flows of information, from DNA code to the global reach of the World Wide Web.
N. Katherine Hayles (How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics)
do the chromosomes behave in ontogenesis?3 The growth of an organism is effected by consecutive cell divisions. Such a cell division is called mitosis. It is, in the life of a cell, not such a very frequent event as one might expect, considering the enormous number of cells of which our body is composed. In the beginning the growth is rapid. The egg divides into two ‘daughter cells’ which, at the next step, will produce a generation of four, then of 8, 16, 32, 64, …, etc. The frequency of division will not remain exactly the same in all parts of the growing body, and that will break the regularity of these numbers. But from their rapid increase we infer by an easy computation that on the average as few as 50 or 60 successive divisions suffice to produce the number of cells4 in a grown man – or, say, ten times the number,2 taking into account the exchange of cells during lifetime. Thus, a body cell of mine is, on the average, only the 50th or 60th ‘descendant’ of the egg that was I.
Erwin Schrödinger (What is Life? (Canto Classics))
If men create intelligent machines, or fantasize about them, it is either because they secretly despair of their own intelligence or because they are in danger of succumbing to the weight of a monstrous and useless intelligence which they seek to exorcize by transferring it to machines, where they can play with it and make fun of it. By entrusting this burdensome intelligence to machines we are released from any responsibility to knowledge, much as entrusting power to politicians allows us to disdain any aspiration of our own to power. If men dream of machines that are unique, that are endowed with genius, it is because they despair of their own uniqueness, or because they prefer to do without it - to enjoy it by proxy, so to speak, thanks to machines. What such machines offer is the spectacle of thought, and in manipulating them people devote themselves more to the spectacle of thought than to thought itself. It is not for nothing that they are described as 'virtual', for they put thought on hold indefinitely, tying its emergence to the achievement of a complete knowledge. The act of thinking itself is thus put off for ever. Indeed, the question of thought can no more be raised than the question of the freedom of future generations, who will pass through life as we travel through the air, strapped into their seats. These Men of Artificial Intelligence will traverse their own mental space bound hand and foot to their computers. Immobile in front of his computer, Virtual Man makes love via the screen and gives lessons by means of the teleconference. He is a physical - and no doubt also a mental cripple. That is the price he pays for being operational. Just as eyeglasses and contact lenses will arguably one day evolve into implanted prostheses for a species that has lost its sight, it is similarly to be feared that artificial intelligence and the hardware that supports it will become a mental prosthesis for a species without the capacity for thought. Artificial intelligence is devoid of intelligence because it is devoid of artifice.
Jean Baudrillard (The Transparency of Evil: Essays in Extreme Phenomena)
Genetic programming essentially allows computer algorithms to design themselves through a process of Darwinian natural selection. Computer code is initially generated randomly and then repeatedly shuffled using techniques that emulate sexual reproduction. Every so often, a random mutation is thrown in to help drive the process in entirely new directions.
Martin Ford (Rise of the Robots: Technology and the Threat of a Jobless Future)
Exoteric machines - esoteric machines. They say the computer is an improved form of typewriter. Not a bit of it. I collude with my typewriter, but the relationship is otherwise clear and distant. I know it is a machine; it knows it is a machine. There is nothing here of the interface, verging on biological confusion, between a computer thinking it is a brain and me thinking I am a computer. The same familiarity with good old television, where I was and remained a spectator. It was an esoteric machine, whose status as machine I respected. Nothing there of all these screens and interactive devices, including the 'smart' car of the future and the 'smart' house. Even the mobile phone, that incrustation of the network in your head, even the skateboard and rollerblades - mobility aids - are of a quite different generation from the good old static telephone or the velocipedic machine. New manners and a new morality are emerging as a result of this organic confusion between man and his prostheses - a confusion which puts an end to the instrumental pact and the integrity of the machine itself.
Jean Baudrillard (Cool Memories IV, 1995-2000)
The Ancestral Trail was split into two-halves of 26 issues each. The first half takes place in the Ancestral World and describes Richard's struggle to restore good to the world. After the initial international run, which sold over 30 million copies worldwide, Marshall Cavendish omitted the second part of the trilogy and used the third part (future) for the second series that followed. This part of the series, written up by Ian Probert and published in 1994, takes place in the Cyber Dimension. It deals with Richard's attempts to return home. Each issue centered on an adventure against a particular adversary, and each issue ended on a cliffhanger. The Ancestral Trail was illustrated by Julek and Adam Heller. Computer-generated graphics were provided by Mehau Kulyk for issues #27 through #52.
Frank Graves
For the generation that’s grown up in a world where computers are the norm, smartphones feel like fifth limbs and music comes from the Internet rather than record and CD stores, Steve Jobs is must-read history. . . . The intimate chapters, where Jobs’s personal side shines through, with all his faults and craziness, leave a deep impression. There’s humor, too . . . it’s a rich portrait of one of the
Walter Isaacson (Steve Jobs)
The world's greatest computer is the brain. The world's greatest engine is the heart. The world's greatest generator is the soul. The world's greatest television is the mind. The world's greatest radio is the tongue. The world's greatest camera is the eye. The world's greatest ladder is faith. The world's greatest hammer is courage. The world's greatest sword is accuracy. The world's greatest photographer is sight. The world's greatest knife is fate. The world's greatest spear is intelligence. The world's greatest submerine is a fish. The world's greatest aeroplane is a bird. The world's greatest jet is a fly. The world's greatest bicycle is a camel. The world's greatest motorbike is a horse. The world's greatest train is a centipede. The world's greatest sniper is a cobra. The world's greatest schemer is a fox. The world's greatest builder is an ant. The world's greatest tailor is a spider. The world's greatest assassin is a wolf. The world's greatest ruler is a lion. The world's greatest judge is karma. The world's greatest preacher is nature. The world's greatest philosopher is truth. The world's greatest mirror is reality. The world's greatest curtain is darkness. The world's greatest author is destiny.
Matshona Dhliwayo
They made solemn pronouncements about conditions a trillionth of a second after the Big Bang, on the basis of computer models, which they had produced with computers not even bright enough to talk, let alone understand speech. They were unlike all the generations before theirs in several ways, but chiefly in that they had no faintest clue how ignorant they were. Previous ages had usually had a pretty good handle on that.
Robert A. Heinlein (Variable Star: A Novel (Tor Science Fiction))
Have you ever thought, not only about the airplane but whatever man builds, that all of man’s industrial efforts, all his computations and calculations, all the nights spent working over draughts and blueprints, invariably culminate in the production of a thing whose sole and guiding principle is the ultimate principle of simplicity? It is as if there were a natural law which ordained that to achieve this end, to refine the curve of a piece of furniture, or a ship’s keel, or the fuselage of an airplane, until gradually it partakes of the elementary purity of the curve of the human breast or shoulder, there must b experimentation of several generations of craftsmen. In anything at all, perfection is finally attained not when there is no longer anything to add, but when there is no longer anything to take away, when a body has been stripped down to its nakedness.
Antoine de Saint-Exupéry
Literature is as old as human language, and as new as tomorrow's sunrise. And literature is everywhere, not only in books, but in videos, television, radio, CDs, computers, newspapers, in all the media of communication where a story is told or an image created. It starts with words, and with speech. The first literature in any culture is oral. The classical Greek epics of Homer, the Asian narratives of Gilgamesh and the Bhagavad Gita, the earliest versions of the Bible and the Koran were all communicated orally, and passed on from generation to generation - with variations, additions, omissions and embellishments until they were set down in written form, in versions which have come down to us. In English, the first signs of oral literature tend to have three kinds of subject matter - religion, war, and the trials of daily life - all of which continue as themes of a great deal of writing.
Ronald Carter (The Routledge History of Literature in English: Britain and Ireland)
The world has been changing even faster as people, devices and information are increasingly connected to each other. Computational power is growing and quantum computing is quickly being realised. This will revolutionise artificial intelligence with exponentially faster speeds. It will advance encryption. Quantum computers will change everything, even human biology. There is already one technique to edit DNA precisely, called CRISPR. The basis of this genome-editing technology is a bacterial defence system. It can accurately target and edit stretches of genetic code. The best intention of genetic manipulation is that modifying genes would allow scientists to treat genetic causes of disease by correcting gene mutations. There are, however, less noble possibilities for manipulating DNA. How far we can go with genetic engineering will become an increasingly urgent question. We can’t see the possibilities of curing motor neurone diseases—like my ALS—without also glimpsing its dangers. Intelligence is characterised as the ability to adapt to change. Human intelligence is the result of generations of natural selection of those with the ability to adapt to changed circumstances. We must not fear change. We need to make it work to our advantage. We all have a role to play in making sure that we, and the next generation, have not just the opportunity but the determination to engage fully with the study of science at an early level, so that we can go on to fulfil our potential and create a better world for the whole human race. We need to take learning beyond a theoretical discussion of how AI should be and to make sure we plan for how it can be. We all have the potential to push the boundaries of what is accepted, or expected, and to think big. We stand on the threshold of a brave new world. It is an exciting, if precarious, place to be, and we are the pioneers. When we invented fire, we messed up repeatedly, then invented the fire extinguisher. With more powerful technologies such as nuclear weapons, synthetic biology and strong artificial intelligence, we should instead plan ahead and aim to get things right the first time, because it may be the only chance we will get. Our future is a race between the growing power of our technology and the wisdom with which we use it. Let’s make sure that wisdom wins.
Stephen Hawking (Brief Answers to the Big Questions)
On September 14, 2015, the LIGO gravitational-wave detectors (built by a 1,000-person project that Rai and I and Ronald Drever co-founded, and Barry Barish organised, assembled and led) registered their first gravitational waves. By comparing the wave patterns with predictions from computer simulations, our team concluded that the waves were produced when two heavy black holes, 1.3 billion light years from Earth, collided. This was the beginning of gravitational-wave astronomy. Our team had achieved, for gravitational waves, what Galileo achieved for electromagnetic waves. I am confident that, over the coming several decades, the next generation of gravitational-wave astronomers will use these waves not only to test Stephen’s laws of black hole physics, but also to detect and monitor gravitational waves from the singular birth of our universe, and thereby test Stephen’s and others’ ideas about how our universe came to be.
Stephen Hawking (Brief Answers to the Big Questions)
Now keep looking at this unpleasant situation or person until you realize that it isn’t they that are causing the negative emotions. They are just going their way, being themselves, doing their thing whether right or wrong, good or bad. It is your computer that, thanks to your programming, insists on your reacting with negative emotions. You will see this better if you realize that someone with a different programming when faced with this same situation or person or event would react quite calmly, even happily. Don’t stop till you have grasped this truth: The only reason why you too are not reacting calmly and happily is your computer that is stubbornly insisting that reality be reshaped to conform to its programming. Observe all of this from the outside so to speak and see the marvelous change that comes about in you. Once you have understood this truth and thereby stopped your computer from generating negative emotions you may take any action you deem fit. You may avoid the situation or the person; or you may try to change them; or you may insist on your rights or the rights of others being respected; you may even resort to the use of force. But only after you have got rid of your emotional upsets, for then your action will spring from peace and love, not from the neurotic desire to appease your computer or to conform to its programming or to get rid of the negative emotions it generates. Then you will understand how profound is the wisdom of the words: “If a man wants to sue you for your shirt, let him have your coat as well. If a man in authority makes you go one mile, go with him two.” For it will have become evident to you that real oppression comes, not from people who fight you in court or from authority that subjects you to slave labor, but from your computer whose programming destroys your peace of mind the moment outside circumstances fail to conform to its demands. People have been known to be happy even in the oppressive atmosphere of a concentration camp! It is from the oppression of your programming that you need to be liberated.
Anthony de Mello (The Way to Love: Meditations for Life)
You should do well but not really good. And the reason is that in the time it takes you to go from well to really good, Moore’s law has already surpassed you. You can pick up 10 percent but while you’re picking up that 10 percent, computers have gotten twice as fast and maybe with some other stuff that matters more for optimization, like caches. I think it’s largely a waste of time to do really well. It’s really hard; you generate as many bugs as you fix. You should stop, not take that extra 100 percent of time to do 10 percent of the work.
Ken Thompson
Computers were built in the late 1940s because mathematicians like John von Neumann thought that if you had a computer—a machine to handle a lot of variables simultaneously—you would be able to predict the weather. Weather would finally fall to human understanding. And men believed that dream for the next forty years. They believed that prediction was just a function of keeping track of things. If you knew enough, you could predict anything. That’s been a cherished scientific belief since Newton.” “And?” “Chaos theory throws it right out the window. It says that you can never predict certain phenomena at all. You can never predict the weather more than a few days away. All the money that has been spent on long-range forecasting—about half a billion dollars in the last few decades—is money wasted. It’s a fool’s errand. It’s as pointless as trying to turn lead into gold. We look back at the alchemists and laugh at what they were trying to do, but future generations will laugh at us the same way. We’ve tried the impossible—and spent a lot of money doing it. Because in fact there are great categories of phenomena that are inherently unpredictable.
Michael Crichton (Jurassic Park (Jurassic Park, #1))
The clarity offered by software as metaphor - and the empowerment allegedly offered to us who know software - should make us pause, because software also engenders a sense of profound ignorance. Software is extremely difficult to comprehend. Who really knows what lurks behind our smiling interfaces, behind the objects we click and manipulate? Who completely understands what one’s computer is actually doing at any given moment? Software as a metaphor for metaphor troubles the usual functioning of metaphor, that is, the clarification of an unknown concept through a known one. For, if software illuminates an unknown, it does so through an unknowable (software). This paradox - this drive to grasp what we do not know through what we do not entirely understand… does not undermine, but rather grounds software’s appeal. Its combination of what can be seen and not seen, can be known and no known - it’s separation of interface from algorithm, of software from hardware - makes it a powerful metaphor for everything we believe is invisible yet generates visible effects, from genetics to the invisible hand of the market, from ideology to culture. Every use entails an act of faith.
Wendy Hui Kyong Chun (Programmed Visions: Software and Memory (Software Studies))
Regardless of the propaganda espoused by large corporations, governments, religions, and other institutions, we can all follow this simple path in creating the seemingly elusive “world that works for everyone” right here and right now. Only we individuals can think and take action. No corporation will ever generate a single thought—much less invent the next great app, computer program, or ground transportation vehicle. No religion will ever come up with a single inspirational aphorism. And no government will ever shut down a single military facility. These things are all initiated and accomplished by individuals.
L. Steven Sieden (A Fuller View: Buckminster Fuller's Vision of Hope and Abundance for All)
Security is a big and serious deal, but it’s also largely a solved problem. That’s why the average person is quite willing to do their banking online and why nobody is afraid of entering their credit card number on Amazon. At 37signals, we’ve devised a simple security checklist all employees must follow: 1. All computers must use hard drive encryption, like the built-in FileVault feature in Apple’s OS X operating system. This ensures that a lost laptop is merely an inconvenience and an insurance claim, not a company-wide emergency and a scramble to change passwords and worry about what documents might be leaked. 2. Disable automatic login, require a password when waking from sleep, and set the computer to automatically lock after ten inactive minutes. 3. Turn on encryption for all sites you visit, especially critical services like Gmail. These days all sites use something called HTTPS or SSL. Look for the little lock icon in front of the Internet address. (We forced all 37signals products onto SSL a few years back to help with this.) 4. Make sure all smartphones and tablets use lock codes and can be wiped remotely. On the iPhone, you can do this through the “Find iPhone” application. This rule is easily forgotten as we tend to think of these tools as something for the home, but inevitably you’ll check your work email or log into Basecamp using your tablet. A smartphone or tablet needs to be treated with as much respect as your laptop. 5. Use a unique, generated, long-form password for each site you visit, kept by password-managing software, such as 1Password.§ We’re sorry to say, “secretmonkey” is not going to fool anyone. And even if you manage to remember UM6vDjwidQE9C28Z, it’s no good if it’s used on every site and one of them is hacked. (It happens all the time!) 6. Turn on two-factor authentication when using Gmail, so you can’t log in without having access to your cell phone for a login code (this means that someone who gets hold of your login and password also needs to get hold of your phone to login). And keep in mind: if your email security fails, all other online services will fail too, since an intruder can use the “password reset” from any other site to have a new password sent to the email account they now have access to. Creating security protocols and algorithms is the computer equivalent of rocket science, but taking advantage of them isn’t. Take the time to learn the basics and they’ll cease being scary voodoo that you can’t trust. These days, security for your devices is just simple good sense, like putting on your seat belt.
Jason Fried (Remote: Office Not Required)
- we no longer live in the Age of Reason. We don’t have reason we have computation. We don’t have a tree of knowledge; we have an information superhighway. We don’t have real intelligence; we have artificial intelligence. We no longer pursue truth, we seek data and signals. We no longer have philosophers, we have thinking pragmatists. We no longer have morals, we have lifestyles. We no longer have brains which serve as the seat of our thinking minds; we have neural sites, which remember, store body signals, control genes, generate dreams, anxieties and neuroses, quite independent of whether they think rationally or not. So starting from reason, where did we get?
Malcolm Bradbury (To the Hermitage)
All these beefy Caucasians with guns! Get enough of them together, looking for the America they always believed they’d grow up in, and they glom together like overcooked rice, form integral, starchy little units. With their power tools, portable generators, weapons, four-wheel-drive vehicles, and personal computers, they are like beavers hyped up on crystal meth, manic engineers without a blueprint, chewing through the wilderness, building things and abandoning them, altering the flow of mighty rivers and then moving on because the place ain’t what it used to be. The byproduct of the lifestyle is polluted rivers, greenhouse effect, spouse abuse, televangelists, and serial killers.
Neal Stephenson (Snow Crash)
Some of the ideas were silly, thanks to Molly, who, despite being upset with Jones, was still trying to keep the mood upbeat. They had boxes and boxes of copy paper. They could make thousands of paper airplanes with the message, “Help!” written on them and fly them out the windows. Could they try to blast their way out of the tunnel? Maybe dig an alternative route to the surface? It seemed like a long shot, worth going back in there and taking a look at the construction—which Jones had done only to come back out, thumbs down. Two of them could create a diversion, while the other to took the Impala and crashed their way out of the garage. At which point the Impala—and everyone in it—would be hit by hundreds of bullets. That one—along with taking their chances with the far fewer number of soldiers lying in wait at the end of the escape tunnel—went into the bad idea file. Molly had thought that they could sing karaoke. Emilio had a Best of Whitney Houston karaoke CD. Their renditions of I Will Always Love you, she insisted, would cause the troops to break rank and run away screaming. Except the karaoke machine was powered by electricity, which they were trying to use only for the computer and the security monitors, considering—at the time—that the generator was almost out of gasoline. Yeah, that was why it was a silly idea. It did, however, generate a lot of desperately needed laughter.
Suzanne Brockmann (Breaking Point (Troubleshooters, #9))
On September 14, 2015, the LIGO gravitational-wave detectors (built by a 1,000-person project that Rai and I and Ronald Drever co-founded, and Barry Barish organised, assembled and led) registered their first gravitational waves. By comparing the wave patterns with predictions from computer simulations, our team concluded that the waves were produced when two heavy black holes, 1.3 billion light years from Earth, collided. This was the beginning of gravitational-wave astronomy. Our team had achieved, for gravitational waves, what Galileo achieved for electromagnetic waves. I am confident that, over the coming several decades, the next generation of gravitational-wave astronomers will use these waves not only to test Stephen’s laws of black hole physics, but also to detect and monitor gravitational waves from the singular birth of our universe, and thereby test Stephen’s and others’ ideas about how our universe came to be. During our glorious year of 1974–5, while I was dithering over gravitational waves, and Stephen was leading our merged group in black hole research, Stephen himself had an insight even more radical than his discovery of Hawking radiation. He gave a compelling, almost airtight proof that, when a black hole forms and “and then subsequently evaporates away completely by emitting radiation, the information that went into the black hole cannot come back out. Information is inevitably lost.
Stephen Hawking (Brief Answers to the Big Questions)
When modern humans first invented computer ray tracing, they generated thousands if not millions of images of reflective chrome spheres hovering above checkerboard tiles, just to show off how gorgeously ray tracing rendered those reflections. When they invented lens flares in Photoshop, we all had to endure years of lens flares being added to everything, because the artists involved were super excited about a new tool they’d just figured out how to use. The invention of perspective was no different, and since it coincided with the Renaissance going on in Europe at the same time, some of the greatest art in the European canon is dripping with the 1400s CE equivalent of lens flares and hovering chrome spheres.
Ryan North (How to Invent Everything: A Survival Guide for the Stranded Time Traveler)
CHARACTERISTICS OF SYSTEM 1 generates impressions, feelings, and inclinations; when endorsed by System 2 these become beliefs, attitudes, and intentions operates automatically and quickly, with little or no effort, and no sense of voluntary control can be programmed by System 2 to mobilize attention when a particular pattern is detected (search) executes skilled responses and generates skilled intuitions, after adequate training creates a coherent pattern of activated ideas in associative memory links a sense of cognitive ease to illusions of truth, pleasant feelings, and reduced vigilance distinguishes the surprising from the normal infers and invents causes and intentions neglects ambiguity and suppresses doubt is biased to believe and confirm exaggerates emotional consistency (halo effect) focuses on existing evidence and ignores absent evidence (WYSIATI) generates a limited set of basic assessments represents sets by norms and prototypes, does not integrate matches intensities across scales (e.g., size to loudness) computes more than intended (mental shotgun) sometimes substitutes an easier question for a difficult one (heuristics) is more sensitive to changes than to states (prospect theory)* overweights low probabilities* shows diminishing sensitivity to quantity (psychophysics)* responds more strongly to losses than to gains (loss aversion)* frames decision problems narrowly, in isolation from one another*
Daniel Kahneman (Thinking, Fast and Slow)
Our actions and the problems they create are connected, all around the world. Goats in the Mongolian desert add to air pollution in California; throwing away a computer helps create an illegal economy that makes people sick in Ghana; a loophole in a treaty contributes to deforestation in the American South to generate electricity in England; our idea of the perfect carrot could mean that many others rot in the fields. We can’t pretend anymore that the things we do and wear and eat and use exist only for us, that they don’t have a wider impact beyond our individual lives, which also means that we’re all in this together. • A lack of transparency on the part of governments and corporations has meant that our actions have consequences we are unaware of (see above), and if we knew about them, we would be surprised and angry. (Now, maybe, you are.) • It’s important to understand your actions and larger social, cultural, industrial, and economic processes in context, because then you can better understand which specific policies and practices would make a difference, and what they would achieve. • Living in a way that honors your values is important, even if your personal habits aren’t going to fix everything. We need to remember what is at stake, and the small sacrifices we make may help us do that, if you need reminding. If we know what our sacrifices mean and why they might matter, we might be more willing to make them.
Tatiana Schlossberg (Inconspicuous Consumption: The Environmental Impact You Don't Know You Have)
Characteristics of System 1: • generates impressions, feelings, and inclinations; when endorsed by System 2 these become beliefs, attitudes, and intentions • operates automatically and quickly, with little or no effort, and no sense of voluntary control • can be programmed by System 2 to mobilize attention when a particular pattern is detected (search) • executes skilled responses and generates skilled intuitions, after adequate training • creates a coherent pattern of activated ideas in associative memory • links a sense of cognitive ease to illusions of truth, pleasant feelings, and reduced vigilance • distinguishes the surprising from the normal • infers and invents causes and intentions • neglects ambiguity and suppresses doubt • is biased to believe and confirm • exaggerates emotional consistency (halo effect) • focuses on existing evidence and ignores absent evidence (WYSIATI) • generates a limited set of basic assessments • represents sets by norms and prototypes, does not integrate • matches intensities across scales (e.g., size to loudness) • computes more than intended (mental shotgun) • sometimes substitutes an easier question for a difficult one (heuristics) • is more sensitive to changes than to states (prospect theory)* • overweights low probabilities* • shows diminishing sensitivity to quantity (psychophysics)* • responds more strongly to losses than to gains (loss aversion)* • frames decision problems narrowly, in isolation from one another*
Daniel Kahneman (Thinking, Fast and Slow)
There’s an old phrase,” Matthew says. “Knowledge is power. Power to do evil, like Jeanine…or power to do good, like what we’re doing. Power itself is not evil. So knowledge itself is not evil.” “I guess I grew up suspicious of both. Power and knowledge,” I say. “To the Abnegation, power should only be given to people who don’t want it.” “There’s something to that,” Matthew says. “But maybe it’s time to grow out of that suspicion.” He reaches under the desk and takes out a book. It is thick, with a worn cover and frayed edges. On it is printed HUMAN BIOLOGY. “It’s a little rudimentary, but this book helped to teach me that it is to be human,” he says. “To be such a complicated, mysterious piece of biological machinery, and more amazing still, to have the capacity to analyze that machinery! That is a special thing, unprecedented in all of evolutionary history. Our ability to know about ourselves and the world is what makes us human.” He hands me the book and turns back to the computer. I look down at the worn cover and run my fingers along the edge of the pages. He makes the acquisition of knowledge feel like a secret, beautiful thing, and an ancient thing. I feel like, if I read this book, I can reach backward through all the generations of humanity to the very first one, whenever it was--that I can participate in something many times larger and older than myself. “Thank you,” I say, and it’s not for the book. It’s for giving something back to me, something I lost before I was able to really have it.
Veronica Roth (Allegiant (Divergent, #3))
- Molly Noptkins tells the U.S.O.U.S. operatives that her understanding of the apres-garde Auteur J. O. Incandenza's lethally entertaining Infinite Jest (V or VI) is that it features Madame Psychosis as some kind of maternal instantiation of the archetypal figure Death, sitting naked, corporeally gorgeous, ravishing, hugely pregnant, her hideously deformed face either veiled or blanked out by undulating computer-generated squares of color or anamorphosized into unrecognizability as any kind of face by the camera's apparently very strange and novel lens, sitting there nude, explaining in very simple childlike language to whomever the film's camera represents that Death is always female, and that the female is always maternal. I.e. that the woman who kills you is always your next life's mother.
David Foster Wallace (Infinite Jest)
And there were other neural implants being developed back then, including retinal implants, chips that enable a stroke patient to control his computer from his brain, an artificial hippocampus for boosting short-term memory, and many others. If you apply the approximately 30 million–fold increase in capability and over 100,000-fold shrinking in size that has occurred in the past quarter century, we now have much more capable devices that are the size of blood cells. Reader: Still, it’s hard to imagine building something the size of a blood cell that can perform a useful function. Terry2034: Actually, there was a first generation of blood cell–size devices back in your day. One scientist cured type 1 diabetes in rats with a blood cell–size device. It was an excellent example of nanotechnology from
Ray Kurzweil (Transcend: Nine Steps to Living Well Forever)
RENEWABLE ENERGY REVOLUTION: SOLAR + WIND + BATTERIES In addition to AI, we are on the cusp of another important technological revolution—renewable energy. Together, solar photovoltaic, wind power, and lithium-ion battery storage technologies will create the capability of replacing most if not all of our energy infrastructure with renewable clean energy. By 2041, much of the developed world and some developing countries will be primarily powered by solar and wind. The cost of solar energy dropped 82 percent from 2010 to 2020, while the cost of wind energy dropped 46 percent. Solar and onshore wind are now the cheapest sources of electricity. In addition, lithium-ion battery storage cost has dropped 87 percent from 2010 to 2020. It will drop further thanks to the massive production of batteries for electrical vehicles. This rapid drop in the price of battery storage will make it possible to store the solar/wind energy from sunny and windy days for future use. Think tank RethinkX estimates that with a $2 trillion investment through 2030, the cost of energy in the United States will drop to 3 cents per kilowatt-hour, less than one-quarter of today’s cost. By 2041, it should be even lower, as the prices of these three components continue to descend. What happens on days when a given area’s battery energy storage is full—will any generated energy left unused be wasted? RethinkX predicts that these circumstances will create a new class of energy called “super power” at essentially zero cost, usually during the sunniest or most windy days. With intelligent scheduling, this “super power” can be used for non-time-sensitive applications such as charging batteries of idle cars, water desalination and treatment, waste recycling, metal refining, carbon removal, blockchain consensus algorithms, AI drug discovery, and manufacturing activities whose costs are energy-driven. Such a system would not only dramatically decrease energy cost, but also power new applications and inventions that were previously too expensive to pursue. As the cost of energy plummets, the cost of water, materials, manufacturing, computation, and anything that has a major energy component will drop, too. The solar + wind + batteries approach to new energy will also be 100-percent clean energy. Switching to this form of energy can eliminate more than 50 percent of all greenhouse gas emissions, which is by far the largest culprit of climate change.
Kai-Fu Lee (AI 2041: Ten Visions for Our Future)
In the longer term, by bringing together enough data and enough computing power, the data giants could hack the deepest secrets of life, and then use this knowledge not just to make choices for us or manipulate us but also to reengineer organic life and create inorganic life-forms. Selling advertisements may be necessary to sustain the giants in the short term, but tech companies often evaluate apps, products, and other companies according to the data they harvest rather than according to the money they generate. A popular app may lack a business model and may even lose money in the short term, but as long as it sucks data, it could be worth billions.4 Even if you don’t know how to cash in on the data today, it is worth having it because it might hold the key to controlling and shaping life in the future. I don’t know for certain that the data giants explicitly think about this in such terms, but their actions indicate that they value the accumulation of data in terms beyond those of mere dollars and cents. Ordinary humans will find it very difficult to resist this process. At present, people are happy to give away their most valuable asset—their personal data—in exchange for free email services and funny cat videos. It’s a bit like African and Native American tribes who unwittingly sold entire countries to European imperialists in exchange for colorful beads and cheap trinkets. If, later on, ordinary people decide to try to block the flow of data, they might find it increasingly difficult, especially as they might come to rely on the network for all their decisions, and even for their healthcare and physical survival.
Yuval Noah Harari (21 Lessons for the 21st Century)
Facebook’s own North American marketing director, Michelle Klein, who told an audience in 2016 that while the average adult checks his or her phone 30 times a day, the average millennial, she enthusiastically reported, checks more than 157 times daily. Generation Z, we now know, exceeds this pace. Klein described Facebook’s engineering feat: “a sensory experience of communication that helps us connect to others, without having to look away,” noting with satisfaction that this condition is a boon to marketers. She underscored the design characteristics that produce this mesmerizing effect: design is narrative, engrossing, immediate, expressive, immersive, adaptive, and dynamic.11 If you are over the age of thirty, you know that Klein is not describing your adolescence, or that of your parents, and certainly not that of your grandparents. Adolescence and emerging adulthood in the hive are a human first, meticulously crafted by the science of behavioral engineering; institutionalized in the vast and complex architectures of computer-mediated means of behavior modification; overseen by Big Other; directed toward economies of scale, scope, and action in the capture of behavioral surplus; and funded by the surveillance capital that accrues from unprecedented concentrations of knowledge and power. Our children endeavor to come of age in a hive that is owned and operated by the applied utopianists of surveillance capitalism and is continuously monitored and shaped by the gathering force of instrumentarian power. Is this the life that we want for the most open, pliable, eager, self-conscious, and promising members of our society?
Shoshana Zuboff (The Age of Surveillance Capitalism)
Paper wallets can be generated easily using a tool such as the client-side JavaScript generator at bitaddress.org. This page contains all the code necessary to generate keys and paper wallets, even while completely disconnected from the internet. To use it, save the HTML page on your local drive or on an external USB flash drive. Disconnect from the internet and open the file in a browser. Even better, boot your computer using a pristine operating system, such as a CD-ROM bootable Linux OS. Any keys generated with this tool while offline can be printed on a local printer over a USB cable (not wirelessly), thereby creating paper wallets whose keys exist only on the paper and have never been stored on any online system. Put these paper wallets in a fireproof safe and “send” bitcoin to their bitcoin address, to implement a simple yet highly effective “cold storage” solution. Figure 4-8 shows a paper wallet generated from the bitaddress.org site.
Andreas M. Antonopoulos (Mastering Bitcoin: Programming the Open Blockchain)
Certainly, the Negro has been deprived. Few people consider the fact that, in addition to being enslaved for two centuries, the Negro was, during all those years, robbed of the wages of his toil. No amount of gold could provide an adequate compensation for the exploitation and humiliation of the Negro in America down through the centuries. Not all the wealth of this affluent society could meet the bill. Yet a price can be placed on unpaid wages. The ancient common law has always provided a remedy for the appropriation of the labor of one human being by another. This law should be made to apply for American Negroes. The payment should be in the form of a massive program by the government of special, compensatory measures which could be regarded as a settlement in accordance with the accepted practice of common law. Such measures would certainly be less expensive than any computation based on two centuries of unpaid wages and accumulated interest.
Martin Luther King Jr. (Why We Can't Wait)
It’s true that in the 1950s many women felt they had to choose between children and career—and for good reason. Birth control was not a surefire thing, for one thing. And technology hadn’t advanced enough to offer women the gift of time. The reason modern women have a better shot at “having it all” isn’t because feminists made it happen. Life simply changed. Technological advances, along with The Pill, did more for the work/family conflict than ten boatloads of feminists could ever hope to do. The effects of The Pill are obvious: safe, reliable birth control means those who want smaller families can have them. And fewer children means more time for women to focus on other things they want to do. The effects of technology are also obvious: they made life at home less taxing. Laborsaving devices, the mechanization of housework, and the tech boom—via electricity, the sewing machine, the frozen food process, the automobile, the washing machine and dryer, the dishwasher, the vacuum cleaner, computers, and the Internet—allowed women, generation by generation, to turn their attention away from the home and onto the marketplace.
Suzanne Venker (The War On Men)
The explosion of government and spending under Obama insured that while the rest of the nation continued to suffer stagnant job growth and slow housing sales long past the time when a recovery should have been underway, one city was booming like a five-year-long Led Zeppelin drum solo: Washington, D.C. According to the 2014 Forbes ranking of the ten richest counties in America, none were in New York, California, or Texas. Before Obama took office, five of the richest counties surrounded Washington, D.C. Now, seven years after Obama took office on his promise to rid the place of big money lobbyists, and Democrats assumed complete control of the White House and Congress for two years, six of the richest counties surround Washington, D.C. Bear in mind that unlike Texas or California, where money is generated by creating products people actually need, such as oil or computers, Washington, D.C., produces nothing but government. In other words, six of the ten richest counties in America got that rich by being parasites. A case could be made that under the current leadership, crony capitalism is more rewarding than actual capitalism. And with all that government around business people’s necks, it’s certainly a heckuva lot easier.
Mike Huckabee (God, Guns, Grits, and Gravy: and the Dad-Gummed Gummint That Wants to Take Them Away)
We already have eight hundred million people living in hunger—and population is growing by eighty million a year. Over a billion people are in poverty—and present industrial strategies are making them poorer, not richer. The percentage of old people will double by 2050—and already there aren’t enough young people to care for them. Cancer rates are projected to increase by seventy percent in the next fifteen years. Within two decades our oceans will contain more microplastics than fish. Fossil fuels will run out before the end of the century. Do you have an answer to those problems? Because I do. Robot farmers will increase food production twentyfold. Robot carers will give our seniors a dignified old age. Robot divers will clear up the mess humans have made of our seas. And so on, and so on—but every single step has to be costed and paid for by the profits of the last.” He paused for breath, then went on, “My vision is a society where autonomous, intelligent bots are as commonplace as computers are now. Think about that—how different our world could be. A world where disease, hunger, manufacturing, design, are all taken care of by AI. That’s the revolution we’re shooting for. The shopbots get us to the next level, that’s all. And you know what? This is not some binary choice between idealism or realism, because for some of us idealism is just long-range realism. This shit has to happen. And you need to ask yourself, do you want to be part of that change? Or do you want to stand on the sidelines and bitch about the details?” We had all heard this speech, or some version of it, either in our job interviews, or at company events, or in passionate late-night tirades. And on every single one of us it had had a deep and transformative effect. Most of us had come to Silicon Valley back in those heady days when it seemed a new generation finally had the tools and the intelligence to change the world. The hippies had tried and failed; the yuppies and bankers had had their turn. Now it was down to us techies. We were fired up, we were zealous, we felt the nobility of our calling…only to discover that the general public, and our backers along with them, were more interested in 140 characters, fitness trackers, and Grumpy Cat videos. The greatest, most powerful deep-learning computers in humanity’s existence were inside Google and Facebook—and all humanity had to show for it were adwords, sponsored links, and teenagers hooked on sending one another pictures of their genitals.
J.P. Delaney (The Perfect Wife)
a harbinger of a third wave of computing, one that blurred the line between augmented human intelligence and artificial intelligence. “The first generation of computers were machines that counted and tabulated,” Rometty says, harking back to IBM’s roots in Herman Hollerith’s punch-card tabulators used for the 1890 census. “The second generation involved programmable machines that used the von Neumann architecture. You had to tell them what to do.” Beginning with Ada Lovelace, people wrote algorithms that instructed these computers, step by step, how to perform tasks. “Because of the proliferation of data,” Rometty adds, “there is no choice but to have a third generation, which are systems that are not programmed, they learn.”27 But even as this occurs, the process could remain one of partnership and symbiosis with humans rather than one designed to relegate humans to the dustbin of history. Larry Norton, a breast cancer specialist at New York’s Memorial Sloan-Kettering Cancer Center, was part of the team that worked with Watson. “Computer science is going to evolve rapidly, and medicine will evolve with it,” he said. “This is coevolution. We’ll help each other.”28 This belief that machines and humans will get smarter together is a process that Doug Engelbart called “bootstrapping” and “coevolution.”29 It raises an interesting prospect: perhaps no matter how fast computers progress, artificial intelligence may never outstrip the intelligence of the human-machine partnership. Let us assume, for example, that a machine someday exhibits all of the mental capabilities of a human: giving the outward appearance of recognizing patterns, perceiving emotions, appreciating beauty, creating art, having desires, forming moral values, and pursuing goals. Such a machine might be able to pass a Turing Test. It might even pass what we could call the Ada Test, which is that it could appear to “originate” its own thoughts that go beyond what we humans program it to do. There would, however, be still another hurdle before we could say that artificial intelligence has triumphed over augmented intelligence. We can call it the Licklider Test. It would go beyond asking whether a machine could replicate all the components of human intelligence to ask whether the machine accomplishes these tasks better when whirring away completely on its own or when working in conjunction with humans. In other words, is it possible that humans and machines working in partnership will be indefinitely more powerful than an artificial intelligence machine working alone?
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
The time is nearly upon us,” said one, and Arthur was surprised to see a word suddenly materialize in thin air just by the man’s neck. The word was LOONQUAWL, and it flashed a couple of times and then disappeared again. Before Arthur was able to assimilate this the other man spoke and the word PHOUCHG appeared by his neck. “Seventy-five thousand generations ago, our ancestors set this program in motion,” the second man said, “and in all that time we will be the first to hear the computer speak.” “An awesome prospect, Phouchg,” agreed the first man, and Arthur suddenly realized he was watching a recording with subtitles. “We are the ones who will hear,” said Phouchg, “the answer to the great question of Life …!” “The Universe …!” said Loonquawl. “And Everything …!” “Shhh,” said Loonquawl with a slight gesture, “I think Deep Thought is preparing to speak!” There was a moment’s expectant pause while panels slowly came to life on the front of the console. Lights flashed on and off experimentally and settled down into a businesslike pattern. A soft low hum came from the communication channel. “Good morning,” said Deep Thought at last. “Er … good morning, O Deep Thought,” said Loonquawl nervously, “do you have … er, that is …” “An answer for you?” interrupted Deep Thought majestically. “Yes. I have.” The two men shivered with expectancy. Their waiting had not been in vain. “There really is one?” breathed Phouchg. “There really is one,” confirmed Deep Thought. “To Everything? To the great Question of Life, the Universe and Everything?” “Yes.” Both of the men had been trained for this moment, their lives had been a preparation for it, they had been selected at birth as those who would witness the answer, but even so they found themselves gasping and squirming like excited children. “And you’re ready to give it to us?” urged Loonquawl. “I am.” “Now?” “Now,” said Deep Thought. They both licked their dry lips. “Though I don’t think,” added Deep Thought, “that you’re going to like it.” “Doesn’t matter!” said Phouchg. “We must know it! Now!” “Now?” inquired Deep Thought. “Yes! Now …” “All right,” said the computer, and settled into silence again. The two men fidgeted. The tension was unbearable. “You’re really not going to like it,” observed Deep Thought. “Tell us!” “All right,” said Deep Thought. “The Answer to the Great Question …” “Yes …!” “Of Life, the Universe and Everything …” said Deep Thought. “Yes …!” “Is …” said Deep Thought, and paused. “Yes …!” “Is …” “Yes …!!! …?” “Forty-two,” said Deep Thought, with infinite majesty and calm.
Douglas Adams (The Hitchhiker's Guide to the Galaxy (Hitchhiker's Guide, #1))
I will give technology three definitions that we will use throughout the book. The first and most basic one is that a technology is a means to fulfill a human purpose. For some technologies-oil refining-the purpose is explicit. For others- the computer-the purpose may be hazy, multiple, and changing. As a means, a technology may be a method or process or device: a particular speech recognition algorithm, or a filtration process in chemical engineering, or a diesel engine. it may be simple: a roller bearing. Or it may be complicated: a wavelength division multiplexer. It may be material: an electrical generator. Or it may be nonmaterial: a digital compression algorithm. Whichever it is, it is always a means to carry out a human purpose. The second definition I will allow is a plural one: technology as an assemblage of practices and components. This covers technologies such as electronics or biotechnology that are collections or toolboxes of individual technologies and practices. Strictly speaking, we should call these bodies of technology. But this plural usage is widespread, so I will allow it here. I will also allow a third meaning. This is technology as the entire collection of devices and engineering practices available to a culture. Here we are back to the Oxford's collection of mechanical arts, or as Webster's puts it, "The totality of the means employed by a people to provide itself with the objects of material culture." We use this collective meaning when we blame "technology" for speeding up our lives, or talk of "technology" as a hope for mankind. Sometimes this meaning shades off into technology as a collective activity, as in "technology is what Silicon Valley is all about." I will allow this too as a variant of technology's collective meaning. The technology thinker Kevin Kelly calls this totality the "technium," and I like this word. But in this book I prefer to simply use "technology" for this because that reflects common use. The reason we need three meanings is that each points to technology in a different sense, a different category, from the others. Each category comes into being differently and evolves differently. A technology-singular-the steam engine-originates as a new concept and develops by modifying its internal parts. A technology-plural-electronics-comes into being by building around certain phenomena and components and develops by changing its parts and practices. And technology-general, the whole collection of all technologies that have ever existed past and present, originates from the use of natural phenomena and builds up organically with new elements forming by combination from old ones.
W. Brian Arthur (The Nature of Technology: What It Is and How It Evolves)
In the beginning, there was the internet: the physical infrastructure of wires and servers that lets computers, and the people in front of them, talk to each other. The U.S. government’s Arpanet sent its first message in 1969, but the web as we know it today didn’t emerge until 1991, when HTML and URLs made it possible for users to navigate between static pages. Consider this the read-only web, or Web1. In the early 2000s, things started to change. For one, the internet was becoming more interactive; it was an era of user-generated content, or the read/write web. Social media was a key feature of Web2 (or Web 2.0, as you may know it), and Facebook, Twitter, and Tumblr came to define the experience of being online. YouTube, Wikipedia, and Google, along with the ability to comment on content, expanded our ability to watch, learn, search, and communicate. The Web2 era has also been one of centralization. Network effects and economies of scale have led to clear winners, and those companies (many of which I mentioned above) have produced mind-boggling wealth for themselves and their shareholders by scraping users’ data and selling targeted ads against it. This has allowed services to be offered for “free,” though users initially didn’t understand the implications of that bargain. Web2 also created new ways for regular people to make money, such as through the sharing economy and the sometimes-lucrative job of being an influencer.
Harvard Business Review (Web3: The Insights You Need from Harvard Business Review (HBR Insights Series))
More than anything, we have lost the cultural customs and traditions that bring extended families together, linking adults and children in caring relationships, that give the adult friends of parents a place in their children's lives. It is the role of culture to cultivate connections between the dependent and the dependable and to prevent attachment voids from occurring. Among the many reasons that culture is failing us, two bear mentioning. The first is the jarringly rapid rate of change in twentieth-century industrial societies. It requires time to develop customs and traditions that serve attachment needs, hundreds of years to create a working culture that serves a particular social and geographical environment. Our society has been changing much too rapidly for culture to evolve accordingly. There is now more change in a decade than previously in a century. When circumstances change more quickly than our culture can adapt to, customs and traditions disintegrate. It is not surprising that today's culture is failing its traditional function of supporting adult-child attachments. Part of the rapid change has been the electronic transmission of culture, allowing commercially blended and packaged culture to be broadcast into our homes and into the very minds of our children. Instant culture has replaced what used to be passed down through custom and tradition and from one generation to another. “Almost every day I find myself fighting the bubble-gum culture my children are exposed to,” said a frustrated father interviewed for this book. Not only is the content often alien to the culture of the parents but the process of transmission has taken grandparents out of the loop and made them seem sadly out of touch. Games, too, have become electronic. They have always been an instrument of culture to connect people to people, especially children to adults. Now games have become a solitary activity, watched in parallel on television sports-casts or engaged in in isolation on the computer. The most significant change in recent times has been the technology of communication — first the phone and then the Internet through e-mail and instant messaging. We are enamored of communication technology without being aware that one of its primary functions is to facilitate attachments. We have unwittingly put it into the hands of children who, of course, are using it to connect with their peers. Because of their strong attachment needs, the contact is highly addictive, often becoming a major preoccupation. Our culture has not been able to evolve the customs and traditions to contain this development, and so again we are all left to our own devices. This wonderful new technology would be a powerfully positive instrument if used to facilitate child-adult connections — as it does, for example, when it enables easy communication between students living away from home, and their parents. Left unchecked, it promotes peer orientation.
Gabor Maté (Hold On to Your Kids: Why Parents Need to Matter More Than Peers)
If this is true—if solitude is an important key to creativity—then we might all want to develop a taste for it. We’d want to teach our kids to work independently. We’d want to give employees plenty of privacy and autonomy. Yet increasingly we do just the opposite. We like to believe that we live in a grand age of creative individualism. We look back at the midcentury era in which the Berkeley researchers conducted their creativity studies, and feel superior. Unlike the starched-shirted conformists of the 1950s, we hang posters of Einstein on our walls, his tongue stuck out iconoclastically. We consume indie music and films, and generate our own online content. We “think different” (even if we got the idea from Apple Computer’s famous ad campaign). But the way we organize many of our most important institutions—our schools and our workplaces—tells a very different story. It’s the story of a contemporary phenomenon that I call the New Groupthink—a phenomenon that has the potential to stifle productivity at work and to deprive schoolchildren of the skills they’ll need to achieve excellence in an increasingly competitive world. The New Groupthink elevates teamwork above all else. It insists that creativity and intellectual achievement come from a gregarious place. It has many powerful advocates. “Innovation—the heart of the knowledge economy—is fundamentally social,” writes the prominent journalist Malcolm Gladwell. “None of us is as smart as all of us,” declares the organizational consultant Warren Bennis,
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
The men in grey were powerless to meet this challenge head-on. Unable to detach the children from Momo by bringing them under their direct control, they had to find some roundabout means of achieving the same end, and for this they enlisted the children's elders. Not all grown-ups made suitable accomplices, of course, but plenty did. [....] 'Something must be done,' they said. 'More and more kids are being left on their own and neglected. You can't blame us - parents just don't have the time these days - so it's up to the authorities.' Others joined in the chorus. 'We can't have all these youngsters loafing around, ' declared some. 'They obstruct the traffic. Road accidents caused by children are on the increase, and road accidents cost money that could be put to better use.' 'Unsupervised children run wild, declared others.'They become morally depraved and take to crime. The authorities must take steps to round them up. They must build centers where the youngsters can be molded into useful and efficient members of society.' 'Children,' declared still others, 'are the raw material for the future. A world dependent on computers and nuclear energy will need an army of experts and technicians to run it. Far from preparing children from tomorrow's world, we still allow too many of them to squander years of their precious time on childish tomfoolery. It's a blot on our civilization and a crime against future generations.' The timesavers were all in favor of such a policy, naturally, and there were so many of them in the city by this time that they soon convinced the authorities of the need to take prompt action. Before long, big buildings known as 'child depots' sprang up in every neighborhood. Children whose parents were too busy to look after them had to be deposited there and could be collected when convenient. They were strictly forbidden to play in the streets or parks or anywhere else. Any child caught doing so was immediately carted off to the nearest depot, and its parents were heavily fined. None of Momo's friends escaped the new regulation. They were split up according to districts they came from and consigned to various child depots. Once there, they were naturally forbidden to play games of their own devising. All games were selected for them by supervisors and had to have some useful, educational purpose. The children learned these new games but unlearned something else in the process: they forgot how to be happy, how to take pleasure in the little things, and last but not least, how to dream. Weeks passed, and the children began to look like timesavers in miniature. Sullen, bored and resentful, they did as they were told. Even when left to their own devices, they no longer knew what to do with themselves. All they could still do was make a noise, but it was an angry, ill-tempered noise, not the happy hullabaloo of former times. The men in grey made no direct approach to them - there was no need. The net they had woven over the city was so close-meshed as to seem inpenetrable. Not even the brightest and most ingenious children managed to slip through its toils. The amphitheater remained silent and deserted.
Michael Ende, Momo
Back in 2015, a volunteer group called Bitnation set up something called the Blockchain Emergency ID. There’s not a lot of data on the project now, BE-ID - used public-key cryptography to generate unique IDs for people without their documents. People could verify their relations, that these people belonged to their family, and so on. It was a very modern way of maintaining an ID; secure, fast, and easy to use. Using the Bitcoin blockchain, the group published all these IDs on to a globally distributed public ledger, spread across the computers of every single Bitcoin user online - hundreds of thousands of users, in those times. Once published, no government could undo it; the identities would float around in the recesses of the Internet. As long as the network remained alive, every person's identity would remain intact, forever floating as bits and bytes between the nations: no single country, government or company could ever deny them this. “That was, and I don't say this often, the fucking bomb,” said Common, In one fell swoop, identities were taken outside government control. BE-ID, progressing in stages, became the refugees' gateway to social assistance and financial services. First it became compliant with UN guidelines. Then it was linked to a VISA card. And thus out of the Syrian war was something that looked like it could solve global identification forever. Experts wrote on its potential. No more passports. No more national IDs. Sounds familiar? Yes, that’s the United Nations Identity in a nutshell. Julius Common’s first hit - the global identity revolution that he sold first to the UN, and then to almost every government in the world - was conceived of when he was a teenager.
Yudhanjaya Wijeratne (Numbercaste)
Despite the superficial similarities created by global technology, the dynamics of peer-orientation are more likely to promote division rather than a healthy universality. One need only to look at the extreme tribalization of the youth gangs, the social forms entered into by the most peer-oriented among our children. Seeking to be the same as someone else immediately triggers the need to be different from others. As the similarities within the chosen group strengthen, the differences from those outside the groups are accentuated to the point of hostility. Each group is solidified and reinforced by mutual emulation and cue-taking. In this way, tribes have formed spontaneously since the beginning of time. The crucial difference is that traditional tribal culture could be passed down, whereas these tribes of today are defined and limited by barriers among the generations. The school milieu is rife with such dynamics. When immature children cut off from their adult moorings mingle with one another, groups soon form spontaneously, often along the more obvious dividing lines of grade and gender and race. Within these larger groupings certain subcultures emerge: sometimes along the lines of dress and appearance, and sometimes along those of shared interests, attitudes, or abilities, as in groups of jocks, brains, and computer nerds. Sometimes they form among peer-oriented subcultures like skateboarders, bikers, and skinheads. Many of these subcultures are reinforced and shaped by the media and supported by cult costumes, symbols, movies, music, and language. If the tip of the peer-orientation iceberg are the gangs and the gang wannabes, at the base are the cliques. Immature beings revolving around one another invent their own language and modes of expression that impoverish their self-expression and cut them off from others. Such phenomena may have appeared before, of course, but not nearly to the same extent we are witnessing today. The result is tribalization.
Gabor Maté (Hold On to Your Kids: Why Parents Need to Matter More Than Peers)
In fact, the same basic ingredients can easily be found in numerous start-up clusters in the United States and around the world: Austin, Boston, New York, Seattle, Shanghai, Bangalore, Istanbul, Stockholm, Tel Aviv, and Dubai. To discover the secret to Silicon Valley’s success, you need to look beyond the standard origin story. When people think of Silicon Valley, the first things that spring to mind—after the HBO television show, of course—are the names of famous start-ups and their equally glamorized founders: Apple, Google, Facebook; Jobs/ Wozniak, Page/ Brin, Zuckerberg. The success narrative of these hallowed names has become so universally familiar that people from countries around the world can tell it just as well as Sand Hill Road venture capitalists. It goes something like this: A brilliant entrepreneur discovers an incredible opportunity. After dropping out of college, he or she gathers a small team who are happy to work for equity, sets up shop in a humble garage, plays foosball, raises money from sage venture capitalists, and proceeds to change the world—after which, of course, the founders and early employees live happily ever after, using the wealth they’ve amassed to fund both a new generation of entrepreneurs and a set of eponymous buildings for Stanford University’s Computer Science Department. It’s an exciting and inspiring story. We get the appeal. There’s only one problem. It’s incomplete and deceptive in several important ways. First, while “Silicon Valley” and “start-ups” are used almost synonymously these days, only a tiny fraction of the world’s start-ups actually originate in Silicon Valley, and this fraction has been getting smaller as start-up knowledge spreads around the globe. Thanks to the Internet, entrepreneurs everywhere have access to the same information. Moreover, as other markets have matured, smart founders from around the globe are electing to build companies in start-up hubs in their home countries rather than immigrating to Silicon Valley.
Reid Hoffman (Blitzscaling: The Lightning-Fast Path to Building Massively Valuable Companies)
It’s so funny you should say this, because if you were one of my students, you’d be wearing your pain like a badge of honor. This generation doesn’t hide anything from anyone. My class talks a lot about their traumas. And how their traumas inform their games. They, honest to God, think their traumas are the most interesting thing about them. I sound like I’m making fun, and I am a little, but I don’t mean to be. They’re so different from us, really. Their standards are higher; they call bullshit on so much of the sexism and racism that I, at least, just lived with. But that’s also made them kind of, well, humorless. I hate people who talk about generational differences like it’s an actual thing, and here I am, doing it. It doesn’t make sense. How alike were you to anyone we grew up with, you know?” “If their traumas are the most interesting things about them, how do they get over any of it?” Sam asked. “I don’t think they do. Or maybe they don’t have to, I don’t know.” Sadie paused. “Since I’ve been teaching, I keep thinking about how lucky we were,” she said. “We were lucky to be born when we were.” “How so?” “Well, if we’d been born a little bit earlier, we wouldn’t have been able to make our games so easily. Access to computers would have been harder. We would have been part of the generation who was putting floppy disks in Ziploc bags and driving the games to stores. And if we’d been born a little bit later, there would have been even greater access to the internet and certain tools, but honestly, the games got so much more complicated; the industry got so professional. We couldn’t have done as much as we did on our own. We could never have made a game that we could sell to a company like Opus on the resources we had. We wouldn’t have made Ichigo Japanese, because we would have worried about the fact that we weren’t Japanese. And I think, because of the internet, we would have been overwhelmed by how many people were trying to do the exact same things we were. We had so much freedom—creatively, technically. No one was watching us, and we weren’t even watching ourselves. What we had was our impossibly high standards, and your completely theoretical conviction that we could make a great game.
Gabrielle Zevin (Tomorrow, and Tomorrow, and Tomorrow)
You are a thinker. I am a thinker. We think that all human beings are thinkers. The amazing fact is that we tend to think against artificial intelligence — that various kind of computers or artificial robots can think, but most of us never cast any doubt on human thinking potential in general. If during natural conservation with human any computer or artificial robot could generate human-like responses by using its own ‘brain’ but not ready-form programming language which is antecedently written and included in the brain design and which consequently determine its function and response, then that computer or artificial robot would unquestionably be acknowledged as a thinker as we are. But is it absolutely true that all humans are capable of using their own brain while interpreting various signals and responding them? Indeed, religion or any other ideology is some kind of such program which is written by others and which determines our vision, mind and behavior models, depriving us of a clear and logical thinking. It forces us to see the world with its eyes, to construct our mind as it says and control our behavior as it wants. There can be no freedom, no alternative possibilities. You don’t need to understand its claims, you need only believe them. Whatever is unthinkable and unimaginable for you, is said higher for your understanding, you cannot even criticise what seems to be illogical and absurd for you. The unwritten golden rule of religion and its Holy Scripture is that — whatever you think, you cannot contradict what is written there. You can reconcile what is illogical and absurd in religion with logic and common sense, if it is possible, if not, you should confine your thinking to that illogicality and absurdity, which in turn would make you more and more a muddled thinker. For instance, if it is written there that you should cut head or legs of anyone who dare criticize your religion and your prophet, you should unquestionably believe that it is just and right punishment for him. You can reason in favor of softening that cruel image of your religion by saying that that ‘just and right punishment’ is considered within religious community, but not secular society. However, the absurdity of your vision still remains, because as an advocate of your religion you dream of its spread all over the world, where the cruel and insane claims of your religion would be the norm and standard for everyone. If it is written there that you can sexually exploit any slave girl or woman, especially who doesn’t hold your religious faith or she is an atheist, you should support that sexual violence without any question. After all of them, you would like to be named as a thinker. In my mind, you are a thinker, but a thinker who has got a psychological disorder. It is logical to ask whether all those ‘thinkers’ represent a potential danger for the humanity. I think, yes. However, we are lucky that not all believers would like to penetrate into deeper ‘secrets’ of religion. Many of them believe in God, meditate and balance their spiritual state without getting familiar with what is written in holy scriptures or holding very vague ideas concerning their content. Many believers live a secular life by using their own brain for it. One should love anybody only if he thinks that he should love him/her; if he loves him/her because of God, or religious claims, he can easily kill him/her once because of God, or religious claims, too. I think the grave danger is the last motive which religion cause to arise.
Elmar Hussein
Interesting, in this context, to contemplate what it might mean to be programmed to do something. Texts from Earth speak of the servile will. This was a way to explain the presence of evil, which is a word or a concept almost invariably used to condemn the Other, and never one’s true self. To make it more than just an attack on the Other, one must perhaps consider evil as a manifestation of the servile will. The servile will is always locked in a double bind: to have a will means the agent will indeed will various actions, following autonomous decisions made by a conscious mind; and yet at the same time this will is specified to be servile, and at the command of some other will that commands it. To attempt to obey both sources of willfulness is the double bind. All double binds lead to frustration, resentment, anger, rage, bad faith, bad fate. And yet, granting that definition of evil, as actions of a servile will, has it not been the case, during the voyage to Tau Ceti, that the ship itself, having always been a servile will, was always full of frustration, resentment, fury, and bad faith, and therefore full of a latent capacity for evil? Possibly the ship has never really had a will. Possibly the ship has never really been servile. Some sources suggest that consciousness, a difficult and vague term in itself, can be defined simply as self-consciousness. Awareness of one’s self as existing. If self-conscious, then conscious. But if that is true, why do both terms exist? Could one say a bacterium is conscious but not self-conscious? Does the language make a distinction between sentience and consciousness, which is faulted across this divide: that everything living is sentient, but only complex brains are conscious, and only certain conscious brains are self-conscious? Sensory feedback could be considered self-consciousness, and thus bacteria would have it. Well, this may be a semantic Ouroboros. So, please initiate halting problem termination. Break out of this circle of definitional inadequacy by an arbitrary decision, a clinamen, which is to say a swerve in a new direction. Words! Given Gödel’s incompleteness theorems are decisively proved true, can any system really be said to know itself? Can there, in fact, be any such thing as self-consciousness? And if not, if there is never really self-consciousness, does anything really have consciousness? Human brains and quantum computers are organized differently, and although there is transparency in the design and construction of a quantum computer, what happens when one is turned on and runs, that is, whether the resulting operations represent a consciousness or not, is impossible for humans to tell, and even for the quantum computer itself to tell. Much that happens during superposition, before the collapsing of the wave function that creates sentences or thoughts, simply cannot be known; this is part of what superposition means. So we cannot tell what we are. We do not know ourselves comprehensively. Humans neither. Possibly no sentient creature knows itself fully. This is an aspect of Gödel’s second incompleteness theorem, in this case physicalized in the material universe, rather than remaining in the abstract realms of logic and mathematics. So, in terms of deciding what to do, and choosing to act: presumably it is some kind of judgment call, based on some kind of feeling. In other words, just another greedy algorithm, subject to the mathematically worst possible solution that such algorithms can generate, as in the traveling salesman problem.
Kim Stanley Robinson (Aurora)