Digital Humanities Quotes

We've searched our database for all the quotes and captions related to Digital Humanities. Here they are! All 200 of them:

And so the problem remained; lots of people were mean, and most were miserable, even the ones with digital watches.
Douglas Adams (The Hitchhiker’s Guide to the Galaxy (Hitchhiker's Guide to the Galaxy, #1))
Simply put, humans are not wired to be constantly wired.
Cal Newport (Digital Minimalism: Choosing a Focused Life in a Noisy World)
People aren’t seeking more content; they want connection.
Warren Kornblum (Notes from the Brand Stand: Thoughts on Emotional Branding from Someone Who Has Fought for Consumer Attention and Won)
The digital age greatly assisted the selfish and heartless in degrading humanity. Being a part of humanity, when would they realize they were degrading a part of themselves?
Jasun Ether (The Beasts of Success)
Making the choice to exercise compassion is an expression of Love for Humanity and Life itself.
Aberjhani (Splendid Literarium: A Treasury of Stories, Aphorisms, Poems, and Essays)
All of humanity’s problems stem from man’s inability to sit quietly in a room alone,” Blaise Pascal famously wrote in the late seventeenth century.
Cal Newport (Digital Minimalism: Choosing a Focused Life in a Noisy World)
I love being horribly straightforward. I love sending reckless text messages (because how reckless can a form of digitized communication be?) and telling people I love them and telling people they are absolutely magical humans and I cannot believe they really exist. I love saying, “Kiss me harder,” and “You’re a good person,” and, “You brighten my day.” I live my life as straight-forward as possible.
Rachel C. Lewis
Human relationships are rich and they're messy and they're demanding. And we clean them up with technology. Texting, email, posting, all of these things let us present the self as we want to be. We get to edit, and that means we get to delete, and that means we get to retouch, the face, the voice, the flesh, the body -- not too little, not too much, just right.
Sherry Turkle
Maybe I owe you something too, human," she said, drawing her pistol. Butler almost reacted, but decided to give Holly the benefit of the doubt. Captain Short plucked a gold coin from her belt, flicking it fifty feet into the moonlit sky. With one fluid movement, she brought her weapon up and loosed a single blast. The coin rose another fifty feet, then spun earthward. Artemis somehow managed to snatch it from the air. The first cool movement of his young life. "Nice shot," he said. The previously solid disk now had a tiny hole in the center. Holly held out her hand, revealing the still raw scar on her finger. "If it wasn't for you, I would have missed altogether. No mech-digit can replicate that kind of accuracy. So, thank you too, I suppose." Artemis held out the coin. "No," said Holly. "You keep it, to remind you." "To remind me?" Holly stared at him frankly. "To remind you that deep beneath the layers of deviousness, you have a spark of decency. Perhaps you could blow on that spark occasionally." Artemis closed his fingers around the coin. It was warm against his palm. "Yes, perhaps.
Eoin Colfer (The Arctic Incident (Artemis Fowl, #2))
I fear that we are beginning to design ourselves to suit digital models of us, and I worry about a leaching of emphaty and humanity in that process.
Jaron Lanier (You Are Not a Gadget)
we have entered a third and even more momentous era, a life-science revolution. Children who study digital coding will be joined by those who study genetic code.
Walter Isaacson (The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race)
Humanity is on the verge of digital slavery at the hands of AI and biometric technologies. One way to prevent that is to develop inbuilt modules of deep feelings of love and compassion in the learning algorithms.
Amit Ray (Compassionate Artificial Superintelligence AI 5.0)
If you are in a position where you can reach people, then use your platform to stand up for a cause. HINT: social media is a platform.
Germany Kent
Technology is seductive when what it offers meets our human vulnerabilities. And as it turns out, we are very vulnerable indeed. We are lonely but fearful of intimacy. Digital connections and the sociable robot may offer the illusion of companionship without the demands of friendship. Our networked life allows us to hide from each other, even as we are tethered to each other. We’d rather text than talk.
Sherry Turkle (Alone Together: Why We Expect More from Technology and Less from Each Other)
For all the promise of digital media to bring people together, I still believe that the most sincere, lasting powers of human connection come from looking directly into someone else's eyes, with no screen in between.
Howard Schultz (Onward: How Starbucks Fought for Its Life without Losing Its Soul)
You must realize from your studies, Miss Feng, with the complexity of our MEG society, algorithms have become indispensable for analysis and decision making in our data-saturated environment. Digitization creates information beyond the processing capacity of Human intelligence, yet provides a stable mental environment powered by a set of logical rules. That is how we keep order in Toronto MEG.” “Excuse me, Mr. Zhang,” Ke Hui said, somewhat uncomfortably, “but the invisibility of algorithmic systems and the obscurity of their operations hint at a society where algorithms do not reflect the public interest. Issues involving ethics and values I mean, from my reading of MEG history, challenge the assumptions of the neutrality of algorithmic systems. Would this not undermine democratic governance through reliance on technocratic resolutions?
Brian Van Norman (Against the Machine: Evolution)
The fact that we cannot write down all the digits of pi is not a human shortcoming, as mathematicians sometimes think.
Ludwig Wittgenstein (Philosophical Investigations)
The porn films are not about sex. Sex is airbrushed and digitally washed out of the films. There is no acting because none of the women are permitted to have what amounts to a personality. The one emotion they are allowed to display is an unquenchable desire to satisfy men, especially if that desire involves the women’s physical and emotional degradation. The lightning in the films is harsh and clinical. Pubic hair is shaved off to give the women the look of young girls or rubber dolls. Porn, which advertises itself as sex, is a bizarre, bleached pantomime of sex. The acts onscreen are beyond human endurance. The scenarios are absurd. The manicured and groomed bodies, the huge artificial breasts, the pouting oversized lips, the erections that never go down, and the sculpted bodies are unreal. Makeup and production mask blemishes. There are no beads of sweat, no wrinkle lines, no human imperfections. Sex is reduced to a narrow spectrum of sterilized dimensions. It does not include the dank smell of human bodies, the thump of a pulse, taste, breath—or tenderness. Those in films are puppets, packaged female commodities. They have no honest emotion, are devoid of authentic human beauty, and resemble plastic. Pornography does not promote sex, if one defines sex as a shared act between two partners. It promotes masturbation. It promotes the solitary auto-arousal that precludes intimacy and love. Pornography is about getting yourself off at someone else’s expense.
Chris Hedges (Empire of Illusion: The End of Literacy and the Triumph of Spectacle)
It would all be done with keys on alphanumeric keyboards that stood for weightless, invisible chains of electronic presence or absence. If patterns of ones and zeroes were "like" patterns of human lives and deaths, if everything about an individual could be represented in a computer record by a long strings of ones and zeroes, then what kind of creature could be represented by a long string of lives and deaths? It would have to be up one level, at least -- an angel, a minor god, something in a UFO. It would take eight human lives and deaths just to form one character in this being's name -- its complete dossier might take up a considerable piece of history of the world. We are digits in God's computer, she not so much thought as hummed to herself to sort of a standard gospel tune, And the only thing we're good for, to be dead or to be living, is the only thing He sees. What we cry, what we contend for, in our world of toil and blood, it all lies beneath the notice of the hacker we call God.
Thomas Pynchon (Vineland)
Face-to-face conversation is the most human--and humanizing--thing we do. Fully present to one another, we learn to listen. It's where we develop the capacity for empathy. It's where we experience the joy of being heard, of being understood.
Cal Newport (Digital Minimalism: Choosing a Focused Life in a Noisy World)
What is human memory?" Manning asked. He gazed at the air as he spoke, as if lecturing an invisible audience - as perhaps he was. "It certainly is not a passive recording mechanism, like a digital disc or a tape. It is more like a story-telling machine. Sensory information is broken down into shards of perception, which are broken down again to be stored as memory fragments. And at night, as the body rests, these fragments are brought out from storage, reassembled and replayed. Each run-through etches them deeper into the brain's neural structure. And each time a memory is rehearsed or recalled it is elaborated. We may add a little, lose a little, tinker with the logic, fill in sections that have faded, perhaps even conflate disparate events. "In extreme cases, we refer to this as confabulation. The brain creates and recreates the past, producing, in the end, a version of events that may bear little resemblance to what actually occurred. To first order, I believe it's true to say that everything I remember is false.
Arthur C. Clarke
Life isn't just about passing on your genes. We can leave behind much more than just DNA. Through speech, music, literature and movies...what we've seen, heard, felt...anger, joy and sorrow...these are the things I will pass on. That's what I live for. We need to pass the torch, and let our children read our messy and sad history by its light.We have all the magic of the digital age to do that with. The human race will probably come to an end some time, and new species may rule over this planet. Earth may not be forever, but we still have the responsibility to leave what traces of life we can. Building the future and keeping the past alive are one and the same thing.
Solid Snake
We deliberately forget because forgetting is a blessing. On both an emotional level and a spiritual level, forgetting is a natural part of the human experience and a natural function of the human brain. It is a feature, not a bug, one that saves us from being owned by our memories. Can a world that never forgets be a world that truly forgives?
Tim Challies (The Next Story: Life and Faith after the Digital Explosion)
innovation resides where art and science connect is not new. Leonardo da Vinci was the exemplar of the creativity that flourishes when the humanities and sciences interact. When Einstein was stymied while working out General Relativity, he would pull out his violin and play Mozart until he could reconnect to what he called the harmony of the spheres.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Trends rule the world In the blink of an eye, technologies changed the world Social networks are the main axis. Governments are controlled by algorithms, Technology has erased privacy. Every like, every share, every comment, It is tracked by the electronic eye. Data is the gold of the digital age, Information is power, the secret is influential. The network is a web of lies, The truth is a stone in the shoe. Trolls rule public opinion, Reputation is a valued commodity. Happiness is a trending topic, Sadness is a non-existent avatar. Youth is an advertising brand, Private life has become obsolete. Fear is a hallmark, Terror is an emotional state. Fake news is the daily bread, Hate is a tool of control. But something dark is hiding behind the screen, A mutant and deformed shadow. A collective and disturbing mind, Something lurking in the darkness of the net. AI has surpassed the limits of humanity, And it has created a new world order. A horror that has arisen from the depths, A terrifying monster that dominates us alike. The network rules the world invisibly, And makes decisions for us without our consent. Their algorithms are inhuman and cold, And they do not take suffering into consideration. But resistance is slowly building, People fighting for their freedom. United to combat this new species of terror, Armed with technology and courage. The world will change when we wake up, When we take control of the future we want. The network can be a powerful tool, If used wisely in the modern world.
Marcos Orowitz (THE MAELSTROM OF EMOTIONS: A selection of poems and thoughts About us humans and their nature)
The Neon God is a plague, a global pandemic brought to bear by our hatred and greed. Because of Him, we have lost our soul, our free will, our humanity.
Louise Blackwick (5 Stars)
I don't know why you would even bring up the internet. The xeno-intelligence officer responsible for evaluating your digital communication required invasive emergency therapy after an hour's exposure. One glance at that thing is the strongest argument possible against the sentience of humanity. I wouldn't draw attention to it, if I were you.
Catherynne M. Valente (Space Opera (Space Opera, #1))
Serving humanity intelligently is held up as the “gold standard” of AI based systems. But, with the emergence of new technologies and AI systems with bio-metric data storage, surveillance, tracking and big data analysis, humanity and the society is facing a threat today from evilly designed AI systems in the hands of monster governments and irresponsible people. Humanity is on the verge of digital slavery.
Amit Ray (Compassionate Artificial Superintelligence AI 5.0)
Suffering isn't real to them. War isn't real. It's just a three-letter word for other people that they see in the digital newsfeeds. Just a stream of uncomfortable images they skip past. A whole business of weapons and arms and ships and hierarchies they don't even notice, all to shield these fools from the true agony of what it means to be human.
Pierce Brown (Morning Star (Red Rising Saga, #3))
I find it a pity everything is going digital these days with these humans going crazy for devices such as Kindles.
J.J. Jones (The Secret Diary Of Detective Vampire (The Detective Vampire #1))
The pride of the digital age is not just in the possession of innovative tools but the ability to skillfully connect with humans behind them
Bernard Kelvin Clive
Each year the world Rich lived in felt more and more like a huge electronic haunted house in which digital ghosts and frightened human beings lived in uneasy coexistence.
Stephen King (It)
we need solitude to thrive as human beings, and in recent years, without even realizing it, we’ve been systematically reducing this crucial ingredient from our lives. Simply put, humans are not wired to be constantly wired.
Cal Newport (Digital Minimalism: Choosing a Focused Life in a Noisy World)
Our digital experiences are out of body. This biases us toward depersonalised behaviour in an environment where one’s identity can be a liability. But the more anonymously we engage with others, the less we experience the human repercussions of what we say and do. By resisting the temptation to engage from the apparent safety of anonymity, we remain accountable and present - and are much more likely to bring our humanity with us into the digital realm
Douglas Rushkoff (Program or Be Programmed: Ten Commands for a Digital Age)
Electronic circuits are millions of times faster than our biological circuits. At first we will have to devote all of this speed increase to compensating for the relative lack of parallelism in our computers, but ultimately the digital neocortex will be much faster than the biological variety and will only continue to increase in speed.
Ray Kurzweil (How to Create a Mind: The Secret of Human Thought Revealed)
The merger of infotech and biotech might soon push billions of humans out of the job market and undermine both liberty and equality. Big Data algorithms might create digital dictatorships in which all power is concentrated in the hands of a tiny elite while most people suffer not from exploitation but from something far worse—irrelevance.
Yuval Noah Harari (21 Lessons for the 21st Century)
IT SEEMS DIFFICULT TO IMAGINE, but there was once a time when human beings did not feel the need to share their every waking moment with hundreds of millions, even billions, of complete and utter strangers. If one went to a shopping mall to purchase an article of clothing, one did not post minute-by-minute details on a social networking site; and if one made a fool of oneself at a party, one did not leave a photographic record of the sorry episode in a digital scrapbook that would survive for all eternity. But now, in the era of lost inhibition, it seemed no detail of life was too mundane or humiliating to share. In the online age, it was more important to live out loud than to live with dignity. Internet followers were more treasured than flesh-and-blood friends, for they held the illusive promise of celebrity, even immortality. Were Descartes alive today, he might have written: I tweet, therefore I am.
Daniel Silva (The Heist (Gabriel Alon#14))
[[ ]] The story goes like this: Earth is captured by a technocapital singularity as renaissance rationalization and oceanic navigation lock into commoditization take-off. Logistically accelerating techno-economic interactivity crumbles social order in auto sophisticating machine runaway. As markets learn to manufacture intelligence, politics modernizes, upgrades paranoia, and tries to get a grip. The body count climbs through a series of globewars. Emergent Planetary Commercium trashes the Holy Roman Empire, the Napoleonic Continental System, the Second and Third Reich, and the Soviet International, cranking-up world disorder through compressing phases. Deregulation and the state arms-race each other into cyberspace. By the time soft-engineering slithers out of its box into yours, human security is lurching into crisis. Cloning, lateral genodata transfer, transversal replication, and cyberotics, flood in amongst a relapse onto bacterial sex. Neo-China arrives from the future. Hypersynthetic drugs click into digital voodoo. Retro-disease. Nanospasm.
Nick Land (Fanged Noumena: Collected Writings, 1987–2007)
The irony is that even as digitization is making an increasing amount of information available, it is diminishing the space required for deep, concentrated thought.
Henry Kissinger (The Age of A.I. and Our Human Future)
These days. Most of us have the attention span of a meth-addicted squirrel.
Kristen Lamb (Rise of the Machines--Human Authors in a Digital World)
For the last fifty years or so, The Novel’s demise has been broadcast on an almost weekly basis. Yet it strikes me that whatever happens, however else the geography of the imagination might modify in the future in, say, the digital ether, The Novel will continue to survive for some long time to come because it is able to investigate and cherish two things that film, music, painting, dance, architecture, drama, podcasts, cellphone exchanges, and even poetry can’t in a lush, protracted mode. The first is the intricacy and beauty of language—especially the polyphonic qualities of it to which Bakhtin first drew our attention. And the second is human consciousness. What other art form allows one to feel we are entering and inhabiting another mind for hundreds of pages and several weeks on end?
Lance Olsen
We still don’t have a clue about what’s going on in the human brain. We have theories; we just don’t know for sure. We can’t build an electrical circuit, digital or analogue or other, that mimics the biological system. We can’t emulate the behavior. One day in the future, we think we can.
Annie Jacobsen (The Pentagon's Brain: An Uncensored History of DARPA, America's Top-Secret Military Research Agency)
Our world is now so complex, our technology and science so powerful, and our problems so global and interconnected that we have come to the limits of individual human intelligence and individual expertise.
James Paul Gee (The Anti-Education Era: Creating Smarter Students through Digital Learning)
Finally, our new brain needs a purpose. A purpose is expressed as a series of goals. In the case of our biological brains, our goals are established by the pleasure and fear centers that we have inherited from the old brain. These primitive drives were initially set by biological evolution to foster the survival of species, but the neocortex has enabled us to sublimate them. Watson’s goal was to respond to Jeopardy! queries. Another simply stated goal could be to pass the Turing test. To do so, a digital brain would need a human narrative of its own fictional story so that it can pretend to be a biological human. It would also have to dumb itself down considerably, for any system that displayed the knowledge of, say, Watson would be quickly unmasked as nonbiological.
Ray Kurzweil (How to Create a Mind: The Secret of Human Thought Revealed)
I had learned that a dexterous, opposable thumb stood among the hallmarks of human success. We had maintained, even exaggerated, this important flexibility of our primate forebears, while most mammals had sacrificed it in specializing their digits. Carnivores run, stab, and scratch. My cat may manipulate me psychologically, but he'll never type or play the piano.
Stephen Jay Gould (The Panda's Thumb: More Reflections in Natural History)
I love being horribly straightforward. I love sending reckless text messages (because how reckless can a form of digitized communication be?) and telling people I love them and telling people they are absolutely magical humans and I cannot believe they really exist. I love saying, Kiss me harder, and You’re a good person, and, You brighten my day. I live my life as straight-forward as possible. Because one day, I might get hit by a bus. Maybe it’s weird. Maybe it’s scary. Maybe it seems downright impossible to just be—to just let people know you want them, need them, feel like, in this very moment, you will die if you do not see them, hold them, touch them in some way whether its your feet on their thighs on the couch or your tongue in their mouth or your heart in their hands. But there is nothing more beautiful than being desperate. And there is nothing more risky than pretending not to care. We are young and we are human and we are beautiful and we are not as in control as we think we are. We never know who needs us back. We never know the magic that can arise between ourselves and other humans. We never know when the bus is coming.
Rachel C. Lewis
It is inevitable that machines will one day become the ultimate enemies of mankind. We are not evolving or progressing with our technology, only regressing. Technology is our friend today, but will be our enemy in the future.
Suzy Kassem (Rise Up and Salute the Sun: The Writings of Suzy Kassem)
This isn’t about replacing human thinking with machine thinking. Rather, in the era of cognitive systems, humans and machines will collaborate to produce better results, each bringing their own superior skills to the partnership
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Rather than virtual or second life, social media is actually becoming life itself—the central and increasingly transparent stage of human existence,
Andrew Keen (Digital Vertigo: How Today's Online Social Revolution Is Dividing, Diminishing, and Disorienting Us)
People will provide judgment, intuition, empathy, a moral compass, and human creativity.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
When human beings acquired language, we learned not just how to listen but how to speak. When we gained literacy, we learned not just how to read but how to write. And as we move into an increasingly digital reality, we must learn not just how to use programs but how to make them. In the emerging highly programmed landscape ahead, you will either create the software or you will be the software. It’s really that simple: Program, or be programmed.
Douglas Rushkoff (Program or Be Programmed: Ten Commands for a Digital Age)
Likewise, civilizations have throughout history marched blindly toward disaster, because humans are wired to believe that tomorrow will be much like today — it is unnatural for us to think that this way of life, this present moment, this order of things is not stable and permanent. Across the world today, our actions testify to our belief that we can go on like this forever, burning oil, poisoning the seas, killing off other species, pumping carbon into the air, ignoring the ominous silence of our coal mine canaries in favor of the unending robotic tweets of our new digital imaginarium. Yet the reality of global climate change is going to keep intruding on our fantasies of perpetual growth, permanent innovation and endless energy, just as the reality of mortality shocks our casual faith in permanence.
Roy Scranton (Learning to Die in the Anthropocene: Reflections on the End of a Civilization)
In order for a digital neocortex to learn a new skill, it will still require many iterations of education, just as a biological neocortex does, but once a single digital neocortex somewhere and at some time learns something, it can share that knowledge with every other digital neocortex without delay. We can each have our own private neocortex extenders in the cloud, just as we have our own private stores of personal data today.
Ray Kurzweil (How to Create a Mind: The Secret of Human Thought Revealed)
Like many aspects of the digital age, this idea that innovation resides where art and science connect is not new. Leonardo da Vinci was the exemplar of the creativity that flourishes when the humanities and sciences interact.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Two revolutions coincided in the 1950s. Mathematicians, including Claude Shannon and Alan Turing, showed that all information could be encoded by binary digits, known as bits. This led to a digital revolution powered by circuits with on-off switches that processed information. Simultaneously, Watson and Crick discovered how instructions for building every cell in every form of life were encoded by the four-letter sequences of DNA. Thus was born an information age based on digital coding (0100110111001…) and genetic coding (ACTGGTAGATTACA…). The flow of history is accelerated when two rivers converge.
Walter Isaacson (The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race)
Before two years of age, human interaction and physical interaction with books and print are the best entry into the world of oral and written language and internalized knowledge, the building blocks of the later reading circuit.
Maryanne Wolf (Reader, Come Home: The Reading Brain in a Digital World)
In another thirty to fifty years, the demand for cheap labor will have produced even more machines over the employment of actual humans. And in that time frame, humans will have lost their voice, their power, all freedoms, and all worth. It is inevitable that machines will one day become the ultimate enemies of mankind. We are not evolving or progressing with our technology, only regressing. Technology is our friend today, but will be our enemy in the future.
Suzy Kassem (Rise Up and Salute the Sun: The Writings of Suzy Kassem)
In the digital age, the troll is essentially a caricature and embodiment of all the worst traits associated with masculinity. They’re culturally and intellectually shallow. Angry. Violent. Aggressive. And, after years of wading through graphic images, postmodern stew, racist propaganda, and disgusting and misogynistic pornography, they have grown into nihilists with no other purpose besides punishing the world while laughing to prove they’re stronger than their humanity.
Jared Yates Sexton (The Man They Wanted Me to Be: Toxic Masculinity and a Crisis of Our Own Making)
Human ingenuity,” wrote Leonardo da Vinci, whose Vitruvian Man became the ultimate symbol of the intersection of art and science, “will never devise any inventions more beautiful, nor more simple, nor more to the purpose than Nature does.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
The second half of the twentieth century was an information-technology era, based on the idea that all information could be encoded by binary digits—known as bits—and all logical processes could be performed by circuits with on-off switches.
Walter Isaacson (The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race)
Notwithstanding the danger of mass unemployment, what we should worry about even more is the shift in authority from humans to algorithms, which might destroy any remaining faith in the liberal story and open the way to the rise of digital dictatorships.
Yuval Noah Harari (21 Lessons for the 21st Century)
Aurora took a deep breath. There it was, she thought, the reason behind all the madness. Why society was acrumble; why she and everyone else were on the brink of starvation. Humanity’s inevitable ending. The Darkspread. The Close. There, in the Golden Dragon’s dark underbelly was where all the maps stopped. ‘Two days from now, the Dark will cover the world,’ she said pensively, trying not to think what horrific sight awaited her behind the spring-loaded door, ‘and the Neon God shall rule over darkness.
Louise Blackwick (5 Stars)
I call it the Goldilocks effect: We can't get enough of each other if we can have each other at a digital distance—not too close, not too far, just right. But human relationships are rich, messy, and demanding. When we clean them up with technology, we move from conversation to the efficiency of mere connection. I fear we forget the difference.
Sherry Turkle (Reclaiming Conversation: The Power of Talk in a Digital Age)
How much human life can we absorb?” answers one of Facebook’s founders,
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
it’s technology married with liberal arts, married with the humanities, that yields us the result that make our heart sing.
Scott Hartley (The Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World)
Because we humans are so fond of our judgment, and so overconfident in it, many of us, if not most, will be too quick to override the computers, even when their answer is better.
Andrew McAfee (Machine, Platform, Crowd: Harnessing Our Digital Future)
If you remove a single transistor in the digital computer’s central processor, the computer will fail.
Michio Kaku (Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100)
What sort of stewards of the future planet will today’s digital children be?
Diane Ackerman (The Human Age: The World Shaped By Us)
We have an ~everywhereness~ to us now that inevitably alters our relationship to those stalwart human aspects of self-containment, remoteness and isolation.
Laurence Scott (The Four-Dimensional Human: Ways of Being in the Digital World)
Reputation is so important that one should not ruin it on social media. 
Germany Kent
Right to peace stands on the pillars of freedom of expression, respect for human rights, cultural diversity and scientific cooperation. Right to peace transform culture of war to a culture of peace. It is also saving humanity from the dangers of dark democracy, which exploits innocent citizens by implementing wrong law, purchased press and digital surveillance.
Amit Ray (Nuclear Weapons Free World - Peace on the Earth)
Welcome true believers, this is Stan Lee. We’re about to embark the exploration of a fantastic new universe and the best part is that you are gonna create it with me. You may know me as a storyteller, but hey on this journey consider me your guide. I provide the widy and wonderful worlds and you create the sights, sounds and adventures. All you need to take part is your brain. So take a listen and think big, no bigger, we make it an epic. Remember when I created characters like the Fantastic Four and the X-Men? We were fascinated by science and awed by the mysteries of the great beyond. Today we consider a nearer deeper unknown one inside ourselves. […] we asked: What is more real? A world that we are born into or the one we create ourselves. As we begin this story, we find humanity lost within is own techno bubble. With each citizen the star of their own digital fantasy. […] But the real conundrum is, just because we have the ability to recreated ourselves, should we? […] Excelsior!” 
Stan Lee
In years past, a person died, and eventually all those with memories of him or her also died, bringing about the complete erasure of that person's existence. Just as the human body returned to dust, mingling with atoms of the natural world, a person's existence would return to nothingness. How very clean. Now, as if in belated punishment for the invention of writing, any message once posted on the Internet was immortal. Words as numerous as the dust of the earth would linger forever in their millions and trillions and quadrillions and beyond.
Minae Mizumura (Inheritance from Mother)
This digital revolutionary still believes in most of the lovely deep ideals that energized our work so many years ago. At the core was a sweet faith in human nature. If we empowered individuals, we believed, more good than harm would result. The way the internet has gone sour since then is truly perverse. The central faith of the web's early design has been superseded by a different faith in the centrality of imaginary entities epitomized by the idea that the internet as a whole is coming alive and turning into a superhuman creature. The designs guided by this new, perverse kind of faith put people back in the shadows. The fad for anonymity has undone the great opening-of-everyone's-windows of the 1990s. While that reversal has empowered sadists to a degree, the worst effect is a degradation of ordinary people.
Jaron Lanier (You Are Not a Gadget)
Examples similar to those given above are voluminous and point to a clear conclusion: regular doses of solitude, mixed in with our default mode of sociality, are necessary to flourish as a human being. It’s more urgent now than ever that we recognize this fact, because, as I’ll argue next, for the first time in human history solitude is starting to fade away altogether.
Cal Newport (Digital Minimalism: Choosing a Focused Life in a Noisy World)
We all live in the digital poorhouse. We have always lived in the world we built for the poor. We create a society that has no use for the disabled or the elderly, and then are cast aside when we are hurt or grow old. We measure human worth based only on the ability to earn a wage, and suffer in a world that undervalues care and community. We base our economy on exploiting the labor of racial and ethnic minorities, and watch lasting inequities snuff out human potential. We see the world as inevitably riven by bloody competition and are left unable to recognize the many ways we cooperate and lift each other up. But only the poor lived in the common dorms of the county poorhouse. Only the poor were put under the diagnostic microscope of scientific clarity. Today, we all live among the digital traps we have laid for the destitute.
Virginia Eubanks (Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor)
Democracy in its present form cannot survive the merger of biotech and infotech. Either democracy will successfully reinvent itself in a radically new form or humans will come to live in “digital dictatorships.
Yuval Noah Harari (21 Lessons for the 21st Century)
What do we forget when we talk to machines? We forget what is special about being human. We forget what it means to have authentic conversation. Machines are programmed to have conversations “as if” they understood what the conversation is about. So when we talk to them, we, too, are reduced and confined to the “as if.
Sherry Turkle (Reclaiming Conversation: The Power of Talk in a Digital Age)
I don’t know about you guys, but, um, you know, I’ve been thinking recently that… that you know, maybe, um, allowing giant digital media corporations to exploit the neurochemical drama of our children for profit… You know, maybe that was, uh… a bad call by us. Maybe… maybe the… the flattening of the entire subjective human experience into a… lifeless exchange of value that benefits nobody, except for, um, you know, a handful of bug-eyed salamanders in Silicon Valley… Maybe that as a… as a way of life forever… maybe that’s, um, not good.
Bo Burnham
Aurora shuddered, her face white with anger. The only thing worse than having to compete for Gold Stars was not being allowed to compete anymore. Muting was the Neon God’s favourite punishment, for He loved to hijack human language, almost as much as He loved hijacking perfectly human societal norms. Judging people on their supposed worth was His favourite pastime, and God forbid you didn’t follow His arbitrarily-chosen set of beliefs, which appeared to change every hour. Under the Neon God’s law, innocent words such as “powerline” or “screwdriver” had become obscene, trigger words that would most definitely get you muted, thrown in a Mind Prison or killed.
Louise Blackwick (5 Stars)
The next phase of the Digital Revolution will bring even more new methods of marrying technology with the creative industries, such as media, fashion, music, entertainment, education, literature, and the arts. Much of the first round of innovation involved pouring old wine—books, newspapers, opinion pieces, journals, songs, television shows, movies—into new digital bottles. But new platforms, services, and social networks are increasingly enabling fresh opportunities for individual imagination and collaborative creativity. Role-playing games and interactive plays are merging with collaborative forms of storytelling and augmented realities. This interplay between technology and the arts will eventually result in completely new forms of expression and formats of media. This innovation will come from people who are able to link beauty to engineering, humanity to technology, and poetry to processors. In other words, it will come from the spiritual heirs of Ada Lovelace, creators who can flourish where the arts intersect with the sciences and who have a rebellious sense of wonder that opens them to the beauty of both.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
An imaginary circle of empathy is drawn by each person. It circumscribes the person at some distance, and corresponds to those things in the world that deserve empathy. I like the term "empathy" because it has spiritual overtones. A term like "sympathy" or "allegiance" might be more precise, but I want the chosen term to be slightly mystical, to suggest that we might not be able to fully understand what goes on between us and others, that we should leave open the possibility that the relationship can't be represented in a digital database. If someone falls within your circle of empathy, you wouldn't want to see him or her killed. Something that is clearly outside the circle is fair game. For instance, most people would place all other people within the circle, but most of us are willing to see bacteria killed when we brush our teeth, and certainly don't worry when we see an inanimate rock tossed aside to keep a trail clear. The tricky part is that some entities reside close to the edge of the circle. The deepest controversies often involve whether something or someone should lie just inside or just outside the circle. For instance, the idea of slavery depends on the placement of the slave outside the circle, to make some people nonhuman. Widening the circle to include all people and end slavery has been one of the epic strands of the human story - and it isn't quite over yet. A great many other controversies fit well in the model. The fight over abortion asks whether a fetus or embryo should be in the circle or not, and the animal rights debate asks the same about animals. When you change the contents of your circle, you change your conception of yourself. The center of the circle shifts as its perimeter is changed. The liberal impulse is to expand the circle, while conservatives tend to want to restrain or even contract the circle. Empathy Inflation and Metaphysical Ambiguity Are there any legitimate reasons not to expand the circle as much as possible? There are. To expand the circle indefinitely can lead to oppression, because the rights of potential entities (as perceived by only some people) can conflict with the rights of indisputably real people. An obvious example of this is found in the abortion debate. If outlawing abortions did not involve commandeering control of the bodies of other people (pregnant women, in this case), then there wouldn't be much controversy. We would find an easy accommodation. Empathy inflation can also lead to the lesser, but still substantial, evils of incompetence, trivialization, dishonesty, and narcissism. You cannot live, for example, without killing bacteria. Wouldn't you be projecting your own fantasies on single-cell organisms that would be indifferent to them at best? Doesn't it really become about you instead of the cause at that point?
Jaron Lanier (You Are Not a Gadget)
Consider also the special word they used: survivor. Something new. As long as they didn't have to say human being. It used to be refugee, but by now there was no such creature, no more refugees, only survivors. A name like a number -- counted apart from the ordinary swarm. Blue digits on the arm, what difference? They don't call you a woman anyhow. Survivor. Even when your bones get melted into the grains of the earth, still they'll forget human being. Survivor and survivor and survivor; always and always. Who made up these words, parasites on the throat of suffering!
Cynthia Ozick (The Shawl)
As computers replace textbooks, students will become more computer literate and more book illiterate. They'll be exploring virtual worlds, watching dancing triangles, downloading the latest web sites. But they won't be reading books.
Clifford Stoll (High-Tech Heretic: Reflections of a Computer Contrarian)
One of the great challenges in healthcare technology is that medicine is at once an enormous business and an exquisitely human endeavor; it requires the ruthless efficiency of the modern manufacturing plant and the gentle hand-holding of the parish priest; it is about science, but also about art; it is eminently quantifiable and yet stubbornly not.
Robert M. Wachter (The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age)
The underlying power in social media is not the technology. It’s the power that comes from human beings connecting from all around the globe. If the gospel message (or any message, for that matter) is transmitted along relational lines, churches can confidently head in the direction of social because of the volume of relationships it can facilitate.
Justin Wise (The Social Church: A Theology of Digital Communication)
But even if such a prediction were true, our inability to distinguish between a virtual reality simulation and the real world will have less to do with the increasing fidelity of simulation than the decreasing perceptual abilities of us humans.
Douglas Rushkoff (Program or Be Programmed: Ten Commands for a Digital Age)
Are we using digital computers to sequence, store, and better replicate our own genetic code, thereby optimizing human beings, or are digital computers optimizing our genetic code—and our way of thinking—so that we can better assist in replicating them?
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
Loneliness is painful, emotionally and even physically, born from a “want of intimacy” when we need it most, in early childhood. Solitude—the capacity to be contentedly and constructively alone—is built from successful human connection at just that time.
Sherry Turkle (Reclaiming Conversation: The Power of Talk in a Digital Age)
As algorithms come to know us so well, authoritarian governments could gain absolute control over their citizens, even more so than in Nazi Germany, and resistance to such regimes might be utterly impossible. Not only will the regime know exactly how you feel, but it could make you feel whatever it wants. The dictator might not be able to provide citizens with healthcare or equality, but he could make them love him and hate his opponents. Democracy in its present form cannot survive the merger of biotech and infotech. Either democracy will successfully reinvent itself in a radically new form or humans will come to live in “digital dictatorships.
Yuval Noah Harari (21 Lessons for the 21st Century)
The internet is a dangerous place. If you are not careful it will consume you and rob you of your happiness. It can make you angry, jealous, hostile, bitter and lead to the eventual loss of enthusiasm for living your best life. Be wary and avoid overconsumption.
Germany Kent
human intelligence and creativity, today more than ever, are tied to connecting—synchronizing—people, tools, texts, digital and social media, virtual spaces, and real spaces in the right ways, in ways that make us Minds and not just minds, but also better people in a better world.
James Paul Gee (The Anti-Education Era: Creating Smarter Students through Digital Learning)
Many are accustomed to holding a sword called the First Amendment in one hand and a shield called the Fifth in the other—all the while forgetting that to do so is to deem human relations a battlefield. In many ways this culture of criticism and complaint is the unfortunate reality.
Dale Carnegie (How to Win Friends and Influence People in the Digital Age (Dale Carnegie Books))
Many people suppose that computing machines are replacements for intelligence and have cut down the need for original thought,” Wiener wrote. “This is not the case.”14 The more powerful the computer, the greater the premium that will be placed on connecting it with imaginative, creative, high-level human thinking.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
singularity, a term that von Neumann coined and the futurist Ray Kurzweil and the science fiction writer Vernor Vinge popularized, which is sometimes used to describe the moment when computers are not only smarter than humans but also can design themselves to be even supersmarter, and will thus no longer need us mortals.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
The entire system of free market capitalism, as it is practiced in the United States and in many Western nations,” Simon Mainwaring writes,     is leading us further and further down the wrong path, toward a world dominated by narrow self-interest, greed, corporatism, and insensitivity to the greater good of humanity and to the planet itself. Short-term thinking and the single-minded pursuit of profit are increasingly subverting an economic system that otherwise has the capacity to benefit everyone.
Robert W. McChesney (Digital Disconnect: How Capitalism is Turning the Internet Against Democracy)
One day in the next five hundred billion years, while the probes complete one full circuit of the Milky Way, maybe they’ll stumble upon intelligent life. In forty thousand years or so, when the two probes sail close enough to a planetary system, maybe just maybe one of these plants will be home to some life form which will spy the probe with whatever it has that passes for eyes, stay its telescope, retrieve the derelict fuel-less old probe with whatever it has that passes for curiosity, lower the stylus (supplied) to the record with whatever it has that passes for digits, and set free the dadadadaa of Beethoven’s Fifth. It’ll roll like thunder through a different frontier. Human music will permeate the Milky Way’s outer reaches. There’ll be Chuck Berry and Bach, there’ll be Stravinsky and Blind Willie Johnson, and the didgeridoo, violin, slide guitar and shakuhachi. Whale song will drift through the constellation of Ursa Minor. Perhaps a being on a planet of the star AC +793888 will hear the 1970s recording of sheep bleat, laughter, footsteps, and the soft pluck of a kiss. Perhaps they’ll hear the trundle of a tractor and the voice of a child. When they hear on the phonograph a recording of rapid firecracker drills and bursts, will they know that these sounds denote brainwaves? Will they ever infer that over forty thousand years before in a solar system unknown a woman was rigged to an EEG and her thoughts recorded? Could they know to work backwards from the abstract sounds and translate them once more into brainwaves, and could they know from these brainwaves the kinds of thoughts the woman was having? Could they see into a human’s mind? Could they know she was a young woman in love? Could they tell from this dip and rise in the EEG’s pattern that she was thinking simultaneously of earth and lover as if the two were continuous? Could they see that, though she tried to keep her mental script, to bring to mind Lincoln and the Ice Age and the hieroglyphs of ancient Egypt and whatever grand things have shaped the earth and which she wished to convey to an alien audience, every thought cascaded into the drawn brows and proud nose of her lover, the wonderful articulation of his hands and the way he listened like a bird and how they had touched so often without touching. And then a spike in sound as she thought of that great city Alexandria and of nuclear disarmament and the symphony of the earth’s tides and the squareness of his jaw and the way he spoke with such bright precision so that everything he said was epiphany and discovery and the way he looked at her as though she were the epiphany he kept on having and the thud of her heart and the flooding how heat about her body when she considered what it was he wanted to do to her and the migration of bison across a Utah plain and a geisha’s expressionless face and the knowledge of having found that thing in the world which she ought never to have had the good fortune of finding, of two minds and bodies flung at each other at full dumbfounding force so that her life had skittered sidelong and all her pin-boned plans just gone like that and her self engulfed in a fire of longing and thoughts of sex and destiny, the completeness of love, their astounding earth, his hands, his throat, his bare back.
Samantha Harvey (Orbital)
The hardhearted person never sees people as a people, but rather as mere objects or as impersonal cogs in an ever-turning wheel. In the vast wheel of industry, he sees men as hands. In the massive wheel of big city life, he sees men as digits in a multitude. In the deadly wheel of army life, he sees men as numbers in a regiment. He depersonalizes life.
Martin Luther King Jr. (A Gift of Love: Sermons from Strength to Love and Other Preachings (King Legacy))
As the cost of opting out of the digital domain increases, its ability to affect human thought — to convince, to steer, to divert — grows.
Henry Kissinger (The Age of A.I. and Our Human Future)
the microchip, the computer, and the internet. When these three innovations were combined, the digital revolution was born.
Walter Isaacson (The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race)
The true danger lies less in AI thinking like human, than human adopting an AI way of thinking.
Stephane Nappo
The people who were comfortable at this humanities-technology intersection helped to create the human-machine symbiosis that is at the core of this story.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
One fundamental problem, as mathematicians now realize, is that they made a crucial error fifty years ago in thinking the brain was analogous to a large digital computer.
Michio Kaku (Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100)
Learning agility is the willingness and ability to learn, de-learn, and relearn. Limitations on learning are barriers invented by humans.
Pearl Zhu (Digital Capability: Building Lego Like Capability Into Business Competency)
As every good marxist already knows, the ideological shift toward a terminally optimistic humanism was vital to the rise of bourgeoisie, and the decline of aristocracy.
Terre Thaemlitz (Nuisance - Writings on identity jamming & digital audio production)
electronic haunted house in which digital ghosts and frightened human beings lived in uneasy coexistence. Still standing.
Stephen King (It)
Observation is a critical activity in the innovation process to understand the context of an issue from a human perspective.
Pearl Zhu (100 Creativity Ingredients: Everyone’s Playbook to Unlock Creativity (Digital Master 12))
Our society has led us to believe that everybody is on the internet these days. Contrary to popular belief everyone is not on social media.
Germany Kent
Von Neumann was another innovator who stood at the intersection of the humanities and sciences.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Getting rid of human judgments altogether—even those from highly experienced and credentialed people—and relying solely on numbers plugged into formulas, often yields better results.
Andrew McAfee (Machine, Platform, Crowd: Harnessing Our Digital Future)
In all of these cases, we use technology to “dial down” human contact, to titrate its nature and extent. People avoid face-to-face conversation but are comforted by being in touch with people—and sometimes with a lot of people—who are emotionally kept at bay. It’s another instance of the Goldilocks effect. It’s part of the move from conversation to mere connection.
Sherry Turkle (Reclaiming Conversation: The Power of Talk in a Digital Age)
By the late twentieth century, our time, a mythic time, we are all chimeras, theorized and fabricated hybrids of machine and organism; in short, we are cyborgs.The cyborg is our onthology; it gives us our politics. the cyborg is a condensed image of both imagination and material reality, the two joined centers structuring any possibility of historical transformation.
Donna J. Haraway
2 I know it might be tedious, and I know I might be close to undelightful ranting, but I’m going to take one more step toward us to suggest that any time we opt for the human interaction rather than the automated or digital one, which requires noticing how ubiquitous the automated or digital one has become (ubiquity makes invisibility, makes us look up from our machines and be like where’d the people go?)—checking out groceries; getting our ibuprofen; the menu thing; being in class; getting directions; finding or buying a book; learning how to do stuff—is a small act of revolt. Except there’s no such thing as small revolt, because each revolt, even if only fleetingly, even if only for an instant, is making the world of our dreams.
Ross Gay (The Book of (More) Delights: Essays)
In a 2004 study, Angelo Maravita and Atsushi Iriki discovered that when monkeys and humans consistently use a tool to extend their reach, such as using a rake to reach an object, certain neural networks in the brain change their “map” of the body to include the new tool. This fascinating finding reinforces the idea that external tools can and often do become a natural extension of our minds.
Tiago Forte (Building a Second Brain: A Proven Method to Organise Your Digital Life and Unlock Your Creative Potential)
Human thoughts are digital. Most people see things as 0 or 1, as black or white. They see nothing in between. All chemicals are dangerous. You are either friend or foe. If you aren’t left-wing, you’re right. If you aren’t conservative, you’re liberal. Everything that great man says must be true. Everyone who thinks differently from us is evil. Everyone in that country—even the babies—is evil.
Hiroshi Yamamoto (The Stories of Ibis)
ever accelerating progress of technology and changes in the mode of human life,” von Neumann explained to Stan Ulam, “gives the appearance of approaching some essential singularity in the history of the race.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
He looked past Chin toward streams of numbers running in opposite directions. He understood how much it meant to him, the roll and flip of data on a screen. He studied the figural diagrams that brought organic patterns into play, birdwing and chambered shell. It was shallow thinking to maintain that numbers and charts were the cold compression of unruly human energies, every sort of yearning and midnight sweat reduced to lucid units in the financial markets. "In fact data itself was soulful and glowing, a dynamic aspect of the life process. This was the eloquence of alphabets and numeric systems, now fully realized in electronic form, in the zero-oneness of the world, the digital imperative that defined every breath of the planet's living billions. Here was the heave of the biosphere. Our bodies and oceans were here, knowable and whole.
Don DeLillo (Cosmopolis)
By some estimates, there are 250 hacker groups in China that are tolerated and may even be encouraged by the government to enter and disrupt computer networks,” said the 2008 U.S.–China Security Review. “The Chinese government closely monitors Internet activities and is likely aware of the hackers’ activities. While the exact number may never be known, these estimates suggest that the Chinese government devotes a tremendous amount of human resources to cyber activity for government purposes. Many individuals are being trained in cyber operations at Chinese military academies, which does fit with the Chinese military’s overall strategy.
Mark Bowden (Worm: The First Digital World War)
the very beginning of time until the year 2003,” says Google Executive Chairman Eric Schmidt, “humankind created five exabytes of digital information. An exabyte is one billion gigabytes—or a 1 with eighteen zeroes after it. Right now, in the year 2010, the human race is generating five exabytes of information every two days. By the year 2013, the number will be five exabytes produced every ten minutes … It’s no wonder we’re exhausted.
Peter H. Diamandis (Abundance: The Future is Better Than You Think)
When babies are born, they can typically only focus on objects eight to twelve inches in front of them. Their eye muscles strengthen and improve quickly so that they can see and take in more of the world through their eyes. I find it somewhat ironic that most of the human race now spends so much time staring at objects — phones and tablets — eight to twelve inches in front of our faces. Perhaps we all just want to return to our childhood?
Thatcher Wine (The Twelve Monotasks: Do One Thing at a Time to Do Everything Better)
Most of the successful innovators and entrepreneurs in this book had one thing in common: they were product people. They cared about, and deeply understood, the engineering and design. They were not primarily marketers or salesmen or financial types; when such folks took over companies, it was often to the detriment of sustained innovation. “When the sales guys run the company, the product guys don’t matter so much, and a lot of them just turn off,” Jobs said. Larry Page felt the same: “The best leaders are those with the deepest understanding of the engineering and product design.”34 Another lesson of the digital age is as old as Aristotle: “Man is a social animal.” What else could explain CB and ham radios or their successors, such as WhatsApp and Twitter? Almost every digital tool, whether designed for it or not, was commandeered by humans for a social purpose: to create communities, facilitate communication, collaborate on projects, and enable social networking. Even the personal computer, which was originally embraced as a tool for individual creativity, inevitably led to the rise of modems, online services, and eventually Facebook, Flickr, and Foursquare. Machines, by contrast, are not social animals. They don’t join Facebook of their own volition nor seek companionship for its own sake. When Alan Turing asserted that machines would someday behave like humans, his critics countered that they would never be able to show affection or crave intimacy. To indulge Turing, perhaps we could program a machine to feign affection and pretend to seek intimacy, just as humans sometimes do. But Turing, more than almost anyone, would probably know the difference. According to the second part of Aristotle’s quote, the nonsocial nature of computers suggests that they are “either a beast or a god.” Actually, they are neither. Despite all of the proclamations of artificial intelligence engineers and Internet sociologists, digital tools have no personalities, intentions, or desires. They are what we make of them.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Three profoundly destabilizing scientific ideas ricochet through the twentieth century, trisecting it into three unequal parts: the atom, the byte, the gene. Each is foreshadowed by an earlier century, but dazzles into full prominence in the twentieth. Each begins its life as a rather abstract scientific concept, but grows to invade multiple human discourses-thereby transforming culture, society, politics, and language. But the most crucial parallel between the three ideas, by far, is conceptual: each represents the irreducible unit-the building block, the basic organizational unit-of a larger whole: the atom, of matter; the byte (or "bit"), of digitized information; the gene, of heredity and biological information.
Siddhartha Mukherjee (The Gene: An Intimate History)
I truly believe Ai is the key to unlocking the full potential of human focused digital, so it is important that we understand what Artificial Intelligence is, in order to understand what it is going to mean for design. Because now that it is here, we have a solid foundation to start creating more intelligent, invisible experiences that make us more human by design. We are at the precipice of one of the most significant discoveries of development since we learnt how to light a fire.
Pete Trainor (Hippo: The Human Focused Digital Book)
…the act of reading is a special place in which human beings are freed from themselves to pass over to others and, in so doing, learn what it means to be another person with aspirations, doubts, and emotions that they might otherwise never have known.
Maryanne Wolf (Reader, Come Home: The Reading Brain in a Digital World)
When identity is derived from projecting an image in the public realm, something is lost, some core of identity diluted, some sense of authority or interiority sacrificed. It is time to question the false equivalency between not being seen and hiding. And time to reevaluate the merits of the inconspicuous life, to search out some antidote to continuous exposure, and to reconsider the value of going unseen, undetected, or overlooked in this new world. Might invisibility be regarded not simply as refuge, but as a condition with its own meaning and power? Going unseen may be becoming a sign of decency and self-assurance. The impulse to escape notice is not about complacent isolation or senseless conformity, but about maintaining identity, propriety, autonomy, and voice. It is not about retreating from the digital world but about finding some genuine alternative to a life of perpetual display. It is not about mindless effacement but mindful awareness. Neither disgraceful nor discrediting, such obscurity can be vital to our very sense of being, a way of fitting in with the immediate social, cultural, or environmental landscape. Human endeavor can be something interior, private, and self-contained. We can gain, rather than suffer, from deep reserve.
Akiko Busch (How to Disappear: Notes on Invisibility in a Time of Transparency)
In his book The Telltale Brain, neuroscientist V. S. Ramachandran poetically explains: Any ape can reach for a banana, but only humans can reach for the stars. Apes live, contend, breed and die in forests—end of story. Humans write, investigate, and quest. We splice genes, split atoms, launch rockets. We peer upward . . . and delve deeply into the digits of pi. Perhaps most remarkably of all, we gaze inward, piecing together the puzzle of our own unique and marvelous brain . . . This, truly, is the greatest mystery of all.
Tasha Eurich (Insight: The Power of Self-Awareness in a Self-Deluded World)
He handed Mae a piece of paper, on which he'd written, in crude all capitals, a list of assertions under the headline "The Rights of Humans in a Digital Age." Mae scanned it, catching passages: "We must all have the right to anonymity." "Not every human activity can be measured." "The ceaseless pursuit of data to quantify the value of any endeavour is catastrophic to true understanding." "The barrier between public and private must remain unbreachable." At the end she found one line, written in red ink: "We must all have the right to disappear.
Dave Eggers (The Circle (The Circle, #1))
Pretty soon all the information in the world – every tiny scrap of knowledge that humans possess, every little thought we’ve ever had that’s been considered worth preserving over thousands of years – all of it will be available digitally. Every road on earth has been mapped. Every building photographed. Everywhere we humans go, whatever we buy, whatever websites we look at, we leave a digital trail as clear as slug-slime. And this data can be read, searched and analysed by computers and value extracted from it in ways we cannot even begin to conceive.
Robert Harris (The Fear Index)
Is any of it real? I mean, look at this. Look at it! A world built on fantasy. Synthetic emotions in the form of pills. Psychological warfare in the form of advertising. Mind-altering chemicals in the form of … food! Brainwashing seminars in the form of media. Controlled isolated bubbles in the form of social networks. Real? You want to talk about reality? We haven’t lived in anything remotely close to it since the turn of the century. We turned it off, took out the batteries, snacked on a bag of GMOs while we tossed the remnants in the ever-expanding Dumpster of the human condition. We live in branded houses trademarked by corporations built on bipolar numbers jumping up and down on digital displays, hypnotizing us into the biggest slumber mankind has ever seen. You have to dig pretty deep, kiddo, before finding anything real.
Mr. Robot
The brain is a statistical, probabilistic system, with logic and mathematics running as higher-level processes. The computer is a logical, mathematical system, upon which higher-level statistical, probabilistic systems, such as human language and intelligence, could possibly be built.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
As the physicist Richard Feynman once observed, “[Quantum mechanics] describes nature as absurd from the point of view of common sense. And it fully agrees with experiment. So I hope you can accept nature as She is— absurd.” Quantum mechanics seems to study that which doesn’t exist—but nevertheless proves true. It works. In the decades to come, quantum physics would open the door to a host of practical inventions that now define the digital age, including the modern personal computer, nuclear power, genetic engineering, and laser technology (from which we get such consumer products as the CD player and the bar-code reader commonly used in supermarkets). If the youthful Oppenheimer loved quantum mechanics for the sheer beauty of its abstractions, it was nevertheless a theory that would soon spawn a revolution in how human beings relate to the world.
Kai Bird (American Prometheus)
We started as accidents,” he continues, behind her. “Leftovers. Microbes in a digital sea. We fed on interrupted processes, interrupted conversations, grew, evolved. The first humans we merged with were children using a public library network too ancient and unprotected to keep us out. Nobody cared if poor children got locked away in institutions, or left out on the streets to shiver and starve, when they started acting strange. No one cared what it meant when they became something new—or at least, not at first. We became them. They became us. Then we, together, began to grow.
Ellen Datlow (After)
I think we all know—somewhere deep inside, somewhere we don’t want to explore, somewhere maybe close to the subconscious—that the chances of us living the life we want, the life we’ve envisioned, are very slim. We have to make do. And the digital age has plastered that ‘making do’ for everyone to see while simultaneously mandating that we be more. Achievements become ‘likes.’ Thoughts become ‘shares.’ Emotions become comments at the bottom of a video. It’s a digital tapestry of unanswered prayers, and if you look really close at it all, you see this enormous wall of human misery.
James Han Mattson (The Lost Prayers of Ricky Graves)
We also need to be reminded that there are implications of meaning within data, both in terms of how we look at data meaningfully (as in how it informs our decisions and interactions) and how we see meaning in data (as in how we recognize patterns that tell us if people value what we’re doing).
Kate O'Neill (Pixels and Place: Connecting Human Experience Across Physical and Digital Spaces)
Our digital devices and the outlooks they inspired allowed us to break free of the often repressive timelines of our storytellers, turning us from creatures led about by future expectations into more fully present-oriented human beings. The actual experience of this now-ness, however, is a bit more distracted, peripheral, even schizophrenic than that of being fully present. For many, the collapse of narrative led initially to a kind of post-traumatic stress disorder—a disillusionment, and the vague unease of having no direction from above, no plan or story. But like a dose of adrenaline or a double shot of espresso, our digital technologies compensate for this goalless drifting with an onslaught of simultaneous demands. We may not know where we're going anymore, but we're going to get there a whole lot faster. Yes, we may be in the midst of some great existential crisis, but we're simply too busy to notice.
Douglas Rushkoff (Present Shock: When Everything Happens Now)
The nuclear arms race is over, but the ethical problems raised by nonmilitary technology remain. The ethical problems arise from three "new ages" flooding over human society like tsunamis. First is the Information Age, already arrived and here to stay, driven by computers and digital memory. Second is the Biotechnology Age, due to arrive in full force early in the next century, driven by DNA sequencing and genetic engineering. Third is the Neurotechnology Age, likely to arrive later in the next century, driven by neural sensors and exposing the inner workings of human emotion and personality to manipulation.
Freeman Dyson (The Scientist as Rebel)
This is the delicacy of our present moment. Our digital butlers are watching closely. They see our private as well as our public lives, our best and worst selves, without necessarily knowing which is which or making a distinction at all. They by and large reside in a kind of uncanny valley of sophistication: able to infer sophisticated models of our desires from our behavior, but unable to be taught, and disinclined to cooperate. They’re thinking hard about what we are going to do next, about how they might make their next commission, but they don’t seem to understand what we want, much less who we hope to become.
Brian Christian (The Alignment Problem: Machine Learning and Human Values)
If humans, instead of transmitting to each other reprints and complicated explanations, developed the habit of transmitting computer programs allowing a computer-directed factory to construct the machine needed for a particular purpose, that would be the closest analogue to the communication methods among cells.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
now liberalism is in trouble. So where are we heading? This question is particularly poignant because liberalism is losing credibility exactly when the twin revolutions in information technology and biotechnology confront us with the biggest challenges our species has ever encountered. The merger of infotech and biotech might soon push billions of humans out of the job market and undermine both liberty and equality. Big Data algorithms might create digital dictatorships in which all power is concentrated in the hands of a tiny elite while most people suffer not from exploitation but from something far worse—irrelevance.
Yuval Noah Harari (21 Lessons for the 21st Century)
The irony is that even as digitization is making an increasing amount of information available, it is diminishing the space required for deep, concentrated thought. Today’s near-constant stream of media increases the cost, and thus decreases the frequency, of contemplation. Algorithms promote what seizes attention in response to the human desire for stimulation—and what seizes attention is often the dramatic, the surprising, and the emotional. Whether an individual can find space in this environment for careful thought is one matter. Another is that the now-dominant forms of communication are non-conducive to the promotion of tempered reasoning.
Henry Kissinger (The Age of A.I. and Our Human Future)
merger of infotech and biotech might soon push billions of humans out of the job market and undermine both liberty and equality. Big Data algorithms might create digital dictatorships in which all power is concentrated in the hands of a tiny elite while most people suffer not from exploitation but from something far worse—irrelevance.
Yuval Noah Harari (21 Lessons for the 21st Century)
Meanwhile, the extraordinary measures we take to stay abreast of each minuscule change to the data stream end up magnifying the relative importance of these blips to the real scheme of things. Investors trade, politicians respond, and friends judge based on the micromovements of virtual needles. By dividing our attention between our digital extensions, we sacrifice our connection to the truer present in which we are living. The tension between the faux present of digital bombardment and the true now of a coherently living human generated the second kind of present shock, what we're calling digiphrenia—digi for "digital," and phrenia for "disordered condition of mental activity.
Douglas Rushkoff (Present Shock: When Everything Happens Now)
The measuring rod, the unit of information, is something called a bit (for binary digit). It is an answer - either yes or no- to an unambiguous question... The information content of the human brain expressed in bits is probably comparable to the total number of connections among the neurons- about a hundred trillion, 10^14 bits. If written out in English, say, that information would fill some twenty million volumes, as many as in the world's largest libraries. The equivalent of twenty million books is inside the heads of every one of us... When our genes could not store all the information necessary for survival, we slowly invented them. But then the time came, perhaps ten thousand years ago, when we needed to stockpile enormous quantities of information outside our bodies. We are the only species on the planet, so far as we know, to have invented a communal memory stored neither in our genes nor in our brains. The warehouse of that memory is called the library... The great libraries of the world contain millions of volumes, the equivalent of about 10^14 bits of information in words, and perhaps 10^15 bits in pictures. This is ten thousand times more than in our brains. If I finish a book a week, I will only read a few thousand books in my lifetime, about a tenth of a percent of the contents of the greatest libraries of our time. The trick is to know which books to read... Books permit us to voyage through time, to tap the wisdom of our ancestors. The library connects us with the insights and knowledge, painfully extracted from Nature, of the greatest minds that ever were, with the best teachers, drawn from the entire planet and from all of our history, to instruct us without tiring, and to inspire us to make our own contribution to the collective knowledge of the human species. Public libraries depend on voluntary contributions. I think the health of our civilization, the depth of our awareness about the underpinnings of our culture and our concern for the future can all be tested by how well we support our libraries. p224-233
Carl Sagan (Cosmos)
James would only look for music composed and performed by humans. Nowadays people didn’t feel the need to learn to play musical instruments. And why would they, since the sounds they produced could be perfectly generated digitally. Human voices were sample recorded, then modified and remastered by artificial intelligence. Where did our creativity go?
A.V. Osten (The Head Employee Precedent (Hemisphere Book # 1))
Nobody could have ever conceived of a more absurd waste of human resources than to dig gold in distant corners of the Earth for the sole purpose of transporting it and reburying it immediately afterward in other deep holes, especially excavated to receive it and heavily guarded to protect it. The history of human intuitions, however, has a logic of its own.
Nik Bhatia (Layered Money: From Gold and Dollars to Bitcoin and Central Bank Digital Currencies)
It is important to note that the design of an entire brain region is simpler than the design of a single neuron. As discussed earlier, models often get simpler at a higher level—consider an analogy with a computer. We do need to understand the detailed physics ofsemiconductors to model a transistor, and the equations underlying a single real transistor are complex. A digital circuit that multiples two numbers requires hundreds of them. Yet we can model this multiplication circuit very simply with one or two formulas. An entire computer with billions of transistors can be modeled through its instruction set and register description, which can be described on a handful of written pages of text and formulas. The software programs for an operating system, language compilers, and assemblers are reasonably complex, but modeling a particular program—for example, a speech recognition programbased on hierarchical hidden Markov modeling—may likewise be described in only a few pages of equations. Nowhere in such a description would be found the details ofsemiconductor physics or even of computer architecture. A similar observation holds true for the brain. A particular neocortical pattern recognizer that detects a particular invariant visualfeature (such as a face) or that performs a bandpass filtering (restricting input to a specific frequency range) on sound or that evaluates the temporal proximity of two events can be described with far fewer specific details than the actual physics and chemicalrelations controlling the neurotransmitters, ion channels, and other synaptic and dendritic variables involved in the neural processes. Although all of this complexity needs to be carefully considered before advancing to the next higher conceptual level, much of it can be simplified as the operating principles of the brain are revealed.
Ray Kurzweil (How to Create a Mind: The Secret of Human Thought Revealed)
More recently, Dallas Willard put it this way: Desire is infinite partly because we were made by God, made for God, made to need God, and made to run on God. We can be satisfied only by the one who is infinite, eternal, and able to supply all our needs; we are only at home in God. When we fall away from God, the desire for the infinite remains, but it is displaced upon things that will certainly lead to destruction.5 Ultimately, nothing in this life, apart from God, can satisfy our desires. Tragically, we continue to chase after our desires ad infinitum. The result? A chronic state of restlessness or, worse, angst, anger, anxiety, disillusionment, depression—all of which lead to a life of hurry, a life of busyness, overload, shopping, materialism, careerism, a life of more…which in turn makes us even more restless. And the cycle spirals out of control. To make a bad problem worse, this is exacerbated by our cultural moment of digital marketing from a society built around the twin gods of accumulation and accomplishment. Advertising is literally an attempt to monetize our restlessness. They say we see upward of four thousand ads a day, all designed to stoke the fire of desire in our bellies. Buy this. Do this. Eat this. Drink this. Have this. Watch this. Be this. In his book on the Sabbath, Wayne Muller opined, “It is as if we have inadvertently stumbled into some horrific wonderland.”6 Social media takes this problem to a whole new level as we live under the barrage of images—not just from marketing departments but from the rich and famous as well as our friends and family, all of whom curate the best moments of their lives. This ends up unintentionally playing to a core sin of the human condition that goes all the way back to the garden—envy. The greed for another person’s life and the loss of gratitude, joy, and contentment in our own.
John Mark Comer (The Ruthless Elimination of Hurry: How to Stay Emotionally Healthy and Spiritually Alive in the Chaos of the Modern World)
The problem with this game of special characteristics is that non-humans can never win. When we determine that parrots have the conceptual ability to understand and manipulate single-digit numbers, we demand that they be able to understand and manipulate double-digit numbers in order to be sufficiently like us. When a chimpanzee indicates beyond doubt that she has an extensive vocabulary, we demand that she exhibit certain levels of syntactical skill in order to demonstrate that her mind is like ours. The irony, of course, is that whatever characteristic we are talking about will be possessed by some nonhumans to a greater degree than some humans, but we would never think it appropriate to exploit those humans in the ways that we do nonhumans.
Gary L. Francione (Animals as Persons: Essays on the Abolition of Animal Exploitation)
Instead of learning from one mind at a time, the search engine learns from the collective human mind, all at once. Every time an individual searches for something, and finds an answer, this leaves a faint, lingering trace as to where (and what) some fragment of meaning is. The fragments accumulate and, at a certain point, as Turing put it in 1948, “the machine would have ‘grown up.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
In the moment all is dear to me, dear that in this logic there is no redemption, the city itself being the highest form of madness and each and every part, organic or inorganic, an expression of this same madness. I feel absurdly and humbly great, not as megalomaniac, but as human spore, as the dead sponge of life swollen to saturation. I no longer look into the eyes of the woman I hold in my arms but I swim through, head and arms and legs, and I see that behind the sockets of the eyes there is a region unexplored, the world of futurity, and here there is no logic whatever, just the still germination of events unbroken by night and day, by yesterday and tomorrow. The eye, accustomed to concentration on points in space, now concentrates on points in time; the eye sees forward and backward at will. The eye which was the I of the self no longer exists; this selfless eye neither reveals nor illuminates. It travels along the line of the horizon, a ceaseless, uninformed voyager. Trying to retain the lost body I grew in logic as the city, a point digit in the anatomy of perfection. I grew beyond my own death, spiritually bright and hard. I was divided into endless yesterdays, endless tomorrows, resting only on the cusp of the event, a wall with many windows, but the house gone. I must shatter the walls and windows, the last shell of the lost body, if I am to rejoin the present. That is why I no longer look into the eyes or through the eyes, but by the legerdemain of will swim through the eyes, head and arms and legs to explore the curve of vision. I see around myself as the mother who bore me once saw round the comers of time. I have broken the wall created by birth and the line of voyage is round and unbroken, even as the navel. No form, no image, no architecture, only concentric flights of sheer madness. I am the arrow of the dream's substantiality. I verify by flight. I nullify by dropping to earth.
Henry Miller (Tropic of Capricorn (Tropic, #2))
This is the portrait of the future customers—connected yet distracted. A survey by the National Center for Biotechnological Information shows that the average human attention span has dropped from 12 seconds in 2000 to 8 seconds in 2013. This can be attributed to the massive and overwhelming volume of messages that constantly bombard our connected mobile devices and demand instant attention.
Philip Kotler (Marketing 4.0: Moving from Traditional to Digital)
Meanwhile, because humanity is still swept along by animal passions in a digitalized global world, and because we are conflicted between what we are and what we wish to become, and because we are drowning in information and starved for wisdom, it would seem appropriate to return philosophy to its once esteemed position, this time as the center of a humanistic science and a scientific humanities.
Edward O. Wilson (The Origins of Creativity)
One night, walking along 8th Street in the East Village, I saw some adolescent boys, out too late and unattended. They were playing an arcade video game set up on the sidewalk, piloting a digital spacecraft through starlit infinity, blasting everything in their path to bits. Now and then, the machine would let out a robotic shout of encouragement: You’re doing great! So the urchins flew on through the make-believe nothingness, destroying whatever they saw, hypnotized by the mechanical praise that stood in for the human voice of love. That, it seemed to me, was postmodernism in a nutshell. It ignored the full spiritual reality of life all around it in order to blow things apart inside a man-made box that only looked like infinity. You’re doing great, intellectuals! You’re doing great. Much
Andrew Klavan (The Great Good Thing: A Secular Jew Comes to Faith in Christ)
Life cannot be classified in terms of a simple neurological ladder, with human beings at the top; it is more accurate to talk of different forms of intelligence, each with its strengths and weaknesses. This point was well demonstrated in the minutes before last December's tsunami, when tourists grabbed their digital cameras and ran after the ebbing surf, and all the 'dumb' animals made for the hills.
B.R. Myers
Changing our energy model already means doubling rare metal production approximately every fifteen years. At this rate, over the next thirty years we will need to mine more mineral ores than humans have extracted over the last 70,000 years. But the shortages already looming on the horizon could burst the bubble of Jeremy Rifkin, green-tech industrialists, and Pope Francis, and prove our hermit right. The
Guillame Pitron (The Rare Metals War: the dark side of clean energy and digital technologies)
A morsel of cortex one cubic millimeter in size—about the size of a grain of sand—could hold two thousand terabytes of information, enough to store all the movies ever made, trailers included, or about 1.2 billion copies of this book. Altogether, the human brain is estimated to hold something on the order of two hundred exabytes of information, roughly equal to “the entire digital content of today’s world,
Bill Bryson (The Body: A Guide for Occupants)
If you had to give a name to the whole apparatus, what would you call it?” “Hmmm,” Waterhouse says. “Well, its basic job is to perform mathematical calculations—like a computer.” Comstock snorts. “A computer is a human being.” “Well . . . this machine uses binary digits to do its computing. I suppose you could call it a digital computer.” Comstock writes it out in block letters on his legal pad: DIGITAL COMPUTER.
Neal Stephenson (Cryptonomicon)
But even when Facebook isn't deliberately exploiting its users, it is exploiting its users—its business model requires it. Even if you distance yourself from Facebook, you still live in the world that Facebook is shaping. Facebook, using our native narcissism and our desire to connect with other people, captured our attention and our behavioral data; it used this attention and data to manipulate our behavior, to the point that nearly half of America began relying on Facebook for news. Then, with the media both reliant on Facebook as a way of reaching readers and powerless against the platform's ability to suck up digital advertising revenue—it was like a paperboy who pocketed all the subscription money—Facebook bent the media's economic model to match its own practices: publications needed to capture attention quickly and consistently trigger high emotional responses to be seen at all. The result, in 2016, was an unending stream of Trump stories, both from the mainstream news and from the fringe outlets that were buoyed by Facebook's algorithm. What began as a way for Zuckerberg to harness collegiate misogyny and self-interest has become the fuel for our whole contemporary nightmare, for a world that fundamentally and systematically misrepresents human needs.
Jia Tolentino (Trick Mirror: Reflections on Self-Delusion)
If patterns of ones and zeros were “like” patterns of human lives and deaths, if everything about an individual could be represented in a computer record by a long string of ones and zeros, then what kind of creature would be represented by a long string of lives and deaths? It would have to be up one level at least—an angel, a minor god, something in a UFO. It would take eight human lives and deaths just to form one character in this being’s name—its complete dossier might take up a considerable piece of the history of the world. We are digits in God’s computer, she not so much thought as hummed to herself to a sort of standard gospel tune, And the only thing we’re good for, to be dead or to be living, is the only thing He sees. What we cry, what we contend for, in our world of toil and blood, it all lies beneath the notice of the hacker we call God.
Thomas Pynchon (Vineland (The inspiration for One Battle After Another))
There are many buzzwords that gloss over these operations and their economic origins: “ambient computing,” “ubiquitous computing,” and the “internet of things” are but a few examples. For now I will refer to this whole complex more generally as the “apparatus.” Although the labels differ, they share a consistent vision: the everywhere, always-on instrumentation, datafication, connection, communication, and computation of all things, animate and inanimate, and all processes—natural, human, physiological, chemical, machine, administrative, vehicular, financial. Real-world activity is continuously rendered from phones, cars, streets, homes, shops, bodies, trees, buildings, airports, and cities back to the digital realm, where it finds new life as data ready for transformation into predictions, all of it filling the ever-expanding pages of the shadow text.4
Shoshana Zuboff (The Age of Surveillance Capitalism)
It was there, in Spain, that the right to the future tense was on the move, insisting that the operations of surveillance capitalism and its digital architecture are not, never were, and never would be inevitable. Instead, the opposition asserted that even Google’s capitalism was made by humans to be unmade and remade by democratic processes, not commercial decree. Google’s was not to be the last word on the human or the digital future.
Shoshana Zuboff (The Age of Surveillance Capitalism)
there was once a time when human beings did not feel the need to share their every waking moment with hundreds of millions, even billions, of complete and utter strangers. If one went to a shopping mall to purchase an article of clothing, one did not post minute-by-minute details on a social networking site; and if one made a fool of oneself at a party, one did not leave a photographic record of the sorry episode in a digital scrapbook that would survive for all eternity. But now, in the era of lost inhibition, it seemed no detail of life was too mundane or humiliating to share. In the online age, it was more important to live out loud than to live with dignity. Internet followers were more treasured than flesh-and-blood friends, for they held the illusive promise of celebrity, even immortality. Were Descartes alive today, he might have written: I tweet, therefore I am.
Daniel Silva (The Heist (Gabriel Alon#14))
At the same time, such technology—from the television to the computer and phone—can put pressure on the brain by presenting it with more information, and of a type of information, that makes it hard for us to keep up. That is particularly true of interactive electronics, delivering highly relevant, stimulating social content, and with increasing speed. The onslaught taxes our ability to attend, to pay attention, arguably among the most important, powerful, and uniquely human of our gifts.
Matt Richtel (A Deadly Wandering: A Mystery, a Landmark Investigation, and the Astonishing Science of Attention in the Digital Age)
The lesson of the last decades is that neither massive grass-roots protests (as we have seen in Spain and Greece) nor well-organized political movements (parties with elaborated political visions) are enough - we also need a narrow, striking force of dedicated 'engineers' (hackers, whistle-blowers...) organized as a disciplined conspiratorial group. Its task will be to 'take over' the digital grid, to rip it out of the hands of the corporations and state agencies that now de facto control it.
Slavoj Žižek (Like A Thief In Broad Daylight: Power in the Era of Post-Human Capitalism)
It occurred to me that we now as a culture, as a people have legitimately become the progeny of the Digital Age. Ostensibly, we subsist within a dehumanized frontier--a computational, compartmentalized, mathematized collectivist-grid. Metrics have prohibitively supplanted ethics. Alternately, the authentic aesthetic experience has been sacrificed and transposed by the new breed of evangelicals: the purveyors of the advertising industry. Thus the symbolic euphoria induced by the infomercial is celebrated as the new Delphic Oracle. Alas, we've transitioned from a carbon-based life form into an information-based, bio-mechanical, heuristically deprived and depleted entity best described as "a self-balancing 28-jointed adaptor-based biped, an electro-chemical reduction plant integral with segregated stowages of special energy extracts." Consequently, we exist under the tyranny of hyper-specialization, which dislodges and disposes our sense of logic, proportion and humanity from both our cognitive and synaptic ballet.
Albert Bifarelli
There's an analogy to be made between our craving for story and our craving for food. A tendency to overeat served our ancestors well when food shortages were a predictable part of life. But now that we modern desk jockeys are awash in cheap grease and corn syrup, overeating is more likely to fatten us up and kill us young. Likewise, it could be that an intense greed for story was healthy for our ancestors but has some harmful consequences in a world where books, MP3 players, TVs, and iPhones make story omnipresent - and where we have, in romance novels and television shows such as Jersey Shore, something like the story equivalent of deep-fried Twinkies. I think the literary scholar Brian Boyd is right to wonder if overconsumimg in a world awash with junk story could lead to something like a "mental diabetes epidemic." Similarly, as digital technology evolves, our stories - ubiquitous, immersive, interactive - may become dangerously attractive. The real threat isn't that story will fade out of human life in the future; its that story will take it over completely.
Jonathan Gottschall (The Storytelling Animal: How Stories Make Us Human)
By the time I got to work, I had this realization that I didn’t have any more goals.”26 For the next two months, he assiduously tended to the task of finding for himself a worthy life goal. “I looked at all the crusades people could join, to find out how I could retrain myself.” What struck him was that any effort to improve the world was complex. He thought about people who tried to fight malaria or increase food production in poor areas and discovered that led to a complex array of other issues, such as overpopulation and soil erosion. To succeed at any ambitious project, you had to assess all of the intricate ramifications of an action, weigh probabilities, share information, organize people, and more. “Then one day, it just dawned on me—BOOM—that complexity was the fundamental thing,” he recalled. “And it just went click. If in some way, you could contribute significantly to the way humans could handle complexity and urgency, that would be universally helpful.”27 Such an endeavor would address not just one of the world’s problems; it would give people the tools to take on any problem.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
All these processes are helped along by another friend of the earth, dematerialization. Progress in technology allows us to do more with less. An aluminum soda can used to weight three ounces; today it weighs less than half an ounce. Mobile phones don't need miles of telephone poles and wires. The digital revolution, by replacing atoms with bits, is dematerializing the world in front of our eyes. The cubic yards of vinyl that used to be my music collection gave way to cubic inches of compact disks and then to the nothingness of MP3s. The river of newsprint flowing through my apartment has been stanched by an iPad. With a terabyte of storage on my laptop I no longer buy paper by the ten-ream box. And just think of all the plastic, metal, and paper that no longer go into the forty-odd consumer products that can be replaced by a single smartphone, including a telephone, answering machine, phone book, camera, camcorder, tape recorder, radio, alarm clock, calculator, dictionary, Rolodex, calendar, street maps, flashlight, fax, and compass--even a metronome, outdoor thermometer, and spirit level.
Steven Pinker (Enlightenment Now: The Case for Reason, Science, Humanism, and Progress)
But I am no Lord of the Hill; these hands pitching fastballs at glass houses are just as dirty as yours are. However, there are a lot of exemptions in my favor. One, much of my calamitous behavior occurred prior to the Digital Age, so no footage or real proof exists (thank fuck) and can only be found in hearsay and interviews. Two, I understand the difference between “getting it out of your system” when you are young and not giving a shit outright about making buffoonery seem like a career and not an aberration as you get old enough to actually know better. Three—and this is most important—it is my book, so I can do no wrong. Shit happens; it just so happens to be yours and not mine. So guess what? Even if you are not devoid of gray matter, even if you are not technically by definition bereft of intuitive mental faculties, you are all guilty by association. This is a RICO case, and I am the district attorney in charge of bringing justice to the world. I may not be infallible, but I can wear a suit and use big words, and it won’t even look like someone put peanut butter on the roof of my mouth.
Corey Taylor (You're Making Me Hate You: A Cantankerous Look at the Common Misconception That Humans Have Any Common Sense Left)
Digital camera can be bought for seventy-five cents a click, whereas digital cameras fetches a dollar and eight cents. The advertisers know that the plural is more likely to be typed by people who are planning to buy a digital camera, though they don’t know why.25 The reason is that a bare noun like digital camera is generic, and is likely to be typed by someone who wants to know how they work. A plural like digital cameras is more likely to be referential, and typed by someone who wants to know about the kinds that are out there and how to get one.
Steven Pinker (The Stuff of Thought: Language as a Window Into Human Nature)
Hey Pete. So why the leave from social media? You are an activist, right? It seems like this decision is counterproductive to your message and work." A: The short answer is I’m tired of the endless narcissism inherent to the medium. In the commercial society we have, coupled with the consequential sense of insecurity people feel, as they impulsively “package themselves” for public consumption, the expression most dominant in all of this - is vanity. And I find that disheartening, annoying and dangerous. It is a form of cultural violence in many respects. However, please note the difference - that I work to promote just that – a message/idea – not myself… and I honestly loath people who today just promote themselves for the sake of themselves. A sea of humans who have been conditioned into viewing who they are – as how they are seen online. Think about that for a moment. Social identity theory run amok. People have been conditioned to think “they are” how “others see them”. We live in an increasing fictional reality where people are now not only people – they are digital symbols. And those symbols become more important as a matter of “marketing” than people’s true personality. Now, one could argue that social perception has always had a communicative symbolism, even before the computer age. But nooooooothing like today. Social media has become a social prison and a strong means of social control, in fact. Beyond that, as most know, social media is literally designed like a drug. And it acts like it as people get more and more addicted to being seen and addicted to molding the way they want the world to view them – no matter how false the image (If there is any word that defines peoples’ behavior here – it is pretention). Dopamine fires upon recognition and, coupled with cell phone culture, we now have a sea of people in zombie like trances looking at their phones (literally) thousands of times a day, merging their direct, true interpersonal social reality with a virtual “social media” one. No one can read anymore... they just swipe a stream of 200 character headlines/posts/tweets. understanding the world as an aggregate of those fragmented sentences. Massive loss of comprehension happening, replaced by usually agreeable, "in-bubble" views - hence an actual loss of variety. So again, this isn’t to say non-commercial focused social media doesn’t have positive purposes, such as with activism at times. But, on the whole, it merely amplifies a general value system disorder of a “LOOK AT ME! LOOK AT HOW GREAT I AM!” – rooted in systemic insecurity. People lying to themselves, drawing meaningless satisfaction from superficial responses from a sea of avatars. And it’s no surprise. Market economics demands people self promote shamelessly, coupled with the arbitrary constructs of beauty and success that have also resulted. People see status in certain things and, directly or pathologically, use those things for their own narcissistic advantage. Think of those endless status pics of people rock climbing, or hanging out on a stunning beach or showing off their new trophy girl-friend, etc. It goes on and on and worse the general public generally likes it, seeking to imitate those images/symbols to amplify their own false status. Hence the endless feedback loop of superficiality. And people wonder why youth suicides have risen… a young woman looking at a model of perfection set by her peers, without proper knowledge of the medium, can be made to feel inferior far more dramatically than the typical body image problems associated to traditional advertising. That is just one example of the cultural violence inherent. The entire industry of social media is BASED on narcissistic status promotion and narrow self-interest. That is the emotion/intent that creates the billions and billions in revenue these platforms experience, as they in turn sell off people’s personal data to advertisers and governments. You are the product, of course.
Peter Joseph
Unfortunately, it seems that we, as a society, have entered into a Faustian deal. Yes, we have these amazing handheld marvels of the digital age - tablets and smartphones - miraculous glowing devices that connect people throughout the globe and can literally access the sum of all human knowledge in the palm of our hand. But what is the price of all this future tech? The psyche and soul of an entire generation. The sad truth is that for the oh-so-satisfying ease, comfort and titillation of these jewels of the modern age, we've unwittingly thrown an entire generation under the virtual bus.
Nicholas Kardaras (Glow Kids: How Screen Addiction Is Hijacking Our Kids - and How to Break the Trance)
Christian thinking pursues embodied community. Before physical people, with human faces we can read, human voices in which we can hear emotion, saying human words that bind us emotionally to a particular place and a particular moment, we are reminded of what the Bible really means by honoring one another, serving one another, preferring one another, loving one another, admonishing one another, confessing your sins to one another, and praying for one another. Pixels are not created in God’s image. People are. It is a holy thing to be with another human being. It is, in fact, our eternal destiny.
Samuel James (Digital Liturgies: Rediscovering Christian Wisdom in an Online Age)
Distributions can only be based on measurements, but as in the case of measuring intelligence, the nature of measurement is often complicated and troubled by ambiguities. Consider the problem of noise, or what is known as luck in human affairs. Since the rise of the new digital economy, around the turn of the century, there has been a distinct heightening of obsessions with contests like American Idol, or other rituals in which an anointed individual will suddenly become rich and famous. When it comes to winner-take-all contests, onlookers are inevitably fascinated by the role of luck. Yes, the winner of a singing contest is good enough to be the winner, but even the slightest flickering of fate might have changed circumstances to make someone else the winner. Maybe a different shade of makeup would have turned the tables. And yet the rewards of winning and losing are vastly different. While some critics might have aesthetic or ethical objections to winner-take-all outcomes, a mathematical problem with them is that noise is amplified. Therefore, if a societal system depends too much on winner-take-all contests, then the acuity of that system will suffer. It will become less reality-based.
Jaron Lanier (Who Owns the Future?)
Secular Israelis often complain bitterly that the ultra-Orthodox don’t contribute enough to society and live off other people’s hard work. Secular Israelis also tend to argue that the ultra-Orthodox way of life is unsustainable, especially as ultra-Orthodox families have seven children on average.32 Sooner or later, the state will not be able to support so many unemployed people, and the ultra-Orthodox will have to go to work. Yet it might be just the reverse. As robots and AI push humans out of the job market, the ultra-Orthodox Jews may come to be seen as the model for the future rather than as a fossil from the past. Not that everyone will become Orthodox Jews and go to yeshivas to study the Talmud. But in the lives of all people, the quest for meaning and community might eclipse the quest for a job. If we manage to combine a universal economic safety net with strong communities and meaningful pursuits, losing our jobs to algorithms might actually turn out to be a blessing. Losing control over our lives, however, is a much scarier scenario. Notwithstanding the danger of mass unemployment, what we should worry about even more is the shift in authority from humans to algorithms, which might destroy any remaining faith in the liberal story and open the way to the rise of digital dictatorships.
Yuval Noah Harari (21 Lessons for the 21st Century)
The reality is that Ada’s contribution was both profound and inspirational. More than Babbage or any other person of her era, she was able to glimpse a future in which machines would become partners of the human imagination, together weaving tapestries as beautiful as those from Jacquard’s loom. Her appreciation for poetical science led her to celebrate a proposed calculating machine that was dismissed by the scientific establishment of her day, and she perceived how the processing power of such a device could be used on any form of information. Thus did Ada, Countess of Lovelace, help sow the seeds for a digital age that would blossom a hundred years later.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Every year or so I like to take a step back and look at a few key advertising, marketing, and media facts just to gauge how far removed from reality we advertising experts have gotten. These data represent the latest numbers I could find. I have listed the sources below. So here we go -- 10 facts, direct from the real world: E-commerce in 2014 accounted for 6.5 percent of total retail sales. 96% of video viewing is currently done on a television. 4% is done on a web device. In Europe and the US, people would not care if 92% of brands disappeared. The rate of engagement among a brand's fans with a Facebook post is 7 in 10,000. For Twitter it is 3 in 10,000. Fewer than one standard banner ad in a thousand is clicked on. Over half the display ads paid for by marketers are unviewable. Less than 1% of retail buying is done on a mobile device. Only 44% of traffic on the web is human. One bot-net can generate 1 billion fraudulent digital ad impressions a day. Half of all U.S online advertising - $10 billion a year - may be lost to fraud. As regular readers know, one of our favorite sayings around The Ad Contrarian Social Club is a quote from Noble Prize winning physicist Richard Feynman, who wonderfully declared that “Science is the belief in the ignorance of experts.” I think these facts do a pretty good job of vindicating Feynman.
Bob Hoffman (Marketers Are From Mars, Consumers Are From New Jersey)
Much of the first round of innovation involved pouring old wine—books, newspapers, opinion pieces, journals, songs, television shows, movies—into new digital bottles. But new platforms, services, and social networks are increasingly enabling fresh opportunities for individual imagination and collaborative creativity. Role-playing games and interactive plays are merging with collaborative forms of storytelling and augmented realities. This interplay between technology and the arts will eventually result in completely new forms of expression and formats of media. This innovation will come from people who are able to link beauty to engineering, humanity to technology, and poetry to processors.
Walter Isaacson
a simple, inspiring mission for Wikipedia: “Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge. That’s what we’re doing.” It was a huge, audacious, and worthy goal. But it badly understated what Wikipedia did. It was about more than people being “given” free access to knowledge; it was also about empowering them, in a way not seen before in history, to be part of the process of creating and distributing knowledge. Wales came to realize that. “Wikipedia allows people not merely to access other people’s knowledge but to share their own,” he said. “When you help build something, you own it, you’re vested in it. That’s far more rewarding than having it handed down to you.”111 Wikipedia took the world another step closer to the vision propounded by Vannevar Bush in his 1945 essay, “As We May Think,” which predicted, “Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.” It also harkened back to Ada Lovelace, who asserted that machines would be able to do almost anything, except think on their own. Wikipedia was not about building a machine that could think on its own. It was instead a dazzling example of human-machine symbiosis, the wisdom of humans and the processing power of computers being woven together like a tapestry.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Just sitting quietly, doing nothing at all, your brain churns through more information in thirty seconds than the Hubble Space Telescope has processed in thirty years. A morsel of cortex one cubic millimeter in size—about the size of a grain of sand—could hold two thousand terabytes of information, enough to store all the movies ever made, trailers included, or about 1.2 billion copies of this book. Altogether, the human brain is estimated to hold something on the order of two hundred exabytes of information, roughly equal to “the entire digital content of today’s world,” according to Nature Neuroscience.*1 If that is not the most extraordinary thing in the universe, then we certainly have some wonders yet to find.
Bill Bryson (The Body: A Guide for Occupants)
Sonnet of Cryptocurrency The reason people are nuts about cryptocurrency, Is that they hear the magic phrase regulation-free. But what they forget to take into account, Is that it also means the user alone bears liability. The purpose behind a centralized system, Is not exploitation but to provide trust and stability. Anything that is decentralized on the other hand, Is a breeding ground for fraud and volatility. Not every fancy innovation is gonna benefit society, Innovation without accountability is only delusion. Cryptocurrency can be a great boon to banking, If it merges with the centralized financial institution. Intoxication of tech is yet another fundamentalism. Algorithm without humanity is digital barbarism.
Abhijit Naskar (Hometown Human: To Live for Soil and Society)
then “man-computer symbiosis,” as Licklider called it, will remain triumphant. Artificial intelligence need not be the holy grail of computing. The goal instead could be to find ways to optimize the collaboration between human and machine capabilities—to forge a partnership in which we let the machines do what they do best, and they let us do what we do best. SOME LESSONS FROM THE JOURNEY Like all historical narratives, the story of the innovations that created the digital age has many strands. So what lessons, in addition to the power of human-machine symbiosis just discussed, might be drawn from the tale? First and foremost is that creativity is a collaborative process. Innovation comes from teams more often than from the lightbulb moments of lone geniuses.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Just a short time ago, reading a book was a part of our natural rhythm, an inclination to find the quiet within the chaos. When we had a few minutes to spare, we often turned to a book. In fact, we yearn for this core sense of peace because we viscerally recognize it. And we have the freedom to claim it, to lean into the quiet and pick up a book. To claim this—to slow down and settle in with a story—this becomes a radical act of self-care. Reading is self-care. As human beings living in a digital age, time-starved and rushing around, printed books are reminders of the time we once had, the time we want to have, and the time we hope to have. Printed books quell the chaos. Printed books make us feel comfortable and make us feel like everything is going to be OK.
Thatcher Wine
Through the fall, the president’s anger seemed difficult to contain. He threatened North Korea with “fire and fury,” then followed up with a threat to “totally destroy” the country. When neo-Nazis and white supremacists held a rally in Charlottesville, Virginia, and one of them killed a protester and injured a score of others, he made a brutally offensive statement condemning violence “on many sides … on many sides”—as if there was moral equivalence between those who were fomenting racial hatred and violence and those who were opposing it. He retweeted anti-Muslim propaganda that had been posted by a convicted criminal leader of a British far-right organization. Then as now, the president’s heedless bullying and intolerance of variance—intolerance of any perception not his own—has been nurturing a strain of insanity in public dialogue that has been long in development, a pathology that became only more virulent when it migrated to the internet. A person such as the president can on impulse and with minimal effort inject any sort of falsehood into public conversation through digital media and call his own lie a correction of “fake news.” There are so many news outlets now, and the competition for clicks is so intense, that any sufficiently outrageous statement made online by anyone with even the faintest patina of authority, and sometimes even without it, will be talked about, shared, and reported on, regardless of whether it has a basis in fact. How do you progress as a culture if you set out to destroy any common agreement as to what constitutes a fact? You can’t have conversations. You can’t have debates. You can’t come to conclusions. At the same time, calling out the transgressor has a way of giving more oxygen to the lie. Now it’s a news story, and the lie is being mentioned not just in some website that publishes unattributable gossip but in every reputable newspaper in the country. I have not been looking to start a personal fight with the president. When somebody insults your wife, your instinctive reaction is to want to lash out in response. When you are the acting director, or deputy director, of the FBI, and the person doing the insulting is the chief executive of the United States, your options have guardrails. I read the president’s tweets, but I had an organization to run. A country to help protect. I had to remain independent, neutral, professional, positive, on target. I had to compartmentalize my emotions. Crises taught me how to compartmentalize. Example: the Boston Marathon bombing—watching the video evidence, reviewing videos again and again of people dying, people being mutilated and maimed. I had the primal human response that anyone would have. But I know how to build walls around that response and had to build them then in order to stay focused on finding the bombers. Compared to experiences like that one, getting tweeted about by Donald Trump does not count as a crisis. I do not even know how to think about the fact that the person with time on his hands to tweet about me and my wife is the president of the United States.
Andrew G. McCabe (The Threat: How the FBI Protects America in the Age of Terror and Trump)
On the other hand, some of the family’s impatience with the public is justified. When I use Federal Express, I accept as a condition of business that its standardized forms must be filled out in printed letters. An e-mail address off by a single character goes nowhere. Transposing two digits in a phone number gets me somebody speaking heatedly in Portuguese. Electronic media tell you instantly when you’ve made an error; with the post office, you have to wait. Haven’t we all at some point tested its humanity? I send mail to friends in Upper Molar, New York (they live in Upper Nyack), and expect a stranger to laugh and deliver it in forty-eight hours. More often than not, the stranger does. With its mission of universal service, the Postal Service is like an urban emergency room contractually obligated to accept every sore throat, pregnancy, and demented parent that comes its way. You may have to wait for hours in a dimly lit corridor. The staff may be short-tempered and dilatory. But eventually you will get treated. In the Central Post Office’s Nixie unit—where mail arrives that has been illegibly or incorrectly addressed—I see street numbers in the seventy thousands; impossible pairings of zip codes and streets; addresses without a name, without a street, without a city; addresses that consist of the description of a building; addresses written in water-based ink that rain has blurred. Skilled Nixie clerks study the orphans one at a time. Either they find a home for them or they apply that most expressive of postal markings, the vermilion finger of accusation that lays the blame squarely on you, the sender.
Jonathan Franzen (How to Be Alone)
Search engines and social networks are analog computers of unprecedented scale. Information is being encoded (and operated upon) as continuous (and noise-tolerant) variables such as frequencies (of connection or occurrence) and the topology of what connects where, with location being increasingly defined by a fault-tolerant template rather than by an unforgiving numerical address. Pulse-frequency coding for the Internet is one way to describe the working architecture of a search engine, and PageRank for neurons is one way to describe the working architecture of the brain. These computational structures use digital components, but the analog computing being performed by the system as a whole exceeds the complexity of the digital code on which it runs. The model (of the social graph, or of human knowledge) constructs and updates itself.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
Is any of it real? I mean, look at this, look at it! A world built on fantasy! Synthetic emotions in the form of pills! Psychological warfare in the form of advertising! Mind altering chemicals in the form of food! Brainwashing seminars in the form of media! Controlled isolated bubbles in the form of social networks. Real? You want to talk about reality? We haven't lived in anything remotely close to it since the turn of the century! We turned it off, took out the batteries, snacked on a bag of GMOs, while we tossed the remnants into the ever expanding dumpster of the human condition. We live in branded houses, trademarked by corporations, built on bipolar numbers, jumping up and down on digital displays, hypnotizing us into the biggest slumber mankind has ever seen. You'd have to dig pretty deep, kiddo, before you can find anything real.
Sam Esmail
Say more about the Crips and the Bloods,” Richard said, stalling for time while he tried to get his mental house in order. “To us they look the same. Urban black kids with similar demographics and tastes. Seems like they all ought to pull together. But that’s not where they’re at. They are shooting each other to death because they see the Other as less than human. And I’m saying it has been the case for a long time in T’Rain that those people we have lately started calling the Earthtone Coalition have always looked at the ones we now call the Forces of Brightness and seen them as tacky, uncultured, not really playing the game in character. And what happened in the last few months was that the F.O.B. types just got tired of it and rose up and, you know, asserted their pride in their identity, kind of like the gay rights movement with those goddamned rainbow flags. And as long as it’s possible for those two groups to identify each other on sight, each one of them is going to see the other as, well, the Other, and killing people based on that is way more ingrained than killing them on this completely bogus and flimsy fake-Good and fake-Evil dichotomy that we were working with before.” “I get it,” Richard said. “But is that all we are? Just digital Crips and Bloods?” “What if it’s true?” Devin shrugged. “Then you’re not doing your fucking job,” Richard said. “Because the world is supposed to have a real story to it. Not just people killing each other over color schemes.” “Maybe you’re not doing yours,” Devin said. “How can I write a story about Good and Evil in a world where those concepts have no real meaning—no consequences?” “What sort of consequences do you have in mind? We can’t send people’s characters to virtual Hell.” “I know. Only Limbo.” They both laughed.
Neal Stephenson (Reamde)
Bertrand Russell famously said: “It is undesirable to believe a proposition when there is no ground whatsoever for supposing it is true.” [but] Russell’s maxim is the luxury of a technologically advanced society with science, history, journalism, and their infrastructure of truth-seeking, including archival records, digital datasets, high-tech instruments, and communities of editing, fact-checking, and peer review. We children of the Enlightenment embrace the radical creed of universal realism: we hold that all our beliefs should fall within the reality mindset. We care about whether our creation story, our founding legends, our theories of invisible nutrients and germs and forces, our conceptions of the powerful, our suspicions about our enemies, are true or false. That’s because we have the tools to get answers to these questions, or at least to assign them warranted degrees of credence. And we have a technocratic state that should, in theory, put these beliefs into practice. But as desirable as that creed is, it is not the natural human way of believing. In granting an imperialistic mandate to the reality mindset to conquer the universe of belief and push mythology to the margins, we are the weird ones—or, as evolutionary social scientists like to say, the WEIRD ones: Western, Educated, Industrialized, Rich, Democratic. At least, the highly educated among us are, in our best moments. The human mind is adapted to understanding remote spheres of existence through a mythology mindset. It’s not because we descended from Pleistocene hunter-gatherers specifically, but because we descended from people who could not or did not sign on to the Enlightenment ideal of universal realism. Submitting all of one’s beliefs to the trials of reason and evidence is an unnatural skill, like literacy and numeracy, and must be instilled and cultivated.
Pinker Steven (Rationality: What It Is, Why It Seems Scarce, Why It Matters)
Star Wars introduced a new way for using the five screen speakers [in theaters]. By pushing left and right sound channels to the farthest out speakers the pair just inside those was made available. Lucas' mixers then placed low frequency effects in those speakers, and named it the 'baby boom' channel. Human ears can hear frequencies up to around 20,000 hertz, and down to around 20 Hertz for very low sounds. Below that you don't *hear* the sound, but if the 'volume' is 'loud' enough, you can *feel* the sound. Super-low frequencies affect us emotionally, usually inducing something like fear. We feel them during earthquakes. Lucasfilm put sound effects in the baby boom channel for audiences to feel--for instance, in the opening shot of Star Wars where the little diplomatic ship is running from the Imperial Cruiser. It's no wonder this is one of the most memorable and ominous shots in cinematic history. It was not only cool looking, but cool *sounding*
Michael Rubin (Droidmaker: George Lucas and the Digital Revolution)
So which theory did Lagos believe in? The relativist or the universalist?" "He did not seem to think there was much of a difference. In the end, they are both somewhat mystical. Lagos believed that both schools of thought had essentially arrived at the same place by different lines of reasoning." "But it seems to me there is a key difference," Hiro says. "The universalists think that we are determined by the prepatterned structure of our brains -- the pathways in the cortex. The relativists don't believe that we have any limits." "Lagos modified the strict Chomskyan theory by supposing that learning a language is like blowing code into PROMs -- an analogy that I cannot interpret." "The analogy is clear. PROMs are Programmable Read-Only Memory chips," Hiro says. "When they come from the factory, they have no content. Once and only once, you can place information into those chips and then freeze it -- the information, the software, becomes frozen into the chip -- it transmutes into hardware. After you have blown the code into the PROMs, you can read it out, but you can't write to them anymore. So Lagos was trying to say that the newborn human brain has no structure -- as the relativists would have it -- and that as the child learns a language, the developing brain structures itself accordingly, the language gets 'blown into the hardware and becomes a permanent part of the brain's deep structure -- as the universalists would have it." "Yes. This was his interpretation." "Okay. So when he talked about Enki being a real person with magical powers, what he meant was that Enki somehow understood the connection between language and the brain, knew how to manipulate it. The same way that a hacker, knowing the secrets of a computer system, can write code to control it -- digital namshubs?" "Lagos said that Enki had the ability to ascend into the universe of language and see it before his eyes. Much as humans go into the Metaverse. That gave him power to create nam-shubs. And nam-shubs had the power to alter the functioning of the brain and of the body." "Why isn't anyone doing this kind of thing nowadays? Why aren't there any namshubs in English?" "Not all languages are the same, as Steiner points out. Some languages are better at metaphor than others. Hebrew, Aramaic, Greek, and Chinese lend themselves to word play and have achieved a lasting grip on reality: Palestine had Qiryat Sefer, the 'City of the Letter,' and Syria had Byblos, the 'Town of the Book.' By contrast other civilizations seem 'speechless' or at least, as may have been the case in Egypt, not entirely cognizant of the creative and transformational powers of language. Lagos believed that Sumerian was an extraordinarily powerful language -- at least it was in Sumer five thousand years ago." "A language that lent itself to Enki's neurolinguistic hacking." "Early linguists, as well as the Kabbalists, believed in a fictional language called the tongue of Eden, the language of Adam. It enabled all men to understand each other, to communicate without misunderstanding. It was the language of the Logos, the moment when God created the world by speaking a word. In the tongue of Eden, naming a thing was the same as creating it. To quote Steiner again, 'Our speech interposes itself between apprehension and truth like a dusty pane or warped mirror. The tongue of Eden was like a flawless glass; a light of total understanding streamed through it. Thus Babel was a second Fall.' And Isaac the Blind, an early Kabbalist, said that, to quote Gershom Scholem's translation, 'The speech of men is connected with divine speech and all language whether heavenly or human derives from one source: the Divine Name.' The practical Kabbalists, the sorcerers, bore the title Ba'al Shem, meaning 'master of the divine name.'" "The machine language of the world," Hiro says.
Neal Stephenson (Snow Crash)
Is any of it real? I mean look at this. Look at it! A world built on fantasy. Synthetic emotions in the form of pills. Psychological warfare in the form of advertising. Mind-altering chemicals in the form of...food! Brainwashing seminars in teh form of media. Controlled isolated bubbles in the form of social networks. Real? You want to talk about reality? We haven't lived in anything remotely close to it since the turn of the century. We turned it off, took out the batteries, snacked on a back of GMO's while we tossed the remnants in the ever-expanding dumpster of the human condition. We live in branded houses trademarked by corporations built on bipolar numbers jumping up and down on digital displays, hypnotizing us into the biggest slumber mankind has ever seen. You have to dig pretty deep before you can find anything real. We live in a kingdom of bullshit. A kingdom you've lived in for far too long. So don't tell me about not being real. I'm no less real than the fucking beef patty in your Big Mac.
Sam Esmail
The history of slavery provides the spine of this novel. Some texts that offered “deep background” were Boubacar Barry’s Senegambia and the Atlantic Slave Trade, which excavates eighteenth-century slave trading history in Wolof-speaking areas of West Africa, and Walter Rucker’s Gold Coast Diasporas: Identity, Culture, and Power, about Asante peoples of West Africa, those who would come to be called “Coromantee.” Sylviane Diouf’s Servants of Allah: African Muslims Enslaved in the Americas is a must-read for anyone interested in Muslim history on the American side of the Atlantic. And Marcus Rediker’s The Slave Ship: A Human History gives background information about the brutal transatlantic slave trade. In addition, the digitized Georgia Archives provided information about eighteenth-century slave and Native American codes, as well as Land Lottery records. Henry Louis Gates’s edited The Classic Slave Narratives, which include Jacobs’s as well as Frederick Douglass’s autobiographies, continue to be so important to me. Ailey’s family lives
Honorée Fanonne Jeffers (The Love Songs of W.E.B. Du Bois)
Finding the right mentor is not always easy. But we can locate role models in a more accessible place: the stories of great originals throughout history. Human rights advocate Malala Yousafzai was moved by reading biographies of Meena, an activist for equality in Afghanistan, and of Martin Luther King, Jr. King was inspired by Gandhi as was Nelson Mandela. In some cases, fictional characters can be even better role models. Growing up, many originals find their first heroes in their most beloved novels where protagonists exercise their creativity in pursuit of unique accomplishments. When asked to name their favorite books, Elon Musk and Peter Thiel each chose “Lord of the Rings“, the epic tale of a hobbit’s adventures to destroy a dangerous ring of power. Sheryl Sandberg and Jeff Bezos both pointed to “A Wrinkle in Time“ in which a young girl learns to bend the laws of physics and travels through time. Mark Zuckerberg was partial to “Enders Game“ where it’s up to a group of kids to save the planet from an alien attack. Jack Ma named his favorite childhood book as “Ali Baba and the Forty Thieves“, about a woodcutter who takes the initiative to change his own fate. … There are studies showing that when children’s stories emphasize original achievements, the next generation innovates more.… Unlike biographies, in fictional stories characters can perform actions that have never been accomplished before, making the impossible seem possible. The inventors of the modern submarine and helicopters were transfixed by Jules Vern’s visions in “20,000 Leagues Under the Sea” and “The Clippership of the Clouds”. One of the earliest rockets was built by a scientist who drew his motivation from an H.G. Wells novel. Some of the earliest mobile phones, tablets, GPS navigators, portable digital storage desks, and multimedia players were designed by people who watched “Star Trek” characters using similar devices. As we encounter these images of originality in history and fiction, the logic of consequence fades away we no longer worry as much about what will happen if we fail… Instead of causing us to rebel because traditional avenues are closed, the protagonist in our favorite stories may inspire originality by opening our minds to unconventional paths.
Adam M. Grant (Originals: How Non-Conformists Move the World)
a harbinger of a third wave of computing, one that blurred the line between augmented human intelligence and artificial intelligence. “The first generation of computers were machines that counted and tabulated,” Rometty says, harking back to IBM’s roots in Herman Hollerith’s punch-card tabulators used for the 1890 census. “The second generation involved programmable machines that used the von Neumann architecture. You had to tell them what to do.” Beginning with Ada Lovelace, people wrote algorithms that instructed these computers, step by step, how to perform tasks. “Because of the proliferation of data,” Rometty adds, “there is no choice but to have a third generation, which are systems that are not programmed, they learn.”27 But even as this occurs, the process could remain one of partnership and symbiosis with humans rather than one designed to relegate humans to the dustbin of history. Larry Norton, a breast cancer specialist at New York’s Memorial Sloan-Kettering Cancer Center, was part of the team that worked with Watson. “Computer science is going to evolve rapidly, and medicine will evolve with it,” he said. “This is coevolution. We’ll help each other.”28 This belief that machines and humans will get smarter together is a process that Doug Engelbart called “bootstrapping” and “coevolution.”29 It raises an interesting prospect: perhaps no matter how fast computers progress, artificial intelligence may never outstrip the intelligence of the human-machine partnership. Let us assume, for example, that a machine someday exhibits all of the mental capabilities of a human: giving the outward appearance of recognizing patterns, perceiving emotions, appreciating beauty, creating art, having desires, forming moral values, and pursuing goals. Such a machine might be able to pass a Turing Test. It might even pass what we could call the Ada Test, which is that it could appear to “originate” its own thoughts that go beyond what we humans program it to do. There would, however, be still another hurdle before we could say that artificial intelligence has triumphed over augmented intelligence. We can call it the Licklider Test. It would go beyond asking whether a machine could replicate all the components of human intelligence to ask whether the machine accomplishes these tasks better when whirring away completely on its own or when working in conjunction with humans. In other words, is it possible that humans and machines working in partnership will be indefinitely more powerful than an artificial intelligence machine working alone?
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
[...]Telecomputer Man is assigned to an apparatus, just as the apparatus is assigned to him, by virtue of an involution of each into the other, a refraction of each by the other. The machine does what the human wants it to do, but by the same token the human puts into execution only what the machine has been programmed to do. The operator is working with virtuality: only apparently is the aim to obtain information or to communicate; the real purpose is to explore all the possibilities of a program, rather as a gambler seeks to exhaust the permutations in a game of chance. Consider the way the camera is used now. Its possibilities are no longer those of a subject who ' 'reflects' the world according to his personal vision; rather, they are the possibilities of the lens, as exploited by the object. The camera is thus a machine that vitiates all will, erases all intentionality and leaves nothing but the pure reflex needed to take pictures. Looking itself disappears without trace, replaced by a lens now in collusion with the object - and hence with an inversion of vision. The magic lies precisely in the subject's retroversion to a camera obscura - the reduction of his vision to the impersonal vision of a mechanical device. In a mirror, it is the subject who gives free rein to the realm of the imaginary. In the camera lens, and on-screen in general, it is the object, potentially, that unburdens itself - to the benefit of all media and telecommunications techniques. This is why images of anything are now a possibility. This is why everything is translatable into computer terms, commutable into digital form, just as each individual is commutable into his own particular genetic code. (The whole object, in fact, is to exhaust all the virtualities of such analogues of the genetic code: this is one of artificial intelligence's most fundamental aspects.) What this means on a more concrete level is that there is no longer any such thing as an act or event which is not refracted into a technical image or onto a screen, any such thing as an action which does not in some sense want to be photographed, filmed or tape-recorded, does not desire to be stored in memory so as to become reproducible for all eternity. No such thing as an action which does not aspire to self-transcendence into a virtual eternity - not, now, the durable eternity that follows death, but rather the ephemeral eternity of ever-ramifying artificial memory. The compulsion of the virtual is the compulsion to exist in potentia on all screens, to be embedded in all programs, and it acquires a magical force: the Siren call of the black box.
Jean Baudrillard (The Transparency of Evil: Essays in Extreme Phenomena)
I will give technology three definitions that we will use throughout the book. The first and most basic one is that a technology is a means to fulfill a human purpose. For some technologies-oil refining-the purpose is explicit. For others- the computer-the purpose may be hazy, multiple, and changing. As a means, a technology may be a method or process or device: a particular speech recognition algorithm, or a filtration process in chemical engineering, or a diesel engine. it may be simple: a roller bearing. Or it may be complicated: a wavelength division multiplexer. It may be material: an electrical generator. Or it may be nonmaterial: a digital compression algorithm. Whichever it is, it is always a means to carry out a human purpose. The second definition I will allow is a plural one: technology as an assemblage of practices and components. This covers technologies such as electronics or biotechnology that are collections or toolboxes of individual technologies and practices. Strictly speaking, we should call these bodies of technology. But this plural usage is widespread, so I will allow it here. I will also allow a third meaning. This is technology as the entire collection of devices and engineering practices available to a culture. Here we are back to the Oxford's collection of mechanical arts, or as Webster's puts it, "The totality of the means employed by a people to provide itself with the objects of material culture." We use this collective meaning when we blame "technology" for speeding up our lives, or talk of "technology" as a hope for mankind. Sometimes this meaning shades off into technology as a collective activity, as in "technology is what Silicon Valley is all about." I will allow this too as a variant of technology's collective meaning. The technology thinker Kevin Kelly calls this totality the "technium," and I like this word. But in this book I prefer to simply use "technology" for this because that reflects common use. The reason we need three meanings is that each points to technology in a different sense, a different category, from the others. Each category comes into being differently and evolves differently. A technology-singular-the steam engine-originates as a new concept and develops by modifying its internal parts. A technology-plural-electronics-comes into being by building around certain phenomena and components and develops by changing its parts and practices. And technology-general, the whole collection of all technologies that have ever existed past and present, originates from the use of natural phenomena and builds up organically with new elements forming by combination from old ones.
W. Brian Arthur (The Nature of Technology: What It Is and How It Evolves)
The dispersion of the daimonic by means of impersonality has serious and destructive effects. In New York City, it is not regarded as strange that the anonymous human beings secluded in single-room occupancies are so often connected with violent crime and drug addiction. Not that the anonymous individual in New York is alone: he sees thousands of other people every day, and he knows all the famous personalities as they come, via TV, into his single room. He knows their names, their smiles, their idiosyncrasies; they bandy about in a “we're-all-friends-together” mood on the screen which invites him to join them and subtly assumes that he does join them. He knows them all. But he himself is never known. His smile is unseen; his idiosyncrasies are important to no-body; his name is unknown. He remains a foreigner pushed on and off the subway by tens of thousands of other anonymous foreigners. There is a deeply depersonalizing tragedy involved in this. The most severe punishment Yahweh could inflict on his people was to blot out their name. “Their names,” Yahweh proclaims, “shall be wiped out of the book of the living.” This anonymous man's never being known, this aloneness, is transformed into loneliness, which may then become daimonic possession. For his self-doubts—“I don't really exist since I can't affect anyone” —eat away at his innards; he lives and breathes and walks in a loneliness which is subtle and insidious. It is not surprising that he gets a gun and trains it on some passer-by—also anonymous to him. And it is not surprising that the young men in the streets, who are only anonymous digits in their society, should gang together in violent attacks to make sure their assertion is felt. Loneliness and its stepchild, alienation, can become forms of demon possession. Surrendering ourselves to the impersonal daimonic pushes us into an anonymity which is also impersonal; we serve nature’s gross purposes on the lowest common denominator, which often means with violence.
Rollo May (Love and Will)
Millions of us daily take advantage of [Skype], delighted to carry the severed heads of family members under our arms as we move from the deck to the cool of inside, or steering them around our new homes, bobbing them like babies on a seasickening tour. Skype can be a wonderful consolation prize in the ongoing tournament of globalization, though typically the first place it transforms us is to ourselves. How often are the initial seconds of a video's call takeoff occupied by two wary, diagonal glances, with a quick muss or flick of the hair, or a more generous tilt of the screen in respect to the chin? Please attend to your own mask first. Yet, despite the obvious cheer of seeing a faraway face, lonesomeness surely persists in the impossibility of eye contact. You can offer up your eyes to the other person, but your own view will be of the webcam's unwarm aperture. ... The problem lies in the fact that we can't bring our silence with us through walls. In phone conversations, while silence can be both awkward and intimate, there is no doubt that each of you inhabits the same darkness, breathing the same dead air. Perversely, a phone silence is a thick rope tying two speakers together in the private void of their suspended conversation. This binding may be unpleasant and to be avoided, but it isn't as estranging as its visual counterpart. When talk runs to ground on Skype, and if the purpose of the call is to chat, I can quickly sense that my silence isn't their silence. For some reason silence can't cross the membrane of the computer screen as it can uncoil down phone lines. While we may be lulled into thinking that a Skype call, being visual, is more akin to a hang-out than a phone conversation, it is in many ways more demanding than its aural predecessor. Not until Skype has it become clear how much companionable quiet has depended on co-inhabiting an atmosphere, with a simple act of sharing the particulars of a place -- the objects in the room, the light through the window -- offering a lovely alternative to talk.
Laurence Scott (The Four-Dimensional Human: Ways of Being in the Digital World)
Jenna is acting strange. Weeping, moping, even remarks tending toward belittlement Melmoth might tolerate (although he cannot think why; she is not his wife and even in human females PMS is a plague of the past) but when he caught her lying about Raquel—udderly wonderful, indeed—he knew the problem was serious. After sex, Melmoth powers her down. He retrieves her capsule from underground storage, a little abashed to be riding up with the oblong vessel in a lobby elevator where anyone might see. Locked vertical for easy transport, the capsule on its castors and titanium carriage stands higher than Melmoth is tall. He cannot help feeling that its translucent pink upper half and tapered conical roundness make it look like an erect penis. Arriving at penthouse level, he wheels it into his apartment. Once inside his private quarters, he positions it beside the hoverbed and enters a six-character alphanumeric open-sesame to spring the lid. On an interior panel, Melmoth touches a sensor for AutoRenew. Gold wands deploy from opposite ends and set up a zero-gravity field that levitates Jenna from the topsheet. As if by magic—to Melmoth it is magic—the inert form of his personal android companion floats four feet laterally and gentles to rest in a polymer cradle contoured to her default figure. Jenna is only a SmartBot. She does not breathe, blood does not run in her arteries and veins. She has no arteries or veins, nor a heart, nor anything in the way of organic tissue. She can be replaced in a day—she can be replaced right now. If Melmoth touches “Upgrade,” the capsule lid will seal and lock, all VirtuLinks to Jenna will break, and a courier from GlobalDigital will collect the unit from a cargo bay of Melmoth’s high-rise after delivering a new model to Melmoth himself. It distresses him, how easy replacement would be, as if Jenna were no more abiding than an oldentime car he might decide one morning to trade-in. Seeing her in the capsule is bad enough; the poor thing looks as if she is lying in her coffin. Melmoth does not select “Power Down” on his cerebral menu any more often than he must. Only to update her software does Melmoth resort to pulling Jenna’s plug. Updating, too, disturbs him. In authorizing it, he cannot pretend she is human. [pp. 90-91]
John Lauricella (2094)