Analog Digital Quotes

We've searched our database for all the quotes and captions related to Analog Digital. Here they are! All 100 of them:

You are an analog girl, living in a digital world.
Neil Gaiman (American Gods (American Gods, #1))
Reality is not digital, an on-off state, but analog. Something gradual. In other words, reality is a quality that things possess in the same way that they possess, say, weight. Some people are more real than others, for example. It has been estimated that there are only about five hundred real people on any given planet, which is why they keep unexpectedly running into one another all the time.
Terry Pratchett (Moving Pictures (Discworld, #10; Industrial Revolution, #1))
There was a romance to that analog era, an ardency, an otherness that is missing in the post-Empire digital age where everything has ultimately come to feel disposable.
Bret Easton Ellis (White)
For most digital-age writers, writing is rewriting. We grope, cut, block, paste, and twitch, panning for gold onscreen by deleting bucketloads of crap. Our analog ancestors had to polish every line mentally before hammering it out mechanically. Rewrites cost them months, meters of ink ribbon, and pints of Tippex. Poor sods.
David Mitchell (The Bone Clocks)
Maybe I just have everything backwards. Maybe it's a problem of perspective. In this Post-Modern Age perhaps it is the digital experiences we ought to cheer as "genuine " and not those troublesome and inconvenient analog ones. Looking at it all fucking backwards.
Caitlín R. Kiernan
You can't live in the digital and die in the analog.
Dean Cavanagh
Our metaphors for the operation of the brain are frequently drawn from the production line. We think of the brain as a glorified sausage machine, taking in information from the senses, processing it and regurgitating it in a different form, as thoughts or actions. The digital computer reinforces this idea because it is quite explicitly a machine that does to information what a sausage machine does to pork. Indeed, the brain was the original inspiration and metaphor for the development of the digital computer, and early computers were often described as 'giant brains'. Unfortunately, neuroscientists have sometimes turned this analogy on its head, and based their models of brain function on the workings of the digital computer (for example by assuming that memory is separate and distinct from processing, as it is in a computer). This makes the whole metaphor dangerously self-reinforcing.
Steve Grand (Creation: Life and How to Make It)
Vreal filed away digital copies in triplicate, four analog copies, threw two copies away, shredded one, and finally, ate another. He was rather compulsive.
Jason Z. Christie (Perfect Me)
eBooks are just digital copies of analog books. Convenient, yes. But we have the technology now to rethink what a book is.
David Conger
It’s the digitals. Leith has that word he uses for the shift from analogs to digitals. That word he uses about eleven times an hour.
David Foster Wallace (Infinite Jest)
The machine seemed to understand time and space, but it didn’t, not as we do. We are analog, fluid, swimming in a flowing sea of events, where one moment contains the next, is the next, since the notion of “moment” itself is the illusion. The machine—it—is digital, and digital is the decision to forget the idea of the infinitely moving wave, and just take snapshots, convincing yourself that if you take enough pictures, it won’t matter that you’ve left out the flowing, continuous aspect of things. You take the mimic for the thing mimicked and say, Good enough. But now I knew that between one pixel and the next—no matter how densely together you packed them—the world still existed, down to the finest grain of the stuff of the universe. And no matter how frequently that mouse located itself, sample after sample, snapshot after snapshot—here, now here, now here—something was always happening between the here’s. The mouse was still moving—was somewhere, but where? It couldn’t say. Time, invisible, was slipping through its digital now’s.
Ellen Ullman (The Bug)
In childhood, he declared, the word-rich get richer and the word-poor get poorer, a phenomenon he called the “Matthew Effect”41 after a passage in the New Testament. There is also a Matthew-Emerson Effect for background knowledge: those who have read widely and well will have many resources to apply to what they read; those who do not will have less to bring, which, in turn, gives them less basis for inference, deduction, and analogical thought and makes them ripe for falling prey to unadjudicated information, whether fake news or complete fabrications. Our young will not know what they do not know. Others, too. Without sufficient background
Maryanne Wolf (Reader, Come Home: The Reading Brain in a Digital World)
Plan in Analog — spend time in analog before jumping to digital
Carmine Gallo (The Presentation Secrets of Steve Jobs)
Digital assets will be much bigger and faster you than analog assets and decentralized products will be much bigger than centralized products.
Olawale Daniel
Are my thoughts too analog for this digital age?
Henry Reign (No More Normies: The First Notebook)
The industrial age was driven by analog copies—exact and cheap. The information age is driven by digital copies—exact and free.
Kevin Kelly (The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future)
It could modulate and demodulate (hence the name) an analog signal, like that carried by a telephone circuit, in order to transmit and receive digital information.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
One fundamental problem, as mathematicians now realize, is that they made a crucial error fifty years ago in thinking the brain was analogous to a large digital computer.
Michio Kaku (Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100)
Algorithms are crude. Computers are machines. Data science is trying to make digital sense of an analog world.
Christian Rudder (Dataclysm: Love, Sex, Race, and Identity--What Our Online Lives Tell Us about Our Offline Selves)
Nearly all habits that lead us to calm exist in one place: the analog world. The more time we spend in the analog world, as opposed to the digital one, the calmer we become. We best unwind in the analog world, acting in accordance with how our ancient brain is wired.
Chris Bailey (How to Calm Your Mind: Finding Presence and Productivity in Anxious Times)
Many of the people libraries serve today are ill equipped to take advantage of all the great things about the digital present and future. Since libraries must be guided by those they serve, they will be awkwardly straddling the analog and the digital for some period of time.
John Palfrey (BiblioTech: Why Libraries Matter More Than Ever in the Age of Google)
It is important to note that the design of an entire brain region is simpler than the design of a single neuron. As discussed earlier, models often get simpler at a higher level—consider an analogy with a computer. We do need to understand the detailed physics ofsemiconductors to model a transistor, and the equations underlying a single real transistor are complex. A digital circuit that multiples two numbers requires hundreds of them. Yet we can model this multiplication circuit very simply with one or two formulas. An entire computer with billions of transistors can be modeled through its instruction set and register description, which can be described on a handful of written pages of text and formulas. The software programs for an operating system, language compilers, and assemblers are reasonably complex, but modeling a particular program—for example, a speech recognition programbased on hierarchical hidden Markov modeling—may likewise be described in only a few pages of equations. Nowhere in such a description would be found the details ofsemiconductor physics or even of computer architecture. A similar observation holds true for the brain. A particular neocortical pattern recognizer that detects a particular invariant visualfeature (such as a face) or that performs a bandpass filtering (restricting input to a specific frequency range) on sound or that evaluates the temporal proximity of two events can be described with far fewer specific details than the actual physics and chemicalrelations controlling the neurotransmitters, ion channels, and other synaptic and dendritic variables involved in the neural processes. Although all of this complexity needs to be carefully considered before advancing to the next higher conceptual level, much of it can be simplified as the operating principles of the brain are revealed.
Ray Kurzweil (How to Create a Mind: The Secret of Human Thought Revealed)
In the early days of adopting a conversation-centric mindset, you might miss the security blanket of what Stephen Colbert astutely labeled "little sips of online connection," & the sudden loss of weak ties to the fringes of your social network might induce moments of loneliness. But as you trade more of this time for conversation, the richness of these analog interactions will far outweigh what you're leaving behind.
Cal Newport (Digital Minimalism: Choosing a Focused Life in a Noisy World)
Web 2.0 is our code word for the analog increasingly supervening upon the digital—reversing how digital logic was embedded in analog components, sixty years ago. Search engines and social networks are just the beginning—the Precambrian phase. “If the only demerit of the digital expansion system were its greater logical complexity, nature would not, for this reason alone, have rejected it,” von Neumann admitted in 1948.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
This behavior can be more easily captured by continuous, analog networks than it can be defined by digital, algorithmic codes. These analog networks may be composed of digital processors, but it is in the analog domain that the interesting computation is being performed. “The purely ‘digital’ procedure is probably more circumstantial and clumsy than necessary,” von Neumann warned in 1951. “Better, and better integrated, mixed procedures may exist.”49 Analog is back, and here to stay.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
There's an analogy to be made between our craving for story and our craving for food. A tendency to overeat served our ancestors well when food shortages were a predictable part of life. But now that we modern desk jockeys are awash in cheap grease and corn syrup, overeating is more likely to fatten us up and kill us young. Likewise, it could be that an intense greed for story was healthy for our ancestors but has some harmful consequences in a world where books, MP3 players, TVs, and iPhones make story omnipresent - and where we have, in romance novels and television shows such as Jersey Shore, something like the story equivalent of deep-fried Twinkies. I think the literary scholar Brian Boyd is right to wonder if overconsumimg in a world awash with junk story could lead to something like a "mental diabetes epidemic." Similarly, as digital technology evolves, our stories - ubiquitous, immersive, interactive - may become dangerously attractive. The real threat isn't that story will fade out of human life in the future; its that story will take it over completely.
Jonathan Gottschall (The Storytelling Animal: How Stories Make Us Human)
Humans are conversant in many media (music, dance, painting), but all of them are analog except for the written word, which is naturally expressed in digital form (i.e. it is a series of discrete symbols—every letter in every book is a member of a certain character set, every “a” is the same as every other “a,” and so on). As any communications engineer can tell you, digital signals are much better to work with than analog ones because they are easily copied, transmitted, and error-checked. Unlike analog signals, they are not doomed to degradation over time and distance. That
Neal Stephenson (In the Beginning...Was the Command Line)
Complex networks—of molecules, people, or ideas—constitute their own simplest behavioral descriptions. This behavior can be more easily captured by continuous, analog networks than it can be defined by digital, algorithmic codes. These analog networks may be composed of digital processors, but it is in the analog domain that the interesting computation is being performed. “The purely ‘digital’ procedure is probably more circumstantial and clumsy than necessary,” von Neumann warned in 1951. “Better, and better integrated, mixed procedures may exist.”49 Analog is back, and here to stay.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
An “infinite number”? For Leonardo, that was not just a figure of speech. When he spoke of the infinite variety in nature, and especially of phenomena such as flowing water, he was making a distinction based on his preference for analog over digital systems. In an analog system, there are infinite gradations. That applies to most of the things that fascinated him: sfumato shadows, colors, movement, waves, the passage of time, the flow of fluids. That is why he believed that geometry was better than arithmetic at describing nature, and even though calculus had not yet been invented, he seemed to sense the need for such a mathematics of continuous quantities.
Walter Isaacson (Leonardo da Vinci)
Meme   A term introduced by the biologist Richard Dawkins in his 1976 book The Selfish Gene. Dawkins defined memes as small cultural units of transmission, analogous to genes, which are spread from person to person by copying or imitation. Examples of memes in his pioneering essay include cultural artifacts such as melodies, catchphrases, and clothing fashions, as well as abstract beliefs. Like genes, memes are defined as replicators that undergo variation, competition, selection, and retention. At any given moment, many memes are competing for the attention of hosts; however, only memes suited to their sociocultural environment spread successfully, while others become extinct.
Limor Shifman (Memes in Digital Culture)
By tracing the early history of USCYBERCOM it is possible to understand some of the reasons why the military has focused almost completely on network defense and cyber attack while being unaware of the need to address the vulnerabilities in systems that could be exploited in future conflicts against technologically capable adversaries. It is a problem mirrored in most organizations. The network security staff are separate from the endpoint security staff who manage desktops through patch and vulnerability management tools and ensure that software and anti-virus signatures are up to date. Meanwhile, the development teams that create new applications, web services, and digital business ventures, work completely on their own with little concern for security. The analogous behavior observed in the military is the creation of new weapons systems, ISR platforms, precision targeting, and C2 capabilities without ensuring that they are resistant to the types of attacks that USCYBERCOM and the NSA have been researching and deploying. USCYBERCOM had its genesis in NCW thinking. First the military worked to participate in the information revolution by joining their networks together. Then it recognized the need for protecting those networks, now deemed cyberspace. The concept that a strong defense requires a strong offense, carried over from missile defense and Cold War strategies, led to a focus on network attack and less emphasis on improving resiliency of computing platforms and weapons systems.
Richard Stiennon (There Will Be Cyberwar: How The Move To Network-Centric Warfighting Has Set The Stage For Cyberwar)
Search engines and social networks are analog computers of unprecedented scale. Information is being encoded (and operated upon) as continuous (and noise-tolerant) variables such as frequencies (of connection or occurrence) and the topology of what connects where, with location being increasingly defined by a fault-tolerant template rather than by an unforgiving numerical address. Pulse-frequency coding for the Internet is one way to describe the working architecture of a search engine, and PageRank for neurons is one way to describe the working architecture of the brain. These computational structures use digital components, but the analog computing being performed by the system as a whole exceeds the complexity of the digital code on which it runs. The model (of the social graph, or of human knowledge) constructs and updates itself.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
Who among us can predict the future? Who would dare to? The answer to the first question is no one, really, and the answer to the second is everyone, especially every government and business on the planet. This is what that data of ours is used for. Algorithms analyze it for patterns of established behavior in order to extrapolate behaviors to come, a type of digital prophecy that’s only slightly more accurate than analog methods like palm reading. Once you go digging into the actual technical mechanisms by which predictability is calculated, you come to understand that its science is, in fact, anti-scientific, and fatally misnamed: predictability is actually manipulation. A website that tells you that because you liked this book you might also like books by James Clapper or Michael Hayden isn’t offering an educated guess as much as a mechanism of subtle coercion.
Edward Snowden (Permanent Record)
So which theory did Lagos believe in? The relativist or the universalist?" "He did not seem to think there was much of a difference. In the end, they are both somewhat mystical. Lagos believed that both schools of thought had essentially arrived at the same place by different lines of reasoning." "But it seems to me there is a key difference," Hiro says. "The universalists think that we are determined by the prepatterned structure of our brains -- the pathways in the cortex. The relativists don't believe that we have any limits." "Lagos modified the strict Chomskyan theory by supposing that learning a language is like blowing code into PROMs -- an analogy that I cannot interpret." "The analogy is clear. PROMs are Programmable Read-Only Memory chips," Hiro says. "When they come from the factory, they have no content. Once and only once, you can place information into those chips and then freeze it -- the information, the software, becomes frozen into the chip -- it transmutes into hardware. After you have blown the code into the PROMs, you can read it out, but you can't write to them anymore. So Lagos was trying to say that the newborn human brain has no structure -- as the relativists would have it -- and that as the child learns a language, the developing brain structures itself accordingly, the language gets 'blown into the hardware and becomes a permanent part of the brain's deep structure -- as the universalists would have it." "Yes. This was his interpretation." "Okay. So when he talked about Enki being a real person with magical powers, what he meant was that Enki somehow understood the connection between language and the brain, knew how to manipulate it. The same way that a hacker, knowing the secrets of a computer system, can write code to control it -- digital namshubs?" "Lagos said that Enki had the ability to ascend into the universe of language and see it before his eyes. Much as humans go into the Metaverse. That gave him power to create nam-shubs. And nam-shubs had the power to alter the functioning of the brain and of the body." "Why isn't anyone doing this kind of thing nowadays? Why aren't there any namshubs in English?" "Not all languages are the same, as Steiner points out. Some languages are better at metaphor than others. Hebrew, Aramaic, Greek, and Chinese lend themselves to word play and have achieved a lasting grip on reality: Palestine had Qiryat Sefer, the 'City of the Letter,' and Syria had Byblos, the 'Town of the Book.' By contrast other civilizations seem 'speechless' or at least, as may have been the case in Egypt, not entirely cognizant of the creative and transformational powers of language. Lagos believed that Sumerian was an extraordinarily powerful language -- at least it was in Sumer five thousand years ago." "A language that lent itself to Enki's neurolinguistic hacking." "Early linguists, as well as the Kabbalists, believed in a fictional language called the tongue of Eden, the language of Adam. It enabled all men to understand each other, to communicate without misunderstanding. It was the language of the Logos, the moment when God created the world by speaking a word. In the tongue of Eden, naming a thing was the same as creating it. To quote Steiner again, 'Our speech interposes itself between apprehension and truth like a dusty pane or warped mirror. The tongue of Eden was like a flawless glass; a light of total understanding streamed through it. Thus Babel was a second Fall.' And Isaac the Blind, an early Kabbalist, said that, to quote Gershom Scholem's translation, 'The speech of men is connected with divine speech and all language whether heavenly or human derives from one source: the Divine Name.' The practical Kabbalists, the sorcerers, bore the title Ba'al Shem, meaning 'master of the divine name.'" "The machine language of the world," Hiro says.
Neal Stephenson (Snow Crash)
Let’s step back. Every year between 1950 and 2000, Americans increased their productivity about 1 to 4 percent.1 Since 2005, however, this growth has slowed in advanced economies, with a productivity decrease recorded in the United States in 2016.2 Maybe our rapidly evolving technology that promises us near-limitless options to keep us busy is not, in fact, making us more productive? One possible explanation for our productivity slowdown is that we’re paralyzed by information overload. As Daniel Levitin writes in The Organized Mind, information overload is worse for our focus than exhaustion or smoking marijuana.3 It stands to reason, then, that to be more productive we need a way to stem the tide of digital distractions. Enter the Bullet Journal, an analog solution that provides the offline space needed to process, to think, and to focus. When you open your notebook, you automatically unplug. It momentarily pauses the influx of information so your mind can catch up. Things become less of a blur, and you can finally examine your life with greater clarity.
Ryder Carroll (The Bullet Journal Method: Track the Past, Order the Present, Design the Future)
Computational models of the mind would make sense if what a computer actually does could be characterized as an elementary version of what the mind does, or at least as something remotely like thinking. In fact, though, there is not even a useful analogy to be drawn here. A computer does not even really compute. We compute, using it as a tool. We can set a program in motion to calculate the square root of pi, but the stream of digits that will appear on the screen will have mathematical content only because of our intentions, and because we—not the computer—are running algorithms. The computer, in itself, as an object or a series of physical events, does not contain or produce any symbols at all; its operations are not determined by any semantic content but only by binary sequences that mean nothing in themselves. The visible figures that appear on the computer’s screen are only the electronic traces of sets of binary correlates, and they serve as symbols only when we represent them as such, and assign them intelligible significances. The computer could just as well be programmed so that it would respond to the request for the square root of pi with the result “Rupert Bear”; nor would it be wrong to do so, because an ensemble of merely material components and purely physical events can be neither wrong nor right about anything—in fact, it cannot be about anything at all. Software no more “thinks” than a minute hand knows the time or the printed word “pelican” knows what a pelican is. We might just as well liken the mind to an abacus, a typewriter, or a library. No computer has ever used language, or responded to a question, or assigned a meaning to anything. No computer has ever so much as added two numbers together, let alone entertained a thought, and none ever will. The only intelligence or consciousness or even illusion of consciousness in the whole computational process is situated, quite incommutably, in us; everything seemingly analogous to our minds in our machines is reducible, when analyzed correctly, only back to our own minds once again, and we end where we began, immersed in the same mystery as ever. We believe otherwise only when, like Narcissus bent above the waters, we look down at our creations and, captivated by what we see reflected in them, imagine that another gaze has met our own.
David Bentley Hart (The Experience of God: Being, Consciousness, Bliss)
Back in the early 1990s, the FBI started worrying about its ability to conduct telephone surveillance. The FBI could do it with the old analog phone switches: a laborious process involving alligator clips, wires, and a tape recorder. The problem was that digital switches didn’t work that way. Isolating individual connections was harder, and the FBI became concerned about the potential loss of its ability to wiretap. So it lobbied Congress hard and got a law passed in 1994 called the Communications Assistance for Law Enforcement Act, or CALEA, requiring telcos to re-engineer their digital switches to have eavesdropping capabilities built in. Fast-forward 20 years, and the FBI again wants the IT industry to make surveillance easier for itself. A lot of communications no longer happen over the telephone. They’re happening over chat. They’re happening over e-mail. They’re happening over Skype. The FBI is currently lobbying for a legislative upgrade to CALEA, one that covers all communications systems: all voice, video, and text systems, including World of Warcraft and that little chat window attached to your online Scrabble game. The FBI’s ultimate goal is government prohibition of truly secure communications. Valerie
Bruce Schneier (Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World)
GCHQ has traveled a long and winding road. That road stretches from the wooden huts of Bletchley Park, past the domes and dishes of the Cold War, and on towards what some suggest will be the omniscient state of the Brave New World. As we look to the future, the docile and passive state described by Aldous Huxley in his Brave New World is perhaps more appropriate analogy than the strictly totalitarian predictions offered by George Orwell's Nineteen Eighty-Four. Bizarrely, many British citizens are quite content in this new climate of hyper-surveillance, since its their own lifestyle choices that helped to create 'wired world' - or even wish for it, for as we have seen, the new torrents of data have been been a source of endless trouble for the overstretched secret agencies. As Ken Macdonald rightly points out, the real drives of our wired world have been private companies looking for growth, and private individuals in search of luxury and convenience at the click of a mouse. The sigint agencies have merely been handed the impossible task of making an interconnected society perfectly secure and risk-free, against the background of a globalized world that presents many unprecedented threats, and now has a few boundaries or borders to protect us. Who, then, is to blame for the rapid intensification of electronic surveillance? Instinctively, many might reply Osama bin Laden, or perhaps Pablo Escobar. Others might respond that governments have used these villains as a convenient excuse to extend state control. At first glance, the massive growth of security, which includes includes not only eavesdropping but also biometric monitoring, face recognition, universal fingerprinting and the gathering of DNA, looks like a sad response to new kinds of miscreants. However, the sad reality is that the Brave New World that looms ahead of us is ultimately a reflection of ourselves. It is driven by technologies such as text messaging and customer loyalty cards that are free to accept or reject as we choose. The public debate on surveillance is often cast in terms of a trade-off between security and privacy. The truth is that luxury and convenience have been pre-eminent themes in the last decade, and we have given them a much higher priority than either security or privacy. We have all been embraced the world of surveillance with remarkable eagerness, surfing the Internet in a global search for a better bargain, better friends, even a better partner. GCHQ vast new circular headquarters is sometimes represented as a 'ring of power', exercising unparalleled levels of surveillance over citizens at home and abroad, collecting every email, every telephone and every instance of internet acces. It has even been asserted that GCHQ is engaged in nothing short of 'algorithmic warfare' as part of a battle for control of global communications. By contrast, the occupants of 'Celtenham's Doughnut' claim that in reality they are increasingly weak, having been left behind by the unstoppable electronic communications that they cannot hope to listen to, still less analyse or make sense of. In fact, the frightening truth is that no one is in control. No person, no intelligence agency and no government is steering the accelerating electronic processes that may eventually enslave us. Most of the devices that cause us to leave a continual digital trail of everything we think or do were not devised by the state, but are merely symptoms of modernity. GCHQ is simply a vast mirror, and it reflects the spirit of the age.
Richard J. Aldrich (GCHQ)
Look around on your next plane trip. The iPad is the new pacifier for babies and toddlers… Parents and other passengers read on Kindles… Unbeknownst to most of us, an invisible, game-changing transformation links everyone in this picture: the neuronal circuit that underlies the brain’s ability to read is subtly, rapidly changing… As work in neurosciences indicates, the acquisition of literacy necessitated a new circuit in our species’ brain more than 6,000 years ago… My research depicts how the present reading brain enables the development of some of our most important intellectual and affective processes: internalized knowledge, analogical reasoning, and inference; perspective-taking and empathy; critical analysis and the generation of insight. Research surfacing in many parts of the world now cautions that each of these essential “deep reading” processes may be under threat as we move into digital-based modes of reading… Increasing reports from educators and from researchers in psychology and the humanities bear this out. English literature scholar and teacher Mark Edmundson describes how many college students actively avoid the classic literature of the 19thand 20th centuries because they no longer have the patience to read longer, denser, more difficult texts. We should be less concerned with students’ “cognitive impatience,” however, than by what may underlie it: the potential inability of large numbers of students to read with a level of critical analysis sufficient to comprehend the complexity of thought and argument found in more demanding texts… Karin Littau and Andrew Piper have noted another dimension: physicality. Piper, Littau and Anne Mangen’s group emphasize that the sense of touch in print reading adds an important redundancy to information – a kind of “geometry” to words, and a spatial “thereness” for text. As Piper notes, human beings need a knowledge of where they are in time and space that allows them to return to things and learn from re-examination – what he calls the “technology of recurrence”. The importance of recurrence for both young and older readers involves the ability to go back, to check and evaluate one’s understanding of a text. The question, then, is what happens to comprehension when our youth skim on a screen whose lack of spatial thereness discourages “looking back.
Maryanne Wolf
Instead of distinct old and new media, what we have is a complex cultural ecosystem that spans the analog and digital, encompassing physical places and online spaces, material objects and digital copies, flesh bodies and virtual identities.
Astra Taylor (The People's Platform: Taking Back Power and Culture in the Digital Age)
Samsung digital TVs depended on countries making the switch from analog to digital television, which is why Samsung TVs did not take over world market share until this century.
Euny Hong (The Birth of Korean Cool: How One Nation Is Conquering the World Through Pop Culture)
48-track tape machines out there. Check out the Internet for used analog and digital tape machines made by Ampex, MCI, Otari, Revox, Sony, and Studer. Read the reviews and note the prices, and as soon as your budget allows, pick one up. It’ll be a smart move.
Robert Wolff (How to Make It in the New Music Business -- Now With the Tips You've Been Asking For!)
analogue computers are stupidly named; they should be named continuous computers.” For real-world questions—especially ambiguous ones—analog computing can be faster, more accurate, and more robust, not only at computing the answers, but also at asking the questions and communicating the results.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
Our steps toward a more productive spiritual path begin with recognizing the need for change. We can no longer afford to seek analog solutions to digital problems.
Kevin Powell (The Black Male Handbook: A Blueprint for Life)
awareness of digital transition, including the awareness of the complete phase-out of analog TV broadcasting in 2012 (34.9% in
카톡PCASH폰캐시
miscalculations, learning quickly, and moving on—is much easier with a strong mind-set to begin with. Failures often help make emotionally healthy people even better and stronger. With a strong mind, you may even make a breakthrough after a crisis or failure. You can learn to condition your mind and body and enhance your EQ. I will explain why and how in this section. But first, here is an analogy that helps distill and define above components, given that your generation is far more technically astute in our increasingly interconnected, sophisticated, and inundated digital age: Your pragmatic psychology in your brain and nervous system is
Jason L. Ma (Young Leaders 3.0: Stories, Insights, and Tips for Next-Generation Achievers)
Yeah, young coders abound, but mostly only string together preassembled digital beads, and even today's brightest young nerdlets aren't immune to eventual wrinkles. As for real experts, well, as the dawn of the computer age recedes, so too have the hairlines of your true computer wizards, the males I mean. We females never change, we are eternally young.
Rajnar Vajra (Analog Science Fiction and Fact, 2013 January/February)
In thrall to the era of Little Digital, we overlook that this is still the era of Big Analog.
Anonymous
Digital disruption is a mindset that ultimately leads to a way of behaving; a mindset that bypasses traditional analog barriers, eliminating the gaps and boundaries that prevent people and companies from giving customers what they want in the moment that they want it.
James McQuivey (Digital Disruption: Unleashing the Next Wave of Innovation)
Creativity has often been analogized as “Thinking outside of the box.
Pearl Zhu (Thinkingaire: 100 Game Changing Digital Mindsets to Compete for the Future)
The S curve is not just important as a model in its own right; it’s also the jack-of-all-trades of mathematics. If you zoom in on its midsection, it approximates a straight line. Many phenomena we think of as linear are in fact S curves, because nothing can grow without limit. Because of relativity, and contra Newton, acceleration does not increase linearly with force, but follows an S curve centered at zero. So does electric current as a function of voltage in the resistors found in electronic circuits, or in a light bulb (until the filament melts, which is itself another phase transition). If you zoom out from an S curve, it approximates a step function, with the output suddenly changing from zero to one at the threshold. So depending on the input voltages, the same curve represents the workings of a transistor in both digital computers and analog devices like amplifiers and radio tuners. The early part of an S curve is effectively an exponential, and near the saturation point it approximates exponential decay. When someone talks about exponential growth, ask yourself: How soon will it turn into an S curve? When will the population bomb peter out, Moore’s law lose steam, or the singularity fail to happen? Differentiate an S curve and you get a bell curve: slow, fast, slow becomes low, high, low. Add a succession of staggered upward and downward S curves, and you get something close to a sine wave. In fact, every function can be closely approximated by a sum of S curves: when the function goes up, you add an S curve; when it goes down, you subtract one. Children’s learning is not a steady improvement but an accumulation of S curves. So is technological change. Squint at the New York City skyline and you can see a sum of S curves unfolding across the horizon, each as sharp as a skyscraper’s corner. Most importantly for us, S curves lead to a new solution to the credit-assignment problem. If the universe is a symphony of phase transitions, let’s model it with one. That’s what the brain does: it tunes the system of phase transitions inside to the one outside. So let’s replace the perceptron’s step function with an S curve and see what happens.
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
Support for libraries, both financial and otherwise, is crucial during this period of transition from a predominantly analog to a predominantly digital world.
John Palfrey (BiblioTech: Why Libraries Matter More Than Ever in the Age of Google)
Although earlier computers existed in isolation from the world, requiring their visuals and sound to be generated and live only within their memory, the Amiga was of the world, able to interface with it in all its rich analog glory. It was the first PC with a sufficient screen resolution and color palette as well as memory and processing power to practically store and display full-color photographic representations of the real world, whether they be scanned in from photographs, captured from film or video, or snapped live by a digitizer connected to the machine. It could be used to manipulate video, adding titles, special effects, or other postproduction tricks. And it was also among the first to make practical use of recordings of real-world sound. The seeds of the digital-media future, of digital cameras and Photoshop and MP3 players, are here. The Amiga was the first aesthetically satisfying PC. Although the generation of machines that preceded it were made to do many remarkable things, works produced on them always carried an implied asterisk; “Remarkable,” we say, “. . . for existing on such an absurdly limited platform.” Even the Macintosh, a dramatic leap forward in many ways, nevertheless remained sharply limited by its black-and-white display and its lack of fast animation capabilities. Visuals produced on the Amiga, however, were in full color and could often stand on their own terms, not as art produced under huge technological constraints, but simply as art. And in allowing game programmers to move beyond blocky, garish graphics and crude sound, the Amiga redefined the medium of interactive entertainment as being capable of adult sophistication and artistry. The seeds of the aesthetic future, of computers as everyday artistic tools, ever more attractive computer desktops, and audiovisually rich virtual worlds, are here. The Amiga empowered amateur creators by giving them access to tools heretofore available only to the professional. The platform’s most successful and sustained professional niche was as a video-production workstation, where an Amiga, accompanied by some relatively inexpensive software and hardware peripherals, could give the hobbyist amateur or the frugal professional editing and postproduction capabilities equivalent to equipment costing tens or hundreds of thousands. And much of the graphical and musical creation software available for the machine was truly remarkable. The seeds of the participatory-culture future, of YouTube and Flickr and even the blogosphere, are here. The
Jimmy Maher (The Future Was Here: The Commodore Amiga (Platform Studies))
Instead of distinct old and new media, what we have is a complex cultural ecosystem that spans the analog and digital, encompassing physical places and online spaces, material objects and digital copies, fleshy bodies and virtual identities.
Astra Taylor (The People’s Platform: Taking Back Power and Culture in the Digital Age)
For this really is the last straw, this aspiration to clear the way, with the digital, for the integral image, free from any real-world constraints. And we would not be forcing the analogy if we extended this same revolution to human beings in general, free now, thanks to this digital intelligence, to operate within an integral individuality, free from all history and subjective constraints ... At the end-point of this rise of the machine, in which all human intelligence is encapsulated- a machine which is now assured of total autonomy as a result- it is clear that mankind exists only at the cost of its own death. It becomes immortal only by paying the price of its technological disappearance, of its inscription in the digital order (the mental diaspora of the networks).
Jean Baudrillard (Why Hasn't Everything Already Disappeared? (The French List))
He notes that the output of neurons is digital: an axon either fires or it doesn’t. This was far from obvious at the time, in that the output could have been an analog signal. The processing in the dendrites leading into a neuron and in the soma neuron cell body, however, are analog. He describes these calculations as a weighted sum of inputs with a threshold.
John von Neumann (The Computer and the Brain: Abused City (The Silliman Memorial Lectures Series))
As he describes each mechanism in the brain, he shows how a modern computer could accomplish the same operation, despite the apparent differences. The brain’s analog mechanisms can be simulated through digital ones because digital computation can emulate analog values to any desired degree of precision (and the precision of analog information in the brain is quite low).
John von Neumann (The Computer and the Brain: Abused City (The Silliman Memorial Lectures Series))
The cathode-ray tube (CRT) was a form of analog computer: varying the voltages to the deflection coils varied the path traced by the electron beam. The CRT, especially in its incarnation as an oscilloscope, could be used to add, subtract, multiply, and divide signals—the results being displayed directly as a function of the amplitude of the deflection and its frequency in time. From these analog beginnings, the digital universe took form.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
One of the latest theories of aging—and my favorite—is presented by my friend and colleague David Sinclair in Lifespan: Why We Age—and Why We Don’t Have To. The information theory of aging proposes that we age and become more susceptible to diseases because our cells lose information. DNA stores information digitally, but the cells have an analog format that can modulate the function of genes in the sequence of the DNA.
Nir Barzilai (Age Later: Secrets of the Healthiest, Sharpest Centenarians)
if we insist too strongly on the brain as a glorified digital machine, we shall be subject to some very just criticism, coming in part from the physiologists and in part from the somewhat opposite camp of those psychologists who prefer not to make use of the machine comparison. I have said that in a digital machine there is a taping, which determines the sequence of operations to be performed, and that a change in this taping on the basis of past experience corresponds to a learning process. In the brain, the clearest analogy to taping is the determination of the synaptic thresholds, of the precise combinations of the incoming neurons which will fire an outgoing neuron with which they are connected.
Norbert Wiener (The Human Use Of Human Beings: Cybernetics And Society (The Da Capo series in science))
What people didn’t realize, including Wall Street executives, was how deep and interrelated the risks CMOs posed were. Part of the problem was that CMOs were complex financial instruments supported by outdated financial architecture that blended analog and digital systems. The lack of seamless digital documentation made quantifying the risk and understanding exactly what CMOs were composed of difficult, if not impossible.
Chris Burniske (Cryptoassets: The Innovative Investor's Guide to Bitcoin and Beyond)
Returning to our canary-in-the-coal-mine analogy, the plight of iGen provides a strong warning about the danger of solitude deprivation. When an entire cohort unintentionally eliminated time alone with their thoughts from their lives, their mental health suffered dramatically. On reflection, this makes sense. These teenagers have lost the ability to process and make sense of their emotions, or to reflect on who they are and what really matters, or to build strong relationships, or even to just allow their brains time to power down their critical social circuits, which are not meant to be used constantly, and to redirect that energy to other important cognitive housekeeping tasks. We shouldn’t be surprised that these absences lead to malfunctions.
Cal Newport (Digital Minimalism: Choosing a Focused Life in a Noisy World)
Analog dollars are becoming digital pennies.
Erik Brynjolfsson (The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies)
A similar argument can be made about the form that the interface takes—with little pictures of folders and pages and trash cans. Those analogies are based in physical forms and so we associate the simplicity of the physical folder with that of the digital one. At best, we have faith in the interface that it is an accurate simplification of a more complex system behind it, and at worse, we don’t even recognize the complexity at all.
Tania Allen (Solving Critical Design Problems: Theory and Practice)
A neuron in the human brain can never equate the human mind, but this analogy doesn't hold true for a digital mind, by virtue of its mathematical structure, it may – through evolutionary progression and provided there are no insurmountable evolvability constraints – transcend to the higher-order Syntellect. A mind is a web of patterns fully integrated as a coherent intelligent system; it is a self-generating, self-reflective, self-governing network of sentient components (that are themselves minds) that evolves, as a rule, by propagating through dimensionality and ascension to ever-higher hierarchical levels of emergent complexity. In this book, the Syntellect emergence is hypothesized to be the next meta-system transition, developmental stage for the human mind – becoming one global mind – that would constitute quintessence of the looming Cybernetic Singularity.
Alex M. Vikoulov (The Syntellect Hypothesis: Five Paradigms of the Mind's Evolution)
The philosophy of conversation-centric communication takes a harder stance. It argues that conversation is the only form of interaction that in some sense counts toward maintaining a relationship. This conversation can take the form of a face-to-face meeting, or it can be a video chat or a phone call—so long as it matches Sherry Turkle’s criteria of involving nuanced analog cues, such as the tone of your voice or facial expressions. Anything textual or non-interactive—basically, all social media, email, text, and instant messaging—doesn’t count as conversation and should instead be categorized as mere connection.
Cal Newport (Digital Minimalism: Choosing a Focused Life in a Noisy World)
The church was never meant to be a derivative of the cultural moment but, rather, a disruption of it.
Jay Y. Kim (Analog Church: Why We Need Real People, Places, and Things in the Digital Age)
Hadn’t they confiscated our cell phones, our tablets, all of our screens and digital access to the outside? We were being held in an analog prison, said David.
Lydia Millet (A Children's Bible)
Live fast in the digital! Die young in the analog!
Dean Cavanagh (The Secret Life Of The Novel)
I don’t know why the telephone, the analog landline telephone, was never formally mourned. What a many-splendored experience it once was to talk on the phone. You’d dial a number, rarely more than seven digits, typically known by heart and fingers.
Virginia Heffernan (Magic and Loss: The Pleasures of the Internet)
What is different from all the experiences humanity has gone through however, is the nature of our communication environment[2]: Digital communication has different characteristics than analog communication environments. Some of which are obvious such as the speed of transmission and the limitless reproducibility at virtually no cost. Some are beginning to be seen, including the potential of digital links and connections. Others are still beyond our imagination.
Frode Hegland (The Future of Text 1)
You're analog players in a digital world.
Eddie Izzard
But Bennie knew that what he was bringing into the world was shit. Too clear, too clean. The problem was precision, perfection; the problem was digitization, which sucked the life out of everything that got smeared through its microscopic mesh. Film, photography, music: dead. An aesthetic holocaust! Bennie knew better than to say this stuff aloud.
Jennifer Egan (A Visit from the Goon Squad)
By digitizing a traditionally analog business model or process, we're effectively turning it into bits and atoms and enabling an infinite variety of possibilities.
Nicholas D. Evans (Mastering Digital Business: How powerful combinations of disruptive technologies are enabling the next wave of digital transformation)
Numerous studies have shown that handwriting notes is simply better for engagement, information retention, and mental health than is writing on digital devices.
David Sax (The Revenge of Analog: Real Things and Why They Matter)
Just as the digital dominance of the recording studio seemed complete, analog had its revenge. Musicians, producers, and engineers searching for the sound of the music that inspired them—roots Americana, blues, and classic rock—began thinking about how the process of recording affected the sound. These artists, including White, Dave Grohl, and Gillian Welch, began experimenting with old tape machines and vintage studio equipment, returning to the analog methods they’d once used. Critics and fans noted that these albums sounded different—more heartfelt, raw, and organic—and the industry began to take notice.
David Sax (The Revenge of Analog: Real Things and Why They Matter)
As a result, the motto in Silicon Valley today is: everything that is analog is now being digitized, everything that is being digitized is now being stored, everything that is being stored is now being analyzed by software on these more powerful computing systems, and all the learning is being immediately applied to make old things work better, to make new things possible, and to do old things in fundamentally new ways. For instance, the invention of the Uber taxi service did all three: it didn’t just create a new competitive taxi fleet; it created a fundamentally new and better way to summon a taxi, to gather data on riders’ needs and desires, to pay for a taxi, and to rate the behavior of the driver and the passenger. These
Thomas L. Friedman (Thank You for Being Late: An Optimist's Guide to Thriving in the Age of Accelerations)
the elevated anxiety he’s observed in this generation of campers is directly related to the constant hovering of their parents, who use digital technology to keep tabs on their children around the clock. They cannot surrender their authority. Many of the phones that Birenbaum has seized from campers over the past few summers were sent on the insistence of parents, who wanted to remain in touch.
David Sax (The Revenge of Analog: Real Things and Why They Matter)
clear enough. I asked Birenbaum what he was ultimately trying to preserve by keeping Walden technology free. Was it the land, the cabins, and the lake, and leaving those spaces undisturbed by the outside world? Or were his efforts to keep the digital barbarians at the gate driven by a desire to preserve something deeper, that universal truth that not only made Walden what it was, but drove the Revenge of Analog in all its various forms? Birenbaum didn’t hesitate to answer. “We look at the heart of what we do, and it is interpersonal relationships,” he said. Any debate about technology’s use came down to a simple binary question: will it impact interpersonal relationships or not? “This camp could be wiped out by a meteor tomorrow, and we could rebuild across the road and we’d still be Walden,” he said. What mattered were the relationships and the uniquely analog recipe that enabled their formation. First, you place lots of people together, and have them relate to one another with the guidance of caregivers, who encourage and enforce mutual respect. Next, you mix in a program that creates various stresses, frustrations, and challenges that campers need to confront. This ranges from the simplest task of getting to breakfast on time to ten-day canoe trips in the harsh Canadian wilderness where twelve-year-olds might be expected to carry a 60-pound canoe on their head for a mile or more in the pouring rain, as blackflies gnaw at their ankles. These situations eventually lead to individual perseverance and self-respect . . . what most people call character. And that character is the glue that allows the relationships built at camp to last a lifetime, as my own friendships formed at Walden have. “You go a bit out of your comfort zone, endure a little hardship, people push you and help you to succeed, and you end up with friendships, confidence, and an inner fortitude that ends in a sense of belonging to a greater, interdependent community,” Birenbaum said. “This is one of the most basic aspects of the human condition.
David Sax (The Revenge of Analog: Real Things and Why They Matter)
The MIT professor Sherry Turkle, who has devoted her career to studying and writing about the impact of digital technology on our lives, once wrote that sociable technology always disappoints, because it promises what it cannot deliver. “It promises friendship but can only deliver performance,
David Sax (The Revenge of Analog: Real Things and Why They Matter)
When I think back on the twenty years I spent in school, what sticks with me isn’t any particular subject, learning tool, or classroom. It is the teachers who brought my education to life and drove my interest forward, so that my passion for learning continued, despite the long days, the hard chairs, the difficult problems. These women and men were giants. They were underpaid, and they put up with all sorts of crap, but they made me the person I am today vastly more than the facts they taught. That relationship is what digital education technology cannot ever replicate or replace, and why a great teacher will always provide a more innovative model for the future of education than the most sophisticated device, software, or platform.
David Sax (The Revenge of Analog: Real Things and Why They Matter)
All digital music listeners are equal. Acquisition is painless. Taste is irrelevant. It is pointless to boast about your iTunes collection, or the quality of your playlists on a streaming service. Music became data, one more set of 1's and 0's lurking in your hard drive, invisible to see and impossible to touch. Nothing is less cool than data.
David Sax (The Revenge of Analog: Real Things and Why They Matter)
World's Smallest Headphone Amplifier,High Quality Decoder Supports - 192KHz/24bit Headphone Amplifier- with the built-in high quality DAC(digital to analog converter)and HI-Fi sound performance, gives you more than CD quality
Portable Headphone Amplifier
Web 2.0 is our code word for the analog increasingly supervening upon the digital—reversing how digital logic was embedded in analog components, sixty years ago. Search engines and social networks are just the beginning—the Precambrian phase.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
Chain letters—yes, the type you still occasionally get via email, or see on social media—have their roots in snail mail, first popularized in the late 1800s. One of the most successful ones, “The Prosperity Club,” originated in Denver in the post-Depression 1930s, and asked people to send a dime to a list of others who were part of the club. Of course, you would add yourself to the list as well. The next set of people would return the favor, sending dimes back, and so on and so forth—with the promise that it would eventually generate $1,562.50. This is about $29,000 in 2019 dollars—not bad! The last line says it all: “Is this worth a dime to you?” It might surprise you that in a world before email, social media, and everything digital, the Prosperity Club chain letter spread incredibly well—so well, in fact, that it reached hundreds of thousands of people within months, within Denver and beyond. There are historical anecdotes of local mail offices being overwhelmed by the sheer volume of letters, and not surprisingly, eventually the US Post Office would make chain letters like Prosperity Club illegal, to stop their spread. It clearly tapped into a Depression zeitgeist of the time, promising “Faith! Hope! Charity!” This is a clever, viral idea (for its time), and I will also argue that this is an analog version of a network effect from the 1800s, just as telephones and railways were, too. How so? First, chain letters are organized as a network, and can be represented by the list of names that are copied and recopied by each participant. These names are likely to be friends, family, and people in the community, furthering the Prosperity Club’s credibility, thereby increasing the engagement level. It follows the classic definition of network effects: the more people who are participating in this chain letter, the better, since you are then more likely to receive dimes. And it even faces the Cold Start Problem: if enough people aren’t already on the list and playing along, then it will fail to grow.
Andrew Chen (The Cold Start Problem: How to Start and Scale Network Effects)
WHAT IS THOUGHT? A thought is a logical picture of the facts, and a proposition is the expression of a thought in a way that we can read or hear. So what is a logical picture? Consider a gramophone record. It consists of variegated grooves on a plastic base. When the record is played, the information contained in the grooves is reproduced in the music. So the spatial patterns on the record must share a form with the auditory relations of the notes in the music. The music, the score of the music, a digital recording of the music and an analog recording all share homologous form, but there is no way of representing the form. In other words, you can’t SHOW a thought.
John Heaton (Introducing Wittgenstein: A Graphic Guide)
One might suppose that analog computers would be more powerful, since they can represent a continuum of values, whereas digital computers can represent data only as discrete numbers. However, this apparent advantage disappears if we take a closer look. A true continuum is unrealizable in the physical world. The problem with analog computers is that their signals can achieve only a limited degree of accuracy.
William Daniel Hillis (The Pattern on the Stone: The Simple Ideas that Make Computers Work)
This is the fundamental difference between digital and analog: A digital valve is either on or off; an analog valve, like your kitchen faucet, can be anything in between. In the hydraulic computer, all that is required of the input signal is that it be strong enough to move the valve. In this case, the difference that makes a difference is the difference in water pressure sufficient to switch the valve on. And since a weakened signal entering an input will still produce a full-strength output, we can connect thousands of layers of logic, the output of one layer controlling the next, without worrying about a gradual decrease in pressure. The output of each gate will always be at full pressure.
William Daniel Hillis (The Pattern on the Stone: The Simple Ideas that Make Computers Work)
He mentioned five elements that really set the deck apart from the rest. Here’s what you need, according to Andy: Name a big, relevant change in the world. This should be an “indisputable truth.” “E-commerce will accelerate post-COVID19-pandemic” is a good example. Show there will be winners and losers. The point here is to give anxiety to the customers that may fall on the losing side. At Videoplaza, we cited the transition from analog to digital in video streaming and monetization with Netflix and Amazon as the winners thus far. Tease the promised land. Instead of introducing your product immediately, talk instead about the future state, about your founding insights to give the prospect a glimpse into the future. Introduce features as magic gifts for overcoming obstacles to the promised land. This is where your product comes in with its ability to get the customer to the other side. Present evidence that you can make the story come true. Case studies, customer testimonials, analyst quotes, product demos—all of these are appropriate in telling this part of the narrative.
Rags Gupta (One to Ten: Finding Your Way from Startup to Scaleup)
Our classical intuition tells us that analog computation is intrinsically continuous and digital computation is intrinsically discrete. As with many other classical intuitions, this one is incorrect when applied to quantum computation. Analog quantum computers and digital quantum computers are one and the same device.
Seth Lloyd (Programming the Universe: A Quantum Computer Scientist Takes on the Cosmos)
Analog data are superior for this job because they can be changed back and forth with relative ease whenever the environment within or outside the cell demands it, and they can store an almost unlimited number of possible values, even in response to conditions that have never been encountered before.25 The unlimited number of possible values is why many audiophiles still prefer the rich sounds of analog storage systems. But even though analog devices have their advantages, they have a major disadvantage. In fact, it’s the reason we’ve moved from analog to digital. Unlike digital, analog information degrades over time—falling victim to the conspiring forces of magnetic fields, gravity, cosmic rays, and oxygen. Worse still, information is lost as it’s copied. No one was more acutely disturbed by the problem of information loss than Claude Shannon, an electrical engineer from the Massachusetts Institute of Technology (MIT) in Boston.
David A. Sinclair (Lifespan: Why We Age – and Why We Don’t Have To)
Go to a traditional folk music festival. The quality of the playing and singing will blow your mind. But like the rise in vinyl record production, house shows, and other aspects of hipster culture, it is quintessentially “analog”—the sonic equivalent of the farm-to-table movement. The great electronic musician and producer Brian Eno, who has been working in funky analog studios in West Africa, has begun to question the very raison d’être of digital recording, which, thanks to Auto-Tune (the tech tool that allows engineers to correct singers with bad pitch), makes it possible to turn a second-rate singer into a diva: “We can quantize everything now; we can quantize audio so the beat is absolutely perfect. We can sort of do and undo everything. And of course, most of the records we like, all of us, as listeners, are records where people didn’t do everything to fix them up and make them perfect.” Tech’s perfection tools do not make for human art.
Jonathan Taplin (Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy)
this was the age of acceleration. Everything that was analog was being digitized. Everything that was digitized was being stored. And everything that was stored was being analyzed, opening up entirely new dimensions for surveillance and attack.
Nicole Perlroth (This Is How They Tell Me the World Ends: The Cyberweapons Arms Race)
once you become a patient—vulnerable, scared, passive—then you need a doctor, a real doctor-person—calm, authoritative, wise. And since all of us, even the youngest, healthiest, fiercest hacker will, in the end, be in that place—which to discover we must travel to—we all need a system that incorporates both: the virtual and the real, the digital and the analog, the Fast and the Slow. And I believe we will have it.
Victoria Sweet (Slow Medicine: The Way to Healing)
As an analogy, we used to think of books, music, and movies as distinct. Then they all became represented by packets sent over the internet. Yes, we listened to music in audio players and viewed books in ebook readers, but their fundamental structure became digital. Similarly, today we think of stocks, bonds, gold, loans, and art as different. But all of them are represented as debits and credits on blockchains. Again, the fundamental structure became digital. Now, we are starting to think of different kinds of collections of people –— whether communities, cities, companies, or countries —– all fundamentally as networks, where the digital profiles and how they interact become more and more fundamental. This is obvious for communities and companies, which can already be fully remote and digital, but even already existing cities and countries are starting to be modeled this way, because (a) their citizens48 are often geographically remote, (b) the concept of citizenship itself is becoming similar to digital single sign-on, (c) many 20th century functions of government have already been de-facto transferred to private networks like (electronic) mail delivery, hotel, and taxi regulation, (d) cities and countries increasingly recruit citizens online, (e) so-called smart cities are increasingly administrated through a computer interface, and (f) as countries issue central bank digital currencies and cities likely follow suit, every polity will be publicly traded on the internet just like companies and coins.
Balaji S. Srinivasan (The Network State: How To Start a New Country)
However, I can tell you that reputation travels even faster digitally than through analog channels.
Kurt Schmidt (The Little Book of Networking: How to Grow Your Career One Conversation at a Time)
We're seeing the world through a prism now; digital capitalism has refined the algorithm to monetize out attention. Peace, contentment, nuance, subtlety, cooperation - all of these things are list as algorithms filter our online communications for maximum engagement. What holds away, of course, is hyperbole, shock, outrage, sensationalism.
Jonathan Simons (The Analog Sea Review: Number Four)
Generation X is the generation that went from analog to digital.
Mia Mulrennan (Passed Over and Pissed Off: The Overlooked Leadership Talents of Generation X)
Broken DNA causes genome instability, I wrote, which distracts the Sir2 protein, which changes the epigenome, causing the cells to lose their identity and become sterile while they fixed the damage. Those were the analog scratches on the digital DVDs. Epigenetic changes cause aging.
David Sinclair (Lifespan: Why We Age—and Why We Don't Have To)
When you feel stuck in your creative pursuits, it doesn’t mean that there’s something wrong with you. You haven’t lost your touch or run out of creative juice. It just means you don’t yet have enough raw material to work with. If it feels like the well of inspiration has run dry, it’s because you need a deeper well full of examples, illustrations, stories, statistics, diagrams, analogies, metaphors, photos, mindmaps, conversation notes, quotes—anything that will help you argue for your perspective or fight for a cause you believe in.
Tiago Forte (Building a Second Brain: A Proven Method to Organise Your Digital Life and Unlock Your Creative Potential)