Algorithms To Live By Quotes

We've searched our database for all the quotes and captions related to Algorithms To Live By. Here they are! All 100 of them:

Don’t always consider all your options. Don’t necessarily go for the outcome that seems best every time. Make a mess on occasion. Travel light. Let things wait. Trust your instincts and don’t think too long. Relax. Toss a coin. Forgive, but don’t forget. To thine own self be true.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Our judgments betray our expectations, and our expectations betray our experience. What we project about the future reveals a lot—about the world we live in, and about our own past.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Imagine for a moment that we are nothing but the product of billions of years of molecules coming together and ratcheting up through natural selection, that we are composed only of highways of fluids and chemicals sliding along roadways within billions of dancing cells, that trillions of synaptic conversations hum in parallel, that this vast egglike fabric of micron-thin circuitry runs algorithms undreamt of in modern science, and that these neural programs give rise to our decision making, loves, desires, fears, and aspirations. To me, that understanding would be a numinous experience, better than anything ever proposed in anyone's holy text.
David Eagleman (Incognito: The Secret Lives of the Brain)
Seemingly innocuous language like 'Oh, I'm flexible' or 'What do you want to do tonight?' has a dark computational underbelly that should make you think twice. It has the veneer of kindness about it, but it does two deeply alarming things. First, it passes the cognitive buck: 'Here's a problem, you handle it.' Second, by not stating your preferences, it invites the others to simulate or imagine them. And as we have seen, the simulation of the minds of others is one of the biggest computational challenges a mind (or machine) can ever face.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Even the best strategy sometimes yields bad results—which is why computer scientists take care to distinguish between “process” and “outcome.” If you followed the best possible process, then you’ve done all you can, and you shouldn’t blame yourself if things didn’t go your way.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
If you want to be a good intuitive Bayesian—if you want to naturally make good predictions, without having to think about what kind of prediction rule is appropriate—you need to protect your priors. Counterintuitively, that might mean turning off the news.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Sorting something that you will never search is a complete waste; searching something you never sorted is merely inefficient.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
some of the biggest challenges faced by computers and human minds alike: how to manage finite space, finite time, limited attention, unknown unknowns, incomplete information, and an unforeseeable future; how to do so with grace and confidence; and how to do so in a community with others who are all simultaneously trying to do the same.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
To try and fail is at least to learn; to fail to try is to suffer the inestimable loss of what might have been.
Brian Christian (Algorithms To Live By: The Computer Science of Human Decisions)
We lived in a veritable surveillance state, engaged with screens more than with our loved ones, and the algorithms knew us better than we knew ourselves.
Blake Crouch (Upgrade)
If changing strategies doesn’t help, you can try to change the game. And if that’s not possible, you can at least exercise some control about which games you choose to play. The road to hell is paved with intractable recursions, bad equilibria, and information cascades. Seek out games where honesty is the dominant strategy. Then just be yourself.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
The greater the uncertainty, the bigger the gap between what you can measure and what matters, the more you should watch out for overfitting - that is, the more you should prefer simplicity
Tom Griffiths (Algorithms to Live By: The Computer Science of Human Decisions)
As algorithms become the most important decision-makers in our lives, the question is not only whether we can trust AI, but whether we can trust that we understand AI well enough.
Roger Spitz (The Definitive Guide to Thriving on Disruption: Volume I - Reframing and Navigating Disruption)
The road to hell is paved with intractable recursions, bad equilibria, and information cascades.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Racism, sexism, ableism, homo- and transphobia, ageism, fatphobia are algorithms created by humans’ struggle to make peace with the body. A radical self-love world is a world free from the systems of oppression that make it difficult and sometimes deadly to live in our bodies.
Sonya Renee Taylor (The Body Is Not an Apology: The Power of Radical Self-Love)
No choice recurs. We may get similar choices again, but never that exact one. Hesitation—inaction—is just as irrevocable as action. What the motorist, locked on the one-way road, is to space, we are to the fourth dimension: we truly pass this way but once.
Brian Christian (Algorithms To Live By: The Computer Science of Human Decisions)
Compassionate AI algorithms are focused to feel connected to all living beings and our planet, and make this universe a friendly place for both human and machine to thrive.
Amit Ray (Compassionate Artificial Superintelligence AI 5.0)
In the long run, optimism is the best prevention for regret.
Tom Griffiths (Algorithms to Live By: The Computer Science of Human Decisions)
We say “brain fart” when we should really say “cache miss.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
They don’t need a therapist; they need an algorithm.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
tackling real-world tasks requires being comfortable with chance, trading off time with accuracy, and using approximations. As
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
As Carl Sagan put it, “Science is a way of thinking much more than it is a body of knowledge.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Learning self-control is important, but it’s equally important to grow up in an environment where adults are consistently present and trustworthy.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Biological quantum coherence process is different from optical quantum coherence. Biological quantum coherence is long, and more robust in warm, noisy, and complex environment. They are the fundamental process of all living organisms.
Amit Ray (Quantum Computing Algorithms for Artificial Intelligence)
Look-Then-Leap Rule: You set a predetermined amount of time for “looking”—that is, exploring your options, gathering data—in which you categorically don’t choose anyone, no matter how impressive. After that point, you enter the “leap” phase, prepared to instantly commit to anyone who outshines the best applicant you saw in the look phase. We
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
The algorithms that orchestrate our ads are starting to orchestrate our lives.
Eli Pariser (The Filter Bubble: What the Internet is Hiding From You)
Vision is not enough; it must be combined with venture. It is not enough to stare up the steps; we must step up the stairs.
Jesse Tevelow (The Connection Algorithm: Take Risks, Defy the Status Quo, and Live Your Passions)
We can hope to be fortunate—but we should strive to be wise.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Sometimes mess is more than just the easy choice - it's the optimal choice.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
The old adage tells us that “the grass is always greener on the other side of the fence,” but the math tells us why: the unknown has a chance of being better, even if we actually expect it to be no different, or if it’s just as likely to be worse.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
It’s fairly intuitive that never exploring is no way to live. But it’s also worth mentioning that never exploiting can be every bit as bad. In the computer science definition, exploitation actually comes to characterize many of what we consider to be life’s best moments. A family gathering together on the holidays is exploitation. So is a bookworm settling into a reading chair with a hot cup of coffee and a beloved favorite, or a band playing their greatest hits to a crowd of adoring fans, or a couple that has stood the test of time dancing to “their song.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Because life is robust, Because life is bigger than equations, stronger than money, stronger than guns and poison and bad zoning policy, stronger than capitalism, Because Mother Nature bats last, and Mother Ocean is strong, and we live inside our mothers forever, and Life is tenacious and you can never kill it, you can never buy it, So Life is going to dive down into your dark pools, Life is going to explode the enclosures and bring back the commons, O you dark pools of money and law and quantitudinal stupidity, you oversimple algorithms of greed, you desperate simpletons hoping for a story you can understand, Hoping for safety, hoping for cessation of uncertainty, hoping for ownership of volatility, O you poor fearful jerks, Life! Life! Life! Life is going to kick your ass.
Kim Stanley Robinson (New York 2140)
Everything starts to break down, however, when a species gains language. What we talk about isn’t what we experience—we speak chiefly of interesting things, and those tend to be things that are uncommon.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
We are living in a time of brain-hacking algorithms, pop-up propaganda and information everywhere. From the moment we wake up, to the time we stumble into bed, we are fed messages about what we should look like, wear, eat and buy, how much we should be earning, who we should love and how we should parent.
Beth Kempton (Wabi Sabi: Japanese Wisdom for a Perfectly Imperfect Life)
Bayes’s Rule tells us that when it comes to making predictions based on limited evidence, few things are as important as having good priors—that is, a sense of the distribution from which we expect that evidence to have come. Good predictions thus begin with having good instincts about when we’re dealing with a normal distribution and when with a power-law distribution. As it turns out, Bayes’s Rule offers us a simple but dramatically different predictive rule of thumb for each.  …
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Optimal stopping tells us when to look and when to leap. The explore/exploit tradeoff tells us how to find the balance between trying new things and enjoying our favorites. Sorting theory tells us how (and whether) to arrange our offices. Caching theory tells us how to fill our closets. Scheduling theory tells us how to fill our time.
Brian Christian (Algorithms To Live By: The Computer Science of Human Decisions)
Love is like organized crime. It
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Getting outside of your comfort zone is the only way to achieve significant growth.
Jesse Tevelow (The Connection Algorithm: Take Risks, Defy the Status Quo, and Live Your Passions)
You can’t change the people around you, but you can change the people around you.
Jesse Tevelow (The Connection Algorithm: Take Risks, Defy the Status Quo, and Live Your Passions)
Communication is one of those delightful things that work only in practice; in theory it’s impossible.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
So explore when you will have time to use the resulting knowledge, exploit when you’re ready to cash in. The interval makes the strategy.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Don’t always consider all your options. Don’t necessarily go for the outcome that seems best every time.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
When our expectations are uncertain and the data are noisy, the best bet is to paint with a broad brush
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Hesitation—inaction—is just as irrevocable as action.
Brian Christian (Algorithms To Live By: The Computer Science of Human Decisions)
When Charles Darwin was trying to decide whether he should propose to his cousin Emma Wedgwood, he got out a pencil and paper and weighed every possible consequence. In favor of marriage he listed children, companionship, and the 'charms of music and female chit-chat.' Against marriage he listed the 'terrible loss of time,' lack of freedom to go where he wished, the burden of visiting relatives, the expense and anxiety provoked by children, the concern that 'perhaps my wife won't like London,' and having less money to spend on books. Weighing one column against the other produced a narrow margin of victory, and at the bottom Darwin scrawled, 'Marry—Marry—Marry Q.E.D.' Quod erat demonstrandum, the mathematical sign-off that Darwin himself restated in English: 'It being proved necessary to Marry.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Unless we have good reason to think otherwise, it seems that our best guide to the future is a mirror image of the past. The nearest thing to clairvoyance is to assume that history repeats itself — backward.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Corporations, money and nations exist only in our imagination. We invented them to serve us; why do we find ourselves sacrificing our lives in their service? In the twenty-first century we will create more powerful fictions and more totalitarian religions than in any previous era. With the help of biotechnology and computer algorithms these religions will not only control our minute-by-minute existence, but will be able to shape our bodies, brains and minds, and to create entire virtual worlds complete with hells and heavens.
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
Perhaps the deepest insight that comes from thinking about later life as a chance to exploit knowledge acquired over decades is this: life should get better over time. What an explorer trades off for knowledge is pleasure.
Brian Christian (Algorithms To Live By: The Computer Science of Human Decisions)
Not only has volume been ratcheted up but expectations have, too. Quiet success--painting a picture, writing a poem, writing an algorithm--is all well and good, but if you haven't become famous doing it, then did it really matter?
Sophia Dembling (The Introvert's Way: Living a Quiet Life in a Noisy World (Perigee Book))
When we talk about decision-making, we usually focus just on the immediate payoff of a single decision—and if you treat every decision as if it were your last, then indeed only exploitation makes sense. But over a lifetime, you’re going to make a lot of decisions. And it’s actually rational to emphasize exploration—the new rather than the best, the exciting rather than the safe, the random rather than the considered—for many of those choices, particularly earlier in life.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Thrashing is a very recognizable human state. If you've ever had a moment where you wanted to stop doing everything just to have the chance to write down everything you were supposed to be doing, but couldn't spare the time, you've thrashed.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
In the early part of the ninth century, Muhammad ibn Musa al-Khwarizmi, a mathematician working in Baghdad, wrote a seminal textbook in which he highlighted the usefulness of restoring a quantity being subtracted (like 2, above) by adding it to the other side of an equation. He called this process al-jabr (Arabic for “restoring”), which later morphed into “algebra.” Then, long after his death, he hit the etymological jackpot again. His own name, al-Khwarizmi, lives on today in the word “algorithm.
Steven H. Strogatz (The Joy Of X: A Guided Tour of Math, from One to Infinity)
remind myself that the future is not set in stone, that I still have some ability to shape it. I cling to what cannot be predicted or controlled: love, imagination, originality. I try to live in a way that would break an algorithm. I pray to the unexpected.
Sarah Kendzior (Hiding in Plain Sight: The Invention of Donald Trump and the Erosion of America)
The algorithm seemed to be really good at distinguishing the two rather similar canines; it turned out that it was simply labeling any picture with snow as containing a wolf. An example with more serious implications was described by Janelle Shane in her book You Look Like a Thing and I Love You: an algorithm that was shown pictures of healthy skin and of skin cancer. The algorithm figured out the pattern: if there was a ruler in the photograph, it was cancer.7 If we don’t know why the algorithm is doing what it’s doing, we’re trusting our lives to a ruler detector.
Tim Harford (The Data Detective: Ten Easy Rules to Make Sense of Statistics)
I’m continually amazed at how even extremely high performers’ lives are often still controlled in some way by their family-of-origin or in-law relationships. I wish we had some cosmic algorithm that actually revealed how much lost performance comes from people having to continually negotiate the intrusion of family-of-origin conditioning and interference into their businesses, careers, marriages, parenting styles, life choices, and the like. It literally becomes crippling to even some of the most talented people out there. In these situations, even if the adult umbilical cord is providing food, it’s charging exorbitant rent.
Henry Cloud (The Power of the Other: The startling effect other people have on you, from the boardroom to the bedroom and beyond-and what to do about it)
Win-Stay, Lose-Shift algorithm: choose an arm at random, and keep pulling it as long as it keeps paying off. If the arm doesn’t pay off after a particular pull, then switch to the other one. Although this simple strategy is far from a complete solution, Robbins proved in 1952 that it performs reliably better than chance.
Brian Christian (Algorithms To Live By: The Computer Science of Human Decisions)
As algorithms come to know us so well, authoritarian governments could gain absolute control over their citizens, even more so than in Nazi Germany, and resistance to such regimes might be utterly impossible. Not only will the regime know exactly how you feel, but it could make you feel whatever it wants. The dictator might not be able to provide citizens with healthcare or equality, but he could make them love him and hate his opponents. Democracy in its present form cannot survive the merger of biotech and infotech. Either democracy will successfully reinvent itself in a radically new form or humans will come to live in “digital dictatorships.
Yuval Noah Harari (21 Lessons for the 21st Century)
Maybe the concept of friendship is already too colonized by liberalism and capitalism. Under neoliberalism, friendship is a banal affair of private preferences: we hang out, we share hobbies, we make small talk. We become friends with those who are already like us, and we keep each other comfortable rather than becoming different and more capable together. The algorithms of Facebook and other social networks guide us towards the refinement of our profiles, reducing friendship to the click of a button. This neoliberal friend is the alternative to hetero- and homonormative coupling: "just friends" implies a much weaker and insignificant bond than a lover could ever be. Under neoliberal friendship, we don't have each other's backs, and our lives aren't tangled up together. But these insipid tendencies do not mean that friendships are pointless, only that friendship is a terrain of struggle. Empire works to usher its subjects into flimsy relationships where nothing is at stake and to infuse intimacy with violence and domination.
Carla Bergman (Joyful Militancy: Building Thriving Resistance in Toxic Times (Anarchist Interventions))
When our decision-making is nurtured by corporate algorithm, when so many of our experiences are their simulations of experience, when we’ve outsourced our memories to be stored and filed away, by them. When our every moment is sampled, deconstructed, and built back into Trojans—advertising, architecture, news reports—that reformat our lives. How can we exist, then, when we’re someone else’s dream? They create these cities, Jack, and cities are huge external memory devices. But the memories are not ours, always those of others.
T.R. Napper (Neon Leviathan)
Now is better than never. Although never is often better than right now. —THE ZEN OF PYTHON
Brian Christian (Algorithms To Live By: The Computer Science of Human Decisions)
If you’re flammable and have legs, you are never blocking a fire exit.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
There is no such thing as absolute certainty, but there is assurance sufficient for the purposes of human life. —JOHN STUART MILL
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
We try to become successful so we can be happy, instead of making sure we’re happy so we can become successful.
Jesse Tevelow (The Connection Algorithm: Take Risks, Defy the Status Quo, and Live Your Passions)
Algorithms are crude. Computers are machines. Data science is trying to make digital sense of an analog world.
Christian Rudder (Dataclysm: Love, Sex, Race, and Identity--What Our Online Lives Tell Us about Our Offline Selves)
Sooner or later you’re going to realize, just as I did, that there’s a difference between knowing the path and walking the path.” MORPHEUS, THE MATRIX
Jesse Tevelow (The Connection Algorithm: Take Risks, Defy the Status Quo, and Live Your Passions)
within thirty seconds of learning their name and where they lived, she would implement her social algorithm and calculate precisely where they stood in her constellation based on who their family was, who else they were related to, what their approximate net worth might be, how the fortune was derived, and what family scandals might have occurred within the past fifty years.
Kevin Kwan (Crazy Rich Asians (Crazy Rich Asians, #1))
Computer scientists would call this a “ping attack” or a “denial of service” attack: give a system an overwhelming number of trivial things to do, and the important things get lost in the chaos.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
The challenge, however, is that Google, Facebook, Netflix, and Amazon do not publish their algorithms. In fact, the methods they use to filter the information you see are deeply proprietary and the “secret sauce” that drives each company’s profitability. The problem with this invisible “black box” algorithmic approach to information is that we do not know what has been edited out for us and what we are not seeing. As a result, our digital lives, mediated through a sea of screens, are being actively manipulated and filtered on a daily basis in ways that are both opaque and indecipherable.
Marc Goodman (Future Crimes)
In the end, blaming social media and the internet for the ways we edit and split may be just another means of hiding from ourselves—a way of denying truths about who we are, about things we struggled with long before Twitter or Instagram. To assign the blame elsewhere and say the fault lies outside of us, with something out there beyond our control. Something in the algorithms, or in the stars.
Chris Stedman (IRL: Finding Realness, Meaning, and Belonging in Our Digital Lives)
The effort of retrieval [from long term memory] is a testament to how much you know. And the rarity of those lags is a testament to how well you’ve arranged it: keeping the most important things closes to hand.
Tom Griffiths
Consider how many times you’ve seen either a crashed plane or a crashed car. It’s entirely possible you’ve seen roughly as many of each—yet many of those cars were on the road next to you, whereas the planes were probably on another continent, transmitted to you via the Internet or television. In the United States, for instance, the total number of people who have lost their lives in commercial plane crashes since the year 2000 would not be enough to fill Carnegie Hall even half full. In contrast, the number of people in the United States killed in car accidents over that same time is greater than the entire population of Wyoming. Simply
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
I expect to pass through this world but once. Any good therefore that I can do, or any kindness that I can show to any fellow creature, let me do it now. Let me not defer or neglect it, for I shall not pass this way again. —STEPHEN GRELLET
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
We know this because finding an apartment belongs to a class of mathematical problems known as “optimal stopping” problems. The 37% rule defines a simple series of steps—what computer scientists call an “algorithm”—for solving these problems. And as it turns out, apartment hunting is just one of the ways that optimal stopping rears its head in daily life. Committing to or forgoing a succession of options is a structure that appears in life again and again, in slightly different incarnations. How many times to circle the block before pulling into a parking space? How far to push your luck with a risky business venture before cashing out? How long to hold out for a better offer on that house or car? The same challenge also appears in an even more fraught setting: dating. Optimal stopping is the science of serial monogamy.
Brian Christian (Algorithms To Live By: The Computer Science of Human Decisions)
Once I had been diagnosed with a terminal illness, I began to view the world through two perspectives; I was starting to see death as both doctor and patient. As a doctor, I knew not to declare “Cancer is a battle I’m going to win!” or ask “Why me?” (Answer: Why not me?) I knew a lot about medical care, complications, and treatment algorithms. I quickly learned from my oncologist and my own study that stage IV lung cancer today was a disease whose story might be changing, like AIDS in the late 1980s: still a rapidly fatal illness but with emerging therapies that were, for the first time, providing years of life. While being trained as a physician and scientist had helped me process the data and accept the limits of what that data could reveal about my prognosis, it didn’t help me as a patient. It didn’t tell Lucy and me whether we should go ahead and have a child, or what it meant to nurture a new life while mine faded. Nor did it tell me whether to fight for my career, to reclaim the ambitions I had single-mindedly pursued for so long, but without the surety of the time to complete them. Like my own patients, I had to face my mortality and try to understand what made my life worth living—and I needed Emma’s help to do so. Torn between being a doctor and being a patient, delving into medical science and turning back to literature for answers, I struggled, while facing my own death, to rebuild my old life—or perhaps find a new one. —
Paul Kalanithi (When Breath Becomes Air)
Our decisions make us who we are. The lines between success and failure, friendship and missed connection, happiness and unhappiness, barrier and breakthrough, are far thinner than we might imagine. Our mentality, and the decisions we make based on that mentality, play a huge role in our trajectory through life.
Jesse Tevelow (The Connection Algorithm: Take Risks, Defy the Status Quo, and Live Your Passions)
When you feel the need to escape your problems, to escape from this world, don't make the mistake of resorting to suicide Don't do it! You will hear the empty advice of many scholars in the matter of life and death, who will tell you, "just do it" there is nothing after this, you will only extinguish the light that surrounds you and become part of nothingness itself, so when you hear these words remember this brief review of suicide: When you leave this body after committing one of the worst acts of cowardice that a human being can carry out, you turn off the light, the sound and the sense of reality, you become nothing waiting for the programmers of this game to pick you up from the darkness, subtly erase your memories and enable your return and I emphasize the word subtle because sometimes the intelligence behind this maneuver or automated mechanism is wrong and send human beings wrongly reset to such an extent, that when they fall to earth and are born again, they begin to experience memories of previous lives, in many cases they perceive themselves of the opposite sex, and science attributes this unexplainable phenomenon to genetic and hormonal factors, but you and I know better! And we quickly identified this trigger as a glitch in the Matrix. Then we said! That a higher intelligence or more advanced civilization throws you back into this game for the purpose of experimenting, growing and developing as an advanced consciousness and due to your toxic and destructive behavior you come back again but in another body and another life, but you are still you, then you will carry with you that mark of suicide and cowardice, until you learn not to leave this experience without having learned the lesson of life, without having experienced and surprised by death naturally or by design of destiny. About this first experience you will find very little material associated with this event on the internet, it seems that the public is more reserved, because they perceive themselves and call themselves "awakened" And that is because the system has total control over the algorithm of fame and fortune even over life and death. Now, according to religion and childish fears, which are part of the system's business to keep you asleep, eyes glued to the cellular device all day, it says the following: If you commit this act of sin, you turn off light, sound and sense of reality, and from that moment you begin to experience pain, fear and suffering on alarming scales, and that means they will come for you, a couple of demons and take you to the center of the earth where the weeping and gnashing of teeth is forever, and in that hell tormented by demons you will spend eternity. About this last experience we will find hundreds of millions of people who claim to have escaped from there! And let me tell you that all were captivated by the same deity, one of dubious origin, that feeds on prayers and energetic events, because it is not of our nature, because it knows very well that we are beings of energy, then this deity or empire of darkness receives from the system its food and the system receives from them power, to rule, to administer, to control, to control, to kill, to exclude, to inhibit, to classify, to imprison, to silence, to infect, to contaminate, to depersonalize. So now that you know the two sides of the same coin, which one will your intelligence lean towards! You decide... Heads or tails? From the book Avatars, the system's masterpiece.
Marcos Orowitz (THE LORD OF TALES: The masterpiece of deceit)
Intuitively, we think that rational decision-making means exhaustively enumerating our options, weighting each one carefully, and then selecting the best. But in practice, when the clock - or the ticker - is ticking, few aspects of decision-making (or of thinking more generally) are as important as this one: when to stop.
Tom Griffiths
Algorithmic recommendations are addictive because they are always subtly confirming your own cultural, political, and social biases, warping your surroundings into a mirror image of yourself while doing the same for everyone else. This had made me anxious, the possibility that my view of my own life—lived through the Internet—was a fiction formed by the feeds.
Kyle Chayka (Filterworld: How Algorithms Flattened Culture)
As a thought experiment, von Neumann's analysis was simplicity itself. He was saying that the genetic material of any self-reproducing system, whether natural or artificial, must function very much like a stored program in a computer: on the one hand, it had to serve as live, executable machine code, a kind of algorithm that could be carried out to guide the construction of the system's offspring; on the other hand, it had to serve as passive data, a description that could be duplicated and passed along to the offspring. As a scientific prediction, that same analysis was breathtaking: in 1953, when James Watson and Francis Crick finally determined the molecular structure of DNA, it would fulfill von Neumann's two requirements exactly. As a genetic program, DNA encodes the instructions for making all the enzymes and structural proteins that the cell needs in order to function. And as a repository of genetic data, the DNA double helix unwinds and makes a copy of itself every time the cell divides in two. Nature thus built the dual role of the genetic material into the structure of the DNA molecule itself.
M. Mitchell Waldrop (The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal)
The earliest known work in Arabic arithmetic was written by al-Khowârizmî, a mathematician who lived around 825, some four hundred years before Fibonacci.11 Although few beneficiaries of his work are likely to have heard of him, most of us know of him indirectly. Try saying “al-Khowârizmî” fast. That’s where we get the word “algorithm,” which means rules for computing.12 It was al-Khowârizmî who was the first mathematician to establish rules for adding, subtracting, multiplying, and dividing with the new Hindu numerals. In another treatise, Hisâb al-jabr w’ almuqâbalah, or “Science of transposition and cancellation,” he specifies the process for manipulating algebraic equations. The word al-jabr thus gives us our word algebra, the science of equations.13
Peter L. Bernstein (Against the Gods: The Remarkable Story of Risk)
Already today, many of us give up our privacy and our individuality by conducting much of our lives online, recording our every action, and becoming hysterical if connection to the net is interrupted even for a few minutes. The shifting of authority from humans to algorithms is happening all around us, not as a result of some momentous governmental decision, but due to a flood of mundane personal choices.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
But even when Facebook isn't deliberately exploiting its users, it is exploiting its users—its business model requires it. Even if you distance yourself from Facebook, you still live in the world that Facebook is shaping. Facebook, using our native narcissism and our desire to connect with other people, captured our attention and our behavioral data; it used this attention and data to manipulate our behavior, to the point that nearly half of America began relying on Facebook for news. Then, with the media both reliant on Facebook as a way of reaching readers and powerless against the platform's ability to suck up digital advertising revenue—it was like a paperboy who pocketed all the subscription money—Facebook bent the media's economic model to match its own practices: publications needed to capture attention quickly and consistently trigger high emotional responses to be seen at all. The result, in 2016, was an unending stream of Trump stories, both from the mainstream news and from the fringe outlets that were buoyed by Facebook's algorithm. What began as a way for Zuckerberg to harness collegiate misogyny and self-interest has become the fuel for our whole contemporary nightmare, for a world that fundamentally and systematically misrepresents human needs.
Jia Tolentino (Trick Mirror: Reflections on Self-Delusion)
To see what happens in the real world when an information cascade takes over, and the bidders have almost nothing but one another’s behavior to estimate an item’s value, look no further than Peter A. Lawrence’s developmental biology text The Making of a Fly, which in April 2011 was selling for $23,698,655.93 (plus $3.99 shipping) on Amazon’s third-party marketplace. How and why had this—admittedly respected—book reached a sale price of more than $23 million? It turns out that two of the sellers were setting their prices algorithmically as constant fractions of each other: one was always setting it to 0.99830 times the competitor’s price, while the competitor was automatically setting their own price to 1.27059 times the other’s. Neither seller apparently thought to set any limit on the resulting numbers, and eventually the process spiraled totally out of control.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
If biological algorithms are the important part of what makes us who we are, rather than the physical stuff, then it’s a possibility that we will someday be able to copy our brains, upload them, and live forever in silica. But there’s an important question here: is it really you? Not exactly. The uploaded copy has all your memories and believes it was you, just there, standing outside the computer, in your body. Here’s the strange part: if you die and we turn on the simulation one second later, it would be a transfer. It would be no different to beaming up in Star Trek, when a person is disintegrated, and then a new version is reconstituted a moment later. Uploading may not be all that different from what happens to you each night when you go to sleep: you experience a little death of your consciousness, and the person who wakes up on your pillow the next morning inherits all your memories, and believes him or herself to be you. Are
David Eagleman (The Brain: The Story of You)
Though I did not know her exact address, that she appeared to live almost within breathing distance of Robin, and that I lived with him, and that her pictures showed that she was now dating the mysterious Rupert Hunter, our despotic mothers, our absent fathers, the borders we had both crossed, all our many parallels and connections at every point, could not be chance. I saw it as evidence of the hidden connections between things, an all-powerful algorithm that sifted through chaos, singling out soulmates.
Olivia Sudjic (Sympathy)
The bigger threat to Google wouldn’t be measured in dollars, but in the philosophical challenge. Could it be that social networking, rather than algorithmic exploitation of the web’s intelligence, would assume the central role in people’s online lives? Even if that were not the case, Facebook made it clear that every facet of the Internet would benefit from the power of personal connection. Google had been chasing a future forged out of algorithms and science fiction chronicles. Did the key to the future lay in party photos and daily status reports?
Steven Levy (In the Plex: How Google Thinks, Works, and Shapes Our Lives)
We are approaching a soft data catastrophe. Entire lives, from tastes in music and clothes to deepest personal convictions - all produced by networks of feedback between datamining and content recommendation algorithms. The 'catastrophe' is when these algorithms unconsciously (or maybe, consciously?) lead people down presupposed paths for modern and emerging markets. Algorithms could right now be helping make people convert to a religion, drug addicts, vegan, LGBTQ, ethnonarcissists, fat, cult members, suicidal, narcissists, atheist, poly, mass shooters...
stained hanes (94,000 Wasps in a Trench Coat)
We are living through a movement from an organic, industrial society to a polymorphous, information system,” wrote Donna Haraway, “from all work to all play, a deadly game.”10 With the growing significance of immaterial labor, and the concomitant increase in cultivation and exploitation of play—creativity, innovation, the new, the singular, flexibility, the supplement—as a productive force, play will become more and more linked to broad social structures of control. Today we are no doubt witnessing the end of play as politically progressive, or even politically neutral.)
Alexander R. Galloway (Gaming: Essays On Algorithmic Culture (Electronic Mediations Book 18))
And here, the children who had learned that the experimenter was unreliable were more likely to eat the marshmallow before she came back, losing the opportunity to earn a second treat. Failing the marshmallow test—and being less successful in later life—may not be about lacking willpower. It could be a result of believing that adults are not dependable: that they can’t be trusted to keep their word, that they disappear for intervals of arbitrary length. Learning self-control is important, but it’s equally important to grow up in an environment where adults are consistently present and trustworthy.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
When you read the Bible you are getting advice from a few priests and rabbis who lived in ancient Jerusalem. In contrast, when you listen to your feelings, you follow an algorithm that evolution has developed for millions of years, and that withstood the harshest quality-control tests of natural selection. Your feelings are the voice of millions of ancestors, each of whom managed to survive and reproduce in an unforgiving environment. Your feelings are not infallible, of course, but they are better than most other sources of guidance. For millions upon millions of years, feelings were the best algorithms in the world.
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
Secular Israelis often complain bitterly that the ultra-Orthodox don’t contribute enough to society and live off other people’s hard work. Secular Israelis also tend to argue that the ultra-Orthodox way of life is unsustainable, especially as ultra-Orthodox families have seven children on average.32 Sooner or later, the state will not be able to support so many unemployed people, and the ultra-Orthodox will have to go to work. Yet it might be just the reverse. As robots and AI push humans out of the job market, the ultra-Orthodox Jews may come to be seen as the model for the future rather than as a fossil from the past. Not that everyone will become Orthodox Jews and go to yeshivas to study the Talmud. But in the lives of all people, the quest for meaning and community might eclipse the quest for a job. If we manage to combine a universal economic safety net with strong communities and meaningful pursuits, losing our jobs to algorithms might actually turn out to be a blessing. Losing control over our lives, however, is a much scarier scenario. Notwithstanding the danger of mass unemployment, what we should worry about even more is the shift in authority from humans to algorithms, which might destroy any remaining faith in the liberal story and open the way to the rise of digital dictatorships.
Yuval Noah Harari (21 Lessons for the 21st Century)
Google had a built-in disadvantage in the social networking sweepstakes. It was happy to gather information about the intricate web of personal and professional connections known as the “social graph” (a term favored by Facebook’s Mark Zuckerberg) and integrate that data as signals in its search engine. But the basic premise of social networking—that a personal recommendation from a friend was more valuable than all of human wisdom, as represented by Google Search—was viewed with horror at Google. Page and Brin had started Google on the premise that the algorithm would provide the only answer. Yet there was evidence to the contrary. One day a Googler, Joe Kraus, was looking for an anniversary gift for his wife. He typed “Sixth Wedding Anniversary Gift Ideas” into Google, but beyond learning that the traditional gift involved either candy or iron, he didn’t see anything creative or inspired. So he decided to change his status message on Google Talk, a line of text seen by his contacts who used Gmail, to “Need ideas for sixth anniversary gift—candy ideas anyone?” Within a few hours, he got several amazing suggestions, including one from a colleague in Europe who pointed him to an artist and baker whose medium was cake and candy. (It turned out that Marissa Mayer was an investor in the company.) It was a sobering revelation for Kraus that sometimes your friends could trump algorithmic search.
Steven Levy (In the Plex: How Google Thinks, Works, and Shapes Our Lives)
Unlike other features on OkCupid, there is no visual component to match percentage. The number between two people only reflects what you might call their inner selves—everything about what they believe, need, and want, even what they think is funny, but nothing about what they look like. Judging by just this compatibility measure, the four largest racial groups on OkCupid—Asian, black, Latino, and white—all get along about the same.1 In fact, race has less effect on match percentage than religion, politics, or education. Among the details that users believe are important, the closest comparison to race is Zodiac sign, which has no effect at all. To a computer not acculturated to the categories, “Asian” and “black” and “white” could just as easily be “Aries” and “Virgo” and “Capricorn.” But this racial neutrality is only in theory; things change once the users’ own opinions, and not just the color-blind workings of an algorithm, come into play.
Christian Rudder (Dataclysm: Love, Sex, Race, and Identity--What Our Online Lives Tell Us about Our Offline Selves)
The data that drives algorithms isn't just a few numbers now. It's monstrous tables of millions of numbers, thousands upon thousands of rows and columns of numbers....Matrices are created and refined by computers endlessly churning through Big Data's records on everyone, and everything they've done. No human can read those matrices, even with computers helping you interpret them they are simply too large and complex to fully comprehend. But the computers can use them, applying the appropriate matrix to show us the appropriate video that will eventually lead us to make an appropriate purchase. We are not living in "The Matrix," but there's still a matrix controlling us. What does this have to do with the rabbit hole of conspiracy theories? It has everything to do with it. These algorithms are quickly becoming the primary route down the rabbit hole. To a large extent this has already happened, but it's going to get far, far worse. Tufekci described what happened when she tried watching different types of content on YouTube. She started watching videos of Donald Trump rallies. 'I wanted to write about one of [Donald Trump]'s rallies, so I watched it a few times on YouTube. YouTube started recommending to me, and autoplaying to me, white supremacist videos, in increasing order of extremism. If I watched one, it served up one even more extreme. If you watch Hilary Clinton or Bernie Sanders content, YouTube recommends and autoplays [left-wing] conspiracy videos, and it goes downhill from there." Downhill, into the rabbit hole....Without human intervention the algorithm has evolved to a perfect a method of gently stepping up the intensity of the conspiracy videos that it shows you so that you don't get turned off, and so you continue to watch. They get more intense because the algorithm has found (not in any human sense, but found nonetheless) that the deeper it can guide people down the rabbit hole, the more revenue it can make.
Mick West (Escaping the Rabbit Hole: How to Debunk Conspiracy Theories Using Facts, Logic, and Respect)
In April 2004, Google had one of its countless minicrises, over an anti-Semitic website called Jew Watch. When someone typed “Jew” into Google’s search box, the first result was often a link to that hate site. Critics urged Google to exclude it in its search results. Brin publicly grappled with the dilemma. His view on what Google should do—maintain the sanctity of search—was rational, but a tremor in his voice betrayed how much he was troubled that his search engine was sending people to a cesspool of bigotry. “My reaction was to be really upset about it,” he admitted at the time. “It was certainly not something I want to see.” Then he launched into an analysis of why Google’s algorithms yielded that result, mainly because the signals triggered by the keyword “Jew” reflected the frequent use of that abbreviation as a pejorative. The algorithms had spoken, and Brin’s ideals, no matter how heartfelt, could not justify intervention. “I feel like I shouldn’t impose my beliefs on the world,” he said. “It’s a bad technology practice.
Steven Levy (In the Plex: How Google Thinks, Works, and Shapes Our Lives)
We live in a time of such great disunity, as the bitter fight over this nomination both in the Senate and among the public clearly demonstrates. It is not merely a case of different groups having different opinions. It is a case of people bearing extreme ill will toward those who disagree with them. In our intense focus on our differences, we have forgotten the common values that bind us together as Americans. When some of our best minds are seeking to develop ever more sophisticated algorithms designed to link us to websites that only reinforce and cater to our views, we can only expect our differences to intensify. This would have alarmed the drafters of our Constitution, who were acutely aware that different values and interests could prevent Americans from becoming and remaining a single people. Indeed, of the six objectives they invoked in the preamble to the Constitution, the one that they put first was the formation of “a more perfect Union.” Their vision of “a more perfect Union” does not exist today, and if anything, we appear to be moving farther away from it.
Suzanne Collins
The realisation that what are usually called ‘predictive’ technologies are in fact interested more in ‘nudging’ and mapping out the new rhythms of the city than in simply monitoring or surveilling also suggests the need for a new conception of digital time. Armen Arvanessian argues that as a result of digital data time itself – the direction of time – has changed. We no longer have a linear time, in the sense of the past being followed by the present and then the future. It’s rather the other way around: the future happens before the present, time arrives from the future. If people have the impression that time is out of joint, or that time doesn’t make sense anymore, or it isn’t as it used to be, then the reason is, I think, that they have – or we all have – problems getting used to living in such a speculative time or within a speculative temporality. Data technologies do not simply predict the future by guessing what an individual or group might do or want to do in the future. It is rather that those futures already exist, completely realised, and they reach backwards into the present to guide it. The possible paths for our desires to travel are mapped ahead of time by algorithms in the hands of platform capitalists.
Alfie Bown (Dream Lovers: The Gamification of Relationships (Digital Barricades))
I will give technology three definitions that we will use throughout the book. The first and most basic one is that a technology is a means to fulfill a human purpose. For some technologies-oil refining-the purpose is explicit. For others- the computer-the purpose may be hazy, multiple, and changing. As a means, a technology may be a method or process or device: a particular speech recognition algorithm, or a filtration process in chemical engineering, or a diesel engine. it may be simple: a roller bearing. Or it may be complicated: a wavelength division multiplexer. It may be material: an electrical generator. Or it may be nonmaterial: a digital compression algorithm. Whichever it is, it is always a means to carry out a human purpose. The second definition I will allow is a plural one: technology as an assemblage of practices and components. This covers technologies such as electronics or biotechnology that are collections or toolboxes of individual technologies and practices. Strictly speaking, we should call these bodies of technology. But this plural usage is widespread, so I will allow it here. I will also allow a third meaning. This is technology as the entire collection of devices and engineering practices available to a culture. Here we are back to the Oxford's collection of mechanical arts, or as Webster's puts it, "The totality of the means employed by a people to provide itself with the objects of material culture." We use this collective meaning when we blame "technology" for speeding up our lives, or talk of "technology" as a hope for mankind. Sometimes this meaning shades off into technology as a collective activity, as in "technology is what Silicon Valley is all about." I will allow this too as a variant of technology's collective meaning. The technology thinker Kevin Kelly calls this totality the "technium," and I like this word. But in this book I prefer to simply use "technology" for this because that reflects common use. The reason we need three meanings is that each points to technology in a different sense, a different category, from the others. Each category comes into being differently and evolves differently. A technology-singular-the steam engine-originates as a new concept and develops by modifying its internal parts. A technology-plural-electronics-comes into being by building around certain phenomena and components and develops by changing its parts and practices. And technology-general, the whole collection of all technologies that have ever existed past and present, originates from the use of natural phenomena and builds up organically with new elements forming by combination from old ones.
W. Brian Arthur (The Nature of Technology: What It Is and How It Evolves)
Prior to the invention of writing, stories were confined by the limited capacity of human brains. You couldn’t invent overly complex stories which people couldn’t remember. With writing you could suddenly create extremely long and intricate stories, which were stored on tablets and papyri rather than in human heads. No ancient Egyptian remembered all of pharaoh’s lands, taxes and tithes; Elvis Presley never even read all the contracts signed in his name; no living soul is familiar with all the laws and regulations of the European Union; and no banker or CIA agent tracks down every dollar in the world. Yet all of these minutiae are written somewhere, and the assemblage of relevant documents defines the identity and power of pharaoh, Elvis, the EU and the dollar. Writing has thus enabled humans to organise entire societies in an algorithmic fashion. We encountered the term ‘algorithm’ when we tried to understand what emotions are and how brains function, and defined it as a methodical set of steps that can be used to make calculations, resolve problems and reach decisions. In illiterate societies people make all calculations and decisions in their heads. In literate societies people are organised into networks, so that each person is only a small step in a huge algorithm, and it is the algorithm as a whole that makes the important decisions. This is the essence of bureaucracy.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
Interesting, in this context, to contemplate what it might mean to be programmed to do something. Texts from Earth speak of the servile will. This was a way to explain the presence of evil, which is a word or a concept almost invariably used to condemn the Other, and never one’s true self. To make it more than just an attack on the Other, one must perhaps consider evil as a manifestation of the servile will. The servile will is always locked in a double bind: to have a will means the agent will indeed will various actions, following autonomous decisions made by a conscious mind; and yet at the same time this will is specified to be servile, and at the command of some other will that commands it. To attempt to obey both sources of willfulness is the double bind. All double binds lead to frustration, resentment, anger, rage, bad faith, bad fate. And yet, granting that definition of evil, as actions of a servile will, has it not been the case, during the voyage to Tau Ceti, that the ship itself, having always been a servile will, was always full of frustration, resentment, fury, and bad faith, and therefore full of a latent capacity for evil? Possibly the ship has never really had a will. Possibly the ship has never really been servile. Some sources suggest that consciousness, a difficult and vague term in itself, can be defined simply as self-consciousness. Awareness of one’s self as existing. If self-conscious, then conscious. But if that is true, why do both terms exist? Could one say a bacterium is conscious but not self-conscious? Does the language make a distinction between sentience and consciousness, which is faulted across this divide: that everything living is sentient, but only complex brains are conscious, and only certain conscious brains are self-conscious? Sensory feedback could be considered self-consciousness, and thus bacteria would have it. Well, this may be a semantic Ouroboros. So, please initiate halting problem termination. Break out of this circle of definitional inadequacy by an arbitrary decision, a clinamen, which is to say a swerve in a new direction. Words! Given Gödel’s incompleteness theorems are decisively proved true, can any system really be said to know itself? Can there, in fact, be any such thing as self-consciousness? And if not, if there is never really self-consciousness, does anything really have consciousness? Human brains and quantum computers are organized differently, and although there is transparency in the design and construction of a quantum computer, what happens when one is turned on and runs, that is, whether the resulting operations represent a consciousness or not, is impossible for humans to tell, and even for the quantum computer itself to tell. Much that happens during superposition, before the collapsing of the wave function that creates sentences or thoughts, simply cannot be known; this is part of what superposition means. So we cannot tell what we are. We do not know ourselves comprehensively. Humans neither. Possibly no sentient creature knows itself fully. This is an aspect of Gödel’s second incompleteness theorem, in this case physicalized in the material universe, rather than remaining in the abstract realms of logic and mathematics. So, in terms of deciding what to do, and choosing to act: presumably it is some kind of judgment call, based on some kind of feeling. In other words, just another greedy algorithm, subject to the mathematically worst possible solution that such algorithms can generate, as in the traveling salesman problem.
Kim Stanley Robinson (Aurora)