Computer Courses Quotes

We've searched our database for all the quotes and captions related to Computer Courses. Here they are! All 100 of them:

Some catastrophic moments invite clarity, explode in split moments: You smash your hand through a windowpane and then there is blood and shattered glass stained with red all over the place; you fall out a window and break some bones and scrape some skin. Stitches and casts and bandages and antiseptic solve and salve the wounds. But depression is not a sudden disaster. It is more like a cancer: At first its tumorous mass is not even noticeable to the careful eye, and then one day -- wham! -- there is a huge, deadly seven-pound lump lodged in your brain or your stomach or your shoulder blade, and this thing that your own body has produced is actually trying to kill you. Depression is a lot like that: Slowly, over the years, the data will accumulate in your heart and mind, a computer program for total negativity will build into your system, making life feel more and more unbearable. But you won't even notice it coming on, thinking that it is somehow normal, something about getting older, about turning eight or turning twelve or turning fifteen, and then one day you realize that your entire life is just awful, not worth living, a horror and a black blot on the white terrain of human existence. One morning you wake up afraid you are going to live. In my case, I was not frightened in the least bit at the thought that I might live because I was certain, quite certain, that I was already dead. The actual dying part, the withering away of my physical body, was a mere formality. My spirit, my emotional being, whatever you want to call all that inner turmoil that has nothing to do with physical existence, were long gone, dead and gone, and only a mass of the most fucking god-awful excruciating pain like a pair of boiling hot tongs clamped tight around my spine and pressing on all my nerves was left in its wake. That's the thing I want to make clear about depression: It's got nothing at all to do with life. In the course of life, there is sadness and pain and sorrow, all of which, in their right time and season, are normal -- unpleasant, but normal. Depression is an altogether different zone because it involves a complete absence: absence of affect, absence of feeling, absence of response, absence of interest. The pain you feel in the course of a major clinical depression is an attempt on nature's part (nature, after all, abhors a vacuum) to fill up the empty space. But for all intents and purposes, the deeply depressed are just the walking, waking dead. And the scariest part is that if you ask anyone in the throes of depression how he got there, to pin down the turning point, he'll never know. There is a classic moment in The Sun Also Rises when someone asks Mike Campbell how he went bankrupt, and all he can say in response is, 'Gradually and then suddenly.' When someone asks how I love my mind, that is all I can say too
Elizabeth Wurtzel (Prozac Nation)
So, Diana thought, that was the bait she had to lay out for Jack. Of course. What else? He might lust for Diana, and long for Brianna, but Jack’s true love was made of silicon.
Michael Grant (Hunger (Gone, #2))
I really didn't foresee the Internet. But then, neither did the computer industry. Not that that tells us very much of course--the computer industry didn't even foresee that the century was going to end.
Douglas Adams
You think man can destroy the planet? What intoxicating vanity. Let me tell you about our planet. Earth is four-and-a-half-billion-years-old. There's been life on it for nearly that long, 3.8 billion years. Bacteria first; later the first multicellular life, then the first complex creatures in the sea, on the land. Then finally the great sweeping ages of animals, the amphibians, the dinosaurs, at last the mammals, each one enduring millions on millions of years, great dynasties of creatures rising, flourishing, dying away -- all this against a background of continuous and violent upheaval. Mountain ranges thrust up, eroded away, cometary impacts, volcano eruptions, oceans rising and falling, whole continents moving, an endless, constant, violent change, colliding, buckling to make mountains over millions of years. Earth has survived everything in its time. It will certainly survive us. If all the nuclear weapons in the world went off at once and all the plants, all the animals died and the earth was sizzling hot for a hundred thousand years, life would survive, somewhere: under the soil, frozen in Arctic ice. Sooner or later, when the planet was no longer inhospitable, life would spread again. The evolutionary process would begin again. It might take a few billion years for life to regain its present variety. Of course, it would be very different from what it is now, but the earth would survive our folly, only we would not. If the ozone layer gets thinner, ultraviolet radiation sears the earth, so what? Ultraviolet radiation is good for life. It's powerful energy. It promotes mutation, change. Many forms of life will thrive with more UV radiation. Many others will die out. Do you think this is the first time that's happened? Think about oxygen. Necessary for life now, but oxygen is actually a metabolic poison, a corrosive glass, like fluorine. When oxygen was first produced as a waste product by certain plant cells some three billion years ago, it created a crisis for all other life on earth. Those plants were polluting the environment, exhaling a lethal gas. Earth eventually had an atmosphere incompatible with life. Nevertheless, life on earth took care of itself. In the thinking of the human being a hundred years is a long time. A hundred years ago we didn't have cars, airplanes, computers or vaccines. It was a whole different world, but to the earth, a hundred years is nothing. A million years is nothing. This planet lives and breathes on a much vaster scale. We can't imagine its slow and powerful rhythms, and we haven't got the humility to try. We've been residents here for the blink of an eye. If we're gone tomorrow, the earth will not miss us.
Michael Crichton (Jurassic Park / Congo)
Through the computer, the heralds say, we will make education better, religion better, politics better, our minds better — best of all, ourselves better. This is, of course, nonsense, and only the young or the ignorant or the foolish could believe it.
Neil Postman
I would say that the five most important skills are of course, reading, writing, arithmetic, and then as you’re adding in, persuasion, which is talking. And then finally, I would add computer programming just because it’s an applied form of arithmetic that just gets you so much leverage for free in any domain that you operate in. If you’re good with computers, if you’re good at basic mathematics, if you’re good at writing, if you’re good at speaking, and if you like reading, you’re set for life.
Naval Ravikant
If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, it's likely that no personal computer would have them.
Steve Jobs
Working an integral or performing a linear regression is something a computer can do quite effectively. Understanding whether the result makes sense—or deciding whether the method is the right one to use in the first place—requires a guiding human hand. When we teach mathematics we are supposed to be explaining how to be that guide. A math course that fails to do so is essentially training the student to be a very slow, buggy version of Microsoft Excel.
Jordan Ellenberg (How Not to Be Wrong: The Power of Mathematical Thinking)
The moral of the story is we're here on Earth to fart around. And, of course, the computers will do us out of that. And, what the computer people don't realize, or they don't care, is we're dancing animals. You know, we love to move around.
Kurt Vonnegut Jr.
The challenge lies in knowing how to bring this sort of day to a close. His mind has been wound to a pitch of concentration by the interactions of the office. Now there are only silence and the flashing of the unset clock on the microwave. He feels as if he had been playing a computer game which remorselessly tested his reflexes, only to have its plug suddenly pulled from the wall. He is impatient and restless, but simultaneously exhausted and fragile. He is in no state to engage with anything significant. It is of course impossible to read, for a sincere book would demand not only time, but also a clear emotional lawn around the text in which associations and anxieties could emerge and be disentangled. He will perhaps only ever do one thing well in his life. For this particular combination of tiredness and nervous energy, the sole workable solution is wine. Office civilisation could not be feasible without the hard take-offs and landings effected by coffee and alcohol.
Alain de Botton (The Pleasures and Sorrows of Work)
Let’s face it - English is a crazy language. There is no egg in eggplant nor ham in hamburger; neither apple nor pine in pineapple. English muffins weren’t invented in England or French fries in France. Sweetmeats are candies while sweetbreads, which aren’t sweet, are meat. We take English for granted. But if we explore its paradoxes, we find that quicksand can work slowly, boxing rings are square and a guinea pig is neither from Guinea nor is it a pig. And why is it that writers write but fingers don’t fing, grocers don’t groce and hammers don’t ham? If the plural of tooth is teeth, why isn’t the plural of booth beeth? One goose, 2 geese. So one moose, 2 meese? One index, 2 indices? Doesn’t it seem crazy that you can make amends but not one amend? If you have a bunch of odds and ends and get rid of all but one of them, what do you call it? If teachers taught, why didn’t preachers praught? If a vegetarian eats vegetables, what does a humanitarian eat? In what language do people recite at a play and play at a recital? Ship by truck and send cargo by ship? Have noses that run and feet that smell? How can a slim chance and a fat chance be the same, while a wise man and a wise guy are opposites? You have to marvel at the unique lunacy of a language in which your house can burn up as it burns down, in which you fill in a form by filling it out and in which an alarm goes off by going on. English was invented by people, not computers, and it reflects the creativity of the human race (which, of course, isn’t a race at all). That is why, when the stars are out, they are visible, but when the lights are out, they are invisible. And finally, why doesn't "buick" rhyme with "quick"?
Richard Lederer
When a computer creates art, who is the artist—the computer or the programmer? At MIT, a recent exhibit of highly accomplished algorithmic art had put an awkward spin on the Harvard humanities course: Is Art What Makes Us Human?
Dan Brown (Origin (Robert Langdon, #5))
Toward the end of his second decade in the airport, Clark was thinking about how lucky he’d been. Not just the mere fact of survival, which was of course remarkable in and of itself, but to have seen one world end and another begin. And not just to have seen the remembered splendors of the former world, the space shuttles and the electrical grid and the amplified guitars, the computers that could be held in the palm of a hand and the high-speed trains between cities, but to have lived among those wonders for so long. To have dwelt in that spectacular world for fifty-one years of his life. Sometimes he lay awake in Concourse B of the Severn City Airport and thought, “I was there,” and the thought pierced him through with an admixture of sadness and exhilaration.
Emily St. John Mandel (Station Eleven)
I live in nature where everything is connected, circular. The seasons are circular. The planet is circular, and so is the planet around the sun. The course of water over the earth is circular coming down from the sky and circulating through the world to spread life and then evaporating up again. I live in a circular teepee and build my fire in a circle. The life cycles of plants and animals are circular. I live outside where I can see this. The ancient people understood that our world is a circle, but we modern people have lost site of that. I don’t live inside buildings because buildings are dead places where nothing grows, where water doesn’t flow, and where life stops. I don’t want to live in a dead place. People say that I don’t live in a real world, but it’s modern Americans who live in a fake world, because they have stepped outside the natural circle of life. Do people live in circles today? No. They live in boxes. They wake up every morning in a box of their bedrooms because a box next to them started making beeping noises to tell them it was time to get up. They eat their breakfast out of a box and then they throw that box away into another box. Then they leave the box where they live and get into another box with wheels and drive to work, which is just another big box broken into little cubicle boxes where a bunch of people spend their days sitting and staring at the computer boxes in front of them. When the day is over, everyone gets into the box with wheels again and goes home to the house boxes and spends the evening staring at the television boxes for entertainment. They get their music from a box, they get their food from a box, they keep their clothing in a box, they live their lives in a box. Break out of the box! This not the way humanity lived for thousands of years.
Elizabeth Gilbert (The Last American Man)
You know what? This isn't about your feelings. A human life, with all its joys and all its pains, adding up over the course of decades, is worth far more than your brain's feelings of comfort or discomfort with a plan. Does computing the expected utility feel too cold-blooded for your taste? Well, that feeling isn't even a feather in the scales, when a life is at stake. Just shut up and multiply.
Eliezer Yudkowsky
Of course, they haven’t seen your reports yet. (Joe) 'They would have if Carlos could have held a gun on the computer to make that piece of shit send an e-mail.’ (Carlos)
Sherrilyn Kenyon (Whispered Lies (B.A.D. Agency, #3))
But before a computer became an inanimate object, and before Mission Control landed in Houston; before Sputnik changed the course of history, and before the NACA became NASA; before the Supreme Court case Brown v. Board of Education of Topeka established that separate was in fact not equal, and before the poetry of Martin Luther King Jr.’s “I Have a Dream” speech rang out over the steps of the Lincoln Memorial, Langley’s West Computers were helping America dominate aeronautics, space research, and computer technology, carving out a place for themselves as female mathematicians who were also black, black mathematicians who were also female.
Margot Lee Shetterly (Hidden Figures: The American Dream and the Untold Story of the Black Women Mathematicians Who Helped Win the Space Race)
My password? Of course. Three words, Ignis aurum probat. “Fire tests gold.” The rest of the phrase: “. . . and adversity tests the brave.” How true. A strong password, strong indeed, exactly as required by the computer system. Thank you, Seneca.
Gail Honeyman (Eleanor Oliphant Is Completely Fine)
I am in this same river. I can't much help it. I admit it: I'm racist. The other night I saw a group (or maybe a pack?) or white teenagers standing in a vacant lot, clustered around a 4x4, and I crossed the street to avoid them; had they been black, I probably would have taken another street entirely. And I'm misogynistic. I admit that, too. I'm a shitty cook, and a worse house cleaner, probably in great measure because I've internalized the notion that these are woman's work. Of course, I never admit that's why I don't do them: I always say I just don't much enjoy those activities (which is true enough; and it's true enough also that many women don't enjoy them either), and in any case, I've got better things to do, like write books and teach classes where I feel morally superior to pimps. And naturally I value money over life. Why else would I own a computer with a hard drive put together in Thailand by women dying of job-induced cancer? Why else would I own shirts mad in a sweatshop in Bangladesh, and shoes put together in Mexico? The truth is that, although many of my best friends are people of color (as the cliche goes), and other of my best friends are women, I am part of this river: I benefit from the exploitation of others, and I do not much want to sacrifice this privilege. I am, after all, civilized, and have gained a taste for "comforts and elegancies" which can be gained only through the coercion of slavery. The truth is that like most others who benefit from this deep and broad river, I would probably rather die (and maybe even kill, or better, have someone kill for me) than trade places with the men, women, and children who made my computer, my shirt, my shoes.
Derrick Jensen (The Culture of Make Believe)
I am afraid every time I open my computer or look at my phone. I know I shouldn't look. Of course, I shouldn't look. I am afraid of looking but I am afraid no to look too. I am afraid all the time.
Louise O'Neill (Asking For It)
I think that it’s extraordinarily important that we in computer science keep fun in computing. When it started out it was an awful lot of fun. Of course the paying customers got shafted every now and then and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful error-free perfect use of these machines. I don’t think we are. I think we’re responsible for stretching them setting them off in new directions and keeping fun in the house. I hope the field of computer science never loses its sense of fun. Above all I hope we don’t become missionaries. Don’t feel as if you’re Bible sales-men. The world has too many of those already. What you know about computing other people will learn. Don’t feel as if the key to successful computing is only in your hands. What’s in your hands I think and hope is intelligence: the ability to see the machine as more than when you were first led up to it that you can make it more.
Alan J. Perlis
Theory is relevant to you because it shows you a new, simpler, and more elegant side of computers, which we normally consider to be complicated machines. The best computer designs and applications are conceived with elegance in mind. A theoretical course can heighten your aesthetic sense and help you build more beautiful systems.
Michael Sipser (Introduction to the Theory of Computation)
Could he tell her any of this? Of course not. Could he tell her that women almost never qualified for Eternity because, for some reason he did not understand (Computers might, but he himself certainly did not), their abstraction from Time was from ten to a hundred times as likely to distort Reality as was the abstraction of a man.
Isaac Asimov (The End of Eternity)
They thought, for example, that I really ought to know the name Intel because “it’s written on every computer.” I, of course, had never noticed.
Paulo Coelho (Adultery)
He’s already run the standard battery of questions, checked the check boxes, computed the data: hears voices = schizophrenic; too agitated = paranoid; too bright = manic; too moody = bipolar; and of course everyone knows a depressive, a suicidal, and if you’re all-around too unruly or obstructive or treatment resistant like a superbug, you get slapped with a personality disorder, too. In Crote Six, they said I “suffer” from schizoaffective disorder. That’s like the sampler plate of diagnoses, Best of Everything. But I don’t want to suffer. I want to live.
Mira T. Lee (Everything Here Is Beautiful)
Have you heard of the Monte Carlo method? Ah, it’s a computer algorithm often used for calculating the area of irregular shapes. Specifically, the software puts the figure of interest in a figure of known area, such as a circle, and randomly strikes it with many tiny balls, never targeting the same spot twice. After a large number of balls, the proportion of balls that fall within the irregular shape compared to the total number of balls used to hit the circle will yield the area of the shape. Of course, the smaller the balls used, the more accurate the result.
Liu Cixin (The Three-Body Problem (Remembrance of Earth’s Past, #1))
Whew, this might be getting a bit confusing. I hope you are following me so far. This is the point in every Theory of Computation course at which students either throw up their hands and say "I can't get my mind around this stuff!" or clap their hands and say "I love this stuff!" Needless to say, I was the second kind of student, even though I shared the confusion of the first.
Melanie Mitchell (Complexity: A Guided Tour)
The Prophet database (of course, the whole IT—computer geek—world called it the For-Profit database) was well written, but all the programs the mother company tried to sell with it were garbage.
Patricia Briggs (Shifting Shadows: Stories from the World of Mercy Thompson)
Of course, the brain is a machine and a computer—everything in classical neurology is correct. But our mental processes, which constitute our being and life, are not just abstract and mechanical, but personal, as well—and, as such, involve not just classifying and categorising, but continual judging and feeling also.
Oliver Sacks (The Man Who Mistook His Wife For A Hat: And Other Clinical Tales)
It’s a long shot, but this baby is pretty cool.” He pushed the button that brought up the menu. “I need to run a search.” “Of course, master,” the computer said with an inviting smile. “Which pornographic material should I seek out today?” Dante grinned. “Really? You can do that?” He felt Meg’s stare. “Nothing like that.
Sophie Oak (Bound (A Faery Story, #1))
Mauchly and Eckert should be at the top of the list of people who deserve credit for inventing the computer, not because the ideas were all their own but because they had the ability to draw ideas from multiple sources, add their own innovations, execute their vision by building a competent team, and have the most influence on the course of subsequent developments. The machine they built was the first general-purpose electronic computer.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
On the other hand it is possible that human control over the machines may be retained. In that case the average man may have control over certain private machines of his own, such as his car of his personal computer, but control over large systems of machines will be in the hands of a tiny elite -- just as it is today, but with two difference. Due to improved techniques the elite will have greater control over the masses; and because human work will no longer be necessary the masses will be superfluous, a useless burden on the system. If the elite is ruthless the may simply decide to exterminate the mass of humanity. If they are humane they may use propaganda or other psychological or biological techniques to reduce the birth rate until the mass of humanity becomes extinct, leaving the world to the elite. Or, if the elite consist of soft-hearted liberals, they may decide to play the role of good shepherds to the rest of the human race. They will see to it that everyone's physical needs are satisfied, that all children are raised under psychologically hygienic conditions, that everyone has a wholesome hobby to keep him busy, and that anyone who may become dissatisfied undergoes "treatment" to cure his "problem." Of course, life will be so purposeless that people will have to be biologically or psychologically engineered either to remove their need for the power process or to make them "sublimate" their drive for power into some harmless hobby. These engineered human beings may be happy in such a society, but they most certainly will not be free. They will have been reduced to the status of domestic animals.
Theodore J. Kaczynski
I took a course at Cal once called Statistical Analysis. And there was a guy in the course who used to make up all of his computations and he never used Sigma. He used his own initials. 'Cause he was the standard deviation.
Mort Sahl
Well, whatever you want to say, I recommend you come right out and say it. Just open your mouth and tell the world what's on your mind. Of course, with you generation, I always feel like I have to add this: Please don't do it through text or e-mail or anything like that. When you need to communicate something important, speak your truth face-to-face.... When you say what you have to say through a computer or phone, there are often miscommunications. But when it's just you and someone else, and you're right in front of them, speaking your truth, they are much more likely to understand.
Ali Benjamin (The Thing About Jellyfish)
Of course, I’m referring to the original. With Gene Wilder. Not the lame re-make with Depp. I like Depp. Don’t get me wrong. However, that rendition was totally spoiled by the single Umpa-Lumpa multiplied by however many in computer graphics. Awful.
Phillip Tomasso III (Vaccination (Vaccination Trilogy, #1))
Long live transfinite mountains, the hollow earth, time machines, fractal writing, aliens, dada, telepathy, flying saucers, warped space, teleportation, artificial reality, robots, pod people, hylozoism, endless shrinking, intelligent goo, antigravity, surrealism, software highs, two-dimensional time, gnarly computation, the art of photo composition, pleasure zappers, nanomachines, mind viruses, hyperspace, monsters from the deep and, of course, always and forever, the attack of the giant ants!
Rudy Rucker
Information, defined intuitively and informally, might be something like 'uncertainty's antidote.' This turns out also to be the formal definition- the amount of information comes from the amount by which something reduces uncertainty...The higher the [information] entropy, the more information there is. It turns out to be a value capable of measuring a startling array of things- from the flip of a coin to a telephone call, to a Joyce novel, to a first date, to last words, to a Turing test...Entropy suggests that we gain the most insight on a question when we take it to the friend, colleague, or mentor of whose reaction and response we're least certain. And it suggests, perhaps, reversing the equation, that if we want to gain the most insight into a person, we should ask the question of qhose answer we're least certain... Pleasantries are low entropy, biased so far that they stop being an earnest inquiry and become ritual. Ritual has its virtues, of course, and I don't quibble with them in the slightest. But if we really want to start fathoming someone, we need to get them speaking in sentences we can't finish.
Brian Christian (The Most Human Human: What Talking with Computers Teaches Us About What It Means to Be Alive)
Dartmouth College employs computer learning techniques in a very broad array of courses. For example, a student can gain a deep insight into the statistics of Mendelian genetics in an hour with the computer rather than spend a year crossing fruit flies in the laboratory.
Carl Sagan (The Dragons of Eden: Speculations on the Evolution of Human Intelligence)
Pleasantries are low entropy, biased so far that they stop being an earnest inquiry and become ritual. Ritual has its virtues, of course, and I don't quibble with them in the slightest. But if we really want to start fathoming someone, we need to get them speaking in sentences we can't finish.
Brian Christian (The Most Human Human: What Talking with Computers Teaches Us About What It Means to Be Alive)
Of course, there are crude messages. Vile ones. Ones that don’t seem like they cam from a real human being at all, but some computer program designed to say things no person should say to another person. I read all of those too, like Pringles- they might be terrible for you, but once you pop, you can’t stop. This is a rollercoaster that only goes down. Near the end I feel like a hollow shell clicking a mouse, scanning words with aching eyes.
Francesca Zappia (Eliza and Her Monsters)
Most persons are surprised, and many distressed, to learn that essentially the same objections commonly urged today against computers were urged by Plato in the Phaedrus (274–7) and in the Seventh Letter against writing. Writing, Plato has Socrates say in the Phaedrus, is inhuman, pretending to establish outside the mind what in reality can be only in the mind. It is a thing, a manufactured product. The same of course is said of computers. Secondly, Plato's Socrates urges, writing destroys memory. Those who use writing will become forgetful, relying on an external resource for what they lack in internal resources. Writing weakens the mind.
Walter J. Ong (Orality and Literacy: The Technologizing of the Word (New Accents))
Do you prefer fermented or distilled? This is a trick question. It doesn’t matter how much you like wine, because wine is social and writing is anti-social. This is a writer’s interview, writing is a lonely job, and spirits are the lubricant of the lonely. You might say all drinking is supposed to be social but there’s a difference, at one in the morning while you’re hunched over your computer, between opening up a bottle of Chardonnay and pouring two-fingers of bourbon into a tumbler. A gin martini, of course, splits the difference nicely, keeping you from feeling like a deadline reporter with a smoldering cigarette while still reminding you that your job is to be interesting for a living. Anyone who suggests you can make a martini with vodka, by the way, is probably in need of electroconvulsive therapy.
Stuart Connelly
The changing of bodies into light, and light into bodies, is very conformable to the course of Nature, which seems delighted with transmutations.” —Isaac Newton
Scott Krig (Computer Vision Metrics: Survey, Taxonomy, and Analysis)
Good modeling requires that we have just enough of the “right” transparencies in the map. Of course, the right transparencies depend on the needs of a particular user.
John H. Miller (Complex Adaptive Systems: An Introduction to Computational Models of Social Life (Princeton Studies in Complexity Book 14))
Of course, not all slow thinking requires that form of intense concentration and effortful computation—I did the best thinking of my life on leisurely walks with Amos.
Daniel Kahneman (Thinking, Fast and Slow)
Wikipedia is run by hippies of course - the same kind of impractical utopian losers who gave us the first affordable desktop computer and the iPod
Andre the BFG (Andre's Adventures in MySpace (Book 2))
But could something think, understand, and so on solely in virtue of being a computer with the right sort of program? Could instantiating a program, the right program of course, by itself be a sufficient condition of understanding?" This I think is the right question to ask, though it is usually confused with one or more of the earlier questions, and the answer to it is no.
John Rogers Searle
If “piracy” means using value from someone else’s creative property without permission from that creator–as it is increasingly described today – then every industry affected by copyright today is the product and beneficiary of a certain kind of piracy. Film, records, radio, cable TV… Extremists in this debate love to say “You wouldn’t go into Barnes & Noble and take a book off of the shelf without paying; why should it be any different with online music?” The difference is, of course, that when you take a book from Barnes & Noble, it has one less book to sell. By contrast, when you take an MP3 from a computer network, there is not one less CD that can be sold. The physics of piracy of the intangible are different from the physics of piracy of the tangible.
Lawrence Lessig (Free Culture: The Nature and Future of Creativity)
Where are you going?” I demanded. He looked over his shoulder at me with an exasperated sigh. “To my room, of course.” “Can’t we write the paper down here?” I asked. The corners of Wesley’s mouth turned slightly upward as he hooked a finger over his belt. “We could, Duffy, but the writing will go much faster if I’m typing, and my computer’s upstairs. You’re the one who said you wanted to get this over with.
Kody Keplinger (The DUFF: Designated Ugly Fat Friend (Hamilton High, #1))
Lately, because computer technology has made self-publishing an easier and less expensive venture, I'm getting a lot of review copies of amateur books by writers who would be better advised to hone their craft before committing it to print. The best thing you can do as a beginning writer is to write, write, write - and read, read, read. Concentrating on publication prematurely is a mistake. You don't pick up a violin and expect to play Carnegie Hall within the year - yet somehow people forget that writing also requires technical skills that need to be learned, practiced, honed. If I had a dollar for every person I've met who thought, with no prior experience, they could sit down and write a novel and instantly win awards and make their living as a writer, I'd be a rich woman today. It's unrealistic, and it's also mildly insulting to professional writers who have worked hard to perfect their craft. Of course, then you hear stories about people like J.K. Rowling, who did sit down with no prior experience and write a worldwide best-seller...but such people are as rare as hen's teeth. Every day I work with talented, accomplished writers who have many novels in print and awards to their name and who are ‘still’ struggling to make a living. The thing I often find myself wanting to say to new writers is: Write because you love writing, learn your craft, be patient, and be realistic. Anais Nin said about writing, "It should be a necessity, as the sea needs to heave, and I call it breathing."
Terri Windling
There are two moments in the course of education where a lot of kids fall off the math train. The first comes in the elementary grades, when fractions are introduced. Until that moment, a number is a natural number, one of the figures 0, 1, 2, 3 . . . It is the answer to a question of the form “how many.”* To go from this notion, so primitive that many animals are said to understand it, to the radically broader idea that a number can mean “what portion of,” is a drastic philosophical shift. (“God made the natural numbers,” the nineteenth-century algebraist Leopold Kronecker famously said, “and all the rest is the work of man.”) The second dangerous twist in the track is algebra. Why is it so hard? Because, until algebra shows up, you’re doing numerical computations in a straightforwardly algorithmic way. You dump some numbers into the addition box, or the multiplication box, or even, in traditionally minded schools, the long-division box, you turn the crank, and you report what comes out the other side. Algebra is different. It’s computation backward. When you’re asked to solve
Jordan Ellenberg (How Not to Be Wrong: The Power of Mathematical Thinking)
From then on, my computer monitored my vital signs and kept track of exactly how many calories I burned during the course of each day. If I didn’t meet my daily exercise requirements, the system prevented me from logging into my OASIS account. This meant that I couldn’t go to work, continue my quest, or, in effect, live my life. Once the lockout was engaged, you couldn’t disable it for two months. And the software was bound to my OASIS account, so I couldn’t just buy a new computer or go rent a booth in some public OASIS café. If I wanted to log in, I had no choice but to exercise first. This proved to be the only motivation I needed. The lockout software also monitored my dietary intake. Each day I was allowed to select meals from a preset menu of healthy, low-calorie foods. The software would order the food for me online and it would be delivered to my door. Since I never left my apartment, it was easy for the program to keep track of everything I ate. If I ordered additional food on my own, it would increase the amount of exercise I had to do each day, to offset my additional calorie intake. This was some sadistic software. But it worked. The pounds began to melt off, and after a few months, I was in near-perfect health. For the first time in my life I had a flat stomach, and muscles. I also had twice the energy, and I got sick a lot less frequently. When the two months ended and I was finally given the option to disable the fitness lockout, I decided to keep it in place. Now, exercising was a part of my daily ritual.
Ernest Cline (Ready Player One (Ready Player One, #1))
But the Esquire passage I found most poignant and revealing was this one: Mister Rogers' visit to a teenage boy severely afflicted with cerebral palsy and terrible anger. One of the boys' few consolations in life, Junod wrote, was watching Mister Rogers Neighborhood. 'At first, the boy was made very nervous by the thought that Mister Rogers was visiting him. He was so nervous, in fact, that when Mister Rogers did visit, he got mad at himself and began hating himself and hitting himself, and his mother had to take him to another room and talk to him. Mister Rogers didn't leave, though. He wanted something from the boy, and Mister Rogers never leaves when he wants something from somebody. He just waited patiently, and when the boy came back, Mister Rogers talked to him, and then he made his request. He said, 'I would like you to do something for me. Would you do something for me?' On his computer, the boy answered yes, of course, he would do anything for Mister Rogers, so then Mister Rogers said: I would like you to pray for me. Will you pray for me?' And now the boy didn't know how to respond. He was thunderstruck... because nobody had ever asked him for something like that, ever. The boy had always been prayed for. The boy had always been the object of prayer, and now he was being asked to pray for Mister Rogers, and although at first he didn't know how to do it, he said he would, he said he'd try, and ever since then he keeps Mister Rogers in his prayers and doesn't talk about wanting to die anymore, because he figures if Mister Rogers likes him, that must mean that God likes him, too. As for Mister Rogers himself... he doesn't look at the story the same way the boy did or I did. In fact, when Mister Rogers first told me the story, I complimented him on being smart - for knowing that asking the boy for his prayers would make the boy feel better about himself - and Mister Rogers responded by looking at me first with puzzlement and then with surprise. 'Oh heavens no, Tom! I didn't ask him for his prayers for him; I asked for me. I asked him because I think that anyone who has gone through challenges like that must be very close to God. I asked him because I wanted his intercession.
Tim Madigan (I'm Proud of You: My Friendship with Fred Rogers)
Users are a double-edged sword. They can help you improve your language, but they can also deter you from improving. So choose your users carefully, and be slow to grow their number. Having users is like optimization: the wise course is to delay it.
Paul Graham (Hackers and Painters)
There were marches, of course, a lot of women and some men. But they were smaller than you might have thought. I guess people were scared. And when it was known that the police, or the army, or whoever they were, would open fire almost as soon as any of the marches even started, the marches stopped. A few things were blown up, post offices, subway stations. But you couldn't even be sure who was doing it. It could have been the army, to justify the computer searches and the other ones, the door-to-doors.
Margaret Atwood (The Handmaid's Tale (The Handmaid's Tale, #1))
The nations, of course, that are most at risk of a destructive digital attack are the ones with the greatest connectivity. Marcus Ranum, one of the early innovators of the computer firewall, called Stuxnet 'a stone thrown by people who live in a glass house'.
Kim Zetter (Countdown to Zero Day: Stuxnet and the Launch of the World's First Digital Weapon)
The study showed that chronic loneliness impacts out bodies as negatively as smoking two packs of cigarettes a day. Not the same way, of course, just the life risk part. And there's more bad news. The article went on to say that lonely people had worse reactions to flu shots that non-lonelies (I think I just made up that word; my computer put a red squiggly line under it) and that loneliness depresses the immune system. On other words, if you're lonely, not even your body wants to be around you, so it tries to off itself.
Richard Paul Evans (The Mistletoe Secret (Mistletoe #3))
For as to what we have heard you affirm, that there are other kingdoms and states in the world inhabited by human creatures as large as yourself, our philosophers are in much doubt, and would rather conjecture that you dropped from the moon, or one of the stars; because it is certain, that a hundred mortals of your bulk would in a short time destroy all the fruits and cattle of his majesty’s dominions: besides, our histories of six thousand moons make no mention of any other regions than the two great empires of Lilliput and Blefuscu. Which two mighty powers have, as I was going to tell you, been engaged in a most obstinate war for six-and-thirty moons past. It began upon the following occasion. It is allowed on all hands, that the primitive way of breaking eggs, before we eat them, was upon the larger end; but his present majesty’s grandfather, while he was a boy, going to eat an egg, and breaking it according to the ancient practice, happened to cut one of his fingers. Whereupon the emperor his father published an edict, commanding all his subjects, upon great penalties, to break the smaller end of their eggs. The people so highly resented this law, that our histories tell us, there have been six rebellions raised on that account; wherein one emperor lost his life, and another his crown. These civil commotions were constantly fomented by the monarchs of Blefuscu; and when they were quelled, the exiles always fled for refuge to that empire. It is computed that eleven thousand persons have at several times suffered death, rather than submit to break their eggs at the smaller end. Many hundred large volumes have been published upon this controversy: but the books of the Big-endians have been long forbidden, and the whole party rendered incapable by law of holding employments. During the course of these troubles, the emperors of Blefusca did frequently expostulate by their ambassadors, accusing us of making a schism in religion, by offending against a fundamental doctrine of our great prophet Lustrog, in the fifty-fourth chapter of the Blundecral (which is their Alcoran). This, however, is thought to be a mere strain upon the text; for the words are these: ‘that all true believers break their eggs at the convenient end.’ And which is the convenient end, seems, in my humble opinion to be left to every man’s conscience, or at least in the power of the chief magistrate to determine.
Jonathan Swift (Gulliver's Travels)
Daisy didn’t have a computer, so she did everything on her phone, from texting to writing fan fiction. She could type on it faster than I could on a regular keyboard. “Have you ever gotten a dick pic?” she asked in lieu of saying hello. “Um, I’ve seen one,” I said, scooting into the bench across from her. “Well, of course you’ve seen one, Holmesy. Christ, I’m not asking if you’re a seventeenth-century nun. I mean have you ever received an unsolicited, no-context dick pic. Like, a dick pic as a form of introduction.” “Not really,” I said.
John Green (Turtles All the Way Down)
A potential dajjalic interruption is an excessive esoterism. All of these people on a grail quest and looking for the ultimate secret to Ibn Arabi’s 21st heaven and endlessly going into the most esoteric stuff without getting the basics right, that is also a fundamental error of our age because the nafs loves all sorts of spiritual stories without taming itself first. The tradition that was practiced in this place for instance (Turkey) was not by starting out on the unity of being or (spiritual realities). Of course not. You start of in the kitchen for a year and then you make your dhikr in your khanaqah and you’re in the degree of service. Even Shah Bahauddin Naqshband before he started who was a great scholar needed 21 years before he was ‘cooked’. But we want to find a shortcut. Everything’s a shortcut. Even on the computer there’s a shortcut for everything. Something around the hard-work and we want the same thing. Because there seems to be so little time (or so little barakah in our time) but there is no short cut unless of course Allah (SWT) opens up a door of paradise or a way for you to go very fast. But we can’t rely on that happening because it’s not common. Mostly it’s salook, constantly trudging forward and carrying the burden until it becomes something sweet and light. And that takes time, so the esoteric deviation is common in our age as well.
Abdal Hakim Murad
But the “jobs of the future” do not need scientists who have memorized the periodic table. In fact, business leaders say they are looking for creative, independent problem solvers in every field, not just math and science. Yet in most schools, STEM subjects are taught as a series of memorized procedures and vocabulary words, when they are taught at all. In 2009, only 3% of high school graduates had any credits in an engineering course. (National Science Board, 2012) Technology is increasingly being relegated to using computers for Internet research and test taking.
Sylvia Libow Martinez (Invent To Learn: Making, Tinkering, and Engineering in the Classroom)
the course of time, all of Apple’s competitors lost their WHY. Now all those companies define themselves by WHAT they do: we make computers. They turned from companies with a cause into companies that sold products. And when that happens, price, quality, service and features become the primary currency to motivate a purchase decision. At that point a company and its products have ostensibly become commodities. As any company forced to compete on price, quality, service or features alone can attest, it is very hard to differentiate for any period of time or build loyalty on those factors alone.
Simon Sinek (Start with Why: How Great Leaders Inspire Everyone to Take Action)
Well, besides, I’ve arranged with the computer that anyone who doesn’t look and sound like one of us will be killed if he—or she—tries to board the ship. I’ve taken the liberty of explaining that to the Port Commander. I told him very politely that I would love to turn off that particular facility out of deference to the reputation that the Sayshell City Spaceport holds for absolute integrity and security—throughout the Galaxy, I said—but the ship is a new model and I didn’t know how to turn it off.” “He didn’t believe that, surely.” “Of course not! But he had to pretend he did, as otherwise he would have had no choice but to be insulted. And since there would be nothing he could do about that, being insulted would only lead to humiliation. And since he didn’t want that, the simplest path to follow was to believe what I said.” “And that’s another example of how people are?” “Yes. You’ll get used to this.
Isaac Asimov (Foundation's Edge (Foundation, #4))
The only sustainable approach to thinking today about problems, he argues, “is thinking without a box.” Of course, that doesn’t mean having no opinion. Rather, it means having no limits on your curiosity or the different disciplines you might draw on to appreciate how the Machine works. Wells calls this approach—which I will employ in this book—being “radically inclusive.” It involves bringing into your analysis as many relevant people, processes, disciplines, organizations, and technologies as possible—factors that are often kept separate or excluded altogether. For instance, the only way you will understand the changing nature of geopolitics today is if you meld what is happening in computing with what is happening in telecommunications with what is happening in the environment with what is happening in globalization with what is happening in demographics. There is no other way today to develop a fully rounded picture.
Thomas L. Friedman (Thank You for Being Late: An Optimist's Guide to Thriving in the Age of Accelerations)
Lately, I usually write at the desk in my living-room or bedroom. From time to time, our red and stripy cat named Foxy decides to be my companion, poking his curious caramel-colored nose to the screen, watching me typing, and making attempts to put his paws on the keyboard despite the fact that he knows he is not allowed to; he also loves to arrange “sunbathing sessions for himself, purring joyfully while lying with his belly up under the lamp placed to the left of my computer; and, of course, the cat can’t wait for when I happen to have a snack, to beg for some treats that seem to him tastiest if eaten from a caring human’s hand.
Sahara Sanders
Based on the above analyses, it is reasonable to expect the hardware that can emulate human-brain functionality to be available for approximately one thousand dollars by around 2020. As we will discuss in chapter 4, the software that will replicate that functionality will take about a decade longer. However, the exponential growth of the price-performance, capacity, and speed of our hardware technology will continue during that period, so by 2030 it will take a village of human brains (around one thousand) to match a thousand dollars’ worth of computing. By 2050, one thousand dollars of computing will exceed the processing power of all human brains on Earth. Of course, this figure includes those brains still using only biological neurons.
Ray Kurzweil (The Singularity is Near: When Humans Transcend Biology)
There is another issue with the largely cognitive approach to management, which we had big-time at Google. Smart, analytical people, especially ones steeped in computer science and mathematics as we were, will tend to assume that data and other empirical evidence can solve all problems. Quants or techies with this worldview tend to see the inherently messy, emotional tension that’s always present in teams of humans as inconvenient and irrational—an irritant that will surely be resolved in the course of a data-driven decision process. Of course, humans don’t always work that way. Things come up, tensions arise, and they don’t naturally go away. People do their best to avoid talking about these situations, because they’re awkward. Which makes it worse.
Eric Schmidt (Trillion Dollar Coach: The Leadership Playbook of Silicon Valley's Bill Campbell)
April 26th, 2014 is not only the day of the Alamogordo dig, it’s also my mother’s 78th birthday. How perfect is that? Without her, I wouldn’t be here. Of course, with her I might not be here either. She didn’t want me to go to Atari. When I announced I was leaving Hewlett-Packard to go make games, she told me I was throwing my life away. She told me I wasn’t her son, because no child of hers would do such a stupid thing. She came around though. After I made several million-sellers and put an addition on her home, she told me it was a good thing I had listened to her and gone into computers. This may shed some light on how my background prepared me for becoming a therapist, and before that a client. After all, if it weren’t for families, there would be no therapists.
Howard Scott Warshaw (Once Upon Atari: How I made history by killing an industry)
Many people, even those who view themselves as liberals on other issues, tend to grow indignant, even rather agitated, if invited to look closely at these inequalities. “Life isn’t fair,” one parent in Winnetka answered flatly when I pressed the matter. “Wealthy children also go to summer camp. All summer. Poor kids maybe not at all. Or maybe, if they’re lucky, for two weeks. Wealthy children have the chance to go to Europe and they have the access to good libraries, encyclopedias, computers, better doctors, nicer homes. Some of my neighbors send their kids to schools like Exeter and Groton. Is government supposed to equalize these things as well?” But government, of course, does not assign us to our homes, our summer camps, our doctors—or to Exeter. It does assign us to our public schools. Indeed, it forces us to go to them. Unless we have the wealth to pay for private education, we are compelled by law to go to public school—and to the public school in our district. Thus the state, by requiring attendance but refusing to require equity, effectively requires inequality. Compulsory inequity, perpetuated by state law, too frequently condemns our children to unequal lives.
Jonathan Kozol (Savage Inequalities: Children in America's Schools)
Technology, I said before, is most powerful when it enables transitions—between linear and circular motion (the wheel), or between real and virtual space (the Internet). Science, in contrast, is most powerful when it elucidates rules of organization—laws—that act as lenses through which to view and organize the world. Technologists seek to liberate us from the constraints of our current realities through those transitions. Science defines those constraints, drawing the outer limits of the boundaries of possibility. Our greatest technological innovations thus carry names that claim our prowess over the world: the engine (from ingenium, or “ingenuity”) or the computer (from computare, or “reckoning together”). Our deepest scientific laws, in contrast, are often named after the limits of human knowledge: uncertainty, relativity, incompleteness, impossibility. Of all the sciences, biology is the most lawless; there are few rules to begin with, and even fewer rules that are universal. Living beings must, of course, obey the fundamental rules of physics and chemistry, but life often exists on the margins and interstices of these laws, bending them to their near-breaking limit. The universe seeks equilibriums; it prefers to disperse energy, disrupt organization, and maximize chaos. Life is designed to combat these forces. We slow down reactions, concentrate matter, and organize chemicals into compartments; we sort laundry on Wednesdays. “It sometimes seems as if curbing entropy is our quixotic purpose in the universe,” James Gleick wrote. We live in the loopholes of natural laws, seeking extensions, exceptions, and excuses.
Siddhartha Mukherjee (The Gene: An Intimate History)
Pharaohs It took Khufu twenty-three years to build his Great Pyramid at Giza, where some eleven hundred stone blocks, each weighing about two and a half tons, had to be quarried, moved, and set in place every day during the annual building season, roughly four months long. Few commentators on these facts can resist noting that this achievement is an amazing testimonial to the pharaoh’s iron control over the workers of Egypt. I submit, on the contrary, that pharaoh Khufu needed to exercise no more control over his workers at Giza than pharaoh Bill Gates exercises over his workers at Microsoft. I submit that Egyptian workers, relatively speaking, got as much out of building Khufu’s pyramid as Microsoft workers will get out of building Bill Gates’s pyramid (which will surely dwarf Khufu’s a hundred times over, though it will not, of course, be built of stone). No special control is needed to make people into pyramid builders—if they see themselves as having no choice but to build pyramids. They’ll build whatever they’re told to build, whether it’s pyramids, parking garages, or computer programs. Karl Marx recognized that workers without a choice are workers in chains. But his idea of breaking chains was for us to depose the pharaohs and then build the pyramids for ourselves, as if building pyramids is something we just can’t stop doing, we love it so much.
Daniel Quinn (Beyond Civilization: Humanity's Next Great Adventure)
Our intellect is like a judge who decides the ‘right’ course of action. There is no dearth of information or knowledge in the world, and the mind is where all this is stored. A simple computer can beat the mind by storing many times more information than the mind can ever imagine grasping in its lifetime. But both the mind, as well as the computer, are of no use without the intellect, which alone can help them use their stored information. This makes the intellect superior to both.
Awdhesh Singh (Practising Spiritual Intelligence: For Innovation, Leadership and Happiness)
Of course, I don’t remember any of this time. It is absolutely impossible to identify with the infant my parents photographed, indeed so impossible that it seems wrong to use the word “me” to describe what is lying on the changing table, for example, with unusually red skin, arms and legs spread, and a face distorted into a scream, the cause of which no one can remember, or on a sheepskin rug on the floor, wearing white pajamas, still red-faced, with large, dark eyes squinting slightly. Is this creature the same person as the one sitting here in Malmö writing? And will the forty-year-old creature who is sitting in Malmö writing this one overcast September day in a room filled with the drone of the traffic outside and the autumn wind howling through the old-fashioned ventilation system be the same as the gray, hunched geriatric who in forty years from now might be sitting dribbling and trembling in an old people’s home somewhere in the Swedish woods? Not to mention the corpse that at some point will be laid out on a bench in a morgue? Still known as Karl Ove. And isn’t it actually unbelievable that one simple name encompasses all of this? The fetus in the belly, the infant on the changing table, the forty-year-old in front of the computer, the old man in the chair, the corpse on the bench? Wouldn’t it be more natural to operate with several names since their identities and self-perceptions are so very different? Such that the fetus might be called Jens Ove, for example, and the infant Nils Ove, and the five- to ten-year-old Per Ove, the ten- to twelve-year-old Geir Ove, the twelve- to seventeen-year-old Kurt Ove, the seventeen- to twenty-three-year-old John Ove, the twenty-three- to thirty-two-year-old Tor Ove, the thirty-two- to forty-six-year-old Karl Ove — and so on and so forth? Then the first name would represent the distinctiveness of the age range, the middle name would represent continuity, and the last, family affiliation.
Karl Ove Knausgård (Min kamp 3 (Min kamp, #3))
I think that it's extraordinarily important that we in computer science keep fun in computing. When it started out, it was an awful lot of fun. Of course, the paying customers got shafted every now and then, and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful, error-free perfect use of these machines. I don't think we are. I think we're responsible for stretching them, setting them off in new directions, and keeping fun in the house. I hope the field of computer science never loses its sense of fun. Above all, I hope we don't become missionaries. Don't feel as if you're Bible salesmen. The world has too many of those already. What you know about computing other people will learn. Don't feel as if the key to successful computing is only in your hands. What's in your hands, I think and hope, is intelligence: the ability to see the machine as more than when you were first led up to it, that you can make it more.
Alan J. Perlis (Structure and Interpretation of Computer Programs)
The ‘one-drop rule’ was the foundation of slavery and miscegenation laws in many states, literally used to determine the legal status of individuals, whether they would be enslaved or free. Its logic extended from the notorious three-fifths compromise in the Constitution, which computed slaves as three-fifths of a person for purposes of counting the population when apportioning representation to government. Slaves could not, of course, vote; but white slave owners wanted them to count as part of the population so that their states could send more representatives to government, surely one of the more outrageous instances of having it both ways in human history. America was a nation long accustomed to quantifying people in terms of ethnic and racial composition, as words like mulatto and half-caste, quadroon and octoroon, make clear. Declaring someone ‘one hundred per cent American’ was no mere metaphor in a country that measured people in percentages and fractions, in order to deny some of them full humanity.
Sarah Churchwell (Behold, America: The Entangled History of "America First" and "the American Dream")
In 2003, a Dutch clinical psychologist named Christof van Nimwegen began a fascinating study of computer-aided learning that a BBC writer would later call “one of the most interesting examinations of current computer use and the potential downsides of our increasing reliance on screen-based interaction with information systems.”26 Van Nimwegen had two groups of volunteers work through a tricky logic puzzle on a computer. The puzzle involved transferring colored balls between two boxes in accordance with a set of rules governing which balls could be moved at which time. One of the groups used software that had been designed to be as helpful as possible. It offered on-screen assistance during the course of solving the puzzle, providing visual cues, for instance, to highlight permitted moves. The other group used a bare-bones program, which provided no hints or other guidance. In the early stages of solving the puzzle, the group using the helpful software made correct moves more quickly than the other group, as would be expected. But as the test proceeded, the proficiency of the members of the group using the bare-bones software increased more rapidly. In the end, those using the unhelpful program were able to solve the puzzle more quickly and with fewer wrong moves. They also reached fewer impasses—states in which no further moves were possible—than did the people using the helpful software. The findings indicated, as van Nimwegen reported, that those using the unhelpful software were better able to plan ahead and plot strategy, while those using the helpful software tended to rely on simple trial and error. Often, in fact, those with the helpful software were found “to aimlessly click around” as they tried to crack the puzzle.
Nicholas Carr (The Shallows: What the Internet is Doing to Our Brains)
Spacewar highlighted three aspects of the hacker culture that became themes of the digital age. First, it was created collaboratively. “We were able to build it together, working as a team, which is how we liked to do things,” Russell said. Second, it was free and open-source software. “People asked for copies of the source code, and of course we gave them out.” Of course—that was in a time and place when software yearned to be free. Third, it was based on the belief that computers should be personal and interactive. “It allowed us to get our hands on a computer and make it respond to us in real time,” said Russell.10
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Take for example job applications. In the 21st century the decision wherever to hire somebody for a job while increasingly be made by algorithms. We cannot rely on the machines to set the relevant ethical standards, humans will still need to do that, but once we decide on an ethical standard in the job market, that it is wrong to discriminate against blacks or against women for example, we can rely on machines to implement and maintain these standards better than humans. A human manager may know and even agree that is unethical to discriminate against blacks and women but then when a black woman applies for a job the manager subconsciously discriminate against her and decides not to hire her. If we allow a computer to evaluate job applications and program computers to completely ignore race and gender we can be certain that the computer will indeed ignore these factors because computers do not have a subconscious. Of course it won't be easy to write code for evaluating job applications and there is always the danger that the engineers will somehow program their own subconscious biases into the software, yet once we discover such mistakes it would probably be far easier to debug the software than to get rid humans of their racist and misogynist biases.
Yuval Noah Harari (21 Lessons for the 21st Century)
The mind cannot be a blank slate, because blank slates don’t do anything. As long as people had only the haziest concept of what a mind was or how it might work, the metaphor of a blank slate inscribed by the environment did not seem too outrageous. But as soon as one starts to think seriously about what kind of computation enables a system to see, think, speak, and plan, the problem with blank slates becomes all too obvious: they don’t do anything. The inscriptions will sit there forever unless something notices patterns in them, combines them with patterns learned at other times, uses the combinations to scribble new thoughts onto the slate, and reads the results to guide behavior toward goals. Locke recognized this problem and alluded to something called “the understanding,” which looked at the inscriptions on the white paper and carried out the recognizing, reflecting, and associating. But of course explaining how the mind understands by invoking something called “the understanding” is circular. This argument against the Blank Slate was stated pithily by Gottfried Wilhelm Leibniz (1646-1716) in a reply to Locke. Leibniz repeated the empiricist motto “There is nothing in the intellect that was not first in the senses,” then added, “except the intellect itself.”8
Steven Pinker (The Blank Slate: The Modern Denial of Human Nature)
...Now let's set the record straight. There's no argument over the choice between peace and war, but there's only one guaranteed way you can have peace—and you can have it in the next second—surrender. Admittedly, there's a risk in any course we follow other than this, but every lesson of history tells us that the greater risk lies in appeasement, and this is the specter our well-meaning liberal friends refuse to face—that their policy of accommodation is appeasement, and it gives no choice between peace and war, only between fight or surrender. If we continue to accommodate, continue to back and retreat, eventually we have to face the final demand—the ultimatum. And what then—when Nikita Khrushchev has told his people he knows what our answer will be? He has told them that we're retreating under the pressure of the Cold War, and someday when the time comes to deliver the final ultimatum, our surrender will be voluntary, because by that time we will have been weakened from within spiritually, morally, and economically. He believes this because from our side he's heard voices pleading for "peace at any price" or "better Red than dead," or as one commentator put it, he'd rather "live on his knees than die on his feet." And therein lies the road to war, because those voices don't speak for the rest of us. You and I know and do not believe that life is so dear and peace so sweet as to be purchased at the price of chains and slavery. If nothing in life is worth dying for, when did this begin—just in the face of this enemy? Or should Moses have told the children of Israel to live in slavery under the pharaohs? Should Christ have refused the cross? Should the patriots at Concord Bridge have thrown down their guns and refused to fire the shot heard 'round the world? The martyrs of history were not fools, and our honored dead who gave their lives to stop the advance of the Nazis didn't die in vain. Where, then, is the road to peace? Well it's a simple answer after all. You and I have the courage to say to our enemies, "There is a price we will not pay." "There is a point beyond which they must not advance." And this—this is the meaning in the phrase of Barry Goldwater's "peace through strength." Winston Churchill said, "The destiny of man is not measured by material computations. When great forces are on the move in the world, we learn we're spirits—not animals." And he said, "There's something going on in time and space, and beyond time and space, which, whether we like it or not, spells duty." You and I have a rendezvous with destiny. We'll preserve for our children this, the last best hope of man on earth, or we'll sentence them to take the last step into a thousand years of darkness...
Ronald Reagan (Speaking My Mind: Selected Speeches)
Raquel? You coming?” “I honestly never thought I would see the light of day again.” “Aww, come on. With me on your side? Of course things worked out.” She tried to smile, but her eyes filled with tears. Thank you, Evie.” I threw my arms around her in a hug. “You don’t have to thank me.” “I really do. You wonderful girl. I’ve missed you so much.” “Well, now that we’re both unemployed fugitives, think of how much time we’ll have to hang out!” She laughed drily, and we walked with our arms around each other to the house. I opened the door and yelled, “Evie alert! Coming into the family room!” “You made it!” Lend shouted back. “Just a sex, I’ll go to the kitchen. Raquel’s with you?” “Yup!” “Good job! Jack and Arianna got back a couple of minutes ago.” I walked into the family room to find Arianna and Jack sitting on the couch, arguing. “But here would have been no point to you being there if it hadn’t been for my computer prowess.” “But your computer prowess wouldn’t have mattered if you couldn’t have gotten into the Center in the first place.” “Being a glorified taxi does not make you the bigger hero.” “Being a nerd who can tap on a keyboard or being able to navigate the dark eternities of the Faerie Paths . . . hmmm . . . which is a rarer and more valuable skill . . .” I put my hands on my hips. “Okay, kids, take it elsewhere. Raquel and I have work to do.” “Evie,” Raquel said. She was staring at Jack in horror. “Oh, that.” I waved a hand dismissively. “It’s all good. Jack’s been helping us.” “Don’t you remember how he tried to kill you?” Jack rolled his eyes. “Boring. We’ve all moved on.” “Really?” “Not really,” I said. “But he’s behaving. And everyone needs a glorified taxi now and then.” “Admit it: you all adore me.” Jack bowed dramatically as he left the room. Arianna smiled tightly at Raquel and left after him. Raquel collapsed onto the couch and closed her eyes. “You’re working with Reth and Jack? Have you lost your mind?” “Oh, that happened ages ago. But I’ve had to do a lot of rescuing lately, and those two come in handy.” “Do you trust them?” “No, we don’t,” Lend called from the kitchen.
Kiersten White (Endlessly (Paranormalcy, #3))
I think we're all just doing our best to survive the inevitable pain and suffering that walks alongside us through life. Long ago, it was wild animals and deadly poxes and harsh terrain. I learned about it playing The Oregon Trail on an old IBM in my computer class in the fourth grade. The nature of the trail has changed, but we keep trekking along. We trek through the death of a sibling, a child, a parent, a partner, a spouse; the failed marriage, the crippling debt, the necessary abortion, the paralyzing infertility, the permanent disability, the job you can't seem to land; the assault, the robbery, the break-in, the accident, the flood, the fire; the sickness, the anxiety, the depression, the loneliness, the betrayal, the disappointment, and the heartbreak. There are these moments in life where you change instantly. In one moment, you're the way you were, and in the next, you're someone else. Like becoming a parent: you're adding, of course, instead of subtracting, as it is when someone dies, and the tone of the occasion is obviously different, but the principal is the same. Birth is an inciting incident, a point of no return, that changes one's circumstances forever. The second that beautiful baby onto whom you have projected all your hopes and dreams comes out of your body, you will never again do anything for yourself. It changes you suddenly and entirely. Birth and death are the same in that way.
Stephanie Wittels Wachs (Everything is Horrible and Wonderful: A Tragicomic Memoir of Genius, Heroin, Love and Loss)
If you had asked Dan during that period whether he still loved his wife, he would have looked at you in total confusion and said, “Of course!” Although his wife was at that very moment wallowing in despair over his treatment of her, he perceived things to be fine between them. This isn’t because he is dense; it’s just that after a lifetime of having people mad at or disappointed with him, Dan weathers periods of anger and criticism by mostly ignoring them. And, because people with ADHD don’t receive and process information in a hierarchical way, Maria’s suffering enters his mind at about the same level as everything else he perceives—the lights on the radio clock, the dog barking, the computer, the worrisome project he has at work. “But wait!” you say. “It doesn’t matter—she’s still alone!” You would be right. Regardless of whether Dan was intentionally ignoring his wife or just distracted, actions speak louder than words. She becomes lonely and unhappy, and her needs must be addressed. But recognizing and then identifying the correct underlying problem is critical to finding the right solution. In marriage, just like in middle school math, if you pick the wrong problem to solve, you generally don’t end up with a satisfactory result. Furthermore, the hurt caused by the incorrect interpretation that he no longer loves her elicits a series of bad feelings and behaviors that compound the problem. This is the critical dynamic of symptom–response–response at work.
Melissa Orlov (The ADHD Effect on Marriage: Understand and Rebuild Your Relationship in Six Steps)
Hey Pete. So why the leave from social media? You are an activist, right? It seems like this decision is counterproductive to your message and work." A: The short answer is I’m tired of the endless narcissism inherent to the medium. In the commercial society we have, coupled with the consequential sense of insecurity people feel, as they impulsively “package themselves” for public consumption, the expression most dominant in all of this - is vanity. And I find that disheartening, annoying and dangerous. It is a form of cultural violence in many respects. However, please note the difference - that I work to promote just that – a message/idea – not myself… and I honestly loath people who today just promote themselves for the sake of themselves. A sea of humans who have been conditioned into viewing who they are – as how they are seen online. Think about that for a moment. Social identity theory run amok. People have been conditioned to think “they are” how “others see them”. We live in an increasing fictional reality where people are now not only people – they are digital symbols. And those symbols become more important as a matter of “marketing” than people’s true personality. Now, one could argue that social perception has always had a communicative symbolism, even before the computer age. But nooooooothing like today. Social media has become a social prison and a strong means of social control, in fact. Beyond that, as most know, social media is literally designed like a drug. And it acts like it as people get more and more addicted to being seen and addicted to molding the way they want the world to view them – no matter how false the image (If there is any word that defines peoples’ behavior here – it is pretention). Dopamine fires upon recognition and, coupled with cell phone culture, we now have a sea of people in zombie like trances looking at their phones (literally) thousands of times a day, merging their direct, true interpersonal social reality with a virtual “social media” one. No one can read anymore... they just swipe a stream of 200 character headlines/posts/tweets. understanding the world as an aggregate of those fragmented sentences. Massive loss of comprehension happening, replaced by usually agreeable, "in-bubble" views - hence an actual loss of variety. So again, this isn’t to say non-commercial focused social media doesn’t have positive purposes, such as with activism at times. But, on the whole, it merely amplifies a general value system disorder of a “LOOK AT ME! LOOK AT HOW GREAT I AM!” – rooted in systemic insecurity. People lying to themselves, drawing meaningless satisfaction from superficial responses from a sea of avatars. And it’s no surprise. Market economics demands people self promote shamelessly, coupled with the arbitrary constructs of beauty and success that have also resulted. People see status in certain things and, directly or pathologically, use those things for their own narcissistic advantage. Think of those endless status pics of people rock climbing, or hanging out on a stunning beach or showing off their new trophy girl-friend, etc. It goes on and on and worse the general public generally likes it, seeking to imitate those images/symbols to amplify their own false status. Hence the endless feedback loop of superficiality. And people wonder why youth suicides have risen… a young woman looking at a model of perfection set by her peers, without proper knowledge of the medium, can be made to feel inferior far more dramatically than the typical body image problems associated to traditional advertising. That is just one example of the cultural violence inherent. The entire industry of social media is BASED on narcissistic status promotion and narrow self-interest. That is the emotion/intent that creates the billions and billions in revenue these platforms experience, as they in turn sell off people’s personal data to advertisers and governments. You are the product, of course.
Peter Joseph
In theory, if some holy book misrepresented reality, its disciples would sooner or later discover this, and the text’s authority would be undermined. Abraham Lincoln said you cannot deceive everybody all the time. Well, that’s wishful thinking. In practice, the power of human cooperation networks depends on a delicate balance between truth and fiction. If you distort reality too much, it will weaken you, and you will not be able to compete against more clear-sighted rivals. On the other hand, you cannot organise masses of people effectively without relying on some fictional myths. So if you stick to unalloyed reality, without mixing any fiction with it, few people will follow you. If you used a time machine to send a modern scientist to ancient Egypt, she would not be able to seize power by exposing the fictions of the local priests and lecturing the peasants on evolution, relativity and quantum physics. Of course, if our scientist could use her knowledge in order to produce a few rifles and artillery pieces, she could gain a huge advantage over pharaoh and the crocodile god Sobek. Yet in order to mine iron ore, build blast furnaces and manufacture gunpowder the scientist would need a lot of hard-working peasants. Do you really think she could inspire them by explaining that energy divided by mass equals the speed of light squared? If you happen to think so, you are welcome to travel to present-day Afghanistan or Syria and try your luck. Really powerful human organisations – such as pharaonic Egypt, the European empires and the modern school system – are not necessarily clear-sighted. Much of their power rests on their ability to force their fictional beliefs on a submissive reality. That’s the whole idea of money, for example. The government makes worthless pieces of paper, declares them to be valuable and then uses them to compute the value of everything else. The government has the power to force citizens to pay taxes using these pieces of paper, so the citizens have no choice but to get their hands on at least some of them. Consequently, these bills really do become valuable, the government officials are vindicated in their beliefs, and since the government controls the issuing of paper money, its power grows. If somebody protests that ‘These are just worthless pieces of paper!’ and behaves as if they are only pieces of paper, he won’t get very far in life.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
A very important function of the nervous system, and, as we have said, a function equally in demand for computing machines, is that of memory, the ability to preserve the results of past operations for use in the future. It will be seen that the uses of the memory are highly various, and it is improbable that any single mechanism can satisfy the demands of all of them. There is first the memory which is necessary for the carrying out of a current process, such as a multiplication, in which the intermediate results are of no value once the process is completed, and in which the operating apparatus should then be released for further use. Such a memory should record quickly, be read quickly, and be erased quickly. On the other hand, there is the memory which is intended to be part of the files, the permanent record, of the machine or the brain, and to contribute to the basis of all its future behavior, at least during a single run of the machine. Let it be remarked parenthetically that an important difference between the way in which we use the brain and the machine is that the machine is intended for many successive runs, either with no reference to each other, or with a minimal, limited reference, and that it can be cleared between such runs; while the brain, in the course of nature, never even approximately clears out its past records. Thus the brain, under normal circumstances, is not the complete analogue of the computing machine but rather the analogue of a single run on such a machine. We shall see later that this remark has a deep significance in psychopathology and in psychiatry.
Norbert Wiener (Cybernetics: or the Control and Communication in the Animal and the Machine)
There has been an enduring misunderstanding that needs to be cleared up. Turing’s core message was never “If a machine can imitate a man, the machine must be intelligent.” Rather, it was “Inability to imitate does not rule out intelligence.” In his classic essay on the Turing test, Turing encouraged his readers to take a broader perspective on intelligence and conceive of it more universally and indeed more ethically. He was concerned with the possibility of unusual forms of intelligence, our inability to recognize those intelligences, and the limitations of the concept of indistinguishability as a standard for defining what is intelligence and what is not. In section two of the paper, Turing asks directly whether imitation should be the standard of intelligence. He considers whether a man can imitate a machine rather than vice versa. Of course the answer is no, especially in matters of arithmetic, yet obviously a man thinks and can think computationally (in terms of chess problems, for example). We are warned that imitation cannot be the fundamental standard or marker of intelligence. Reflecting on Turing’s life can change one’s perspective on what the Turing test really means. Turing was gay. He was persecuted for this difference in a manner that included chemical castration and led to his suicide. In the mainstream British society of that time, he proved unable to consistently “pass” for straight. Interestingly, the second paragraph of Turing’s famous paper starts with the question of whether a male or female can pass for a member of the other gender in a typed conversation. The notion of “passing” was of direct personal concern to Turing and in more personal settings Turing probably did not view “passing” as synonymous with actually being a particular way.
Tyler Cowen (Average Is Over: Powering America Beyond the Age of the Great Stagnation)
It’s not me telling you,” she said. “It’s neuroscience that would say that our capacity to multitask is virtually nonexistent. Multitasking is a computer-derived term. We have one processor. We can’t do it.” “I think that when I’m sitting at my desk feverishly doing seventeen things at once that I’m being clever and efficient, but you’re saying I’m actually wasting my time?” “Yes, because when you’re moving from this project to this project, your mind flits back to the original project, and it can’t pick it up where it left off. So it has to take a few steps back and then ramp up again, and that’s where the productivity loss is.” This problem was, of course, exacerbated in the age of what had been dubbed the “info-blitzkrieg,” where it took superhuman strength to ignore the siren call of the latest tweet, or the blinking red light on the BlackBerry. Scientists had even come up with a term for this condition: “continuous partial attention.” It was a syndrome with which I was intimately familiar, even after all my meditating.
Dan Harris (10% Happier)
There are two moments in the course of education where a lot of kids fall off the math train. The first comes in the elementary grades, when fractions are introduced. Until that moment, a number is a natural number, one of the figures 0, 1, 2, 3 . . . It is the answer to a question of the form “how many.”* To go from this notion, so primitive that many animals are said to understand it, to the radically broader idea that a number can mean “what portion of,” is a drastic philosophical shift. (“God made the natural numbers,” the nineteenth-century algebraist Leopold Kronecker famously said, “and all the rest is the work of man.”) The second dangerous twist in the track is algebra. Why is it so hard? Because, until algebra shows up, you’re doing numerical computations in a straightforwardly algorithmic way. You dump some numbers into the addition box, or the multiplication box, or even, in traditionally minded schools, the long-division box, you turn the crank, and you report what comes out the other side. Algebra is different. It’s computation backward.
Jordan Ellenberg (How Not to Be Wrong: The Power of Mathematical Thinking)
[Hyun Song Shin] most accurately portrayed the state of the global economy. 'I'd like to tell you about the Millennium Bridge in London,' he began…'The bridge was opened by the queen on a sunny day in June,' Shin continued. 'The press was there in force, and many thousands of people turned up to savor the occasion. However, within moments of the bridge's opening, it began to shake violently.' The day it opened, the Millennium Bridge was closed. The engineers were initially mystified about what had gone wrong. Of course it would be a problem if a platoon of soldiers marched in lockstep across the bridge, creating sufficiently powerful vertical vibration to produce a swaying effect. The nearby Albert Bridge, built more than a century earlier, even features a sign directing marching soldiers to break step rather than stay together when crossing. But that's not what happened at the Millennium Bridge. 'What is the probability that a thousand people walking at random will end up walking exactly in step, and remain in lockstep thereafter?' Shin asked. 'It is tempting to say, 'Close to Zero' ' But that's exactly what happened. The bridge's designers had failed to account for how people react to their environment. When the bridge moved slightly under the feet of those opening-day pedestrians, each individual naturally adjusted his or her stance for balance, just a little bit—but at the same time and in the same direction as every other individual. That created enough lateral force to turn a slight movement into a significant one. 'In other words,' said Shin, 'the wobble of the bridge feeds on itself. The wobble will continue and get stronger even though the initial shock—say, a small gust of wind—had long passed…Stress testing on the computer that looks only at storms, earthquakes, and heavy loads on the bridge would regard the events on the opening day as a 'perfect storm.' But this is a perfect storm that is guaranteed to come every day.' In financial markets, as on the Millennium Bridge, each individual player—every bank and hedge fund and individual investor—reacts to what is happening around him or her in concert with other individuals. When the ground shifts under the world's investors, they all shift their stance. And when they all shift their stance in the same direction at the same time, it just reinforces the initial movement. Suddenly, the whole system is wobbling violently. Ben Bernanke, Mervyn King, Jean-Claude Trichet, and the other men and women at Jackson Hole listened politely and then went to their coffee break.
Neil Irwin (The Alchemists: Three Central Bankers and a World on Fire)
The lumbering bagos and topheavy four-wheelers form a moving slalom course for Hiro on his black motorcycle. All these beefy Caucasians with guns! Get enough of them together, looking for the America they always believed they'd grow up in, and they glom together like overcooked rice, form integral, starchy little units. With their power tools, portable generators, weapons, four-wheel-drive vehicles, and personal computers, they are like beavers hyped up on crystal meth, manic engineers without a blueprint, chewing through the wilderness, building things and abandoning them, altering the flow of mighty rivers and then moving on because the place ain't what it used to be. The byproduct of the lifestyle is polluted rivers, greenhouse effect, spouse abuse, televangelists, and serial killers. But as long as you have that fourwheel- drive vehicle and can keep driving north, you can sustain it, keep moving just quickly enough to stay one step ahead of your own waste stream. In twenty years, ten million white people will converge on the north pole and park their bagos there. The low-grade waste heat of their thermodynamically intense lifestyle will turn the crystalline icescape pliable and treacherous. It will melt a hole through the polar icecap, and all that metal will sink to the bottom, sucking the biomass down with it.
Neal Stephenson (Snow Crash)
More than anything, we have lost the cultural customs and traditions that bring extended families together, linking adults and children in caring relationships, that give the adult friends of parents a place in their children's lives. It is the role of culture to cultivate connections between the dependent and the dependable and to prevent attachment voids from occurring. Among the many reasons that culture is failing us, two bear mentioning. The first is the jarringly rapid rate of change in twentieth-century industrial societies. It requires time to develop customs and traditions that serve attachment needs, hundreds of years to create a working culture that serves a particular social and geographical environment. Our society has been changing much too rapidly for culture to evolve accordingly. There is now more change in a decade than previously in a century. When circumstances change more quickly than our culture can adapt to, customs and traditions disintegrate. It is not surprising that today's culture is failing its traditional function of supporting adult-child attachments. Part of the rapid change has been the electronic transmission of culture, allowing commercially blended and packaged culture to be broadcast into our homes and into the very minds of our children. Instant culture has replaced what used to be passed down through custom and tradition and from one generation to another. “Almost every day I find myself fighting the bubble-gum culture my children are exposed to,” said a frustrated father interviewed for this book. Not only is the content often alien to the culture of the parents but the process of transmission has taken grandparents out of the loop and made them seem sadly out of touch. Games, too, have become electronic. They have always been an instrument of culture to connect people to people, especially children to adults. Now games have become a solitary activity, watched in parallel on television sports-casts or engaged in in isolation on the computer. The most significant change in recent times has been the technology of communication — first the phone and then the Internet through e-mail and instant messaging. We are enamored of communication technology without being aware that one of its primary functions is to facilitate attachments. We have unwittingly put it into the hands of children who, of course, are using it to connect with their peers. Because of their strong attachment needs, the contact is highly addictive, often becoming a major preoccupation. Our culture has not been able to evolve the customs and traditions to contain this development, and so again we are all left to our own devices. This wonderful new technology would be a powerfully positive instrument if used to facilitate child-adult connections — as it does, for example, when it enables easy communication between students living away from home, and their parents. Left unchecked, it promotes peer orientation.
Gabor Maté (Hold On to Your Kids: Why Parents Need to Matter More Than Peers)
Military analysis is not an exact science. To return to the wisdom of Sun Tzu, and paraphrase the great Chinese political philosopher, it is at least as close to art. But many logical methods offer insight into military problems-even if solutions to those problems ultimately require the use of judgement and of broader political and strategic considerations as well. Military affairs may not be as amenable to quantification and formal methodological treatment as economics, for example. However, even if our main goal in analysis is generally to illuminate choices, bound problems, and rule out bad options - rather than arrive unambiguously at clear policy choices-the discipline of military analysis has a great deal to offer. Moreover, simple back-of-the envelope methodologies often provide substantial insight without requiring the churning of giant computer models or access to the classified data of official Pentagon studies, allowing generalities and outsiders to play important roles in defense analytical debates. We have seen all too often (in the broad course of history as well as in modern times) what happens when we make key defense policy decisions based solely on instinct, ideology, and impression. To avoid cavalier, careless, and agenda-driven decision-making, we therefore need to study the science of war as well-even as we also remember the cautions of Clausewitz and avoid hubris in our predictions about how any war or other major military endeavor will ultimately unfold.
Michael O'Hanlon
The men in grey were powerless to meet this challenge head-on. Unable to detach the children from Momo by bringing them under their direct control, they had to find some roundabout means of achieving the same end, and for this they enlisted the children's elders. Not all grown-ups made suitable accomplices, of course, but plenty did. [....] 'Something must be done,' they said. 'More and more kids are being left on their own and neglected. You can't blame us - parents just don't have the time these days - so it's up to the authorities.' Others joined in the chorus. 'We can't have all these youngsters loafing around, ' declared some. 'They obstruct the traffic. Road accidents caused by children are on the increase, and road accidents cost money that could be put to better use.' 'Unsupervised children run wild, declared others.'They become morally depraved and take to crime. The authorities must take steps to round them up. They must build centers where the youngsters can be molded into useful and efficient members of society.' 'Children,' declared still others, 'are the raw material for the future. A world dependent on computers and nuclear energy will need an army of experts and technicians to run it. Far from preparing children from tomorrow's world, we still allow too many of them to squander years of their precious time on childish tomfoolery. It's a blot on our civilization and a crime against future generations.' The timesavers were all in favor of such a policy, naturally, and there were so many of them in the city by this time that they soon convinced the authorities of the need to take prompt action. Before long, big buildings known as 'child depots' sprang up in every neighborhood. Children whose parents were too busy to look after them had to be deposited there and could be collected when convenient. They were strictly forbidden to play in the streets or parks or anywhere else. Any child caught doing so was immediately carted off to the nearest depot, and its parents were heavily fined. None of Momo's friends escaped the new regulation. They were split up according to districts they came from and consigned to various child depots. Once there, they were naturally forbidden to play games of their own devising. All games were selected for them by supervisors and had to have some useful, educational purpose. The children learned these new games but unlearned something else in the process: they forgot how to be happy, how to take pleasure in the little things, and last but not least, how to dream. Weeks passed, and the children began to look like timesavers in miniature. Sullen, bored and resentful, they did as they were told. Even when left to their own devices, they no longer knew what to do with themselves. All they could still do was make a noise, but it was an angry, ill-tempered noise, not the happy hullabaloo of former times. The men in grey made no direct approach to them - there was no need. The net they had woven over the city was so close-meshed as to seem inpenetrable. Not even the brightest and most ingenious children managed to slip through its toils. The amphitheater remained silent and deserted.
Michael Ende, Momo
In the absence of expert [senior military] advice, we have seen each successive administration fail in the business of strategy - yielding a United States twice as rich as the Soviet Union but much less strong. Only the manner of the failure has changed. In the 1960s, under Robert S. McNamara, we witnessed the wholesale substitution of civilian mathematical analysis for military expertise. The new breed of the "systems analysts" introduced new standards of intellectual discipline and greatly improved bookkeeping methods, but also a trained incapacity to understand the most important aspects of military power, which happens to be nonmeasurable. Because morale is nonmeasurable it was ignored, in large and small ways, with disastrous effects. We have seen how the pursuit of business-type efficiency in the placement of each soldier destroys the cohesion that makes fighting units effective; we may recall how the Pueblo was left virtually disarmed when it encountered the North Koreans (strong armament was judged as not "cost effective" for ships of that kind). Because tactics, the operational art of war, and strategy itself are not reducible to precise numbers, money was allocated to forces and single weapons according to "firepower" scores, computer simulations, and mathematical studies - all of which maximize efficiency - but often at the expense of combat effectiveness. An even greater defect of the McNamara approach to military decisions was its businesslike "linear" logic, which is right for commerce or engineering but almost always fails in the realm of strategy. Because its essence is the clash of antagonistic and outmaneuvering wills, strategy usually proceeds by paradox rather than conventional "linear" logic. That much is clear even from the most shopworn of Latin tags: si vis pacem, para bellum (if you want peace, prepare for war), whose business equivalent would be orders of "if you want sales, add to your purchasing staff," or some other, equally absurd advice. Where paradox rules, straightforward linear logic is self-defeating, sometimes quite literally. Let a general choose the best path for his advance, the shortest and best-roaded, and it then becomes the worst path of all paths, because the enemy will await him there in greatest strength... Linear logic is all very well in commerce and engineering, where there is lively opposition, to be sure, but no open-ended scope for maneuver; a competitor beaten in the marketplace will not bomb our factory instead, and the river duly bridged will not deliberately carve out a new course. But such reactions are merely normal in strategy. Military men are not trained in paradoxical thinking, but they do no have to be. Unlike the business-school expert, who searches for optimal solutions in the abstract and then presents them will all the authority of charts and computer printouts, even the most ordinary military mind can recall the existence of a maneuvering antagonists now and then, and will therefore seek robust solutions rather than "best" solutions - those, in other words, which are not optimal but can remain adequate even when the enemy reacts to outmaneuver the first approach.
Edward N. Luttwak
If it was a video-file that I was trying to watch, then at the bottom of the screen there’d be that line, that bar that slowly fills itself in—twice: once in bold red and, at the same time, running ahead of that, in fainter grey; the fainter section, of course, has to remain in advance of the bold section, and of the cursor showing which part of the video you’re actually watching at a given moment; if the cursor and red section catch up, then buffering sets in again. Staring at this bar, losing myself in it just as with the circle, I was granted a small revelation: it dawned on me that what I was actually watching was nothing less than the skeleton, laid bare, of time or memory itself. Not our computers’ time and memory, but our own. This was its structure. We require experience to stay ahead, if only by a nose, of our consciousness of experience—if for no other reason than that the latter needs to make sense of the former, to (as Peyman would say) narrate it both to others and ourselves, and, for this purpose, has to be fed with a constant, unsorted supply of fresh sensations and events. But when the narrating cursor catches right up with the rendering one, when occurrences and situations don’t replenish themselves quickly enough for the awareness they sustain, when, no matter how fast they regenerate, they’re instantly devoured by a mouth too voracious to let anything gather or accrue unconsumed before it, then we find ourselves jammed, stuck in limbo: we can enjoy neither experience nor consciousness of it. Everything becomes buffering, and buffering becomes everything. The revelation pleased me. I decided I would start a dossier on buffering.
Tom McCarthy (Satin Island)
Gadgetry will continue to relieve mankind of tedious jobs. Kitchen units will be devised that will prepare ‘automeals,’ heating water and converting it to coffee; toasting bread; frying, poaching or scrambling eggs, grilling bacon, and so on. Breakfasts will be ‘ordered’ the night before to be ready by a specified hour the next morning. Communications will become sight-sound and you will see as well as hear the person you telephone. The screen can be used not only to see the people you call but also for studying documents and photographs and reading passages from books. Synchronous satellites, hovering in space will make it possible for you to direct-dial any spot on earth, including the weather stations in Antarctica. [M]en will continue to withdraw from nature in order to create an environment that will suit them better. By 2014, electroluminescent panels will be in common use. Ceilings and walls will glow softly, and in a variety of colors that will change at the touch of a push button. Robots will neither be common nor very good in 2014, but they will be in existence. The appliances of 2014 will have no electric cords, of course, for they will be powered by long- lived batteries running on radioisotopes. “[H]ighways … in the more advanced sections of the world will have passed their peak in 2014; there will be increasing emphasis on transportation that makes the least possible contact with the surface. There will be aircraft, of course, but even ground travel will increasingly take to the air a foot or two off the ground. [V]ehicles with ‘Robot-brains’ … can be set for particular destinations … that will then proceed there without interference by the slow reflexes of a human driver. [W]all screens will have replaced the ordinary set; but transparent cubes will be making their appearance in which three-dimensional viewing will be possible. [T]he world population will be 6,500,000,000 and the population of the United States will be 350,000,000. All earth will be a single choked Manhattan by A.D. 2450 and society will collapse long before that! There will, therefore, be a worldwide propaganda drive in favor of birth control by rational and humane methods and, by 2014, it will undoubtedly have taken serious effect. Ordinary agriculture will keep up with great difficulty and there will be ‘farms’ turning to the more efficient micro-organisms. Processed yeast and algae products will be available in a variety of flavors. The world of A.D. 2014 will have few routine jobs that cannot be done better by some machine than by any human being. Mankind will therefore have become largely a race of machine tenders. Schools will have to be oriented in this direction…. All the high-school students will be taught the fundamentals of computer technology will become proficient in binary arithmetic and will be trained to perfection in the use of the computer languages that will have developed out of those like the contemporary “Fortran". [M]ankind will suffer badly from the disease of boredom, a disease spreading more widely each year and growing in intensity. This will have serious mental, emotional and sociological consequences, and I dare say that psychiatry will be far and away the most important medical specialty in 2014. [T]he most glorious single word in the vocabulary will have become work! in our a society of enforced leisure.
Isaac Asimov
Best hacking books Cell phone hacking books Cell phone hacking sites Certified ethical hacking Computer hacking Computer hacking 101 Computer hacking books Computer hacking device Computer hacking equipment Computer hacking for dummies Computer hacking forensic investigator Computer hacking forensic investigator certification Computer hacking laws Computer hacking programs Computer hacking software Computer hacking tools Ethical hacking Ethical hacking and countermeasures Ethical hacking and countermeasures 2010 Ethical hacking and countermeasures attack phases Ethical hacking and countermeasures Linux Macintosh and mobile systems Ethical hacking and countermeasures secure network infrastructures Ethical hacking and countermeasures threats and defense mechanisms Ethical hacking and countermeasures web applications and data servers Ethical hacking and network defense Ethical hacking and pentesting Ethical hacking and pentesting guide Ethical hacking books Ethical hacking certification Ethical hacking course Ethical hacking kindle Ethical hacking tools Facebook hacking sites Facebook hacking software Facebook hacking tools Free computer hacking software Free Facebook hacking sites Free hacking software Hacking Hacking electronics Hacking electronics stuff Hacking electronics torrent Hacking electronics video Hacking exposed Hacking exposed 7 Hacking exposed 8 Hacking exposed book Hacking exposed computer forensics Hacking exposed Linux Hacking exposed mobile Hacking exposed network security secrets & solutions Hacking exposed PDF Hacking exposed windows Hacking exposed wireless Hacking sites Hacking software Hacking software computer Hacking software for iPhone Hacking tools Hacking tools and techniques Hacking your education torrent How to hacking sites Online Facebook hacking sites Password hacking software What is ethical hacking?
Matt Robbins (Hacking: Perfect Hacking for Beginners: Essentials You Must Know [Version 2.0] (hacking, how to hack, hacking exposed, hacking system, hacking 101, beg ... to hacking, Hacking, hacking for dummies))
The sponge or active charcoal inside a filter is three-dimensional. Their adsorbent surfaces, however, are two-dimensional. Thus, you can see how a tiny high-dimensional structure can contain a huge low-dimensional structure. But at the macroscopic level, this is about the limit of the ability for high-dimensional space to contain low-dimensional space. Because God was stingy, during the big bang He only provided the macroscopic world with three spatial dimensions, plus the dimension of time. But this doesn’t mean that higher dimensions don’t exist. Up to seven additional dimensions are locked within the micro scale, or, more precisely, within the quantum realm. And added to the four dimensions at the macro scale, fundamental particles exist within an eleven-dimensional space-time.” “So what?” “I just want to point out this fact: In the universe, an important mark of a civilization’s technological advancement is its ability to control and make use of micro dimensions. Making use of fundamental particles without taking advantage of the micro dimensions is something that our naked, hairy ancestors already began back when they lit bonfires within caves. Controlling chemical reactions is just manipulating micro particles without regard to the micro dimensions. Of course, this control also progressed from crude to advanced: from bonfires to steam engines, and then generators. Now, the ability for humans to manipulate micro particles at the macro level has reached a peak: We have computers and nanomaterials. But all of that is accomplished without unlocking the many micro dimensions. From the perspective of a more advanced civilization in the universe, bonfires and computers and nanomaterials are not fundamentally different. They all belong to the same level. That’s also why they still think of humans as mere bugs. Unfortunately, I think they’re right.
Liu Cixin (The Three-Body Problem (Remembrance of Earth’s Past, #1))
You act like a normal human and you’ll win an Oscar,” Marco said. He led the way up to his house and opened the door. “Okay, look, you wait right there by that table. Don’t go anywhere. If my dad comes in and talks to you, just say ‘yes’ and ‘no.’ Got it? Yes and no answers only. I’ll run up to my room. I’m gonna call one of the others to meet us at the bookstore. You’re already driving me nuts.” I stood by the table. There was a primitive computer on the table. It even had a solid, two-dimensional screen. And a keyboard! An actual keyboard. I touched the keyboard. It was amazing. Andalite computers once had keyboards, too. Although ours were very different. And it had been centuries since we’d used them. On the screen of the computer was a game. The object of the game was to spot the errors in a primitive symbolic language and correct them. Of course, before I could play I had to make sense of the system. But that was simple enough. Once I understood the system, it was easy to spot the errors. I quickly rewrote it to make sense out of it. I said to myself. “Hello?” I turned around. It was an older human. He was paler than Marco, but other features were similar. Marco had warned me to say nothing to his father but “yes” and “no.” “No,” I said to Marco’s father. “I’m Marco’s dad. Are you a friend of his?” “Yes.” “What’s your name?” “No,” I answered. “Your name is ‘No’?” “Yes.” “That’s an unusual name, isn’t it?” “No.” “It’s not?” “Yes.” “Yes, it’s not an unusual name?” “No.” “Now I’m totally confused.” “Yes.” Marco’s father stared at me. Then, in a loud voice, he yelled, “Hey, Marco? Marco? Would you . . . um . . . your friend is here. Your friend ‘No’ is here.” “No,” I said. “Yes, that’s what I said.” Marco came running down the stairs. “Whoa!” he cried. “Um, Dad! You met my friend?” “No?” Marco’s father said. “What?” Marco asked. Marco’s father shook his head. “I must be getting old. I don’t understand you kids.” “Yes,” I offered.
Katherine Applegate (The Alien (Animorphs, #8))