Computer Movie Quotes

We've searched our database for all the quotes and captions related to Computer Movie. Here they are! All 100 of them:

Taylor’s dating Scott Casey?” He began to laugh. He held up one hand, clutching his side with the other. “Wait, wait.” He gasped for breath. “This really is too good. I gotta write this down to use one day.” Jeremy turned to his computer, reading out loud as he typed. “ ‘And then the evil, arrogant movie star learned that lying does not pay.
Julie James (Just the Sexiest Man Alive)
When a writer looked at an empty computer screen, what did she see? Tristan wondered. A movie screen ready to be lit with faces? A night sky with one small star blinking at the top, a universe ready to be written on? Endless possibilities. Love's endless twists and turns - and all love's impossibilities.
Elizabeth Chandler (The Power of Love (Kissed by an Angel, #2))
Never presume to know a person based on the one dimensional window of the internet. A soul can’t be defined by critics, enemies or broken ties with family or friends. Neither can it be explained by posts or blogs that lack facial expressions, tone or insight into the person’s personality and intent. Until people “get that”, we will forever be a society that thinks Beautiful Mind was a spy movie and every stranger is really a friend on Facebook.
Shannon L. Alder
You know what I noticed when I was with Jacob? In your world, people can reach each other in an instant. There's the telephone, and the fax - and on the computer you can talk to someone all the way around the world. You've got people telling their secrets on TV talk shows, and magazines that publish pictures of movie stars trying to hide their homes. All those connections, but everyone there seems so lonely.
Jodi Picoult (Plain Truth)
Considering that we live in an era of evolutionary everything---evolutionary biology, evolutionary medicine, evolutionary ecology, evolutionary psychology, evolutionary economics, evolutionary computing---it was surprising how rarely people thought in evolutionary terms. It was a human blind spot. We look at the world around us as a snapshot when it was really a movie, constantly changing.
Michael Crichton (Prey)
I remember when I first came around, the computer-generated stuff was pretty wicked. I was like, 'Wow!' but I feel like then for the longest time, we saw so much of it, after a while, you might as well just be watching an animated movie.
Paul Walker
Americans now spend more money on fast food than on higher education, personal computers, computer software, or new cars. They spend more on fast food than on movies, books, magazines, newspapers, videos, and recorded music—combined.
Eric Schlosser (Fast Food Nation: The Dark Side of the All-American Meal)
I think the reason novels are regarded to have so much more 'information' than films is that they outsource the scenic design and cinematography to the reader... This, for me, is a powerful argument for the value and potency of literature specifically. Movies don't demand as much from the player. Most people know this; at the end of the day you can be too beat to read but not yet too beat to watch television or listen to music.
Brian Christian (The Most Human Human: What Talking with Computers Teaches Us About What It Means to Be Alive)
In 1970, Americans spent about $6 billion on fast food; in 2000, they spent more than $110 billion. Americans now spend more money on fast food than on higher education, personal computers, computer software, or new cars. They spend more on fast food than on movies, books, magazines, newspapers, videos, and recorded music—combined.
Eric Schlosser (Fast Food Nation: The Dark Side of the All-American Meal)
It does not matter. You train your soldiers to kill using video games. They blow enough people up on their computer and it becomes easier for them to kill with a real weapon. Why do you think your government funds so many war and terrorism movies? Hollywood does your dirty work for you. Had 9/11 happened twenty years earlier, the country would have been in chaos, but people have seen enough bad things on their television screen to prepare them for just about anything. We do not really need to talk about government conspiracies.
Sylvain Neuvel (Sleeping Giants (Themis Files, #1))
I noticed how utterly indifferent the passengers were to what they were doing, namely, flying through the air. A glance out of the window would have revealed furrowed fields of cloud stained smoke-blue and violet as night and morning changed shifts –- but how were they passing time in First, Business and Coach? Crosswords. In-flight movies. Computer games. E-mail. Creation sprawls like a dewed and willing maiden outside your window awaiting only the lechery of your senses –- and what do you do? Complain about the dwarf cutlery. Plug your ears. Blind you eyes. Discuss Julia Roberts’s hair. Ah, me. Sometimes I think my work is done.
Glen Duncan (I, Lucifer)
As in the movie The Matrix, we might one day be able to download memories and skills using computers.
Michio Kaku (The Future of the Mind: The Scientific Quest to Understand, Enhance, and Empower the Mind)
It was funny how bomb and wars looked so thrilling in movies and computer games, when the reality was so heart-stoppingly terrifying.
Sophie McKenzie (Split Second (Split Second #1))
Many of my all-time favorite movies are almost entirely verbal. The entire plot of My Dinner with Andre is “Wallace Shawn and Andre Gregory eat dinner.” The entire plot of Before Sunrise is “Ethan Hawke and Julie Delpy walk around Vienna.” But the dialogue takes us everywhere, and as Roger Ebert notes, of My Dinner with Andre, these films may be paradoxically among the most visually stimulating in the history of the cinema:
Brian Christian (The Most Human Human: What Talking with Computers Teaches Us About What It Means to Be Alive)
But I did wake up, and the main computer showed the slight rise in CO2 I had predicted. Looks like I’ll live another sol. Live Another Sol would be an awesome name for a James Bond movie.
Andy Weir (The Martian)
The final form of leverage is brand new—the most democratic form. It is: “products with no marginal cost of replication.” This includes books, media, movies, and code. Code is probably the most powerful form of permissionless leverage. All you need is a computer—you don’t need anyone’s permission. [1]
Eric Jorgenson (The Almanack of Naval Ravikant: A Guide to Wealth and Happiness)
My story is about all I got to my name right now, and that's why I feel robbed. But a story's a whole lot more than most people got. All you people watching out there, you're listening to what I say because I have something you don't: I got plot. Bought and paid for. That's what all you people want, and why you're sucking off me. You want my plot. I know how you feel too, since hey, I used to feel the same way. TV and video games and movies and computer screens... On April 8th, 1999, I jumped into the screen, I switched to watchee. Ever since, I've known what my life is about. I give good story. It may have been kinda gory, but admit it, you all loved it. You ate it up. Nuts, I ought to be on some government payroll. Without people like me, the whole country would jump off a bridge, 'cause the only thing on TV is some housewife on Who Wants to Be a Millionaire? winning $64,000 for remembering the name of the president's dog.
Lionel Shriver (We Need to Talk About Kevin)
Of all the inventions Addie has seen her ushered into the world — steam-powered trains, electric lights, photography, and phones, and airplanes, and computers — movies might just be her favorite one. Books are wonderful, portable, lasting, but sitting there, in the darkened theater, the wide screen filling her vision, the world falls away, and for a few short hours she is someone else, plunged into romance and intrigue and comedy and adventure.
Victoria E. Schwab (The Invisible Life of Addie LaRue)
Scott goes to the computer and loads a chart that says something about global warming. Scott says, "See?" Judy says, "I don't think global warming is important, people shouldn't need to use global warming as an excuse to stop being wasteful." Scott says, "How can you not believe this?" Judy says, "There has been golf ball-sized hail storms and hurricanes for a long time, it didn't just start all of the sudden. In the movie Al Gore drives in an SUV." Scott leaves to have a cigarette. Cory says, "Al Gore owns his own farm." Judy stares at the TV. Judy thinks, "No one in this room cares about global warming, this is ridiculous, we are all smoking cigarettes and eating cheese, how can any one of us care about voting? No one in this room cares about anything.
Ellen Kennedy
You are sitting on a computer in the projector cabin of a unique cinema hall in which the screen is not made up of white cloth. Instead, there is a big transparent room full of white liquid. You click on a movie file on your computer, the projector starts throwing light on the room of white liquid, real characters start emerging from the white liquid. You get attached to the characters. You start feeling their pain and pleasures. That room of white liquid is Space-Time or Maya. You are a soul sitting on the computer. The movie file is Karma-Desires. If you don’t like the movie, you can change it and play a better movie.
Shunya
By contrast Hobie lived and wafted like some great sea mammal in his own mild atmosphere, the dark brown of tea stains and tobacco, where every clock in the house said something different and time didn’t actually correspond to the standard measure but instead meandered along at its own sedate tick-tock, obeying the pace of his antique-crowded backwater, far from the factory-built, epoxy-glued version of the world. Though he enjoyed going out to the movies, there was no television; he read old novels with marbled end papers; he didn’t own a cell phone; his computer, a prehistoric IBM, was the size of a suitcase and useless.
Donna Tartt (The Goldfinch)
movie 2001: A Space Odyssey.” Off to the side were dozens of keypunch machines—what passed in those days for computer terminals.
Malcolm Gladwell (Outliers: The Story of Success)
No sensible person would prefer a computer screen to a well printed page for reading text
James Monaco (How To Read a Film: Movies, Media, and Beyond)
except one bit about a movie with werewolves and a woman bursting like a balloon is just special effects, that’s drawing on computers.
Emma Donoghue (Room)
The basic plot of almost all movies and novels about AI revolves around the magical moment when a computer or a robot gains consciousness.
Yuval Noah Harari (21 Lessons for the 21st Century)
But I did wake up, and the main computer showed the slight rise in CO2 I had predicted. Looks like I'll live another sol. Live Another Sol would be an awesome name for a James Bond movie.
Andy Weir (The Martian)
Sliding Doors and Run Lola Run (1998)—These two movies, neither of which is technically science fiction, were released in the same year. We see the idea of timelines branching from a single point which lead to different outcomes. In the example of Sliding Doors, a separate timeline branches off of the first timeline and then exists in parallel for some time, overlapping the main timeline, before merging back in. In Run Lola Run, on the other hand, we see Lola trying to rescue her boyfriend Manni by rewinding what happened and making different choices multiple times. We see visually what running our Core Loop might look like in a real-world, high-stress situation.
Rizwan Virk (The Simulated Multiverse: An MIT Computer Scientist Explores Parallel Universes, The Simulation Hypothesis, Quantum Computing and the Mandela Effect)
He was petulant even then, attacking a Time correspondent for having wounded him with a story that was too revealing. But talking to him afterward, I found myself rather captivated, as so many others have been over the years, by his engaging intensity. We stayed in touch, even after he was ousted from Apple. When he had something to pitch, such as a NeXT computer or Pixar movie, the beam of his charm would suddenly refocus on me, and he would take me to a sushi
Walter Isaacson (Steve Jobs)
Awkward. \ˈȯ-kwərd\. Adjective. A feeling of embarrassment, discomfort, or abnormality. If music is the universal language, then awkward is the universal feeling. Awkward works in mysterious ways. Sometimes it’s a handshake that was meant to be a high-five. Other times it’s telling the guy who works at the movie theater to enjoy the movie, too. Awkward comes in so many forms: meeting your girlfriend’s parents, getting socks as a birthday present, a friend request that turned out to be a computer virus, on and on and on.
Michael McCreary (Funny, You Don't Look Autistic: A Comedian's Guide to Life on the Spectrum)
[O]ur attitudes towards things like race or gender operate on two levels. First of all, we have our conscious attitudes. This is what we choose to believe. These are our stated values, which we use to direct our behavior deliberately . . . But the IAT [Implicit Association Test] measures something else. It measures our second level of attitude, our racial attitude on an unconscious level - the immediate, automatic associations that tumble out before we've even had time to think. We don't deliberately choose our unconscious attitudes. And . . . we may not even be aware of them. The giant computer that is our unconscious silently crunches all the data it can from the experiences we've had, the people we've met, the lessons we've learned, the books we've read, the movies we've seen, and so on, and it forms an opinion.
Malcolm Gladwell (Blink: The Power of Thinking Without Thinking)
Science-fiction movies generally assume that in order to match and surpass human intelligence, computers will have to develop consciousness. But real science tells a different story. There might be several alternative ways leading to super-intelligence, only some of which pass through the straits of consciousness.
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
Michelangelo said that all he did was see the statue inside the block of marble and carve away the excess stone until the statue was revealed. Likewise, an algorithm carves away the excess transistors in the computer until the intended function is revealed, whether it’s an airliner’s autopilot or a new Pixar movie. An
Pedro Domingos (The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World)
Hollywood movies, however, have brainwashed us into thinking that we can defeat the alien invaders if they are a few decades or centuries ahead of us in technology. Hollywood assumes that we can win by using some primitive, clever trick. In Independence Day, all we have to do is inject a simple computer virus into their operating system to bring them to their knees, as if the aliens use Microsoft Windows.
Michio Kaku (The Future of Humanity: Terraforming Mars, Interstellar Travel, Immortality and Our Destiny Beyond Earth)
Research on emotion shows that positive emotions wear off quickly. Our emotional systems like newness. They like novelty. They like change. We adapt to positive life circumstances so that before too long, the new car, the new spouse, the new house—they don’t feel so new and exciting anymore. But gratitude makes us appreciate the value of something, and when we appreciate the value of something, we extract more benefits from it; we’re less likely to take it for granted. In effect, I think gratitude allows us to participate more in life. We notice the positives more, and that magnifies the pleasures you get from life. Instead of adapting to goodness, we celebrate goodness. We spend so much time watching things—movies, computer screens, sports—but with gratitude we become greater participants in our lives as opposed to spectators.
Brené Brown (Atlas of the Heart: Mapping Meaningful Connection and the Language of Human Experience)
We live in the era of the search engine. Gone is the era of finding things on your own. If you want to find something, you can use your computer or phone to easily google it. You can find popular restaurants, movies, novels, and fashion anywhere in the world with no challenge. Ours is now a life of passive acquisition. But the joy of finding is gone, as is the catharsis of going to great trouble in searching for something and finding it.
Hideo Kojima (The Creative Gene: How Books, Movies, and Music Inspired the Creator of Death Stranding and Metal Gear Solid)
Szabo reckoned that the future of libraries was a combination of a people’s university, a community hub, and an information base, happily partnered with the Internet rather than in competition with it. In practical terms, Szabo felt the library should begin offering classes and voter registration and literacy programs and story times and speaker series and homeless outreach and business services and computer access and movie rentals and e-book loans and a nice gift shop. Also, books.
Susan Orlean (The Library Book)
In the medium term, AI may automate our jobs, to bring both great prosperity and equality. Looking further ahead, there are no fundamental limits to what can be achieved. There is no physical law precluding particles from being organised in ways that perform even more advanced computations than the arrangements of particles in human brains. An explosive transition is possible, although it may play out differently than in the movies. As mathematician Irving Good realised in 1965, machines with superhuman intelligence could repeatedly improve their design even further, in what science-fiction writer Vernor Vinge called a technological singularity. One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders and potentially subduing us with weapons we cannot even understand. Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.
Stephen Hawking
What did we talk about? I don't remember. We talked so hard and sat so still that I got cramps in my knee. We had too many cups of tea and then didn't want to leave the table to go to the bathroom because we didn't want to stop talking. You will think we talked of revolution but we didn't. Nor did we talk of our own souls. Nor of sewing. Nor of babies. Nor of departmental intrigue. It was political if by politics you mean the laboratory talk that characters in bad movies are perpetually trying to convey (unsuccessfully) when they Wrinkle Their Wee Brows and say (valiantly--dutifully--after all, they didn't write it) "But, Doctor, doesn't that violate Finagle's Constant?" I staggered to the bathroom, released floods of tea, and returned to the kitchen to talk. It was professional talk. It left my grey-faced and with such concentration that I began to develop a headache. We talked about Mary Ann Evans' loss of faith, about Emily Brontë's isolation, about Charlotte Brontë's blinding cloud, about the split in Virginia Woolf's head and the split in her economic condition. We talked about Lady Murasaki, who wrote in a form that no respectable man would touch, Hroswit, a little name whose plays "may perhaps amuse myself," Miss Austen, who had no more expression in society than a firescreen or a poker. They did not all write letters, write memoirs, or go on the stage. Sappho--only an ambiguous, somewhat disagreeable name. Corinna? The teacher of Pindar. Olive Schriener, growing up on the veldt, wrote on book, married happily, and ever wrote another. Kate Chopin wrote a scandalous book and never wrote another. (Jean has written nothing.). There was M-ry Sh-ll-y who wrote you know what and Ch-rl-tt- P-rk-ns G-lm-an, who wrote one superb horror study and lots of sludge (was it sludge?) and Ph-ll-s Wh--tl-y who was black and wrote eighteenth century odes (but it was the eighteenth century) and Mrs. -nn R-dcl-ff- S-thw-rth and Mrs. G--rg- Sh-ld-n and (Miss?) G--rg-tt- H-y-r and B-rb-r- C-rtl-nd and the legion of those, who writing, write not, like the dead Miss B--l-y of the poem who was seduced into bad practices (fudging her endings) and hanged herself in her garter. The sun was going down. I was blind and stiff. It's at this point that the computer (which has run amok and eaten Los Angeles) is defeated by some scientifically transcendent version of pulling the plug; the furniture stood around unknowing (though we had just pulled out the plug) and Lady, who got restless when people talked at suck length because she couldn't understand it, stuck her head out from under the couch, looking for things to herd. We had talked for six hours, from one in the afternoon until seven; I had at that moment an impression of our act of creation so strong, so sharp, so extraordinarily vivid, that I could not believe all our talking hadn't led to something more tangible--mightn't you expect at least a little blue pyramid sitting in the middle of the floor?
Joanna Russ (On Strike Against God)
Wowbagger grunted. He watched the majesty of creation outside his window for a moment or two. “I think I’ll take a nap,” he said, and then added, “What network areas are we going to be passing through in the next few hours?” The computer beeped. “Cosmovid, Thinkpix and Home Brain Box,” it said, and beeped. “Any movies I haven’t seen thirty thousand times already?” “No.” “Uh.” “There’s Angst in Space. You’ve only seen that thirty-three thousand five hundred and seventeen times.” “Wake me for the second reel.
Douglas Adams (The Ultimate Hitchhiker's Guide to the Galaxy (Hitchhiker's Guide to the Galaxy #1-5))
I hate computers. My hatred is entrenched, and I nourish it daily. I’m comfortable with it, and no community outreach program will change my mind. I hate computers for getting their own section in the New York Times and for lengthening commercials with the mention of a Web site address. Who really wants to find out more about Procter & Gamble? Just buy the toothpaste or laundry detergent, and get on with it. I hate them for creating the word org and I hate them for e-mail, which isn’t real mail but a variation of the pointless notes people used to pass in class. I hate computers for replacing the card catalog in the New York Public Library and I hate the way they’ve invaded the movies. I’m not talking about their contribution to the world of special effects. I have nothing against a well-defined mutant or full-scale alien invasion — that’s good technology. I’m talking about their actual presence in any given movie. They’ve become like horses in a western — they may not be the main focus, but everybody seems to have one.
David Sedaris (Me Talk Pretty One Day)
To love is to lose, Sam. Unfortunately, it’s just that simple. Maybe not today but someday. Maybe not when she’s too young and you’re too young, but you see that being old doesn’t help. Maybe not your wife or your girlfriend or your mother, but you see that friends die, too. I could not spare you this any more than I could spare you puberty. It is the inevitable condition of humanity. It is exacerbated by loving but also simply by leaving your front door, by seeing what’s out there in the world, by inventing computer programs that help people. You are afraid of time, Sam. Some sadness has no remedy. Some sadness you can’t make better.” “So what the hell do I do?” “Be sad.” “For how long?” “Forever.” “But then why isn’t everyone walking around miserable all the time?” “Because ice cream still tastes good. And sunny and seventy-five is still a lovely day. And funny movies make you laugh, and work is sometimes fulfilling, and a beer with a friend is nice. And other people love you too.” “And that’s enough?” “There is no enough. You are the paragon of animals, my love. You aspire to such greatness, to miracle, to newness and wonder. And that’s great. I’m so proud of you. But you forgot about the part that’s been around for time immemorial. Love, death, loss. You’ve run up against it. And there’s no getting around or over it. You stop and build your life right there at the base of that wall. But it’s okay. That’s where everyone else is too. Everyone else is either there or on their way. There is no other side, but there’s plenty of space there to build a life and plenty of company. Welcome to the wall, Sam.
Laurie Frankel (Goodbye for Now)
Take the 2013 film Monsters University. Even when using an industrial grade computing processor, it would have taken an average of 29 hours for each of the film’s 120,000-plus frames to be rendered. In total, that would have meant more than two years just to render the entire movie once, assuming not a single render was ever replaced or scene changed. With this challenge in mind, Pixar built a data center of 2,000 conjoined industrial-grade computers with a combined 24,000 cores that, when fully assigned, could render a frame in roughly seven seconds.
Matthew Ball (The Metaverse: And How It Will Revolutionize Everything)
The union of a zillion streams of information intermingling, flowing into each other, is what we call the cloud. Software flows from the cloud to you as a stream of upgrades. The cloud is where your stream of texts go before they arrive on your friend’s screen. The cloud is where the parade of movies under your account rests until you call for them. The cloud is the reservoir that songs escape from. The cloud is the seat where the intelligence of Siri sits, even as she speaks to you. The cloud is the new organizing metaphor for computers. The foundational units of this third digital regime, then, are flows, tags, and clouds.
Kevin Kelly (The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future)
Of all the inventions Addie has seen ushered into the world—steam-powered trains, electric lights, photography, and phones, and airplanes, and computers—movies might just be her favorite one. Books are wonderful, portable, lasting, but sitting there, in the darkened theater, the wide screen filling her vision, the world falls away, and for a few short hours she is someone else, plunged into romance and intrigue and comedy and adventure. All of it complete with 4K picture and stereo sound. A quiet heaviness fills her chest when the credits roll. For a while she was weightless, but now she returns to herself, sinking until her feet are back on the ground.
Victoria E. Schwab (The Invisible Life of Addie LaRue)
Why do you hate the idea of being with yourself so much that ‘the time you spend with yourself is now considered as loneliness Why we fear loneliness. The fear of loneliness was injected into our minds since we were kids. We have learned that the kid who eats alone, sits alone, and has no friends is pathetic. In every book or movie, the kid who is eating alone, and has no friend is always featured as a weak character who needs to be saved. It’s not pathetic to be alone. I realized that we don’t hate being alone. We hate to believe that we are left behind. Being alone is a part of life. But being lonely means viewing yourself from the lens of sympathy and misery. When you look at yourself through the lens of loneliness, you feel insecure and left out. Being alone doesn’t mean you are lonely. Being alone means YOU ARE WITH YOURSELF. Stop romanticizing your life , one day someone will come to save you, rescue you, or rather fall in love with you. The problem with this is that you CHOOSE to believe that YOU ARE NOT ENOUGH to change your life all by yourself. You rely your hope on someone who doesn’t exist. After college, you don’t make friends. You just network. You just try to be nice to people so you are not left behind (mostly). We don’t want people to think that no one chose us so what do we do? We start becoming like an ideal version of whom everyone loves. We start saying YES to things that we hate. But step by step, as we become like everyone else, we go far away from who we truly are. Loneliness is not when you don’t have people around. Loneliness occurs when you cannot find yourself inside you. The moment you feel the loss of your real self, that’s when loneliness makes a home inside you. “There are some days when you miss yourself more than you have ever missed anyone else. Solitude is my home , Loneliness was my cage. Imagine Yourself as a computer and see how you have opened different tabs of your personality for each person you meet. New person, new tab. Perhaps, that's the reason your real personality has crashed.
Renuka Gavrani
I had known him since 1984, when he came to Manhattan to have lunch with Time’s editors and extol his new Macintosh. He was petulant even then, attacking a Time correspondent for having wounded him with a story that was too revealing. But talking to him afterward, I found myself rather captivated, as so many others have been over the years, by his engaging intensity. We stayed in touch, even after he was ousted from Apple. When he had something to pitch, such as a NeXT computer or Pixar movie, the beam of his charm would suddenly refocus on me, and he would take me to a sushi restaurant in Lower Manhattan to tell me that whatever he was touting was the best thing he had ever produced. I liked him.
Walter Isaacson (Steve Jobs)
personality of a creative entrepreneur whose passion for perfection and ferocious drive revolutionized six industries: personal computers, animated movies, music, phones, tablet computing, and digital publishing. You might even add a seventh, retail stores, which Jobs did not quite revolutionize but did reimagine. In addition, he opened the way for a new market for digital content based on apps rather than just websites. Along the way he produced not only transforming products but also, on his second try, a lasting company, endowed with his DNA, that is filled with creative designers and daredevil engineers who could carry forward his vision. In August 2011, right before he stepped down as CEO, the enterprise he started in his parents’ garage became the world’s most valuable company. This is
Walter Isaacson (Steve Jobs)
Blood, Sweat, and Pixels: The Triumphant, Turbulent Stories Behind How Video Games Are Made, by Jason Schreier; Masters of Doom: How Two Guys Created an Empire and Transformed Pop Culture, by David Kushner; Hackers: Heroes of the Computer Revolution (specifically the section on Sierra On-Line), by Steven Levy; A Mind Forever Voyaging: A History of Storytelling in Video Games, by Dylan Holmes; Extra Lives: Why Video Games Matter, by Tom Bissell; All Your Base Are Belong to Us: How Fifty Years of Video Games Conquered Pop Culture, by Harold Goldberg; and the documentaries Indie Game: The Movie, directed by James Swirsky and Lisanne Pajot, and GTFO, directed by Shannon Sun-Higginson. I read Indie Games by Bounthavy Suvilay after I finished writing, and it’s a beautiful book for those looking to see how artful games can be.
Gabrielle Zevin (Tomorrow, and Tomorrow, and Tomorrow)
The laws that keep us safe, these same laws condemn us to boredom. Without access to true chaos, we’ll never have true peace. Unless everything can get worse, it won’t get any better. This is all stuff the Mommy used to tell him. She used to say, “The only frontier you have left is the world of intangibles. Everything else is sewn up too tight.” Caged inside too many laws. By intangibles, she meant the Internet, movies, music, stories, art, rumors, computer programs, anything that isn’t real. Virtual realities. Make-believe stuff. The culture. The unreal is more powerful than the real. Because nothing is as perfect as you can imagine it. Because it’s only intangible ideas, concepts, beliefs, fantasies that last. Stone crumbles. Wood rots. People, well, they die. But things as fragile as a thought, a dream, a legend, they can go on and on.
Chuck Palahniuk (Choke)
This is a book about the roller-coaster life and searingly intense personality of a creative entrepreneur whose passion for perfection and ferocious drive revolutionized six industries: personal computers, animated movies, music, phones, tablet computing, and digital publishing. You might even add a seventh, retail stores, which Jobs did not quite revolutionize but did reimagine. In addition, he opened the way for a new market for digital content based on apps rather than just websites. Along the way he produced not only transforming products but also, on his second try, a lasting company, endowed with his DNA, that is filled with creative designers and daredevil engineers who could carry forward his vision. In August 2011, right before he stepped down as CEO, the enterprise he started in his parents’ garage became the world’s most valuable company.
Walter Isaacson (Steve Jobs)
This is a book about the roller-coaster life and searingly intense personality of a creative entrepreneur whose passion for perfection and ferocious drive revolutionized six industries: personal computers, animated movies, music, phones, tablet computing, and digital publishing. You might even add a seventh, retail stores, which Jobs did not quite revolutionize but did reimagine. In addition, he opened the way for a new market for digital content based on apps rather than just websites. Along the way he produced not only transforming products but also, on his second try, a lasting company, endowed with his DNA, that is filled with creative designers and daredevil engineers who could carry forward his vision. In August 2011, right before he stepped down as CEO, the enterprise he started in his parents’ garage became the world’s most valuable company. This
Walter Isaacson (Steve Jobs)
the thesis is that after many generations in which technology favored centralization (railroads, telegraph, radio, television, movies, mass production) since about 1950 it is now favoring decentralization (transistor, personal computer, internet, remote work, smartphone, cryptocurrency). So by this measure, peak centralization was about 1950, when there was one telephone company (AT&T), two superpowers (US/USSR), and three TV stations (ABC/CBS/NBC). Even though the 1950s are romanticized in the US, and there were certainly good things about the era, that level of centralization was not natural. This was an enormous degree of cultural homogenization, conformity, and sameness relative to the pre-1914 world just a few decades prior. Many aspects of individual initiative, creativity, and freedom had been dulled down or eliminated in the standardization process.
Balaji S. Srinivasan (The Network State: How To Start a New Country)
I saw a guy the other day at a wedding, and I told him my theory on why we’ve seen this explosion in comedies in the past fifteen years. Number one, America is tacking hard to the right. That sort of extremism always kind of kicks up the need to create comedy. But the second thing is Avid. What’s Avid? It’s a digital movie-editing program that directors use, and it’s incredibly helpful. I think Avid is hugely responsible for this boom in comedy. In the past, one would have to shoot the film and edit it, which was a big deal. Now, filmmakers can record the laughs from a test audience at a screening, and we can then cut to the rhythm of those laughs, the rhythm of the audience. We synchronize the laughs with the film. We can really get our timing down to a hundredth of a second. You can decide where you want your story to kick in, where you want a little bit of mood, where you want a hard laugh line. All of this can really be calibrated to these test screenings that we do. It doesn’t mean that it becomes mathematical. It still ultimately means that you have to make creative choices, but you can just really get a lot out of it. Sort of like surgery with a laser compared with a regular scalpel. We’re able to download a movie onto the computer and literally do all our edits in minutes. The precision is incredible. You play back the audio of the test screening and get everything timed just right. Like, “This laugh is losing this next line; let’s split the difference here.” You’re able to achieve this rolling energy. You can try experimental edits, and do multiple test screenings, and it’s all because you can move so fast with this program. Comedy is the one genre that I think has just really benefited from this more than any other.
Mike Sacks (Poking a Dead Frog: Conversations with Today's Top Comedy Writers)
As a society we are only now getting close to where Dogen was eight hundred years ago. We are watching all our most basic assumptions about life, the universe, and everything come undone, just like Dogen saw his world fall apart when his parents died. Religions don’t seem to mean much anymore, except maybe to small groups of fanatics. You can hardly get a full-time job, and even if you do, there’s no stability. A college degree means very little. The Internet has leveled things so much that the opinions of the greatest scientists in the world about global climate change are presented as being equal to those of some dude who read part of the Bible and took it literally. The news industry has collapsed so that it’s hard to tell a fake headline from a real one. Money isn’t money anymore; it’s numbers stored in computers. Everything is changing so rapidly that none of us can hope to keep up. All this uncertainty has a lot of us scrambling for something certain to hang on to. But if you think I’m gonna tell you that Dogen provides us with that certainty, think again. He actually gives us something far more useful. Dogen gives us a way to be okay with uncertainty. This isn’t just something Buddhists need; it’s something we all need. We humans can be certainty junkies. We’ll believe in the most ridiculous nonsense to avoid the suffering that comes from not knowing something. It’s like part of our brain is dedicated to compulsive dot-connecting. I think we’re wired to want to be certain. You have to know if that’s a rope or a snake, if the guy with the chains all over his chest is a gangster or a fan of bad seventies movies. Being certain means being safe. The downfall is that we humans think about a lot of stuff that’s not actually real. We crave certainty in areas where there can never be any. That’s when we start in with believing the crazy stuff. Dogen is interesting because he tries to cut right to the heart of this. He gets into what is real and what is not. Probably the main reason he’s so difficult to read is that Dogen is trying to say things that can’t actually be said. So he has to bend language to the point where it almost breaks. He’s often using language itself to show the limitations of language. Even the very first readers of his writings must have found them difficult. Dogen understood both that words always ultimately fail to describe reality and that we human beings must rely on words anyway. So he tried to use words to write about that which is beyond words. This isn’t really a discrepancy. You use words, but you remain aware of their limitations. My teacher used to say, “People like explanations.” We do. They’re comforting. When the explanation is reasonably correct, it’s useful.
Brad Warner (It Came from Beyond Zen!: More Practical Advice from Dogen, Japan's Greatest Zen Master (Treasury of the True Dharma Eye Book 2))
To escape the throngs, we decided to see the new Neil Degrasse Tyson planetarium show, Dark Universe. It costs more than two movie tickets and is less than thirty minutes long, but still I want to go back and see it again, preferably as soon as possible. It was more visually stunning than any Hollywood special effect I’d ever seen, making our smallness as individuals both staggering and - strangely - rather comforting. Only five percent of the universe consists of ordinary matter, Neil tells us. That includes all matter - you, and me, and the body of Michael Brown, and Mork’s rainbow suspenders, and the letters I wrote all summer, and the air conditioner I put out on the curb on Christmas Day because I was tired of looking at it and being reminded of the person who had installed it, and my sad dying computer that sounds like a swarm of bees when it gets too hot, and the fields of Point Reyes, and this year’s blossoms which are dust now, and the drafts of my book, and Israeli tanks, and the untaxed cigarettes that Eric Garner sold, and my father’s ill-fitting leg brace that did not accomplish what he’d hoped for in terms of restoring mobility, and the Denver airport, and haunting sperm whales that sleep vertically, and the water they sleep in, and Mars and Jupiter and all of the stars we see and all of the ones we don’t. That’s all regular matter, just five percent. A quarter is “dark matter,” which is invisible and detectable only by gravitational pull, and a whopping 70 percent of the universe is made up of “dark energy,” described as a cosmic antigravity, as yet totally unknowable. It’s basically all mystery out there - all of it, with just this one sliver of knowable, livable, finite light and life. And did I mention the effects were really cool? After seeing something like that it’s hard to stay mad at anyone, even yourself.
Summer Brennan
Music centers you,” I whispered to an empty car, staring at his front door. “You listened to your iPod between classes and while you sat on the bleachers before school every morning.” I smiled, letting more tears run down my cheeks and thinking back to him and his black hoodies, looking so dark. “You love popcorn. Almost every kind and flavor but especially with Tabasco sauce,” I said, remembering the times he would come into the theater where I worked. “You hold the door open for women—students, teachers, and even old ladies coming out of Baskin-Robbins. You love movies about natural disasters, but they have to have some comedy in them. Your favorite one is Armageddon.” I swallowed and thought about how little I’d ever seen Jax truly smile. “And while you love computers, it’s not your passion,” I concluded. “You love being outdoors. You love having space.” My whole face hurt, the last words barely audible. “And you deserve someone who makes you happy. I’m just not that person.
Penelope Douglas (Falling Away (Fall Away, #4))
More recently, mathematical script has given rise to an even more revolutionary writing system, a computerised binary script consisting of only two signs: 0 and 1. The words I am now typing on my keyboard are written within my computer by different combinations of 0 and 1. Writing was born as the maidservant of human consciousness, but is increasingly becoming its master. Our computers have trouble understanding how Homo sapiens talks, feels and dreams. So we are teaching Homo sapiens to talk, feel and dream in the language of numbers, which can be understood by computers. And this is not the end of the story. The field of artificial intelligence is seeking to create a new kind of intelligence based solely on the binary script of computers. Science-fiction movies such as The Matrix and The Terminator tell of a day when the binary script throws off the yoke of humanity. When humans try to regain control of the rebellious script, it responds by attempting to wipe out the human race.
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
In the climactic scene of many Hollywood science-fiction movies, humans face an alien invasion fleet, an army of rebellious robots or an all-knowing super-computer that wants to obliterate them. Humanity seems doomed. But at the very last moment, against all the odds, humanity triumphs thanks to something that the aliens, the robots and the super-computers didn’t suspect and cannot fathom: love. The hero, who up till now has been easily manipulated by the super-computer and has been riddled with bullets by the evil robots, is inspired by his sweetheart to make a completely unexpected move that turns the tables on the thunderstruck Matrix. Dataism finds such scenarios utterly ridiculous. ‘Come on,’ it admonishes the Hollywood screenwriters, ‘is that all you could come up with? Love? And not even some platonic cosmic love, but the carnal attraction between two mammals? Do you really think that an all-knowing super-computer or aliens who managed to conquer the entire galaxy would be dumbfounded by a hormonal rush?
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
It was little things at first. Abby missed a phone call because she had an away game. Then one time Gretchen didn’t write back and never made up for the missing letter. They got busy with SATs and college applications, and even though they both applied to Georgetown, Gretchen didn’t get in, and Abby wound up going to George Washington anyways. At college they went to their computer labs and sent each other emails, sitting in front of black and green CRT screens and pecking them out one letter at a time. And they still wrote, but calling became a once-a-week thing. Gretchen was Abby’s maid of honor at her tiny courthouse wedding, but sometimes a month would go by and they wouldn’t speak. Then two months. Then three. They went through periods when they both made an effort to write more, but after a while that usually faded. It wasn’t anything serious, it was just life. The dance recitals, making the rent, first real jobs, pickups, dropoffs, the fights that seemed so important, the laundry, the promotions, the vacations taken, shoes bought, movies watched, lunches packed. It was a haze of the everyday that blurred the big things and made them feel distant and small.
Grady Hendrix (My Best Friend's Exorcism)
The idea that John Lasseter pitched was called “Toy Story.” It sprang from a belief, which he and Jobs shared, that products have an essence to them, a purpose for which they were made. If the object were to have feelings, these would be based on its desire to fulfill its essence. The purpose of a glass, for example, is to hold water; if it had feelings, it would be happy when full and sad when empty. The essence of a computer screen is to interface with a human. The essence of a unicycle is to be ridden in a circus. As for toys, their purpose is to be played with by kids, and thus their existential fear is of being discarded or upstaged by newer toys. So a buddy movie pairing an old favorite toy with a shiny new one would have an essential drama to it, especially when the action revolved around the toys’ being separated from their kid. The original treatment began, “Everyone has had the traumatic childhood experience of losing a toy. Our story takes the toy’s point of view as he loses and tries to regain the single thing most important to him: to be played with by children. This is the reason for the existence of all toys. It is the emotional foundation of their existence.
Walter Isaacson (Steve Jobs)
The culture we have does not make people feel good about themselves. And you have to be strong enough to say if the culture doesn’t work, don’t buy it.” Morrie, true to these words, had developed his own culture—long before he got sick. Discussion groups, walks with friends, dancing to his music in the Harvard Square church. He started a project called Greenhouse, where poor people could receive mental health services. He read books to find new ideas for his classes, visited with colleagues, kept up with old students, wrote letters to distant friends. He took more time eating and looking at nature and wasted no time in front of TV sitcoms or “Movies of the Week.” He had created a cocoon of human activities—conversation, interaction, affection—and it filled his life like an overflowing soup bowl.I had also developed my own culture. Work. I did four or five media jobs in England, juggling them like a clown. I spent eight hours a day on a computer, feeding my stories back to the States. Then I did TV pieces, traveling with a crew throughout parts of London. I also phoned in radio reports every morning and afternoon. This was not an abnormal load. Over the years, I had taken labor as my companion and had moved everything else to the side.
Mitch Albom (Tuesdays with Morrie: An Old Man, a Young Man, and Life's Greatest Lesson)
Being unable to deal with the complexity of the world has seen us retreat into what Curtis calls a “static world”. Instead of looking to change the world for the better, we look either to change small things (our bodies, our own rights as an individual), or we fall back into the past. “This obsession with risk that politicians, terror experts and finance people have, it’s about going back into the past, looking for patterns – which computers now allow you to do – and adjusting everything to make sure things are stable. “When I was working with Massive Attack, I used an old Bauhaus song called Bela Lugosi’s Dead and [on the big screens] I constantly repeated the phrase, ‘If you like this, then you’ll love that.’ I think in a way that’s the motto of our time. We’ll give you tomorrow something very similar to what you had yesterday. And then the world will be stable. And that’s true in politics, finance and culture. “Look at the way culture plays it,” he continues. “I mean, look at me. Look at Edgar Wright: he makes movies constantly referencing things. We constantly play yesterday back to you in a slightly altered form, to try and make you feel stable and happy. And the world stays stuck and everyone gets ratty, which is why they all snark at each other on the internet.
Anonymous
The best entrepreneurs don’t just follow Moore’s Law; they anticipate it. Consider Reed Hastings, the cofounder and CEO of Netflix. When he started Netflix, his long-term vision was to provide television on demand, delivered via the Internet. But back in 1997, the technology simply wasn’t ready for his vision—remember, this was during the era of dial-up Internet access. One hour of high-definition video requires transmitting 40 GB of compressed data (over 400 GB without compression). A standard 28.8K modem from that era would have taken over four months to transmit a single episode of Stranger Things. However, there was a technological innovation that would allow Netflix to get partway to Hastings’s ultimate vision—the DVD. Hastings realized that movie DVDs, then selling for around $ 20, were both compact and durable. This made them perfect for running a movie-rental-by-mail business. Hastings has said that he got the idea from a computer science class in which one of the assignments was to calculate the bandwidth of a station wagon full of backup tapes driving across the country! This was truly a case of technological innovation enabling business model innovation. Blockbuster Video had built a successful business around buying VHS tapes for around $ 100 and renting them out from physical stores, but the bulky, expensive, fragile tapes would never have supported a rental-by-mail business.
Reid Hoffman (Blitzscaling: The Lightning-Fast Path to Building Massively Valuable Companies)
Walking back through the mall to the exit nearest our part of the parking lot, we passed one shop which sold computers, printers, software, and games. It was packed with teenagers, the kind who wear wire rims and know what the new world is about. The clerks were indulgent, letting them program the computers. Two hundred yards away, near the six movie houses, a different kind of teenager shoved quarters into the space-war games, tensing over the triggers, releasing the eerie sounds of extraterrestrial combat. Any kid back in the computer store could have told the combatants that because there is no atmosphere in space, there is absolutely no sound at all. Perfect distribution: the future managers and the future managed ones. Twenty in the computer store, two hundred in the arcade. The future managers have run on past us into the thickets of CP/M, M-Basic, Cobal, Fortran, Z-80, Apples, and Worms. Soon the bosses of the microcomputer revolution will sell us preprogrammed units for each household which will provide entertainment, print out news, purvey mail-order goods, pay bills, balance accounts, keep track of expenses, and compute taxes. But by then the future managers will be over on the far side of the thickets, dealing with bubble memories, machines that design machines, projects so esoteric our pedestrian minds cannot comprehend them. It will be the biggest revolution of all, bigger than the wheel, bigger than Franklin’s kite, bigger than paper towels.
John D. MacDonald (Cinnamon Skin (Travis McGee, #20))
As an analogy, we used to think of books, music, and movies as distinct. Then they all became represented by packets sent over the internet. Yes, we listened to music in audio players and viewed books in ebook readers, but their fundamental structure became digital. Similarly, today we think of stocks, bonds, gold, loans, and art as different. But all of them are represented as debits and credits on blockchains. Again, the fundamental structure became digital. Now, we are starting to think of different kinds of collections of people –— whether communities, cities, companies, or countries —– all fundamentally as networks, where the digital profiles and how they interact become more and more fundamental. This is obvious for communities and companies, which can already be fully remote and digital, but even already existing cities and countries are starting to be modeled this way, because (a) their citizens48 are often geographically remote, (b) the concept of citizenship itself is becoming similar to digital single sign-on, (c) many 20th century functions of government have already been de-facto transferred to private networks like (electronic) mail delivery, hotel, and taxi regulation, (d) cities and countries increasingly recruit citizens online, (e) so-called smart cities are increasingly administrated through a computer interface, and (f) as countries issue central bank digital currencies and cities likely follow suit, every polity will be publicly traded on the internet just like companies and coins.
Balaji S. Srinivasan (The Network State: How To Start a New Country)
A serious reader of fiction is an adult who reads, let's say, two or more hours a night, three or four nights a week, and by the end of two or three weeks he has read the book. A serious reader is not someone who reads for half an hour at a time and then picks the book up again on the beach a week later. While reading, serious readers aren't distracted by anything else. They put the kids to bed, and then they read. They don't watch TV intermittently or stop off and on to shop on-line or to talk on the phone. There is, indisputably, a rapidly diminishing number of serious readers, certainly in America. Of course, the cause is something more than just the multitudinous distractions of contemporary life. One must acknowledge the triumph the screen. Reading, whether serious or frivolous, doesn't stand a chance against the screen: first, the movie screen, then the television screen, now the proliferating computer screen, one in your pocket, one on your desk, one in your hand, and soon one imbedded between your eyes. Why can't serious reading compete? Because the gratifications of the screen are far more immediate, graspable, gigantically gripping. Alas, the screen is not only fantastically useful, it's fun, and what beats fun? There was never a Golden Age of Serious Reading in America but I don't remember ever in my lifetime the situation being as sad for books – with all the steady focus and uninterrupted concentration they require – as it is today. And it will be worse tomorrow and even worse the day after. My prediction is that in thirty years, if not sooner, there will be just as many people reading serious fiction in America as now read Latin poetry. A percentage do. But the number of people who find in literature a highly desirable source of sustaining pleasure and mental stimulation is sadly diminished.
Philip Roth
Despite the superficial similarities created by global technology, the dynamics of peer-orientation are more likely to promote division rather than a healthy universality. One need only to look at the extreme tribalization of the youth gangs, the social forms entered into by the most peer-oriented among our children. Seeking to be the same as someone else immediately triggers the need to be different from others. As the similarities within the chosen group strengthen, the differences from those outside the groups are accentuated to the point of hostility. Each group is solidified and reinforced by mutual emulation and cue-taking. In this way, tribes have formed spontaneously since the beginning of time. The crucial difference is that traditional tribal culture could be passed down, whereas these tribes of today are defined and limited by barriers among the generations. The school milieu is rife with such dynamics. When immature children cut off from their adult moorings mingle with one another, groups soon form spontaneously, often along the more obvious dividing lines of grade and gender and race. Within these larger groupings certain subcultures emerge: sometimes along the lines of dress and appearance, and sometimes along those of shared interests, attitudes, or abilities, as in groups of jocks, brains, and computer nerds. Sometimes they form among peer-oriented subcultures like skateboarders, bikers, and skinheads. Many of these subcultures are reinforced and shaped by the media and supported by cult costumes, symbols, movies, music, and language. If the tip of the peer-orientation iceberg are the gangs and the gang wannabes, at the base are the cliques. Immature beings revolving around one another invent their own language and modes of expression that impoverish their self-expression and cut them off from others. Such phenomena may have appeared before, of course, but not nearly to the same extent we are witnessing today. The result is tribalization.
Gabor Maté (Hold On to Your Kids: Why Parents Need to Matter More Than Peers)
Isaac Asimov’s short story “The Fun They Had” describes a school of the future that uses advanced technology to revolutionize the educational experience, enhancing individualized learning and providing students with personalized instruction and robot teachers. Such science fiction has gone on to inspire very real innovation. In a 1984 Newsweek interview, Apple’s co-founder Steve Jobs predicted computers were going to be a bicycle for our minds, extending our capabilities, knowledge, and creativity, much the way a ten-speed amplifies our physical abilities. For decades, we have been fascinated by the idea that we can use computers to help educate people. What connects these science fiction narratives is that they all imagined computers might eventually emulate what we view as intelligence. Real-life researchers have been working for more than sixty years to make this AI vision a reality. In 1962, the checkers master Robert Nealey played the game against an IBM 7094 computer, and the computer beat him. A few years prior, in 1957, the psychologist Frank Rosenblatt created Perceptron, the first artificial neural network, a computer simulation of a collection of neurons and synapses trained to perform certain tasks. In the decades following such innovations in early AI, we had the computation power to tackle systems only as complex as the brain of an earthworm or insect. We also had limited techniques and data to train these networks. The technology has come a long way in the ensuing decades, driving some of the most common products and apps today, from the recommendation engines on movie streaming services to voice-controlled personal assistants such as Siri and Alexa. AI has gotten so good at mimicking human behavior that oftentimes we cannot distinguish between human and machine responses. Meanwhile, not only has the computation power developed enough to tackle systems approaching the complexity of the human brain, but there have been significant breakthroughs in structuring and training these neural networks.
Salman Khan (Brave New Words: How AI Will Revolutionize Education (and Why That’s a Good Thing))
During homeroom, before first period, I start a bucket list in one of my notebooks. First on the list? 1) Eat in the cafeteria. Sit with people. TALK TO THEM. 2) And…that’s all I can come up with for now. But this is good. One task to work on. No distractions. I can do this. When my lunch period rolls around, I forgo the safety of my bag lunch and the computer lab and slip into the pizza line, wielding my very own tray of semi-edible fare for the first time in years. “A truly remarkable sight.” Jensen cuts into line beside me, sliding his tray next to mine on the ledge in front of us. He lifts his hands and frames me with his fingers, like he’s shooting a movie. “In search of food, the elusive creature emerges from her den and tries her luck at the watering hole." I shake my head, smiling, moving down the line. “Wow, Peters. I never knew you were such a huge Animal Planet fan.” “I’m a fan of all things nature. Birds. Bees. The like.” He grabs two pudding cups and drops one on my tray. “Pandas?” I say. “How did you know? The panda is my spirit animal.” “Oh, good, because Gran has this great pattern for an embroidered panda cardigan. It would look amazing on you.” “Um, yeah, I know. It was on my Christmas list, but Santa totally stiffed me." I laugh as I grab a carton of milk. So does he. He leans in closer. “Come sit with me.” “At the jock table? Are you kidding?” I hand the cashier my lunch card. Jensen squints his eyes in the direction of his friends. “We’re skinny-ass basketball players, Wayfare. We don’t really scream jock.” “Meatheads, then?” “I believe the correct term is Athletic Types.” We step out from the line and scan the room. “So where were you planning on sitting?" “I was thinking Grady and Marco were my safest bet.” “The nerd table?” I gesture to myself, especially my glasses. “I figure my natural camouflage will help me blend, yo.” He laughs, his honey-blond hair falling in front of his eyes. “And hey,” I say, nudging him with my elbow, “last I heard, Peters was cool with nerdy.” He claps me gently on the back. “Good luck, Wayfare. I’m pulling for ya.
M.G. Buehrlen (The Untimely Deaths of Alex Wayfare (Alex Wayfare #2))
know that taking a long walk was his preferred way to have a serious conversation. It turned out that he wanted me to write a biography of him. I had recently published one on Benjamin Franklin and was writing one about Albert Einstein, and my initial reaction was to wonder, half jokingly, whether he saw himself as the natural successor in that sequence. Because I assumed that he was still in the middle of an oscillating career that had many more ups and downs left, I demurred. Not now, I said. Maybe in a decade or two, when you retire. I had known him since 1984, when he came to Manhattan to have lunch with Time’s editors and extol his new Macintosh. He was petulant even then, attacking a Time correspondent for having wounded him with a story that was too revealing. But talking to him afterward, I found myself rather captivated, as so many others have been over the years, by his engaging intensity. We stayed in touch, even after he was ousted from Apple. When he had something to pitch, such as a NeXT computer or Pixar movie, the beam of his charm would suddenly refocus on me, and he would take me to a sushi restaurant in Lower Manhattan to tell me that whatever he was touting was the best thing he had ever produced. I liked him. When he was restored to the throne at Apple, we put him on the cover of Time, and soon thereafter he began offering me his ideas for a series we were doing on the most influential people of the century. He had launched his “Think Different” campaign, featuring iconic photos of some of the same people we were considering, and he found the endeavor of assessing historic influence fascinating. After I had deflected his suggestion that I write a biography of him, I heard from him every now and then. At one point I emailed to ask if it was true, as my daughter had told me, that the Apple logo was an homage to Alan Turing, the British computer pioneer who broke the German wartime codes and then committed suicide by biting into a cyanide-laced apple. He replied that he wished he had thought of that, but hadn’t. That started an exchange about the early history of Apple, and I found myself gathering string on the subject, just in case I ever decided to do such a book. When my Einstein biography came out, he came to a book event in Palo Alto and
Walter Isaacson (Steve Jobs)
Twenty years? No kidding: twenty years? It’s hard to believe. Twenty years ago, I was—well, I was much younger. My parents were still alive. Two of my grandchildren had not yet been born, and another one, now in college, was an infant. Twenty years ago I didn’t own a cell phone. I didn’t know what quinoa was and I doubt if I had ever tasted kale. There had recently been a war. Now we refer to that one as the First Gulf War, but back then, mercifully, we didn’t know there would be another. Maybe a lot of us weren’t even thinking about the future then. But I was. And I’m a writer. I wrote The Giver on a big machine that had recently taken the place of my much-loved typewriter, and after I printed the pages, very noisily, I had to tear them apart, one by one, at the perforated edges. (When I referred to it as my computer, someone more knowledgeable pointed out that my machine was not a computer. It was a dedicated word processor. “Oh, okay then,” I said, as if I understood the difference.) As I carefully separated those two hundred or so pages, I glanced again at the words on them. I could see that I had written a complete book. It had all the elements of the seventeen or so books I had written before, the same things students of writing list on school quizzes: characters, plot, setting, tension, climax. (Though I didn’t reply as he had hoped to a student who emailed me some years later with the request “Please list all the similes and metaphors in The Giver,” I’m sure it contained those as well.) I had typed THE END after the intentionally ambiguous final paragraphs. But I was aware that this book was different from the many I had already written. My editor, when I gave him the manuscript, realized the same thing. If I had drawn a cartoon of him reading those pages, it would have had a text balloon over his head. The text would have said, simply: Gulp. But that was twenty years ago. If I had written The Giver this year, there would have been no gulp. Maybe a yawn, at most. Ho-hum. In so many recent dystopian novels (and there are exactly that: so many), societies battle and characters die hideously and whole civilizations crumble. None of that in The Giver. It was introspective. Quiet. Short on action. “Introspective, quiet, and short on action” translates to “tough to film.” Katniss Everdeen gets to kill off countless adolescent competitors in various ways during The Hunger Games; that’s exciting movie fare. It sells popcorn. Jonas, riding a bike and musing about his future? Not so much. Although the film rights to The Giver were snapped up early on, it moved forward in spurts and stops for years, as screenplay after screenplay—none of them by me—was
Lois Lowry (The Giver (Giver Quartet Book 1))
The US traded its manufacturing sector’s health for its entertainment industry, hoping that Police Academy sequels could take the place of the rustbelt. The US bet wrong. But like a losing gambler who keeps on doubling down, the US doesn’t know when to quit. It keeps meeting with its entertainment giants, asking how US foreign and domestic policy can preserve its business-model. Criminalize 70 million American file-sharers? Check. Turn the world’s copyright laws upside down? Check. Cream the IT industry by criminalizing attempted infringement? Check. It’ll never work. It can never work. There will always be an entertainment industry, but not one based on excluding access to published digital works. Once it’s in the world, it’ll be copied. This is why I give away digital copies of my books and make money on the printed editions: I’m not going to stop people from copying the electronic editions, so I might as well treat them as an enticement to buy the printed objects. But there is an information economy. You don’t even need a computer to participate. My barber, an avowed technophobe who rebuilds antique motorcycles and doesn’t own a PC, benefited from the information economy when I found him by googling for barbershops in my neighborhood. Teachers benefit from the information economy when they share lesson plans with their colleagues around the world by email. Doctors benefit from the information economy when they move their patient files to efficient digital formats. Insurance companies benefit from the information economy through better access to fresh data used in the preparation of actuarial tables. Marinas benefit from the information economy when office-slaves look up the weekend’s weather online and decide to skip out on Friday for a weekend’s sailing. Families of migrant workers benefit from the information economy when their sons and daughters wire cash home from a convenience store Western Union terminal. This stuff generates wealth for those who practice it. It enriches the country and improves our lives. And it can peacefully co-exist with movies, music and microcode, but not if Hollywood gets to call the shots. Where IT managers are expected to police their networks and systems for unauthorized copying – no matter what that does to productivity – they cannot co-exist. Where our operating systems are rendered inoperable by “copy protection,” they cannot co-exist. Where our educational institutions are turned into conscript enforcers for the record industry, they cannot co-exist. The information economy is all around us. The countries that embrace it will emerge as global economic superpowers. The countries that stubbornly hold to the simplistic idea that the information economy is about selling information will end up at the bottom of the pile. What country do you want to live in?
Cory Doctorow (Content: Selected Essays on Technology, Creativity, Copyright, and the Future of the Future)
As the subject watches the movies, the MRI machine creates a 3-D image of the blood flow within the brain. The MRI image looks like a vast collection of thirty thousand dots, or voxels. Each voxel represents a pinpoint of neural energy, and the color of the dot corresponds to the intensity of the signal and blood flow. Red dots represent points of large neural activity, while blue dots represent points of less activity. (The final image looks very much like thousands of Christmas lights in the shape of the brain. Immediately you can see that the brain is concentrating most of its mental energy in the visual cortex, which is located at the back of the brain, while watching these videos.) Gallant’s MRI machine is so powerful it can identify two to three hundred distinct regions of the brain and, on average, can take snapshots that have one hundred dots per region of the brain. (One goal for future generations of MRI technology is to provide an even sharper resolution by increasing the number of dots per region of the brain.) At first, this 3-D collection of colored dots looks like gibberish. But after years of research, Dr. Gallant and his colleagues have developed a mathematical formula that begins to find relationships between certain features of a picture (edges, textures, intensity, etc.) and the MRI voxels. For example, if you look at a boundary, you’ll notice it’s a region separating lighter and darker areas, and hence the edge generates a certain pattern of voxels. By having subject after subject view such a large library of movie clips, this mathematical formula is refined, allowing the computer to analyze how all sorts of images are converted into MRI voxels. Eventually the scientists were able to ascertain a direct correlation between certain MRI patterns of voxels and features within each picture. At this point, the subject is then shown another movie trailer. The computer analyzes the voxels generated during this viewing and re-creates a rough approximation of the original image. (The computer selects images from one hundred movie clips that most closely resemble the one that the subject just saw and then merges images to create a close approximation.) In this way, the computer is able to create a fuzzy video of the visual imagery going through your mind. Dr. Gallant’s mathematical formula is so versatile that it can take a collection of MRI voxels and convert it into a picture, or it can do the reverse, taking a picture and then converting it to MRI voxels. I had a chance to view the video created by Dr. Gallant’s group, and it was very impressive. Watching it was like viewing a movie with faces, animals, street scenes, and buildings through dark glasses. Although you could not see the details within each face or animal, you could clearly identify the kind of object you were seeing. Not only can this program decode what you are looking at, it can also decode imaginary images circulating in your head.
Michio Kaku (The Future of the Mind: The Scientific Quest to Understand, Enhance, and Empower the Mind)
me to be honest about his failings as well as his strengths. She is one of the smartest and most grounded people I have ever met. “There are parts of his life and personality that are extremely messy, and that’s the truth,” she told me early on. “You shouldn’t whitewash it. He’s good at spin, but he also has a remarkable story, and I’d like to see that it’s all told truthfully.” I leave it to the reader to assess whether I have succeeded in this mission. I’m sure there are players in this drama who will remember some of the events differently or think that I sometimes got trapped in Jobs’s distortion field. As happened when I wrote a book about Henry Kissinger, which in some ways was good preparation for this project, I found that people had such strong positive and negative emotions about Jobs that the Rashomon effect was often evident. But I’ve done the best I can to balance conflicting accounts fairly and be transparent about the sources I used. This is a book about the roller-coaster life and searingly intense personality of a creative entrepreneur whose passion for perfection and ferocious drive revolutionized six industries: personal computers, animated movies, music, phones, tablet computing, and digital publishing. You might even add a seventh, retail stores, which Jobs did not quite revolutionize but did reimagine. In addition, he opened the way for a new market for digital content based on apps rather than just websites. Along the way he produced not only transforming products but also, on his second try, a lasting company, endowed with his DNA, that is filled with creative designers and daredevil engineers who could carry forward his vision. In August 2011, right before he stepped down as CEO, the enterprise he started in his parents’ garage became the world’s most valuable company. This is also, I hope, a book about innovation. At a time when the United States is seeking ways to sustain its innovative edge, and when societies around the world are trying to build creative digital-age economies, Jobs stands as the ultimate icon of inventiveness, imagination, and sustained innovation. He knew that the best way to create value in the twenty-first century was to connect creativity with technology, so he built a company where leaps of the imagination were combined with remarkable feats of engineering. He and his colleagues at Apple were able to think differently: They developed not merely modest product advances based on focus groups, but whole new devices and services that consumers did not yet know they needed. He was not a model boss or human being, tidily packaged for emulation. Driven by demons, he could drive those around him to fury and despair. But his personality and passions and products were all interrelated, just as Apple’s hardware and software tended to be, as if part of an integrated system. His tale is thus both instructive and cautionary, filled with lessons about innovation, character, leadership, and values.
Walter Isaacson (Steve Jobs)
Like,” he repeats with distaste. “How about I tell you what I don’t like? I do not like postmodernism, postapocalyptic settings, postmortem narrators, or magic realism. I rarely respond to supposedly clever formal devices, multiple fonts, pictures where they shouldn’t be—basically, gimmicks of any kind. I find literary fiction about the Holocaust or any other major world tragedy to be distasteful—nonfiction only, please. I do not like genre mash-ups à la the literary detective novel or the literary fantasy. Literary should be literary, and genre should be genre, and crossbreeding rarely results in anything satisfying. I do not like children’s books, especially ones with orphans, and I prefer not to clutter my shelves with young adult. I do not like anything over four hundred pages or under one hundred fifty pages. I am repulsed by ghostwritten novels by reality television stars, celebrity picture books, sports memoirs, movie tie-in editions, novelty items, and—I imagine this goes without saying—vampires. I rarely stock debuts, chick lit, poetry, or translations. I would prefer not to stock series, but the demands of my pocketbook require me to. For your part, you needn’t tell me about the ‘next big series’ until it is ensconced on the New York Times Best Sellers list. Above all, Ms. Loman, I find slim literary memoirs about little old men whose little old wives have died from cancer to be absolutely intolerable. No matter how well written the sales rep claims they are. No matter how many copies you promise I’ll sell on Mother’s Day.” Amelia blushes, though she is angry more than embarrassed. She agrees with some of what A.J. has said, but his manner is unnecessarily insulting. Knightley Press doesn’t even sell half of that stuff anyway. She studies him. He is older than Amelia but not by much, not by more than ten years. He is too young to like so little. “What do you like?” she asks. “Everything else,” he says. “I will also admit to an occasional weakness for short-story collections. Customers never want to buy them though.” There is only one short-story collection on Amelia’s list, a debut. Amelia hasn’t read the whole thing, and time dictates that she probably won’t, but she liked the first story. An American sixth-grade class and an Indian sixth-grade class participate in an international pen pal program. The narrator is an Indian kid in the American class who keeps feeding comical misinformation about Indian culture to the Americans. She clears her throat, which is still terribly dry. “The Year Bombay Became Mumbai. I think it will have special int—” “No,” he says. “I haven’t even told you what it’s about yet.” “Just no.” “But why?” “If you’re honest with yourself, you’ll admit that you’re only telling me about it because I’m partially Indian and you think this will be my special interest. Am I right?” Amelia imagines smashing the ancient computer over his head. “I’m telling you about this because you said you liked short stories! And it’s the only one on my list. And for the record”—here, she lies—“it’s completely wonderful from start to finish. Even if it is a debut. “And do you know what else? I love debuts. I love discovering something new. It’s part of the whole reason I do this job.” Amelia rises. Her head is pounding. Maybe she does drink too much? Her head is pounding and her heart is, too. “Do you want my opinion?” “Not particularly,” he says. “What are you, twenty-five?” “Mr. Fikry, this is a lovely store, but if you continue in this this this”—as a child, she stuttered and it occasionally returns when she is upset; she clears her throat—“this backward way of thinking, there won’t be an Island Books before too long.
Gabrielle Zevin (The Storied Life of A.J. Fikry)
We need to be humble enough to recognize that unforeseen things can and do happen that are nobody’s fault. A good example of this occurred during the making of Toy Story 2. Earlier, when I described the evolution of that movie, I explained that our decision to overhaul the film so late in the game led to a meltdown of our workforce. This meltdown was the big unexpected event, and our response to it became part of our mythology. But about ten months before the reboot was ordered, in the winter of 1998, we’d been hit with a series of three smaller, random events—the first of which would threaten the future of Pixar. To understand this first event, you need to know that we rely on Unix and Linux machines to store the thousands of computer files that comprise all the shots of any given film. And on those machines, there is a command—/bin/rm -r -f *—that removes everything on the file system as fast as it can. Hearing that, you can probably anticipate what’s coming: Somehow, by accident, someone used this command on the drives where the Toy Story 2 files were kept. Not just some of the files, either. All of the data that made up the pictures, from objects to backgrounds, from lighting to shading, was dumped out of the system. First, Woody’s hat disappeared. Then his boots. Then he disappeared entirely. One by one, the other characters began to vanish, too: Buzz, Mr. Potato Head, Hamm, Rex. Whole sequences—poof!—were deleted from the drive. Oren Jacobs, one of the lead technical directors on the movie, remembers watching this occur in real time. At first, he couldn’t believe what he was seeing. Then, he was frantically dialing the phone to reach systems. “Pull out the plug on the Toy Story 2 master machine!” he screamed. When the guy on the other end asked, sensibly, why, Oren screamed louder: “Please, God, just pull it out as fast as you can!” The systems guy moved quickly, but still, two years of work—90 percent of the film—had been erased in a matter of seconds. An hour later, Oren and his boss, Galyn Susman, were in my office, trying to figure out what we would do next. “Don’t worry,” we all reassured each other. “We’ll restore the data from the backup system tonight. We’ll only lose half a day of work.” But then came random event number two: The backup system, we discovered, hadn’t been working correctly. The mechanism we had in place specifically to help us recover from data failures had itself failed. Toy Story 2 was gone and, at this point, the urge to panic was quite real. To reassemble the film would have taken thirty people a solid year. I remember the meeting when, as this devastating reality began to sink in, the company’s leaders gathered in a conference room to discuss our options—of which there seemed to be none. Then, about an hour into our discussion, Galyn Susman, the movie’s supervising technical director, remembered something: “Wait,” she said. “I might have a backup on my home computer.” About six months before, Galyn had had her second baby, which required that she spend more of her time working from home. To make that process more convenient, she’d set up a system that copied the entire film database to her home computer, automatically, once a week. This—our third random event—would be our salvation. Within a minute of her epiphany, Galyn and Oren were in her Volvo, speeding to her home in San Anselmo. They got her computer, wrapped it in blankets, and placed it carefully in the backseat. Then they drove in the slow lane all the way back to the office, where the machine was, as Oren describes it, “carried into Pixar like an Egyptian pharaoh.” Thanks to Galyn’s files, Woody was back—along with the rest of the movie.
Ed Catmull (Creativity, Inc.: Overcoming the Unseen Forces That Stand in the Way of True Inspiration)
So far, the smallest memory device known to be evolved and used in the wild is the genome of the bacterium Candidatus Carsonella ruddii, storing about 40 kilobytes, whereas our human DNA stores about 1.6 gigabytes, comparable to a downloaded movie. As mentioned in the last chapter, our brains store much more information than our genes: in the ballpark of 10 gigabytes electrically (specifying which of your 100 billion neurons are firing at any one time) and 100 terabytes chemically/biologically (specifying how strongly different neurons are linked by synapses). Comparing these numbers with the machine memories shows that the world’s best computers can now out-remember any biological system—at a cost that’s rapidly dropping and was a few thousand dollars in 2016.
Max Tegmark (Life 3.0: Being Human in the Age of Artificial Intelligence)
Marturano recommended something radical: do only one thing at a time. When you’re on the phone, be on the phone. When you’re in a meeting, be there. Set aside an hour to check your email, and then shut off your computer monitor and focus on the task at hand. Another tip: take short mindfulness breaks throughout the day. She called them “purposeful pauses.” So, for example, instead of fidgeting or tapping your fingers while your computer boots up, try to watch your breath for a few minutes. When driving, turn off the radio and feel your hands on the wheel. Or when walking between meetings, leave your phone in your pocket and just notice the sensations of your legs moving. “If I’m a corporate samurai,” I said, “I’d be a little worried about taking all these pauses that you recommend because I’d be thinking, ‘Well, my rivals aren’t pausing. They’re working all the time.’ ” “Yeah, but that assumes that those pauses aren’t helping you. Those pauses are the ways to make you a more clear thinker and for you to be more focused on what’s important.” This was another attack on my work style. I had long assumed that ceaseless planning was the recipe for effectiveness, but Marturano’s point was that too much mental churning was counterproductive. When you lurch from one thing to the next, constantly scheming, or reacting to incoming fire, the mind gets exhausted. You get sloppy and make bad decisions. I could see how the counterintuitive act of stopping, even for a few seconds, could be a source of strength, not weakness. This was a practical complement to Joseph’s “is this useful?” mantra. It was the opposite of zoning out, it was zoning in. In fact, I looked into it and found there was science to suggest that pausing could be a key ingredient in creativity and innovation. Studies showed that the best way to engineer an epiphany was to work hard, focus, research, and think about a problem—and then let go. Do something else. That didn’t necessarily mean meditate, but do something that relaxes and distracts you; let your unconscious mind go to work, making connections from disparate parts of the brain. This, too, was massively counterintuitive for me. My impulse when presented with a thorny problem was to bulldoze my way through it, to swarm it with thought. But the best solutions often come when you allow yourself to get comfortable with ambiguity. This is why people have aha moments in the shower. It was why Kabat-Zinn had a vision while on retreat. It was why Don Draper from Mad Men, when asked how he comes up with his great slogans, said he spends all day thinking and then goes to the movies. Janice Marturano was on
Dan Harris (10% Happier)
Leela's happiest childhood memories were of aloneness: reading in her room with the door closed, playing chess on the computer, embarking on long bike rides through the city, going to the movies by herself. You saw more that way, she explained to her parents. You didn't miss crucial bits of dialogue because your companion was busy making inane remarks. Her parents, themselves solitary individuals, didn't object. People– except for a selected handful– were noisy and messy. They knew that.
Chitra Banerjee Divakaruni (The Unknown Errors of Our Lives)
He convinced John Stainton to agree that there would be no CGI (computer-generated imagery) wildlife in the movie. We didn’t want to pretend to react to an animal in front of a green screen, and then have computer graphic technicians complete the shot later. That was how Hollywood would normally have done it, but that wasn’t an option for Steve. “All the animals have to be real,” he insisted to the executives at MGM. “I’m doing all of my own stunts. Otherwise, I am not interested.” I always believed that Steve would excel at anything he put his mind to, and a movie would be no different. The camera loved him. As talks ground on at MGM, we came up with a title: Crocodile Hunter: Collision Course. But mostly we had phone calls and meetings. The main sticking point was that no insurance company would touch us. No underwriter would write a policy for a project that required Steve to be working with real live crocodiles. As negotiations seemed to be grinding to a halt, we were all feeling frustrated. Steve looked around at John, Judi, and the others. He could see that everybody had gotten a bit stretched on all our various projects. He decided we needed a break. He didn’t lead us into the bush this time. Instead, Steve said a magic word. “Samoa.” “Sea snakes?” I asked. “Surfing,” he said.
Terri Irwin (Steve & Me)
Overused Settings All too often, games borrow settings from one another or from common settings found in the movies, books, or television. A huge number of games are set in science fiction and fantasy worlds, especially the quasi-medieval, sword-and-sorcery fantasy inspired by J. R. R. Tolkien and Dungeons & Dragons, popular with the young people who used to be the primary—indeed, almost the only—market for computer games. But a more diverse audience plays games nowadays, and they want new worlds to play in. You should look beyond these hoary old staples of gaming.
Ernest Adams (Fundamentals of Game Design)
It is fun to be around really, really creative makers in the second half of the chessboard, to see what they can do, as individuals, with all of the empowering tools that have been enabled by the supernova. I met Tom Wujec in San Francisco at an event at the Exploratorium. We thought we had a lot in common and agreed to follow up on a Skype call. Wujec is a fellow at Autodesk and a global leader in 3-D design, engineering, and entertainment software. While his title sounds like a guy designing hubcaps for an auto parts company, the truth is that Autodesk is another of those really important companies few people know about—it builds the software that architects, auto and game designers, and film studios use to imagine and design buildings, cars, and movies on their computers. It is the Microsoft of design. Autodesk offers roughly 180 software tools used by some twenty million professional designers as well as more than two hundred million amateur designers, and each year those tools reduce more and more complexity to one touch. Wujec is an expert in business visualization—using design thinking to help groups solve wicked problems. When we first talked on the phone, he illustrated our conversation real-time on a shared digital whiteboard. I was awed. During our conversation, Wujec told me his favorite story of just how much the power of technology has transformed his work as a designer-maker.
Thomas L. Friedman (Thank You for Being Late: An Optimist's Guide to Thriving in the Age of Accelerations)
Smart entrepreneurs have grabbed this opportunity with a vengeance. Now online lesson-plan marketplaces such as Gooru Learning, Teachers Pay Teachers, and Share My Lesson allow teachers who want to devote more of their time to other tasks the ability to purchase high-quality (and many lesser-quality) lesson plans, ready to go. With sensors, data, and A.I., we can begin, even today, testing for the learning efficacy of different lectures, styles, and more. And, because humans do a poor job of incorporating massive amounts of information to make iterative decisions, in the very near future, computers will start doing more and more of the lesson planning. They will write the basic lessons and learn what works and what doesn’t for specific students. Creative teachers will continue, though, to be incredibly valuable: they will learn how to steer and curate algorithmic and heuristically updated lesson creation in ways that computers could not necessarily imagine. All of this is, of course, a somewhat bittersweet development. Teaching is an idealistic profession. You probably remember a special teacher who shaped your life, encouraged your interests, and made school exciting. The movies and pop culture are filled with paeans to unselfish, underpaid teachers fighting the good fight and helping their charges. But it is becoming clearer that teaching, like many other white-collar jobs that have resisted robots, is something that robots can do—possibly, in structured curricula, better than humans can. The
Vivek Wadhwa (The Driver in the Driverless Car: How Our Technology Choices Will Create the Future)
It's like the guy in the Apollo 13 movie who says, 'Power is everything.' The kind of computers you're talking about, the ones that rival the human brain for processing nodes, consume on the order of four million watts of power. The chunk of meat in your head - which is not a computer, by the way - uses twenty watts. Not twenty million. Just twenty. Our brains are efficient thermodynamic systems, designed to help us produce valuable work from the potential energy around us in the world. Computers are simply extensions of our minds - tools we use that heighten that production value.
David Walton (The Genius Plague)
Writing was born as the maidservant of human consciousness, but it is increasingly becoming its master. Our computers have trouble understanding how Homo sapiens talks, feels and dreams. So we are teaching Homo sapiens to talk, feel and dream in the language of numbers, which can be understood by computers. And this is not the end of the story. The field of artificial intelligence is seeking to create a new kind of intelligence based solely on the binary script of computers. Science-fiction movies such as The Matrix and The Terminator tell of a day when the binary script throws off the yoke of humanity. When humans try to regain control of the rebellious script, it responds by attempting to wipe out the human race.
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
In the wider world, we keep hectically busy and fill every free moment of our day with some form of diversion—work, computers, television, movies, radio, magazines, newspapers, sports, alcohol, drugs, parties. Perhaps we distract ourselves because looking at our lives confronts us with our lack of meaning, our unhappiness, and our loneliness—and with the difficulty, the fragility, and the unbelievable brevity of life.
Armand M. Nicholi Jr. (The Question of God: C.S. Lewis and Sigmund Freud Debate God, Love, Sex, and the Meaning of Life)
Yo, time, where are you? When 'Sorry' used to be atonement 'Progress' meant development 'I do' meant commitment 'Brands' meant advertisement 'TV' meant refreshment 'People' meant government 'Trees' meant environment 'Looks' weren't requirement 'Talent' meant achievement 'Outdoor' meant excitement 'Movies' meant Entertainment And 'Nice' used to be the complement Yo time, today, 'Apology' is just statement 'Social Service' is just department 'Politics' is just argument 'Love' is just arrangement 'J/Bail' is just judgment 'Computer' is just another instrument 'Family' is Adjustment as After all We live in our small apartment Don't give a damn, man, But don't jump in the deep dam, To be good was 90s dream To be good now is just a meme - Nice Guy
Bhavik Sarkhedi
Now I made myself scarce; I shut the door to my room and talked on the phone with friends or watched movies. My mother also disappeared, playing solitaire on the computer upstairs or reading in bed. I rarely went to her. You would think my father's behavior might have brought us together; in fact, it drove us apart.
Katharine Smyth (All the Lives We Ever Lived: Seeking Solace in Virginia Woolf)
Just think about the incredible transformation that took place in Steve’s life and career after Pixar. In 1983, Apple launched their computer Lisa, the last project Jobs worked on before he was let go. Jobs released Lisa with a nine-page ad in the New York Times spelling out the computer’s technical features. It was nine pages of geek talk nobody outside NASA was interested in. The computer bombed. When Jobs returned to the company after running Pixar, Apple became customer-centric, compelling, and clear in their communication. The first campaign he released went from nine pages in the New York Times to just two words on billboards all over America: Think Different. When Apple began filtering their communication to make it simple and relevant, they actually stopped featuring computers in most of their advertising. Instead, they understood their customers were all living, breathing heroes, and they tapped into their stories. They did this by (1) identifying what their customers wanted (to be seen and heard), (2) defining their customers’ challenge (that people didn’t recognize their hidden genius), and (3) offering their customers a tool they could use to express themselves (computers and smartphones). Each of these realizations are pillars in ancient storytelling and critical for connecting with customers. I’ll teach you about these three pillars and more in the coming chapters, but for now just realize the time Apple spent clarifying the role they play in their customers’ story is one of the primary factors responsible for their growth. Notice, though, the story of Apple isn’t about Apple; it’s about you. You’re the hero in the story, and they play a role more like Q in the James Bond movies. They are the guy you go see when you need a tool to help you win the day.
Donald Miller (Building a StoryBrand: Clarify Your Message So Customers Will Listen)
older movie they’d seen many times about a guy who finds out his life is not really a life, but some sort of computer-generated virtual reality, and he has to save the world from the tyranny of the machines.
Laura Ruby (The Shadow Cipher (York, #1))
It’s hard to explain how important Star Trek is to me. I think I went to my first Star Trek convention when I was fifteen. So to hear that Leonard Nimoy—Mr. Spock—was on the phone, I was not processing what he was saying. I could only focus on his amazing voice. I thought this was a phone call to see if he’d agree to do the part, but in his mind, he had already agreed to do it! He had one specific note on the script, which is that Mr. Spock doesn’t use contractions when he speaks. He says “cannot;” he doesn’t say “can’t.” And I remember just being chagrined that I hadn’t intervened and had allowed this to go on. I loved Spock so much, I used to sneak lines of Mr. Spock dialogue from the movies and TV shows into Big Bang Theory and give them to Sheldon. There’s an episode early on where Sheldon and Leonard are having a fight, and Penny asks, “Well, how do you feel?” And Sheldon replies, “I don’t understand the question.” That’s from the beginning of Star Trek IV where Spock has reunited with his mind and his body, and is being quizzed by a computer about his status. So Leonard Nimoy was just one of many fanboy moments. I once said to LeVar Burton, “If I could go back in time and tell my teenage self there would be a day where I would eventually talk to three crew members of the USS Enterprise, I’d fall over and die.
Jessica Radloff (The Big Bang Theory: The Definitive, Inside Story of the Epic Hit Series)
The reason why we can’t see our eyes moving with our own eyes is because our brains edit out the bits between the saccades—a process called saccadic suppression. Without it, we’d look at an object and it would be a blurry mess. What we perceive as vision is the director’s cut of a film, with your brain as the director, seamlessly stitching together the raw footage to make a coherent reality. Perception is the brain’s best guess at what the world actually looks like. Immense though the computing power of that fleshy mass sitting in the darkness of our skulls is, if we were to take in all the information in front of our eyes, our brains would surely explode.** Instead, our eyes sample bits and pieces of the world, and we fill in the blanks in our heads. This fact is fundamental to the way that cinema works. A film is typically 24 static images run together every second, which our brain sees as continuous fluid movement—that’s why it’s called a movie. The illusion of movement actually happens at more like 16 frames per second. At that speed, a film projection is indistinguishable from the real world, at least to us. It was the introduction of sound that set the standard of 24 frames per second with The Jazz Singer in 1927, the first film to have synchronized dialogue. The company
Adam Rutherford (The Complete Guide to Absolutely Everything (Abridged): Adventures in Math and Science)
Andrei avoided the internet as well and this evasion only added to his gloom. He loved music, especially old songs, and he loved movies, of all sorts. If he had the patience, sometimes he would read. While most of the pages he turned bored him to sleep, certain books with certain lines disarranged him. Some literature brought him to his feet, laughing and howling in his room. When the book was right, it was bliss and he wept. His room hushed with serenity and indebtedness. When he turned to his computer, however, or took out his phone, he would inevitably come across a viral trend or video that took the art he loved and turned it into a joke. The internet, in Andrei’s desperate eyes, managed to make fun of everything serious. And if one did not laugh, they were not intelligent. The internet could not be slowed and no protest to criticize its exploitation of art could be made because recreations of art hid perfectly under the veneer of mockery and was thus, impenetrable. It was easy to use Chopin’s ‘Sonata No. 2’ for a quick laugh, to reduce the ‘Funeral March’ to background music. It was a sneaky way for a digital creator to be considered an artist—and parodying the classics made them appear cleverer than the original artist. Meanwhile, Andrei’s body had healed playing Chopin alone in his apartment. He would frailly replay movie moments, too, that he later found the world edited and ripped apart with its cheap teeth. And everyone ate the internet’s crumbs. This cruel derision was impossible to escape. But enough jokes, memes, and glam over someone’s precious source of life would eventually make a sensitive body numb. And Andrei was afraid of that. He needed his fountain of hope unblemished. For this reason, he escaped the internet’s claws and only surrendered to it for e-mails, navigation, and the weather.
Kristian Ventura (A Happy Ghost)
The dream of Strong Artificial Intelligence—and more specifically the growing interest in the idea that a computer can become conscious and have first-person subjective experiences—has led to a cultural shift. Prophets like Kurzweil believe that we are much closer to cyberconsciousness and superintelligence than most observers acknowledge, while skeptics argue that current AI systems are still extremely primitive and that hopes of conscious machines are pipedreams. Who is right? This book does not attempt to address this question, but points out some philosophical problems and asks some philosophical questions about machine consciousness. One fundamental problem is that we do not understand human consciousness. Many in science and artificial intelligence assume that human consciousness is based on information or computations. Several writers have tried to tackle this assumption, most notably the British physicist Roger Penrose, whose controversial theory suggests that consciousness is based upon noncomputable quantum states in some of the tiniest structures in the brain, called microtubules. Other, perhaps less esoteric thinkers, like Duke’s Miguel Nicolelis and Harvard’s Leonid Perlovsky, are beginning to challenge the idea that the brain is computable. These scientists lead their fields in man-machine interfacing and computer science. The assumption of a computable brain allows artificial intelligence researchers to believe they will create artificial minds. However, despite assuming that the brain is a computational system—what philosopher Riccardo Manzotti calls “the computational stance”—neuroscience is still discovering that human consciousness is nothing like we think it is. For me this is where LSD enters the picture. It turns out that human consciousness is likely itself a form of hallucination. As I have said, it is a very useful hallucination, but a hallucination nonetheless. LSD and psychedelics may help reveal our normal everyday experience for the hallucination that it is. This insight has been argued about for centuries in philosophy in various forms. Immanuel Kant may have been first to articulate it in modern form when he called our perception of the world “synthetic.” The fundamental idea is that we do not have direct knowledge of the external world. This idea will be repeated often in this book, and you will have to get used to it. We only have knowledge of our brain’s creation of that world for us. In other words, what we see, hear, and subsequently think are like movies that our brain plays for us after the fact. These movies are based on perceptions that come into our senses from the external world, but they are still fictions of our brain’s creation. In fact, you might put the disclaimer “based on a true story” in front of each experience you have. I do not wish to imply that I believe in the homunculus argument—what philosopher Daniel Dennett describes as the “Cartesian Theater”—the hypothetical place in the mind where the self becomes aware of the world. I only wish to employ the metaphor to illustrate the idea that there is no direct relationship between the external world and your perception of it.
Andrew Smart (Beyond Zero and One: Machines, Psychedelics, and Consciousness)
I don't have a problem with AI generated content, I have a problem when it's rooted in fraud and deception. In fact, AI generated content could open up new horizons of human creativity - but only if practiced with conscience. For example, we could set up a whole new genre of AI generated material in every field of human endeavor. We could have AI generated movies, alongside human movies - we could have AI generated music, alongside human music - we could have AI generated poetry and literature, alongside human poetry and literature - and so on. The possibilities are endless - and all above board. This way we make AI a positive part of human existence, rather than facilitating the obliteration of everything human about human life.
Abhijit Naskar (Iman Insaniyat, Mazhab Muhabbat: Pani, Agua, Water, It's All One)
Today we immediately associate Los Angeles with movies, New York with finance, Silicon Valley with computers, Seattle with software, and the Raleigh-Durham area with medical research.
Enrico Moretti (The New Geography of Jobs)
And soon I was getting involved in one of the most amazing projects. Someone asked me to help design the digital part of the first hotel movie system, which was based on the very earliest VCRs. No one had VCRs then, of course. I was thinking, Oh my god! This is going to be incredible—designing movies for hotels! I couldn’t get over it. Their formula was this. They’d line up about six VCRs. Then they had a method of sending special TV channels to everybody’s room. They could play the movies on those channels. There was a filter in each room to block those channels. But the hotel clerk in the lobby could send a signal to unlock the filter in a particular room. Then the guest could watch the movie they ordered on their TV. Someone in the VCR room had to literally start the movie, but this was still a really cool system.
Steve Wozniak (iWoz: Computer Geek to Cult Icon)
The first video game based on a movie or television series is probably Mike Mayfield’s 1971 text-only game Star Trek, a strategy game about commanding the USS Enterprise against the Klingons. But Mayfield created the game as a hobbyist on a Sigma 7 minicomputer, a device that required as much space as several refrigerators. It hardly seemed to be at risk of becoming a commercial product.
Nick Montfort (Racing the Beam: The Atari Video Computer System (Platform Studies))
Galyn Susman, the movie’s supervising technical director, remembered something: “Wait,” she said. “I might have a backup on my home computer.” About six months before, Galyn had had her second baby, which required that she spend more of her time working from home. To make that process more convenient, she’d set up a system that copied the entire film database to her home computer, automatically, once a week. This—our third random event—would be our salvation. Within a minute of her
Ed Catmull (Creativity, Inc.: Overcoming the Unseen Forces That Stand in the Way of True Inspiration)
When the snowspeeder endures for two minutes in Star Wars: The Empire Strikes Back, it wins the temporary invulnerability of “the Force” and the Star Wars theme plays, as it does when the cartridge first starts up. This rare musical treat effectively draws a connection to the Star Wars movies and also works effectively in the game, making the period of invulnerability even more heightened. The theme plays as sound effects from the game continue, too, so that it is integrated into the experience of play rather than interrupting it.
Nick Montfort (Racing the Beam: The Atari Video Computer System (Platform Studies))