Digital Learning Quotes

We've searched our database for all the quotes and captions related to Digital Learning. Here they are! All 100 of them:

All I ask is this: Do something. Try something. Speaking out, showing up, writing a letter, a check, a strongly worded e-mail. Pick a cause – there are few unworthy ones. And nudge yourself past the brink of tacit support to action. Once a month, once a year, or just once...Even just learning enough about a subject so you can speak against an opponent eloquently makes you an unusual personage. Start with that. Any one of you would have cried out, would have intervened, had you been in that crowd in Bashiqa. Well thanks to digital technology, you’re all in it now.
Joss Whedon
If you are on social media, and you are not learning, not laughing, not being inspired or not networking, then you are using it wrong.
Germany Kent
I hope you feel better about yourself. I hope you feel alive. I hope that good things happen to you, and I hope that when the inevitable bad things happen you can handle them and learn a lesson and move on. I hope you know you're not alone and I hope you spend plenty of time with your family and/or friends and I hope you write more and get a seven-figure book deal. I hope next year no more celebrities die and I hope you get an iPhone if you want one. Or maybe a pony. I hope someone writes a song for you on Valentines Day that's a bit like Hey There Delilah, and I hope they have a good singing voice, or at least one better than mine. I hope that you accept yourself the way you are, and figure out that losing 20 pounds isn't going to magically make you love yourself. I hope you read a lot. I hope you don't have to almost die to figure out how valuable life is. I hope you find the perfect nail polish/digital camera/home/life partner. I hope you stop being jealous of others. I hope you feel good, about yourself and the people around you and the world. I hope you eat heaps of salt and vinegar chips because they're the best kind. I hope you accomplish all your hopes & dreams & aspirations and are blissfully happy & get married to Edward Cullen/George Clooney/Megan Fox/Angelina Jolie (delete whichever are inappropriate) & ride a pretty white horse into the sunset & I hope it's all sweet and wonderful because you deserve it because you did well this year in the face of sparkly vampires/great evil/low self-esteem.
Steph Bowe
No one else knows exactly what the future holds for you, no one else knows what obstacles you've overcome to be where you are, so don't expect others to feel as passionate about your dreams as you do.
Germany Kent
These problems have been here so long that the only way I’ve been able to function at all is by learning to ignore them. Else I would be in a constant state of panic, unable to think or act constructively.
Mark Bowden (Worm: The First Digital World War)
We all have nature and nurture to shape us. She can watch other people’s opinions when she has opinions of her own, and no sooner. We’re not digital creatures. We’re flesh and blood. Better she learns that before the world finds her.
Pierce Brown (Morning Star (Red Rising, #3))
Freedom of Speech doesn't justify online bullying. Words have power, be careful how you use them.
Germany Kent
She looks at her watch - a real one, with arms. Those digital ones came and went, thank God. When will people learn that just because you can make something doesn't mean you should?
Sara Gruen (Water for Elephants)
No child in my family watches holos before the age of twelve. We all have nature and nurture to shape us. She can watch other people’s opinions when she has opinions of her own, and no sooner. We’re not digital creatures. We’re flesh and blood. Better she learns that before the world finds her.
Pierce Brown (Morning Star (Red Rising, #3))
Humanity is on the verge of digital slavery at the hands of AI and biometric technologies. One way to prevent that is to develop inbuilt modules of deep feelings of love and compassion in the learning algorithms.
Amit Ray (Compassionate Artificial Superintelligence AI 5.0)
What is the value of libraries? Through lifelong learning, libraries can and do change lives, a point that cannot be overstated.
Michael E. Gorman (Our Enduring Values: Librarianship in the 21st Century)
Face-to-face conversation is the most human--and humanizing--thing we do. Fully present to one another, we learn to listen. It's where we develop the capacity for empathy. It's where we experience the joy of being heard, of being understood.
Cal Newport (Digital Minimalism: Choosing a Focused Life in a Noisy World)
Today, when routine cognitive tasks are digitized and automated, and multiple lifetimes worth of information are accessible at our fingertips (much of which rapidly becomes obsolete), the focus of education must shift.
Roger Spitz (The Definitive Guide to Thriving on Disruption: Volume III - Beta Your Life: Existence in a Disruptive World)
She looks at her watch—a real one, with arms. Those digital ones came and went, thank God. When will people learn that just because you can make something doesn’t mean you should?
Sara Gruen (Water for Elephants)
Learning agility means to learn, de-learn, and relearn all the times.
Pearl Zhu (Digital Agility: The Rocky Road from Doing Agile to Being Agile)
...rather than assuming that education is primarily about preparing for jobs and careers, what would it mean to think of education as a process of guiding kids' participation in public life more generally, a public life that includes social, recreational, and civic engagement.
Mizuko Ito (Living and Learning with New Media: Summary of Findings from the Digital Youth Project (John D. and Catherine T. MacArthur Foundation Reports on Digital Media and Learning))
Our world is now so complex, our technology and science so powerful, and our problems so global and interconnected that we have come to the limits of individual human intelligence and individual expertise.
James Paul Gee (The Anti-Education Era: Creating Smarter Students through Digital Learning)
I had learned that a dexterous, opposable thumb stood among the hallmarks of human success. We had maintained, even exaggerated, this important flexibility of our primate forebears, while most mammals had sacrificed it in specializing their digits. Carnivores run, stab, and scratch. My cat may manipulate me psychologically, but he'll never type or play the piano.
Stephen Jay Gould (The Panda's Thumb: More Reflections in Natural History)
People over the age of thirty were born before the digital revolution really started. We’ve learned to use digital technology—laptops, cameras, personal digital assistants, the Internet—as adults, and it has been something like learning a foreign language. Most of us are okay, and some are even expert. We do e-mails and PowerPoint, surf the Internet, and feel we’re at the cutting edge. But compared to most people under thirty and certainly under twenty, we are fumbling amateurs. People of that age were born after the digital revolution began. They learned to speak digital as a mother tongue.
Ken Robinson (The Element: How Finding Your Passion Changes Everything)
Learning a new language, just like opening a new window, allows you to see the world with intimacy.
Pearl Zhu (Thinkingaire: 100 Game Changing Digital Mindsets to Compete for the Future)
Social tools leave a digital audit trail, documenting our learning journey—often an unfolding story—and leaving a path for others to follow.
Marcia Conner (The New Social Learning: A Guide to Transforming Organizations Through Social Media)
I learned electronics as a kid by messing around with old radios that were easy to tamper with because they were designed to be fixed.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Cloud is the digital wonderland of Internet of Things, powered by Artificial Intelligence and Big Data
Enamul Haque (Digital Transformation Through Cloud Computing: Developing a sustainable business strategy to eschew extinction)
To reclaim solitude we have to learn to experience a moment of boredom as a reason to turn inward, to defer going “elsewhere” at least some of the time.
Sherry Turkle (Reclaiming Conversation: The Power of Talk in a Digital Age)
She can watch other people’s opinions when she has opinions of her own, and no sooner. We’re not digital creatures. We’re flesh and blood. Better she learns that before the world finds her.
Pierce Brown (Morning Star (Red Rising, #3))
[[ ]] The story goes like this: Earth is captured by a technocapital singularity as renaissance rationalization and oceanic navigation lock into commoditization take-off. Logistically accelerating techno-economic interactivity crumbles social order in auto sophisticating machine runaway. As markets learn to manufacture intelligence, politics modernizes, upgrades paranoia, and tries to get a grip. The body count climbs through a series of globewars. Emergent Planetary Commercium trashes the Holy Roman Empire, the Napoleonic Continental System, the Second and Third Reich, and the Soviet International, cranking-up world disorder through compressing phases. Deregulation and the state arms-race each other into cyberspace. By the time soft-engineering slithers out of its box into yours, human security is lurching into crisis. Cloning, lateral genodata transfer, transversal replication, and cyberotics, flood in amongst a relapse onto bacterial sex. Neo-China arrives from the future. Hypersynthetic drugs click into digital voodoo. Retro-disease. Nanospasm.
Nick Land (Fanged Noumena: Collected Writings, 1987–2007)
Likewise, civilizations have throughout history marched blindly toward disaster, because humans are wired to believe that tomorrow will be much like today — it is unnatural for us to think that this way of life, this present moment, this order of things is not stable and permanent. Across the world today, our actions testify to our belief that we can go on like this forever, burning oil, poisoning the seas, killing off other species, pumping carbon into the air, ignoring the ominous silence of our coal mine canaries in favor of the unending robotic tweets of our new digital imaginarium. Yet the reality of global climate change is going to keep intruding on our fantasies of perpetual growth, permanent innovation and endless energy, just as the reality of mortality shocks our casual faith in permanence.
Roy Scranton (Learning to Die in the Anthropocene: Reflections on the End of a Civilization)
Every time we look at the clock, we must learn to feel a sense of urgency. We must learn to realize that “now” is happening and will very soon be gone. We must look at the digits on the display and be overcome with an urge to do something before those digits change. Before “now” slips through our fingers. We must look at the ink on the calendar and see an immediate opportunity to do something wonderful, incredible, or beautiful. It’s that simple. We need to change our thinking from “when the number changes” to “before the number changes”.
Dan Pearce (Single Dad Laughing: The Best of Year One)
In order for a digital neocortex to learn a new skill, it will still require many iterations of education, just as a biological neocortex does, but once a single digital neocortex somewhere and at some time learns something, it can share that knowledge with every other digital neocortex without delay. We can each have our own private neocortex extenders in the cloud, just as we have our own private stores of personal data today.
Ray Kurzweil (How to Create a Mind: The Secret of Human Thought Revealed)
To be successful in this field, you need to become a problem solver with good observation skills and a desire to create things. You never stop learning in this field. You face new challenges with every new project, many of which require innovative solutions that you must discover on your own.
William Vaughan (Digital Modeling)
When human beings acquired language, we learned not just how to listen but how to speak. When we gained literacy, we learned not just how to read but how to write. And as we move into an increasingly digital reality, we must learn not just how to use programs but how to make them. In the emerging highly programmed landscape ahead, you will either create the software or you will be the software. It’s really that simple: Program, or be programmed.
Douglas Rushkoff (Program or Be Programmed: Ten Commands for a Digital Age)
Before two years of age, human interaction and physical interaction with books and print are the best entry into the world of oral and written language and internalized knowledge, the building blocks of the later reading circuit.
Maryanne Wolf (Reader, Come Home: The Reading Brain in a Digital World)
I have learned the password of two of my neighbors’ wireless home networks, so you can use theirs if you like. Be a parasite on their network. Global digital parasitism is the new Trotskyism. Connect to anywhere in the world you like.
David Cronenberg (Consumed)
Anything you might want to accomplish—executing a project at work, getting a new job, learning a new skill, starting a business—requires finding and putting to use the right information. Your professional success and quality of life depend directly on your ability to manage information effectively. According to the New York Times, the average person’s daily consumption of information now adds up to a remarkable 34 gigabytes.1 A separate study cited by the Times estimates that we consume the equivalent of 174 full newspapers’ worth of content each and every day, five times higher than in 1986.2 Instead of empowering us, this deluge of information often overwhelms us. Information Overload has become Information Exhaustion, taxing our mental resources and leaving us constantly anxious that we’re forgetting something.
Tiago Forte (Building a Second Brain: A Proven Method to Organize Your Digital Life and Unlock Your Creative Potential)
A programmable mind embraces mental agility, to practice “de-learning” and “relearning” all the time.
Pearl Zhu (Thinkingaire: 100 Game Changing Digital Mindsets to Compete for the Future)
An adaptive mind has better learning capability.
Pearl Zhu (Thinkingaire: 100 Game Changing Digital Mindsets to Compete for the Future)
people like stories because they are good, not because they are true.
James Paul Gee (The Anti-Education Era: Creating Smarter Students through Digital Learning)
I highly recommend treating a digital note as if the space were limited.
Sönke Ahrens (How to Take Smart Notes: One Simple Technique to Boost Writing, Learning and Thinking – for Students, Academics and Nonfiction Book Writers)
You need not leave your room. Remain sitting at your table and listen. You need not even listen, simply wait, just learn to become quiet, and still, and solitary. The world will freely offer itself to you to be unmasked.
Sherry Turkle (Reclaiming Conversation: The Power of Talk in a Digital Age)
Unfortunately, parents are increasingly opting for digital companions over living, breathing ones, but I beg you, put the tablets, game consoles, and televisions away, and arrange play dates with a variety of real, live children.
Jessica Lahey (The Gift of Failure: How the Best Parents Learn to Let Go So Their Children Can Succeed)
human intelligence and creativity, today more than ever, are tied to connecting—synchronizing—people, tools, texts, digital and social media, virtual spaces, and real spaces in the right ways, in ways that make us Minds and not just minds, but also better people in a better world.
James Paul Gee (The Anti-Education Era: Creating Smarter Students through Digital Learning)
It’s that time of the month again… As we head into those dog days of July, Mike would like to thank those who helped him get the toys he needs to enjoy his summer. Thanks to you, he bought a new bass boat, which we don’t need; a condo in Florida, where we don’t spend any time; and a $2,000 set of golf clubs…which he had been using as an alibi to cover the fact that he has been remorselessly banging his secretary, Beebee, for the last six months. Tragically, I didn’t suspect a thing. Right up until the moment Cherry Glick inadvertently delivered a lovely floral arrangement to our house, apparently intended to celebrate the anniversary of the first time Beebee provided Mike with her special brand of administrative support. Sadly, even after this damning evidence-and seeing Mike ram his tongue down Beebee’s throat-I didn’t quite grasp the depth of his deception. It took reading the contents of his secret e-mail account before I was convinced. I learned that cheap motel rooms have been christened. Office equipment has been sullied. And you should think twice before calling Mike’s work number during his lunch hour, because there’s a good chance that Beebee will be under his desk “assisting” him. I must confess that I was disappointed by Mike’s over-wrought prose, but I now understand why he insisted that I write this newsletter every month. I would say this is a case of those who can write, do; and those who can’t do Taxes. And since seeing is believing, I could have included a Hustler-ready pictorial layout of the photos of Mike’s work wife. However, I believe distributing these photos would be a felony. The camera work isn’t half-bad, though. It’s good to see that Mike has some skill in the bedroom, even if it’s just photography. And what does Beebee have to say for herself? Not Much. In fact, attempts to interview her for this issue were met with spaced-out indifference. I’ve had a hard time not blaming the conniving, store-bought-cleavage-baring Oompa Loompa-skinned adulteress for her part in the destruction of my marriage. But considering what she’s getting, Beebee has my sympathies. I blame Mike. I blame Mike for not honoring the vows he made to me. I blame Mike for not being strong enough to pass up the temptation of readily available extramarital sex. And I blame Mike for not being enough of a man to tell me he was having an affair, instead letting me find out via a misdirected floral delivery. I hope you have enjoyed this new digital version of the Terwilliger and Associates Newsletter. Next month’s newsletter will not be written by me as I will be divorcing Mike’s cheating ass. As soon as I press send on this e-mail, I’m hiring Sammy “the Shark” Shackleton. I don’t know why they call him “the Shark” but I did hear about a case where Sammy got a woman her soon-to-be ex-husband’s house, his car, his boat and his manhood in a mayonnaise jar. And one last thing, believe me when I say I will not be letting Mike off with “irreconcilable differences” in divorce court. Mike Terwilliger will own up to being the faithless, loveless, spineless, useless, dickless wonder he is.
Molly Harper (And One Last Thing ...)
But these conversations require time and space, and we say we’re too busy. Distracted at our dinner tables and living rooms, at our business meetings, and on our streets, we find traces of a new “silent spring”—a term Rachel Carson coined when we were ready to see that with technological change had come an assault on our environment. Now, we have arrived at another moment of recognition. This time, technology is implicated in an assault on empathy. We have learned that even a silent phone inhibits conversations that matter.
Sherry Turkle (Reclaiming Conversation: The Power of Talk in a Digital Age)
The most important lesson in photography is learning to photograph what you love.
Tony Northrup (Tony Northrup's DSLR Book: How to Create Stunning Digital Photography)
You will be learning along with the students, and your status as a learning expert will provide them with the support they need so that their work is the best it can be.
Larissa Pahomov (Authentic Learning in the Digital Age: Engaging Students Through Inquiry)
Alan was slow to learn that indistinct line that separated initiative from disobedience.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Being a digital native may have long-term consequences related to learning how to read.
Jason Merkoski (Burning the Page: The eBook Revolution and the Future of Reading)
Digital disruption and the impact of coronavirus will bring the 2030 technological advancement earlier than predicted
Enamul Haque (The Ultimate Modern Guide to Artificial Intelligence)
The era of artificial intelligence has arrived. You, who only felt far from artificial intelligence, and the growing dream trees, are now inseparable from artificial intelligence.
Enamul Haque (The Ultimate Modern Guide to Artificial Intelligence: Including Machine Learning, Deep Learning, IoT, Data Science, Robotics, The Future of Jobs, Required Upskilling and Intelligent Industries)
Technology continues to be used to change the way we experience museums and the ways we learn and absorb information.
Mackenzie Finklea (Beyond the Halls: An Insider's Guide to Loving Museums)
Case applied the two lessons he had learned at Procter & Gamble: make a product simple and launch it with free samples.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
In 1948, while working for Bell Telephone Laboratories, he published a paper in the Bell System Technical Journal entitled "A Mathematical Theory of Communication" that not only introduced the word bit in print but established a field of study today known as information theory. Information theory is concerned with transmitting digital information in the presence of noise (which usually prevents all the information from getting through) and how to compensate for that. In 1949, he wrote the first article about programming a computer to play chess, and in 1952 he designed a mechanical mouse controlled by relays that could learn its way around a maze. Shannon was also well known at Bell Labs for riding a unicycle and juggling simultaneously.
Charles Petzold (Code: The Hidden Language of Computer Hardware and Software)
When you enter the professional world, the demands on your notetaking change completely. The entire approach to notetaking you learned in school is not only obsolete, it’s the exact opposite of what you need. In the professional world: It’s not at all clear what you should be taking notes on. No one tells you when or how your notes will be used. The “test” can come at any time and in any form. You’re allowed to reference your notes at any time, provided you took them in the first place. You are expected to take action on your notes, not just regurgitate them.
Tiago Forte (Building a Second Brain: A Proven Method to Organize Your Digital Life and Unlock Your Creative Potential)
I'm still learning how to express myself on the platform but the digital distance and the human values that i learned in my pre-smartphone education don't seem to match together very well sometimes.
Alain Bremond-Torrent ("Darling, it's not only about sex")
He also had a trait, so common among innovators, that was charmingly described by his biographer Andrew Hodges: “Alan was slow to learn that indistinct line that separated initiative from disobedience.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
In the digital domain trust is now important not only because we really need to know how to trust people and whom to trust but because we need others to trust us and have to learn how to help them do so.
David Amerland (The Tribe That Discovered Trust: How Trust is Created, Propagated, Lost and Regained in Commercial Interactions)
They said that he had learned to read by age two. That he was fluent in Latin, Ancient Greek, German, English, and French, that he could divide two eight-digit numbers in his head by the time he was six,
Benjamín Labatut (The MANIAC)
School is often based not on problem solving, which perforce involves actions and goals, but on learning information, facts, and formulas that one has read about in texts or heard about in lectures. It is not surprising, then, that research has long shown that a student’s doing well in school, in terms of grades and tests, does not correlate with being able to solve problems in the areas in which the student has been taught (e.g., math, civics, physics).
James Paul Gee (The Anti-Education Era: Creating Smarter Students through Digital Learning)
Face-to-face conversation is the most human—and humanizing—thing we do.24 Fully present to one another, we learn to listen. It’s where we develop the capacity for empathy. It’s where we experience the joy of being heard, of being understood.
Cal Newport (Digital Minimalism: On Living Better with Less Technology)
…the act of reading is a special place in which human beings are freed from themselves to pass over to others and, in so doing, learn what it means to be another person with aspirations, doubts, and emotions that they might otherwise never have known.
Maryanne Wolf (Reader, Come Home: The Reading Brain in a Digital World)
We must learn to see the full picture, and not just the treats before our eyes. Our trendy gadgets, such as smartphones and tablets, have given us new access to the world. We regularly communicate with people we would never even have been aware of before the networked age. We can find information about almost anything at any time. But we have learned how much our gadgets and out idealistically motivated digital networks are being used to spy on us by ultrapowerful, remote organizations. We are being dissected more than we dissect.
Jaron Lanier (Who Owns the Future?)
As Page puts it, “Good ideas are always crazy until they’re not.” It’s a principle he’s tried to apply at Google. When Page and Sergey Brin began wondering aloud about developing ways to search the text inside of books, all of the experts they consulted said it would be impossible to digitize every book. The Google cofounders decided to run the numbers and see if it was actually physically possible to scan the books in a reasonable amount of time. They concluded it was, and Google has since scanned millions of books. “I’ve learned that your intuition about things you don’t know that much about isn’t very good,” Page said. “The way Elon talks about this is that you always need to start with the first principles of a problem. What are the physics of it? How much time will it take? How much will it cost? How much cheaper can I make it? There’s this level of engineering and physics that you need to make judgments about what’s possible and interesting. Elon is unusual in that he knows that, and he also knows business and organization and leadership and governmental issues.
Ashlee Vance (Elon Musk: Inventing the Future)
We had to learn their vocabularies in order to be able to run their problems. I could switch my vocabulary and speak highly technical for the programmers, and then tell the same things to the managers a few hours later but with a totally different vocabulary.” Innovation requires articulation.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
In family conversation, much of the work is done as children learn they are in a place they can come back to, tomorrow and tomorrow. When digital media encourage us to edit ourselves until we have said the “right thing,” we can lose sight of the important thing: Relationships deepen not because we necessarily say anything in particular but because we are invested enough to show up for another conversation. In family conversations, children learn that what can matter most is not the information shared but the relationships sustained. It is hard to sustain those relationships if you are on your phone.
Sherry Turkle (Reclaiming Conversation: The Power of Talk in a Digital Age)
In recent years, psychologists have learned more about how creative ideas come from the reveries of solitude. When we let our minds wander, we set our brains free. Our brains are most productive when there is no demand that they be reactive. For some, this goes against cultural expectations. American culture tends to worship sociality. We have wanted to believe that we are our most creative during “brainstorming” and “groupthink” sessions. But this turns out not to be the case. New ideas are more likely to emerge from people thinking on their own. Solitude is where we learn to trust our imaginations.
Sherry Turkle (Reclaiming Conversation: The Power of Talk in a Digital Age)
Digital technologies will no more solve the so-called ‘crisis in education’ than airbags will stop drivers from having accidents.  What digital technologies can do, however, is to dramatically accelerate the changes in behaviours, values, and actions, which then transform the way we learn and our capacity to learn.
David Price (Open: How We’ll Work, Live and Learn In The Future)
Meaningful learning in a community requires both participation and reification to be present and in interplay. Sharing artifacts without engaging in discussions and activities around them impairs the ability to negotiate the meaning of what is being shared. Interacting without producing artifacts makes learning depend on individual interpretation and memory and can limit its depth, extent, and impact. Both participation and reification are necessary. Sometimes one process may dominate the other, or the two processes may not be well integrated. The challenge of this polarity is for communities to successfully cycle between the two.
Etienne Wenger (Digital Habitats: stewarding technology for communities)
Global warming, environmental degradation, global flows of economic speculation and risk taking, overpopulation, global debt, new viruses, terrorism and warfare, and political polarization are killing us. Dealing with big questions takes a long-term view, cooperation, delayed gratification, and deep learning that crosses traditional silos of knowledge production. All of these are in short supply today. In the United States and much of the developed world, decisions are based on short-term interests and gain (e.g., stock prices or election cycles), as well as pandering to ignorance. Such decisions make the world worse, not better, and bring Armageddon ever closer.
James Paul Gee (The Anti-Education Era: Creating Smarter Students through Digital Learning)
Instead of the education system banning ChatGPT from schools, the focus should be geared towards educating students on how to properly use AI tools. Schools should be at the forefront of innovation and technological progress NOT a place for preserving obsolete learning methods and clinging onto archaic practices that are no longer relevant for the world we live in.
Nicky Verd (Disrupt Yourself Or Be Disrupted)
Instead of learning from one mind at a time, the search engine learns from the collective human mind, all at once. Every time an individual searches for something, and finds an answer, this leaves a faint, lingering trace as to where (and what) some fragment of meaning is. The fragments accumulate and, at a certain point, as Turing put it in 1948, “the machine would have ‘grown up.
George Dyson (Turing's Cathedral: The Origins of the Digital Universe)
Ben developed a similarly Byzantine system for memorizing binary digits, which enables him to convert any ten-digit-long string of ones and zeros into a unique image. That’s 210, or 1,024, images set aside for binaries. When he sees 1101001001, he immediately sees it as a single chunk, an image of a card game. When he sees 0111011010, he instantaneously conjures up an image of a cinema. In international memory competitions, mental athletes are given sheets of 1,200 binary digits, thirty to a row, forty rows to a page. Ben turns each row of thirty digits into a single image. The number 110110100000111011010001011010, for example, is a muscleman putting a fish in a tin. At the time, Ben held the world record for having learned 3,705 random ones and zeroes in half an hour.
Joshua Foer (Moonwalking with Einstein: The Art and Science of Remembering Everything)
High-quality and affordable childcare and eldercare • Paid family and medical leave for women and men • A right to request part-time or flexible work • Investment in early education comparable to our investment in elementary and secondary education • Comprehensive job protection for pregnant workers • Higher wages and training for paid caregivers • Community support structures to allow elders to live at home longer • Legal protections against discrimination for part-time workers and flexible workers • Better enforcement of existing laws against age discrimination • Financial and social support for single parents • Reform of elementary and secondary school schedules to meet the needs of a digital rather than an agricultural economy and to take advantage of what we now know about how children learn
Anne-Marie Slaughter (Unfinished Business: Women Men Work Family)
Wences had first learned about Bitcoin in late 2011 from a friend back in Argentina who thought it might give Wences a quicker and cheaper way to send money back home. Wences’s background in financial technology gave him a natural appreciation for the concept. After quietly watching and playing with it for some time, Wences gave $100,000 of his own money to two high-level hackers he knew in eastern Europe and asked them to do their best to hack the Bitcoin protocol. He was especially curious about whether they could counterfeit Bitcoins or spend the coins held in other people’s wallets—the most damaging possible flaw. At the end of the summer, the hackers asked Wences for more time and money. Wences ended up giving them $150,000 more, sent in Bitcoins. In October they concluded that the basic Bitcoin protocol was unbreakable, even if some of the big companies holding Bitcoins were not. By
Nathaniel Popper (Digital Gold: Bitcoin and the Inside Story of the Misfits and Millionaires Trying to Reinvent Money)
The meaning of a 3-star review, for example, can be interpreted differently among two users based on their average rating history. Standards also vary among countries and types of users (i.e. e-book readers versus physical book readers). Readers of physical books, for example, rate negatively when then are delivery delays/complications or production quality (i.e. paper quality) which doesn’t affect e-book readers who receive a digital copy on-demand.
Oliver Theobald (Machine Learning: Make Your Own Recommender System (Machine Learning with Python for Beginners Book Series 3))
Like gamblers, baseball fans and television networks, fishermen are enamored of statistics. The adoration of statistics is a trait so deeply embedded in their nature that even those rarefied anglers the disciples of Jesus couldn't resist backing their yarns with arithmetic: when the resurrected Christ appears on the morning shore of the Sea of Galilee and directs his forlorn and skunked disciples to the famous catch of John 21, we learn that the net contained not "a boatload" of fish, nor "about a hundred and a half," nor "over a gross," but precisely "a hundred and fifty three." This is, it seems to me, one of the most remarkable statistics ever computed. Consider the circumstances: this is after the Crucifixion and the Resurrection; Jesus is standing on the beach newly risen from the dead, and it is only the third time the disciples have seen him since the nightmare of Calvary. And yet we learn that in the net there were "great fishes" numbering precisely "a hundred and fifty three." How was this digit discovered? Mustn't it have happened thus: upon hauling the net to shore, the disciples squatted down by that immense, writhing fish pile and started tossing them into a second pile, painstakingly counting "one, two, three, four, five, six, seven... " all the way up to a hundred and fifty three, while the newly risen Lord of Creation, the Sustainer of all their beings, He who died for them and for Whom they would gladly die, stood waiting, ignored, till the heap of fish was quantified. Such is the fisherman's compulsion toward rudimentary mathematics! ....Concerning those disciples huddled over the pile of fish, another possibility occurs to me: perhaps they paid the fish no heed. Perhaps they stood in a circle adoring their Lord while He, the All-Curious Son of His All-Knowing Dad, counted them all Himself!
David James Duncan (The River Why)
Instead of thinking about addiction, it makes sense to confront this reality: We are faced with technologies to which we are extremely vulnerable and we don’t always respect that fact. The path forward is to learn more about our vulnerabilities. Then, we can design technology and the environments in which we use them with these insights in mind. For example, since we know that multitasking is seductive but not helpful to learning, it’s up to us to promote “unitasking.
Sherry Turkle (Reclaiming Conversation: The Power of Talk in a Digital Age)
What was going on here was that like so many people in contemporary society, along the way to gaining their superb educations, and their shiny opportunities, they had absorbed the wrong lessons. They had mastered formulas in calculus and chemistry. They had read great books and learned world history and become fluent in foreign languages. But they had had never formally been taught how to maximize their brains' potential or how to find meaning and happiness. Armed with iPhones and personal digital assistants, they had multitasked their way through a storm of resume-building experiences, often at the expense of actual ones. In their pursuit of high achievement, they had isolated themselves from their peers and loved ones and thus compromised the very support systems they so ardently needed. Repeatedly, I noticed these patterns in my own students, who often broke down under the tyranny of expectations we place on ourselves and those around us.
Shawn Achor
Fortunately, there’s a simple practice that can help you sidestep these inconveniences and make it much easier to regularly enjoy rich phone conversations. I learned it from a technology executive in Silicon Valley who innovated a novel strategy for supporting high-quality interaction with friends and family: he tells them that he’s always available to talk on the phone at 5:30 p.m. on weekdays. There’s no need to schedule a conversation or let him know when you plan to call—just dial him up.
Cal Newport (Digital Minimalism: Choosing a Focused Life in a Noisy World)
Access doesn’t automatically come with an ability to use the Web well. We aren’t suddenly self-directed, organized, and literate enough to make sense of all the people and information online — or savvy enough to connect and build relationships with others in safe, ethical, and effective ways. Access doesn’t grant the ability to stay on task when we need to get something done. No matter how often we dub our kids “digital natives,” the fact is they can still use our help to do those things and more if they are to thrive in the abundance of their times. Right
Will Richardson (Why School?: How Education Must Change When Learning and Information Are Everywhere)
Perhaps the most exasperating cliche is about children being forced to memorize, not think. But memorization is not an abomination in itself, though the mnemic pressure on our species has dropped. Memorization is, de facto, exercise for the mind. Neuroscience shows an active hippocampus stimulates cerebral activity. We have often observed how the most profound and creative pupils are those who know the most things, though their usefulness is not always apparent. No question is more insinuating stupid than 'What good will it do to me?' In certain teaching contexts, it is not wrong to ask pupils to memorize. While it is not the only goal the idea that memorizing is useless since information is available online is also wrong and falsely self-obvious. It denotes a misunderstanding of how our mind works. Our brains are not computers, our memory can't be replaced by external HDDs. Each piece of info we memorize is integrated, albeit minimally, as living memory is active, while digital memory is passive. Strange as some may find it, memorizing can stimulate thinking as few other things can. What impairs thinking is the lack of the habit to reflect, the custom of stopping our mind's flow to go back to what we've learned.
Doru Castaian
As data analytics, superfast computers, digital technology, and other breakthroughs enabled by science play a bigger and bigger role in informing medical decision-making, science has carved out a new and powerful role as the steadfast partner of the business of medicine—which is also enjoying a new day in the sun. It may surprise some people to learn that the business of medicine is not a twenty-first-century invention. Health care has always been a business, as far back as the days when Hippocrates and his peers practiced medicine. Whether it was three goats, a gold coin, or a bank note, some type of payment was typically exchanged for medical services, and institutions of government or learning funded research. However, since the 1970s, business has been the major force directing the practice of medicine. Together, the business and science of medicine are the new kids on the block—the bright, shiny new things. Ideally, as I’ve suggested, the art, science, and business of medicine would work together in a harmonious partnership, each upholding the other and contributing all it has to offer to the whole. And sometimes (as we’ll find in later chapters) this partnership works well. When it does, the results are magnificent for patients and doctors, not to mention for scientists and investors.
Halee Fischer-Wright (Back To Balance: The Art, Science, and Business of Medicine)
Many people who celebrate the arts and the humanities, who applaud vigorously the tributes to their importance in our schools, will proclaim without shame (and sometimes even joke) that they don’t understand math or physics. They extoll the virtues of learning Latin, but they are clueless about how to write an algorithm or tell BASIC from C++, Python from Pascal. They consider people who don’t know Hamlet from Macbeth to be Philistines, yet they might merrily admit that they don’t know the difference between a gene and a chromosome, or a transistor and a capacitor, or an integral and a differential equation. These concepts may seem difficult. Yes, but so, too, is Hamlet. And like Hamlet, each of these concepts is beautiful.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Jobs also attacked America’s education system, saying that it was hopelessly antiquated and crippled by union work rules. Until the teachers’unions were broken, there was almost no hope for education reform. Teachers should be treated as professionals, he said, not as industrial assembly- line workers. Principals should be able to hire and fire them based on how good they were. Schools should be staying open until at least 6 p.m. and be in session eleven months of the year. It was absurd, he added, that American classrooms were still based on teachers standing at a board and using textbooks. All books, learning materials, and assessments should be digital and interactive, tailored to each student and providing feedback in real time.
Walter Isaacson (Steve Jobs)
In this digital age, the claims made about the power of a mere book seem almost preposterous. Yet Christians are asked to believe that God, working through Scripture, can do mighty works. Consider the biblical call to grow in love. How can we do this? A common answer is that we become more adept at loving by loving-by doing acts of love-and there is real wisdom in that response. Yet we know that merely trying to love does not lead to love. Love is a cultivated disposition that flourishes when our minds are trained to honor loving thoughts and our bodies are trained toward loving acts. Lectio divina can help us grow in love by experiencing God's true and healing love as we meditate on his Word and by learning to pay attention to the roots of love-our thoughts.
James C. Wilhoit (Discovering Lectio Divina: Bringing Scripture into Ordinary Life)
In fact, as these companies offered more and more (simply because they could), they found that demand actually followed supply. The act of vastly increasing choice seemed to unlock demand for that choice. Whether it was latent demand for niche goods that was already there or a creation of new demand, we don't yet know. But what we do know is that the companies for which we have the most complete data - netflix, Amazon, Rhapsody - sales of products not offered by their bricks-and-mortar competitors amounted to between a quarter and nearly half of total revenues - and that percentage is rising each year. in other words, the fastest-growing part of their businesses is sales of products that aren't available in traditional, physical retail stores at all. These infinite-shelf-space businesses have effectively learned a lesson in new math: A very, very big number (the products in the Tail) multiplied by a relatives small number (the sales of each) is still equal to a very, very big number. And, again, that very, very big number is only getting bigger. What's more, these millions of fringe sales are an efficient, cost-effective business. With no shelf space to pay for - and in the case of purely digital services like iTunes, no manufacturing costs and hardly any distribution fees - a niche product sold is just another sale, with the same (or better) margins as a hit. For the first time in history, hits and niches are on equal economic footing, both just entries in a database called up on demand, both equally worthy of being carried. Suddenly, popularity no longer has a monopoly on profitability.
Chris Anderson (The Long Tail: Why the Future of Business is Selling Less of More)
Learn techniques to get real results Be aware, the landscape of digital marketing is always changing. Gurus, podcasts and bloggers have declared a tool or strategy hot for a week and died the next day. The truth is that digital marketing is less about “digital” and more about “marketing” because basically digital marketing has been around for ages. Its source has already been established. In the digital marketplace, our goal is to eliminate confusion about how strategies work and how they can be used to grow your business. Known for killing email marketing, digital advertising, or search engine optimization. Here, we are all about the basics. As you will see in this guide, these 8 main branches of digital marketing will be important for your business to grow today, tomorrow and in the years to come.
digitalmarkett1
In certain young people today…I notice what I find increasingly troubling: a cold-blooded grasping, a hunger to take and take and take, but never give; a massive sense of entitlement; an inability to show gratitude; an ease with dishonesty and pretension and selfishness that is couched in the language of self-care; an expectation always to be helped and rewarded no matter whether deserving or not; language that is slick and sleek but with little emotional intelligence; an astonishing level of self-absorption; an unrealistic expectation of puritanism from others; an over-inflated sense of ability, or of talent where there is any at all; an inability to apologize, truly and fully, without justifications; a passionate performance of virtue that is well mexecuted in the public space of Twitter but not in the intimate space of friendship. I find it obscene. People who ask you to ‘educate’ yourself while not having actually read any books themselves, while not being able to intelligently defend their own ideological positions, because by ‘educate,’ they actually mean ‘parrot what I say, flatten all nuance, wish away complexity.’ People who wield the words ‘violence’ and ‘weaponize’ like tarnished pitchforks. People who depend on obfuscation, who have no compassion for anybody genuinely curious or confused. Ask them a question and you are told that the answer is to repeat a mantra. Ask again for clarity and be accused of violence. And so we have a generation of young people on social media so terrified of having the wrong opinions that they have robbed themselves of the opportunity to think and to learn and to grow.
Chimamanda Ngozi Adichie
Transhumanism is Terrorism (The Sonnet) Intelligence comes easy, accountability not so much, Yet intelligence is complex, accountability is simple. Technology comes easy, transformation not so much, Yet technology is complicated, transformation is simple. In olden days there were just nutters of fundamentalism, Today there are nutters of nationalism and transhumanism. Some are obsessed with land, others with digital avatars, While humanity battles age-old crises like starvationism. When too much logic, coldness and pomposity set in, Common sense humanity goes out of the window. Once upon a time religion was the opium of all people, Today transhumanism and singularity are opium of the shallow. To replace the sky god with a computer god isn't advancement. Real advancement is when nobody suffers from scarcity of sustenance.
Abhijit Naskar (Amantes Assemble: 100 Sonnets of Servant Sultans)
It is important none the less that our remotest identifiable ancestors lived in trees because what survived in the next phase of evolution were genetic strains best suited to the special uncertainties and accidental challenges of the forest. That environment put a premium on the capacity to learn. Those survived whose genetic inheritance could respond and adapt to the surprising, sudden danger of deep shade, confused visual patterns and treacherous handholds. Strains prone to accident in such conditions were wiped out. Among those that prospered (genetically speaking) were some species with long digits which were to develop into fingers and, eventually, the oppositional thumb, and other forerunners of the apes already embarked upon an evolution towards three-dimensional vision and the diminution of the importance of the sense of smell.
J.M. Roberts (The Penguin History of the World)
Until now. You and I are a mis-Match, Ellie, because I hacked into your servers to manipulate our results.” “Rubbish,” Ellie said, secretly balking at the notion. She folded her arms indignantly. “Our servers are more secure than almost every major international company across the world. We receive so many hacking attempts, yet no one gets in. We have the best software and team money can buy to protect us against people like you.” “You’re right about some of that. But what your system didn’t take into account was your own vanity. Do you remember receiving an email some time ago with the subject ‘Businesswoman of the Year Award’? You couldn’t help but open it.” Ellie vaguely remembered reading the email as it had been sent to her private account, which only a few people had knowledge of. “Attached to it was a link you clicked on and that opened to nothing, didn’t it?” Matthew continued. “Well, it wasn’t nothing to me, because your click released a tiny, undetectable piece of tailor-made malware that allowed me to remotely access your network and work my way around your files. Everything you had access to, I had access to. Then I simply replicated my strand of DNA to mirror image yours, sat back and waited for you to get in touch. That’s why I came for a job interview, to learn a little more about the programming and systems you use. Please thank your head of personnel for leaving me alone in the room for a few moments with her laptop while she searched for a working camera to take my head shot. That was a huge help in accessing your network. Oh, and tell her to frisk interviewees for lens deflectors next time—they’re pocket-sized gadgets that render digital cameras useless.
John Marrs (The One)
The same thing, notes Brynjolfsson, happened 120 years ago, in the Second Industrial Revolution, when electrification—the supernova of its day—was introduced. Old factories did not just have to be electrified to achieve the productivity boosts; they had to be redesigned, along with all business processes. It took thirty years for one generation of managers and workers to retire and for a new generation to emerge to get the full productivity benefits of that new power source. A December 2015 study by the McKinsey Global Institute on American industry found a “considerable gap between the most digitized sectors and the rest of the economy over time and [found] that despite a massive rush of adoption, most sectors have barely closed that gap over the past decade … Because the less digitized sectors are some of the largest in terms of GDP contribution and employment, we [found] that the US economy as a whole is only reaching 18 percent of its digital potential … The United States will need to adapt its institutions and training pathways to help workers acquire relevant skills and navigate this period of transition and churn.” The supernova is a new power source, and it will take some time for society to reconfigure itself to absorb its full potential. As that happens, I believe that Brynjolfsson will be proved right and we will start to see the benefits—a broad range of new discoveries around health, learning, urban planning, transportation, innovation, and commerce—that will drive growth. That debate is for economists, though, and beyond the scope of this book, but I will be eager to see how it plays out. What is absolutely clear right now is that while the supernova may not have made our economies measurably more productive yet, it is clearly making all forms of technology, and therefore individuals, companies, ideas, machines, and groups, more powerful—more able to shape the world around them in unprecedented ways with less effort than ever before. If you want to be a maker, a starter-upper, an inventor, or an innovator, this is your time. By leveraging the supernova you can do so much more now with so little. As Tom Goodwin, senior vice president of strategy and innovation at Havas Media, observed in a March 3, 2015, essay on TechCrunch.com: “Uber, the world’s largest taxi company, owns no vehicles. Facebook, the world’s most popular media owner, creates no content. Alibaba, the most valuable retailer, has no inventory. And Airbnb, the world’s largest accommodation provider, owns no real estate. Something interesting is happening.
Thomas L. Friedman (Thank You for Being Late: An Optimist's Guide to Thriving in the Age of Accelerations)
There is nothing more dangerous to contemplate than World War III. It is worth considering whether part of the danger may not be intrinsic in the unguarded use of learning machines. Again and again I have heard the statement that learning machines cannot subject us to any new dangers, because we can turn them off when we feel like it. But can we? To turn a machine off effectively, we must be in possession of information as to whether the danger point has come. The mere fact that we have made the machine does not guarantee that we shall have the proper information to do this. This is already implicit in the statement that the checker-playing machine can defeat the man who has programmed it, and this after a very limited time of working in. Moreover, the very speed of operation of modern digital machines stands in the way of our ability to perceive and think through the indications of danger.
Norbert Wiener (Cybernetics or Control and Communication in the Animal and the Machine, Reissue of the 1961 second edition)
So which theory did Lagos believe in? The relativist or the universalist?" "He did not seem to think there was much of a difference. In the end, they are both somewhat mystical. Lagos believed that both schools of thought had essentially arrived at the same place by different lines of reasoning." "But it seems to me there is a key difference," Hiro says. "The universalists think that we are determined by the prepatterned structure of our brains -- the pathways in the cortex. The relativists don't believe that we have any limits." "Lagos modified the strict Chomskyan theory by supposing that learning a language is like blowing code into PROMs -- an analogy that I cannot interpret." "The analogy is clear. PROMs are Programmable Read-Only Memory chips," Hiro says. "When they come from the factory, they have no content. Once and only once, you can place information into those chips and then freeze it -- the information, the software, becomes frozen into the chip -- it transmutes into hardware. After you have blown the code into the PROMs, you can read it out, but you can't write to them anymore. So Lagos was trying to say that the newborn human brain has no structure -- as the relativists would have it -- and that as the child learns a language, the developing brain structures itself accordingly, the language gets 'blown into the hardware and becomes a permanent part of the brain's deep structure -- as the universalists would have it." "Yes. This was his interpretation." "Okay. So when he talked about Enki being a real person with magical powers, what he meant was that Enki somehow understood the connection between language and the brain, knew how to manipulate it. The same way that a hacker, knowing the secrets of a computer system, can write code to control it -- digital namshubs?" "Lagos said that Enki had the ability to ascend into the universe of language and see it before his eyes. Much as humans go into the Metaverse. That gave him power to create nam-shubs. And nam-shubs had the power to alter the functioning of the brain and of the body." "Why isn't anyone doing this kind of thing nowadays? Why aren't there any namshubs in English?" "Not all languages are the same, as Steiner points out. Some languages are better at metaphor than others. Hebrew, Aramaic, Greek, and Chinese lend themselves to word play and have achieved a lasting grip on reality: Palestine had Qiryat Sefer, the 'City of the Letter,' and Syria had Byblos, the 'Town of the Book.' By contrast other civilizations seem 'speechless' or at least, as may have been the case in Egypt, not entirely cognizant of the creative and transformational powers of language. Lagos believed that Sumerian was an extraordinarily powerful language -- at least it was in Sumer five thousand years ago." "A language that lent itself to Enki's neurolinguistic hacking." "Early linguists, as well as the Kabbalists, believed in a fictional language called the tongue of Eden, the language of Adam. It enabled all men to understand each other, to communicate without misunderstanding. It was the language of the Logos, the moment when God created the world by speaking a word. In the tongue of Eden, naming a thing was the same as creating it. To quote Steiner again, 'Our speech interposes itself between apprehension and truth like a dusty pane or warped mirror. The tongue of Eden was like a flawless glass; a light of total understanding streamed through it. Thus Babel was a second Fall.' And Isaac the Blind, an early Kabbalist, said that, to quote Gershom Scholem's translation, 'The speech of men is connected with divine speech and all language whether heavenly or human derives from one source: the Divine Name.' The practical Kabbalists, the sorcerers, bore the title Ba'al Shem, meaning 'master of the divine name.'" "The machine language of the world," Hiro says.
Neal Stephenson (Snow Crash)
Okay, I’m going to tell you what I think. It’s like this,” he said grimly. “Quit or don’t quit. Take the promotion or not take it. But, if you take the graveyard shift, mark my words, we will eventually—I don’t know how, and I don’t know when—live to regret it.” Without saying another word he walked inside. In bed Alexander let her kiss his hands. He was on his back, and Tatiana sidled up to him naked, kneeling by his side. Taking his hands, she kissed them slowly, digit by digit, knuckle by knuckle, pressing them to her trembling breasts, but when she opened her mouth to speak, Alexander took his hands away. “I know what you’re about to do,” he said. “I’ve been there a thousand times. Go ahead. Touch me. Caress me. Whisper to me. Tell me first you don’t see my scars anymore, then make it all right. You always do, you always manage to convince me that whatever crazy plan you have is really the best for you and me,” he said. “Returning to blockaded Leningrad, escaping to Sweden, Finland, running to Berlin, the graveyard shift. I know what’s coming. Go ahead, I’ll be good to you right back. You’re going to try to make me all right with you staying in Leningrad when I tell you that to save your hard-headed skull you must return to Lazarevo? You want to convince me that escaping through enemy territory across Finland’s iced-over marsh while pregnant is the only way for us? Please. You want to tell me that working all Friday night and not sleeping in my bed is the best thing for our family? Try. I know eventually you’ll succeed.” He was staring at her blonde and lowered head. “Even if you don’t,” he continued, “I know eventually, you’ll do what you want anyway. I don’t want you to do it. You know you should be resigning, not working graveyard—nomenclature, by the way, that I find ironic for more reasons that I care to go into. I’m telling you here and now, the path you’re taking us on is going to lead to chaos and discord not order and accord. It’s your choice, though. This defines you—as a nurse, as a woman, as a wife—pretend servitude. But you can’t fool me. You and I both know what you’re made of underneath the velvet glove: cast iron.” When Tatiana said nothing, Alexander brought her to him and laid her on his chest. “You gave me too much leeway with Balkman,” he said, kissing her forehead. “You kept your mouth shut too long, but I’ve learned from your mistake. I’m not keeping mine shut—I’m telling you right from the start: you’re choosing unwisely. You are not seeing the future. But you do what you want.” Kneeling next to him, she cupped him below the groin into one palm, kneading him gently, and caressed him back and forth with the other. “Yes,” he said, putting his arms under his head and closing his eyes. “You know I love that, your healing stroke. I’m in your hands.” She kissed him and whispered to him, and told him she didn’t see his scars anymore, and made it if not all right then at least forgotten for the next few hours of darkness.
Paullina Simons (The Summer Garden (The Bronze Horseman, #3))
a harbinger of a third wave of computing, one that blurred the line between augmented human intelligence and artificial intelligence. “The first generation of computers were machines that counted and tabulated,” Rometty says, harking back to IBM’s roots in Herman Hollerith’s punch-card tabulators used for the 1890 census. “The second generation involved programmable machines that used the von Neumann architecture. You had to tell them what to do.” Beginning with Ada Lovelace, people wrote algorithms that instructed these computers, step by step, how to perform tasks. “Because of the proliferation of data,” Rometty adds, “there is no choice but to have a third generation, which are systems that are not programmed, they learn.”27 But even as this occurs, the process could remain one of partnership and symbiosis with humans rather than one designed to relegate humans to the dustbin of history. Larry Norton, a breast cancer specialist at New York’s Memorial Sloan-Kettering Cancer Center, was part of the team that worked with Watson. “Computer science is going to evolve rapidly, and medicine will evolve with it,” he said. “This is coevolution. We’ll help each other.”28 This belief that machines and humans will get smarter together is a process that Doug Engelbart called “bootstrapping” and “coevolution.”29 It raises an interesting prospect: perhaps no matter how fast computers progress, artificial intelligence may never outstrip the intelligence of the human-machine partnership. Let us assume, for example, that a machine someday exhibits all of the mental capabilities of a human: giving the outward appearance of recognizing patterns, perceiving emotions, appreciating beauty, creating art, having desires, forming moral values, and pursuing goals. Such a machine might be able to pass a Turing Test. It might even pass what we could call the Ada Test, which is that it could appear to “originate” its own thoughts that go beyond what we humans program it to do. There would, however, be still another hurdle before we could say that artificial intelligence has triumphed over augmented intelligence. We can call it the Licklider Test. It would go beyond asking whether a machine could replicate all the components of human intelligence to ask whether the machine accomplishes these tasks better when whirring away completely on its own or when working in conjunction with humans. In other words, is it possible that humans and machines working in partnership will be indefinitely more powerful than an artificial intelligence machine working alone?
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Bush’s description of how basic research provides the seed corn for practical inventions became known as the “linear model of innovation.” Although subsequent waves of science historians sought to debunk the linear model for ignoring the complex interplay between theoretical research and practical applications, it had a popular appeal as well as an underlying truth. The war, Bush wrote, had made it “clear beyond all doubt” that basic science—discovering the fundamentals of nuclear physics, lasers, computer science, radar—“is absolutely essential to national security.” It was also, he added, crucial for America’s economic security. “New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science. A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade.” By the end of his report, Bush had reached poetic heights in extolling the practical payoffs of basic scientific research: “Advances in science when put to practical use mean more jobs, higher wages, shorter hours, more abundant crops, more leisure for recreation, for study, for learning how to live without the deadening drudgery which has been the burden of the common man for past ages.”9 Based on this report, Congress established the National Science Foundation.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
And so, when I tell stories today about digital transformation and organizational agility and customer centricity, I use a vocabulary that is very consistent and very refined. It is one of the tools I have available to tell my story effectively. I talk about assumptions. I talk about hypotheses. I talk about outcomes as a measure of customer success. I talk about outcomes as a measurable change in customer behavior. I talk about outcomes over outputs, experimentation, continuous learning, and ship, sense, and respond. The more you tell your story, the more you can refine your language into your trademark or brand—what you’re most known for. For example, baseball great Yogi Berra was famous for his Yogi-isms—sayings like “You can observe a lot by watching” and “When you come to a fork in the road, take it.” It’s not just a hook or catchphrase, it helps tell the story as well. For Lean Startup, a best-selling book on corporate innovation written by Eric Ries, the words were “build,” “measure,” “learn.” Jeff Patton, a colleague of mine, uses the phrase “the differences that make a difference.” And he talks about bets as a way of testing confidence levels. He’ll ask, “What will you bet me that your idea is good? Will you bet me lunch? A day’s pay? Your 401(k)?” These words are not only their vocabulary. They are their brand. That’s one of the benefits of storytelling and telling those stories continuously. As you refine your language, the people who are beginning to pay attention to you start adopting that language, and then that becomes your thing.
Jeff Gothelf (Forever Employable: How to Stop Looking for Work and Let Your Next Job Find You)
Gina flopped back on her cot, arm up over her eyes. “Oh, my God, Molly, what am I going to do? The fact that he came here tonight at all is . . . He’s clearly interested, but that’s probably just because he thinks I’m a total perv.” “Whoa,” Molly said. “Wait. You lost me there.” Gina sat up, a mix of earnestness, horror, and amusement on her pretty face. “I didn’t tell you this, but after I first spoke to Lucy’s sister—we were in the shower tent so no one would see us—I let her leave first and then I waited, like, a minute, thinking we shouldn’t be seen leaving the tent together. And before I go, he came in.” He. “Leslie Pollard?” Molly clarified. Gina nodded. “I freaked out when I saw him coming, and it’s stupid, I know, but I hid. And I should have just waited until I heard the shower go on, but God, maybe he wouldn’t have pulled the curtain, because he obviously thought he was in there alone . . .” Molly started to laugh. “Oh my.” “Yeah,” Gina said. “Oh my. So I decide to run for it, only he’s not in one of the changing booths, he’s over by the bench, you know?” Molly nodded. The bench in the main part of the room. “In only his underwear,” Gina finished, with a roll of her eyes. “Oh, my God.” “Really? Molly asked. Apparently Jones was taking his change of identity very seriously. He hated wearing underwear of any kind, but obviously he thought it wouldn’t be in character for Leslie Pollard to go commando. “Boxers or briefs?” Gina gave her a look, but she was starting to laugh now, too, thank goodness. “Briefs. Very brief briefs.” She covered her mouth with her hands. “Oh, my God, Molly, he was . . . I think he showers at noon because he knows no one else will be in there, so he can, you know, have an intimate visit with Mr. Hand.” Oh, dear. “And now I know, and he knows I know, and he also probably thinks I lurk in the men’s shower,” Gina continued. “And the fact that he actually came to tea tonight, instead of hiding from me, in his tent, forever, means . . . something awful, don’t you think? Did I mention he has, like, an incredible body?” Molly shook her head. Oh dear. “No.” “Yes,” Gina said just a little too grimly, considering the topic. “Who would’ve guessed that underneath those awful shirts he’s a total god? And maybe that’s what’s freaking out the most.” “You mean because . . . you’re attracted to him?” Molly asked. “No!” Gina said. “God! Because I’m not. I felt nothing. I’m standing there and he’s . . . You know how I said he reminds me of Hugh Grant?” Molly nodded, too relieved to speak. “Well, I got the wrong Hugh. This guy is built like Hugh Jackman. And beneath the hats and sunblock and glasses, he’s actually got cheekbones and a jaw line, too. I’m talking total hottie. And, yes, I can definitely appreciate that on one level, but . . .” She glanced over at the desk, at her digital camera. She’d gotten it out of her trunk earlier today. Which, Molly had learned, meant that she’d spent more time this afternoon looking at her saved pictures. Which included at least a few of Max. Molly’s relief over not having to deal with the complications of Gina having a crush on Leslie felt a whole lot less good. She wished someone would just go ahead and steal Gina’s camera already. Maybe that would help her move on.
Suzanne Brockmann (Breaking Point (Troubleshooters, #9))
The Memory Business Steven Sasson is a tall man with a lantern jaw. In 1973, he was a freshly minted graduate of the Rensselaer Polytechnic Institute. His degree in electrical engineering led to a job with Kodak’s Apparatus Division research lab, where, a few months into his employment, Sasson’s supervisor, Gareth Lloyd, approached him with a “small” request. Fairchild Semiconductor had just invented the first “charge-coupled device” (or CCD)—an easy way to move an electronic charge around a transistor—and Kodak needed to know if these devices could be used for imaging.4 Could they ever. By 1975, working with a small team of talented technicians, Sasson used CCDs to create the world’s first digital still camera and digital recording device. Looking, as Fast Company once explained, “like a ’70s Polaroid crossed with a Speak-and-Spell,”5 the camera was the size of a toaster, weighed in at 8.5 pounds, had a resolution of 0.01 megapixel, and took up to thirty black-and-white digital images—a number chosen because it fell between twenty-four and thirty-six and was thus in alignment with the exposures available in Kodak’s roll film. It also stored shots on the only permanent storage device available back then—a cassette tape. Still, it was an astounding achievement and an incredible learning experience. Portrait of Steven Sasson with first digital camera, 2009 Source: Harvey Wang, From Darkroom to Daylight “When you demonstrate such a system,” Sasson later said, “that is, taking pictures without film and showing them on an electronic screen without printing them on paper, inside a company like Kodak in 1976, you have to get ready for a lot of questions. I thought people would ask me questions about the technology: How’d you do this? How’d you make that work? I didn’t get any of that. They asked me when it was going to be ready for prime time? When is it going to be realistic to use this? Why would anybody want to look at their pictures on an electronic screen?”6 In 1996, twenty years after this meeting took place, Kodak had 140,000 employees and a $28 billion market cap. They were effectively a category monopoly. In the United States, they controlled 90 percent of the film market and 85 percent of the camera market.7 But they had forgotten their business model. Kodak had started out in the chemistry and paper goods business, for sure, but they came to dominance by being in the convenience business. Even that doesn’t go far enough. There is still the question of what exactly Kodak was making more convenient. Was it just photography? Not even close. Photography was simply the medium of expression—but what was being expressed? The “Kodak Moment,” of course—our desire to document our lives, to capture the fleeting, to record the ephemeral. Kodak was in the business of recording memories. And what made recording memories more convenient than a digital camera? But that wasn’t how the Kodak Corporation of the late twentieth century saw it. They thought that the digital camera would undercut their chemical business and photographic paper business, essentially forcing the company into competing against itself. So they buried the technology. Nor did the executives understand how a low-resolution 0.01 megapixel image camera could hop on an exponential growth curve and eventually provide high-resolution images. So they ignored it. Instead of using their weighty position to corner the market, they were instead cornered by the market.
Peter H. Diamandis (Bold: How to Go Big, Create Wealth and Impact the World (Exponential Technology Series))
Similarly, the brains of mice that have learned many tasks are slightly different from the brains of other mice that have not learned these tasks. It is not so much that the number of neurons has changed, but rather that the nature of the neural connections has been altered by the learning process. In other words, learning actually changes the structure of the brain. This raises the old adage “practice makes perfect.” Canadian psychologist Dr. Donald Hebb discovered an important fact about the wiring of the brain: the more we exercise certain skills, the more certain pathways in our brains become reinforced, so the task becomes easier. Unlike a digital computer, which is just as dumb today as it was yesterday, the brain is a learning machine with the ability to rewire its neural pathways every time it learns something. This is a fundamental difference between a digital computer and the brain. This lesson applies not only to London taxicab drivers, but also to accomplished concert musicians as well. According to psychologist Dr. K. Anders Ericsson and colleagues, who studied master violinists at Berlin’s elite Academy of Music, top concert violinists could easily rack up ten thousand hours of grueling practice by the time they were twenty years old, practicing more than thirty hours per week. By contrast, he found that students who were merely exceptional studied only eight thousand hours or fewer, and future music teachers practiced only a total of four thousand hours. Neurologist Daniel Levitin says, “The emerging picture from such studies is that ten thousand hours of practice is required to achieve the level of mastery associated with being a world-class expert—in anything.… In study after study, of composers, basketball players, fiction writers, ice skaters, concert pianists, chess players, master criminals, and what have you, this number comes up again and again.” Malcolm Gladwell, writing in the book Outliers, calls this the “10,000-hour rule.
Michio Kaku (The Future of the Mind: The Scientific Quest to Understand, Enhance, and Empower the Mind)