Digital Illustration Quotes

We've searched our database for all the quotes and captions related to Digital Illustration. Here they are! All 24 of them:

WARNING: The following is a transcript of a digital recording. In certain places, the audio quality was poor, so some words and phrases represent the author's best guesses. Where possible, illustrations of important symbols mentioned in the recording have been added. Background noises such as scuffling, hitting, and cursing by the two speakers have not been transcribed The author makes no claims for the authenticity of the recording. It seems impossible that the two young narrators are telling the truth, but you, the reader, must decide for yourself.
Rick Riordan (The Red Pyramid (The Kane Chronicles, #1))
Today Hedy’s invention serves millions through GPS, Galileo, and GLONASS satellites, Bluetooth, cell-phone, and digital wireless systems. (illustration credit i1.23)
Richard Rhodes (Hedy's Folly: The Life and Breakthrough Inventions of Hedy Lamarr, the Most Beautiful Woman in the World)
[The public intellectual] will also describe how she can work a pop culture reference into her essay, comparing the Supreme Court to the creature in the number-one box office movie of the moment. Editors like this sort of mass-media integration, first, because it gives them a way to illustrate the piece, and second because they are under the delusion that pop-culture references will propel a piece's readership into the five-digit area.
David Brooks (Bobos in Paradise)
Three researchers at Stanford University noticed the same thing about the undergraduates they were teaching, and they decided to study it. First, they noticed that while all the students seemed to use digital devices incessantly, not all students did. True to stereotype, some kids were zombified, hyperdigital users. But some kids used their devices in a low-key fashion: not all the time, and not with two dozen windows open simultaneously. The researchers called the first category of students Heavy Media Multitaskers. Their less frantic colleagues were called Light Media Multitaskers. If you asked heavy users to concentrate on a problem while simultaneously giving them lots of distractions, the researchers wondered, how good was their ability to maintain focus? The hypothesis: Compared to light users, the heavy users would be faster and more accurate at switching from one task to another, because they were already so used to switching between browser windows and projects and media inputs. The hypothesis was wrong. In every attentional test the researchers threw at these students, the heavy users did consistently worse than the light users. Sometimes dramatically worse. They weren’t as good at filtering out irrelevant information. They couldn’t organize their memories as well. And they did worse on every task-switching experiment. Psychologist Eyal Ophir, an author of the study, said of the heavy users: “They couldn’t help thinking about the task they weren’t doing. The high multitaskers are always drawing from all the information in front of them. They can’t keep things separate in their minds.” This is just the latest illustration of the fact that the brain cannot multitask. Even if you are a Stanford student in the heart of Silicon Valley.
John Medina (Brain Rules: 12 Principles for Surviving and Thriving at Work, Home, and School)
In 2006, Avinash Kaushik and Ronny Kohavi, two data analysis professionals who were then working at Intuit and Microsoft, respectively, came up with the acronym HiPPO to summarize the dominant decision-making style at most companies. It stands for “highest-paid person’s opinion.” We love this shorthand and use it a lot, because it vividly illustrates the standard partnership. Even when the decisions are not made by the highest-paid people, they’re often—too often—based on opinions, judgments, intuition, gut, and System 1. The evidence is clear that this approach frequently doesn’t work well, and that HiPPOs too often destroy value.
Andrew McAfee (Machine, Platform, Crowd: Harnessing Our Digital Future)
• Storyboard: An illustration, typically across multiple panels, depicting a scenario. I used storyboards (see Figure 4.4) when working with MediaMaster, a digital music start-up that was trying to choose among a number of different directions in their product development.
Steve Portigal (Interviewing Users: How to Uncover Compelling Insights)
Bruce Horn: I thought that computers would be hugely flexible and we could be able to do everything and it would be the most mind-blowing experience ever. And instead we froze all of our thinking. We froze all the software and made it kind of industrial and mass-marketed. Computing went in the wrong direction: Computing went to the direction of commercialism and cookie-cutter. Jaron Lanier: My whole field has created shit. And it’s like we’ve thrust all of humanity into this endless life of tedium, and it’s not how it was supposed to be. The way we’ve designed the tools requires that people comply totally with an infinite number of arbitrary actions. We really have turned humanity into lab rats that are trained to run mazes. I really think on just the most fundamental level we are approaching digital technology in the wrong way. Andy van Dam: Ask yourself, what have we got today? We’ve got Microsoft Word and we’ve got PowerPoint and we’ve got Illustrator and we’ve got Photoshop. There’s more functionality and, for my taste, an easier-to-understand user interface than what we had before. But they don’t work together. They don’t play nice together. And most of the time, what you’ve got is an import/export capability, based on bitmaps: the lowest common denominator—dead bits, in effect. What I’m still looking for is a reintegration of these various components so that we can go back to the future and have that broad vision at our fingertips. I don’t see how we are going to get there, frankly. Live bits—where everything interoperates—we’ve lost that. Bruce Horn: We’re waiting for the right thing to happen to have the same type of mind-blowing experience that we were able to show the Apple people at PARC. There’s some work being done, but it’s very tough. And, yeah, I feel somewhat responsible. On the other hand, if somebody like Alan Kay couldn’t make it happen, how can I make it happen?
Adam Fisher (Valley of Genius: The Uncensored History of Silicon Valley (As Told by the Hackers, Founders, and Freaks Who Made It Boom))
It is fun to be around really, really creative makers in the second half of the chessboard, to see what they can do, as individuals, with all of the empowering tools that have been enabled by the supernova. I met Tom Wujec in San Francisco at an event at the Exploratorium. We thought we had a lot in common and agreed to follow up on a Skype call. Wujec is a fellow at Autodesk and a global leader in 3-D design, engineering, and entertainment software. While his title sounds like a guy designing hubcaps for an auto parts company, the truth is that Autodesk is another of those really important companies few people know about—it builds the software that architects, auto and game designers, and film studios use to imagine and design buildings, cars, and movies on their computers. It is the Microsoft of design. Autodesk offers roughly 180 software tools used by some twenty million professional designers as well as more than two hundred million amateur designers, and each year those tools reduce more and more complexity to one touch. Wujec is an expert in business visualization—using design thinking to help groups solve wicked problems. When we first talked on the phone, he illustrated our conversation real-time on a shared digital whiteboard. I was awed. During our conversation, Wujec told me his favorite story of just how much the power of technology has transformed his work as a designer-maker.
Thomas L. Friedman (Thank You for Being Late: An Optimist's Guide to Thriving in the Age of Accelerations)
The traditional illustration of the direct rule-based approach is the “three laws of robotics” concept, formulated by science fiction author Isaac Asimov in a short story published in 1942.22 The three laws were: (1) A robot may not injure a human being or, through inaction, allow a human being to come to harm; (2) A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law; (3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. Embarrassingly for our species, Asimov’s laws remained state-of-the-art for over half a century: this despite obvious problems with the approach, some of which are explored in Asimov’s own writings (Asimov probably having formulated the laws in the first place precisely so that they would fail in interesting ways, providing fertile plot complications for his stories).23 Bertrand Russell, who spent many years working on the foundations of mathematics, once remarked that “everything is vague to a degree you do not realize till you have tried to make it precise.”24 Russell’s dictum applies in spades to the direct specification approach. Consider, for example, how one might explicate Asimov’s first law. Does it mean that the robot should minimize the probability of any human being coming to harm? In that case the other laws become otiose since it is always possible for the AI to take some action that would have at least some microscopic effect on the probability of a human being coming to harm. How is the robot to balance a large risk of a few humans coming to harm versus a small risk of many humans being harmed? How do we define “harm” anyway? How should the harm of physical pain be weighed against the harm of architectural ugliness or social injustice? Is a sadist harmed if he is prevented from tormenting his victim? How do we define “human being”? Why is no consideration given to other morally considerable beings, such as sentient nonhuman animals and digital minds? The more one ponders, the more the questions proliferate. Perhaps
Nick Bostrom (Superintelligence: Paths, Dangers, Strategies)
extended to the borders of the new flat front crystal display, and the Digital Crown and side button are now elevated on the case.
David Walter (Apple Watch Ultra User Guide: A Complete Step By Step Manual For Beginners and Seniors With Practical Illustrations On How To Use & Master The New Apple Watch Ultra. With watchOS 9 Tips & Tricks)
Kirkus Review Children will appreciate that Fox gets his deserved comeuppance and will giggle over this spirited tale filled with comical banter that proves a smart, brave, levelheaded individual can outwit a bully. The dynamic, witty illustrations depict wonderfully expressive characters and droll underground details; kids will have fun poring over all the amusing activities happening in the bunnies’ habitat. (This book was reviewed digitally.) Readers will have a “hole” lot of fun with this entertaining book. (Picture book. 4-8)
Scott Slater (Down the Hole)
the well of inspiration has run dry, it’s because you need a deeper well full of examples, illustrations, stories, statistics, diagrams, analogies, metaphors, photos, mindmaps, conversation notes, quotes—anything that will help you argue for your perspective or fight for a cause you believe in.
Tiago Forte (Building a Second Brain: A Proven Method to Organize Your Digital Life and Unlock Your Creative Potential)
The connection between learning and evolution was proposed in 1896 by the American psychologist James Baldwin7 and independently by the British ethologist Conwy Lloyd Morgan8 but not generally accepted at the time. The Baldwin effect, as it is now known, can be understood by imagining that evolution has a choice between creating an instinctive organism whose every response is fixed in advance and creating an adaptive organism that learns what actions to take. Now suppose, for the purposes of illustration, that the optimal instinctive organism can be coded as a six-digit number, say, 472116, while in the case of the adaptive organism, evolution specifies only 472*** and the organism itself has to fill in the last three digits by learning during its lifetime. Clearly, if evolution has to worry about choosing only the first three digits, its job is much easier; the adaptive organism, in learning the last three digits, is doing in one lifetime what evolution would have
Stuart Russell (Human Compatible: Artificial Intelligence and the Problem of Control)
monetizable media empires on YouTube, while many freelancers make a better living on Upwork than they ever did or could at a traditional firm. My fascination with platforms emerged from a desire to understand business success and failure in the context of emerging digital business models. Platform Scale is an outcome of this growing fascination to unpack the inner workings of business models in a networked world. The ideas in this book aim to illustrate the importance of these models, the forces that power their rapid
Sangeet Paul Choudary (Platform Scale: How an emerging business model helps startups build large empires with minimum investment)
This focus on humans rather than money is best illustrated by the Apple store concept, which was the first to include the Genius Bar, a children’s play area and other features that critics thought were a waste of time. When the first Apple store designs were announced, Bloomberg reported: “(Steve) Jobs thinks he can do a better job than experienced retailers. Problem is, the numbers don’t add up. I give them two years before they’re turning out the lights on a very painful and expensive mistake.”15 However, after opening, it was obvious that the stores were engaging customers in an even more immersive, brand building experience. Eight years later, Apple’s New York store became the highest grossing retailer on Fifth Avenue.
Chris Skinner (Digital Bank: Strategies to launch or become a digital bank)
The problem of rapidly evolving technologies or “digital migration” was rather alarmingly illustrated in England in the 1980s with a considerably larger amount of information. Actually, it began in 1086 with the Domesday Book. The first public record ever made in England, the Domesday Book was instigated by William the Conqueror, who wished to take a census of his people and, more specifically, their possessions.
Christine Kenneally (The Invisible History of the Human Race: How DNA and History Shape Our Identities and Our Futures)
The “self-actualization” philosophy from which most of this new bureaucratic language emerged insists that we live in a timeless present, that history means nothing, that we simply create the world around us through the power of the will. This is a kind of individualistic fascism. Around the time the philosophy became popular in the seventies, some conservative Christian theologians were actually thinking along very similar lines: seeing electronic money as a kind of extension for God’s creative power, which is then transformed into material reality through the minds of inspired entrepreneurs. It’s easy to see how this could lead to the creation of a world where financial abstractions feel like the very bedrock of reality, and so many of our lived environments look like they were 3-D-printed from somebody’s computer screen. In fact, the sense of a digitally generated world I’ve been describing could be taken as a perfect illustration of another social law—at least, it seems to me that it should be recognized as a law—that, if one gives sufficient social power to a class of people holding even the most outlandish ideas, they will, consciously or not, eventually contrive to produce a world organized in such a way that living in it will, in a thousand subtle ways, reinforce the impression that those ideas are self-evidently true.
David Graeber (The Utopia of Rules: On Technology, Stupidity, and the Secret Joys of Bureaucracy)
Here are some services you can offer as a freelance fashion designer: ● Digital fashion flats (Adobe Illustrator) ● Tech packs ● Pattern making / pattern drafting (sample sewing / fittings) ● Design (trend / color research / collection design) ● Merchandising
Heidi Sew (Freelancing in Fashion: A Step-By-step Guide to Creating Your Portfolio, Setting Rates and Finding Clients You Love)
When you feel stuck in your creative pursuits, it doesn’t mean that there’s something wrong with you. You haven’t lost your touch or run out of creative juice. It just means you don’t yet have enough raw material to work with. If it feels like the well of inspiration has run dry, it’s because you need a deeper well full of examples, illustrations, stories, statistics, diagrams, analogies, metaphors, photos, mindmaps, conversation notes, quotes—anything that will help you argue for your perspective or fight for a cause you believe in.
Tiago Forte (Building a Second Brain: A Proven Method to Organise Your Digital Life and Unlock Your Creative Potential)
Pictures: As a subtext to all of this, be sure to include pictures, screenshots, illustrations, videos, and documentation that verify everything you say about yourself.
Raza Imam (Six Figure Blogging Blueprint: How to Start an Amazingly Profitable Blog in the Next 60 Days (Even If You Have No Experience) (Digital Marketing Mastery Book 3))
The same goes for textual authoring. It is easy to dictate or type in volume and most people today do, in the form of short ‘text’ messages, social media posts or longer pieces for study, work or leisure. It is much harder to make this mass of text truly accessible–we still write mostly in columns, import illustrations from elsewhere and have severe restrictions for how we can connect or link information from different locations or sources and hardly any opportunity to address specific sections of text. We then leave little information in our primitive digital documents, which are currently in the PDF document format, when we publish the result of our work.
Frode Hegland (The Future of Text 1)
In the beginning of Photoshop and Illustrator—what became Creative Suite—and to me it looked like another tool, another set of tools. And it fit in well with my photography. The photography had been there to support my art, basically, a way of gathering information and images in order to create better paintings.
James Stanford (Shimmering Zen)
The main criticism was really “he just presses a filter and gets these images.” But then they’d realize that I had learned how to paint and how to draw—that I had paid my dues, and I’d also been a graphic artist and a technical illustrator. And so I was able to show that I could draw with technical pens—and do anything that anyone else could do—and yet still was fascinated by this, and that sort of opened people up a little bit more. The more they knew about me, the more open they were to my explorations in the digital field.
James Stanford
But this is no longer a photograph and, liter-ally speaking, it is no longer even an image. These shots may be said, rather, to be part of the murder of the image. That murder is being perpetrated continually by all the images that accumulate in series, in 'thematic' sequences, which illustrate the same event ad nauseam, which think they are accumulating, but are, in fact, cancelling each other out, till they reach the zero degree of information. There is a violence done to the world in this way, but there is also a violence done to the image, to the sovereignty of images. Now, an image has to be sovereign; it has to have its own symbolic space. If they are living images- 'aesthetic' quality is not at issue here- they ensure the existence of that symbolic space by eliminating an infinite number of other spaces from it. There is a perpetual rivalry between (true) images. But it is exactly the opposite today with the digital, where the parade of images resembles the sequencing of the genome.
Jean Baudrillard (Why Hasn't Everything Already Disappeared? (The French List))