Research And Innovation Quotes

We've searched our database for all the quotes and captions related to Research And Innovation. Here they are! All 100 of them:

People’s confidence in their abilities influences how they approach life. Their dreams are likely anchored to what they feel they can achieve.
Raoul Davis Jr. (Firestarters: How Innovators, Instigators, and Initiators Can Inspire You to Ignite Your Own Life)
Innovators are owners of the situation. They own it because they create it—quite literally. They embrace the world as it should match the vision in their heads. And when something is missing from that vision, they fill the gap.
Raoul Davis Jr. (Firestarters: How Innovators, Instigators, and Initiators Can Inspire You to Ignite Your Own Life)
Belief that you can act is a powerful motivator. Belief that change can happen in a flash is an even stronger motivator.
Raoul Davis Jr. (Firestarters: How Innovators, Instigators, and Initiators Can Inspire You to Ignite Your Own Life)
Introspection is a form of self-management. You reflect. You decide. You change. You allow yourself to grow.
Raoul Davis Jr. (Firestarters: How Innovators, Instigators, and Initiators Can Inspire You to Ignite Your Own Life)
The secret to doing good research is always to be a little underemployed. You waste years by not being able to waste hours
Michael Lewis (The Undoing Project: A Friendship That Changed Our Minds)
Under normal conditions the research scientist is not an innovator but a solver of puzzles, and the puzzles upon which he concentrates are just those which he believes can be both stated and solved within the existing scientific tradition.
Thomas S. Kuhn (The Structure of Scientific Revolutions)
Design must be an innovative, highly creative, cross-disciplinary tool responsive to the needs of men. It must be more research-oriented, and we must stop defiling the earth itself with poorly-designed objects and structures.
Victor Papanek (Design for the Real World: Human Ecology and Social Change)
If just a tiny fraction of the sums spent on scientific and technological research and innovation were devoted to labs for designing and testing new organizational and institutional structures, we might have a much broader range of options to head off the looming implosion.
Alvin Toffler (Revolutionary Wealth)
Research suggests that in over 90 percent of all successful new businesses, historically, the strategy that the founders had deliberately decided to pursue was not the strategy that ultimately led to the business’s success.
Clayton M. Christensen (The Innovator's Solution: Creating and Sustaining Successful Growth (Creating and Sustainability Successful Growth))
The daughter of Lithuanian immigrants, born with a precocious scientific intellect and a thirst for chemical knowledge, Elion had completed a master's degree in chemistry from New York University in 1941 while teaching high school science during the day and preforming her research for her thesis at night and on the weekends. Although highly qualified, talented, and driven, she had been unable to find a job in an academic laboratory. Frustrated by repeated rejections, she had found a position as a supermarket product supervisor. When Hitchings found Trudy Elion, who would soon become on of the most innovative synthetic chemists of her generation (and a future Nobel laureate), she was working for a food lab in New York, testing the acidity of pickles and the color of egg yolk going into mayonnaise. Rescued from a life of pickles and mayonnaise…
Siddhartha Mukherjee (The Emperor of All Maladies: A Biography of Cancer)
I don't like museums, I like labs.
Amit Kalantri
Innovation basically involves making obsolete that which you did before.
Jay Abraham (The Sticking Point Solution: 9 Ways to Move Your Business from Stagnation to Stunning Growth InTough Economic Times)
Research; the curiosity to find the unknown to make it known.
Lailah Gifty Akita (Think Great: Be Great! (Beautiful Quotes, #1))
Her research suggests a paradoxical truth about innovation: good ideas are more likely to emerge in environments that contain a certain amount of noise and error.
Steven Johnson (Where Good Ideas Come From)
We get smarter and more creative as we age, research shows. Our brain's anatomy, neural networks, and cognitive abilities can actually improve with age and increased life experiences. Contrary to the mythology of Silicon Valley, older employees may be even more productive, innovative, and collaborative than younger ones... Most people, in fact, have multiple cognitive peaks throughout their lives.
Rich Karlgaard (Late Bloomers: The Power of Patience in a World Obsessed with Early Achievement)
An invention is a responsibility of the individual, society cannot invent, it can only applaud the invention and inventor.
Amit Kalantri (Wealth of Words)
Dissociation is the ultimate form of human response to chronic developmental stress, because patients with dissociative disorders report the highest frequency of childhood abuse and/or neglect among all psychiatric disorders. The cardinal feature of dissociation is a disruption in one or more mental functions. Dissociative amnesia, depersonalization, derealization, identity confusion, and identity alterations are core phenomena of dissociative psychopathology which constitute a single dimension characterized by a spectrum of severity. Clinical Psychopharmacology and Neuroscience 2014 Dec; 12(3): 171-179 The Many Faces of Dissociation: Opportunities for Innovative Research in Psychiatry
Verdat Sar
Brain Research Through Advancing Innovative Neurotechnologies (or BRAIN) project announced by President Obama, and the Human Brain Project of the European Union, which will potentially allocate billions of dollars to decode the pathways of the brain, all the way down to the neural level.
Michio Kaku (The Future of the Mind: The Scientific Quest To Understand, Enhance and Empower the Mind)
narrowly focused specialists may be good at incremental innovation. But breakthrough innovation is often the product of temporary teams whose members cross disciplinary boundaries—at a time when breakthroughs in every field are, in fact, blurring those very boundaries. And this is not just a matter for scientists and researchers.
Alvin Toffler (Revolutionary Wealth)
Billions of dollars worth of research knowledge lie dormant at American universities waiting for the right disruptor to come along and create a business.
Jay Samit (Disrupt You!: Master Personal Transformation, Seize Opportunity, and Thrive in the Era of Endless Innovation)
Basic research leads to new knowledge,” Bush wrote. “It provides scientific capital. It creates the fund from which the practical applications of knowledge must be drawn.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
In 2011 a milestone was reached: Apple and Google spent more on lawsuits and payments involving patents than they did on research and development of new products.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Research has shown that simply imagining yourself in a situation where the rules of regular life don’t apply can greatly increase your creativity.
Tanner Christensen (The Creativity Challenge: Design, Experiment, Test, Innovate, Build, Create, Inspire, and Unleash Your Genius)
The main lesson of thirty-five years of AI research is that the hard problems are easy and the easy problems are hard,” according to Steven Pinker, the Harvard cognitive scientist.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
LISP, which was designed to facilitate artificial intelligence research.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
the Sources of Innovation: The Case of Scientific Instruments." Research Policy 23, no. 4: 459-469.
Eric von Hippel (Democratizing Innovation)
After 3.8 billion years of research and development, failures are fossils, and what surrounds us is the secret to survival.
Janine M. Benyus (Biomimicry: Innovation Inspired by Nature)
Most improved things can be improved.
Mokokoma Mokhonoana
Envisioning fungi as nanoconductors in mycocomputers, Gorman (2003) and his fellow researchers at Northwestern University have manipulated mycelia of Aspergillus niger to organize gold into its DNA, in effect creating mycelial conductors of electrical potentials. NASA reports that microbiologists at the University of Tennessee, led by Gary Sayler, have developed a rugged biological computer chip housing bacteria that glow upon sensing pollutants, from heavy metals to PCBs (Miller 2004). Such innovations hint at new microbiotechnologies on the near horizon. Working together, fungal networks and environmentally responsive bacteria could provide us with data about pH, detect nutrients and toxic waste, and even measure biological populations.
Paul Stamets (Mycelium Running: How Mushrooms Can Help Save the World)
As long as research data is stored as tacit knowledge in people’s minds or buried in interview transcripts, teams will experience difficulty synthesizing what has been observed and learned.
Bella Martin (Universal Methods of Design: 100 Ways to Research Complex Problems, Develop Innovative Ideas, and Design Effective Solutions)
involving into an iterative process of simplifying the 'complexity', and then transforming this 'simplicity into newer complexity' while integrating the unsolved domain for an unprecedented success.
Priyavrat Thareja
ARPA should not force the research computers at each site to handle the routing of data, Clark argued. Instead ARPA should design and give each site a standardized minicomputer that would do the routing.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
The commercialization of molecular biology is the most stunning ethical event in the history of science, and it has happened with astonishing speed. For four hundred years since Galileo, science has always proceeded as a free and open inquiry into the workings of nature. Scientists have always ignored national boundaries, holding themselves above the transitory concerns of politics and even wars. Scientists have always rebelled against secrecy in research, and have even frowned on the idea of patenting their discoveries, seeing themselves as working to the benefit of all mankind. And for many generations, the discoveries of scientists did indeed have a peculiarly selfless quality... Suddenly it seemed as if everyone wanted to become rich. New companies were announced almost weekly, and scientists flocked to exploit genetic research... It is necessary to emphasize how significant this shift in attitude actually was. In the past, pure scientists took a snobbish view of business. They saw the pursuit of money as intellectually uninteresting, suited only to shopkeepers. And to do research for industry, even at the prestigious Bell or IBM labs, was only for those who couldn't get a university appointment. Thus the attitude of pure scientists was fundamentally critical toward the work of applied scientists, and to industry in general. Their long-standing antagonism kept university scientists free of contaminating industry ties, and whenever debate arose about technological matters, disinterested scientists were available to discuss the issues at the highest levels. But that is no longer true. There are very few molecular biologists and very few research institutions without commercial affiliations. The old days are gone. Genetic research continues, at a more furious pace than ever. But it is done in secret, and in haste, and for profit.
Michael Crichton (Jurassic Park (Jurassic Park, #1))
What’s more, according to Brown’s research, play shapes our brain, fosters empathy, helps us navigate complex social groups, and is at the core of creativity and innovation. In some ways, it helps our overheated brain cool down.
Brené Brown (Dare to Lead: Brave Work. Tough Conversations. Whole Hearts.)
Taylor recalled that he ran into a brick wall every time he tried to deal with the suits back east. As the head of a Xerox research facility in Webster, New York, explained to him, “The computer will never be as important to society as the copier.”73
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
the breakthrough researcher first discovers the fundamental causal mechanism behind the phenomena of success. This allows those who are looking for “an answer” to get beyond the wings-and-feathers mind-set of copying the attributes of successful companies.
Clayton M. Christensen (The Innovator's Solution: Creating and Sustaining Successful Growth)
scientific complex and technological wizardry. All successful late modern empires cultivated scientific research in the hope of harvesting technological innovations, and many scientists spent most of their time working on arms, medicines and machines for their imperial masters.
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
November 2014, focus on science communications full-time. I now work for a cancer charity, “translating” science into English for fundraising teams. My writing reaches so many people and helps raise money toward cancer research, and it feels very meaningful to me because of that.
Sam Maggs (Wonder Women: 25 Innovators, Inventors, and Trailblazers Who Changed History)
While tracking trends can be a useful tool in dealing with the unpredictable future, market research can be more of a problem than a help. Research does best at measuring the past. New ideas and concepts are almost impossible to measure. No one has a frame of reference. People don’t know what they will do until they face an actual decision. The classic example is the research conducted before Xerox introduced the plain-paper copier. What came back was the conclusion that no one would pay five cents for a plain-paper copy when they could get a Thermofax copy for a cent and a half. Xerox ignored the research, and the rest is history.
Al Ries (The 22 Immutable Laws of Marketing: Violate Them at Your Own Risk)
private school teachers tend to have fewer credentials and to cling to traditional teaching styles, such as lecturing while students sit in rows and take notes. Public school teachers, by contrast, are much more likely to be certified, to hold higher degrees, and to embrace research-based innovations in curriculum and pedagogy
David C. Berliner (50 Myths and Lies That Threaten America's Public Schools: The Real Crisis in Education)
There is no more delicate matter to take in hand, nor more dangerous to conduct, nor more doubtful in its success, than to be a leader in the introduction of changes. For he who innovates will have for enemies all those who are well off under the old order of things, and only lukewarm supporters in those who might be better off under the new.
Mark E. Mendenhall (Global Leadership: Research, Practice, and Development)
Raffensperger also requested and received a $5,591,800 grant from the privately funded Center for Election Innovation and Research (CEIR), a group funded by Facebook founder Mark Zuckerberg and his wife Priscilla Chan.84 The group reported Georgia used the funds to push mail-in balloting and to counteract negative messaging about mail-in voting.
Mollie Ziegler Hemingway (Rigged: How the Media, Big Tech, and the Democrats Seized Our Elections)
Double the population of a city, and it doesn’t simply double its productivity; it yields productivity and innovation that is more than doubled. These relationships have been found in patents, a city’s gross metropolitan product, research and development budgets, and even the presence of so-called supercreative individuals, such as artists and academics.
Samuel Arbesman (The Half-life of Facts: Why Everything We Know Has an Expiration Date)
What is the purpose of education? Is it to impart knowledge and facts or is it to nurture curiosity, effortful problem solving, and the capacity for lifelong learning? Educational historians have repeatedly shown that today’s schools were designed during the first half of the twentieth century to meet the demands of the industrial era, not an innovative knowledge economy. “Very few schools teach students how to create knowledge,” says Professor Keith Sawyer of Washington University, a leading education and innovation researcher. “Instead, students are taught that knowledge is static and complete, and they become experts at consuming knowledge rather than producing knowledge.” This is unacceptable. Change
Peter Sims (Little Bets: How breakthrough ideas emerge from small discoveries)
[One way] researchers sometimes evaluate people's judgments is to compare those judgments with those of more mature or experienced individuals. This method has its limitations too, because mature or experienced individuals are sometimes so set in their ways that they can't properly evaluate new or unique conditions or adopt new approaches to solving problems.
Robert Epstein (Teen 2.0: Saving Our Children and Families from the Torment of Adolescence)
Entrepreneurs who kept their day jobs had 33 percent lower odds of failure than those who quit. If you’re risk averse and have some doubts about the feasibility of your ideas, it’s likely that your business will be built to last. If you’re a freewheeling gambler, your startup is far more fragile. Like the Warby Parker crew, the entrepreneurs whose companies topped Fast Company’s recent most innovative lists typically stayed in their day jobs even after they launched. Former track star Phil Knight started selling running shoes out of the trunk of his car in 1964, yet kept working as an accountant until 1969. After inventing the original Apple I computer, Steve Wozniak started the company with Steve Jobs in 1976 but continued working full time in his engineering job at Hewlett-Packard until 1977. And although Google founders Larry Page and Sergey Brin figured out how to dramatically improve internet searches in 1996, they didn’t go on leave from their graduate studies at Stanford until 1998. “We almost didn’t start Google,” Page says, because we “were too worried about dropping out of our Ph.D. program.” In 1997, concerned that their fledgling search engine was distracting them from their research, they tried to sell Google for less than $2 million in cash and stock. Luckily for them, the potential buyer rejected the offer. This habit of keeping one’s day job isn’t limited to successful entrepreneurs. Many influential creative minds have stayed in full-time employment or education even after earning income from major projects. Selma director Ava DuVernay made her first three films while working in her day job as a publicist, only pursuing filmmaking full time after working at it for four years and winning multiple awards. Brian May was in the middle of doctoral studies in astrophysics when he started playing guitar in a new band, but he didn’t drop out until several years later to go all in with Queen. Soon thereafter he wrote “We Will Rock You.” Grammy winner John Legend released his first album in 2000 but kept working as a management consultant until 2002, preparing PowerPoint presentations by day while performing at night. Thriller master Stephen King worked as a teacher, janitor, and gas station attendant for seven years after writing his first story, only quitting a year after his first novel, Carrie, was published. Dilbert author Scott Adams worked at Pacific Bell for seven years after his first comic strip hit newspapers. Why did all these originals play it safe instead of risking it all?
Adam M. Grant (Originals: How Non-Conformists Move the World)
Notwithstanding the intense pressure on faculty members to publish, nationwide surveys indicate that they value teaching as highly as scholarly research.6 For every research superstar seeking international acclaim and association only with graduate students, there are many professors who value not only scholarship but also teaching and mentoring undergraduates.
Clayton M. Christensen (The Innovative University: Changing the DNA of Higher Education from the Inside Out)
In the seventy years since von Neumann effectively placed his “Draft Report” on the EDVAC into the public domain, the trend for computers has been, with a few notable exceptions, toward a more proprietary approach. In 2011 a milestone was reached: Apple and Google spent more on lawsuits and payments involving patents than they did on research and development of new products.64
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Most lecture-based courses contribute nothing to real learning. Consequential and retained learning comes from applying knowledge to new situations or problems, research on questions and issues that students consider important, peer interaction, activities, and projects. Experiences, rather than short-term memorization, help students develop the skills and motivation that transforms lives." [p. 7-8]
Tony Wagner (Most Likely to Succeed: Preparing Our Kids for the Innovation Era)
Research by Harvard Business School professor Amy Edmondson shows that in the type of psychologically safe environment that Meyer helped create, people learn and innovate more.* And it’s givers who often create such an environment: in one study, engineers who shared ideas without expecting anything in return were more likely to play a major role in innovation, as they made it safe to exchange information
Adam M. Grant (Give and Take: A Revolutionary Approach to Success)
Relationships can be mundane and boring. What I love about sensual living is that it gives you the ability to be innovative. You are able to introduce things that your partner don't know they want or need yet, and thus keep the excitement up. More like what Steve Jobs did with Apple. He said I never rely on market research. Our task is to read things that are not yet on the page. Now that's the essence of true sensual living right there.
Lebo Grand
Today’s computer technology exists in some measure because millions of middle-class taxpayers supported federal funding for basic research in the decades following World War II. We can be reasonably certain that those taxpayers offered their support in the expectation that the fruits of that research would create a more prosperous future for their children and grandchildren. Yet, the trends we looked at in the last chapter suggest we are headed toward a very different outcome. BEYOND THE BASIC MORAL QUESTION of whether a tiny elite should be able to, in effect, capture ownership of society’s accumulated technological capital, there are also practical issues regarding the overall health of an economy in which income inequality becomes too extreme. Continued progress depends on a vibrant market for future innovations—and that, in turn, requires a reasonable distribution of purchasing power.
Martin Ford (Rise of the Robots: Technology and the Threat of a Jobless Future)
another obstacle to educating innovators in universities is the lack of respect for interdisciplinary inquiry, practical knowledge, and applied learning. Discipline-based, in-depth knowledge is important, and basic research makes significant contributions to innovation. It is essential to our future that we continue to support this kind of inquiry, but this cannot—and must not—be the only kind of knowledge that is valued by our universities and our society.
Tony Wagner (Creating Innovators: The Making of Young People Who Will Change the World)
Harvard Business School professor Teresa Amabile has conducted extensive research on employees working in creative endeavors in order to understand how work environments foster or impede creativity and innovation. She has consistently found that work environments in which employees have a high degree of operational autonomy lead to the highest degree of creativity and innovation. Operational autonomy, of course, can be seen as the extreme version of process fairness.
Harvard Business Review (HBR's 10 Must Reads on Emotional Intelligence (with featured article "What Makes a Leader?" by Daniel Goleman)(HBR's 10 Must Reads))
The researchers found that when students were given problems to solve, and they did not know methods to solve them, but they were given opportunity to explore the problems, they became curious, and their brains were primed to learn new methods, so that when teachers taught the methods, students paid greater attention to them and were more motivated to learn them. The researchers published their results with the title “A Time for Telling,” and they argued that the question is not “Should we tell or explain methods?” but “When is the best time do this?
Jo Boaler (Mathematical Mindsets: Unleashing Students' Potential through Creative Math, Inspiring Messages and Innovative Teaching (Mindset Mathematics))
On October 29 the connection was ready to be made. The event was appropriately casual. It had none of the drama of the “one small step for man, one giant leap for mankind” that had occurred on the moon a few weeks earlier, with a half billion people watching on television. Instead it was an undergraduate named Charley Kline, under the eye of Crocker and Cerf, who put on a telephone headset to coordinate with a researcher at SRI while typing in a login sequence that he hoped would allow his terminal at UCLA to connect through the network to the computer 354 miles away in Palo Alto.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
John Passmore writes in his book Science and Its Critics, The Spanish Inquisition sought to avoid direct responsibility for the burning of heretics by handing them over to the secular arm; to burn them itself, it piously explained, would be wholly inconsistent with its Christian principles. Few of us would allow the Inquisition thus easily to wipe its hands clean of bloodshed; it knew quite well what would happen. Equally, where the technological application of scientific discoveries is clear and obvious—as when a scientist works on nerve gases—he cannot properly claim that such applications are “none of his business,” merely on the ground that it is the military forces, not scientists, who use the gases to disable or kill. This is even more obvious when the scientist deliberately offers help to governments, in exchange for funds. If a scientist, or a philosopher, accepts funds from some such body as an office of naval research, then he is cheating if he knows his work will be useless to them and must take some responsibility for the outcome if he knows that it will be useful. He is subject, properly subject, to praise or blame in relation to any innovations which flow from his work.
Carl Sagan (The Demon-Haunted World: Science as a Candle in the Dark)
The earliest modern attempt to test prayer’s efficacy was Sir Francis Galton’s innovative but flawed survey in 1872.16 The field languished until the 1960s, when several researchers began clinical and laboratory studies designed to answer two fundamental questions: (1) Do the prayerful, compassionate, healing intentions of humans affect biological functions in remote individuals who may be unaware of these efforts? (2) Can these effects be demonstrated in nonhuman processes, such as microbial growth, specific biochemical reactions, or the function of inanimate objects? The answer to both questions appears to be yes.
Ervin Laszlo (The Akashic Experience: Science and the Cosmic Memory Field)
Much of the literature on creativity focuses on how to trigger these moments of innovative synthesis; how to drive the problem phase toward its resolution. And it turns out that epiphanies often happen when we are in one of two types of environment. The first is when we are switching off: having a shower, going for a walk, sipping a cold beer, daydreaming. When we are too focused, when we are thinking too literally, we can’t spot the obscure associations that are so important to creativity. We have to take a step back for the “associative state” to emerge. As the poet Julia Cameron put it: “I learned to get out of the way and let that creative force work through me.”8 The other type of environment where creative moments often happen, as we have seen, is when we are being sparked by the dissent of others. When Kevin Dunbar, a psychologist at McGill University, went to look at how scientific breakthroughs actually happen, for example (he took cameras into four molecular biology labs and recorded pretty much everything that took place), he assumed that it would involve scientists beavering away in isolated contemplation. In fact, the breakthroughs happened at lab meetings, where groups of researchers would gather around a desk to talk through their work. Why here? Because they were forced to respond to challenges and critiques from their fellow researchers. They were jarred into seeing new associations.
Matthew Syed (Black Box Thinking: Why Most People Never Learn from Their Mistakes--But Some Do)
of the most momentous innovations tiptoe quietly onto history’s stage. On August 6, 1991, Berners-Lee was glancing through the Internet’s alt.hypertext newsgroup and ran across this question: “Is anyone aware of research or development efforts in . . . hypertext links enabling retrieval from multiple heterogeneous sources?” His answer, “from: at 2:56 pm,” became the first public announcement of the Web. “The WorldWideWeb project aims to allow links to be made to any information anywhere,” he began. “If you’re interested in using the code, mail me.”31 With his low-key personality and even lower-key posting, Berners-Lee did not fathom what a profound idea he had unleashed. Any information anywhere.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
An obituary of a librarian could be about anything under the sun, a woman with a phenomenal memory, who recalled the books her again patrons read as children - and was also, incidentally, the best sailor on her stretch of the Maine coast - or a man obsessed with maps, who helped automate the Library of Congress’s map catalog and paved the way for wonders like Google Maps… Whether the subject was a community librarian or a prophet, almost every librarian obituary contained some version of this sentence: “Under [their] watch, the library changed from a collection of books into an automated research center.” I began to get the idea that libraries were where it was happening - wide open territory for innovators, activists, and pioneers.
Marilyn Johnson (This Book Is Overdue!: How Librarians and Cybrarians Can Save Us All)
So, if you are predominantly a producer of intangible assets (writing software, doing design, producing research) you probably want to build an organization that allows information to flow, help serendipitous interactions, and keeps the key talent. That probably means allowing more autonomy, fewer targets, and more access to the boss, even if that is at the cost of influence activities. This seems to describe the types of autonomous organizations that the earlier writers, like Charles Leadbeater, had in mind. And it also seems to describe the increasing importance of systemic innovators. Such innovators are not inventors of single, isolated inventions. Rather, their role is to coordinate the synergies that successfully bring such an innovation to market.
Jonathan Haskel (Capitalism without Capital: The Rise of the Intangible Economy)
free to pursue new options even if such options imply loss of profits for selected industries. The same is clearly true in pharmaceutical research, in the pursuit of alternatives to the internal-combustion engine, and in many other technological frontiers. I do not think that the development of new technologies should be placed in the control of old technologies; the temptation to suppress the competition is too great. If we Americans live in a free-enterprise society, let us see substantial independent enterprise in all of the technologies upon which our future may depend. If organizations devoted to technological innovation and its boundaries of acceptability are not challenging (and perhaps even offending) at least some powerful groups, they are not accomplishing their purpose.
Carl Sagan (Broca's Brain: Reflections on the Romance of Science)
Entrepreneurship itself is an emergent system, where companies create the conditions for experimentation and learning to occur, often symbiotically with customers. In 1978, Eric von Hippel (my PhD advisor at MIT) pioneered the notion of user-driven innovation.10, 11 Back then, the conventional wisdom was that innovation only came from corporate, government, and university research-and-development labs. While some still believe this today, Eric's insight proved to be prescient in many areas, especially in the information age, as the widespread adoption of open-source software and Lean Startup methodologies have demonstrated.12 Twitter is a tangible example since three of the platform's most popular features—the @ reply, the # hashtag indexing, and retweet sharing—were all generated bottom-up by users.
Brad Feld (The Startup Community Way: Evolving an Entrepreneurial Ecosystem (Techstars))
Researchers at Shanghai Jiao Tong University in China, Saga University in Japan, and the University of California, Davis, proposed creating an artificial inorganic leaf modeled on the real thing. They took a leaf of Anemone vitifolia, a plant native to China, and injected its veins with titanium dioxide-a well-known industrial photocatalyst. By taking on the precise branching shape and structure of the leaf's veins, the titanium dioxide produced much higher light-harvesting ability than if ti was used in a traditional configuration. The researchers found an astounding 800 percent increase in hydrogen production as well. The total performance was 300 percent more active than the world's best commercial photocatalysts. When they added platinum nanoparticles to the mix, it increased activity by a further 1,000 percent.
Jay Harman (The Shark's Paintbrush: Biomimicry and How Nature is Inspiring Innovation)
This interplay of military and academic motives became ingrained in the Internet. “The design of both the ARPANET and the Internet favored military values, such as survivability, flexibility, and high performance, over commercial goals, such as low cost, simplicity, or consumer appeal,” the technology historian Janet Abbate noted. “At the same time, the group that designed and built ARPA’s networks was dominated by academic scientists, who incorporated their own values of collegiality, decentralization of authority, and open exchange of information into the system.”90 These academic researchers of the late 1960s, many of whom associated with the antiwar counterculture, created a system that resisted centralized command. It would route around any damage from a nuclear attack but also around any attempt to impose control.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
[...] 'Imagine a town of scholars, all researching the most marvelous, fascinating things. Science. Mathematics. Languages. Literature. Imagine building after building filled with more books than you've seen in your entire life. Imagine quiet, solitude and a serene place to think.' He sighed. 'London is a blathering mess. It's impossible to get anything done here; the city's too loud, and it demands too much from you. You can escape out to places like Hampstead, but the screaming core draws you back in wether you like it or not. But Oxford gives you all the toold you need for your work – food, clothes, books, tea – and then it leaves you alone. It is the centre of all knowledge and innovation in the civilized world. And, should you progress sufficiently well in your studies here, you might one day be lucky enough to call it home.
R.F. Kuang (Babel)
To understand the fundamental benefits of an immigrant population, imagine that you could divide the population into two groups: one consisting on the average of the youngest, healthiest, boldest, most risk-tolerant, most hard-working, ambitious, and innovative people; the other consisting of everyone else. Transplant the first group to another country, and leave the second group in their country of origin. That selective transplanting approximates the decision to emigrate and its successful accomplishment. Hence it comes as no surprise that more than one-third of American Nobel Prize winners are foreign-born, and over half are either immigrants themselves or else the children of immigrants. That's because Nobel Prize-winning research demands those same qualities of boldness, risk-tolerance, hard work, ambition, and innovativeness.
Jared Diamond (Upheaval: Turning Points for Nations in Crisis)
Instead it was an undergraduate named Charley Kline, under the eye of Crocker and Cerf, who put on a telephone headset to coordinate with a researcher at SRI while typing in a login sequence that he hoped would allow his terminal at UCLA to connect through the network to the computer 354 miles away in Palo Alto. He typed in “L.” The guy at SRI told him that it had been received. Then he typed in “O.” That, too, was confirmed. When he typed in “G,” the system hit a memory snag because of an auto-complete feature and crashed. Nevertheless, the first message had been sent across the ARPANET, and if it wasn’t as eloquent as “The Eagle has landed” or “What has God wrought,” it was suitable in its understated way: “Lo.” As in “Lo and behold.” In his logbook, Kline recorded, in a memorably minimalist notation, “22:30. Talked to SRI Host to Host. CSK.”101
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Total obscurity. Bilbo in Gollum's tunnel. A mathematician's first steps into unknown territory constitute the first phase of a familiar cycle. After the darkness comes a faint, faint glimmer of light, just enough to make you think that something is there, almost within reach, waiting to be discovered . . . Then, after the faint, faint glimmer, if all goes well, you unravel the thread - and suddenly it's broad daylight! You're full of confidence, you want to tell anyone who will listen about what you've found. And then, after day has broken, after the sun has climbed high into the sky, a phase of depression inevitably follows. You lose all faith in the importance of what you've achieved. Any idiot could have done what you've done, go find yourself a more worthwhile problem and make something of your life. Thus the cycle of mathematical research . . .
Cédric Villani (Birth of a Theorem: A Mathematical Adventure)
The fundamental problem is that every technology embeds the ideologies of its creators! Who made the Internet? The military! The Internet is the product of the Defense Advanced Research Projects Agency! We call it DARPA for short! Who worked for DARPA? DARPA was a bunch of men! Not a single woman worked on the underlying technologies that fuel our digital universe! Men are the shit of the world and all of our political systems and philosophies were created and devised without the input of women! Half of the world’s population lives beneath systems of government and technological innovation into which their gender had zero input! Democracy is a bullshit ideology that a bunch of slaveholding Greek men constructed between rounds of beating their wives! All the presumed ideologies of men were taken for inescapable actualities and designed into the Internet! Packet switching is an incredible evil!
Jarett Kobek (I Hate the Internet)
As World War II was ending, the great engineer and public official Vannevar Bush argued that America’s innovation engine would require a three-way partnership of government, business, and academia. He was uniquely qualified to envision that triangle, because he had a foot in all three camps. He had been dean of engineering at MIT, a founder of Raytheon, and the chief government science administrator overseeing, among other projects, the building of the atom bomb.4 Bush’s recommendation was that government should not build big research labs of its own, as it had done with the atomic bomb project, but instead should fund research at universities and corporate labs. This government-business-university partnership produced the great innovations that propelled the U.S. economy in the postwar period, including transistors, microchips, computers, graphical user interfaces, GPS, lasers, the internet, and search engines.
Walter Isaacson (The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race)
We're all equal before a wave. —Laird Hamilton, professional surfer In 2005, I was working as an equity analyst at Merrill Lynch. When one afternoon I told a close friend that I was going to leave Wall Street, she was dumbfounded. "Are you sure you know what you're doing?" she asked me. This was her polite, euphemistic way of wondering if I'd lost my mind. My job was to issue buy or sell recommendations on corporate stocks—and I was at the top of my game. I had just returned from Mexico City for an investor day at America Movíl, now the fourth largest wireless operator in the world. As I sat in the audience with hundreds of others, Carlos Slim, the controlling shareholder and one of the world's richest men, quoted my research, referring to me as "La Whitney." I had large financial institutions like Fidelity Investments asking for my financial models, and when I upgraded or downgraded a stock, the stock price would frequently move several percentage points.
Whitney Johnson (Disrupt Yourself: Putting the Power of Disruptive Innovation to Work)
Montessori classrooms emphasize self-directed learning, hands-on engagement with a wide variety of materials (including plants and animals), and a largely unstructured school day. And in recent years they’ve produced alumni including the founders of Google (Larry Page and Sergey Brin), Amazon (Jeff Bezos), and Wikipedia (Jimmy Wales). These examples appear to be part of a broader trend. Management researchers Jeffrey Dyer and Hal Gregersen interviewed five hundred prominent innovators and found that a disproportionate number of them also went to Montessori schools, where “they learned to follow their curiosity.” As a Wall Street Journal blog post by Peter Sims put it, “the Montessori educational approach might be the surest route to joining the creative elite, which are so overrepresented by the school’s alumni that one might suspect a Montessori Mafia.” Whether or not he’s part of this mafia, Andy will vouch for the power of SOLEs. He was a Montessori kid for the
Erik Brynjolfsson (The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies)
DuPont, for 130 years, had confined itself to making munitions and explosives. In the mid-1920s it then organized its first research efforts in other areas, one of them the brand-new field of polymer chemistry, which the Germans had pioneered during World War I. For several years there were no results at all. Then, in 1928, an assistant left a burner on over the weekend. On Monday morning, Wallace H. Carothers, the chemist in charge, found that the stuff in the kettle had congealed into fibers. It took another ten years before DuPont found out how to make Nylon intentionally. The point of the story is, however, that the same accident had occurred several times in the laboratories of the big German chemical companies with the same results, and much earlier. The Germans were, of course, looking for a polymerized fiber—and they could have had it, along with world leadership in the chemical industry, ten years before DuPont had Nylon. But because they had not planned the experiment, they dismissed its results, poured out the accidentally produced fibers, and started all over again.
Peter F. Drucker (Innovation and Entrepreneurship)
The story of the Internet's origins departs from the explanations of technical innovation that center on individual inventors or on the pull of markets. Cerf and Kahn were neither captains of industry nor "two guys tinkering in a garage." The Internet was not built in response to popular demand, real or imagined; its subsequent mass appeal had no part in the decisions made in 1973. Rather, the project reflected the command economy of military procurement, where specialized performance is everything and money is no object, and the research ethos of the university, where experimental interest and technical elegance take precedence over commercial application. This was surely an unlikely context for the creation of what would become a popular and profitable service. Perhaps the key to the Internet's later commercial success was that the project internalized the competitive forces of the market by bringing representatives of diverse interest groups together and allowing them to argue through design issues. Ironically, this unconventional approach produced a system that proved to have more appeal for potential "customers"—people building networks—than did the overtly commercial alternatives that appeared soon after.
Janet Abbate (Inventing the Internet (Inside Technology))
I want to convince you that intellectual property is important, that it is something that any informed citizen needs to know a little about, in the same way that any informed citizen needs to know at least something about the environment, or civil rights, or the way the economy works. I will try my best to be fair, to explain the issues and give both sides of the argument. Still, you should know that this is more than mere description. In the pages that follow, I try to show that current intellectual property policy is overwhelmingly and tragically bad in ways that everyone, and not just lawyers or economists, should care about. We are making bad decisions that will have a negative effect on our culture, our kids’ schools, and our communications networks; on free speech, medicine, and scientific research. We are wasting some of the promise of the Internet, running the risk of ruining an amazing system of scientific innovation, carving out an intellectual property exemption to the First Amendment. I do not write this as an enemy of intellectual property, a dot-communist ready to end all property rights; in fact, I am a fan. It is precisely because I am a fan that I am so alarmed about the direction we are taking.
Perhaps the most remarkable elder-care innovation developed in Japan so far is the Hybrid Assistive Limb (HAL)—a powered exoskeleton suit straight out of science fiction. Developed by Professor Yoshiyuki Sankai of the University of Tsukuba, the HAL suit is the result of twenty years of research and development. Sensors in the suit are able to detect and interpret signals from the brain. When the person wearing the battery-powered suit thinks about standing up or walking, powerful motors instantly spring into action, providing mechanical assistance. A version is also available for the upper body and could assist caretakers in lifting the elderly. Wheelchair-bound seniors have been able to stand up and walk with the help of HAL. Sankai’s company, Cyberdyne, has also designed a more robust version of the exoskeleton for use by workers cleaning up the Fukushima Daiichi nuclear plant in the wake of the 2011 disaster. The company says the suit will almost completely offset the burden of over 130 pounds of tungsten radiation shielding worn by workers.* HAL is the first elder-care robotic device to be certified by Japan’s Ministry of Economy, Trade, and Industry. The suits lease for just under $2,000 per year and are already in use at over three hundred Japanese hospitals and nursing homes.21
Martin Ford (Rise of the Robots: Technology and the Threat of a Jobless Future)
Rejecting failure and avoiding mistakes seem like high-minded goals, but they are fundamentally misguided. Take something like the Golden Fleece Awards, which were established in 1975 to call attention to government-funded projects that were particularly egregious wastes of money. (Among the winners were things like an $84,000 study on love commissioned by the National Science Foundation, and a $3,000 Department of Defense study that examined whether people in the military should carry umbrellas.) While such scrutiny may have seemed like a good idea at the time, it had a chilling effect on research. No one wanted to “win” a Golden Fleece Award because, under the guise of avoiding waste, its organizers had inadvertently made it dangerous and embarrassing for everyone to make mistakes. The truth is, if you fund thousands of research projects every year, some will have obvious, measurable, positive impacts, and others will go nowhere. We aren’t very good at predicting the future—that’s a given—and yet the Golden Fleece Awards tacitly implied that researchers should know before they do their research whether or not the results of that research would have value. Failure was being used as a weapon, rather than as an agent of learning. And that had fallout: The fact that failing could earn you a very public flogging distorted the way researchers chose projects. The politics of failure, then, impeded our progress. There’s a quick way to determine if your company has embraced the negative definition of failure. Ask yourself what happens when an error is discovered. Do people shut down and turn inward, instead of coming together to untangle the causes of problems that might be avoided going forward? Is the question being asked: Whose fault was this? If so, your culture is one that vilifies failure. Failure is difficult enough without it being compounded by the search for a scapegoat. In a fear-based, failure-averse culture, people will consciously or unconsciously avoid risk. They will seek instead to repeat something safe that’s been good enough in the past. Their work will be derivative, not innovative. But if you can foster a positive understanding of failure, the opposite will happen. How, then, do you make failure into something people can face without fear? Part of the answer is simple: If we as leaders can talk about our mistakes and our part in them, then we make it safe for others. You don’t run from it or pretend it doesn’t exist. That is why I make a point of being open about our meltdowns inside Pixar, because I believe they teach us something important: Being open about problems is the first step toward learning from them. My goal is not to drive fear out completely, because fear is inevitable in high-stakes situations. What I want to do is loosen its grip on us. While we don’t want too many failures, we must think of the cost of failure as an investment in the future.
Ed Catmull (Creativity, Inc.: an inspiring look at how creativity can - and should - be harnessed for business success by the founder of Pixar)
Bush’s description of how basic research provides the seed corn for practical inventions became known as the “linear model of innovation.” Although subsequent waves of science historians sought to debunk the linear model for ignoring the complex interplay between theoretical research and practical applications, it had a popular appeal as well as an underlying truth. The war, Bush wrote, had made it “clear beyond all doubt” that basic science—discovering the fundamentals of nuclear physics, lasers, computer science, radar—“is absolutely essential to national security.” It was also, he added, crucial for America’s economic security. “New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science. A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade.” By the end of his report, Bush had reached poetic heights in extolling the practical payoffs of basic scientific research: “Advances in science when put to practical use mean more jobs, higher wages, shorter hours, more abundant crops, more leisure for recreation, for study, for learning how to live without the deadening drudgery which has been the burden of the common man for past ages.”9 Based on this report, Congress established the National Science Foundation.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
If this is true—if solitude is an important key to creativity—then we might all want to develop a taste for it. We’d want to teach our kids to work independently. We’d want to give employees plenty of privacy and autonomy. Yet increasingly we do just the opposite. We like to believe that we live in a grand age of creative individualism. We look back at the midcentury era in which the Berkeley researchers conducted their creativity studies, and feel superior. Unlike the starched-shirted conformists of the 1950s, we hang posters of Einstein on our walls, his tongue stuck out iconoclastically. We consume indie music and films, and generate our own online content. We “think different” (even if we got the idea from Apple Computer’s famous ad campaign). But the way we organize many of our most important institutions—our schools and our workplaces—tells a very different story. It’s the story of a contemporary phenomenon that I call the New Groupthink—a phenomenon that has the potential to stifle productivity at work and to deprive schoolchildren of the skills they’ll need to achieve excellence in an increasingly competitive world. The New Groupthink elevates teamwork above all else. It insists that creativity and intellectual achievement come from a gregarious place. It has many powerful advocates. “Innovation—the heart of the knowledge economy—is fundamentally social,” writes the prominent journalist Malcolm Gladwell. “None of us is as smart as all of us,” declares the organizational consultant Warren Bennis,
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
The Blue Mind Rx Statement Our wild waters provide vast cognitive, emotional, physical, psychological, social, and spiritual values for people from birth, through adolescence, adulthood, older age, and in death; wild waters provide a useful, widely available, and affordable range of treatments healthcare practitioners can incorporate into treatment plans. The world ocean and all waterways, including lakes, rivers, and wetlands (collectively, blue space), cover over 71% of our planet. Keeping them healthy, clean, accessible, and biodiverse is critical to human health and well-being. In addition to fostering more widely documented ecological, economic, and cultural diversities, our mental well-being, emotional diversity, and resiliency also rely on the global ecological integrity of our waters. Blue space gives us half of our oxygen, provides billions of people with jobs and food, holds the majority of Earth's biodiversity including species and ecosystems, drives climate and weather, regulates temperature, and is the sole source of hydration and hygiene for humanity throughout history. Neuroscientists and psychologists add that the ocean and wild waterways are a wellspring of happiness and relaxation, sociality and romance, peace and freedom, play and creativity, learning and memory, innovation and insight, elation and nostalgia, confidence and solitude, wonder and awe, empathy and compassion, reverence and beauty — and help manage trauma, anxiety, sleep, autism, addiction, fitness, attention/focus, stress, grief, PTSD, build personal resilience, and much more. Chronic stress and anxiety cause or intensify a range of physical and mental afflictions, including depression, ulcers, colitis, heart disease, and more. Being on, in, and near water can be among the most cost-effective ways of reducing stress and anxiety. We encourage healthcare professionals and advocates for the ocean, seas, lakes, and rivers to go deeper and incorporate the latest findings, research, and insights into their treatment plans, communications, reports, mission statements, strategies, grant proposals, media, exhibits, keynotes, and educational programs and to consider the following simple talking points: •Water is the essence of life: The ocean, healthy rivers, lakes, and wetlands are good for our minds and bodies. •Research shows that nature is therapeutic, promotes general health and well-being, and blue space in both urban and rural settings further enhances and broadens cognitive, emotional, psychological, social, physical, and spiritual benefits. •All people should have safe access to salubrious, wild, biodiverse waters for well-being, healing, and therapy. •Aquatic biodiversity has been directly correlated with the therapeutic potency of blue space. Immersive human interactions with healthy aquatic ecosystems can benefit both. •Wild waters can serve as medicine for caregivers, patient families, and all who are part of patients’ circles of support. •Realization of the full range and potential magnitude of ecological, economic, physical, intrinsic, and emotional values of wild places requires us to understand, appreciate, maintain, and improve the integrity and purity of one of our most vital of medicines — water.
Wallace J. Nichols (Blue Mind: The Surprising Science That Shows How Being Near, In, On, or Under Water Can Make You Happier, Healthier, More Connected, and Better at What You Do)
The various ways of creating a culture of innovation that we’ve talked about so far are greatly influenced by the leaders at the top. Leaders can’t dictate culture, but they can nurture it. They can generate the right conditions for creativity and innovation. Metaphorically, they can provide the heat and light and moisture and nutrients for a creative culture to blossom and grow. They can focus the best efforts of talented individuals to build innovative, successful groups. In our work at IDEO, we have been lucky enough to meet frequently with CEOs and visionary leaders from both the private and public sectors. Each has his or her own unique style, of course, but the best all have an ability to identify and activate the capabilities of people on their teams. This trait goes far beyond mere charisma or even intelligence. Certain leaders have a knack for nurturing people around them in a way that enables them to be at their best. One way to describe those leaders is to say they are “multipliers,” a term we picked up from talking to author and executive advisor Liz Wiseman. Drawing on a background in organizational behavior and years of experience as a global human resources executive at Oracle Corporation, Liz interviewed more than 150 leaders on four continents to research her book Multipliers: How the Best Leaders Make Everyone Smarter. Liz observes that all leaders lie somewhere on a continuum between diminishers, who exercise tight control in a way that underutilizes their team’s creative talents, and multipliers, who set challenging goals and then help employees achieve the kind of extraordinary results that they themselves may not have known they were capable of.
Tom Kelley (Creative Confidence: Unleashing the Creative Potential Within Us All)
Yet the deepest and most enduring forms of cultural change nearly always occurs from the “top down.” In other words, the work of world-making and world-changing are, by and large, the work of elites: gatekeepers who provide creative direction and management within spheres of social life. Even where the impetus for change draws from popular agitation, it does not gain traction until it is embraced and propagated by elites. The reason for this, as I have said, is that culture is about how societies define reality—what is good, bad, right, wrong, real, unreal, important, unimportant, and so on. This capacity is not evenly distributed in a society, but is concentrated in certain institutions and among certain leadership groups who have a lopsided access to the means of cultural production. These elites operate in well-developed networks and powerful institutions. Over time, cultural innovation is translated and diffused. Deep-rooted cultural change tends to begin with those whose work is most conceptual and invisible and it moves through to those whose work is most concrete and visible. In a very crude formulation, the process begins with theorists who generate ideas and knowledge; moves to researchers who explore, revise, expand, and validate ideas; moves on to teachers and educators who pass those ideas on to others, then passes on to popularizers who simplify ideas and practitioners who apply those ideas. All of this, of course, transpires through networks and structures of cultural production. Cultural change is most enduring when it penetrates the structure of our imagination, frameworks of knowledge and discussion, the perception of everyday reality. This rarely if ever happens through grassroots political mobilization though grassroots mobilization can be a manifestation of deeper cultural transformation.
James Davison Hunter (To Change the World: The Irony, Tragedy, and Possibility of Christianity in the Late Modern World)
the device had the property of transresistance and should have a name similar to devices such as the thermistor and varistor, Pierce proposed transistor. Exclaimed Brattain, “That’s it!” The naming process still had to go through a formal poll of all the other engineers, but transistor easily won the election over five other options.35 On June 30, 1948, the press gathered in the auditorium of Bell Labs’ old building on West Street in Manhattan. The event featured Shockley, Bardeen, and Brattain as a group, and it was moderated by the director of research, Ralph Bown, dressed in a somber suit and colorful bow tie. He emphasized that the invention sprang from a combination of collaborative teamwork and individual brilliance: “Scientific research is coming more and more to be recognized as a group or teamwork job. . . . What we have for you today represents a fine example of teamwork, of brilliant individual contributions, and of the value of basic research in an industrial framework.”36 That precisely described the mix that had become the formula for innovation in the digital age. The New York Times buried the story on page 46 as the last item in its “News of Radio” column, after a note about an upcoming broadcast of an organ concert. But Time made it the lead story of its science section, with the headline “Little Brain Cell.” Bell Labs enforced the rule that Shockley be in every publicity photo along with Bardeen and Brattain. The most famous one shows the three of them in Brattain’s lab. Just as it was about to be taken, Shockley sat down in Brattain’s chair, as if it were his desk and microscope, and became the focal point of the photo. Years later Bardeen would describe Brattain’s lingering dismay and his resentment of Shockley: “Boy, Walter hates this picture. . . . That’s Walter’s equipment and our experiment,
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Bell resisted selling Texas Instruments a license. “This business is not for you,” the firm was told. “We don’t think you can do it.”38 In the spring of 1952, Haggerty was finally able to convince Bell Labs to let Texas Instruments buy a license to manufacture transistors. He also hired away Gordon Teal, a chemical researcher who worked on one of Bell Labs’ long corridors near the semiconductor team. Teal was an expert at manipulating germanium, but by the time he joined Texas Instruments he had shifted his interest to silicon, a more plentiful element that could perform better at high temperatures. By May 1954 he was able to fabricate a silicon transistor that used the n-p-n junction architecture developed by Shockley. Speaking at a conference that month, near the end of reading a thirty-one-page paper that almost put listeners to sleep, Teal shocked the audience by declaring, “Contrary to what my colleagues have told you about the bleak prospects for silicon transistors, I happen to have a few of them here in my pocket.” He proceeded to dunk a germanium transistor connected to a record player into a beaker of hot oil, causing it to die, and then did the same with one of his silicon transistors, during which Artie Shaw’s “Summit Ridge Drive” continued to blare undiminished. “Before the session ended,” Teal later said, “the astounded audience was scrambling for copies of the talk, which we just happened to bring along.”39 Innovation happens in stages. In the case of the transistor, first there was the invention, led by Shockley, Bardeen, and Brattain. Next came the production, led by engineers such as Teal. Finally, and equally important, there were the entrepreneurs who figured out how to conjure up new markets. Teal’s plucky boss Pat Haggerty was a colorful case study of this third step in the innovation process.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Linguistic and musical sound systems illustrate a common theme in the study of music-language relations. On the surface, the two domains are dramatically different. Music uses pitch in ways that speech does not, and speech organizes timbre to a degree seldom seen in music. Yet beneath these differences lie deep connections in terms of cognitive and neural processing. Most notably, in both domains the mind interacts with one particular aspect of sound (pitch in music, and timbre in speech) to create a perceptually discretized system. Importantly, this perceptual discretization is not an automatic byproduct of human auditory perception. For example, linguistic and musical sequences present the ear with continuous variations in amplitude, yet loudness is not perceived in terms of discrete categories. Instead, the perceptual discretization of musical pitch and linguistic timbre reflects the activity of a powerful cognitive system, built to separate within-category sonic variation from differences that indicate a change in sound category. Although music and speech differ in the primary acoustic feature used for sound category formation, it appears that the mechanisms that create and maintain learned sound categories in the two domains may have a substantial degree of overlap. Such overlap has implications for both practical and theoretical issues surrounding human communicative development. In the 20th century, relations between spoken and musical sound systems were largely explored by artists. For example, the boundary between the domains played an important role in innovative works such as Schoenberg's Pierrot Lunaire and Reich's Different Trains (cf. Risset, 1991). In the 21st century, science is finally beginning to catch up, as relations between spoken and musical sound systems prove themselves to be a fruitful domain for research in cognitive neuroscience. Such work has already begun to yield new insights into our species' uniquely powerful communicative abilities.
Aniruddh D. Patel (Music, Language, and the Brain)
Modeling the evolution of modularity became significantly easier after a kind of genetic variation was discovered by quantitative trait locus (QTL) mapping in the lab of James Cheverud at Washington University called 'relationship QTL' or r-QTL for short. An r-QTL is a genetic locus that affects the correlations between two quantitative traits (i.e. their variational relationship, and therefore, 'relationship' loci). Surprisingly, a large fraction of these so-mapped loci are also neutral with respect to the character mean. This means one can select on these 'neutral' r-QTLs without simultaneously changing the character mean in a certain way. It was easy to show that differential directional selection on a character could easily lead a decrease in genetic correlation between characters. Of course, it is not guaranteed that each and every population has the right kind of r-QTL polymorphisms, nor is it yet clear what kind of genetic architecture allows for the existence of an r-QTL. Nevertheless, these findings make it plausible that differential directional selection can enhance the genetic/variational individuality of traits and, thus, may play a role in the origin of evolutionary novelties by selecting for variational individuality. It must be added, though, that there has been relatively little research in this area and that we will need to see more to determine whether we understand what is going on here, if anything. In particular, one difficulty is the mathematical modeling of gene interaction (epistasis), because the details of an epistasis model determine the outcome of the evolution by natural selection. One result shows that natural selection increases or decreases mutational variance, depending on whether the average epistatic effects are positive or negative. This means that the genetic architecture is more determined by the genetic architecture that we start with than by the nature of the selection forces that act upon it. In other words, the evolution of a genetic architecture could be arbitrary with respect to selection.
Günter Wagner (Homology, Genes, and Evolutionary Innovation)
Was this luck, or was it more than that? Proving skill is difficult in venture investing because, as we have seen, it hinges on subjective judgment calls rather than objective or quantifiable metrics. If a distressed-debt hedge fund hires analysts and lawyers to scrutinize a bankrupt firm, it can learn precisely which bond is backed by which piece of collateral, and it can foresee how the bankruptcy judge is likely to rule; its profits are not lucky. Likewise, if an algorithmic hedge fund hires astrophysicists to look for patterns in markets, it may discover statistical signals that are reliably profitable. But when Perkins backed Tandem and Genentech, or when Valentine backed Atari, they could not muster the same certainty. They were investing in human founders with human combinations of brilliance and weakness. They were dealing with products and manufacturing processes that were untested and complex; they faced competitors whose behaviors could not be forecast; they were investing over long horizons. In consequence, quantifiable risks were multiplied by unquantifiable uncertainties; there were known unknowns and unknown unknowns; the bracing unpredictability of life could not be masked by neat financial models. Of course, in this environment, luck played its part. Kleiner Perkins lost money on six of the fourteen investments in its first fund. Its methods were not as fail-safe as Tandem’s computers. But Perkins and Valentine were not merely lucky. Just as Arthur Rock embraced methods and attitudes that put him ahead of ARD and the Small Business Investment Companies in the 1960s, so the leading figures of the 1970s had an edge over their competitors. Perkins and Valentine had been managers at leading Valley companies; they knew how to be hands-on; and their contributions to the success of their portfolio companies were obvious. It was Perkins who brought in the early consultants to eliminate the white-hot risks at Tandem, and Perkins who pressed Swanson to contract Genentech’s research out to existing laboratories. Similarly, it was Valentine who drove Atari to focus on Home Pong and to ally itself with Sears, and Valentine who arranged for Warner Communications to buy the company. Early risk elimination plus stage-by-stage financing worked wonders for all three companies. Skeptical observers have sometimes asked whether venture capitalists create innovation or whether they merely show up for it. In the case of Don Valentine and Tom Perkins, there was not much passive showing up. By force of character and intellect, they stamped their will on their portfolio companies.
Sebastian Mallaby (The Power Law: Venture Capital and the Making of the New Future)
Interestingly enough, creative geniuses seem to think a lot more like horses do. These people also spend a rather large amount of time engaging in that favorite equine pastime: doing nothing. In his book Fire in the Crucible: The Alchemy of Creative Genius, John Briggs gathers numerous studies illustrating how artists and inventors keep their thoughts pulsating in a field of nuance associated with the limbic system. In order to accomplish this feat against the influence of cultural conditioning, they tend to be outsiders who have trouble fitting into polite society. Many creative geniuses don’t do well in school and don’t speak until they’re older, thus increasing their awareness of nonverbal feelings, sensations, and body language cues. Einstein is a classic example. Like Kathleen Barry Ingram, he also failed his college entrance exams. As expected, these sensitive, often highly empathic people feel extremely uncomfortable around incongruent members of their own species, and tend to distance themselves from the cultural mainstream. Through their refusal to fit into a system focusing on outside authority, suppressed emotion, and secondhand thought, creative geniuses retain and enhance their ability to activate the entire brain. Information flows freely, strengthening pathways between the various brain functions. The tendency to separate thought from emotion, memory, and sensation is lessened. This gives birth to a powerful nonlinear process, a flood of sensations and images interacting with high-level thought functions and aspects of memory too complex and multifaceted to distill into words. These elements continue to influence and build on each other with increasing ferocity. Researchers emphasize that the entire process is so rapid the conscious mind barely registers that it is happening, let alone what is happening. Now a person — or a horse for that matter — can theoretically operate at this level his entire life and never receive recognition for the rich and innovative insights resulting from this process. Those called creative geniuses continuously struggle with the task of communicating their revelations to the world through the most amenable form of expression — music, visual art, poetry, mathematics. Their talent for innovation, however, stems from an ability to continually engage and process a complex, interconnected, nonlinear series of insights. Briggs also found that creative geniuses spend a large of amount of time “doing nothing,” alternating episodes of intense concentration on a project with periods of what he calls “creative indolence.” Albert Einstein once remarked that some of his greatest ideas came to him so suddenly while shaving that he was prone to cut himself with surprise.
Linda Kohanov (The Tao of Equus: A Woman's Journey of Healing and Transformation through the Way of the Horse)
Manage Your Team’s Collective Time Time management is a group endeavor. The payoff goes far beyond morale and retention. ILLUSTRATION: JAMES JOYCE by Leslie Perlow | 1461 words Most professionals approach time management the wrong way. People who fall behind at work are seen to be personally failing—just as people who give up on diet or exercise plans are seen to be lacking self-control or discipline. In response, countless time management experts focus on individual habits, much as self-help coaches do. They offer advice about such things as keeping better to-do lists, not checking e-mail incessantly, and not procrastinating. Of course, we could all do a better job managing our time. But in the modern workplace, with its emphasis on connectivity and collaboration, the real problem is not how individuals manage their own time. It’s how we manage our collective time—how we work together to get the job done. Here is where the true opportunity for productivity gains lies. Nearly a decade ago I began working with a team at the Boston Consulting Group to implement what may sound like a modest innovation: persuading each member to designate and spend one weeknight out of the office and completely unplugged from work. The intervention was aimed at improving quality of life in an industry that’s notorious for long hours and a 24/7 culture. The early returns were positive; the initiative was expanded to four teams of consultants, and then to 10. The results, which I described in a 2009 HBR article, “Making Time Off Predictable—and Required,” and in a 2012 book, Sleeping with Your Smartphone , were profound. Consultants on teams with mandatory time off had higher job satisfaction and a better work/life balance, and they felt they were learning more on the job. It’s no surprise, then, that BCG has continued to expand the program: As of this spring, it has been implemented on thousands of teams in 77 offices in 40 countries. During the five years since I first reported on this work, I have introduced similar time-based interventions at a range of companies—and I have come to appreciate the true power of those interventions. They put the ownership of how a team works into the hands of team members, who are empowered and incentivized to optimize their collective time. As a result, teams collaborate better. They streamline their work. They meet deadlines. They are more productive and efficient. Teams that set a goal of structured time off—and, crucially, meet regularly to discuss how they’ll work together to ensure that every member takes it—have more open dialogue, engage in more experimentation and innovation, and ultimately function better. CREATING “ENHANCED PRODUCTIVITY” DAYS One of the insights driving this work is the realization that many teams stick to tried-and-true processes that, although familiar, are often inefficient. Even companies that create innovative products rarely innovate when it comes to process. This realization came to the fore when I studied three teams of software engineers working for the same company in different cultural contexts. The teams had the same assignments and produced the same amount of work, but they used very different methods. One, in Shenzen, had a hub-and-spokes org chart—a project manager maintained control and assigned the work. Another, in Bangalore, was self-managed and specialized, and it assigned work according to technical expertise. The third, in Budapest, had the strongest sense of being a team; its members were the most versatile and interchangeable. Although, as noted, the end products were the same, the teams’ varying approaches yielded different results. For example, the hub-and-spokes team worked fewer hours than the others, while the most versatile team had much greater flexibility and control over its schedule. The teams were completely unaware that their counterparts elsewhere in the world were managing their work differently. My research provide
Technological innovations that produced certain major components of the United States military cannot be understood as resulting from a qualitative arms race. Those involved in decisions about new military technologies for the U.S. Army and Air Force simply do not appear to have had access to good intelligence about the Soviet military technological developments. How, then, were decisions made as to technologies to develop? Military research and development decisions are made amid great uncertainties. In an ideal world, such decisions would be managed by estimating the future costs of alternative programs and their prospective military values, and then pursuing the program with the best ratio of cost to value. But...there are tremendous difficulties in forecasting the real value and costs of weapons development programs. These uncertainties, combined with the empirical difficulty American technology managers had in collecting intelligence on the Soviet Union, meant that research and development strategies in the real world tended to become strategies for managing uncertainties. At least two such strategies are conceivable. One of the most politically important can be called, for want of a better phrase, "let the scientists choose." [This approach should be] compared with the theoretical and practical arguments for a strategy that concentrates on low-cot hedges against various forms of uncertainty.
Stephen Peter Rosen (Winning the Next War: Innovation and the Modern Military (Cornell Studies in Security Affairs))
Our focus, however, should not be on the differences among our sectors, but rather on supporting and promoting the entire innovation economy — from tech to bio to clean energy to health care and beyond. In fact, in its recent Impact 2020 report, the Massachusetts Biotechnology Council highlighted the interrelationship of these vital sectors working together, combining cutting-edge biomedical research with new information technology tools for capturing and integrating data, conducting sophisticated analytics, and enhancing personal connectivity. Massachusetts is a national and world leader in the growing field of life science information technology.
Saccharin was discovered in 1879 when a research fellow at Johns Hopkins University found his bread extra sweet one night and figured that something from the lab must have followed him home. Incredibly, he set about to tasting nearly everything in his lab—and lived to find o-benzoic sulfimide—saccharin by any other name.
Thomas Kelley (The Art of Innovation: Lessons in Creativity from IDEO, America's Leading Design Firm)
Melching is one of 25 mothers featured in Mothers of Innovation, a research report to be launched at a conference in London this week. I began investigating mothers as innovators after meeting women from mothers2mothers, a movement seeking to stop the transmission of the HIV virus to unborn babies. Soon afterwards, I heard about a scheme in which mothers were getting together to improve educational outcomes for children from deprived backgrounds in Turkey, with remarkable results.
THE CHASM – THE DIFFUSION MODEL WHY EVERYBODY HAS AN IPOD Why is it that some ideas – including stupid ones – take hold and become trends, while others bloom briefly before withering and disappearing from the public eye? Sociologists describe the way in which a catchy idea or product becomes popular as ‘diffusion’. One of the most famous diffusion studies is an analysis by Bruce Ryan and Neal Gross of the diffusion of hybrid corn in the 1930s in Greene County, Iowa. The new type of corn was better than the old sort in every way, yet it took twenty-two years for it to become widely accepted. The diffusion researchers called the farmers who switched to the new corn as early as 1928 ‘innovators’, and the somewhat bigger group that was infected by them ‘early adaptors’. They were the opinion leaders in the communities, respected people who observed the experiments of the innovators and then joined them. They were followed at the end of the 1930s by the ‘sceptical masses’, those who would never change anything before it had been tried out by the successful farmers. But at some point even they were infected by the ‘hybrid corn virus’, and eventually transmitted it to the die-hard conservatives, the ‘stragglers’. Translated into a graph, this development takes the form of a curve typical of the progress of an epidemic. It rises, gradually at first, then reaches the critical point of any newly launched product, when many products fail. The critical point for any innovation is the transition from the early adaptors to the sceptics, for at this point there is a ‘chasm’. According to the US sociologist Morton Grodzins, if the early adaptors succeed in getting the innovation across the chasm to the sceptical masses, the epidemic cycle reaches the tipping point. From there, the curve rises sharply when the masses accept the product, and sinks again when only the stragglers remain. With technological innovations like the iPod or the iPhone, the cycle described above is very short. Interestingly, the early adaptors turn away from the product as soon as the critical masses have accepted it, in search of the next new thing. The chasm model was introduced by the American consultant and author Geoffrey Moore. First they ignore you, then they laugh at you, then they fight you, then you win. Mahatma Gandhi
Mikael Krogerus (The Decision Book: 50 Models for Strategic Thinking)
The message: Paris and Rome must reform their economies, removing barriers to the creation of businesses and jobs. Countries with the flexibility to spend more while staying within EU deficit rules should do so, creating what Mr Draghi described as “a more growth-friendly overall fiscal stance for the euro area”. Though the ECB president did not name names, that suggestion was widely interpreted as a call for Germany, the eurozone’s dominant economic power, to raid its fiscal coffers. “The part of Mr Draghi’s speech on the fiscal stance was an innovation,” says Lucrezia Reichlin, a professor at London Business School and a former head of research at the ECB. “The idea of co-ordination between monetary and fiscal policy from a euro area perspective is a hint to Germany.” France, already used to the ECB’s grumbles that it should do more to restructure the economy, received Mr Draghi’s calls warmly.
We’re not big fans of focus groups. We don’t much care for traditional market research either. We go to the source. Not the "experts" inside a company, but the actual people who use the product or something similar to what we’re hoping to create.
Thomas Kelley (The Art of Innovation: Lessons in Creativity from IDEO, America's Leading Design Firm)
With federal and state money drying up, research universities are increasingly trying to monetize their own intellectual property for revenue. In 2012, universities collectively generated $2.6 billion from their patents, a 6.8 percent jump from the previous year, according to the Association of University Technology Managers. Napolitano, of course, knows all of this. The University of California, especially its Berkeley and Los Angeles campuses, includes some of the biggest players in converting research into licensing fees and startups that might go public or be acquired. Witness the uptick in university-run incubators in the Bay Area. But someone must do the research that leads to those technologies that eventually hit the market, she said. When Napolitano first joined the University of California, one of her top priorities was to increase efficiency - to do more with less. But over time she came to realize that research is anything but efficient. But that's a good thing. "The grace note of basic research is failure," Napolitano said. "It's what doesn't work that leads to unexpected breakthroughs." There is nothing inherently wrong with seeking profit from innovation. But we must first understand that innovation starts when scientists ask how and why. Basic research "is where the action is," she said.
Kalau alutsista perlengkapan dan peralatan itu bisa kita bikin sendiri, wajib hukumnya jajaran TNI dan Polri membeli produksi dalam negeri. Kalau belum bisa kita bikin sendiri, tapi bisa kita laksanakan: joint production; joint research, innovation, and development; joint investment. Itu yang kita pilih. Kalau belum memang betul-betul sampai di situ and we have to puchase it dari negara mana pun think about transfer of technology, think about sekali lagi, kebersamaan, seperti yang kita lakukan dengan Korea misalnya, yang kita lakukan dengan negara- negara sahabat yang lain.
Susilo Bambang Yudhoyono