Technology Famous Quotes

We've searched our database for all the quotes and captions related to Technology Famous. Here they are! All 100 of them:

Then let me ask you this famous question: Would you rather live in a world without technology…or in a world without religion? Would you rather live without medicine, electricity, transportation, and antibiotics…or without zealots waging war over fictional tales and imaginary spirits?
Dan Brown (Origin (Robert Langdon, #5))
Later, I remember to tell Ben about the girl. “Seconds!” I say, but he is unmoved. “People always talk about email and phones and how they alienate us from one another, but these sorts of fears about technology have always been with us,” he claims. When electricity was first introduced to homes, there were letters to the newspapers about how it would undermine family togetherness. Now there would be no need to gather around a shared hearth, people fretted. In 1903, a famous psychologist worried that young people would lose their connection to dusk and its contemplative moments. Hahaha! (Except when was the last time I stood still because it was dusk?)
Jenny Offill (Weather)
For thousands of years, it had been nature--and its supposed creator--that had had a monopoly on awe. It had been the icecaps, the deserts, the volcanoes and the glaciers that had given us a sense of finitude and limitation and had elicited a feeling in which fear and respect coagulated into a strangely pleasing feeling of humility, a feeling which the philosophers of the eighteenth century had famously termed the sublime. But then had come a transformation to which we were still the heirs.... Over the course of the nineteenth century, the dominant catalyst for that feeling of the sublime had ceased to be nature. We were now deep in the era of the technological sublime, when awe could most powerfully be invoked not by forests or icebergs but by supercomputers, rockets and particle accelerators. We were now almost exclusively amazed by ourselves.
Alain de Botton (The Pleasures and Sorrows of Work)
Why is it that if you say you don’t enjoy using an e-reader, or that you aren’t going to get one till the technology is mature, you get reported as “loathing” it? The little Time article itself is fairly accurate about what I’ve said about e-reading, but the title of the series, “Famous Writers Who Loathe E-Books,” reflects or caters to a silly idea: that not being interested in using a particular technology is the same as hating and despising it.
Ursula K. Le Guin
I thought of the fate of Descartes’ famous formulation: man as ‘master and proprietor of nature.’ Having brought off miracles in science and technology, this ‘master and proprietor’ is suddenly realizing that he owns nothing and is master neither of nature (it is vanishing, little by little, from the planet), nor of History (it has escaped him), nor of himself (he is led by the irrational forces of his soul). But if God is gone and man is no longer master, then who is master? The planet is moving through the void without any master. There it is, the unbearable lightness of being.
Milan Kundera (The Art of the Novel)
The founders of start-ups as varied as YouTube, Palantir Technologies, and Yelp all worked at PayPal. Another set of people—including Reid Hoffman, Thiel, and Botha—emerged as some of the technology industry’s top investors. PayPal staff pioneered techniques in fighting online fraud that have formed the basis of software used by the CIA and FBI to track terrorists and of software used by the world’s largest banks to combat crime. This collection of super-bright employees has become known as the PayPal Mafia—more or less the current ruling class of Silicon Valley—and Musk is its most famous and successful member.
Ashlee Vance (Elon Musk: Inventing the Future)
As Stewart Brand, that great theologian of the information age, famously put it, “We are gods and might as well get good at it.
Meghan O'Gieblyn (God, Human, Animal, Machine: Technology, Metaphor, and the Search for Meaning)
Because there was no pre-existing patrician elite, those successful in the new book industry could write very swiftly to the top of the social hierarchy.
Andrew Pettegree (Brand Luther: How an Unheralded Monk Turned His Small Town into a Center of Publishing, Made Himself the Most Famous Man in Europe—and Started the Protestant Reformation)
His plain, undecorated, and utilitarian work reeked week of provincialism.
Andrew Pettegree (Brand Luther: How an Unheralded Monk Turned His Small Town into a Center of Publishing, Made Himself the Most Famous Man in Europe—and Started the Protestant Reformation)
Anyone or anything can be made famous. Technology has truly elevated the trivial. This is the true curse of our times.
Daksh Tyagi (Signs of Life)
Sometime during the 1960s, the Nobel laureate economist Milton Friedman was consulting with the government of a developing Asian nation. Friedman was taken to a large-scale public works project, where he was surprised to see large numbers of workers wielding shovels, but very few bulldozers, tractors, or other heavy earth-moving equipment. When asked about this, the government official in charge explained that the project was intended as a “jobs program.” Friedman’s caustic reply has become famous: “So then, why not give the workers spoons instead of shovels?” Friedman
Martin Ford (The Rise of the Robots: Technology and the Threat of Mass Unemployment)
The approach to digital culture I abhor would indeed turn all the world's books into one book, just as Kevin (Kelly) suggested. It might start to happen in the next decade or so. Google and other companies are scanning library books into the cloud in a massive Manhattan Project of cultural digitization. What happens next is what's important. If the books in the cloud are accessed via user interfaces that encourage mashups of fragments that obscure the context and authorship of each fragment, there will be only one book. This is what happens today with a lot of content; often you don't know where a quoted fragment from a news story came from, who wrote a comment, or who shot a video. A continuation of the present trend will make us like various medieval religious empires, or like North Korea, a society with a single book. The Bible can serve as a prototypical example. Like Wikipedia, the Bible's authorship was shared, largely anonymous, and cumulative, and the obscurity of the individual authors served to create an oracle-like ambience for the document as "the literal word of God." If we take a non-metaphysical view of the Bible, it serves as a link to our ancestors, a window. The ethereal, digital replacement technology for the printing press happens to have come of age in a time when the unfortunate ideology I'm criticizing dominates technological culture. Authorship - the very idea of the individual point of view - is not a priority of the new ideology. The digital flattening of expression into a global mush is not presently enforced from the top down, as it is in the case of a North Korean printing press. Instead, the design of software builds the ideology into those actions that are the easiest to perform on the software designs that are becoming ubiquitous. It is true that by using these tools, individuals can author books or blogs or whatever, but people are encouraged by the economics of free content, crowd dynamics, and lord aggregators to serve up fragments instead of considered whole expressions or arguments. The efforts of authors are appreciated in a manner that erases the boundaries between them. The one collective book will absolutely not be the same thing as the library of books by individuals it is bankrupting. Some believe it will be better; others, including me, believe it will be disastrously worse. As the famous line goes from Inherit the Wind: 'The Bible is a book... but it is not the only book' Any singular, exclusive book, even the collective one accumulating in the cloud, will become a cruel book if it is the only one available.
Jaron Lanier (You Are Not a Gadget)
The model? Whoa.' But Spanner's interest in human beings, even when dead or famous, was still secondary to his fondness for rare comics, technological innovation, and bands of which Strike had never heard.
Robert Galbraith (The Cuckoo's Calling (Cormoran Strike, #1))
After all, we are all immigrants to the future; none of us is a native in that land. Margaret Mead famously wrote about the profound changes wrought by the Second World War, “All of us who grew up before the war are immigrants in time, immigrants from an earlier world, living in an age essentially different from anything we knew before.” Today we are again in the early stages of defining a new age. The very underpinnings of our society and institutions--from how we work to how we create value, govern, trade, learn, and innovate--are being profoundly reshaped by amplified individuals. We are indeed all migrating to a new land and should be looking at the new landscape emerging before us like immigrants: ready to learn a new language, a new way of doing things, anticipating new beginnings with a sense of excitement, if also with a bit of understandable trepidation.
Marina Gorbis (The Nature of the Future: Dispatches from the Socialstructed World)
Singer cited the famous essay “The Tragedy of the Commons,” in which biologist Garrett Hardin argued that individuals acting in their rational self-interest may undermine the common good, and warned against assuming that technology would save us from ourselves. “If we ignore the present warning signs and wait for an ecological disaster to strike, it will probably be too late,” Singer noted. He imagined what it must have been like to be Noah, surrounded by “complacent compatriots,” saying, “‘Don’t worry about the rising waters, Noah; our advanced technology will surely discover a substitute for breathing.’ If it was wisdom that enabled Noah to believe in the ‘never-yet-happened,’ we could use some of that wisdom now,” Singer concluded.
Naomi Oreskes (Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming)
This pandemic has inherited an habit in us of leaving behind a digital dust of who were are; what we are; how we are...to get snooped, pried and loathed upon. Just locking up phones for privacy and opening up feeling and emotions is like watching a pole dance in public - Oren Tamira
- Oren Tamira, Counter-Strike: An Anthology of Dalit Short Stories
Social networking technology allows us to spend our time engaged in a hypercompetitive struggle for attention, for victories in the currency of “likes.” People are given more occasions to be self-promoters, to embrace the characteristics of celebrity, to manage their own image, to Snapchat out their selfies in ways that they hope will impress and please the world. This technology creates a culture in which people turn into little brand managers, using Facebook, Twitter, text messages, and Instagram to create a falsely upbeat, slightly overexuberant, external self that can be famous first in a small sphere and then, with luck, in a large one. The manager of this self measures success by the flow of responses it gets. The social media maven spends his or her time creating a self-caricature, a much happier and more photogenic version of real life. People subtly start comparing themselves to other people’s highlight reels, and of course they feel inferior.
David Brooks (The Road to Character)
Max Weber famously pointed out that a sovereign state's institutional representatives maintain a monopoly on the right of violence within the state's territory. Normally, this violence can only be exercised by certain duly authorized officials (soldiers, police, jailers), or those authorized by such officials (airport security, private guards…), and only in a manne explicitly designated by law. But ultimately, sovereign power really is, still, the right to brush such legalities aside, or to make them up as one goes along. The United States might call itself "a country of laws, not men", but as we have learned in recent years, American presidents can order torture, assassinations, domestic surveillance programs, even set up extra-legal zones like Guantanamo where they can treat prisoners pretty much any way they choose to. Even on the lowest levels, those who enforce the law are not really subject to it. It's extraordinary difficult, for instance, for a police officer to do *anything* to an American citizen that would lead to that officer being convicted of a crime. (p. 195)
David Graeber (The Utopia of Rules: On Technology, Stupidity, and the Secret Joys of Bureaucracy)
Kensi Gounden says Everybody wants to be famous, but nobody wants to do the work. I live by that. You grind hard so you can play hard. At the end of the day, you put all the work in, and eventually it’ll pay off. It could be in a year, it could be in 30 years. Eventually, your hard work will pay off. #kensigounden #kensi #gounden #kenseelen #sports #technology #tech #positivethinking #hopeforbest #innovation #innovate #information #knowledge
Kensi Gounden
In 1907 the American Telephone and Telegraph Company faced a crisis. The patents of its founder, Alexander Graham Bell, had expired, and it seemed in danger of losing its near-monopoly on phone services. Its board summoned back a retired president, Theodore Vail, who decided to reinvigorate the company by committing to a bold goal: building a system that could connect a call between New York and San Francisco. The challenge required combining feats of engineering with leaps of pure science. Making use of vacuum tubes and other new technologies, AT&T built repeaters and amplifying devices that accomplished the task in January 1915. On the historic first transcontinental call, in addition to Vail and President Woodrow Wilson, was Bell himself, who echoed his famous words from thirty-nine years earlier, “Mr. Watson, come here, I want to see you.” This time his former assistant Thomas Watson, who was in San Francisco, replied, “It would take me a week.”1
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Ben R. Rich, the ex-president of the famous “Skunk Works,” Lockheed-Martin’s Advanced Development Programs (ADP) group, revealed the truth just before he died. In an alumni speech at the University of California, Los Angeles, in 1993, he said, “We already have the means to travel among the stars, but these technologies are locked up in black projects and it would take an Act of God to ever get them out to benefit humanity . . . Anything you can imagine, we already know how to do.
Len Kasten (The Secret History of Extraterrestrials: Advanced Technology and the Coming New Race)
On its first over was the famous picture of Earth taken from space; its subtitle was "Access to Tools." The underlying philosophy was that technology could be our friend. Brand wrote on the first page of the first edition, "A realm of intimate, personal power is developing- power of the individual to conduct his own education, find his own inspiration, shape his own environment, and share his adventure with whoever is interested. Tools that aid this process are sought and promoted by the Whole Earth Catalog.
Walter Isaacson (Steve Jobs)
I think it is almost impossible that he [Prophet Muhammad (saas)] could have known about things like the common origin of the universe, because scientists have only found out within the last few years with very complicated and advanced technological methods that this is the case. Somebody who did not know something about nuclear physics 1400 years ago could not, I think, be in a position to find out from his own mind for instance that the earth and the heavens had the same origin, or many others of the questions that we have discussed here. (Alfred Kroner, Professor of the Department of Geosciences, University of Mainz, Germany. One of the world's most famous geologists)
Harun Yahya (Allah's Miracles in the Qur'an)
Equally bad deals have been made with Big Tech. In many ways, Silicon Valley is a product of the U.S. government’s investments in the development of high-risk technologies. The National Science Foundation funded the research behind the search algorithm that made Google famous. The U.S. Navy did the same for the GPS technology that Uber depends on. And the Defense Advanced Research Projects Agency, part of the Pentagon, backed the development of the Internet, touchscreen technology, Siri, and every other key component in the iPhone. Taxpayers took risks when they invested in these technologies, yet most of the technology companies that have benefited fail to pay their fair share of taxes.
Mariana Mazzucato
Some researchers, such as psychologist Jean Twenge, say this new world where compliments are better than sex and pizza, in which the self-enhancing bias has been unchained and allowed to gorge unfettered, has led to a new normal in which the positive illusions of several generations have now mutated into full-blown narcissism. In her book The Narcissism Epidemic, Twenge says her research shows that since the mid-1980s, clinically defined narcissism rates in the United States have increased in the population at the same rate as obesity. She used the same test used by psychiatrists to test for narcissism in patients and found that, in 2006, one in four U.S. college students tested positive. That’s real narcissism, the kind that leads to diagnoses of personality disorders. In her estimation, this is a dangerous trend, and it shows signs of acceleration. Narcissistic overconfidence crosses a line, says Twenge, and taints those things improved by a skosh of confidence. Over that line, you become less concerned with the well-being of others, more materialistic, and obsessed with status in addition to losing all the restraint normally preventing you from tragically overestimating your ability to manage or even survive risky situations. In her book, Twenge connects this trend to the housing market crash of the mid-2000s and the stark increase in reality programming during that same decade. According to Twenge, the drive to be famous for nothing went from being strange to predictable thanks to a generation or two of people raised by parents who artificially boosted self-esteem to ’roidtastic levels and then released them into a culture filled with new technologies that emerged right when those people needed them most to prop up their self-enhancement biases. By the time Twenge’s research was published, reality programming had spent twenty years perfecting itself, and the modern stars of those shows represent a tiny portion of the population who not only want to be on those shows, but who also know what they are getting into and still want to participate. Producers with the experience to know who will provide the best television entertainment to millions then cull that small group. The result is a new generation of celebrities with positive illusions so robust and potent that the narcissistic overconfidence of the modern American teenager by comparison is now much easier to see as normal.
David McRaney (You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself)
Brand ran the Whole Earth Truck Store, which began as a roving truck that sold useful tools and educational materials, and in 1968 he decided to extend its reach with the Whole Earth Catalog. On its first cover was the famous picture of Earth taken from space; its subtitle was “Access to Tools.” The underlying philosophy was that technology could be our friend. Brand wrote on the first page of the first edition, “A realm of intimate, personal power is developing—power of the individual to conduct his own education, find his own inspiration, shape his own environment, and share his adventure with whoever is interested. Tools that aid this process are sought and promoted by the Whole Earth Catalog.” Buckminster Fuller followed with a poem that began: “I see God in the instruments and mechanisms that work reliably.
Walter Isaacson (Steve Jobs)
55 The expansion of cultures can also be tracked by following the waft of alcohol. Commenting on the settling of the American frontier, Mark Twain famously characterized whiskey as the “earliest pioneer of civilization,” ahead of the railway, newspaper, and missionary.56 By far the most technologically advanced and valuable artifacts found in early European settlements in the New World were copper stills, imported at great cost and worth more than their weight in gold.57 As the writer Michael Pollan has argued, Johnny Appleseed, whom American mythology now portrays as intent on spreading the gift of wholesome, vitamin-filled apples to hungry settlers, was in fact “the American Dionysus,” bringing badly needed alcohol to the frontier. Johnny’s apples, so desperately sought out by American homesteaders, were not meant to be eaten at the table, but rather used to make cider and “applejack” liquor.58
Edward Slingerland (Drunk: How We Sipped, Danced, and Stumbled Our Way to Civilization)
Bertrand Russell famously said: “It is undesirable to believe a proposition when there is no ground whatsoever for supposing it is true.” [but] Russell’s maxim is the luxury of a technologically advanced society with science, history, journalism, and their infrastructure of truth-seeking, including archival records, digital datasets, high-tech instruments, and communities of editing, fact-checking, and peer review. We children of the Enlightenment embrace the radical creed of universal realism: we hold that all our beliefs should fall within the reality mindset. We care about whether our creation story, our founding legends, our theories of invisible nutrients and germs and forces, our conceptions of the powerful, our suspicions about our enemies, are true or false. That’s because we have the tools to get answers to these questions, or at least to assign them warranted degrees of credence. And we have a technocratic state that should, in theory, put these beliefs into practice. But as desirable as that creed is, it is not the natural human way of believing. In granting an imperialistic mandate to the reality mindset to conquer the universe of belief and push mythology to the margins, we are the weird ones—or, as evolutionary social scientists like to say, the WEIRD ones: Western, Educated, Industrialized, Rich, Democratic. At least, the highly educated among us are, in our best moments. The human mind is adapted to understanding remote spheres of existence through a mythology mindset. It’s not because we descended from Pleistocene hunter-gatherers specifically, but because we descended from people who could not or did not sign on to the Enlightenment ideal of universal realism. Submitting all of one’s beliefs to the trials of reason and evidence is an unnatural skill, like literacy and numeracy, and must be instilled and cultivated.
Pinker Steven (Rationality: What It Is, Why It Seems Scarce, Why It Matters)
This waking dream we call the internet also blurs the difference between my serious thoughts and my playful thoughts, or to put it more simply: I no longer can tell when I am working and when I am playing online. For some people the disintegration between these two realms marks all that is wrong with the internet: It is the high-priced waster of time. It breeds trifles and turns superficialities into careers. Jeff Hammerbacher, a former Facebook engineer, famously complained that the “best minds of my generation are thinking about how to make people click ads.” This waking dream is viewed by some as an addictive squandering. On the contrary, I cherish a good wasting of time as a necessary precondition for creativity. More important, I believe the conflation of play and work, of thinking hard and thinking playfully, is one of the greatest things this new invention has done. Isn’t the whole idea that in a highly evolved advanced society work is over?
Kevin Kelly (The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future)
The demographic ageing of Europe and other leading industrial countries is multiplied by the economic burden of immigration. For the time being, we can still hold out, but this will not last. The lack of active workers, the burden of retirees and the expenses of healthcare will end, from 2005-2010, with burdening European economies with debt. Gains in productivity and technological advances (the famous ‘primitive accumulation of fixed capital’, the economists’ magic cure) will never be able to match the external demographic costs. Lastly, far from compensating for the losses of the working-age native-born population, the colonising immigration Europe is experiencing involves first of all welfare recipients and unskilled workers. In addition, this immigration represents a growing expense (insecurity, the criminal economy, urban policies, etc.). An economic collapse of Europe, the world’s leading commercial power, would drag down with it the United States and the entire Western economy.
Guillaume Faye (Convergence of Catastrophes)
THE FOUNDING PROPHET of modern antihumanism was Thomas Malthus (1766–1834). For three decades a professor at the British East India Company’s East India College, Malthus was a political economist who famously argued that human reproduction always outruns available resources. This doctrine served to rationalize the starvation of millions caused by his employer’s policy of brutal oppression of the peasants of the Indian subcontinent. The British Empire’s colonial helots, however, were not Malthus’s only targets. Rather, his Essay on the Principle of Population (first published in 1798 and later expanded in numerous further editions) was initially penned as a direct attack on such Enlightenment revolutionaries as William Godwin and the Marquis de Condorcet, who advanced the notion that human liberty, expanding knowledge, and technological progress could ultimately make possible a decent life for all mankind. Malthus prescribed specific policies to keep population down by raising the death rate:
Robert Zubrin (Merchants of Despair: Radical Environmentalists, Criminal Pseudo-Scientists, and the Fatal Cult of Antihumanism)
From every direction, the place is under assault—and unlike in the past, the adversary is not concentrated in a single force, such as the Bureau of Reclamation, but takes the form of separate outfits conducting smaller attacks that are, in many ways, far more insidious. From directly above, the air-tour industry has succeeded in scuttling all efforts to dial it back, most recently through the intervention of Arizona’s senators, John Kyl and John McCain, and is continuing to destroy one of the canyon’s greatest treasures, which is its silence. From the east has come a dramatic increase in uranium-mining claims, while the once remote and untrammeled country of the North Rim now suffers from an ever-growing influx of recreational ATVs. On the South Rim, an Italian real estate company recently secured approval for a massive development whose water demands are all but guaranteed to compromise many of the canyon’s springs, along with the oases that they nourish. Worst of all, the Navajo tribe is currently planning to cooperate in constructing a monstrous tramway to the bottom of the canyon, complete with a restaurant and a resort, at the confluence of the Little Colorado and the Colorado, the very spot where John Wesley Powell made his famous journal entry in the summer of 1869 about venturing “down the Great Unknown.” As vexing as all these things are, what Litton finds even more disheartening is the country’s failure to rally to the canyon’s defense—or for that matter, to the defense of its other imperiled natural wonders. The movement that he and David Brower helped build is not only in retreat but finds itself the target of bottomless contempt. On talk radio and cable TV, environmentalists are derided as “wackos” and “extremists.” The country has swung decisively toward something smaller and more selfish than what it once was, and in addition to ushering in a disdain for the notion that wilderness might have a value that extends beyond the metrics of economics or business, much of the nation ignorantly embraces the benefits of engineering and technology while simultaneously rejecting basic science.
Kevin Fedarko (The Emerald Mile: The Epic Story of the Fastest Ride in History Through the Heart of the Grand Canyon)
Evidently Nehru, though a nationalist at the political level, was intellectually and emotionally drawn to the Indus civilization by his regard for internationalism, secularism, art, technology and modernity. By contrast, Nehru’s political rival, Muhammad Ali Jinnah, the founder of Pakistan, neither visited Mohenjo-daro nor commented on the significance of the Indus civilization. Nor did Nehru’s mentor, Mohandas Karamchand Gandhi, India’s greatest nationalist leader. In Jinnah’s case, this silence is puzzling, given that the Indus valley lies in Pakistan and, moreover, Jinnah himself was born in Karachi, in the province of Sindh, not so far from Mohenjo-daro. In Gandhi’s case, the silence is even more puzzling. Not only was Gandhi, too, an Indus dweller, so to speak, having been born in Gujarat, in Saurashtra, but he must surely also have become aware in the 1930s of the Indus civilization as the potential origin of Hinduism, plus the astonishing revelation that it apparently functioned without resort to military violence. Yet, there is not a single comment on the Indus civilization in the one hundred large volumes of the Collected Works of Mahatma Gandhi. The nearest he comes to commenting is a touching remark recorded by the Mahatma’s secretary when the two of them visited the site of Marshall’s famous excavations at Taxila, in northern Punjab, in 1938. On being shown a pair of heavy silver ancient anklets by the curator of the Taxila archaeological museum, ‘Gandhiji with a deep sigh remarked: “Just like what my mother used to wear.
Andrew Robinson (The Indus)
As part of his long-winded bullshit, Baby fell into a genre trope that he had avoided in his first two novels. He started inventing new words. This was a common habit amongst Science Fiction writers. They couldn’t help themselves. They were always inventing new words. Perhaps the most famous example of a Science Fiction writer inventing a new word occurs in Robert Heinlein’s Stranger in a Strange Land. Part of Heinlein’s vision of horny decentralized alien sex involves the Martian word grok. To grok something is to comprehend that something with effortless and infinite intuition. When you grok something, that something becomes a part of you and you become a part of that something without any troublesome Earthling attempts at knowing. A good example of groking something is the way that members of the social construct of the White race had groked their own piglet pink. They’d groked their skin color so much that it became invisible. It had become part of them and they had become part of it. That was groking. People in the San Francisco Bay Area, especially those who worked in technology like Erik Willems, loved to talk about groking. With time, their overusage stripped away the original meaning and grok became synonymous with simple knowledge of a thing. In a weird way, people in the Bay Area who used the word grok did not grok the word grok. Baby had always been popular with people on the Internet, which was a wonderful place to deny climate change, willfully misinterpret the Bible, and denounce Darwin’s theory of evolution. Now that Baby had coined nonsense neologisms, he had become more than popular. He had become quotable.
Jarett Kobek (I Hate the Internet)
Westerners, not just Lincoln Steffens. It took in the Central Intelligence Agency of the United States. It even took in the Soviet Union’s own leaders, such as Nikita Khrushchev, who famously boasted in a speech to Western diplomats in 1956 that “we will bury you [the West].” As late as 1977, a leading academic textbook by an English economist argued that Soviet-style economies were superior to capitalist ones in terms of economic growth, providing full employment and price stability and even in producing people with altruistic motivation. Poor old Western capitalism did better only at providing political freedom. Indeed, the most widely used university textbook in economics, written by Nobel Prize–winner Paul Samuelson, repeatedly predicted the coming economic dominance of the Soviet Union. In the 1961 edition, Samuelson predicted that Soviet national income would overtake that of the United States possibly by 1984, but probably by 1997. In the 1980 edition there was little change in the analysis, though the two dates were delayed to 2002 and 2012. Though the policies of Stalin and subsequent Soviet leaders could produce rapid economic growth, they could not do so in a sustained way. By the 1970s, economic growth had all but stopped. The most important lesson is that extractive institutions cannot generate sustained technological change for two reasons: the lack of economic incentives and resistance by the elites. In addition, once all the very inefficiently used resources had been reallocated to industry, there were few economic gains to be had by fiat. Then the Soviet system hit a roadblock, with lack of innovation and poor economic incentives preventing any further progress. The only area in which the Soviets did manage to sustain some innovation was through enormous efforts in military and aerospace technology. As a result they managed to put the first dog, Leika, and the first man, Yuri Gagarin, in space. They also left the world the AK-47 as one of their legacies. Gosplan was the supposedly all-powerful planning agency in charge of the central planning of the Soviet economy. One of the benefits of the sequence of five-year plans written and administered by Gosplan was supposed to have been the long time horizon necessary for rational investment and innovation. In reality, what got implemented in Soviet industry had little to do with the five-year plans, which were frequently revised and rewritten or simply ignored. The development of industry took place on the basis of commands by Stalin and the Politburo, who changed their minds frequently and often completely revised their previous decisions. All plans were labeled “draft” or “preliminary.” Only one copy of a plan labeled “final”—that for light industry in 1939—has ever come to light. Stalin himself said in 1937 that “only bureaucrats can think that planning work ends with the creation of the plan. The creation of the plan is just the beginning. The real direction of the plan develops only after the putting together of the plan.” Stalin wanted to maximize his discretion to reward people or groups who were politically loyal, and punish those who were not. As for Gosplan, its main role was to provide Stalin with information so he could better monitor his friends and enemies. It actually tried to avoid making decisions. If you made a decision that turned
Daron Acemoğlu (Why Nations Fail: FROM THE WINNERS OF THE NOBEL PRIZE IN ECONOMICS: The Origins of Power, Prosperity and Poverty)
Moore’s Law, the rule of thumb in the technology industry, tells us that processor chips—the small circuit boards that form the backbone of every computing device—double in speed every eighteen months. That means a computer in 2025 will be sixty-four times faster than it is in 2013. Another predictive law, this one of photonics (regarding the transmission of information), tells us that the amount of data coming out of fiber-optic cables, the fastest form of connectivity, doubles roughly every nine months. Even if these laws have natural limits, the promise of exponential growth unleashes possibilities in graphics and virtual reality that will make the online experience as real as real life, or perhaps even better. Imagine having the holodeck from the world of Star Trek, which was a fully immersive virtual-reality environment for those aboard a ship, but this one is able to both project a beach landscape and re-create a famous Elvis Presley performance in front of your eyes. Indeed, the next moments in our technological evolution promise to turn a host of popular science-fiction concepts into science facts: driverless cars, thought-controlled robotic motion, artificial intelligence (AI) and fully integrated augmented reality, which promises a visual overlay of digital information onto our physical environment. Such developments will join with and enhance elements of our natural world. This is our future, and these remarkable things are already beginning to take shape. That is what makes working in the technology industry so exciting today. It’s not just because we have a chance to invent and build amazing new devices or because of the scale of technological and intellectual challenges we will try to conquer; it’s because of what these developments will mean for the world.
Eric Schmidt (The New Digital Age: Reshaping the Future of People, Nations and Business)
Every day, the markets were driven less directly by human beings and more directly by machines. The machines were overseen by people, of course, but few of them knew how the machines worked. He knew that RBC’s machines—not the computers themselves, but the instructions to run them—were third-rate, but he had assumed it was because the company’s new electronic trading unit was bumbling and inept. As he interviewed people from the major banks on Wall Street, he came to realize that they had more in common with RBC than he had supposed. “I’d always been a trader,” he said. “And as a trader you’re kind of inside a bubble. You’re just watching your screens all day. Now I stepped back and for the first time started to watch other traders.” He had a good friend who traded stocks at a big-time hedge fund in Stamford, Connecticut, called SAC Capital. SAC Capital was famous (and soon to be infamous) for being one step ahead of the U.S. stock market. If anyone was going to know something about the market that Brad didn’t know, he figured, it would be them. One spring morning he took the train up to Stamford and spent the day watching his friend trade. Right away he saw that, even though his friend was using technology given to him by Goldman Sachs and Morgan Stanley and the other big firms, he was experiencing exactly the same problem as RBC: The market on his screens was no longer the market. His friend would hit a button to buy or sell a stock and the market would move away from him. “When I see this guy trading and he was getting screwed—I now see that it isn’t just me. My frustration is the market’s frustration. And I was like, Whoa, this is serious.” Brad’s problem wasn’t just Brad’s problem. What people saw when they looked at the U.S. stock market—the numbers on the screens of the professional traders, the ticker tape running across the bottom of the CNBC screen—was an illusion. “That’s when I realized the markets are rigged. And I knew it had to do with the technology. That the answer lay beneath the surface of the technology. I had absolutely no idea where. But that’s when the lightbulb went off that the only way I’m going to find out what’s going on is if I go beneath the surface.
Michael Lewis (Flash Boys: A Wall Street Revolt)
A famous British writer is revealed to be the author of an obscure mystery novel. An immigrant is granted asylum when authorities verify he wrote anonymous articles critical of his home country. And a man is convicted of murder when he’s connected to messages painted at the crime scene. The common element in these seemingly disparate cases is “forensic linguistics”—an investigative technique that helps experts determine authorship by identifying quirks in a writer’s style. Advances in computer technology can now parse text with ever-finer accuracy. Consider the recent outing of Harry Potter author J.K. Rowling as the writer of The Cuckoo’s Calling , a crime novel she published under the pen name Robert Galbraith. England’s Sunday Times , responding to an anonymous tip that Rowling was the book’s real author, hired Duquesne University’s Patrick Juola to analyze the text of Cuckoo , using software that he had spent over a decade refining. One of Juola’s tests examined sequences of adjacent words, while another zoomed in on sequences of characters; a third test tallied the most common words, while a fourth examined the author’s preference for long or short words. Juola wound up with a linguistic fingerprint—hard data on the author’s stylistic quirks. He then ran the same tests on four other books: The Casual Vacancy , Rowling’s first post-Harry Potter novel, plus three stylistically similar crime novels by other female writers. Juola concluded that Rowling was the most likely author of The Cuckoo’s Calling , since she was the only one whose writing style showed up as the closest or second-closest match in each of the tests. After consulting an Oxford linguist and receiving a concurring opinion, the newspaper confronted Rowling, who confessed. Juola completed his analysis in about half an hour. By contrast, in the early 1960s, it had taken a team of two statisticians—using what was then a state-of-the-art, high-speed computer at MIT—three years to complete a project to reveal who wrote 12 unsigned Federalist Papers. Robert Leonard, who heads the forensic linguistics program at Hofstra University, has also made a career out of determining authorship. Certified to serve as an expert witness in 13 states, he has presented evidence in cases such as that of Christopher Coleman, who was arrested in 2009 for murdering his family in Waterloo, Illinois. Leonard testified that Coleman’s writing style matched threats spray-painted at his family’s home (photo, left). Coleman was convicted and is serving a life sentence. Since forensic linguists deal in probabilities, not certainties, it is all the more essential to further refine this field of study, experts say. “There have been cases where it was my impression that the evidence on which people were freed or convicted was iffy in one way or another,” says Edward Finegan, president of the International Association of Forensic Linguists. Vanderbilt law professor Edward Cheng, an expert on the reliability of forensic evidence, says that linguistic analysis is best used when only a handful of people could have written a given text. As forensic linguistics continues to make headlines, criminals may realize the importance of choosing their words carefully. And some worry that software also can be used to obscure distinctive written styles. “Anything that you can identify to analyze,” says Juola, “I can identify and try to hide.
Anonymous
gave up on the idea of creating “socialist men and women” who would work without monetary incentives. In a famous speech he criticized “equality mongering,” and thereafter not only did different jobs get paid different wages but also a bonus system was introduced. It is instructive to understand how this worked. Typically a firm under central planning had to meet an output target set under the plan, though such plans were often renegotiated and changed. From the 1930s, workers were paid bonuses if the output levels were attained. These could be quite high—for instance, as much as 37 percent of the wage for management or senior engineers. But paying such bonuses created all sorts of disincentives to technological change. For one thing, innovation, which took resources away from current production, risked the output targets not being met and the bonuses not being paid. For another, output targets were usually based on previous production levels. This created a huge incentive never to expand output, since this only meant having to produce more in the future, since future targets would be “ratcheted up.” Underachievement was always the best way to meet targets and get the bonus. The fact that bonuses were paid monthly also kept everyone focused on the present, while innovation is about making sacrifices today in order to have more tomorrow. Even when bonuses and incentives were effective in changing behavior, they often created other problems. Central planning was just not good at replacing what the great eighteenth-century economist Adam Smith called the “invisible hand” of the market. When the plan was formulated in tons of steel sheet, the sheet was made too heavy. When it was formulated in terms of area of steel sheet, the sheet was made too thin. When the plan for chandeliers was made in tons, they were so heavy, they could hardly hang from ceilings. By the 1940s, the leaders of the Soviet Union, even if not their admirers in the West, were well aware of these perverse incentives. The Soviet leaders acted as if they were due to technical problems, which could be fixed. For example, they moved away from paying bonuses based on output targets to allowing firms to set aside portions of profits to pay bonuses. But a “profit motive” was no more encouraging to innovation than one based on output targets. The system of prices used to calculate profits was almost completely unconnected to the value of new innovations or technology. Unlike in a market economy, prices in the Soviet Union were set by the government, and thus bore little relation to value. To more specifically create incentives for innovation, the Soviet Union introduced explicit innovation bonuses in 1946. As early as 1918, the principle had been recognized that an innovator should receive monetary rewards for his innovation, but the rewards set were small and unrelated to the value of the new technology. This changed only in 1956, when it was stipulated that the bonus should be proportional to the productivity of the innovation. However, since productivity was calculated in terms of economic benefits measured using the existing system of prices, this was again not much of an incentive to innovate. One could fill many pages with examples of the perverse incentives these schemes generated. For example, because the size of the innovation bonus fund was limited by the wage bill of a firm, this immediately reduced the incentive to produce or adopt any innovation that might have economized on labor.
Daron Acemoğlu (Why Nations Fail: FROM THE WINNERS OF THE NOBEL PRIZE IN ECONOMICS: The Origins of Power, Prosperity and Poverty)
Growth was so rapid that it took in generations of Westerners, not just Lincoln Steffens. It took in the Central Intelligence Agency of the United States. It even took in the Soviet Union’s own leaders, such as Nikita Khrushchev, who famously boasted in a speech to Western diplomats in 1956 that “we will bury you [the West].” As late as 1977, a leading academic textbook by an English economist argued that Soviet-style economies were superior to capitalist ones in terms of economic growth, providing full employment and price stability and even in producing people with altruistic motivation. Poor old Western capitalism did better only at providing political freedom. Indeed, the most widely used university textbook in economics, written by Nobel Prize–winner Paul Samuelson, repeatedly predicted the coming economic dominance of the Soviet Union. In the 1961 edition, Samuelson predicted that Soviet national income would overtake that of the United States possibly by 1984, but probably by 1997. In the 1980 edition there was little change in the analysis, though the two dates were delayed to 2002 and 2012. Though the policies of Stalin and subsequent Soviet leaders could produce rapid economic growth, they could not do so in a sustained way. By the 1970s, economic growth had all but stopped. The most important lesson is that extractive institutions cannot generate sustained technological change for two reasons: the lack of economic incentives and resistance by the elites. In addition, once all the very inefficiently used resources had been reallocated to industry, there were few economic gains to be had by fiat. Then the Soviet system hit a roadblock, with lack of innovation and poor economic incentives preventing any further progress. The only area in which the Soviets did manage to sustain some innovation was through enormous efforts in military and aerospace technology. As a result they managed to put the first dog, Leika, and the first man, Yuri Gagarin, in space. They also left the world the AK-47 as one of their legacies. Gosplan was the supposedly all-powerful planning agency in charge of the central planning of the Soviet economy. One of the benefits of the sequence of five-year plans written and administered by Gosplan was supposed to have been the long time horizon necessary for rational investment and innovation. In reality, what got implemented in Soviet industry had little to do with the five-year plans, which were frequently revised and rewritten or simply ignored. The development of industry took place on the basis of commands by Stalin and the Politburo, who changed their minds frequently and often completely revised their previous decisions. All plans were labeled “draft” or “preliminary.” Only one copy of a plan labeled “final”—that for light industry in 1939—has ever come to light. Stalin himself said in 1937 that “only bureaucrats can think that planning work ends with the creation of the plan. The creation of the plan is just the beginning. The real direction of the plan develops only after the putting together of the plan.” Stalin wanted to maximize his discretion to reward people or groups who were politically loyal, and punish those who were not. As for Gosplan, its main role was to provide Stalin with information so he could better monitor his friends and enemies. It actually tried to avoid making decisions. If you made a decision that turned out badly, you might get shot. Better to avoid all responsibility. An example of what could happen
Daron Acemoğlu (Why Nations Fail: FROM THE WINNERS OF THE NOBEL PRIZE IN ECONOMICS: The Origins of Power, Prosperity and Poverty)
No?” Winston’s voice remained flat. “Then let me ask you this famous question: Would you rather live in a world without technology…or in a world without religion? Would you rather live without medicine, electricity, transportation, and antibiotics…or without zealots waging war over fictional tales and imaginary spirits?
Dan Brown (Origin (Robert Langdon, #5))
G. D. Birla famously wrote to his grandson Aditya Birla, then studying at the famous Massachusetts Institute of Technology, ‘And above all, don’t be extravagant.
Ashwin Sanghi (13 Steps to Bloody Good Wealth)
Moreover, Netflix produces exactly what it knows its customers want based on their past viewing habits, eliminating the waste of all those pilots, and only loses customers when they make a proactive decision to cancel their subscription. The more a person uses Netflix, the better Netflix gets at providing exactly what that person wants. And increasingly, what people want is the original content that is exclusive to Netflix. The legendary screenwriter William Goldman famously wrote of Hollywood, “Nobody knows anything.” To which Reed Hastings replies, “Netflix does.” And all this came about because Hastings had the insight and persistence to wait nearly a decade for Moore’s Law to turn his long-term vision from an impossible pipe dream into one of the most successful media companies in history. Moore’s Law has worked its magic many other times, enabling new technologies ranging from computer animation (Pixar) to online file storage (Dropbox) to smartphones (Apple). Each of those technologies followed the same path from pipe dream to world-conquering reality, all driven by Gordon Moore’s 1965 insight.
Reid Hoffman (Blitzscaling: The Lightning-Fast Path to Building Massively Valuable Companies)
It was during the early summer of 1952 that I found myself in the small community park next to Stevens Institute of Technology. Although I had a job, I had only worked as a “soda jerk” for a little over a week before I started looking for something else. The Hoboken waterfront was still familiar to me from earlier years when I walked this way to catch the trolley or the electrified Public Service bus home from the Lackawanna Ferry Terminal. Remembering the gray-hulled Liberty Ships being fitted out for the war at these dilapidated piers, was still very much embedded in my memory. Things had not changed all that much, except that the ships that were once here were now at the bottom of the ocean, sold, or nested at one of the “National Defense Reserve Fleets.” The iconic movie On the Waterfront had not yet been filmed, and it would take another two years before Marlon Brando would stand on the same pier I was now looking down upon, from the higher level of Stevens Park. Labor problems were common during this era, but it was all new to me. I was only 17 years old, but would later remember how Marlon Brando got the stuffing kicked out of him for being a union malcontent. When they filmed the famous fight scene in On the Waterfront, it took place on a barge, tied up in the very same location that I was looking upon.
Hank Bracker
Tony Hseih, CEO of Zappos, helped disrupt the retail space by emphasizing mastery, making the “pursuit of growth and learning” central to his corporate philosophy and famously saying: “Failure isn’t a badge of shame. It is a rite of passage.
Peter H. Diamandis (Bold: How to Go Big, Create Wealth and Impact the World (Exponential Technology Series))
This is the Number. It is in the things we do, the people we meet, the ID cards that we carry. It's part of our identities, our credit cards, our social interactions. It takes our influences, our biases, morals, lifestyles and turns them into a massive alternate reality that no-one can escape from. It lives on our phones, in our televisions, in the cards we swipe to enter office. At its best, it’s an exact mirror of how human society actually works - all our greatness, all our petty shallowness, all our small talk and social contacts all codified and reduced and made plain. At its worst, it’s also exactly that. It’s how poor and rich and famous and desirable you are. It’s the backchannel given a name and dragged out into the limelight for everyone to see.
Yudhanjaya Wijeratne (Numbercaste)
Another of our large ambitions here is to demonstrate that our new understanding of the relationship between parts and wholes in physical reality can serve as the basis for a renewed dialogue between the two cultures of humanists-social scientists and scientists-engineers. When C. P. Snow recognized the growing gap between these two cultures in his now famous Rede Lecture in 1959, his primary concern was that the culture of humanists-social scientists might become so scientifically illiterate that it would not be able to meaningfully evaluate the uses of new technologies
Robert L. Nadeau (The Non-Local Universe: The New Physics and Matters of the Mind)
After all, as Ray Kurzweil and Terry Grossman, MD famously wrote, all we need to do is “live long enough to live forever.
Sergey Young (The Science and Technology of Growing Young: An Insider's Guide to the Breakthroughs that Will Dramatically Extend Our Lifespan . . . and What You Can Do Right Now)
To reverse Arthur C. Clarke’s famous adage about magic, any sufficiently familiar technology is indistinguishable from nature.
Virginia Postrel (The Fabric of Civilization: How Textiles Made the World)
Nevertheless, the Icelandic language remains a source of pride and identity. Famously, rather than adopt foreign terms for new technologies and concepts, various committees establish new Icelandic words to enter the lexicon: tölva (a mixture of “number” and “prophetess”) for computer, friðþjófur (“thief of peace”) for a pager, and skriðdreki (“crawling dragon”) for an armored tank.
Eliza Reid (Secrets of the Sprakkar: Iceland's Extraordinary Women and How They Are Changing the World)
When someone is serial pervert, this is not news. News is when an asexual is forced to come… by an invisible neutrino weapon! So it is more probable they'll outcry the asexual than the pervert. Especially if the first (pervert) is famous and the second is famous wannabe (asexual).
Maria Karvouni (You Are Always Innocent)
One final possibility should be mentioned from the perspective of military activity on the Utah ranch. Recent allegations have surfaced that the Air Force Office of Special Investigations (AFOSI) engaged in several deception and disinformation operations in the 1970s, the 1980s and (presumably) in the 1990s. Many of these operations involved the simulation of “UFOs,” the manufacture of bogus evidence indicative of “extraterrestrial visitation” designed to conceal classified military technology or simply to lead investigators astray. In 2005, retired AFOSI special agent Richard Doty broke his silence to publicly acknowledge being involved in several of these “alien visitation” operations, the most famous being the disinformation campaign to persuade Albuquerque physicist Paul Bennewitz that an alien base existed in Dulce, New Mexico. The operation is described in detail in Greg Bishop’s book Project Beta: The Story of Paul Bennewitz, National Security, and the Creation of a Modern UFO Myth.
Colm A. Kelleher (Hunt for the Skinwalker: Science Confronts the Unexplained at a Remote Ranch in Utah)
literature is full of stories where humans create something in a burst of optimism and then lose control of their creation. Dr Frankenstein creates a monster only for it to escape from him and commit murder. Aza [Raskin] began to think about these stories when he talked with his friends who were engineers working for some of the most famous websites in the world. He would ask them basic questions like why their recommendation engines recommend one thing over another and, he said to me, 'They're like: we're not sure why it's recommending those things.' They're not lying - they have set up a technology that is doing things they don't fully comprehend. He always says to them: 'Isn't that exactly the moment, in the allegories, where you turn the thing off - [when] it's starting to do things you can't predict?
Johann Hari (Stolen Focus: Why You Can't Pay Attention— and How to Think Deeply Again)
207, 2nd Floor, 3rd Main Rd, Chamrajpet, Bengaluru, Karnataka 560018 Call – +91 7022122121 Kannada Books Purchase: Veeraloka Books: Discover the World of Literature At the heart of Karnataka's vibrant literary tradition are its rich cultural heritage and nothing better exemplifies that than Kannada books. Veeraloka Books is the ideal destination for all of your book needs if you're a fan of Kannada literature, from timeless classics to contemporary works. Veeraloka Books is now a trusted name for book lovers looking to add to their Kannada collection Why shop for Kannada books from Veeraloka Books? Veeraloka Books is more than just a bookstore; it is also a point of entry into the vast realm of Kannada literature. Veeraloka Books caters to every kind of reader, whether you're just starting out or have a long history of reading. Veeraloka Books ought to be your first choice for Kannada books for the following reasons: Complete Collection: The selection of Kannada books offered by Veeraloka Books spans genres. The collection is intended to appeal to a wide range of readers' preferences and includes everything from children's books and biographies to short stories, essays, poetry, and historical narratives. Famous Authors: Investigate the works of Kannada literary icons like Da and Kuvempu. Ra. Bendre, K. Shivaram Karanth, and U.R. Ananthamurthy, as well as contemporary authors who are influencing contemporary Kannada literature, are all examples. Veeraloka Books guarantees that you will have access to the best works by contemporary and established authors. Special Editions: Rare and exclusive limited-edition prints of classic Kannada books are frequently available at Veeraloka Books. Veeraloka is a treasure trove for bibliophiles who value one-of-a-kind editions. Cost-effective Pricing: Veeraloka Books makes sure that readers don't have to give up quality for price when purchasing Kannada books. You won't have to break the bank to build your own personal library thanks to low prices and frequent sales. Simple Shopping Experience Online: Veeraloka Books has embraced technology to provide a seamless online shopping experience as digital platforms have grown in popularity. From the convenience of your own home, you can browse, select, and Kannada Books Purchase. It has never been easier to buy Kannada books thanks to a website that is easy to use and offers quick delivery options. Top Novel and Fiction Genres to Explore: Immerse yourself in vivid characters and immersive narratives that highlight Kannada culture, history, and society. Poetry: Through timeless poetry collections, discover the splendor of Kannada verse. Motivation and self-help: Leading Kannada authors can serve as sources of guidance and inspiration for personal development and success. Literature for Kids: Engaging tales and folklore from Kannada's long history will enchant your children. Support the Local Literature and Language By selecting Veeraloka Books for your Kannada book purchases, you are not only supporting the Kannada language and its extensive literary heritage. Veeraloka Books is dedicated to distributing regional literature to readers all over the world. In conclusion, Veeraloka Books provides an unparalleled buying experience for Kannada books. It is the ideal location for readers to immerse themselves in the world of Kannada literature due to its extensive collection, reasonable prices, and straightforward online shopping. Veeraloka Books has something for everyone, whether you're looking for contemporary writings or classic novels. Make your way over to Veeraloka Books right now to get lost in the splendor of Kannada literature!
Kannada Books Purchase
Initially working out of our home in Northern California, with a garage-based lab, I wrote a one page letter introducing myself and what we had and posted it to the CEOs of twenty-two Fortune 500 companies. Within a couple of weeks, we had received seventeen responses, with invitations to meetings and referrals to heads of engineering departments. I met with those CEOs or their deputies and received an enthusiastic response from almost every individual. There was also strong interest from engineers given the task of interfacing with us. However, support from their senior engineering and product development managers was less forthcoming. We learned that many of the big companies we had approached were no longer manufacturers themselves but assemblers of components or were value-added reseller companies, who put their famous names on systems that other original equipment manufacturers (OEMs) had built. That didn't daunt us, though when helpful VPs of engineering at top-of-the-food-chain companies referred us to their suppliers, we found that many had little or no R & D capacity, were unwilling to take a risk on outside ideas, or had no room in their already stripped-down budgets for innovation. Our designs found nowhere to land. It became clear that we needed to build actual products and create an apples-to-apples comparison before we could interest potential manufacturing customers. Where to start? We created a matrix of the product areas that we believed PAX could impact and identified more than five hundred distinct market sectors-with potentially hundreds of thousands of products that we could improve. We had to focus. After analysis that included the size of the addressable market, ease of access, the cost and time it would take to develop working prototypes, the certifications and metrics of the various industries, the need for energy efficiency in the sector, and so on, we prioritized the list to fans, mixers, pumps, and propellers. We began hand-making prototypes as comparisons to existing, leading products. By this time, we were raising working capital from angel investors. It's important to note that this was during the first half of the last decade. The tragedy of September 11, 2001, and ensuing military actions had the world's attention. Clean tech and green tech were just emerging as terms, and energy efficiency was still more of a slogan than a driver for industry. The dot-com boom had busted. We'd researched venture capital firms in the late 1990s and found only seven in the United States investing in mechanical engineering inventions. These tended to be expansion-stage investors that didn't match our phase of development. Still, we were close to the famous Silicon Valley and had a few comical conversations with venture capitalists who said they'd be interested in investing-if we could turn our technology into a website. Instead, every six months or so, we drew up a budget for the following six months. Via a growing network of forward-thinking private investors who could see the looming need for dramatic changes in energy efficiency and the performance results of our prototypes compared to currently marketed products, we funded the next phase of research and business development.
Jay Harman (The Shark's Paintbrush: Biomimicry and How Nature is Inspiring Innovation)
scientist and science fiction writer Arthur C. Clarke famously observed, “Any sufficiently advanced technology is indistinguishable from magic.
Philip E. Tetlock (Superforecasting: The Art and Science of Prediction)
John Adams famously pointed out, political wisdom has not improved over the ages; even as technology has advanced, mankind steps on the same rakes, and the new inventions often magnify the damage. Historian Daniel Boorstin referred to the nonprogressivity of human nature and politics as “Adams’ law,” but Boorstin was far too modest, for he appended several of his own astute observations to it, among which was that technology, far from fulfilling needs and solving problems, creates needs and spreads problems. “Boorstin’s law,” then, could be formulated thus in the modern world: beware of optimism about the social and political benefits of the Internet and social media, for while technology progresses, human nature and politics do not.21
William J. Bernstein (Masters of the Word: How Media Shaped History from the Alphabet to the Internet)
From its earliest forms, utopian fiction has depicted imaginary just and rational societies established in opposition to exploitative worldly ones. Marx was famously reluctant to describe the utopian society that would succeed the successful proletarian revolution, describing it only in the vaguest terms in the conclusion of the Communist Manifesto. Nonetheless he affirmed its importance as an historical goal. Marx also valued technology as a vital tool of human liberation. He believed that in a just world technological innovations were the guarantors of human freedom from toil, just as they were also the means of mass enslavement in an exploitative order. These ideas were forged in Marxist thought into a story of social and technological liberation that had clear affinities with the basic stories of
Edward James (The Cambridge Companion to Science Fiction)
Westerners, not just Lincoln Steffens. It took in the Central Intelligence Agency of the United States. It even took in the Soviet Union’s own leaders, such as Nikita Khrushchev, who famously boasted in a speech to Western diplomats in 1956 that “we will bury you [the West].” As late as 1977, a leading academic textbook by an English economist argued that Soviet-style economies were superior to capitalist ones in terms of economic growth, providing full employment and price stability and even in producing people with altruistic motivation. Poor old Western capitalism did better only at providing political freedom. Indeed, the most widely used university textbook in economics, written by Nobel Prize–winner Paul Samuelson, repeatedly predicted the coming economic dominance of the Soviet Union. In the 1961 edition, Samuelson predicted that Soviet national income would overtake that of the United States possibly by 1984, but probably by 1997. In the 1980 edition there was little change in the analysis, though the two dates were delayed to 2002 and 2012. Though the policies of Stalin and subsequent Soviet leaders could produce rapid economic growth, they could not do so in a sustained way. By the 1970s, economic growth had all but stopped. The most important lesson is that extractive institutions cannot generate sustained technological change for two reasons: the lack of economic incentives and resistance by the elites. In addition, once all the very inefficiently used resources had been reallocated to industry, there were few economic gains to be had by fiat. Then the Soviet system hit a roadblock, with lack of innovation and poor economic incentives preventing any further progress. The only area in which the Soviets did manage to sustain some innovation was through enormous efforts in military and aerospace technology. As a result they managed to put the first dog, Leika, and the first man, Yuri Gagarin, in space. They also left the world the AK-47 as one of their legacies. Gosplan was the supposedly all-powerful planning agency in charge of the central planning of the Soviet economy. One of the benefits of the sequence of five-year plans written and administered by Gosplan was supposed to have been the long time horizon necessary for rational investment and innovation. In reality, what got implemented in Soviet industry had little to do with the five-year plans, which were frequently revised and rewritten or simply ignored. The development of industry took place on the basis of commands by Stalin and the Politburo, who changed their minds frequently and often completely revised their previous decisions. All plans were labeled “draft” or “preliminary.” Only one copy of a plan labeled “final”—that for light industry in 1939—has ever come to light. Stalin himself said in 1937 that “only bureaucrats can think that planning work ends with the creation of the plan. The creation of the plan is just the beginning. The real direction of the plan develops only after the putting together of the plan.” Stalin wanted to maximize his discretion to reward people or groups who were politically loyal, and punish those who were not. As for Gosplan, its main role was to provide Stalin with information so he could better monitor his friends and enemies. It actually tried to avoid making decisions. If you made a decision that turned out badly, you might get shot. Better to avoid all responsibility. An example of what could happen
Daron Acemoğlu (Why Nations Fail: FROM THE WINNERS OF THE NOBEL PRIZE IN ECONOMICS: The Origins of Power, Prosperity and Poverty)
From its earliest forms, utopian fiction has depicted imaginary just and rational societies established in opposition to exploitative worldly ones. Marx was famously reluctant to describe the utopian society that would succeed the successful proletarian revolution, describing it only in the vaguest terms in the conclusion of the Communist Manifesto. Nonetheless he affirmed its importance as an historical goal. Marx also valued technology as a vital tool of human liberation. He believed that in a just world technological innovations were the guarantors of human freedom from toil, just as they were also the means of mass enslavement in an exploitative order. These ideas were forged in Marxist thought into a story of social and technological liberation that had clear affinities with the basic stories of sf.
Edward James (The Cambridge Companion to Science Fiction)
Social media technology creates a culture in which people turn into little brand managers, using Facebook, twitter, text messages to create a falsely upbeat, slightly overexuberant, external self that can be famous first in a small sphere and then, with luck, in a large one.
David Brooks
book The World Beyond Your Head: On Becoming an Individual in an Age of Distraction as a jumping off point, he takes care to unpack the various cultural mandates  that have infected the way we think and feel about distraction. I found his ruminations not only enlightening but surprisingly emancipating: There are two big theories about why [distraction is] on the rise. The first is material: it holds that our urbanized, high-tech society is designed to distract us… The second big theory is spiritual—it’s that we’re distracted because our souls are troubled. The comedian Louis C.K. may be the most famous contemporary exponent of this way of thinking. A few years ago, on “Late Night” with Conan O’Brien, he argued that people are addicted to their phones because “they don’t want to be alone for a second because it’s so hard.” (David Foster Wallace also saw distraction this way.) The spiritual theory is even older than the material one: in 1887, Nietzsche wrote that “haste is universal because everyone is in flight from himself”; in the seventeenth century, Pascal said that “all men’s miseries derive from not being able to sit in a quiet room alone.”… Crawford argues that our increased distractibility is the result of technological changes that, in turn, have their roots in our civilization’s spiritual commitments. Ever since the Enlightenment, he writes, Western societies have been obsessed with autonomy, and in the past few hundred years we have put autonomy at the center of our lives, economically, politically, and technologically; often, when we think about what it means to be happy, we think of freedom from our circumstances. Unfortunately, we’ve taken things too far: we’re now addicted to liberation, and we regard any situation—a movie, a conversation, a one-block walk down a city street—as a kind of prison. Distraction is a way of asserting control; it’s autonomy run amok. Technologies of escape, like the smartphone, tap into our habits of secession. The way we talk about distraction has always been a little self-serving—we say, in the passive voice, that we’re “distracted by” the Internet or our cats, and this makes us seem like the victims of our own decisions. But Crawford shows that this way of talking mischaracterizes the whole phenomenon. It’s not just that we choose our own distractions; it’s that the pleasure we get from being distracted is the pleasure of taking action and being free. There’s a glee that comes from making choices, a contentment that settles after we’ve asserted our autonomy. When
Anonymous
After the arrival of mass-produced books, we became “typographical man,” and our voices lost some power. We were encouraged by the technologies of writing and printing to take on some kinds of input and discouraged from taking on others. Today we privilege the information we take in through our eyes while reading and pay less heed to information that arrives via our other senses. In plainest terms, McLuhan delivers his famous line: “The medium is the message.” What you use to interact with the world changes the way you see the world. Every lens is a tinted lens.
Anonymous
The situation was similar in the Soviet Union, with industry playing the role of sugar in the Caribbean. Industrial growth in the Soviet Union was further facilitated because its technology was so backward relative to what was available in Europe and the United States, so large gains could be reaped by reallocating resources to the industrial sector, even if all this was done inefficiently and by force. Before 1928 most Russians lived in the countryside. The technology used by peasants was primitive, and there were few incentives to be productive. Indeed, the last vestiges of Russian feudalism were eradicated only shortly before the First World War. There was thus huge unrealized economic potential from reallocating this labor from agriculture to industry. Stalinist industrialization was one brutal way of unlocking this potential. By fiat, Stalin moved these very poorly used resources into industry, where they could be employed more productively, even if industry itself was very inefficiently organized relative to what could have been achieved. In fact, between 1928 and 1960 national income grew at 6 percent a year, probably the most rapid spurt of economic growth in history up until then. This quick economic growth was not created by technological change, but by reallocating labor and by capital accumulation through the creation of new tools and factories. Growth was so rapid that it took in generations of Westerners, not just Lincoln Steffens. It took in the Central Intelligence Agency of the United States. It even took in the Soviet Union’s own leaders, such as Nikita Khrushchev, who famously boasted in a speech to Western diplomats in 1956 that “we will bury you [the West].” As late as 1977, a leading academic textbook by an English economist argued that Soviet-style economies were superior to capitalist ones in terms of economic growth, providing full employment and price stability and even in producing people with altruistic motivation. Poor old Western capitalism did better only at providing political freedom. Indeed, the most widely used university textbook in economics, written by Nobel Prize–winner Paul Samuelson, repeatedly predicted the coming economic dominance of the Soviet Union. In the 1961 edition, Samuelson predicted that Soviet national income would overtake that of the United States possibly by 1984, but probably by 1997. In the 1980 edition there was little change in the analysis, though the two dates were delayed to 2002 and 2012. Though the policies of Stalin and subsequent Soviet leaders could produce rapid economic growth, they could not do so in a sustained way. By the 1970s, economic growth had all but stopped. The most important lesson is that extractive institutions cannot generate sustained technological change for two reasons: the lack of economic incentives and resistance by the elites. In addition, once all the very inefficiently used resources had been reallocated to industry, there were few economic gains to be had by fiat. Then the Soviet system hit a roadblock, with lack of innovation and poor economic incentives preventing any further progress. The only area in which the Soviets did manage to sustain some innovation was through enormous efforts in military and aerospace technology. As a result they managed to put the first dog, Leika, and the first man, Yuri Gagarin, in space. They also left the world the AK-47 as one of their legacies. Gosplan was the supposedly all-powerful planning agency in charge of the central planning of the Soviet economy.
Daron Acemoğlu (Why Nations Fail: FROM THE WINNERS OF THE NOBEL PRIZE IN ECONOMICS: The Origins of Power, Prosperity and Poverty)
Inventions have long-since reached their limit--and I see no hope for further developments." -- Julius Frontinus, world-famous engineer (Rome, 10 AD)
Frontinus
He is famous for saying that these days he prescribes a lot more applications than medications to his patients.
Bertalan Meskó (The Guide to the Future of Medicine (2022 Edition): Technology AND The Human Touch)
THE CHASM – THE DIFFUSION MODEL WHY EVERYBODY HAS AN IPOD Why is it that some ideas – including stupid ones – take hold and become trends, while others bloom briefly before withering and disappearing from the public eye? Sociologists describe the way in which a catchy idea or product becomes popular as ‘diffusion’. One of the most famous diffusion studies is an analysis by Bruce Ryan and Neal Gross of the diffusion of hybrid corn in the 1930s in Greene County, Iowa. The new type of corn was better than the old sort in every way, yet it took twenty-two years for it to become widely accepted. The diffusion researchers called the farmers who switched to the new corn as early as 1928 ‘innovators’, and the somewhat bigger group that was infected by them ‘early adaptors’. They were the opinion leaders in the communities, respected people who observed the experiments of the innovators and then joined them. They were followed at the end of the 1930s by the ‘sceptical masses’, those who would never change anything before it had been tried out by the successful farmers. But at some point even they were infected by the ‘hybrid corn virus’, and eventually transmitted it to the die-hard conservatives, the ‘stragglers’. Translated into a graph, this development takes the form of a curve typical of the progress of an epidemic. It rises, gradually at first, then reaches the critical point of any newly launched product, when many products fail. The critical point for any innovation is the transition from the early adaptors to the sceptics, for at this point there is a ‘chasm’. According to the US sociologist Morton Grodzins, if the early adaptors succeed in getting the innovation across the chasm to the sceptical masses, the epidemic cycle reaches the tipping point. From there, the curve rises sharply when the masses accept the product, and sinks again when only the stragglers remain. With technological innovations like the iPod or the iPhone, the cycle described above is very short. Interestingly, the early adaptors turn away from the product as soon as the critical masses have accepted it, in search of the next new thing. The chasm model was introduced by the American consultant and author Geoffrey Moore. First they ignore you, then they laugh at you, then they fight you, then you win. Mahatma Gandhi
Mikael Krogerus (The Decision Book: 50 Models for Strategic Thinking)
We need to analyze and contemplate the experience of modernity in the Arab and Muslim world, in order to grasp what is happening. Some of us, for example, reject modernity, and yet it’s obvious that these same people are using the products of modernity, even to the extent that when proselytizing their interpretation of Islam, which conflicts with modernity, they’re employing the tools of modernity to do so. This strange phenomenon can best be understood by contemplating our basic attitude towards modernity, stemming from two centuries ago. If we analyze books written by various Muslim thinkers at the time, concerning modernity and the importance of modernizing our societies, and so forth, we can see that they distinguished between certain aspects of modernity that should be rejected, and others that may be accepted. You can find this distinction in the very earliest books that Muslim intellectuals wrote on the topic of modernity. To provide a specific example, I’ll cite an important book that is widely regarded as having been the first ever written about modern thought in the Muslim world, namely, a book by the famous Egyptian intellectual, Rifa’ Rafi’ al-Tahtawi (1801–1873), Takhlish al-Ibriz fi Talkhish Baris, whose title may be translated as Mining Gold from Its Surrounding Dross. As you can immediately grasp from its title, the book distinguishes between the “gold” contained within modernity—gold being a highly prized, expensive and rare product of mining—and its so-called “worthless” elements, which Muslims are forbidden to embrace. Now if we ask ourselves, “What elements of modernity did these early thinkers consider acceptable, and what did they demand that we reject?,” we discover that technology is the “acceptable” element of modernity. We are told that we may adopt as much technology as we want, and exploit these products of modernity to our heart’s content. But what about the modes of thought that give rise to these products, and underlie the very phenomenon of modernity itself? That is, the free exercise of reason, and critical thought? These two principles are rejected and proscribed for Muslims, who may adopt the products of modernity, while its substance, values and foundations, including its philosophical modes of thought, are declared forbidden. Shaykh Rifa’ Rafi’ al-Tahtawi explained that we may exploit knowledge that is useful for defense, warfare, irrigation, farming, etc., and yet he simultaneously forbade us to study, or utilize, the philosophical sciences that gave rise to modern thought, and the love for scientific methodologies that enlivens the spirit of modern knowledge, because he believed that they harbored religious deviance and infidelity (to God).
علي مبروك
Perhaps the most famous, if flawed, oracle of the Federal Reserve, former chairman Alan Greenspan, knew that money was something that not only central bankers could create. In a speech in 1996, just as the Cypherpunks were pushing forward with their experiments, Greenspan said that he imagined that the technological revolution could bring back the potential for private money and that it might actually be a good thing: “We could envisage proposals in the near future for issuers of electronic payment obligations, such as stored-value cards or ‘digital cash,’ to set up specialized issuing corporations with strong balance sheets and public credit ratings.
Nathaniel Popper (Digital Gold: Bitcoin and the Inside Story of the Misfits and Millionaires Trying to Reinvent Money)
Education was still considered a privilege in England. At Oxford you took responsibility for your efforts and for your performance. No one coddled, and no one uproariously encouraged. British respect for the individual, both learner and teacher, reigned. If you wanted to learn, you applied yourself and did it. Grades were posted publicly by your name after exams. People failed regularly. These realities never ceased to bewilder those used to “democracy” without any of the responsibility. For me, however, my expectations were rattled in another way. I arrived anticipating to be snubbed by a culture of privilege, but when looked at from a British angle, I actually found North American students owned a far greater sense of entitlement when it came to a college education. I did not realize just how much expectations fetter—these “mind-forged manacles,”2 as Blake wrote. Oxford upholds something larger than self as a reference point, embedded in the deep respect for all that a community of learning entails. At my very first tutorial, for instance, an American student entered wearing a baseball cap on backward. The professor quietly asked him to remove it. The student froze, stunned. In the United States such a request would be fodder for a laundry list of wrongs done against the student, followed by threatening the teacher’s job and suing the university. But Oxford sits unruffled: if you don’t like it, you can simply leave. A handy formula since, of course, no one wants to leave. “No caps in my classroom,” the professor repeated, adding, “Men and women have died for your education.” Instead of being disgruntled, the student nodded thoughtfully as he removed his hat and joined us. With its expanses of beautiful architecture, quads (or walled lawns) spilling into lush gardens, mist rising from rivers, cows lowing in meadows, spires reaching high into skies, Oxford remained unapologetically absolute. And did I mention? Practically every college within the university has its own pub. Pubs, as I came to learn, represented far more for the Brits than merely a place where alcohol was served. They were important gathering places, overflowing with good conversation over comforting food: vital humming hubs of community in communication. So faced with a thousand-year-old institution, I learned to pick my battles. Rather than resist, for instance, the archaic book-ordering system in the Bodleian Library with technological mortification, I discovered the treasure in embracing its seeming quirkiness. Often, when the wrong book came up from the annals after my order, I found it to be right in some way after all. Oxford often works such. After one particularly serendipitous day of research, I asked Robert, the usual morning porter on duty at the Bodleian Library, about the lack of any kind of sophisticated security system, especially in one of the world’s most famous libraries. The Bodleian was not a loaning library, though you were allowed to work freely amid priceless artifacts. Individual college libraries entrusted you to simply sign a book out and then return it when you were done. “It’s funny; Americans ask me about that all the time,” Robert said as he stirred his tea. “But then again, they’re not used to having u in honour,” he said with a shrug.
Carolyn Weber (Surprised by Oxford)
astronomical fees, of course—any day now. Please don’t get sucked into that mess. In 1986, [Brooks] famously predicted that there were no silver bullets: that by 1996, no single technology or management technique would offer a tenfold increase in productivity, reliability, or simplicity. None did. Agile development isn’t a silver bullet, either. In fact, I don’t recommend adopting agile
Anonymous
Singer cited the famous essay “The Tragedy of the Commons,” in which biologist Garrett Hardin argued that individuals acting in their rational self-interest may undermine the common good, and warned against assuming that technology would save us from ourselves. “If we ignore the present warning signs and wait for an ecological disaster to strike, it will probably be too late,” Singer noted. He imagined what it must have been like to be Noah, surrounded by “complacent compatriots,” saying, “‘Don’t worry about the rising waters, Noah; our advanced technology will surely discover a substitute for breathing.
Naomi Oreskes (Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming)
Reid Hoffman famously said, ‘If you’re not embarrassed by the first version of your product, you’ve launched too late.’ 
Peter H. Diamandis (Bold: How to Go Big, Create Wealth and Impact the World (Exponential Technology Series))
Then let me ask you this famous question: would you rather live in a world without technology ... or in a world without religion? Would you rather live without medicine, electricity, transportation, and antibiotics ... or without zealots waging war over fictional take us an imaginary spirits? - Winston
Dan Brown
End of May 2012 The continuation of my email to Andy: …I was delighted to return to London after war-ravaged Belfast. The students in our college had to evacuate several times due to IRA bomb threats. I must have subconsciously selected to be in Northern Ireland because of my unsettling inner upheavals. Much like the riots that went on in the city in 1971, I was unconsciously fighting my inner demons within myself. I needed that year to overcome my sexual additions and to immerse myself in my fashion studies. By the following year, I had compiled an impressive fashion design portfolio for application with various London Art and Design colleges. Foundation students generally required two years to complete their studies. I graduated from the Belfast College of Art with flying colors within a year. By the autumn of 1972, I was accepted into the prestigious Harrow School of Art and Technology. Around that period, my father’s business was waning and my family had financial difficulty sponsoring my graduate studies. Unbeknownst to my family, I had earned sufficient money during my Harem services to comfortably put myself through college. I lied to my parents and told them I was working part-time in London to make ends meet so I could finance my fashion education. They believed my tall tale. For the next three years I put my heart and soul into my fashion projects. I would occasionally work as a waiter at the famous Rainbow Room in Biba, which is now defunct. Working at this dinner dance club was a convenient way of meeting beautiful and trendy patrons, who often visit this capricious establishment.
Young (Unbridled (A Harem Boy's Saga, #2))
The nonspecialist electric technology retribalizes. The process of upset resulting from a new distribution of skills is accompanied by much culture lag in which people feel compelled to look at new situations as if they were old ones, and come up with ideas of “population explosion” in an age of implosion. Newton, in an age of clocks, managed to present the physical universe in the image of a clock. But poets like Blake were far ahead of Newton in their response to the challenge of the clock. Blake spoke of the need to be delivered “from single vision and Newton’s sleep,” knowing very well that Newton’s response to the challenge of the new mechanism was itself merely a mechanical repetition of the challenge. Blake saw Newton and Locke and others as hypnotized Narcissus types quite unable to meet the challenge of mechanism. W. B. Yeats gave the full Blakean version of Newton and Locke in a famous epigram: Locke sank into a swoon; The garden died; God took the spinning jenny Out of his side. Yeats presents Locke, the philosopher of mechanical and lineal associationism, as hypnotized by his own image. The “garden,” or unified consciousness, ended. Eighteenth century man got an extension of himself in the form of the spinning machine that Yeats endows with its full sexual significance. Woman, herself, is thus seen as a technological extension of man’s being.
Marshall McLuhan (Understanding Media: The Extensions of Man)
Examine any famous street in Paris, Cairo, London, or New York, and you’ll find plenty of shops where you can buy clothes or coffee, have your hair styled or nails polished. But where are the shops selling the secrets to full satisfaction and a truly happy life? The Yoga Wisdom Literatures The wisdom texts of the Vedic tradition specialize in happiness. Veda means “knowledge,” and the Vedas are ancient but ageless texts containing knowledge that lead us to happiness. We learn from them that human life is meant for self-inquiry and that whatever we do should lead to self-discovery and the purification of our body, mind, and consciousness. Vedic teachers show by example how to live a more peaceful and balanced life. They don’t neglect science or technology, but instead teach us how to use them purposefully so that we can attain our full potential.
Vaiśeṣika Dāsa (The Four Questions: A Pathway to Inner Peace)
Freud to his famous reading of the Oedipus myth and the sense of the Father’s law, since it is the competition with the Father - arising as a correlate of the infant’s incestual longing for the mother - that first brings the relation between desire and survival to a crisis. Later, in the formulation of the death drive, the sacrificial character of desire is thought even more immediately, so that desire is not merely integrated structurally with a threat to existence within the oedipal triangle, but is rather related to death by the intrinsic tendency of its own economy. The intensity of the affect is now thought as inherently oriented to its own extinction, as a differentiation from death or the inorganic that is from its beginning a compulsion to return. But despite recognizing that the conscious self is a modulation of the drives, so that all psychical energy stems from the unconscious (from which ego-energy is borrowed), Freud seems to remain committed to the right of the reality principle, and its representative the ego, and thus to accept a survival (or adaptation) imperative as the principle of therapeutic practice. It is because of this basic prejudice against the claims of desire that psychoanalysis has always had a tendency to degenerate into a technology of repression that subtilizes, and therefore reinforces, the authority of the ego. In the terms both of the reality principle and the conservative moment of psychoanalysis, desire is a negative pressure working against the conservation of life, a dangerous internal onslaught against the self, tending with inexorable force towards the immolation of the individual and his civilization
Nick Land (The Thirst for Annihilation: Georges Bataille and Virulent Nihilism (An Essay in Atheistic Religion))
Aren’t fears of disappearing jobs something that people claim periodically, like with both the agricultural and industrial revolution, and it’s always wrong?” It’s true that agriculture went from 40 percent of the workforce in 1900 to 2 percent in 2017 and we nonetheless managed to both grow more food and create many wondrous new jobs during that time. It’s also true that service-sector jobs multiplied in many unforeseen ways and absorbed most of the workforce after the Industrial Revolution. People sounded the alarm of automation destroying jobs in the 19th century—the Luddites destroying textile mills in England being the most famous—as well as in the 1920s and the 1960s, and they’ve always been wildly off the mark. Betting against new jobs has been completely ill-founded at every point in the past. So why is this time different? Essentially, the technology in question is more diverse and being implemented more broadly over a larger number of economic sectors at a faster pace than during any previous time. The advent of big farms, tractors, factories, assembly lines, and personal computers, while each a very big deal for the labor market, were orders of magnitude less revolutionary than advancements like artificial intelligence, machine learning, self-driving vehicles, advanced robotics, smartphones, drones, 3D printing, virtual and augmented reality, the Internet of things, genomics, digital currencies, and nanotechnology. These changes affect a multitude of industries that each employ millions of people. The speed, breadth, impact, and nature of the changes are considerably more dramatic than anything that has come before.
Andrew Yang (The War on Normal People: The Truth About America's Disappearing Jobs and Why Universal Basic Income Is Our Future)
Franklin, "the most accomplished American of his age and the most influential in inventing the type of society America would become."[4] Franklin became a newspaper editor, printer, and merchant in Philadelphia, becoming very wealthy, writing and publishing Poor Richard's Almanack and The Pennsylvania Gazette. Franklin was interested in science and technology, and gained international renown for his famous experiments. He played a major role in establishing the University of Pennsylvania and Franklin & Marshall College and was elected the first president of the American Philosophical Society. Franklin became a national hero in America when he spearheaded the effort to have Parliament repeal the unpopular Stamp Act. An accomplished diplomat, he was widely admired among the French as American minister to Paris and was a major figure in the development of positive Franco-American relations.
Benjamin Franklin (The Articles of Confederation)
The insatiable need for more processing power -- ideally, located as close as possible to the user but, at the very least, in nearby indus­trial server farms -- invariably leads to a third option: decentralized computing. With so many powerful and often inactive devices in the homes and hands of consumers, near other homes and hands, it feels inevitable that we'd develop systems to share in their mostly idle pro­cessing power. "Culturally, at least, the idea of collectively shared but privately owned infrastructure is already well understood. Anyone who installs solar panels at their home can sell excess power to their local grid (and, indirectly, to their neighbor). Elon Musk touts a future in which your Tesla earns you rent as a self-driving car when you're not using it yourself -- better than just being parked in your garage for 99% of its life. "As early as the 1990s programs emerged for distributed computing using everyday consumer hardware. One of the most famous exam­ples is the University of California, Berkeley's SETl@HOME, wherein consumers would volunteer use of their home computers to power the search for alien life. Sweeney has highlighted that one of the items on his 'to-do list' for the first-person shooter Unreal Tournament 1, which shipped in 1998, was 'to enable game servers to talk to each other so we can just have an unbounded number of players in a single game session.' Nearly 20 years later, however, Sweeney admitted that goal 'seems to still be on our wish list.' "Although the technology to split GPUs and share non-data cen­ter CPUs is nascent, some believe that blockchains provide both the technological mechanism for decentralized computing as well as its economic model. The idea is that owners of underutilized CPUs and GPUs would be 'paid' in some cryptocurrency for the use of their processing capabilities. There might even be a live auction for access to these resources, either those with 'jobs' bidding for access or those with capacity bidding on jobs. "Could such a marketplace provide some of the massive amounts of processing capacity that will be required by the Metaverse? Imagine, as you navigate immersive spaces, your account continuously bidding out the necessary computing tasks to mobile devices held but unused by people near you, perhaps people walking down the street next to you, to render or animate the experiences you encounter. Later, when you’re not using your own devices, you would be earning tokens as they return the favor. Proponents of this crypto-exchange concept see it as an inevitable feature of all future microchips. Every computer, no matter how small, would be designed to be auctioning off any spare cycles at all times. Billions of dynamically arrayed processors will power the deep compute cycles of event the largest industrial customers and provide the ultimate and infinite computing mesh that enables the Metaverse.
Mattew Ball
The famous feminist fallacy version of Firestone also requires that we forget her repeated proviso that without a revolutionary transformation of society’s views of gender, kinship, and marriage new reproductive technologies would be more likely to further subordinate women than to liberate them (“to envision it in the hands of the present powers is to envision a nightmare,” she cautioned). As Debora Halbert points out in a more careful reading of The Dialectic of Sex on the question of technology, Firestone clearly articulated [that] the problem is not [reproductive] technology but the underlying sex-roles that it may or may not reproduce . . . [T]echnology alone will not liberate women and men, instead there must be a transformation in the way sex-roles are understood, a transformation that can only take place if technology is used to give women choices other than childrearing.
Mandy Merck (Further Adventures of The Dialectic of Sex: Critical Essays on Shulamith Firestone (Breaking Feminist Waves))
In the same way that Firestone’s embrace of scientific and technological progress as manifest destiny tips its hat to Marx and Engels, so also it resembles (perhaps even more closely) the Marxist-inspired biofuturism of the interwar period, particularly in Britain, in the work of writers such as H. G. Wells, J. B. S. Haldane, J. D. Bernal, Julian Huxley, Conrad Waddington, and their contemporaries (including Gregory Bateson and Joseph Needham, the latter of whose embryological interests led to his enduring fascination with the history of technology in China). Interestingly, it is also in these early twentieth century writings that ideas about artificial reproduction, cybernation, space travel, genetic modification, and ectogenesis abound. As cultural theorist Susan Squier has demonstrated, debates about ectogenesis were crucial to both the scientific ambitions and futuristic narratives of many of the United Kingdom’s most eminent biologists from the 1920s and the 1930s onward. As John Burdon Sanderson (“Jack”) Haldane speculated in his famous 1923 paper “Daedalus, or Science and the Future” (originally read to the Heretics society in Cambridge) ectogenesis could provide a more efficient and rational basis for human reproduction in the future: [W]e can take an ovary from a woman, and keep it growing in a suitable fluid for as long as twenty years, producing a fresh ovum each month, of which 90 per cent can be fertilized, and the embryos grown successfully for nine months, and then brought out into the air.
Mandy Merck (Further Adventures of The Dialectic of Sex: Critical Essays on Shulamith Firestone (Breaking Feminist Waves))
The strongest argument for gene editing cane toads, house mice, and ship rats is also the simplest: what's the alternative? Rejecting such technologies as unnatural isn't going to bring nature back. The choice is not between what was and what is, but between what is and what will be, which often enough is nothing. This is the situation of the Devil's Hole pupfish, the Shoshone pupfish, and the Pahrump poolfish, of the northern quoll, the Campbell Island teal, and the Tristan albatross. Stick to a strict interpretation of the natural and these--along with thousands of other species--are goners. The issue at this point, is not whether we're going to alter nature, but to what end? "We are as gods and might as well get good at it," Stewart Brand, editor of the Who Earth Catalog, famously wrote in its first issue, published in 1968. Recently, in response to the whole-earth transformation that's under way, Brand has sharpened his statement: "We are as gods and we have to get good at it.
Elizabeth Kolbert (Under a White Sky: The Nature of the Future)
...by the late 2000s, it seemed like a sucker's bet to try to make a living as an inventor in the classic sense, by creating useful and original things... the country's most famous inventors were inventing things of dubious merit, generating enormous wealth for a few by hawking gadgets to the many. In the San Francisco Bay Area, as America's coal-fired power plants continued to soak the atmosphere with gunk, as dysfunction snarled Congress and the roads and bridges chipped and cracked, as twelve million searched in vain for jobs and the economies of entire towns ran on food stamps, the best and brightest trilled about the awesomeness of their smartphone apps. Twitter, Facebook, Instagram, Angry Birds, Summly, Wavii: software to entertain, encapsulate, package, distract. Silicon Valley: a place that has made many useful things and created enormous wealth and transformed the way we live and where many are now working to build a virtual social layer atop the real corroding world.
Jason Fagone (Ingenious: A True Story of Invention, Automotive Daring, and the Race to Revive America)
All right, as a gambler I liked the idea of Betfair, because it offered better odds than the bookmakers and, if it was successful, it would be a thorn in their side. Every punter loves to hate the bookies. But, then again, I am not a follower of Victor Kiam, who famously bought Remington because he liked shaving with its razor. I never buy into a company because I like its product. For the first couple of years that I was a part-owner and director of Betfair, I didn’t register as a user. To be honest, I’m not very good with technology and I didn’t know how to go online and bet. I remember being ridiculed by some of my fellow Betfair directors when I remarked, nearly three years after making my investment, that I had just started using the site and found it impressive. How could anyone invest in their baby without giving it an extensive road-test first, without understanding how to use it? Wilful ignorance is one of my best investment tools. I don’t want to know too much before making an investment. I don’t want to cloud my judgement, or make the decision difficult. I don’t want to know about all the risks or understand them. I just want to be reasonably sure that it’s a star business. That makes life simple and fun. And profitable.
Richard Koch (The Star Principle: How it can make you rich)
At the end of 1999 I was the editor of Time, and we made a somewhat offbeat decision to make Bezos our Person of the Year, even though he wasn’t a famous world leader or statesman. I had the theory that the people who affect our lives the most are often the people in business and technology who, at least early in their careers, aren’t often found on the front pages. For example, we had made Andy Grove of Intel the Person of the Year at the end of 1997 because I felt the explosion of the microchip was changing our society more than any prime minister or president or treasury secretary. But as the publication date of our Bezos issue neared in December 1999, the air was starting to go out of the dot.com bubble. I was worried—correctly—that internet stocks, such as Amazon, would start to collapse. So I asked the CEO of Time Inc., the very wise Don Logan, whether I was making a mistake by choosing Bezos and would look silly in years to come if the internet economy deflated. No, Don told me. “Stick with your choice. Jeff Bezos is not in the internet business. He’s in the customer-service business. He will be around for decades to come, well after people have forgotten all the dot.coms that are going to go bust.
Jeff Bezos (Invent and Wander: The Collected Writings of Jeff Bezos)
In 1930, in a speech titled “Economic Possibilities for Our Grandchildren,” the economist John Maynard Keynes made a famous prediction: Within a century, thanks to the growth of wealth and the advance of technology, no one would have to work more than about fifteen hours a week. The challenge would be how to fill all our newfound leisure time without going crazy. “For the first time since his creation,” Keynes told his audience, “man will be faced with his real, his permanent problem—how to use his freedom from pressing economic cares.” But Keynes was wrong. It turns out that when people make enough money to meet their needs, they just find new things to need and new lifestyles to aspire to; they never quite manage to keep up with the Joneses, because whenever they’re in danger of getting close, they nominate new and better Joneses with whom to try to keep up. As a result, they work harder and harder, and soon busyness becomes an emblem of prestige. Which is clearly completely absurd: for almost the whole of history, the entire point of being rich was not having to work so much. Moreover,
Oliver Burkeman (Four Thousand Weeks: Time Management for Mortals)
the roughly $800 billion in available stimulus, we directed more than $90 billion toward clean energy initiatives across the country. Within a year, an Iowa Maytag plant I’d visited during the campaign that had been shuttered because of the recession was humming again, with workers producing state-of-the-art wind turbines. We funded construction of one of the world’s largest wind farms. We underwrote the development of new battery storage systems and primed the market for electric and hybrid trucks, buses, and cars. We financed programs to make buildings and businesses more energy efficient, and collaborated with Treasury to temporarily convert the existing federal clean energy tax credit into a direct-payments program. Within the Department of Energy, we used Recovery Act money to launch the Advanced Research Projects Agency–Energy (ARPA-E), a high-risk, high-reward research program modeled after DARPA, the famous Defense Department effort launched after Sputnik that helped develop not only advanced weapons systems like stealth technology but also an early iteration of the internet, automated voice activation, and GPS.
Barack Obama (A Promised Land)
Those Minecraftians and their junk… At least, I figured it was Minecraftian junk. I had never personally met one of the creatures before. From what I’d heard in my training and tales from other Endermen, the Minecraftians were small and weak, but were intelligent, and were able to transform the Overworld into tools, armor, and other technology that made them stronger. The older Endermen told me stories about the famous Steve, as well as other Minecraftians that came and went frequently on the Overworld. We even saw a Minecraftian or two appear every once and a while on the dragon’s island, stuck on our world because of dabbling with portal technology they didn’t understand.
Skeleton Steve (Diary of an Enderman Ninja, Book 1 (Diary of an Enderman Ninja #1))
The first thing companies did with computer technology back in the 1980s was to multiply the number of choices for their customers. More colors, more styles, more features, more models, more messages, more channels, more services, more brand extensions, more SKUs. The siren call of “consumer choice” proved impossible for companies to resist. If a little choice was good, they reasoned, more choice was better. Customers loved it. For about 15 minutes. Today their lives are so cluttered by choice that they can barely breathe. Americans now see that a little choice increases their freedom, but too much takes it away. Do you really want to spend three hours learning how to use the features on your new Samsung TV? Or sort through 17 varieties each time you buy Crest toothpaste at the supermarket? Or deal with the 3,000 pages of items shown in Restoration Hardware’s 15-pound set of catalogs? Not if you have a life. Of course, none of us wants to give up this lavish banquet of choice. We just want it off the floor and out of the way. “It’s not information over-load,” media consultant Clay Shirky famously said. “It’s filter failure.” Our brains can’t handle the deluge. We’re desperate for a way to organize, access, and make use of so many options. Amazon founder Jeff Bezos called it “cognitive overhead.
Marty Neumeier (Brand Flip, The: Why customers now run companies and how to profit from it (Voices That Matter))
In a famous speech he criticized “equality mongering,” and thereafter not only did different jobs get paid different wages but also a bonus system was introduced. It is instructive to understand how this worked. Typically a firm under central planning had to meet an output target set under the plan, though such plans were often renegotiated and changed. From the 1930s, workers were paid bonuses if the output levels were attained. These could be quite high—for instance, as much as 37 percent of the wage for management or senior engineers. But paying such bonuses created all sorts of disincentives to technological change. For one thing, innovation, which took resources away from current production, risked the output targets not being met and the bonuses not being paid. For another, output targets were usually based on previous production levels. This created a huge incentive never to expand output, since this only meant having to produce more in the future, since future targets would be “ratcheted up.” Underachievement was always the best way to meet targets and get the bonus. The fact that bonuses were paid monthly also kept everyone focused on the present, while innovation is about making sacrifices today in order to have more tomorrow.
Daron Acemoğlu (Why Nations Fail: The Origins of Power, Prosperity, and Poverty)
I tell him that such stories make me nervous for the future in the face of advances in AI. Garry shakes his head. “I’m more optimistic about the future of humanity,” he says. “You’re not worried about AI taking over?” “Why should I be? I was the first knowledge worker whose job was threatened by machines,” he says. I laugh. It’s true. In 1997, Garry was famously beaten in a chess match by IBM’s supercomputer Deep Blue. “I think it’s wrong to cry about progress,” he says. “The future is not humans fighting machines. The future is humans collaborating with machines. Every technology in history destroyed jobs, but it also created new ones.
A.J. Jacobs (The Puzzler: One Man's Quest to Solve the Most Baffling Puzzles Ever, from Crosswords to Jigsaws to the Meaning of Life)
The advent of the first ledger technology can be traced back to roughly 3000 BCE, in ancient Mesopotamia (modern-day Iraq). Of the tens of thousands of clay tablets the Mesopotamians left behind, most are, well, ledgers: records of taxes, payments, private wealth, worker pay. The famous Code of Hammurabi—the Babylonians’ system of law—was written on one of these ledgers, but most of the kings had their own rules set out as well. The rise of these ledgers matched the rise of the first large-scale civilizations. Why have ledgers been so important throughout history? Exchanges of goods and services have defined the expansion of societies, but this was possible only if people could keep track of the exchanges. It wasn’t so difficult for everyone in a small village to remember that someone had killed a pig and to trust—a word we’ll encounter throughout this book—that all who ate of it would find some way to later repay the hunter, perhaps with a new arrowhead or some other thing of value. It was another to manage these cross-societal obligations across a larger group of strangers—especially when moving outside of kinship boundaries made it harder to trust each other. Ledgers are record-keeping devices that help deal with those problems of complexity and trust.
Michael J. Casey (The Truth Machine: The Blockchain and the Future of Everything)
Many Silicon Valley insiders predicted that it would only get worse. One of its most famous investors, Paul Graham, wrote: “Unless the forms of technological progress that produced these things are subject to different laws than technological progress in general, the world will get more addictive in the next forty years than it did in the last forty.
Johann Hari (Stolen Focus: Why You Can't Pay Attention—and How to Think Deeply Again)
Wi-Fi is one of the maximum vital technological developments of the present day age. It’s the wireless networking wellknown that enables us experience all of the conveniences of cutting-edge media and connectivity. But what is Wi-Fi, definitely? The time period Wi-Fi stands for wi-fi constancy. Similar to other wi-fi connections, like Bluetooth, Wi-Fi is a radio transmission generation. Wireless fidelity is built upon a fixed of requirements that permit high-pace and at ease communications among a huge sort of virtual gadgets, get admission to points, and hardware. It makes it viable for Wi-Fi succesful gadgets to get right of entry to the net without the want for real wires. Wi-Fi can function over brief and long distances, be locked down and secured, or be open and unfastened. It’s particularly flexible and is simple to use. That’s why it’s located in such a lot of famous devices. Wi-Fi is ubiquitous and exceedingly essential for the manner we function our contemporary linked world. How does Wi-Fi paintings? Bluetooth Mesh Philips Hue Wi-fi Although Wi-Fi is commonly used to get right of entry to the internet on portable gadgets like smartphones, tablets, or laptops, in actuality, Wi-Fi itself is used to hook up with a router or other get entry to point which in flip gives the net get entry to. Wi-Fi is a wireless connection to that tool, no longer the internet itself. It also affords get right of entry to to a neighborhood community of related gadgets, that's why you may print photos wirelessly or study a video feed from Wi-Fi linked cameras without a want to be bodily linked to them. Instead of the usage of stressed connections like Ethernet, Wi-Fi uses radio waves to transmit facts at precise frequencies, most typically at 2.4GHz and 5GHz, although there are numerous others used in more niche settings. Each frequency range has some of channels which wireless gadgets can function on, supporting to spread the burden in order that person devices don’t see their indicators crowded or interrupted by other visitors — although that does happen on busy networks.
Anonymous
The Romans built houses, shops, public buildings, and baths from concrete. The breakwaters, towers, and other structures that made up the colossal man-made harbor of Caesarea,8 in what is now Israel, were built with concrete, as was the foundation of the Colosseum, along with countless bridges and aqueducts9 across the empire. Most famously, Rome’s Pantheon, built nearly 2,000 years ago, is roofed with a spectacular concrete dome—still the biggest concrete structure without reinforcing steel in the world. Like so much other knowledge the Romans had accumulated, though, the science and technology of concrete faded from memory as the empire slowly crumbled over the centuries that followed. “Perhaps the material was lost because it was industrial in nature and needed an industrial empire to support it,” writes scientist and engineer Mark Miodownik in Stuff Matters. “Perhaps it was lost because it was not associated with a particular skill or craft, such as ironmongery, stonemasonry, or carpentry, and so was not handed down as a family trade.”10 Whatever the reasons, the result was striking: “There were no concrete structures built for more than a thousand years after the Romans stopped making it,” notes Miodownik.
Vince Beiser (The World in a Grain: The Story of Sand and How It Transformed Civilization)
Kensi Gounden says Everybody wants to be famous, but nobody wants to do the work. I live by that. You grind hard so you can play hard. At the end of the day, you put all the work in, and eventually it’ll pay off. It could be in a year, it could be in 30 years. Eventually, your hard work will pay off. #kensigounden #kensi #gounden #kenseelen #sports #technology #tech #positivethinking #hopeforbest #innovation #innovate #information #knowledge
Kensi Gounden
Like many successful Chinese firms, it is caught at the bottom of what Taiwanese technology baron Stan Shih famously called the “smile.” Shih observed that in the tech industry, high profits are earned at one end by companies that control the design of core technologies (such as Intel), and at the other by companies that control the design and distribution of products to consumers (such as Apple). In between are commodity firms that manufacture and assemble the products, in high volumes but for low profit margins. Taiwan is filled with such low-margin bottom-of-the-smile firms, such as Shih’s own Acer, TSMC (the world’s biggest contract maker of integrated circuits), and Foxconn (the world’s biggest contract assembler of consumer electronics). For the most part, China’s technology companies seem to be heading in the same direction.
Arthur R. Kroeber (China's Economy: What Everyone Needs to Know)
Doing so showed that there was another variable that was a strong predictor of a person’s securing an entry in Wikipedia: the proportion of immigrants in your county of birth. The greater the percentage of foreign-born residents in an area, the higher the proportion of children born there who go on to notable success. (Take that, Donald Trump!) If two places have similar urban and college populations, the one with more immigrants will produce more prominent Americans. What explains this? A lot of it seems to be directly attributable to the children of immigrants. I did an exhaustive search of the biographies of the hundred most famous white baby boomers, according to the Massachusetts Institute of Technology’s Pantheon project, which is also working with Wikipedia data. Most of these were entertainers. At least thirteen had foreign-born mothers, including Oliver Stone, Sandra Bullock, and Julianne Moore. This rate is more than three times higher than the national average during this period. (Many had fathers who were immigrants, including Steve Jobs and John Belushi, but this data was more difficult to compare to national averages, since information on fathers is not always included on birth certificates.)
Seth Stephens-Davidowitz (Everybody Lies)