Google Algorithm Quotes

We've searched our database for all the quotes and captions related to Google Algorithm. Here they are! All 100 of them:

When people design web pages, they often cater to the taste of the Google search algorithm rather than to the taste of any human being.
Yuval Noah Harari (21 Lessons for the 21st Century)
This monopoly of information is a threat to democracy...
Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism)
...large technologies such as Google need to be broken up and regulated, because their consolidated power and cultural influence make competition largely impossible. This monopoly in the information sector is a threat to democracy...
Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism)
Google and Facebook don’t have “users” or “customers”. Instead, they have participants under machine surveillance, whose activities are algorithmically combined within Big Data silos.
Bruce Sterling (The Epic Struggle of the Internet of Things)
The implications of such marginalization are profound. The insights about sexist and racist biases... are important because information organizations, from libraries to schools and universities to governmental agencies, are increasingly reliant on being displaced by a variety of web-based "tools" as if there are no political, social, or economic consequences of doing so.
Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism)
Google Search is in fact an advertising platform, not intended to solely serve as a public information resource in the way that, say, a library might. Google creates advertising algorithms, not information algorithms.
Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism)
Google creates advertising algorithms, not information algorithms.
Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism)
Manufacturing consent begins by weaponizing the meme and utilizing the censorship algorithms of Google, Facebook, Twitter and YouTube.
James Scott, Senior Fellow, The Center for Cyber Influence Operations Studies
three platforms: Amazon, Google, and Facebook. Registering, iterating, and monetizing its audience is the heart of each platform’s business. It’s what the most valuable man-made things ever created (their algorithms) are designed to do.
Scott Galloway (The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google)
The notion that Google/ Alphabet has the potential to be a democratizing force is certainly laudable, but the contradictions inherent in its projects must be contextualized in the historical conditions that both create and are created by it.
Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism)
within a mere two decades, billions of people have come to entrust the Google search algorithm with one of the most important tasks of all: searching for relevant and trustworthy information. We no longer search for information. Instead, we google
Yuval Noah Harari (21 Lessons for the 21st Century)
The Google and Facebook algorithms not only know exactly how you feel, they also know myriad other things about you that you hardly suspect. Consequently you should stop listening to your feelings and start listening to these external algorithms instead. What’s the point of having democratic elections when the algorithms know not only how each person is going to vote, but also the underlying neurological reasons why one person votes Democrat while another votes Republican? Whereas humanism commanded: ‘Listen to your feelings!’ Dataism now commands: ‘Listen to the algorithms! They know how you feel.’ When
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
If Google isn’t responsible for its algorithm, then who is?
Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism)
The expert goes over my text and says, “Don’t use this word—use that word instead. Then we will get more attention from the Google algorithm.” We know that if we can just catch the eye of the algorithm, we can take the humans for granted.
Yuval Noah Harari (21 Lessons for the 21st Century)
The next question is how? How does news find us? What you need is a certain critical literacy about the fact that you are almost always subject to an algorithm. The most powerful thing in your world now is an algorithm about which you know nothing about.
Kelly McBride
The Google search algorithm has a very sophisticated taste when it comes to ranking the web pages of ice cream vendors, and the most successful ice cream vendors in the world are those that the Google algorithm ranks first—not those that produce the tastiest ice cream.
Yuval Noah Harari (21 Lessons for the 21st Century)
Just think of the way that within a mere two decades, billions of people have come to entrust the Google search algorithm with one of the most important tasks of all: searching for relevant and trustworthy information. We no longer search for information. Instead, we google.
Yuval Noah Harari (21 Lessons for the 21st Century)
An Internet of Things is not a consumer society. It’s a materialised network society. It’s like a Google or Facebook writ large on the landscape.  Google and Facebook don’t have “users” or “customers”. Instead, they have participants under machine surveillance, whose activities are algorithmically combined within Big Data silos.
Bruce Sterling (The Epic Struggle of the Internet of Things)
Once Google, Facebook and other algorithms become all-knowing oracles, they may well evolve into agents and ultimately into sovereigns
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
the problem with Google is that it is forever evolving – adjusting its algorithm in ways it keeps secret.
Jon Ronson (So You've Been Publicly Shamed (Picador Collection))
From the outside looking in, trying to decipher Google’s search algorithms is like reading tea leaves in a toilet bowl…as it’s flushing. With the lights off.
Guy Kawasaki (What the Plus! Google+ for the Rest of Us)
Google filters out serendipity in favor of insularity. It douses the infectious messiness of a city with an algorithmic antiseptic.
Nicholas Carr (The Glass Cage: How Our Computers Are Changing Us)
Google hit such a degree of control in their operation of YouTube that they can tweak the algorithms just right to recommend very peculiar kinds of videos to suit economic and social changes.
stained hanes (94,000 Wasps in a Trench Coat)
The challenge, however, is that Google, Facebook, Netflix, and Amazon do not publish their algorithms. In fact, the methods they use to filter the information you see are deeply proprietary and the “secret sauce” that drives each company’s profitability. The problem with this invisible “black box” algorithmic approach to information is that we do not know what has been edited out for us and what we are not seeing. As a result, our digital lives, mediated through a sea of screens, are being actively manipulated and filtered on a daily basis in ways that are both opaque and indecipherable.
Marc Goodman (Future Crimes)
it’s crucial to make the right call about whether to use an algorithm or a heuristic in a specific situation. This is why the Google experiment with forty-one shades of blue seems so foreign to me, accustomed as I am to the Apple approach. Google used an A/B test to make a color choice. It used a single predetermined value criterion and defined it like so: The best shade of blue is the one that people clicked most often in the test. This is an algorithm.
Ken Kocienda (Creative Selection: Inside Apple's Design Process During the Golden Age of Steve Jobs)
Google’s search algorithms, for example, return results better than anyone else’s. Proprietary technologies for extremely short page load times and highly accurate query autocompletion add to the core search product’s robustness and defensibility. It
Peter Thiel (Zero to One: Notes on Startups, or How to Build the Future)
And allow Google and Facebook to read all your emails, monitor all your chats and messages, and keep a record of all your Likes and clicks. If you do all that, then the great algorithms of the Internet-of-All-Things will tell you whom to marry, which career to pursue and whether to start a war.
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
Google’s enviable position as the monopoly leader in the provision of information has allowed its organization of information and customization to be driven by its economic imperatives and has influenced broad swaths of society to see it as the creator and keeper of information culture online, which I am arguing is another form of American imperialism that manifests itself as a “gatekeeper”18 on the web.
Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism)
Just think of the way that within a mere two decades, billions of people have come to entrust the Google search algorithm with one of the most important tasks of all: searching for relevant and trustworthy information. We no longer search for information. Instead, we google. And as we increasingly rely on Google for answers, so our ability to search for information by ourselves diminishes. Already today, “truth” is defined by the top results of the Google search.11
Yuval Noah Harari (21 Lessons for the 21st Century)
Just think of the way that within a mere two decades, billions of people have come to entrust the Google search algorithm with one of the most important tasks of all: searching for relevant and trustworthy information. We no longer search for information. Instead, we google. And as we increasingly rely on Google for answers, so our ability to search for information by ourselves diminishes. Already today, ‘truth’ is defined by the top results of the Google search.11
Yuval Noah Harari (21 Lessons for the 21st Century)
Indeed, already today computers and algorithms are beginning to function as clients in addition to producers. In the stock exchange, for example, algorithms are becoming the most important buyers of bonds, shares and commodities. Similarly in the advertisement business, the most important customer of all is an algorithm: the Google search algorithm. When people design Web pages, they often cater to the taste of the Google search algorithm rather than to the taste of any human being.
Yuval Noah Harari (21 Lessons for the 21st Century)
Some photo-hosting services, such as Google Photos, are good examples of this. Once you upload all your family photos to the service, it automatically recognizes that the same person A shows up in photos 1, 5, and 11, while another person B shows up in photos 2, 5, and 7. This is the unsupervised part of the algorithm (clustering). Now all the system needs is for you to tell it who these people are. Just one label per person,4 and it is able to name everyone in every photo, which is useful for searching photos.
Aurélien Géron (Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems)
because of the huge number of pages and links involved, Page and Brin named their search engine Google, playing off googol, the term for the number 1 followed by a hundred zeros. It was a suggestion made by one of their Stanford officemates, Sean Anderson, and when they typed in Google to see if the domain name was available, it was. So Page snapped it up. “I’m not sure that we realized that we had made a spelling error,” Brin later said. “But googol was taken, anyway. There was this guy who’d already registered Googol.com, and I tried to buy it from him, but he was fond of it. So we went with Google.”157 It was a playful word, easy to remember, type, and turn into a verb.IX Page and Brin pushed to make Google better in two ways. First, they deployed far more bandwidth, processing power, and storage capacity to the task than any rival, revving up their Web crawler so that it was indexing a hundred pages per second. In addition, they were fanatic in studying user behavior so that they could constantly tweak their algorithms.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
The bigger threat to Google wouldn’t be measured in dollars, but in the philosophical challenge. Could it be that social networking, rather than algorithmic exploitation of the web’s intelligence, would assume the central role in people’s online lives? Even if that were not the case, Facebook made it clear that every facet of the Internet would benefit from the power of personal connection. Google had been chasing a future forged out of algorithms and science fiction chronicles. Did the key to the future lay in party photos and daily status reports?
Steven Levy (In the Plex: How Google Thinks, Works, and Shapes Our Lives)
used to produce more robots, and so on. These corporations can grow and expand to the far reaches of the galaxy, and all they need are robots and computers – they don’t need humans even to buy their products. Indeed, already today computers and algorithms are beginning to function as clients in addition to producers. In the stock exchange, for example, algorithms are becoming the most important buyers of bonds, shares and commodities. Similarly in the advertisement business, the most important customer of all is an algorithm: the Google search algorithm.
Yuval Noah Harari (21 Lessons for the 21st Century)
At the highest levels of authority, we will probably retain human figureheads, who will give us the illusion that the algorithms are only advisors, and that ultimate authority is still in human hands. We will not appoint an AI to be the chancellor of Germany or the CEO of Google. However, the decisions taken by the chancellor and the CEO will be shaped by AI. The chancellor could still choose between several different options, but all these options will be the outcome of Big Data analysis, and they will reflect the way AI views the world more than the way humans view
Yuval Noah Harari (21 Lessons for the 21st Century)
Listen, Google,’ I will say, ‘both John and Paul are courting me. I like both of them, but in a different way, and it’s so hard to make up my mind. Given everything you know, what do you advise me to do?’ And Google will answer: ‘Well, I know you from the day you were born. I have read all your emails, recorded all your phone calls, and know your favourite films, your DNA and the entire history of your heart. I have exact data about each date you went on, and if you want, I can show you second-by-second graphs of your heart rate, blood pressure and sugar levels whenever you went on a date with John or Paul. If necessary, I can even provide you with accurate mathematical ranking of every sexual encounter you had with either of them. And naturally enough, I know them as well as I know you. Based on all this information, on my superb algorithms, and on decades’ worth of statistics about millions of relationships – I advise you to go with John, with an 87 per cent probability of being more satisfied with him in the long run. Indeed, I know you so well that I also know you don’t like this answer. Paul is much more handsome than John, and because you give external appearances too much weight, you secretly wanted me to say “Paul”. Looks matter, of course; but not as much as you think. Your biochemical algorithms – which evolved tens of thousands of years ago in the African savannah – give looks a weight of 35 per cent in their overall rating of potential mates. My algorithms – which are based on the most up-to-date studies and statistics – say that looks have only a 14 per cent impact on the long-term success of romantic relationships. So, even though I took Paul’s looks into account, I still tell you that you would be better off with John.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
Not satisfied with controlling information pipelines, the tech oligarchs have been moving to shape content as well. Controllers like those at Facebook and Twitter seek to “curate” content on their sites, or even eliminate views they find objectionable, which tend to be conservative views, according to former employees.35 Algorithms intended to screen out “hate groups” often spread a wider net, notes one observer, since the programmers have trouble distinguishing between “hate groups” and those who might simply express views that conflict with the dominant culture of Silicon Valley.36 That managers of social media platforms aim to control content is not merely the perception of conservatives. Over 70 percent of Americans believe that social media platforms “censor political views,” according to a recent Pew study.37 With their quasi-monopoly status, Facebook and Google don’t have to worry about competing with anyone, as the tech entrepreneur Peter Thiel observes, so they can indulge their own prejudices to a greater extent than the businesses that might be concerned about alienating customers.38 With their tightening control over media content, the tech elite are now situated to exert a cultural predominance that is unprecedented in the modern era.39 It recalls the cultural influence of the Catholic Church in the Middle Ages, but with more advanced technology.
Joel Kotkin (The Coming of Neo-Feudalism: A Warning to the Global Middle Class)
Equally bad deals have been made with Big Tech. In many ways, Silicon Valley is a product of the U.S. government’s investments in the development of high-risk technologies. The National Science Foundation funded the research behind the search algorithm that made Google famous. The U.S. Navy did the same for the GPS technology that Uber depends on. And the Defense Advanced Research Projects Agency, part of the Pentagon, backed the development of the Internet, touchscreen technology, Siri, and every other key component in the iPhone. Taxpayers took risks when they invested in these technologies, yet most of the technology companies that have benefited fail to pay their fair share of taxes.
Mariana Mazzucato
Google had a built-in disadvantage in the social networking sweepstakes. It was happy to gather information about the intricate web of personal and professional connections known as the “social graph” (a term favored by Facebook’s Mark Zuckerberg) and integrate that data as signals in its search engine. But the basic premise of social networking—that a personal recommendation from a friend was more valuable than all of human wisdom, as represented by Google Search—was viewed with horror at Google. Page and Brin had started Google on the premise that the algorithm would provide the only answer. Yet there was evidence to the contrary. One day a Googler, Joe Kraus, was looking for an anniversary gift for his wife. He typed “Sixth Wedding Anniversary Gift Ideas” into Google, but beyond learning that the traditional gift involved either candy or iron, he didn’t see anything creative or inspired. So he decided to change his status message on Google Talk, a line of text seen by his contacts who used Gmail, to “Need ideas for sixth anniversary gift—candy ideas anyone?” Within a few hours, he got several amazing suggestions, including one from a colleague in Europe who pointed him to an artist and baker whose medium was cake and candy. (It turned out that Marissa Mayer was an investor in the company.) It was a sobering revelation for Kraus that sometimes your friends could trump algorithmic search.
Steven Levy (In the Plex: How Google Thinks, Works, and Shapes Our Lives)
It is best to be the CEO; it is satisfactory to be an early employee, maybe the fifth or sixth or perhaps the tenth. Alternately, one may become an engineer devising precious algorithms in the cloisters of Google and its like. Otherwise, one becomes a mere employee. A coder of websites at Facebook is no one in particular. A manager at Microsoft is no one. A person (think woman) working in customer relations is a particular type of no one, banished to the bottom, as always, for having spoken directly to a non-technical human being. All these and others are ways for strivers to fall by the wayside — as the startup culture sees it — while their betters race ahead of them. Those left behind may see themselves as ordinary, even failures.
Ellen Ullman (Life in Code: A Personal History of Technology)
and museums. Have you had your DNA sequenced? No?! What are you waiting for? Go and do it today. And convince your grandparents, parents and siblings to have their DNA sequenced too – their data is very valuable for you. And have you heard about these wearable biometric devices that measure your blood pressure and heart rate twenty-four hours a day? Good – so buy one of those, put it on and connect it to your smartphone. And while you are shopping, buy a mobile camera and microphone, record everything you do, and put in online. And allow Google and Facebook to read all your emails, monitor all your chats and messages, and keep a record of all your Likes and clicks. If you do all that, then the great algorithms of the Internet-of-All-Things will tell you whom to marry, which career to pursue and whether to start a war.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
You want to know who you really are?’ asks Dataism. ‘Then forget about mountains and museums. Have you had your DNA sequenced? No?! What are you waiting for? Go and do it today. And convince your grandparents, parents and siblings to have their DNA sequenced too – their data is very valuable for you. And have you heard about these wearable biometric devices that measure your blood pressure and heart rate twenty-four hours a day? Good – so buy one of those, put it on and connect it to your smartphone. And while you are shopping, buy a mobile camera and microphone, record everything you do, and put in online. And allow Google and Facebook to read all your emails, monitor all your chats and messages, and keep a record of all your Likes and clicks. If you do all that, then the great algorithms of the Internet-of-All-Things will tell you whom to marry, which career to pursue and whether to start a war.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
Here are some practical Dataist guidelines for you: ‘You want to know who you really are?’ asks Dataism. ‘Then forget about mountains and museums. Have you had your DNA sequenced? No?! What are you waiting for? Go and do it today. And convince your grandparents, parents and siblings to have their DNA sequenced too – their data is very valuable for you. And have you heard about these wearable biometric devices that measure your blood pressure and heart rate twenty-four hours a day? Good – so buy one of those, put it on and connect it to your smartphone. And while you are shopping, buy a mobile camera and microphone, record everything you do, and put in online. And allow Google and Facebook to read all your emails, monitor all your chats and messages, and keep a record of all your Likes and clicks. If you do all that, then the great algorithms of the Internet-of-All-Things will tell you whom to marry, which career to pursue and whether to start a war.’ But where do these great algorithms come from? This is the mystery of Dataism. Just as according to Christianity we humans cannot understand God and His plan, so Dataism declares that the human brain cannot fathom the new master algorithms. At present, of course, the algorithms are mostly written by human hackers. Yet the really important algorithms – such as the Google search algorithm – are developed by huge teams. Each member understands just one part of the puzzle, and nobody really understands the algorithm as a whole. Moreover, with the rise of machine learning and artificial neural networks, more and more algorithms evolve independently, improving themselves and learning from their own mistakes. They analyse astronomical amounts of data that no human can possibly encompass, and learn to recognise patterns and adopt strategies that escape the human mind. The seed algorithm may initially be developed by humans, but as it grows it follows its own path, going where no human has gone before – and where no human can follow.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
In April 2004, Google had one of its countless minicrises, over an anti-Semitic website called Jew Watch. When someone typed “Jew” into Google’s search box, the first result was often a link to that hate site. Critics urged Google to exclude it in its search results. Brin publicly grappled with the dilemma. His view on what Google should do—maintain the sanctity of search—was rational, but a tremor in his voice betrayed how much he was troubled that his search engine was sending people to a cesspool of bigotry. “My reaction was to be really upset about it,” he admitted at the time. “It was certainly not something I want to see.” Then he launched into an analysis of why Google’s algorithms yielded that result, mainly because the signals triggered by the keyword “Jew” reflected the frequent use of that abbreviation as a pejorative. The algorithms had spoken, and Brin’s ideals, no matter how heartfelt, could not justify intervention. “I feel like I shouldn’t impose my beliefs on the world,” he said. “It’s a bad technology practice.
Steven Levy (In the Plex: How Google Thinks, Works, and Shapes Our Lives)
A recent study commissioned by Google’s nemesis – Facebook – has indicated that already today the Facebook algorithm is a better judge of human personalities and dispositions than even people’s friends, parents and spouses. The study was conducted on 86,220 volunteers who have a Facebook account and who completed a hundred-item personality questionnaire. The Facebook algorithm predicted the volunteers’ answers based on monitoring their Facebook Likes – which webpages, images and clips they tagged with the Like button. The more Likes, the more accurate the predictions. The algorithm’s predictions were compared with those of work colleagues, friends, family members and spouses. Amazingly, the algorithm needed a set of only ten Likes in order to outperform the predictions of work colleagues. It needed seventy Likes to outperform friends, 150 Likes to outperform family members and 300 Likes to outperform spouses. In other words, if you happen to have clicked 300 Likes on your Facebook account, the Facebook algorithm can predict your opinions and desires better than your husband or wife!
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
Search engine query data is not the product of a designed statistical experiment and finding a way to meaningfully analyse such data and extract useful knowledge is a new and challenging field that would benefit from collaboration. For the 2012–13 flu season, Google made significant changes to its algorithms and started to use a relatively new mathematical technique called Elasticnet, which provides a rigorous means of selecting and reducing the number of predictors required. In 2011, Google launched a similar program for tracking Dengue fever, but they are no longer publishing predictions and, in 2015, Google Flu Trends was withdrawn. They are, however, now sharing their data with academic researchers... Google Flu Trends, one of the earlier attempts at using big data for epidemic prediction, provided useful insights to researchers who came after them... The Delphi Research Group at Carnegie Mellon University won the CDC’s challenge to ‘Predict the Flu’ in both 2014–15 and 2015–16 for the most accurate forecasters. The group successfully used data from Google, Twitter, and Wikipedia for monitoring flu outbreaks.
Dawn E. Holmes (Big Data: A Very Short Introduction (Very Short Introductions))
Listen, Google,’ I will say, ‘both John and Paul are courting me. I like both of them, but in different ways, and it’s so hard to make up my mind. Given everything you know, what do you advise me to do?’ And Google will answer: ‘Well, I’ve known you from the day you were born. I have read all your emails, recorded all your phone calls, and know your favourite films, your DNA and the entire biometric history of your heart. I have exact data about each date you went on and, if you want, I can show you second-by-second graphs of your heart rate, blood pressure and sugar levels whenever you went on a date with John or Paul. If necessary, I can even provide you with an accurate mathematical ranking of every sexual encounter you had with either of them. And naturally, I know them as well as I know you. Based on all this information, on my superb algorithms, and on decades’ worth of statistics about millions of relationships –I advise you to go with John, with an 87 per cent probability that you will be more satisfied with him in the long run. ‘Indeed, I know you so well that I also know you don’t like this answer. Paul is much more handsome than John, and because you give external appearances too much weight, you secretly wanted me to say “Paul”. Looks matter, of course; but not as much as you think. Your biochemical algorithms –which evolved tens of thousands of years ago on the African savannah –give looks a weight of 35 per cent in their overall rating of potential mates. My algorithms –which are based on the most up-to-date studies and statistics –say that looks have only a 14 per cent impact on the long-term success of romantic relationships. So, even though I took Paul’s looks into account, I still tell you that you would be better off with John.
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
The issue is not merely one of false stories, incorrect facts, or even election campaigns and spin doctors: the social media algorithms themselves encourage false perceptions of the world. People click on the news they want to hear; Facebook, YouTube, and Google then show them more of whatever it is that they already favor, whether it is a certain brand of soap or a particular form of politics. The algorithms radicalize those who use them too. If you click on perfectly legitimate anti-immigration YouTube sites, for example, these can lead you quickly, in just a few more clicks, to white nationalist sites and then to violent xenophobic sites. Because they have been designed to keep you online, the algorithms also favor emotions, especially anger and fear. And because the sites are addictive, they affect people in ways they don't expect. Anger becomes a habit. Divisiveness becomes normal. Even if social media is not yet the primary news source for all Americans, it already helps shape how politicians and journalists interpret the world and portray it. Polarization has moved from the online world into reality. The result is a hyper-partisanship that adds to the distrust of "normal" politics, "establishment" politicians, derided "experts," and "mainstream" institutions--including courts, police, civil servants--and no wonder. As polarization increases, the employees of the state are invariably portrayed as having been "captured" by their opponents. It is not an accident that the Law and Justice Party in Poland, the Brexiteers in Britain, and the Trump administration in the United States have launched verbal assaults on civil servants and professional diplomats. It is not an accident that judges and courts are now the object of criticism, scrutiny, and anger in so many other places too. There can be no neutrality in a polarized world because there can be no nonpartisan or apolitical institutions.
Anne Applebaum (Twilight of Democracy: The Seductive Lure of Authoritarianism)
Google or Facebook were once in the right place at the right time. It’s not clear whether they are still better than anyone else at online data science, or whether their prominence is such that they’ve become the permanent “default.
Frank Pasquale (The Black Box Society: The Secret Algorithms That Control Money and Information)
New opportunities for New York as a high-tech hub are related to the evolution of the Internet, according to Chris Dixon: “Imagine the Internet as a house. The first phase— laying the foundation, the bricks—happened in the ‘90s. No wonder that Boston and California, heavy tech places with MIT and Stanford, dominated the scene at that time. The house has been built, now it’s more about interior design. Many interesting, recent companies haven’t been started by technologists but by design and product-oriented people, which has helped New York a lot. New York City has always been a consumer media kind of city, and the Internet is in need of those kinds of skills now. Actually, when I say design, it’s more about product-focused people. I’d put Facebook in that category. Everything requires engineers, but unlike Google, their breakthrough was not as scientific. It was a well-designed product that people liked to use. Google had a significant scientific breakthrough with their search algorithm. That’s not what drives Facebook. In The Social Network movie, when they write equations on the wall that’s just not what it is, it’s not about that. Every company has engineering problems, but Facebook is product-design driven.
Maria Teresa Cometto (Tech and the City: The Making of New York's Startup Community)
with GDrive, it is not difficult to imagine scenarios whereby users might create public share folders whose contents were exposed to Google's search algorithms.
Matthew G. Kirschenbaum (Mechanisms: New Media and the Forensic Imagination (The MIT Press))
New technologies, be it the printed encyclopedia or Wikipedia, are not abstract machines that independently render us stupid or smart. As we saw with Enlightenment reading technologies, knowledge emerges out of complex processes of selection, distinction, and judgment—out of the irreducible interactions of humans and technology. We should resist the false promise that the empty box below the Google logo has come to represent—either unmediated access to pure knowledge or a life of distraction and shallow information. It is a ruse. Knowledge is hard won; it is crafted, created, and organized by humans and their technologies. Google’s search algorithms are only the most recent in a long history of technologies that humans have developed to organize, evaluate, and engage their world.
Chad Wellmon
More and more, your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click. Google’s
Eli Pariser (The Filter Bubble)
In this scenario, ten years from now, if the tech giants are not restrained and their power as data-monopolies becomes further entrenched, governments will find themselves increasingly sidelined and impotent. Reduced to mere gatekeepers, politicians and civil servants will likely retreat behind algorithmic government, with laws shaped by data and machine learning, with all its inherent biases and imperfections, and public services gradually surrendered to private businesses. Indeed, we should expect just about every area of human existence, currently managed by government, to be dominated by Big Tech and its outriders: from the future of finance (just about everyone), to healthcare (Google), and from low-cost housing (Apple, Google) to education (Google, again) and autonomous vehicles (Tesla, Alphabet, Amazon, Apple, etc.).
Maelle Gavet (Trampled by Unicorns: Big Tech's Empathy Problem and How to Fix It)
The much-ballyhooed Internet of Things (what is a smart bin? In my day that was a dog) will see a surge in demand for rare metals, forcing up their price. Combine that with the invention of the brain-computer interface and there will be times when the spare capacity of human consciousness will be a cheaper option for processing and storage. For a nutritious bowl of soup we’ll be plugged in, with a thousand others, running algorithms to more precisely push a new wonder mop to lonely housewives idly googling through a Valium comedown.
Frankie Boyle (The Future of British Politics)
Now, data is democratizing, and American spy agencies are struggling to keep up. More than half the world is online,25 conducting five billion Google searches each day.26 Cell phone users are recording and posting events in real-time—turning everyone into intelligence collectors, whether they know it or not.27 Anyone with an Internet connection can access Google Earth satellite imagery, identify people using facial recognition software, and track events on Twitter.
Amy B. Zegart (Spies, Lies, and Algorithms: The History and Future of American Intelligence)
As former Google CEO Eric Schmidt and former Deputy Secretary of Defense Robert Work wrote, “AI is accelerating innovation in every scientific and engineering endeavor.”5 Not since electricity has a breakthrough technology ushered in so much potential promise and peril.
Amy B. Zegart (Spies, Lies, and Algorithms: The History and Future of American Intelligence)
populations at scale. There is greater upheaval still to come. In 2019, Google announced it had achieved “quantum supremacy”—a computing breakthrough so powerful that a math problem a supercomputer would need ten thousand years to solve could be cracked by its machine in just three minutes and twenty seconds.
Amy B. Zegart (Spies, Lies, and Algorithms: The History and Future of American Intelligence)
Amazon, Apple, Facebook, Google, and Microsoft are, too. Although some companies have declared they will never use their technology for weapons, the reality is their technology already is a weapon: hackers are attacking computer networks through Gmail phishing schemes and Microsoft coding vulnerabilities, terrorists are livestreaming attacks, and malign actors have turned social media platforms like Twitter and Facebook into disinformation superhighways that undermine democracy from within.
Amy B. Zegart (Spies, Lies, and Algorithms: The History and Future of American Intelligence)
Technological advances (like the Internet) used to start in government and then migrate to the commercial sector.52 Now that process is reversed, with breakthroughs coming from large companies like Google and Nvidia and from startups like Ginko Bioworks and Dataminr.
Amy B. Zegart (Spies, Lies, and Algorithms: The History and Future of American Intelligence)
Machine learning has been through several transition periods starting in the mid-90s. From 1995–2005, there was a lot of focus on natural language, search, and information retrieval. The machine learning tools were simpler than what we’re using today; they include things like logistic regression, SVMs (support vector machines), kernels with SVMs, and PageRank. Google became immensely successful using these technologies, building major success stories like Google News and the Gmail spam classifier using easy-to-distribute algorithms for ranking and text classification—using technologies that were already mature by the mid-90s. (Reza Zadeh)
David Beyer (The Future of Machine Intelligence)
There is discrimination, and the opportunities are not equal to everyone. Most countries are blocked from using several crucial features on Google, Amazon, Shopify, AliExpress, and many more platforms that the "internet millionaires" use to get all of their wealth. They are not smarter than you! They simply have access to markets that are blocked to you! When you try to compete inside their markets, the domain owners alter the algorithms to favor people in that geolocation and put them and their products in front of your. I have been stopped from uploading books for no other reason than being in east Europe. People don't believe these stories are true because they don't want to believe they are living in such a world. It's like the story of the Native Americans, who were offered blankets contaminated with diseases to kill them. Now you are being offered a blanket of illusions that gives you lies. And when you say the truth, they call it a conspiracy and hate speech.
Dan Desmarques
But since the Cold War’s end, it is estimated that more than 80 percent of information in a typical intelligence report is not secrets.40 It’s publicly available information—what intelligence officials call open-source intelligence, or OSINT for short. OSINT includes a wide range of information and sources—everything from routine bureaucratic documents published by obscure foreign government agencies to televised speeches by foreign leaders to maps available on Google Earth and ISIS decapitation videos posted online.
Amy B. Zegart (Spies, Lies, and Algorithms: The History and Future of American Intelligence)
A mob has more tools at its disposal then individual actors do. Popularity—the quantity of clicks of use on any given time is tracked and exploited by algorithms online, and a mob is a critical mass. If thousands of people are linking to something about you, that will quickly become the first thing people see when they google your name, regardless of whether it's a fact checked news article or SmegmaDan69's video about what a bitch you are.
Zoe Quinn (Crash Override: How Gamergate (Nearly) Destroyed My Life, and How We Can Win the Fight Against Online Hate)
In August of 2016 Facebook announced it was changing its news-feed algorithm to try to cut down on the amount of click bait that appears on the site. It remains to be seen how this will affect quality journalism organizations that are dependent on Facebook traffic.
Jonathan Taplin (Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy)
Google's John Mueller answers question about quantifying site quality and mentions having inquired about a Search Console Quality Metric Can Site Quality Be Quantified In Search Console? Google’s John Mueller was asked about whether search quality is quantifiable, meaning something that could be measured and expressed as a metric. John Mueller’s answer was surprising because he indicated having looked into a search console quality metric to help publishers. What Is Site Quality? The idea of site quality seems deceptively simple but it’s not. John Mueller and many others talk about the importance of site quality for indexing and ranking but what it actually means at Google is a mystery so we’re stuck with our own subjective ideas. Screenshot of John Mueller Discussing Site Quality Google's John Mueller discussing site quality The concept of “site quality” is a subjective concept that depends on the opinions of individuals, opinions that are influenced by their wildly varied levels of experience and knowledge. ADVERTISEMENT CONTINUE READING BELOW There is no absolute and objective way to express what Site Quality is. Every person is literally blind to what “quality” actually means for Google because Google doesn’t say what the height, width, and depth of their definition of site quality is. And it could be, as the person asking the question implies, there is no way to quantify what multiple algorithms are independently verifying. The person asking the question, quite reasonably, was looking for something more objective and quantifiable. The question: “When you say improve the quality of your website is this quality something that is quantifiable? Or is it simply a term used to determine how multiple algorithms look at your website?
Business (Business Papersmputers Instructors Mange)
Fausto-Sterling later said she was writing ‘with tongue firmly in cheek’. And yet anyone who has taken gender studies in the past two decades will almost certainly have been assigned her essay, and it is taken inexplicably seriously. So entrenched is the ‘five sexes’ claim that Googling ‘how many sexes are there’ turned up ‘five’ as the top answer until the search algorithm was tweaked in response to complaints.
Helen Joyce (Trans: When Ideology Meets Reality)
The success of Google Flu Trends became emblematic of the hot new trend in business, technology, and science: big data and algorithms. “Big data” can mean many things, but let’s focus on the found data we discussed in the previous chapter, the digital exhaust of web searches, credit card payments, and mobile phones pinging the nearest cell tower, perhaps buttressed by the administrative data generated as organizations organize themselves.
Tim Harford (The Data Detective: Ten Easy Rules to Make Sense of Statistics)
When you deal with robots, you can act robotic. But until you are dealing with humans, try to be a human. Not to mention, this is also one of the ranking factors in the Google algorithm.
Pooja Agnihotri (The Art of Running a Successful Wedding Services Business: The Missing Puzzle Piece You’re Looking For)
A high-profile example of this type of data bias appeared in Google’s “Flu Trends” program. The program, which started in 2008, intended to leverage online searches and user location monitoring to pinpoint regional flu outbreaks. Google collected and used this information to tip-off and alert health authorities in regions they identified. Over time the project failed to accurately predict flu cases due to changes in Google’s search engine algorithm. A new algorithm update in 2012 caused Google’s search engine to suggest a medical diagnosis when users searched for the terms “cough” and “fever.” Google, therefore, inserted a false bias into its results by prompting users with a cough or a fever to search for flu-related results (equivalent to a research assistant lingering over respondents’ shoulder whispering to check the “flu” box to explain their symptoms). This increased the volume of searches for flu-related terms and led Google to predict an exaggerated flu outbreak twice as severe as public health officials anticipated.
Oliver Theobald (Statistics for Absolute Beginners: A Plain English Introduction)
In 2015, Google's image-recognition algorithm confused Black users with gorillas. The company's 'immediate action' in response to this was 'to prevent Google Photos from ever labelling any image as a gorilla, chimpanzee, or monkey - even pictures of the primates themselves.' Several years later, Google's 2018 Arts & Culture app with its museum doppelganger feature allowed users to find artwork containing figures and faces that look like them, prompting problematic pairings as the algorithm identified look-alikes based on essentializing ethnic of racialized attributes. For many of us, these 'tools' have done little more than gamify racial bias.
Legacy Russell (Glitch Feminism: A Manifesto)
UTT hypothesizes that this is due to the fact that these regions of your brain have more neuronal bandwidth available, allowing them to move around more information and sift through more potential solutions than your conscious centers of thinking. Your conscious mind, according to this theory, is like a home computer on which you can run carefully written programs that return correct answers to limited problems, whereas your unconscious mind is like Google’s vast data centers, in which statistical algorithms sift through terabytes of unstructured information, teasing out surprising useful solutions to difficult questions.
Cal Newport (Deep Work: Rules for Focused Success in a Distracted World)
The likes of Google and Target are no more keen to share their datasets and algorithms than Newton was to share his alchemical experiments. Sometimes
Tim Harford (The Data Detective: Ten Easy Rules to Make Sense of Statistics)
Proprietary technology is the most substantive advantage a company can have because it makes your product difficult or impossible to replicate. Google’s search algorithms, for example, return results better than anyone else’s.
Peter Thiel (Zero to One: Notes on Startups, or How to Build the Future)
In Google’s world, which is the world we enter when we go online, there’s little place for the pensive stillness of deep reading or the fuzzy indirection of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive—and better algorithms to steer the course of its thought.
Nicholas Carr (The Shallows: What the Internet Is Doing to Our Brains)
PUTTING IT ALL TOGETHER I’ve explained a lot of concepts in this chapter, so I want to recap it all into something a little more tangible. Step #1: The first step is to figure out what type of show you want to have. If you’re a writer, then you should start a blog. If you like video, then you should start a vlog on one of the video platforms. Lastly, if you like audio, then you should start a podcast. Step #2: Your show will be you documenting the process of achieving the same goal that your audience will be striving for. As you’re documenting your process, you’ll be testing your material and paying attention to the things that people respond to. If you commit to publishing your show every day for a year, you’ll have the ability to test your material and find your voice, and your dream customers will be able to find you. Step #3: You’ll leverage your Dream 100 by interviewing them on your show. This will give you the ability to build relationships with them, give them a platform, give you the ability to promote their episode on your show to their audience, and get access to their friends and followers. Step #4: Even though this is your own show, you’re renting time on someone else’s network. It’s important that you don’t forget it and that you focus on converting it into traffic that you own. Figure 7.11: As you create your own show, focus on converting traffic that you earn and control into traffic that you own. And with that, I will close out Section One of this book. So far, we’ve covered a lot of core principles to traffic. We: Identified exactly who your dream client is. Discovered exactly where they are congregating. Talked about how to work your way into those audiences (traffic that you earn) and how you buy your way into those audiences (traffic that you control). Learned how to take all the traffic that you earn and all the traffic that you buy and turn it all into traffic that you own (building your list). Discussed how to plug that list into a follow-up funnel so you can move them through your value ladder. Prepared to infiltrate your Dream 100, find your voice, and build your following by creating your own show. In the next section, we’ll shift our focus to mastering the pattern to get traffic from any advertising networks (like Instagram, Facebook, Google, and YouTube) and how to understand their algorithms so you can get unlimited traffic and leads pouring into your funnels.
Russell Brunson (Traffic Secrets: The Underground Playbook for Filling Your Websites and Funnels with Your Dream Customers)
Owned Traffic Channels A friend of mine owns a SaaS company that’s competing in a massively crowded space. His product gets 500,000 unique visitors a month because he’s exceptional at search engine optimization (SEO), and his company ranks on the first page of Google for many high-volume terms. He owns these organic traffic channels in his market, so even though other names on those pages might be more recognizable, he can stay highly competitive. Even if you own a high-traffic search term on Google, Amazon, or the WordPress plugin store, you can have a pretty commoditized product that can still succeed. One caveat is that this moat can be a bit dicey to maintain because the algorithms at any of those companies can change quickly—and have. Google’s many updates have tanked businesses overnight that depended solely on SEO-driven traffic.
Rob Walling (The SaaS Playbook: Build a Multimillion-Dollar Startup Without Venture Capital)
six reasons why email is the best: My company AppSumo generates $65 million a year in total transactions. And you know what? Nearly 50 percent of that comes from email. This percentage has been consistent for more than ten years. Don’t believe me? I have 120,000 Twitter followers, 750,000 YouTube subscribers, and 150,000 TikTok fans—and I would give them all up for my 100,000 email subscribers. Why? Every time I send an email, 40,000 people open it and consume my content. I’m not hoping the platform gods will allow me to reach them. On the other platforms, anywhere between 100 and 1 million people pay attention to my content, but it’s not consistent or in my control. I know what you’re saying: “C’mon, Noah, email is dead.” Now ask yourself, when was the last time you checked your email? Exactly. Email is used obsessively by over 4 billion people! It’s the largest way of communicating at scale that exists today. Eighty-nine percent of people check it EVERY DAY! Social media decides who and how many people you’re seen by. One tweak to the algorithm, and you’re toast. Remember the digital publisher LittleThings? Yeah, no one else does, either. They closed after they lost 75 percent of their 20,000,000 monthly visitors when Facebook changed its algorithm in 2018. CEO Joe Speiser says it killed his business and he lost $100 million. You own your email list. Forever. If AppSumo shuts down tomorrow, my insurance policy, my sweet sweet baby, my beloved, my email list comes with me and makes anything I do after so much easier. Because it’s mine. It also doesn’t cost you significant money to grow your list or to communicate with your list, whereas Facebook or Google ads consistently cost money.
Noah Kagan (Million Dollar Weekend: The Surprisingly Simple Way to Launch a 7-Figure Business in 48 Hours)
This dynamic is even stronger for digital goods, which can be produced almost for free. Once Amazon has formatted an e-book for sale, selling new copies of it doesn’t take any additional paper, ink, or labor—so it sells for a nearly infinite multiple of its marginal cost. As a result, the close relationship between marginal cost, price, and consumers’ willingness to pay has been weakened. In the case of services whose marginal cost is low enough that they can be free to consumers altogether, that relationship breaks down completely. Once Google has designed its search algorithms and built its server farms, providing a user with one additional search costs almost nothing.
Ray Kurzweil (The Singularity Is Nearer: When We Merge with AI)
The twenty-first-century shift into real-time analytics has only made the danger of metrics more intense. Avinash Kaushik, digital marketing evangelist at Google, warns that trying to get website users to see as many ads as possible naturally devolves into trying to cram sites with ads: “When you are paid on a [cost per thousand impressions] basis the incentive is to figure out how to show the most possible ads on every page [and] ensure the visitor sees the most possible pages on the site.… That incentive removes a focus from the important entity, your customer, and places it on the secondary entity, your advertiser.” The website might gain a little more money in the short term, but ad-crammed articles, slow-loading multi-page slide shows, and sensationalist clickbait headlines will drive away readers in the long run. Kaushik’s conclusion: “Friends don’t let friends measure Page Views. Ever.
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
Google is known as a “full stack AI company” that uses its own data stores “to train its own algorithms running on its own chips deployed on its own cloud.
Shoshana Zuboff (The Age of Surveillance Capitalism)
Tips on Web Design and Site Marketing Web content is king, which is why we have devoted an entire chapter to it later in this book. It is what draws visitors and ultimately what converts them to customers. So, try to make your web content as engaging as possible. Make sure the content is interactive, unique and educational. Ensure that visitors have the option of plugins while encouraging them to visit as many pages on your site as possible if they want to obtain vital information. The images you use on your website should be both enticing and descriptive in nature. In today’s world, social media is all pervasive. In order to encourage visitors to share your web content, you can include icons of social media platforms on your website. In some select cases, consider integrating social media feeds, like Facebook or Instagram, onto your website so that they can automatically show the latest postings. A "Call-to-Action" can help convert visitors to your site into customers. Always try using a very clear and concise "Call-to-Action" language. Understand what type of conversion you are looking for, and try to provide multiple levels of conversion. For example, a plastic surgeon may provide Schedule an Appointment as a call to action, which will attract only the segment of web visitors who have reached their decision stage. By adding conversion points for visitors who are at earlier stages of their decision making, like signing up for a webcast or your newsletter can help you widen your conversion points and provide inputs to your email marketing. To raise the average amount of time a visitor spends on your website and to minimize the bounce rate, ensure that your website offers a user-friendly and attractive design. This way you will increase the number of links you have on your website and boost its SEO ranking (Tip: While Google’s algorithm is not public, our iterative testing shows that sites with good usability analytics metrics like time on site and bounce rate play favorably in Google’s algorithm, other things remaining constant). Ensure you observe due diligence when designing a website that will enable visitors to navigate in different languages. For example, you may need a lot more space for your menu, as there are languages that use up more space than the English language.
Danny Basu (Digital Doctor: Integrated Online Marketing Guide for Medical and Dental Practices)
Companies like Facebook, Google, and Twitter have built sophisticated, planetary-scale machine-learning algorithms whose entire purpose is to generate engagement
Kevin Roose (Futureproof: 9 Rules for Surviving in the Age of AI)
One of the early stage AI companies Google purchased is DeepMind, based in London. In 2015 researchers at DeepMind published a paper in Nature describing how they taught an AI to learn to play 1980s-era arcade video games, like Video Pinball. They did not teach it how to play the games, but how to learn to play the games—a profound difference. They simply turned their cloud-based AI loose on an Atari game such as Breakout, a variant of Pong, and it learned on its own how to keep increasing its score. A video of the AI’s progress is stunning. At first, the AI plays nearly randomly, but it gradually improves. After a half hour it misses only once every four times. By its 300th game, an hour into it, it never misses. It keeps learning so fast that in the second hour it figures out a loophole in the Breakout game that none of the millions of previous human players had discovered. This hack allowed it to win by tunneling around a wall in a way that even the game’s creators had never imagined. At the end of several hours of first playing a game, with no coaching from the DeepMind creators, the algorithms, called deep reinforcement machine learning, could beat humans in half of the 49 Atari video games they mastered.
Kevin Kelly (The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future)
At the epicenter of Google’s bulging portfolio is one master project: The company wants to create machines that replicate the human brain, and then advance beyond. This is the essence of its attempts to build an unabridged database of global knowledge and its efforts to train algorithms to become adept at finding patterns, teaching them to discern images and understand language. Taking on this grandiose assignment, Google stands to transform life on the planet, precisely as it boasted it would. The laws of man are a mere nuisance that can only slow down such work. Institutions and traditions are rusty scrap for the heap. The company rushes forward, with little regard for what it tramples, on its way toward the New Jerusalem. (less)
Franklin Foer (World Without Mind: The Existential Threat of Big Tech)
Since then, most of Google’s successful products have been based on strong technical insights, while most of the less successful ones lacked them. AdWords, the Google ads engine that generates most of the company’s revenue, was based on the insight that ads could be ranked and placed on a page based on their value as information to users, rather than just by who was willing to pay more.63 Google News, the site that aggregates news headlines from thousands of media outlets, was based on the insight that we could algorithmically group stories by topic, not source. Chrome, Google’s open-source browser, was founded on the insight that as websites grew more complex and powerful, browsers needed to be reengineered for speed. Pick an innovative, successful Google product, and you are likely to find at least one significant technical insight behind it, the sort of idea that could have appeared in a technical journal.
Eric Schmidt (How Google Works)
Each algorithm is a feedback loop, taking an action, observing the resulting conditions, and taking another action after that. Again, and again, and again. It's an iterative process, in which the algorithms adjust themselves and their activity on every loop, responding less to the news on the ground than to one another. Such systems go out of control because the feedback of their own activity has become louder than the original signal.
Douglas Rushkoff (Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity)
Once Google, Facebook and other algorithms become all-knowing oracles, they may well evolve into agents and finally into sovereigns.
Yuval Noah Harari
In addition to pool installers and flower shops, our customers include people who make a living bombarding people with email offers, or gaming Google’s search algorithm, or figuring out which kind of misleading subject line is most likely to trick someone into opening a message. Online marketing is not quite as sleazy as Internet porn, but it’s not much better, either.
Dan Lyons (Disrupted: My Misadventure in the Start-Up Bubble)
Both Google search results and Facebook’s News Feed algorithm are based on showing us what they think we want. The
Jacob Silverman (Terms of Service: Social Media and the Price of Constant Connection)
As long as algorithms determine the distribution of profits, they will also determine what gets published. The
Nicholas Carr (The Big Switch: Rewiring the World, from Edison to Google)
The authors, academics from Northeastern University, Harvard University, and the University of Houston, concluded that Google Flu Trends had wildly overestimated the number of flu cases in the United States for more than two years. The article, “The Parable of Google Flu: Traps in Big Data Analysis,” concluded that the errors were, at least in part, due to the decisions made by GFT engineers about what to include in their models—mistakes the academics dubbed “algorithmic dynamics” and “big data hubris.
Clayton M. Christensen (Competing Against Luck: The Story of Innovation and Customer Choice)
Here’s the point: You get out what you put in—to a degree. Your natural ability and strengths play a big role in determining the potential of your output. Don’t work on things that don’t play to your strengths and passions. Don’t work on things that provide opportunities that don’t interest you. It’s easy to get lost in the fight, and continuously bang your head against the wall when people tell you that you get out what you put in. That expression tells us to just keep working harder, regardless of the task. This isn’t always the answer. If you’re working hard and don’t feel like you’re getting out what you’re putting in, you probably need to stop banging your head against the wall and jump ship. There is also a time factor here, which is very important to consider. YouTube was acquired by Google for $1.65 billion in 2006, less than two years after its founding.[28] PopCap, a gaming company, was acquired by Electronic Arts for $750 million in 2011, eleven years after its founding.[29] If you founded or worked for either of these companies and loved every moment of it, the difference in time-to-acquisition is a non-factor. But, if you did not enjoy yourself and didn’t grow along the way, you better hope you were working at YouTube. Two years of stagnant personal growth is far less painful than eleven. The example above includes two companies that both sold for a bunch of money, which is great. But there’s more to life than money. There’s an opportunity cost to everything you do. If you’re hitting a wall for more than a year and you’re unhappy, I recommend jumping ship—even if there’s a potential pile of cash down the road. There are opportunities beyond whatever you’re currently working on. Forget how much time, effort, and money has already been invested and ignore whatever you might be giving up if you leave. Those are sunk costs and unknown outcomes. Instead, think about what you’re working on and who you’re working with right now. Then decide if that’s really how you want to be spending your days. That’s all that really matters.
Jesse Tevelow (The Connection Algorithm: Take Risks, Defy the Status Quo, and Live Your Passions)
The privacy issue was reignited in early 2014, when the Wall Street Journal reported that Facebook had conducted a massive social-science experiment on nearly seven hundred thousand of its users. To determine whether it could alter the emotional state of its users and prompt them to post either more positive or negative content, the site’s data scientists enabled an algorithm, for one week, to automatically omit content that contained words associated with either positive or negative emotions from the central news feeds of 689,003 users. As it turned out, the experiment was very “successful” in that it was relatively easy to manipulate users’ emotions, but the backlash from the blogosphere was horrendous. “Apparently what many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we’ll respond to but to actually change our emotions,” wrote Sophie Weiner on AnimalNewYork.com.
Jonathan Taplin (Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy)
Someday, you may even turn to Google, which has read all your emails and internet searches and looked at your bank account and DNA and sugar levels and blood pressure and heart rate, to determine who you should date. But this is also where liberalism collapses, on the day that the algorithms know you better than you know yourself.
GBF Summary (Summary: Homo Deus by Yuval Noah Harari (Great Books Fast))
The fifth factor in the T Algorithm is the ability to control the consumer experience, at purchase, through vertical integration.
Scott Galloway (The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google)
told Nike that to have a shot at a trillion, they would need to do three things: Increase percentage of direct-to-consumer retail to 40 percent within ten years (closer to 10 percent in 2016). Gain greater facility with data and how to incorporate into product features. Move their headquarters from Portland. As I learned, the algorithm is the easy part. Getting them to listen to you (“You need to relocate HQ from Portland”) is the hard part. Chapter 9 The Fifth Horseman?
Scott Galloway (The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google)
So assuming one could argue that Google is a monopoly and needs to enter into a consent decree, would the Bell Labs model work? If Google were required to license every patent it owns for a nominal fee to any American company that asks for it, it would have to license its search algorithms, Android patents, self-driving car patents, smart-thermostat patents, advertising-exchange patents, Google Maps patents, Google Now patents, virtual-reality patents, and thousands of others. What is clear from the Bell Labs model is that such a solution actually benefits innovation in general.
Jonathan Taplin (Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy)
Over the last generation, journalism has slowly been swallowed. The ascendant media companies of our era don’t think of themselves as heirs to a great ink-stained tradition. Some prefer to call themselves technology firms. This redefinition isn’t just a bit of fashionable branding. Silicon Valley has infiltrated the profession, from both within and without. Over the past decade, journalism has come to depend unhealthily on Facebook and Google. The big tech companies supply journalism with an enormous percentage of its audience—and therefore a big chunk of revenue. This gives Silicon Valley influence over the entire profession, and it has made the most of its power. Dependence generates desperation—a mad, shameless chase to gain clicks through Facebook, a relentless effort to game Google’s algorithms. It leads media to ink terrible deals, which look like self-preserving necessities, but really just allow Facebook and Google to hold them even tighter. Media will grant Facebook the right to sell advertising or give Google permission to publish articles directly on its fast-loading server. What makes these deals so terrible is the capriciousness of the tech companies. They like to shift quickly in a radically different direction, which is great for their bottom line, but terrible for all the media companies dependent on the platforms. Facebook will decide that its users prefer video to words, or that its users prefer ideologically pleasing propaganda to hard news. When Facebook shifts direction like this or when Google tweaks its algorithm, they instantly crash Web traffic flowing to media, with all the rippling revenue ramifications that follow. Media know they should flee the grasp of Facebook, but dependence also breeds cowardice. The prisoner lies on the cot dreaming of escape plans that will never hatch. Dependence on the big tech companies is increasingly the plight of the worker and the entrepreneur. Drivers maintain erratic patterns of sleep because of Uber’s shifting whims. Companies that manufacture tchotchkes sold on Amazon watch their businesses collapse when Amazon’s algorithms detect the profitability of their item, leading the giant to manufacture the goods itself at a lower price. The problem isn’t just financial vulnerability. It’s the way in which the tech companies dictate the patterns of work, the way in which their influence can shift the ethos of an entire profession to suit their needs—lowering standards of quality, eroding ethical protections. I saw this up close during my time at the New Republic. I watched how dependence on the tech companies undermined the very integrity of journalism. At the very beginning of that chapter in my career, I never imagined that we would go down that path.
Franklin Foer (World Without Mind: The Existential Threat of Big Tech)