“
This monopoly of information is a threat to democracy...
”
”
Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism)
“
we shall quickly find ourselves about as important to the algorithms as animals currently are to us.
”
”
Niall Ferguson (The Square and the Tower: Networks and Power, from the Freemasons to Facebook)
“
...large technologies such as Google need to be broken up and regulated, because their consolidated power and cultural influence make competition largely impossible. This monopoly in the information sector is a threat to democracy...
”
”
Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism)
“
Google and Facebook don’t have “users” or “customers”. Instead, they have participants under machine surveillance, whose activities are algorithmically combined within Big Data silos.
”
”
Bruce Sterling (The Epic Struggle of the Internet of Things)
“
The implications of such marginalization are profound. The insights about sexist and racist biases... are important because information organizations, from libraries to schools and universities to governmental agencies, are increasingly reliant on being displaced by a variety of web-based "tools" as if there are no political, social, or economic consequences of doing so.
”
”
Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism)
“
The Google and Facebook algorithms not only know exactly how you feel, they also know myriad other things about you that you hardly suspect. Consequently you should stop listening to your feelings and start listening to these external algorithms instead. What’s the point of having democratic elections when the algorithms know not only how each person is going to vote, but also the underlying neurological reasons why one person votes Democrat while another votes Republican? Whereas humanism commanded: ‘Listen to your feelings!’ Dataism now commands: ‘Listen to the algorithms! They know how you feel.’ When
”
”
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
“
Manufacturing consent begins by weaponizing the meme and utilizing the censorship algorithms of Google, Facebook, Twitter and YouTube.
”
”
James Scott, Senior Fellow, The Center for Cyber Influence Operations Studies
“
three platforms: Amazon, Google, and Facebook. Registering, iterating, and monetizing its audience is the heart of each platform’s business. It’s what the most valuable man-made things ever created (their algorithms) are designed to do.
”
”
Scott Galloway (The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google)
“
The right to free expression didn’t include the right to “algorithmic amplification,
”
”
Sheera Frenkel (An Ugly Truth: Inside Facebook's Battle for Domination)
“
The reason why conversations like this are simultaneously so frustrating and revealing is that people like him have lost the desire to question what they are being told. Their bespoke, unchallenged diet of ‘news’, augmented we now know by Facebook algorithms and deliberately fake stories, is so unvaried that the possibility that it might be largely bogus is never entertained
”
”
James O'Brien (How To Be Right… in a World Gone Wrong)
“
The root of the disinformation problem, of course, lay in the technology. Facebook was designed to throw gas on the fire of any speech that invoked an emotion, even if it was hateful speech—its algorithms favored sensationalism.
”
”
Sheera Frenkel (An Ugly Truth: Inside Facebook's Battle for Domination)
“
An Internet of Things is not a consumer society. It’s a materialised network society. It’s like a Google or Facebook writ large on the landscape. Google and Facebook don’t have “users” or “customers”. Instead, they have participants under machine surveillance, whose activities are algorithmically combined within Big Data silos.
”
”
Bruce Sterling (The Epic Struggle of the Internet of Things)
“
Maybe the concept of friendship is already too colonized by liberalism and capitalism. Under neoliberalism, friendship is a banal affair of private preferences: we hang out, we share hobbies, we make small talk. We become friends with those who are already like us, and we keep each other comfortable rather than becoming different and more capable together. The algorithms of Facebook and other social networks guide us towards the refinement of our profiles, reducing friendship to the click of a button. This neoliberal friend is the alternative to hetero- and homonormative coupling: "just friends" implies a much weaker and insignificant bond than a lover could ever be. Under neoliberal friendship, we don't have each other's backs, and our lives
aren't tangled up together. But these insipid tendencies do not mean that friendships are pointless, only that friendship is a terrain of struggle. Empire works to usher its subjects into flimsy relationships where nothing is at stake and to infuse intimacy with violence and domination.
”
”
Carla Bergman (Joyful Militancy: Building Thriving Resistance in Toxic Times (Anarchist Interventions))
“
Algorithm without humanity is mental holocaust.
”
”
Abhijit Naskar (Handcrafted Humanity: 100 Sonnets For A Blunderful World)
“
Once Google, Facebook and other algorithms become all-knowing oracles, they may well evolve into agents and ultimately into sovereigns
”
”
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
“
Car en plus d'orienter nos clics et nos décisions, les algorithmes peuvent nous maintenir dans des bulles affinitaires, où nous croisons essentiellement ceux qui partagent nos opinions.
”
”
Cyril Dion (Petit manuel de résistance contemporaine)
“
The challenge, however, is that Google, Facebook, Netflix, and Amazon do not publish their algorithms. In fact, the methods they use to filter the information you see are deeply proprietary and the “secret sauce” that drives each company’s profitability. The problem with this invisible “black box” algorithmic approach to information is that we do not know what has been edited out for us and what we are not seeing. As a result, our digital lives, mediated through a sea of screens, are being actively manipulated and filtered on a daily basis in ways that are both opaque and indecipherable.
”
”
Marc Goodman (Future Crimes)
“
Facebook had a “Verified Apps” program & claimed it certified the security of participating apps. It didn’t. Facebook promised users that it would not share their personal information with advertisers. It did.
”
”
Frank Pasquale (The Black Box Society: The Secret Algorithms That Control Money and Information)
“
In Germany, according to the report, more than one third of Facebook’s political groups were deemed extremist. The algorithm itself seemed to be responsible: 64 percent of people in the groups had joined at the system’s suggestion.
”
”
Max Fisher (The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World)
“
Facebook would never put it this way, but algorithms are meant to erode free will, to relieve humans of the burden of choosing, to nudge them in the right direction. Algorithms fuel a sense of omnipotence, the condescending belief that our behavior can be altered, without our even being aware of the hand guiding us, in a superior direction. That's always been a danger of the engineering mindset, as it moves beyond its roots in building inanimate stuff and beings to design a more perfect social world. We are the screws and rivets in their grand design
”
”
Franklin Foer (World Without Mind: The Existential Threat of Big Tech)
“
For his birthday, she'd bought him an iPhone, which he'd returned to the store. He'd apologized, saying that it was a thoughtful gift, but he didn't want to carry a tiny high-powered mainframe on which he could compute astronomical algorithms, or check Facebook. He wanted a phone.
”
”
Laura Kasischke (Mind of Winter)
“
Sixty-four percent of people who join extremist groups on Facebook do so because the algorithm steers them there. Less than three years after QAnon appeared online, half of Americans had heard of its conspiracy theories. In reality, what social media favors is that which divides us.
”
”
Scott Galloway (Adrift: America in 100 Charts)
“
And allow Google and Facebook to read all your emails, monitor all your chats and messages, and keep a record of all your Likes and clicks. If you do all that, then the great algorithms of the Internet-of-All-Things will tell you whom to marry, which career to pursue and whether to start a war.
”
”
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
“
I post a petition on my Facebook page. Which of my friends will see it on their news feed? I have no idea. As soon as I hit send, that petition belongs to Facebook, and the social network’s algorithm makes a judgment about how to best use it. It calculates the odds that it will appeal to each of my friends. Some of them, it knows, often sign petitions, and perhaps share them with their own networks. Others tend to scroll right past. At the same time, a number of my friends pay more attention to me and tend to click the articles I post. The Facebook algorithm takes all of this into account as it decides who will see my petition. For many of my friends, it will be buried so low on their news feed that they’ll never see it.
”
”
Cathy O'Neil (Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy)
“
We saw a blatant example of this abuse in mid-2014 when a study published by researchers at Facebook and Cornell University revealed that social networks can manipulate the emotions of their users simply by algorithmically altering what they see in the news feed. In a study published by the National Academy of Sciences, Facebook changed the update feeds of 700,000 of its users to show them either more sad or more happy news. The result? Users seeing more negative news felt worse and posted more negative things, the converse being true for those seeing the more happy news. The study’s conclusion: “Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.
”
”
Marc Goodman (Future Crimes)
“
After studying all the hidden data—the stuff that Facebook doesn’t release to the public—the company’s scientists reached a definite conclusion. They wrote: “Our algorithms exploit the human brain’s attraction to divisiveness,” and “if left unchecked,” the site would continue to pump its users with “more and more divisive content in an effort to gain user attention and increase time on the platform.
”
”
Johann Hari (Stolen Focus: Why You Can't Pay Attention—and How to Think Deeply Again)
“
(By the way, wolves – or at least their dog cousins – aren’t a hopeless case. A company called ‘No More Woof’ is developing a helmet for reading canine experiences. The helmet monitors the dog’s brain waves, and uses computer algorithms to translate simple sentiments such as ‘I am angry’ into human language.8 Your dog may soon have a Facebook or Twitter account of his own – perhaps with more Likes and followers than you.)
”
”
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
“
But even when Facebook isn't deliberately exploiting its users, it is exploiting its users—its business model requires it. Even if you distance yourself from Facebook, you still live in the world that Facebook is shaping. Facebook, using our native narcissism and our desire to connect with other people, captured our attention and our behavioral data; it used this attention and data to manipulate our behavior, to the point that nearly half of America began relying on Facebook for news. Then, with the media both reliant on Facebook as a way of reaching readers and powerless against the platform's ability to suck up digital advertising revenue—it was like a paperboy who pocketed all the subscription money—Facebook bent the media's economic model to match its own practices: publications needed to capture attention quickly and consistently trigger high emotional responses to be seen at all. The result, in 2016, was an unending stream of Trump stories, both from the mainstream news and from the fringe outlets that were buoyed by Facebook's algorithm. What began as a way for Zuckerberg to harness collegiate misogyny and self-interest has become the fuel for our whole contemporary nightmare, for a world that fundamentally and systematically misrepresents human needs.
”
”
Jia Tolentino (Trick Mirror: Reflections on Self-Delusion)
“
In one tragicomic incident in October 2017, a Palestinian laborer posted to his private Facebook account a picture of himself in his workplace, alongside a bulldozer. Adjacent to the image he wrote “Good morning!” An automatic algorithm made a small error when transliterating the Arabic letters. Instead of ysabechhum (which means “good morning”), the algorithm identified the letters as ydbachhum (which means “kill them”). Suspecting that the man might be a terrorist intending to use a bulldozer to run people over, Israeli security forces swiftly arrested him. He was released after they realized that the algorithm made a mistake. But the offending Facebook post was nevertheless taken down. You can never be too careful. 29 What Palestinians are experiencing today in the West Bank might be just a primitive preview of what billions will eventually experience all over the planet.
”
”
Yuval Noah Harari (21 Lessons for the 21st Century)
“
Supporters of Bolsonaro had created a video warning that his main rival, Fernando Haddad, wanted to turn all the children of Brazil into homosexuals, and that he had developed a cunning technique to do it. The video showed a baby sucking a bottle, only there was something peculiar about it—the teat of the bottle had been painted to look like a penis. This, the story that circulated said, is what Haddad will distribute to every kindergarten in Brazil. This became one of the most-shared news stories in the entire election. People in the favelas explained indignantly that they couldn’t possibly vote for somebody who wanted to get babies to suck these penis-teats, and so they would have to vote for Bolsonaro instead. On these algorithm-pumped absurdities, the fate of the whole country turned. When Bolsonaro unexpectedly won the presidency, his supporters chanted “Facebook! Facebook! Facebook!” They knew what the algorithms had done for them.
”
”
Johann Hari (Stolen Focus: Why You Can't Pay Attention—and How to Think Deeply Again)
“
I have no reason to believe that the social scientists at Facebook are actively gaming the political system. Most of them are serious academics carrying out research on a platform that they could only have dreamed about two decades ago. But what they have demonstrated is Facebook’s enormous power to affect what we learn, how we feel, and whether we vote. Its platform is massive, powerful, and opaque. The algorithms are hidden from us, and we see only the results of the experiments researchers choose to publish.
”
”
Cathy O'Neil (Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy)
“
A recent study commissioned by Google’s nemesis – Facebook – has indicated that already today the Facebook algorithm is a better judge of human personalities and dispositions than even people’s friends, parents and spouses. The study was conducted on 86,220 volunteers who have a Facebook account and who completed a hundred-item personality questionnaire. The Facebook algorithm predicted the volunteers’ answers based on monitoring their Facebook Likes – which webpages, images and clips they tagged with the Like button. The more Likes, the more accurate the predictions. The algorithm’s predictions were compared with those of work colleagues, friends, family members and spouses. Amazingly, the algorithm needed a set of only ten Likes in order to outperform the predictions of work colleagues. It needed seventy Likes to outperform friends, 150 Likes to outperform family members and 300 Likes to outperform spouses. In other words, if you happen to have clicked 300 Likes on your Facebook account, the Facebook algorithm can predict your opinions and desires better than your husband or wife!
”
”
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
“
The bigger threat to Google wouldn’t be measured in dollars, but in the philosophical challenge. Could it be that social networking, rather than algorithmic exploitation of the web’s intelligence, would assume the central role in people’s online lives? Even if that were not the case, Facebook made it clear that every facet of the Internet would benefit from the power of personal connection. Google had been chasing a future forged out of algorithms and science fiction chronicles. Did the key to the future lay in party photos and daily status reports?
”
”
Steven Levy (In the Plex: How Google Thinks, Works, and Shapes Our Lives)
“
Analysis of your social network and its members can also be highly revealing of your life, politics, and even sexual orientation, as demonstrated in a study carried out at MIT. In an analysis known as Gaydar, researchers studied the Facebook profiles of fifteen hundred students at the university, including those whose profile sexual orientation was either blank or listed as heterosexual. Based on prior research that showed gay men have more friends who are also gay (not surprising), the MIT investigators had a valuable data point to review the friend associations of their fifteen hundred students. As a result, researchers were able to predict with 78 percent accuracy whether or not a student was gay. At least ten individuals who had not previously identified as gay were flagged by the researchers’ algorithm and confirmed via in-person interviews with the students. While these findings might not be troubling in liberal Cambridge, Massachusetts, they could prove problematic in the seventy-six countries where homosexuality remains illegal, such as Sudan, Iran, Yemen, Nigeria, and Saudi Arabia, where such an “offense” is punished by death.
”
”
Marc Goodman (Future Crimes)
“
Facebook Sonnet
Facebook is not just injurious to health,
It's now a full-on humanitarian crisis.
If you think it's just a harmless bad habit,
You're fanning the flames of social necrosis.
Social media ought to make people social,
Not make pavlov's dogs out of humanity.
Yet all that facebook actually does today,
Is drive society towards clinical insanity.
Social media is not necessarily bad,
So long as it doesn't feed on our stability.
Yet facebook has devised the perfect algorithm,
To learn, pump and monetize human instability.
Facebook is the definition of what AI must be not.
Algorithm without humanity is mental holocaust.
”
”
Abhijit Naskar (Handcrafted Humanity: 100 Sonnets For A Blunderful World)
“
Not satisfied with controlling information pipelines, the tech oligarchs have been moving to shape content as well. Controllers like those at Facebook and Twitter seek to “curate” content on their sites, or even eliminate views they find objectionable, which tend to be conservative views, according to former employees.35 Algorithms intended to screen out “hate groups” often spread a wider net, notes one observer, since the programmers have trouble distinguishing between “hate groups” and those who might simply express views that conflict with the dominant culture of Silicon Valley.36 That managers of social media platforms aim to control content is not merely the perception of conservatives. Over 70 percent of Americans believe that social media platforms “censor political views,” according to a recent Pew study.37 With their quasi-monopoly status, Facebook and Google don’t have to worry about competing with anyone, as the tech entrepreneur Peter Thiel observes, so they can indulge their own prejudices to a greater extent than the businesses that might be concerned about alienating customers.38 With their tightening control over media content, the tech elite are now situated to exert a cultural predominance that is unprecedented in the modern era.39 It recalls the cultural influence of the Catholic Church in the Middle Ages, but with more advanced technology.
”
”
Joel Kotkin (The Coming of Neo-Feudalism: A Warning to the Global Middle Class)
“
Google had a built-in disadvantage in the social networking sweepstakes. It was happy to gather information about the intricate web of personal and professional connections known as the “social graph” (a term favored by Facebook’s Mark Zuckerberg) and integrate that data as signals in its search engine. But the basic premise of social networking—that a personal recommendation from a friend was more valuable than all of human wisdom, as represented by Google Search—was viewed with horror at Google. Page and Brin had started Google on the premise that the algorithm would provide the only answer. Yet there was evidence to the contrary. One day a Googler, Joe Kraus, was looking for an anniversary gift for his wife. He typed “Sixth Wedding Anniversary Gift Ideas” into Google, but beyond learning that the traditional gift involved either candy or iron, he didn’t see anything creative or inspired. So he decided to change his status message on Google Talk, a line of text seen by his contacts who used Gmail, to “Need ideas for sixth anniversary gift—candy ideas anyone?” Within a few hours, he got several amazing suggestions, including one from a colleague in Europe who pointed him to an artist and baker whose medium was cake and candy. (It turned out that Marissa Mayer was an investor in the company.) It was a sobering revelation for Kraus that sometimes your friends could trump algorithmic search.
”
”
Steven Levy (In the Plex: How Google Thinks, Works, and Shapes Our Lives)
“
One of Zuckerberg’s least favorite criticisms of Facebook was that it created ideological echo chambers, in which people only engaged with the ideas they wanted to hear. Facebook had already funded research,12 in 2015, to show echo chambers were mathematically not their fault. With the social network, everyone had the potential to engage with whatever kinds of ideas they wanted to, and tended to have at least some Facebook connections with people who held different political opinions. But if people chose not to interact with those they disagreed with, was that really Facebook’s doing? Their algorithm was just showing people what they demonstrated, through their own behavior, they wanted to see, enhancing their existing preferences.
”
”
Sarah Frier (No Filter: The inside story of Instagram)
“
It is best to be the CEO; it is satisfactory to be an early employee, maybe the fifth or sixth or perhaps the tenth. Alternately, one may become an engineer devising precious algorithms in the cloisters of Google and its like. Otherwise, one becomes a mere employee. A coder of websites at Facebook is no one in particular. A manager at Microsoft is no one. A person (think woman) working in customer relations is a particular type of no one, banished to the bottom, as always, for having spoken directly to a non-technical human being. All these and others are ways for strivers to fall by the wayside — as the startup culture sees it — while their betters race ahead of them. Those left behind may see themselves as ordinary, even failures.
”
”
Ellen Ullman (Life in Code: A Personal History of Technology)
“
The firm did this at the local level, creating right-wing pages with vague names like Smith County Patriots or I Love My Country. Because of the way Facebook’s recommendation algorithm worked, these pages would pop up in the feeds of people who had already liked similar content. When users joined CA’s fake groups, it would post videos and articles that would further provoke and inflame them. Conversations would rage on the group page, with people commiserating about how terrible or unfair something was. CA broke down social barriers, cultivating relationships across groups. And all the while it was testing and refining messages, to achieve maximum engagement. Now CA had users who (1) self-identified as part of an extreme group, (2) were a captive audience, and (3) could be manipulated with data.
”
”
Christopher Wylie (Mindf*ck: Cambridge Analytica and the Plot to Break America)
“
and museums. Have you had your DNA sequenced? No?! What are you waiting for? Go and do it today. And convince your grandparents, parents and siblings to have their DNA sequenced too – their data is very valuable for you. And have you heard about these wearable biometric devices that measure your blood pressure and heart rate twenty-four hours a day? Good – so buy one of those, put it on and connect it to your smartphone. And while you are shopping, buy a mobile camera and microphone, record everything you do, and put in online. And allow Google and Facebook to read all your emails, monitor all your chats and messages, and keep a record of all your Likes and clicks. If you do all that, then the great algorithms of the Internet-of-All-Things will tell you whom to marry, which career to pursue and whether to start a war.
”
”
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
“
He does not know that Facebook is monitoring him and spying on him and even listening to him more or less constantly, nor does he believe it when he’s told this very thing by Jack, that he is being secretly watched by Facebook. This comes in the form of a long private letter that Jack has composed pleading with his father to stop spending so much time with all these conspiracies, that none of them are true, that Lawrence is getting unnecessarily worked up and angry about nothing, that there are no shadowy cabals secretly plotting against the world, and what’s happening here is actually just that a small group of engineers in Silicon Valley have built moneymaking algorithms that are now optimizing, that what Lawrence is seeing is not reality but rather an algorithmic abstraction of reality that sits invisibly atop reality like a kind of distortion field.
”
”
Nathan Hill (Wellness)
“
You want to know who you really are?’ asks Dataism. ‘Then forget about mountains and museums. Have you had your DNA sequenced? No?! What are you waiting for? Go and do it today. And convince your grandparents, parents and siblings to have their DNA sequenced too – their data is very valuable for you. And have you heard about these wearable biometric devices that measure your blood pressure and heart rate twenty-four hours a day? Good – so buy one of those, put it on and connect it to your smartphone. And while you are shopping, buy a mobile camera and microphone, record everything you do, and put in online. And allow Google and Facebook to read all your emails, monitor all your chats and messages, and keep a record of all your Likes and clicks. If you do all that, then the great algorithms of the Internet-of-All-Things will tell you whom to marry, which career to pursue and whether to start a war.
”
”
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
“
Already, when Facebook bought Instagram, it felt as though the walls of the Internet were closing in a little tighter around us users. The broad expanse of possibility, of messiness, on a network like Geocities or the personal expression of Tumblr was shut down. Digital life became increasingly templated, a set of boxes to fill in rather than a canvas to cover in your own image. (You don’t redesign how your Facebook profile looks; you just change your avatar.) I felt a certain sense of loss, but at first the trade-off of creativity for broadcast reach seemed worthwhile: You could talk to so many people at once on social media! But that exposure became enervating, too, and I missed the previous sense of intimacy, the Internet as a private place—a hideout from real life, rather than the determining force of real life. As the walls closed in, the algorithmic feeds took on more and more influence and authority.
”
”
Kyle Chayka (Filterworld: How Algorithms Flattened Culture)
“
The media environment... has changed in ways that foster [social and cultural] division. Long gone is the time when everybody watched one of three national television networks. By the 1990s there was a cable news channel for most points on the political spectrum, and by the early 2000s there was a website or discussion group for every conceivable interest group and grievance. By the 2010s most Americans were using social media sites like Facebook and Twitter, which make it easy to encase oneself within an echo-chamber. And then there's the "filter bubble," in which search engines and YouTube algorithms are designed to give you more of what you seem to be interested in, leading conservatives and progressives into disconnected moral matrices backed up by mutually contradictory informational worlds. Both the physical and the electronic isolation from people we disagree with allow the forces of confirmation bias, groupthink, and tribalism to push us still further apart.
”
”
Jonathan Haidt (The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting up a Generation for Failure)
“
Here are some practical Dataist guidelines for you: ‘You want to know who you really are?’ asks Dataism. ‘Then forget about mountains and museums. Have you had your DNA sequenced? No?! What are you waiting for? Go and do it today. And convince your grandparents, parents and siblings to have their DNA sequenced too – their data is very valuable for you. And have you heard about these wearable biometric devices that measure your blood pressure and heart rate twenty-four hours a day? Good – so buy one of those, put it on and connect it to your smartphone. And while you are shopping, buy a mobile camera and microphone, record everything you do, and put in online. And allow Google and Facebook to read all your emails, monitor all your chats and messages, and keep a record of all your Likes and clicks. If you do all that, then the great algorithms of the Internet-of-All-Things will tell you whom to marry, which career to pursue and whether to start a war.’ But where do these great algorithms come from? This is the mystery of Dataism. Just as according to Christianity we humans cannot understand God and His plan, so Dataism declares that the human brain cannot fathom the new master algorithms. At present, of course, the algorithms are mostly written by human hackers. Yet the really important algorithms – such as the Google search algorithm – are developed by huge teams. Each member understands just one part of the puzzle, and nobody really understands the algorithm as a whole. Moreover, with the rise of machine learning and artificial neural networks, more and more algorithms evolve independently, improving themselves and learning from their own mistakes. They analyse astronomical amounts of data that no human can possibly encompass, and learn to recognise patterns and adopt strategies that escape the human mind. The seed algorithm may initially be developed by humans, but as it grows it follows its own path, going where no human has gone before – and where no human can follow.
”
”
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
“
Israel has an extremely vibrant hi-tech sector, and a cutting-edge cyber-security industry. At the same time it is also locked into a deadly conflict with the Palestinians, and at least some of its leaders, generals and citizens might well be happy to create a total surveillance regime in the West Bank as soon as they have the necessary technology. Already today whenever Palestinians make a phone call, post something on Facebook or travel from one city to another they are likely to be monitored by Israeli microphones, cameras, drones or spy software. The gathered data is then analysed with the aid of Big Data algorithms. This helps the Israeli security forces to pinpoint and neutralise potential threats without having to place too many boots on the ground. The Palestinians may administer some towns and villages in the West Bank, but the Israelis control the sky, the airwaves and cyberspace. It therefore takes surprisingly few Israeli soldiers to effectively control about 2.5 million Palestinians in the West Bank.
”
”
Yuval Noah Harari (21 Lessons for the 21st Century)
“
Humanism thought that experiences occur inside us, and that we ought to find within ourselves the meaning of all that happens, thereby infusing the universe with meaning. Dataists believe that experiences are valueless if they are not shared, and that we need not – indeed cannot – find meaning within ourselves. We need only record and connect our experience to the great data flow, and the algorithms will discover its meaning and tell us what to do. Twenty years ago Japanese tourists were a universal laughing stock because they always carried cameras and took pictures of everything in sight. Now everyone is doing it. If you go to India and see an elephant, you don’t look at the elephant and ask yourself, ‘What do I feel?’ – you are too busy looking for your smartphone, taking a picture of the elephant, posting it on Facebook and then checking your account every two minutes to see how many Likes you got. Writing a private diary – a common humanist practice in previous generations – sounds to many present-day youngsters utterly pointless. Why write anything if nobody else can read it? The new motto says: ‘If you experience something – record it. If you record something – upload it. If you upload something – share it.
”
”
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
“
Civil disobedience in the attention economy means withdrawing attention. But doing that by loudly quitting Facebook and then tweeting about it is the same mistake as thinking that the imaginary Pera is a real island that we can reach by boat. A real withdrawal of attention happens first and foremost in the mind. What is needed, then, is not a “once-and-for-all” type of quitting but ongoing training: the ability not just to withdraw attention, but to invest it somewhere else, to enlarge and proliferate it, to improve its acuity. We need to be able to think across different time scales when the mediascape would have us think in twenty-four-hour (or shorter) cycles, to pause for consideration when clickbait would have us click, to risk unpopularity by searching for context when our Facebook feed is an outpouring of unchecked outrage and scapegoating, to closely study the ways that media and advertising play upon our emotions, to understand the algorithmic versions of ourselves that such forces have learned to manipulate, and to know when we are being guilted, threatened, and gaslighted into reactions that come not from will and reflection but from fear and anxiety. I am less interested in a mass exodus from Facebook and Twitter than I am in a mass movement of attention: what happens when people regain control over their attention and begin to direct it again, together.
”
”
Jenny Odell (How to Do Nothing: Resisting the Attention Economy)
“
The issue is not merely one of false stories, incorrect facts, or even election campaigns and spin doctors: the social media algorithms themselves encourage false perceptions of the world. People click on the news they want to hear; Facebook, YouTube, and Google then show them more of whatever it is that they already favor, whether it is a certain brand of soap or a particular form of politics. The algorithms radicalize those who use them too. If you click on perfectly legitimate anti-immigration YouTube sites, for example, these can lead you quickly, in just a few more clicks, to white nationalist sites and then to violent xenophobic sites. Because they have been designed to keep you online, the algorithms also favor emotions, especially anger and fear. And because the sites are addictive, they affect people in ways they don't expect. Anger becomes a habit. Divisiveness becomes normal. Even if social media is not yet the primary news source for all Americans, it already helps shape how politicians and journalists interpret the world and portray it. Polarization has moved from the online world into reality.
The result is a hyper-partisanship that adds to the distrust of "normal" politics, "establishment" politicians, derided "experts," and "mainstream" institutions--including courts, police, civil servants--and no wonder. As polarization increases, the employees of the state are invariably portrayed as having been "captured" by their opponents. It is not an accident that the Law and Justice Party in Poland, the Brexiteers in Britain, and the Trump administration in the United States have launched verbal assaults on civil servants and professional diplomats. It is not an accident that judges and courts are now the object of criticism, scrutiny, and anger in so many other places too. There can be no neutrality in a polarized world because there can be no nonpartisan or apolitical institutions.
”
”
Anne Applebaum (Twilight of Democracy: The Seductive Lure of Authoritarianism)
“
It is best to be the CEO; it is satisfactory to be an early employee, maybe the fifth or sixth or perhaps the tenth. Alternately, one may become an engineer devising precious algorithms in the cloisters of Google and its like. Otherwise one becomes a mere employee. A coder of websites at Facebook is no one in particular. A manager at Microsoft is no one. A person (think woman) working in customer relations is a particular type of no one,
”
”
Ellen Ullman (Life in Code: A Personal History of Technology)
“
Invisible Facebook and Google algorithms steer you toward content you agree with, and nonconforming voices stay silent for fear of being flamed or trolled or unfriended. The result is a silo in which, whatever side you're on, you feel absolutely right to hate what you hate.
”
”
Jonathan Franzen (The End of the End of the Earth: Essays)
“
Even voices in the proud New York Times newsroom now cede that Facebook, not the Old Gray Lady itself, now drives the national conversation with the horsepower of its search traffic and algorithms providing traditional media its best chance to be seen. “Measured by web traffic, ad revenue and influence over the way the rest of the media makes money, Facebook has grown into the most powerful force in the news industry,” wrote Times media columnist Farhad Manjoo
”
”
Salena Zito (The Great Revolt: Inside the Populist Coalition Reshaping American Politics)
“
Today’s equivalent is probably ‘get an engineering degree’, but it will not necessarily be as lucrative. A third of Americans who graduated in STEM subjects (science, technology, engineering and maths) are in jobs that do not require any such qualification.52 They must still pay off their student debts. Up and down America there are programmers working as office temps and even fast-food servers. In the age of artificial intelligence, more and more will drift into obsolescence. On the evidence so far, this latest technological revolution is different in its dynamics from earlier ones. In contrast to earlier disruptions, which affected particular sectors of the economy, the effects of today’s revolution are general-purpose. From janitors to surgeons, virtually no jobs will be immune. Whether you are training to be an airline pilot, a retail assistant, a lawyer or a financial trader, labour-saving technology is whittling down your numbers – in some cases drastically so. In 2000, financial services employed 150,000 people in New York. By 2013 that had dropped to 100,000. Over the same period, Wall Street’s profits have soared. Up to 70 per cent of all equity trades are now executed by algorithms.53 Or take social media. In 2006, Google bought YouTube for $1.65 billion. It had sixty-five employees, so the price amounted to $25 million per employee. In 2012 Facebook bought Instagram, which had thirteen employees, for $1 billion. That came to $77 million per employee. In 2014, it bought WhatsApp, with fifty-five employees, for $19 billion, at a staggering $345 million per employee.54 Such riches are little comfort to the thousands of engineers who cannot find work. Facebook’s data servers are now managed by Cyborg, a software program. It requires one human technician for every twenty thousand computers.
”
”
Edward Luce (The Retreat of Western Liberalism)
“
The Ultimate Guide To SEO In The 21st Century
Search engine optimization is a complex and ever changing method of getting your business the exposure that you need to make sales and to build a solid reputation on line. To many people, the algorithms involved in SEO are cryptic, but the basic principle behind them is impossible to ignore if you are doing any kind of business on the internet. This article will help you solve the SEO puzzle and guide you through it, with some very practical advice!
To increase your website or blog traffic, post it in one place (e.g. to your blog or site), then work your social networking sites to build visibility and backlinks to where your content is posted. Facebook, Twitter, Digg and other news feeds are great tools to use that will significantly raise the profile of your pages.
An important part of starting a new business in today's highly technological world is creating a professional website, and ensuring that potential customers can easily find it is increased with the aid of effective search optimization techniques. Using relevant keywords in your URL makes it easier for people to search for your business and to remember the URL. A title tag for each page on your site informs both search engines and customers of the subject of the page while a meta description tag allows you to include a brief description of the page that may show up on web search results. A site map helps customers navigate your website, but you should also create a separate XML Sitemap file to help search engines find your pages. While these are just a few of the basic recommendations to get you started, there are many more techniques you can employ to drive customers to your website instead of driving them away with irrelevant search results.
One sure way to increase traffic to your website, is to check the traffic statistics for the most popular search engine keywords that are currently bringing visitors to your site. Use those search words as subjects for your next few posts, as they represent trending topics with proven interest to your visitors.
Ask for help, or better yet, search for it. There are hundreds of websites available that offer innovative expertise on optimizing your search engine hits. Take advantage of them! Research the best and most current methods to keep your site running smoothly and to learn how not to get caught up in tricks that don't really work.
For the most optimal search engine optimization, stay away from Flash websites. While Google has improved its ability to read text within Flash files, it is still an imperfect science. For instance, any text that is part of an image file in your Flash website will not be read by Google or indexed. For the best SEO results, stick with HTML or HTML5.
You have probably read a few ideas in this article that you would have never thought of, in your approach to search engine optimization. That is the nature of the business, full of tips and tricks that you either learn the hard way or from others who have been there and are willing to share! Hopefully, this article has shown you how to succeed, while making fewer of those mistakes and in turn, quickened your path to achievement in search engine optimization!
”
”
search rankings
“
Unleash the potential of Facebook's vast landscape. From algorithms to engagement, our eBook guides you through the art of turning 'likes' into thriving profits.
”
”
Akan Etefia
“
Unleash the potential of Facebook's vast landscape. From algorithms to engagement, the eBook guides you through the art of turning 'likes' into thriving profits.
”
”
Akan Etefia (Facebook Cash Cow: How to Milk the World's Largest Social Network for Profit)
“
Contrary to what many well-intentioned people believe, the fact that we have multiple social media platforms today has little effect on spreading genuinely diverse narratives and perspectives. Social media is not only increasingly in the hands of a few billionaires strongly connected to the ruling class (e.g., Meta acquiring some of the most popular and active platforms), but also the fact that social media platforms operate based on carefully designed and manipulated algorithms to promote the viewpoints of the ruling class in what Cathy O’Neil has called ‘weapons of math destruction’, and what Safiya Umoja Noble insightfully calls ‘algorithms of oppression’, which apply not only to racial matters, but extend to every other matter that is potentially at odds with the desires of the ruling class.
”
”
Louis Yako
“
Amazon, Apple, Facebook, Google, and Microsoft are, too. Although some companies have declared they will never use their technology for weapons, the reality is their technology already is a weapon: hackers are attacking computer networks through Gmail phishing schemes and Microsoft coding vulnerabilities, terrorists are livestreaming attacks, and malign actors have turned social media platforms like Twitter and Facebook into disinformation superhighways that undermine democracy from within.
”
”
Amy B. Zegart (Spies, Lies, and Algorithms: The History and Future of American Intelligence)
“
The sheer volume of online data today is so staggering, it’s hard to comprehend: in 2019, Internet users posted 500 million tweets, sent 294 billion emails, and posted 350 million photos on Facebook every day.29 Some estimate that the amount of information on earth is doubling every two years.30 This kind of publicly available information is called open-source intelligence and it is becoming increasingly valuable.
”
”
Amy B. Zegart (Spies, Lies, and Algorithms: The History and Future of American Intelligence)
“
After studying all the hidden data - the stuff that Facebook doesn't release to the public - the company's scientists reached a definite conclusion. They wrote: 'Our algorithms exploit the human brain's attraction to divisiveness,' and 'if left unchecked,' the site would continue to pump its users with 'more and more divisive content in an effort to gain user attention and increase time on the platform.' A separate internal Facebook team ...independently reached the same conclusions. They found that 64 percent of all the people joining extremist groups were finding their way to them because Facebook's algorithms were directly recommending them. This meant that across the world, people were seeing in their Facebook feeds racist, fascist and even Nazi groups next to the words: 'Groups You Should Join.' They warned that in Germany one-third of all the political groups on the site were extremist. Facebook's own team was blunt, concluding: 'Our recommendation systems grow the problem.'
After carefully analysing all the options, Facebook's scientists concluded there was one solution: they said Facebook would have to abandon its current business model. Because their growth was so tied up with toxic outcomes, the company should abandon attempts at growth. The only way out was for the company to adopt a strategy that was 'anti-growth' - deliberately shrink, and choose to be a less wealthy company that wasn't wrecking the world.
Once Facebook was shown - in plain language, by their own people - what they were doing, how did the company's executives respond? According to the Journal's in-depth reporting, they mocked the research, calling it an 'Eat Your Veggies' approach. They introduced some minor tweaks, but dismissed most of the recommendations.
”
”
Johann Hari (Stolen Focus: Why You Can't Pay Attention— and How to Think Deeply Again)
“
In October 2018, Sri Lankan civil leaders gave Facebook’s regional office, which oversees South Asia’s 400 million users from India, a stark presentation. Hate speech and misinformation were overrunning the platform, seemingly promoted by its algorithms. Violent extremists operated some of its most popular pages. Viral falsehoods were becoming consensus reality for users. Facebook, after all, had displaced local news outlets, just as it had in Myanmar, where villages were still burning. Sri Lanka might be next. Separately, government officials met privately with Facebook’s regional chiefs in Colombo. They pleaded with the company to better police the hate speech on their platform. These posts and pages violated the company’s own rules. Why wouldn’t Facebook act?
”
”
Max Fisher (The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World)
“
Already, when Facebook bought Instagram, it felt as though the walls of the Internet were closing in a little tighter around us users. The broad expanse of possibility, of messiness, on a network like Geocities or the personal expression of Tumblr was shut down. Digital life became increasingly templated, a set of boxes to fill in rather than a canvas to cover in your own image. (You don’t redesign how your Facebook profile looks; you just change your avatar.)
”
”
Kyle Chayka (Filterworld: How Algorithms Flattened Culture)
“
six reasons why email is the best: My company AppSumo generates $65 million a year in total transactions. And you know what? Nearly 50 percent of that comes from email. This percentage has been consistent for more than ten years. Don’t believe me? I have 120,000 Twitter followers, 750,000 YouTube subscribers, and 150,000 TikTok fans—and I would give them all up for my 100,000 email subscribers. Why? Every time I send an email, 40,000 people open it and consume my content. I’m not hoping the platform gods will allow me to reach them. On the other platforms, anywhere between 100 and 1 million people pay attention to my content, but it’s not consistent or in my control. I know what you’re saying: “C’mon, Noah, email is dead.” Now ask yourself, when was the last time you checked your email? Exactly. Email is used obsessively by over 4 billion people! It’s the largest way of communicating at scale that exists today. Eighty-nine percent of people check it EVERY DAY! Social media decides who and how many people you’re seen by. One tweak to the algorithm, and you’re toast. Remember the digital publisher LittleThings? Yeah, no one else does, either. They closed after they lost 75 percent of their 20,000,000 monthly visitors when Facebook changed its algorithm in 2018. CEO Joe Speiser says it killed his business and he lost $100 million. You own your email list. Forever. If AppSumo shuts down tomorrow, my insurance policy, my sweet sweet baby, my beloved, my email list comes with me and makes anything I do after so much easier. Because it’s mine. It also doesn’t cost you significant money to grow your list or to communicate with your list, whereas Facebook or Google ads consistently cost money.
”
”
Noah Kagan (Million Dollar Weekend: The Surprisingly Simple Way to Launch a 7-Figure Business in 48 Hours)
“
just as the Soviet secret police created the slavish Homo sovieticus through surveillance, rewards and punishments, so also the Facebook and YouTube algorithms have created internet trolls by rewarding certain base instincts while punishing the better angels of our nature.
”
”
Yuval Noah Harari (Nexus: A Brief History of Information Networks from the Stone Age to AI)
“
around 2010 there was a series of innovations that fundamentally changed these services. First and foremost, in 2009, Facebook introduced the “like” button and Twitter introduced the “retweet” button. Both of these innovations were then widely copied by other platforms, making viral content dissemination possible. These innovations quantified the success of every post and incentivized users to craft each post for maximum spread, which sometimes meant making more extreme statements or expressing more anger and disgust.[8] At the same time, Facebook began using algorithmically curated news feeds, which motivated other platforms to join the race and curate content that would most successfully hook users. Push notifications were released in 2009, pinging users with notifications throughout the day. The app store brought new advertising-driven platforms to smartphones. Front-facing cameras (2010) made it easier to take photos and videos of oneself, and the rapid spread of high-speed internet (reaching 61% of American homes by January 2010[9]) made it easier for everyone to consume everything quickly.
”
”
Jonathan Haidt (The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness)
“
When a “friend” is changed to an “acquaintance,” their posts are pushed lower in the Facebook algorithm and have a far lower chance of showing up in your feed.
”
”
S.J. Scott (10-Minute Digital Declutter: The Simple Habit to Eliminate Technology Overload)
“
New opportunities for New York as a high-tech hub are related to the evolution of the Internet, according to Chris Dixon: “Imagine the Internet as a house. The first phase— laying the foundation, the bricks—happened in the ‘90s. No wonder that Boston and California, heavy tech places with MIT and Stanford, dominated the scene at that time. The house has been built, now it’s more about interior design. Many interesting, recent companies haven’t been started by technologists but by design and product-oriented people, which has helped New York a lot. New York City has always been a consumer media kind of city, and the Internet is in need of those kinds of skills now. Actually, when I say design, it’s more about product-focused people. I’d put Facebook in that category. Everything requires engineers, but unlike Google, their breakthrough was not as scientific. It was a well-designed product that people liked to use. Google had a significant scientific breakthrough with their search algorithm. That’s not what drives Facebook. In The Social Network movie, when they write equations on the wall that’s just not what it is, it’s not about that. Every company has engineering problems, but Facebook is product-design driven.
”
”
Maria Teresa Cometto (Tech and the City: The Making of New York's Startup Community)
“
The monetary fines levied against Facebook for these violations were minuscule: less than a few hours of revenue.
”
”
Frank Pasquale (The Black Box Society: The Secret Algorithms That Control Money and Information)
“
Google or Facebook were once in the right place at the right time. It’s not clear whether they are still better than anyone else at online data science, or whether their prominence is such that they’ve become the permanent “default.
”
”
Frank Pasquale (The Black Box Society: The Secret Algorithms That Control Money and Information)
“
already today the Facebook algorithm is a better judge of human personalities and dispositions even than people's friends, parents and spouses.
”
”
Yuval Noah Harari
“
Once Google, Facebook and other algorithms become all-knowing oracles, they may well evolve into agents and finally into sovereigns.
”
”
Yuval Noah Harari
“
The privacy issue was reignited in early 2014, when the Wall Street Journal reported that Facebook had conducted a massive social-science experiment on nearly seven hundred thousand of its users. To determine whether it could alter the emotional state of its users and prompt them to post either more positive or negative content, the site’s data scientists enabled an algorithm, for one week, to automatically omit content that contained words associated with either positive or negative emotions from the central news feeds of 689,003 users. As it turned out, the experiment was very “successful” in that it was relatively easy to manipulate users’ emotions, but the backlash from the blogosphere was horrendous. “Apparently what many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we’ll respond to but to actually change our emotions,” wrote Sophie Weiner on AnimalNewYork.com.
”
”
Jonathan Taplin (Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy)
“
So assuming one could argue that Google is a monopoly and needs to enter into a consent decree, would the Bell Labs model work? If Google were required to license every patent it owns for a nominal fee to any American company that asks for it, it would have to license its search algorithms, Android patents, self-driving car patents, smart-thermostat patents, advertising-exchange patents, Google Maps patents, Google Now patents, virtual-reality patents, and thousands of others. What is clear from the Bell Labs model is that such a solution actually benefits innovation in general.
”
”
Jonathan Taplin (Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy)
“
Both Google search results and Facebook’s News Feed algorithm are based on showing us what they think we want. The
”
”
Jacob Silverman (Terms of Service: Social Media and the Price of Constant Connection)
“
Researchers have found that words used on Facebook are surprisingly reliable indicators of personality. Their results are published in the Journal of Personality and Social Psychology. The researchers utilized predictive algorithms of the language to create efficient large-scale personality assessments. The automated language-based models of traits were consistent with the participants' self-reported personality measurements.
”
”
Anonymous
“
Over the last generation, journalism has slowly been swallowed. The ascendant media companies of our era don’t think of themselves as heirs to a great ink-stained tradition. Some prefer to call themselves technology firms. This redefinition isn’t just a bit of fashionable branding. Silicon Valley has infiltrated the profession, from both within and without. Over the past decade, journalism has come to depend unhealthily on Facebook and Google. The big tech companies supply journalism with an enormous percentage of its audience—and therefore a big chunk of revenue. This gives Silicon Valley influence over the entire profession, and it has made the most of its power. Dependence generates desperation—a mad, shameless chase to gain clicks through Facebook, a relentless effort to game Google’s algorithms. It leads media to ink terrible deals, which look like self-preserving necessities, but really just allow Facebook and Google to hold them even tighter. Media will grant Facebook the right to sell advertising or give Google permission to publish articles directly on its fast-loading server. What makes these deals so terrible is the capriciousness of the tech companies. They like to shift quickly in a radically different direction, which is great for their bottom line, but terrible for all the media companies dependent on the platforms. Facebook will decide that its users prefer video to words, or that its users prefer ideologically pleasing propaganda to hard news. When Facebook shifts direction like this or when Google tweaks its algorithm, they instantly crash Web traffic flowing to media, with all the rippling revenue ramifications that follow. Media know they should flee the grasp of Facebook, but dependence also breeds cowardice. The prisoner lies on the cot dreaming of escape plans that will never hatch. Dependence on the big tech companies is increasingly the plight of the worker and the entrepreneur. Drivers maintain erratic patterns of sleep because of Uber’s shifting whims. Companies that manufacture tchotchkes sold on Amazon watch their businesses collapse when Amazon’s algorithms detect the profitability of their item, leading the giant to manufacture the goods itself at a lower price. The problem isn’t just financial vulnerability. It’s the way in which the tech companies dictate the patterns of work, the way in which their influence can shift the ethos of an entire profession to suit their needs—lowering standards of quality, eroding ethical protections. I saw this up close during my time at the New Republic. I watched how dependence on the tech companies undermined the very integrity of journalism. At the very beginning of that chapter in my career, I never imagined that we would go down that path.
”
”
Franklin Foer (World Without Mind: The Existential Threat of Big Tech)
“
The fifth factor in the T Algorithm is the ability to control the consumer experience, at purchase, through vertical integration.
”
”
Scott Galloway (The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google)
“
told Nike that to have a shot at a trillion, they would need to do three things: Increase percentage of direct-to-consumer retail to 40 percent within ten years (closer to 10 percent in 2016). Gain greater facility with data and how to incorporate into product features. Move their headquarters from Portland. As I learned, the algorithm is the easy part. Getting them to listen to you (“You need to relocate HQ from Portland”) is the hard part. Chapter 9 The Fifth Horseman?
”
”
Scott Galloway (The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google)
“
this system was so entrenched, publication websites had dismantled their home pages to the point that they often featured only a few stories on the screen at a time, with a maximum of images and a minimum of text. When I browsed them, I felt like an unexpected visitor, someone who wasn’t supposed to be there. The sites all but shouted: Don’t you know you’re supposed to be on Facebook or Twitter!?
”
”
Kyle Chayka (Filterworld: How Algorithms Flattened Culture)
“
Our algorithms exploit the human brain’s attraction to divisiveness,” the researchers warned in a 2018 presentation later leaked to the Wall Street Journal. In fact, the presentation continued, Facebook’s systems were designed in a way that delivered users “more and more divisive content in an effort to gain user attention & increase time on the platform.
”
”
Max Fisher (The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World)
“
Humanovator (The Sonnet)
Chatgpt pampers plagiarism,
Facebook pampers conspiracy.
More and more innovations are
becoming catalyst of catastrophe.
Note, I didn't mention the birdie,
Very mindful, very demure.
Facebook can still be repaired,
but once a MAGA, always a sewer.
Innovation that outlives its usefulness,
is no longer innovation but carnivoration.
Innovators not in touch with soil-n-roots,
are predators of the concrete jungle.
The golden age of startups is behind us,
today it's mostly filth, fraud and smut.
Amidst the crowd of trust fund termites,
be the humanovator to humanize the world.
”
”
Abhijit Naskar (The Divine Refugee)
“
Chatgpt pampers plagiarism,
Facebook pampers conspiracy.
More and more innovations are
becoming catalyst of catastrophe.
”
”
Abhijit Naskar (The Divine Refugee)
“
Companies like Facebook, Google, and Twitter have built sophisticated, planetary-scale machine-learning algorithms whose entire purpose is to generate engagement
”
”
Kevin Roose (Futureproof: 9 Rules for Surviving in the Age of AI)
“
PUTTING IT ALL TOGETHER I’ve explained a lot of concepts in this chapter, so I want to recap it all into something a little more tangible. Step #1: The first step is to figure out what type of show you want to have. If you’re a writer, then you should start a blog. If you like video, then you should start a vlog on one of the video platforms. Lastly, if you like audio, then you should start a podcast. Step #2: Your show will be you documenting the process of achieving the same goal that your audience will be striving for. As you’re documenting your process, you’ll be testing your material and paying attention to the things that people respond to. If you commit to publishing your show every day for a year, you’ll have the ability to test your material and find your voice, and your dream customers will be able to find you. Step #3: You’ll leverage your Dream 100 by interviewing them on your show. This will give you the ability to build relationships with them, give them a platform, give you the ability to promote their episode on your show to their audience, and get access to their friends and followers. Step #4: Even though this is your own show, you’re renting time on someone else’s network. It’s important that you don’t forget it and that you focus on converting it into traffic that you own. Figure 7.11: As you create your own show, focus on converting traffic that you earn and control into traffic that you own. And with that, I will close out Section One of this book. So far, we’ve covered a lot of core principles to traffic. We: Identified exactly who your dream client is. Discovered exactly where they are congregating. Talked about how to work your way into those audiences (traffic that you earn) and how you buy your way into those audiences (traffic that you control). Learned how to take all the traffic that you earn and all the traffic that you buy and turn it all into traffic that you own (building your list). Discussed how to plug that list into a follow-up funnel so you can move them through your value ladder. Prepared to infiltrate your Dream 100, find your voice, and build your following by creating your own show. In the next section, we’ll shift our focus to mastering the pattern to get traffic from any advertising networks (like Instagram, Facebook, Google, and YouTube) and how to understand their algorithms so you can get unlimited traffic and leads pouring into your funnels.
”
”
Russell Brunson (Traffic Secrets: The Underground Playbook for Filling Your Websites and Funnels with Your Dream Customers)
“
But as Facebook became a destination for political conversations, the human curation in “Trending Topics” wasn’t the actual problem. It was how human nature was manipulated by Facebook’s algorithm, and how Facebook looked away, that got the company in trouble.
”
”
Sarah Frier (No Filter: The inside story of Instagram)
“
News organizations had been designing more clickable headlines ever since the social network became key to their distribution. But those news organizations were getting beaten by these new players, who had come up with an easier, more lucrative way to go viral—by making up stories that played on Americans’ hopes and fears, and therefore winning via the Facebook algorithm.
”
”
Sarah Frier (No Filter: The inside story of Instagram)
“
With the social network, everyone had the potential to engage with whatever kinds of ideas they wanted to, and tended to have at least some Facebook connections with people who held different political opinions. But if people chose not to interact with those they disagreed with, was that really Facebook’s doing? Their algorithm was just showing people what they demonstrated, through their own behavior, they wanted to see, enhancing their existing preferences.
”
”
Sarah Frier (No Filter: The inside story of Instagram)
“
Imagine two Facebook feeds. One is full of updates, news, and videos that make you feel calm and happy. The other is full of updates, news, and videos that make you feel angry and outraged. Which one does the algorithm select? The algorithm is neutral about the question of whether it wants you to be calm or angry. That’s not its concern. It only cares about one thing: Will you keep scrolling? Unfortunately, there’s a quirk of human behavior. On average, we will stare at something negative and outrageous for a lot longer than we will stare at something positive and calm. You will stare at a car crash longer than you will stare at a person handing out flowers by the side of the road, even though the flowers will give you a lot more pleasure than the mangled bodies in a crash. Scientists have been proving this effect in different contexts for a long time—if they showed you a photo of a crowd, and some of the people in it were happy, and some angry, you would instinctively pick out the angry faces first. Even ten-week-old babies respond differently to angry faces. This has been known about in psychology for years and is based on a broad body of evidence. It’s called “negativity bias.” There is growing evidence that this natural human quirk has a huge effect online. On YouTube, what are the words that you should put into the title of your video, if you want to get picked up by the algorithm? They are—according to the best site monitoring YouTube trends—words such as “hates,” “obliterates,” “slams,” “destroys.” A major study at New York University found that for every word of moral outrage you add to a tweet, your retweet rate will go up by 20 percent on average, and the words that will increase your retweet rate most are “attack,” “bad,” and “blame.” A study by the Pew Research Center found that if you fill your Facebook posts with “indignant disagreement,” you’ll double your likes and shares. So an algorithm that prioritizes keeping you glued to the screen will—unintentionally but inevitably—prioritize outraging and angering you. If it’s more enraging, it’s more engaging.
”
”
Johann Hari (Stolen Focus: Why You Can't Pay Attention—and How to Think Deeply Again)
“
The strategy used focus groups, psychographic modeling, and predictive algorithms, and it harvested private user data through online quizzes and contests, using a perfectly legal opt-in.
”
”
Brittany Kaiser (Targeted: The Cambridge Analytica Whistleblower's Inside Story of How Big Data, Trump, and Facebook Broke Democracy and How It Can Happen Again)
“
The figure of 50 million was nearly twice the number of users whose data was stolen, and was what Cambridge had used to model some 240 million Americans. With that single, prodigious harvest, and personality profiling, Cambridge had been able to categorize, through predictive algorithms and the other data it had purchased, every single American over the age of eighteen according to many different models, including OCEAN scoring; that’s how it knew which individual Americans were “open,” “conscientious,” “neurotic,” and so on. And that’s what had made its microtargeting so precise and effective. It was one of the main ingredients in Cambridge’s secret sauce.
”
”
Brittany Kaiser (Targeted: The Cambridge Analytica Whistleblower's Inside Story of How Big Data, Trump, and Facebook Broke Democracy and How It Can Happen Again)
“
Second, CA provided clients, political and commercial, with a benefit that set the company apart: the accuracy of its predictive algorithms. Dr. Alex Tayler, Dr. Jack Gillett, and CA’s other data scientists constantly ran new algorithms, producing much more than mere psychographic scores. They produced scores for every person in America, predicting on a scale of 0 to 100 percent how likely, for example, each was to vote; how likely each was to belong to a particular political party; or what toothpaste each was likely to prefer.
”
”
Brittany Kaiser (Targeted: The Cambridge Analytica Whistleblower's Inside Story of How Big Data, Trump, and Facebook Broke Democracy and How It Can Happen Again)
“
Third, CA then took what they had learned from these algorithms and turned around and used platforms such as Twitter, Facebook, Pandora (music streaming), and YouTube to find out where the people they wished to target spent the most interactive time. Where was the best place to reach each person?
”
”
Brittany Kaiser (Targeted: The Cambridge Analytica Whistleblower's Inside Story of How Big Data, Trump, and Facebook Broke Democracy and How It Can Happen Again)
“
My friend Bangaly Kaba, formerly head of growth at Instagram, called this idea the theory of “Adjacent Users.” He describes his experience at Instagram, which several years post-launch was growing fast but not at rocketship speed: When I joined Instagram in 2016, the product had over 400 million users, but the growth rate had slowed. We were growing linearly, not exponentially. For many products, that would be viewed as an amazing success, but for a viral social product like Instagram, linear growth doesn’t cut it. Over the next 3 years, the growth team and I discovered why Instagram had slowed, developed a methodology to diagnose our issues, and solved a series of problems that reignited growth and helped us get to over a billion users by the time I left. Our success was anchored on what I now call The Adjacent User Theory. The Adjacent Users are aware of a product and possibly tried using it, but are not able to successfully become an engaged user. This is typically because the current product positioning or experience has too many barriers to adoption for them. While Instagram had product-market fit for 400+ million people, we discovered new groups of billions of users who didn’t quite understand Instagram and how it fit into their lives.67 In my conversations with Bangaly on this topic, he described his approach as a systematic evaluation of the network of networks that constituted Instagram. Rather than focusing on the core network of Power Users—the loud and vocal minority that often drive product decisions—instead the approach was to constantly figure out the adjacent set of users whose experience was subpar. There might be multiple sets of nonfunctional adjacent networks at any given time, and it might require different approaches to fix each one. For some networks, it might be the features of the product, like Instagram not having great support for low-end Android apps. Or it might be because of the quality of their networks—if the right content creators or celebrities hadn’t yet arrived. You fix the experience for these users, then ask yourself again, who are the adjacent users? Then repeat. Bangaly describes this approach: When I started at Instagram, the Adjacent User was women 35–45 years old in the US who had a Facebook account but didn’t see the value of Instagram. By the time I left Instagram, the Adjacent User was women in Jakarta, on an older 3G Android phone with a prepaid mobile plan. There were probably 8 different types of Adjacent Users that we solved for in-between those two points. To solve for the needs of the Adjacent User, the Instagram team had to be nimble, focusing first on pulling the audience of US women from the Facebook network. This required the team to build algorithmic recommendations that utilized Facebook profiles and connections, so that Instagram could surface friends and family on the platform—not just influencers. Later on, targeting users in Jakarta and in other developing countries might involve completely different approaches—refining apps for low-end Android phones with low data connections. As the Adjacent User changes, the strategy has to change as well.
”
”
Andrew Chen (The Cold Start Problem: How to Start and Scale Network Effects)
“
The root of the disinformation problem, of course, lay in the technology. Facebook was designed to throw gas on the fire of any speech that invoked an emotion, even if it was hateful speech—its algorithms favored sensationalism. Whether a user clicked on a link because they were curious, horrified, or engaged was immaterial; the system saw that the post was being widely read, and it promoted it more widely across users’ Facebook pages. The situation in Myanmar was a deadly experiment in what could happen when the internet landed in a country where a social network became the primary, and most widely trusted, source of news.
”
”
Sheera Frenkel (An Ugly Truth: Inside Facebook's Battle for Domination)
“
In August of 2016 Facebook announced it was changing its news-feed algorithm to try to cut down on the amount of click bait that appears on the site. It remains to be seen how this will affect quality journalism organizations that are dependent on Facebook traffic.
”
”
Jonathan Taplin (Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy)
“
In 2017 an artificial intelligence program crushed the world’s top players at Texas Hold ’Em—that is to say, it knew how to bluff. Given enough examples, AI programs can now learn almost anything: Facebook’s DeepFace algorithm recognizes specific human faces in photos 97 percent of the time,
”
”
Bill McKibben (Falter: Has the Human Game Begun to Play Itself Out?)
“
Tips on Web Design and Site Marketing Web content is king, which is why we have devoted an entire chapter to it later in this book. It is what draws visitors and ultimately what converts them to customers. So, try to make your web content as engaging as possible. Make sure the content is interactive, unique and educational. Ensure that visitors have the option of plugins while encouraging them to visit as many pages on your site as possible if they want to obtain vital information. The images you use on your website should be both enticing and descriptive in nature. In today’s world, social media is all pervasive. In order to encourage visitors to share your web content, you can include icons of social media platforms on your website. In some select cases, consider integrating social media feeds, like Facebook or Instagram, onto your website so that they can automatically show the latest postings. A "Call-to-Action" can help convert visitors to your site into customers. Always try using a very clear and concise "Call-to-Action" language. Understand what type of conversion you are looking for, and try to provide multiple levels of conversion. For example, a plastic surgeon may provide Schedule an Appointment as a call to action, which will attract only the segment of web visitors who have reached their decision stage. By adding conversion points for visitors who are at earlier stages of their decision making, like signing up for a webcast or your newsletter can help you widen your conversion points and provide inputs to your email marketing. To raise the average amount of time a visitor spends on your website and to minimize the bounce rate, ensure that your website offers a user-friendly and attractive design. This way you will increase the number of links you have on your website and boost its SEO ranking (Tip: While Google’s algorithm is not public, our iterative testing shows that sites with good usability analytics metrics like time on site and bounce rate play favorably in Google’s algorithm, other things remaining constant). Ensure you observe due diligence when designing a website that will enable visitors to navigate in different languages. For example, you may need a lot more space for your menu, as there are languages that use up more space than the English language.
”
”
Danny Basu (Digital Doctor: Integrated Online Marketing Guide for Medical and Dental Practices)
“
Data scientist Jeff Hammerbacher, former manager of the Data group at Facebook, once told Bloomberg Businessweek that “the best minds of my generation are thinking about how to make people click ads.
”
”
Brian Christian (Algorithms to Live By: The Computer Science of Human Decisions)
“
Volume is key. Twitter now estimates that Russia used more than fifty thousand automated accounts or bots to Tweet election-related content during the 2016 presidential campaign. Twitter and Facebook are the best-known disinformation superhighways, but there are many others. Russian officers have infiltrated everything from 4chan to Pinterest.
”
”
Amy B. Zegart (Spies, Lies, and Algorithms: The History and Future of American Intelligence)