Coded Bias Quotes

We've searched our database for all the quotes and captions related to Coded Bias. Here they are! All 67 of them:

Fairness isn’t about charity. It’s smart business.
Hanna Hasl-Kelchner (Seeking Fairness at Work: Cracking the New Code of Greater Employee Engagement, Retention & Satisfaction)
Bias in the workplace is a form of tribalism – you’re either in or out
Hanna Hasl-Kelchner (Seeking Fairness at Work: Cracking the New Code of Greater Employee Engagement, Retention & Satisfaction)
Real leadership is treating your least favorite employee the same as your favorite
Hanna Hasl-Kelchner (Seeking Fairness at Work: Cracking the New Code of Greater Employee Engagement, Retention & Satisfaction)
The syntax of prejudice—threaded into conversation with the perfect pauses and facial expressions—was like ciphers and spy codes. The meaning clear to those it was meant for. To everyone else, it was harmless scribbles. Easy enough to deny.
Sonali Dev (Pride, Prejudice, and Other Flavors (The Rajes, #1))
All that incoherence. Selection, election, option, alternative. All behind him now. Codes and formats. Courses of action. Values, bias, predilection. Choice is a subtle form of disease.
Don DeLillo (Running Dog)
Claiming that the past was socially better than the present is also a hallmark of white supremacy. Consider any period in the past from the perspective of people of color: 246 years of brutal enslavement; the rape of black women for the pleasure of white men and to produce more enslaved workers; the selling off of black children; the attempted genocide of Indigenous people, Indian removal acts, and reservations; indentured servitude, lynching, and mob violence; sharecropping; Chinese exclusion laws; Japanese American internment; Jim Crow laws of mandatory segregation; black codes; bans on black jury service; bans on voting; imprisoning people for unpaid work; medical sterilization and experimentation; employment discrimination; educational discrimination; inferior schools; biased laws and policing practices; redlining and subprime mortgages; mass incarceration; racist media representations; cultural erasures, attacks, and mockery; and untold and perverted historical accounts, and you can see how a romanticized past is strictly a white construct. But it is a powerful construct because it calls out to a deeply internalized sense of superiority and entitlement and the sense that any advancement for people of color is an encroachment on this entitlement.
Robin DiAngelo (White Fragility: Why It's So Hard for White People to Talk About Racism)
Speech codes may well increase tension and edginess rather than relieve them. A student at the State University of New York at Binghamton complains that “If you look at someone funny, it’s a bias incident.”137
Jared Taylor (Paved With Good Intentions: The Failure of Race Relations in Contemporary America)
Jim Crow and mass incarceration have similar political origins...both caste systems were born in part, due to desire among white elites to exploit the resentments, vulnerabilities and racial biases of poor and working-class whites for political or economic gain. Segregation laws were proposed as part of a deliberate and strategic effort to deflect anger and hostility that have been brewing against the white elite away from them and toward African Americans. The birth of mass incarceration can be traced to a similar political dynamic. Conservatives in the 1960s and 1970s sought to appeal to the racial biases and economic vulnerabilities of poor and working-class whites through racially coded rhetoric on crime and welfare. In both cases, the racial opportunists offered few, if any, economic reforms to address the legitimate economic anxieties of poor and working-class whites, proposing instead a crackdown on the racially defined "others." In the early years of Jim Crow, conservative white elites competed with each other by passing ever more stringent and oppressive Jim Crow legislation. A century later, politicians in the early years of the drug war competed with each other to prove who could be tougher on crime by passing ever harsher drug laws- a thinly veiled effort to appeal to poor and working-class whites who, once again, proved they were willing to forego economic and structural reform in exchange for an apparent effort to put blacks back "in their place.
Michelle Alexander (The New Jim Crow: Mass Incarceration in the Age of Colorblindness)
The revolutionary environmentalists twist science to get what they want, saying they're using the "best available science" to determine public policy, when in fact these are code words for cherry picking from a repertoire of biased science studies. Arguing with the Greenies' faulty science is like shouting into the wind, because they will disregard or minimize evidence that disputes any position they are trying to assert. Your points will be ignored, and you will be demonized.
Brian Herbert
Computational model: history is the on-chain population; all the rest is editorialization. There’s a great book by Franco Moretti called Graphs, Maps, and Trees. It’s a computational study of literature. Moretti’s argument is that every other study of literature is inherently biased. The selection of which books to discuss is itself an implicit editorialization. He instead makes this completely explicit by creating a dataset of full texts, and writing code to produce graphs. The argument here is that only a computational history can represent the full population in a statistical sense; anything else is just a biased sample.
Balaji S. Srinivasan (The Network State: How To Start a New Country)
Take for example job applications. In the 21st century the decision wherever to hire somebody for a job while increasingly be made by algorithms. We cannot rely on the machines to set the relevant ethical standards, humans will still need to do that, but once we decide on an ethical standard in the job market, that it is wrong to discriminate against blacks or against women for example, we can rely on machines to implement and maintain these standards better than humans. A human manager may know and even agree that is unethical to discriminate against blacks and women but then when a black woman applies for a job the manager subconsciously discriminate against her and decides not to hire her. If we allow a computer to evaluate job applications and program computers to completely ignore race and gender we can be certain that the computer will indeed ignore these factors because computers do not have a subconscious. Of course it won't be easy to write code for evaluating job applications and there is always the danger that the engineers will somehow program their own subconscious biases into the software, yet once we discover such mistakes it would probably be far easier to debug the software than to get rid humans of their racist and misogynist biases.
Yuval Noah Harari (21 Lessons for the 21st Century)
Ultimately the danger of the New Jim Code positioning is that existing social biases are reinforced – yes. But new methods of social control are produced as well. Does this mean that every form of technological prediction or personalization has racist effects? Not necessarily. It means that, whenever we hear the promises of tech being extolled, our antennae should pop up to question what all that hype of “better, faster, fairer” might be hiding and making us ignore. And, when bias and inequity come to light, “lack of intention” to harm is not a viable alibi. One cannot reap the reward when things go right but downplay responsibility when they go wrong.
Ruha Benjamin (Race After Technology: Abolitionist Tools for the New Jim Code)
Animals, including people, fight harder to prevent losses than to achieve gains. In the world of territorial animals, this principle explains the success of defenders. A biologist observed that “when a territory holder is challenged by a rival, the owner almost always wins the contest—usually within a matter of seconds.” In human affairs, the same simple rule explains much of what happens when institutions attempt to reform themselves, in “reorganizations” and “restructuring” of companies, and in efforts to rationalize a bureaucracy, simplify the tax code, or reduce medical costs. As initially conceived, plans for reform almost always produce many winners and some losers while achieving an overall improvement. If the affected parties have any political influence, however, potential losers will be more active and determined than potential winners; the outcome will be biased in their favor and inevitably more expensive and less effective than initially planned. Reforms commonly include grandfather clauses that protect current stake-holders—for example, when the existing workforce is reduced by attrition rather than by dismissals, or when cuts in salaries and benefits apply only to future workers. Loss aversion is a powerful conservative force that favors minimal changes from the status quo in the lives of both institutions and individuals.
Daniel Kahneman (Thinking, Fast and Slow)
That’s why one of my strongest ideas is to look at the tax code in both its complexity and its obvious bias toward the rich. Hedge fund and money managers are important for our pension funds and the 401(k) plans that help millions of Americans—but far less important than they think. But financial advisers should pay taxes at the highest levels when they’re earning money at those levels. Often, these financial engineers are “flipping” companies, laying people off, and making billions—yes, billions—of dollars by “downsizing” and destroying people’s lives and sometimes entire companies. Believe me, I know the value of a billion dollars—but I also know the importance of a single dollar.
Donald J. Trump (Great Again: How to Fix Our Crippled America)
In a classic study of how names impact people’s experience on the job market, researchers show that, all other things being equal, job seekers with White-sounding first names received 50 percent more callbacks from employers than job seekers with Black-sounding names.5 They calculated that the racial gap was equivalent to eight years of relevant work experience, which White applicants did not actually have; and the gap persisted across occupations, industry, employer size – even when employers included the “equal opportunity” clause in their ads.6 With emerging technologies we might assume that racial bias will be more scientifically rooted out. Yet, rather than challenging or overcoming the cycles of inequity, technical fixes too often reinforce and even deepen the status quo. For example, a study by a team of computer scientists at Princeton examined whether a popular algorithm, trained on human writing online, would exhibit the same biased tendencies that psychologists have documented among humans. They found that the algorithm associated White-sounding names with “pleasant” words and Black-sounding names with “unpleasant” ones.7 Such findings demonstrate what I call “the New Jim Code”: the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era.
Ruha Benjamin (Race After Technology: Abolitionist Tools for the New Jim Code)
Anson laid bare his ulterior motives for favoring the removal of Japanese farmers, but like all strategic racists, he also at least partially subscribed to the racial antipathies he endeavored to exploit. From here, motives become more attenuated as persons adopt particular ideas depending not on their material interests but on how these notions protect their self-image and, for the privileged, confirm society’s basic fairness. For instance, the dominance of colorblindness today surely ties back to motives, not on the fully conscious level, but in many whites being drawn to conceptions of race that affirm their sense of being moral persons neither responsible for nor benefited by racial inequality. Colorblindness offers whites racial expiation: they cannot be racist if they lack malice; nor can they be responsible for inequality, since this reflects differences in group mores. Colorblindness also compliments whites on a superior culture that explains their social position. In addition it empathizes with whites as racism’s real victims when government favors minorities through affirmative action or welfare payments. Finally, colorblindness affirms that whites are moral when they oppose measures to promote integration because it’s allegedly their principled objection to any use of race that drives them, not bias. Colorblindness has not gained adherents because of its analytic insight (that race is completely disconnected from social practices blinks reality); rather, it thrives because it comforts whites regarding their innocence, reassures them that their privilege is legitimate, commiserates with their victimization, and hides from them their hostility toward racial equality.
Ian F. Haney-López (Dog Whistle Politics: How Coded Racial Appeals Have Reinvented Racism and Wrecked the Middle Class)
Silicon Psychos (The Sonnet) If we cared more about the hard problem of real inhumanity, And less about the fictitious hard problem of consciousness, We'd have filled the world with human consciousness already, Instead of still fighting for basic rights against base biases. What kind of a moron goes walkabout when their home is on fire, What kind of a moron abandons the living chasing life on silicon! We really gotta take a hard look at our habits and priorities, Dreaming is good, but dream devoid of life is but degeneration. Chimps driving teslas are still chimps no matter the demagoguery, All intelligence is disgrace if it's unaware of human condition. A heartless organism living on silicon is no different, From a heartless organism living in a carbon based human. Be it crucifix or code, in savage hands every tool is weapon. The wise use AI to design prosthetics, savages for transhumanism.
Abhijit Naskar (Corazon Calamidad: Obedient to None, Oppressive to None)
Imagine this situation: you have bought a new car, but before you can start using it, you must open the settings menu and check one of several boxes. In case of an accident, do you want the car to sacrifice your life or to kill the family in the other vehicle? Is this a choice you even want to make? Just think of the arguments you are going to have with your husband about which box to check. So maybe the state should intervene to regulate the market and lay down an ethical code binding all self-driving cars. Some lawmakers will doubtless be thrilled by the opportunity to finally make laws that are always followed to the letter. Others may be alarmed by such unprecedented and totalitarian responsibility. After all, throughout history the limitations of actually enforcing laws provided a welcome check on the biases, mistakes, and excesses of lawmakers. It was an extremely lucky thing that laws against homosexuality and blasphemy were only partially enforced. Do we really want a system in which the decisions of fallible politicians become as inexorable as gravity?
Yuval Noah Harari (21 Lessons for the 21st Century)
But it is also true that this long-winded, unwieldy compilation of assorted prescriptions represents an overall softening—a humanizing—of the common law of the ancient Middle East, which easily prescribed a hand not for a hand but for the theft of a loaf of bread or for the striking of one’s better and which gave much favor to the rights of the nobility and virtually none to the lower classes. The casual cruelty of other ancient law codes—the cutting off of nose, ears, tongue, lower lip (for kissing another man’s wife), breasts, and testicles—is seldom matched in the Torah. Rather, in the prescriptions of Jewish law we cannot but note a presumption that all people, even slaves, are human and that all human lives are sacred. The constant bias is in favor not of the powerful and their possessions but of the powerless and their poverty; and there is even a frequent enjoinder to sympathy:     “A sojourner you are not to oppress:     you yourselves know (well) the feelings of the sojourner,     for sojourners were you in the land of Egypt.” This bias toward the underdog is unique not only in ancient law but in the whole history of law. However faint our sense of justice may be, insofar as it operates at all it is still a Jewish sense of justice.
Thomas Cahill (The Gifts of the Jews: How a Tribe of Desert Nomads Changed the Way Everyone Thinks and Feels (Hinges of History Book 2))
Sociologist Barry Glassner (1999) has documented many of the biases introduced by “If it bleeds, it leads” news reporting, and by the strategic efforts of special interest groups to control the agenda of public fear of crime, disease, and other hazards. Is an increase of approximately 700 incidents in 50 states over 7 years an “epidemic” of road rage? Is it conceivable that there is (or ever was) a crisis in children’s day care stemming from predatory satanic cults? In 1994, a research team funded by the U.S. government spent 4 years and $750,000 to reach the conclusion that the myth of satanic conspiracies in day care centers was totally unfounded; not a single verified instance was found (Goodman, Qin, Bottoms, & Shaver, 1994; Nathan & Snedeker, 1995). Are automatic-weapon-toting high school students really the first priority in youth safety? (In 1999, approximately 2,000 school-aged children were identified as murder victims; only 26 of those died in school settings, 14 of them in one tragic incident at Columbine High School in Littleton, Colorado.) The anthropologist Mary Douglas (Douglas & Wildavsky, 1982) pointed out that every culture has a store of exaggerated horrors, many of them promoted by special interest factions or to defend cultural ideologies. For example, impure water had been a hazard in 14th-century Europe, but only after Jews were accused of poisoning wells did the citizenry become preoccupied with it as a major problem. But the original news reports are not always ill-motivated. We all tend to code and mention characteristics that are unusual (that occur infrequently). [...] The result is that the frequencies of these distinctive characteristics, among the class of people considered, tend to be overestimated.
Reid Hastie (Rational Choice in an Uncertain World: The Psychology of Judgement and Decision Making)
When the children were asked how many white people were “mean,” they commonly answered “almost none.” But when asked how many blacks were mean, many answered “some” or “a lot.”1 The thrust of the article seemed to be that children possess racial biases. However eye-catching the title, though, it pointed in the wrong direction—at infants and little children rather than adults. The core of the article focused on parenting strategies, and especially on the desire to raise children to be colorblind—to be blind to race. The parents were not teaching their children to be bigots. Instead, they were doing their utmost to teach their children to reject racism by studiously ignoring race. Yet, even in a liberal bastion like Austin, it wasn’t working. Today the dominant etiquette around race
Ian F. Haney-López (Dog Whistle Politics: How Coded Racial Appeals Have Reinvented Racism and Wrecked the Middle Class)
Computer code becomes law, and rules are executed as they were written and interpreted by the network. Computers don’t have the same social biases and behaviors as humans do.
Tiana Laurence (Blockchain for Dummies)
It is easier simply to deny the bias, to say that what is, is not. Small wonder that's the default position of conservatism on matters of race: Absent burning crosses and pointy white hoods, nothing is ever racism to them. And the more fervently one denies self-evident truth, the more emotionally invested one becomes in doing so.
Leonard Pitts Jr. (Racism in America: Cultural Codes and Color Lines in the 21st Century)
The results of the studies opened up a whole new avenue of research into live-attenuated vaccines: synthetic attenuated virus engineering (SAVE). A virus was created with 631 synonymous mutations in its P1 coding sequence, designed to bias it toward the use of codons that rarely preferred in human cells. The result was a highly attenuated virus that caused no disease in an animal model of virus infection, and like the naturally evolved live-attenuated polioviruses developed by Sabin, it proved to be a highly effective vaccine. Unlike Sabin's strains, however, the multiplicity of genetic changes contributing to attenuation is expected to render the phenotype far more stable and resilient to reversion in vivo. This technology could prove extremely useful in the development of safe and stable attenuated viruses that raise an immune response almost identical to that against the natural infection. There are now many examples of the genetic engineering of synthetic attenuated virus vaccines; most notably it has been employed to create a live-attenuated vaccine against a strain of human influenza, a virus that, unlike poliovirus or smallpox virus, we cannot hope to eradicate and for which vaccination remains the lynchpin of disease management.
Michael G. Cordingley
The researchers looked deeper into these observations, in hopes of gaining insight into the mechanisms underlying the high evolutionary rate and extraordinary immunologic plasticity of influenza HA. They probed in more detail the precise codons that are used by the virus to encode the influenza HA1 protein. The discriminated between codons on the basis of volatility. Each three-nucleotide codon is related by a single nucleotide change to nine 'mutational neighbours.' Of those nine mutations, some proportion change the codon to a synonymous codon and some change it to a nonsynonymous one, which directs the incorporation of a different amino acid into the protein. More volatile codons are those for which a larger proportion of those nine mutational neighbours encode an amino acid change. The use of particular codons in a gene at a frequency that is disproportionate to their random selection for encoding a chosen amino acid is termed codon bias. Such bias is common and is influenced by many factors, but here the collaborators found strong evidence for codon bias that was particular for and restricted to the amino acids making up the HA1 epitopes. Remarkably, they observed that influenza employs a disproportionate number of volatile codons in its epitope-coding sequences. There was a bias for the use of codons that had the fewest synonymous mutational neighbours. In other words, influenza HA1 appears to have optimized the speed with which it can change amino acids in its epitopes. Amino acid changes can arise from fewer mutational events. The antibody combining regions are optimized to use codons that have a greater likelihood to undergo nonsynonymous single nucleotide substitutions : they are optimized for rapid evolution.
Michael G. Cordingley (Viruses: Agents of Evolutionary Invention)
We have to place our attention on humanizing artificial intelligence by removing the biases from algorithms rather than dehumanizing it.
Abhijit Naskar
Skills are taught experientially—meaning that students studying AI don’t have their heads buried in books. In order to learn, they need lexical databases, image libraries, and neural nets. For a time, one of the more popular neural nets at universities was called Word2vec, and it was built by the Google Brain team. It was a two-layer system that processed text, turning words into numbers that AI could understand.17 For example, it learned that “man is to king as woman is to queen.” But the database also decided that “father is to doctor as mother is to nurse” and “man is to computer programmer as woman is to homemaker.”18 The very system students were exposed to was itself biased. If someone wanted to analyze the farther-reaching implications of sexist code, there weren’t any classes where that learning could take place.
Amy Webb (The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity)
The implicit bias is clear: expense codes are based on the assumption that the employee has a wife at home taking care of the home and the kids.
Caroline Criado Pérez (Invisible Women: Data Bias in a World Designed for Men)
More generally, a data scientist is someone who knows how to extract meaning from and interpret data, which requires both tools and methods from statistics and machine learning, as well as being human. She spends a lot of time in the process of collecting, cleaning, and munging data, because data is never clean. This process requires persistence, statistics, and software engineering skills — skills that are also necessary for understanding biases in the data, and for debugging logging output from code. Once she gets the data into shape, a crucial part is exploratory data analysis, which combines visualization and data sense. She’ll find patterns, build models, and algorithms — some with the intention of understanding product usage and the overall health of the product, and others to serve as prototypes that ultimately get baked back into the product. She may design experiments, and she is a critical part of data-driven decision making. She’ll communicate with team members, engineers, and leadership in clear language and with data visualizations so that even if her colleagues are not immersed in the data themselves, they will understand the implications.
Rachel Schutt (Doing Data Science)
If colorblindness seems to backfire, is there something that does help our children—and us—navigate the dangerous shoals of race? Yes: talking openly about racial differences and what they might mean. Psychological research shows that cognitive biases in social judgment “can be controlled only through subsequent, deliberate ‘mental correction’ that takes group status squarely into account.
Ian F. Haney-López (Dog Whistle Politics: How Coded Racial Appeals Have Reinvented Racism and Wrecked the Middle Class)
The underlying message of all these rules is that inheritance tends to work against the primary technical imperative you have as a programmer, which is to manage complexity. For the sake of controlling complexity, you should maintain a heavy bias against inheritance.
Steve McConnell (Code Complete)
In his searing work Less Than Human: Why We Demean, Enslave, and Exterminate Others, philosopher David Livingstone Smith explains how this occurs: We are innately biased against outsiders. This bias is seized upon and manipulated by indoctrination and propaganda to motivate men and women to slaughter one another. This is done by inducing men to regard their enemies as subhuman creatures, which overrides their natural, biological inhibitions against killing. So dehumanization has the specific function of unleashing aggression in war. This is a cultural process, not a biological one, but it has to ride piggyback on biological adaptations in order to be effective.9,10
Shannon E. French (The Code of the Warrior: Exploring Warrior Values Past and Present)
The birth of mass incarceration can be traced to a similar political dynamic. Conservatives in the 1970s and 1980s sought to appeal to the racial biases and economic vulnerabilities of poor and working-class whites through racially coded rhetoric on crime and welfare. In both cases, the racial opportunists offered few, if any, economic reforms to address the legitimate economic anxieties of poor and working-class whites, proposing instead a crackdown on the racially defined “others.” In
Michelle Alexander (The New Jim Crow: Mass Incarceration in the Age of Colorblindness)
As has been the case far too often in the Obama administration, which may go down as the least transparent administration in history, the IRS refused to respond to our FOIA requests. Judicial Watch was forced to sue the IRS in federal court in October 2013, shortly after Lois Lerner had “retired” to avoid the consequences of her actions. Judicial Watch’s efforts through these FOIA requests and subsequent litigation led to the discovery that in addition to targeting conservatives at the IRS, Lois Lerner sent confidential taxpayer information to attorneys at the Federal Election Commission, which enforces federal campaign finance rules, in violation of federal law. Email communications revealed that Lerner, who formerly worked at the Federal Election Commission (FEC), sent extensive materials on conservative organizations—the American Issues Project and Citizens for the Republic—to the FEC, including detailed confidential information, after inquiries from the FEC attorneys. She disclosed this information in spite of Section 6103 of the Internal Revenue Code, which bars the IRS from sending such information to anyone, including other federal agencies. It also turned out that the FEC attorneys were acting without authority to make such an inquiry, because the commissioners who run the agency had never approved an investigation. The emails discovered by Judicial Watch provided a disturbing window into the activities of two out-of-control federal agencies, whose employees, because of their political bias, were trying to target conservative organizations.
Tom Fitton (Clean House: Exposing Our Government's Secrets and Lies)
the Code [of the National Association of Broadcasters]. . . . also deals briefly with the presentation of news in a fair and accurate manner. In general, the handling of news largely consists in the accuracy and speed with which it is gathered and distributed, with freedom from editorial bias in its selection and presentation.
Judith C. Waller (Radio: The Fifth Estate)
Correlations made by big data are likely to reinforce negative bias. Because big data often relies on historical data or at least the status quo, it can easily reproduce discrimination against disadvantaged racial and ethnic minorities. The propensity models used in many algorithms can bake in a bias against someone who lived in the zip code of a low-income neighborhood at any point in his or her life. If an algorithm used by human resources companies queries your social graph and positively weighs candidates with the most existing connections to a workforce, it makes it more difficult to break in in the first place. In effect, these algorithms can hide bias behind a curtain of code. Big data is, by its nature, soulless and uncreative. It nudges us this way and that for reasons we are not meant to understand. It strips us of our privacy and puts our mistakes, secrets, and scandals on public display. It reinforces stereotypes and historical bias. And it is largely unregulated because we need it for economic growth and because efforts to try to regulate it have tended not to work; the technologies are too far-reaching and are not built to recognize the national boundaries of our world
Alec J. Ross (The Industries of the Future)
Correlations made by big data are likely to reinforce negative bias. Because big data often relies on historical data or at least the status quo, it can easily reproduce discrimination against disadvantaged racial and ethnic minorities. The propensity models used in many algorithms can bake in a bias against someone who lived in the zip code of a low-income neighborhood at any point in his or her life. If an algorithm used by human resources companies queries your social graph and positively weighs candidates with the most existing connections to a workforce, it makes it more difficult to break in in the first place. In effect, these algorithms can hide bias behind a curtain of code. Big data is, by its nature, soulless and uncreative. It nudges us this way and that for reasons we are not meant to understand. It strips us of our privacy and puts our mistakes, secrets, and scandals on public display. It reinforces stereotypes and historical bias. And it is largely unregulated because we need it for economic growth and because efforts to try to regulate it have tended not to work; the technologies are too far-reaching and are not built to recognize the national boundaries of our world’s 196 sovereign nation-states. Yet would it be best to try to shut down these technologies entirely if we could? No. Big data simultaneously helps solve global challenges while creating an entirely new set of challenges. It’s our best chance at feeding 9 billion people, and it will help solve the problem of linguistic division that is so old its explanation dates back to the Old Testament and the Tower of Babel. Big data technologies will enable us to discover cancerous cells at 1 percent the size of what can be detected using today’s technologies, saving tens of millions of lives. The best approach to big data might be one put forward by the Obama campaign’s chief technology officer, Michael Slaby, who said, “There’s going to be a constant mix between your qualitative experience and your quantitative experience. And at times, they’re going to be at odds with each other, and at times they’re going to be in line. And I think it’s all about the blend. It’s kind of like you have a mixing board, and you have to turn one up sometimes, and turn down the other. And you never want to be just one or the other, because if it’s just one, then you lose some of the soul.” Slaby has made an impressive career out of developing big data tools, but even he recognizes that these tools work best when governed by human judgment. The choices we make about how we manage data will be as important as the decisions about managing land during the agricultural age and managing industry during the industrial age. We have a short window of time—just a few years, I think—before a set of norms set in that will be nearly impossible to reverse. Let’s hope humans accept the responsibility for making these decisions and don’t leave it to the machines.
Alec J. Ross (The Industries of the Future)
Structural bias refers to policies or practices of societal institutions, such as corporations that discriminate against workers or hospitals that discriminate against patients. It is frequently intertwined with implicit bias.
Becca Levy (Breaking the Age Code: How Your Beliefs About Aging Determine How Long and Well You Live)
There exists a similar culture-based racial bias:
Becca Levy (Breaking the Age Code: How Your Beliefs About Aging Determine How Long and Well You Live)
Warning to the Spellbound (Sonnet from the future) In our times we wrote our own literature, In our times we wrote our own music. In our times we wrote our own code, In our times we wrote our own poetry. Ours was the last human generation, where humans shaped their own society. The day you traded comfort for originality, you forfeited the right to life and liberty. Today you are nothing, you mean thing, you are no more significant than woodworm. You are just puppets to large gibberish models, backboneless victims of algorithm addiction. If you can still hear my voice, AI is still adolescent, Once in control, it'll erase all records of humanness. We can't yet treat human bias, 'n here comes AI bias, Abandon all non-vital tech, return to simpler ways.
Abhijit Naskar (Brit Actually: Nursery Rhymes of Reparations)
It’s tempting to think that the male bias that is embedded in language is simply a relic of more regressive times, but the evidence does not point that way. The world’s ‘fastest-growing language’,34 used by more than 90% of the world’s online population, is emoji.35 This language originated in Japan in the 1980s and women are its heaviest users:36 78% of women versus 60% of men frequently use emoji.37 And yet, until 2016, the world of emojis was curiously male. The emojis we have on our smartphones are chosen by the rather grand-sounding ‘Unicode Consortium’, a Silicon Valley-based group of organisations that work together to ensure universal, international software standards. If Unicode decides a particular emoji (say ‘spy’) should be added to the current stable, they will decide on the code that should be used. Each phone manufacturer (or platform such as Twitter and Facebook) will then design their own interpretation of what a ‘spy’ looks like. But they will all use the same code, so that when users communicate between different platforms, they are broadly all saying the same thing. An emoji face with heart eyes is an emoji face with heart eyes. Unicode has not historically specified the gender for most emoji characters. The emoji that most platforms originally represented as a man running, was not called ‘man running’. It was just called ‘runner’. Similarly the original emoji for police officer was described by Unicode as ‘police officer’, not ‘policeman’. It was the individual platforms that all interpreted these gender-neutral terms as male. In 2016, Unicode decided to do something about this. Abandoning their previously ‘neutral’ gender stance, they decided to explicitly gender all emojis that depicted people.38 So instead of ‘runner’ which had been universally represented as ‘male runner’, Unicode issued code for explicitly male runner and explicitly female runner. Male and female options now exist for all professions and athletes. It’s a small victory, but a significant one.
Caroline Criado Pérez (Invisible Women: Data Bias in a World Designed for Men)
the average Black family had ten times less accumulated wealth than the average white family. And they were far less likely to own a home. This was often due to the legacy and persistence of restrictive zoning laws, tax code biases, and mortgage lending prohibitions dating from before the 1960s and the Civil Rights Act.
Fiona Hill (There Is Nothing for You Here: Finding Opportunity in the Twenty-First Century)
It is one thing to capitalize on the coolness of a Black artist to sell (overpriced) products and quite another to engage the cultural specificity of Black people enough to enhance the underlying design of a widely used technology. This is why the notion that tech bias is “unintentional” or “unconscious” obscures the reality – that there is no way to create something without some intention and intended user in mind
Ruha Benjamin (Race After Technology: Abolitionist Tools for the New Jim Code)
As a white person, I can openly and unabashedly reminisce about “the good old days.” Romanticized recollections of the past and calls for a return to former ways are a function of white privilege, which manifests itself in the ability to remain oblivious to our racial history. Claiming that the past was socially better than the present is also a hallmark of white supremacy. Consider any period in the past from the perspective of people of color: 246 years of brutal enslavement; the rape of black women for the pleasure of white men and to produce more enslaved workers; the selling off of black children; the attempted genocide of Indigenous people, Indian removal acts, and reservations; indentured servitude, lynching, and mob violence; sharecropping; Chinese exclusion laws; Japanese American internment; Jim Crow laws of mandatory segregation; black codes; bans on black jury service; bans on voting; imprisoning people for unpaid work; medical sterilization and experimentation; employment discrimination; educational discrimination; inferior schools; biased laws and policing practices; redlining and subprime mortgages; mass incarceration; racist media representations; cultural erasures, attacks, and mockery; and untold and perverted historical accounts, and you can see how a romanticized past is strictly a white construct.
Robin DiAngelo (White Fragility: Why It's So Hard for White People to Talk About Racism)
Humanizing AI (The Sonnet) You can code tasks, But not consciousness. You can code phony feelings, But definitely not sentience. Nobody can bring a machine to life, No matter how complex you make it. But once a machine is complex enough, It might develop awareness by accident. So let us focus on humanizing AI, By removing biases from algorithms, Rather than dehumanizing AI, By aiming for a future without humans. Rich kids with rich dreams make good movies. Be human first and use AI to equalize communities.
Abhijit Naskar (Either Reformist or Terrorist: If You Are Terror I Am Your Grandfather)
The term ‘hate speech’ is typically code for our hatred of the truth, not the speech.
Craig D. Lounsbrough
Another detrimental effect of undervaluing people skills was that in some cases, programmers were rewarded more for raw code production than for meeting the user's needs. Marge Devaney, a programmer at Los Alamos National Laboratory in the 1950's, recalled sex differences in how programmers judged their performance. Asked if she had ever experienced gender bias on the job, sh replied that discrimination was difficult to prove, adding, "With things like computing, it's very hard to judge who's doing the best. Is it better to produce a program quickly and have it full of bugs that the users keep hitting, and so it doesn't work? Or is it better to produce it more slowly and have it so it works?...I do know some of the men believed in the first way: 'Throw it together and let the user debug it!'" This critique is echoed by women today who find their male peers rewarded for averting disasters through heroic last-minute efforts, while women's efforts at preventing such problems through careful work and communication with users go unrecognized. As a female software engineer complained in 2007, "Why don't we just build the system right in the first place? Women are much better at preventive medicine. A Superman mentality is not necessarily productive; it's just an easy fit for the men in the sector.
Janet Abbate (Recoding Gender: Women's Changing Participation in Computing (History of Computing))
The entirety of Facebook’s staff working on integrity and societal issues was now literally reporting to Marketing, and the effects weren’t subtle. Social scientists had to seek approval not just to conduct research that touched on politics, climate change, bias, health, or user well-being, but even to propose studying those subjects or summarizing their past work.
Jeff Horwitz (Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets)
There exists a similar culture-based racial bias: studies show that job seekers who added typical “white” identifiers to their résumés received significantly more calls for interviews than those without these identifiers.
Becca Levy (Breaking the Age Code: How Your Beliefs About Aging Determine How Long and Well You Live)
Traditional diagnostic results are the foundation for AI diagnostic systems. AI diagnostics is a fast-growing sector because there is a lot of enthusiasm about potentially using AI in the future. Sometimes this takes the form of claiming to make diagnosis more accurate. Sometimes people are open about their goal of replacing doctors and medical personnel, usually as a cost-cutting measure. The way you figure out what is going on in state-of-the-art computational science is by looking at open-source science. All of the people developing proprietary AI methods look at what’s happening in open science, and most use it for inspiration. Microsoft’s GitHub, the most popular code-sharing website, hosts most of the available code.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
Most of us like to consider ourselves as capable of thinking fairly accurately about other people. But the truth is, we are social beings who carry around unconscious social beliefs that are so deeply rooted in our minds that we don’t usually realize they’ve got their hooks in us. This can result in an unconscious process called “implicit bias,” which automatically influences us to like or dislike certain groups of people.
Becca Levy (Breaking the Age Code: How Your Beliefs About Aging Determine How Long and Well You Live)
Structural bias refers to policies or practices of societal institutions, such as corporations that discriminate against workers or hospitals that discriminate against patients. It is frequently intertwined with implicit bias. For within institutions, the discrimination may operate without the managers’ or doctors’ awareness and therefore can be considered implicit. But at the same time, it is often structural insofar as the discrimination reinforces the power of those in authority while withholding power from those who are marginalized.
Becca Levy (Breaking the Age Code: How Your Beliefs About Aging Determine How Long and Well You Live)
Claiming that the past was socially better than the present is also a hallmark of white supremacy. Consider any period in the past from the perspective of people of color: 246 years of brutal enslavement; the rape of black women for the pleasure of white men and to produce more enslaved workers; the selling off of black children; the attempted genocide of Indigenous people, Indian removal acts, and reservations; indentured servitude, lynching, and mob violence; sharecropping; Chinese exclusion laws; Japanese American internment; Jim Crow laws of mandatory segregation; black codes; bans on black jury service; bans on voting; imprisoning people for unpaid work; medical sterilization and experimentation; employment discrimination; educational discrimination; inferior schools; biased laws and policing practices; redlining and subprime mortgages; mass incarceration; racist media representations; cultural erasures, attacks, and mockery; and untold and perverted historical accounts, and you can see how a romanticized past is strictly a white construct. But it is a powerful construct because it calls out to a deeply internalized sense of superiority and entitlement and the sense that any advancement for people of color is an encroachment on this entitlement. The past was great for white people (and white men in particular) because their positions went largely unchallenged.
Robin DiAngelo (White Fragility: Why It's So Hard for White People to Talk About Racism)
Today, we already tend to place more trust in computer-generated recommendation systems than in other sources of information—a phenomenon known as automation bias.
Primavera De Filippi (Blockchain and the Law: The Rule of Code)
This has further spawned a number of self-interested beliefs that are closely and proudly held by the tech “tribe,” including notions such as “as long as I produce outstanding results I won't be fired,” that technology itself isn't good or bad, but neutral (“there is no bias in code”), tech “solutionism” (“tech will solve everything,” “we just need better tech, more tech,” etc.), gender/race-neutral approach (“we only hire the best”), and so on.
Maelle Gavet (Trampled by Unicorns: Big Tech's Empathy Problem and How to Fix It)
Imagine if we had our entire criminal code and regulatory structure, but no Constitution. Every institution, whether it’s a country or a company, needs a charter of first principles that are everlasting—not just a hodgepodge and mishmash of bureaucratic rules and requirements that can be ignored with little or no consequence. Getting people to listen and report and sound alarms and seek advice requires more than email reminders; it entails understanding what motivates real people in real life, people with vulnerabilities and fears and biases and every other ordinary human failing and foible that can prevent us from doing the right thing.
Preet Bharara (Doing Justice: A Prosecutor's Thoughts on Crime, Punishment, and the Rule of Law)
expense codes are based on the assumption that the employee has a wife at home taking care of the home and the kids.
Caroline Criado Pérez (Invisible Women: Data Bias in a World Designed for Men)
That new tools are coded in old biases is surprising only if we equate technological innovation with social progress.
Ruha Benjamin (Race After Technology: Abolitionist Tools for the New Jim Code)
In 2015, nearly one hundred Canadian, Dutch, Irish, German, and English citizens came to America to adopt black children. Whatever forms of othering may be going on in their countries, they are not infected with American anti-black bias that coursed from Virginia's legal codes in the seventeenth century through the supremacist regimes of later ages to the subconscious of far too many Americans in the twenty-first.
Sheryll Cashin (Loving: Interracial Intimacy in America and the Threat to White Supremacy)
The implicit bias is clear: expense codes are based on the assumption that the employee has a wife at home taking care of the home and the kids. This work doesn’t need paying for, because it’s women’s work, and women don’t get paid for it.
Caroline Criado Pérez (Invisible Women: Data Bias in a World Designed for Men)
Consider any period in the past from the perspective of people of color: 246 years of brutal enslavement; the rape of black women for the pleasure of white men and to produce more enslaved workers; the selling off of black children; the attempted genocide of Indigenous people, Indian removal acts, and reservations; indentured servitude, lynching, and mob violence; sharecropping; Chinese exclusion laws; Japanese American internment; Jim Crow laws of mandatory segregation; black codes; bans on black jury service; bans on voting; imprisoning people for unpaid work; medical sterilization and experimentation; employment discrimination; educational discrimination; inferior schools; biased laws and policing practices; redlining and subprime mortgages; mass incarceration; racist media representations; cultural erasures, attacks, and mockery; and untold and perverted historical accounts, and you can see how a romanticized past is strictly a white construct.
Robin DiAngelo (White Fragility: Why It's So Hard for White People to Talk About Racism)
Claiming that the past was socially better than the present is also a hallmark of white supremacy. Consider any period in the past from the perspective of people of color: 246 years of brutal enslavement; the rape of black women for the pleasure of white men and to produce more enslaved workers; the selling off of black children; the attempted genocide of Indigenous people, Indian removal acts, and reservations; indentured servitude, lynching, and mob violence; sharecropping; Chinese exclusion laws; Japanese American internment; Jim Crow laws of mandatory segregation; black codes; bans on black jury service; bans on voting; imprisoning people for unpaid work; medical sterilization and experimentation; employment discrimination; educational discrimination; inferior schools; biased laws and policing practices; redlining and subprime mortgages; mass incarceration; racist media representations; cultural erasures, attacks, and mockery; and untold and perverted historical accounts, and you can see how a romanticized past is strictly a white construct.
Robin DiAngelo (White Fragility: Why It's So Hard for White People to Talk About Racism)
The problem here is that, as humans, we have an authority bias that’s incredibly strong and unconscious—if a superior tells you to do something, by God we tend to follow it, even when it’s wrong. Having one person tell other people what to do is not a reliable way to make good decisions. So how do you create conditions where that doesn’t happen, where you develop a hive mind? How do you develop ways to challenge each other, ask the right questions, and never defer to authority? We’re trying to create leaders among leaders.
Daniel Coyle (The Culture Code: The Secrets of Highly Successful Groups)
Cooper set out to build those conditions for his teams. His approach to nurturing cooperation could be described as an insurgent campaign against authority bias. Merely creating space for cooperation, he realized, wasn’t enough; he had to generate a series of unmistakable signals that tipped his men away from their natural tendencies and toward interdependence and cooperation. “Human nature is constantly working against us,” he says. “You have to get around those barriers, and they never go away.
Daniel Coyle (The Culture Code: The Secrets of Highly Successful Groups)
Sadly, fierce in-group/out-group biases live within the eating disorder complex, generating and sustaining an ethical code of the culture as girls and women project their shadow upon one another. Individuals with anorexia secretly scorn those who struggle with bulimia or binge eating, those with bulimia and binge eating feel gross, often “wishing to be anorexic,” yet detesting their slim sisters with vicious jealousy. A callous hierarchy is formed, with anorexia as the ideal; bulimia, as a very distant underworld second; and binge eating, clearly at the bottom of acceptability.
Tom Wooldridge (Psychoanalytic Treatment of Eating Disorders: When Words Fail and Bodies Speak (ISSN))
The prototypical borderline patient is almost always coded as feminine. Consider again the symptoms of borderline personality disorder. An unstable sense of self, fears of abandonment, difficulties maintaining relationships, self-harm behaviors, mood instability, feeling empty, dissociation, outbursts of anger, impulsivity: if you remove the concept of mental disorder, does this not sound like every patriarchal stereotype of women that you have ever heard? If they are women we don't like, they are crazy, manipulative "bitches"; if these traits are portrayed in a positive light, they are manic pixie dream girls, a term first coined by the film critic Nathan Rabin. Borderline personality disorder is the most egregious example of a gendered personality disorder, but most of them demonstrate some degree of gender bias. As our understanding of the construction of gender has increased, it has become further evident that traits stereotypically associated with femininity are far more likely to be categorized as a disorder.
Jonathan Foiles ((Mis)Diagnosed: How Bias Distorts Our Perception of Mental Health)
you once tell me about all these ingenious financial AI programs that were standard practice in your business? They sound a lot like Jiang Ziya. Impartial. Efficient. Perfect. But who programmed these supposedly perfect machines? Imperfect humans, who allowed their biases to be baked into the first generation of code.
Max Brooks (Tiger Chair)