Glitch Techs Quotes

We've searched our database for all the quotes and captions related to Glitch Techs. Here they are! All 26 of them:

Was I trying too hard to make this mean something? I asked Leah. Was that just buying into the industry's own narratives about itself? I tried to summarize the frantic, self-important work culture in Silicon Valley, how everyone was optimizing their bodies for longer lives, which would then be spent productively; how it was frowned upon to acknowledge that a tech job was a transaction rather than a noble mission or a seat on a rocket ship. In this respect, it was not unlike book publishing: talking about doing work for money felt like screaming the safe word. While perhaps not unique to tech--it may even have been endemic to a generation--the expectation was overbearing. Why did it feel so taboo, I asked, to approach work the way most people did, as a trade of my time and labor for money? Why did we have to pretend it was all so fun? Leah nodded, curls bobbing. "That's real," she said. "but I wonder if you're forcing things. Your job can be in service of the rest of your life." She reached out to squeeze my wrist, then leaned her head against the window. "You're allowed to enjoy your life," she said. The city streaked past, the bridge cables flickering like a delay, or a glitch.
Anna Wiener (Uncanny Valley)
Was I trying to hard to make this mean something? I asked Leah. Was that just buying into the industry's own narratives about itself? I tried to summarize the frantic, self-important work cultur ein Silicon Valley, howe everyone was optimizing their bodies for longer lives, which would then be spent productively; how it was frowned upon to acknowledge that a tech job was a transaction rather than a noble mission or a seat on a rocket ship. In this respect, it was not unlike book publishing: talking about oding work for money felt like screaming the safe word. While perhaps not unique to tech--it may even have been endemic to a generation--the expectation was overbearing. Why did it feel so taboo, I asked, to approach work the way most people did, as a trade of my time and labor for money? Why did we have to pretend it was all so fun? Leah nodded, curls bobbing. "That's real," she said. "but I wonder if you're forcing things. your job can bi in service of the rest of your life." She reached out to squeeze my wrist, then leaned her head against the window. "You're allowed to enjoy your life," she said. The city streaked past, the bridge cables flickering like a delay, or a glitch.
Anna Wiener
most 3D printer artists use the tech to create their visions to perfection. You just take existing things and mess with them. It’s like if the Venus de Milo's arms weren’t lopped off, just misprinted. That’s not art. That’s a hack, nothing more.
Alex Livingston (Glitch Rain)
there, because color is always on
Carolyn L. Kane (High-Tech Trash: Glitch, Noise, and Aesthetic Failure (Rhetoric & Public Culture: History, Theory, Critique Book 1))
If you ever get a gooey eye, don’t mess around, see a doctor, even if your schedule is packed. 2. Please don’t ever not use condoms. Please keep in mind that it is your own older and wiser self telling you this. It’s not propaganda. 3. Try running—you will like it. Has Pilates been invented yet? Do that. I wish you would pay attention to your laterals. Stay fit. Keep in mind that it will reduce cellulite acquisition. 4. When Grandma gives you a Krugerrand for your twenty-first birthday, have Dad put it in his safe deposit box because you will lose it, I promise you. 5. The stock market goes way up in 1999. But get out of tech by July 2000. 6. As discussed, keep wearing your retainer. 7. Sunscreen (60+) and remove makeup every night. 8. Brush up on multiple regressions before the departmental comprehensive senior year. But don’t freak out—you pass. 9. Try to maintain good sleep habits. I’m not sure this is possible. But try. 10. Start working on your Mandarin tones. 11. I’m giving you a list of big IPOs—see if you can invest early in any of them. They’re not going to let you in easily so show some hustle. Also, these are some great companies for a first job. 12. Remember you won’t always have the time you have now, so this is the time to learn Arabic. 13. Remember that greatness is difficult but worth it. 14. There will be plenty of time for boys/men/romance/dating once you’re a VP. There’s no point before then. 15. Remember, with men, the key quality you need is that they’ll put your career first, since it’s hard to both be extremely ambitious. 16. But men who don’t want to be #1 aren’t going to be exciting enough for you. 17. I haven’t figured out how to reconcile those two either, but maybe you can. 18. Having a killer work ethic is worth more than riches.
Elisabeth Cohen (The Glitch)
One approach involves looking at three different kinds of bias: physical bias, computational bias, and interpretation bias. This approach was proposed by UCLA engineering professor Achuta Kadambi in a 2021 Science paper.26 Physical bias manifests in the mechanics of the device, as when a pulse oximeter works better on light skin than darker skin. Computational bias might come from the software or the dataset used to develop a diagnostic, as when only light skin is used to train a skin cancer detection algorithm. Interpretation bias might occur when a doctor applies unequal, race-based standards to the output of a test or device, as when doctors give a different GFR threshold to Black patients. “Bias is multidimensional,” Kadambi told Scientific American. “By understanding where it originates, we can better correct it.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
However, then Kadambi recommends looking at different kinds of bias and deciding what is an acceptable threshold, which is a popular idea among computer scientists but not among the people affected. I disagree with Kadambi on this point. I want to normalize not using technology when the technology is impossible to make fair. “Computer systems can have an acceptable level of bias” is an idea, an argument, not a fact—for the moment, at least. It is contentious. It is presented in this paper as a fact, backed up by the writer’s expertise. This should not have made it through the editing process.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
realized that I wanted to tell my friend to pretend the grading algorithm was magic, so I could stop explaining in that moment. But I also wanted to push through and explain better, because I didn’t want my friend to feel abandoned intellectually the way I did when the professor got frustrated with me.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
I recommend trusting that the reason for an algorithmic decision is some kind of preexisting social bias manifesting in subtle (or obvious) ways. It feels unsatisfying to lack a reason, and that dissatisfaction can fuel a drive to achieve social justice. Tech is racist and sexist and ableist because the world is so. Computers just reflect the existing reality and suggest that things will stay the same—they predict the status quo. By adopting a more critical view of technology, and by being choosier about the tech we allow into our lives and our society, we can employ technology to stop reproducing the world as it is, and get us closer to a world that is truly more just.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
Traditional diagnostic results are the foundation for AI diagnostic systems. AI diagnostics is a fast-growing sector because there is a lot of enthusiasm about potentially using AI in the future. Sometimes this takes the form of claiming to make diagnosis more accurate. Sometimes people are open about their goal of replacing doctors and medical personnel, usually as a cost-cutting measure. The way you figure out what is going on in state-of-the-art computational science is by looking at open-source science. All of the people developing proprietary AI methods look at what’s happening in open science, and most use it for inspiration. Microsoft’s GitHub, the most popular code-sharing website, hosts most of the available code.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
Biologist Carl T. Bergstrom and information scientist Jevin West teach a class called “Calling Bullshit” at the University of Washington, and published a book with the same name. “Bullshit involves language, statistical figures, data graphics, and other forms of presentation intended to persuade by impressing and overwhelming a reader or listener, with a blatant disregard for truth and logical coherence,” in their definition. They offer a simple, three-step method for bullshit detection, which involves these questions: Who is telling me this? How do they know it? What are they trying to sell me?
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
One of the people leading the field in algorithmic auditing is Cathy O’Neil, the author of Weapons of Math Destruction. Her book is one of the catalysts for the entire movement for algorithmic accountability. O’Neil’s consulting company, O’Neil Risk Consulting & Algorithmic Auditing (ORCAA), does bespoke auditing to help companies and organizations manage and audit their algorithmic risks. I have had the good fortune to consult with ORCAA. When ORCAA considers an algorithm, they start by asking two questions: What does it mean for this algorithm to work? How could this algorithm fail, and for whom? One thing ORCAA does is what’s called an internal audit, which means they ask these questions directly of companies and other organizations, focusing on algorithms as they are used in specific contexts.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
Microsoft researchers developed what they call a “community jury,” a process for soliciting input from stakeholders during the software development process. Community input is not a new idea, but embedding it in business processes as a step to stave off algorithmic harms is a new development.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
audit is possible, yes—but it requires the inspector general’s office and the auditors and the audit report-readers and everyone else in the institutional context to have a level of mathematical and computational literacy in order to understand and communicate about the results. When you ask for medical test results, you get a report, and lawyers and activists understand that.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
Public interest technology pushes back against this, with an awareness that we need to fund infrastructure as well as innovation. If we are building AI systems that intervene in people’s lives, we need to maintain and inspect and replace the systems the same way we maintain and inspect and replace bridges and roads.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
In April 2021, the FTC published guidance on corporate use of AI. “If a data set is missing information from particular populations, using that data to build an AI model may yield results that are unfair or inequitable to legally protected groups,” reads the FTC guidance. “From the start, think about ways to improve your data set, design your model to account for data gaps, and—in light of any shortcomings—limit where or how you use the model.”3 Other tips include watching out for discriminatory outcomes, embracing transparency, telling the truth about where data comes from and how it is used, and not exaggerating an algorithm’s capabilities. If a model causes more harm than good, FTC can challenge the model as unfair. This guidance put corporate America on alert. Companies need to hold themselves accountable for ensuring their algorithmic systems are not unfair, in order to avoid FTC enforcement penalties.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
Joy Buolamwini’s organization, the Algorithmic Justice League (AJL), is doing a project called CRASH, which stands for Community Reporting of Algorithmic System Harms. Part of the project involves establishing bug bounties. The concept of a bug bounty comes from cybersecurity, where people can be rewarded by tech companies for finding and reporting system bugs.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
After the shoplifting incident, the Shinola store gave a copy of its surveillance video to the Detroit police. Five months later, a digital image examiner for the Michigan State Police looked at the grainy, poorly lit surveillance video on her computer and took a screen shot.2 She uploaded it to the facial recognition software the police used: a $5.5 million program supplied by DataWorks Plus, a South Carolina firm founded in 2000 that began selling facial recognition software developed by outside vendors in 2005. The system accepted the photo; scanned the image for shapes, indicating eyes, nose, and mouth; and set markers at the edges of each shape. Then, it measured the distance between the markers and stored that information. Next, it checked the measurements against the State Network of Agency Photos (SNAP) database, which includes mug shots, sex offender registry photographs, driver’s license photos, and state ID photos. To give an idea of the scale, in 2017, this database had 8 million criminal photos and 32 million DMV photos. Almost every Michigan adult was represented in the database.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
Sasha Costanza-Chock’s book Design Justice outlines the principles and explores how “universalist design principles and practices erase certain groups of people—specifically, those who are intersectionally disadvantaged or multiply burdened under the matrix of domination (white supremacist heteropatriarchy, ableism, capitalism, and settler colonialism).”12 Design can be a tool for collective liberation. The disability justice movement has taken leadership on this issue.13 The #disabilityjustice hashtag is one place to start learning more; other hashtags for learning about disability include #deaftwitter, #blindtwitter, #a11y, #blindtiktok, #disabilityawareness, and #instainclusion.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
If you find yourself using a rationalization beloved of eugenicists in order to rationalize oppression, think again. This is a pretty good indicator that you are not on the side of the angels.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
The precedent was a 2011 case in which the National Association of the Deaf successfully sued Netflix because the streaming giant did not provide captions on streaming video. That decision, rendered in the US District Court for the District of Massachusetts, was the first time a federal court affirmed that the ADA applies to internet-based businesses. The next major step forward came when Girma’s team and NFB won their battle, securing access to Scribd’s online library for the 61 million Americans with disabilities.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
High tech methods, low tech methods, makes no difference to me, a man is still a man, a date is still a date and a male body part attached to the ideal man that makes me scream with pleasure, is still very much a penis, no matter where I actually find it.
Jill Thrussell (Love Inc: Sophistidated (Glitches #2))
Giovanni smiled. "Wow, high tech bonking, how advanced." He teased.
Jill Thrussell (Love Inc: Sophistidated (Glitches #2))
Technochauvinism is a kind of bias that considers computational solutions to be superior to all other solutions. Embedded
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
We understand the quantitative through the qualitative.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)
Race is a social construct but it is often embedded in computational systems as if it were scientific fact.
Meredith Broussard (More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech)