Italicized Double Quotes

We've searched our database for all the quotes and captions related to Italicized Double. Here they are! All 2 of them:

When scientists underestimate complexity, they fall prey to the perils of unintended consequences. The parables of such scientific overreach are well-known: foreign animals, introduced to control pests, become pests in their own right; the raising of smokestacks, meant to alleviate urban pollution, releases particulate effluents higher in the air and exacerbates pollution; stimulating blood formation, meant to prevent heart attacks, thickens the blood and results in an increased risk of blood clots in the heart. But when nonscientists overestimate [italicized, sic] complexity- 'No one can possibly crack this [italicized, sic] code" - they fall into the trap of unanticipated consequences. In the early 1950s , a common trope among some biologists was that the genetic code would be so context dependent- so utterly determined by a particular cell in a particular organism and so horribly convoluted- that deciphering it would be impossible. The truth turned out to be quite the opposite: just one molecule carries the code, and just one code pervades the biological world. If we know the code, we can intentionally alter it in organisms, and ultimately in humans. Similarly, in the 1960s, many doubted that gene-cloning technologies could so easily shuttle genes between species. by 1980, making a mammalian protein in a bacterial cell, or a bacterial protein in a mammalian cell, was not just feasible, it was in Berg's words, rather "ridiculously simple." Species were specious. "Being natural" was often "just a pose.
Siddhartha Mukherjee (The Gene: An Intimate History)
In 2006, researchers Brendan Nyhan and Jason Reifler created fake newspaper articles about polarizing political issues. The articles were written in a way that would confirm a widespread misconception about certain ideas in American politics. As soon as a person read a fake article, experimenters then handed over a true article that corrected the first. For instance, one article suggested that the United States had found weapons of mass destruction in Iraq. The next article corrected the first and said that the United States had never found them, which was the truth. Those opposed to the war or who had strong liberal leanings tended to disagree with the original article and accept the second. Those who supported the war and leaned more toward the conservative camp tended to agree with the first article and strongly disagree with the second. These reactions shouldn’t surprise you. What should give you pause, though, is how conservatives felt about the correction. After reading that there were no WMDs, they reported being even more certain than before that there actually were WMDs and that their original beliefs were correct. The researchers repeated the experiment with other wedge issues, such as stem cell research and tax reform, and once again they found that corrections tended to increase the strength of the participants’ misconceptions if those corrections contradicted their ideologies. People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence was interpreted as threatening to their beliefs, they doubled down. The corrections backfired. Researchers Kelly Garrett and Brian Weeks expanded on this work in 2013. In their study, people already suspicious of electronic health records read factually incorrect articles about such technologies that supported those subjects’ beliefs. In those articles, the scientists had already identified any misinformation and placed it within brackets, highlighted it in red, and italicized the text. After they finished reading the articles, people who said beforehand that they opposed electronic health records reported no change in their opinions and felt even more strongly about the issue than before. The corrections had strengthened their biases instead of weakening them. Once something is added to your collection of beliefs, you protect it from harm. You do this instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens those misconceptions instead. Over time, the backfire effect makes you less skeptical of those things that allow you to continue seeing your beliefs and attitudes as true and proper.
David McRaney (You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself)