New Social Historians Quotes

We've searched our database for all the quotes and captions related to New Social Historians. Here they are! All 56 of them:

Many in America, as one social historian wrote, 'believed implicitly that New York's social leaders went to bed in full evening dress, brushed their teeth in vintage champagne, married their daughters without exception to shady French counts, and arrayed their poodle dogs in diamond tiaras.'...
Greg King
The history of philosophy, and perhaps especially of moral, social and political philosophy, is there to prevent us from becoming too readily bewitched. The intellectual historian can help us to appreciate how far the values embodied in our present way of life, and our present ways of thinking about those values, reflect a series of choices made at different times between different possible worlds. This awareness can help to liberate us from the grip of any one hegemonal account of those values and how they should be interpreted and understood. Equipped with a broader sense of possibility, we can stand back from the intellectual commitments we have inherited and ask ourselves in a new spirit of enquiry what we should think of them.
Quentin Skinner (Liberty Before Liberalism)
Thus, by science I mean, first of all, a worldview giving primacy to reason and observation and a methodology aimed at acquiring accurate knowledge of the natural and social world. This methodology is characterized, above all else, by the critical spirit: namely, the commitment to the incessant testing of assertions through observations and/or experiments — the more stringent the tests, the better — and to revising or discarding those theories that fail the test. One corollary of the critical spirit is fallibilism: namely, the understanding that all our empirical knowledge is tentative, incomplete and open to revision in the light of new evidence or cogent new arguments (though, of course, the most well-established aspects of scientific knowledge are unlikely to be discarded entirely). . . . I stress that my use of the term 'science' is not limited to the natural sciences, but includes investigations aimed at acquiring accurate knowledge of factual matters relating to any aspect of the world by using rational empirical methods analogous to those employed in the natural sciences. (Please note the limitation to questions of fact. I intentionally exclude from my purview questions of ethics, aesthetics, ultimate purpose, and so forth.) Thus, 'science' (as I use the term) is routinely practiced not only by physicists, chemists and biologists, but also by historians, detectives, plumbers and indeed all human beings in (some aspects of) our daily lives. (Of course, the fact that we all practice science from time to time does not mean that we all practice it equally well, or that we practice it equally well in all areas of our lives.)
Alan Sokal
When The Matrix debuted in 1999, it was a huge box-office success. It was also well received by critics, most of whom focused on one of two qualities—the technological (it mainstreamed the digital technique of three-dimensional “bullet time,” where the on-screen action would freeze while the camera continued to revolve around the participants) or the philosophical (it served as a trippy entry point for the notion that we already live in a simulated world, directly quoting philosopher Jean Baudrillard’s 1981 reality-rejecting book Simulacra and Simulation). If you talk about The Matrix right now, these are still the two things you likely discuss. But what will still be interesting about this film once the technology becomes ancient and the philosophy becomes standard? I suspect it might be this: The Matrix was written and directed by “the Wachowski siblings.” In 1999, this designation meant two brothers; as I write today, it means two sisters. In the years following the release of The Matrix, the older Wachowski (Larry, now Lana) completed her transition from male to female. The younger Wachowski (Andy, now Lilly) publicly announced her transition in the spring of 2016. These events occurred during a period when the social view of transgender issues radically evolved, more rapidly than any other component of modern society. In 1999, it was almost impossible to find any example of a trans person within any realm of popular culture; by 2014, a TV series devoted exclusively to the notion won the Golden Globe for Best Television Series. In the fifteen-year window from 1999 to 2014, no aspect of interpersonal civilization changed more, to the point where Caitlyn (formerly Bruce) Jenner attracted more Twitter followers than the president (and the importance of this shift will amplify as the decades pass—soon, the notion of a transgender US president will not seem remotely implausible). So think how this might alter the memory of The Matrix: In some protracted reality, film historians will reinvestigate an extremely commercial action movie made by people who (unbeknownst to the audience) would eventually transition from male to female. Suddenly, the symbolic meaning of a universe with two worlds—one false and constructed, the other genuine and hidden—takes on an entirely new meaning. The idea of a character choosing between swallowing a blue pill that allows him to remain a false placeholder and a red pill that forces him to confront who he truly is becomes a much different metaphor. Considered from this speculative vantage point, The Matrix may seem like a breakthrough of a far different kind. It would feel more reflective than entertaining, which is precisely why certain things get remembered while certain others get lost.
Chuck Klosterman (But What If We're Wrong?: Thinking about the Present as If It Were the Past)
A succinct summary of the tragic vision was given by historians Will and Ariel Durant: Out of every hundred new ideas ninety-nine or more will probably be inferior to the traditional responses which they propose to replace. No one man, however brilliant or well-informed, can come in one lifetime to such fullness of understanding as to safely judge and dismiss the customs or institutions of his society, for these are the wisdom of generations after centuries of experiment in the laboratory of history.
Thomas Sowell (The Vision Of The Annointed: Self-congratulation As A Basis For Social Policy)
9* What produces change is new ideas and actions guided by them. What distinguishes one group from another is the effect of such innovations. These innovations are not accomplished by a group mind; they are always the achievements of individuais. What makes the American people different from any other people is the joint effect produced by the thoughts and actions of innumerable uncommon Americans. We know the names of the men who invented and step by step perfected the motorcar. A historian can write a detailed history of the evolution of the automobile. We do not know the names of the men who, in the beginnings of civilization, made the greatest inventions —for example lighting a fire. But this ignorance does not permit us to ascribe this fundamental invention to a group mind. It is always an individual who starts a new method of doing things, and then other people imitate his example. Customs and fashions have always been inaugurated by individuais and spread through imitation by other people.
Ludwig von Mises (Theory and History: An Interpretation of Social and Economic Evolution)
America had shifted from what the influential cultural historian Warren Susman called a Culture of Character to a Culture of Personality—and opened up a Pandora’s Box of personal anxieties from which we would never quite recover. In the Culture of Character, the ideal self was serious, disciplined, and honorable. What counted was not so much the impression one made in public as how one behaved in private. The word personality didn’t exist in English until the eighteenth century, and the idea of “having a good personality” was not widespread until the twentieth. But when they embraced the Culture of Personality, Americans started to focus on how others perceived them. They became captivated by people who were bold and entertaining. “The social role demanded of all in the new Culture of Personality was that of a performer,” Susman famously wrote. “Every American was to become a performing self.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
In former times, military power was isolated, with the consequence that victory or defeat appeared to depend upon the accidental qualities of commanders. In our day, it is common to treat economic power as the source from which all other kinds are derived; this, I shall contend, is just as great an error as that of the purely military historians whom it has caused to seem out of date. Again, there are those who regard propaganda as the fundamental form of power. This is by no means a new opinion; it is embodied in such traditional sayings as magna est veritas et prevalebit and ‘the blood of the martyrs is the seed of the Church’. It has about the same measure of truth and falsehood as the military view or the economic view. Propaganda, if it can create an almost unanimous opinion, can generate an irresistible power; but those who have military or economic control can, if they choose, use it for the purpose of propaganda.
Bertrand Russell (Power: A New Social Analysis (Routledge Classics))
Contrary to a notion that has become fashionable among American historians, the concept of race was not invented in the late eighteenth or nineteenth century. Indeed, systems of categorical generalization that separated groups of people according to social constructions of race (sometimes based on skin color, sometimes with reference to other attributes) and ranked them as to disposition and intelligence, were in use in Europe at least a thousand years before Columbus set off across the Atlantic.
David E. Stannard (American Holocaust: Columbus and the Conquest of the New World)
When the system of mass incarceration collapses (and if history is any guide, it will), historians will undoubtedly look back and marvel that such an extraordinarily comprehensive system of racialized social control existed in the United States. How fascinating, they will likely say, that a drug war was waged almost exclusively against poor people of color—people already trapped in ghettos that lacked jobs and decent schools. They were rounded up by the millions, packed away in prisons, and when released, they were stigmatized for life, denied the right to vote, and ushered into a world of discrimination. Legally barred from employment, housing, and welfare benefits—and saddled with thousands of dollars of debt—these people were shamed and condemned for failing to hold together their families. They were chastised for succumbing to depression and anger, and blamed for landing back in prison. Historians will likely wonder how we could describe the new caste system as a system of crime control, when it is difficult to imagine a system better designed to create—rather than prevent—crime.
Michelle Alexander (The New Jim Crow: Mass Incarceration in the Age of Colorblindness)
During the period of the Renaissance, the English language changed very swiftly in keeping with rapid social, economic and political changes. However, writers in particular soon came to realise that the vocabulary of the English language did not always allow them to talk and write accurately about the new concepts, techniques and inventions which were emerging in Europe. At the same time a period of increasing exploration and trade across the whole world introduced new words, many of which had their origin in other languages. Historians of the language have suggested that between 1500 and 1650 around 12,000 new words were introduced into English.
Ronald Carter (The Routledge History of Literature in English: Britain and Ireland)
How are we going to bring about these transformations? Politics as usual—debate and argument, even voting—are no longer sufficient. Our system of representative democracy, created by a great revolution, must now itself become the target of revolutionary change. For too many years counting, vast numbers of people stopped going to the polls, either because they did not care what happened to the country or the world or because they did not believe that voting would make a difference on the profound and interconnected issues that really matter. Now, with a surge of new political interest having give rise to the Obama presidency, we need to inject new meaning into the concept of the “will of the people.” The will of too many Americans has been to pursue private happiness and take as little responsibility as possible for governing our country. As a result, we have left the job of governing to our elected representatives, even though we know that they serve corporate interests and therefore make decisions that threaten our biosphere and widen the gulf between the rich and poor both in our country and throughout the world. In other words, even though it is readily apparent that our lifestyle choices and the decisions of our representatives are increasing social injustice and endangering our planet, too many of us have wanted to continue going our merry and not-so-merry ways, periodically voting politicians in and out of office but leaving the responsibility for policy decisions to them. Our will has been to act like consumers, not like responsible citizens. Historians may one day look back at the 2000 election, marked by the Supreme Court’s decision to award the presidency to George W. Bush, as a decisive turning point in the death of representative democracy in the United States. National Public Radio analyst Daniel Schorr called it “a junta.” Jack Lessenberry, columnist for the MetroTimes in Detroit, called it “a right-wing judicial coup.” Although more restrained, the language of dissenting justices Breyer, Ginsberg, Souter, and Stevens was equally clear. They said that there was no legal or moral justification for deciding the presidency in this way.3 That’s why Al Gore didn’t speak for me in his concession speech. You don’t just “strongly disagree” with a right-wing coup or a junta. You expose it as illegal, immoral, and illegitimate, and you start building a movement to challenge and change the system that created it. The crisis brought on by the fraud of 2000 and aggravated by the Bush administration’s constant and callous disregard for the Constitution exposed so many defects that we now have an unprecedented opportunity not only to improve voting procedures but to turn U.S. democracy into “government of the people, by the people, and for the people” instead of government of, by, and for corporate power.
Grace Lee Boggs (The Next American Revolution: Sustainable Activism for the Twenty-First Century)
First of all, we should not forget that the concept of fascism has frequently been used even after World War II, and not only in order to define the military dictatorships of Latin America. In 1959, Theodor Adorno wrote that ‘the survival of National Socialism within democracy’ was potentially more dangerous than ‘the survival of fascist tendencies against democracy’.2 In 1974, Pier Paolo Pasolini depicted the anthropological models of neoliberal capitalism as a ‘new fascism’ compared to which the regime of Mussolini appeared irremediably archaic, as a kind of ‘paleofascism’.3 And in even more recent decades, many historians seeking to provide interpretations of Berlusconi’s Italy recognized its intimacy—if not its filiation—with classical fascism
Enzo Traverso (The New Faces of Fascism: Populism and the Far Right)
It wasn’t until nearly 400 years later [since capitalist privatizations at home in Britain, i.e. the Enclosures starting in 1500s] that life expectancies in Britain finally began to rise. […] It happened slightly later in the rest of Europe, while in the colonised world longevity didn’t begin to improve until the early 1900s [decolonization]. So if [capitalist economic] growth itself does not have an automatic relationship with life expectancy and human welfare, what could possibly explain this trend? Historians today point out that it began with a startlingly simple intervention […]: [public] sanitation. In the middle of the 1800s, public health researchers had discovered that health outcomes could be improved by introducing simple sanitation measures, such as separating sewage from drinking water. All it required was a bit of public plumbing. But public plumbing requires public works, and public money. You have to appropriate private land for things like public water pumps and public baths. And you have to be able to dig on private property in order to connect tenements and factories to the system. This is where the problems began. For decades, progress towards the goal of public sanitation was opposed, not enabled, by the capitalist class. Libertarian-minded landowners refused to allow officials to use their property [note: the Enclosures required state violence to privatize land], and refused to pay the taxes required to get it done. The resistance of these elites was broken only once commoners won the right to vote and workers organised into unions. Over the following decades these movements, which in Britain began with the Chartists and the Municipal Socialists, leveraged the state to intervene against the capitalist class. They fought for a new vision: that cities should be managed for the good of everyone, not just for the few. These movements delivered not only public sanitation systems but also, in the years that followed, public healthcare, vaccination coverage, public education, public housing, better wages and safer working conditions. According to research by the historian Simon Szreter, access to these public goods – which were, in a way, a new kind of commons – had a significant positive impact on human health, and spurred soaring life expectancy through the twentieth century.
Jason Hickel (Less Is More: How Degrowth Will Save the World)
In truth, most of the fights over the 1619 Project were never really about the facts. The Princeton historian Allen C. Guelzo, a particularly acerbic critic, published several articles that denounced the 1619 Project for treating “slavery not as a blemish that the Founders grudgingly tolerated…not as a regrettable chapter in the distant past, but as a living, breathing pattern upon which all American social life is based.” Guelzo then made clear that the source of his antipathy was not just what the project was saying but who was saying it: “It is the bitterest of ironies that the 1619 Project dispenses this malediction from the chair of ultimate cultural privilege in America, because in no human society has an enslaved people suddenly found itself vaulted into positions of such privilege, and with the consent—even the approbation—of those who were once the enslavers.
Nikole Hannah-Jones (The 1619 Project: A New Origin Story)
Whatever the historian’s individual outlook may be, a subject such as the social history of art simply cannot be treated by relying on secondary authorities. Even Mr. Hauser’s belief in social determinism could have become fertile and valuable if it had inspired him, as it has inspired others, to prove its fruitfulness is research, to bring to the surface new facts about the past not previously caught in the nest of more conventional theories. Perhaps the trouble lies in the fact that Mr. Hauser is avowedly not interested in the past for its own sake but that he sees it as "the purpose of historical research" to understand the present (p. 714). His theoretical prejudices may have thwarted his sympathies. For to some extent they deny the very existence of what we call the "humanities". If all human beings, including ourselves, are completely conditioned by the economic and social circumstances of their existence then we really cannot understand the past by ordinary sympathy.
E.H. Gombrich (Meditations on a Hobby Horse: And Other Essays on the Theory of Art)
The human and social costs are beyond measure. Such overwhelming traumas tear at the bonds that hold cultures together. The epidemic that struck Athens in 430 B.C., Thucydides reported, enveloped the city in “a great degree of lawlessness.” The people “became contemptuous of everything, both sacred and profane.” They joined ecstatic cults and allowed sick refugees to desecrate the great temples, where they died untended. A thousand years later the Black Death shook Europe to its foundations. Martin Luther’s rebellion against Rome was a grandson of the plague, as was modern anti-Semitism. Landowners’ fields were emptied by death, forcing them either to work peasants harder or pay more to attract new labor. Both choices led to social unrest: the Jacquerie (France, 1358), the Revolt of Ciompi (Florence, 1378), the Peasants’ Revolt (England, 1381), the Catalonian Rebellion (Spain, 1395), and dozens of flare-ups in the German states. Is it necessary to spell out that societies mired in fratricidal chaos are vulnerable to conquest? To borrow a trope from the historian Alfred Crosby, if Genghis Khan had arrived with the Black Death, this book would not be written in a European language
Charles C. Mann (1491: New Revelations of the Americas Before Columbus)
It helps, of course, that Denmark is essentially one giant middle class or, as the Danes would have you believe, effectively classless. The creation of this economically and gender-equal society has driven much of Denmark’s social and economic development over the last hundred or so years. One very well-known Danish quotation sums this up—it is another line, like Holst’s “What was lost without…” that every Dane knows by heart, and was written by N. F. S. Grundtvig: Og da har i rigdom vi drevet det vidt, når få har for meget og færre for lidt. (And we will have made great strides in equality, when few have too much and fewer too little.) It sounds like some kind of utopian fantasy but, by and large, the Danes have succeeded in achieving it. As historian Tony Hall writes in Scandinavia: At War with Trolls, Grundtvig’s Folk High Schools were founded on the principle of “teaching them, whenever feasible, that regardless of their social rank and occupation, they belonged to one people, and as such had one mother, one destiny and one purpose.” The result is that, according to the New Statesman, “90 percent of the population [of Denmark] enjoy an approximately identical standard of living.
Michael Booth (The Almost Nearly Perfect People: Behind the Myth of the Scandinavian Utopia)
The essence of Roosevelt’s leadership, I soon became convinced, lay in his enterprising use of the “bully pulpit,” a phrase he himself coined to describe the national platform the presidency provides to shape public sentiment and mobilize action. Early in Roosevelt’s tenure, Lyman Abbott, editor of The Outlook, joined a small group of friends in the president’s library to offer advice and criticism on a draft of his upcoming message to Congress. “He had just finished a paragraph of a distinctly ethical character,” Abbott recalled, “when he suddenly stopped, swung round in his swivel chair, and said, ‘I suppose my critics will call that preaching, but I have got such a bully pulpit.’ ” From this bully pulpit, Roosevelt would focus the charge of a national movement to apply an ethical framework, through government action, to the untrammeled growth of modern America. Roosevelt understood from the outset that this task hinged upon the need to develop powerfully reciprocal relationships with members of the national press. He called them by their first names, invited them to meals, took questions during his midday shave, welcomed their company at day’s end while he signed correspondence, and designated, for the first time, a special room for them in the West Wing. He brought them aboard his private railroad car during his regular swings around the country. At every village station, he reached the hearts of the gathered crowds with homespun language, aphorisms, and direct moral appeals. Accompanying reporters then extended the reach of Roosevelt’s words in national publications. Such extraordinary rapport with the press did not stem from calculation alone. Long before and after he was president, Roosevelt was an author and historian. From an early age, he read as he breathed. He knew and revered writers, and his relationship with journalists was authentically collegial. In a sense, he was one of them. While exploring Roosevelt’s relationship with the press, I was especially drawn to the remarkably rich connections he developed with a team of journalists—including Ida Tarbell, Ray Stannard Baker, Lincoln Steffens, and William Allen White—all working at McClure’s magazine, the most influential contemporary progressive publication. The restless enthusiasm and manic energy of their publisher and editor, S. S. McClure, infused the magazine with “a spark of genius,” even as he suffered from periodic nervous breakdowns. “The story is the thing,” Sam McClure responded when asked to account for the methodology behind his publication. He wanted his writers to begin their research without preconceived notions, to carry their readers through their own process of discovery. As they educated themselves about the social and economic inequities rampant in the wake of teeming industrialization, so they educated the entire country. Together, these investigative journalists, who would later appropriate Roosevelt’s derogatory term “muckraker” as “a badge of honor,” produced a series of exposés that uncovered the invisible web of corruption linking politics to business. McClure’s formula—giving his writers the time and resources they needed to produce extended, intensively researched articles—was soon adopted by rival magazines, creating what many considered a golden age of journalism. Collectively, this generation of gifted writers ushered in a new mode of investigative reporting that provided the necessary conditions to make a genuine bully pulpit of the American presidency. “It is hardly an exaggeration to say that the progressive mind was characteristically a journalistic mind,” the historian Richard Hofstadter observed, “and that its characteristic contribution was that of the socially responsible reporter-reformer.
Doris Kearns Goodwin (The Bully Pulpit: Theodore Roosevelt, William Howard Taft, and the Golden Age of Journalism)
what about your new way of looking at things? We seem to have wandered rather a long way from that.’ ‘Well, as a matter of fact,’ said Philip, ‘we haven’t. All these camisoles en flanelle and pickled onions and bishops of cannibal islands are really quite to the point. Because the essence of the new way of looking is multiplicity. Multiplicity of eyes and multiplicity of aspects seen. For instance, one person interprets events in terms of bishops; another in terms of the price of flannel camisoles; another, like that young lady from Gulmerg,’ he nodded after the retreating group, ‘thinks of it in terms of good times. And then there’s the biologist, the chemist, the physicist, the historian. Each sees, professionally, a different aspect of the event, a different layer of reality. What I want to do is to look with all those eyes at once. With religious eyes, scientific eyes, economic eyes, homme moyen sensuel eyes . . .’ ‘Loving eyes too.’ He smiled at her and stroked her hand. ‘The result . . .’ he hesitated. ‘Yes, what would the result be?’ she asked. ‘Queer,’ he answered. ‘A very queer picture indeed.’ ‘Rather too queer, I should have thought.’ ‘But it can’t be too queer,’ said Philip. ‘However queer the picture is, it can never be half so odd as the original reality. We take it all for granted; but the moment you start thinking, it becomes queer. And the more you think, the queerer it grows. That’s what I want to get in this book—the astonishingness of the most obvious things. Really any plot or situation would do. Because everything’s implicit in anything. The whole book could be written about a walk from Piccadilly Circus to Charing Cross. Or you and I sitting here on an enormous ship in the Red Sea. Really, nothing could be queerer than that. When you reflect on the evolutionary processes, the human patience and genius, the social organisation, that have made it possible for us to be here, with stokers having heat apoplexy for our benefit and steam turbines doing five thousand revolutions a minute, and the sea being blue, and the rays of light not flowing round obstacles, so that there’s a shadow, and the sun all the time providing us with energy to live and think—when you think of all this and a million other things, you must see that nothing could well be queerer and that no picture can be queer enough to do justice to the facts.’ ‘All the same,’ said Elinor, after a long silence, ‘I wish one day you’d write a simple straightforward story about a young man and a young woman who fall in love and get married and have difficulties, but get over them, and finally settle down.’ ‘Or
Aldous Huxley (Point Counter Point)
The pathbreaker who disdains the applause he may get from the crowd of his contemporaries does not depend on his own age's ideas. He is free to say with Schillers Marquis Posa: "This century is not ripe for my ideas; I live as a citizen of centuries to come." The genius' work too is embedded in the sequence of historical events, is conditioned by the achievements of preceding generations, and is merely a chapter in the evolution of ideas. But it adds something new and unheard of to the treasure of thoughts and may in this sense be called creative. The genuine history of mankind is the history of ideas. It is ideas that distinguish man from ali other beings. Ideas engender social institutions, political changes, technological methods of production, and ali that is called economic conditions. And in searching for their origin we inevitably come to a point at which ali that can be asserted is that a man had an idea. Whether the name of this man is known or not is of secondary importance. This is the meaning that history attaches to the notion of individuality. Ideas are the ultimate given of historical inquiry. Ali that can be said about ideas is that they carne to pass. The historian may point out how a new idea fitted into the ideas developed by earlier generations and how it may be considered a continuation of these ideas and their logical sequei.
Ludwig von Mises (Theory and History: An Interpretation of Social and Economic Evolution)
Hoover wanted the new investigation to be a showcase for his bureau, which he had continued to restructure. To counter the sordid image created by Burns and the old school of venal detectives, Hoover adopted the approach of Progressive thinkers who advocated for ruthlessly efficient systems of management. These systems were modeled on the theories of Frederick Winslow Taylor, an industrial engineer, who argued that companies should be run “scientifically,” with each worker’s task minutely analyzed and quantified. Applying these methods to government, Progressives sought to end the tradition of crooked party bosses packing government agencies, including law enforcement, with patrons and hacks. Instead, a new class of technocratic civil servants would manage burgeoning bureaucracies, in the manner of Herbert Hoover—“ the Great Engineer”—who had become a hero for administering humanitarian relief efforts so expeditiously during World War I. As the historian Richard Gid Powers has noted, J. Edgar Hoover found in Progressivism an approach that reflected his own obsession with organization and social control. What’s more, here was a way for Hoover, a deskbound functionary, to cast himself as a dashing figure—a crusader for the modern scientific age. The fact that he didn’t fire a gun only burnished his image. Reporters noted that the “days of ‘old sleuth’ are over” and that Hoover had “scrapped the old ‘gum shoe, dark lantern and false moustache’ traditions of the Bureau of Investigation and substituted business methods of procedure.” One article said, “He plays golf. Whoever could picture Old Sleuth doing that?
David Grann (Killers of the Flower Moon: The Osage Murders and the Birth of the FBI)
he importance and influence of Charles Darwin’s theory of evolution by natural selection can scarcely be exaggerated. A century after Darwin’s death, the great evolutionary biologist and historian of science, Ernst Mayr, wrote, ‘The worldview formed by any thinking person in the Western world after 1859, when On the Origin of Species was published, was by necessity quite different from a worldview formed prior to 1859… The intellectual revolution generated by Darwin went far beyond the confines of biology, causing the overthrow of some of the most basic beliefs of his age.’1 Adrian Desmond and James Moore, Darwin’s biographers, contend, ‘Darwin is arguably the best known scientist in history. More than any modern thinker—even Freud or Marx—this affable old-world naturalist from the minor Shropshire gentry has transformed the way we see ourselves on the planet.’2 In the words of the philosopher Daniel C. Dennett, ‘Almost no one is indifferent to Darwin, and no one should be. The Darwinian theory is a scientific theory, and a great one, but that is not all it is… Darwin’s dangerous idea cuts much deeper into the fabric of our most fundamental beliefs than many of its sophisticated apologists have yet admitted, even to themselves.’3 Dennett goes on to add, ‘If I were to give an award for the single best idea anyone has ever had, I’d give it to Darwin, ahead of Newton and Einstein and everyone else. In a single stroke, the idea of evolution by natural selection unifies the realm of life, meaning, and purpose with the realm of space and time, cause and effect, mechanism and physical law.’4 The editors of the Cambridge Companion to Darwin begin their introduction by stating, ‘Some scientific thinkers, while not themselves philosophers, make philosophers necessary. Charles Darwin is an obvious case. His conclusions about the history and diversity of life—including the evolutionary origin of humans—have seemed to bear on fundamental questions about being, knowledge, virtue and justice.’5 Among the fundamental questions raised by Darwin’s work, which are still being debated by philosophers (and others) are these: ‘Are we different in kind from other animals? Do our apparently unique capacities for language, reason and morality point to a divine spark within us, or to ancestral animal legacies still in evidence in our simian relatives? What forms of social life are we naturally disposed towards—competitive and selfish forms, or cooperative and altruistic ones?’6 As the editors of the volume point out, virtually the entire corpus of the foundational works of Western philosophy, from Plato and Aristotle to Descartes to Kant to Hegel, has had to be re-examined in the light of Darwin’s work. Darwin continues to be read, discussed, interpreted, used, abused—and misused—to this day. As the philosopher and historian of science, Jean Gayon, puts it, ‘[T]his persistent positioning of new developments in relation to a single, pioneering figure is quite exceptional in the history of modern natural science.
Charles Darwin (On the Origin of Species)
On the first day of the meeting that would become known as the United States Constitutional Convention, Edmund Randolph of Virginia kicked off the proceedings. Addressing his great fellow Virginian General George Washington, victorious hero of the War of Independence, who sat in the chair, Randolph hoped to convince delegates sent by seven, so far, of the thirteen states, with more on the way, to abandon the confederation formed by the states that had sent them—the union that had declared American independence from England and won the war—and to replace it with another form of government. “Our chief danger,” Randolph announced, “arises from the democratic parts of our constitutions.” This was in May of 1787, in Philadelphia, in the same ground-floor room of the Pennsylvania State House, borrowed from the Pennsylvania assembly, where in 1776 the Continental Congress had declared independence. Others in the room already agreed with Randolph: James Madison, also of Virginia; Robert Morris of Pennsylvania; Gouverneur Morris of New York and Pennsylvania; Alexander Hamilton of New York; Washington. They wanted the convention to institute a national government. As we know, their effort was a success. We often say the confederation was a weak government, the national government stronger. But the more important difference has to do with whom those governments acted on. The confederation acted on thirteen state legislatures. The nation would act on all American citizens, throughout all the states. That would be a mighty change. To persuade his fellow delegates to make it, Randolph was reeling off a list of what he said were potentially fatal problems, urgently in need, he said, of immediate repair. He reiterated what he called the chief threat to the country. “None of the constitutions”—he meant those of the states’ governments—“have provided sufficient checks against the democracy.” The term “democracy” could mean different things, sometimes even contradictory things, in 1787. People used it to mean “the mob,” which historians today would call “the crowd,” a movement of people denied other access to power, involving protest, riot, what recently has been called occupation, and often violence against people and property. But sometimes “democracy” just meant assertive lawmaking by a legislative body staffed by gentlemen highly sensitive to the desires of their genteel constituents. Men who condemned the working-class mob as a democracy sometimes prided themselves on being “democratical” in their own representative bodies. What Randolph meant that morning by “democracy” is clear. When he said “our chief danger arises from the democratic parts of our constitutions,” and “none of the constitutions have provided sufficient checks against the democracy,” he was speaking in a context of social and economic turmoil, pervading all thirteen states, which the other delegates were not only aware of but also had good reason to be urgently worried about. So familiar was the problem that Randolph would barely have had to explain it, and he didn’t explain it in detail. Yet he did say things whose context everyone there would already have understood.
William Hogeland (Founding Finance: How Debt, Speculation, Foreclosures, Protests, and Crackdowns Made Us a Nation (Discovering America))
Describing the plight of the unemployed, the historian William Manchester wrote: “Although millions were trapped in a great tragedy for which there could plainly be no individual responsibility, social workers repeatedly observed that the jobless were suffering from feelings of guilt. ‘I haven’t had a steady job in more than two years,’ a man facing eviction told a New York Daily News reporter in February 1932. ‘Sometimes I feel like a murderer. What’s wrong with me, that I can’t protect my children?
Jon Meacham (The Soul of America: The Battle for Our Better Angels)
*Because of their obsession with gold, the conquistadors are often dismissed as “gold crazy.” In fact they were not so much gold crazy as status crazy. Like Hernán Cortés, who conquered Mexico, Pizarro was born into the lower fringes of the nobility and hoped by his exploits to earn titles, offices, and pensions from the Spanish crown. To obtain these royal favors, their expeditions had to bring something back for the king. Given the difficulty and expense of transportation, precious metals—“nonperishable, divisible, and compact,” as historian Matthew Restall notes—were almost the only goods that they could plausibly ship to Europe. Inka gold and silver thus represented to the Spaniards the intoxicating prospect of social betterment.
Charles C. Mann (1491: New Revelations of the Americas Before Columbus)
In 1957, historian Norman Cohn had tracked these outbreaks of apocalyptic thinking to periods of significant social, technological and economic change, when wealth inequality becomes highly visible.1 Arguably, with the tumultuous changes brought about by the COVID-19 pandemic, we’re living through one of those periods now.
Byron Clark (Fear: The must-read gripping new book about New Zealand's hostile underworld of extremists)
There are hints, too, of wider social trends. The first edition of the Dictionary contains more than thirty references to coffee, and even more to tea. Johnson would vigorously defend the latter, not long after the Dictionary was published, in his review of an essay by the umbrella-toting Hanway, who believed it was ‘pernicious to health, obstructing industry and impoverishing the nation’.2 Johnson’s love of tea was deep but not exceptional: the leaf had been available in England since the 1650s (Pepys records drinking it for the first time in September 1660), and by 1755 it was being imported to Britain at the rate of 2,000 tons a year. The fashion for tea-drinking, facilitated by Britain’s imperial resources, drove demand for another fruit of the colonies, sugar (‘the native salt of the sugar-cane, obtained by the expression and evaporation of its juice’). Tea also played a crucial role in the dissolution of the eighteenth-century British Empire, for it was of course Bostonian opponents of a British tax on tea who opened the final breach between Britain and colonial America. All the same, it was coffee that proved the more remarkable phenomenon of the age. Johnson gives a clue to this when he defines ‘coffeehouse’ as ‘a house of entertainment where coffee is sold, and the guests are supplied with newspapers’. It was this relationship between coffee and entertainment (by which Johnson meant ‘conversation’) that made it such a potent force. Coffee was first imported to Europe from Yemen in the early part of the seventeenth century, and the first coffee house opened in St Mark’s Square in Venice in 1647. The first in England opened five years later—a fact to which Johnson refers in his entry for ‘coffee’—but its proprietor, Daniel Edwards, could hardly have envisioned that by the middle of the following century there would be several thousand coffee houses in London alone, along with new coffee plantations, run by Europeans, in the East Indies and the Caribbean. Then as now, coffee houses were meeting places, where customers (predominantly male) could convene to discuss politics and current affairs. By the time of the Dictionary they were not so much gentlemanly snuggeries as commercial exchanges. As the cultural historian John Brewer explains, ‘The coffee house was the precursor of the modern office’; in later years Johnson would sign the contract for his Lives of the English Poets in a coffee house on Paternoster Row, and the London Stock Exchange and Lloyd’s have their origins in the coffee-house culture of the period. ‘Besides being meeting places’, the coffee houses were ‘postes restantes, libraries, places of exhibition and sometimes even theatres’. They were centres, too, of political opposition and, because they were open to all ranks and religions, they allowed a rare freedom of information and expression.
Henry Hitchings (Defining the World: The Extraordinary Story of Dr. Johnson's Dictionary)
Aldous Huxley is known today primarily as the author of the novel Brave New World. He was one of the first prominent Americans to publicly endorse the use of psychedelic drugs. Controversial political theorist Lyndon Larourche called Huxley “the high priest for Britain’s opium war,” and claimed he played a conspicuous role in laying the groundwork for the Sixties counterculture. Huxley’s grandfather was Thomas H. Huxley, founder of the Rhodes Roundtable and a longtime collaborator with establishment British historian Arnold Toynbee. Toynbee headed the Research Division of British Intelligence during World War II, and was a briefing officer to Winston Churchill. Aldous Huxley was tutored at Oxford by novelist H. G. Wells, a well-known advocate of world government. Expounding in his “Open Conspiracy: Blue Prints for a World Revolution,” Wells wrote, “The Open Conspiracy will appear first, I believe, as a conscious organization of intelligent and quite possibly in some cases, wealthy men, as a movement having distinct social and political aims. . . . In all sorts of ways they will be influencing and controlling the apparatus of the ostensible government.” Wells introduced Huxley to the notorious Satanist, Aleister Crowley.
Donald Jeffries (Hidden History: An Exposé of Modern Crimes, Conspiracies, and Cover-Ups in American Politics)
John Adams famously pointed out, political wisdom has not improved over the ages; even as technology has advanced, mankind steps on the same rakes, and the new inventions often magnify the damage. Historian Daniel Boorstin referred to the nonprogressivity of human nature and politics as “Adams’ law,” but Boorstin was far too modest, for he appended several of his own astute observations to it, among which was that technology, far from fulfilling needs and solving problems, creates needs and spreads problems. “Boorstin’s law,” then, could be formulated thus in the modern world: beware of optimism about the social and political benefits of the Internet and social media, for while technology progresses, human nature and politics do not.21
William J. Bernstein (Masters of the Word: How Media Shaped History from the Alphabet to the Internet)
The hallmark of originality is rejecting the default and exploring whether a better option exists. I’ve spent more than a decade studying this, and it turns out to be far less difficult than I expected. The starting point is curiosity: pondering why the default exists in the first place. We’re driven to question defaults when we experience vuja de, the opposite of déjà vu. Déjà vu occurs when we encounter something new, but it feels as if we’ve seen it before. Vuja de is the reverse—we face something familiar, but we see it with a fresh perspective that enables us to gain new insights into old problems. Without a vuja de event, Warby Parker wouldn’t have existed. When the founders were sitting in the computer lab on the night they conjured up the company, they had spent a combined sixty years wearing glasses. The product had always been unreasonably expensive. But until that moment, they had taken the status quo for granted, never questioning the default price. “The thought had never crossed my mind,” cofounder Dave Gilboa says. “I had always considered them a medical purchase. I naturally assumed that if a doctor was selling it to me, there was some justification for the price.” Having recently waited in line at the Apple Store to buy an iPhone, he found himself comparing the two products. Glasses had been a staple of human life for nearly a thousand years, and they’d hardly changed since his grandfather wore them. For the first time, Dave wondered why glasses had such a hefty price tag. Why did such a fundamentally simple product cost more than a complex smartphone? Anyone could have asked those questions and arrived at the same answer that the Warby Parker squad did. Once they became curious about why the price was so steep, they began doing some research on the eyewear industry. That’s when they learned that it was dominated by Luxottica, a European company that had raked in over $7 billion the previous year. “Understanding that the same company owned LensCrafters and Pearle Vision, Ray-Ban and Oakley, and the licenses for Chanel and Prada prescription frames and sunglasses—all of a sudden, it made sense to me why glasses were so expensive,” Dave says. “Nothing in the cost of goods justified the price.” Taking advantage of its monopoly status, Luxottica was charging twenty times the cost. The default wasn’t inherently legitimate; it was a choice made by a group of people at a given company. And this meant that another group of people could make an alternative choice. “We could do things differently,” Dave suddenly understood. “It was a realization that we could control our own destiny, that we could control our own prices.” When we become curious about the dissatisfying defaults in our world, we begin to recognize that most of them have social origins: Rules and systems were created by people. And that awareness gives us the courage to contemplate how we can change them. Before women gained the right to vote in America, many “had never before considered their degraded status as anything but natural,” historian Jean Baker observes. As the suffrage movement gained momentum, “a growing number of women were beginning to see that custom, religious precept, and law were in fact man-made and therefore reversible.
Adam M. Grant (Originals: How Non-Conformists Move the World)
Most of this material is accessible in the Dunayevskaya and Glaberman collections in the Wayne State University Archives of Labor and Urban Affairs in Detroit. When occasionally I look up something in the collections, I find it hard to believe that we wrote so much and took on so many literary critics and historians. The Johnson-Forest Tendency consisted of a small number of members—never more than sixty to seventy in an organization of several hundred. But the fervor with which we supported the independent black struggle and attacked the alienation of human beings in the process of capitalist production made us stand out in any gathering. Most members of the Johnson-Forest Tendency were part of the new generation who had joined the radical movement in the 1940s because we wanted to make a second American Revolution—which to us meant mainly encouraging the struggles of rank-and-file workers to take over control of production inside the plant and supporting the black struggle for full social, economic, and political equality. Black, white, Asian, and Chicano, workers and intellectuals, living on the East Coast, West Coast, and in the Midwest, we were a representative sample of the new human forces that were emerging in the United States during World War II. Because CLR could not be publicly active, we acted as his transmission belt to the larger American community. Our little organization was a collective way to know reality.
Grace Lee Boggs (Living for Change: An Autobiography)
Humans have natural rights in the state of nature but they do not have civil rights. Civil rights are derived from membership in a society. The Republicans who controlled both houses of Congress after the Civil War knew this. They also knew that, before conferring civil rights, they had to once and for all abolish slavery. The Thirteenth Amendment ending slavery was passed by the Senate on April 8, 1864, and by the House on January 31, 1865. Republican support for the amendment: 100 percent. Democratic support: 23 percent. Even after the Civil War, only a tiny percentage of Democrats were willing to sign up to permanently end slavery. Most Democrats wanted it to continue. In the following year, on June 13, 1866, the Republican Congress passed the Fourteenth Amendment overturning the Dred Scott decision and granting full citizenship and equal rights under the law to blacks. This amendment prohibited states from abridging the “privileges and immunities” of all citizens, from depriving them of “due process of law” or denying them “equal protection of the law.” The Fourteenth Amendment passed the House and Senate with exclusive Republican support. Not a single Democrat either in the House or the Senate voted for it. Two years later, in 1868, Congress with the support of newly-elected Republican president Ulysses Grant passed the Fifteenth Amendment granting suffrage to blacks. The right to vote, it said, cannot be “denied or abridged by the United States or any state on account of race, color or previous condition of servitude.” In the Senate, the Fifteenth Amendment passed by a vote of 39 to 13. Every one of the 39 “yes” votes came from Republicans. (Some Republicans like Charles Sumner abstained because they wanted the measure to go even further than it did.) All the 13 “no” votes came from Democrats. In the House, every “yes” vote came from a Republican and every Democrat voted “no.” It is surely a matter of the greatest significance that the constitutional provisions that made possible the Civil Rights Act, the Voting Rights Act, and the Fair Housing Bill only entered the Constitution thanks to the Republican Party. Beyond this, the GOP put forward a series of Civil Rights laws to further reinforce black people’s rights to freedom, equality, and social justice. When Republicans passed the Civil Rights Act of 1866—guaranteeing to blacks the rights to make contracts and to have the criminal laws apply equally to whites and blacks—the Democrats struck back. They didn’t have the votes in Congress, but they had a powerful ally in President Andrew Johnson. Johnson vetoed the legislation. Now this may seem like an odd act for Lincoln’s vice president, but it actually wasn’t. Many people don’t realize that Johnson wasn’t a Republican; he was a Democrat. Historian Kenneth Stampp calls him “the last Jacksonian.”8 Lincoln put him on the ticket because he was a pro-union Democrat and Lincoln was looking for ways to win the votes of Democrats opposed to secession. Johnson, however, was both a southern partisan and a Democratic partisan. Once the Civil War ended, he attempted to lead weak-kneed Republicans into a new Democratic coalition based on racism and white privilege. Johnson championed the Democratic mantra of white supremacy, declaring, “This is a country for white men and, by God, as long as I am president, it shall be a government of white men.” In his 1867 annual message to Congress, Johnson declared that blacks possess “less capacity for government than any other race of people. No independent government of any form has ever been successful in their hands. On the contrary, wherever they have been left to their own devices they have shown a consistent tendency to relapse into barbarism.”9 These are perhaps the most racist words uttered by an American president, and no surprise, they were uttered by a Democrat.
Dinesh D'Souza (Hillary's America: The Secret History of the Democratic Party)
A Hard Left For High-School History The College Board version of our national story BY STANLEY KURTZ | 1215 words AT the height of the “culture wars” of the late 1980s and early 1990s, conservatives were alive to the dangers of a leftist takeover of American higher education. Today, with the coup all but complete, conservatives take the loss of the academy for granted and largely ignore it. Meanwhile, America’s college-educated Millennial generation drifts ever farther leftward. Now, however, an ambitious attempt to force a leftist tilt onto high-school U.S.-history courses has the potential to shake conservatives out of their lethargy, pulling them back into the education wars, perhaps to retake some lost ground. The College Board, the private company that develops the SAT and Advanced Placement (AP) exams, recently ignited a firestorm by releasing, with little public notice, a lengthy, highly directive, and radically revisionist “framework” for teaching AP U.S. history. The new framework replaces brief guidelines that once allowed states, school districts, and teachers to present U.S. history as they saw fit. The College Board has promised to generate detailed guidelines for the entire range of AP courses (including government and politics, world history, and European history), and in doing so it has effectively set itself up as a national school board. Dictating curricula for its AP courses allows the College Board to circumvent state standards, virtually nationalizing America’s high schools, in violation of cherished principles of local control. Unchecked, this will result in a high-school curriculum every bit as biased and politicized as the curriculum now dominant in America’s colleges. Not coincidentally, David Coleman, the new head of the College Board, is also the architect of the Common Core, another effort to effectively nationalize American K–12 education, focusing on English and math skills. As president of the College Board, Coleman has found a way to take control of history, social studies, and civics as well, pushing them far to the left without exposing himself to direct public accountability. Although the College Board has steadfastly denied that its new AP U.S. history (APUSH) guidelines are politically biased, the intellectual background of the effort indicates otherwise. The early stages of the APUSH redesign overlapped with a collaborative venture between the College Board and the Organization of American Historians to rework U.S.-history survey courses along “internationalist” lines. The goal was to undercut anything that smacked of American exceptionalism, the notion that, as a nation uniquely constituted around principles of liberty and equality, America stands as a model of self-government for the world. Accordingly, the College Board’s new framework for AP U.S. history eliminates the traditional emphasis on Puritan leader John Winthrop’s “City upon a Hill” sermon and its echoes in American history. The Founding itself is demoted and dissolved within a broader focus on transcontinental developments, chiefly the birth of an exploitative international capitalism grounded in the slave trade. The Founders’ commitment to republican principles is dismissed as evidence of a benighted belief in European cultural superiority. Thomas Bender, the NYU historian who leads the Organization of American Historians’ effort to globalize and denationalize American history, collaborated with the high-school and college teachers who eventually came to lead the College Board’s APUSH redesign effort. Bender frames his movement as a counterpoint to the exceptionalist perspective that dominated American foreign policy during the George W. Bush ad ministration. Bender also openly hopes that students exposed to his approach will sympathize with Supreme Court justice Ruth Bader Ginsburg’s willingness to use foreign law to interpret the U.S. Constitution rather than with Justice Antonin Scalia�
Anonymous
I was also troubled by a sensibility in much of the conventional history of the era that these events were somehow inevitable. White animosity toward blacks was accepted as a wrong, but logical extension of antebellum racial views. Events were presented as having transpired as a result of large--seemingly unavoidable--social and anthropological shifts, rather than the specific decisions and choices of individuals. What's more, African Americans were portrayed by most historians as an almost static component of U.S. society: Their leaders changed with each generation, but the mass of black Americans were depicted as if the freed slaves of 1863 were the same people still not free fifty years later. There was no acknowledgement of the effects of cycle upon cycle of malevolent defeat, of the injury of seeing one generation rise above the cusp of poverty only to be indignantly crushed, of the impact of repeating tsunamis of violence and obliterated opportunities on each new generation of an ever-changing population out-numbered in persons and resources.
Douglas A. Blackmon
How did the West produce the intense world of visual signs? What were the underlying forces that favored the multiplication of signs? It is generally understood that there is close relationship between capitalism and Christianity. Especially through the Protestant Reformation the Christian faith produced a huge shift to the individual, a man or woman separated out before God. Sociologists and historians recognize that by means of this ideological transition the individual no longer existed within a containing order of duties and rights controlling the distribution of wealth. Wealth instead became a marker of individual divine blessing. Thus the Reformation led to the typical figure of the righteous business man, the mill-owner who made big profits during the week and with them endowed a church for giving thanks on Sunday. More recently we have the emergence of the ‘prosperity gospel’ which applies the same basic formula to everyone. As they say in these churches, ‘prayed for and paid for’, neatly chiming relationship to God and personal financial success. Thus Christianity has underpinned the multiplication of material wealth for individuals. But a consequence of this is the thickening of the world of signs. Prosperity is a sign of God’s favor, and this is shown, signified, by the actual goods, the houses, clothes, cars, etc. Against this metaphysical background, however, the goods very quickly attain their own social value and produce the well-known contours of the consumer world. Once they were declared divinely willed and good they could act as self-referential signs in and for themselves. People don’t have to give any thought to theological justification to derive meaning from the latest car model, from the good-life associations of household items, refrigerators, fitted kitchens, plasma T.V.s, and now from the plugged-in cool of the digital world, computers, cell phones, iPods, G.P.S. and so on. So it is that our Western culture has developed a class of signs with a powerful inner content of validated desire. You
Anthony Bartlett (Virtually Christian: How Christ Changes Human Meaning and Makes Creation New)
Very few young people know this history. Most of them haven’t even heard about Douglass; who hasn’t heard of Martin Luther King? Am I suggesting that the scandalous neglect of Douglass and the excessive praise heaped on King is part of the progressive whitewash? You bet I am. But, say the Democratic and progressive historians, wait a minute! While King’s program moved forward and was enacted into law, Douglass’s program was halted in its tracks. We cannot forget about the backlash! Yes, indeed. The Democratic storytellers are right that there was a powerful backlash against blacks in the South, so that the constitutional provisions of freedom, equality, and social justice became a dead letter. The Civil Rights laws were stymied, and even the provisions that passed were ignored. Blacks were reduced to new forms of subjugation not identical with, but reminiscent of, slavery. This re-enslavement of blacks was enforced by a juggernaut of violence epitomized by that institution of domestic terrorism, the Ku Klux Klan.
Dinesh D'Souza (Hillary's America: The Secret History of the Democratic Party)
Benjamin Franklin Learned about Democracy by Observing Native Americans One of the Founding Fathers, Benjamin Franklin, actually spent quite a lot of time observing and socializing with the Iroquois tribe. During his interactions with the Native Americans, Franklin noticed that the Iroquois was in fact, a union of different tribes that were ruled by one chief. Their chief would only remain in power if the other tribes supported his actions, which technically made him an elected official. The Iroquois also had in place a system of checks and balances to make sure that no one abused their authority. Some historians speculate that Franklin introduced many of the things he learned from his interactions with the Native Americans when he and the other Founding Fathers drafted the United States Constitution.
William D. Willis (American History: US History: An Overview of the Most Important People & Events. The History of United States: From Indians, to "Contemporary" History ... Native Americans, Indians, New York Book 1))
My father's concern for his patients was only enhanced by the fact that so many of them had a personal connection to him...In the words of the historian David J. Rothman, 'doctor and patient occupied the same social space,' promoting a shared relationship. Meanwhile, the poor and minority patients my dad met for the first time at the Mount Sinai--including many he would then follow for years--got the same royal treatment...His goal was to 'take extra pains with the service patients, to be certain they are reassured and confident in your care, and come to believe that you really care about him or her as an individual.' One way he did this was to take advantage of his flexible schedule. 'It's so simple,' he wrote, 'to make an extra visit in the afternoon for these special cases, come back to report a new lab test result, review an X-ray [or] reassure that the scheduled test is necessary, important and will lead to some conclusive information.' Illness, he underscored, was 'frightening.
Barron H. Lerner (The Good Doctor: A Father, a Son, and the Evolution of Medical Ethics)
Still later, I read Ira Katznelson’s history of discrimination, When Affirmative Action Was White, which argued that similar exclusions applied to other “color-blind” New Deal programs, such as the beloved GI Bill, social security, and unemployment insurance. I was slowly apprehending that a rising tide, too, could be made to discriminate. A raft of well-researched books and articles pointed me this way. From historians, I learned that the New Deal’s exclusion of blacks was the price FDR paid to the southern senators for its passage. The price black people paid was being forced out of the greatest government-backed wealth-building opportunity in the twentieth century. The price of discrimination had more dimensions than those that were immediately observable. Since the country’s wealth was distributed along the lines of race and because black families were cordoned off, resources accrued and compounded for whites while relative poverty accrued for blacks. And so it was not simply that black people were more likely to be poor but that black people—of all classes—were more likely to live in poor neighborhoods. So thick was the barrier of segregation that upper-class blacks were more likely to live in poor neighborhoods than poor whites.
Ta-Nehisi Coates (We Were Eight Years in Power: An American Tragedy)
The discipline of history is particularly important in this context because while science has had a direct impact on how historians write, and what they write about, history has itself been evolving. One of the great debates in historiography is over how events move forward. One school of thought has it that ‘great men’ are mostly what matter, that the decisions of people in power can bring about significant shifts in world events and mentalities. Others believe that economic and commercial matters force change by promoting the interests of certain classes within the overall population. In the twentieth century, the actions of Stalin and Hitler in particular would certainly seem to suggest that ‘great’ men are vital to historical events. But the second half of the century was dominated by thermonuclear weapons, and can one say that any single person, great or otherwise, was really responsible for the bomb? No. In fact, I would suggest that we are living at a time of change, a crossover time in more ways than one, when what we have viewed as the causes of social movement in the past – great men or economic factors playing on social classes – are both being superseded as the engine of social development. That new engine is science.
Peter Watson (The Modern Mind: An Intellectual History of the 20th Century)
As the eminent historian Gordon Wood has pointed out, we must understand that a majority of the Articles’ most famous critics — and the later constitutional framers — were basically aristocrats in the pre-industrial, pre-capitalist sense of the word. They feared inflation, paper money, and debt relief measures because they modeled their social and economic world on the systems and tendencies of the English gentry. Their entire societal and agrarian order was at risk during the 1780s — in fact it would later collapse in the increasingly commercial northern states, only to live on in the plantation life of the antebellum South. Much of their complaint about “excessive democracy” in the new American state governments may ring hollow to modern ears, but they believed in their position most emphatically.
Daniel A. Sjursen (A True History of the United States: Indigenous Genocide, Racialized Slavery, Hyper-Capitalism, Militarist Imperialism and Other Overlooked Aspects of American Exceptionalism (Truth to Power))
The Historians’ Dispute The debate between the New Historians and the critical sociologists on one side, and the social scientists of the establishment on the other, broke out less than a year after the Oslo Accords were signed. The first salvo of what came to be known as “the historians’ dispute” was in a 1994 article published in Haaretz by author Aharon Meged, a longtime supporter of the Zionist Labor movement. In the article he accuses the post-Zionists of rewriting history “in the spirit of its enemies.”40 He claims that the post-Zionists had signed up to support the aims of “the Arabs” by constructing an anti-Zionist historiography that reproduced “the old communist and Soviet propaganda which presented Zionism as an imperialist-colonialist movement.” Meged claimed that this was the result of an innate suicidal instinct amongst the post-Zionists who know that denying the justification of Zionism will bring about the destruction of Israel. Hence, he overtly called for a social science whose role is to confirm the central tenet of Zionism.
Tikva Honig-Parnass (The False Prophets of Peace: Liberal Zionism and the Struggle for Palestine)
When the system of mass incarceration collapses (and if history is any guide, it will), historians will undoubtedly look back and marvel that such an extraordinarily comprehensive system of racialized social control existed in the United States.
Michelle Alexander (The New Jim Crow: Mass Incarceration in the Age of Colorblindness)
Across Europe, conservatives alarmed by the rise of labour were discovering antidotes in nationalism, racism, and jingoism. Intellectuals, politicians, industrialists, and empire-builders embraced the idea that the masses – the dark, threatening masses stirring in the social depths – could perhaps be distracted by a new kind of ‘bread and circuses’: the glory of empire. French philologist, philosopher, and historian Ernest Renan was explicit: it was ‘the only way to counter socialism’, and ‘a nation that does not colonise is condemned to end up with socialism, to experience a war between rich and poor’. Cecil Rhodes, the diamond magnate and colonial pioneer who did more than anyone to establish British imperial rule in Southern Africa, found himself thinking along precisely these lines after witnessing a rowdy meeting of the unemployed in East London. ‘On my way home,’ he later recalled, ‘I pondered over the scene, and I became more than ever convinced of the importance of imperialism … The Empire, as I have always said, is a bread and butter question. If you want to avoid civil war, you must become imperialists.
Neil Faulkner (Empire and Jihad: The Anglo-Arab Wars of 1870-1920)
During the Vietnam War, a new generation of scholars such as William Appleman Williams and Gabriel Kolko challenged long-standing legends about the workings of US foreign policy. Social and cultural historians in the 1970s and 1980s wrote new histories of the nation from the bottom up, expanding our view to include long-overlooked perspectives on gender, race, and ethnic identities and, in the process, showing that narrow narratives focused solely on political leaders at the top obscured more than they revealed. Despite the fact that the term revisionist history is often thrown around by nonhistorians as an insult, in truth all good historical work is at heart “revisionist” in that it uses new findings from the archives or new perspectives from historians to improve, to perfect—and, yes, to revise—our understanding of the past. Today, yet another generation of historians is working once again to bring historical scholarship out of academic circles, this time to push back against misinformation in the public sphere. Writing op-eds and essays for general audiences; engaging the public through appearances on television, radio, and podcasts; and being active on social media sites such as Facebook, Twitter, and Substack, hundreds if not thousands of historians have been working to provide a counterbalance and corrections to the misinformation distorting our national dialogue. Such work has incredible value, yet historians still do their best work in the longer written forms of books, articles, and edited collections that allow us both to express our thoughts with precision in the text and provide ample evidence in the endnotes. This volume has brought together historians who have been actively engaging the general public through the short forms of modern media and has provided them a platform where they might expand those engagements into fuller essays that reflect the best scholarly traditions of the profession.
Kevin M. Kruse (Myth America: Historians Take On the Biggest Legends and Lies About Our Past)
By 1636, civil authorities on the island decreed a rule that became common in chattel systems throughout the hemisphere: slaves would remain in bondage for life. In 1661, with the island now amid a full-blown sugar boom, the authorities formulated a fuller set of laws governing the lives of slaves, a Black code that one historian has called “one of the most influential pieces of legislation passed by a colonial legislature.” Antigua, Jamaica, South Carolina, and, “indirectly,” Georgia adopted it in its entirety, while the laws of many other English colonies were modeled after it. The law described Africans as a “heathenish, brutish and uncertaine, dangerous kinde of people,” and gave their white owners near total control over their lives. The right of trial by jury guaranteed for whites was excluded for slaves, whom their owners could punish at will, facing no consequences even for murder, so long as they could cite a cause. Other rules barred Black slaves from skilled occupations, thus helping to reify race as a largely impermeable membrane dividing whites and Blacks in the New World. With steps like these, tiny Barbados became an enormously powerful driver of history, not only through the prodigious wealth it would generate, a wealth hitherto “unknown in other parts of colonial America,” but by its legal and social example as well. The island colony stood out as a pioneer in the development of chattel slavery and in the construction of the plantation machine, as the originator of codes like these, and later as a crucial source of early migration, both Black and white, to the Carolinas, Virginia, and later Jamaica. Here was the seed crystal of the English plantation system in the New World, or in the words of one historian, its “cultural hearth.
Howard W. French (Born in Blackness: Africa, Africans, and the Making of the Modern World, 1471 to the Second World War)
Historian Kelly Lytle Hernandez is among those who have begun to connect the dots between mass incarceration and mass deportation. In her brilliant essay “Amnesty or Abolition: Felons, Illegals, and the Case for a New Abolition Movement,” she chronicles how these systems have emerged as dual, interlocking forms of social control that relegate “aliens” and “felons” to a racialized caste of outsiders.
Michelle Alexander (The New Jim Crow: Mass Incarceration in the Age of Colorblindness)
Historian Kelly Lytle Hernandez is among those who have begun to connect the dots between mass incarceration and mass deportation. In her brilliant essay “Amnesty or Abolition: Felons, Illegals, and the Case for a New Abolition Movement,” she chronicles how these systems have emerged as dual, interlocking forms of social control that relegate “aliens” and “felons” to a racialized caste of outsiders. The
Michelle Alexander (The New Jim Crow: Mass Incarceration in the Age of Colorblindness)
In such an endeavor it is not enough to say that history unfolds by processes too complex for reductionistic analysis. That is the white flag of the secular intellectual, the lazy modernist equivalent of The Will of God. On the other hand, it is too early to speak seriously of ultimate goals, such as perfect green-belted cities and robot expeditions to the nearest stars. It is enough to get Homo sapiens settled down and happy before we wreck the planet. A great deal of serious thinking is needed to navigate the decades immediately ahead. We are gaining in our ability to identify options in the political economy most likely to be ruinous. We have begun to probe the foundations of human nature, revealing what people intrinsically most need, and why. We are entering a new era of existentialism, not the old absurdist existentialism of Kierkegaard and Sartre, giving complete autonomy to the individual, but the concept that only unified learning, universally shared, makes accurate foresight and wise choice possible. In the course of all of it we are learning the fundamental principle that ethics is everything. Human social existence, unlike animal sociality, is based on the genetic propensity to form long-term contracts that evolve by culture into moral precepts and law. The rules of contract formation were not given to humanity from above, nor did they emerge randomly in the mechanics of the brain. They evolved over tens or hundreds of millennia because they conferred upon the genes prescribing them survival and the opportunity to be represented in future generations. We are not errant children who occasionally sin by disobeying instructions from outside our species. We are adults who have discovered which covenants are necessary for survival, and we have accepted the necessity of securing them by sacred oath. The search for consilience might seem at first to imprison creativity. The opposite is true. A united system of knowledge is the surest means of identifying the still unexplored domains of reality. It provides a clear map of what is known, and it frames the most productive questions for future inquiry. Historians of science often observe that asking the right question is more important than producing the right answer. The right answer to a trivial question is also trivial, but the right question, even when insoluble in exact form, is a guide to major discovery. And so it will ever be in the future excursions of science and imaginative flights of the arts.
Edward O. Wilson (Consilience: The Unity of Knowledge)
Russian Political Masonry, 1917, and Historians,” International Review of Social History, 28.2 (1983): 240–58. According to the New York Journal-American of February 3, 1949, Jacob Schiff, senior partner in Kuhn, Leob & Co., “sank about 20,000,000 dollars for the final triumph of Bolshevism in Russia.
Athanasius Schneider (Christus Vincit: Christ's Triumph Over the Darkness of the Age)
Antipatriot sentiments were shared by a wide variety of people across a broad social spectrum. In New York, ironically, some of those who opposed the Revolution were poor tenant farmers from the 160,000-acre Livingston Manor in the Hudson Valley. Robert Livingston, Jr., lord of the manor, was a Whig Revolutionary—not because of deep philosophical convictions, but because his opponents in New York politics were all Tories. Livingston’s tenants, according to historian Staughton Lynd, saw in the Revolution a chance to oppose their Lord and possibly take possession of the land they worked.
Ray Raphael (A People's History of the American Revolution: How Common People Shaped the Fight for Independence)
RBG’s image as a moderate was clinched in March 1993, in a speech she gave at New York University known as the Madison Lecture. Sweeping judicial opinions, she told the audience, packed with many of her old New York friends, were counterproductive. Popular movements and legislatures had to first spur social change, or else there would be a backlash to the courts stepping in. As case in point, RBG chose an opinion that was very personal to plenty of people listening: Roe v. Wade. The right had been aiming to overturn Roe for decades, and they’d gotten very close only months before the speech with Planned Parenthood v. Casey. Justices Anthony Kennedy, David Souter, and Sandra Day O’Connor had instead brokered a compromise, allowing states to put restrictions on abortion as long as they didn’t pose an “undue burden” on women—or ban it before viability. Neither side was thrilled, but Roe was safe, at least for the moment. Just as feminists had caught their breath, RBG declared that Roe itself was the problem. If only the court had acted more slowly, RBG said, and cut down one state law at a time the way she had gotten them to do with the jury and benefit cases. The justices could have been persuaded to build an architecture of women’s equality that could house reproductive freedom. She said the very boldness of Roe, striking down all abortion bans until viability, had “halted a political process that was moving in a reform direction and thereby, I believe, prolonged divisiveness and deferred stable settlement of the issue.” This analysis remains controversial among historians, who say the political process of abortion access had stalled before Roe. Meanwhile, the record shows that there was no overnight eruption after Roe. In 1975, two years after the decision, no senator asked Supreme Court nominee John Paul Stevens about abortion. But Republicans, some of whom had been pro-choice, soon learned that being the anti-abortion party promised gains. And even if the court had taken another path, women’s sexual liberation and autonomy might have still been profoundly unsettling. Still, RBG stuck to her guns, in the firm belief that lasting change is incremental. For the feminists and lawyers listening to her Madison Lecture, RBG’s argument felt like a betrayal. At dinner after the lecture, Burt Neuborne remembers, other feminists tore into their old friend. “They felt that Roe was so precarious, they were worried such an expression from Ruth would lead to it being overturned,” he recalls. Not long afterward, when New York senator Daniel Patrick Moynihan suggested to Clinton that RBG be elevated to the Supreme Court, the president responded, “The women are against her.” Ultimately, Erwin Griswold’s speech, with its comparison to Thurgood Marshall, helped convince Clinton otherwise. It was almost enough for RBG to forgive Griswold for everything else.
Irin Carmon (Notorious RBG: The Life and Times of Ruth Bader Ginsburg)
When the system of mass incarceration collapses (and if history is any guide, it will), historians will undoubtedly look back and marvel that such an extraordinarily comprehensive system of racialized social control existed in the United States. How fascinating, they will likely say, that a drug war was waged almost exclusively against poor people of color—people already trapped in ghettos that lacked jobs and decent schools.
Michelle Alexander (The New Jim Crow: Mass Incarceration in the Age of Colorblindness)
Still later, I read Ira Katznelson’s history of discrimination, When Affirmative Action Was White, which argued that similar exclusions applied to other “color-blind” New Deal programs, such as the beloved GI Bill, social security, and unemployment insurance. I was slowly apprehending that a rising tide, too, could be made to discriminate. A raft of well-researched books and articles pointed me this way. From historians, I learned that the New Deal’s exclusion of blacks was the price FDR paid to the southern senators for its passage. The price black people paid was being forced out of the greatest government-backed wealth-building opportunity in the twentieth century. The price of discrimination had more dimensions than those that were immediately observable.
Ta-Nehisi Coates (We Were Eight Years in Power: An American Tragedy)
...the historian has been trained in a society in which education and knowledge are put forward as technical problems of excellence and not as tools for contending social classes, races, nations...
Howard Zinn (A People's History of the United States: American Beginnings to Reconstruction (New Press People's History, 1))