Social Psychology Attribution Quotes

We've searched our database for all the quotes and captions related to Social Psychology Attribution. Here they are! All 34 of them:

Hostility, malice, and sadism are the result of helplessness and self-loathing; that they are all produced by adaptation to a hypercritical social reality and are not attributable to innate aggression.
Arno Gruen
The Spirit is that element of transcendence, superiority, permanence, power, liberty, inner reality, creativity, harmony and synthesis in every manifestation, both individual and social. In people, therefore, the term ‘spiritual’ (to varying degrees) can be attributed to everything that compels them to transcend their selfish exclusiveness, fears, inertia and love of pleasure; everything that urges them to discipline, control and direct those untamed forces, instincts and emotions that seethe within; everything that induces the recognition of a greater, superior reality, social or ideal in nature, and to become one with it, extending the limits of the personality.
Roberto Assagioli (Transpersonal Development: The Dimension Beyond Psychosynthesis)
We deny responsibility for our actions when we attribute their cause to factors outside ourselves: Vague, impersonal forces—“I cleaned my room because I had to.” Our condition, diagnosis, or personal or psychological history—“I drink because I am an alcoholic.” The actions of others—“I hit my child because he ran into the street.” The dictates of authority—“I lied to the client because the boss told me to.” Group pressure—“I started smoking because all my friends did.” Institutional policies, rules, and regulations—“I have to suspend you for this infraction because it’s the school policy.” Gender roles, social roles, or age roles—”I hate going to work, but I do it because I am a husband and a father.” Uncontrollable impulses—“I was overcome by my urge to eat the candy bar.
Marshall B. Rosenberg (Nonviolent Communication: A Language of Life: Life-Changing Tools for Healthy Relationships (Nonviolent Communication Guides))
I'm not interested in the labels of dominance, submission, top, bottom, fucking versus being fucked. I don't believe that particular sexual acts denote vulnerability or strength; that would be to buy the line that fucking is active and being fucked is passive; as if the arrangement of our bodies tells us something categorical about our psychological stances, our vulnerabilities, our feelings; as if the binaries of active and passive are not used to divide the ranks into powerful masculine attributes and powerless feminine ones. I am talking here instead about a psychological and social acceptance of vulnerability, of all our capacity for injury, of the shared softness of us all.
Katherine Angel (Tomorrow Sex Will Be Good Again: Women and Desire in the Age of Consent)
Despite all the positive psychological attributes of hatred we have outlined, hatred destroys finally the core of the life of the hater. While it lasts, burning in white heat, its effect seems positive and dynamic. But at last it turns to ash, for it guarantees a final isolation from one’s fellows. It blinds the individual to all values of worth, even as they apply to himself and to his fellows. Hatred bears deadly and bitter fruit. It is blind and nondiscriminating.
Howard Thurman (Jesus and the Disinherited)
[That] the driving force of the evolution of human intelligence was the coordination of multiple cognitive systems to pursue complex, shared goals [is called] the social brain hypothesis. It attributes the increase in intelligence to the increasing size and complexity of hominid social groups. Living in a group confers advantages, as we have seen with hunting, but it also demands certain cognitive abilities. It requires the ability to communicate in sophisticated ways, to understand and incorporate the perspectives of others, and to share common goals. The social brain hypothesis posits that the cognitive demands and adaptive advantages associated with living in a group created a snowball effect: As groups got larger and developed more complex joint behaviors, individuals developed new capabilities to support those behaviors. These new capabilities in turn allowed groups to get even larger and allowed group behavior to become even more complex.
Steven Sloman (The Knowledge Illusion: Why We Never Think Alone)
When someone is in some kind of social or psychological difficulty, and someone has been irresponsible in some way, we wonder what caused the problem: “Why are they like that?” And instead of attributing the problem to the person, our psychologists tend to refer it back to other things and other people: It was because of their environment, or because of family conditioning, or because of their father and mother. But there is no end to that, because you can take the blame straight back to Adam and Eve! And responsibility is evaded, because it was limited in the first place.
Alan W. Watts (What Is Zen?)
Exaggerated Emotional Coherence (Halo Effect) If you like the president’s politics, you probably like his voice and his appearance as well. The tendency to like (or dislike) everything about a person—including things you have not observed—is known as the halo effect. The term has been in use in psychology for a century, but it has not come into wide use in everyday language. This is a pity, because the halo effect is a good name for a common bias that plays a large role in shaping our view of people and situations. It is one of the ways the representation of the world that System 1 generates is simpler and more coherent than the real thing. You meet a woman named Joan at a party and find her personable and easy to talk to. Now her name comes up as someone who could be asked to contribute to a charity. What do you know about Joan’s generosity? The correct answer is that you know virtually nothing, because there is little reason to believe that people who are agreeable in social situations are also generous contributors to charities. But you like Joan and you will retrieve the feeling of liking her when you think of her. You also like generosity and generous people. By association, you are now predisposed to believe that Joan is generous. And now that you believe she is generous, you probably like Joan even better than you did earlier, because you have added generosity to her pleasant attributes.
Daniel Kahneman (Thinking, Fast and Slow)
Between the religion of a people and its actual mode of life there is always a compensatory relation, otherwise religion would have no practical significance at all. Beginning with the highly moral religion of the Persians and the notorious dubiousness, even in antiquity, of Persian habits of life, right down to our own “Christian” era, when the religion of love assisted at the greatest blood-bath in the world’s history—wherever we turn this rule holds true. We may therefore infer from the symbol of the Delphic reconciliation an especially violent split in the Greek character. This would also explain the longing for deliverance which gave the mysteries their immense significance for the social life of Greece, and which was completely overlooked by the early admirers of the Greek world. They were content with naïvely attributing to the Greeks everything they themselves lacked.
C.G. Jung (Collected Works of C. G. Jung, Volume 6: Psychological Types (The Collected Works of C. G. Jung Book 38))
In the car ahead, Jane was thinking fast and furiously. She had felt the purpose for which Tarzan had asked a few words with her, and she knew that she must be prepared to give him an answer in the very near future. He was not the sort of person one could put off, and somehow that very thought made her wonder if she did not really fear him. And could she love where she feared? She realized the spell that had been upon her in the depths of that far-off jungle, but there was no spell of enchantment now in prosaic Wisconsin. Nor did the immaculate young Frenchman appeal to the primal woman in her, as had the stalwart forest god. Did she love him? She did not know—now. She glanced at Clayton out of the corner of her eye. Was not here a man trained in the same school of environment in which she had been trained—a man with social position and culture such as she had been taught to consider as the prime essentials to congenial association? Did not her best judgment point to this young English nobleman, whose love she knew to be of the sort a civilized woman should crave, as the logical mate for such as herself? Could she love Clayton? She could see no reason why she could not. Jane was not coldly calculating by nature, but training, environment and heredity had all combined to teach her to reason even in matters of the heart. That she had been carried off her feet by the strength of the young giant when his great arms were about her in the distant African forest, and again today, in the Wisconsin woods, seemed to her only attributable to a temporary mental reversion to type on her part—to the psychological appeal of the primeval man to the primeval woman in her nature. If he should never touch her again, she reasoned, she would never feel attracted toward him. She had not loved him, then. It had been nothing more than a passing hallucination, super-induced by excitement and by personal contact. Excitement would not always mark their future relations, should she marry him, and the power of personal contact eventually would be dulled by familiarity. Again she glanced at Clayton. He was very handsome and every inch a gentleman. She should be very proud of such a husband.
Edgar Rice Burroughs (Tarzan of the Apes (Tarzan, #1))
Theories of generational difference make sense if they are expressed as theories of environmental difference rather than of psychological difference. People, especially young people, will respond to incentives because they have much to gain and little to lose from experimentation. To understand why people are spending so much time and energy exploring new forms of connection, you have to overcome the fundamental attribution error and extend to other people the set of explanations that you use to describe your own behavior: you respond to new opportunities, and so does everybody else, and these changes feed on one another, amplifying some kinds of behavior and damping others. People in my generation and older often tut-tut about young people’s disclosing so much of their lives on social networks like Facebook, contrasting that behavior with our own relative virtue in that regard: “You exhibitionists! We didn’t behave like that when we were your age!” This comparison conveniently ignores the fact that we didn’t behave that way because no one offered us the opportunity (and from what I remember of my twenties, I think we would have happily behaved that way if we’d had the chance). The generational explanations of Napster’s success fall apart because of the fundamental attribution error. The recording industry made that error when it became convinced that young people were willing to share because their generation was morally inferior (a complaint with obvious conceptual appeal to the elders). This thesis never made sense. If young people had become generally lawless, we’d expect to see a rise not just in sharing music but also in shoplifting and other forms of theft.
Clay Shirky (Cognitive Surplus: Creativity and Generosity in a Connected Age: How Technology Makes Consumers into Collaborators)
The Importance of Becoming Metacognitively Sophisticated as a Learner Whatever the reasons for our not developing accurate mental models of ourselves as learners, the importance of becoming sophisticated as a learner cannot be overemphasized. Increasingly, coping with the changes that characterize today’s world—technological changes, job and career changes, and changes in how much of formal and informal education happens in the classroom versus at a computer terminal, coupled with the range of information and procedures that need to be acquired—requires that we learn how to learn. Also, because more and more of our learning will be what Whitten, Rabinowitz, and Whitten (2006) have labeled unsupervised learning, we need, in effect, to know how to manage our own learning activities. To become effective in managing one’s own learning requires not only some understanding of the complex and unintuitive processes that underlie one’s encoding, retention, and retrieval of information and skills, but also, in my opinion, avoiding certain attribution errors. In social psychology, the fundamental attribution error (Ross, 1977) refers to the tendency, in explaining the behaviors of others, to overvalue the role of personality characteristics and undervalue the role of situational factors. That is, behaviors tend to be overattributed to a behaving individual’s or group’s characteristics and underattributed to situational constraints and influences. In the case of human metacognitive processes, there is both a parallel error and an error that I see as essentially the opposite. The parallel error is to overattribute the degree to which students and others learn or remember to innate ability. Differences in ability between individuals are overappreciated, whereas differences in effort, encoding activities, and whether the prior learning that is a foundation for the new learning in question has been acquired are underappreciated.
Aaron S. Benjamin (Successful Remembering and Successful Forgetting: A Festschrift in Honor of Robert A. Bjork)
Early biologists were the social scientists of their times, because their racial descriptions of the human species contain explicit behavioral correlations. Racial attributes were cited to explain social conditions, which then became a natural state of affairs. In the process of their construction, races are deemed part of nature; they are alleged to have been “discovered,” not constructed by an emphasis on particular anatomical attributes. This assumption of the naturalness of race is connected with the pursuit of an explanation of a particular social condition—inequality. Races, as unequal biological entities, must be said to have their peculiar cultures, psychologies, and unequal economic circumstances.
Yehudi O. Webster
A vast body of social psychological research reveals that, as people go about their daily lives, they tend to interpret the situations they encounter and the events they experience in a decidedly self-centered, self-aggrandizing, and self-justifying way [...] the majority of men and women possess unrealistically positive self-views — they judge positive traits as overwhelmingly more characteristic of themselves than negative traits; dismiss any unfavorable attributes they may have as inconsequential while at the same time emphasizing the uniqueness and importance of their favorable attributes; recall personal successes more readily than failures; take credit for positive outcomes while steadfastly denying responsibility for negative ones; and generally view themselves as “better” than the average person [...] In addition, people often fall prey to an illusion of control consisting of exaggerated perceptions of their own ability to master and control events and situations that are solely or primarily determined by chance [...] Moreover, most individuals are unrealistically optimistic about the future, firmly believing that positive life events are more likely (and negative events are less likely) to happen to them than to others […] These cognitive processes, collectively known as self-serving biases or self-enhancement biases, not only function to protect and enhance people’s self-esteem [...] but also color perceptions of the events that occur in their closest and most intimate relationships. [...] married individuals routinely overestimate the extent of their own contributions, relative to their spouses, to a variety of joint marital activities [...] People not only perceive their own attributes, behaviors, and future outcomes in an overly positive manner, but they also tend to idealize the characteristics of their intimate partners and relationships.
Pamela Regan (Close Relationships)
If the left prefrontal lobe is damaged, the patient may withdraw from the social world and show a marked reluctance to do anything at all. This is euphemistically called pseudodepression—“pseudo” because none of the standard criteria for identifying depression, such as feelings of bleakness and chronic negative thought patterns, are revealed by psychological or neurological probing. Conversely, if the right prefrontal lobe is damaged, a patient will seem euphoric even though, once again he really won’t be. Cases of prefrontal damage are especially distressing to relatives. Such a patient seems to lose all interest in his own future and he shows no moral compunctions of any kind. He may laugh at a funeral or urinate in public. The great paradox is that he seems normal in most respects: his language, his memory, and even his IQ are unaffected. Yet he has lost many of the most quintessential attributes that define human nature: ambition, empathy, foresight, a complex personality, a sense of morality, and a sense of dignity as a human being. For these reasons the prefrontal cortex has long been regarded as the “seat of humanity.” As for the question of how such a relatively small patch of the brain manages to orchestrate such a sophisticated and elusive suite of functions, we are still very much at a loss.
V.S. Ramachandran (The Tell-Tale Brain: A Neuroscientist's Quest for What Makes Us Human)
Tolerance or intolerance can be attributed as the single most important factor in understanding the psychology of any society, viz. a viz., its attitude towards peace, progress and growth. Author (One Vs All: Beware Mr.Prime Minister, It's India Impossible)
Ashok Anand
The mistake is to think that Firestone’s history of acute psychological distress somehow explains the Dialectic, allowing us to see that the meaning of its radicalism, its stridently nonconformist worldview, was always incipient mental illness. The Dialectic thus becomes read as a symptom of Firestone’s “madness.” Which means, of course, not reading it. Not engaging with its ideas; but instead, dismissing it from the scene of serious political and theoretical engagement. But this is to get things the wrong way round. We must not use “mental illness” to depoliticize radical theory; but use radical theory to politicize “mental illness.” The urgent task is to identify and analyze the social and economic structures that work to produce a widespread psychological distress, to which are attributed diagnostic labels.
Victoria Margree (Neglected or Misunderstood: The Radical Feminism of Shulamith Firestone)
If we fail to hold ourselves to that standard of responsibility, then other people regard us as lacking in ethics and integrity. And it does not end there. Just as we hold people (including ourselves) accountable for the wrongs they have done, or the good they have failed to do, we also believe (or at least act out the proposition) that someone who has made a good decision freely, deserves whatever benefit might come of that decision. It is for that reason that we believe each person should justly receive the fruit of their honest and voluntary labor. There seems something natural and inevitable about such judgments; something at work within them that is universal and inescapable, psychologically and socially. What all this means is that everyone—child, adult, self, others—will rebel against being treated as a cog in a wheel, incapable of choice and devoid of freedom, and (similarly) that it is practically impossible to establish a positive relationship with any other (or even our private selves) without that attribution of personal agency, free will, and responsibility.
Jordan B. Peterson (Beyond Order: 12 More Rules For Life)
There is a perhaps understandable reluctance to come to grips scientifically with the problem of race differences in intelligence—to come to grips with it, that is to say, in the same way the scientists would approach the investigation of any other phenomenon. This reluctance is manifested in a variety of ‘symptoms’ found in most writings and discussions of the psychology of race differences. These symptoms include a tendency to remain on the remotest fringes of the subject, to sidestep central questions, and to blur the issues and tolerate a degree of vagueness in definitions, concepts and inferences that would be unseemly in any other realm of scientific discourse. Many writers express an unwarranted degree of skepticism about reasonably well-established quantitative methods and measurements. They deny or belittle facts already generally accepted—accepted, that is, when brought to bear on inferences outside the realm of race differences—and they demand practically impossible criteria of certainty before even seriously proposing or investigating genetic hypotheses, as contrasted with with extremely uncritical attitudes towards purely environmental hypotheses. There is often a failure to distinguish clearly between scientifically answerable aspects of the question and the moral, political and social policy issues; there is tendency to beat dead horses and set up straw men on what is represented, or misrepresented I should say, as the genetic side of the argument. We see appeals to the notion that the topic is either too unimportant to be worthy of scientific curiosity, or is too complex, or too difficult, or that it will be forever impossible for any kind of research to be feasible, or that answers to key questions are fundamentally ‘unknowable’ in any scientifically accepted sense. Finally, we often see complete denial of intelligence and race as realities, or as quantifiable attributes, or as variables capable of being related to one another. In short, there is an altogether ostrich-like dismissal of the subject.
Arthur R. Jensen (Genetics and education)
Traditional structures of social and economic support slowly weakened; no longer was it possible for a man to follow his father and grandfather into a manufacturing job, or to join the union and start on the union ladder of wages. Marriage was no longer the only socially acceptable way to form intimate partnerships, or to rear children. People moved away from the security of legacy religions or the churches of their parents and grandparents, toward churches that emphasized seeking an identity, or replaced membership with the search for connection or economic success (Wuthnow, 1988). These changes left people with less structure when they came to choose their careers, their religion, and the nature of their family lives. When such choices succeed, they are liberating; when they fail, the individual can only hold himself or herself responsible. In the worst cases of failure, this is a Durkheim-like recipe for suicide. We can see this as a failure to meet early expectations or, more fundamentally, as a loss of the structures that give life a meaning.10 Durkheim, in his book On Suicide, wrote: It is sometimes said that, by virtue of his psychological make-up, man cannot live unless he attaches himself to an object that is greater than himself and outlives him, and this necessity has been attributed to a supposedly common need not to perish entirely. Life, they say, is only tolerable if one can see some purpose in it, if it has a goal and one that is worth pursuing. But the individual in himself is not sufficient as an end for himself. He is too small a thing. Not only is he confined in space, he is also narrowly limited in time.
Chris Hedges (America: The Farewell Tour)
Increasingly, what people with AIDS share are not personal or psychological attributes. They do not share culture or language or a certain racial identity. They do not share sexual preference or an absolute income bracket. What they share, rather, is a social position—the bottom rung of the ladder in inegalitarian societies.
Paul Farmer (Infections and Inequalities: The Modern Plagues)
That there are nations, states, and churches, that there is social cooperation under the division of labor, becomes discernible only in the actions of certain individuals. Nobody ever perceived a nation without perceiving its members. In this sense one may say that a social collective comes into being through the actions of individuals. That does not mean that the individual is temporally antecedent. It merely means that definite actions of individuals constitute the collective. There is no need to argue whether a collective is the sum resulting from the addition of its elements or more, whether it is a being suigeneris, and whether it is reasonable or not to speak of its will, plans, aims, and actions and to attribute to it a distinct "soul." Such pedantic talk is idle. A collective whole is a particular aspect of the actions of various individuals and as such a real thing determining the course of events. It is illusory to believe that it is possible to visualize collective wholes. They are never visible; their cognition is always the outcome of the understanding of the meaning which acting men attribute to their acts. We can see a crowd, i.e., a multitude of people. Whether this crowd is a mere gathering or a mass (in the sense in which this term is used in contemporary psychology) or an organized body or any other kind of social entity is a question which can only be answered by understanding the meaning which they themselves attach to their presence. And this meaning is always the meaning of individuals. Not our senses, but understanding, a mental process, makes us recognize social entities. Those who want to start the study of human action from the collective units encounter an insurmountable obstacle in the fact that an individual at the same time can belong and--with the exception of the most primitive tribesmen--really belongs to various collective entities. The problems raised by the multiplicity of coexisting social units and their mutual antagonisms can be solved only by methodological individualism
Ludwig von Mises (Human Action: A Treatise on Economics)
While a fair amount of empirical evidence has accrued that people use these three types of information to draw conclusions about the causes of behaviour, it soon became apparent that formal models of attribution, like the co-variation model, were inherently limited.
Richard J. Crisp (Social Psychology: A Very Short Introduction (Very Short Introductions Book 439))
The contemptuous person is likely to experience feelings of low self-esteem, inadequacy, and shame. In a March 2019 New York Times opinion piece entitled Our Culture of Contempt, Arthur C. Brooks writes: “political scientists have found that our nation is more polarized than it has been at any time since the civil war. One in six Americans has stopped talking to a family member or close friend because of the 2016 election. Millions of people organized their social lives and their news exposure along with ideological lines to avoid people with opposing viewpoints.” What's our problem? A 2014 article in The Proceedings of the National Academy of Sciences on motive attribution asymmetry, the assumption that your ideology is based in love while your opponent’s is based in hate suggests an answer. The researchers found that the average republican and the average democrat today suffer from a level of motive attribution asymmetry that is comparable with that of Palestinians and Israelis. Each side thinks it's driven by a benevolence while the other side is evil and motivated by hatred, and is therefore an enemy with whom one cannot negotiate or compromise. People often say that our problem in America today is incivility or intolerance. This is incorrect. Motive attribution asymmetry leads to something far worse – contempt, which is a noxious brew of anger and disgust, and not just contempt for other people's ideas but also for other people. In the words of the philosopher Arthur Schopenhauer, contempt is “the unsullied conviction of the worthlessness of another.” Brooks goes on to say contempt makes political compromise and progress impossible. It also makes us unhappy as people. According to the American Psychological Association, “the feelings of rejection so often experienced after being treated with contempt increases anxiety, depression, and sadness. It also damages the contemptuous person by stimulating two stress hormones -- cortisol and adrenaline -- in ways both public and personal. Contempt causes us deep harm.
Brené Brown (Atlas of the Heart: Mapping Meaningful Connection and the Language of Human Experience)
Here are ten facts about IQ. These facts are debated and often controversial among the general public but far less so among scientists who study intelligence. The best review of the academic literature supporting these facts is a 2012 paper by Richard Nisbett and colleagues – an interdisciplinary team of leading scholars, household names within intelligence research, comprised of psychologists, an economist, a behavioral geneticist, and a former President of the American Psychological Association. Their areas of expertise include cultural and sex differences in intelligence, the effect of social and genetic factors that affect intelligence, the development of intelligence over the lifespan, the relationship between economic development and intelligence, and changes in intelligence over history 1. IQ is a good predictor of school and work performance, at least in WEIRD societies. 2. IQ differs in predictive power and is the least predictive of performance on tasks that demand low cognitive skill. 3. IQ may be separable into what can be called ‘crystallized intelligence’ and ‘fluid intelligence’. Crystalized intelligence refers to knowledge that is drawn on to solve problems. Fluid intelligence refers to an ability to solve novel problems and to learn. 4. Educational interventions can improve aspects of IQ, including fluid intelligence, which is affected by interventions such as memory training. Many of these results don’t seem to last long, although there is strong evidence that education as a whole causally raises IQ over a lifetime. 5. IQ test scores have been dramatically increasing over time. This is called the Flynn effect after James Flynn (also an author of the review mentioned above), who first noticed this pattern. The Flynn effect is largest for nations that have recently modernized. Large gains have been measured on the Raven’s test, a test that has been argued to be the most ‘culture-free’ and a good measure of fluid intelligence. That is, it’s not just driven by people learning more words or getting better at adding and subtracting. 6. IQ differences have neural correlates – i.e. you can measure these differences in the brain. 7. IQ is heritable, though the exact heritability differs by population, typically ranging from around 30% to 80%. 8. Heritability is lower for poorer people in the US, but not in Australia and Europe where it is roughly the same across levels of wealth. 9. Males and females differ in IQ performance in terms of variance and in the means of different subscales. 10. Populations and ethnicities differ on IQ performance. You can imagine why some people might question these statements. But setting aside political considerations, how do we scientifically make sense of this? Popular books from Richard Herrnstein and Charles Murray’s The Bell Curve (1994) to Robert Plomin’s Blueprint (2018) have attributed much of this to genes. People and perhaps groups differ in genes, making some brighter than others. But humans are a species with two lines of inheritance. They have not just genetic hardware but also cultural software. And it is primarily by culture rather than genes that we became the most dominant species on earth. For a species so dependent on accumulated knowledge, not only is the idea of a culture-free intelligence test meaningless, so too is the idea of culture free intelligence.
Michael Muthukrishna
But the historical point is that personhood was not always reduced to those psychological features and that therefore, as long as personhood was not thus redescribed, it could not be conceived of in terms of brainhood. Anthropologists who study conceptions and practices related to the beginnings and ends of life make a similar point when they notice that “producing persons is an inherently social project” and that “personhood is not an innate or natural quality but a cultural attribute” (Kaufman and Morgan 2005, 320–321).
Fernando Vidal (Being Brains: Making the Cerebral Subject)
One important reason that philosophers should take Nietzsche seriously is because he seems to have gotten, at least in broad contours, many points about human moral psychology right. Consider: (1) Nietzsche holds that heritable type-facts are central determinants of personality and morally significant behaviors, a claim well supported by extensive empirical findings in behavioral genetics. (2) Nietzsche claims that consciousness is a “surface” and that “the greatest part of conscious thought must still be attributed to [non-conscious] instinctive activity,” theses overwhelmingly vindicated by recent work by psychologists on the role of the unconscious (e.g., Wilson 2002) and by philosophers who have produced synthetic meta-analyses of work on consciousness in psychology and neuroscience (e.g., Rosenthal 2008). (3) Nietzsche claims that moral judgments are post-hoc rationalizations of feelings that have an antecedent source, and thus are not the outcome of rational reflection or discursiveness, a conclusion in sync with the findings of the ascendent “social intuitionism” in the empirical moral psychology of Jonathan Haidt (2001) and others. (4) Nietzsche argues that free will is an “illusion,” that our conscious experience of willing is itself the causal product of non-conscious forces, a view recently defended by the psychologist Daniel Wegner (2002), who, in turn, synthesiyes a large body of empirical literature, including the famous neurophysical data about “willing” collected by Benjamin Libet.
Brian Leiter (Nietzsche and Morality)
When we deal with a person who has authority over us, it can often seem that the person is smarter and more competent as we are. Instead of attributing this power to the person's position in society, we make the fundamental attribution error. We assume the person is more competent than we are because the person has power over us. We ignore the fact that people have authority because of their social position, not necessarily because of any special expertise.
Michael Lovaglia (Knowing People: The Personal Use of Social Psychology)
But, actually, the idea of a personal god or spirit who peevishly withholds food, or maliciously hurls lightning, gets a boost from the evolved human brain. People reared in modern scientific societies may consider it only natural to ponder some feature of the world—the weather, say—and try to come up with a mechanistic explanation couched in the abstract language of natural law. But evolutionary psychology suggests that a much more natural way to explain anything is to attribute it to a humanlike agent. This is the way we’re “designed” by natural selection to explain things. Our brain’s capacity to think about causality—to ask why something happened and come up with theories that help us predict what will happen in the future—evolved in a specific context: other brains. When our distant ancestors first asked “Why,” they weren’t asking about the behavior of water or weather or illness; they were asking about the behavior of their peers. That’s a somewhat speculative (and, yes, hard-to-test!) claim. We have no way of observing our prehuman ancestors one or two or three million years ago, when the capacity to think explicitly about causality was evolving by natural selection. But there are ways to shed light on the process. For starters, we can observe our nearest nonhuman relatives, chimpanzees. We didn’t evolve from chimps, but chimps and humans do share a common ancestor in the not-too-distant past (4 to 7 million years ago). And chimps are probably a lot more like that common ancestor than humans are. Chimps aren’t examples of our ancestors circa 5 million BCE but they’re close enough to be illuminating. As the primatologist Frans de Waal has shown, chimpanzee society shows some clear parallels with human society. One of them is in the title of his book Chimpanzee Politics. Groups of chimps form coalitions—alliances—and the most powerful alliance gets preferred access to resources (notably a resource that in Darwinian terms is important: sex partners). Natural selection has equipped chimps with emotional and cognitive tools for playing this political game. One such tool is anticipation of a given chimp’s future behavior based on past behavior. De Waal writes of a reigning alpha male, Yeroen, who faced growing hostility from a former ally named Luit: “He already sensed that Luit’s attitude was changing and he knew that his position was threatened.” 8 One could argue about whether Yeroen was actually pondering the situation in as clear and conscious a way as de Waal suggests. But even if chimps aren’t quite up to explicit inference, they do seem close. If you imagine their politics getting more complex (more like, say, human politics), and them getting smarter (more like humans), you’re imagining an organism evolving toward conscious thought about causality. And the causal agents about which these organisms will think are other such organisms, because the arena of causality is the social arena. In this realm, when a bad thing happens (like a challenge for Yeroen’s alpha spot) or a good thing happens (like an ally coming to Yeroen’s aid), it is another organism that is making the bad or good thing happen.
Robert Wright (The Evolution of God)
Humans pursuing deep, complete connections respond to quite different incentives from those that influence self-interested utility maximizers. Rewards, monitoring, and punishments are less likely to be effective than engagement, communication, norms, socialization, identity, and common purpose. They share not out of a calculation of reciprocity but from a psychological pleasure in sharing. Those seeking connections make decisions from their hearts as well as their heads, influenced by emotion, fairness, empathy, and intuition. Their behavior, thoughts, feelings, and even personal attributes are highly socially contingent. The range of humanity includes individuals who display every possible combination of selfishness and sociability.
Anne-Marie Slaughter (The Chessboard and the Web: Strategies of Connection in a Networked World)
[That] the driving force of the evolution of human intelligence was the coordination of multiple cognitive systems to pursue complex, shared goal [is called] the social brain hypothesis. It attributes the increase in intelligence to the increasing size and complexity of hominid social groups. Living in a group confers advantages, as we have seen with hunting, but it also demands certain cognitive abilities. It requires the ability to communicate in sophisticated ways, to understand and incorporate the perspectives of others, and to share common goals. The social brain hypothesis posits that the cognitive demands and adaptive advantages associated with living in a group created a snowball effect: As groups got larger and developed more complex joint behaviors, individuals developed new capabilities to support those behaviors. These new capabilities in turn allowed groups to get even larger and allowed group behavior to become even more complex.
Steven Sloman (The Knowledge Illusion: Why We Never Think Alone)
46. We attribute the social and psychological problems of modern society to the fact that that society requires people to live under conditions radically different from those under which the human race evolved and to behave in ways that conflict with the patterns of behavior that the human race developed while living under the earlier conditions.
Theodore J. Kaczynski (The Unabomber Manifesto: A Brilliant Madman's Essay on Technology, Society, and the Future of Humanity)
Not only does our individual and societal sanity depend on connection; so does our physical health. Because we are biopsychosocial creatures, the rising loneliness epidemic in Western culture is much more than just a psychological phenomenon: it is a public health crisis. A preeminent scholar of loneliness, the late neuroscientist John Cacioppo and his colleague and spouse, Stephania Cacioppo, published a letter in the Lancet only a month before his death in 2018. "Imagine," they wrote, "a condition that makes a person irritable, depressed, and self-centered, and is associated with a 26% increase in the risk of premature mortality. Imagine too that in industrialized countries around a third of people are affected by this condition, with one person in 12 affected severely, and that these proportions are increasing. Income, education, sex, and ethnicity are not protective, and the condition is contagious. The effects of the condition are not attributable to some peculiarity of the character of a subset of individuals, they are a result of the condition affecting ordinary people. Such a condition exists — loneliness." We now know without doubt that chronic loneliness is associated with an elevated risk of illness and early death. It has been shown to increase mortality from cancer and other diseases and has been compared to the harm of smoking fifteen cigarettes a day. According to research presented at the American Psychological Association's annual convention in 2015, the loneliness epidemic is a public health risk at least as great as the burgeoning rates of obesity. Loneliness, the researcher Steven Cole told me, can impair genetic functioning. And no wonder: even in parrots isolation impairs DNA repair by shortening chromosome-protecting telomeres. Social isolation inhibits the immune system, promotes inflammation, agitates the stress apparatus, and increases the risk of death from heart disease and strokes. Here I am referring to social isolation in the pre COVID-19 sense, though the pandemic has grievously exacerbated the problem, at great cost to the well-being of many.
Gabor Maté (The Myth of Normal: Trauma, Illness, and Healing in a Toxic Culture)
Nietzsche’s On the Genealogy of Morals appeared in 1887. In it, he begins with an argument that might well have been taken directly from Adam Smith—but he takes it a step further than Smith ever dared to, insisting that not just barter, but buying and selling itself, precede any other form of human relationship. The feeling of personal obligation, he observes, has its origin in the oldest and most primitive personal relationship there is, in the relationship between seller and buyer, creditor and debtor. Here for the first time one person moved up against another person, here an individual measured himself against another individual. We have found no civilization still at such a low level that something of this relationship is not already perceptible. To set prices, to measure values, to think up equivalencies, to exchange things—that preoccupied man’s very first thinking to such a degree that in a certain sense it’s what thinking itself is. Here the oldest form of astuteness was bred; here, too, we can assume are the first beginnings of man’s pride, his feeling of pre-eminence in relation to other animals. Perhaps our word “man” (manas) continues to express directly something of this feeling of the self: the human being describes himself as a being which assesses values, which values and measures, as the “inherently calculating animal.” Selling and buying, together with their psychological attributes, are even older than the beginnings of any form of social organizations and groupings; out of the most rudimentary form of personal legal rights the budding feeling of exchange, contract, guilt, law, duty, and compensation was instead first transferred to the crudest and earliest social structures (in their relationships with similar social structures), along with the habit of comparing power with power, of measuring, of calculating
David Graeber (Debt: The First 5,000 Years)