Unconscious Famous Quotes

We've searched our database for all the quotes and captions related to Unconscious Famous. Here they are! All 58 of them:

Most sane human beings who have managed to attain and retain fame each uses it to dramatically increase their name’s chances of being remembered until Jesus comes back, since their heart cannot do what they consciously or unconsciously lust for, that is to say, for it to beat until Jesus returns.
Mokokoma Mokhonoana (The Use and Misuse of Children)
If this constant sliding and hiding of meaning were true of conscious life, then we would of course never be able to speak coherently at all. If the whole of language were present to me when I spoke, then I would not be able to articulate anything at all. The ego, or consciousness, can therefore only work by repressing this turbulent activity, provisionally nailing down words on to meanings. Every now and then a word from the unconscious which I do not want insinuates itself into my discourse, and this is the famous Freudian slip of the tongue or parapraxis. But for Lacan all our discourse is in a sense a slip of the tongue: if the process of language is as slippery and ambiguous as he suggests, we can never mean precisely what we say and never say precisely what we mean. Meaning is always in some sense an approximation, a near-miss, a part-failure, mixing non-sense and non-communication into sense and dialogue.
Terry Eagleton (Literary Theory: An Introduction)
If you get in tune with the tendency field, through expanding and exploring the set of positive characteristics within yourself, the tendency field responds with constant energy, and affirmations. If you work against the tendency field, by being negative, unfair, unloving, and unconscious of the truth, you weaken your connection to the tendency field, and you experience existential dread, no matter how rich or famous or powerful you are.
Gregory David Roberts (The Mountain Shadow)
But now it seems clear that literary criticism was inherently doomed. Explicitly or otherwise it had based itself on a structure of echelons and hierarchies; it was about the talent elite. And the structure atomized as soon as the forces of democratization gave their next concerted push. Those forces – incomparably the most potent in our culture – have gone on pushing. And they are now running up against a natural barrier. Some citadels, true, have proved stormable. You can become rich without having any talent (via the scratchcard and the rollover jackpot). You can become famous without having any talent (by abasing yourself on some TV nerdathon; a clear improvement on the older method of simply killing a celebrity and inheriting the aura). But you cannot become talented without having any talent. Therefore, talent must go. Literary criticism, now almost entirely confined to the universities, thus moves against talent by moving against the canon. Academic preferment will not come from a respectful study of Wordsworth’s poetics; it will come from a challenging study of his politics – his attitude toward the poor, say, or his unconscious ‘valorization’ of Napoleon; and it will come still faster if you ignore Wordsworth and elevate some (justly) neglected contemporary, by which process the canon may be quietly and steadily sapped. A brief consultation of the Internet will show that meanwhile, everyone has become a literary critic – or at least, a book-reviewer.
Martin Amis (The War against Cliché: Essays and Reviews 1971-2000)
While Sigmund famously focused on the unconscious (the id), Anna made the ego seem more important, particularly in respect of therapy and psychoanalysis. Her
Tom Butler-Bowdon (50 Psychology Classics: Who We Are, How We Think, What We Do: Insight and Inspiration from 50 Key Books (50 Classics))
often has no suspicion of the causal connection between the precipitating event and the pathological phenomenon.
Sigmund Freud (Freud's Most Famous & Influential Books, Vol 1: The Interpretations of Dreams/On Dreams/On Psychotherapy/Jokes & Their Relation to the Unconscious)
Everybody has got to live for something, but Jesus is arguing that, if he is not that thing, it will fail you. First, it will enslave you. Whatever that thing is, you will tell yourself that you have to have it or there is no tomorrow. That means that if anything threatens it, you will become inordinately scared; if anyone blocks it, you will become inordinately angry; and if you fail to achieve it, you will never be able to forgive yourself. But second, if you do achieve it, it will fail to deliver the fulfillment you expected. Let me give you an eloquent contemporary expression of what Jesus is saying. Nobody put this better than the American writer David Foster Wallace. He got to the top of his profession. He was an award-winning, bestselling postmodern novelist known around the world for his boundary-pushing storytelling. He once wrote a sentence that was more than a thousand words long. A few years before the end of his life, he gave a now-famous commencement speech at Kenyon College. He said to the graduating class, Everybody worships. The only choice we get is what to worship. And the compelling reason for maybe choosing some sort of god . . . to worship . . . is that pretty much anything else you worship will eat you alive. If you worship money and things, if they are where you tap real meaning in life, then you will never have enough, never feel you have enough. It’s the truth. Worship your own body and beauty and sexual allure, and you will always feel ugly. And when time and age start showing, you will die a million deaths before [your loved ones] finally plant you. . . . Worship power, and you will end up feeling weak and afraid, and you will need ever more power over others to numb you to your own fear. Worship your intellect, being seen as smart, you will end up feeling stupid, a fraud, always on the verge of being found out. Look, the insidious thing about these forms of worship is not that they are evil or sinful; it is that they’re unconscious. They are default settings.4 Wallace was by no means a religious person, but he understood that everyone worships, everyone trusts in something for their salvation, everyone bases their lives on something that requires faith. A couple of years after giving that speech, Wallace killed himself. And this nonreligious man’s parting words to us are pretty terrifying: “Something will eat you alive.” Because even though you might never call it worship, you can be absolutely sure you are worshipping and you are seeking. And Jesus says, “Unless you’re worshipping me, unless I’m the center of your life, unless you’re trying to get your spiritual thirst quenched through me and not through these other things, unless you see that the solution must come inside rather than just pass by outside, then whatever you worship will abandon you in the end.
Timothy J. Keller (Encounters with Jesus: Unexpected Answers to Life's Biggest Questions)
With our media and celebrity-heavy culture, it’s very, very common to see people unconsciously adopt a frame of reference that if they’re not famous, they’re not successful. If they’re not wealthy, they’re not successful.
Knowledge@Wharton (Conversations on Success: 6 Thought Leaders Redefine What It Means to Succeed (Knowledge@Wharton Conversations))
Nor have I any reason for wishing to eliminate this evidence of my initial views. Even to-day I regard them not as errors but as valuable first approximations to knowledge which could only be fully acquired after long and continuous efforts.
Sigmund Freud (Freud's Most Famous & Influential Books, Vol 1: The Interpretations of Dreams/On Dreams/On Psychotherapy/Jokes & Their Relation to the Unconscious)
The Peacemaker Colt has now been in production, without change in design, for a century. Buy one to-day and it would be indistinguishable from the one Wyatt Earp wore when he was the Marshal of Dodge City. It is the oldest hand-gun in the world, without question the most famous and, if efficiency in its designated task of maiming and killing be taken as criterion of its worth, then it is also probably the best hand-gun ever made. It is no light thing, it is true, to be wounded by some of the Peacemaker’s more highly esteemed competitors, such as the Luger or Mauser: but the high-velocity, narrow-calibre, steel-cased shell from either of those just goes straight through you, leaving a small neat hole in its wake and spending the bulk of its energy on the distant landscape whereas the large and unjacketed soft-nosed lead bullet from the Colt mushrooms on impact, tearing and smashing bone and muscle and tissue as it goes and expending all its energy on you. In short when a Peacemaker’s bullet hits you in, say, the leg, you don’t curse, step into shelter, roll and light a cigarette one-handed then smartly shoot your assailant between the eyes. When a Peacemaker bullet hits your leg you fall to the ground unconscious, and if it hits the thigh-bone and you are lucky enough to survive the torn arteries and shock, then you will never walk again without crutches because a totally disintegrated femur leaves the surgeon with no option but to cut your leg off. And so I stood absolutely motionless, not breathing, for the Peacemaker Colt that had prompted this unpleasant train of thought was pointed directly at my right thigh. Another thing about the Peacemaker: because of the very heavy and varying trigger pressure required to operate the semi-automatic mechanism, it can be wildly inaccurate unless held in a strong and steady hand. There was no such hope here. The hand that held the Colt, the hand that lay so lightly yet purposefully on the radio-operator’s table, was the steadiest hand I’ve ever seen. It was literally motionless. I could see the hand very clearly. The light in the radio cabin was very dim, the rheostat of the angled table lamp had been turned down until only a faint pool of yellow fell on the scratched metal of the table, cutting the arm off at the cuff, but the hand was very clear. Rock-steady, the gun could have lain no quieter in the marbled hand of a statue. Beyond the pool of light I could half sense, half see the dark outline of a figure leaning back against the bulkhead, head slightly tilted to one side, the white gleam of unwinking eyes under the peak of a hat. My eyes went back to the hand. The angle of the Colt hadn’t varied by a fraction of a degree. Unconsciously, almost, I braced my right leg to meet the impending shock. Defensively, this was a very good move, about as useful as holding up a sheet of newspaper in front of me. I wished to God that Colonel Sam Colt had gone in for inventing something else, something useful, like safety-pins.
Alistair MacLean (When Eight Bells Toll)
What one should add here is that self-consciousness is itself unconscious: we are not aware of the point of our self-consciousness. If ever there was a critic of the fetishizing effect of fascinating and dazzling "leitmotifs", it is Adorno: in his devastating analysis of Wagner, he tries to demonstrate how Wagnerian leitmotifs serve as fetishized elements of easy recognition and thus constitute a kind of inner-structural commodification of his music. It is then a supreme irony that traces of this same fetishizing procedure can be found in Adorno's own writings. Many of his provocative one-liners do effectively capture a profound insight or at least touch on a crucial point (for example: "Nothing is more true in pscyhoanalysis than its exaggeration"); however, more often than his partisans are ready to admit, Adorno gets caught up in his own game, infatuated with his own ability to produce dazzlingly "effective" paradoxical aphorisms at the expense of theoretical substance (recall the famous line from Dialectic of Englightment on how Hollywood's ideological maniuplation of social reality realized Kant's idea of the transcendental constitution of reality). In such cases where the dazzling "effect" of the unexpected short-circuit (here between Hollywood cinema and Kantian ontology) effectively overshadows the theoretical line of argumentation, the brilliant paradox works precisely in the same manner as the Wagnerian leitmotif: instead of serving as a nodal point in the complex network of structural mediation, it generates idiotic pleasure by focusing attention on itself. This unintended self-reflexivity is something of which Adorno undoubtedly was not aware: his critique of the Wagnerian leitmotif was an allegorical critique of his own writing. Is this not an exemplary case of his unconscious reflexivity of thinking? When criticizing his opponent Wagner, Adorno effectively deploys a critical allegory of his own writing - in Hegelese, the truth of his relation to the Other is a self-relation.
Slavoj Žižek (Living in the End Times)
But now it seems clear that literary criticism was inherently doomed. Explicitly or otherwise it had based itself on a structure of echelons and hierarchies; it was about the talent elite. And the structure atomized as soon as the forces of democratization gave their next concerted push. Those forces – incomparably the most potent in our culture – have gone on pushing. And they are now running up against a natural barrier. Some citadels, true, have proved stormable. You can become rich without having any talent (via the scratchcard and the rollover jackpot). You can become famous without having any talent (by abasing yourself on some TV nerdathon; a clear improvement on the older method of simply killing a celebrity and inheriting the aura). But you cannot become talented without having any talent. Therefore, talent must go. Literary criticism, now almost entirely confined to the universities, thus moves against talent by moving against the canon. Academic preferment will not come from a respectful study of Wordsworth’s poetics; it will come from a challenging study of his politics – his attitude toward the poor, say, or his unconscious ‘valorization’ of Napoleon; and it will come still faster if you ignore Wordsworth and elevate some (justly) neglected contemporary, by which process the canon may be quietly and steadily sapped. A brief consultation of the Internet will show that meanwhile, everyone has become a literary critic – or at least, a book-reviewer.
Martin Amis (The War against Cliché: Essays and Reviews 1971-2000)
What is famously called "the midlife crisis" is precisely such an erosion of programs and projections. We expect that by investing sincere energy in a career, a relationship, a set of roles, that they will return the investment in manifold, satisfying ways. We feverishly renew the projections, up the ante, and anxiously repress the insurgence of doubt once more. We do not realize that a projection has occurred, for it is an unconscious mechanism of our energeic unconscious. Only after it has painfully dissolved may we begin to recognize that we placed such a large agenda on such a frangible place, that we asked too much of the beloved, of others, of institutions, and perhaps of life itself.
James Hollis
The style of a soul “What’s the matter with both of you, Ellsworth? Why such talk—over nothing at all? People’s faces and first impressions don’t mean a thing.” “That, my dear Kiki,” he answered, his voice soft and distant, as if he were giving an answer, not to her, but to a thought of his own, “is one of our greatest common fallacies. There’s nothing as significant as a human face. Nor as eloquent. We can never really know another person, except by our first glance at him. Because, in that glance, we know everything. Even though we’re not always wise enough to unravel the knowledge. Have you ever thought about the style of a soul, Kiki?” “The … what?” “The style of a soul. Do you remember the famous philosopher who spoke of the style of a civilization? He called it ‘style.’ He said it was the nearest word he could find for it. He said that every civilization has its one basic principle, one single, supreme, determining conception, and every endeavor of men within that civilization is true, unconsciously and irrevocably, to that one principle. … I think, Kiki, that every human soul has a style of its own, also. Its one basic theme. You’ll see it reflected in every thought, every act, every wish of that person. The one absolute, the one imperative in that living creature. Years of studying a man won’t show it to you. His face will. You’d have to write volumes to describe a person. Think of his face. You need nothing else.” “That sounds fantastic, Ellsworth. And unfair, if true. It would leave people naked before you.” “It’s worse than that. It also leaves you naked before them. You betray yourself by the manner in which you react to a certain face. To a certain kind of face. … The style of your soul … There’s nothing important on earth, except human beings. There’s nothing as important about human beings as their relations to one another. …” —Ayn Rand, The Fountainhead
Ayn Rand
A weariness of the desert was the living always in company, each of the party hearing all that was said and seeing all that was done by the others day and night. Yet the craving for solitude seemed part of the delusion of self-sufficiency, a factitious making-rare of the person to enhance its strangeness in its own estimation. To have privacy, as Newcombe and I had, was ten thousand times more restful than the open life, but the work suffered by the creation of such a bar between the leaders and men. Among the Arabs there were no distinctions, traditional or natural, except the unconscious power given a famous sheikh by virtue of his accomplishment; and they taught me that no man could be their leader except he ate the ranks’ food, wore their clothes, lived level with them, and yet appeared better in himself.
T.E. Lawrence (Seven Pillars of Wisdom: A Triumph)
Valentine’s concept of introversion includes traits that contemporary psychology would classify as openness to experience (“thinker, dreamer”), conscientiousness (“idealist”), and neuroticism (“shy individual”). A long line of poets, scientists, and philosophers have also tended to group these traits together. All the way back in Genesis, the earliest book of the Bible, we had cerebral Jacob (a “quiet man dwelling in tents” who later becomes “Israel,” meaning one who wrestles inwardly with God) squaring off in sibling rivalry with his brother, the swashbuckling Esau (a “skillful hunter” and “man of the field”). In classical antiquity, the physicians Hippocrates and Galen famously proposed that our temperaments—and destinies—were a function of our bodily fluids, with extra blood and “yellow bile” making us sanguine or choleric (stable or neurotic extroversion), and an excess of phlegm and “black bile” making us calm or melancholic (stable or neurotic introversion). Aristotle noted that the melancholic temperament was associated with eminence in philosophy, poetry, and the arts (today we might classify this as opennessto experience). The seventeenth-century English poet John Milton wrote Il Penseroso (“The Thinker”) and L’Allegro (“The Merry One”), comparing “the happy person” who frolics in the countryside and revels in the city with “the thoughtful person” who walks meditatively through the nighttime woods and studies in a “lonely Towr.” (Again, today the description of Il Penseroso would apply not only to introversion but also to openness to experience and neuroticism.) The nineteenth-century German philosopher Schopenhauer contrasted “good-spirited” people (energetic, active, and easily bored) with his preferred type, “intelligent people” (sensitive, imaginative, and melancholic). “Mark this well, ye proud men of action!” declared his countryman Heinrich Heine. “Ye are, after all, nothing but unconscious instruments of the men of thought.” Because of this definitional complexity, I originally planned to invent my own terms for these constellations of traits. I decided against this, again for cultural reasons: the words introvert and extrovert have the advantage of being well known and highly evocative. Every time I uttered them at a dinner party or to a seatmate on an airplane, they elicited a torrent of confessions and reflections. For similar reasons, I’ve used the layperson’s spelling of extrovert rather than the extravert one finds throughout the research literature.
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
Those who are not psychologically sophisticated, do not realise the extent to which the average person is unconsciously motivated by jealousy and envy. People who are not happy, confident, and fulfilled will generally resent those who are happier, more confident, and more fulfilled than them. Admiration and envy seem to be received in equal proportion as one develops and succeeds. Many famous people are admired with a passion and also hated with a vengeance. Powerful political leaders are respected and also ruthlessly criticised. Famous movie stars are adored and also grossly invaded and scrutinised. Nevertheless, we learn to think not 'what the world is doing to us' but 'what we are doing for the world.' Our attention is not on how the world is hurting us, but on how our presence is helping to heal the world. This outward and upward focus is our protection and our guide.
Donna Goddard (The Love of Being Loving (Love and Devotion, #1))
The fading of a memory or the losing of its affect depends on various factors. The most important of these is whether there has been an energetic reaction to the event that provokes the affect. By ‘reaction’ we here understand the whole class of voluntary and involuntary reflexes - from tears to acts of revenge - in which, as experience shows us, the affects are discharged. If this reaction takes place to a sufficient amount a large part of the affect disappears as a result.
Sigmund Freud (Freud's Most Famous & Influential Books, Vol 1: The Interpretations of Dreams/On Dreams/On Psychotherapy/Jokes & Their Relation to the Unconscious)
A quote has an even more powerful effect if we presume not just a particular author behind it, but God, nature, the unconscious, labor, or difference. These are strong fetishes, each conjuring the powerful submedial in a particular way. Yet all of them must nonetheless be exchanged in a certain rhythm according to the laws of the medial economy. In order to create such fetishes, one does not have to use brilliant quotes by famous authors but can use anonymous quotes that stem from the author- less realm of the everyday, lowly, foreign, vulgar, aggressive, or stupid. Precisely such quotes produce the effect of medial sincerity, that is, the revelation of a deeply submerged, hidden, medial plane on the familiar medial surface. It then appears as if this surface had been blasted open from the inside and that the respective quotes had sprung forth from the submedial interior—like aliens. All of this, of course, refers to the economy of the quote as a gift that can be offered, accepted, and reciprocated.
Boris Groys (Under Suspicion)
In retrospect one’s life is subject to invention and distortion. Documentation is often scanty, and what survives rarely predicts a famous life. Indeed, fame begins with the mundane: a messy birth, a crying baby, health and sickness, parents, family, schooling. Later, we consciously and unconsciously edit our lives: saving some papers, throwing much away; telling and retelling certain stories and suppressing much more. By the time a biographer relates a life, much of it has faded into well-tailored memory, an open arena for creative retelling or outright invention.
William E. Wallace (Michelangelo: The Artist, the Man and his Times)
The injured person’s reaction to the trauma only exercises a completely ‘cathartic’ effect if it is an adequate reaction - as, for instance, revenge. But language serves as a substitute for action; by its help, an affect can be ‘abreacted’ almost as effectively. In other cases speaking is itself the adequate reflex, when, for instance, it is a lamentation or giving utterance to a tormenting secret, e.g. a confession. If there is no such reaction, whether in deeds or words, or in the mildest cases in tears, any recollection of the event retains its affective tone to begin with.
Sigmund Freud (Freud's Most Famous & Influential Books, Vol 1: The Interpretations of Dreams/On Dreams/On Psychotherapy/Jokes & Their Relation to the Unconscious)
Our famous scientific reality does not afford us the slightest protection against the so-called irreality of the unconscious. Something works behind the veil of fantastic images, whether we give this something a good name or a bad. It is something real, and for this reason its manifestations must be taken seriously. But first the tendency to concretization must be overcome; in other words, we must not take the fantasies literally when we approach the question of interpreting them. While we are in the grip of the actual experience, the fantasies cannot be taken literally enough.
C.G. Jung (The Red Book: Liber Novus)
As you know, there was a famous quarrel between Max Planck and Einstein, in which Einstein claimed that, on paper, the human mind was capable of inventing mathematical models of reality. In this he generalized his own experience because that is what he did. Einstein conceived his theories more or less completely on paper, and experimental developments in physics proved that his models explained phenomena very well. So Einstein says that the fact that a model constructed by the human mind in an introverted situation fits with outer facts is just a miracle and must be taken as such. Planck does not agree, but thinks that we conceive a model which we check by experiment, after which we revise our model, so that there is a kind of dialectic friction between experiment and model by which we slowly arrive at an explanatory fact compounded of the two. Plato-Aristotle in a new form! But both have forgotten something- the unconscious. We know something more than those two men, namely that when Einstein makes a new model of reality he is helped by his unconscious, without which he would not have arrived at his theories...But what role DOES the unconscious play?...either the unconscious knows about other realities, or what we call the unconscious is a part of the same thing as outer reality, for we do not know how the unconscious is linked with matter.
Marie-Louise von Franz (Alchemy: An Introduction to the Symbolism and the Psychology)
[Tolstoy] denounced [many historians'] lamentable tendency to simplify. The experts stumble onto a battlefield, into a parliament or public square, and demand, "Where is he? Where is he?" "Where is who?" "The hero, of course! The leader, the creator, the great man!" And having found him, they promptly ignore all his peers and troops and advisors. They close their eyes and abstract their Napoleon from the mud and the smoke and the masses on either side, and marvel at how such a figure could possibly have prevailed in so many battles and commanded the destiny of an entire continent. "There was an eye to see in this man," wrote Thomas Carlyle about Napoleon in 1840, "a soul to dare and do. He rose naturally to be the King. All men saw that he was such." But Tolstoy saw differently. "Kings are the slaves of history," he declared. "The unconscious swarmlike life of mankind uses every moment of a king's life as an instrument for its purposes." Kings and commanders and presidents did not interest Tolstoy. History, his history, looks elsewhere: it is the study of infinitely incremental, imperceptible change from one state of being (peace) to another (war). The experts claimed that the decisions of exceptional men could explain all of history's great events. For the novelist, this belief was evidence of their failure to grasp the reality of an incremental change brought about by the multitude's infinitely small actions.
Daniel Tammet (Thinking In Numbers: On Life, Love, Meaning, and Math)
You don’t have to have studied the description-experience gap to understand, if you’re truly expert at something, that you need experience to balance out the descriptions. Otherwise, you’re left with the illusion of knowledge—knowledge without substance. You’re an armchair philosopher who thinks that just because she read an article about something she is a sudden expert. (David Dunning, a psychologist at the University of Michigan most famous for being one half of the Dunning-Kruger effect—the more incompetent you are, the less you’re aware of your incompetence—has found that people go quickly from being circumspect beginners, who are perfectly aware of their limitations, to “unconscious incompetents,” people who no longer realize how much they don’t know and instead fancy themselves quite proficient.)
Maria Konnikova (The Biggest Bluff: How I Learned to Pay Attention, Master Myself, and Win)
It has been noted in various quarters that the half-illiterate Italian violin maker Antonio Stradivari never recorded the exact plans or dimensions for how to make one of his famous instruments. This might have been a commercial decision (during the earliest years of the 1700s, Stradivari’s violins were in high demand and open to being copied by other luthiers). But it might also have been because, well, Stradivari didn’t know exactly how to record its dimensions, its weight, and its balance. I mean, he knew how to create a violin with his hands and his fingers but maybe not in figures he kept in his head. Today, those violins, named after the Latinized form of his name, Stradivarius, are considered priceless. It is believed there are only around five hundred of them still in existence, some of which have been submitted to the most intense scientific examination in an attempt to reproduce their extraordinary sound quality. But no one has been able to replicate Stradivari’s craftsmanship. They’ve worked out that he used spruce for the top, willow for the internal blocks and linings, and maple for the back, ribs, and neck. They’ve figured out that he also treated the wood with several types of minerals, including potassium borate, sodium and potassium silicate, as well as a handmade varnish that appears to have been composed of gum arabic, honey, and egg white. But they still can’t replicate a Stradivarius. The genius craftsman never once recorded his technique for posterity. Instead, he passed on his knowledge to a number of his apprentices through what the philosopher Michael Polyani called “elbow learning.” This is the process where a protégé is trained in a new art or skill by sitting at the elbow of a master and by learning the craft through doing it, copying it, not simply by reading about it. The apprentices of the great Stradivari didn’t learn their craft from books or manuals but by sitting at his elbow and feeling the wood as he felt it to assess its length, its balance, and its timbre right there in their fingertips. All the learning happened at his elbow, and all the knowledge was contained in his fingers. In his book Personal Knowledge, Polyani wrote, “Practical wisdom is more truly embodied in action than expressed in rules of action.”1 By that he meant that we learn as Stradivari’s protégés did, by feeling the weight of a piece of wood, not by reading the prescribed measurements in a manual. Polyani continues, To learn by example is to submit to authority. You follow your master because you trust his manner of doing things even when you cannot analyze and account in detail for its effectiveness. By watching the master and emulating his efforts in the presence of his example, the apprentice unconsciously picks up the rules of the art, including those which are not explicitly known to the master himself. These hidden rules can be assimilated only by a person who surrenders himself to that extent uncritically to the imitation of another.
Lance Ford (UnLeader: Reimagining Leadership…and Why We Must)
There is no doubt that the shock of an... emotional experience is often needed to make people wake up and pay attention to what they are doing. There's a famous case of the 13th century Spanish Hidalgo, Raimon Lull, who finally (after a long chase) succeeded in meeting the lady he admired at a secret rendezvous. She silently opened her dress and showed him her breast, rotten with cancer. The shock changed Lull's life; he eventually became an eminent Theologian and one of the Church's greatest missionaries. In the case of such a sudden change one can often prove that an archetype has been at work for a long time in the unconscious, skillfully arranging circumstances that will lead to the crisis... such experiences seem to show that archetypal forms are not just static patterns. They are Dynamic factors that manifest themselves in impulses, just as spontaneously as the instincts. Certain dreams, visions, or thoughts can suddenly appear; and however carefully one investigates, one cannot find out what causes them. This does not mean that they have no cause; they certainly have. But it is so remote or obscure that one cannot see what it is.
C.G. Jung (Man and His Symbols)
But I have said enough about the negative side of the anima. There are just as many important positive aspects. The anima is, for instance, responsible for the fact that a man is able to find the right marriage partner. Another function is at least equally important: Whenever a man’s logical mind is incapable of discerning facts that are hidden in his unconscious, the anima helps him to dig them out. Even more vital is the role that the anima plays in putting a man’s mind in tune with the right inner values and thereby opening the way into more profound inner depths. It is as if an inner “radio” becomes tuned to a certain wave length that excludes irrelevancies but allows the voice of the Great Man to be heard. In establishing this inner “radio” reception, the anima takes on the role of guide, or mediator, to the world within and to the Self. That is how she appears in the example of the initiations of shamans that I described earlier; this is the role of Beatrice in Dante’s Paradiso, and also of the goddess Isis when she appeared in a dream to Apuleius, the famous author of The Golden Ass, in order to initiate him into a higher, more spiritual form of life.
C.G. Jung (Man and His Symbols)
Freud’s incest theory describes certain fantasies that accompany the regression of libido and are especially characteristic of the personal unconscious as found in hysterical patients. Up to a point they are infantile-sexual fantasies which show very clearly just where the hysterical attitude is defective and why it is so incongruous. They reveal the shadow. Obviously the language used by this compensation will be dramatic and exaggerated. The theory derived from it exactly matches the hysterical attitude that causes the patient to be neurotic. One should not, therefore, take this mode of expression quite as seriously as Freud himself took it. It is just as unconvincing as the ostensibly sexual traumata of hysterics. The neurotic sexual theory is further discomfited by the fact that the last act of the drama consists in a return to the mother’s body. This is usually effected not through the natural channels but through the mouth, through being devoured and swallowed (pl. LXII), thereby giving rise to an even more infantile theory which has been elaborated by Otto Rank. All these allegories are mere makeshifts. The real point is that the regression goes back to the deeper layer of the nutritive function, which is anterior to sexuality, and there clothes itself in the experiences of infancy. In other words, the sexual language of regression changes, on retreating still further back, into metaphors derived from the nutritive and digestive functions, and which cannot be taken as anything more than a façon de parler. The so-called Oedipus complex with its famous incest tendency changes at this level into a “Jonah-and-the-Whale” complex, which has any number of variants, for instance the witch who eats children, the wolf, the ogre, the dragon, and so on. Fear of incest turns into fear of being devoured by the mother. The regressing libido apparently desexualizes itself by retreating back step by step to the presexual stage of earliest infancy. Even there it does not make a halt, but in a manner of speaking continues right back to the intra-uterine, pre-natal condition and, leaving the sphere of personal psychology altogether, irrupts into the collective psyche where Jonah saw the “mysteries” (“représentations collectives”) in the whale’s belly. The libido thus reaches a kind of inchoate condition in which, like Theseus and Peirithous on their journey to the underworld, it may easily stick fast. But it can also tear itself loose from the maternal embrace and return to the surface with new possibilities of life.
C.G. Jung (Collected Works of C. G. Jung, Volume 5: Symbols of Transformation (The Collected Works of C. G. Jung Book 7))
But the basis of Freud's ideas aren't accepted by all philosophers, though many accept that he was right about the possibility of unconscious thought. Some have claimed that Freud's theories are unscientific. Most famously, Karl Popper (whose ideas are more fully discussed in Chapter 36) described many of the ideas of psychoanalysis as ‘unfalsifiable’. This wasn't a compliment, but a criticism. For Popper, the essence of scientific research was that it could be tested; that is, there could be some possible observation that would show that it was false. In Popper's example, the actions of a man who pushed a child into a river, and a man who dived in to save a drowning child were, like all human behaviour, equally open to Freudian explanation. Whether someone tried to drown or save a child, Freud's theory could explain it. He would probably say that the first man was repressing some aspect of his Oedipal conflict, and that led to his violent behaviour, whereas the second man had ‘sublimated’ his unconscious desires, that is, managed to steer them into socially useful actions. If every possible observation is taken as further evidence that the theory is true, whatever that observation is, and no imaginable evidence could show that it was false, Popper believed, the theory couldn't be scientific at all. Freud, on the other hand, might have argued that Popper had some kind of repressed desire that made him so aggressive towards psychoanalysis. Bertrand
Nigel Warburton (A Little History of Philosophy (Little Histories))
We are conscious of only a tiny fraction of the information that our brains process in each moment.1 Although we continually notice changes in our experience—in thought, mood, perception, behavior, etc.—we are utterly unaware of the neurophysiological events that produce them. In fact, we can be very poor witnesses to experience itself. By merely glancing at your face or listening to your tone of voice, others are often more aware of your state of mind and motivations than you are. I generally start each day with a cup of coffee or tea—sometimes two. This morning, it was coffee (two). Why not tea? I am in no position to know. I wanted coffee more than I wanted tea today, and I was free to have what I wanted. Did I consciously choose coffee over tea? No. The choice was made for me by events in my brain that I, as the conscious witness of my thoughts and actions, could not inspect or influence. Could I have “changed my mind” and switched to tea before the coffee drinker in me could get his bearings? Yes, but this impulse would also have been the product of unconscious causes. Why didn’t it arise this morning? Why might it arise in the future? I cannot know. The intention to do one thing and not another does not originate in consciousness—rather, it appears in consciousness, as does any thought or impulse that might oppose it. The physiologist Benjamin Libet famously used EEG to show that activity in the brain’s motor cortex can be detected some 300 milliseconds before a person feels that he has decided to move.2 Another lab extended this work using functional magnetic resonance imaging (fMRI): Subjects were asked to press one of two buttons while watching a “clock” composed of a random sequence of letters appearing on a screen. They reported which letter was visible at the moment they decided to press one button or the other. The experimenters found two brain regions that contained information about which button subjects would press a full 7 to 10 seconds before the decision was consciously made.3 More recently, direct recordings from the cortex showed that the activity of merely 256 neurons was sufficient to predict with 80 percent accuracy a person’s decision to move 700 milliseconds before he became aware of it.4 These findings are difficult to reconcile with the sense that we are the conscious authors of our actions. One fact now seems indisputable: Some moments before you are aware of what you will do next—a time in which you subjectively appear to have complete freedom to behave however you please—your brain has already determined what you will do. You then become conscious of this “decision” and believe that you are in the process of making it. The distinction between “higher” and “lower” systems in the brain offers no relief: I, as the conscious witness of my experience, no more initiate events in my prefrontal cortex than I cause my heart to beat. There will always be some delay between the first neurophysiological events that kindle my next conscious thought and the thought itself. And even if there weren’t—even if all mental states were truly coincident with their underlying brain states—I cannot decide what I will next think or intend until a thought or intention arises. What will my next mental state be? I do not know—it just happens. Where is the freedom in that?
Sam Harris (Free Will)
Virginia Satir, one of our most famous family therapists, said, “Families are people factories.” She meant that we learn how to relate to others during our early experiences of our families. The patterns of interacting that we use today were set up early on in our lives and were reinforced over and over again until they became automatic and part of our unconscious. Our peers and others influence us as well, but the basics are learned very early on and inform much of how we think about ourselves and others later on in life.
7Cups (7 Cups for the Searching Soul)
Cognitive Bias, that is, unconscious—and irrational—brain processes that literally distort the way we see the world. Kahneman and Tversky discovered more than 150 of them. There’s the Framing Effect, which demonstrates that people respond differently to the same choice depending on how it is framed (people place greater value on moving from 90 percent to 100 percent—high probability to certainty—than from 45 percent to 55 percent, even though they’re both ten percentage points). Prospect Theory explains why we take unwarranted risks in the face of uncertain losses. And the most famous is Loss Aversion, which shows how people are statistically more likely to act to avert a loss than to achieve an equal gain. Kahneman
Chris Voss (Never Split the Difference: Negotiating As If Your Life Depended On It)
I have always fancied myself as a fairly objective looker, but I’m beginning to wonder whether I do not miss whole categories of things. Let me give you an example of what I mean, Alicia. Some years ago the U.S. Information Service paid the expenses of a famous and fine Italian photographer to go to America and to take pictures of our country. It was thought that pictures by an Italian would be valuable to Italians because they would be of things of interest to Italy. I was living in Florence at the time and I saw the portfolio as soon as the pictures were printed. The man had traveled everywhere in America, and do you know what his pictures were? Italy, in every American city he had unconsciously sought and found Italy. The portraits—Italians; the countryside—Tuscany and the Po Valley and the Abruzzi. His eye looked for what was familiar to him and found it. . . . This man did not see the America which is not like Italy, and there is very much that isn’t. And I wonder what I have missed in the wonderful trip to the south that I have just completed. Did I see only America? I confess I caught myself at it. Traveling over those breathtaking mountains and looking down at the shimmering deserts . . . I found myself saying or agreeing—yes, that’s like the Texas panhandle— that could be Nevada, and that might be Death Valley. . . . [B]y identifying them with something I knew, was I not cutting myself off completely from the things I did not know, not seeing, not even recognizing, because I did not have the easy bridge of recognition . . . the shadings, the nuance, how many of those I must not have seen. (Newsday, 2 Apr. 1966)
John Steinbeck (America and Americans and Selected Nonfiction)
The style of a soul. Do you remember the famous philosopher who spoke of the style of a civilization? He called it 'style.' He said it was the nearest word he could find for it. He said that every civilization has its one basic principle, one single, supreme, determining conception, and every endeavor of men within that civilization is true, unconsciously and irrevocably, to that one principle... every human soul has a style of its own, also. Its one basic theme. You'll see it reflected in every thought, every act, every wish of that person. The one absolute, the one imperative in that living creature.
Ayn Rand (The Fountainhead)
End of May 2012 The continuation of my email to Andy: …I was delighted to return to London after war-ravaged Belfast. The students in our college had to evacuate several times due to IRA bomb threats. I must have subconsciously selected to be in Northern Ireland because of my unsettling inner upheavals. Much like the riots that went on in the city in 1971, I was unconsciously fighting my inner demons within myself. I needed that year to overcome my sexual additions and to immerse myself in my fashion studies. By the following year, I had compiled an impressive fashion design portfolio for application with various London Art and Design colleges. Foundation students generally required two years to complete their studies. I graduated from the Belfast College of Art with flying colors within a year. By the autumn of 1972, I was accepted into the prestigious Harrow School of Art and Technology. Around that period, my father’s business was waning and my family had financial difficulty sponsoring my graduate studies. Unbeknownst to my family, I had earned sufficient money during my Harem services to comfortably put myself through college. I lied to my parents and told them I was working part-time in London to make ends meet so I could finance my fashion education. They believed my tall tale. For the next three years I put my heart and soul into my fashion projects. I would occasionally work as a waiter at the famous Rainbow Room in Biba, which is now defunct. Working at this dinner dance club was a convenient way of meeting beautiful and trendy patrons, who often visit this capricious establishment.
Young (Unbridled (A Harem Boy's Saga, #2))
No matter how many decathlons you run, how many shoes you buy or how famous you get, those unresolved, unbearable feelings exist below. Freud was on the button when he said you have to bring your darkness into the light if you want to free yourself from those deep unconscious emotions. Just like a virus has to be sweated out, your malignant thoughts and feelings have to surface.
Ruby Wax (Sane New World: The original bestseller)
The style of a soul. Do you remember the famous philosopher who spoke of the style of a civilization? He called it 'style.' He said it was the nearest word he could find for it. He said that every civilization has its one basic principle, one single, supreme, determining conception, and every endeavor of men within that civilization is true, unconsciously and irrevocably, to that one principle....I think, Kiki, that every human soul has a style of its own, also. Its one basic theme. You'll see it reflected in every thought, every act, every wish of that person. The one absolute, the one imperative in that living creature. Years of studying a man won't show it to you. His face will. You'd have to write volumes to describe a person. Think of his face. You need nothing else.
Ayn Rand (The Fountainhead)
thepsychchic chips clips ii If you think of yourself instead as an almost-victor who thought correctly and did everything possible but was foiled by crap variance? No matter: you will have other opportunities, and if you keep thinking correctly, eventually it will even out. These are the seeds of resilience, of being able to overcome the bad beats that you can’t avoid and mentally position yourself to be prepared for the next time. People share things with you: if you’ve lost your job, your social network thinks of you when new jobs come up; if you’re recently divorced or separated or bereaved, and someone single who may be a good match pops up, you’re top of mind. This attitude is what I think of as a luck amplifier. … you will feel a whole lot happier … and your ready mindset will prepare you for the change in variance that will come … 134-135 W. H. Auden: “Choice of attention—to pay attention to this and ignore that—is to the inner life what choice of action is to the outer. In both cases man is responsible for his choice and must accept the consequences.” Pay attention, or accept the consequences of your failure. 142 Attention is a powerful mitigator to overconfidence: it forces you to constantly reevaluate your knowledge and your game plan, lest you become too tied to a certain course of action. And if you lose? Well, it allows you to admit when it’s actually your fault and not a bad beat. 147 Following up on Phil Galfond’s suggestion to be both a detective and a storyteller and figure out “what your opponent’s actions mean, and sometimes what they don’t mean.” [Like the dog that didn’t bark in the Sherlock Holmes “Silver Blaze” story.] 159 You don’t have to have studied the description-experience gap to understand, if you’re truly expert at something, that you need experience to balance out the descriptions. Otherwise, you’re left with the illusion of knowledge—knowledge without substance. You’re an armchair philosopher who thinks that just because she read an article about something she is a sudden expert. (David Dunning, a psychologist at the University of Michigan most famous for being one half of the Dunning-Kruger effect—the more incompetent you are, the less you’re aware of your incompetence—has found that people go quickly from being circumspect beginners, who are perfectly aware of their limitations, to “unconscious incompetents,” people who no longer realize how much they don’t know and instead fancy themselves quite proficient.) 161-162 Erik: Generally, the people who cash the most are actually losing players (Nassim Taleb’s Black Swan strategy, jp). You can’t be a winning player by min cashing. 190 The more you learn, the harder it gets; the better you get, the worse you are—because the flaws that you wouldn’t even think of looking at before are now visible and need to be addressed. 191 An edge, even a tiny one, is an edge worth pursuing if you have the time and energy. 208 Blake Eastman: “Before each action, stop, think about what you want to do, and execute.” … Streamlined decisions, no immediate actions, or reactions. A standard process. 217 John Boyd’s OODA: Observe, Orient, Decide, and Act. The way to outmaneuver your opponent is to get inside their OODA loop. 224 Here’s a free life lesson: seek out situations where you’re a favorite; avoid those where you’re an underdog. 237 [on folding] No matter how good your starting hand, you have to be willing to read the signs and let it go. One thing Erik has stressed, over and over, is to never feel committed to playing an event, ever. “See how you feel in the morning.” Tilt makes you revert to your worst self. 257 Jared Tindler, psychologist, “It all comes down to confidence, self-esteem, identity, what some people call ego.” 251 JT: “As far as hope in poker, f#¢k it. … You need to think in terms of preparation. Don’t worry about hoping. Just Do.” 252
Maria Konnikova (The Biggest Bluff: How I Learned to Pay Attention, Master Myself, and Win)
gold standard for infrastructure, a brick house in a world of straw; those stupid raised freeways, built strong enough to withstand the Big One, had served as refugia for the entire population of the city, and the subsequent evacuation had proceeded successfully. A very impressive improvisation. Despite LA’s uneven popularity across the world, it was for sure immensely famous. The dream factory had accomplished that at least. Many people all over the world felt they knew the place, and were transfixed by the images of it suddenly inundated. If it could happen to LA, rich as it was, dreamy as it was, it could happen anywhere. Was that right? Maybe not, but it felt that way. Some deep flip in the global unconscious was making people queasy. Despite this sense that the world was falling apart, or maybe because of it, demonstrations in the capitals of the world intensified. Actually these seemed to be occupations rather than demonstrations, because they didn’t end but rather persisted as disruptions of the ordinary business of the capitals. Within the occupied spaces, people were setting up and performing alternative lifeways with gift supplies of food and impromptu shelter and
Kim Stanley Robinson (The Ministry for the Future)
The state of tranquility and nothingness, which at first sight seems to be the right awareness of the selfless self, is nothing other than the depth of unconsciousness called Alaya-vijnana, which has come to prevail over Mana-vijnana. This is suggested by the following Zen verse composed by the famous Master Chosha Keijin (Ch’ang-sha Ching-ts’en):36 Students of the Way do not comprehend the Truth Because they only recognize the existing Eighth Consciousness. Fools identify with the original man, The boundless origin of birth and death.
Omori Sogen (Introduction to Zen Training: A Physical Approach to Meditation and Mind-Body Training (The Classic Rinzai Zen Manual))
Take for instance a phenomenon called frustrated spontaneous emission. It sounds like an embarrassing sexual complaint that psychotherapy might help with. In fact, it involves the decay of radioactive particles, which ordinarily takes place at a predictably random rate. The exception, however, is when radioactive material is placed in an environment that cannot absorb the photons that are emitted by decay. In that case, decay ceases—the atoms become “frustrated.” How do these atoms “know” to stop decaying until conditions are suitable? According to Wharton, the unpredictable decay of radioactive particles may be determined in part by whatever receives their emitted photons in the future.20 Decay may not really be random at all, in other words. Another quantum mystery that arguably becomes less mysterious in a retrocausal world is the quantum Zeno effect. Usually, the results of measurements are unpredictable—again according to the famous uncertainty believed to govern the quantum kingdom—but there is a loophole. Persistent, rapid probing of reality by repeating the same measurement over and over produces repetition of the same “answer” from the physical world, almost as if it is “stopping time” in some sense (hence the name of the effect, which refers to Zeno’s paradoxes like an arrow that must first get halfway to its target, and then halfway from there, and so on, and thus is never able to reach the target at all).21 If the measurement itself is somehow influencing a particle retrocausally, then repeating the same measurement in the same conditions may effectively be influencing the measured particles the same way in their past, thereby producing the consistent behavior. Retrocausation may also be at the basis of a long-known but, again, hitherto unsatisfyingly explained quirk of light’s behavior: Fermat’s principle of least time. Light always takes the fastest possible path to its destination, which means taking the shortest available path through different media like water or glass. It is the rule that accounts for the refraction of light through lenses, and the reason why an object underwater appears displaced from its true location.22 It is yet another example of a creature in the quantum bestiary that makes little sense unless photons somehow “know” where they are going in order to take the most efficient possible route to get there. If the photon’s angle of deflection when entering a refractive medium is somehow determined by its destination, Fermat’s principle would make much more sense. (We will return to Fermat’s principle later in this book; it plays an important role in Ted Chiang’s short story, “Story of Your Life,” the basis for the wonderful precognition movie Arrival.) And retrocausation could also offer new ways of looking at the double-slit experiment and its myriad variants.
Eric Wargo (Time Loops: Precognition, Retrocausation, and the Unconscious)
What if it is somehow our misunderstood, unacknowledged, looping relationship to our future that makes us ill—or at least, that contributes to our suffering—and not our failure to connect appropriately to our past? Could some neuroses be time loops misrecognized and denied, the way we haunt ourselves from our futures and struggle to reframe it as being about our past history? The next two chapters will examine this question through the lives of two famously precognitive and neurotic writers. Both show strikingly how creativity may travel together with trauma and suffering along the resonating string that connects us to the Not Yet.
Eric Wargo (Time Loops: Precognition, Retrocausation, and the Unconscious)
Even after his wife puts the knife up on a high shelf, out of the reach of her sleepwalking self, it continues to exert a hypnotic power over her, repeatedly calling forth what seems like some buried male, violent personality. Meanwhile Beverton himself falls into a somnambulistic state and assumes the persona of a victimized woman. After Beverton throws the knife in a snowy field, his wife finds it in her trance and stabs him in the shoulder. After Beverton recovers, a psychologist specializing in hypnotism (a character perhaps based on the doctor Robertson had visited for his real-life difficulties) tries to convince Beverton that he and his wife are acting out the telepathically received story of the famous Caribbean pirate Captain Henry Morgan and his captive sex slave Isobel, but with the sexes reversed. They were somehow picking up the thoughts of “some strong, projective personality—some man or woman thoroughly enthused and interested in the history of the seventeenth-century pirates.”22 Beverton listens to the doctor’s explanation but believes the truth goes deeper: Reincarnation is the real answer. They had actually been these figures in their past lives and at night were playing out their old relationship. Eisenbud noted that “The Sleep Walker” is a pretty weird gender-bender for such a resolutely masculine writer. What he didn’t catch is that Robertson may in this story have been expressing a strange truth about how he secretly understood his own fickle creative gifts. In the volume, Morgan Robertson the Man, one of Robertson’s friends, an artist named J. O’Neill, recalled that the writer believed that he had telepathically acquired the writing gift, the muse, of a young woman he had known years earlier but who had been unable to make anything of her talent due to a lack of “stickativeness.” In other words, Robertson believed his fickle and inconstant “astral helper” or “psychic partner”23 (in the words of another friend, Henry W. Francis) was specifically that of a female. He was effectively appropriating that muse telepathically, or allowing himself to be its vessel, because it was of no use to the woman anyway and he could profit better from it.
Eric Wargo (Time Loops: Precognition, Retrocausation, and the Unconscious)
What if it is somehow our misunderstood, unacknowledged, looping relationship to our future that makes us ill—or at least, that contributes to our suffering—and not our failure to connect appropriately to our past? Could some neuroses be time loops misrecognized and denied, the way we haunt ourselves from our futures and struggle to reframe it as being about our past history? The next two chapters will examine this question through the lives of two famously precognitive and neurotic writers. Both show strikingly how creativity may travel together with trauma and suffering along the resonating string that connects us to the Not Yet. 12 Fate, Free Will, and Futility — Morgan Robertson’s Tiresias Complex Who can tell us of the power which events possess … Are their workings in the past or in the future; and are the more powerful of them those that are no longer, or those that are not yet? Is it to-day or to-morrow that moulds us? Do we not all spend the greater part of our lives under the shadow of an event that has not yet come to pass? — Maurice Maeterlinck, “The Pre-Destined” (1914) The monkey wrench precognition appears to throw into the problem of free will is an important part of the force field inhibiting serious consideration of it by many people in our culture. It may have been a fear of the inevitability of things prophesied that made the whole subject so anathema to Freud, for example. In a society that places priority on success and the individual’s responsibility for its attainment, it is both taken for granted and a point of fierce conviction that we choose and that our choices are not completely made for us by the inexorable clockwork of matter—the Newtonian inertia that brought the Titanic and the Iceberg, mere inert objects, together. Scientists may pay lip service to determinism—Freud himself did—but the inevitability of material processes due to causes “pushing” from the past somehow feels less restrictive than a block universe in which our fate is already set. The radical predestination implied by time loops may rob “great men” of their ability to claim credit for their successes.
Eric Wargo (Time Loops: Precognition, Retrocausation, and the Unconscious)
One reason for this “dirty little secret” is the positive publication bias described in Chapter 7. If researchers and medical journals pay attention to positive findings and ignore negative findings, then they may well publish the one study that finds a drug effective and ignore the nineteen in which it has no effect. Some clinical trials may also have small samples (such as for a rare diseases), which magnifies the chances that random variation in the data will get more attention than it deserves. On top of that, researchers may have some conscious or unconscious bias, either because of a strongly held prior belief or because a positive finding would be better for their career. (No one ever gets rich or famous by proving what doesn’t cure cancer.)
Charles Wheelan (Naked Statistics: Stripping the Dread from the Data)
Our deepest fear is not that we are inadequate. Our deepest fear is that we are powerful beyond measure. It is our light, not our darkness that frightens us most. We ask ourselves, ‘who am I to be brilliant, gorgeous, talented and famous?’ Actually, who are you not to be? You are a child of God. Your playing small does not serve the world. There is nothing enlightening about shrinking so that people won’t feel insecure around you. We were born to make manifest the Glory of God that is within us. It’s not just in some of us; it’s in all of us. And when we let our own light shine, we unconsciously give other people permission to do the same. As we are liberated from our own fear, our presence automatically liberates others.” -Nelson Mandela
Matthew Barnes (Jesus Christ, Zen Master: The top 116 sayings of an Enlightened Jesus. (Zennish Series Book 4))
Committing himself to the freelance’s life he lived at the edge of poverty, a southerner he stood outside of the literary establishment of new England. Poe’s pale expression in his most famous photo shows a man who believed he was born to suffer. If circumstances in his life were not propitious to suffering, he made sure to change them until they were, deep in his understanding almost as to be unconscious was a respect for the driving power of his misery. That it could take manifold forms in ways that he didn’t have to be aware of as if not he, but it could create. “ E.L. Doctorow on Edgar Allan Poe
E.L. Doctorow (Creationists: Selected Essays: 1993-2006)
Ah, the old description-experience gap. Phil may not know the term, but he understands the concept—exactly what he’s been trying to tell me this whole time about poker terms. You don’t have to have studied the description-experience gap to understand, if you’re truly expert at something, that you need experience to balance out the descriptions. Otherwise, you’re left with the illusion of knowledge—knowledge without substance. You’re an armchair philosopher who thinks that just because she read an article about something she is a sudden expert. (David Dunning, a psychologist at the University of Michigan most famous for being one half of the Dunning-Kruger effect—the more incompetent you are, the less you’re aware of your incompetence—has found that people go quickly from being circumspect beginners, who are perfectly aware of their limitations, to “unconscious incompetents,” people who no longer realize how much they don’t know and instead fancy themselves quite proficient.)
Maria Konnikova (The Biggest Bluff: How I Learned to Pay Attention, Master Myself, and Win)
One important reason that philosophers should take Nietzsche seriously is because he seems to have gotten, at least in broad contours, many points about human moral psychology right. Consider: (1) Nietzsche holds that heritable type-facts are central determinants of personality and morally significant behaviors, a claim well supported by extensive empirical findings in behavioral genetics. (2) Nietzsche claims that consciousness is a “surface” and that “the greatest part of conscious thought must still be attributed to [non-conscious] instinctive activity,” theses overwhelmingly vindicated by recent work by psychologists on the role of the unconscious (e.g., Wilson 2002) and by philosophers who have produced synthetic meta-analyses of work on consciousness in psychology and neuroscience (e.g., Rosenthal 2008). (3) Nietzsche claims that moral judgments are post-hoc rationalizations of feelings that have an antecedent source, and thus are not the outcome of rational reflection or discursiveness, a conclusion in sync with the findings of the ascendent “social intuitionism” in the empirical moral psychology of Jonathan Haidt (2001) and others. (4) Nietzsche argues that free will is an “illusion,” that our conscious experience of willing is itself the causal product of non-conscious forces, a view recently defended by the psychologist Daniel Wegner (2002), who, in turn, synthesiyes a large body of empirical literature, including the famous neurophysical data about “willing” collected by Benjamin Libet.
Brian Leiter (Nietzsche and Morality)
therapy wasn’t about truth; as Jack Nicholson famously shouted in A Few Good Men, sometimes people “can’t handle the truth.” Rather, it’s a matter of getting your unconscious to stop controlling your conscious mind. Effective therapy is about lowering your defences so that you can deal with the issues that arise in your life.
Catherine Gildiner (Good Morning, Monster: A Therapist Shares Five Heroic Stories of Emotional Recovery)
Determinism says that our behaviour is determined by two causes: our heredity and our environment. Heredity refers to the genes we inherit from our parents, while environment refers not only to our current environment but also to the environments we have experienced in the past—in effect, to all the experiences we have had from the time we were born. Determinism, in other words, says that our behaviour is entirely determined by our genes and experiences: if we knew every gene and every experience a person had, then, in principle, we could predict exactly what they would do at every moment in time. (p. 4) And now we may be on the brink of yet another revolution. It has been taking place largely out of public view, in psychology laboratories around the world. Its implications, however, are profound. It is telling us that just as we lost our belief that we are at the centre of the universe, we may also be losing our claim to stand aloof from the material world, to rise above the laws of physics and chemistry that bind other species. Our behaviour, it suggests, is just as lawful, just as determined, as that of every other living creature. (p. 6) Also, while determinism is clearly contrary to the religious doctrine of free will, it is important to note that it is not contrary to religion per se. Einstein famously said that ‘God does not play dice’ with nature. He believed in some form of creation, but he found it inconceivable that God would have left the running of this universe to chance. Determinism assumes that the universe is lawful, but it makes no assumptions about how this universe came into being. (p. 11) Another way in which parents influence their children’s behaviour is simply by being who they are. Children have a strong tendency to imitate adults, especially when the adult is important in their lives, and you can’t get much more important to a child than a parent. (p. 62) What children see does influence their understanding of how to get along in the world, of what is and isn’t acceptable. (p. 64) Our need to be liked, combined with our horror of being rejected or ostracized, can influence all of us. (p. 79) It is the brain which gives rise to thought: no brain activity, no thought. (p. 90) We’ve seen that everything we think, feel and do depends on the existence of an intact brain – (p. 92) …: that what remains in memory is not necessarily the precise details of an experience but our interpretation of that experience. (p. 140) According to determinism, it is your behaviour which is determined, not events. … The future is not preordained; if you change your behaviour, your future will also change. (p. 151) It is our brains that determine what we think and feel; if our brains don’t function properly, consciousness is disrupted. (p. 168) Given how much of our mental processing takes place in the unconscious, it is perhaps not surprising that we are often unaware of the factors that have guided our conscious thought. … …, but insofar as behaviour is determined by the environment, then by changing that environment we can change that behaviour. (p. 169)
David Lieberman (The Case Against Free Will: What a Quiet Revolution in Psychology has Revealed about How Behaviour is Determined)
Albert Einstein, who famously could not accept that God would be so unclassy as to turn His universe into a giant craps table. What seemed for all the world like randomness—blind chance—may really be the previously unseen influence of particles’ future histories on their present behavior. Retrocausation, in other words.
Eric Wargo (Time Loops: Precognition, Retrocausation, and the Unconscious)
To be human is to be on a quest. To live is to be embarked on a kind of unconscious journey toward a destination of your dreams. As Blaise Pascal put it in his famous wager: “You have to wager. It is not up to you, you are already committed.”7 You can’t not bet your life on something. You can’t not be headed somewhere. We live leaning forward, bent on arriving at the place we long for.
James K.A. Smith (You Are What You Love: The Spiritual Power of Habit)
Experiments published in 1983 clearly showed that subjects could choose not to perform a movement that was on the cusp of occurring (that is, that their brain was preparing to make) and that was preceded by a large readiness potential. In this view, although the physical sensation of an urge to move is initiated unconsciously, will can still control the outcome by vetoing the action. Later researchers, in fact, reported readiness potentials that precede a planned foot movement not by mere milliseconds but by almost two full seconds, leaving free won’t an even larger window of opportunity. “Conscious will could thus affect the outcome of the volitional process even though the latter was initiated by unconscious cerebral processes,” Libet says. “Conscious will might block or veto the process, so that no act occurs.” Everyone, Libet continues, has had the experience of “vetoing a spontaneous urge to perform some act. This often occurs when the urge to act involves some socially unacceptable consequence, like an urge to shout some obscenity at the professor.” Volunteers report something quite consistent with this view of the will as wielding veto power. Sometimes, they told Libet, a conscious urge to move seemed to bubble up from somewhere, but they suppressed it. Although the possibility of moving gets under way some 350 milliseconds before the subject experiences the will to move, that sense of will nevertheless kicks in 150 to 200 milliseconds before the muscle moves—and with it the power to call a halt to the proceedings. Libet’s findings suggest that free will operates not to initiate a voluntary act but to allow or suppress it. “We may view the unconscious initiatives for voluntary actions as ‘bubbling up’ in the brain,” he explains. “The conscious will then selects which of these initiatives may go forward to an action or which ones to veto and abort…. This kind of role for free will is actually in accord with religious and ethical strictures. These commonly advocate that you ‘control yourself.’ Most of the Ten Commandments are ‘do not’ orders.” And all five of the basic moral precepts of Buddhism are restraints: refraining from killing, from lying, from stealing, from sexual misconduct, from intoxicants. In the Buddha’s famous dictum, “Restraint everywhere is excellent.
Jeffrey M. Schwartz (The Mind & The Brain: Neuroplasticity and the Power of Mental Force)
Researchers may have some conscious or unconscious bias, either because of a strongly held prior belief or because a positive finding would be better for their career. (No one ever gets rich or famous by proving what doesn't cause cancer.)
Charles Wheelan (Naked Statistics: Stripping the Dread from the Data)
For the time being, however, his bent was literary and religious rather than balletic. He loved, and what seventh grader doesn’t, the abstracter foxtrots and more metaphysical twists of a Dostoevsky, a Gide, a Mailer. He longed for the experience of some vivider pain than the mere daily hollowness knotted into his tight young belly, and no weekly stomp-and-holler of group therapy with other jejune eleven-year-olds was going to get him his stripes in the major leagues of suffering, crime, and resurrection. Only a bona-fide crime would do that, and of all the crimes available murder certainly carried the most prestige, as no less an authority than Loretta Couplard was ready to attest, Loretta Couplard being not only the director and co-owner of the Lowen School but the author, as well, of two nationally televised scripts, both about famous murders of the 20th Century. They’d even done a unit in social studies on the topic: A History of Crime in Urban America. The first of Loretta’s murders was a comedy involving Pauline Campbell, R.N., of Ann Arbor, Michigan, circa 1951, whose skull had been smashed by three drunken teenagers. They had meant to knock her unconscious so they could screw her, which was 1951 in a nutshell. The eighteen-year-olds, Bill Morey and Max Pell, got life; Dave Royal (Loretta’s hero) was a year younger and got off with twenty-two years. Her second murder was tragic in tone and consequently inspired more respect, though not among the critics, unfortunately. Possibly because her heroine, also a Pauline (Pauline Wichura), though more interesting and complicated had also been more famous in her own day and ever since. Which made the competition, one best-selling novel and a serious film biography, considerably stiffen Miss Wichura had been a welfare worker in Atlanta, Georgia, very much into environment and the population problem, this being the immediate pre-Regents period when anyone and everyone was legitimately starting to fret. Pauline decided to do something, viz., reduce the population herself and in the fairest way possible. So whenever any of the families she visited produced one child above the three she’d fixed, rather generously, as the upward limit, she found some unobtrusive way of thinning that family back to the preferred maximal size. Between 1989 and 1993 Pauline’s journals (Random House, 1994) record twenty-six murders, plus an additional fourteen failed attempts. In addition she had the highest welfare department record in the U.S. for abortions and sterilizations among the families whom she advised. “Which proves, I think,” Little Mister Kissy Lips had explained one day after school to his friend Jack, “that a murder doesn’t have to be of someone famous to be a form of idealism.” But of course idealism was only half the story: the other half was curiosity. And beyond idealism and curiosity there was probably even another half, the basic childhood need to grow up and kill someone.
Thomas M. Disch (334)
The American psychologist Elliot Aronson, who studied this phenomenon, famously assembled a discussion group of pompous, dull people. Some of the participants were made to endure an arduous selection process; others were allowed to join immediately, without expending any effort. Those who were given the runaround reported enjoying the group far more than the ones who were simply let in. Aronson explained what was happening here: whenever we’ve invested time, money or energy into something and it ends up being a complete waste of time, this creates dissonance, which we try to reduce by finding ways of justifying our bad decision. Aronson’s participants focused unconsciously on what might be interesting, or at least bearable, about being part of a deliberately boring group. The people who had invested very little effort in joining therefore had less dissonance to reduce, and more readily admitted what a waste of time it had been.
Steven Bartlett (The Diary of a CEO: The 33 Laws of Business and Life)