Phenomenon Famous Quotes

We've searched our database for all the quotes and captions related to Phenomenon Famous. Here they are! All 45 of them:

Pick up a pinecone and count the spiral rows of scales. You may find eight spirals winding up to the left and 13 spirals winding up to the right, or 13 left and 21 right spirals, or other pairs of numbers. The striking fact is that these pairs of numbers are adjacent numbers in the famous Fibonacci series: 1, 1, 2, 3, 5, 8, 13, 21... Here, each term is the sum of the previous two terms. The phenomenon is well known and called phyllotaxis. Many are the efforts of biologists to understand why pinecones, sunflowers, and many other plants exhibit this remarkable pattern. Organisms do the strangest things, but all these odd things need not reflect selection or historical accident. Some of the best efforts to understand phyllotaxis appeal to a form of self-organization. Paul Green, at Stanford, has argued persuasively that the Fibonacci series is just what one would expects as the simplest self-repeating pattern that can be generated by the particular growth processes in the growing tips of the tissues that form sunflowers, pinecones, and so forth. Like a snowflake and its sixfold symmetry, the pinecone and its phyllotaxis may be part of order for free
Stuart A. Kauffman (At Home in the Universe: The Search for the Laws of Self-Organization and Complexity)
The exponential growth of this industry was correlated with the phenomenon famously discovered by Moore, who in 1965 drew a graph of the speed of integrated circuits, based on the number of transistors that could be placed on a chip, and showed that it doubled about every two years, a trajectory that could be expected to continue. This was reaffirmed in 1971, when Intel was able to etch a complete central processing unit onto one chip, the Intel 4004, which was dubbed a “microprocessor.” Moore’s Law has held generally true to this day, and its reliable projection of performance to price allowed two generations of young entrepreneurs, including Steve Jobs and Bill Gates, to create cost projections for their forward-leaning products.
Walter Isaacson (Steve Jobs)
often has no suspicion of the causal connection between the precipitating event and the pathological phenomenon.
Sigmund Freud (Freud's Most Famous & Influential Books, Vol 1: The Interpretations of Dreams/On Dreams/On Psychotherapy/Jokes & Their Relation to the Unconscious)
Neurologically speaking, though, there are reasons we develop a confused sense of priorities when we’re in front of our computer screens. For one thing, email comes at unpredictable intervals, which, as B. F. Skinner famously showed with rats seeking pellets, is the most seductive and habit-forming reward pattern to the mammalian brain. (Think about it: would slot machines be half as thrilling if you knew when, and how often, you were going to get three cherries?) Jessie would later say as much to me when I asked her why she was “obsessed”—her word—with her email: “It’s like fishing. You just never know what you’re going to get.” More to the point, our nervous systems can become dysregulated when we sit in front of a screen. This, at least, is the theory of Linda Stone, formerly a researcher and senior executive at Microsoft Corporation. She notes that we often hold our breath or breathe shallowly when we’re working at our computers. She calls this phenomenon “email apnea” or “screen apnea.” “The result,” writes Stone in an email, “is a stress response. We become more agitated and impulsive than we’d ordinarily be.
Jennifer Senior (All Joy and No Fun: The Paradox of Modern Parenthood)
The interpretation of a result is an example. To take a trivial instance, there is a famous joke about a man who complains to a friend of a mysterious phenomenon. The white horses on his farm eat more than the black horses. He worries about this and cannot understand it, until his friend suggests that maybe he has more white horses than black ones.
Richard P. Feynman (The Meaning of It All: Thoughts of a Citizen-Scientist)
The exponential growth of this industry was correlated with the phenomenon famously discovered by Moore, who in 1965 drew a graph of the speed of integrated circuits, based on the number of transistors that could be placed on a chip, and showed that it doubled about every two years, a trajectory that could be expected to continue. This was reaffirmed in 1971, when Intel was able to etch a complete central processing unit onto one chip, the Intel 4004, which was dubbed a “microprocessor.”Moore’s Law has held generally true to this day, and its reliable projection of performance to price allowed two generations of young entrepreneurs, including Steve Jobs and Bill Gates, to create cost projections for their forward- leaning products.
Walter Isaacson (Steve Jobs)
The exponential growth of this industry was correlated with the phenomenon famously discovered by Moore, who in 1965 drew a graph of the speed of integrated circuits, based on the number of transistors that could be placed on a chip, and showed that it doubled about every two years, a trajectory that could be expected to continue. This was reaffirmed in 1971, when Intel was able to etch a complete central processing unit onto one chip, the Intel 4004, which was dubbed a “microprocessor.” Moore’s Law has held generally true to this day, and its reliable projection of performance to price allowed two generations of young entrepreneurs, including Steve Jobs and Bill Gates, to create cost projections for their forward-leaning products. The
Walter Isaacson (Steve Jobs)
Most of us do not like not being able to see what others see or make sense of something new. We do not like it when things do not come together and fit nicely for us. That is why most popular movies have Hollywood endings. The public prefers a tidy finale. And we especially do not like it when things are contradictory, because then it is much harder to reconcile them (this is particularly true for Westerners). This sense of confusion triggers in a us a feeling of noxious anxiety. It generates tension. So we feel compelled to reduce it, solve it, complete it, reconcile it, make it make sense. And when we do solve these puzzles, there's relief. It feels good. We REALLY like it when things come together. What I am describing is a very basic human psychological process, captured by the second Gestalt principle. It is what we call the 'press for coherence.' It has been called many different things in psychology: consonance, need for closure, congruity, harmony, need for meaning, the consistency principle. At its core it is the drive to reduce the tension, disorientation, and dissonance that come from complexity, incoherence, and contradiction. In the 1930s, Bluma Zeigarnik, a student of Lewin's in Berlin, designed a famous study to test the impact of this idea of tension and coherence. Lewin had noticed that waiters in his local cafe seemed to have better recollections of unpaid orders than of those already settled. A lab study was run to examine this phenomenon, and it showed that people tend to remember uncompleted tasks, like half-finished math or word problems, better than completed tasks. This is because the unfinished task triggers a feeling of tension, which gets associated with the task and keeps it lingering in our minds. The completed problems are, well, complete, so we forget them and move on. They later called this the 'Zeigarnik effect,' and it has influenced the study of many things, from advertising campaigns to coping with the suicide of loved ones to dysphoric rumination of past conflicts.
Peter T. Coleman (The Five Percent: Finding Solutions to Seemingly Impossible Conflicts)
Hölderlin's sense of loss and destitution was not simply due to a personal predilection for suffering, but was part of a larger cultural phenomenon that arose from powerful currents seething under the Enlightenment—an increasing alienation from nature and a growing sense of disenchantment in the face of a triumphant rationality and waning traditions and values. Hölderlin was not alone in perceiving these changes and experiencing them deeply. Hegel, for example, famously wrote of alienated consciousness, and Schiller described modern human beings as "stunted plants, that show only a feeble vestige of their nature." Hölderlin, for his part, reacted to these currents with an almost overwhelming longing for lost wholeness.
Friedrich Hölderlin (Odes and Elegies)
The sociologist Robert Merton famously called this phenomenon the “Matthew Effect”after the New Testament verse in the Gospel of Matthew: “For unto everyone that hath shall be given, and he shall have abundance. But from him that hath not shall be taken away even that which he hath.”It is those who are successful, in other words, who are most likely to be given the kinds of special opportunities that lead to further success. It’s the rich who get the biggest tax breaks. It’s the best students who get the best teaching and most attention. And it’s the biggest nine- and ten-year-olds who get the most coaching and practice. Success is the result of what sociologists like to call “accumulative advantage.”The professional hockey player starts out a little bit better than his peers.
Malcolm Gladwell (Outliers: The Story of Success)
Vul and Pashler drew inspiration from the well-known phenomenon known as the wisdom-of-crowds effect: averaging the independent judgments of different people generally improves accuracy. In 1907, Francis Galton, a cousin of Darwin and a famous polymath, asked 787 villagers at a country fair to estimate the weight of a prize ox. None of the villagers guessed the actual weight of the ox, which was 1,198 pounds, but the mean of their guesses was 1,200, just 2 pounds off, and the median (1,207) was also very close. The villagers were a “wise crowd” in the sense that although their individual estimates were quite noisy, they were unbiased. Galton’s demonstration surprised him: he had little respect for the judgment of ordinary people, and despite himself, he urged that his results were “more creditable to the trustworthiness of a democratic judgment than might have been expected.
Daniel Kahneman (Noise)
Speaking generally, there are two kinds of descriptive music. The first comes under the heading of literal description. A composer wishes to recreate the sound of bells in the night. He therefore writes certain chords, for orchestra or piano or whatever medium he is using, which actually sound like bells in the night. Something real is being imitated realistically. A famous example of that kind of description in music is the passage in one of Strauss’s tone poems where he imitates the bleating of sheep. The music has no other raison d’être than mere imitation at that point. The other type of descriptive music is less literal and more poetic. No attempt is made to describe a particular scene or event; nevertheless some outward circumstance arouses certain emotions in the composer which he wishes to communicate to the listener. It may be clouds or the sea or a country fair or an airplane. But the point is that instead of literal imitation, one gets a musicopoetic transcription of the phenomenon as reflected in the composer’s mind. That constitutes a higher form of program music. The bleating of sheep will always sound like the bleating of sheep, but a cloud portrayed in music allows the imagination more freedom. One principle must be kept firmly
Aaron Copland (What to Listen For in Music (Signet Classics))
If this is true—if solitude is an important key to creativity—then we might all want to develop a taste for it. We’d want to teach our kids to work independently. We’d want to give employees plenty of privacy and autonomy. Yet increasingly we do just the opposite. We like to believe that we live in a grand age of creative individualism. We look back at the midcentury era in which the Berkeley researchers conducted their creativity studies, and feel superior. Unlike the starched-shirted conformists of the 1950s, we hang posters of Einstein on our walls, his tongue stuck out iconoclastically. We consume indie music and films, and generate our own online content. We “think different” (even if we got the idea from Apple Computer’s famous ad campaign). But the way we organize many of our most important institutions—our schools and our workplaces—tells a very different story. It’s the story of a contemporary phenomenon that I call the New Groupthink—a phenomenon that has the potential to stifle productivity at work and to deprive schoolchildren of the skills they’ll need to achieve excellence in an increasingly competitive world. The New Groupthink elevates teamwork above all else. It insists that creativity and intellectual achievement come from a gregarious place. It has many powerful advocates. “Innovation—the heart of the knowledge economy—is fundamentally social,” writes the prominent journalist Malcolm Gladwell. “None of us is as smart as all of us,” declares the organizational consultant Warren Bennis,
Susan Cain (Quiet: The Power of Introverts in a World That Can't Stop Talking)
Maybe nostalgia is itself the problem. A Democrat I met in Macon during a conversation we had about the local enthusiasm for Trump told me that “people want to go back to Mayberry”, the setting of the beloved old Andy Griffith Show. (As it happens, the actual model for Mayberry, Mount Airy, a bedraggled town in North Carolina, has gone all in on the Trump revolution, as the Washington Post recently reported.) Maybe it’s also true, as my liberal friends believe, that what people in this part of the country secretly long to go back to are the days when the Klan was riding high or when Quantrill was terrorizing the people of neighboring Kansas, or when Dred Scott was losing his famous court case. For sure, there is a streak of that ugly sentiment in the Trump phenomenon. But I want to suggest something different: that the nostalgic urge does not necessarily have to be a reactionary one. There is nothing un-progressive about wanting your town to thrive, about recognizing that it isn’t thriving today, about figuring out that the mid-century, liberal way worked better. For me, at least, that is how nostalgia unfolds. When I drive around this part of the country, I always do so with a WPA guidebook in hand, the better to help me locate the architectural achievements of the Roosevelt years. I used to patronize a list of restaurants supposedly favored by Harry Truman (they are slowly disappearing). And these days, as I pass Trump sign after Trump sign, I wonder what has made so many of Truman’s people cast their lot with this blustering would-be caudillo. Maybe what I’m pining for is a liberal Magic Kingdom, a non-racist midwest where things function again. For a countryside dotted with small towns where the business district has reasonable job-creating businesses in it, taverns too. For a state where the giant chain stores haven’t succeeded in putting everyone out of business. For an economy where workers can form unions and buy new cars every couple of years, where farmers enjoy the protection of the laws, and where corporate management has not been permitted to use every trick available to them to drive down wages and play desperate cities off one against the other. Maybe it’s just an impossible utopia, a shimmering Mayberry dream. But somehow I don’t think so.
Thomas Frank (Rendezvous with Oblivion: Reports from a Sinking Society)
Every human being with normal mental and emotional faculties longs for more. People typically associate their longing for more with a desire to somehow improve their lot in life—to get a better job, a nicer house, a more loving spouse, become famous, and so on. If only this, that, or some other thing were different, we say to ourselves, then we’d feel complete and happy. Some chase this “if only” all their lives. For others, the “if only” turns into resentment when they lose hope of ever acquiring completeness. But even if we get lucky and acquire our “if only,” it never quite satisfies. Acquiring the better job, the bigger house, the new spouse, or world fame we longed for may provide a temporary sense of happiness and completeness, but it never lasts. Sooner or later, the hunger returns. The best word in any language that captures this vague, unquenchable yearning, according to C. S. Lewis and other writers, is the German word Sehnsucht (pronounced “zane-zookt”).[9] It’s an unusual word that is hard to translate, for it expresses a deep longing or craving for something that you can’t quite identify and that always feels just out of reach. Some have described Sehnsucht as a vague and bittersweet nostalgia and/or longing for a distant country, but one that cannot be found on earth. Others have described it as a quasi-mystical sense that we (and our present world) are incomplete, combined with an unattainable yearning for whatever it is that would complete it. Scientists have offered several different explanations for this puzzling phenomenon—puzzling, because it’s hard to understand how natural processes alone could have evolved beings that hunger for something nature itself doesn’t provide.[10] But this longing is not puzzling from a biblical perspective, for Scripture teaches us that humans and the entire creation are fallen and estranged from God. Lewis saw Sehnsucht as reflective of our “pilgrim status.” It indicates that we are not where we were meant to be, where we are destined to be; we are not home. Lewis once wrote to a friend that “our best havings are wantings,” for our “wantings” are reminders that humans are meant for a different and better state.[11] In another place he wrote: Our lifelong nostalgia, our longing to be reunited with something in the universe from which we now feel cut off, to be on the inside of some door which we have always seen from the outside is . . . the truest index of our real situation.[12] With Lewis, Christians have always identified this Sehnsucht that resides in the human heart as a yearning for God. As St. Augustine famously prayed, “You have made us for yourself, and our hearts are restless till they find their rest in you.”[13] In this light, we might think of Sehnsucht as a sort of homing device placed in us by our Creator to lead us into a passionate relationship with him.
Gregory A. Boyd (Benefit of the Doubt: Breaking the Idol of Certainty)
The mixture of a solidly established Romance aristocracy with the Old English grassroots produced a new language, a “French of England,” which came to be known as Anglo-Norman. It was perfectly intelligible to the speakers of other langues d’oïl and also gave French its first anglicisms, words such as bateau (boat) and the four points of the compass, nord, sud, est and ouest. The most famous Romance chanson de geste, the Song of Roland, was written in Anglo-Norman. The first verse shows how “French” this language was: Carles li reis, nostre emperere magnes, set anz tuz pleins ad estéd en Espaigne, Tresqu’en la mer cunquist la tere altaigne… King Charles, our great emperor, stayed in Spain a full seven years: and he conquered the high lands up to the sea… Francophones are probably not aware of how much England contributed to the development of French. England’s court was an important production centre for Romance literature, and most of the early legends of King Arthur were written in Anglo-Norman. Robert Wace, who came from the Channel Island of Jersey, first evoked the mythical Round Table in his Roman de Brut, written in French in 1155. An Englishman, William Caxton, even produced the first “vocabulary” of French and English (a precursor of the dictionary) in 1480. But for four centuries after William seized the English crown, the exchange between Old English and Romance was pretty much the other way around—from Romance to English. Linguists dispute whether a quarter or a half of the basic English vocabulary comes from French. Part of the argument has to do with the fact that some borrowings are referred to as Latinates, a term that tends to obscure the fact that they actually come from French (as we explain later, the English worked hard to push away or hide the influence of French). Words such as charge, council, court, debt, judge, justice, merchant and parliament are straight borrowings from eleventh-century Romance, often with no modification in spelling. In her book Honni soit qui mal y pense, Henriette Walter points out that the historical developments of French and English are so closely related that anglophone students find it easier to read Old French than francophones do. The reason is simple: Words such as acointance, chalenge, plege, estriver, remaindre and esquier disappeared from the French vocabulary but remained in English as acquaintance, challenge, pledge, strive, remain and squire—with their original meanings. The word bacon, which francophones today decry as an English import, is an old Frankish term that took root in English. Words that people think are totally English, such as foreign, pedigree, budget, proud and view, are actually Romance terms pronounced with an English accent: forain, pied-de-grue (crane’s foot—a symbol used in genealogical trees to mark a line of succession), bougette (purse), prud (valiant) and vëue. Like all other Romance vernaculars, Anglo-Norman evolved quickly. English became the expression of a profound brand of nationalism long before French did. As early as the thirteenth century, the English were struggling to define their nation in opposition to the French, a phenomenon that is no doubt the root of the peculiar mixture of attraction and repulsion most anglophones feel towards the French today, whether they admit it or not. When Norman kings tried to add their French territory to England and unify their kingdom under the English Crown, the French of course resisted. The situation led to the first, lesser-known Hundred Years War (1159–1299). This long quarrel forced the Anglo-Norman aristocracy to take sides. Those who chose England got closer to the local grassroots, setting the Anglo-Norman aristocracy on the road to assimilation into English.
Jean-Benoît Nadeau (The Story of French)
book The World Beyond Your Head: On Becoming an Individual in an Age of Distraction as a jumping off point, he takes care to unpack the various cultural mandates  that have infected the way we think and feel about distraction. I found his ruminations not only enlightening but surprisingly emancipating: There are two big theories about why [distraction is] on the rise. The first is material: it holds that our urbanized, high-tech society is designed to distract us… The second big theory is spiritual—it’s that we’re distracted because our souls are troubled. The comedian Louis C.K. may be the most famous contemporary exponent of this way of thinking. A few years ago, on “Late Night” with Conan O’Brien, he argued that people are addicted to their phones because “they don’t want to be alone for a second because it’s so hard.” (David Foster Wallace also saw distraction this way.) The spiritual theory is even older than the material one: in 1887, Nietzsche wrote that “haste is universal because everyone is in flight from himself”; in the seventeenth century, Pascal said that “all men’s miseries derive from not being able to sit in a quiet room alone.”… Crawford argues that our increased distractibility is the result of technological changes that, in turn, have their roots in our civilization’s spiritual commitments. Ever since the Enlightenment, he writes, Western societies have been obsessed with autonomy, and in the past few hundred years we have put autonomy at the center of our lives, economically, politically, and technologically; often, when we think about what it means to be happy, we think of freedom from our circumstances. Unfortunately, we’ve taken things too far: we’re now addicted to liberation, and we regard any situation—a movie, a conversation, a one-block walk down a city street—as a kind of prison. Distraction is a way of asserting control; it’s autonomy run amok. Technologies of escape, like the smartphone, tap into our habits of secession. The way we talk about distraction has always been a little self-serving—we say, in the passive voice, that we’re “distracted by” the Internet or our cats, and this makes us seem like the victims of our own decisions. But Crawford shows that this way of talking mischaracterizes the whole phenomenon. It’s not just that we choose our own distractions; it’s that the pleasure we get from being distracted is the pleasure of taking action and being free. There’s a glee that comes from making choices, a contentment that settles after we’ve asserted our autonomy. When
Anonymous
For me, this was the first hint that the liturgy might be the cure for spiritual loneliness. Though I felt inadequate and alone during my prayer crisis, I was not alone. Much of American spiritual life trudges through the muck of solitary spirituality. Twenty years ago, Robert Bellah described this phenomenon in Habits of the Heart, with his now famous description of one woman: Sheila Larson is a young nurse who has received a good deal of therapy and describes her faith as “Sheilaism.” This suggests the logical possibility of more than 235 million American religions, one for each of us. “I believe in God,” Sheila says. “I am not a religious fanatic. I can’t remember the last time I went to church. My faith has carried me a long way. It’s Sheilaism. Just my own little voice.” “My little voice” guides many lonely people to and through New Age, wicca, Buddhism, labyrinths, Scientology, yoga, meditation, and various fads in Christianity—and then creates a new Sheilaism from the fragments that have not been discarded along the way. I love Sheila Larson precisely because she articulates nearly perfectly my lifelong struggle: “I believe in God. I am not a religious fanatic…. My faith has carried me a long way. It’s Sheilism. Just my own little voice.” The difference between Sheila and me is that she has the courage of her convictions: she knows her faith is very personal and so hasn’t bothered with the church. I like to pretend that my faith is grounded in community, but I struggle to believe in anything but Markism. Fortunately God loves us so much he has made it a “spiritual law” that Sheilism or Markism become boring after awhile. The gift of the liturgy—and it is precisely why I need the liturgy—is that it helps me hear not so much “my little voice” but instead the still, small voice (Psalm 46). It leads away from the self and points me toward the community of God.
Mark Galli (Beyond Smells and Bells: The Wonder and Power of Christian Liturgy)
Inflation’, wrote Milton Friedman in a famous definition, ‘is always and everywhere a monetary phenomenon, in the sense that it cannot occur without a more rapid increase in the quantity of money than in output.
Niall Ferguson (The Ascent of Money: A Financial History of the World: 10th Anniversary Edition)
Think of the new conception of time, Digital Presentism, like real-time streaming of progressively generated content in immersive virtual reality. We’re all familiar with online music streaming, too: When you stream music online, every bit is discretely rendered, interpreted and finally interwoven into your unitary experiential reality. Only with Digital Presentism 'music' is also being created in 'real time' as if right from your mind... Since time can’t be absolute but is always subjective, Digital Presentism revolves around observer-centric temporality. What we call ‘time’ is a sequential change between static perceptual 'frames,' it’s an emergent phenomenon, 'a moving image of eternity' as Plato famously said more than two millennia ago.
Alex M. Vikoulov (The Physics of Time: D-Theory of Time & Temporal Mechanics (The Science and Philosophy of Information Book 2))
he is simply heads and tails more capable than anyone else. It’s a romantic notion in popular media—Sherlock Holmes, Miranda Priestly, Tony Stark—but in real life, these people are not who you want on your team no matter how talented they are. Instead of a multiplier effect, you get a divider effect: the presence of this person makes the rest of your team less effective. Stanford professor Robert I. Sutton described this phenomenon in his now famous book The No Asshole Rule. He defines an asshole as someone who makes other people feel worse about themselves or who specifically targets people less powerful than him or
Julie Zhuo (The Making of a Manager: What to Do When Everyone Looks to You)
Before the 1940’s, if one woman in an audience stood up and shrieked at the top of her lungs throughout an entire show she’d have been carted off to an asylum. By the mid-forties, however, entire audiences behaved like that, screaming, tearing at their clothes and hair, leaving their seats to board the stage. On December 30th, 1942, while Frank Sinatra sang at the Paramount Theater in New York, the behavior of the audience changed, and a part of our relationship to well-known people changed forever. Psychiatrists and psychologists of the day struggled to explain the phenomenon. They recalled medieval dance crazes, spoke of “mass frustrated love” and “mass hypnosis.” The media age did bring a type of mass hypnosis into American life. It affects all of us to some degree, and some of us to a great degree. Before the advent of mass-media, a young girl might have admired a performer from afar, and it would have been acceptable to have a passing crush. It would not have been acceptable if she pursued the performer to his home, or if she had to be restrained by police. It would not have been acceptable to skip school in order to wait for hours outside a hotel and then try to tear pieces of clothing from the passing star. Yet that unhealthy behavior became “normal” in the Sinatra days. In fact, audience behavior that surprised everyone in 1942 was expected two years later when Sinatra appeared again at the Paramount Theater. This time, the 30,000 screaming, bobby-soxed fans were joined by a troop of reporters. The media were learning to manipulate this new behavior to their advantage. Having predicted a commotion, 450 police officers were assigned to that one theater, and it appeared that society had learned to deal with this phenomenon. It had not. During the engagement, an 18-year old named Alexander Ivanovich Dorogokupetz stood up in the theater and threw an egg that hit Sinatra in the face. The show stopped, and for a moment, a brief moment, Sinatra was not the star. Now it was Dorogokupetz mobbed by audience members and Dorogokupetz who had to be escorted out by police. Society had not learned to deal with this, and still hasn’t. Dorogokupetz told police: “I vowed to put an end to this monotony of two years of consecutive swooning. It felt good.” Saddled with the least American of names, he had tried to make one for himself in the most American way, and but for his choice of a weapon, he would probably be as famous today as Frank Sinatra. Elements in society were pioneering the skills of manipulating emotion and behavior in ways that had never been possible before: electronic ways. The media were institutionalizing idolatry. Around
Gavin de Becker (The Gift of Fear: Survival Signals That Protect Us from Violence)
Before the 1940’s, if one woman in an audience stood up and shrieked at the top of her lungs throughout an entire show she’d have been carted off to an asylum. By the mid-forties, however, entire audiences behaved like that, screaming, tearing at their clothes and hair, leaving their seats to board the stage. On December 30th, 1942, while Frank Sinatra sang at the Paramount Theater in New York, the behavior of the audience changed, and a part of our relationship to well-known people changed forever. Psychiatrists and psychologists of the day struggled to explain the phenomenon. They recalled medieval dance crazes, spoke of “mass frustrated love” and “mass hypnosis.” The media age did bring a type of mass hypnosis into American life. It affects all of us to some degree, and some of us to a great degree. Before the advent of mass-media, a young girl might have admired a performer from afar, and it would have been acceptable to have a passing crush. It would not have been acceptable if she pursued the performer to his home, or if she had to be restrained by police. It would not have been acceptable to skip school in order to wait for hours outside a hotel and then try to tear pieces of clothing from the passing star. Yet that unhealthy behavior became “normal” in the Sinatra days. In fact, audience behavior that surprised everyone in 1942 was expected two years later when Sinatra appeared again at the Paramount Theater. This time, the 30,000 screaming, bobby-soxed fans were joined by a troop of reporters. The media were learning to manipulate this new behavior to their advantage. Having predicted a commotion, 450 police officers were assigned to that one theater, and it appeared that society had learned to deal with this phenomenon. It had not. During the engagement, an 18-year old named Alexander Ivanovich Dorogokupetz stood up in the theater and threw an egg that hit Sinatra in the face. The show stopped, and for a moment, a brief moment, Sinatra was not the star. Now it was Dorogokupetz mobbed by audience members and Dorogokupetz who had to be escorted out by police. Society had not learned to deal with this, and still hasn’t. Dorogokupetz told police: “I vowed to put an end to this monotony of two years of consecutive swooning. It felt good.” Saddled with the least American of names, he had tried to make one for himself in the most American way, and but for his choice of a weapon, he would probably be as famous today as Frank Sinatra. Elements in society were pioneering the skills of manipulating emotion and behavior in ways that had never been possible before: electronic ways. The media were institutionalizing idolatry.
Gavin de Becker (The Gift of Fear: Survival Signals That Protect Us from Violence)
Jang Jin Sung, a famous North Korea defector and former poet laureate who worked in North Korea’s propaganda bureau, calls this phenomenon “emotional dictatorship.” In North Korea, it’s not enough for the government to control where you go, what you learn, where you work, and what you say. They need to control you through your emotions, making you a slave to the state by destroying your individuality, and your ability to react to situations based on your own experience of the world.
Yeonmi Park (In Order to Live: A North Korean Girl's Journey to Freedom)
When examined through the lens of Meerkat’s Law and the central framework of this book, it is obvious why the resulting networks generated by big launches are weak. You’d rather have a smaller set of atomic networks that are denser and more engaged than a large number of networks that aren’t there. When a networked product depends on having other people in order to be useful, it’s better to ignore the top-line aggregate numbers. Instead, the quality of the traction can only be seen when you zoom all the way into the perspective of an individual user within the network. Does a new person who joins the product see value based on how many other users are already on it? You might as well ignore the aggregate numbers, and in particular the spike of users that a new product might see in its first days. As Eric Ries describes in his book The Lean Startup, these are “vanity metrics.” The numbers might make you feel good, especially when they are going up, but it doesn’t matter if you have a hundred million users if they are churning out at a high rate, due to a lack of other users engaging. When networks are built bottom-up, they are more likely to be densely interconnected, and thus healthier and more engaged. There are multiple reasons for this: A new product is often incubated within a subcommunity, whether that’s a college campus, San Francisco techies, gamers, or freelancers—as recent tech successes have shown. It will grow within this group before spreading into other verticals, allowing time for its developers to tune features like inviting or sharing, while honing the core value proposition. Once a new networked product is spreading via word of mouth, then each user is likely to know at least one other user already on the network. By the time it reaches the broader consciousness, it will be seen as a phenomenon, and top-down efforts can always be added on to scale a network that’s already big and engaged. If Big Bang Launches work so poorly in general, why do they work for Apple? This type of launch works for Apple because their core offerings can stand alone as premium, high-utility products that generally don’t need to construct new networks to function. At most, they tap into existing networks like email and SMS. Famously, Apple has not succeeded with social offerings like the now-defunct Game Center and Ping. The closest new networked product they’ve launched is arguably the App Store, but even that was initially not in Steve Jobs’s vision for the phone.87 Most important, though, you aren’t Apple. So don’t try to copy them without having their kinds of products.
Andrew Chen (The Cold Start Problem: How to Start and Scale Network Effects)
Stanford professor Robert I. Sutton described this phenomenon in his now famous book The No Asshole Rule. He defines an asshole as someone who makes other people feel worse about themselves or who specifically targets people less powerful than him or her.
Julie Zhuo (The Making of a Manager: What to Do When Everyone Looks to You)
Take for instance a phenomenon called frustrated spontaneous emission. It sounds like an embarrassing sexual complaint that psychotherapy might help with. In fact, it involves the decay of radioactive particles, which ordinarily takes place at a predictably random rate. The exception, however, is when radioactive material is placed in an environment that cannot absorb the photons that are emitted by decay. In that case, decay ceases—the atoms become “frustrated.” How do these atoms “know” to stop decaying until conditions are suitable? According to Wharton, the unpredictable decay of radioactive particles may be determined in part by whatever receives their emitted photons in the future.20 Decay may not really be random at all, in other words. Another quantum mystery that arguably becomes less mysterious in a retrocausal world is the quantum Zeno effect. Usually, the results of measurements are unpredictable—again according to the famous uncertainty believed to govern the quantum kingdom—but there is a loophole. Persistent, rapid probing of reality by repeating the same measurement over and over produces repetition of the same “answer” from the physical world, almost as if it is “stopping time” in some sense (hence the name of the effect, which refers to Zeno’s paradoxes like an arrow that must first get halfway to its target, and then halfway from there, and so on, and thus is never able to reach the target at all).21 If the measurement itself is somehow influencing a particle retrocausally, then repeating the same measurement in the same conditions may effectively be influencing the measured particles the same way in their past, thereby producing the consistent behavior. Retrocausation may also be at the basis of a long-known but, again, hitherto unsatisfyingly explained quirk of light’s behavior: Fermat’s principle of least time. Light always takes the fastest possible path to its destination, which means taking the shortest available path through different media like water or glass. It is the rule that accounts for the refraction of light through lenses, and the reason why an object underwater appears displaced from its true location.22 It is yet another example of a creature in the quantum bestiary that makes little sense unless photons somehow “know” where they are going in order to take the most efficient possible route to get there. If the photon’s angle of deflection when entering a refractive medium is somehow determined by its destination, Fermat’s principle would make much more sense. (We will return to Fermat’s principle later in this book; it plays an important role in Ted Chiang’s short story, “Story of Your Life,” the basis for the wonderful precognition movie Arrival.) And retrocausation could also offer new ways of looking at the double-slit experiment and its myriad variants.
Eric Wargo (Time Loops: Precognition, Retrocausation, and the Unconscious)
Weak argument: talk loudly.' Winston Churchill's famous marginal note is a classic example of a meta-equation. More precisely, it is a reminder of the basic principle of total valency: both halves of any equation strive toward self-repetition. In this respect, an equation is the ideal image of any reflection. Mimesis is tautological, and tautology as a universal phenomenon has its equivalent not only in mathematics but also in art; if in mathematics it takes form in an equation, then its ideal genre equivalent lies the riddle (the equation is the rationalization of a riddle, and detective fiction is its dramatization). As it grows into 'higher' genres, the riddle preserves its principle: two equal sides with unknowns, in which the sides demonstrate that they are identical. A riddle is a game. The process of solving it essentially boils down to proving the obvious; one knows from the start that the meanings of the two functions given are equal. This transforms the whole process into a sort of intellectual ostensibility.
Evgeny Dobrenko (Late Stalinism: The Aesthetics of Politics)
Saunderson lectured on light, lenses, optics, the phenomenon of the rainbow, and other subjects connected with sight. He also helped to make Newton's theories of the Principia Mathematica and other works accessible to students of Cambridge. Unlike Newton, however, Saunderson was famously (or infamously) irreligious, which adds another layer to Diderot's interest in this blind man.
M. Leona Godin (There Plant Eyes: A Personal and Cultural History of Blindness)
Trade disputes were regularly argued before juries of hundreds of citizens, and this must have created an intensely financially literate society. The monetization of the Athenian economy was an equally important step. Recently, scholars have argued that it played a central role in the transition to the political phenomenon for which Athens is most famous: democracy. Money became both a tool for sharing the Athenian economic success and an instrument for aligning personal loyalties to the state.
William N. Goetzmann (Money Changes Everything: How Finance Made Civilization Possible)
Supernatural Supernatural has several meanings; the usual is “miraculous; ascribed to agencies or powers above or beyond nature; divine.” Because science is commonly regarded as a method of studying the natural world, a supernatural phenomenon is by this definition unexplainable by, and therefore totally incompatible with, science. Today, a few religious traditions continue to maintain that psi is supernatural and therefore not amenable to scientific study. But a few hundred years ago virtually all natural phenomena were thought to be manifestations of supernatural agencies and spirits. Through years of systematic investigation, many of these phenomena are now understood in quite ordinary terms. Thus, it is entirely reasonable to expect that so-called miracles are simply indicators of our present ignorance. Any such events may be more properly labeled first as paranormal, then as normal once we have developed an acceptable scientific explanation. As astronaut Edgar Mitchell put it: “There are no unnatural or supernatural phenomena, only very large gaps in our knowledge of what is natural, particularly regarding relatively rare occurrences.”2 Mystical Mystical refers to the direct perception of reality; knowledge derived directly rather than indirectly. In many respects, mysticism is surprisingly similar to science in that it is a systematic method of exploring the nature of the world. Science concentrates on outer, objective phenomena, and mysticism concentrates on inner, subjective phenomena. It is interesting that numerous scientists, scholars, and sages over the years have revealed deep, underlying similarities between the goals, practices, and findings of science and mysticism. Some of the most famous scientists wrote in terms that are practically indistinguishable from the writings of mystics.
Dean Radin (The Conscious Universe: The Scientific Truth of Psychic Phenomena)
The American psychologist Elliot Aronson, who studied this phenomenon, famously assembled a discussion group of pompous, dull people. Some of the participants were made to endure an arduous selection process; others were allowed to join immediately, without expending any effort. Those who were given the runaround reported enjoying the group far more than the ones who were simply let in. Aronson explained what was happening here: whenever we’ve invested time, money or energy into something and it ends up being a complete waste of time, this creates dissonance, which we try to reduce by finding ways of justifying our bad decision. Aronson’s participants focused unconsciously on what might be interesting, or at least bearable, about being part of a deliberately boring group. The people who had invested very little effort in joining therefore had less dissonance to reduce, and more readily admitted what a waste of time it had been.
Steven Bartlett (The Diary of a CEO: The 33 Laws of Business and Life)
Christianity was still primarily an urban phenomenon, attracting at first especially the ‘middle sort’ of people. It spread through persuasion and example, in ways that upper-class pagans found demeaning. ‘We see them in our own homes, wool dressers, cobblers and fullers, the most uneducated and common persons, not daring to say a word in the presence of their masters who are older and wiser,’ Celsus famously complained. ‘But when they get hold of the children in private, and silly women with them, they are wonderfully eloquent, to the effect that children must not listen to their father, but believe them, and be taught by them.’1
Larry Siedentop (Inventing the Individual: The Origins of Western Liberalism)
A fabricated feeling of superiority? PHILOSOPHER: A familiar example would be “giving authority.” YOUTH: What does that mean? PHILOSOPHER: One makes a show of being on good terms with a powerful person (broadly speaking—it could be anyone from the leader of your school class to a famous celebrity). And by doing that, one lets it be known that one is special. Behaviors like misrepresenting one’s work experience or excessive allegiance to particular brands of clothing are forms of giving authority, and probably also have aspects of the superiority complex. In each case, it isn’t that the “I” is actually superior or special. It is only that one is making the “I” look superior by linking it to authority. In short, it’s a fabricated feeling of superiority.
Ichiro Kishimi (The Courage to Be Disliked: The Japanese Phenomenon That Shows You How to Change Your Life and Achieve Real Happiness)
The early Wittgenstein and the logical positivists that he inspired are often thought to have their roots in the philosophical investigations of René Descartes.9 Descartes’s famous dictum “I think, therefore I am” has often been cited as emblematic of Western rationalism. This view interprets Descartes to mean “I think, that is, I can manipulate logic and symbols, therefore I am worthwhile.” But in my view, Descartes was not intending to extol the virtues of rational thought. He was troubled by what has become known as the mind-body problem, the paradox of how mind can arise from nonmind, how thoughts and feelings can arise from the ordinary matter of the brain. Pushing rational skepticism to its limits, his statement really means “I think, that is, there is an undeniable mental phenomenon, some awareness, occurring, therefore all we know for sure is that something—let’s call it I—exists.” Viewed in this way, there is less of a gap than is commonly thought between Descartes and Buddhist notions of consciousness as the primary reality. Before 2030, we will have machines proclaiming. Descartes’s dictum. And it won’t seem like a programmed response. The machines will be earnest and convincing. Should we believe them when they claim to be conscious entities with their own volition?
Ray Kurzweil (The Age of Spiritual Machines: When Computers Exceed Human Intelligence)
Inflation’, wrote Milton Friedman in a famous definition, ‘is always and everywhere a monetary phenomenon, in the sense that it cannot occur without a more rapid increase in the quantity of money than in output.’ What
Niall Ferguson (The Ascent of Money: A Financial History of the World: 10th Anniversary Edition)
If you were destined to be a poet, then you won't brainstorm for lines that rhymes. If you were destined to be a celebrity, then you shouldn't start searching for fans. If you are truly a god, then let others worship you!
Michael Bassey Johnson
We need to analyze and contemplate the experience of modernity in the Arab and Muslim world, in order to grasp what is happening. Some of us, for example, reject modernity, and yet it’s obvious that these same people are using the products of modernity, even to the extent that when proselytizing their interpretation of Islam, which conflicts with modernity, they’re employing the tools of modernity to do so. This strange phenomenon can best be understood by contemplating our basic attitude towards modernity, stemming from two centuries ago. If we analyze books written by various Muslim thinkers at the time, concerning modernity and the importance of modernizing our societies, and so forth, we can see that they distinguished between certain aspects of modernity that should be rejected, and others that may be accepted. You can find this distinction in the very earliest books that Muslim intellectuals wrote on the topic of modernity. To provide a specific example, I’ll cite an important book that is widely regarded as having been the first ever written about modern thought in the Muslim world, namely, a book by the famous Egyptian intellectual, Rifa’ Rafi’ al-Tahtawi (1801–1873), Takhlish al-Ibriz fi Talkhish Baris, whose title may be translated as Mining Gold from Its Surrounding Dross. As you can immediately grasp from its title, the book distinguishes between the “gold” contained within modernity—gold being a highly prized, expensive and rare product of mining—and its so-called “worthless” elements, which Muslims are forbidden to embrace. Now if we ask ourselves, “What elements of modernity did these early thinkers consider acceptable, and what did they demand that we reject?,” we discover that technology is the “acceptable” element of modernity. We are told that we may adopt as much technology as we want, and exploit these products of modernity to our heart’s content. But what about the modes of thought that give rise to these products, and underlie the very phenomenon of modernity itself? That is, the free exercise of reason, and critical thought? These two principles are rejected and proscribed for Muslims, who may adopt the products of modernity, while its substance, values and foundations, including its philosophical modes of thought, are declared forbidden. Shaykh Rifa’ Rafi’ al-Tahtawi explained that we may exploit knowledge that is useful for defense, warfare, irrigation, farming, etc., and yet he simultaneously forbade us to study, or utilize, the philosophical sciences that gave rise to modern thought, and the love for scientific methodologies that enlivens the spirit of modern knowledge, because he believed that they harbored religious deviance and infidelity (to God).
علي مبروك
Perhaps one of the most remarkable cases is one cited by F. W. H. Myers in his chapter on hypnotism in Human Personality: a young actress, an understudy, called upon suddenly to replace the star of her company, was sick with apprehension and stage-fright. Under light hypnosis she performed with competence and brilliance, and won great applause; but it was long before she was able to act her parts without the aid of the hypnotist, who stationed himself in her dressing-room. (Later in this same case the phenomenon of “post-hypnotic suggestion” began to be observed, and the foundations of the Nancy School of autosuggestion, of which Coué is the most famous contemporary associate, were laid.) In the same chapter in which he quotes the remarkable case of the actress, Myers made a theorizing comment which is of immense value to everyone who hopes to free himself of his bondage to failure. He points out that the ordinary shyness and tentativeness with which we all approach novel action is entirely removed from the hypnotized subject, who consequently acts instead with precision and self-confidence. Now the removal of shyness, or mauvaise honte (he wrote), which hypnotic suggestion can effect, is in fact a purgation of memory—inhibiting the recollection of previous failures, and setting free whatever group of aptitudes is for the moment required.
Dorothea Brande (Wake Up and Live!: A Formula for Success That Really Works!)
Meditation can generate several different kinds of altered states like strong emotional swings. Some of these states may be fun, but they are not the aim of exploring the whole universe of phenomena — seeing, listening, feeling, eating, touching, and thought — and of seeking our liberation amid the storm rather than demanding that the phenomenon match our desires. Practices of contemplation are powerful. When you work alone, and feel you're not free, please protect yourself. This dangerous feeling could include extreme fear, stress, uncertainty or even signs of the physical. Stay to speak with an instructor, a psychologist or a professional who can educate you about the procedure if something like this happens. Without wonder meditation is not a panacea. In fact when asked the spiritual leader Jiddu Krishnamurti, "What good is all this contemplation doing?" It's no use at all," he responded. "Meditation isn't guaranteed to make you wealthy, gorgeous or famous. That's a mystery. You do want to achieve your goal, but you need to let go of the target-oriented, overachieving, task-centered way of doing and remain in the state of being that helps to incorporate your mind and body in your meditation. It is the paradox of the Zen instruction “Try not to try.” What to Do in an Emergency A professional teacher's guide is often required. A group called the Spiritual Emergence Network advises people suffering from a spiritual emergency and lets qualified psychologists and physicians discern between a psychological emergency and a mental breakdown. Another way to tell the difference is that the person who sees visions in a spiritual disaster realizes they are delusions, whereas in a psychotic breakdown the person believes the dreams are real. If you have feelings that are extremely unpleasant and no trainer is present, immediately stop the practice and concentrate on simple earthy stuff to get yourself focused. Dig into the yard, go out walking or jogging, get a workout, take a bath or a shower and eat heavy stuff. Slow down your spiritual awakening when you feel threatened by it.
Adrian Satyam (Energy Healing: 6 in 1: Medicine for Body, Mind and Spirit. An extraordinary guide to Chakra and Quantum Healing, Kundalini and Third Eye Awakening, Reiki and Meditation and Mindfulness.)
You see, whether or not we want to admit it, political contempt and division are what economists call a demand-driven phenomenon. Famous people purvey it, but ordinary citizens are the ones creating a market for it.
Arthur C. Brooks (Love Your Enemies: How Decent People Can Save America from the Culture of Contempt)
The sociologist Robert Merton famously called this phenomenon the “Matthew Effect” after the New Testament verse in the Gospel of Matthew: “For unto everyone that hath shall be given, and he shall have abundance. But from him that hath not shall be taken away even that which he hath.” It is those who are successful, in other
Malcolm Gladwell (Outliers: The Story of Success)
In 1968, elementary school teacher Jane Elliott conducted a famous experiment with her students in the days after the assassination of Dr. Martin Luther King Jr. She divided the class by eye color. The brown-eyed children were told they were better. They were the “in-group.” The blue-eyed children were told they were less than the brown-eyed children—hence becoming the “out-group.” Suddenly, former classmates who had once played happily side by side were taunting and torturing one another on the playground. Lest we assign greater morality to the “out-group,” the blue-eyed children were just as quick to attack the brown-eyed children once the roles were reversed.6 Since Elliott’s experiment, researchers have conducted thousands of studies to understand the in-group/out-group response. Now, with fMRI scans, these researchers can actually see which parts of our brains fire up when perceiving a member of an out-group. In a phenomenon called the out-group homogeneity effect, we are more likely to see members of our groups as unique and individually motivated—and more likely to see a member of the out-group as the same as everyone else in that group. When we encounter this out-group member, our amygdala—the part of our brain that processes anger and fear—is more likely to become active. The more we perceive this person outside our group as a threat, the more willing we are to treat them badly.
Sarah Stewart Holland (I Think You're Wrong (But I'm Listening): A Guide to Grace-Filled Political Conversations)
He was soft-spoken and grandfatherly as he introduced himself. He seemed utterly and completely normal. Nothing about him suggested his dark past. The “banality of evil,” Hannah Arendt famously called this odd phenomenon when writing about Eichmann’s trial.
Eric Lichtblau (The Nazis Next Door: How America Became a Safe Haven for Hitler's Men)
As a phenomenon, this isn’t new. For centuries, Agadez has been an important crossroads for travellers and traders trying to make it through the Sahara. In the Middle Ages, salt and gold merchants picking their way between Timbuktu and the Mediterranean often had to pass through the town. By the fifteenth century, Agadez had its own sultan, its famously imposing mosque, and a knot of winding streets that still exists today.
Patrick Kingsley (The New Odyssey: The Story of the Twenty-First Century Refugee Crisis)