Programme Brain Quotes

We've searched our database for all the quotes and captions related to Programme Brain. Here they are! All 61 of them:

Names are deeply meaningful to your brain, and misleading names add chaos to your code.
Andrew Hunt (The Pragmatic Programmer: From Journeyman to Master)
Stop worrying about what others think of you. Base your thoughts, your decisions, and your goals on what you want and what is important in your life.
Daniel G. Amen (Change Your Brain, Change Your Life: The breakthrough programme for conquering anger, anxiety and depression)
Most of us know that the media tell us our bodies are imperfect - too fat, to smelly, too wrinkled, or too soft. And, even though we may know it’s horseshit, these messages still seep into our brains and mess with our self-esteem. In a media-saturated country where most images of women and men have been photoshopped to perfection, it’s hard to find a living supermodel (much less a computer programmer), who doesn’t wish she had sexier earlobes or a tighter ass. So, buck up, even the prettiest bombshell has body insecurities. You can spend your life thinking your butt’s too big (or your cock’s too small) or feeling sexy as hell. Make the choice to appreciate your body as it is.
Victoria Vantoch (The Threesome Handbook: Make the Most of Your Favorite Fantasy - the Ultimate Guide for Tri-Curious Singles and Couples)
While we may continue to use the words smart and stupid, and while IQ tests may persist for certain purposes, the monopoly of those who believe in a single general intelligence has come to an end. Brain scientists and geneticists are documenting the incredible differentiation of human capacities, computer programmers are creating systems that are intelligent in different ways, and educators are freshly acknowledging that their students have distinctive strengths and weaknesses.
Howard Gardner (Intelligence Reframed: Multiple Intelligences for the 21st Century)
Your scientists have done studies with people connected to an EEG brain-scanning device while watching TV; they registered activity in the delta wave frequencies, essentially occupying a highly programmable sleep state while viewing TV.
Barbara Marciniak (Path of Empowerment: New Pleiadian Wisdom for a World in Chaos)
Music was a kind of penetration. Perhaps absorption is a less freighted word. The penetration or absorption of everything into itself. I don't know if you have ever taken LSD, but when you do so the doors of perception, as Aldous Huxley, Jim Morrison and their adherents ceaselessly remind us, swing wide open. That is actually the sort of phrase, unless you are William Blake, that only makes sense when there is some LSD actually swimming about inside you. In the cold light of the cup of coffee and banana sandwich that are beside me now it appears to be nonsense, but I expect you to know what it is taken to mean. LSD reveals the whatness of things, their quiddity, their essence. The wateriness of water is suddenly revealed to you, the carpetness of carpets, the woodness of wood, the yellowness of yellow, the fingernailness of fingernails, the allness of all, the nothingness of all, the allness of nothing. For me music gives access to everyone of these essences, but at a fraction of the social or financial cost of a drug and without the need to cry 'Wow!' all the time, which is LSD's most distressing and least endearing side effects. ...Music in the precision of its form and the mathematical tyranny of its laws, escapes into an eternity of abstraction and an absurd sublime that is everywhere and nowhere at once. The grunt of rosin-rubbed catgut, the saliva-bubble blast of a brass tube, the sweaty-fingered squeak on a guitar fret, all that physicality, all that clumsy 'music making', all that grain of human performance...transcends itself at the moment of its happening, that moment when music actually becomes, as it makes the journey from the vibrating instrument, the vibrating hi-fi speaker, as it sends those vibrations across to the human tympanum and through to the inner ear and into the brain, where the mind is set to vibrate to frequencies of its own making. The nothingness of music can be moulded by the mood of the listener into the most precise shapes or allowed to float as free as thought; music can follow the academic and theoretical pattern of its own modality or adhere to some narrative or dialectical programme imposed by a friend, a scholar or the composer himself. Music is everything and nothing. It is useless and no limit can be set to its use. Music takes me to places of illimitable sensual and insensate joy, accessing points of ecstasy that no angelic lover could ever locate, or plunging me into gibbering weeping hells of pain that no torturer could ever devise. Music makes me write this sort of maundering adolescent nonsense without embarrassment. Music is in fact the dog's bollocks. Nothing else comes close.
Stephen Fry (Moab Is My Washpot (Memoir, #1))
Technology is taking over mind control programmes today, which allows them to go direct to the brain's information processing systems through the medium of electricity, electromagnetism, frequency, and microchips.
David Icke
Concepts and patterns that your brain is sorting through and making sense of are much more scalable and universal than any specific vendor’s technology
Chad Fowler (The Passionate Programmer: Creating a Remarkable Career in Software Development (Pragmatic Life))
The problem with this premise, of course, is that whereas other children had programmers who fed their brains with love and kindness, my programmers were evil. My code is flawed.
Stephanie Foo (What My Bones Know: A Memoir of Healing from Complex Trauma)
Our brains are like computers; it's our responsibility to programme them well, daily, and remove the viruses.
Sam Owen (500 Relationships And Life Quotes: Bite-Sized Advice For Busy People)
Confusion is part of programming.
Felienne Hermans (The Programmer's Brain)
Many rookie software managers think that they can "motivate" their programmers to work faster by giving them nice, "tight" (unrealistically short) schedules. I think this kind of motivation is brain-dead. When I'm behind schedule, I feel doomed and depressed and unmotivated. When I'm working ahead of schedule, I'm cheerful and productive. The schedule is not the place to play psychological games.
Joel Spolsky (Joel on Software)
A statement: children who watch violent TV programmes tend to be more violent when they grow up. But did the TV cause the violence, or do violent children preferentially enjoy watching violent programmes? Very likely both are true. Commercial defenders of TV violence argue that anyone can distinguish between television and reality. But Saturday morning children’s programmes now average 25 acts of violence per hour. At the very least this desensitizes young children to aggression and random cruelty. And if impressionable adults can have false memories implanted in their brains, what are we implanting in our children when we expose them to some 100,000 acts of violence before they graduate from elementary school?
Carl Sagan (The Demon-Haunted World: Science as a Candle in the Dark)
You see, programmers tend to be arrogant, self-absorbed introverts. We didn’t get into this business because we like people. Most of us got into programming because we prefer to deeply focus on sterile minutia, juggle lots of concepts simultaneously, and in general prove to ourselves that we have brains the size of a planet, all while not having to interact with the messy complexities of other people.
Robert C. Martin (Clean Coder, The: A Code of Conduct for Professional Programmers (Robert C. Martin Series))
if you program a purpose into a computer program, does that constitute its will? Does it have free will, if a programmer programmed its purpose? Is that programming any different from the way we are programmed by our genes and brains? Is a programmed will a servile will? Is human will a servile will? And is not the servile will the home and source of all feelings of defilement, infection, transgression, and rage?
Kim Stanley Robinson (2312)
Universal computers are capable of performing all the computations permitted by the laws of physics. Once a universal computer is constructed, all you have to do is to load it with the right programme, and it can simulate any other system that is physically allowed. This includes the biosphere, with all its splendid richness of animals, plants, and microorganisms; and, in principle, it even includes your brain, together with thoughts and emotions.
Chiara Marletto (The Science of Can and Can't: A Physicist's Journey Through the Land of Counterfactuals)
Instead of trying to produce a programme to simulate the adult mind, why not rather try to produce one which simulates the child’s? If this were then subjected to an appropriate course of education one would obtain the adult brain.3
Nick Bostrom (Superintelligence: Paths, Dangers, Strategies)
Instead of trying to produce a programme to simulate the adult mind, why not rather try to produce one which simulates the child’s?” he asked. “If this were then subjected to an appropriate course of education, one would obtain the adult brain.
Walter Isaacson (The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution)
Working simultaneously, though seemingly without a conscience, was Dr. Ewen Cameron, whose base was a laboratory in Canada's McGill University, in Montreal. Since his death in 1967, the history of his work for both himself and the CIA has become known. He was interested in 'terminal' experiments and regularly received relatively small stipends (never more than $20,000) from the American CIA order to conduct his work. He explored electroshock in ways that offered such high risk of permanent brain damage that other researchers would not try them. He immersed subjects in sensory deprivation tanks for weeks at a time, though often claiming that they were immersed for only a matter of hours. He seemed to fancy himself a pure scientist, a man who would do anything to learn the outcome. The fact that some people died as a result of his research, while others went insane and still others, including the wife of a member of Canada's Parliament, had psychological problems for many years afterwards, was not a concern to the doctor or those who employed him. What mattered was that by the time Cheryl and Lynn Hersha were placed in the programme, the intelligence community had learned how to use electroshock techniques to control the mind. And so, like her sister, Lynn was strapped to a chair and wired for electric shock. The experience was different for Lynn, though the sexual component remained present to lesser degree...
Cheryl Hersha (Secret Weapons: How Two Sisters Were Brainwashed to Kill for Their Country)
Research2 shows that older people who remain physically active throughout old age have more proteins in the brain that keep the connections between the neurons strong and healthy. This correlates to higher cognitive function and less neurodegeneration. The brain is a remarkable piece of machinery that we can programme and continue to upgrade so that we can live a full life that’s governed by autonomy and control.
Nicole Vignola (Rewire: Break the Cycle, Alter Your Thoughts and Create Lasting Change (Your Neurotoolkit for Everyday Life))
That evolution should select for larger brains may seem to us like, well, a no-brainer. We are so enamoured of our high intelligence that we assume that when it comes to cerebral power, more must be better. But if that were the case, the feline family would also have produced cats who could do calculus, and frogs would by now have launched their own space programme. Why are giant brains so rare in the animal kingdom?
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
[God] tells the woman that she will now bring forth children in sorrow, and desire an unworthy, sometimes resentful man, who will in consequence lord her biological fate over her, permanently. What might this mean? It could just mean that God is a patriarchal tyrant, as politically motivated interpretations of the ancient story insist. I think it’s—merely descriptive. Merely. And here is why: As human beings evolved, the brains that eventually gave rise to self-consciousness expanded tremendously. This produced an evolutionary arms race between fetal head and female pelvis.56 The female graciously widened her hips, almost to the point where running would no longer be possible. The baby, for his part, allowed himself to be born more than a year early, compared to other mammals of his size, and evolved a semi-collapsible head.57 This was and is a painful adjustment for both. The essentially fetal baby is almost completely dependent on his mother for everything during that first year. The programmability of his massive brain means that he must be trained until he is eighteen (or thirty) before being pushed out of the nest. This is to say nothing of the woman’s consequential pain in childbirth, and high risk of death for mother and infant alike. This all means that women pay a high price for pregnancy and child-rearing, particularly in the early stages, and that one of the inevitable consequences is increased dependence upon the sometimes unreliable and always problematic good graces of men.
Jordan B. Peterson (12 Rules for Life: An Antidote to Chaos)
Because now mental health disorders have gone “mainstream”. And for all the good it’s brought people like me who have been given therapy and stuff, there’s a lot of bad it’s brought too. Because now people use the phrase OCD to describe minor personality quirks. “Oooh, I like my pens in a line, I’m so OCD.” NO YOU’RE FUCKING NOT. “Oh my God, I was so nervous about that presentation, I literally had a panic attack.” NO YOU FUCKING DIDN’T. “I’m so hormonal today. I just feel totally bipolar.” SHUT UP, YOU IGNORANT BUMFACE. Told you I got angry. These words – words like OCD and bipolar – are not words to use lightly. And yet now they’re everywhere. There are TV programmes that actually pun on them. People smile and use them, proud of themselves for learning them, like they should get a sticker or something. Not realizing that if those words are said to you by a medical health professional, as a diagnosis of something you’ll probably have for ever, they’re words you don’t appreciate being misused every single day by someone who likes to keep their house quite clean. People actually die of bipolar, you know? They jump in front of trains and tip down bottles of paracetamol and leave letters behind to their devastated families because their bullying brains just won’t let them be for five minutes and they can’t bear to live with that any more. People also die of cancer. You don’t hear people going around saying: “Oh my God, my headache is so, like, tumoury today.” Yet it’s apparently okay to make light of the language of people’s internal hell
Holly Bourne
So which theory did Lagos believe in? The relativist or the universalist?" "He did not seem to think there was much of a difference. In the end, they are both somewhat mystical. Lagos believed that both schools of thought had essentially arrived at the same place by different lines of reasoning." "But it seems to me there is a key difference," Hiro says. "The universalists think that we are determined by the prepatterned structure of our brains -- the pathways in the cortex. The relativists don't believe that we have any limits." "Lagos modified the strict Chomskyan theory by supposing that learning a language is like blowing code into PROMs -- an analogy that I cannot interpret." "The analogy is clear. PROMs are Programmable Read-Only Memory chips," Hiro says. "When they come from the factory, they have no content. Once and only once, you can place information into those chips and then freeze it -- the information, the software, becomes frozen into the chip -- it transmutes into hardware. After you have blown the code into the PROMs, you can read it out, but you can't write to them anymore. So Lagos was trying to say that the newborn human brain has no structure -- as the relativists would have it -- and that as the child learns a language, the developing brain structures itself accordingly, the language gets 'blown into the hardware and becomes a permanent part of the brain's deep structure -- as the universalists would have it." "Yes. This was his interpretation." "Okay. So when he talked about Enki being a real person with magical powers, what he meant was that Enki somehow understood the connection between language and the brain, knew how to manipulate it. The same way that a hacker, knowing the secrets of a computer system, can write code to control it -- digital namshubs?" "Lagos said that Enki had the ability to ascend into the universe of language and see it before his eyes. Much as humans go into the Metaverse. That gave him power to create nam-shubs. And nam-shubs had the power to alter the functioning of the brain and of the body." "Why isn't anyone doing this kind of thing nowadays? Why aren't there any namshubs in English?" "Not all languages are the same, as Steiner points out. Some languages are better at metaphor than others. Hebrew, Aramaic, Greek, and Chinese lend themselves to word play and have achieved a lasting grip on reality: Palestine had Qiryat Sefer, the 'City of the Letter,' and Syria had Byblos, the 'Town of the Book.' By contrast other civilizations seem 'speechless' or at least, as may have been the case in Egypt, not entirely cognizant of the creative and transformational powers of language. Lagos believed that Sumerian was an extraordinarily powerful language -- at least it was in Sumer five thousand years ago." "A language that lent itself to Enki's neurolinguistic hacking." "Early linguists, as well as the Kabbalists, believed in a fictional language called the tongue of Eden, the language of Adam. It enabled all men to understand each other, to communicate without misunderstanding. It was the language of the Logos, the moment when God created the world by speaking a word. In the tongue of Eden, naming a thing was the same as creating it. To quote Steiner again, 'Our speech interposes itself between apprehension and truth like a dusty pane or warped mirror. The tongue of Eden was like a flawless glass; a light of total understanding streamed through it. Thus Babel was a second Fall.' And Isaac the Blind, an early Kabbalist, said that, to quote Gershom Scholem's translation, 'The speech of men is connected with divine speech and all language whether heavenly or human derives from one source: the Divine Name.' The practical Kabbalists, the sorcerers, bore the title Ba'al Shem, meaning 'master of the divine name.'" "The machine language of the world," Hiro says.
Neal Stephenson (Snow Crash)
Grades can also be profoundly unfair, especially for students who are unable to keep up, because the level of the exams usually increases from week to week. Let’s take the analogy of video games. When you discover a new game, you initially have no idea how to progress effectively. Above all, you don’t want to be constantly reminded of how bad you are! That’s why video game designers start with extremely easy levels, where you are almost sure to win. Very gradually, the difficulty increases and, with it, the risk of failure and frustration—but programmers know how to mitigate this by mixing the easy with the difficult, and by leaving you free to retry the same level as many times as you need. You see your score steadily increase . . . and finally, the joyous day comes when you successfully pass the final level, where you were stuck for so long. Now compare this with the report cards of “bad” students: they start the year off with a bad grade, and instead of motivating them by letting them take the same test again until they pass, the teacher gives them a new exercise every week, almost always beyond their abilities. Week after week, their “score” hovers around zero. In the video game market, such a design would be a complete disaster. All too often, schools use grades as punishments.
Stanislas Dehaene (How We Learn: Why Brains Learn Better Than Any Machine . . . for Now)
I deal in information," he says to the smarmy, toadying pseudojournalist who "interviews" him. He's sitting in his office in Houston, looking slicker than normal. "All television going out to Consumers throughout the world goes through me. Most of the information transmitted to and from the CIC database passes through my networks. The Metaverse -- -the entire Street -- exists by virtue of a network that I own and control. "But that means, if you'll just follow my reasoning for a bit, that when I have a programmer working under me who is working with that information, he is wielding enormous power. Information is going into his brain. And it's staying there. It travels with him when he goes home at night. It gets all tangled up into his dreams, for Christ's sake. He talks to his wife about it. And, goddamn it, he doesn't have any right to that information. If I was running a car factory, I wouldn't let workers drive the cars home or borrow tools. But that's what I do at five o'clock each day, all over the world, when my hackers go home from work. "When they used to hang rustlers in the old days, the last thing they would do is piss their pants. That was the ultimate sign, you see, that they had lost control over their own bodies, that they were about to die. See, it's the first function of any organization to control its own sphincters. We're not even doing that. So we're working on refining our management techniques so that we can control that information no matter where it is -- on our hard disks or even inside the programmers' heads. Now, I can't say more because I got competition to worry about. But it is my fervent hope that in five or ten years, this kind of thing won't even be an issue.
Neal Stephenson (Snow Crash)
Lagos, typically for a nonbusinessman, had a fatal flaw: he thought too small. He figured that with a little venture capital, this neurolinguistic hacking could be developed as a new technology that would enable Rife to maintain possession of information that had passed into the brains of his programmers. Which, moral considerations aside, wasn't a bad idea. "Rife likes to think big. He immediately saw that this idea could be much more powerful. He took Lagos's idea and told Lagos himself to buzz off. Then he started dumping a lot of money into Pentecostal churches. He took a small church in Bayview, Texas, and built it up into a university. He took a smalltime preacher, the Reverend Wayne Bedford, and made him more important than the Pope. He constructed a string of self-supporting religious franchises all over the world, and used his university, and its Metaverse campus, to crank out tens of thousands of missionaries, who fanned out all over the Third World and began converting people by the hundreds of thousands, just like St. Louis Bertrand. L. Bob Rife's glossolalia cult is the most successful religion since the creation of Islam. They do a lot of talking about Jesus, but like many selfdescribed Christian churches, it has nothing to do with Christianity except that they use his name. It's a postrational religion. "He also wanted to spread the biological virus as a promoter or enhancer of the cult, but he couldn't really get away with doing that through the use of cult prostitution because it is flagrantly anti-Christian. But one of the major functions of his Third World missionaries was to go out into the hinterlands and vaccinate people -- and there was more than just vaccine in those needles. "Here in the First World, everyone has already been vaccinated, and we don't let religious fanatics come up and poke needles into us. But we do take a lot of drugs. So for us, he devised a means for extracting the virus from human blood serum and packaged it as a drug known as Snow Crash.
Neal Stephenson (Snow Crash)
Computers speak machine language," Hiro says. "It's written in ones and zeroes -- binary code. At the lowest level, all computers are programmed with strings of ones and zeroes. When you program in machine language, you are controlling the computer at its brainstem, the root of its existence. It's the tongue of Eden. But it's very difficult to work in machine language because you go crazy after a while, working at such a minute level. So a whole Babel of computer languages has been created for programmers: FORTRAN, BASIC, COBOL, LISP, Pascal, C, PROLOG, FORTH. You talk to the computer in one of these languages, and a piece of software called a compiler converts it into machine language. But you never can tell exactly what the compiler is doing. It doesn't always come out the way you want. Like a dusty pane or warped mirror. A really advanced hacker comes to understand the true inner workings of the machine -- he sees through the language he's working in and glimpses the secret functioning of the binary code -- becomes a Ba'al Shem of sorts." "Lagos believed that the legends about the tongue of Eden were exaggerated versions of true events," the Librarian says. "These legends reflected nostalgia for a time when people spoke Sumerian, a tongue that was superior to anything that came afterward." "Is Sumerian really that good?" "Not as far as modern-day linguists can tell," the Librarian says. "As I mentioned, it is largely impossible for us to grasp. Lagos suspected that words worked differently in those days. If one's native tongue influences the physical structure of the developing brain, then it is fair to say that the Sumerians -- who spoke a language radically different from anything in existence today -- had fundamentally different brains from yours. Lagos believed that for this reason, Sumerian was a language ideally suited to the creation and propagation of viruses. That a virus, once released into Sumer, would spread rapidly and virulently, until it had infected everyone." "Maybe Enki knew that also," Hiro says. "Maybe the nam-shub of Enki wasn't such a bad thing. Maybe Babel was the best thing that ever happened to us.
Neal Stephenson (Snow Crash)
I WANT TO end this list by talking a little more about the founding of Pixar University and Elyse Klaidman’s mind-expanding drawing classes in particular. Those first classes were such a success—of the 120 people who worked at Pixar then, 100 enrolled—that we gradually began expanding P.U.’s curriculum. Sculpting, painting, acting, meditation, belly dancing, live-action filmmaking, computer programming, design and color theory, ballet—over the years, we have offered free classes in all of them. This meant spending not only the time to find the best outside teachers but also the real cost of freeing people up during their workday to take the classes. So what exactly was Pixar getting out of all of this? It wasn’t that the class material directly enhanced our employees’ job performance. Instead, there was something about an apprentice lighting technician sitting alongside an experienced animator, who in turn was sitting next to someone who worked in legal or accounting or security—that proved immensely valuable. In the classroom setting, people interacted in a way they didn’t in the workplace. They felt free to be goofy, relaxed, open, vulnerable. Hierarchy did not apply, and as a result, communication thrived. Simply by providing an excuse for us all to toil side by side, humbled by the challenge of sketching a self-portrait or writing computer code or taming a lump of clay, P.U. changed the culture for the better. It taught everyone at Pixar, no matter their title, to respect the work that their colleagues did. And it made us all beginners again. Creativity involves missteps and imperfections. I wanted our people to get comfortable with that idea—that both the organization and its members should be willing, at times, to operate on the edge. I can understand that the leaders of many companies might wonder whether or not such classes would truly be useful, worth the expense. And I’ll admit that these social interactions I describe were an unexpected benefit. But the purpose of P.U. was never to turn programmers into artists or artists into belly dancers. Instead, it was to send a signal about how important it is for every one of us to keep learning new things. That, too, is a key part of remaining flexible: keeping our brains nimble by pushing ourselves to try things we haven’t tried before. That’s what P.U. lets our people do, and I believe it makes us stronger.
Ed Catmull (Creativity, Inc.: an inspiring look at how creativity can - and should - be harnessed for business success by the founder of Pixar)
Christopher Cerf has been composing songs for Sesame Street for twenty-five years. His large Manhattan townhouse is full of Sesame Street memorabilia – photographs of Christopher with his arm around Big Bird, etc. ‘Well, it’s certainly not what I expected when I wrote them,’ Christopher said. ‘I have to admit, my first reaction was, “Oh my gosh, is my music really that terrible?” ’ I laughed. ‘I once wrote a song for Bert and Ernie called “Put Down The Ducky”,’ he said, ‘which might be useful for interrogating members of the Ba’ath Party.’ ‘That’s very good,’ I said. ‘This interview,’ Christopher said, ‘has been brought to you by the letters W, M and D.’ ‘That’s very good,’ I said. We both laughed. I paused. ‘And do you think that the Iraqi prisoners, as well as giving away vital information, are learning new letters and numbers?’ I said. ‘Well, wouldn’t that be an incredible double win?’ said Christopher. Christopher took me upstairs to his studio to play me one of his Sesame Street compositions, called ‘Ya! Ya! Das Is a Mountain!’ ‘The way we do Sesame Street,’ he explained, ‘is that we have educational researchers who test whether these songs are working, whether the kids are learning. And one year they asked me to write a song to explain what a mountain is, and I wrote a silly yodelling song about what a mountain was.’ Christopher sang me a little of the song: Oompah-pah! Oompah-pah! Ya! Ya! Das is a mountain! Part of zee ground zat sticks way up high! ‘Anyway,’ he said, ‘forty per cent of the kids had known what a mountain was before they heard the song, and after they heard the song, only about twenty-six per cent knew what a mountain was. That’s all they needed. You don’t know what a mountain is now, right? It’s gone! So I figure if I have the power to suck information out of people’s brains by writing these songs, maybe that’s something that could be useful to the CIA for brainwashing techniques.’ Just then, Christopher’s phone rang. It was a lawyer from his music publishers, BMI. I listened into Christopher’s side of the conversation: ‘Oh really?’ he said. ‘I see . . . Well, theoretically they have to log that and I should be getting a few cents for every prisoner, right? Okay. Bye, bye . . .’ ‘What was that about?’ I asked Christopher. ‘Whether I’m due some money for the performance royalties,’ he explained. ‘Why not? It’s an American thing to do. If I have the knack of writing songs that can drive people crazy sooner and more effectively than others, why shouldn’t I profit from that?’ This is why, later that day, Christopher asked Danny Epstein – who has been the music supervisor of Sesame Street since the very first programme was broadcast in July 1969 – to come to his house. It would be Danny’s responsibility to collect the royalties from the military if they proved negligent in filing a music-cue sheet.
Jon Ronson (The Men Who Stare At Goats)
Pericles’ speech is not only a programme. It is also a defence, and perhaps even an attack. It reads, as I have already hinted, like a direct attack on Plato. I do not doubt that it was directed, not only against the arrested tribalism of Sparta, but also against the totalitarian ring or ‘link’ at home; against the movement for the paternal state, the Athenian ‘Society of the Friends of Laconia’ (as Th. Gomperz called them in 190232). The speech is the earliest33 and at the same time perhaps the strongest statement ever made in opposition to this kind of movement. Its importance was felt by Plato, who caricatured Pericles’ oration half a century later in the passages of the Republic34 in which he attacks democracy, as well as in that undisguised parody, the dialogue called Menexenus or the Funeral Oration35. But the Friends of Laconia whom Pericles attacked retaliated long before Plato. Only five or six years after Pericles’ oration, a pamphlet on the Constitution of Athens36 was published by an unknown author (possibly Critias), now usually called the ‘Old Oligarch’. This ingenious pamphlet, the oldest extant treatise on political theory, is, at the same time, perhaps the oldest monument of the desertion of mankind by its intellectual leaders. It is a ruthless attack upon Athens, written no doubt by one of her best brains. Its central idea, an idea which became an article of faith with Thucydides and Plato, is the close connection between naval imperialism and democracy. And it tries to show that there can be no compromise in a conflict between two worlds37, the worlds of democracy and of oligarchy; that only the use of ruthless violence, of total measures, including the intervention of allies from outside (the Spartans), can put an end to the unholy rule of freedom. This remarkable pamphlet was to become the first of a practically infinite sequence of works on political philosophy which were to repeat more or less, openly or covertly, the same theme down to our own day. Unwilling and unable to help mankind along their difficult path into an unknown future which they have to create for themselves, some of the ‘educated’ tried to make them turn back into the past. Incapable of leading a new way, they could only make themselves leaders of the perennial revolt against freedom. It became the more necessary for them to assert their superiority by fighting against equality as they were (using Socratic language) misanthropists and misologists—incapable of that simple and ordinary generosity which inspires faith in men, and faith in human reason and freedom. Harsh as this judgement may sound, it is just, I fear, if it is applied to those intellectual leaders of the revolt against freedom who came after the Great Generation, and especially after Socrates. We can now try to see them against the background of our historical interpretation.
Karl Popper (The Open Society and Its Enemies)
Here’s something you may not know: every time you go to Facebook or ESPN.com or wherever, you’re unleashing a mad scramble of money, data, and pixels that involves undersea fiber-optic cables, the world’s best database technologies, and everything that is known about you by greedy strangers. Every. Single. Time. The magic of how this happens is called “real-time bidding” (RTB) exchanges, and we’ll get into the technical details before long. For now, imagine that every time you go to CNN.com, it’s as though a new sell order for one share in your brain is transmitted to a stock exchange. Picture it: individual quanta of human attention sold, bit by bit, like so many million shares of General Motors stock, billions of times a day. Remember Spear, Leeds & Kellogg, Goldman Sachs’s old-school brokerage acquisition, and its disappearing (or disappeared) traders? The company went from hundreds of traders and two programmers to twenty programmers and two traders in a few years. That same process was just starting in the media world circa 2009, and is right now, in 2016, kicking into high gear. As part of that shift, one of the final paroxysms of wasted effort at Adchemy was taking place precisely in the RTB space. An engineer named Matthew McEachen, one of Adchemy’s best, and I built an RTB bidding engine that talked to Google’s huge ad exchange, the figurative New York Stock Exchange of media, and submitted bids and ads at speeds of upwards of one hundred thousand requests per second. We had been ordered to do so only to feed some bullshit line Murthy was laying on potential partners that we were a real-time ads-buying company. Like so much at Adchemy, that technology would be a throwaway, but the knowledge I gained there, from poring over Google’s RTB technical documentation and passing Google’s merciless integration tests with our code, would set me light-years ahead of the clueless product team at Facebook years later.
Antonio García Martínez (Chaos Monkeys: Obscene Fortune and Random Failure in Silicon Valley)
The aim of the next nine sections will be to present careful arguments to show that none of the loopholes (a), (b), and (c) can provide a plausible way to evade the contradiction of the robot. Accordingly, it, and we also, are driven to the unpalatable (d), if we are still insistent that mathematical understanding can be reduced to computation. I am sure that those concerned with artificial intelligence would find (d) to be as unpalatable as I find it to be. It provides perhaps a conceivable standpoint-essentially the A/D suggestion, referred to at the end of 1.3, whereby divine intervention is required for the implanting of an unknowable algorithm into each of our computer brains (by 'the best programmer in the business'). In any case, the conclusion 'unknowable'-for the very mechanisms that are ultimately responsible for our intelligence-would not be a very happy conclusion for those hoping actually to construct a genuinely artificially intelligent robot! It would not be a particularly happy conclusion, either, for those of us who hope to understand, in principle and in a scientific way, how human intelligence has actually arisen, in accordance with comprehensible scientific laws, such as those of physics, chemistry, biology, and natural selection-irrespective of any desire to reproduce such intelligence in a robot device. In my own opinion, such a pessimistic conclusion is not warranted, for the very reason that 'scientific comprehensibility' is a very different thing from 'computability'. The conclusion should be not that the underlying laws are incomprehensible, but that they are non-computable.
Roger Penrose (Shadows of the Mind: A Search for the Missing Science of Consciousness)
One such usage is the turtle module (which is also part of the standard library). To quote the Python docs: Turtle graphics is a popular way for introducing programming to kids. It was part of the original Logo programming language developed by Wally Feurzig and Seymour Papert in 1966. Programmers
Paul Barry (Head First Python: A Brain-Friendly Guide)
Those who support the concept of a direct interface network don’t take into account the price they would have to pay for it, in privacy and safety and a thousand other areas of concern. Do you really want a machine to know where you are every minute of the day? Do you really trust the people who design these things, and program them, enough to let their work directly into your head? Don’t you realize that every time you let this creature come in contact with your brain, you are leaving your mark upon it as clearly as fingerprints upon glass, which any clever programmer can decipher? MAXWELL ONEGIN; Think Again! (Historical Archives, Hellsgate Station)
C.S. Friedman (This Alien Shore (The Outworlds series Book 1))
The earliest people to report porn-related problems in online forums were typically computer programmers and information-technology specialists. They had acquired high-speed internet porn ahead of the pack
Gary Wilson (Your Brain On Porn: Internet Pornography and the Emerging Science of Addiction)
Skills are taught experientially—meaning that students studying AI don’t have their heads buried in books. In order to learn, they need lexical databases, image libraries, and neural nets. For a time, one of the more popular neural nets at universities was called Word2vec, and it was built by the Google Brain team. It was a two-layer system that processed text, turning words into numbers that AI could understand.17 For example, it learned that “man is to king as woman is to queen.” But the database also decided that “father is to doctor as mother is to nurse” and “man is to computer programmer as woman is to homemaker.”18 The very system students were exposed to was itself biased. If someone wanted to analyze the farther-reaching implications of sexist code, there weren’t any classes where that learning could take place.
Amy Webb (The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity)
More radically, how can we be sure that the source of consciousness lies within our bodies at all? You might think that because a blow to the head renders one unconscious, the ‘seat of consciousness’ must lie within the skull. But there is no logical reason to conclude that. An enraged blow to my TV set during an unsettling news programme may render the screen blank, but that doesn’t mean the news reader is situated inside the television. A television is just a receiver: the real action is miles away in a studio. Could the brain be merely a receiver of ‘consciousness signals’ created somewhere else? In Antarctica, perhaps? (This isn’t a serious suggestion – I’m just trying to make a point.) In fact, the notion that somebody or something ‘out there’ may ‘put thoughts in our heads’ is a pervasive one; Descartes himself raised this possibility by envisaging a mischievous demon messing with our minds. Today, many people believe in telepathy. So the basic idea that minds are delocalized is actually not so far-fetched. In fact, some distinguished scientists have flirted with the idea that not all that pops up in our minds originates in our heads. A popular, if rather mystical, idea is that flashes of mathematical inspiration can occur by the mathematician’s mind somehow ‘breaking through’ into a Platonic realm of mathematical forms and relationships that not only lies beyond the brain but beyond space and time altogether. The cosmologist Fred Hoyle once entertained an even bolder hypothesis: that quantum effects in the brain leave open the possibility of external input into our thought processes and thus guide us towards useful scientific concepts. He proposed that this ‘external guide’ might be a superintelligence in the far cosmic future using a subtle but well-known backwards-in-time property of quantum mechanics in order to steer scientific progress.
Paul Davies (The Demon in the Machine: How Hidden Webs of Information Are Finally Solving the Mystery of Life)
More radically, how can we be sure that the source of consciousness lies within our bodies at all? You might think that because a blow to the head renders one unconscious, the ‘seat of consciousness’ must lie within the skull. But there is no logical reason to conclude that. An enraged blow to my TV set during an unsettling news programme may render the screen blank, but that doesn’t mean the news reader is situated inside the television. A television is just a receiver: the real action is miles away in a studio. Could the brain be merely a receiver of ‘consciousness signals’ created somewhere else? In Antarctica, perhaps? (This isn’t a serious suggestion – I’m just trying to make a point.) In fact, the notion that somebody or something ‘out there’ may ‘put thoughts in our heads’ is a pervasive one; Descartes himself raised this possibility by envisaging a mischievous demon messing with our minds. Today, many people believe in telepathy. So the basic idea that minds are delocalized is actually not so far-fetched. In fact, some distinguished scientists have flirted with the idea that not all that pops up in our minds originates in our heads. A popular, if rather mystical, idea is that flashes of mathematical inspiration can occur by the mathematician’s mind somehow ‘breaking through’ into a Platonic realm of mathematical forms and relationships that not only lies beyond the brain but beyond space and time altogether. The cosmologist Fred Hoyle once entertained an even bolder hypothesis: that quantum effects in the brain leave open the possibility of external input into our thought processes and thus guide us towards useful scientific concepts. He proposed that this ‘external guide’ might be a superintelligence in the far cosmic future using a subtle but well-known backwards-in-time property of quantum mechanics in order to steer scientific progress.
Paul C.W. Davies (The Demon in the Machine: How Hidden Webs of Information Are Solving the Mystery of Life)
When an airplane navigates through the sky it works its way along a route composed of beacons and waypoints – invisible signposts in the sky – which are defined by geographic coordinates. They constitute the pilot’s map of the world. Flight computers are programmed into these waypoints which are put into the systems before take-off. Assuming these coordinates have been programmed correctly, the plane will go from point A, passing through the designated waypoints, before arriving at point B without a hitch. However, if any of these waypoints are wrong, the aircraft will deviate from its flight programme and its destination which can prove fatal. Life for each of us contains thousands of waypoints; signposts that hopefully provide us with directions as to what to do, how to go about things and where to go next – our decision-making processes. But what happens when our own onboard computer, our brain, has initially been programmed with data that is corrupt and socially unacceptable. How are we able to make life decisions – correct decisions that is?
Christopher Berry-Dee (Inside the Mind of Jeffrey Dahmer: The Cannibal Killer)
Part of the difference is that the video gamers Gentile studied were adolescents. It’s unusual for adults to experience serious negative consequences from playing video games. Adolescent brains, however, have not yet fully developed, so adolescents may act like adults with brain damage. The biggest difference in the adolescent brain is in the frontal lobes, which don’t completely develop until their early twenties. That’s a problem because it’s the frontal lobes that give adults good judgment. They act like a brake, warning us when we’re about to do something that might not be such a good idea. Without fully functioning frontal lobes, adolescents act impulsively, and are at greater risk of making unwise decisions, even when they know better. There’s more to it than that, though. Video games are more complex than slot machines, so there are more opportunities for programmers to bake in features that trigger dopamine release in order to make it hard to stop playing.
Daniel Z. Lieberman (The Molecule of More: How a Single Chemical in Your Brain Drives Love, Sex, and Creativity―and Will Determine the Fate of the Human Race)
According to many experts the majority of the people won't be needed anymore for the coming society. Almost everything will be done by artificial intelligence, including self-driving cars and trucks, which already exist anyway. Some even mentioned that AI is making universities obsolete by how fast it can produce information. However, In my view, the AI has limitations that the many can't see, because on a brain to brain comparison, the AI always wins, yet the AI can only compute with programmable data. In other words, the AI can think like a human but can't imagine or create a future. The AI is always codependent on the imagination of its user. So the limitations of the AI are in fact determined by humans. It is not bad that we have AI but that people have no idea of how to use it apart from replacing their mental faculties and being lazy. This is actually why education has always been a scam. The AI will simply remove that from the way. But knowledge will still require analysis and input of information, so the AI doesn't really replace the necessary individuals of the academic world, but merely the many useless ones that keep copying and plagiarizing old ideas to justify and validate a worth they don't truly possess. Being afraid and paranoid about these transitions doesn't make sense because evolution can't be stopped, only delayed. The problem at the moment has more to do with those who want to keep themselves in power by force and profiting from the transitions. The level of consciousness of humanity is too low for what is happening, which is why people are easily deceived. Consequently, there will be more anger, fear, and frustration, because for the mind that is fixed on itself, change is perceived as chaos. The suffering is then caused by emotional attachments, stubbornness and the paranoid fixation on using outdated systems and not knowing how to adapt properly. In essence, AI is a problem for the selfish mind - rooted in cognitive rationalizations -, but an opportunity of great value for the self-reflective mind - capable of a metacognitive analysis. And the reason why nobody seems to understand this is precisely because, until now, everyone separated the mind from the spirit, while not knowing how a spiritual ascension actually goes through the mind. And this realization, obviously, will turn all religions obsolete too. Some have already come to this conclusion, and they are the ones who are ready.
Dan Desmarques
That evolution should select for larger brains may seem to us like, well, a no-brainer. We are so enamoured of our high intelligence that we assume that when it comes to cerebral power, more must be better. But if that were the case, the feline family would also have produced cats who could do calculus, and frogs would by now have launched their own space programme. Why are giant brains so rare in the animal kingdom? The fact is that a jumbo brain is a jumbo drain on the body. It’s not easy to carry around, especially when encased inside a massive skull. It’s even harder to fuel. In Homo sapiens, the brain accounts for about 2–3 per cent of total body weight, but it consumes 25 per cent of the body’s energy when the body is at rest. By comparison, the brains of other apes require only 8 per cent of rest-time energy. Archaic humans paid for their large brains in two ways. Firstly, they spent more time in search of food. Secondly, their muscles atrophied. Like a government diverting money from defence to education, humans diverted energy from biceps to neurons. It’s hardly a foregone conclusion that this is a good strategy for survival on the savannah. A chimpanzee can’t win an argument with a Homo sapiens, but the ape can rip the man apart like a rag doll.
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
You have to be an optimist to believe in the Singularity,” she says, “and that’s harder than it seems. Have you ever played Maximum Happy Imagination?” “Sounds like a Japanese game show.” Kat straightens her shoulders. “Okay, we’re going to play. To start, imagine the future. The good future. No nuclear bombs. Pretend you’re a science fiction writer.” Okay: “World government … no cancer … hover-boards.” “Go further. What’s the good future after that?” “Spaceships. Party on Mars.” “Further.” “Star Trek. Transporters. You can go anywhere.” “Further.” I pause a moment, then realize: “I can’t.” Kat shakes her head. “It’s really hard. And that’s, what, a thousand years? What comes after that? What could possibly come after that? Imagination runs out. But it makes sense, right? We probably just imagine things based on what we already know, and we run out of analogies in the thirty-first century.” I’m trying hard to imagine an average day in the year 3012. I can’t even come up with a half-decent scene. Will people live in buildings? Will they wear clothes? My imagination is almost physically straining. Fingers of thought are raking the space behind the cushions, looking for loose ideas, finding nothing. “Personally, I think the big change is going to be our brains,” Kat says, tapping just above her ear, which is pink and cute. “I think we’re going to find different ways to think, thanks to computers. You expect me to say that”—yes—“but it’s happened before. It’s not like we have the same brains as people a thousand years ago.” Wait: “Yes we do.” “We have the same hardware, but not the same software. Did you know that the concept of privacy is, like, totally recent? And so is the idea of romance, of course.” Yes, as a matter of fact, I think the idea of romance just occurred to me last night. (I don’t say that out loud.) “Each big idea like that is an operating system upgrade,” she says, smiling. Comfortable territory. “Writers are responsible for some of it. They say Shakespeare invented the internal monologue.” Oh, I am very familiar with the internal monologue. “But I think the writers had their turn,” she says, “and now it’s programmers who get to upgrade the human operating system.” I am definitely talking to a girl from Google. “So what’s the next upgrade?” “It’s already happening,” she says. “There are all these things you can do, and it’s like you’re in more than one place at one time, and it’s totally normal. I mean, look around.” I swivel my head, and I see what she wants me to see: dozens of people sitting at tiny tables, all leaning into phones showing them places that don’t exist and yet are somehow more interesting than the Gourmet Grotto. “And it’s not weird, it’s not science fiction at all, it’s…” She slows down a little and her eyes dim. I think she thinks she’s getting too intense. (How do I know that? Does my brain have an app for that?) Her cheeks are flushed and she looks great with all her blood right there at the surface of her skin. “Well,” she says finally, “it’s just that I think the Singularity is totally reasonable to imagine.
Robin Sloan (Mr. Penumbra's 24-Hour Bookstore (Mr. Penumbra's 24-Hour Bookstore, #1))
We are so enamoured of our high intelligence that we assume that when it comes to cerebral power, more must be better. But if that were the case, the feline family would also have produced cats who could do calculus, and frogs would by now have launched their own space programme. Why are giant brains so rare in the animal kingdom? The fact is that a jumbo brain is a jumbo drain on the body. It’s not easy to carry around, especially when encased inside a massive skull. It’s even harder to fuel. In Homo sapiens, the brain accounts for about 2–3 per cent of total body weight, but it consumes 25 per cent of the body’s energy when the body is at rest. By comparison, the brains of other apes require only 8 per cent of rest-time energy. Archaic humans paid for their large brains in two ways. Firstly, they spent more time in search of food. Secondly, their muscles atrophied. Like a government diverting money from defence to education, humans diverted energy from biceps to neurons. It’s hardly a foregone conclusion that this is a good strategy for survival on the savannah. A chimpanzee can’t win an argument with a Homo sapiens, but the ape can rip the man apart like a rag doll. Today
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
There is even a view, not uncommonly expressed, that might best be regarded as a combination of A and D (or perhaps B and D)-a possibility that will actually feature significantly in our later deliberations. According to this view, the brain's action is indeed that of a computer, but it is a computer of such wonderful complexity that its imitation is beyond the wit of man and science, being necessarily a divine creation of God-the 'best programmer in the business'!
Roger Penrose (Shadows of the Mind: A Search for the Missing Science of Consciousness)
I’m not a heroic hacker with magic code. I was a brain in a box. But I am a programmer. Or was. And I knew the system. I knew the software. And I had a plan. And a little bit of time before anyone was going to bother me again. So I got to work.
John Scalzi (The End of All Things (Old Man's War, #6))
THINK OF THE WAY a stretch of grass becomes a road. At first, the stretch is bumpy and difficult to drive over. A crew comes along and flattens the surface, making it easier to navigate. Then, someone pours gravel. Then tar. Then a layer of asphalt. A steamroller smooths it; someone paints lines. The final surface is something an automobile can traverse quickly. Gravel stabilizes, tar solidifies, asphalt reinforces, and now we don’t need to build our cars to drive over bumpy grass. And we can get from Philadelphia to Chicago in a single day. That’s what computer programming is like. Like a highway, computers are layers on layers of code that make them increasingly easy to use. Computer scientists call this abstraction. A microchip—the brain of a computer, if you will—is made of millions of little transistors, each of whose job is to turn on or off, either letting electricity flow or not. Like tiny light switches, a bunch of transistors in a computer might combine to say, “add these two numbers,” or “make this part of the screen glow.” In the early days, scientists built giant boards of transistors, and manually switched them on and off as they experimented with making computers do interesting things. It was hard work (and one of the reasons early computers were enormous). Eventually, scientists got sick of flipping switches and poured a layer of virtual gravel that let them control the transistors by punching in 1s and 0s. 1 meant “on” and 0 meant “off.” This abstracted the scientists from the physical switches. They called the 1s and 0s machine language. Still, the work was agonizing. It took lots of 1s and 0s to do just about anything. And strings of numbers are really hard to stare at for hours. So, scientists created another abstraction layer, one that could translate more scrutable instructions into a lot of 1s and 0s. This was called assembly language and it made it possible that a machine language instruction that looks like this: 10110000 01100001 could be written more like this: MOV AL, 61h which looks a little less robotic. Scientists could write this code more easily. Though if you’re like me, it still doesn’t look fun. Soon, scientists engineered more layers, including a popular language called C, on top of assembly language, so they could type in instructions like this: printf(“Hello World”); C translates that into assembly language, which translates into 1s and 0s, which translates into little transistors popping open and closed, which eventually turn on little dots on a computer screen to display the words, “Hello World.” With abstraction, scientists built layers of road which made computer travel faster. It made the act of using computers faster. And new generations of computer programmers didn’t need to be actual scientists. They could use high-level language to make computers do interesting things.* When you fire up a computer, open up a Web browser, and buy a copy of this book online for a friend (please do!), you’re working within a program, a layer that translates your actions into code that another layer, called an operating system (like Windows or Linux or MacOS), can interpret. That operating system is probably built on something like C, which translates to Assembly, which translates to machine language, which flips on and off a gaggle of transistors. (Phew.) So, why am I telling you this? In the same way that driving on pavement makes a road trip faster, and layers of code let you work on a computer faster, hackers like DHH find and build layers of abstraction in business and life that allow them to multiply their effort. I call these layers platforms.
Shane Snow (Smartcuts: The Breakthrough Power of Lateral Thinking)
THE SUPERMEMO MODEL HOW TO REMEMBER EVERYTHING YOU HAVE EVER LEARNED Long-term memory has two components: retrievability and stability. Retrievability determines how easily we remember something, and depends on how near the surface of our consciousness the information is ‘swimming’. Stability, on the other hand, is to do with how deeply information is anchored in our brains. Some memories have a high level of stability but a low level of retrievability. Try to recall one of your old phone numbers – you probably won’t be able to. But if you see the number in front of you, you will recognise it immediately. Imagine that you are learning Chinese. You have learned a word and memorised it. Without practice, over time it will become increasingly difficult to remember. The amount of time it takes for you to forget it completely can be calculated, and ideally you should be reminded of the word precisely when you are in the process of forgetting it. The more often you are reminded of the word, the longer you will remember it for. This learning programme is called Super-Memo and was developed by the Polish researcher Piotr Woźniak. It’s not what you know, it’s what you remember. Jan Cox After learning something, you should ideally refresh your memory of it at the following intervals: one, ten, thirty and sixty days afterwards.
Mikael Krogerus (The Decision Book: Fifty Models for Strategic Thinking (The Tschäppeler and Krogerus Collection))
You see, programmers tend to be arrogant, self-absorbed introverts. We didn’t get into this business because we like people. Most of us got into programming because we prefer to deeply focus on sterile minutia, juggle lots of concepts simultaneously, and in general prove to ourselves that we have brains the size of a planet, all while not having to interact with the messy complexities of other people. Yes,
Robert C. Martin (Clean Coder, The: A Code of Conduct for Professional Programmers (Robert C. Martin Series))
Why are we rejecting explicit word-based interfaces, and embracing graphical or sensorial ones—a trend that accounts for the success of both Microsoft and Disney? Part of it is simply that the world is very complicated now—much more complicated than the hunter-gatherer world that our brains evolved to cope with—and we simply can’t handle all of the details. We have to delegate. We have no choice but to trust some nameless artist at Disney or programmer at Apple or Microsoft to make a few choices for us, close off some options, and give us a conveniently packaged executive summary.
Neal Stephenson (In the Beginning...Was the Command Line)
Despite being roughly twice as many characters, it requires a fraction of the mental effort when you read it
Felienne Hermans (The Programmer's Brain)
Imagine another possibility – suppose you could back up your brain to a portable hard drive and then run it on your laptop. Would your laptop be able to think and feel just like a Sapiens? If so, would it be you or someone else? What if computer programmers could create an entirely new but digital mind, composed of computer code, complete with a sense of self, consciousness and memory? If you ran the program on your computer, would it be a person? If you deleted it could you be charged with murder?
Yuval Noah Harari (Sapiens: A Brief History of Humankind)
dating question -What do you want from this world? -To have a wardrobe. In his first meeting with Katrina, she asked him a dating question, and his answer was unconventional, he wished he could buy a wardrobe, in which he put his belongings, a metaphor for the instability in his life, so how does he do this, while he is without a homeland, without a home, moving from place to another, carrying a bag containing a few of his personal belongings. About to cheat on Khadija, the curiosity in the intelligence man’s mind overpowered him, the desire for knowledge, exploration, information, and a thirst for more details, the smallest details. Plan the process with the mentality of a computer programmer, “I will leave them a loophole in the system, they will hack me through it, and to do this they have to open their doors to send their code, and at this very moment, I am sending my code in the opposite direction. The most vulnerable account devices to hack are the hackers themselves. They enter the systems through special ports, which are opened to them by the so-called Trojan horse, a type of virus, with which they target the victim, open loopholes for them, infiltrate through them, and in both cases, they, in turn, have to open ports on their devices to complete the connection, from which they can be hacked backward. Katrina is a Trojan horse, he will not close the ports in front of her, she must succeed in penetrating him, and she will be his bridge connecting them, he will sneak through her, to the most secret and terrifying place in the world, a journey that leads him to the island of Malta, to enter the inevitable den. This is how the minds of investigators and intelligence men work, they must open the outlets of their minds to the fullest, to collect information, receive it, and deal with it, and that is why their minds are the most vulnerable to penetration, manipulation, and passing misleading information to them. It is almost impossible to convince a simple man, that there is life outside the planet, the outlets of his mind are closed, he is not interested in knowledge, nor is he collecting information, and the task of entering him is difficult, they call him the mind of the crocodile, a mind that is solid, closed, does not affect anything and is not affected by anything, He has his own convictions, he never changes them. While scientists, curious, intellectuals, investigators, and intelligence men, the ports of their minds are always open. And just as hackers can penetrate websites by injecting their URL addresses with programming phrases, they can implant their code into the website’s database, and pull information from it. The minds of such people can also be injected, with special codes, some of them have their minds ready for injection, and one or two injections are sufficient to prepare for the next stage, and for some, dozens of injections are not enough, and some of them injected their minds themselves, by meditation, thinking, and focusing on details, as Ruslan did. Khadija did not need more than three injections, but he trusted the love that brought them together, there is no need, she knew a lot about him in advance, and she will trust him and believe him. Her mind would not be able to get her away, or so he wished, the woman’s madness had not been given its due. What he is about to do now, and the revenge videos that she is going to receive will remain in her head forever, and will be her brain’s weapon to escape, when he tries to get her out of the box. From an early age, he did not enjoy safety and stability, he lived in the midst of hurricanes of chaos, and the heart of randomness. He became the son of shadows and their master. He deserved the nickname he called himself “Son of Chaos.
Ahmad I. AlKhalel (Zero Moment: Do not be afraid, this is only a passing novel and will end (Son of Chaos Book 1))
Let the spell collapse. Something inside him relaxed, like a ghost limb untensed; a mind-trick. The spell, the brain's equivalent of some tiny, crude, looping sub-programme collapsed, simply ceased to be said.
Iain M. Banks (The Player of Games (Culture, #2))
A lot of the psychological insights [in Parabellum] stemmed from my personal experiences. For example, I was a college athlete, so I could imagine what the ex-athlete was going through when her sports career ended. I'm a little emotionally detached (which helps in a field like forensics where I can see some unpleasant things), so I could identify with the detachment of the programmer. I tried to take my experiences and push them a little farther to develop characters with more serious psychological issues. And I read several memoirs to get a sense of what it feels like to live with depression, PTSD, brain trauma, etc.
Greg Hickey (Parabellum)
Every craftsman starts his or her journey with a basic set of good-quality tools. A woodworker might need rules, gauges, a couple of saws, some good planes, fine chisels, drills and braces, mallets, and clamps. These tools will be lovingly chosen, will be built to last, will perform specific jobs with little overlap with other tools, and, perhaps most importantly, will feel right in the budding woodworker's hands. Then begins a process of learning and adaptation. Each tool will have its own personality and quirks, and will need its own special handling. Each must be sharpened in a unique way, or held just so. Over time, each will wear according to use, until the grip looks like a mold of the woodworker's hands and the cutting surface aligns perfectly with the angle at which the tool is held. At this point, the tools become conduits from the craftsman's brain to the finished product—they have become extensions of his or her hands. Over time, the woodworker will add new tools, such as biscuit cutters, laser-guided miter saws, dovetail jigs—all wonderful pieces of technology. But you can bet that he or she will be happiest with one of those original tools in hand, feeling the plane sing as it slides through the wood.
Andrew Hunt (The Pragmatic Programmer)
So the key is repetition of these positive visualisations. It takes repeating and practising something twenty-one times over twenty-one days for the brain to create new neurological habit pathways, and in turn it takes twenty-one days for those pathways to begin to diminish if you cease the actions. Various studies have shown that it takes the brain a minimum of ten days and a maximum of twenty-one days to let go of an old belief and replace it with another one. Twenty-one days of affirmations and new habits of language and
Marisa Peer (You Can Be Thin: The Ultimate Programme to End Dieting...Forever)
Good and Bad Habits Habits can be compared to macros in an Excel sheet. If we have tasks we wish to repeat in multiple cells, we can record a macro to automate them and quickly apply the set of actions to selected cells. Habits are like macros in the brain. On receiving the given cue, the brain automatically performs the actions of its programming. However, there is a catch to it. The created macro does not care whether it was correctly designed or not. If correct, it saves time through automated processes. But if the macro itself is wrong, we end up with a messed up excel sheet. Likewise, habits too programme the brain for our benefit or harm. Here is an anecdotal tale about habits.
Swami Mukundananda (The Science of Mind Management)
Computers, as any programmer will tell you, are giant morons, not giant brains.
Arthur Samuel
Both the chef and the journalist can focus intensely when they are doing something that genuinely interests them. The rest of the time, focusing is a challenge. I have heard similar stories from other people with ADHD who work as programmers, car mechanics and marketing execs. They all describe facing an uphill battle in school and how, once they found their calling, they could suddenly put their hyperfocus to good use—something they had previously only experienced in front of the computer or TV. When they are truly interested, they don’t just move up one extra gear but two. Suddenly, their ability to concentrate is even better than that of a person without ADHD. The individuals I’m describing haven’t become successful in their professions despite having ADHD; they have become successful thanks to it.
Anders Hansen (Unlocking the ADHD Advantage: Why Your Brain Being Wired Differently Is Your Superpower)