Data Quality Quotes

We've searched our database for all the quotes and captions related to Data Quality. Here they are! All 100 of them:

Pilots used to fly planes manually, but now they operate a dashboard with the help of computers. This has made flying safer and improved the industry. Healthcare can benefit from the same type of approach, with physicians practicing medicine with the help of data, dashboards, and AI. This will improve the quality of care they provide and make their jobs easier and more efficient
Ronald M. Razmi (AI Doctor: The Rise of Artificial Intelligence in Healthcare - A Guide for Users, Buyers, Builders, and Investors)
A change in Quantity also entails a change in Quality
Friedrich Engels
Every business can benefit from good quality management consulting services. Consultants are able to gather, assemble and utilize data in unique ways. Consultants also have perspectives that are likely to be unique compared to the perspectives you find internal to your business.
Hendrith Vanlon Smith Jr.
The five-star scale doesn’t really exist for humans; it exists for data aggregation systems, which is why it did not become standard until the internet era. Making conclusions about a book’s quality from a 175-word review is hard work for artificial intelligences, whereas star ratings are ideal for them.
John Green (The Anthropocene Reviewed: Essays on a Human-Centered Planet)
Many people object to “wasting money in space” yet have no idea how much is actually spent on space exploration. The CSA’s budget, for instance, is less than the amount Canadians spend on Halloween candy every year, and most of it goes toward things like developing telecommunications satellites and radar systems to provide data for weather and air quality forecasts, environmental monitoring and climate change studies. Similarly, NASA’s budget is not spent in space but right here on Earth, where it’s invested in American businesses and universities, and where it also pays dividends, creating new jobs, new technologies and even whole new industries.
Chris Hadfield (An Astronaut's Guide to Life on Earth)
The Qur'an follows on from the two Revelations that preceded it and is not only free from contradictions in its narrations, the sign of the various human manipulations to be found in the Gospels, but provides a quality all of its own for those who examine it objectively and in the light of science i.e. its complete agreement with modern scientific data.
Maurice Bucaille (The Bible, the Qur'an, and Science: The Holy Scriptures Examined in the Light of Modern Knowledge)
Perception requires imagination because the data people encounter in their lives are never complete and always equivocal. For example, most people consider that the greatest evidence of an event one can obtain is to see it with their own eyes, and in a court of law little is held in more esteem than eyewitness testimony. Yet if you asked to display for a court a video of the same quality as the unprocessed data catptured on the retina of a human eye, the judge might wonder what you were tryig to put over. For one thing, the view will have a blind spot where the optic nerve attaches to the retina. Moreover, the only part of our field of vision with good resolution is a narrow area of about 1 degree of visual angle around the retina’s center, an area the width of our thumb as it looks when held at arm’s length. Outside that region, resolution drops off sharply. To compensate, we constantly move our eyes to bring the sharper region to bear on different portions of the scene we wish to observe. And so the pattern of raw data sent to the brain is a shaky, badly pixilated picture with a hole in it. Fortunately the brain processes the data, combining input from both eyes, filling in gaps on the assumption that the visual properties of neighboring locations are similar and interpolating. The result - at least until age, injury, disease, or an excess of mai tais takes its toll - is a happy human being suffering from the compelling illusion that his or her vision is sharp and clear. We also use our imagination and take shortcuts to fill gaps in patterns of nonvisual data. As with visual input, we draw conclusions and make judgments based on uncertain and incomplete information, and we conclude, when we are done analyzing the patterns, that out “picture” is clear and accurate. But is it?
Leonard Mlodinow (The Drunkard's Walk: How Randomness Rules Our Lives)
Focus groups for business projects are kind of the same. In this, you are going to collect data from a group of your customers or potential customers rather than your friends.
Pooja Agnihotri (Market Research Like a Pro)
Although the method is simple, it shows how, mathematically, random brute force can overcome precise logic. It's a numerical approach that uses quantity to derive quality.
Liu Cixin (The Three-Body Problem (Remembrance of Earth’s Past, #1))
So the 185 billion events to be enjoyed over our mortal days might be either an overestimate or an underestimate. If we consider the amount of data the brain could theoretically process, the number might be too low; but if we look at how people actually use their minds, it is definitely much too high. In any case, an individual can experience only so much. Therefore, the information we allow into consciousness becomes extremely important; it is, in fact, what determines the content and the quality of life.
Mihály Csíkszentmihályi (Flow: The Psychology of Optimal Experience)
This suggests that our boding mechanisms depend on our own perception of the other and that therefore our ability to bond with them depends much more on emotional settings than on abstract "humanlike" qualities. For the same reason, it is the very emotionality Commmander Data from Star Trek displays every time it complains about having no emotions that endears it; an emotionless machine would not constantly raise the issues of its own worth, value, and personhood.
Anne Foerst (God in the Machine: What Robots Teach Us About Humanity and God)
the track of Quality preselects what data we’re going to be conscious of, and it makes this selection in such a way as to best harmonize what we are with what we are becoming.
Robert M. Pirsig (Zen and the Art of Motorcycle Maintenance)
Quality without science and research is absurd. You can't make inferences that something works when you have 60 percent missing data.
Peter Pronovost (Safe Patients, Smart Hospitals: How One Doctor's Checklist Can Help Us Change Health Care from the Inside Out)
Data is the blood of any organization; coming from everywhere, used everywhere, connecting all the body, transferring messages and when analyzed it reflects the whole picture of the body.
Khalid Abulmajd
A dispassionate conceptual development of the typology of violence must by definition ignore its traumatic impact. Yet there is a sense in which a cold analysis of violence somehow reproduces and participates in its horror. A distinction needs to be made, as well, between (factual) truth and truthfulness: what renders a report of a raped woman (or any other narrative of a trauma) truthful is its very factual unreliability, its confusion, its inconsistency. If the victim were able to report on her painful and humiliating experience in a clear manner, with all the data arranged in a consistent order, this very quality would make us suspicious of its truth.
Slavoj Žižek (Violence: Six Sideways Reflections)
Specifically, Kahan identified “scientific curiosity.” That’s different from scientific literacy. The two qualities are correlated, of course, but there are curious people who know rather little about science (yet), and highly trained people with little appetite to learn more.
Tim Harford (The Data Detective: Ten Easy Rules to Make Sense of Statistics)
Leyner's fiction is, in this regard, an eloquent reply to Gilder's prediction that our TV-culture problems can be resolved by the dismantling of images into discrete chunks we can recombine as we fancy. Leyner's world is a Gilder-esque dystopia. The passivity and schizoid decay still endure for Leyner in his characters' reception of images and waves of data. The ability to combine them only adds a layer of disorientation: when all experience can be deconstructed and reconfigured, there become simply too many choices. And in the absence of any credible, noncommercial guides for living, the freedom to choose is about as "liberating" as a bad acid trip: each quantum is as good as the next, and the only standard of an assembly's quality is its weirdness, incongruity, its ability to stand out from a crowd of other image-constructs and wow some Audience.
David Foster Wallace
As CEO, you should have an opinion on absolutely everything. You should have an opinion on every forecast, every product plan, every presentation, and even every comment. Let people know what you think. If you like someone’s comment, give her the feedback. If you disagree, give her the feedback. Say what you think. Express yourself. This will have two critically important positive effects:   Feedback won’t be personal in your company. If the CEO constantly gives feedback, then everyone she interacts with will just get used to it. Nobody will think, “Gee, what did she really mean by that comment? Does she not like me?” Everybody will naturally focus on the issues, not an implicit random performance evaluation.   People will become comfortable discussing bad news. If people get comfortable talking about what each other are doing wrong, then it will be very easy to talk about what the company is doing wrong. High-quality company cultures get their cue from data networking routing protocols: Bad news travels fast and good news travels slowly. Low-quality company cultures take on the personality of the Wicked Witch of the West in The Wiz: “Don’t nobody bring me no bad news.
Ben Horowitz (The Hard Thing About Hard Things: Building a Business When There Are No Easy Answers)
Completeness of crucial information is extremely important, as missing data is not only a cost issue but is also a massive lost opportunity issue...
Rupa Mahanti (Data Quality: Dimensions, Measurement, Strategy, Management, and Governance)
Make a habit of discussing a problem on the basis of the data and respecting the facts shown by them.” - Dr Kaoru Ishikawa
Suresh Lulla (Quality Fables: High density nuggets on vision, change, innovation, and problem solving.)
He said sometimes when you're young you have to think about things, because you're forming your value-sets and you keep coming up with Data Insufficient and finding holes in your programs. So you keep trying to do a fix on your sets. And the more powerful your mind is and the more intense your concentration is, the worse damage you can do to yourself, which is why, Justin says, Alphas always have trouble and some of them go way off and out-there, and why almost all Alphas are eccentric. But he says the best thing you can do if you're too bright for your own good is what the Testers do, be aware where you got which idea, keep a tab on everything, know how your ideas link up with each other and with your deep-sets and value-sets, so when you're forty or fifty or a hundred forty and you find something that doesn't work, you can still find all the threads and pull them. But that's not real easy unless you know what your value-sets are, and most CITs don't. CITs have a trouble with not wanting to know that kind of thing. Because some of them are real eetee once you get to thinking about how they link. Especially about sex and ego-nets. Justin says inflexibility is a trap and most Alpha types are inward-turned because they process so fast they're gone and thinking before a Gamma gets a sentence out. Then they get in the habit of thinking they thought of everything, but they don't remember everything stems from input. You may have a new idea, but it stems from input somebody gave you, and that could be wrong or your senses could have been lying to you. He says it can be an equipment-quality problem or a program-quality problem, but once an Alpha takes a falsehood for true, it's a personal problem.
C.J. Cherryh (Cyteen (Cyteen, #1-3))
The algorithms of superintelligence will change the world in a positive way but trouncing human being will not be possible due to emotions, empathy, social interactions, reproduction, and mortality which are the qualities that belong to humans only.
Enamul Haque
the consequences of scientific illiteracy are far more dangerous in our time than in any that has come before. It’s perilous and foolhardy for the average citizen to remain ignorant about global warming, say, or ozone depletion, air pollution, toxic and radioactive wastes, acid rain, topsoil erosion, tropical deforestation, exponential population growth. Jobs and wages depend on science and technology. If our nation can’t manufacture, at high quality and low price, products people want to buy, then industries will continue to drift away and transfer a little more prosperity to other parts of the world. Consider the social ramifications of fission and fusion power, supercomputers, data “highways,” abortion, radon, massive reductions in strategic weapons, addiction, government eavesdropping on the lives of its citizens, high-resolution TV, airline and airport safety, fetal tissue transplants, health costs, food additives, drugs to ameliorate mania or depression or schizophrenia, animal rights, superconductivity, morning-after pills, alleged hereditary antisocial predispositions, space stations, going to Mars, finding cures for AIDS and cancer. How can we affect national policy—or even make intelligent decisions in our own lives—if we don’t grasp the underlying issues?
Carl Sagan (The Demon-Haunted World: Science as a Candle in the Dark)
As we’ve gone along, I’ve pointed out that a warm childhood relationship with his mother—not maternal education—was significantly related to a man’s verbal test scores, to high salary, to class rank at Harvard, and to military rank at the end of World War II. At the men’s twenty-fifth reunion, it looked, to my surprise, as though the quality of a man’s relationship with his mother had little effect on overall midlife adjustment. However, forty-five years later, to my surprise again, the data suggested that there was a significant positive correlation between the quality of one’s maternal relationship and the absence of cognitive decline. At age ninety, 33 percent of the men with poor maternal relationships, and only 13 percent of men with warm relationships, suffered from dementia.
George E. Vaillant (Triumphs of Experience: The Men of the Harvard Grant Study)
Every successful brand stands for something more than itself, and that thing is emotional. A great brand promises hope, the contagion of coolness, or desirability, or love, or romance, or acceptance, or luxury, or youth, or sophistication, or high-quality technology.
Martin Lindstrom (Small Data: The Tiny Clues That Uncover Huge Trends)
It is not that I am a genius or exceptionally gifted, not by any means. Quite the contrary. What Happened (I shall try to explain it) is that every mind is shaped by its own experiences and memories and knowledge, and what makes it unique is the grand total and extremely personal nature of the collection of all the data that have made it what it is. Each person possesses a mind with powers that are, whether great or small, always unique, powers that belong to them alone. This renders them capable of carrying out a feat, whether grandiose or banal, that only they could have carried out. In this case, all others had failed because they had counted on the simple quantitative progression of intelligence and ingenuity, when what was required was an unspecified quantity, but of the appropriate quality, of both. My own intelligence is quite minimal, a fact that I have ascertained at great cost to myself. It has been just barely adequate to keep me afloat in the tempestuous waters of life. Yet, its quality is unique; not because I decided it would be, but rather because that is how it must be.
César Aira
Thanks in part to the resulting high score on the evaluation, he gets a longer sentence, locking him away for more years in a prison where he’s surrounded by fellow criminals—which raises the likelihood that he’ll return to prison. He is finally released into the same poor neighborhood, this time with a criminal record, which makes it that much harder to find a job. If he commits another crime, the recidivism model can claim another success. But in fact the model itself contributes to a toxic cycle and helps to sustain it. That’s a signature quality of a WMD.
Cathy O'Neil (Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy)
Drift. Down through deltas of former girlfriends, degrees of confirmation of girlfriendhood, personal sightings of Rez or Lo together with whichever woman in whatever public place, each account illuminated with the importance the event had held for whoever had posted it. This being for Laney the most peculiar aspect of this data, the perspective in which these two loomed. Human in every detail but then not so. Everything scrupulously, fanatically accurate, probably, but always assembled around the hollow armature of celebrity. He could see celebrity here, not like Kathy’s idea of a primal substance, but as a paradoxical quality inherent in the substance of the world. He saw that the quantity of data accumulated here by the band’s fans was much greater than everything the band themselves had ever generated. And their actual art, the music and the videos, was the merest fragment of that.
William Gibson (Idoru (Bridge, #2))
In this chapter I will describe the effects of the data deluge on all members of society generally and how it erodes the confidence, judgment, and decisiveness of leaders in particular. Then I will show the paradoxical side of the data deluge. Despite its anxiety-provoking effects, the proliferation of data also has an addictive quality. Leaders, healers, and parents “imbibe” data as a way of dealing with their own chronic anxiety. The pursuit of data, in almost any field, has come to resemble a form of substance abuse, accompanied by all the usual problems of addiction: self-doubt, denial, temptation, relapse, and withdrawal. Leadership training programs thus wind up in the codependent position of enablers, with publishers often in the role of “suppliers.” What does it take to get parents, healers, and managers, when they hear of the latest quick-fix fad that has just been published, to “just say no”?
Edwin H. Friedman (A Failure of Nerve: Leadership in the Age of the Quick Fix)
Indian children are more likely to be malnourished than children from Zimbabwe, Somalia and the Democratic Republic of Congo, Africa’s three poorest countries,1 and in Delhi, nearly 5 million school-aged children have irreversible lung damage from that city’s air quality, which is twice as bad as Beijing’s.2
Martin Lindstrom (Small Data: The Tiny Clues That Uncover Huge Trends)
Empirically, things are poignant, tragic, beautiful, humorous, settled, disturbed, comfortable, annoying, barren, harsh, consoling, splendid, fearful; are such immediately and in their own right and behalf.... These traits stand in themselves on precisely the same level as colours, sounds, qualities of contact, taste and smell. Any criterion that finds the latter to be ultimate and "hard" data will, impartially applied, come to the same conclusion about the former. -Any- quality as such is final; it is at once initial and terminal; just what it is as it exists. it may be referred to other things, it may be treated as an effect or a sign. But this involves an extraneous extension and use. It takes us beyond quality in its immediate qualitativeness.... The surrender of immediate qualities, sensory and significant, as objects of science, and as proper forms of classification and understanding, left in reality these immediate qualities just as they were; since they are -had- there is no need to -know- them. But... the traditional view that the object of knowledge is reality par excellence led to the conclusion that the object of science was preeminently metaphysically real. Hence, immediate qualities, being extended from the object of science, were left thereby hanging loose from the "real" object. Since their -existence- could not be denied, they were gathered together into a psychic realm of being, set over against the object of physics. Given this premise, all the problems regarding the relation of mind and matter, the psychic and the bodily, necessarily follow. Change the metaphysical premise; restore, that is to say, immediate qualities to their rightful position as qualities of inclusive situations, and the problems in question cease to be epistemological problems. They become specifiable scientific problems; questions, that is to say, of how such and such an event having such and such qualities actually occurs.
John Dewey (Experience and Nature)
The lack of meritocracy in academia is a problem that should concern all of us if we care about the quality of the research that comes out of the academy, because studies show that female academics are more likely than men to challenge male-default analysis in their work.53 This means that the more women who are publishing, the faster the gender data gap in research will close. And we should care about the quality of academic research.
Caroline Criado Pérez (Invisible Women: Exposing Data Bias in a World Designed for Men)
I like to ensure that I have music and art all around me. My personal favorite is old maps. What I love about old maps is that they are both beautiful and imperfect. These imperfections represent that some of the most talented in history were still very wrong (early cartography was very difficult). As the majority of my work is analysis and advisory, I find it a valuable reminder that my knowledge is limited. No matter how much data or insight I have, I can never fully “map out” any business. Yet, despite the incompleteness of these early cartographers, so much was learned of the world. So much done and accomplished. Therefore, these maps, or art pieces, serve as something to inspire both humility and achievement. This simple environmental factor helps my productivity and the overall quality of my work. Again, it’s like adding positive dice to my hand that are rolled each day.
Evan Thomsen (Don’t Chase The Dream Job, Build It: The unconventional guide to inventing your career and getting any job you want)
High-quality and transparent data, clearly documented, timely rendered, and publicly available are the sine qua non of competent public health management. During a pandemic, reliable and comprehensive data are critical for determining the behavior of the pathogen, identifying vulnerable populations, rapidly measuring the effectiveness of interventions, mobilizing the medical community around cutting-edge disease management, and inspiring cooperation from the public. The shockingly low quality of virtually all relevant data pertinent to COVID-19, and the quackery, the obfuscation, the cherrypicking and blatant perversion would have scandalized, offended, and humiliated every prior generation of American public health officials. Too often, Dr. Fauci was at the center of these systemic deceptions. The “mistakes” were always in the same direction—inflating the risks of coronavirus and the safety and efficacy of vaccines in
Robert F. Kennedy Jr. (The Real Anthony Fauci: Bill Gates, Big Pharma, and the Global War on Democracy and Public Health)
The master propagandist, like the advertising expert, avoids obvious emotional appeals and strives for a tone that is consistent with the prosaic quality of modern life—a dry, bland matter-of-factness. Nor does the propagandist circulate "intentionally biased" information. He knows that partial truths serve as more effective instruments of deception than lies. Thus he tries to impress the public with statistics of economic growth that neglect to give the base year from which growth is calculated, with accurate but meaningless facts about the standard of living—with raw and uninterpreted data, in other words, from which the audience is invited to draw the inescapable conclusion that things are getting better and the present régime therefore deserves the people's confidence, or on the other hand that things are getting worse so rapidly that the present régime should be given emergency powers to deal with the developing crisis.
Christopher Lasch (The Culture of Narcissism: American Life in An Age of Diminishing Expectations)
There's a myth that with age the vagina risks a "use it or lose it" scenario - meaning without the potency of a penis the vagina may shrink permanently. It's simply shocking (imagine my voice dripping in sarcasm) that there's no similar myth that the penis can avoid shrinkage with regular vaginal contact. The "use it or lose it" theory was based on lower quality data that wouldn't be accepted today. Loss of estrogen and age related changes are what affect a vagina: it's not a lament for the touch of a man.
Jennifer Gunter (The Menopause Manifesto: Own Your Health with Facts and Feminism)
In this zero-marginal-cost economy, the only way to make money is to scrape consumer data from your users and sell it to advertisers. In the creative world, nowhere does the fixed cost to produce high-quality music, video, books, and games get factored into this equation. How are musicians, journalists, photographers, and filmmakers going to survive in the zero-marginal-cost economy? For the media economy to continue, we are going to have to find ways to deal with the paradox that Summers and DeLong point to.
Jonathan Taplin (Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy)
Even at that early date, the basic building blocks of web search had been already set in stone. Search was a four-step process. First came a sweeping scan of all the world’s web pages, via a spider. Second was indexing the information drawn from the spider’s crawl and storing the data on racks of computers known as servers. The third step, triggered by a user’s request, identified the pages that seemed best suited to answer that query. That result was known as search quality. The final step involved formatting and delivering the results to the user.
Steven Levy (In the Plex: How Google Thinks, Works, and Shapes Our Lives)
The company’s problems with the FDA were far bigger than just the drugs intended for Africa. Ranbaxy had not properly tested the stability of almost any drugs on the U.S. market. The most basic good manufacturing practices require continuous monitoring of drug quality. Drug stability must be tested at intervals called “stations”: three months, six months, nine months, and so on. So long as a drug is on the market, that data has to be filed in an annual report with the FDA. One is never out of data, because obtaining it is simply part of the process.
Katherine Eban (Bottle of Lies: The Inside Story of the Generic Drug Boom)
As I came to understand, in retrospect, the magnetic quality that these works held for me, I came to understand that what motivated these men was not Earthly prizes or the respect of colleagues, but that they put their souls and minds on something and reached the extraordinary place where the mind could no longer produce data of the type that they wanted, and they were in the territory of inspiration, where their intuitions accelerated and they knew that there was something more than the realm of time and space and matter, something more than physical life.
Gary Zukav (The Seat of the Soul: 25th Anniversary Edition with a Study Guide)
The future for ancient DNA laboratories that I find appealing is based on a model that has emerged among radiocarbon dating laboratories. For example, the Oxford Radiocarbon Accelerator Unit processes large numbers of samples for a fee, and uses this income stream to support a factory that churns out routine dates and produces data more cheaply, efficiently, and at higher quality than would be possible if its scientists limited themselves to their own questions. But its scientists then piggyback on the juggernaut of the radiocarbon dating factory they have built to do cutting-edge science,
David Reich (Who We Are and How We Got Here: Ancient DNA and the New Science of the Human Past)
We have first to distinguish knowledge of things and knowledge of truths. In each there are two kinds, one immediate and one derivative. Our immediate knowledge of things, which we called acquaintance, consists of two sorts, according as the things known are particulars or universals. Among particulars, we have acquaintance with sense-data and (probably) with ourselves. Among universals, there seems to be no principle by which we can decide which can be known by acquaintance, but it is clear that among those that can be so known are sensible qualities, relations of space and time, similarity, and certain abstract logical universals. Our derivative knowledge of things, which we call knowledge by description, always involves both acquaintance with something and knowledge of truths. Our immediate knowledge of truths may be called intuitive knowledge, and the truths so known may be called self-evident truths. Among such truths are included those which merely state what is given in sense, and also certain abstract logical and arithmetical principles, and (though with less certainty) some ethical propositions. Our derivative knowledge of truths consists of everything that we can deduce from self-evident truths by the use of self-evident principles of deduction.
Bertrand Russell (The Problems of Philosophy)
I don’t know to what extent ignorance of science and mathematics contributed to the decline of ancient Athens, but I know that the consequences of scientific illiteracy are far more dangerous in our time than in any that has come before. It’s perilous and foolhardy for the average citizen to remain ignorant about global warming, say, or ozone depletion, air pollution, toxic and radioactive wastes, acid rain, topsoil erosion, tropical deforestation, exponential population growth. Jobs and wages depend on science and technology. If our nation can’t manufacture, at high quality and low price, products people want to buy, then industries will continue to drift away and transfer a little more prosperity to other parts of the world. Consider the social ramifications of fission and fusion power, supercomputers, data “highways,” abortion, radon, massive reductions in strategic weapons, addiction, government eavesdropping on the lives of its citizens, high-resolution TV, airline and airport safety, fetal tissue transplants, health costs, food additives, drugs to ameliorate mania or depression or schizophrenia, animal rights, superconductivity, morning-after pills, alleged hereditary antisocial predispositions, space stations, going to Mars, finding cures for AIDS and cancer.
Carl Sagan (The Demon-Haunted World: Science as a Candle in the Dark)
Say Goodbye to Fingersticks & hello to Continuous Glucose Monitoring Systems Living with diabetes is a daily challenge, requiring individuals to closely monitor their blood glucose levels to maintain stable health. Fortunately, advancements in medical technology have revolutionized diabetes management, with one such innovation being Continuous Glucose Monitoring (CGM) systems. CGM has become a game-changer for diabetics, providing real-time data and insights that enable better control of blood sugar levels and, ultimately, a higher quality of life. In this article, we will explore the benefits of Continuous Glucose Monitoring and how it has transformed diabetes management for the better.
Continuous Glucose Monitoring
Data show that the more patients actually know, the less they want of our treatments at the end of life. A study of 230 surrogate decision makers for patients on breathing machines demonstrated that the better the quality of clinician–family communication, the less life support was elected. Another study showed that people were less likely to want CPR after they learned what it actually entailed. Most people dramatically overestimate the likelihood of survival after CPR. When they learn the real numbers, they are less likely to want it by about 50 percent. In short, when people have a more robust understanding of the benefits and burdens of the treatment they are actually getting, they want less of it.
Jessica Nutik Zitter (Extreme Measures: Finding a Better Path to the End of Life)
I have found it frustrating at times that so few people know what the space program does and, as a result, are unaware that they benefit from it. Many people object to “wasting money in space” yet have no idea how much is actually spent on space exploration. The CSA’s budget, for instance, is less than the amount Canadians spend on Halloween candy every year, and most of it goes toward things like developing telecommunications satellites and radar systems to provide data for weather and air quality forecasts, environmental monitoring and climate change studies. Similarly, NASA’s budget is not spent in space but right here on Earth, where it’s invested in American businesses and universities, and where it also pays dividends, creating new jobs, new technologies and even whole new industries. The
Chris Hadfield (An Astronaut's Guide to Life on Earth)
Joy and sadness, vivacity and obtuseness are data of introspection, and when we invest landscapes or other people with these states, it is because we have observed in ourselves the coincidence between these internal perceptions and the external signs associated with them by the accidents of our constitution. Perception thus impoverished becomes purely a matter of knowledge, a progressive noting down of qualities and of their most habitual distribution, and the perceiving subject approaches the world as the scientist approaches his experiments. If on the other hand we admit that all these ‘projections’, all these ‘associations’, all these ‘transferences’ are based on some intrinsic characteristic of the object, the ‘human world’ ceases to be a metaphor and becomes once more what it really is, the seat and as it were the homeland of our thoughts.
Maurice Merleau-Ponty (Phenomenology of Perception)
von Braun went looking for problems, hunches, and bad news. He even rewarded those who exposed problems. After Kranz and von Braun’s time, the “All Others Bring Data” process culture remained, but the informal culture and power of individual hunches shriveled. In 1974, William Lucas took over the Marshall Space Flight Center. A NASA chief historian wrote that Lucas was a brilliant engineer but “often grew angry when he learned of problems.” Allan McDonald described him to me as a “shoot-the-messenger type guy.” Lucas transformed von Braun’s Monday Notes into a system purely for upward communication. He did not write feedback and the notes did not circulate. At one point they morphed into standardized forms that had to be filled out. Monday Notes became one more rigid formality in a process culture. “Immediately, the quality of the notes fell,” wrote another official NASA historian.
David Epstein (Range: Why Generalists Triumph in a Specialized World)
Recent studies indicate that boys raised by women, including single women and lesbian couples, do not suffer in their adjustment; they are not appreciably less “masculine”; they do not show signs of psychological impairment. What many boys without fathers inarguably do face is a precipitous drop in their socioeconomic status. When families dissolve, the average standard of living for mothers and children can fall as much as 60 percent, while that of the man usually rises. When we focus on the highly speculative psychological effects of fatherlessness we draw away from concrete political concerns, like the role of increased poverty. Again, there are as yet no data suggesting that boys without fathers to model masculinity are necessarily impaired. Those boys who do have fathers are happiest and most well adjusted with warm, loving fathers, fathers who score high in precisely “feminine” qualities.
Terrence Real (I Don't Want to Talk About It: Overcoming the Secret Legacy of Male Depression)
The psychosis-inducing effects of synthetics offered one last, crucial piece of evidence about the risks of cannabis. And so, in January 2017, the National Academy of Medicine examined the thirty years of research that had begun with Sven Andréasson’s paper and declared the issue settled. “The association between cannabis use and development of a psychotic disorder is supported by data synthesized in several good-quality systematic reviews,” the NAM wrote. “The magnitude of this association is moderate to large and appears to be dose-dependent . . . The primary literature reviewed by the committee confirms the conclusions of the systematic reviews.” But almost no one noticed the National Academy report. The New York Times published an online summary of its findings—in May 2018, more than a year after it appeared. It has not changed the public policy debate around marijuana in the United States or perceptions of the safety of the drug.
Alex Berenson (Tell Your Children: The Truth About Marijuana, Mental Illness, and Violence)
I know that the consequences of scientific illiteracy are far more dangerous in our time than in any that has come before. It’s perilous and foolhardy for the average citizen to remain ignorant about global warming, say, or ozone depletion, air pollution, toxic and radioactive wastes, acid rain, topsoil erosion, tropical deforestation, exponential population growth. Jobs and wages depend on science and technology. If our nation can’t manufacture, at high quality and low price, products people want to buy, then industries will continue to drift away and transfer a little more prosperity to other parts of the world. Consider the social ramifications of fission and fusion power, supercomputers, data “highways,” abortion, radon, massive reductions in strategic weapons, addiction, government eavesdropping on the lives of its citizens, high-resolution TV, airline and airport safety, fetal tissue transplants, health costs, food additives, drugs to ameliorate mania or depression or schizophrenia, animal rights, superconductivity, morning-after pills, alleged hereditary antisocial predispositions, space stations, going to Mars, finding cures for AIDS and cancer. How can we affect national policy—or even make intelligent decisions in our own lives—if we don’t grasp the underlying issues? As I write, Congress is dissolving its own Office of Technology Assessment—the only organization specifically tasked to provide advice to the House and Senate on science and technology. Its competence and integrity over the years have been exemplary. Of the 535 members of the U.S. Congress, rarely in the twentieth century have as many as one percent had any significant background in science. The last scientifically literate President may have been Thomas Jefferson.* So how do Americans decide these matters? How do they instruct their representatives? Who in fact makes these decisions, and on what basis? —
Carl Sagan (The Demon-Haunted World: Science as a Candle in the Dark)
If we look at the way an industrial producer creates new products, we see a long list of trials and errors and eventually improvement in quality at a lower cost. Urban policies and strategies, by contrast, often do not follow this logic; they are often repeated even when it is well known that they failed. For instance, policies like rent control, greenbelts, new light rail transports, among others, are constantly repeated in spite of a near consensus on their failure to achieve their objectives. A quantitative evaluation of the failure of these policies is usually well documented through special reports or academic papers; it is seldom produced internally by cities, however, and the information does not seem to reach urban decision makers. Only a systematic analysis of data through indicators allows urban policies to be improved over time and failing policies to be abandoned. But as Angus Deaton wrote: 'without data, anyone who does anything is free to claim success.
Alain Bertaud (Order without Design: How Markets Shape Cities)
Jacobs, whose faith in Moore’s Law was as strong as ever, thought a more complicated system of frequency-hopping would work better. Rather than keeping a given phone call on a certain frequency, he proposed moving call data between different frequencies, letting him cram more calls into available spectrum space. Most people thought he was right in theory, but that such a system would never work in practice. Voice quality would be low, they argued, and calls would be dropped. The amount of processing needed to move call data between frequencies and have it interpreted by a phone on the other end seemed enormous. Jacobs disagreed, founding a company called Qualcomm—Quality Communications—in 1985 to prove the point. He built a small network with a couple cell towers to prove it would work. Soon the entire industry realized Qualcomm’s system would make it possible to fit far more cell phone calls into existing spectrum space by relying on Moore’s Law to run the algorithms that make sense of all the radio waves bouncing around.
Chris Miller (Chip War: The Fight for the World's Most Critical Technology)
Imagine you're sitting having dinner in a restaurant. At some point during the meal, your companion leans over and whispers that they've spotted Lady Gaga eating at the table opposite. Before having a look for yourself, you'll no doubt have some sense of how much you believe your friends theory. You'll take into account all of your prior knowledge: perhaps the quality of the establishment, the distance you are from Gaga's home in Malibu, your friend's eyesight. That sort of thing. If pushed, it's a belief that you could put a number on. A probability of sorts. As you turn to look at the woman, you'll automatically use each piece of evidence in front of you to update your belief in your friend's hypothesis Perhaps the platinum-blonde hair is consistent with what you would expect from Gaga, so your belief goes up. But the fact that she's sitting on her own with no bodyguards isn't, so your belief goes down. The point is, each new observations adds to your overall assessment. This is all Bayes' theorem does: offers a systematic way to update your belief in a hypothesis on the basis of the evidence. It accepts that you can't ever be completely certain about the theory you are considering, but allows you to make a best guess from the information available. So, once you realize the woman at the table opposite is wearing a dress made of meat -- a fashion choice that you're unlikely to chance up on in the non-Gaga population -- that might be enough to tip your belief over the threshold and lead you to conclude that it is indeed Lady Gaga in the restaurant. But Bayes' theorem isn't just an equation for the way humans already make decisions. It's much more important that that. To quote Sharon Bertsch McGrayne, author of The Theory That Would Not Die: 'Bayes runs counter to the deeply held conviction that modern science requires objectivity and precision. By providing a mechanism to measure your belief in something, Bayes allows you to draw sensible conclusions from sketchy observations, from messy, incomplete and approximate data -- even from ignorance.
Hannah Fry (Hello World: Being Human in the Age of Algorithms)
When we mix a practical ability to engineer minds with our ignorance of the mental spectrum and with the narrow interests of governments, armies and corporations, we get a recipe for trouble. We may successfully upgrade our bodies and our brains, while losing our minds in the process. Indeed, techno-humanism may end up downgrading humans. The system may prefer downgraded humans not because they would possess any superhuman knacks, but because they would lack some really disturbing human qualities that hamper the system and slow it down. As any farmer knows, it’s usually the brightest goat in the flock that stirs up the most trouble, which is why the Agricultural Revolution involved downgrading animals’ mental abilities. The second cognitive revolution, dreamed up by techno-humanists, might do the same to us, producing human cogs who communicate and process data far more effectively than ever before, but who can hardly pay attention, dream or doubt. For millions of years we were enhanced chimpanzees. In the future, we may become oversized ants.
Yuval Noah Harari (Homo Deus: A Brief History of Tomorrow)
Galileo's mechanical world was only a partial representation of a finite number of probable worlds, each peculiar to a particular living species; and all these worlds are but a portion of the infinite number of possible worlds that may have once existed or may yet exist. But anything like a single world, common to all species, at all times, under all circumstances, is a purely hypothetical construction, drawn by inference from pathetically insufficient data, prized for the assurance of stability and intelligibility it gives, even though that assurance turns out, under severe examination, to be just another illusion. A butterfly or a beetle, a fish or a fowl, a dog or a dolphin, would have a different report to give even about primary qualities, for each lives in a world conditioned by the needs and environmental opportunities open to his species. In the gray visual world of the dog, smells, near and distant, subtle or violently exciting, probably play the part that colors do in man's world-though in the primal occupation of eating, the dog's world and man's world would approach each other more closely.
Lewis Mumford (The Pentagon of Power (The Myth of the Machine, Vol 2))
There was little effort to conceal this method of doing business. It was common knowledge, from senior managers and heads of research and development to the people responsible for formulation and the clinical people. Essentially, Ranbaxy’s manufacturing standards boiled down to whatever the company could get away with. As Thakur knew from his years of training, a well-made drug is not one that passes its final test. Its quality must be assessed at each step of production and lies in all the data that accompanies it. Each of those test results, recorded along the way, helps to create an essential roadmap of quality. But because Ranbaxy was fixated on results, regulations and requirements were viewed with indifference. Good manufacturing practices were stop signs and inconvenient detours. So Ranbaxy was driving any way it chose to arrive at favorable results, then moving around road signs, rearranging traffic lights, and adjusting mileage after the fact. As the company’s head of analytical research would later tell an auditor: “It is not in Indian culture to record the data while we conduct our experiments.
Katherine Eban (Bottle of Lies: The Inside Story of the Generic Drug Boom)
Despite the advancements of systematic experimental pipelines, literature-curated protein-interaction data continue to be the primary data for investigation of focused biological mechanisms. Notwithstanding the variable quality of curated interactions available in public databases, the impact of inspection bias on the ability of literature maps to provide insightful information remains equivocal. The problems posed by inspection bias extend beyond mapping of protein interactions to the development of pharmacological agents and other aspects of modern biomedicine. Essentially the same 10% of the proteome is being investigated today as was being investigated before the announcement of completion of the reference genome sequence. One way forward, at least with regard to interactome mapping, is to continue the transition toward systematic and relatively unbiased experimental interactome mapping. With continued advancement of systematic protein-interaction mapping efforts, the expectation is that interactome 'deserts', the zones of the interactome space where biomedical knowledge researchers simply do not look for interactions owing to the lack of prior knowledge, might eventually become more populated. Efforts at mapping protein interactions will continue to be instrumental for furthering biomedical research.
Joseph Loscalzo (Network Medicine: Complex Systems in Human Disease and Therapeutics)
A few years ago my friend Jon Brooks supplied this great illustration of skewed interpretation at work. Here’s how investors react to events when they’re feeling good about life (which usually means the market has been rising): Strong data: economy strengthening—stocks rally Weak data: Fed likely to ease—stocks rally Data as expected: low volatility—stocks rally Banks make $4 billion: business conditions favorable—stocks rally Banks lose $4 billion: bad news out of the way—stocks rally Oil spikes: growing global economy contributing to demand—stocks rally Oil drops: more purchasing power for the consumer—stocks rally Dollar plunges: great for exporters—stocks rally Dollar strengthens: great for companies that buy from abroad—stocks rally Inflation spikes: will cause assets to appreciate—stocks rally Inflation drops: improves quality of earnings—stocks rally Of course, the same behavior also applies in the opposite direction. When psychology is negative and markets have been falling for a while, everything is capable of being interpreted negatively. Strong economic data is seen as likely to make the Fed withdraw stimulus by raising interest rates, and weak data is taken to mean companies will have trouble meeting earnings forecasts. In other words, it’s not the data or events; it’s the interpretation. And that fluctuates with swings in psychology.
Howard Marks (Mastering The Market Cycle: Getting the Odds on Your Side)
Many aspects of the modern financial system are designed to give an impression of overwhelming urgency: the endless ‘news’ feeds, the constantly changing screens of traders, the office lights blazing late into the night, the young analysts who find themselves required to work thirty hours at a stretch. But very little that happens in the finance sector has genuine need for this constant appearance of excitement and activity. Only its most boring part—the payments system—is an essential utility on whose continuous functioning the modern economy depends. No terrible consequence would follow if the stock market closed for a week (as it did in the wake of 9/11)—or longer, or if a merger were delayed or large investment project postponed for a few weeks, or if an initial public offering happened next month rather than this. The millisecond improvement in data transmission between New York and Chicago has no significance whatever outside the absurd world of computers trading with each other. The tight coupling is simply unnecessary: the perpetual flow of ‘information’ part of a game that traders play which has no wider relevance, the excessive hours worked by many employees a tournament in which individuals compete to display their alpha qualities in return for large prizes. The traditional bank manager’s culture of long lunches and afternoons on the golf course may have yielded more useful information about business than the Bloomberg terminal. Lehman
John Kay (Other People's Money: The Real Business of Finance)
Again you must learn the point which comes next. Every circle, of those which are by the act of man drawn or even turned on a lathe, is full of that which is opposite to the fifth thing. For everywhere it has contact with the straight. But the circle itself, we say, has nothing in either smaller or greater, of that which is its opposite. We say also that the name is not a thing of permanence for any of them, and that nothing prevents the things now called round from being called straight, and the straight things round; for those who make changes and call things by opposite names, nothing will be less permanent (than a name). Again with regard to the definition, if it is made up of names and verbal forms, the same remark holds that there is no sufficiently durable permanence in it. And there is no end to the instances of the ambiguity from which each of the four suffers; but the greatest of them is that which we mentioned a little earlier, that, whereas there are two things, that which has real being, and that which is only a quality, when the soul is seeking to know, not the quality, but the essence, each of the four, presenting to the soul by word and in act that which it is not seeking (i.e., the quality), a thing open to refutation by the senses, being merely the thing presented to the soul in each particular case whether by statement or the act of showing, fills, one may say, every man with puzzlement and perplexity. [...] But in subjects where we try to compel a man to give a clear answer about the fifth, any one of those who are capable of overthrowing an antagonist gets the better of us, and makes the man, who gives an exposition in speech or writing or in replies to questions, appear to most of his hearers to know nothing of the things on which he is attempting to write or speak; for they are sometimes not aware that it is not the mind of the writer or speaker which is proved to be at fault, but the defective nature of each of the four instruments. The process however of dealing with all of these, as the mind moves up and down to each in turn, does after much effort give birth in a well-constituted mind to knowledge of that which is well constituted. [...] Therefore, if men are not by nature kinship allied to justice and all other things that are honourable, though they may be good at learning and remembering other knowledge of various kinds-or if they have the kinship but are slow learners and have no memory-none of all these will ever learn to the full the truth about virtue and vice. For both must be learnt together; and together also must be learnt, by complete and long continued study, as I said at the beginning, the true and the false about all that has real being. After much effort, as names, definitions, sights, and other data of sense, are brought into contact and friction one with another, in the course of scrutiny and kindly testing by men who proceed by question and answer without ill will, with a sudden flash there shines forth understanding about every problem, and an intelligence whose efforts reach the furthest limits of human powers. Therefore every man of worth, when dealing with matters of worth, will be far from exposing them to ill feeling and misunderstanding among men by committing them to writing. In one word, then, it may be known from this that, if one sees written treatises composed by anyone, either the laws of a lawgiver, or in any other form whatever, these are not for that man the things of most worth, if he is a man of worth, but that his treasures are laid up in the fairest spot that he possesses. But if these things were worked at by him as things of real worth, and committed to writing, then surely, not gods, but men "have themselves bereft him of his wits".
Plato (The Letters)
..."facts" properly speaking are always and never more than interpretations of the data... the Gospel accounts are themselves such data or, if you like, hard facts. But the events to which the Gospels refer are not themselves "hard facts"; they are facts only in the sense that we interpret the text, together with such other data as we have, to reach a conclusion regarding the events as best we are able. They are facts in the same way that the verdict of a jury establishes the facts of the case, the interpretation of the evidence that results in the verdict delivered. Here it is as well to remember that historical methodology can only produce probabilities, the probability that some event took place in such circumstances being greater or smaller, depending on the quality of the data and the perspective of the historical enquirer. The jury which decides what is beyond reasonable doubt is determining that the probability is sufficiently high for a clear-cut verdict to be delivered. Those who like "certainty" in matters of faith will always find this uncomfortable. But faith is not knowledge of "hard facts"...; it is rather confidence, assurance, trust in the reliability of the data and in the integrity of the interpretations derived from that data... It does seem important to me that those who speak for evangelical Christians grasp this nettle firmly, even if it stings! – it is important for the intellectual integrity of evangelicals. Of course any Christian (and particularly evangelical Christians) will want to get as close as possible to the Jesus who ministered in Galilee in the late 20s of the first century. If, as they believe, God spoke in and through that man, more definitively and finally than at any other time and by any other medium, then of course Christians will want to hear as clearly as possible what he said, and to see as clearly as possible what he did, to come as close as possible to being an eyewitness and earwitness for themselves. If God revealed himself most definitively in the historical particularity of a Galilean Jew in the earliest decades of the Common Era, then naturally those who believe this will want to inquire as closely into the historical particularity and actuality of that life and of Jesus’ mission. The possibility that later faith has in some degree covered over that historical actuality cannot be dismissed as out of the question. So a genuinely critical historical inquiry is necessary if we are to get as close to the historical actuality as possible. Critical here, and this is the point, should not be taken to mean negatively critical, hermeneutical suspicion, dismissal of any material that has overtones of Easter faith. It means, more straightforwardly, a careful scrutiny of all the relevant data to gain as accurate or as historically responsible a picture as possible. In a day when evangelical, and even Christian, is often identified with a strongly right-wing, conservative and even fundamentalist attitude to the Bible, it is important that responsible evangelical scholars defend and advocate such critical historical inquiry and that their work display its positive outcome and benefits. These include believers growing in maturity • to recognize gray areas and questions to which no clear-cut answer can be given (‘we see in a mirror dimly/a poor reflection’), • to discern what really matters and distinguish them from issues that matter little, • and be able to engage in genuine dialogue with those who share or respect a faith inquiring after truth and seeking deeper understanding. In that way we may hope that evangelical (not to mention Christian) can again become a label that men and women of integrity and good will can respect and hope to learn from more than most seem to do today.
James D.G. Dunn (The Historical Jesus: Five Views)
This extreme situation in which all data is processed and all decisions are made by a single central processor is called communism. In a communist economy, people allegedly work according to their abilities, and receive according to their needs. In other words, the government takes 100 per cent of your profits, decides what you need and then supplies these needs. Though no country ever realised this scheme in its extreme form, the Soviet Union and its satellites came as close as they could. They abandoned the principle of distributed data processing, and switched to a model of centralised data processing. All information from throughout the Soviet Union flowed to a single location in Moscow, where all the important decisions were made. Producers and consumers could not communicate directly, and had to obey government orders. For instance, the Soviet economics ministry might decide that the price of bread in all shops should be exactly two roubles and four kopeks, that a particular kolkhoz in the Odessa oblast should switch from growing wheat to raising chickens, and that the Red October bakery in Moscow should produce 3.5 million loaves of bread per day, and not a single loaf more. Meanwhile the Soviet science ministry forced all Soviet biotech laboratories to adopt the theories of Trofim Lysenko – the infamous head of the Lenin Academy for Agricultural Sciences. Lysenko rejected the dominant genetic theories of his day. He insisted that if an organism acquired some new trait during its lifetime, this quality could pass directly to its descendants. This idea flew in the face of Darwinian orthodoxy, but it dovetailed nicely with communist educational principles. It implied that if you could train wheat plants to withstand cold weather, their progenies will also be cold-resistant. Lysenko accordingly sent billions of counter-revolutionary wheat plants to be re-educated in Siberia – and the Soviet Union was soon forced to import more and more flour from the United States.
Yuval Noah Harari (Homo Deus: A History of Tomorrow)
As Graedon scrutinized the FDA’s standards for bioequivalence and the data that companies had to submit, he found that generics were much less equivalent than commonly assumed. The FDA’s statistical formula that defined bioequivalence as a range—a generic drug’s concentration in the blood could not fall below 80 percent or rise above 125 percent of the brand name’s concentration, using a 90 percent confidence interval—still allowed for a potential outside range of 45 percent among generics labeled as being the same. Patients getting switched from one generic to another might be on the low end one day, the high end the next. The FDA allowed drug companies to use different additional ingredients, known as excipients, that could be of lower quality. Those differences could affect a drug’s bioavailability, the amount of drug potentially absorbed into the bloodstream. But there was another problem that really drew Graedon’s attention. Generic drug companies submitted the results of patients’ blood tests in the form of bioequivalence curves. The graphs consisted of a vertical axis called Cmax, which mapped the maximum concentration of drug in the blood, and a horizontal axis called Tmax, the time to maximum concentration. The resulting curve looked like an upside-down U. The FDA was using the highest point on that curve, peak drug concentration, to assess the rate of absorption into the blood. But peak drug concentration, the point at which the blood had absorbed the largest amount of drug, was a single number at one point in time. The FDA was using that point as a stand-in for “rate of absorption.” So long as the generic hit a similar peak of drug concentration in the blood as the brand name, it could be deemed bioequivalent, even if the two curves reflecting the time to that peak looked totally different. Two different curves indicated two entirely different experiences in the body, Graedon realized. The measurement of time to maximum concentration, the horizontal axis, was crucial for time-release drugs, which had not been widely available when the FDA first created its bioequivalence standard in 1992. That standard had not been meaningfully updated since then. “The time to Tmax can vary all over the place and they don’t give a damn,” Graedon emailed a reporter. That “seems pretty bizarre to us.” Though the FDA asserted that it wouldn’t approve generics with “clinically significant” differences in release rates, the agency didn’t disclose data filed by the companies, so it was impossible to know how dramatic the differences were.
Katherine Eban (Bottle of Lies: The Inside Story of the Generic Drug Boom)
The first 20 percent often begins with having the right data, the right technology, and the right incentives. You need to have some information—more of it rather than less, ideally—and you need to make sure that it is quality-controlled. You need to have some familiarity with the tools of your trade—having top-shelf technology is nice, but it’s more important that you know how to use what you have. You need to care about accuracy—about getting at the objective truth—rather than about making the most pleasing or convenient prediction, or the one that might get you on television. Then you might progress to a few intermediate steps, developing some rules of thumb (heuristics) that are grounded in experience and common sense and some systematic process to make a forecast rather than doing so on an ad hoc basis.
Nate Silver (The Signal and the Noise: Why So Many Predictions Fail-but Some Don't)
Maxine loves coding and she’s awesome at it. But she knows that there’s something even more important than code: the systems that enable developers to be productive, so that they can write high-quality code quickly and safely, freeing themselves from all the things that prevent them from solving important business problems.
Gene Kim (The Unicorn Project: A Novel about Developers, Digital Disruption, and Thriving in the Age of Data)
this is not just the normal churn of capitalism’s creative destruction, a process that has previously helped lead to a new equilibrium of more jobs, higher wages, and a better quality of life for all. The free market is supposed to be self-correcting, but these self-correcting mechanisms break down in an economy driven by artificial intelligence. Low-cost labor provides no edge over machines, and data-driven monopolies are forever self-reinforcing.
Kai-Fu Lee (AI Superpowers: China, Silicon Valley, and the New World Order)
The unity of the object will remain a mystery for as long as we think of its various qualities (its colour and taste, for example) as just so many data belonging to the entirely distinct worlds of sight, smell, touch and so on. Yet modern psychology, following Goethe’s lead, has observed that, rather than being absolutely separate, each of these qualities has an affective meaning which establishes a correspondence between it and the qualities associated with the other senses. For example, anyone who has had to choose carpets for a flat will know that a particular mood emanates from each colour, making it sad or happy, depressing or fortifying. Because the same is true of sounds and tactile data, it may be said that each colour is the equivalent of a particular sound or temperature. This is why some blind people manage to picture a colour when it is described, by way of an analogy with, for example, a sound.
Maurice Merleau-Ponty (The World of Perception)
At present we’re snowed under with an irrational expansion of blind data-gathering in the sciences because there’s no rational format for any understanding of scientific creativity. At present we are also snowed under with a lot of stylishness in the arts … thin art … because there’s very little assimilation or extension into underlying form. We have artists with no scientific knowledge and scientists with no artistic knowledge, and both with no spiritual sense of gravity at all. And the result is not just bad, it is ghastly. The time for real reunification of art and technology is really long overdue.
Robert Prisig
First, purpose—what you hope to accomplish. In the case of Beane, he sought a better method for predicting baseball success. Second, scope—what to include or exclude in arriving at the decision. Beane decided to include past performance statistics and exclude aesthetic qualities. He reduced the scope of his search to data on performance. Third, perspective—your point of view in approaching this decision and how others might approach it.
Jeffrey Ma (The House Advantage: Playing the Odds to Win Big In Business)
Acting based on incomplete data means every moral act depends on the decision-making capabilities of the moral agent. Someone of less intellect will not consider as many factors as someone of superior intellect will. Which means the overall quality of moral acts will cluster around the intelligence level of the population average. As anyone who has read human history will tell you, the population average is… pretty average.
Vance Pravat (Zeroglyph: An AI Technothriller)
And here is where we could see the emergence of new types of companies—“Auto-Tech.” These would either be vertically integrated or strategically allied companies, from vehicle manufacture, to fleet management, to ride hailing through their own platforms. They would be the master coordinators of multiple capabilities—manufacturing, data and supply chain management, machine learning, software and systems integration, and the delivery of high-quality “mobility as a service” to customers around the world. At this point, there is still no tipping point where the benefits of new technology and business models prove so overwhelming that they obliterate the oil-fueled personal car model that has reigned for so long.
Daniel Yergin (The New Map: Energy, Climate, and the Clash of Nations)
Trademark Trademark is fundamentally exceptional of a licensed innovation comprising plans, logos, and imprints. Organizations utilize different plans, logos, or words to recognize their items and administrations from others. Those imprints which help in distinctive the item or administrations from others and help the clients in distinguishing their image, quality, and even source of the item is known as Trademark. In contrast to licenses, a brand name is enlisted for a very long time, and from that point, it tends to be recharged for an additional 10 years after an additional installment of reestablishment expenses. Trademark Objection After the enrollment of the brand name, an Examiner/Registrar or outsider can set a trademark objection. As per Section(s) 9 (Absolute Grounds of Refusal) and 11 (Relative Grounds of Refusal) of the Act, these two can be the ground of a complaint:- The application contains wrong data, or Comparable or indistinguishable brand names exist. At whatever point a Trademark enlistment center mentions a criticism, a candidate has an occasion to send a composed answer alongside the strong proof, realities, and reasons why the imprint ought to be assigned to him within 30 days of the protest. On the off chance that the analyst/enlistment center discovers the answer to be adequate and addresses the entirety of his interests in the assessment report and there is no contention, at that point he may give authorization to the candidate to distribute the application in the Trademark diary before enrollment. How to respond to an objection A Trademark assessment report is set up on the Trademark office site alongside the subtleties of the brand name application and a candidate or a specialist has the occasion to send a composed answer which ought to be known as a trademark objection reply. The answer can be submitted as "Answer to the assessment report" either on the web or it tends to be submitted through a post or individual alongside supporting archives or a sworn statement. When the application gets recorded a candidate ought to be given a notification about the protest and ground of the complaint. Different grounds are:- There ought to be a counter assertion of the application, It ought to be recorded within 2 months of the application, On the off chance that the analyst neglects to record a complaint inside the time, at that point the status of the application will be deserted. After recording the counter of a complaint, the enlistment center will call a candidate for the meeting. On the off chance that it rules in the courtesy, at that point, the candidate will get it enrolled, and on the off chance that the answer isn't agreeable, at that point, the application for the enlistment will get dismissed. Trademark Objection Reply Fees Although I have gone through various sites, finding a perfect formal reply is quite difficult. But Professional Utilities provides a perfect reply through experts, also the trademark objection reply fees are really affordable. They provide services for just 1,499/- only.
Shweta Sharma
obscurity" involves questions that are often unrecognized and not seen to be relevant until they are posed explicitly—which requires curiosity, a quality often in short supply. Data to illuminate obscurities are also often available—even if found in non-traditional sources that require imagination to identify, if procedures to filter signal from noise can be developed, and
Jeffrey R. Cooper (The CIA's Program for Improving Intelligence Analysis - "Curing Analytic Pathologies")
Die-Face Analysis In the 1930s, J. B. Rhine and his colleagues recognized and took into account the possibility that some dice studies may have been flawed because the probabilities of die faces are not equal. With some dice, it is slightly more likely that one will roll a 6 face than a 1 face because the die faces are marked by scooping out bits of material. The 6 face, for example, has six scoops removed from the surface of that side of the die, so it has slightly less mass than the other die faces. On any random toss, that tiny difference in mass will make the 6 slightly more likely to land face up, followed in decreasing probability by the 5, 4, 3, 2, and 1 faces. Thus, an experiment that relied exclusively upon the 6 face as the target may have been flawed because, unless there were also control tosses with no mental intention applied, we could not tell whether above-chance results were due to a mind-matter interaction or to the slightly higher probability of rolling a 6. To see whether this bias was present in these dice studies, we sifted out all reports for which the published data allowed us to calculate the effective hit rate separately for each of the six die faces used under experimental and control conditions. In fact, the suspected biases were found, as shown in figure 8.3. The hit rates for both experimental and control tosses tended to increase from die faces 1 to 6. However, most of the experimental hit rates were also larger than the corresponding control hit rates, suggested some thing interesting beyond the artifacts caused by die-face biases. For example, for die face 6 the experimental condition was significantly larger than the control with odds against chance of five thousand to one. Figure 8.3. Relationship between die face and hit rates for experimental and control conditions. The error bars are 65 percent confidence intervals. Because of the evidence that the die faces were slightly biased, we examined a subset of studies that controlled for these dice biases—studies using design protocols where die faces were equally distributed among the six targets. We referred to such studies as the “balanced-protocol subset.” Sixty-nine experiments met the balanced-protocol criteria. Our examination of those experiments resulted in three notable points: there was still highly significant evidence for mind-matter interaction, with odds against chance of greater than a trillion to one; the effects were constant across different measures of experimental quality; and the selective-reporting “file drawer” required a twenty-to-one ratio of unretrieved, nonsignificant studies for each observed study. Thus chance, quality, and selective reporting could not explain away the results. Dice Conclusions Our meta-analysis findings led us to conclude that a genuine mind-matter interaction did exist with experiments testing tossed dice. The effect had been successfully replicated in more than a hundred experiments by more than fifty investigators for more than a half-century.
Dean Radin (The Conscious Universe: The Scientific Truth of Psychic Phenomena)
In recent years, Continuous Glucose Monitoring (CGM) devices have emerged as a game-changer in diabetes management, offering patients a real-time view of their glucose levels and revolutionizing the way they monitor their condition. Among the pioneers in providing these life-changing devices, Med Supply US stands out as a reliable source, offering CGMs from various renowned brands like Abbott, Dexcom, and more. This article explores the significance of CGM devices and highlights the contribution of Med Supply US in making them accessible to those in need. Understanding CGM Devices: For individuals living with diabetes, maintaining optimal blood glucose levels is crucial to prevent serious health complications. Traditionally, this involved frequent finger-prick tests, which could be inconvenient and sometimes inaccurate. CGM devices, however, have transformed this process by providing continuous and real-time glucose level readings. These devices consist of a small sensor inserted under the skin that measures glucose levels in the interstitial fluid. The data collected is then transmitted to a receiver or a smartphone app, allowing users to track their glucose levels throughout the day and night. Benefits of CGM Devices: The introduction of CGM devices has brought about a paradigm shift in diabetes management due to their numerous benefits: Real-time Monitoring: CGM devices offer a real-time insight into glucose trends, enabling users to make informed decisions about their diet, exercise, and insulin dosages. This real-time feedback empowers individuals to take timely action to maintain their glucose levels within a healthy range. Reduced Hypoglycemia and Hyperglycemia: By providing alerts for both low and high glucose levels, CGMs help users avoid dangerous hypoglycemic episodes and hyperglycemic spikes. This is particularly beneficial during sleep when such episodes might otherwise go unnoticed. Data-Driven Insights: CGM devices generate a wealth of data, including glucose trends, patterns, and even predictive alerts for potential issues. This information can be shared with healthcare providers to tailor treatment plans for optimal diabetes management. Enhanced Quality of Life: The convenience of CGM devices reduces the need for frequent finger pricks, leading to an improved quality of life for individuals managing diabetes. The constant insights also alleviate anxiety related to unpredictable glucose fluctuations. Med Supply US: Bringing Hope to Diabetes Management: Med Supply US has emerged as a prominent supplier of CGM devices, offering a range of options from reputable brands such as Abbott and Dexcom. The availability of CGMs through Med Supply US has made these cutting-edge devices accessible to a wider demographic, bridging the gap between technology and healthcare. Med Supply US not only provides access to CGM devices but also plays a crucial role in educating individuals about their benefits. Through informative resources, they empower users to make informed choices based on their specific needs and preferences. Furthermore, their commitment to customer support ensures that users can seamlessly integrate CGM devices into their daily routines.
CGM devices
At Castle Surveys Ltd, we understand the importance of accurate and reliable surveying data. We are dedicated to providing the highest quality service in everything we do, from Google/Bing Topographic Land Surveys to Measured Building Surveys, Sac to BIM, 3D Laser Scanning, Underground Utility Surveys, Site Engineering & Setting out, CCTV Drainage Surveys, AVR’s, Drone Surveys, and much more.
Castle Surveys Cheltenham
In 2012, Google Maps had become the premier provider of mapping services and location data for mobile phone users. It was a popular feature on Apple’s iPhone. However, with more consumer activity moving to mobile devices and becoming increasingly integrated with location data, Apple realized that Google Maps was becoming a significant threat to the long-term profitability of its mobile platform. There was a real possibility that Google could make its mapping technology into a separate platform, offering valuable customer connections and geographic data to merchants, and siphoning this potential revenue source away from Apple. Apple’s decision to create its own mapping app to compete with Google Maps made sound strategic sense—despite the fact that the initial service was so poorly designed that it caused Apple significant public embarrassment. The new app misclassified nurseries as airports and cities as hospitals, suggested driving routes that passed over open water (your car had better float!), and even stranded unwary travelers in an Australian desert a full seventy kilometers from the town they expected to find there. iPhone users erupted in howls of protest, the media had a field day lampooning Apple’s misstep, and CEO Tim Cook had to issue a public apology.19 Apple accepted the bad publicity, likely reasoning that it could quickly improve its mapping service to an acceptable quality level—and this is essentially what has happened. The iPhone platform is no longer dependent on Google for mapping technology, and Apple has control over the mapping application as a source of significant value.
Geoffrey G. Parker (Platform Revolution: How Networked Markets Are Transforming the Economy and How to Make Them Work for You)
A strain of newly minted “cyberlibertarian” ideals informed the early Internet, which assumed that a fairly minimal communications layer was sufficient; obviously necessary higher-level architectural elements, such as persistent identities for humans, would be supplied by a hypothetical future layer of private industry. But these higher layers turned out to give rise to natural monopolies because of network effects; the outcome was a new kind of unintended centralization of information and therefore of power. A tiny number of tech giants came to own the means of access to networks for most people. Indeed, these companies came to route and effectively control the data of most individuals. Similarly, there was no provision for provenance, authentication, or any other species of digital context that might support trust, a precious quality that underlies decent societies. Neither the Internet nor the Web built on top of it kept track of back links, meaning what nodes on the Internet included references to a given node. It was left to businesses like commercial search engines to maintain that type of context. Support for financial transactions was left to private enterprise and quickly became the highly centralized domain of a few credit card and online payment companies.
Eric A. Posner (Radical Markets: Uprooting Capitalism and Democracy for a Just Society)
How long does it take to Learn Freelancing? How long it takes to learn freelancing depends on what you're learning, how you start freelancing, and how hard you try to learn it. Learning something requires more willpower and concentration than any effort. The sooner you continue to learn to work with focus, the sooner you will succeed. And the slower you go, the longer it will take you to learn the task. So if you want to build a career online as a professional freelancer then you must spend extra time on it. Freelancing for Beginners: If you are new to the freelancing sector, there are a few things you need to know. For example: What is data entry? What is outsourcing? Web design key etc. Having a basic understanding of these things will make it much easier for you to learn freelancing. Although freelancing has complex tasks as well as some simple ones. But it is very few and low incomes. There are many new freelancers who want to earn freelancing with mobile. Their statement is, "I don't need so much money, only 4-5 thousand taka will do". In their case, I would say that you learn data entry work. You can earn that amount of money in this work. But if you choose freelancing just to do this job then I would say you are doing it wrong. Because this data entry work is very long, you need to work for 7-8 hours. And if you dream of only 4-5 thousand rupees by working 7-8 hours, then my suggestion for you is that you should not do this work but get tutoring. At least it will be best for you. Freelancing requires you to have big dreams and the passion to make them come true. Misconceptions about Freelancing: There is no substitute for a good quality computer or a good quality laptop to learn and master freelancing professionally. This way you can practice and learn very quickly without any hassle. Many people think that by looking at the monitor and pressing the keyboard, they become freelancing and can earn lakhs of rupees a month. In fact, those who think so cannot be entirely blamed. Many of us get lured by such mouthwatering advertisements as "opportunity to earn lakhs per month with just one month course" and waste both our precious time and money by joining bad unprofessional coaching centers. Why is it not possible to learn freelancing in just one month even in one year? It is clear proof that glittering does not make gold. There are thousands of jobs in freelancing, each job is different, and each job takes a different amount of time to learn. So it is very difficult to comment on how long it takes to learn freelancing. Be aware in choosing the right Freelancing Training Center: But whatever you do, don't go for an online course of Rs 400-600-1200. Because it will also lose the willpower you have to learn to freelance. If you have to do this type of bad course today, then do a government freelancing course or you can take practical training from an organization called "Bhairab ​​IT Zone" for a nominal fee. Here hands-on training is provided by professional freelancers using tools in free, premium, and upgraded versions. Although there are many ways or mediums to learn freelancing or outsourcing. E.g. Outsourcing Learning Books, Youtube Video Tutorials, Seminars etc. Either way, some learn to swim in a day and some in a week. To become a good swimmer one must continue swimming for a long time. Not everyone has the same brain capacity or stamina. Humans are naturally different from one another. The same goes for freelancing. You might learn the ins and outs of freelancing within 6-7 months, it might take another 1-2 years. No matter how long it takes to learn, you need to work twice as long to become proficient at it. But with hard work, willpower, and determination you can make any impossible possible. Please visit Our Blogging Website to Read More Articles related to Freelancing and Outsourcing.
Bhairab IT Zone
Data about the persecution of Jews in Europe drawn from almost a thousand cities between 1100 and 1800 shows that a decrease in the average growing-season temperature of about one-third of 1 degree Celsius is correlated with a rise in the probability of Jews being attacked in the subsequent five-year period – with those living in and near locations with poor soil quality and weaker institutions more likely still to be the victims of violence during times of food shortages and higher prices.
Peter Frankopan (The Earth Transformed: An Untold History)
Continuous Glucose Monitors are transforming diabetes management by providing real-time data, improving glycemic control, and improving quality of life. We should expect CGMs to become more accessible and user-friendly as technology advances, enhancing the lives of chronic disease patients. Ask your doctor about CGMs if you have diabetes.
Continuous Glucose Monitors (CGMs)
Continuous Glucose Monitors are transforming diabetes management by providing real-time data, improving glycemic control, and improving quality of life. We should expect CGMs to become more accessible and user-friendly as technology advances, enhancing the lives of chronic disease patients. Ask your doctor about CGMs if you have diabetes.
Continuous Glucose Monitors
High schools routinely classified students who quit high school as transferring to another school, returning to their native country, or leaving to pursue a General Equivalency Diploma (GED)—none of which count as dropping out in the official statistics. Houston reported a citywide dropout rate of 1.5 percent in the year that was examined; 60 Minutes calculated that the true dropout rate was between 25 and 50 percent. The statistical chicanery with test scores was every bit as impressive. One way to improve test scores (in Houston or anywhere else) is to improve the quality of education so that students learn more and test better. This is a good thing. Another (less virtuous) way to improve test scores is to prevent the worst students from taking the test. If the scores of the lowest-performing students are eliminated, the average test score for the school or district will go up, even if all the rest of the students show no improvement at all. In Texas, the statewide achievement test is given in tenth grade. There was evidence that Houston schools were trying to keep the weakest students from reaching tenth grade. In one particularly egregious example, a student spent three years in ninth grade and then was promoted straight to eleventh grade—a deviously clever way of keeping a weak student from taking a tenth-grade benchmark exam without forcing him to drop out (which would have showed up on a different statistic).
Charles Wheelan (Naked Statistics: Stripping the Dread from the Data)
The convenience we've gained in the digital age should not be at the expense of our privacy. Bronva stands as a testament to the belief that we can have quality search results without sacrificing our personal data. Privacy and convenience are not mutually exclusive; they can coexist harmoniously.
James William Steven Parker
The social difficulties experienced by patients with anorexia are not only caused by the patients’ deficits in interpreting others’ minds. There is evidence that they present an unemotional “flat” face to others (Lang et al. 2016), and this can lead to failed social encounters. This is reminiscent of babies’ extreme distress when their mothers presented an unmoving expression to them, just for a minute or two (Weinberg et al. 2008; Tronick 2018). In other words, we are all expecting emotional expression in others and find it very unpleasant when we meet someone who presents a flat, unemotional face. The dependency on confirmation from others corresponds with major trends in contemporary culture, with great emphasis on visuality, bodily surfaces, external qualities, performances, etc. A central psychological trait in both contemporary culture and highly aggravated in eating disorders is the emphasis on comparison and comparison anxiety . Many are obsessively comparing themselves with others, concerning bodies, numbers and amounts of food, hence depending on profoundly superficial data.
Paul Robinson (Hunger: Mentalization-based Treatments for Eating Disorders)
How to Build a Mobile App with React Native With the continuous evolution of web applications, real-time apps, and hybrid apps, the companies want faster development and easy maintenance for their app. Due to high-end technologies, the React Native app development has earned its significance in bringing all of these together within the limited budget of the companies. Overview of React Native As the React Native is based on the React framework, it is good for React Native app development to follow the same. In addition to that, React Native has separate APIs for both the platforms, it allows development for both Android and iOS in the single app, and most importantly, it is free and open-source. Facebook’s React Native Developing apps that run on the different operating systems with one tool, especially mobile devices, would be a great advantage to the developers. Therefore, the React Native development by Facebook is one of the best ways to build apps that are scalable and flexible. The Android App Development with React Native With the number of active Android users, it has created more value to the companies in developing the apps for android mobile devices. Working with React Native In React Native, the developers have a lot of responsibilities. They do not need to write the code manually, as React Native automatically generates the code for the mobile app development. This is the reason why the developers need to focus more on the UX of the app. There are several UX aspects that are required for a development, such as the native code, the visual aesthetics, the technical and back-end aspects. All these aspects would be added together to design the user interface. This is why the React Native app development becomes quite important. The creation of the native code, design, and other technical aspects make React Native a valuable tool for developers and non-developers. Benefits of React Native React Native helps in building a complete native mobile app without any coding skills. The beautiful library creates responsive and interactive web apps from all the simple mobile web components and thus increases the creation of high-quality applications. React Native is a part of web development in its new form with its development of new concepts in application. It uses the native functionality of an operating system so that all of the advanced concepts of web development can be applied to mobile apps. This makes React Native a preferred platform for apps which are made specifically for Android and iOS. With React Native, the companies can develop a beautiful and efficient app in less time without having to spend too much time. Conclusion As stated in the above results of mobile app development, the UI remains the most important part of a mobile app. All developers are in love with different UI frameworks and libraries. As for this topic, given below are some of the great reasons to select React Native as a UI framework: It’s the only full-stack UI framework from Facebook. More than 20 frameworks have appeared, and React Native is the only one that was born out of Facebook. Features like rendering into the DOM, XHR, Native Embedding, data persistence, offline support and more. Although React Native is more than capable of tackling many challenges, it still falls short of some modern technologies like HOCs and Server-side Rendering (SSR).
Peter Lee (Nuneaton (Images of England))
Fractional dimension becomes a way of measuring qualities that otherwise have no clear definition: the degree of roughness or brokenness or irregularity in an object. A twisting coastline, for example, despite its immeasurability in terms of length, nevertheless has a certain characteristic degree of roughness. Mandelbrot specified ways of calculating the fractional dimension of real objects, given some technique of constructing a shape or given some data, and he allowed his geometry to make a claim about the irregular patterns he had studied in nature. The claim was that the degree of irregularity remains constant over different scales. Surprisingly often, the claim turns out to be true. Over and over again, the world displays a regular irregularity.
James Gleick (Chaos: Making a New Science)
Emaciated data-thin designs,” he warns, “provoke suspicions—and rightfully so—about the quality of measurement and analysis.
Tim Harford (The Data Detective: Ten Easy Rules to Make Sense of Statistics)
JUSTIFYING OPPRESSION While history has proven Malthusianism empirically false, however, it provides the ideal foundation for justifying human oppression and tyranny. The theory holds that there isn’t enough to go around, and can never be. Therefore human aspirations and liberties must be constrained, and authorities must be empowered to enforce the constraining. During Malthus’s own time, his theory was used to justify regressive legislation directed against England’s lower classes, most notably the Poor Law Act of 1834, which forced hundreds of thousands of poor Britons into virtual slavery. 11 However, a far more horrifying example of the impact of Malthusianism was to occur a few years later, when the doctrine motivated the British government’s refusal to provide relief during the great Irish famine of 1846. In a letter to economist David Ricardo, Malthus laid out the basis for this policy: “The land in Ireland is infinitely more peopled than in England; and to give full effect to the natural resources of the country, a great part of the population should be swept from the soil.” 12 For the last century and a half, the Irish famine has been cited by Malthusians as proof of their theory of overpopulation, so a few words are in order here to set the record straight. 13 Ireland was certainly not overpopulated in 1846. In fact, based on census data from 1841 and 1851, the Emerald Isle boasted a mere 7.5 million people in 1846, less than half of England’s 15.8 million, living on a land mass about two-thirds that of England and of similar quality. So compared to England, Ireland before the famine was if anything somewhat underpopulated. 14 Nor, as is sometimes said, was the famine caused by a foolish decision of the Irish to confine their diet to potatoes, thereby exposing themselves to starvation when a blight destroyed their only crop. In fact, in 1846 alone, at the height of the famine, Ireland exported over 730,000 cattle and other livestock, and over 3 million quarts of corn and grain flour to Great Britain. 15 The Irish diet was confined to potatoes because—having had their land expropriated, having been forced to endure merciless rack-rents and taxes, and having been denied any opportunity to acquire income through manufactures or other means—tubers were the only food the Irish could afford. So when the potato crop failed, there was nothing for the Irish themselves to eat, despite the fact that throughout the famine, their homeland continued to export massive amounts of grain, butter, cheese, and meat for foreign consumption. As English reformer William Cobbett noted in his Political Register: Hundreds of thousands of living hogs, thousands upon thousands of sheep and oxen alive; thousands upon thousands of barrels of beef, pork, and butter; thousands upon thousands of sides of bacon; and thousands and thousands of hams; shiploads and boats coming daily and hourly from Ireland to feed the west of Scotland; to feed a million and a half people in the West Riding of Yorkshire, and in Lancashire; to feed London and its vicinity; and to fill the country shops in the southern counties of England; we beheld all this, while famine raged in Ireland amongst the raisers of this very food. 16 “The population should be swept from the soil.” Evicted from their homes, millions of Irish men, women, and children starved to death or died of exposure. (Contemporary drawings from Illustrated London News.)
Robert Zubrin (Merchants of Despair: Radical Environmentalists, Criminal Pseudo-Scientists, and the Fatal Cult of Antihumanism)
The great majority of men in cities are apt to pride themselves on their own exemption from ‘superstition’, and to smile pityingly at the poor countrymen and countrywomen who believe in fairies. But when they do so they forget that, with all their own admirable progress in material invention, with all the far-reaching data of their acquired science, with all the vast extent of their commercial and economic conquests, they themselves have ceased to be natural. Wherever under modern conditions great multitudes of men and women are herded together there is bound to be an unhealthy psychical atmosphere never found in the country—an atmosphere which inevitably tends to develop in the average man who is not psychically strong enough to resist it, lower at the expense of higher forces or qualities, and thus to inhibit any normal attempts of the Subliminal Self (a well-accredited psychological entity) to manifest itself in consciousness.
W.Y. Evans-Wentz (The Fairy-Faith in Celtic Countries)
YouTube also contains a treasure trove of lectures by nearly all of finance’s leading lights, strewn throughout its vast wasteland of misinformation. Tread carefully. A few wrong clicks and you’ll wind up with a QAnon conspiracist or a crypto bro. Of the names I’ve mentioned in this book, I’d search for John Bogle, Eugene Fama, Kenneth French, Jonathan Clements, Zvi Bodie, William Sharpe, Burton Malkiel, Charles Ellis, and Jason Zweig. Worthwhile finance podcasts abound. Start with the Economist’s weekly “Money Talks” and NPR’s Planet Money, although most of the latter’s superb coverage revolves around economics and relatively little around investing. Rick Ferri’s Boglehead podcast interviews cover mainly passive investing. Another financial podcast I highly recommend is Barry Ritholtz’s Masters in Business from Bloomberg. Podcasts are a rapidly evolving area. Lest you wear your ears out, you’ll need discretion to curate the burgeoning amount of high-quality audio. Research mutual funds. All the fund companies discussed in this book have sophisticated websites from which basic fund facts, such as fees and expenses, can be obtained, as well as annual and semiannual reports that list and tabulate holdings. If you’re researching a large number of funds, this gets cumbersome. The best way is to visit Morningstar.com. Use the site’s search function to locate the main page for the fund you’re interested in and click the “Expense” and “Portfolio” tabs to find the fund expense ratio and detailed data on the fund holdings. Click the “Performance” tab to see the fund’s return over periods ranging from a single day up to 15 years, and the “Chart” tab to compare the returns of multiple funds over a given interval. ***
William J. Bernstein (The Four Pillars of Investing, Second Edition: Lessons for Building a Winning Portfolio)
The Future of Diabetes Management: Continuous Glucose Monitors by Med Supply US In the realm of diabetes management, continuous glucose monitors (CGMs) have emerged as a revolutionary technology, transforming the way individuals monitor their blood sugar levels. Med Supply US, a leading name in healthcare solutions, is at the forefront of this innovation, offering cutting-edge CGM devices that enhance the quality of life for those with diabetes. What sets continuous glucose monitors apart is their ability to provide real-time glucose readings, allowing users to track their levels throughout the day and night, without the need for constant finger pricks. This continuous monitoring not only offers convenience but also helps individuals make informed decisions about their diet, exercise, and insulin dosages. Med Supply US has established itself as a trusted provider of CGMs, offering a range of devices that cater to different needs and preferences. Whether it's the ease of use of their user-friendly interfaces or the accuracy of their readings, Med Supply US CGMs are designed to empower users in managing their diabetes effectively. One of the key advantages of Med Supply US CGMs is their compatibility with smartphone apps, allowing users to conveniently view their glucose data on their devices. This seamless integration with technology makes monitoring glucose levels more accessible and less intrusive, leading to better diabetes management outcomes. In conclusion, continuous glucose monitors by Med Supply US are revolutionizing diabetes management, offering a level of convenience, accuracy, and integration with technology that was previously unimaginable. With Med Supply US CGMs, individuals can take control of their diabetes with confidence, knowing that they have a reliable partner in their journey towards better health.
Med Supply US
market research survey in Myanmar assumes a urgent part in empowering organizations to go with informed choices in view of buyer conduct and market patterns. In Myanmar, measurable studying associations offer fundamental administrations that give significant experiences to organizations hoping to flourish in the neighborhood market. The Myanmar statistical surveying scene includes different administrations pointed toward aiding organizations comprehend and explore the powerful purchaser market. Factual looking over associations in Myanmar offer a scope of administrations intended to meet the different requirements of organizations. These administrations incorporate market investigation, customer conduct studies, contender examination, and industry pattern evaluations. One of the central administrations presented by market research survey in Myanmar is market examination. Through thorough information assortment and examination, these associations furnish organizations with a profound comprehension of market elements, including market size, development potential, and key patterns. This data is essential for organizations looking to enter new business sectors or extend their current tasks in Myanmar. Buyer conduct studies are one more fundamental part of statistical surveying administrations in Myanmar. By concentrating on purchaser inclinations, purchasing behaviors, and dynamic cycles, organizations can fit their items and administrations to more readily address the issues of their ideal interest group. Understanding the subtleties of customer conduct is crucial for organizations hoping to acquire an upper hand in the Myanmar market. Contender examination is likewise a critical contribution of statistical surveying firms in Myanmar. By directing inside and out appraisals of contenders' techniques, market situating, and qualities and shortcomings, organizations can recognize open doors and dangers inside their industry. This information engages organizations to refine their own techniques and separate themselves in a jam-packed commercial center. Moreover, industry pattern appraisals given by statistical surveying firms empower organizations to remain on top of things. By determining and examining industry patterns, organizations can proactively adjust to changes in customer inclinations, innovation progressions, and administrative turns of events. This proactive methodology assists organizations with relieving gambles and exploit arising open doors. Notwithstanding these center administrations, statistical surveying firms in Myanmar likewise offer redid research arrangements custom fitted to the particular necessities of organizations. Whether it's leading overviews, center gatherings, or top to bottom meetings, these associations give organizations the fundamental devices to assemble important experiences straightforwardly from their interest group. For organizations working in Myanmar or looking to enter the market, utilizing the administrations of measurable studying associations is vital to going with informed key choices. By taking advantage of the abundance of information and experiences given by these organizations, organizations can upgrade their market understanding, distinguish learning experiences, and relieve chances. All in all, statistical surveying study administrations presented by market research survey in Myanmar are fundamental for organizations hoping to flourish in the neighborhood market. From market examination and shopper conduct studies to contender investigation and industry pattern appraisals, these administrations furnish organizations with the information and experiences expected to pursue educated and vital choices. By tackling the force of statistical surveying, organizations can situate themselves for progress in the dynamic and quickly advancing business sector scene of Myanmar.
market research survey in Myanmar
The degree of data quality excellence that should be attained and sustained is driven by the criticality of the data, the business need and the cost and time to achieve the defined degree of data quality.
Rupa Mahanti, in Data Quality: Dimensions, Measurement, Strategy, Management, and Governance
You do not need to have zero percent data issues... In other words, you do not need 100% data quality.
Rupa Mahanti, in Data Quality: Dimensions, Measurement, Strategy, Management, and Governance
Each data quality dimension captures a particular measurable aspect of data quality. In other words, the dimensions represent the views, benchmarks, or measures for data quality issues that can be understood, analysed, and resolved or minimized eventually.
Rupa Mahanti (Data Quality: Dimensions, Measurement, Strategy, Management, and Governance)
My friend Bangaly Kaba, formerly head of growth at Instagram, called this idea the theory of “Adjacent Users.” He describes his experience at Instagram, which several years post-launch was growing fast but not at rocketship speed: When I joined Instagram in 2016, the product had over 400 million users, but the growth rate had slowed. We were growing linearly, not exponentially. For many products, that would be viewed as an amazing success, but for a viral social product like Instagram, linear growth doesn’t cut it. Over the next 3 years, the growth team and I discovered why Instagram had slowed, developed a methodology to diagnose our issues, and solved a series of problems that reignited growth and helped us get to over a billion users by the time I left. Our success was anchored on what I now call The Adjacent User Theory. The Adjacent Users are aware of a product and possibly tried using it, but are not able to successfully become an engaged user. This is typically because the current product positioning or experience has too many barriers to adoption for them. While Instagram had product-market fit for 400+ million people, we discovered new groups of billions of users who didn’t quite understand Instagram and how it fit into their lives.67 In my conversations with Bangaly on this topic, he described his approach as a systematic evaluation of the network of networks that constituted Instagram. Rather than focusing on the core network of Power Users—the loud and vocal minority that often drive product decisions—instead the approach was to constantly figure out the adjacent set of users whose experience was subpar. There might be multiple sets of nonfunctional adjacent networks at any given time, and it might require different approaches to fix each one. For some networks, it might be the features of the product, like Instagram not having great support for low-end Android apps. Or it might be because of the quality of their networks—if the right content creators or celebrities hadn’t yet arrived. You fix the experience for these users, then ask yourself again, who are the adjacent users? Then repeat. Bangaly describes this approach: When I started at Instagram, the Adjacent User was women 35–45 years old in the US who had a Facebook account but didn’t see the value of Instagram. By the time I left Instagram, the Adjacent User was women in Jakarta, on an older 3G Android phone with a prepaid mobile plan. There were probably 8 different types of Adjacent Users that we solved for in-between those two points. To solve for the needs of the Adjacent User, the Instagram team had to be nimble, focusing first on pulling the audience of US women from the Facebook network. This required the team to build algorithmic recommendations that utilized Facebook profiles and connections, so that Instagram could surface friends and family on the platform—not just influencers. Later on, targeting users in Jakarta and in other developing countries might involve completely different approaches—refining apps for low-end Android phones with low data connections. As the Adjacent User changes, the strategy has to change as well.
Andrew Chen (The Cold Start Problem: How to Start and Scale Network Effects)
Just as YouTube started with manual curation, most networked products can start with manual efforts. This means exercising editorial judgment, or allowing users to curate content themselves. The App Store has millions of apps, so when Apple releases a list of “Apps of the Year” in the App Store, it aids discovery for consumers but also inspires app developers to invest in the design and quality of their products. Or platforms can leverage user-generated content, where content is organized by the ever-popular hashtag—one example is Amazon’s wish lists, which are driven primarily by users without editors. Similarly, using implicit data—whether that’s attributes of the content or grouping the originator by their company or college email domain name—can bring people together with data from the network. Twitter uses a hybrid approach—the team analyzes activity on the network to identify trending events, which are then editorialized into stories.
Andrew Chen (The Cold Start Problem: How to Start and Scale Network Effects)
Anti-Network Effects Hit the Google+ Launch A charismatic executive from one of the most powerful technology companies in the world introduces a new product at a conference. This time, it’s June 2011 at the Web 2.0 Summit, where Google vice president Vic Gundotra describes the future of social networking and launches Google+. This was Google’s ambitious strategy to counteract Facebook, which was nearing their IPO. To give their new networked product a leg up, as many companies do, it led with aggressive upsells from their core product. The Google.com homepage linked to Google+, and they also integrated it widely within YouTube, Photos, and the rest of the product ecosystem. This generated huge initial numbers—within months, the company announced it had signed up more than 90 million users. While this might superficially look like a large user base, it actually consisted of many weak networks that weren’t engaged, because most new users showed up and tried out the product as they read about it in the press, rather than hearing from their friends. The high churn in the product was covered up by the incredible fire hose of traffic that the rest of Google’s network generated. Even though it wasn’t working, the numbers kept going up. When unengaged users interact with a networked product that hasn’t yet gelled into a stable, atomic network, then they don’t end up pulling other users into the product. In a Wall Street Journal article by Amir Efrati, Google+ was described as a ghost town even while the executives touted large top-line numbers: To hear Google Inc. Chief Executive Larry Page tell it, Google+ has become a robust competitor in the social networking space, with 90 million users registering since its June launch. But those numbers mask what’s really going on at Google+. It turns out Google+ is a virtual ghost town compared with the site of rival Facebook Inc., which is preparing for a massive initial public offering. New data from research firm comScore Inc. shows that Google+ users are signing up—but then not doing much there. Visitors using personal computers spent an average of about three minutes a month on Google+ between September and January, versus six to seven hours on Facebook each month over the same period, according to comScore, which didn’t have data on mobile usage.86 The fate of Google+ was sealed in their go-to-market strategy. By launching big rather than focusing on small, atomic networks that could grow on their own, the teams fell victim to big vanity metrics. At its peak, Google+ claimed to have 300 million active users—by the top-line metrics, it was on its way to success. But network effects rely on the quality of the growth and not just its quantity
Andrew Chen (The Cold Start Problem: How to Start and Scale Network Effects)