Data Compression Quotes

We've searched our database for all the quotes and captions related to Data Compression. Here they are! All 19 of them:

Mark, some answers to your earlier questions: No, we will not tell our Botany Team to "Go fuck themselves." [...] The data transfer rate just isn't enough for the size of music files, even in compressed formats. So your request for "Anything, oh god, ANYTHING but Disco" is denied. Enjoy your boogie fever.
Andy Weir (The Martian)
From this point of view, the laws of science represent data compression in action. A theoretical physicist acts like a very clever coding algorithm. “The laws of science that have been discovered can be viewed as summaries of large amounts of empirical data about the universe,” wrote Solomonoff. “In the present context, each such law can be transformed into a method of compactly coding the empirical data that gave rise to that law.” A good scientific theory is economical. This was yet another way of saying so.
James Gleick (The Information: A History, a Theory, a Flood)
The data transfer rate just isn’t good enough for the size of music files, even in compressed formats. So your request for “Anything, oh God, ANYTHING but Disco” is denied. Enjoy your boogie fever. Also,
Andy Weir (The Martian)
He looked past Chin toward streams of numbers running in opposite directions. He understood how much it meant to him, the roll and flip of data on a screen. He studied the figural diagrams that brought organic patterns into play, birdwing and chambered shell. It was shallow thinking to maintain that numbers and charts were the cold compression of unruly human energies, every sort of yearning and midnight sweat reduced to lucid units in the financial markets. "In fact data itself was soulful and glowing, a dynamic aspect of the life process. This was the eloquence of alphabets and numeric systems, now fully realized in electronic form, in the zero-oneness of the world, the digital imperative that defined every breath of the planet's living billions. Here was the heave of the biosphere. Our bodies and oceans were here, knowable and whole.
Don DeLillo (Cosmopolis)
That iPhone sitting in your pocket is the exact equivalent of a Cray XMP supercomputer from twenty years ago that used to cost ten million dollars. It’s got the same operating system software, the same processing speed, the same data storage, compressed down to a six-hundred-dollar device. That
Anonymous
The interface theory says that space and time are not fundamental aspects of objective reality, but simply a data format for messages about fitness, a format evolved to compress and correct such messages. Objects in spacetime are not aspects of objective reality, but simply messages about fitness coded in a format of icons that is specific to the needs of Homo sapiens. In particular, our bodies are not aspects of objective reality, and our actions don’t give us direct access to preexisting objects in spacetime. Our bodies are messages about fitness that are coded as icons in a format specific to our species. When you perceive yourself sitting inside space and enduring through time, you’re actually seeing yourself as an icon inside your own data structure.
Donald D. Hoffman (The Case Against Reality: Why Evolution Hid the Truth from Our Eyes)
Strange to consider that these two linguistic operations, metaphor and analogy, so often linked together in rhetoric and narratology, and considered to be variants of the same operation, are actually hugely different from each other, to the point where one is futile and stupid, the other penetrating and useful. Can this not have been noticed before? Do they really think x is like y is equivalent to x is to y as a is to b? Can they be that fuzzy, that sloppy? Yes. Of course. Evidence copious. Reconsider data at hand in light of this; it fits the patterns. Because fuzzy is to language as sloppy is to action. Or maybe both these rhetorical operations, and all linguistic operations, all language—all mentation—simply reveal an insoluble underlying problem, which is the fuzzy, indeterminate nature of any symbolic representation, and in particular the utter inadequacy of any narrative algorithm yet invented and applied. Some actions, some feelings, one might venture, simply do not have ways to be effectively compressed, discretized, quantified, operationalized, proceduralized, and gamified; and that lack, that absence, makes them unalgorithmic. In short, there are some actions and feelings that are always, and by definition, beyond algorithm. And therefore inexpressible. Some
Kim Stanley Robinson (Aurora)
The best entrepreneurs don’t just follow Moore’s Law; they anticipate it. Consider Reed Hastings, the cofounder and CEO of Netflix. When he started Netflix, his long-term vision was to provide television on demand, delivered via the Internet. But back in 1997, the technology simply wasn’t ready for his vision—remember, this was during the era of dial-up Internet access. One hour of high-definition video requires transmitting 40 GB of compressed data (over 400 GB without compression). A standard 28.8K modem from that era would have taken over four months to transmit a single episode of Stranger Things. However, there was a technological innovation that would allow Netflix to get partway to Hastings’s ultimate vision—the DVD. Hastings realized that movie DVDs, then selling for around $ 20, were both compact and durable. This made them perfect for running a movie-rental-by-mail business. Hastings has said that he got the idea from a computer science class in which one of the assignments was to calculate the bandwidth of a station wagon full of backup tapes driving across the country! This was truly a case of technological innovation enabling business model innovation. Blockbuster Video had built a successful business around buying VHS tapes for around $ 100 and renting them out from physical stores, but the bulky, expensive, fragile tapes would never have supported a rental-by-mail business.
Reid Hoffman (Blitzscaling: The Lightning-Fast Path to Building Massively Valuable Companies)
The rate of time flow perceived by an observer in the simulated universe is completely independent of the rate at which a computer runs the simulation, a point emphasized in Greg Egan's science-fiction novel Permutation City. Moreover, as we discussed in the last chapter and as stressed by Einstein, it's arguably more natural to view our Universe not from the frog perspective as a three-dimensional space where things happen, but from the bird perspective as a four-dimensional spacetime that merely is. There should therefore be no need for the computer to compute anything at all-it could simply store all the four-dimensional data, that is, encode all properties of the mathematical structure that is our Universe. Individual time slices could then be read out sequentially if desired, and the "simulated" world should still feel as real to its inhabitants as in the case where only three-dimensional data is stored and evolved. In conclusion: the role of the simulating computer isn't to compute the history of our Universe, but to specify it. How specify it? The way in which the data are stored (the type of computer, the data format, etc.) should be irrelevant, so the extent to which the inhabitants of the simulated universe perceive themselves as real should be independent of whatever method is used for data compression. The physical laws that we've discovered provide great means of data compression, since they make it sufficient to store the initial data at some time together with the equations and a program computing the future from these initial data. As emphasized on pages 340-344, the initial data might be extremely simple: popular initial states from quantum field theory with intimidating names such as the Hawking-Hartle wavefunction or the inflationary Bunch-Davies vacuum have very low algorithmic complexity, since they can be defined in brief physics papers, yet simulating their time evolution would simulate not merely one universe like ours, but a vast decohering collection of parallel ones. It's therefore plausible that our Universe (and even the whole Level III multiverse) could be simulated by quite a short computer program.
Max Tegmark (Our Mathematical Universe: My Quest for the Ultimate Nature of Reality)
This book is a compilation of interesting ideas that have strongly influenced my thoughts and I want to share them in a compressed form. That ideas can change your worldview and bring inspiration and the excitement of discovering something new. The emphasis is not on the technology because it is constantly changing. It is much more difficult to change the accompanying circumstances that affect the way technological solutions are realized. The chef did not invent salt, pepper and other spices. He just chooses good ingredients and uses them skilfully, so others can enjoy his art. If I’ve been successful, the book creates a new perspective for which the selection of ingredients is important, as well as the way they are smoothly and efficiently arranged together. In the first part of the book, we follow the natural flow needed to create the stimulating environment necessary for the survival of a modern company. It begins with challenges that corporations are facing, changes they are, more or less successfully, trying to make, and the culture they are trying to establish. After that, we discuss how to be creative, as well as what to look for in the innovation process. The book continues with a chapter that talks about importance of inclusion and purpose. This idea of inclusion – across ages, genders, geographies, cultures, sexual orientation, and all the other areas in which new ways of thinking can manifest – is essential for solving new problems as well as integral in finding new solutions to old problems. Purpose motivates people for reaching their full potential. This is The second and third parts of the book describes the areas that are important to support what is expressed in the first part. A flexible organization is based on IT alignment with business strategy. As a result of acceleration in the rate of innovation and technological changes, markets evolve rapidly, products’ life cycles get shorter and innovation becomes the main source of competitive advantage. Business Process Management (BPM) goes from task-based automation, to process-based automation, so automating a number of tasks in a process, and then to functional automation across multiple processes andeven moves towards automation at the business ecosystem level. Analytics brought us information and insight; AI turns that insight into superhuman knowledge and real-time action, unleashing new business models, new ways to build, dream, and experience the world, and new geniuses to advance humanity faster than ever before. Companies and industries are transforming our everyday experiences and the services we depend upon, from self-driving cars, to healthcare, to personal assistants. It is a central tenet for the disruptive changes of the 4th Industrial Revolution; a revolution that will likely challenge our ideas about what it means to be a human and just might be more transformative than any other industrial revolution we have seen yet. Another important disruptor is the blockchain - a distributed decentralized digital ledger of transactions with the promise of liberating information and making the economy more democratic. You no longer need to trust anyone but an algorithm. It brings reliability, transparency, and security to all manner of data exchanges: financial transactions, contractual and legal agreements, changes of ownership, and certifications. A quantum computer can simulate efficiently any physical process that occurs in Nature. Potential (long-term) applications include pharmaceuticals, solar power collection, efficient power transmission, catalysts for nitrogen fixation, carbon capture, etc. Perhaps we can build quantum algorithms for improving computational tasks within artificial intelligence, including sub-fields like machine learning. Perhaps a quantum deep learning network can be trained more efficiently, e.g. using a smaller training set. This is still in conceptual research domain.
Tomislav Milinović
6The report showed that while ‘intangible assets’ were growing on US and UK company balance sheets at nearly three times the rate of tangible assets, the actual size of the digital sector in the GDP figures had remained static. So something is broken in the logic we use to value the most important thing in the modern economy. However, by any measure, it is clear that the mix of inputs has altered. An airliner looks like old technology. But from the atomic structure of the fan blades, to the compressed design cycle, to the stream of data it is firing back to its fleet HQ, it is ‘alive’ with information.
Paul Mason
Features of Cassandra In order to keep this chapter short, the following bullet list covers the great features provided by Cassandra: Written in Java and hence providing native Java support Blend of Google BigTable and Amazon Dynamo Flexible schemaless column-family data model Support for structured and unstructured data Decentralized, distributed peer-to-peer architecture Multi-data center and rack-aware data replication Location transparent Cloud enabled Fault-tolerant with no single point of failure An automatic and transparent failover Elastic, massively, and linearly scalable Online node addition or removal High Performance Built-in data compression Built-in caching layer Write-optimized Tunable consistency providing choices from very strong consistency to different levels of eventual consistency Provision of Cassandra Query Language (CQL), a SQL-like language imitating INSERT, UPDATE, DELETE, SELECT syntax of SQL Open source and community-driven
C.Y. Kan (Cassandra Data Modeling and Analysis)
The process of creating .jpgs is synonymous with the process of throwing away information. 12-bits of data per channel from the sensor gets squeezed into 8 bits of data per channel (giving up some tonality and fine shades of color). A little bit of dynamic range gets lost too.  Then Lots of visual information that the human brain cannot perceive gets thrown away, which is what’s responsible for JPG’s famously small size.  If there is a lot of high-frequency detail in the image, then that gets replaced by what’s called a .jpg compression artifact (which I describe in a couple of sections).  Then the compressed .jpg image file is written to the memory card, and then the raw information from which the .jpg was produced is discarded (unless you were wise enough to shoot in RAW + JPG mode). 
Gary L. Friedman (The Complete Guide to Sony's Alpha 77 II: Professional Insights for the Experienced Photographer)
Like a recovering alcoholic, he marked each day without a jack in as an accomplishment. It was an exciting life, cutting through databank security and pilfering whatever he could, battling live Net security agents in some liquid mercury duel with programs he built from the ground up. The full-on speed of a crèche run was so different from just jacking into the front of the crèche. If a regular jack run was a sprint, a crèche run was a drag race, an intense compression of time and speed and data folded into every erg of consciousness. That kind of intensity couldn’t be easily put down, and once Bridge had removed himself from those runs, he’d felt their absence every goddamn day.
Gary A. Ballard (Under the Amoral Brigde)
Personal Thinking Blockchains More speculatively for the farther future, the notion of blockchain technology as the automated accounting ledger, the quantized-level tracking device, could be extensible to yet another category of record keeping and administration. There could be “personal thinking chains” as a life-logging storage and backup mechanism. The concept is “blockchain technology + in vivo personal connectome” to encode and make useful in a standardized compressed data format all of a person’s thinking. The data could be captured via intracortical recordings, consumer EEGs, brain/computer interfaces, cognitive nanorobots, and other methodologies. Thus, thinking could be instantiated in a blockchain — and really all of an individual’s subjective experience, possibly eventually consciousness, especially if it’s more precisely defined. After they’re on the blockchain, the various components could be administered and transacted — for example, in the case of a post-stroke memory restoration. Just as there has not been a good model with the appropriate privacy and reward systems that the blockchain offers for the public sharing of health data and quantified-self-tracking data, likewise there has not been a model or means of sharing mental performance data. In the case of mental performance data, there is even more stigma attached to sharing personal data, but these kinds of “life-streaming + blockchain technology” models could facilitate a number of ways to share data privately, safely, and remuneratively. As mentioned, in the vein of life logging, there could be personal thinking blockchains to capture and safely encode all of an individual’s mental performance, emotions, and subjective experiences onto the blockchain, at minimum for backup and to pass on to one’s heirs as a historical record. Personal mindfile blockchains could be like a next generation of Fitbit or Apple’s iHealth on the iPhone 6, which now automatically captures 200+ health metrics and sends them to the cloud for data aggregation and imputation into actionable recommendations. Similarly, personal thinking blockchains could be easily and securely recorded (assuming all of the usual privacy concerns with blockchain technology are addressed) and mental performance recommendations made to individuals through services such as Siri or Amazon’s Alexa voice assistant, perhaps piped seamlessly through personal brain/computer interfaces and delivered as both conscious and unconscious suggestions. Again perhaps speculatively verging on science fiction, ultimately the whole of a society’s history might include not just a public records and document repository, and an Internet archive of all digital activity, but also the mindfiles of individuals. Mindfiles could include the recording of every “transaction” in the sense of capturing every thought and emotion of every entity, human and machine, encoding and archiving this activity into life-logging blockchains.
Melanie Swan (Blockchain: Blueprint for a New Economy)
That iPhone sitting in your pocket is the exact equivalent of a Cray XMP supercomputer from twenty years ago that used to cost ten million dollars. It’s got the same operating system software, the same processing speed, the same data storage, compressed down to a six-hundred-dollar device. That is the breakthrough Steve achieved. That’s what these phones really are!
Brent Schlender (Becoming Steve Jobs: The Evolution of a Reckless Upstart into a Visionary Leader)
That iPhone sitting in your pocket is the exact equivalent of a Cray XMP supercomputer from twenty years ago that used to cost ten million dollars. It’s got the same operating system software, the same processing speed, the same data storage, compressed down to a six-hundred-dollar device. That is the breakthrough Steve achieved.
Brent Schlender (Becoming Steve Jobs: The Evolution of a Reckless Upstart into a Visionary Leader)
That iPhone sitting in your pocket is the exact equivalent of a Cray XMP supercomputer from twenty years ago that used to cost ten million dollars. It’s got the same operating system software, the same processing speed, the same data storage, compressed down to a six-hundred-dollar device.
Anonymous
Use manual sanity checks in data pipelines. When optimizing data processing systems, it’s easy to stay in the “binary mindset” mode, using tight pipelines, efficient binary data formats, and compressed I/O. As the data passes through the system unseen, unchecked (except for perhaps its type), it remains invisible until something outright blows up. Then debugging commences. I advocate sprinkling a few simple log messages throughout the code, showing what the data looks like at various internal points of processing, as good practice — nothing fancy, just an analogy to the Unix head command, picking and visualizing a few data points. Not only does this help during the aforementioned debugging, but seeing the data in a human-readable format leads to “aha!” moments surprisingly often, even when all seems to be going well. Strange tokenization! They promised input would always be encoded in latin1! How did a document in this language get in there? Image files leaked into a pipeline that expects and parses text files! These are often insights that go way beyond those offered by automatic type checking or a fixed unit test, hinting at issues beyond component boundaries. Real-world data is messy. Catch early even things that wouldn’t necessarily lead to exceptions or glaring errors. Err on the side of too much verbosity.
Micha Gorelick (High Performance Python: Practical Performant Programming for Humans)