# The "fragility" of digital media



## Metryq (Jan 27, 2012)

http://www.variety.com/article/VR1118048861/

*Academy sounds alarm about fragility of digital production*
*Sci-Tech report spotlights short lifespan of non-film formats*

Treating the symptoms instead of the actual problem
This article implies a diverse range of topics, yet addresses only the smallest of them. Specifically, the story is about indie producers being both ignorant of and also powerless to deal with the "fragility" of digital media. I am both surprised and not surprised by the ignorance aspect. Production of any sort is going to require some technical knowledge. However, refinement of such technologies also makes it easier and easier for the layman to pass "Go." Like flipping on a light switch or heating a prepared meal in a microwave oven, many people take technology for granted. It just is, and they use it.

Anyone who is even marginally propellerheaded will already be aware of the "fragility" of digital media. So in one sense, I find the article vaguely panic-mongering, but I'll address that in more detail shortly. For now, let's stick with those indie producers. 

The tyranny of numbers
Semiconductors did not single-handedly make digital computers possible. In an alternate universe, technology might have bypassed vacuum tubes altogether. However, vacuum tubes were easier to construct and tinker with when the first semiconductors were discovered. So engineers took the path of least resistance  and developed an entire "language" in vacuum tubes for switching, amplifying, and other basic building blocks. Only a few decades later, more was known about semiconductors, and they had major advantages like lower power consumption and significantly longer lifespans. But the big obstacle still to overcome was the "tyranny of numbers."

A single logic unit in a computer might take eight tubes or semiconductors, plus a few ancillary components, all assembled in the correct manner. A single logic unit is virtually worthless, so one needs several tens, hundreds, or even thousands of logic units to do anything worthwhile. But the more logic units are needed, the greater the chance for human error to creep in while assembling all those components. _One_ error can make the whole assembly useless. Also, in the time necessary to painstakingly assemble this cluster, one could have done the work manually a dozen times over. This is the tyranny of numbers.

Then two engineers working independently invented "monolithic circuits" and broke the "sound barrier" of the tyranny of numbers. In a nutshell, a complex circuit of millions of components can be designed, then "photographically" mass manufactured. It might take months to design, build and test the prototype. After that, duplicates can be run off very rapidly, and each one a "force multiplier" that made the long effort worthwhile. It was a revolution as significant as the printing press in Renaissance Europe.

Shooting film, waiting for it to be developed, cutting and pasting it together by hand, along with audio tape handled in the same fashion, is a kind of tyranny of numbers. The economy of digital production and editing tools makes it _possible_ for indie programs to be made at all.

Irresistible force and immovable object
A handheld video camcorder is like the point of a spear—tremendous power focussed into a compact area. However, that fine point can also be blunted and damaged more easily than a club made of lesser materials. More is being produced with digital, but it may not last as long. This is actually something of a trade-off. One of digital media's advantages is the way content can be rapidly generated, disseminated, and even edited—compare newspapers with news sites on the Web. Not all digital content is meant to be transient, but that is a current limitation of the technologies used for storage. 

If we consider a single character printed on paper as a point of information, then that bit of information is supported by billions of atoms in ink and cellulose, and a letter chiseled into granite is supported by even more. Meanwhile a digital bit is stored on a tiny fraction of the matter needed for paper or stone. The paper and stone have a kind of "inertia" against change that is a disadvantage in a rapidly moving world, yet both can tolerate tremendous wear before the data is completely lost, while the digital bit can be gone before one knows it. Matter is very noisy at smaller scales.

There are books and documentaries speculating on what might happen to Man's creations if Mankind suddenly vanished tomorrow. A structure like the World Trade Towers in New York would collapse (even without the help of jet planes) before a structure like the Empire State Building because the latter is so "over-built," like a castle...or a paper book. (In fact, a plane did crash into the Empire State Building in July of 1945.) Bronze and plastics are two materials with near-immortal longevity. I suppose we could resort to bronze punch cards for computers, or maybe even plastic discs with grooves cut into them...but then there's that tyranny of numbers again. Can you imagine trying to archive the petabytes of just the data we _want_ to preserve by this method?

"There's a big difference between _mostly_ dead and _all_ dead."
When a computer file is "deleted"—thrown in the trash and the trash emptied—the file still exists on the hard-drive. Computer hard-drives are like books with a table of contents (TOC) and pages full of print. "Deleting" every file on the hard-drive is like tearing the TOC out of a book. The pages full of text still exist, and the files can be "recovered" with special software that essentially "reads" the hard-drive from cover-to-cover, creating a new TOC along the way. Basically, a "deleted" file exists until something else over-writes it, and removing a file's listing from the TOC ("deleting" it) gives permission to over-write the file.

Some operating systems may permit one to "secure delete" a file by deliberately over-writing a file with ones and zeroes 7 times. This is typically good enough to make the file unrecoverable by commercial data recovery apps. Then there is "military grade" secure deletion where a file is over-written _35_ times. This suggests that there are military techniques for recovering even a 7x secure delete. Consider that for a moment—if coherent data can be extracted even from the palimpsest of a 7x over-write, then magnetic storage must be more robust than generally thought.

An obsolete or proprietary format is not always unrecoverable:
http://www.cosmosmagazine.com/news/3827/lost-apollo-tapes-restored-and-broadcast

If the data is "valuable" enough, there are techniques for recovering it. A hard-drive may die, but there are companies specializing in recovering the data. They might rebuild damaged parts of the mechanism, or even pull apart the platters and reconstruct the data by a variety of hardware and software means. 

Even a damaged digital file can be recovered. An unrecognized image file format, or a file with damaged headers can sometimes be recovered with a commercial app like Adobe Photoshop. One might have to hit-and-miss over the frame size, the number of color channels, etc. But it's do-able.

Macromedia Director, a precursor to Flash, could compile a "Projector," a document and player all rolled into one. This made it very convenient for distribution, but also made Projectors extremely prone to obsolescence—at least on the Mac, but I know it affected many files for Windows users, too. Today, I can still extract most of the data from a Projector file by feeding it to FileJuicer, an app that scans through the pile looking for known data types, like images, video, audio, text, and so on.

So "mostly dead" is not always "all dead," but I agree with the Variety article that some format standardization can help.

Sound barrier
The vast array of competing formats is not necessarily a bad thing. They mean that there are lots of engineers out there all trying to find some technological edge that might make their format stand out in the marketplace. On the one hand, that tyranny of numbers of smaller amounts of matter being used to store information may mean that we're approaching an asymptotic limit—a "sound barrier" in storage.

But this does not mean it must be an absolute limit in the laws of physics. Perhaps a digital storage medium with the longevity of stone, paper, or plastic will require a breakthrough in materials physics... or maybe it is simply an engineering obstacle that some clever person will figure out how to bypass.

In the meantime, it is kind of ironic that new indie programming isn't being distributed when we have hundreds of TV channels overflowing with reruns of _old_ stuff. That, at least, is not an engineering problem.


----------



## chrispenycate (Jan 27, 2012)

The fact that a digital copy of an oeuvre is essentially identical to the original means that the information is potentially eternal. As long as information decay is within certain limits, the copy, be it on the original medium or a younger mechanical support. Unlike, say the need to transfer films from (potentially explosive) nitrate stock to newer, longer-lived (but far from eternal) acetate, there was a considerable loss of quality. When the studio was doing puppet shows on video, and we went onto the Ampex DCT standard, one of our programs had sixty eight generations of picture on it – just try doing that with Betacam SP, or 35 mm.

When the library of Alexandria burnt down a significant portion of the knowledge available to humanity was lost, and the copies (of such volumes that had been copied) were likely to contain errors due to the scribes responsible. Nowadays, the information could be backed up to the point that the biosphere could be destroyed without it being lost (obviously, the lack of lifeforms to appreciate it would be a drawback, and it would decay with time as the cycle of renewal was interrupted). 

The lack of archives of the material is more a symptom of everybody being more interested in producing originals than any shortcomings in the mechanical support medium, or coding systems, There's just too much new programme when most TV stations run 24/7, even with reruns. It would demand an archiving/filing team as big as the administration of the station, and we know that exceeded the production team back in the seventies. Try Draper's "MS fnd in a lbry" {hey, I actually found it!} (http://home.comcast.net/~bcleere/texts/draper.html), written in 1961, and compare with the present state of affairs…

So, for me, the base problem is not the actual storage; it would be quite possible with present day technology to develop a storage medium that would last five thousand years, but the continuous renewal of all data which interests anybody has been "chosen" (admittedly by default) as adequate. And more than. No, the problem is applying Sturgeons law, and filtering information that is not relevant.

I somehow don't think You Tube would agree with me.


----------



## Metryq (Jan 27, 2012)

That story is pretty warped, Chris!  But it dovetails with my own thoughts—it might not be bad if was lost some of this crap.


----------

