The Black Box of the World

In continuing to explore the domains of the digital and the analog, I’ve come to an unexpected conclusion, one that will no doubt be obvious to others but which I nevertheless found surprising. I’ve finally realized the extent to which analogicity has been hounded out of technical history. In the past I had assumed, incorrectly, that digitality and analogicity were more or less equal alternatives. Yes there was a litany of digital techniques in human history — moveable type, arithmetic, metaphysics — but so too history could furnish its share of analogical triumphs, right? Not exactly. In my unscientific survey, the digital techniques far outweigh the analog ones. And many things categorized as blockbuster interjections of peak analogicity — the invention of the calculus, Richard Dedekind’s 1858 definition of real numbers — harbor deeply-ingrained anti-analog biases upon closer inspection. Dedekind sought to discretize the real, not think the real as pure continuity, and his tool of choice was the cut, a digital technology if there ever was one. And while Newton’s “fluxions” are genuinely strange and interesting, both Newtonian and Leibnizian calculus aimed to “solve” the problem of pure continuity via recourse to a distinctly digital mechanism: the difference unit, or differential. Good show, now try it again without cheating! It almost makes me nostalgic for Euclid. At least he stayed true to the analog sciences of line and curve, without recourse to the digital crutches of algebra or arithmetic.

So while I’m deeply skeptical of the analog turn in theory — a few decades ago it was all language, structure, symbol, economy, logic, now it’s all affect, experience, expression, ethics, aesthetics — it’s only fair to admit the profound rareness of analogicity. Particularly in philosophy, which is almost entirely dominated by digital thinking. (In mathematics it’s not even close: math is one long sad story of arithmetic subduing geometry, the symbol subduing the real.) It’s exceptionally difficult to think continuity as continuity. Very few have accomplished this feat. So if anything we need more work on continuity and analogical sciences, not less. More work on signal processing, noise, randomness, modularity, curves and lines, heat and energy, fields, areas, transduction, quality, intuition. Less on arithmetic and discrete breaks. More on bending, blurring, bleeding, and sliding. More on the body, more on real experience. More on what William James called the “blooming, buzzing confusion” of life.

The answer is all around us, in real materiality. Here is the theorist Karen Barad talking about how quantum particles unsettle the various abstractions fueling Western philosophy and culture:

“Quantum particles unsettle comfortable notions of temporality, of the new, the now, presence, absence, progress, tradition, evolution/extinction, stasis/restoration, remediation, return/reversal, universal, generation/production, emergence, recursion, iterations, temporary, momentary, biographic, historical, fast/slow, speeding up, intensifying, compressing, pausing, disrupting, rupturing, changing, being, becoming.”

In other words, if computers are black boxes, and if they can be opened, revealing other black boxes, with boxes inside of those boxes, still there exists one black box that’s not like the others, the black box of the world. It’s the box that can never be opened.

Or I should clarify, we can open it. That’s what phenomenology teaches; our mode of being inquires into the conditions of real materiality. So we can. But computers can’t. Computers can process data but they can’t be there present in and as the data. I mean this 100% phenomenologically. Digital computers can’t not formalize. That is their special curse: always only form. Always only box. Presence isn’t part of the equation. Computers require input, but they have a hard time generating input out of whole cloth. Computers require givenness, but they can’t, ultimately, give it themselves.

In other words, computers finally solved the old mind-body problem from Descartes. It was easy. Just keep the mind, and amputate the body! Computers are idealism, perfected. And every perfection comes at a cost.

I’m exaggerating, of course. Computers exist in the real world. For instance, crypto-currency mining is defined explicitly as a thermodynamic process (the expenditure of energy) not an immaterial process. And to be clear: my point is really about digital computers in silicon. DNA computers and quantum computers are something quite different: DNA computers because they’re massively parallel; quantum computers because they don’t follow the logic of exclusively-binary states. And we know that computers can generate data until the cows come home, even if, as I maintain, computers are difference engines and not “presence engines.”

But what about neural nets and the kinds of images drawn using AI, don’t they generate new data? No, these kinds of algorithms are just glorified techniques for computing averages. The semantic essence of these images was pre-tagged and databased by some underpaid intern or Mechanical Turk worker. Don’t be dazzled by Deep Dream. You made the data, Google just crunched the numbers. In the 19th Century it was called “composite photography” and it was used for eugenics research. Oh how the world turns. And anyway artists like Jason Salavon or Trevor Paglen have made much better use of this technique than the AI companies ever will.

Image result for procedural noise landscape

Of all genuinely computer-generated data — not the stuff tagged by workers — the most successful is probably procedural noise. Procedural noise is essentially a pseudo-random sequence of numbers. It has been incredibly influential in computer graphics and other areas.

Procedural noise is mathematically elegant. It’s what they call a “fract of a sine” – that is, the fractional component of a sine wave. How does it work? First, compute sin(x); then go deep, deep rightward past the decimal point and grab an integer; that number will be, for all practical purposes, random. Do it again, and you have a pseudo-random sequence. So it’s the fractional component — or “fract” — of a scaled-up wave:

//compute the sine of x
sin(2) = 0.90929742682

//multiply by a large number to scale the value way up
0.90929742682 * 100000 = 90929.7426826

//lop off the fractional component
//the remaining value is pseudo-random
.7426826

//voila the formula for a pseudo-random number
random = fract(sin(x) * 100000)

There’s already enough noise in the system. You just have to know where to look for it, in this case deep, deep down inside a sine wave. But this makes perfect sense. Of course the source of this random “noise” would be a wave form, the sine wave. For we know that there is no greater analog technology than the wave form. And we know that all randomness — perceptual or actual — has its roots in the analogical real.

Another success is the JPEG compression algorithm, used in both still images and digital video. Philosophers like to wring their hands anxiously over the fact that digital images can be synthesized (and thus manipulated and even faked), breaking the supposed link between referent and representation. But, as usual, these philosophers don’t really know what they are talking about. If you’re nervous about images being inauthentic, the queue for complaints is twenty-five hundred years old and it starts right behind Socrates.

It’s not that digital images can be manipulated; if they are images compressed through JPEG (or similar algorithms), they are *100%* synthetic. The image exists strictly as a combination of what they call “DCT basis functions” with corresponding coefficients. The DCT basis functions are little wavelets and they work a bit like the alphabet does, only with more significant visual impact. There are 64 of them, just like a chess board. Every inch of your image is synthesized from combining these 64 thumbnail elements in different strengths determined by the coefficients. The “D” in DCT stands for “discrete.” The “C” stands for “cosine” — not “sine” but a wave form nonetheless. So, again, the analog is the “real materiality” at the heart of the digital.

In other words, the light at the end of the black box is an analog light. The computer is one half of an asymmetrical relation. Computers need their inputs; and most of the inputs come from other computers. But there’s one terminal input, the transductive interface where the analog real enters through the side door.

taken from here

Nach oben scrollen
Datenschutz
Ich, Achim Szepanski (Wohnort: Deutschland), verarbeite zum Betrieb dieser Website personenbezogene Daten nur im technisch unbedingt notwendigen Umfang. Alle Details dazu in meiner Datenschutzerklärung.
Datenschutz
Ich, Achim Szepanski (Wohnort: Deutschland), verarbeite zum Betrieb dieser Website personenbezogene Daten nur im technisch unbedingt notwendigen Umfang. Alle Details dazu in meiner Datenschutzerklärung.