During the 2020-2021 academic year I’m lucky enough to be co-leader (with prof. Emily Apter) of a research lab at NYU focused on questions of translation. Translation is a notoriously difficult topic, not only within theoretical discussions, but also as a practical art form. Translation is hard to think; it’s also hard to do, and harder still to do well. Emily was part of a team working on “untranslatables,” which tried to keep the problem alive as a problem, while also providing some basis for orientation with specific terms that have remained, for whatever reason, untranslated.
I’m very much still a beginner when it comes to translation theory, and only have limited experience in actual translation. Instead of talking about translation I’d like to outline four operations that are somehow external to translation — prior to it perhaps, or maybe implicated by translation, yet still somehow necessary to it…I’m not yet sure how exactly.
Let’s begin with two basic kinds of conversion thresholds for bodies and information. I will call these “symmetrical” interfaces because they involve a common measure across the interface threshold. These are the two conversion interfaces that don’t entail a mode change, and thus are simpler in a certain sense.
1 – Transcoding
Those of you do web coding know that colors can be easily transcoded between an RGB rendering and a hexadecimal rendering. For instance, the RGB color for solid red (255, 0, 0) may also be rendered in hexadecimal as (FF, 00, 00), or #FF0000 as it’s more commonly written. In this example, the decimal value “255” is equivalent to the hexadecimal value “FF.” (Try this yourself: when you count in base ten you get the ten integers from 0-9; but when you count in hex, you get six additional “integers” after 9 consisting of the letters A-F. Count up from 6, 7 to 8 to 9 then to A, B, on up to F, before resetting to 0 again.)
All sorts of discrete values can be transcoded in this way. Thus the ASCII character “M” can be transcoded into a decimal or base-ten value as “77,” into a binary or base-two value as “01001101,” or into a hexadecimal or base-sixteen value as “4D.” Despite using different encoding schemes, all four of these renderings are equivalent at the level of value:
M ↔ 77 ↔ 01001101 ↔ 4D
I define transcoding as a digital-to-digital interface (D → D). When transcoding, one or more symbols are transformed from a certain symbolic system to another symbolic system. For the most part, transcoding is “merely” mechanical or technical, meaning that transcoding relies on pre-given transformation rules that are known and explicitly defined given the systems in question. Transcoding is thus the implementation of those rules.
As a digital-to-digital interface, the essence of transcoding is found in the logic of identity and similarity. Bridging two different symbolic systems, “M” forms an identity with “77.” Why? Because ASCII (in conjunction with decimal notation) has already defined this identity to be true. Yet while transcoding entails identity, that does not mean that all transcoding is reversible, lossless, or otherwise non-invasive.
Numerous examples exist of one-way transcoding. For instance, many transcoding calculations performed by computers are not reversible — one of the most ubiquitous examples of transcoding in software development, compilation, is in many cases non-reversible (at least not trivially so). Likewise transcoding operations are frequently compressive and even lossy, meaning they delete information, as is the case in some logical operations. For example, if only two bits go in and one bit comes out, the operation is compressive. And if this compression isn’t reversible, it’s lossy compression.
2 – Transduction
If transcoding is a digital-to-digital interface, transduction refers to thresholds that both start and end in analog representation (A → A). “The Way Things Go,” the film by Peter Fischli and David Weiss, is a good illustration of transduction. Watch here as a series of qualitatively different materials — glass, metal, rubber, water, fire — nevertheless communicate across a sequence of singularity thresholds. But what’s being communicated? Is it energy? Momentum? A message?
The Simondon and Deleuze folks are the best source for examples of transduction, which can sometimes acquire the luster of a grande epiphany: liquids shift from consistent flow to whirling turbulence; proteins relay signals in organisms; or crystals grow and propagate.
For a more mundane example, consider the common audio microphone. A microphone is a transducer in that it converts sound waves (marked by fluctuations in air) into an electrical current. The waves are an instance of analog representation, but so is the intensity of the electrical current. Technical devices are certainly the distillation of complex symbolic regimes — as Flusser or Kittler would maintain, or even Marx with this notion of “general intellect” — yet the passage from air pressure intensity to electrical current intensity transpires real-on-real without having to encode the signal into discrete symbols.
Transduction entails analogy and correspondence. The electrical current fluctuates with the same proportionate intensities as those of the sound wave. Yet while “same,” the essence of transduction is not identity so much as difference. The air outside the microphone and the copper wire inside it do not form an identity because they are qualitatively different forms of materiality.
“For the process of transduction to occur, there must be some disparity, discontinuity or mismatch within a domain; two different forms or potentials whose disparity can be modulated,” observed Adrian Mackenzie in his book Transductions. “Transduction is a process whereby a disparity or a difference is topologically and temporally restructured across some interface. It mediates different organizations of energy. The membranes of the microphone move in a magnetic field. A microphone couples soundwaves and electrical currents” (25n3).
In a microphone sound wave and electrical current form a relation of difference, between which lies a point of singularity. Note that the notion of singularity is slightly counter-intuitive. We’re often taught that entities and bodies are defined by how they are enclosed. A perimeter, a membrane, a frame, a terminus — these are the ends of the body; the entity itself is within them. Yet transduction is exactly the reverse. With transduction, it’s the transition point that becomes the “thing.” The phase transition becomes the identifying marker. Or the moment when slope changes from negative to positive. Or the point when still water forms a vortex. Thresholds are no longer at the perimeter; the threshold is the thing.
This is one reason why Deleuze was obsessed with the “analog” mathematics of topology. Topology begins firmly with the infinitely “elastic” space of analog relation. Born adrift, topology finds its anchor points not in monads, like arithmetic does, but in singularities. These might be points of intense pressure or focus, or “catastrophe” points, or the place where a function crosses itself, or where it is undefined. All such singularities are examples of transductive thresholds. In the absence of symbolic identity, transduction relies on a material or even organic logic of difference and variation. Transduction is thus characterized by singularities or “individuations” rather than essences or identities. Transduction is characterized more through becoming and metastability than through being or stasis. If transcoding is monadic, transduction is nomadic.
Transcoding and transduction are the two “symmetrical” interfaces. I say symmetrical because they both involve a common measure across the conversion threshold. In the case of transcoding the common measure is a quantitive measure ensured through the principle of identity. In the case of transduction the common measure is a qualitative measure ensured through the principle of difference. (The former could be called an “arithmetic” measure; the latter a “geometric” measure.)
The key to both transcoding and transduction is that they are machinic rather than hermeneutic. In other words both transcoding and transduction operate within specific axiomatic systems in which all definitions and rules are known, and in which specific kinds of contingency and randomness are normal. One does not need to interpret the decimal value “255” in order to transcode it as the hexadecimal value “FF.” But this is only because the specific rules for transcoding mathematical values between different number bases have already been set up in advance. Transcoding is merely the execution of these technical rules. Likewise ice is not “interpreted” when it warms and passes the singularity point of 32 degrees Fahrenheit to become liquid water. Ice and water are subject to known laws governing heat transfer and the phase transitions of matter. (Of course one must admit the existence of chaotic and nonlinear phenomena; things don’t always happen the same way every time.)
There’s a lot more to be said about these first two interfaces. As regards computers, we can make the following observations: the vast majority of computation transpires under the heading of transcoding; an important but smaller subset of computation consists of transduction, mostly happening at the level of hardware. Very little if any true translation exists within computation, mostly because computers are not hermeneutic devices. This is not to deny the existence of things like machine translation, merely to point out that such technical marvels operate through transcoding not transduction (and certainly not through translation proper).
This post is already a bit long, so I’ll save interfaces three and four for the next post. These other interfaces — sampling and interpolation — are asymmetrical rather than symmetrical. That means they don’t involve a common measure across the interface threshold. In that sense, they are much more complex than either transcoding or transduction. As we will see, they entail a more radical change across two different modes of mediation. (To be continued…)
taken from here