Model and Code

Exzerpt from the book “In the delrium of Simulation. Baudrillard Revisted.”

As Baudrillard shows, simulation is not only an effect of political economy and thus also of a (quasi-)philosophy, but it is a constitutive process of the system itself. It also stands for a new period of capital and a new mode of production. In the global simulation structures, however, it is not simulated because an automatic subject of capital would have decreed it, but what can be simulated in the context of global capitalisation and commodification is simulated, not as an end or as the use of means, but in the midst of the recursiveness of the functionality and efficiency of capital. Jonathan Beller writes that Baudrillard could have said, by analogy with Marx, that everything that is solid melts into information today (Beller, 2021: 43). The calculability inherent in code liquefies the solid according to the requirements of capital. In the informatic flow, the computer’s close connection with the erasure of territory from the map[i] and with the neoindustrial project of financialised globalisation is reflected in the derealisation of traditional forms of space and time. The ability of capital to use simulation models for itself, to differentiate every difference in the game of differences in order to enforce the code, and, where necessary, to create differences and distinctions that only serve to expand the code, to operationalise and optimise models and signs in order to ensure the further development of computation, is now reaching a climax (ibid.).

          In this context, Baudrillard also accuses Deleuze of defining difference itself both positively and affirmatively. The standard software widely used today, which is initially to be equated here with code, does more than just create facts; rather, its operations generate artificial realities in Baudrillard’s sense. Take, for example, the software packages from SAP, which dominate business administration and companies in Europe. On the one hand, computer simulations are used to create theory, i.e. concepts are created in the form of computer codes, and on the other hand, the simulations also provide instructions (methods) with the code, which are to be executed in order to test the theory (Dippel & Warnke, 2022: 130). Today, computers operate in reality and generate new realities in all kinds of areas, from economics and medicine to everyday life.

          Various optimisation models are part of the computer-aided decision-making tools. Their focus is on the variables of model structure, quantification and solution efficiency. There are also important metamodeling, analysis and interpretation activities associated with practical decision-making. Efforts to integrate A.I. and optimisation models are still mainly focused on model formulation and selection[ii]. With second-order cybernetics, contingent phenomena have been transformed into necessary laws, including errors, disturbances and breakdowns, which have been integrated into a computational optimisation model and the rule of capital over indeterminacy. However, all models have approximation errors. Statistical models, on the other hand, usually claim that their approximation errors are unsystematic and without pattern—noise. This needs to be reconsidered. Automated decision making therefore requires a mode of conceptual reasoning in which rules and laws are invented and experimentally structured around the social dimensions of computational learning. The investment of capital in technological intelligence has led to an explosion of artificial intelligence, with which unconscious or pre-cognitive decisions are operated automatically and automatically. With high-frequency trading, the algorithms of Netflix and Amazon, the live platforms of Uber and Air B&B, and online dating sites, cognitive capital seems to have transformed the subsumption of social intelligence into a set of learning algorithms that can efficiently make decisions without the assistance of consciousness. This leads to a synthesis of logic and calculation and, in the case of algorithmic intelligence, non-deductive reasoning and dynamic statistics. Beyond data analysis, statistics then definitely becomes a model for the (re)construction of reality. However, if a hypothetical model makes certain probabilistic assumptions, then other probabilistic implications can also follow deductively via the detour of testing and recalibration. The information of a poorly calibrated model can degenerate into noise, while that which has been discarded a priori as noise can become information.

          Phenomena of all kinds now become fodder for simulation models, with which the real phenomena are indexed and fed back into the model, while the rest is discarded. Making calculable what was not calculable by drawing it into the realm of mathematical calculation, making the incomprehensible rational and thus social because it is economic – that was the development path not only of science, but also of industrial capitalism on the way to computer capital. For Baudrillard, simulation involves the creation of an artificial reality through codes and their models, a hyperreality constructed from the code (as value) and its model. Every model has a code, but it is the code that produces the model; it is now primarily about the code, for example that of genetic miniaturisation[iii], which now indicates an important dimension of simulation. The code is to be understood as a coding of effects. In Baudrillard’s interpretation of genetics, the code is inscribed in the hereditary substance (with letters and their possible combinations). It is not a question of postulating a relationship between code and effect, because there is no relationship between the code and its effect—the effect is an integral part of the code. The code informs, so there is no longer the traversal of an effect. Nothing separates one pole from the other, the beginning from the end, rather there is a collapse of the two poles (cause and effect) into each other, an implosion of meaning. This is where the simulation begins. Today, however, we find ourselves in a world that Baudrillard, going beyond simulation, ultimately calls integral reality. The world that he claims is the paroxysm of simulation or the simulation of simulation. The virtual world is neurotic to the point of implosion, and our destiny is to merge with its surroundings, to disappear without feeling it, that is, to live forever, because life precedes death, the transparent shroud of a tailor-made immortality. For immortality to work, we must first disappear, as in death. Either way, we die, because immortality is just another way of dying: to die and live on, or rather to live and die on.

Every system that approaches perfect operativity comes closer to its downfall. If the system says A is A, then it is approaching absolute power and at the same time total absurdity, that is, probable implosion. The infinitely living, the undead or the zombies, are now inevitably seized by the artificial corpse of hyperreality. They pay their respects by lining up to stare into the faces of the dead (hyperreality is also an accumulation of the dead), knowing that they could achieve the same effect if they stayed at home and looked in the mirror. The mirror throws back one’s own face as the perfection of inertia, and in this the uncanniness of implosion is revealed without ever being recognised (Shipley, 2021).

          Baudrillard writes that it is ultimately cultural or informational cloning that annuls all distinctions and precedes genetic cloning (Baudrillard, 2000: 56). According to Baudrillard, the reality that has become completely operative and functional, the thoroughly coded reality, is generated today from miniaturised cells, from matrices, memory banks and control models; and it can be replicated infinitely often[iv]. The code also constantly demonstrates the urgency of transforming its own process, its processors and its processing capacities. In this context, Beller sees digitised computing as a strategy of efficient risk management and a cost-benefit analysis of substitutable decisions for the essential program of capital; it opens up new ways of allocating resources and verifies the potential profitability of new sites of capitalisation necessary to halt the declining rate of profit.

          For Baudrillard, code as a “universal value” includes the essential elements and relations of functionality, programming and management, whereby the modelling, generation, reproduction and influencing of all relations of reality across different areas is included here (Baudrillard, 1976: 90ff.). The code is also based on the claim of programmatic infallibility and the ability to program reality ex ante, although uncertainty can never be ruled out (for Baudrillard, the code and the socio-economic stand for what could be called the “system”). With the code, the capital system perfects its own reproduction by means of a software of cool functionalisation, insofar as almost every new phenomenon can now be incorporated into the code and translated by it or even generated by it. He still insists on the tautological imperative of the mere functionality of the economy. The program code corresponds to deductive thinking, although it should be noted that today’s artificial intelligence models are not programmed, but trained. The code now also dominates the sign, thus, as Baudrillard explains, the term “sign” itself has only an allusive value (ibid.). The form of the structural law of value, as presented by Baudrillard in his book The Symbolic Exchange and Death, isno longer inherent in the sign in general, but in a specific organisation, namely that of the code. The code thus no longer recognises the primacy of the sign (Ibid.). We have already seen that Deleuze uses the concept of code in connection with modulation.

          In contrast to Marx’s work, Baudrillard’s work does not initially appear to contain any major sensational breaks. However, the concept of symbolic exchange, which still indulges in the utopian dream of transgression and the moment of religious nihilism that assumes a true world behind appearances, is later abandoned in favour of simulation and seduction. Baudrillard reveals a number of weaknesses in his book The Symbolic Exchange and Death, which isgenerally regarded as his magnum opus, particularly in his discussion of Marxism. In this text, capital is not completely abandoned, but described as a totalising, technology-obsessed form of domination that seeps into and is scattered across all social, media and political fields, which is optionally given names such as structure, system, code, model, etc (ibid.: 23). The structural law of value stands here for the purest and at the same time most pervasive form of domination. It no longer has any reference to a ruling class, nor does it take the form of a balance of power structured by capital; it functions entirely without violence and penetrates the signs that surround us without a trace of it, acting everywhere by means of the code.


[i] The map is not the territory, whereby we can only partially suspend observation, which is separate from the map, when we walk around in the territory. This distinguishes the map of the social, in which we can become one of its elements, from the map of the sciences. In the second stage of simulation, the territory no longer precedes the map, rather it is now the map that precedes the territory; it is the precession of the simulacra that produces the territory anew, and finally it is now the territory whose traces are slowly rotting away in the expansion of the map. It is then the rotting real and not the map, whose traces can still be found here and there in the deserts: The desert of the real itself. The rotting real as a corpse is the desert of deserts, which is without an external boundary, because there is no outside, no beyond, only the re organisation of the structure, only the wind that lifts the sand.

            Finally, the third stage of the simulation is neither about the map nor about the territory, because there is no longer any relationship between them. And it is this loss of contact between the original and the copy that leads to the artificial revival of reality, the reintroduction of difference in an attempt to simulate referentiality in order to prevent the collapse of representation. What happens now is the murder of the real. In the stage of the virtual, the sign is finally extinguished and the real has disappeared, insofar as any real is simultaneously possible and every possibility is immediately realised with empty acceleration.

[ii] The term “model” has a negative undertone in Baudrillard’s work and refers to the schematic reductionism that Guattari has applied to both structuralism and capitalist axiomatics. Guattari’s own metamodeling is offered as a more complex alternative to the prevailing social models. Guattari understood the term “model”, which he subjected to critique, in two ways: in the normative sense, the model is a learned pattern of behaviour that is adopted by the family, institutions and political regimes and that ultimately functions as a norm imposed by the system. In the descriptive sense, the model is, as is usual in the social sciences, a means of mapping processes and configurations. Marxist positions, on the other hand, in comparison to this latter still rather vulgar scientistic position, start with conceptual models that are based on discursive disciplines and lead to concretisation in relation to epistemic objects. To use Niels Bohr’s analogy, they attempt to elaborate how and through which scientific-technical means discourses have the capacity to produce meanings in order to relate these in turn to models of operativity and experiments. The model thus stands for an iterative process of producing new experimental traces that are checked as data for their compatibility with the existing model. This allows the model to be modified, i.e. the scientific process itself leads to a constant deconstruction of the model. They are mathematical models in the epistemological sense, whereby models actively realise something. For Serres, mathematical models in their statuesque massiveness are machine-like, automatic, algorithmic. They are objective, but they are not autonomous. (Serres 1991) Models are not primarily representative, but instrumental and operative (they realise a phenomenon) and they are also active (Serres even attributes mentality to the instruments that operate). Models operate the equivalence between the real and the rational and mediate dark meanings in transparent structures.

            From a correlationist perspective, on the other hand, the classical scientist constructs a model, a grid or a syntax in order to superimpose it on the world, and the truth then emerges from the correspondence between theory and world.

            In recent research, permanent traces are referred to as data. Data can now be linked together in bottom-up models. Models can thus be roughly described as data clusters in data space. They can be used to capture a large amount of data at a glance, creating the illusion of being able to see a whole. Nevertheless, they can be used to set a cycle in motion that keeps the model open and relativizes the illusion, namely from the experiment to the model and vice versa.

[iii] Deleuze sees genetic engineering as the apotheosis of a new phase of capitalism, and for Baudrillard, DNA manipulation is not just another project of the sciences, but marks the transition to a neo-capitalist cybernetics, more precisely, to the organic (de)composition of biocapital of the cybernetic order, which aims for absolute control. The biological theory of the code has long since placed itself at the service of this mutation. In this context, the clone is the biopunk body of simulation in the control society. Going further, clones now metonymically occupy the socio-political coordinates of the working class by performing both productive and reproductive labour. The new biocapitalism is the hidden and at the same time life-determining horizon of the clones’ existence. According to Sloterdijk, genetic engineering could also be described as a script that demands interpretation. In the 21st century, the genetic technologist is the new hermeneuticist, according to Sloterdijk.

[iv] We find ourselves in a logic of simulation that no longer has anything to do with a logic of facts and an order of causalities. Simulation is characterised by a precession of the model, of all models to the smallest fact, namely by the fact that the codes and models are there first, their circulation forming the magnetic field for all events. Facts now increasingly emerge at the interface of the models. The precession, the short circuit, indeed the confusion of the facts with their model, opens up an excess of possible interpretations, and they are all true insofar as their truth consists merely in exchanging themselves in a general cycle, according to Baudrillard, along the lines of the models from which they emerge.

you can order here: https://shop.becoming.press/products/in-the-delirium-of-the-simulation-baudrillard-revisited-by-achim-szepanski

or here: https://forceincmilleplateaux.bandcamp.com/merch/in-the-delirium-of-the-simulation-baudrillard-revisited-by-achim-szepanski

Nach oben scrollen