Deleuze/Guattari and quantum finance

Deleuze/Guattari invite us to apply at least three models of thought to the theoretical analysis of economics, by tracing the image of the current that, in their view, articulates each model. It is only the third, the molecular-performative model, that correctly stages the force of thought. (Laruelle contrasts the concept of force of thought with thought. Thoughts or thinking are “false” representations that are developed and acted upon in the academic fields of concepts. The force-of-thinking, on the other hand, is the most important experience of thinking, which follows the transcendental axiom of the real). The first model of thought related to philosophical classicism permanently questions in a kind of philosophical self-assurance how the tree model can succeed in linearly distributing chaotic currents through unidirectional processes under certain circumstances. A centralized tree trunk is divided into two; such a binarity provides the dualistic distribution of currents. Or to put it another way, from the center of power the sprouts radiate by means of a uniform, synchronized set of homogenized binary processes. Digitalization, the division of one into two, means that two terms, whatever they may be, are placed in relation to each other: with regard to information, they are divided into two forms; with regard to language, they are divided into representation and the represented; with regard to thought, they are divided between the thinker and what he thinks. The digital thus contains the potential to separate things and objects and to continuously make further differences between them. With Alexander Galloway we don’t grasp the digital as a distinction between zero and one, but as the more fundamental distinction/division into one and two that makes it possible to generate more and more distinctions. And this kind of distribution affects even those trees that are not strictly vertical in their organizational distribution.

With regard to the analysis of the economy this means that we have to explain how the tree system succeeds in linking different ‘unit dipoles’ (institutions, agents, assets etc.), which all expand in a linear direction, in such a way that the Euclidean established routes, which have always been prescribed by a centre (link dipole), are followed. Here, Deleuze/Guattari is not exclusively concerned with the investigation of the functioning of planned economies, but also with analyses of the functioning of institutions such as the central banks of today’s capitalist economies. For example, the American FED combines “Open Market Operations” – an open market policy oriented towards the financial secondary markets and their course – with sovereign decisions (regarding the “short term interests rates”) in order to create root or tree-like modes of distributing money capital flows. Such actions by central banks can certainly change the money supply and the speed of money circulation, but today they influence prices and liquidity on the financial markets to a lesser extent than 30 years ago, with which the prices of classic commodity flows are ultimately manipulated (there is no fixed relationship between money supply and money supply nor between money supply and inflation qua money creation multiplier). As non-state institutions, the central banks are at the same time authorised by the state to act as a ‘link dipole’ on the money and capital markets in order to interfere with the rest of the ‘unit dipoles’ of the economy (commodity, equity and consumer goods markets). They try to implement so-called target-interest rates, with which they have certain possibilities to actually regulate the demand for money (by buying and selling treasuries), and thus continuously give signals to the markets within the framework of a hierarchical distribution of information. In fact, the central banks try everything to control and regulate the relations between the molar and the molecular, but ultimately they remain determined by what escapes them, or, to put it another way, by their impotence, which prevents them from being able to comprehensively regulate the flow of money capital and thus repeatedly exposes their supposedly irrevocable zones of power and influence to the danger of crisis. The setting of growth and money supply targets – intermediate targets such as interest and discount rates, determination of bank reserves, etc. – have lost much of their effect in recent decades in the economies and are now more intensely than ever influenced by real-financial capital and its machines, which, in the course of their ultra-fast transaction chains, have a strange effect on credit movements, interest rates, currency fluctuations and prices. On the contrary, the central banks now see themselves forced to conduct open market policy more offensively than ever and to trade on the money markets; they themselves mutate into investors, if not speculators, and this with the consequence that the definition of the quantities of money, which are supposed to show a stable relation to price formation processes as control variables, is shifted further and further by their own activities.

According to Deleuze/Guattari, the second model of thought is characterized as a system of small roots, as used especially in modernity. Deleuze/Guattari write: “The main root is (here) atrophied, its end dead, and already a multitude of secondary roots begins to proliferate wildly.” For Deleuze/Guattari, however, this model does not differ significantly from the tree model, because the system of small roots must clearly miss the idea of multiplicity because of a number of social restrictions; it takes into account the diversity of the phenomena of the outer world, but nevertheless only an identical subject seems to be able to coordinate them. Thus, the principal duality of thinking is not eliminated at all; rather, it proliferates even more strongly in the continuous division of the object, while the subject is simultaneously invented as a new type of totalizing unity.

Also with regard to economic analysis, this model contains the secret agreement with the higher unity, and this beyond a tendency that leads to molecular multiplicity: Contingencies such as those in the supply-demand relationship, decentralized causality and the stochastics of price movements may be regarded as examples of multiplicity, but these contingencies are limited; in the end, they are supposed to be the so-called “contingencies” of the supply-demand relationship. They should finally affirm “free market capitalism” as the model of a determinant, hierarchically organized capital and its state (idealistic total capitalist) by easily recognizing the power of higher unity (quasi-transcendenceality of capital), which at best has to guarantee a balanced organization of the distribution of money capital flows. It is Deleuze/Guattari’s concern to show that the second model considers the (apersonal) “general” – a thoroughly apersonal unit that, however, remains bound either to the object or to the subject – of the first model, which is necessary for n entities to manage them in unison, indispensable in view of the modes of operation of capitalist competition when it comes to keeping capital accumulation cyclically stable or in a state of equilibrium.

Let us now turn to the third model. For Deleuze/Guattari, the multiple or multiplicity must be produced continuously, not by constantly adding higher dimensions to the previous models, but conversely by avoiding unity in the simplest way, i.e. always writing n-1. In a certain way, the rhizome is not, it is at most a multifaceted movement excluding the central unity, it performs a subtraction that eliminates from each movement what wants to present itself as unity or determinant unity. Deleuze/Guattari call such a construct a rhizome characterized by six basic principles or properties: a) n-dimensional connectivity, b) heterogeneity, c) multiplicity, d) asignificant fraction or non-linearity, e) cartography, and f) decalcomania.

One could now imagine whole clusters of markets populated by multiple operators and vectors that engage at once in hedging, arbitrage and speculation, with the help of different assets and their different classes of exchange and the requirements given by the economic properties/objects of each asset, to finally create the market continuously as a mobile horizon of heterogeneous, qunatifying regimes of signs, diagrams and technologies. Deleuze/Guattari call these processes economic war machines. The volatility of the price movements of an asset has to be regulated to a certain degree by differentiating concepts (innovation of a derivative or a derivative market), techniques (mathematical formalisation or standardisation) and operations (strategies such as dynamic hedging), for which procedures of constant calibration and recalibration are required, since calculating forecasts can be divided into new forecasts (derivatives are written not only on underlying assets but also on derivatives). Future events need to be identified and quantified, their probabilities evaluated in order to allocate them appropriate prices and apply them to future investment decisions. Today, this is done primarily by means of quantitative (mathematical) operations in which stochastic series are used.

In this context, Randy Martin speaks of a newly emerging possibility of introducing derivative sociality on a planetary level, or of a phantasmatic break with linear time. He writes: “While derivatives are conceived in a language of futures and forwards, the present anticipation of what is to come, […] the act of bundling attributes suggests an orientation in all directions, which is an effect of mutual commensurability”. However, we have to note that the deterritorialized or virtualized money capital flows or derivatives themselves have long since been processing rhizomatic properties, i.e. molecular-dynamic, volatile processes, which are able to link even the metrised payment flows with more fungible, exact and topological flows of finance money, whereby – this seems to be essential here – these molecular compositions of the money capital flows dominate the metrised segments of the economy and its payment flows today, which, however, still have important macro-economical parameters such as interest rates, supply/demand ratios, etc., but which are not yet the same as in the past. The latter refers to the extensive determination of value or cardinal value. In economic reports and balances, lines and segments are recorded in specific metrics, numerical registers or measures, based on given values or stock sizes. These lines are stratified; they are the notched metrics of money capital flows or the dominant metrics of cardinal value, e.g. empirical-statistical data such as wages, net savings, net profits, interest rates, capital claims, investments, consumption etc.: This is the classical approach of economics to represent values – cardinal values. The power centres of the economy, as they are condensed in the central banks, for example, are constantly trying to mediate the molar and molecular flows. Even quantitative finance has a mediating function here. It is closely linked to computer science and big data technologies, makes use of research on artificial intelligence or evolutionary algorithms that are used in game theoretical simulations and serve to optimize computer hardware and communication networks. The molar is defined by macroeconomics (rigid, veridical, euclidean, tree-like, etc.). Microeconomics, on the other hand, is characterized by properties such as fungibility, horizontality, topology, rhizomatics, etc.

There is a double reciprocal dependence between the two areas: Whenever a fixed line is identified, one may perceive that it continues in a different form, e.g. as a quantum flow. In each instance, a power apparatus can be located at the border between the molar and the molecular, which is by no means characterized by the absolute exercise of power, but by relative adaptations and conversions that entail certain effects in the money capital flows.
With the concept of the rhizome, Deleuze/Guattari begin to sketch more precisely their concept of ordinal value, which immediately raises the question of the economic quantification, valuation and movement of differential prices in smooth spaces that elude metrication. Any stratification, representation and metrisation, whether enforced by price or any other economic parameter, must today refer to the non-quantitative differentiation of a rhizomatic financial economy. It could even be that, according to Deleuze/Guattari at least, in an economy of molecular war machines even money is no longer needed, at least in its functions as measure and means of circulation through which it metricizes and moves valuations.

In both spaces discussed here there are points, lines and surfaces. In notched space the lines are usually subordinated to the points, i.e. lines only establish the connections between the points; in smooth space we find the opposite principle, i.e. the lines pass through the points. In the smooth space of capital, the line inevitably transforms into a vector (one direction) – produced by localized operations and directional changes, and thus it is to be understood as non-dimensional and non-metric. In smooth space, the materials refer to manifold forces or serve them as symbols, while in notched space it is always the forms that organize the matter. Organless bodies, distances, symptomatic and judgmental perception of smooth space stand in the way of the organism and the organization, the measure unit, the dimensional unit, the dimensional unit and the dimensional unit.

Lines are understood as constitutive elements of things and events; the line does not mean unity of the One, but a unity characterized by diversity (one-all). In the open system of the rhizome (a network/folding of lines), the individual capital with its manifold components continuously draws lines and is drawn by them in the context of total capital – it uses the lines for the continuous efficiency of accumulation in order not only to produce static products, but for itself to achieve the optimization of the lines in the context of its capitalization.

It is no longer primarily about the production of products (number and quality including input and output), but about the creation of profitable lines/waves/vectors that run non-linearly and spirally – infinite and a-linear lines that flee in all directions. This per se virtualizing aspect of capital is supplemented, updated and thus always restricted by the economic mathematics and its quantifying mode. It should be noted that the cult around “the” dialectic as Ariadne’s thread for mastering the labyrinth of capital has basically always attempted to think the algebraic only as a derivation of the linguistic conceptual. Although this is a certain necessity of theoretical access, the polarity of the opposition has been shifted in favour of the linguistic logo. The differentiation, temporalization, modulation and smoothing of the measure (of money) provided by the deterritorialized money capital flows correlates the mathematics of economy (modularization) or the complex logic of the algorithm that performs the measurements today. In derivatives markets, non-quantifiable rhythmization plays an increasingly important role, although it must be incorporated and updated by the algorithms described elsewhere, which perform temporal-rhythmic, derivative “measurements” (volatility) in discrete steps. The first type of rhyhtmization serves purely to exploit the money capital, which pemanently draws speculative lines that document less the value of the goods or the companies than the optimization of the money capital strives for.

In order to get direct access to price formation, which is increasingly getting by without the “latency period” of human action speed, it was first necessary to make the electronic markets the paradigm of price formation. An important moment in this development was the introduction of the kind of open source platform Island, which allowed buy and sell orders to be placed away from the well-known stock exchanges and their market makers. It was launched in 1996 and paved the way for automated trading. This innovation not only led to radical changes in stock exchanges (such as the NYSE or NASDAQ), but also to a reduction in the human labour force (with a few exceptions) that had dominated stock exchange trading up to that point: Traders based on affective and intellectual interaction, which determined pricing on the trading floor, had become redundant. Trading the future in increasingly microscopic contemporary moments, taking into account gigantic amounts of historical data and the automation of order flow, promised markets with high liquidity and fairness in pricing. As Haim Bodek points out, the regulatory authorities were also impressed, as these systems led to a narrowing of the spread, the difference between buying and selling prices on which market makers lived, and thus to better order execution for all market participants, including relatively inexperienced private investors with small budgets.

Gerald Raunig speaks of a purely speculative or virtual line, of an abstract individual line that crosses things and relations, of the social factory of the new precarious companies, of deterritorialized mortgages and loans, etc., which are bundled and reassembled as risks qua derivatives and transformed into a singular cash flow and a single risk in order to break this up again into tranches (CDO) with the aim of generating further profits. In this context, he quotes Randy Martin, whose argumentation here again refers to the dominance of financial capital over “real capital”, which is inseparably linked to it. Martin writes in his book Knowledge Ltd: “While the mass assembly line gathered all its inputs in one place to produce a tightly integrated commodity that was more than the sum of its parts, financial engineering reversed this process by decomposing a commodity into its constituent and variable elements and dispersing these attributes.

The aim is to bundle them together with the elements of other goods that are interesting for a globally oriented market for risk-controlled exchange. All these moving parts are reassembled with their risk attribute so that they become worth more as derivatives than their individual commodities.” The derivative is much more than just a written contract regulating the exchange of an object or commodity at a future point in time and at a certain price; it is a financial instrument that, contrary to the notion of the separation of financial and real economy in a manner complementary to money (it must always be realized in money), enables a kind of measurement qua comparison of future money capital flows, and thus enables the regulation and coupling of the various economic areas, individual capital and capital fractions, makes them commensurable with one another precisely via certain differentiation processes, whereby the derivatives are each already realised in money and are themselves to be understood as a form of money capital. Randy Martin writes about the differentiation processes of derivatives: “While commodities appear to us as a unit of wealth that can abstract parts into a whole, derivatives are still a more complex process in which parts are no longer uniform, but are constantly disassembled and collected again when different attributes are bundled and their value exceeds the whole economy under which they were once summed. Shifts in size from the concrete to the abstract or from the local to the global are no longer external standards of equivalence, but within the circulation of the bundled attributes that multiply derivative transactions and set them in motion.

In the diagram of a synthetic asset, the discrete elements, which are nothing more than the economic properties of the asset (cash flow, maturity, price, risk, volatility, etc.), are related to each other. You can leave it at that and say, the relations exist before the relations, which however do not dissolve, or you can even go one step further with Nietzsche and say that the properties of a thing are only effects on other things, and if you then take away the term “other thing”, then a thing has no properties at all anymore, so there is definitely no thing without other things. Thinginess dissolves into the flux of differential events. (If one, acknowledging the dominance of the relations over the relations, does not completely dissolve them into events and leaves them in a relative independence, then this still remains within the relation “relation and relation”. If, however, with Heidegger, the objective particularity of the relation could be thought of as dependent on a real, on a non-objective transcendence that separates each relation into two formally distinguished sides, one side being assigned to the uncovering and concealment in the direction of the being of the existing, and one side being assigned to the object that is indifferent to the other side. Here Heidegger poses the question of irreversibility as such, which in the end does not deny itself. The Kantian thing itself is an essential subtraction, as a tautological subtraction in itself as the essence of being. Nothing is niece.)

What is to be understood by a “quantum flow”? It is a deterritorialized stream of financial money that knows no segments and no stratified lines, but singularities and quanta. Here, the poles of the respective flows are regarded as condensations where money is created and destroyed; singularities mean the nominal liquid assets and quanta stand for processes such as inflation, deflation and stagflation. Derivatives can be used to observe certain distributions and corresponding rhythms of money capital flows into other forms and at the same time. For Deleuze/Guattari, each quantum stream is “deeper” than the money capital flows, whose metrics refer to elements of cardinal values: The quantum flows are mutant, convulsive, creative, circulatory and material flows that are bound to desire and always lie lower than the solid lines and their segments that determine, for example, interest rates and the relationship between supply and demand. For Deleuze/Guattari there are thus two economic ways of thinking, a) that of economic accounting (segments and solid lines of a molar organization that represent determinative metrics of cardinal value), b) that of financial money, which today is called the flow of finance or financial flows. In his book “Algorhythmus” (Algorhythm), Miyazaki has countered the metaphor of flow by bringing into play the concept of rhythm, which is used to describe digital networks and the microstructure of the financial system . It is more precise because it displays a discrete, catastrophic flow, which is to be located as an algorhythm between the discrete-impulsive and the continuous-flowing. By means of algebraic logic, control can be described as a chain of circuits which serve to produce equilibria of flow, “whereby the expression ‘flow’ only conceals the fact that at any moment it is only a matter of more or less large distances of binary states, up to the limit value of their collapse”. (Bahr)

The problematization of quantum physics promises to provide clarification at this point. It has shaken the foundations of classical physics by Max Planck’s groundbreaking insight that radiation such as light, which until then was regarded as a continuous phenomenon, can be a quantum under certain conditions or have a discontinuous character. According to the classical view, light behaves like a wave, but since Einstein’s discovery that light can behave as a particle under certain circumstances, the classical position in physics had to acknowledge that there is uncertainty, i.e. the wave- or particle-like state of light cannot be observed at the same time. The visualization or representation of the quantum phenomenon in a single image is not possible.

The concept of complementarity, invented by Nils Bohr, establishes a relationship between quantum mechanics and philosophy in a similar way to later Derrida (persistent opposites at the same time). If the main question is whether an object can assume opposite properties of wave and particle at the same time, later experiments show that wave-like and particle-like are not even properties or attributes that can be attributed to quantum objects, because these designations are based on classical concepts that try to describe radically new phenomena. New designations even raised the question of whether quantum objects can be thought of as objects at all. This suddenly confronted physics with the indeterminable, which can only be observed in its effects. The unrecognizable objects of quantum phenomena were defined as “efficacities”, i.e. they are only accessible to us through their effects, which can only be observed and understood with the concepts of classical physics. Classical physics and its own principles of causality require the construction of a model with which the interaction between natural objects and natural phenomena can be observed, measured, explained and verified. Such a model would not be possible in quantum mechanics, since only the interactions between the effects of the efficacities and the measuring instruments can be described. As Bohr says, all this requires a departure not only from the classical principle of causality and its visualization, but geerel from the classical attitude towards the problem of physical reality. But if the unknowable is recognizable only by its effects – by the concepts of classical physics – then such a situation requires a revision of what constitutes reality, and thus it is hardly possible to follow the model of a model, or the concept of the model of classical physics. In contrast to classical, causal and deterministic ways of constructing models according to models, the quantum phenomena proved that such ways of modelling do not work, because in quantum mechanics in general what was previously considered to be guaranteed was at stake. From this perspective, quantum theory can be seen as a crisis of representation, of its models and of mimesis, the Platonic model of a model with which science in general was able to represent certain phenomena as reasonable. With the rise of quantum theory, the visibility of such phenomena and thus their representation was at stake. In this sense, Bohr was not a Hegelian; his concept of complementarity not only fundamentally challenged the Hegelian concept of synthesis, but contained the critique of a metaphysics of presence. Bohr could perhaps be assigned to Derrida’s deconstructivism, which considers synthesis possible only if one affirms the metaphysics of presence. For Bohr, an abyss really opened up at this point. And Bohr was perhaps the bataille of physics, a non-hegelian hegel who interrupted every kind of synthesis between opposites in order to continue them endlessly in their disintegration.

Money capital flows and their rhythms cannot be indexed at the level of virtualization, nor can they be measured by the common metrics of operational accounting, and they are not regulated by new metrics and segments, rather they operate between and through the poles in order to constantly create new singularities and quanta – they determine not least the molar determinants of cardinal values. Molecular money flows cannot therefore be represented, they resist metrics and can even show lines of flight (from macroeconomic indicators) that are characterized by deterritorialization, destruction, and transformation of classical economic thought, belief, and desire, but in a sense they always come too late, insofar as macroeconomic indicators have already reterritorialized molecular movements again.

Liquidity often appears to be the most important property of an asset. This aspect can be described as transaction liquidity, i.e. an asset has liquidity if it can be exchanged for money (the object has liquidity). But liquidity can also be used to penetrate those capital markets that are populated by differential varieties of assets. Here one no longer speaks of the liquidity of the assets, but of market liquidity in order to attribute the liquidity within a simulative space to the exchange, in which the market participants can liquidate their positions quickly enough without having to accept excessive price reductions of the involved assets (one can here assume a shift of liquidity as a purely objective property to a property of the simulative space with objective consequences). Of course, liquidity is also referred to as a property that is entirely related to the debtor and is then called funding liquidity. This involves the debtor’s creditworthiness and his ability to borrow assets at an acceptable interest rate so as not to suffer the conversion of liquidity into bankruptcy.

The fundamental prerequisite for developed financial markets is so-called secondary trading, which is based on trust in highly liquid money and capital markets. The pricing process, from primitive collateral to individual financial innovation (derivatives), requires “continuous” financial values; and “continuous” pricing depends on the availability of financeable liquidity. The smooth functioning of the financial system is based on the notion that the option to trade can be exercised even under constantly tested conditions. But it is also one of the problems for the current financial capital, because although modern finance improves the conditions for capital realisation, it remains heavily dependent on market liquidity. When this evaporates, the whole setting quickly becomes fragile. In other words, the demand for higher discipline in capitalist power relations makes the economic milieu more fragile and vulnerable. Liquidity must be assigned to the financial system as endogenous. In times of danger, the assessment of risk opportunities and asset prices tends to go down, market participants draw their credit lines and/or raise margin requirements to protect themselves against the risks posed by counterparties, and liquidity disappears when most needed, so that eventually the whole pricing process can collapse. This would be a description that could be called Marx’s financial instability hypothesis.

Economic objects never stand for processes such as inflation, deflation, etc., but here it is the relations that indicate the price spreads between the objects and the imago of value (money), relations that in turn refer to processes of inflation or deflation. Also quanta are not objects, but rather stochastic processes through which economic objects attain the status of objectivity. In this context, the phenomenon of inflation does not seem to differ much from that of weather, it is a Haeccietas, it articulates the differential price movement and/or the stochastic of price movements, which in turn condense in objects without ever being able to be reduced to such objects. We can therefore say that quantum flows and their rhythms are often stochastic processes, whose dynamics are formed around singularities in order to refracted themselves into lines and segments, into metrics of representation or cardinal value. In contrast, the theory of ordinal value insists that money capital flows emerge from the double reciprocal determination of molecular and molar machines, each with its own modalities and two unequal reference systems, although the two circulation circuits, which are only analytically separable, remain materially coupled and always flow only as one stream. The sedentary distribution deals with spaces (markets) and objects in space (assets), with territories (characteristics and their relations) and their zones and regions. And it processes like all forms of distribution with the help of points and paths. The nomadic distribution also deals with these parameters, it can even follow ordinary paths, from one point to another, but the points remain subordinate to the paths they determine. In the nomadic mode of distribution, a point is reached only to leave it behind, so that each point represents a kind of relay and exists only as a relay. Thus the in-between (between the points) or the intermezzo acquires its own consistency, even a new autonomy and dominance, which in the field of financial theory consists in the continuous recalibration of assets. A becoming, whose characteristic feature lies between two points in between, indicates in practice the abstract principle of Cantorian set theory. The Cantorsche set theory comprises an infinite dust of points, whose continuous becoming creates a continuous space between the points – the principle of activity consists in the continuous repetition of division, so that the set goes continuously towards zero, although it is always enclosed in a finite space, so that it remains at the same time infinitely diverse and economical. Continuous recalibration is a practical method of endless deterritorialization.

With the euphoric use of the term microstructure or rhizome, a post-historical, post-romantic, post-genealogical phantasy about the network society was initiated, in which the various sub-areas could allegedly develop forward in a coevolutionary manner and no longer need any causal relationships to apply, instead the respective areas were interwoven with each other as a colorful patchwork. Today the talk of the ecological system or the network pervades almost every experienced social analysis. Thus, in his paper “Deciphering the Riddle of Capital”, David Harvey has listed factors such as class relations, institutional structures, production processes, relationships to nature, reproduction, everyday life, demographic development, and intellectual conceptions in addition to the parameter of capital accumulation, which taken together would form a network or an open complex whole. According to Harvey, it is important to show how these areas influence each other, how they organize and structure their relations, from which tensions, contradictions, evolutions and inventive processes can result, which, however, are by no means causally determined, but simply contingently mediated. According to Harvey, exactly this theoretical structure was developed by none other than Marx himself in Chapter 13 of Capital Bd1 in discussion with Darwin’s theory of evolution. Allegedly Marx knew no primacy of an instance and he also did not know the concept of determination. Harvey also shows the strategic influence of the network metaphor, which today goes far beyond the mere description of a data-organized infrastructure and a relational ecology. After all, today everything is a network, and the best answer to networks is even more networks, in fact there is almost a paranoid intellectual atmosphere, according to which everything is a network.

In this context, even the term “rhizome” had to be used to describe the fashion in art, architecture, computer science, neurobiology, economics, etc. for a network model reduced to the term virtuality (at least the terms virtuality and digitality were used congruently), which followed the almost paranoid premise that everything is networked. In fact, many of today’s large corporations are network companies. While Google monetarizes its network designs by clustering algorithms, Facebook overwrites subjectivity and social interaction along the lines of channeled and discrete network services. In military theory, the most effective response to terrorism is network building, in operaism of the negative variety the best response to empire is multitude, in ecology networks are the most effective response to the systemic colonization of nature. In computer science, distributed architectures are the best answers to connectivity bottlenecks. In the economic sciences, heterogeneous economic latitude is the best answer to the distributed nature of the long tail. Ecological or system-oriented thinking has attained an unprecedented popularity, as a kind of solution to the problem of diachrony, in which space and the landscape occupy the place of time and history. The postmodern “spatial turn” goes hand in hand with the devaluation of the temporal moment, one thinks of Riemann’s complex surfaces, of the path from phenomenology to the theory of assemblages, from the time image of cinema to the data-based image on the Internet. Finally, the old manta ray of historicization was replaced by the mantra of constant connectedness and networking. (Sloterdijk has designed the phase of network compression for the period from 1492 to 1974, and thus misjudges the very essence of the so-called digital revolution.)

During the age of clocks, the universe was still thought of as a mechanism in which the heavens rotated according to the music of the spheres. In the age of the steam engine, the world mutated into an engine of indescribable thermodynamic forces. And after the fully developed industrialization, the body transformed into a factory enriched with the seductive metaphors of technology and infrastructure. Today, in the age of networks, a new template (paradigm) inscribes itself in everything that shows itself as presence, even more so, the idea that everything is network opens a new tautology of presence.

translated by: DeepL.

Foto: Bernhard Weber

Nach oben scrollen