Deleuze/Guattari ask us to apply at least three models of thought to the theoretical analysis of economics by tracing the image of the flow that they see each model articulating. Only the third, the molecular-performative model, properly stages the force-of-thought. (Laruelle contrasts the term force-of-thought with thought. Thoughts or thinking are “false” representations that are developed and acted upon in the academic fields of concepts. The force-of-thought, on the other hand, is the most important experience of thought, which follows the transcendental axiom of the real). In a kind of philosophical self-assurance, the first model of thought related to philosophical classicism permanently asks how the tree model can succeed in linearly distributing chaotic currents under certain circumstances through unidirectional processes. A centralised tree trunk divides into two; such binarity performs the dualistic distribution of flows. Or to put it another way, from the centre of power, the rungs radiate out by means of a unified, synchronised set of homogenised binary processes. Digitisation, the division of one into two, means that two terms, whatever they may be, are set in relation to each other: with regard to information, the division into two forms takes place; with regard to language, the division into representation and represented; with regard to thinking, the division between thinker and what he thinks. The digital thus contains the potential to separate things and objects and to continuously make further distinctions between them. With Alexander Galloway, we conceive of the digital not as a distinction into zero and one, but rather as the more fundamental distinction/division into one and two that makes it possible to generate ever further distinctions. And this kind of distribution affects even those trees that are not strictly vertical in their organisational distribution.
In terms of the analysis of the economy, this means that it is necessary to show how the tree system succeeds in linking various ‘unit dipoles’ (institutions, agents, assets, etc.), which all expand in a linear direction, in such a way that the Euclidean, established routes that are always already prescribed from a centre (link dipole) are followed. Deleuze/Guattari are not exclusively concerned here with examining the ways in which planned economies function; analyses of the ways in which institutions such as the central banks of our contemporary capitalist economies function can certainly also be used as material for examination. The American Fed, for example, combines “open market operations” – an open market policy oriented towards the secondary financial markets and their forms – with sovereign decisions (regarding “short term interest rates”) to create root or tree-like modes of distributing monetary capital flows. Such actions by central banks can certainly change the money supply and the speed of money circulation, but today they influence prices and liquidity on the financial markets to a lesser extent than 30 years ago, which ultimately also manipulates the prices of classical commodity flows (there is neither a fixed relationship between the monetary base and the money supply nor between the money supply and inflation qua money creation multiplier). The central banks, as non-state institutions, are at the same time authorised by the state to act as a ‘link-dipole’ on the money and capital markets in order to interfere with the rest of the ‘unit-dipoles’ of the economy (goods, equity and consumer goods markets). They try to implement so-called target-interest rates, which give them some ability to actually regulate the demand for money (through the buying and selling of Treasuries), and thus continuously signal to the markets as part of a hierarchical distribution of information. Indeed, central banks try everything to control and regulate the relations between the molar and the molecular, but they remain ultimately determined by what escapes them, or, to put it another way, by their impotence, which prevents them from being able to comprehensively regulate the flows of monetary capital and which thus repeatedly exposes their supposedly immutable zones of power and influence to the danger of crisis. The setting of growth and money supply targets – intermediary targets such as interest and discount rates, determination of bank reserves, etc. – have lost much of their impact in economies in recent decades and are now more intensively influenced than ever by real-financial capital and its machines, which in the course of their ultra-fast chains of transactions modulate credit movements, interest rates, currency fluctuations and prices themselves. Circumstances such as the expansion of credit, permanent interest rate fluctuations and the differential movement of derivative prices are largely beyond the grasp of central banks. On the contrary, the central banks now see themselves forced to pursue an open market policy more offensively than ever before and to trade on the money markets; they themselves mutate into investors, if not into speculators, and this with the consequence that the determination of the monetary quantities, which as control variables are supposed to show a stable relation to price formation processes, is shifted further and further by their own activities.
According to Deleuze/Guattari, the second model of thought is characterised by a system of small roots, as it is applied in particular in modernity. Deleuze/Guattari write: “The main root has atrophied (here), its end has died, and already a multiplicity of secondary roots begins to proliferate wildly.” For Deleuze/Guattari, however, this model is not essentially different from the tree model, for the system of small roots must clearly miss the idea of multiplicity because of a series of social restrictions; it does take into account the multiplicity of phenomena in the external world, but nevertheless it seems possible for only one identical subject to coordinate those. Thus, the principal duality of thought is not eliminated at all; rather, it proliferates even more in the perpetual division of the object, while at the same time the subject is invented as a new type of totalising unity.
Even with regard to economic analyses, one finds in this model the secret agreement with the higher unity, and this beyond a tendency that leads to molecular multiplicity: Contingencies such as those inherent in the supply-demand relationship, decentralised causality and the stochasticity of price movements may indeed be considered examples of multiplicity, but these contingencies are limited; they are, after all, intended to support the so-called “free market capitalism” as the “free market”. “free market capitalism” as the model of a determinant, hierarchically organised capital and its state (ideational total capitalist), by recognising without circumstance the power of the higher entity (quasi-transcendentality of capital), which has to guarantee an organisation of the distribution of money-capital flows that is at best equally weighted. Deleuze/Guattari’s point is to show that the second model considers the (apersonal) “general” – a thoroughly apersonal entity that nevertheless remains bound either to the object or to the subject – of the first model, which is necessary for n-entities to perform their management in unison, to be indispensable, given the workings of capitalist competition, when it comes to keeping capital accumulation cyclically stable or in a state of equilibrium.
Let us now turn to the third model. For Deleuze/Guattari, the multiple or multiplicity must be constantly produced, not by constantly adding higher dimensions to the previous models, but conversely by avoiding unity in the simplest way, that is, by always writing n-1. In a certain sense, the rhizome is not, it is at best manifold movement to the exclusion of the central One, it performs a subtraction that eliminates from every movement what wants to present itself as One or determinant unity. Such a construct is what Deleuze/Guattari call a rhizome, characterised by six basic principles or properties: a) n-dimensional connectivity, b) heterogeneity, c) multiplicity or multiplicity, d) asignificant rupture or non-linearity, e) cartography, and f) decalcomania.
One could now imagine whole clusters of markets populated by multiple operators and vectors that at once engage in hedging, arbitrage and speculation, using different assets and their different classes of exchange and the requirements given by the economic properties/objects of the assets in each case, to finally continuously create the market as a mobile horizon of heterogeneous, qunatifying regimes of signs, diagrams and technologies. These processes are what Deleuze/Guattari call economic war machines. The volatility of an asset’s price movements has to be regulated to a certain extent through the differentiation of concepts (innovation of a derivative or a derivative market), techniques (mathematical formalisation or standardisation) and operations (strategies such as dynamic hedging), for which one needs procedures of constant calibration and recalibration, since calculating forecasts can be divided into new forecasts (derivatives are not only written on underlying assets, but also on derivatives). Future events need to be identified and quantified, their probabilities evaluated in order to assign them corresponding prices and apply them to upcoming investment decisions. Today, this is done primarily by means of quantitative (mathematical) operations in which, for example, stochastic series are applied.
In this context, Randy Martin talks about derivatives as an emerging possibility to introduce derivative sociality on a planetary scale, or the phantasmatic break with linear time. He writes: “While derivatives are conceived in a language of futures and forwards, the present anticipation of what is future, […] the act of bundling attributes speaks to an orientation toward all sides that is an effect of mutual commensurability.” We must note, however, that the deterritorialised or virtualised money capital flows or derivatives today have long since themselves processed rhizomatic properties, i.e. molecular-dynamic, volatile processes that are capable of linking even the metricised payment flows with more fungible, anexact and topological flows of finance money, whereby – this seems to be essential here – these molecular compositions of money capital flows dominate the metricised segments of the economy and its payment flows today, which, however, still set or at least influence important macro-economic parameters such as interest rates, the ratio of supply and demand, etc. (the latter). (the latter refers to the extensive determination of value or cardinal value). In economic accounts and balance sheets, lines and segments are recorded in certain metrics, in numerical registers or measures, starting from given values or stocks. These lines are stratified; it is about the notched metrics of monetary capital flows or the dominant metrics of cardinal value, e.g. empirical-statistical data such as wages, net savings, net profits, interest rates, capital claims, investments, consumption, etc. This is the classical approach: This is the classical approach of economics to represent values – cardinal values. The power centres of economics, as they are concentrated in the central banks, for example, permanently try to mediate the molar and molecular flows. Even quantitative finance takes on a mediating function here. It is closely linked to computer science and big data technologies, uses research on artificial intelligence or evolutionary algorithms, which are used in game-theoretical simulations and serve to optimise computer hardware and communication networks. The molar is defined by macroeconomics (rigid, veridical, Euclidean, arboreal, etc.). In contrast, microeconomics is characterised by properties such as fungibility, horizontality, topology, rhizomaticity, etc.
There is a double reciprocal dependence to be reported between the two domains: Whenever a fixed line is identified, it may be perceived to proliferate in another form, for example as a quantum flow. In each instance, an apparatus of power can be located at the border between the molar and the molecular, which is by no means characterised by the absolute exercise of power, but by relative adaptations and conversions that entail certain effects in the money-capital flows.
With the concept of the rhizome, Deleuze/Guattari begin to sketch out their concept of ordinal value in more detail, which immediately raises the question of the economic quantification, valuation and movement of differential prices in smooth spaces that defy metrication. Any stratification, representation and metrisation, whether enforced by price or any other economic parameter, must now refer to the non-quantitative differentiation of a rhizomatic, financial economy. It could even be, at least according to Deleuze/Guattari, that in an economy of molecular war machines, even money, at least in its functions as measure and means of circulation through which it metricates and moves valuations, is no longer needed.
In both spaces discussed here, there are points, lines and surfaces. In notched space, lines are mostly subordinate to points, i.e. lines only establish the connections between points; in smooth space we find the opposite principle, i.e. lines pass through points. In the smooth space of capital, the line inevitably transforms into a vector (a direction) – produced by localised operations and changes of direction, and thus it is to be grasped as non-dimensional and non-metric. The materials in smooth space refer to or serve as symbols of multiple forces, whereas in notched space it is always the forms that organise matter.Organless bodies, distances, symptomatic and estimative perception of smooth space are juxtaposed with the organism and organisation, units of measurement and property of notched space.
Lines are understood as constitutive elements of things and events; the line does not mean unity of the one, but a unity characterised by diversity (one-all). Individual capital with its multiple components incessantly draws lines in the open system of the rhizome (a mesh/fold of lines) and is drawn by them in the context of total capital – it takes up the lines for the continuous effectivation of accumulation, not at all only to produce static products, but to achieve for itself the optimisation of the lines in the context of its capitalisation.
It is no longer primarily about the production of products (number and quality incl. inputs and outputs), but about the creation of profitable lines/waves/vectors that are non-linear and spiral – infinite and a-linear lines that flee in all directions. This per se virtualising aspect of capital is supplemented, actualised and thus always also restricted by economic mathem and its quantifying mode. It is important to note that the cult of “the” dialectic as an Ariadne’s thread to master the labyrinth of capital has basically always tried to think of the algebraic only as a derivative of the linguistic-conceptual. In this there is a certain necessity of theoretical access, but in the process the polarity of opposition has been shifted in favour of the linguistic logos. The differentiation, temporalisation, modulation and smoothing of the measure (of money) performed by deterritorialised monetary capital flows correlates with the mathem of economics (modularisation) or the complex logic of the algorithm that today performs the measurements. In the derivatives markets, non-quantifiable rhythmisation plays an increasingly important role, although it must be incorporated and updated by the algorithms described elsewhere, which perform temporal-rhythmic derivative “measurements” in discrete steps (volatility). The first type of rhyhtmisation serves purely the exploitation of money capital, which pemanently draws speculative lines that do not so much document the value of commodities or enterprises as seek to optimise money capital.
In order to gain direct access to price formation, which increasingly does without the “latency” of human speed of action, it was first necessary to make electronic markets the paradigm of price formation. An important moment in this development was the introduction of the open source platform Island, which made it possible to place buy and sell orders away from the familiar stock exchanges and their market makers. It was launched in 1996 and paved the way for automated trading. This innovation not only led to radical changes on the stock exchanges (such as the NYSE or NASDAQ), but also to a reduction in the human workforce (with a few exceptions) that had dominated stock exchange trading up to that point: The traders, built on affective as well as intellectual interaction, who determined price formation on the trading floor had become redundant. Trading the future in ever more microscopic present moments, taking into account gigantic amounts of historical data and the automation of order flow promised markets with high liquidity and fairness in pricing. Regulators, as Haim Bodek points out, were also pleased, as these systems led to a reduction in the spread, the difference between buy and sell prices on which market makers lived, and thus to better order execution for all market participants, including relatively inexperienced private investors with small budgets.
In addition to the purely speculative or virtual line, Gerald Raunig speaks of an abstract-dividual line that crosses things and relations, he speaks of the social factory of the new precarised companies, of deterritorialised mortgages and loans etc., which are bundled and recomposed as risks qua derivatives and transformed into a singular cash flow and a single risk, in order to break this down again into tranches (CDO) with the aim of generating further profits. In this context, he quotes Randy Martin, whose argumentation here again points to the dominance of financial capital over “real capital”, which is inextricably linked to it. Martin writes in his book Knowledge Ltd: “Whereas the mass-produced assembly line gathered all its inputs into one place to produce a tightly integrated commodity that was more than the sum of its parts, financial engineering reeled off this process in reverse, breaking down a commodity into its constituent and mutable elements and dispersing these attributes to bundle them together with the elements of other commodities of interest to a globally oriented market for risk-driven exchange. All these moving parts are reassembled with their risk attribute so that they become worth more as a derivative than their individual commodities.” The derivative is far more than just a written contract that regulates the exchange of an object or commodity at a future time and price, it is a financial instrument that, contrary to the notions of separation of financial and real economies, allows for a kind of measurement qua comparison of future money capital flows in a way that is supplementary to money (it must always be realised in money), and thus enables the regulation and coupling of the various economic areas, individual capitals and capital fractions, making them commensurable with each other precisely through certain processes of differentiation, whereby the derivatives are already realised in money and are themselves to be understood as a form of money capital. Randy Martin writes about the processes of differentiation of derivatives: “While commodities appear to us as a unity of wealth that can abstract parts into a whole, derivatives are still a more complex process in which parts are no longer unitary but are constantly disassembled and reassembled as different attributes are bundled together and their value exceeds the whole economy under which they were once summed. Shifts in size from the concrete to the abstract or from the local to the global are no longer external measures of equivalence but internal to the circulation of bundled attributes that multiply derivative transactions and set them in motion.”
In the diagram of a synthetic asset, the discrete elements, which are nothing more than the economic properties of the asset (cash flow, maturity, price, risk, volatility, etc.), are related to each other. One can now leave it at that and say that the relations extist before the relata, which, however, do not dissolve, or one can even go one step further with Nietzsche and say that the properties of a thing are only effects on other things, and if one then takes away the term “other thing”, then a thing no longer has any properties at all, so there is definitely no thing without other things. Thingness thus dissolves entirely into the flux of differential events. (If, while recognising the dominance of relations over relata, one does not completely dissolve them into events and leaves them in a relative independence, then this still remains within the relation “relation and relata”. If, on the other hand, the objective particularity of the relata could be thought, with Heidegger, as dependent on a Real, on a non-objective transcendence that separates each relata into two formally distinguished sides, one side assigned to disconcealment and concealment in the direction of the Being of Being, and one side assigned to the object that is indifferent to the other side. Here Heidegger poses the question of irreversibility as such, which in the end does not deny itself. The Kantian thing-in-itself is an essential subtraction, as a tautological subtraction in itself as the essence of being. Nothing natures).
Now what is meant by a “quantum flow”? It is a deterritorialised flow of financial money that knows no segments and no stratified lines, but singularities and quanta. The poles of the respective flows are regarded as condensations at which money is created and destroyed; singularities mean the nominal liquid assets and the quanta stand for processes such as inflation, deflation and stagflation. Via derivatives, certain distributions and corresponding rhythms of monetary capital flows can be observed in other forms and at the same time. For Deleuze/Guattari, each quantum flow is “deeper” than the monetary capital flows, whose metrics refer to elements of cardinal values: Quantum flows are mutant, convulsive, creative, circulatory and material flows that are tied to desire and are always lower than the solid lines and their segments that determine, for example, interest rates and the relationship between supply and demand. For Deleuze/Guattari, then, there are two economic ways of thinking, a) that of economic accounting (segments and solid lines of a molar organisation representing determinative metrics of cardinal value), b) that of financial money, which today is called the flow of finance or, precisely, financial flows. In his book “Algorhythm”, Miyazaki opposed the metaphor of flow with the concept of rhythm, which is more precise for describing digital networks and the structures of storage, transmission and processing, because it indicates a discrete, catastrophic flow that, as algorhythmics, is located between the discrete-impactful and the continuous-flowing. By means of algebraic logic, command and control can be described as a chain of circuits that serve to establish equilibria of flow, “whereby the term ‘flow’ only obscures the fact that at any given moment it is only a matter of more or less large intervals of binary states, up to the limit of their coincidence.” (Bahr)
The problematisation of quantum physics promises enlightenment at this point. It has shaken the foundations of classical physics with Max Planck’s groundbreaking insight that radiation, such as light, which until then had been regarded as a continuous phenomenon, could under certain conditions be a quantum or have a discontinuous character. According to the classical view, light behaves like a wave, but since Einstein’s discovery that light can behave as a particle under certain circumstances, the classical position in physics had to acknowledge that there is indeterminacy, i.e. the wave or particle-like state of light cannot be observed at the same time. The visualisation or representation of the quantum phenomenon in a single image is not possible.
The concept of complementarity, invented by Nils Bohr, establishes a link between quantum mechanics and philosophy in a similar way as Derrida did later (persistent opposites at the same time). If the main question is whether an object can take on opposite properties of wave and particle at the same time, later experiments show that “wave-like” and “particle-like” are not even properties or attributes that can be attributed to quantum objects, because these designations are based on classical concepts that attempt to describe radically new phenomena. New designations even raised the question of whether quantum objects could be thought of as objects at all. Physics was thus suddenly confronted with the indeterminable, which is observable only in its effects. The unknowable objects of quantum phenomena were determined as “efficacities”, i.e. they are only accessible to us through their effects, which, however, can only be observed and understood with the concepts of classical physics. Classical physics and its own principles of causality require the construnction of a model with which the interaction between natural objects and natural phenomena can be observed, measured, explained and verified. In quantum mechanics, such a model would not be possible, since only the interactions between the efficacities and the measuring instruments can be described. As Bohr says, all this requires a departure not only from the classical principle of causality and its visualisation, but geerelfrom the classical attitude regarding the problem of physical reality. But if the unknowable is only knowable through its effects – through the concepts of classical physics – then such a situation requires a revision of what constitutes reality, and thus it is hardly possible to follow the model of a model, or for instance the concept of the model of classical physics. In contrast to classical, causal and deterministic ways of constructing models according to models, quantum phenomena proved that such ways of modelling do not work, because in quantum mechanics in general what was previously thought to be guaranteed was at stake. Quantum theory can be seen from this perspective as a crisis of representation, of its models and of mimesis, of the Platonic model of a model with which science was generally able to represent certain phenomena as reasonable. With the rise of quantum theory, the visibility of such phenomena and thus their representation was at stake. In this sense, Bohr was no Hegelian; his concept of complemetarity not only fundamentally challenged the Hegelian concept of synthesis, but contained the critique of a metaphysics of presence. Bohr could perhaps be assigned to Derrida’s deconstructivism, which only considers synthesis possible if one affirms the metaphysics of presence. For Bohr, an abyss really opened up at this point. And Bohr was perhaps the Bataille of physics, a non-Hegelian Hegel who interrupted every kind of synthesis between opposites in order to continue them endlessly in their breaking apart.
Money capital flows and their rhythms cannot be indexed at the level of virtualisation, nor can they be measured by the common metrics of managerial accounting, nor can they be regulated by new metrics and segments, rather they operate between and through the poles to constantly create new singularities and quanta – they determine, not least, the molar determinants of cardinal values. Molecular money flows thus cannot be represented, they resist metrics and can even reveal lines of flight (from macroeconomic indicators) characterised by deterritorialisation, destruction and transformation of classical economic thought, belief and desire, but in a way they always come too late as well, insofar as macroeconomic indicators have already reterritorialised the molecular movements.
Liquidity often appears as the most important property of an asset. One can call this aspect transaction liquidity, i.e. an asset has liquidity if it can be exchanged for money (the object has liquidity). But liquidity can also be used to penetrate those capital markets that are populated by differential varieties of assets. Here, one no longer speaks of the liquidity of the assets, but of market liquidity, in order to attribute liquidity within a simulative space for exchange, in which the market participants can liquidate their positions quickly enough without having to accept excessive price reductions of the involved assets (one can assume here a shift of liquidity as a purely objective property to a property of the simulative space with objective consequences). Of course, liquidity is also referred to as a property that is entirely related to the debtor and is then called funding liquidity. This involves the creditworthiness of the debtor as well as its ability to lend assets at an acceptable interest rate so as not to suffer the conversion of liquidity as insolvency.
The fundamental prerequisite for developed financial markets is so-called secondary trading, which is based on trust in highly liquid money and capital markets. The price formation process, starting from primitive collateral to single financial innovation (derivatives) requires “continuous” financial values; and “continuous” price formation depends on the existence of financially viable liquidity. The smooth functioning of the financial system is built on the notion that the option to trade can itself be executed under constantly tested conditions. But lies also one of the problems for contemporary financial capital, for although modern finance improves the conditions of realisation for capital, it remains heavily dependent on market liquidity. If this evaporates, then the whole setting quickly becomes brittle. In other words, the demand for greater discipline within the framework of capitalist power relations makes the economic milieu more fragile and vulnerable. Liquidity must be assigned to the financial system as endogenous. In times of danger, the valuation of risk opportunities and the prices of assets tend downwards, market participants draw their credit lines, and/or raise margin requirements to protect themselves against the risks posed by counterparties, and liquidity disappears when it is most needed, so that eventually the whole price formation process can collapse. This would be a description that could be called Marx’s financial instability hypothesis.
Economic objects never stand for processes such as inflation, deflation, etc., rather here it is the relations that indicate the price spreads between the objects and the imago of value (money), relations that in turn refer to processes of inflation or deflation. Quanta are also not objects, but rather stochastic processes through which economic objects first acquire the status of objectivity. In this context, the phenomenon of inflation does not seem to be much different from that of weather to begin with, it is a haeccietas, it articulates the differential price movement and/or the stochasticity of price movements, which in turn are condensed into objects without ever being reducible to such objects. We can thus say that quantum flows and their rhythms are often stochastic processes whose dynamics are formed around singularities, only to refract back into lines and segments in some circumstances, into metrics of representation or cardinal value. In contrast, ordinal value theory insists that money-capital flows emerge from the double reciprocal determination of molecular and molar machines, each with its own modalities and, moreover, two unequal systems of reference, although the two circuits of circulation, which can only be separated analytically, remain materially coupled and always flow as one stream.
Settled distribution deals with spaces (markets) and objects in space (assets), with territories (features and their relations) and their zones and regions. And it processes, like all forms of distribution, using points and paths. Nomadic distribution, while also dealing with these parameters, can even follow ordinary paths, from one point to another, but the points remain subordinate to the paths that determine them. In the nomadic mode of distribution, a point is reached only to leave it behind, so that each point is a kind of relay and exists only as a relay. Thus, the in-between (between points) or the intermezzo acquires a consistency all its own, even a new autonomy and dominance, which in the field of financial theory consists in the continuous recalibration of assets. A becoming whose distinguishing feature lies in the in-between between two points indicates in practice the abstract principle of Cantor’s set theory. Cantor set theory involves an infinite dust of points whose continuous becoming creates a continuous space between the points – the principle of activity is the continuous repetition of division so that the set continuously approaches zero, although it is always enclosed in a finite space, so that it remains at the same time infinitely diverse and sparse. Continuous recalibration is a practical method of endless deterritorialisation.
With the euphoric use of the concept of the structure or the rhizome, a post-historical, post-romantic, post-genealogical phantasmagoria about the network society was initiated, in which the various sub-areas could supposedly develop forward in a co-evolutionary manner and no causal relationships needed to apply any more, instead the respective areas were interwoven as a colourful patchwork. Talk of the ecological system or the network pervades almost every well-versed social analysis today. For example, David Harvey, in his paper “Unravelling the Riddle of Capital”, listed factors such as class relations, institutional structures, production processes, relations with nature, reproduction, everyday life, demographic development and intellectual ideas, in addition to the parameter of capital accumulation, which together would form a network or an open complex whole. According to Harvey, the important thing is to show how these areas influence each other, how they organise and structure their relations, from which tensions, contradictions, evolutions and inventive processes can result, which, however, are by no means causally determined, but simply contingently mediated. According to Harvey, this exact theoretical structure was developed by none other than Marx himself in the 13th chapter of Capital Bd1 in opposition to Darwin’s theory of evolution. Supposedly, Marx did not know any primacy of an instance and he did not know the concept of determination. Thus, the strategic influence of the network metaphor is also evident in Harvey, which today goes far beyond the mere description of a data-organised infrastructure and a relational ecology. After all, everything is network today, and the best response to networks is even more networks; indeed, there is almost a paranoid intellectual atmosphere according to which everything is network.
In this context, even the term “rhizome” had to be used to describe the fact that in art, architecture, computer science, neurobiology, economics etc. a network model reduced to the concept of virtuality came into fashion (at least the terms virtuality and digitality were used congruently), which followed the almost paranoid premise that everything is networked. In fact, many of the big corporations today are network companies. While Google monetises its designs of networks by clustering algorithms, Facebook overwrites subjectivity and social interaction along the lines of channelled and discrete network services. In military theory, the most effective response to terrorism is to build networks; in Negrian operaism, the best response to empire is the multitude; in ecology, networks are the most effective response to the systemic colonisation of nature. In the computer sciences, distributed architectures are the best answers to bottlenecks in connectivity. In the economic sciences, heterogeneous economic scopes are the best answer to the distributed nature of the “long tail.” Ecological or systems thinking gained popularity, sometimes unimagined, as a kind of solution to the problem of diachrony, with space and landscape occupying the place of time and history. The postmodern “spatial turn” goes hand in hand with the devaluation of the temporal moment; think of Riemann’s complex surfaces, the path from phenomenology to the theory of assemblages, from the temporal image of cinema to the data-based image on the internet. Finally, the old manta of historicising has been replaced by the mantra of constant connecting and networking. (Sloterdijk has laid out the phase of network consolidation for the period from 1492 to 1974, and thus misses precisely the essence of the so-called digital revolution).
During the age of clocks, the universe was still thought of as a mechanism in which the heavens rotate according to the music of the spheres. In the age of the steam engine, the world mutated into an engine of indescribable thermodynamic forces. And after fully developed industrialisation, the body transformed into a factory, enriched with the seductive metaphors of technology and infrastructure. Today, in the age of networks, a new template (paradigm) inscribes itself in everything that shows itself as presence; even more, the idea that everything is network opens up a new tautology of presence.
Photo: Bernhard Weber
translated by deepl