High Frequency Trading and Ecotechnology

If one follows the theory of technical objects as developed by the French theorist Gilbert Simondon (Simondon 2012), and subsequently Frédéric Neyrat’s statements in the anthology The Technological Condition (Hörl 2011), it is necessary to fundamentally reconceptualize the already disturbed identity of nature and technology for today’s hyper-technicalized societies (there is neither a total integration of nature into technology, nor is technology to be understood purely as an expression of nature) by first affirming the machines or technical objects in their pure functionality, so that they can be understood in their unenclosed nature. technical objects, which by no means extend human organs in a prosthesis-like manner or serve humans only as means for use, are first affirmed in their pure functioning, so that they can finally attain the status of coherent and at the same time individuated systems, whose localizations are embedded in complex machine networks Almost at the same time as Gilbert Simondon, Günther Anders had spoken of machines as apparatuses, but of an apparatus world that had rendered the distinction between technical and social configurations obsolete, indeed had generally rendered the distinction between technology and society irrelevant. (Anders 1980: 110) According to Günther Anders, every single technical device is integrated into an ensemble, is itself only a part of a device, a part in the system of devices – of apparatuses – whereby it on the one hand satisfies the needs of other devices, and on the other hand, through its mere presence with other devices, satisfies the need for stimulate new devices. Anders writes: “What is true of these devices is true mutatis mutandis of all […] To claim of this system of devices, this macro-gear, that it is a ‘means,’ that is, that it is available to us for free purpose, would be utterly meaningless. The system of devices is our ‘world’. And ‘world’ is something different than ‘means’. Something categorically different.” (Anders 2002: 2) Or to put it in Simondon’s terms, given our post-industrial situation, we should speak of technical objects whose elements always form recursions and maintain internal resonances with each other, while at the same time the objects are in external resonances with other technical objects, possibly to be able to play out their inherent technicality in machine ensembles as open machines. In addition, many technical entities develop a plural functionality, execute several functions within a machine system instead of one function; think of the combustion engine, for example, whose cooling fins take on the function of amplification in addition to cooling when they counteract the deformation of the cylinder head. (Cf. Barthélémy 2011: 97) Simondon, however, did not subscribe to the deeply pessi- mistic view about postindustrial technologies found in Gün- ther Anders. Rather, in those technical objects that escape the hylemorphistic juxtaposition of form and matter still envisioned in the labor model (matter shaped by tools), Simondon has identified precisely a possibility for technology to approach the autonomy of nature, a tendency that leads to a dynamic cohesiveness of technical objects by having them
This is achieved by the production of associated milieus, the connection of their inside (resonance of different parts and multifunctionality of the parts) with the outside, with other objects, be they natural or artificial objects. At the same time, the technical object cannot completely separate itself from an excess of abstraction, which is precisely what characterizes the so-called artificial, heteronomous object. Simondon attributes the power of abstraction primarily to man as his constitutive contribution to technology, which, however, prevents technical objects from concretizing themselves in open structures and from exercising their tendency to autonomy Simondon, however, by no means allows himself to beseduce from the thesis that in a postindustrial future all living things must rigorously be subject to open technical ensembles; on the contrary, Simondon advocates a social concept of technical ensembles or open machine associations in which the human coexists with the ‘society of technical objects.’ But where man intervenes too determi- ningly in the technical, we have to do with heteronomous artificial objects, whereas the technical object tends at least to an auto- nomy (it cannot completely give up abstraction) that includes the natural moment: i. e. closedness and consistency of a machine sys- tem. Paradoxically, then, for Simondon, it is precisely the artificial that prevents technology from achieving autonomy. (Ibid.) According to Simondon, abstract artificiality always refers to a lack of technicity, whereas the tech- nical object is supposed to concretize itself in coherent processes, whereby it is necessary to integrate each local operation of the technical object into an overarching arrangement of machine ensembles. (Hegel defines the concrete as that which includes the relational, while the abstract is considered by him to be rather one-sided or isolated. The terms “concrete” and “abstract” therefore do not designate types of entities, such as the material and immaterial, but are used to designate the way in which thought stands in relation to entities. Thus, as Hegel often explains, the abstract may turn out to be the most concrete and the concrete the most abstract. A materialist concept must be able to explain what constitutes the reality of a conceptually formed abstraction, but without wanting to hypostatize the form. It must be able to show how abstractions are “treated” by social practices, whereby the latter are more than just labor processes shaping matter, if they repeatedly re-situate themselves in very specific ways, i.e., concretize themselves in relation to the concretizing technical objects as well, as Simondon suggests). Thus, the technical object functions in associated milieus, i.e., it is related to other technical objects or it is just sufficient for itself, and in doing so, it must always take nature into account.
Simondon’s technical objects refer to their embedding in network structures, where he depicts the contemporary coupling of technical objects to atomization technology, salutary anticipations from the image of a society that is no longer apparatusized come to the fore. The digital, information- and computation-intensive ecology of new media foreshadowed as early as the 1960s, the dispositif of digital, transformational, and modular technologies including a nonintentional and distributed neo-subjectivity deformed by machinic speeds. A sub- jectivity situated at the intersections of technological and monetary currents flowing at the edge of the speed of light, where it proves to be flexible, modular and recombinable to the extreme through all self-relations. Almost in unison with cybernetics, Simondon is also aware that the machine is not used as or like a work-product, rather it is operated. Technical objects are neither prostheses of man, nor, conversely, can man be completely dissolved as a prosthesis of machines. First of all, technical objects should be considered purely in terms of their functioning, and this in terms of their explicable genesis, in the course of which, according to Simondon, they increasingly concretize (not abstractify) themselves on the basis of an immanent evolution, that is, on the side of the adaptation and expediency of their use or their fixation as means. However, the technical object is not a creative agent in its own right, it remains constrained in an economic and scientific context, and the process of its concretization at the same time asserts synergetics, the interaction with other functional subsystems by modifying and completing the functionality of the technical object. The movens of the concretization of the technical object includes the organization of functional subsystems, in which the technical object interconnected only matures to a technical ensemble, which is characterized by comprehensive social, economic and technical processes and their structuring. Concretization also means the tendency to innovation, in which a number of quite conflicting requirements are satisfied by multifunctional solutions of individuated technical objects by establishing causal cycles in order to integrate the respective requirements. Technical elements (parts of the machine), technical individuals (machine as a whole) and technical ensembles (machine as part of social, technical and economic systems) are each already in a dynamic relationship that potentially unleashes a process of technological change. However, the capitalist economy is not dominated by the media/machines; rather, capital and its social economy continue to determine the technological situation in the last instance. We are dealing here with a unilateral feedback loop, the dynamic relation between the economy and its social environment on the one hand (determinant), and the machine ensembles on the other. The economy feeds the machines, sets conditions, and at the same time uses their knowledge for the organization of its own fields of power, while conversely the machine ensembles shape the consis- tencies of the economy, its communications and relations of power. (For Laruelle, the first distinction must be made between the technical and the technological. Laruelle understands technology as the object presented to science, while the concept of techno- logy belongs to the order of discourse and knowledge, of a human-scientific knowledge belonging to the philosophical type, understood as techno-logos or techno-logical difference. At first sight this seems not so far away from Hans-Dieter Bahr’s conception of the machine, which he had sketched in his writing Umgang mit Maschinen. On the one hand, a discourse on the genealogy of machines and techniques, from the trap to baroque slot machines to the industrial robot. On the other hand, a genealogy of the concepts of technology and machines, which until now have mostly been categorized in a philosophical, instrumental or anthropological scheme. Bahr, on the other hand, is interested in machines in their differential neutrality or non-neutral indifference, insofar as they are not causally reducible blasting and jumping machines, when they process as the epitome of an art of understanding beyond the means-purpose scheme.
Laruelle’s concern with regard to a real object is the essence of the technique and not the technical object. If there are technical objects, they are those that Simondon describes, but Laruelle does not believe that they can exist as scientific objectivity and constitute the essence of the techni- cal. That which constitutes the technical essence of technique, for Laruelle, who at least on this point agrees with Heidegger, is not itself technical. However, the technical essence is also not to be understood as techno-logical, in that philosophy turns to the technical object to exercise its dominance).
According to the French theorist Frédéric Neyrat, the identity between nature and technology, already disturbed in each case, refers to the “hyperject,” which denotes the machinic autonomization of technology vis-à-vis human actants, as well as the material substitution of the material by the artificial, without, however, being able to speak of a total integration of nature into the technical must be assumed. (Technology as detachment from nature, as substi- tution of natural materials by synthetics and as detachment of technology from humans qua machine autonomization. It is important to assume that machines and their materials are in a relation of interference). One can identify the hyper- subject as a substitution and autonomization milieu (materials and machines) of the technical that is independent from subject/mind and object/nature, whereas regarding the con- textualization of the two milieus one should not speak of unifications but of superimpositions when thinking about the inner and outer resonances of the technical objects.
Post-industrial technology, for example the concept of the transclassical machine in Gotthard Günther, stays in the in-between of nature and spirit, because it is forbidden to reduce the transclassical machine purely to the scientific-human creation, precisely because of the processes of double detachment, since it follows an independent logic of reflection. It is about the transclassical machine, whose essential function is to deliver, transform and translate informati- on. [Information arti- culates the difference that makes a difference, as Gregory Bateson sees it, but this is not because the smallest unit of information, a bit, as Bate- son assumes, is simple but, as Bernhard Vief writes in his essay Digital Money (Vief 1991: 120f. ), is doubly given: bits are immaterial, relative dividers, they stand for a movement of differentiality that is neither present nor absent, and thus the binary code, the binary sequence of numbers, can also only be positioned as an effect of the alternance articulating it. As Lacan has shown in the example of the cybernetic machine, the articulated is of the same order as the symbolic registers, the switches of the switching algebra being the third of that order: The articulation, which itself is neither open nor closed, first indicates the possibility of the purely place-valued states.]The transclassical machine cannot be mapped onto either the object or the subject; rather, it inheres a three-valued logic: subject, object, and the transclassical machine as hyperject. Thus, the hyperject belongs neither to nature (object) nor to mind (subject), and thus it is subject to an exteriority, which, however, is by no means to be understood as the externalization of the interior of a subject, but rather indicates an independent “region of being”-it contains a trivalence that shows its incompleteness per se, because it does not synthesize the opposites (subject and object)-on the contrary, there remain on the contrary, these non-trivial machines (Heinz von Foerster) are ever already withdrawn from the complete analysis as well as from the synthesization. However, at this point the concept of technical being has to put up with the question whether the medial of technical objects can be ontologically grasped as ways of dispersion into open spaces or the dispersion of space itself. Second Order Cybernetics had created its own constellation of terms in the last century (feedback, autopoiesis, temporal irreversibility, self-referentiality, etc.), which has long since migrated into mathematical models or computer simulation. Although the material substrate or the physicality on which these processes are based does not dissolve at all, the autonomous-immanent relations and interactions of a multiple graded complexity govern here, whereby complexifications take place in every single contingent process: Systems transform random events into structures, just as, conversely, very specific events can also destroy structures, so that a single system shows a continuous fluctuation between disorganization and reorganization as well as between the virtual and the actual in almost every conceivable case. Gotthard Günther has above all tried to depict the ontological implications of these forms of knowledge and has introduced the concept of polycontextuality for this purpose. In a polycontextural world context, the transclassical machines operating in a gap or as the third between subject/mind or object/nature are scattered across a multiplicity of objects, qualities, and differences. (Neyrat 2011: 165f.) These trans-classical machines are conceivable as ensembles of universes, each of which can make an equivalent claim to objectivity without having to thereby map or even eliminate the claims of other ensembles. In this, the notion of contextur denotes a continuum of potential reality that changes its shape with each quantification. Günther thus speaks of the contingency of the objective itself, whose difference does not convey an intelligible hier- argy, with the consequence that in these technological fields we are dealing less with classifications or taxonomies than with decision- tuations and flexible practices. In contrast, the computers we know so far operate only autoreferentially, i.e., they cannot process the difference between their own operations and the environment within themselves.
Frédéric Neyrat introduces the so-called holo- ject as a fourth level of the technical, which, in contrast to the hyperject, is a medium of absolute connectivity) As such, the holoject is inexistent, though it can transfer its continuity properties to the hyperject and thus give form to it, which we then finally call an organless body, a machinic ensemble that is machinic in all its parts. In this process, there is by no means a fusion of two realms (subject/object, knowledge/thing, etc.), but rather, according to quantum physics, we have to assume superpositions here, in which, for example, two waves retain their identi- ty when they generate a third wave, which, however, is neither a synthesis of the two previous waves nor their destruction, but, according to François Laruelle, indicates a non-commutative identity. Idempotency, a term from computer science, includes a function that is linked to itself or remains unchanged by the addition of further functions, so that the generative matrix persists as a non-commutative identity through all variations without ever needing transcendence. Idem- potency is inherent to what characterizes the holoject according to Neyrat, the “both as well as, as well as, as well …”, whereby with regard to idem- potency, the main focus is on the function of “and”, i.e. on the insistence of conjunctive syn- theses, and this leads us towards an open technical structure, in which the technical object as an “in-between” ever already indicates itself with a certain delay as well as an inexhaustible reserve of the technical medium itself. In this context, McLuhan’s formula “the medium is the message” does not postulate an identity of terms, nor is the message degraded to a mere effect of technical structures; rather, the “is” echoes something that recurs in the medium as difference, virulence, or disjointedness, without ever being able to be immobilized. (Cf. Lenger 2013) The message of the medium occurs in the fact that difference only submits to a medial “together” in order to recur in it as disparation and to echo itself as difference, thus simultaneously undermining its previous technical modes and modifications. At this point, Jean-Luc Nancy speaks of an eco-technique of intersections, twists, and tensions, a technique to which the principle of coordination and control is alien, and he calls this pure juxtaposition, this unstable assemblage without any sense, struction. (Cf. Nancy 2011: 61)
Last but not least, it was Félix Guattari who developed a concept of material, social machinization that moves beyond the conception of molar machines, that maintains the dualisms of subject versus object and nature ver- sus culture. Machines are understood by Guattari as multiple assemblages that are in one material and semiotic, actual and virtual at the same time. Even before a machine can be qualified as technical, the machine is social and diagrammatic, and this dimension (diagrams, plans, equations as parts of the machine) cannot be separated from the dimension of its virtualizations and infinities. The molecular, autopoietic machine, whose part is the human alongside a multitude of non-human agents, operates in the infinite by means of asignificant semiotics insofar as it gives shape or form to the indeterminate. And as in quantum physics, molecular machines cannot be separated between observation and the observed within the framework of material apparatuses and their practices. These assemblages, described by Karen Barad (Cf. Barad 2012) as intra-active prac- tices, produce contingent phenomena, that is, relational, material configurations of components, human and nonhuman elements (signs, apparatuses, machines, etc.) that did not even exist as relations before the production of the relations. Guattari’s machi- nes possess, on the one hand, the capacity for signification, the formation of relevant concepts, and, on the other hand, they lead to the gradual materialization of assemblages across all spacetimes, intervening directly in material pro- cesses to drive the dynamic configuration of topological man- nifolds. It is primarily through asignificant semiotics that machines express, indeed speak, phenomena (and the things within them) and at the same time communicate, with other machines and real phenomena (materializations across spacetimes). And by means of certain signs, molecular machines intra-act (action upon an action) in the context of their material-discursive practices themselves with the physical and biological strata of matter, which in turn is itself an agent of a material becoming.
With respect to the cybernetic situation, Alexander Galloway has defined the black box as an apparatus in which primarily the inputs and outputs are known or visible, with the diverse interfaces establishing its relation to the outside. While Marx’s fetishism critique of the commodity was still about deciphering the mystical shell in order to penetrate to the rational core, today’s post-industrial technologies, which are undeciphered, are not.

In contrast, in today’s post-industrial technologies, which produce information as a commodity, the shell, which functions purely via the interfaces, is open and visible, while at the same time the core remains invisible. (Galloway 2011: 269) The interactive interfaces occupy the surfaces in the black boxes and usually only allow selective passages from the visible outside to the opaque inside. Black boxes function as nodes integrated into networks, whose external connectivity is subject to a strict archi- tecture as well as a management that remains largely in the invisible. According to Vilém Flusser, the camera can be considered exemplary for most apparatuses and their function. Its agent controls the camera to a certain extent because of the control of the interface, that is, on the basis of input and output selections, but the camera controls the agent precisely because of the opacity of the interior of the black box. For Simondon, on the other hand, it is precisely the digital technologies with their visually attractive and “black-boxed” interfaces that would prove highly problematic today. These technologies usually derive their popularity from a suggestive aesthetic of the surface; they do not attract the user because they offer him a possibility to indeterminate the technology.
They do not attract the user because they offer him a possibility of indetermination of the technology, of flexible couplings of the machines with each other and with the human, as Simondon often considers worth considering. Simondon, in fact, maintains that the principal movens of technological development is not an increase in automation, but rather the emergence and evolution of those open machines that are susceptible to social regulation. In the case of black boxes, on the other hand, we are dealing with technological objects that are described as ensembles of readable rational functions, and this with respect to their input-output relations that are as smooth as possible, whereby on the one hand their interior remains invisible, and on the other hand their material construction still exists in discourse at best as a referent to be neglected. Simondon, on the other hand, urges us to take a look inside the black boxes.
Furthermore, the problem of connectivity has to be taken into account, namely with regard to non-missionizing, transmitting machines, which are endowed with a plurality of procedures and effects, and this shows up as a matter of highest economic relevance, if these machines, contrary to a one-dimensional chain of effects, produce multiple machinic functions and effects in and with their complexes, yes, these functions even set free blasts of previous machines and thus pose new conjunctures

“The spheres of production and energy technology, transportation, information, and human technology give vague field determinations of machines into which the machine-environmental is already inscribed,” writes Hans-Dieter Bahr (Bahr 1983: 277), and in principle most machine ensembles and processes can thus be described as transmitting informa- tions, informa- tions into which natural, economic, and soci- ale structures and processes enter, including their deferrals, complexifications, and layer changes, whereby it has long been not only about communica- tions, It is not only about communications, but also about the absorption and filtering of the informational itself, about the manipulation of data via algorithms – and thus the respective relations and programmings/functionalizations inside the technical objects themselves would have to be decoded, which, however, the hegemonic discourses on technology know how to prevent almost obsessively. The (technological) machine would therefore, as Hans-Dieter Bahr has argued in his writing Über den Umgang mit Maschinen, be understood less as the concept of the object “machine,” but rather to be described as a discursive formation. (Ibid.: 277). Every (digital) machine is functionalized by programming, but it quickly becomes apparent that the description and maintenance of the constructive functions alone does not necessarily mean that a machine is “functioning”; rather, the manifold dysfunctionalities of the machines have to be taken into account, which can thwart the functioning system of input and output relations at any time, accidents, crashes, crises, etc. (it can happen at any time). (It can happen that a deceleration of the speed of machines is cost-saving for an economy as a whole, think of the (external) climate costs that are not incurred, although the deceleration increases costs for the individual capital; a machine can become obsolete due to the competition between the companies, i.e. from an economic point of view, although it is still fully functional from a material point of view, a constellation that Marx called moral wear and tear). The in-between of the machines, respectively the machine transmissions, quite violently block a teleo- gical point of view: The outputs of complex machines today are less than ever objects of use, which are mostly already further machine inputs, but generate much more complexes of effects including the unintended side effects, whereby the machines themselves mutate into the labyrinthine and therefore constantly need new programming and modes of operation for orientation and control, in order to keep their input selective.


The machines are supposed to function in particular by the rule-governed supply of programs, substances, information and by controlling the input-output relations. Possible outputs of the machines can be use values, but also other dysfunctionalities that disturb the continuous operation of the machines – most of these outputs are inputs into other machines. So machines send out energy and information streams which are cut or interrupted by other machines, while the source machines of the sent out streams themselves have already cut or taken from other streams, which in turn belong to other source machines. Each emission of a stream is therefore a cut into another emission and so on and so forth, this is at least how Deleuze/Guattari see it in Anti-Oedipus. (Deleuze/Guattari 1974: 11f.) At the same time, a double division emerges with the machine incisions, whereby the concept of the incision does not rise as a meaning from one inside in order to then be translated or transported into the inside of another; rather, in the communication of the incision something is indicated that already “is” as an outside, e.g., something that “is”.
“is”, e.g. a network of machine series that flee in all directions (Len- ger 2013), or a gravitational field that enables the coupling of machines. Each co-communication or translation takes place over an inexpressible incision into which the net divides. This division remains unexpressed in the middle, but this only because an open space is opened, which allows that in principle everything can be communicated and expressed. Today, these divisions take place via interfaces. Interfaces are usually referred to as significant surfaces. An extension of the conceptualization takes place when they are conceived as transitions or passages, when they are described as thresholds, doors, or windows, or when they are understood as fields of choices in the sense of a flexibilization of input selections, in which case we can speak of an intraface that is an indeterminate zone of translations of inside and outside. (Cf. Galloway 2012) The intraface opens up the machinic structures in an indeterminate way to associated milieus, which means that we are dealing with open machines in which several intrafaces are always integrated as effects of trans- lations that work or do not work, although even this distinction is still questionable if we consider that machinic transmissio- ns simply cannot do without manifold side effects and disturbances.
Now, the cybernetic hypothesis is characterized precisely by the fact that it defines the technical system by the sum of inputs and outputs, whereby its black boxes (computers, data objects, interface, codes) must eliminate dysfunctional inputs in perma- nence. Dysfunctional inputs include climatic conditions, incomplete classifications, influences of other machines, faulty programs, economics, wear, etc., and it is up to the cybernetic machines to absorb these structures and correct them according to their own criteria, and these transformations in turn affect the outputs. If machine systems select and transform different types of inputs, this means precisely that a variety of economic, social, natural, cultural, juridical functions are among their inputs as well as among their outputs. (Bahr 1983: 281) Here, the disciplining function of the feedback mode of cybernetic control loops is clearly indicated, the attempt to link the outputs back to the inputs in such a way that in the future dysfunctional inputs can be faded out or eliminated or at least more functional selections of the inputs take place than before. Thus, cybernetics is not only characterized by automation, but also by the mechanism of input selections as well as the elimination of dysfunctions. A cybernetic feedback system is consistent with a recursive function of input and output: F= F(x). Thereby the system assumes an outside, which is shown an inside as its target value. It must be able to correct disturbances with respect to this set value, otherwise the system disappears, whereby it must be noted that the system can never determine its inside as set value by itself. If now the human element is taken out of the feedback loop, one speaks of the automaton. This contradicts of course a posthuman situation, as Gilbert Simondon had still imagined it: If Simondon’s technical objects individualize themselves, then they are always also in external reso- nance, whereby the resonances insist in the in-between of technical individual and associated techno-logical milieu, they create a recursive causality in the in-between. Cybernetics, however, wants to subject the in-between completely to its automatism or input selections, whereby the identity of living being and machine is thought of purely from the point of view of the automaton, while Simondon conceives the analogy between human and machine, which for him is asymptotic, from the perspective of the machines, each already oriented towards open spaces and asso- ciated milieus, which in turn corresponds to a certain
affirmation of non-selective inputs and a variety of stratagems that perpetuate themselves as cuts, divisions, and traversals of machine milieus. Technical objects are now typically embedded in digital networks, with the associated architecture of proto- collects regulating their exchange of information with one another, proliferating across a complex topology of condensations and dispersals, and even from this, for Simondon, a cultural force would likely still emerge. This does not use the machines, but confirms that cultural dignity lies precisely in the recognition of the pure functioning of the technical objects, whereby the human can only enter into a dialogue with the technical ensembles and this dialogue can lead to a true transindividuality. If the input and output selections are considered starting from their crossing contingencies, then we are not dealing with more automata, but actually with open machines – and concretization then means to appreciate the contingency of the functions as well as the interdependence of the elements in order to do justice to their inner resonance, which makes them probable machines, which cannot be measured by the ideal of precision, but display different degrees of precision by expanding their field of application, extending to new fields or tangentially extending them until they occupy all fields of the social, cultural, economic and technological, as has happened in the case of computer technology, however, in a usurping way. It is the process of disparation between two rea- lities, in Deleuze’s sense the disparation between the virtual and the actu- el, which finally activates information other than the digital and sets in motion a process of individuation that comes from the future. Information is located not so much on the homogeneous level of a single reality, but at least on two or more disparate levels, for example, a 3-D topology that knots our posthuman reality; it is about a fabrication of reality that folds the past and the future into the counter- present, as an individuation of reality through disparation that is in itself information. If individuation includes the disparation of the virtual and the actual, then the information is always already there, already the now (actualization) of a future present that can occur either way (virtualization). What is called past or present is thus mainly the disparation of an immanent source of information, which is always in the process of its dissolution. For Simondon, the notion of the capacity or potential of a technical object is closely related to his theory of individuation. The individual object is never given in advance, it must be produced, it must coagulate, or it must gain existence in an ongoing process. At the same time, the pre-individual is not a stage that lacks identity, it is not an undifferentiated chaos, but rather a condition that is more than a unity or an identity, namely itself a system of highest potentiality or full of potentials, an excess or a supersaturation, finally a system that exists independently of thought. Nevertheless, Simondon’s concept of transindividuation still remains far too strongly influenced by the optimistic vision of a harmonious balance.
Digital networks today not only span that globe which they themselves generate, but they penetrate into the social microstructures of the capitalist economy, whose human agents in turn subject them to permanent addressability, online presence, and informational control. (Lenger 2013) Being “online” today becomes the hegemonic way of being, the permanently mobilizable availability is part of a flexible normalization that users affirm in toto by practicing everyday wellness, cosmetic, and fitness programs until they finally incorporate the processes of normalization altogether in the course of their permanent recursion with the machines. In the postscript on societies of control, Deleuze had described human agents as “dividers,” as largely a-physical, as endlessly divisible entities that can be condensed to data representation, and that act similarly to computer-based systems precisely because of the effects of the a-human technologies of control. At this stage, we can at least assume that a homology can be identified between post- Fordist management methods, which propagate non-hierarchical networks, self-organization, flexibility and innovation in heroic litanies, and neuroscience, which describes the brain as a decentered network of neuronal aggregates and emphasizes a neurological plasticity (Christine Malabou) as the basis for cognitive flexibility and adaptation. According to Catharine Malabou, neuronal and social functions influence each other until it is no longer possible to distinguish between the two. At the very least, we must assume the possibility that the human species, with the rapid translation of its own material history into data streams, networked connectivity, artificial intelligence, and satellite surveillance, is tending to become a carbon copy of the tech-nology. When events-mobile apps, technological devices, economic crises, digital money, drone wars, etc.-process at light speed, there is a definite destabilization of the reference systems of traditional techno-discourses, whose definitions and hypotheses increasingly fail as useful indicators of what the future of hyper-accelerated capitalism might hold. The obscuring of clearly defined boundaries between bodies and machines, the interpenetration of human perception and algorithmic code, the active remi- xing of the constituents of humans, animals, plants, and inanimate objects-all culminate in the injection of a fundamental technological drift into the social, cultural, and economic, with capitalist econom- y still determining the technological. Implemented into social reality, the currently important signifiers of techno- logical acceleration include concepts such as “Big Data,” “distant reading,” and “aug- mented reality,” with them capital as power shoots the discourses still bound to gravity into the weightless space of the regimes of compu- tation. In the future there will be further migrations into this weightless space, e.g. that of thoughts into mobile technologies, and at the same time we will have to deal with a further increasing volatility in the field of digital financial economy, triggered by trading algorithms based on neuronal networks and genetic programming, we will dive further into the relational networks of social media, and last but not least we will be confronted with a completely distributed brain, modulated by the experiments in neurotechnology. Nothing remains stable anymore, everything is in motion.
Let us now turn to the current machine ensembles and their environments, to the digital networks and their complex ecologies of the material and the economic, in which high frequency trading (HFH) is integrated. Digital technologies have long since permeated the entire financial sys- tem – with the HFH, the fluid, planetary movement of financial capital, inherent in a drive towards the violence of pure market abstraction as well as the substitution of material experience by the diverse models of com- puter simulation, easily takes off from the cumbersome production and orders of consumption to perpetuate itself in a self-referential, semiotic system that permanently forces the calibration and recalibration of machine-machine relations.
The process of decimalization (the pricing of assets by decimal number and no longer by fractals/fractions), which has started rolling on the financial markets, accelerating itself, since about the year 2000 and has reduced the spread between the buy and sell prices (bid-ask spread) more and more, reflects and fuels the need to move ever higher and extremely time-consuming transaction sums on the financial markets in order to compensate for the ever decreasing spreads. In doing so, the traders hold the positions of the respective deals only for a minimal period of time, realizing also only small spreads, so that the high profits result from the quantity and the speed of monetary transactions alone. With so-called direct trading, which allows large institutional investors in particular to bypass all mediators (including the stock exchange) between themselves and the respective trading partner, as well as the existence of almost completely automated infrastructures, it is becoming increasingly urgent for financial companies to access the latest technological innovations in order to manage, control and, if still possible at all, also steer them in the sense of accelerative dynamics.77 Thus, in the current HFH, digi-tal automation of almost every aspect of the trading process, from analysis to execution of the particular trade to back-end processes, with all components controlled by algorithms. An HFH system must fine-tune any programming as well as memory capacity, manipulate individual data points and packets, capture databases, and select inputs, etc. Thus, a tendency towards the hegemonization of automation can clearly be observed in the financial markets. [In the Grundrisse, Marx had rudimentarily described automation as a process of absorption of the general productive forces – parts of the social brain – into the machine or capital fixe, which includes the knowledge and technical skills of the workers (Marx 1974: 603), which now follow the logic of capital rather than still being an expression of social labor]. If we consider the history of the development of the relationship between capital and technology, it seems quite obvious that automation has moved away from the thermo-mechanical model of classical industrial capitalism and has integrated itself into the electronic-computing networks of contemporary capitalism. Today, digital automation processes en détail the social nervous system and the social brain, it encompasses the potentials of the virtual, the simulative and the abstract, feedback and auto- nomous processes, it unfolds in networks and their electronic and nervous connections, in which the user acts as a quasi- automatic relay of the non-stop flowing streams of information.
Algorithms must be dis- cussed in the context of this new mode of automation. Usually, one defines the algorithm as an action-financial transactions, which may well expand into metastatic growth. (Cf. Rosa 2005: 116) However, if acceleration is presented as the fundamental condition of capitalism or as a time-diagnostic societal panorama, the question cannot be omitted how acceleration, which is indeed a derived phenomenon, as we have briefly explained, is driven at all, if acceleration, which is unlikely, is not supposed to develop from itself. Moreover, it is not only necessary to refer (following Luhmann) to the always present scarcity of time, which serves as a time limit for contingency and the production of order, but also to the present slowness of time: The faster e.g. information circulates within the systems, the higher will be the probability that there will be fluctuations or slowdowns in the system, which, however, can certainly promote the stability of the systems. Thus, the exchange of information in capitalism takes place within a rhyth- mology of speeds and slownesses, but clearly under the dominance of acceleration and its effects.

As such, the algorithm is an abstraction, its existence simultaneously integrated into the particular program language of a particular machine architecture, which in turn consists of hardware, data, bodies, and decisions. In the process, the currently existing algorithms process ever larger amounts of data and thus process a growing entropy of data streams (Big Data); they generate far more than just instructions, each of which must be executed, namely potentially infinite amounts of data and information, which in turn interfere with other algorithms to repro- gram the diverse algorithmic processes. From an economic perspective, algorithms are forms of fixed capital in which social knowledge (extracted from the work of mathematicians, programmers, but also user activities) is objectified, whereby these forms of fixed capital are not valorizable in themselves, but only insofar as they are integrated into monetary capitalization, whereby they can further drive and force it. In any case, algorithms are not to be understood as mere tools, but it should be understood that they actively intervene in the analysis and processing of data streams in order to translate them into economically relevant information, e.g. by creating the technological conditions for the exploitation of information by generating orders on the financial markets in a self-referential manner and, under certain circumstances, also successfully concluding them. In other words, by far the greater part of financial transactions in high-frequency trading today takes place via pure machine-to-machine communication, which the human actors are no longer able to observe, because the data and information streams flow at a-human high speeds via invisible apparatuses and still liquefy the distinction between machine, body and image. (Cf. Wilkins/Dragos 2013) While the composition of human and ahuman entities varies in HFH systems, in extreme cases some of the fin- ance companies eliminate almost any human intervention in the automated transactions, so that the data read by the machines flows continuously and self-referentially in and out of the algorithms controlling the processes. Any human intervention, on the other hand, complicates even those financial processes in which specific errors and problems have arisen.
To some extent, algorithms are already being physically implemented in silicon chips: the union of hardware and software. The contemporary financial economy, at least in the area of HFH systems, is thus largely invisibly shaped by algorithms – for example, certain programs permanently scan the financial markets to see whether the indicators fixed by algorithms reach certain levels, which then take effect as buy or sell signals. There are current versions of algorithms such as the “volume-weighted average price algorithms” (VWAP) that generate complex randomness functions in conjunction with econometric methods to optimize the size and execution times of monetary transactions in the context of global trading volumes. (Ibid.) We are dealing with other types of algo- rithms that attempt to identify and anti- zip such transactions, or there are non-adaptive, low-latency algorithms that “process” both the differentials of transmission speeds in global financial networks and the correlating material transformations that enable those informational relations. Genetic algorithms are employed to optimize the possible combinations of price fluctuations of financial derivatives and instruments and to ensure the optimal fine-tuning of each parameter within a financial system. (Ibid.) The implementation of algorithmic systems in computerized financial economics represents a qualitatively new phase of the real subsumption of machinery under capi- tal, it indicates the transition from cybernetics to contemporary scientific technicity, the so-called “nano-bio-info-cognitive” revolution that rests on distributed networks and supposedly frictionless systems (superconductors, ubi- quite computing). (Cf. Srnicek/Williams 2014) (Real subsumption under capital includes that every aspect of the production process – tech- nology, research, markets, workers, means of production, etc. – is determined by a pro- cess whose purpose is the self- valorization of capital). In this process, the trading processes in the financial markets remain integrated into a financial ecology of powerful dominant players who feed the self-referentially operating robots (those that liquidate large positions and those that monitor indexes) with information and only partially still control them, while the HFH systems have placed themselves at the top of the financial ecology, at least in technical terms. The monetary side of the financial ecology would have to be specified in the sense that digital money today processes in series of refe- renceless signs that are fed into the calculating automation of the various simulation models, in the screen media with their automated displays (graphics, indices, etc.), and in algorithmic trading itself (bot-to-bot transaction). In his essay Digital Money (Vief 1991: 120f.), Bernhard Vief pointed out early on that digital money is pure sign money, which today is binary coded like almost all other sign systems. The bits not only fulfill all the functions of the previous money, but also mediate any exchange of information. For Vief, bits are universal signs, with the help of which different signs and sign systems (sound, image, writing, logic, values, etc.) can be translated and accounted for. (Ibid.: 120) Bits, pure sign money, not only encode money, but they are money themselves, and thus, according to Vief, money is equal to a code.
While the importance of speed has always been essential for financial markets, technical infrastructures today enable machine processing at completely a-human speeds plus the corresponding accelerations.78 The programs used, which are integrated into the markets as recursive loops, i.e., act on the markets via feedback, remain profitable only for a short time, which means that the pro- grams must be continuously updated to further increase response times, so that only the fastest computer systems generate profits, which in turn leads to enormous costs for the companies. In HFH systems, the realization of profits involves a continuously accelerating fluctuation of the respective portfolios – financial decisions are made in milliseconds, microseconds, even nanoseconds. For example, the latest iX-eCute chip from Fixnetix today processes trades in just 740 nanoseconds.79 This chip can process over 330000 trades in the blink of an eye (about 250 milliseconds). Consequently, HFH systems have long since reached the temporal depth of nanoseconds (1trillionth of a second). And so, too, the average daily volume of NYSE financial transac- tions increased by 300% for the period 2005-2009, while the number of of daily trades grew by 800% in the same time. (Cf. Durbin 2010: vi- viii) Although the profits generated by a trade in HFH systems remain relatively low compared to other financial investments such as complex derivative products, the investments generated by digital machines do grant relatively secure incomes. Thus, at least in the U.S., high-frequency trading marks an important influence on financial market structures. It is estimated that HFH systems (about 100 companies) are currently responsible for about 70 % of the so-called equity market volume in the USA, for one third in the UK, and this with increasing tendency. (Ibid.) Nevertheless, due to the low profit margins, the HFH systems do not remain particularly significant for the dominant financial companies; at best, they tolerate those companies that are active in the HFH because this allegedly provides the required liquidity for the markets. (Srnicek//Williams 2014) Thus, the exclusive hedge funds are not so much interested in the price performance of a stock, but they break the companies into different parts and then examine very specific aspects, e.g., in which country the company is located, whether it is a technology company, whether the company’s stock is traded as part of a certain index, etc. By using economic mathematics such as the Black-Scholes formula, hedge funds can then establish relationships between the parameters that result in the respective price trends, and consequently engage in arbitrage. In contrast, HFH systems work in the medium of statistical arbitrage by trying to exploit market efficiencies through their software-driven trading in extremely short time intervals, thus participating in a systemic advantage which is also a result of the technicality of synthetic trading, although this remains completely opaque for the so-called outsiders of the financial markets.
In his book Europa im Weltwirtschaftskrieg, Heiner Mühlmann has described in detail how high-frequency trading affects CDS systems. (Mühlmann 2013: 115f.) If, for example. a financial actor enters into an insurance-taking CDS at time (to) and an insurance-issuing CDS at time (tn) during the period (to to tn), where the actor pays a fee (a) in the first case and collects a fee (b) in the second case, then he assumes that the collected fee (b) is higher than the fee (a), because in the assumed time period the probability that a credit event (event with negative effects) occurs increases – and thus the fee (b) increases. The time (tn) is the interval of a deferral period, at the end of each of which there is an unpredictable event that brings a qualitatively new dynamic into play. During this period, the n-intervals that lie between the purchase and sale of insurance become shorter and shorter, or in other words, the number of cycles per transaction increases incessantly. Mühlmann also points out that the fee amounts per transaction are small, but the inverse linkage of buying and selling CDS remains profitable because enormous sums of money, further fueled by the cheap money policies of central banks, flow into the trade, so that both the respective sums of money per transaction and the high-frequency multiplication of transactions compensate for the smallness of the fee profits. (Ibid.: 155) Overall, the time to occurrence of a catastrophic credit event has an asymptotic component, insofar as the catastrophe (insolvency, etc.) is approached infinitesimally over specified time periods, but paradoxically, the catastrophe is not supposed to occur after all, since it inevitably leads to losses, at least for certain participants.
The abstract diagram of a trading system consists of three major components: a) trading strategies, b) mathematics integrated into software programs, and c) technological infrastructure. (Srnicek/Williams 2014) High-frequency trading is considered a perfect example of distributive real-time systems, incorporating and implementing patterns from the fields of complex event processing, including thousands of individual programs that increasingly resist the tendency to concentrate processing in the computer’s CPU by delegating crucial trading tasks to specialized hardware components. (Cf. Durbin 2010: 8) GPU computing (graphics processor-accelerated computing) has also long been used in HFH, where the graphics processor (GPU) is used together with the CPU to accelerate financial trading. Parallelization is considered to be the decisive concept of current GPUs, which is also being further perfected in financial science discourses and practices in order to ensure, for example, the evaluation and calculations of Black-Scholes models including all com- ponents in real time. At the same time, the software of the financial systems remains bound to the modularity of the respective components to be managed and at the same time serves the structural coupling of the companies to specific communication networks. (Ibid.: 101-102) Cybernetic feed- back technologies would not even be conceivable today without the modularity of digital machines/media, a structuring in which the modules, which are all modules, which all go back to an identical design, can be permanently reassembled without losing their autonomy. At the same time, the mode of per se possible recombination of modular constellations requires constantly flexible tests with which the continuous feedback functions at all. Software engineers working in financial companies develop such systems in terms of elasticity, flexibility and profitability of financial events. Extracting the drift every surplus nanosecond thus results in laborious processes of designing and optimizing the algorithms and the corresponding specific financial instruments.
While popular perceptions in the context of financial science dis- courses continue to see Wall Street as the central localization of global finance, it is precisely cities like New Jersey and Chicago where much of the U.S. financial system is currently located, at least physically. HFH hubs such as the NYSE location of Mahwah house many of the largest “matching engines” (machines whose algorithms evaluate, compare, buy, and sell transactions from around the world). (Ibid.: 16) Thus, there is definitely a physical concentration of the distribution systems of global finance, and these non-places, global cities, are of course considered components of the respective national infrastructures that need to be strongly protected. The physical infrastructure or hardware of the financial systems is intrinsically distributed and networked. And since electronic signals flow over optical fiber cables whose transmission rates are in the gigabit to terabit range, the distance between the sender and receiver of information is considered a key variable in the temporal latency of the systems. The competitive situation sti- mulates a rapid race for the shortest response times on the markets, which usually leads to financial companies locating their HFH servers directly at the locations of the exchange servers, if they still operate exchange-oriented trading. Of course, a functionally flawless operationality of the connectivity remains a prerequisite, which has to exclude the parasi- tic responsible for the non-operationality as much as possible. Financial enterprises, as complex socio-technical systems, are forced to permanently process the production of a parasitic noise and to reduce the constantly fluctuating information disparities by operating at a high rate of data throughput and attempting to smooth out noise and entropy in the context of a financial ecology. (Cf. Wilkins/Dragos 2013)A signal currently takes eighteen milliseconds to travel from New York to London at the speed of light. This temporal gap provides a period of time for a trader in New York to detect and process new data before a trader in London even registers it. But even the speed of light can be too slow in high-frequency trading in some ways, so that if a trader in New York simultaneously offers a deal to a trader in London and Frankfurt, the trader in London is preferred because the signal sent at the speed of light takes too long to reach Frankfurt. In order to exceed the existing speed limits, certain companies are cutting holes in visible and invisible walls to position themselves physically as close as possible to the matching engines of the central trading venues. Some U.S. companies today, in order to further reduce latency, go so far as to run their communications cables through tunnels cut through mountains, for example, to further reduce transmission times between Chicago and New York. To maximize profits, spatial relations must be used in the most effective way, making HFH systems effectively parasitic with respect to their infrastructures, keeping a permanent eye on the data centers of host-based services in order to, if possible, bypass even these structures. Consequently, the imperatives of acceleration virtually demand that companies increasingly eliminate all disruptive intermediaries. This implies a tendency toward spatial concentration of financial companies in highly networked global cities, while at the same time companies follow a decentralization that depends on the simultaneity of digitality in order to be able to manage the multiple exchange systems on the financial markets, including their speeds, rhythms, sequences and metrics, in a way that is at all consistent with profits. Technological progress in the digital economy – complex, dynamic, integrated networks – is dictated by financial capital, i.e., its investment decisions and risk management determine the mode of technological developments. In this process, inno- vations take place in a cloudy and dense information structure that absorbs, stores and passes on immense masses of data and information – commodity prices, exchange rates, interest rates, political and social factors, etc. – with the media, which are in constant exchange with financial capital as well as the technology sector, acting as an informational marketing machine.
Just think of the endless loops that run along with every news broadcast on television channels such as CNN or N-TV, as if stock prices represented the content for the ECG of global financial capital. The technological-informational architecture of HFH systems thus plays a very crucial role in minimizing latency in financial systems networks, with a high proportion of data and information now flowing not over the public Internet but over one of the largest networks in the world – named the Secure Financial Transaction Infrastructure (SFTI). As part of NYSE Euronext, Company SFTI provides a private high-speed computer network for financial companies in the U.S., Europe and Asia. However, because the amount of information transmissions is limited by the speed of light, physicists have long begun triangulating planetary coordinates of optima- l trading locations, while other researchers believe that the fiber optic cables used to optimize HFH systems may eventually be too slow, and therefore propose running the communication links through the Earth’s core to bypass the minimally slower navigation on the Earth’s surface.80 Privatized particle accelerators would then generate and encode neutrinos to drill a sub-molecular path through the Earth to gain even minimal time margins over competitors. In the end, even the Earth could still become an obstacle to acce- lerating capital circulation. Until then, the earth is at least regarded as a resource of capitalism, on the basis of which the expansion into a placeless horizon is to take place, insofar as capital itself still intends to economize the outside in order to produce, if possible, a new planetary constellation. The dromological aspect (Virilio) of the HFH systems thus touches their immediate embodiments and the associated locations and requires at least the transformation of the entire planet into an accelerating medium of capital circulation, whose technological dispositve in the HFH is currently multiple network structures.

Circulations of money capital involves a machine ecology of data centers and software programs based on a highly optimized material infrastructure. It is quite obvious that financial firms, within the framework of their algorithm-based and fine-tuned HFH systems, must fight for every possible exploitation of the nanosecond in financial markets in order to eventually touch the zero time of capital. (The zero time of capital is the time of the utmost acceleration. However, we are no longer dealing with the opposite poles of zero and hyper velocity, but with a (virtual) tendency for the poles to merge. Both at absolute rest and at continuously repeated hyperactivity, the cerebral system or the electroencephalogram shows only an average and flat pattern. Time glides endlessly along a linear path, or it is wrapped in cascades of entropic decay. A white noise without any information finally characterizes the situation, andforcefully discusses the perception and organization of space and time in social life. For Virilio, we are so fascinated by the speed of technological apparatuses and at the same time so oversaturated by their mobile apps that we virtually freeze, or at least our perception seems increasingly disturbed the moment it is in free fall into a gray (dromospheric) techno-ecology, so that any escape from the demonic domination of technological time comes too late. The original technological accident (time is the accident of all accidents), which concerns the disappearance of the routes or trajects, spreads with such force and vehemence that in its consequence literally everything tips over into a posthuman situation, the speed of which is that of a real time, in and with which the three attributes of the divine are realized: Omnipresence, Instantaneousness, and Immediacy. (Viri- lio 2011: 20) For Virilio, capitalization is above all that of movement; he conceives of capital as power under the signs of quantum physics, insofar as power itself is modified into a velocity vector that places capital accumulation under the dictates of a violent mobility that has long since bumped up against its physical limit. But we no longer live in McLuhan’s fully technologized world, whose only limit is the speed of light, in a world full of digital prostheses, in a world in which the human sensorium is externalized and fluidized in communication technologies. In contrast, today the language of neuroscience re-defines, re-inscribes, and re-codes human consciousness, allowing it to evaporate in the interplay of coded drift, digital trauma, and mobility as part of an expanded data-flesh. (Arthur Kroker 2014) Code drift refers to a total mobility that one is not able to program in advance, rather, through the (un)expected use of technologies, through creative applications and fluctuations of perception, a technological transformation is initiated that leads to an uncertain future of flows, chaos and sampling and remix. No teleology, just a digi- tal stream of capital, information, and technology filtered through the perturbations of frequency codes, propelled at the speed of random fluctuations with which the question of identity becomes that of a sampling error-and one is available online at all times to reconnect to the fractured energy flows of bifurcations.

If the HFH systems carry further high risks besides the dromological risk problem, this can be explained by the fact that these systems always generate new possibility fields and thus new risks, whose evaluation, execution and revision opens further possibility fields, etc.: What happened a few minutes before a trader’s decision is just as unimportant as what will be a few minutes later, and thus, as Arthur Kroker writes in the Panic Encyclopedia, securities trading resembles, in the final analysis, the fluid movements of money chips that permanently modulate the hyper- pulse of unpredictable highs and lows on the financial markets. (Kroker/Kroker/Cook 1999: 104f.). And even if a state of equilibrium is reached in the markets in the short term, this does not necessarily imply that systems have therefore also reached their optimal state, because degrees of interconnectedness can occur in financial systems that are so enormously complex that they look almost perfectly designed, but in fact disrupt rather than benefit the functional connections between systems. The closer machine systems get to something like absolute efficiency, the less they will recognize the inefficiencies inherent in them, though of course the market can never achieve perfect or absolute efficiency, which is moreover associated with equilibrium states. As markets become more efficient, there will definitely be fewer and fewer opportunities for informed traders to exploit profitable arbitrage (including calibration of the mismatch between current prices and underlying assets), while uninformed noise traders will increasingly begin to dominate financial markets.
Arbitrage opportunities and their correlated financial products exist today both within and between the respective exchange systems. The statistical arbitrage strategies and the alleged risk-free profit they include are based on the simultaneous processing of price differences for similar or identical financial products in different markets; for example, one analyzes the relations of two financial assets in order to generate profits through arbitrage when the respective compo- nents of a singular financial asset change. Thus, the first step is to recognize the statistical significance of the movement of two financial assets, although today it is hardly a matter of two financial assets, but mostly of complex systems, even multiple sets of correlations.
Arbitrage is supposedly also intended to evaluate, exploit and balance the anomalies occurring in the markets in order to generally contribute to correct pricing in the markets and to provide liquidity – but arbitrage is precisely the result of fully computerized systems that automatically and simultaneously throw and split enormous orders on all possible trading venues in order to successfully exploit specific price movements, so that dysfunctionalities cannot be ruled out at all. The risk of a disruption of the entire distribution system of high-frequency trading is aggravated if the digital networks are very strongly interconnected by arbitrage functions, so that the indication of a “wrong” price on a singular market can lead to a wave of wrong pricing. The increase in interconnectivity in standardized HFH systems corresponds to an expansion of volatility, which actually contradicts the abstract requirement for an efficient market that culminates in every security investment achieving the same price at all trading venues. Paradoxically, arbit- rage is based precisely on the inefficiency of markets, which is supposed to be reduced by the use of specific models and instruments that serve the strategic orientation of actors in the markets. Accordingly, efficient markets should not actually allow arbitrage. (Cf. Esposito 2010: 167f.)
On an abstract-functional level, HFH systems operate by managing the probabilities of particular transactions, trying to compensate the disadvantage of low profits of an individual transaction by the high number of profitable transactions. The typical HFH trader generates his profits mainly through two strategies: (1) managing the difference between bid/ask prices, (2) probability-based analysis and subsequently exploiting the price movements of various financial assets. (Cf. Srnicek//Williams 2014) In financial markets, both passive and active trading can be established, where the former involves placing an order into the system without knowing whether another party is even willing to occupy the other side of the deal, and for this the HFH system provides programs (auto- quoters) to generate just those decision processes. On the other hand, active trading consists of filling the opposing side of orders that are already listed in the order books by using a software called “auto quoter”. If profits result from the difference in bid/ask prices, then the inherent risk is precisely that before the trader is even able to complete a round-trip trade (buy and sell or vice versa), market prices may have already moved against him and he will necessarily incur a loss. This again indicates that in HFH the management of speed remains essential, on the one hand to beat the dominant market makers, and on the other hand to successfully complete each transaction as quickly as possible in the battle against competitors. Thus, in HFH systems, in addition to managing risks and probabilities, the speed problem plays an enormous role, with financial firms perma- nently using self-accelerating HFH systems to manage risks/probabilities through instantaneous arbitrage.82
When HFH systems interact acceleratively with their complex functions, they also repeatedly produce emergent phenomena at temporal intervals, such as flash crashes or ultra-fast black swans (Taleb 2010), the latter understood as highly dysfunctional contingencies, and this quite in contrast to processes of structured randomness, such as those that characterize the functioning of casinos. In this context, financial frictions can swell to the point where micro-fractures proliferate in an enormous number of minimal flash-crashes until they infect the entire financial ecology. Moreover, because of the encapsulated logics they encode, algorithmic-based trading platforms are intrinsically open to abusive practices precisely because they represent highly opaque interfaces that are not 100% controllable by human actors. Thus, periodic 1000-millisecond circu- lations can be detected in financial markets, with enormous floods of transactions emerging from the interactions of HFH agents on the one hand, and from the emergent rhythms of a largely automated financial ecology on the other, where here

The exploitation of data within the nervous system of automated algorithms visibly sub- tracts the classic market maker. HFH traders compete not only against each other, but also against the classic market makers, who are supposed to provide liquidity for individual securities on behalf of institu- tional investors by simultaneously setting buy and sell prices. For both market makers and HFH traders, the imperative is to trade faster than the competition, which may well launch an attack or perhaps simulatively spread orders or even just wait for the same signa- tures to trade themselves.The algorithmic machines dominate the respective price movements, if they initiate self-referential strategies, which in turn can evoke a certain herd behavior among the human actors.83 The implementation of learning algorithms in the HFH systems leads to ever higher computer performance, to the flexible coding of efficiency and to a deeper expert knowledge, whereby within this tripartite association an increasing diversification and security of trading strategies is to be achieved in the future, but at the same time the accident continues to insist, just insofar as certain events appear statistically predictable. (Cf. Wilkins/ Dragos 2013) In this respect, the accident is to be understood less as the consequence of a malfunction, but rather as an expression of the fact that the systems work too per- fectly. Trading strategies today include the collection of “slow quo- tes”, implying that an HFH trader makes his decisions faster than a market maker can adjust his quotes to the respective price changes at all. “Quote stuffing” implies that an enormous number of orders are sent to the stock exchange and are deleted the very next moment in order to drive market prices in the respective intended direction in the short term, only to profit from the countermovement in the next moment. These strategies have been analyzed in depth by Nanex, an expert in the study of trading anomalies and a provider of software for real-time analysis of stock quotes.84 Ecstatic trading can be triggered by a single error in the algorithm, so that HFH systems must be constantly subjected to specific stress tests and quality checks, periodic updates and the well-known bug fixing. In 2003, for example, a company became insolvent in sixteen seconds when a “wrong” algorithm was set in motion. (Srnicek//Wil- liams 2014) The use of a certain algorithm also led to an immense number of orders being placed in the system but not executed. According to Nanex, this algorithm alone abandoned four percent of all orders available at a given time in the central quotation system of the U.S. stock exchanges (which adds up the orders existing on the various trading venues), affecting approximately five hundred securities and ten percent of the orders that were not executed.

Percent of the total bandwidth available for quotations was used, from which one can conclude that the algorithm had tried to extend the reaction times of competitors by claiming enormous bandwidths of its own. The order flood was used to reduce the bandwidth of the electronic trading system for other participants in order to influence the price discovery. Some HFH traders program algorithms that generate a four-digit number of securities trading orders per second for a single stock and send them to the exchange, whereby the orders initially only become visible when they appear at the top of the exchange’s order book, i.e., lead to the highest bid price or the lowest ask price. The vast majority of trading, however, is in the realm of an invisible noise. (The so-called latency arbitrage is about the management of milliseconds in which HFH traders intentionally fully utilize the trading system, so that the larger share of traders, who rely on the so-called CQS data, as received by almost all market participants, do not even know where the market is trending for phases of up to one minute).
In technological terms, the nonsense of a neo-cybernetic mechanism of pure machine-machine interference actually rules the financial markets today, and this generates a nomos under the hegemony of financial capital, which e.g. Nick Land imagines as a field of asignificant numbers interacting with each other in a non-representational space. (Cf. Land 2010) The acceleration of numerical technicity requires HFH systems as vectors of protrusion, ie, production for profit instructs a technologically grounded production for production qua an inhuman register (ibid.: 260), with its metabolic processes ceaselessly accelerating under the dominance of the circulation of financial capital so that infinitesimal price differentials and instantaneous arbitrage possibilities can be managed for as long as they can, while nomadic liquidity in global networks steadily increases. While human agents have long been acting too slowly, even too fleshy, to be able to overcome temporal and perceptual barriers, the HFH systems generate fine nano-structures in the financial markets, which have long been too complicated and too complexly interwoven to be observed accurately by human agents. Obviously, it is precisely the viral automation of the financial as a result of the functioning of the algo- rithmic machines that is currently generating a multitude of forms of monetary-capitalization. However, it is important to consider the difference between the mere processing of data and the semantic processing of informa- tion, or to put it another way, the HFH systems enco- dinate chains of financial transactions that are initiated by the experts of the finan- cial system but operate at an autoreferential level of data in the systems themselves – i.e., the algorithms of the HFH systems operate as pure data calculators to eject diagrams of a hidden connectivity that the networks of financial transac- tion chains virtually generate themselves. Based on the cognitive mapping of experts, HFH systems impress with their potential to execute vast amounts of tra- des beyond human perception thresholds at intervals of millise- conds in order to undermine the price signals recorded in the order books or to continuously trans- form price movements before human actors are even able to observe this. And this becomes crucial especially in times of crisis, when we are dealing with high volatility and low liquidity.
In the face of a possible, long-term fall in the rate of profit, the desperation of companies, expressed in not even being able to stabilize their profit share in the falling global growth, let alone increase it, reveals itself precisely with regard to the ubiquitous mode of elaborate digi- tal valuations and measurements, where nanoseconds are at stake with regard to the design of trade. If one assumes that this dromolo-gical acceleration cannot be reversed by any laws (there is only the physical limit), whereby the competitive advantage for the individual company slowly evaporates qua management of speed, then the question of nihilism in the capi- talist economy finally arises, insofar as the dromological acceleration finally targets a horizon without any localization, by carrying out nothing else than the hyper-intensification of capitalization. If the speed of light cannot be overcome and the various competing enterprises all tend to operate at the same speed, i.e. if the dromological competition between financial enterprises is at some point no longer able to exceed its immanent limit, then the monetary costs for enterprises to achieve an increase in the rate of profit, however marginal, grow immensely. A system-immanent cunning to escape this problem at least in phases.

One system-immanent slyness to escape this problem, at least in phases, is, for example, to place ultra-rapid orders in the markets to make the prices and size of certain other bets visible before they are even listed in the so-called exchange order books. Some algorithms are designed to expose the telltale signatu- res of other algorithms in order to gain betting advantages themselves; certain algorithms continuously evolve by operating on the ecology of human and non-human traders to process the appropriate ratios. (Cf. Wilkins/Dragos 2013) Furthermore, based on the analysis of data from the past, one simulates weather constellations in order to forecast, for example, the harvests of wheat, soybeans, and corn and, in turn, to be able to predict price fluctuations that may be related to the relation weather-harvest. At the same time, the HFH systems are linked to algorithmic systems used in other areas of economics (consumption analysis, data collection of all kinds, semantic web). In all these processes, algorithms multiply in an unmanageable way, although human actors program them. The question of fine-tuning the algorithms then inevitably arises, but what if the fine-tuning is in turn based on the automated processes in the computer systems? Algorithms, then, may well take on a life of their own over and above the intentions of their programmers; consider, for example, the program Eureqa, a technology that exposes and explains intrinsic mathematical relations hidden in complex data sets.
If computation is based on discreteness (bit, recursive algorithmic pro- cessing, quantified measurement), then the aspect of continuity (i.e. reality) remains largely alien to digital computation. Algo- rithms encode a finite and iterative conception of time by using discrete numbers, while being unable to represent, say, geodesic spatialities or non-trivial continuities. (Srnicek/Williams 2014) This problem is, among others, already rooted in Frege’s and Hilbert’s project of axi- omatizing geometry through arithmetic. Thus, operationalization via computation can only provide an inadequate model for complex social, biological, chemical and quantum mechanical processes. If the reductive project of Newton and Laplace insists at the heart of digital computation, then mathematics as a non-royalistic science, as Deleuze/Guattari, for instance, have called for, must be disregarded. Because there is no generativity in algorithmic thinking, and if there is, then only properly of the formulas and models of royalist mathematics, the algorithmic machine tra- ding remains bound to the prescribed rules and thus cannot produce a uni- versal synthetic acceleration, as the Iranian philo- soph Reza Negastrani or the English theorists Srnicek//Williams, for example, call for (Srnicek/Williams 2013b: 2ff.). Finally, it was Turing himself who contrasted his original conception of the computer with a non-linear, continuous variant. One should thus bring the computations of models applied in economics much closer to the continuous, non-iterative nature of the material/physical, to contemporary post-quantum physics and the complex sciences. However, certain stratagems of camouflage, mimesis, and deception are already fueling non-adaptive mutations in financial commerce today – think of non-equilibrium dynamics due to the “Red Queen Effect” (e.g., co-evolutionary arms race between host and parasite) as well as certain modeling in evolutionary game theory via crypsis (camouflage behavior). (Cf. Wilkons/Dragos 2013) It is possible, however, that we are only dealing with the pathological tendencies of a hyper-technological capitalism, with a thanatopic mimicry, which, in the course of the continuous calibration and redistribution of financial instruments, energy and information, finally leads to deadly processes of cannibalization and self-destruction of capital.
There would now perhaps be two possible global scenarios to forecast for a future evolution of HFH systems. (However, one could also assume, as Arthur Kroker does, that the emblematic signs of the new technopoiesis that holds us in its clutches indicate less the imagery of apocalysis than the slow suicide of the posthuman species). In the former scenario, HFH systems are globally outlawed in the wake of a series of catastrophic financial apocalypses because the unpredictable and emergent effects of the automated trading of financial war machines, combined with the constitutive irresponsibility of institutional investors, could lead to a virtual extinction of technologies and capital systems triggered by, say, a disastrous sequence of automated trades. In the second scenario, semantically effec- tive HFH systems almost completely replace human compo- nents in a creeping mode, with capital eventually ridding itself of all human resistances in order to generate even more abstract process strategies, including the posthuman intelligences of self-referential financial instruments, which in the end want to disassemble even the sola- systems in order to achieve a maxi- mum “computational power” – no matter at all if in the end everything is drawn into an absolutely deteritorialized stream of money capital leading to an annihilation of cosmic proportions.85 (Mention of the sola- ren economy, the economy of the sun as a labyrinth of life, runs primarily through the writings of Georges Bataille and Nick Land). The human intelligence with its limited cerebral potencies can no longer grasp such a xeno- economy, considering that today even the neo-liberal financial systems are already phased into an a-human panic, into an ultra-abstract accumulation for accumulation’s sake, to perhaps soon turn into a frenzied and utterly destructive deterritorialization of the capi- tal, indulging therein in a terrible thanatotopism. The second scenario would be a pure dromological acceleration, a linearly increasing acceleration, as it has already been described approximately by Paul Virilio and Nick Land. As long as this dromo- logical drift continues to produce dynamic effects in terms of successful capitalization, the current hegemonic financial regime never itself questions the rules of the game that operate and govern it, although the propagation of “faster and faster” ultimately comes to naught when confronted with the recalcitrant traces left by the accelerating processes themselves.
Whether, in addition, there could be something like a universal acceleration in a post-capitalist society, which is characterized, for instance, by manifold and cunning experiments, which, put on a permanent basis, are again and again fissured by contingency, remains extremely questionable, although the infini- tesimal approximation of the HFH systems to the speed of light virtually demands the transition to a more intelligent paradigm of the experimental in the social field. At this point, it is then necessary to address a third scenario. A third scenario in which financial automata are freed from the dispositifs and rules of capitalization, as envisioned, for example, in Srnicek//Williams’s acceleratio- nal manifesto, so that the reshaping of capitalist technologies with their infrastructures actually leads to a post-capitalism (Srnicek/Williams 2013b: 30) that makes use of, for example, the Métis as part of a universal strategy to achieve something like a nomadic communism that focuses on the stra- tegem of contingency beyond the stifling political-economic space of capital – this scenario, however, remains more than vague.86 Thus, one wants to replace the blind acceleration that takes place by means of current digital technologies with an intense acceleration, by means of refined machines based on a continuously neo-computing format, and these machines are precisely accompanied by a complex métis, i.e. a sly and at the same time well-tempered action that amplifies the dynamic tendencies of the material one is working with, thus propagating at the same time a complicity with the material, with the contingent and the unpredictable. But if human agents are becoming more and more obsolete in hypertechnological capitalism and its accelerated future, who will be left at all to repair or re-function the machines when they actually collapse? And is there still a need for the human at all, when the machines have long since filled every social niche.

What is the question if the computer has occupied the evolutionary territory of a bright future? One tries to solve this problem with the call for a thoughtful intervention in the polities of abstraction to challenge capitalism politically and epi- stemically, although it is not decided how a revolu- tionary praxis can be linked to the theoretical effort; indeed, little is said about how the cognitive functions or map- ping of the economic can influence social practices.
If even the most intense, the last gradual acceleration is threatened by noise, then the boundary separating signal and noise can only be thought of starting from contingency, though not from an infinite contingency, as Massimo De Carolis assumes, for instance (cf. De Carolis 2011: 297), but rather from an indefinite contingency that opens up an untuned field and the manifold of temporal functions. One is then dealing with neither an infinite nor a finite field of the socio-semiotic; rather, in the course of a non-politics, every discursive practice must display itself as an experimental stratagem in indefinite fields (Bahr 1983: 295ff.), and indeed as a permanent virtualization practice, a register of in-extension, marked by an indefinite comprehension. In this field of unforeseen encounters, surprising entanglements of the neuronal-political with the liquid streams of information can also constantly occur, entanglements or associations between hitherto non-associated forces, and this not only in the course of a métis, a cunning and at the same time precisely timed action that knows about the dynamism of the material it deals with, but also in the course of seizing an actualizing continuous event that is implemented in rhythmic sequences of contingent plots (not plans) and requires counter-realization in action. Counter-realization as a precis of that part of the virtual event that shoots beyond ordinary actualizations, which happens through a negation (counter-) of previously practiced orientations and patterns by resisting the mere implementation or objectification of plans that seek to reduce the included techniques to a program grid whose con- tingency is limited. In this precise sense, counter-realization implies the artificial, which differentiates itself more and more in its becoming. The “counter-” of counter-realization, in the con- text of contingent events, denotes contact, exchange, and opposition/autonomy (of the “counter”), and these factors are described in Deleuze as the three contracted moments of a disjunctive synthesis that insists in the counter- concepts. Thus, while the negative is not entirely adopted, it is extended by the problematic. Deleuze, against Hegel, is concerned with the production of positive distances in which, with the incessant differentiation of the counter-realized against itself, incorporeal transfomations are realized in permanence as new inscriptions in the bodies, and this in the time of the event. In this, in the sense of a permanent revolution, the old system is no longer implied in the counter-realized, insofar as the new was just not objectively laid out in the (old) capitalist economy due to specific conditions and therefore did not have to occur nowen- digerously. And thus, the counter-realization has to constantly assure itself of its event marker by demanding its autonomy, its becoming a minoritarian, whose negative moment consists in not getting involved in any attempts that, for example, seek to derive communism from capitalist ecology/society in order to justify their revolutionary efforts. The experimental nature of the stratagem shows itself in the fact that it, as counter-realization, never gets involved purely in the preservation of what has been achieved, and thus ever already affirms its date. And for this it needs an affirmative force of negation, which keeps the struggle against its own achievements in mind, in order to ever already contain the purely preserving tendencies, which indicate the end of the experimental. If the machinic transmissions are themselves situated in a field of encounters and non-encounters, of subfrontation or pragmatic differences, then it is still necessary to consider, beyond the métis, the experimentum machinarum of stratagems, inclu- sive a logic of interruptions, which, however, at the same time strives for the precision of the indistinct and thus precisely refuses the ideal of undisturbed functioning, in order to finally direct the discursive practice to complex ensembles of effects, with which also the machines no longer show themselves as programmable models of precision, but at least as stratagems of different degrees of precision. (Ibid.: 305f.) Thus, it is necessary to think the new that emerges from an experimental practice as a process that, starting from its specific situation, engages in a certain handling of the mate- rial in order to produce consistencies and generic consequences that always keep open the experiment, which in turn includes continuity and discontinuity within itself. The experiment is a struggle of stratagems for the possibilities of the possible.
If machines in principle exhibit a multitude of incalculable effects, then they cannot be used merely as means or, as in cybernetics, merely operated, nor can the machine systems be determined solely by the relations of the inputs and outputs and their recursive steering. It is not at all about the non-functioning of the present informational machines, about the crashes or black swans, which have to be reduced with cybernetic feedback loops, but about the recognition and exploitation of the unpredictable effects of the machine in time, which disfigure the human actant himself, when he enters the technical sys- tems with his asyn- chronal presence in the mode of a pure rhythmization, perhaps to detune them or to make music with them in an experiment. From the outset, the machine systems contain unpredictable effects. For example, the development of handy engines in the twentieth century led to the manualization of mass transportation qua automobile (ibid.: 313), which, however, led to a series of unexpected effects, which became apparent in a variety of complex machine systems, even social mega-machines, ranging from those of traffic planning and systems, the military, and questions of the economy.

To analyze the current situation of global finance in a broader context, it is first necessary to demolish some of the neoliberal dogmas, such as the belief in the efficiency of the competitive market mechanism, which is notoriously supposed to lead to states of equilibrium, or the confidence in the capacities of financial systems to self-regulate, or the doctrine of sustainable development, which ignores the fourth law of thermodynamics.

translated by deepl.

Nach oben scrollen