In his book How Noise Matters to Finance, which is well worth reading, Knouf points first and foremost to traditional ideas about the stock market, such as the control of the market by the famous invisible hand, the dreams of balance, transparency, explainable price movements and correct prices, All in all, these are social fictions that many traders, companies and economists still believe today despite the repeated cyclical financial crises, even though the algorithms execute the exchange processes on the stock markets faster than people can even perceive them. The complexity of the new financial instruments is usually ignored. Knouf, however, notes one thing from the outset, especially in view of the increasing algorithmization of the markets: the market is composed by machines and people, whereby the machines are at least partly still manufactured by people, and even in the case of automatic algorithmic trading, the algorithms are still written by people. And the actions of humans are not predictable. But even the modes of operation of probable machines are not 100% predictable, among other things because of the noise of the material world, one could also mention the hardware with Kittler, whereby one should not simply understand the noise as an undesirable sound or signal, but with Michel Serres as turbulence, as order and disorder at the same time – order that dissolves and forms itself anew through repetition and redundancy, while disorder is generated by new events, madness, uncertainty and the unpredictable. Noise is the fundamentally unstable ground on which machines and human existence are based. Order is then the form of an appearance that permanently massages the turbulent background and is massaged by it. All models are only approximations to something that is constantly in motion.
Knouf’s short book is not only about depicting the relation between noise and finance, but about illuminating the nexus of machines, people and noise in terms of debates about the future and the trajectors that lead us out of the quagmire in which we currently find ourselves.
The concept of noise found its way into financial literature in the early 1980s, when the understanding of the market as an information processing system (Hayek) kept in balance by signals and “true” prices gradually began to waver, and from now on the price of security can be based not only on information but also on noise generated by the less sophisticated traders. However, the forced, the “intelligent” traders and the economists quickly learned how to make profits out of noise, which culminated in high-frequency trading (HFT).
Knouf examines, from the point of view of materiality and interference between humans and machines, how certain forms and meanings justify the financial noise in various space times – firstly, how the noisy activities of the traders shake the models of the rationally acting and use-optimizing trader, and secondly, how the sonic noise accompanies the physical activities in conventional floor trading, third, how the analysis of the relation between computerization and trade shows that material practices of the man-machine hybrid use noise as a means to make profits, and fourth, how the problem of speed dramatically escalates, transforming the race for risk-free profits into a race to zero.
Financial capital can no longer be discussed without reference to information theory and cyborg science. Claude Shannon’s information theory, which distinguishes between signal and the ever-present noise that disturbs the signal, is widely known, but less well known are Hayek’s remarks on information theory in the context of economics. Hayek sees prices as the most important medium in which information circulates in the markets. It is not so much centralized planning as decentralized planning, which is carried out by many people or by algorithms today, that is economically most effective because the actors observe the constantly changing information reflected in the fluctuating price systems independently of one another and act accordingly rationally, i.e. adjust their activities according to the always correct price movements. The computer – especially for derivatives trading – should make Hayek’s dream a reality.
According to Black and Scholes, stock prices follow the paths of a random walk or a geometric Brownian movement, according to which, if you roll the dice – depending on whether you get an even or an odd number, you go one step forward or back – after a certain time, according to the average of the progressions, you are back where you started, but much more likely there is a drift away from the starting point. Black and Scholes replace the discrete random walk with a more complicated form that can be used mathematically and that better captures the dynamics of current price movements on the stock markets. The random walk is now continuous and geometric and the movements are based on the Gaussian normal distribution. References are statistical physics and thermodynamics of the 19th century as well as diffusion equations. These prerequisites form the basis for the Black-Scholes equation, which simplifies and codifies assumed rules of man-machine systems to produce a controllable representation of reality. We have already presented this in the review of Lee and Martin’s book Derivatives and The Wealth of Societies. This leads straight to the models of CAPM and EHM.
But not everyone has the capital to get the information they need to trade these models efficiently, because there are transaction costs and the information has different gradations and scales. However, this is of little concern in financial theory, since Milton Friedman’s thesis continues to be that the assumptions and prerequisites that an economic theory or the models introduce cannot be the basis of criticism. Although the models are seen as idealizations that are economically unrealistic, they do provide something like a useful benchmark for measuring efficiency.
The EMH model in its strongest version states that the market is efficient if it immediately provides all the information to which at least insiders have access. However, there are asymmetries in real terms when, for example, specialists gain advantages by exploiting tiny price fluctuations in order to make profits. For Fama, however, these asymmetries can be eliminated through electronic commerce, which is exclusively dictated by the movement of time. Some of these assumptions were first challenged by behavioural economics, which pointed to the discrepancy between stochastic models and human expectations about future events. The impossibility of correctly pricing risk on the basis of probabilistic models indeed points to inefficiency as a constituent of markets, whether certain factors have a stabilizing or destabilizing effect.
It was Black himself who, in an essay, pointed out that noise trading is enormously important for the existence of liquid markets, thereby “saving” the EMH model a bit in the face of the irrationality of human actions. In the EMH model, there is no way to profit from information, the market prices immediately reflect the existing information and arbitrage is not possible. In real markets, however, these prerequisites do not apply, because privileged traders can make a profit by managing the information. These traders are confronted with traders who believe that the noise they are trading is information. The share price thus reflects the trading of noise and information. Black, however, assumed that the noise trader was only a temporary phenomenon and could not generate profits over a longer period of time.
But what if noise traders are better informed than informed investors? Contrary to the model assumptions of the CAPM and the EHM, trading in derivatives is not frictionless, because there are transaction costs and leverage is limited, so that better informed investors may not be able to take advantage of certain price fixings. What now appears to be an arbitrage opportunity is the result of the fact that the activities of noise traders cause stock prices to rise or fall, and this could become costly for informed traders indeed, so that finally noise traders and informed arbitrage traders become indistinguishable. This fact has been translated into econometric models that show that noise traders can achieve higher expected returns simply by working with higher risks than they have created themselves; or they benefit from their own destabilizing influence. Contrary to Friedman’s assumptions, they do not die out either, but are definitely part of the market today. Noise thus becomes the vital component of the system, an unpredictableactivity, which paradoxically can also support the equations underlying modern financial theory. It is a binarity – those who possess information and those who do not, the latter as noise traders taking advantage of those traders who trade on the basis of real information.
However, a strict distinction between information and noise is not possible, as all markets are to some extent noisy to the extent that transactions in space and time are not without friction. Noise is an aspect of trade as part of an embodied, material world – embodied in the sense of humans who trade or write algorithms, material in the sense of the man-machine relationship in real systems and not in idealized equations. In the EMH model, the relationship and distinction between noise and information remains unclear. Although the model knows how information should be absorbed by the markets to be efficient, it cannot define what kind of information it should be. In a world without rational agents and the presence of uncertainty, noise is essential.
Knouf concludes that electronic trading – the materiality of multiple monitors and ubiquitous Bloomberg terminals – has transformed the practices of contemporary finance, while the views and sounds of physical, “open-outcry” trading continue to determine our cultural imaginary. It is the affective moments – panic and fear in the face of crises and price slumps – that determine the floor, but the trader’s tableau today is one of numbers, graphics and texts, more visual than a compendium of senses, more singular modality than an interference of multiple multiplicities.
On the floor, buyers and sellers are linked to each other by their bodies and extreme behaviours, in a hierarchical manner that is also spatially visible. Hand signals and the voice are the most important instruments here, but there are a number of body movements that are considered specific signals. This way of combining movement and voice creates noise, pure noise for an untrained ear, but a carefully constructed system for the experienced trader. The ambient noise of the market affects the market as a whole. A rising sound level indicates a rising trading volume and high volatility, and so the noise on the floor may contain information that is definitely lost when trading on screen. Sounds that articulate fear, panic, hysteria, emotion and uncertainty cannot be easily transferred to electronic networks; the machine may not be a good substitute for humans in this respect.
There are market makers who are not tied to large financial companies and whose trading, whatever the stage of the market, is intended to provide liquidity to the markets. There are locals who speculate and make small profits with short-term trades, but carry out a large number of transactions. Other traders are linked to companies and work on a commission basis. Due to the high number of buyers and sellers, it is important to find a suitable partner for a derivative contract that is expected to generate high returns.
Knouf also rudimentarily discusses affect theories, think of Deleuze/Guattari and their Spinoza owed concept of affect as intensity or capacity between and within the bodies. Current debates revolve around intentionality and the question of the extent to which affect is pre-cognitive and pre-social. Knouf argues against Brian Massumi that affects are not antisocial and autonomous and therefore are not outside social signification. Precisely because affect is not autonomous, it possesses a political force. The power of affective events is also embodied in the structures of contemporary finance. The gestures of the traders who beef themselves up on the floor are incomprehensible if one does not assume that they have internalized the structures and means with which the market functions. The affects may be more direct than those of the traders in front of the screens, but they are to be understood as the result of previous social and ideological interpellations,
With electronic trading, new trading floors become the norm, “populated” by batteries from servers in carefully air-conditioned rooms. While the popular perception in the context of financial discourse continues to see Wall Street as the central localization of global a large part of the American financial system is currently at least physically located. HFH hubs like the NYSE Mahwah site house many of the largest matching engines (machines whose algorithms evaluate, compare, buy and sell transactions from around the world). So there is also a physical concentration of the distribution systems of global finance, and these non-places, global cities, are considered of course to be strongly protected components of the respective national infrastructures. The material infrastructure or hardware of the financial systems is intrinsically distributed and networked. And since electronic signals flow via optical fiber cables with transmission rates in the gigabit to terabit range, the distance between the sender and the receiver of information is regarded as a key variable for the temporal latency of the systems. The competitive situation stimulates a rapid race for the shortest reaction times on the markets, which usually leads to financial companies locating their HFH servers directly at the locations of the exchange servers if they are still trading on the stock exchange. The prerequisite, of course, is a functionally flawless connectivity operation, which must exclude the parasite responsible for the non-operationality as far as possible. Financial companies, as complex socio-technical systems, are forced to permanently handle the production of parasitic noise and reduce the constantly fluctuating information gap by operating at a high rate of data throughput and attempting to close out noise and entropy within a financial ecology.
Electronic networks process on an algorithmic basis. It is supposedly exclusively about efficiency. If, for example, a trader wants to sell 100,000 shares of a security, the price on the market will certainly fall when the security is executed and perhaps the trader will not find buyers immediately, which is why a specially designed algorithm splits the order into smaller parts, thus reducing the influence on price movements on the market. This order can still be distributed over a certain period of time. The corresponding algorithm is called time-weighted average price (TWAP). If the volumes are reduced further and further, this is done with the help of the volume-weighted average price (VWAP). In practice, the algorithms are now much more complex. If one assumes that the space for potential market algorithms is practically infinite, then a race against zero occurs that shifts certain time frames and complexities beyond human perception.
In addition to new electronic trading venues such as BATs and Chi-X Europe, there is the expansion of electronic trading on established exchanges such as NASDAQ, which has led to the establishment of such data centres as Mahwah. And this, in turn, is pushing the boundaries of computerization even further, as financial companies with enormous capital size hire computer engineers and buy special equipment that guarantees that their algorithms work microseconds faster than those of their competitors. Colocation is one of the latest trends in electronic trading. One of the reasons the NYSE data center exists is to provide space for companies to be closer to the machines that execute the current trade. There are now laser-transmission connections between the NYSE and NASDAQ that are nanoseconds faster than previous cables. The infrastructural investment allows certain financial companies to realize new profits that would otherwise not be possible. The latency period and the closure of the distance to the market computers is therefore becoming increasingly important.
If the algorithms are not necessarily dependent on speed, then another logic must come into play here, namely the tiny price fluctuations, a form of noise that today draws so much attention to high-frequency trading. HFT includes extremely high order volumes and their rapid elimination, the exploitation of spreads, the short-term holding of positions and low profit margins. The low profit margins are compensated by the processing of a large number of transactions. The ability to quickly move out of positions and into positions made possible by fast computers opens up new arbitrage opportunities. Latency is now definitely part of profit production. The money capital needed to perform successfully at these speeds exists only for those financial firms that can hold huge sums of money to invest in the computerized infrastructures, and that are naturally able to make a significant contribution to the growth of their business.
In any case, in the context of other changes in markets resulting from changes in human and machine behaviour and increased competitive pressure, the HFT has further integrated markets as networks, and perhaps today, like nuclear power plants, markets need to be understood as large-scale systems that require appropriate modelling or a new ecology of practices. If HFT increases market volatility and seeks millisecond benefits from such strategies, the question is how the concept of noise can help to understand these phenomena. For an economist, the microstructure of noise can be talked about precisely when it becomes difficult to estimate the value of a particular time series of data; furthermore, noise is a factor that is responsible for observed values deviating from fundamental data. Noise is the falsification of an idealized and observable process, although there are sufficient attempts to incorporate it into mathematical models to obtain a “true” measure of the process. Other researchers concede that microstructural noise exists independently of mathematics. It is then understood as a necessary deviation due to human activity, interconnective systems and processes that do not perfectly follow mathematical modeling, although noise can become a constant component of the market. The decisive factor here is the signal noise rate, which must be greater than 1 for successful trade, and thus the noise within a dual system is understood as something that lacks information content.
Knouf examines in detail some artistic projects that translate financial data from financial movements into sound, with projects like rynb going one step further and focusing on the problem of sonification of resonance and feedback. There is resonance with various strands of noise music and contemporary sonic practices that transform data into sound, including the informational noise in the digital signal itself, a result of the increase in mathematical chaos through HFT. The financial noise is generated by all kinds of feedback, mimetic forces and anticipations. Flash Sonification by rynb aims to show the obscurantism of today’s financial languages, the way in which supposedly useless noise is translated into a discursive language. At the same time, human perception is integrated into the time scales of algorithms. Sonic Noise includes the translation of market data, abstract and material at the same time, into various abstract forms that do not immediately signify anything. Listening to such sounds does not lead us to a rational understanding of finance, rather it gives us rather gloomy forebodings about the mechanisms of finance – high frequency pulses, which begin in the rhythm of the heartbeat and are so fast that they can no longer be distinguished from each other. Goodman calls this “bass materialism”. Noise as volatility and fluctuation is a means to continuously generate and accumulate profits. The intersection between sonic and informal noise produces vocal indications such as panic and fear on the one hand, and a dark ghostly cloud based on the attack of the data on the other. If, instead of attacking Wall Street, you attack the data centers, you need to understand that the real physical manifestations of finance today are the air-conditioned units and computers of the data centers.
Finally, Knouf discusses some aspects of the current discussion on accelerationism. Do we just have to accelerate the processes taking place in the financial centers, as some accelerationists assume, a term Benjamin Noys brought into play in recourse to the writings of Deleuze/Guattari, Lyotard and Nick Land. The topic has already been discussed in detail elsewhere, so that in the case of Deleuze/Guattari we only refer to the fact that the authors have largely relativized their statements made in Anti-Oedipus on re-territorialization and deterritorialization in thousands of plateaus, when they write, for example, that the deterritorialized flows should only be accelerated if circumstances demand it, with which they do not favor an ultimate acceleration, but rather a pragmatic statement. Benjamin Noys believes that both Deleuze/Guattari and Lyotard made their relativizing remarks on acceleration because they noted that their earlier positions were congruent with capitalist money flows. Noys sees accelerationism as the fantasy of easy integrationof machines, people and capital, transforming the misery into a new jouissance. Knouf adds as criticism that it seems problematic that this fantasy alone should serve as a guideline for future political actions.
Snirzek/Williams do not share Nick Lands’ joy in being consumed by capital and, from the left perspective, demand cybernetic full automation, unconditional basic income and the reduction of working hours. But the Promethean policy of domination over technology overlooks the peculiarities of the machines and the positive handling of the machines. The way in which the machines work as stratagems independent of their creators is completely misunderstood, so that it cannot be understood how the nexus between machines, capital and people really works, the basis for new configurations that have nothing to do with a rule over machines. Hans-Dieter Bahr calls for a different way of dealing with machines when he speaks of the stratagems as effects of “the experimental par excellence, as attempt and being tried”, and this neither in an infinite nor in a finite, but in an indefinite field, a “campus indefinitum”. At this point, Bahr also calls for an “archaeography” that confronts the problem of the “overclear, the indistinct, the allusion and the overinterpretation” in order to escape the philosophical circle of representation, reflection and representation between reality and discourse. Bahr’s labyrinth of monuments, the multiplicity of temporal functions, can also here be brought into a certain theoretical proximity to the laruelles of fractal indeterminacy power, which quite by itself makes the given at once irregular, namely in the campus indefinitium according to the real.