The Legitimacy of High Frequency Trading

Mark Thoma brought my attention to a post by Dean Baker, High Speed Trading and Slow-Witted Economic Policy.  High Frequency Trading, or more generically Computer Based Trading, is proving problematic because it is a general term involving a variety of different techniques, some of which appear uncontroversial, others appear very dubious.

For example, a technique I would consider legitimate derives from Robert Almgren and Neil Chriss' work on optimal order execution: how do you structure a large trade such that it has minimal negative price impact and low transaction costs.  There are firms that now specialise in performing these trades on behalf of institutions and I don't think there is an issue with how they innovate in order to generate profits.

The technique that is most widely regarded as illegitimate is order, or quote, stuffing.  The technique involves placing orders and within a tenth of a second or less, cancelling them if they are not executed.  I suspect this is the process that Baker refers to that enables HFTs to 'front run' the market.  Baker regards the process as illegitimate with the argument that
The issue here is that people are earning large amounts of money by using sophisticated computers to beat the market. This is effectively a form of insider trading. Pure insider trading, for example trading based on the CEO giving advance knowledge of better than expected profits, is illegal. The reason is that it rewards people for doing nothing productive at the expense of honest investors.
 On the other hand, there are people who make large amounts of money by doing good research to get ahead of the market. ... The gains to the economy may not in all cases be equal to the private gains to these traders, but at least they are providing some service.
By contrast, the front-running high speed trader, like the inside trader, is providing no information to the market. They are causing the price of stocks to adjust milliseconds more quickly than would otherwise be the case. It is implausible that this can provide any benefit to the economy. This is simply siphoning off money at the expense of other actors in the market.
The problem I have with Baker's argument is that I do not think it is robust.  It starts by suggesting a link between insider trading and HFT.  I don't think this holds up.  When a trade is placed on an exchange, it becomes public information.  The HFTs are making their profits by responding more quickly to the information, not because they are working on private information.  Baker distinguishes one sort of  'research', traditional economic research, from another, novel research on computer networks and algorithms, and implies that traditional research has a legitimacy in market exchange that computer research does not.

Statements like "simply siphoning off money at the expense of other actors in the market" make me a bit uneasy because they create distinctions between 'legitimate' and 'illegitimate' activity without offering a clear basis for the distinction.  For me, the distinction Baker makes seems to be on the intellectual basis of the agents: in economics or computer sciences.  I worry that the foundation of Baker's criticism is an affinity with institutional investors and a distaste for small scale entrepreneurs.

Baker's solution of  "A modest tax on financial transactions [that] would make this sort of rapid trading unprofitable" is, if my basic economic understanding is correct, a standard way incumbents create barriers to new entrants.  Wall Street, according to Jonathan Levy's Freaks of Fortune, has at least a hundred year tradition of lobbying legislatures to protect its interests and I think we should be wary of whether Wall Street's interests are aligned to the broader public.

The problem is somewhat more serious in the UK.  In 2012 the UK's Government Office of Science reviewed Computer Based Trading technologies and decided that, while acknowledging that order stuffing was dubious, they would not suggest inhibiting it.  The rational was that the market place was a competitive arena and that traders would congregate at exchanges that enabled competition; i.e. for the UK to retain its position as a financial centre the UK government should not legislate on the issue.

The substantive question is whether I can come up with a more robust argument than Baker's, and I offer an argument at the bottom of this piece.

I have been critical of the Foresight report.  However I have also been concious that I could not coherently justify my objections to practices such as order stuffing.  This concern was related to my uneasiness around identifying the concept of reciprocity being embedded in contemporary financial mathematics.  I have come from a fairly orthodox background and connecting mathematics and ethics was a problem for me since I first identified a link around 2010.

For me, the intellectual resolution of the problem of linking mathematics and ethics comes from pragmatic philosophy.  Pragmatism is especially relevant to finance because it addresses the thorny issue of truth when we cannot rely on objectivity, neutrality and determinism and because it acknowledges the role of ethics in science. Specifically, by rejecting the ideology of the fact/value dichotomy, I claim that  the principle of ‘no arbitrage’ in pricing contingent claims is infused with the moral concept of fairness.  This is all well and good, but the claim can be treated as a heuristic (as the Dutch Book argument is) or as a fact.  Based on the empirical evidence of the Ultimatum Game, I claim it is a fact that reciprocity is embedded in financial mathematics.  This raises the question of why is reciprocity important.

As well as justifying the connection between ethics and mathematics,pragmatism provides an explanatory hypothesis.  one problem I grappled with was why did the link between reciprocity and finance become obscured between the eighteenth and twenty-first centuries.  The explanation comes in the theories developed in Adorno and Horkenheimer's Dialectic of Enlightenment  or Polyani's The Great Transformation, both published in 1944. The Dialectic claims that the Enlightenment led to the objectification of nature and its mathematisation, which in turn leads to ‘instrumental mindsets’ that seek to optimally achieve predetermined ends in the context of an underlying need to control external events. Jurgen Habermas responded to the Dialectic in  Structural Transformation of the Public Sphere where he argues that during the seventeenth and eighteenth centuries public spaces emerged, the public sphere, which facilitated rational discussion that sought the truth in support of the public good. In the nineteenth century mass circulation mechanisms came to dominate the public sphere and these were controlled by private interests. As a consequence, the public became consumers of news and information rather than creators of a consensus through engagement with information. Having undertaken this analysis of the contemporary state of affairs, Habermas sought to describe how the ideal of the Enlightenment public sphere could be enacted in the more complex contemporary (pre-internet) society and his Theory of Communicative Action was the result.

Central to Communicative Action is a rejection of the dominant philosophical paradigm, the ‘philosophy of consciousness’ that is rooted in Cartesian dualism; the separation of mind-body, subject-object, concepts and is characterised by Foundationalism; philosophy is required in order to demonstrate the validity of science and the validity of science is based on empiricism, and certain views specific to the social sciences; such as that society is based on individuals (atoms) interacting, so that society is posterior to individuals and that society (a material, extending the physical metaphor) can be studied as a unitary whole, not as an aggregate of individuals.

The dominant paradigm sees language as being made up of statements that are either true or false and complex statements are valid if they can be deduced from true primitive statements. This approach is exemplified in the standard mathematical technique of axiom-theorem-proof. Habermas replaces this paradigm with one that rests on a Pragmatic theory of meaning that shifts the focus from what language says (bears truth) to what it does. Specifically, Habermas sees the function of language as being to enable different people to come to a shared understanding and achieve a consensus, this is defined as discourse. Because discourse is based on making a claim, the claim being challenged and then justified, discourse needs to be governed by rules, or norms. The most basic rules are logical and semantic, on top of these are norms governing procedure, such as sincerity and accountability, and finally there are norms to ensure that discourse is not subject to coercion or skewed by inequality.

I have come to the conclusion that markets are centres of communicative action enabled by the language of mathematics.  In this framework reciprocity is a norm of communication, but it is not the only norm.  Habermas emphasises the importance of sincerity in communication in general, and the implication is that it is required in markets.

It is on this basis that I believe we can identify order stuffing as illegitimate: it is insincere.  The difference between optimal order execution strategies, which earn their computer scientist experts money, and order stuffing is that a HFT order stuffing is not "sincere" in issuing an order they immediately cancel.  The antidote is not to impose an additional cost on transactions, that would not affect institutional investors but might hinder legitimate speculation and innovation, but to regulate the timing of order cancellations: order stuffing would not be possible if orders had to remain on the book for a few minutes.

0 Response to "The Legitimacy of High Frequency Trading"

Post a Comment

Ylix

Pop Cash

cpx

pro