Why High-Frequency Trading Is So Hard to Regulate – NYTimes.com.
Peter J. Henning writes in NYTimes.com’s DealBook that “The challenge in pursuing charges against these firms is that they are taking advantage of changes in the technology underpinning the markets to profit from quick trades, which is not illegal. But regulators can find it difficult to draw the line between acceptable trading strategies and manipulation because of the complexity of the strategies.”
However, detecting market manipulation (either potential or attempted) has always been challenging since the beginning of financial markets. Regulators’ timely access to detailed data on security market transactions is the key in successful market surveillance. Advances in market microstructure technology makes this task much easier today compared to the paper based trading of the past. Regulators should be able to view all the details of market transactions in real-time, especially for the exchange traded instrument since everything is in the electronic records of the exchanges. Imposing new rules to require market participants submit detailed transactions data after-the-fact feels like regulatory responses of the last century.
In his 2011 Review of Futures Markets publication, Dr. Ahmet Karagozoglu suggests that regulators (and self-regulators) should design their own “market surveillance algorithms” and “co-locate” with exchanges’ matching engines in order to facilitate detection and rapid respond to improper trading activity that might be taking place at extreme speeds. Number of Quants, Financial Engineers, Risk Modelers and Algo Developers employed by the regulatory agencies need to increase dramatically (at the least should be close to the number of lawyers and legal staff)! Regulators’ market surveillance algorithms should rival in their speed and complexity the trading strategies used by market participants.