View original document

The full text on this page is automatically extracted from the file linked above and may contain errors and inconsistencies.

Controlling risk in a lightning-speed trading environment
by Carol L. Clark, lead technical expert, Financial Markets Group
A handful of high-frequency trading firms accounted for an estimated 70 percent of overall
trading volume on U.S. equities markets in 2009. One firm with such a computerized system
traded over 2 billion shares in a single day in October 2008, amounting to over 10 percent of
U.S. equities trading volume for the day. What are the advantages and disadvantages of this
technology-dependent trading environment, and how are its risks controlled?
A small group of high-frequency algorithmic trading firms have invested heavily in technology
to leverage the nexus of high-speed communications, mathematical advances, trading, and highspeed computing. By doing so, they are able to complete trades at lightning speeds. Highfrequency algorithmic trading strategies rely on computerized quantitative models that identify
which type of financial instruments to buy or sell (e.g., stocks, options, or futures1), as well as
the quantity, price, timing, and location of the trades. These so-called black boxes are capable of
reading market data, transmitting thousands of order messages per second to an exchange,
cancelling and replacing orders based on changing market conditions, and capturing price
discrepancies with little or no human intervention. The TABB Group, a financial markets
research firm, estimates that algorithmic trading in the U.S. equities markets grew from 30
percent of total volume in 2005 to about 70 percent in 2009 and that 2 percent of the 20,000
trading firms in the U.S. initiate these transactions.2 These firms made about $21 billion in
profits during 2008.3 Many of them are based in Chicago and are staffed by former floor traders
of the city’s major exchanges—the Chicago Mercantile Exchange and the Chicago Board of
Trade (both now parts of the CME Group) and the Chicago Board Options Exchange.
In view of the speeds at which these firms trade, some industry experts have expressed concern
over the potential for one or more black boxes going berserk and causing huge losses. Certainly,
trading losses due to errors are routine in an open outcry trading environment, and numerous
trading errors in electronic, screen-based trading environments have resulted in individual losses
amounting to hundreds of millions of dollars. Although algorithmic trading errors have occurred,
we likely have not yet seen the full breadth, magnitude, and speed with which they can be
generated. Furthermore, many such errors may be hidden from public view because a large
number of high-frequency trading firms are privately held, rely on proprietary technology, and
have no customers.
While black boxes execute trades with little or no human intervention, their operation is still in
human hands. For example, human beings decide how these tools are developed, programmed,
and tested; when they are turned on and off; and who has access to them. So, what are the risks
associated with high-frequency trading, and how are these risks controlled? In this Chicago Fed
Letter, I examine why speed is important to high-frequency trading strategies and how trading
firms and exchanges increase speed through unfiltered sponsored access and co-location. I
discuss the advantages and disadvantages of speed, which require careful risk management.
Finally, I explain how pre- and post-trade risk controls help manage these risks.

The importance of speed
A main goal of high-frequency trading strategies is to reduce latency, or delays, in placing,
filling, and confirming or cancelling orders. This is important because price takers—those who
place orders to buy or sell—are exposed to market risk prior to receiving confirmation that their
orders have been filled. Price makers—those who provide resting bids (buy orders) and offers
(sell orders) or respond to buy or sell orders—are exposed to the risk that their prices will remain
in the market at a time when the market has moved in the opposite direction of their strategy.
Latency is measured in microseconds (millionths of a second) and has various components,
including speed at which market data and signals from the marketplace are processed and
geographical distance and response time from the exchange matching engine (a computer or
computers where the trade is matched and executed). By reducing latency, high-frequency
traders are able to send their buy and sell orders to the exchange matching engine at breakneck
speeds in the hopes of getting their trades executed first.
To further satisfy trading firms’ desire for speed, there have been some new developments in
how some customers of clearing members and some nonclearing members send trades to an
exchange. Clearing members are members of an exchange clearinghouse, where trades are
matched and settled, and are referred to as broker–dealers in the securities markets and as futures
commission merchants (FCMs) in the futures markets.4 Nonclearing members are not members
of the exchange clearinghouse and must clear their trades through clearing members. Clearing
members are financially responsible for the trades of their customers and any nonclearing
members who clear trades through them, which include any customers of those nonclearing
members.
A two-tiered risk-management structure exists to monitor the credit and trading risks that
clearing members pose to the clearinghouse and that customers and nonclearing members pose to
the clearing member, although the extent of oversight varies across clearinghouses, broker–
dealers, and FCMs. The clearinghouse monitors clearing members, requires them to meet
stringent capital requirements, and demands margin (collateral) payments for positions they
maintain, including customer and nonclearing member positions. Clearinghouses also require
clearing members to contribute to a loss-sharing pool that is maintained in case one or more
clearing members become insolvent. Some clearinghouses also impose additional assessments on
clearing members if the losses arising from a defaulting clearing member exceed the amount in
the loss-sharing pool. Together, these measures help mitigate the credit risk that a clearing
member poses to the clearinghouse. In turn, clearing members oversee customers and
nonclearing members because of the credit and trading risks they pose to the clearing member.
Unfiltered sponsored access and co-location
Clearing members’ customers and nonclearing members typically send their orders to the
exchange using the clearing members’ trading infrastructure. To control the risk of potential
losses arising from these trades, the clearing member sets pre-trade limits, such as price and

quantity, which prevent trades outside a certain range from being executed. Rather than using
clearing members’ trading platforms, some customers of clearing members and some
nonclearing members now reduce latency by sending their trades directly to the exchange, using
their own or vendor-provided trading platforms.
There are two types of arrangements: sponsored access and unfiltered sponsored access.
Sponsored access allows clearing members’ customers and nonclearing members to access the
exchange matching engine directly and includes some pre-trade risk controls, such as price and
quantity limits. Unfiltered sponsored access—known as ―naked access‖ in the equities markets
and as ―direct market access‖ in the futures markets—enables customers of clearing members
and nonclearing members to bypass pre-trade risk controls and to send trades directly to the
matching engine. Customers of clearing members and nonclearing members find it desirable to
have unfiltered sponsored access because pre-trade risk controls slow trades down and increase
latency.
Another development in reducing latency is related to how close a trading firm’s server is to the
exchange matching engine. It is estimated that for each 100 miles the server is located away from
the matching engine, 1 millisecond (thousandth of a second) of delay is added. To reduce this
latency, many exchanges now offer co-location services, which allow trading firms to place their
servers close to the exchange matching engine. Doing so significantly reduces the time it takes to
access the central order book, where electronic information on quotes to buy and sell as well as
current market prices are warehoused. It also decreases the time it takes to transmit trade
instructions and execute matched trades. Co-location services are offered by numerous
exchanges, including NYSE (New York Stock Exchange) Euronext, Eurex,
IntercontinentalExchange (ICE), and the Chicago Mercantile Exchange.5
Advantages and disadvantages of speed
Many factors have driven the adoption of high-frequency algorithmic trading, including the
ability to more quickly capture trading opportunities and achieve best execution and desired
benchmark price.6 A significant catalyst for this change was the decimalization of U.S. capital
markets in 2000, which resulted in the price of stocks being quoted in decimals (pennies) rather
than fractions of a dollar. Smaller tick sizes—the smallest increment by which the price of a
financial instrument can move—caused an explosion in market data volumes. Processing such
high volumes began to exceed the data assimilation capabilities of human traders, whereas
machines were ideally suited to handling thousands of data points per second.7
There is evidence that high-frequency algorithmic trading also has some positive benefits for
investors by narrowing spreads—the difference between the price at which a buyer is willing to
purchase a financial instrument and the price at which a seller is willing to sell it—and by
increasing liquidity at each decimal point.8 However, a major issue for regulators and
policymakers is the extent to which high-frequency trading, unfiltered sponsored access, and colocation amplify risks, including systemic risk, by increasing the speed at which trading errors or
fraudulent trades can occur. Robert L. D. Colby, former deputy director of the Trading and

Markets Division, Securities Exchange Commission (SEC), said that if two minutes pass from
the time a broker–dealer receives information on the trades that were executed by a customer
with unfiltered sponsored access, hundreds of thousands of trades worth billions of dollars could
take place.9 On January 13, 2010, the SEC proposed a rule change that would prevent broker–
dealers from providing customers with unfiltered sponsored access to an exchange.10
Concerns over a black box going berserk arise from some well-known trading errors in the
electronic trading environment.11 For example, the Dow Jones Industrial Average dropped 100
points in 2002 when a Bear Stearns trader inadvertently entered a sell order for $4 billion instead
of $4 million. More than $600 million of the stock changed hands before the error was
detected.12 Similarly, a Morgan Stanley trader entered an order to buy 100,000 shares in 2007,
using a tool with a built-in multiplier of 1,000. The order was entered as $10.8 billion instead of
$10.8 million. Before the bank realized the mistake and cancelled the order, 81.5 million shares
totaling $875.3 million had been traded.13
Sometimes, these trading errors have been the result of the removal of pre-trade risk controls to
decrease latency. For example, futures broker MF Global suffered $141.5 million in losses in
February 2008, when a rogue trader initiated transactions during off hours using a terminal
intended for the business of major customers. One breakdown in MF Global’s internal risk
systems was the removal of trade limits, which had been done to increase trading speeds.14
Because speed is vastly increased in an algorithmic trading environment, particularly when
unfiltered sponsored access and co-location are involved, some market participants have
expressed concern that trading errors and losses have the potential of being even greater than
those in the electronic, screen-based trading environment. While it is difficult to know if such a
scenario will ever occur, we do know there have been instances of high-frequency trading errors
leading to losses. For example, in 2003 a U.S. trading firm became insolvent in 16 seconds when
an employee who had no involvement with algorithms switched one on. It took the company 47
minutes to realize it had gone bust and to call its clearing bank, which was unaware of the
situation.15 In a separate event, the NYSE recently fined Credit Suisse for an incident in 2007,
when a trader who was also a programmer changed the parameters on an algorithm. The change
resulted in a message loop that sent 600,000 messages to the matching engine in 20 minutes,
which severely slowed message traffic at the exchange. The NYSE rejected about 400,000 of
these messages, but it has not been disclosed if the remaining 200,000 orders resulted in any
trading losses.16
Firms with weak internal controls are exposed to risks related to the speed at which trades can be
executed and the circumvention of pre-trade risk controls. Moreover, because black boxes
sometimes trade with other black boxes, an erroneous price from one could impact the trading
strategy of another.

Pre- and post-trade risk controls
Pre- and post-trade risk controls exist at various levels of the trading process to prevent and limit
losses. Some exchanges have pre-trade volume and price limits that stop trades outside a certain
quantity or price from being executed. Others have trade bust policies that cancel clearly
erroneous trades. A well-built algorithm contains risk controls, such as price and quantity limits.
Broker–dealers and FCMs are responsible for verifying the financial integrity and risk controls
of their customers and nonclearing members, whether they are floor, screen-based, or
algorithmic traders. As discussed earlier, clearinghouses impose risk controls on their members.
Moreover, some clearinghouses, such as the Chicago Mercantile Exchange, provide FCMs with
near real-time information on their customers’ trades. This post-trade information enables FCMs
to monitor customers and nonclearing members with unfiltered sponsored access and to make
decisions on whether to allow them to continue trading. Therefore, of paramount importance is
the speed at which clearing members receive post-trade information from the clearinghouse and
incorporate this information into their risk-management systems so that erroneous trades can be
detected and stopped. Also, clearing members need an automated means to stop trades, which
not all exchanges provide.

Conclusion
The high-frequency trading environment has the potential to generate errors and losses at a speed
and magnitude far greater than that in a floor or screen-based trading environment. In addition,
the types of risk-management tools employed by broker–dealers and FCMs, their customers,
nonclearing members, exchanges, and clearinghouses vary; and their robustness for withstanding
losses from high-frequency algorithmic trading is uncertain. Because these losses have the
capability of impacting the financial conditions of the broker–dealers and FCMs and possibly the
clearinghouses, determining and applying the appropriate balance of financial and operational
controls is crucial.
Moreover, issues related to risk management of these technology-dependent trading systems are
numerous and complex and cannot be addressed in isolation within domestic financial markets.
For example, placing limits on high-frequency algorithmic trading or restricting unfiltered
sponsored access and co-location within one jurisdiction might only drive trading firms to
another jurisdiction where controls are less stringent.
1

Options are contracts that provide the owner with the right, but not the obligation, to buy
(call) or sell (put) a specific quantity of an asset (a stock, bond, index, commodity, or
currency) within a specified time period. Futures are contractual agreements to buy or sell a
commodity or financial instrument on a specific date in the future at a predetermined price.

2

See David Weidner, 2009, ―Subplots of the Goldman code heist,‖ Wall Street Journal,
MarketBeat blog, July 9, available at http:// blogs.wsj.com/marketbeat/2009/07/09/subplotsof-the-goldman-code-heist/.

3

New York Times Company, 2009, ―High-frequency algorithmic trading,‖ New York Times,
August 8 (updated September 17), available at
http://topics.nytimes.com/topics/reference/timestopics/subjects/h/
high_frequency_algorithmic_trading/index.html.

4

Broker–dealers trade securities for their own accounts and on behalf of their customers.
FCMs trade futures contracts for their own accounts and on behalf of their customers.

5

Colin Packham, 2007, ―Co-location, co-location, co-location,‖ FOW, Vol. 17, No. 4,
November, pp. 50–51.

6

Benchmark price is specific to the firm’s trading strategy and may be based on a number of
factors, e.g., the time or volume weighted average price of the financial instrument being
bought or sold.

7

Anita Hawser, 2009, ―The thinking machine’s man … or woman,‖ Automated Trader, No.12,
First Quarter, available by subscription at www.automatedtrader.net/articles/
technology-strategy/7840/the--thinking-machines-man--or-woman.

8

Terrence Hendershott, Charles M. Jones, and Albert J. Menkveld, 2007, ―Does algorithmic
trading improve liquidity?,‖ University of California, Berkeley, Haas School of Business;
Columbia University, Graduate School of Business; and Vrije Universiteit Amsterdam and
Tinbergen Institute, working paper, September 4, available at
http://icf.som.yale.edu/pdf/seminars07-08/jones.pdf.

9

Robert L. D. Colby, 2009, keynote address at Aite Group panel discussion, New World
Order: Deciphering the Future of High-Frequency Trading, New York, September 16.

10

Comments on this proposed rule change are due 60 days after its publication in the Federal
Register; see www.sec.gov/news/press/ 2010/2010-7.htm. The SEC also issued a concept
release on January 13, 2010, seeking public comment on the structure of the equity market,
including high-frequency trading and co-location; see www.sec.gov/news/press/2010/20108.htm.

11

See, e.g., Maria Trombly, 2007, ―Error in Singapore forced unwinding of 110,000 trades,‖
Securities Industry News, August 6, available at
www.securitiesindustry.com/issues/19_28/21320-1.html; and Tara Loader Wilkinson, 2007,
―The fat finger points to trouble for traders,‖ Dow Jones Financial News Online, March 12,
available by subscription at www.efinancialnews.com/assetmanagement/
content/2347355201/restricted.

12

Wilkinson (2007).

13

Ibid.

14

Greg Burns and Joshua Boak, 2008, ―Ultraspeedy trades can rack up big losses, fast,‖
Chicago Tribune, March 2, available at http:// archives.chicagotribune.com/2008/mar/
02/business/chi-sun-mf-tradermar02.

15

Andreas Preuss, 2007, Eurex CEO’s comments from Exchanges—The CEO Perspective
panel, Futures Industry Association Expo 2007, Chicago, November 27.

16

John Hintze, 2010, ―NYSE fines Credit Suisse for algo run amok,‖ Securities Industry News,
January 13, available at www.securitiesindustry.com/news/-24504-1.html.