January 2012


The preface to this book gives a reason behind the title for this book,

This year we learnt that there are many thousands of children across Britain who cannot read competently, that there are thousands who leave primary school unable to put together basic sentences. One in three teenagers reads only two books a year, or fewer, and one in six children rarely reads books outside of the classroom. Many parents do not read stories to their children, and many homes do not have books in them. Stories and poems, for these thousands of children, are not a source of enchantment or excitement. Books are are associated with school, or worse – they are associated with acute feelings of shame and frustration.

The ten people who have contributed to this book are from very different backgrounds. Some grew up with a multitude and variety of wonderful books within their reach; some had parents who imparted to them a fierce desire for books and for learning; for others, books were hard to come by, or even illicit. But all ten are united here in a passionate belief in the distinctive and irreplaceable pleasures and powers of reading. They describe a poem as a lifeline, a compass, or literature as the holding place of human value.They each contend that books are not just for the classroom, but must be made easily available beyond it, because great books are essential to a richer quality of life. These writers know that learning to read transformed their very brains, and that literature has helped them to express their questions and ideals, and molded their imagination and sense of self. This book is a manifesto. In a year of rude awakenings to low levels of literacy and a widespread apathy towards books and reading, this book demands an interruption. Stop What You’re Doing and Read. Read these essays, because they aim to convince you to make reading part of your daily life.

In this post, I have tried listing down some of the remarks/statements/quotes that I have found interesting in the ten essays penned in the book.

1C__Cauldron_Books_Reviews_stop_doing_what_you_are_doing_and_read_this_for_blog_zadie_smith  Zadie Smith

  • I have spent a lot of time in libraries. To go somewhere to study, because you have chosen to, with no adult looking over your shoulder and only other students for support and company – this was a new experience for me.
  • I came to understand why silence is necessary for serious study, and what the point of coffee is.
  • You might discover how quickly an afternoon with a toddler passes in a local library, quicker than practically anywhere else. You might realize that giving up smoking or writing a novel is easier to do when you are one of a group of people all seated on some fold-up chairs in a circle. It might strike you that what you really want in life is silence. Despite the many wonders of the Internet, you might suddenly long for the smell of the old books.

 2C__Cauldron_Books_Reviews_stop_doing_what_you_are_doing_and_read_this_for_blog_blake_morrison[4]  Blake Morrison

  • The best books give us a lifeline, a reason to believe, a way to breathe more freely.
  • The existence of the text is a silent existence, silent until the moment in which a reader reads it. Only when the able eye makes contact with the markings on the tablet does the text come to active life. All writing depends on the generosity of the reader.
  • It takes courage to own up dark thoughts and dangerous feelings. But poetry – the most intimate yet public of forms – is the ideal place.
  • Poetry is a serious business, but it isn’t solemn or funereal. All it insists on is that we read carefully, with concentration. The shortening of lines is a signal to slacken our pace.
  • Reading catatonically can become a way of coping with travails. To read so as to limit the power that the real world has over us.
  • When you are deep into a book, it’s for yourself alone, But once you’ve finished, if it’s any good, you want to share it with others – rehearse the story, assess the characters, discus what makes the book special. Children do this instinctively, and the growth of book clubs, reading groups and blogs reflects a continuing and perhaps increasing need for adults to share books in this way.
  • Don’t build a canon( reading list) just for the heck of it. Read at whim and don’t be intimidated by experts
  • Sometimes we find places that are very comforting just like books. We tend to keep these places a secret so that it doesn’t get ruined when a lot of people discover it. Books are NOT like that. We have no investment in keeping it to ourselves. Share them. The world will be the better for it.

 3C__Cauldron_Books_Reviews_stop_doing_what_you_are_doing_and_read_this_for_blog_carmen_callil Carmen Callil

  • “Live always in the best company when you read”.This my and mother I did: we read for company
  • To this day, I do not move outside the house without a book; inside my house they are my paintings, my decorations, my fellow travellers and my comfort.
  • Books are shields against a terror of boredom, that curse of most childhoods
  • Books are like gardens; a Kindle or an iPad is like a supermarket- it makes life easier, but one doesn’t want to loiter in it. You can fiddle with books. Like gardens, they can be wonderful to look at.

4C__Cauldron_Books_Reviews_stop_doing_what_you_are_doing_and_read_this_for_blog_tim_parks Tim Parks

  • Only the sequence of signs matter. The writing is in the sequence of the signs. This is the one thing that we can’t change. The experience is the sequence. The experience is not in any one moment of perception, but in the movement through the sequence from beginning to end, at our own speed, with interruptions.At beginning of each sentence we are projected towards the end. At the end we have the momentum of the beginning. Same with the paragraph, same with the chapter, same with the whole book, may be the trilogy. The beginning requires the end, the end the beginning. We are locked into a journey.
  • It’s learning how to take intense pleasure in reading that makes it so useful. There are two pleasures in reading, one enchantment and second , awareness of the same. Two seemingly diverse experiences make any reading a pleasure.
  • Life is simply too short to read the wrong books, or even the right books at the wrong time.
  • So, this novel, which was definitely not what you were looking for, now turns out to be exactly what you needed. It has allowed you to discover something new about yourself, because you were watching your reaction as you read it.

 5C__Cauldron_Books_Reviews_stop_doing_what_you_are_doing_and_read_this_for_blog_mark_haddon  Mark Haddon

  • It also occurs to me that whilst I read different books these days, my reasons for reading have changed very little. It’s the thrill of being transported to another world.
  • Select the right words and put them in the right order and you can run a cable in to the hearts of strangers. Strangers in China, strangers not yet born.
  • Books are like people. Some look deceptively attractive from a distance, some deceptively unappealing , some are easy company, some demand hard work that isn’t guaranteed to pay off. Some become friends and some stay friends for life. Some change in our absence – or perhaps it’s we who change in theirs – and we meet up again only to find that we don’t get along any more. Unlike people, one can at least dump them or hand them to a friend without causing offense or feeling guilt.
  • I am always in search of novels that understand and articulate precisely what it feels like to be a human being.
  • Lay the novel alongside film and its specialness becomes obvious. Film promises everything. Ancient Rome, dinosaurs, talking dogs, car chases, sex, Mars, vampires… Such a boundless cornucopia that we forget what it can’t do. It can’t do smell or taste or texture. It can’t tell us what is is like to inhabit a human body. Its eyes are always open. It fails to understand the importance of the things we don’t notice. It can’t show those long stretches of time when we are seeing nothing at all, just drifting in our own minds.
  • The sense of being inside looking out, of seeing a world that belongs to everyone, but is nevertheless yours alone. It is this uncrossable gulf between me and not-me, between my private experience and yours, which lies at the heart of being human and which no other medium can touch, and this border is where the novel lives and moves and has its being.
  • Talking about reading as the cause of anything is to get things back to front. Reading is primarily a symptom. Of a healthy imagination, of our interest in this and other worlds, of our ability to be still and quiet, of our ability to dream during daylight.
  • If we want more people to enjoy better books, whatever that means, we should concentrate on the things that prevent people reading. Poverty, poor literacy, library closures, feelings of cultural exclusion. Alleviate any of these problems and reading will blossom. These are real threats, not technology, not the pervasive and risking fear that readers are being tempted else where by the shallow pleasures of Britain’s got talent.
  • Novel is just the right words in the right order.
  • Films and television programs, plays and paintings and sculptures never really become friends in the way that novels do. We can admire, we can be impressed, we can be moved and consoled, but we rarely feel that peculiarly personal attachment we feel to a loved novel, because whilst writing novels is a long and solitary business, reading them is always a collaboration, and a good writer gives the reader space and encouragement to play their part so that when we close the final page, we have had an experience that is partly of our own making.

 6C__Cauldron_Books_Reviews_stop_doing_what_you_are_doing_and_read_this_for_blog_michael_rosen  Michael Rosen

  • I can’t really sort out who’s who, real or imaginary, and I think this is how we all read when we have time and space to think about books.
  • Dickens told us about a Miss Havisham whom he created, but when many of us read about that Miss Havisham, we bring her to life with the Miss Havishams we know in our lives.
  • My father’s performance had given such life to the characters that their vocabulary became ours, and they could now live with us on the campsite, and , it turned out, beyond, for years after.
  • Part of the power of stories is the way in which we can see facets of this or that fictional person in the people we know, and scenes from the fictional world have echoes in the events of the real world.

 7C__Cauldron_Books_Reviews_stop_doing_what_you_are_doing_and_read_this_for_blog_jane_davis  Dr.Jane Davis

  • You think of reading as an individual, even a solitary activity, one that you want to defend as such, because usually, for devoted readers, the act of reading is deeply private. I’m going to argue, though, that even highly proficient readers might want to try shared reading, which is in equal measure about books and people. It isn’t just about getting non-readers in to reading; it is about building relationships out of communal meanings. Sharing a books is a multiplier, as anyone who has ever read, night after night, to a story-besotted child will know.It is about mutual recognitions, sharing of selves.
  • Read the poem alone and you have your own experience and imagination to touch the poem in to life. Read it with six others and you have six lives and six imaginations with which to inhabit this flexible human-shaped space.
  • Over the last 100 or so years, the loss of religious as a reputable discourse in common life has led to a poverty of language and this to a poverty of contemplative thought and feeling about what we are, and what we need. We need some inner stuff, scaffolding to help us get around our inner space, something to help us map, explore and even settle those places where we are still primitive.
  • What is that part of being human which is touched by silence, which recognizes an intense atmosphere when people are moved, which gets scared or exhilarated when alone in a big space, or when faced with a newborn baby? Science may gradually work this out that is our mainstream model these days for accredited seriousness, for what we can be confident in believing. But literature- too often now dismissed or misplaced – has always known that buried part, and in thousands of ways.
  • What people instinctively know, and science is beginning to understand, is that what makes people happy, above all, is a network of supportive fellow creatures, a sense of purpose, challenge and meaningful occupation. Shared reading can provide all this. Get a few people together, pick up a good book and try it.

 8C__Cauldron_Books_Reviews_stop_doing_what_you____g_and_read_this_for_blog_Jeanette_Winterson  Jeanette Winterson

  • There are two dominant modes of experience offered to us at present – actual( hence our appetite for reality TV, documentaries and “true-life” drama) and virtual- the Web. Sometimes these come together as in bizarre concept of Facebook : relationships without the relating. Reading offers something else; an imaginative world.
  • To cross the threshold of a book is to make a journey in total time. I don’t think of reading as leisure time or wasted time and especially not as downtime. The total time of a book is more like uptime than downtime, in the way that salmon swim upstream to get home.
  • Reading is becoming a casualty of the surf-syndrome of the web. Reading is not skimming for information. Reader is a deeper dive. Or a high climb. Nan Shepherd talks about the exhilaration of altitude. The air is thinner. The body is lighter. But you have to acclimatize. You have to acclimatize yourself to books.
  • The consequences of homogenized mass culture plus the failure of our education system and our contempt for books and art, mean that not reading cuts off the possibility of private thinking, or of a trained mind, or of a sense of self not dependent on external factors.
  • A trained mind is a mind that can concentrate. Attention Deficit Disorder is not a disease; it is a consequence of not reading. Teach a child to read and keep that child reading and you will change everything. And yes, I mean everything.

 9C__Cauldron_Books_Reviews_stop_doing_what_you_are_doing_and_read_this_for_blog_nicholas_carr  Nicholas Carr

  • For Emerson, the best books – the “true ones” – “take rank in our life with parents and lovers and passionate experiences, so medicinal, so stringent, so revolutionary ,so authoritative”.
  • Books are not only alive; they give life, or at least give it a new twist.
  • The groups of nerve cells , or neurons, that are activated in the brains of readers closely mirror those involved when they perform, imagine or observe similar real-world activities
  • When we open a book, it seems that we really do enter, as far as our brains are concerned, a new world – one conjured not just out of the author’s words, but out of our own memories and desires – and it is our cognitive immersion in that world that gives reading its rich emotional force.
  • In our day-to-day lives we are always trying to manipulate or otherwise act on our surroundings, whether it’s by turning a car’s steering wheel or frying an egg or clicking on a link at a website. But when we open a book our expectations and our attitudes change drastically. Because we understand that “we cannot or will not change the work of art by our actions’, we are relieved of our desire to exert an influence over objects and people and hence are able to “disengage our cognitive systems for initiating actions”. that frees us to become absorbed in the imaginary world of the literary work. We read the author’s words with “poetic faith”.
  • It is only when we leave behind the incessant busyness of our lives in society that we open ourselves to literature’s transformative emotional power.
  • Although early versions of popular e-readers like the Kindle and the Nook did a pretty good job of replicating the tranquility of a simple page of text, it now seems likely that the page’s calm, and the immersive reading it encourages, will be broken as a book’s words are made to compete for a reader’s attention with a welter of on-screen tools, messaging systems and other eye-catching diversions. The very form of a book seems fated to change as the written word shifts to a new means of production and distribution.
  • Reading is more than visual decoding of alphabetic symbols. It is a state of mind, a dream of life, and a book, if it is going to be a true book, needs to be more than a container of words; it needs to be shield against busyness, a transport to elsewhere. Stevens put it simply:”The house was quiet because it had to be”.

10C__Cauldron_Books_Reviews_stop_doing_what_you_are_doing_and_read_this_for_blog_maryanne_wolf  Dr. Maryanne Wolf

Dr.Maryanne Wolf and Dr. Mirit Barzillai ask a set of questions that probe in to the “future of reading”. With the onslaught of information overload, increasing distractions ( twitter, instant messaging, etc.) there are many possible ways the “the reading brain” is going to shape up in the future; “shape up” is the word used as we are never born to read or write anything. Unlike vision or language, reading has no generic program that unfolds to create an ideal form of itself. Rather, learning to read lies outside the original repertoire of the human brain’s functions and requires a whole new circuit to be built afresh with each new reader. The brain changes itself by building a versatile “reading circuit” out of a rearrangement of its original structures, such as visual, conceptual and language areas. In the last essay , the authors ask a lot of questions in this context,“How will the next generation of kids build a ‘reading circuit’ ”. Will the twitters/IM make the brain circuitry devoid of ‘deep-reading’ and instead build something completely new that is equipped to handle the kind of info overload that we are seeing ?

  • We may already have within our grasp the tools to conceptualize that the “new readers” of the twenty-first century need: a differently evolving reading circuit, one that connects the existing expert deep-reading skills to the evolving information-processing skills in order to be able to use the resources of the twenty-first century external platforms of knowledge wisely and well. The task is to figure out how to get there. We will need many minds to plot this progression.


The book is a nice collection of essays about “reading” by ten authors ; all of the essays have one common theme – Make time to deep-read in your busy schedule.


In the introduction, the author tries to define HFT using the voices of leading HFT players. What are the characteristics of HFT? There is no agreement on a common definition of HFT, but there are some elements that one can at least attribute to HFT. They are

  • Low latency trading.
  • High turnover strategy.
  • Trader goes home `flat’ with no open position.

The goal of any HFT firm broadly is to have a set of uncorrrelated trading strategies that has statistically more winners than losers for all the trading positions of the day. The introduction mentions the fundamental market driven factors that gave rise to HFT:

  • RegNMS.
  • Order Handling Rules leading to ECNs and hence a proliferation of venues.
  • Explosion of the universe of instruments; Human market maker is giving way to computerized market maker.

The author ends his intro clarifying terms such as program trading, quant trading, algo trading, automated trading, prop trading, stat arb, ultrahigh frequency trading. I found the explanation of these terms better explained in Barry Johnson’s book ,`Algorithmic trading and DMA’.

The Emergence of High-Frequency Trading

This section lists down the significant events from 1969-2007 that were a precursor to todays’ HFT

  • 1969 Feb – Instinet Launched.
  • 1971 Feb – NASDAQ starts electronic trading in 2500 OTC stocks.
  • 1975 May – SEC bans fixed commission rates.
  • 1976 May – NYSE introduces DOT.
  • 1984 July – NYSE Approves SuperDOT.
  • 1987 – Launch of ITG Posit.
  • 1994 August – BNP purchases Cooper Neff Options trading firm.
  • 1997 – Order Handling rules approved.
  • 1998 – Launch of Attain, NexTrade, Strike Technologies, Bloomberg tradebook.
  • 1999 April – Regulation ATS becomes effective.
  • 1999 July – Goldman buys Hull for $530M.
  • 2000 – BRUT/Strike merger, Archipelago and PSX electronic exchange launched.
  • 2001- Launch of Liquidnet, Launch of Direct plus, Archipelago/REDI Book merger.
  • 2001 April – Decimalization rule hits stock exchanges.
  • 2002 – Launch of NASDAQ Super montage, Instinet – Island merger, Tradebot trades 100 million shares a day for the first time.
  • 2004 – Nasdaq acquires BRUT, Launch of Pipeline.
  • 2005 – RegNMS comes to force, NASDAQ acquires INET, NYSE acquires Archipelago, Knight acquires Attain and renames to DirectEdge ECN, Citi acquires NexTrade.
  • 2006 – NYSE buys Euronext, CME-CBOT merger.
  • 2007 – Chi-X Europe launched by Instinet. It becomes largest Multilateral trading facility in Europe. A graphic for the above events should have been there to make the content easier to read and recall. A few visuals like the would have made the chapter more readable:



The Path to Growth

This chapter traces the developments through the period 2007-2010 June.

  • 2007 April – General Atlantic invests $300M in GETCO.
  • 2007 May – NASDAQ announces its plans for acquiring Nordic Stock exchange.
  • 2007 July – Citi buys Automated Trading desk for $680M.
  • 2007 Oct – NASDAQ buys Boston Stock exchange.
  • 2007 Nov – MiFID goes live in Europe , NYSE announces its plans for a data center Mahwah, NJ.
  • 2008 June – `Flow traders’ receives funding.
  • 2008 June – GS Sigma X and Credit Suisse ATS platform log in record volumes of 406M and 210M shares respectively.
  • 2008 Nov – BATS becomes an exchange with its USP as speed with maker-taker model. As of today, BATS logs in 12% of the daily US equity trading volume.
  • 2008 – Citadel’s Tactical Trading fund earns $1.19B in 2009 with a team of 55 people!
  • 2009 Mar – GSET launches Sigma-X in Hongkong. It soon becomes a big hit.
  • 2009 June – NASDAQ introduces two types of flash orders , BATS introduces flash orders.

High Frequency Trading Goes Mainstream

This chapter traces the developments during 2009-2011 , the time frame when HFT entered the parlance of media, journalists, politicians, economists etc. It all began with a NYTimes article by Charles Duhigg on July 24, 2009 on HFT. That was the first time when HFT became known to wider audience and most importantly lead to false notion amongst people that flash orders were used to manipulate market.Click here  to read the full article that was published in NY Times.

Soon after this article got published, NASDAQ and BATS stopped offering flash orders. Amidst the falling economy, HFT was making money and this was met with harsh reactions from everyone. Meanwhile BATS and DirectEdge were eating in to the market share of NASDAQ and NYSE. So, to stay competitive in the game, NASDAQ launched INET in late 2009, an HFT enabled platform. NASDAQ also asked SEC to look in to various data centers that were mushrooming in the country. These data centers were heavily used by hedge funds and HFT shops. Meanwhile Chi-X was launched in Europe and Asia. Chi-X was very successful in both the venues. In Japan, it was launched under the name, `Arrow Head’ and the latest numbers from Arrow Head – 100 Million shares a day – shows that it is a massive success. There is another market development mentioned here, i.e the ban on naked access. Naked access goes under various names, ‘Sponsored Access’, ‘Unfiltered Access’, etc. All those names boil down to one type of access – Allowing hedge funds to participate in the market using broker’s market participant Id. The broker does not put any kind of controls to the orders and merely offers the platform to the clients to trade with broker’s participant Id. According to Aite group, 38% of the equities volume is from naked access. Banning this was a big jolt to all the broker-dealers.

The chapter also features the success story of GETCO, an options trading firm started by 2 floor traders in 1999, that went on to become one of the biggest success stories from Chicago. There are some interesting facts about the GETCO’s story like , it hired skilled video gamers from Illinois Institute of technology for its trading team. Well, to think of it, HFT is a kind of video game played with sophisticated technology. In Feb 2011, GETCO became a NYSE designated market maker along with the biggies like Goldman, Kellogg group, Bank Am and Barclays.The success of GETCO is also an indication of what a powerful financial and technology hotbed Chicago had become.

The Technology Race in HFT

Based on a few interviews with HFT shops, the author lists some basic requirements for a HFT shop to get going

  • A reliable and sufficiently fast data input source.
  • A robust and sufficiently fast market access.
  • Low enough transaction, clearing and processing costs as well as sufficiently effective and efficient processing and verification capability.
  • Strategy monitoring tools.

Well, if these things appear obvious, Kumiega and Ben Van Vliet of Illionois Institute of technology have developed a step-by-step methodology( using machine control theory) that addresses the needs of institutional trading and hedge fund industries for development, presentation and evaluation of high-speed trading and investment systems. They call it methodology and it says that the following stages of development should serve as a road map for running HFT shop.

  • Before even firms even start building, they should ask themselves how they are going to monitor a successful algo.
  • Build a customized database of historical data and purchase or build a tool for proper back testing of a strategy.
  • Data cleaning methods are a key component. Writing your own scrub methods is better as it gives control over the data.Procure dirty data and see whether your system works.
  • When algos go out of range,let’s say 3-4 stdev , stop the strategy and fix it, just like a plant shuts off a machine if it goes 3 stdev from the normal.
  • Backtesting is critical. However there are certain aspects you can’t back test and be sure. Latency, Market data and timing are always going to be things that are difficult to check in backtesting.
  • Fully define the functionalities and performance requirements of the trading/investment system and develop a system. Don’t tradeoff robustness for the sake of speed.
  • Put in place a reporting mechanism to capture relevant metrics for the HFT strategies.

The chapter ends with saying that, low latency is essential, but it is the strategy that needs to be robust. This is a very different view from what I heard from one of my friends last week. He is of the opinion that there are less than 10 strategies that are basically used in all the HFT shops and the speed is the ONLY thing matters. This book and other people mentioned in the book says `otherwise’. May be both matter equally!.

The Real Story Behind the “Flash Crash”

I found this chapter to be most interesting in the entire book as it goes behind the events that lead to NY Times article in July 2009 . It was in June 2009 that NASDAQ and BATS introduced flash orders to compete against DirectEdge. DirectEdge had given its participants a special order type called Indication of Interest , that helped DirectEdge add 10% to its existing market share. So, this tactic by NASDAQ was to make SEC ban such orders on the entire US market including CBOE that had roaring flash order business. SEC did not make regulatory changes as far as options are concerned. But it did make changes and subsequently flash orders were completely removed from the market.

One typically associates flash crash with some kind of HFT trading gone bad. But this chapter makes it very clear that HFT traders hate flash orders as the only entities that are profitable in a flash order are the exchanges and the guy putting the order. Infact reading this chapter makes one realize that flash crash actually goes on to show why HFT firms are needed in the first place becoz most of the HFT firms shut off their computers during the flash crash. Since there was no liquidity from HFT firms,volatility increased so much that Dow tanked 600 points and bounced back. Liquidity and Volatility have an inverse relationship and thus ‘HFT firms providing liquidity lessen the volatility’ is the argument provided `for’ HFT case. When SEC report came out investigating flash crash, it was clear that a massive market sell order of $4 Billion E-Mini contract was the tipping point for the crash in the already nervous market. Why would a trader put in such a big order as a market order and not a limit order is surprising ? Somehow the press has been flogging HFT for the flash crash , when infact, there were firms who were providing liquidity even during the flash crash. Rumor is that there is a firm in Chicago which made $100M in one day. Now given that HFT profits are close to $2B-$3B, that’s a huge amount of money.

Will there be more flash crash type of events in the future ? Certainly says the book and it says this has got nothing to do with HFT strategies.

Life after the “Flash Crash”

This section describes the various events post May 2010 flash crash

  • May 12, 2010 – Talk of HFT tax as they were falsely blamed for flash crash
  • May 14, 2010 – The first piece of investigation revealed that it was a traditional money manager who placed a $4B sell trade with out price limit, that triggered the crash
  • May 24, 2010 – Talk of eliminating stub quotes
  • May 26, 2010 – Goldman launches DMA in Latin American markets
  • June 4, 2010 – SGX announces $250M “Reach Initiative”, the fastest platform in the world that has a 90 microsecond door to door speed. NASDAQ has a door to door metric of 177 micro seconds. This proposed platform at SGX is 100 times faster than the existing infra
  • June 10, 2010 – SEC approves rules that require exchanges to stop trading in specific stocks if the price goes up by 10% or more in 5 min interval.
  • June 11, 2010 – Revolving door practice is seen at GETCO. GETCO hires Elizabeth King, a personnel from SEC. This practice of hiring people from regulatory bodies is called ‘Revolving door practice’
  • June 22, 2010 – GETCO launches its dark pool
  • July 21 2010 – DirectEdge launches two platforms, one for blackbox stat arb guys and second for passive agency order guys
  • Aug 24 2010 – Chi-X receives a takeover enquiry from BATS. Chi-X becomes the second largest trading venue after London Stock exchange in Europe
  • Sep 1, 2010 – BATS and NASDAQ stop flash orders. DirectEdge continues with Indication of Interest orders, a flash order in disguise. From the flash crash to the Waddell & Reed trade that started it all, from the discussion about stub quotes to circuit breakers, from the launch of new exchanges in the United States to GETCO’s dark pool in Europe, by November 2010, high frequency trading was on the minds of everyone in the financial world.

The Future of HFT

This sections lists the events that have happened in 2010.

  • Oct 21, 2010 – Chi-X Global announces that it will start operating in Australia by Mar 2011
  • Oct 21, 2010 – SGX makes GETCO a trading member in its securities operations
  • Oct 25, 2011 – SGX makes a $8.3 B takeover offer for the operator of Australian Bourse
  • Nov 3, 2010 – SEC voted to bar brokers from granting HFT unfiltered access to an exchange, a move aimed at imposing safeguards meant to prevent bad trades
  • Nov 8, 2010 – SEC decides to ban `stub quotes’
  • Nov 11, 2010 – Chi-East, a joint venture between Chi-X Global and SGX was launched and this became the first pan-Asian independent dark pool to be backed by a regional exchange.Chi-East is widely regarded as a landmark in the development of Asian Equities
  • Nov 24, 2010 – DirectEdge announces that it was changing the way it handles flash orders. It also indicated that it will slowly phase out flash orders
  • Dec 1, 2010 – Equiduct Systems Ltd, announced that it traded more than 1 billion euros in Nov 2010 for the first time as ATS aimed at retail brokers gathered momentum. Equiduct is an MTF operated by Berlin bourse.
  • Dec 6, 2010 – ASX that was selling itself to SGX said that the decision to merge with SGX was in the best interests of Australia’s market micro structure development. Collectively the combined SGX – ASX entity would list companies worth $1.9T, fourth in Asia behind Tokyo, HongKong, and Shanghai.

The book is interspersed with interviews with 6 High frequency traders.

Meet the Speed Trader : John Netto


In this chapter, John Netto , a high frequency trader talks about his trading experiences and HFT in general. Here are some of the aspects he mentions,

  • The creation of HFT strategy was the easy part. Issues such as how one will raise capital and how one will handle dealing with compliance, regulatory, reporting, and developing infrastructure are all questions not asked at first but which will become a critical part of the equation of profitability
  • Low latency is very important for HFT but the degree of importance varies based on strategy.
  • In terms of infra, I would say buy , than build so that you can focus on strategies to make money.
  • Sophisticated systems can be thought of as the assistants to the traders, not replacements to the traders, because after all, strategy, structure and people are key to success.
  • HFT is going to get bigger, stronger and more prevalent. More traditional investment managers will explore HFT in the times to come.
  • Always have a few strategies that you think , make money on paper at least.
  • I work out daily. Fitness and being active are a very important part of success . Trading requires you to be mentally alert and hence you have to be be fit.

Meet the Speed Trader : Aaron Lebovitz


Aaron Lebovitz spent two decades in the industry and then started his own firm, Infinium Capital management, a prop shop in 2003 and made it in to an exceptional force in the industry. Here are some of the aspects he mentions,

  • Hook up with a firm that has a proven track record, and make sure to know how to write code.
  • Integration of formal models in to trading decisions is the kicker you get in applying quant.
  • Started off with pairs as his first high-freq strategy.
  • His mantra is, `Developing new markets, increasing global market liquidity, and driving efficient price discovery’
  • Infinium trades 23.5 hours 6 days a week. It did shut down its machines on the day of flash crash
  • It is not all about speed. Having subtle or good strategies will get you alpha.
  • The future of HFT is still very much in doubt, hinging on an uncertain regulatory environment.
  • I don’t foresee traditional investment managers shifting their focus to becoming market makers or developing statistical arbitrage strategies

Meet the Speed Trader : Peter van Kleef


Peter Van Kleef has spent the last 15 years running automated trading ops. Here are some of the aspects he mentions,

  • It never really crossed my mind why someone would want to trade any other way than HFT. It is just so much more efficient
  • Start writing your own trading strategies and see whether they make money on paper. Think about fancy infra later.
  • Trying to apply solutions from other fields of science and research is also a start when exploring HFT.
  • Whenever possible, I try to balance work with some sort of physical exercise. It is MUST for being a better trader.
  • Low latency is important but is not the be-all and end-all.
  • It is time for people to stop competing for speed and start producing alpha.
  • A lot of the profits that are sustainable usually will come from the strategy and not necessarily from technology.
  • Leverage is the key driver because returns are quite low.
  • The overall profit largely depends only on the number of trades that can be done because the average profit is nearly certain. I don’t know many other strategies that, if run well, consistently range about 50% per year and above. We are not in the business to rate ourselves; high-frequency is very profitable, if done right. That much is sure.
  • Sophisticated retail investors are already running HFT strategies.
  • HFT of the yesteryears is becoming a commodity as you can buy all the infra that is needed.
  • Part of what will determine the future of the industry is what regulatory changes legislators and SEC will introduce.

Meet the Speed Trader : Adam Afshar


Adam Afshar’s views on HFT are little different from the traders covered earlier. He is of the opinion that it is the low latency that is extremely critical. Here are some of the aspects he mentions,

  • HFT is a method or a tool. It is not a strategy. HFT methods are applied to three types of strategies – Market Making, Stat arb and Algorithmic execution.
  • My HFT shop uses models where there is absolutely no human intervention in the execution. I believe this should be the way, as human order execution does not allow you to collect data and you can’t back test. If you use an algo for execution, one can collect all types of data and then improve it. Your reliance of specific individuals is also less
  • High-speed data management is the linchpin of a success HFT shop.
  • You can’t avoid latency issue. Its a prerequisite for whatever you do in HFT world.
  • The biggest revenue generating idea in my shop is , one that scours 110 million news items and takes positions intra-day
  • HFT popularity will slowly fade away. It will soon become a commodity once tech makes the landscape flat.
  • My advice to students is to take science and liberal arts subjects as they teach how to think. You can figure what you want to do later.

Meet the Speed Trader : Stuart Theakston


Stuart Theakston runs GLC, an HFT firm. Here are some of the aspects he mentions,

  • High frequency traders simply replace specialist/jobbers in providing liquidity in a much more competitive framework.
  • HFT traders are not getting a free lunch by installing heavy duty infra. Infact all they are doing is eat each other’s lunches.
  • An unfortunate way in which flash crash was handled by the exchanges : Some of the trades were canceled. This gives all the more a reason for HFT traders to be away from the market during crisis.
  • I spend 70% of the time with quants, traders and programmers and 30% of the time watching market and doing research myself. I read a lot of academic papers and journals, etc.for they provide valuable source of ideas.
  • Most of the people in the media, political circles, policy makers are clueless about the function of HFT. Volatility reduction demands Liquidity and HFT guys play a crucial role here. The policy makers have the potential to kill this entire HFT crowd if they don’t understand issues properly.
  • Order-anticipation and cross-venue arb are sensitive to latency, liquidity provision is less so.
  • Prop trading firms can spend 99 cents on infra to make 1 dollar. Hedge funds can only spend 19 cents of the dollar given the structure. So these participants cannot compete in the same space
  • HFT is nearing the bottom in terms of competition to be the fastest. The opportunities are outside equity asset classes like Forex, Exchange traded CDS and developed markets.
  • Suppliers will spring up who try to sell HFT trading in a box to the masses. But the retail users won’t make any money, at least on a properly risk-adjusted basis
  • Most of the development in HFT have already happened in developed-market equity and equity derivative markets, and these are approaching maturity. These markets are likely to be the domain of increasingly specialist outfits. The low hanging fruit has been picked. The next wave is likely to be in emerging markets and esoteric ETFs. This is where people developing their careers in this space should focus.

Meet the Speed Trader : Manoj Narang


Manoj Narang started his firm in 1999 that was in to providing financial toolbox. In the last few years or so, he has transformed his firm in to a profitable HFT firm.Here are some of the aspects he mentions,

  • Its the law of large numbers at play in HFT. If you have a strategy that works 55 percent of the times, you can’t do a few trades. You have to do a ton of trading to translate that edge in to sustainable profits.
  • We generally are buyers of hardware and builders of software and systems.
  • We have never had a a losing week since we started in the business.
  • A successful high frequency strategy will have a Sharpe ratio higher than 4 and a successful HFT operation that runs multiple strategies generally will have a double-digit Sharpe ratio. Such high Sharpe ration are unheard of in traditional investment places.
  • Back of the envelope calculation shows that the profitability of the entire HFT industry is around $2B. Not much in relative terms. There are many other corners of the financial industry such as derivative trading, that generate hundreds of times this amount of profit in one year.
  • The main use of outside capital in the world of HFT is to fund research and development, operations, not to actually trade the capital. Very little capital is required for trading purpose.
  • Once you are an electronic market maker, it is skills that become important and not status or power or connections.
  • Market making function in equities is decentralized. It doesn’t mean everyone should be a market maker. Just because everyone needs food, doesn’t mean everyone becomes a grocer. Similarly individuals should seek to be investors and not liquidity providers.
  • Wall street has little to do with HFT.
  • I think the main risk in the financial markets is that the capital is getting increasingly concentrated, which makes herd like behavior even more prevalent. The frequency with which bubbles inflate and exploder is getting more and more rapid as a result because massively capitalized investors jump from one asset class to another asset class, leaving devastation in their wake
  • HFT does not connote systemic risk.The only systemic risks to the market are the ones posed by herd-like behavior.
  • Tradeworkx started HFT in 2009 and this 50 member shop account for 3% of overall volume in SPDR ETF.
  • One must keep in mind that modeling systemic correlations is VASTLY different from modeling structural correlations.
  • How do I see our fund in five to ten years? I have no idea; five or ten years are an eternity when you are immersed in a world where microseconds matter.

imageTakeaway :

If you take all the events that have happened relating to HFT world and list them in a chronological order, spice it up with some HFT trader stories and viewpoints, what you get is, this book.


The book starts off with a discussing ‘What’ and ‘Who’ aspects of High Frequency trading systems.

What is High Frequency Trading :

With the advent of electronic trading , the human market maker has been replaced by electronic market maker. The back end is a programmed strategy that does the market making. HFT’s DNA comprises two elements.First element is the trading strategy that is typically liquidity provision or intra day arb extraction. Second element of its DNA is the holding time that is usually a fraction of a second.

Who are High Frequency traders:

HFT firms can be typically categorized in to four types 

  • Regulated Market Makers : Leading wholesalers like Citadel, Knight fall in this category. These firms have employed technology to enhance the overall operations of a traditional market marking role
  • Statistical Arbitrage Hedge funds : Statarb typically connotes a strategy with a holding period of a few days. But in the recent years, variant of Statarb techniques are being employed intra-day, for example, Pairs. When Pairs was first used at Morgan Stanley, the holding period was a few days from the entry date. Now the same co-integration based strategies are employed on intra-day data. The fact that they are being used does not mean that there is definite alpha in them. Intra day is characterized by far more noise than inter-day data. So, there is always a risk of coming up with nonsense estimates using intra-day data.
  • Low-Latency brokers : These firms are the Lime Brokerages of the world whose strength is technology. Using very strong technology infra, they deploy strategies to do incredible volumes on the stock exchange
  • Clearing firms catering to HFT needs

The common attributes amongst the above firms is that they focus on having low-latency, resilient, scalable technology, and their trading strategy is their bread and butter and they forever keep tweaking them. Lastly they all keep a low-profile.

The chapter goes on to list the impact of HFT firms on the markets. 

  • Tech has become the critical component in HFT strategy. Hence there have been lot of successful startups providing very niche services.
  • Post HFT, spreads have become tight
  • 60% of the volume on NYSE is HFT driven
  • Expansion of the maker-taker model : Island is an example here, that has become the de factor pricing model of today’s US equities market. By offering a rebate for providing liquidity and charging for taking liquidity, Island created an ideal transaction model to attract HFT firms.
  • Active investor in market structure : Most trading venues have attracted investments from HFT firms, thus improving the market structure
  • HFT has moved beyond equities in to options, futures , FX and other assets

So, pretty much whether someone likes it or not, HFT is a reality that is not going to go away. With out HFT, i.e Stock exchange transactions would look completely different.

Market Structure

This section talks about the evolution of the market structure since the time Order Handing rules were introduced in 1997. The following visual best summarizes the evolution:

1997 was a landmark event in the market structure evolution as ECNs came in to existence. They were the main outlet for unwanted limit orders from market makers. Large buy-side firms became attracted to ECNs because of their ability to execute orders anonymously and to minimize market impact.

ECNs followed mainly two business models.


The first category is the best execution centric ECNs where they were efficient order routers; the second category was Market-centric where ECNs matched the orders internally and then routed if they failed to match it internally. Slowly Best Execution centric ECNs lost their uniqueness to DMA . Instinet and Island ECNs became popular with latter taking a larger market share. In June 2006, Instinet acquired Island and thus NASDAQ became a big player in ECN business. Also there were a spate of mergers and acquisitions during 1997 and 2006. By the end of 2006 all the ECNs got consolidated in to NYSE or NASDAQ.

The following visual summarizes the M&A activity in this space :


Then came RegNMS . The future prospects of regional exchanged looked quite bleak. But in the recent years they have made a great comeback. The chapter then goes on to cover one of the most interesting execution avenue in US , the dark pools , that are currently about 40 in number and account for approximately 13 % of the the US equities market. Types of dark-pools and the nature of dark pools are covered in this section. One of the reasons for the rise of dark pools is hedge funds.

The various types of dark pools described are 

  • Block Trading Dark pools
  • Agency Dark Pools
  • Consortium Dark Pools
  • Exchange Dark Pools
  • Internalization Dark Pools

The estimated market share of dark pools as of Q2 2009 is shown below


Trading Infrastructure

With rapid adoption of electronic trading and algorithmic trading, the messaging volume has seen an exponential rise. Reg NMS and market fragmentation has also given rise to tremendous amount of TAQ data in the recent years. In the equities side, 1.8 Billion messages per day is becoming a norm and in options data, the volumes are mind boggling with 2.5 Million per second. The following illustrations give an idea of the same





What are the key components of a High frequency trading infrastructure ? It is helpful to have the big picture now and see where the trading infra components fit


Feed Handler :
The book says that most of the HFT firms write their own feed handlers and it takes about 2-3 full time dedicated engineers to take care of this task.

Ticker Plant :
Tier-one firms pay around 7 Million USD per year in license and maintenance for these tasks. Add to that the FTEs needed to support and maintain the ticker plant cost easily adds to 10 Million USD. So, the firms need to generate that much money in trading to support this high cost operation.

Messaging Middle ware :
This is another big area where technology plays a crucial role. If one looks at the various communication messages that occur between the entities in trading, the low latency messaging infra is key for trading algos.

Storage :
Depending on the size of operation, firms pay any where between $10,000 to $2 Million for storage.

Colocation :
This is the first thing that any HFT set up wants in place. It costs around $1,500 to $10,000 per month. In INR terms it costs about 9 to 60 lakh per year.

Sponsored Access :
This accounts for 50% of the overall daily trading volume in the US equities market. Out of this 38% belongs to unfiltered sponsored access and 12% belongs to filtered sponsored access. The former involves no pre-risk management from the broker’s end whereas latter involves some kind of risk management checks. The biggest advantage via a sponsored access it the low-latency that a firm gets. For a DMA , the level of latency is about 4 to 8 milliseconds whereas for co-located unfiltered/filtered access it is 300/650 micro seconds . Thus sponsored access is a big business for all the brokers in US. The following illustration gives an idea of the latency for sponsored access trading.


This section describes flash crash that occurred on May 6 2010. Firstly, what’s a flash order ?


Given the above context, the section goes on to describe flash crash at a 10,000 ft view.

Trading Strategies

Trading strategies typically fall in to the following categories 

  • Market Making
  • Statistical arbitrage
  • Momentum trading
  • Basis trading
  • News based trading

A few examples of execution algorithms are mentioned such as VWAP, TWAP, Pegging, Arrival Price are mentioned, following it up with talking about order types. Obviously at this point it is clear that the content in this book is suited to be a report than a book.In fact there is a section where the authors refer to the content in the context of a report. So, clearly this book was some sort of report floated in Aite group that was then converted to a book format.


Expansion in HFT

This is a nice section of the book that gives a bird’s eye view of HFT developments in various places across the world.

US Markets

This section lists a lot of metrics that give a sense of HFT development in various asset classes

  • Futures 
    • 25% of all the futures volume is derived from professional high frequency traders
    • 50% of all orders generated in the futures market were algo model based
  • Fixed income 
    • 35% to 40% of the `on-the-run’ treasuries volume is generated by HFT firms
  • FX 
    • The industry averaged approximately $4.3 trillion in daily trading volume in 2008 compared to about $4 trillion in 2007.
    • At the end of 2009, electronic trading accounted for approximately 65% of all FX trading
    • In 2001, the global FX market averaged slightly more than 200,000 trades daily.At the end of 2009, the average daily trade figure reached more than 1 million trades a day.
    • Another good indicator of high frequency trading’s impact in the marketplace is the overall trend in average trade size. In addition to the explosive growth in average daily trade number, the FX market has also seen a decline in average trade size. In 2005, the average trade size for spot FX stood at close to $4 million. By the end of 2009, the average trade size had shrunk to US$1.4 million.
    • FX high frequency trading is poised to grow quite rapidly over the next few years, as the first-generation high frequency trading firms are joined by an influx of next-generation equity and futures high frequency trading firms looking to capture uncorrelated alpha in FX. In addition, new high frequency trading firms have emerged in recent months, formed by FX quants and traders who have left large banks looking to capture new opportunities on the other side of the market. At the end of 2009, high frequency trading accounted for approximately 25% of overall trade volume
  • Options – The following are the big eight option exchanges in US


In options, by default HFT makes sense as market making across various strikes and expirations manually is a nightmare. Various stock exchanges where options have traded have adopted different models to attract liquidity providers. Some have adopted maker/taker model while some have their own customized rules for attracting market makers.

European Markets

Like Reg NMS, there has been a trigger in the European markets for rapid market structure changes, called Markets in Financial Instruments Directive(MiFID). MiFID is by far the most ambitious piece of regulatory initiative within the European financial services industry. At the highest level, MiFID is designed to achieve the following goals: 

  • Provide pan-European harmonization in order to promote investor protection and the leveling of competition across borders.
  • Improve market transparency.
  • Create an environment for greater market competition for trade execution.
  • Create a pan-European mandate to uphold best execution obligation

The best execution burden is on the firm . This means that firm has to store historical data to be ready to prove that their execution complied with the regulation. This means opportunity for trading infra providers. Also “Bypassing concentration rules” that are part of MiFID have given rise to Alternative trading venues.Over the last two and a half years, multiple venues have emerged from the dust of MiFID, including MTFs and dark pools. These alternative venues are expected to account for more than 30% of pan-European equities trade volume by end of 2010. 




Brazilian Markets
As early as 1990s there were close to 20 different markets. Now there is one single exchange , Bovespa. Electronic trading has started in futures market since July 2009 and the volumes have picked up a lot

Asian Markets

There are regulatory, IT, business and cultural obstacles that hamper the overall adoption of impending market structure changes. Unlike the US and European counterparts, there is no single pan-Asian capital market. Each major financial center has its own set of regulations and infrastructure, which makes it very tough for smaller players to build a significant presence in Asia. 

  • ATS – In general most Asian countries discourage off-exchange transactions. This means that ATS will take a long time to develop. Based on regulatory status and changes in market structure and exchange technology upgrades, Japan appears to be the most ready for first generation ATS adoption. This potential has been boosted by the recent launch of Arrowhead, the Tokyo Stock Exchange’s (TSE’s) next generation trading and market data platform. Unlike other Asian markets, Japan’s proprietary trading system (PTS) activity has increased over the last few months.



  • Lack of IT knowledge on the buy-side adds complexity to driving the adoption of advanced trade execution tools, which includes direct market access (DMA), algorithms, and ATSs.
  • Depending on the country, clearing and settlement is another area of concern. In Hong Kong, Australia, and Singapore, the primary exchanges own the clearing and settlement organizations.
  • While independent ATSs struggle to begin operations faced with inflexible regulations, most brokers/dealers have operated their own internal crossing platforms for a couple of years now, mainly focused on the Japanese and Hong Kong markets. As the markets become more electronic and major Asian market centers begin to fragment, these internal crossing engines will enable brokers/dealers to provide a wide array of liquidity services to their clients.
    The following gives an idea of various crossing platforms in Asia.




  • HFT adoption in Asia-Pacific


The overall adoption of high frequency trading in Asia is expected to lag behind Europe by a significant margin driven by the lack of IT infrastructure, complexity in regulation, and lack of attractive market microstructure. However, the presence of high frequency trading firms in most of the major Asian markets confirms that given the right mixture of conditions, the penetration of high frequency trading flow could be significant and quite rapid

Positives and Possibilities

A complete managed solution includes the following components 

  • Platform
  • Colocation
  • Network connectivity
  • Market data
  • Order management
  • Risk management
  • Algo chassis
  • Compliance reporting
  • Latency monitoring

Aite groups claims to have done an interview of 40 odd HFT firms and the following visual summarizes the finding on the key elements of a HFT managed solution


Close to 50% of the technology used in HFT firms is built in-house. The common themes of this in-house development are

  • Colocation : Key requirements include colocation for new markets and asset classes, data center consolidation, and moving additional strategies into existing colocation. Further, firms were waiting for new colocation opportunities to come online from exchanges and other liquidity providers.
  • Market data : Key requirements include acquiring more direct feeds, new data to support expansion, and building historical data repositories for back-testing strategies.
  • Risk management : Speed is the key requirement for risk management. How fast can trades make it through required checks and can the firm make it faster ? Further, many firms are watching regulatory discussions and preparing to incorporate additional risk management requirements.
  • Geographic and asset class expansion : This is a large bucket largely containing the other items in this list, but additional efforts primarily focus on making internal changes to existing systems to support new data types, new fields, new connectivity methods, and is probably the largest bucket of outsourcing opportunity and interest.
  • Network connectivity : Key requirements in this area include upgrades to routers, the addition of network monitoring tools, and fiber upgrades.
  • General Latency: Key requirements include hardware upgrades, exploring hardware acceleration, engineering code for multi-core, and squeezing latency out of existing code.

In the quant analysis infrastructure, databases , especially HPDB( High Performance Databases) play a central role in quantitative trading.


There are two types of dbs used. One is where all the historical tick data is stored and the key is efficient retrieval of data for back testing. Second is the kind of db where the data is stored in memory and it is used for active analysis. The third area that is mushrooming is Complex Event Processing. In the context of dbs,, there  are lot of decisions that need to be taken. Should the db be outsourced ? Will the current db infra be sufficient to expand beyond the traditional asset classes? The author seem to say that a HFT firm’s success/failure is critically dependent on the decisions taken about databases

The chapter talks about Smart order routing at length. Smart order routing is predominantly used by Broker/Dealers. The following illustration gives an estimate about the use of SOR by various entities

image Market fragmentation and global trading are driving SOR adoption. Of the areas driving growth, Europe is leading the interest in adoption. The following visual gives an estimate of the growth potential of SOR.


What are the business drivers propelling SOR growth ?

  • Dark Liquidity Access
  • Liquidity Access in a New Geography.
  • Customized Trading Algorithms.
  • Small-Firm SOR Adoption
  • New Asset Classes.
  • Compliance.

I came to know about XBRL from this book and that it has been seen an enthusiastic adoption in Japan.In April 2005, the Securities and Exchange Commission announced an initiative to move financial reporting to an electronic filing system. In the electronic filing process, data is published using eXtensible Business Reporting Language (XBRL) to segregate financial information into structured eXtensible Markup Language (XML) documents that can be read by XBRL document readers and machines. Probably this is the reason why there are hedge funds whose main strategies are news driven.

The authors predict that once low latency solutions are available to anyone, anywhere, there will be a ton of trading startups in a lot of countries.

The chapter ends with these words

Colocation is available. There are managed trading platforms and sponsored access. It would seem that anyone with technical acumen, an understanding of the markets, and some statistical analysis skill could build a strategy and start trading in a low latency environment. Sure, capital is a barrier to entry, but there are firms out there willing to fund people with a good strategy. People in the Ukraine, India, the Philippines, Malaysia, etc. will figure out how to turn their technical acumen and market knowledge into a profitable strategy.

Credit Crisis of 2008:The Blame Game

The book ends with a brief description of various regulatory checks that have been put in place , post subprime crisis. There has been a lot of regulation on hedge funds, prop trading shops, derivative instruments, OTC securities, rating agencies etc that have been put in place and will be imposed on the wall street firms. Hopefully this regulation should make the markets stable in the times to come. But you never know, the next LTCM might happen in a few minutes time.

imageTakeaway :

Even though the contents are published in a book form, this is more of a report on HFT. It gives an historical evolution to the current reality of stock exchanges in US , Europe and some Asian countries- High Frequency trading.


The title is a marketing ploy so that someone struggling to write, be it  a graduate student. a researcher or a  professor , buys the book thinking that they will get some magical advice from it. The author himself admits this towards the end of the book that the title is not exactly what the book is all about. He says the actual title should have been, “How To Write More Productively During the Normal Work week with Less Anxiety and Guilt ?”. 

This book is targeted towards people in academia who need to publish their research / write research grants/ write books etc. This book isn’t about cranking out fluff, publishing second-rate material for the sake of amassing publications, or tuning a crisp article in to an long winding exposition. The author makes it clear that the book is not targeted towards the creative/ artistic kind of writers.  He jokingly remarks, “The subtlety of your Analysis of Variance(ANOVA) will not move readers to tears, although the tediousness of it might”

The suggestions by the author can be applied to any domain, any activity I guess. Personally I could relate the suggestions given in the book to programming. Its better to be a “disciplined/scheduled programmer” than a “binge programmer”.  Came across the word “Binge Writers”, in this book. It’s a nice word to describe those who wait for a break, for a 3 day weekend, for a vacation, for a sabbatical, etc. to write their inventory of thoughts. Sadly when a break does come by, they tend to accomplish less than what they had expected. I find this aspect particularly relevant to programming.

Though there are various aspects mentioned in the book, there are some core ideas around which the content is organized. They are as follows :

  • There are some specious barriers to writing
    • I can’t FIND time
      • As though we run through our weekly/monthly planner and figure out that we have absolutely no time to spare. Crap…The author suggests that  you need ‘ALLOT’ time, not ‘FIND’ time. Make a schedule and stick to it
    • Need more analysis to do 
      • Binge writers tend to be binge readers and binge statisticians. meaning Binge writers more often than not tend to do not so great analysis too.
      • Write what you have analyzed , whatever partial work you would have done, at frequent intervals.  It will only help you in doing better analysis/research.
    • I need a new desk/new computer/new ___( fill in the blank with whatever fancy gadget that you need) / new printer / a study room to start my writing
      • All of them are mere excuses and the person who gives the above reasons also knows that they are mere excuses.
    • Let me wait for my inspiration. Will write when I feel like
      • Another barrier which crumbles under close examination. There are tons of experiments that show that  “scheduled writing” always triumphs “writing whenever you feel like”.  Schedule automatically brings in creativity sooner or later
  • Set specific goals, prioritize and track them religiously.As they say, anything that gets measured gets done.
    • The author shows his own system of tracking where he uses a simple file to track the following
      • Date
      • Duration of writing
      • Project worked on
      • # of words written
    • The utility of the above data is that when you sit down and see the metrics like average #  of words you have written over  day / month/ year , the number of days you managed to write, your whole outlook toward writing is improved. These simple stats will make you aware of the limitations / short sightedness in your planning + writing process.
    • Writer’s block is an utterly crappy notion. There is nothing like that. It is a dispositional fallacy(A description of behavior can’t explain the described behavior). Its like saying you can’t write because you are not writing. Prolific writers follow their writing schedule regardless of whether they feel like writing.
  • Start your own Agraphia ( pathological loss of the ability to write ) group
    • The author mentions about an interesting development in North Carolina’s faculty group. They have started a group where they meet regularly to discuss specific writing challengers, to understand others’ ideas and insights in to writing etc. Well, it is basically a “weight watcher” community approach to writing I guess.

Stephen King, the prolific science fiction writer says (about your writing room )

“It needs only one thing, a door which you are willing to shut”

This book, in way says, the same thing with a tweak , “a DOOR which you are willing to SHUT & a TIME that you are willing to ALLOT each day.”


Any book that promises a journey spanning 300 years is bound to focus on events that / people who made the maximum impact for the development of option pricing formula. If one were to pause and think about the option pricing formula, one would come up with questions like

  • Pretty naïve question to start with, Why is there a need for options, in the first place? How were the traded historically? Were they precursor to some other instruments?
  • What factors drive option prices? Can one mathematically pin down a few factors, assuming others are constant?
  • Option by default depends on the movement of the underlying security. So , how to mathematically describe the underlying security?
    • Should one form an equation for the price?
    • Should one form an equation for price increments?
    • Should one assume a discrete process or a continuous process, after all traded price series is a discrete time series?
  • Given a specific process for the underlying security, How does one go about figuring out the price of option?

This book in a way traces all the developments leading to Black Scholes equation like the Brownian motion, Ito’s calculus, Kolmogorov forward and backward equations,etc. and leading up to the most important idea of option prices, “replication”. Each of these ideas are described chapter wise. Let me summarize briefly, the chapters in this book.

Flowers and Spices


The book starts off describing Tulip mania of 1630’s and the reason it talks about Tulip mania is this : It was the first instance when government, in order to come out of a crisis, converted forward contracts to options contracts. The chapter then talks about Dutch East India company that dealt in Spices. The history of this company is closely linked to the emergence of the first formal market for options. Dutch East India company (VOC) became a powerful company in Netherlands just a few years from its inception. The shares of the company became valuable and a futures market emerged. Subsequently to cater to various kinds of demands, options market emerged for VOC shares. VOC finally got bogged down in corporate politics, corruption and became bankrupt. The period of 1680s was also the time when there was a need for communicating to general public about the ways to trade and understand options. To clarify the various terms and mechanics of options, Joseph de la Vega wrote extensively in his book,“Confusion of Confusions”. De la Vega was the first to describe in writing the workings of the stock market, in particular the trading of options. His work is held at such a high esteem that, since the year 2000, the federation of European Securities exchanges sponsor an annual prize for “outstanding research paper related to the securities markets in Europe”.

In the Beginning


This chapter contains some important developments that happened in the financial markets between 1694 and 1885.The chapter starts off in 1694 with John Law advising French king on restoring the financial stability of the country. John Law started a firm in France that printed paper money. It was the first attempt in Europe to replace metal coins with paper money as legal tender. The bank, forerunner of France central bank, guaranteed that the notes would always be worth as much as the metal coins on which they were based. Law also convinced the king to grant him powers for a natural resource trading company so that he can bring in more revenues in to the country. Law’s company became very popular and there was a mad scramble amongst people to buy shares of the company. Like any bubble, the printing machine idea flopped by 1720 and France was facing an imminent financial disaster again. However the taste of trading and speculation activity that Law gave to French citizens was still in full force. Unofficial trading of various other instruments increased. So, finally in 1724, the government decided to bring some order in to this situation and an official Bourse was created. The whole system ran well until the French revolution in 1789 after which chaos ensued. There were few more developments that lead to the reopening of Paris Bourse and this time everybody was allowed to trade. Again forwards were outside the regulation but it did not stop the increasing volumes in the instruments and soon became the hub of speculators. Also with the collapse of a big investment bank, France went in to a recession. Out of all these developments, there was one positive development,legalization of forward market in 1885.

From Rags to Riches


This chapter talks about the life of Jules Regnault. It is a classic rags to riches story. Why is the story relevant to the book or options pricing ? Well, Jules Regnault was the first person at least as per the book who deduced square root of time law. He not only tried proving it using math, but also used the law in the stock markets to retire rich. He started his work life at the age of 23 as a broker assistant. His living conditions were miserable but he managed to improve his living conditions by working hard in the broker’s office. Regnault was the first person to try to understand the workings of the stock exchange in mathematical terms, and his explanations had all the trappings of a scientific theory. Regnault managed this with hardly any formal education system. After a full day’s work at the Bourse, he would sit in a small room under the attic and do quant. Truly inspiring life. Remember this was in 1860s and he was investigating concepts such as random walks, role of information in stock markets, useful/harmful sides of speculation, insider information, factors that drive the valuation of an instrument, ways to calculate fair value of an instrument etc. An admirable quality about Jules Regnault’s life is that he never shied away from applying things in the stock market. He used all his principles at the Bourse and retired at the age of 47 after making a ton of money. In one sense, Jules Regnault can be called the first quant who made a fortune in the market.

The Banker’s Secretary


In any introductory book on options theory, you are likely to see payoff diagrams for various options and option strategies. This chapter talks about the first person to use these diagrams for communicating option positions, Henri Lefevre. Lefevre was a personal secretary to business tycoon James de Rothschild. Lefevre did not participate in speculation or trading activities but was a keen observer and educator of markets. He published various books and articles, thanks to the fact that he was a secretary to Rothschild and could influence the publishers. He made two main contributions to options theory. First contribution was his mental model of comparing economy and the flows of capital and goods with the human body and its organs. In Lefevre model of economy, stock exchange is like the heart that keeps blood moving through the veins, government is like a brain that thinks, combines and regulates the flow, Speculation is like the nervous system that provides the momentum that keeps commodities and capital in continuous motion. His second contribution was in 1873 and 1874 through his books. In those books he introduced a method of analysis that we are all familiar with, the pay off diagrams. The pay off diagrams for individual options might be very simple and a use of such a diagram to explain things could be a stretch. The real power of payoff diagrams comes in to play when you are analyzing a set of option positions. The final payoff diagram for a set of option positions, at once provides the price ranges where the entire position makes or loses money. Lefevre’s method and his extensive publications in the form of books , papers, articles, etc. helped the common masses understand options and options based investing in a better way.

The Spurned Professor


Louis Bachelier

The study of financial markets began in earnest at the turn of twentieth century. The first underpinnings of a mathematical theory were developed in the doctoral thesis of a thirty-year-old French student , Louis Bachelier. In 1900 Bachelier defended his PhD thesis and the committee composed on Poincare, Appell and Boussinesq awarded the thesis as “somewhat better than okay”. This evaluation haunted Bachelier through out his life as he could not secure a faculty position without a “real distinction” on his PhD. One of the reasons for Bachelier’s theory not getting attention was that it was incorrect. He analyzed price movements as a specific random walk ( an Arithmetic Brownian motion ) which allowed for stock prices to take negative values. So, the thesis that was probably the first step towards a mathematical treatment of financial markets lay dormant for a long time.

Botany, Physics and Chemistry


The first observation about the jiggle movement of particles was by a Dutch Scientist Jan Ingenhousz. However the credit goes to Robert Brown for the name. I find Robert Brown’s life very inspiring. He made use of his free time and did all the observations and work after work hours. He never socialized or dined out. Basically he was a guy who kept to himself and did outstanding work. He observed that Brownian motion never ceases though he never knew the reason for Brownian motion. Remember this was 1827 and molecular/atomic theory was not established. Then came along Einstein , aptly named as the father of atomic theory. He hypothesized and predicted the Brownian motion behavior. He also formulated the square root law by clarifying that one must not analyze a single drunkard’s walk but an ensemble of drunkard walks. In 1906 Marian Smoluchowski , another scientist made important contributions to understanding Brownian motion when he postulated that , the observed jittery motions are the displacements due to the unobservable zigzag movements that are a result of huge number of impacts. He concluded this after analyzing the speed of the particles at various resolutions. He saw that the velocity of the particles increased at higher resolutions. This made him conclude that whatever action that is seen in the microscope is basically the displacement. This book provides a fantastic analogy to Brownian motion which one will never forget after reading once. The books says it in a beautiful way:

Imagine the dance floor of a studio. Flashing strobe lights highlight the seemingly jerky movements of the dancers, Of course, even the coolest dancers do not disappear in to thin air between the flashes. They move contiguously through time and space, but their movements are hidden during the dark periods between flashes. In fact, there could be different dance steps that give rise to identical positions during the strobe flashes. This is the key point. Particles pushed around by the molecules in a liquid describe a continuous path, and the jerky motion we see under the microscope is just a sample of the entire movement.

In the next chapter, the book gives an account of Norbert Weiner’s life who proved that even for the jerkiest observations, it is practically certain that continuous paths exists which give rise to them. Typically these historical developments give so much meaning to what one gets to read on a pure mathematical text on Brownian motion. Brownian motion is continuous. “Oh! ok! ” was my first reaction when I studied stochastic calculus. However books such as these give so much context to mathematical properties that learning math becomes that much more interesting.

Disco Dancers and Strobe Lights


It was in 1880 that physicist John William Strutt discovered random walk in a completely different context, super imposition of waves of equal amplitudes, equal periods but random phases ? He came to the conclusion that amplitude was proportional to the square root of number of vibrations( variant of square root rule). Thorvald Nicolai Thiele, another scientist also worked on the random walk , while developing concepts in computational techniques. He used Gauss method of least square to conclude that an ensemble of particles following random walk would have an average displacement proportional to the square root of number of steps. Some 80 years later, this was picked up by another scientist Kalman and today we know a ton of applications of Kalman filter. Karl Pearson, the famous statistician added in a bit with his article in “Nature” magazine. Despite all these efforts, Brownian motion was not on sound mathematical foundation. It was Norbert Weiner who formally defined the process, proved its existence in a 40 page paper in 1923. In 1948, Paul Levy, considered as the founding father of probability theory , defined and constructed another class of random walks called Levy processes. In the last 60 years, more and more scientists studied various classes of random walks such as reflecting random walks, loop-erased random walks, biased random walks, random walks with drifts etc. In 2006, the prestigious Fields Medal, the Nobel prize equivalent in mathematics, was awarded to a French mathematician, Wendelin Werner, for his work on random walks.

The Overlooked Thesis


Regnault had brought Statistics , Lefevre Geometry and Bachelier Calculus to understanding options. This chapter highlights some of the important elements of Bachelier’s PhD thesis that shows how the thesis was so far removed from the traditional way of analyzing finance. The section mentions the following from the thesis

  • The thesis referred to the efficient market hypothesis and formulates the mathematical equivalent of the same.
  • It departs from the tradition of describing economic phenomenon verbally and uses mathematical equations to describe things.
  • It showed that fluctuations of the stock price around true price follows Gaussian distribution.
  • It showed two justifications for the square root of time law.
  • It used Fourier methodology to derive heat equation in the context of price probabilities
  • It showed that Options value depends on the security’s volatility and time
  • It used reflecting principle to reduce the complexity involving Brownian motion calculations.

One of the offshoots of Bachelier thesis was Chapman-Kolmogorov equation that ties in the probability distribution of a variable at the end of a discrete Markov chain to its intermediate probabilities. Books such as these should be made required reading for a student before he/she gets exposed to math relating to Brownian motion. If you learn square root of time law using math and with no prior exposure to the rich history behind the development, you might learnt it but not appreciate the beauty that lies behind it. Similarly you might prove that Brownian path is nowhere differentiable but you will feel the result in a completely different way after reading Theoder Svedberg experiments about calculation of velocity of particles. All said and done, I strongly believe that historical developments about concepts/formulas are very important, sometimes more important than merely knowing a few ways to derivation of Black-Scholes equation.

Another Pioneer

Developments in science tend to be non-linear and controversy laden. Examples of plagiarism accusations are rampant. In such a context, it is but obvious that option pricing formula also has , in its history , some people whom we can only speculate that, they knew about a way to value options much before anyone else. One such individual mentioned in this chapter is Vincenz Bronzin. His work was accidentally discovered by Swiss historian Wolfgang Hafner, who later sent it to Prof Zimmermann. Both concluded that Vincenz Bronzin work in early 1920s had all the necessary math and concepts relating to pricing an option and in fact Bronzin’s work ends up deriving the same formula as Bachelier’s. The fact that his work was never popular/ recognized shows that historical development of any concept is tough to attribute to individuals. You never know that some person who never published stuff might have known the concept right through. Vincenz Bronzin work on option pricing was so advanced that some of the concepts looked similar to what the final Black-Scholes formula looked like, decades later.

Measuring the Immeasurable


I loved this section where various historical figures are mentioned in the development of probability and stochastic processes. Firstly, what has a stochastic process got to do with pricing of options. What’s wrong with using / trying to use a deterministic process ? Well, the basic difference between a stochastic process and deterministic process is, in the latter case you can pin point the result and in the former case, you can only get a probability distribution for the result. So, all diffusion processes, arithmetic random walks and geometric random walks are all nothing but a way to summarize the particle movement and any computations on it will likely result in a probability distribution. Ok, let me get back to summarizing this chapter. This chapter talks about Kolmogorov, the father of modern probability. Picking on one of the Hilbert’s 23 problems that were announced in 1900, Kolmogorov developed a full-fledged axiomatic theory of probability in 1933. In doing so, he heavily relied on the works of Henri Lebesgue, George Cantor and Bachelier. Henri Lebesgue is credited for his revolutionary method of integration called the Lebesgue Integration that is applicable to a wide range of functions and sets of points. Lebesgue benefited from Cantor’s work on real line and introduced the concept of measure. Well , the development of Lebesgue integration is  in itself is a fantastic story and “The Calculus Gallery” covers it in a splendid way. Kolmogorov also derived Fokker Planck equation in his monograph, unaware that the PDE was developed in 1913 and 1914 by two physicists Adriaan Fokker and Max Planck to describe the time evolution of a variable’s probability distribution. The variable could be a particle’s position in a fluid or distribution of stock price. I liked this section because of a nice analogy that gives the difference between Chapman Kolmogorov and Fokker Planck equation. I love analogies as they are the first things that come to your mind than the raw equations. I paraphrase the author’s words here

The Chapman-Kolmogorov equation gives the probability of jumping from A to Z by investigating the probabilities for two consecutive jumps, say from A to P and then from P to Z, or from A to Q followed by a jump from Q to Z, and so on. It is like driving from New York in any direction and ending up in San Francisco. What are the chances of that happening ? Well, you could drive via Dallas, Texas, Via Witchita, Kansas, via Rapid City, South Dakota or via any other route. The probability of ending up in San Francisco is then computed by adding the probabilities of all possible routes. This is what the Chapman-Kolmogorov equation does. The Fokker-Planck equation goes one step further. It develops the whistle stop tours in to a distribution of probabilities of ending up anywhere on the West Coast, be it San Francisco, Santa Barabara, Los Angeles, or Tijuana, Mexico

After formulating the PDE, Kolmogorov found that the solution to the PDE was Chapman-Kolmogorov equation. He then turned the situation around: starting with a parabolic PDE, he pondered on the question whether it had a solution. If the answer was yes, then the solution could be none other than the Chapman-Kolmogorov equation. Thus he proved that Chapman-Kolmogorov equation indeed existed and was not merely a figment of imagination. Further development was needed in this area as there were strict conditions that one had to impose on Kolmogorov’s PDE describing Brownian motion so that it had a solution.

Accounting for Randomness


Kiyoshi Ito

This section talks about the contribution of Kiyoshi Ito. Since the Brownian motion is jagged and jittery at any resolution, one can’t calculate the velocity of the particles, as Marian Smoluchowski concluded in 1905. So how does one go about analyzing any Brownian path if the usual Riemann and Lebesgue Calculus cannot be applied ? Karl Weierstrass was the first mathematician who came up with such a nowhere differentiable function. How does one work with such functions? There is no way to slice the paths and handle it in the usual way. Whatever slice you look at , there will be jaggedness. What’s the way out? Here is where Ito comes in. In 1942 Ito published a paper that contained the mathematical machinery to handle such functions or paths. He used Taylor series expansion and found that the first three terms of the expansion were all that mattered for a stochastic process and thus ignored the rest to come up a forecast of probability distributions. Today Ito’s calculus is synonymous with “framework for handling Stochastic Differential equations”. Ito lived for 93 years and contributed immensely to field of Stochastics. In his Kyoto-Prize address, he says that he devised stochastic differential equations, after painstaking , solitary endeavors. For those who think that Solitude drives people crazy and must be avoided, Ito is a classic example of the tremendous possibilities of Solitude in life.

The Sealed Envelope


Wolfgang Doblin

This section talks about Wolfgang Doblin, a young mathematician who commits suicide while serving in the army to avoid getting killed by Germans. Before his death, he sends a sealed envelope to Academy of Paris. There is a big history behind the sealed envelopes that one can read from this section. The sealed envelope was supposed to be opened in 2040 but thankfully it gets opened in 2000. The math that Doblin sent to the academy had a framework to deal with Stochastic PDEs in such a way that one could lessen the restrictions imposed on it. Mathematicians wonder that if Doblin had not served in the army and had not met the fatal outcome, option pricing would have developed much earlier. In any case, a reader gets a glimpse in to this young genius,thanks to this book. Doblin’s life revolved around math and it served as a way to get him out of his gloomy and depressing environment. Though he only had the rare hour to focus entirely on math, usually during the night shifts hidden away in the telephone station that he was guarding and that provided some heat, his preoccupation with mathematical problems alleviated the dreariness and kept him from falling in to depression In one of his letter to his professor he writes,

Fortunately, my mathematical work helps me fight despair. As I am not interested in alcohol, I do not have the luxury of getting drunk like others.

The Utility of Logarithms


Paul Samuelson

This chapter talks about Paul Samuelson who relooked at Bachelier’s thesis and improvised on it. Instead of considering plain vanilla Brownian motion, Samuelson insisted on geometric brownian motion because of couple of reasons

  • Share values are always positive in GBM scenario
  • Price jumps, in percent, are equally distributed
  • Model is in accordance with human nature, as displayed by Weber and Fechner’s psychology
  • Data observed on stock exchanges fit GBM than Arithmetic Brownian motion

The chapter also mentions M.F.M.Osborne, a Physicist, who claimed in 1958, that log values and not the changes in stock prices are normally distributed. He was motivated to analyze the behavior after reading the works of psychologists Weber and Fechner. This observation is the same utility argument made by Daniel Bernoulli in 1713 while solving St.Petersburg Paradox.

The Nobelists -  The Three Musketeers


Fischer Black – Myron Scholes – Robert Merton

The next two chapters talk about Fischer Black, Myron Scholes and Robert Merton. MIT was one of the common denominators for all the three people. Fischer Black and Myron Scholes worked together on consulting projects for financial firms and thus were aware of markets and the places where quant techniques could be applied. They developed a good working rapport as Black’s forte was theory and Scholes was good at implementation. Robert Merton had worked on pricing warrants along with his MIT professor Samuelson. Warrants were similar to Options but for some technical differences. Before the trio cracked the formula, there were quite a number of scholars who came close to cracking it. In 1961, a graduate student in economics, Case Sprenkle made some headway. He assumed GBM for prices and allowed drift. But his approach was faulty as he posited that investor’s utility function would be revealed in prices. Also he ignored time value of money. In 1962, James Boness another student from Chicago also attempted and improvised on Sprenkle’s model by incorporating time value of money. However there were some problems with his theory as he assumed, all stocks on which the options are traded are defined to be of the same risk class and all investors are indifferent to risk. So, all attempts of Thorp, Case Sprenkle, James Boness, Robert Merton and Samuelson had shortcomings. But they all carried seeds of the eventual solution.

Somewhere in 1968 Fischer Black tried formulating the option price as PDE but faced a tough time solving the PDE. So, he filed it in his cabinet and went on with his work. However in 1969 , while conversing with his sole associate of his company, Myron Scholes, Black got a chance to start the work again. The key idea that was used in cracking the formula was “Portfolio hedging and replication” . If the option is deep in the money, then delta or the hedge ratio is 1. This means you can perfectly hedge a long call option by shorting one share of the underlying contract. If the option is not deep in the money but near ATM, then the hedge ratio is definitely not 1. It must be less than 1. If you represent the hedge ratio as x, then the combination of call and hedge position is basically a risk free portfolio growing at risk free rate. With the unknown hedge ratio as a variable, Black and Scholes wrote an equation describing option pricing.

With the PDE in hand, Black-Scholes made an educated guess that turned out to be the solution for option pricing formula. The solution to Black-Scholes PDE surprisingly did not have stock return component at all. This was totally unexpected from the prevailing notion that option price should somehow contain the return of the underlying instrument as one of its components. But Black-Scholes proved it otherwise. Robert Merton arrived at the same solution using Ito’s calculus. Initially Black had a lot of difficult publishing the paper in academic journals. It was only in 1973 that Black and Scholes managed to publish it under the title,”The Pricing of Options and Corporate Liabilities”, in Journal of Finance and Journal of Political Economy. According to a study undertaken in 2006, the paper garnered 2589 citations, making it the sixth most cited paper in all of economics. In the paper they acknowledged the fact that Merton had worked out a different approach and arrived at the same solution, thus kind of validating their final solution to option pricing.

The Higher They Climb …

This section highlights the lives of the three musketeers before the Nobel Prize award in recognition to the work. There are some people who believe that Texas Instruments was solely responsible for the tremendous marketing of Black Scholes formula by incorporating it in the module in its calculator. Traders did not need to know anything about the formula except the inputs. So, they happily traded away the options at CBOE that is today one of the biggest derivative exchanges in the world. Black joins Goldman at the age of 46 , becomes a partner with in two years and dies at the age of 56 (1995) because of cancer. Scholes and Merton land up at LTCM a fund using quant stuff to manage money. The principals get their recognition in the form of a Nobel Prize in 1997.

…The Harder They Fall

This section talks about the fall of LTCM. The LTCM bust is basically a `leverage’ lesson for all money managers.If one needs to read in detail about the demise of LTCM, the book by Roger Lowenstein is spot on. This section does not fit the flow of the book.

The Long Tail

The last chapter talks about the problems behind Black-Scholes formula such as assuming volatility constant, assuming GBM that has normal distribution at it’s core. The books ends by saying

So, there are pitfalls and they can and do lead to spectacular failures like the LTCM crash, crisis of 2007, demise of Lehman Brothers. But to fault Black, Scholes and Merton and their equations for such aberrations would be utterly wrong. This would be like accusing Issac Newton and the laws of motion for fatal traffic accidents.



”What risk premium should be used for option pricing” was a stumbling block for developing the option pricing formula. The answers given by Black-Scholes-Merton surprised everyone : no risk premium at all. This book traces the historical developments leading to option pricing formula and in doing so, weaves an entertaining and a gripping narration of all the mathematical developments that were a precursor to Black-Scholes formula.