June 2008


PairsTrading

Simple ideas are usually very powerful. “Pairs trading” is based on one such simple but powerful idea which generated millions in profits for various firms. Morgan Stanley which first implemented this strategy and a host of other firms which copied this technique and made a lot of money.

Pairs trading is based on 2 constructs :

First, Any stock moves in a random way , however there is a systematic component to it and idiosyncratic component. In simple terms  , there is a component you can identify with a set of factors , and there is another component which is stock specific. So, you can pick a pair of stocks such that the systematic component of one stock is related to the systematic component of another stock.. Thus you have a grip on the spread.

Second, test for the mean reversion of the idiosyncratic risk. Meaning, see whether the error component of the pair exhibits a mean reversion.

Once both the things are known for a pair or you can find a pair which satisfies the above 2 constructs,, all one needs to do is to take a long-short position , dynamically hedge the position and make money whenever there is a mean reversion, meaning whenever the spread hits 0. In a stock universe of lets say 5000 stocks, one needs to compare 2 million pairs…Though the technique is simple, if you want to test on the universe  , it is going to time intensive…This book talks about a technique to get over this problem.

The book is very well written because it tries to explain a lot many of things intuitively, with proper reasoning for all the equations used in the book.

This technique was first implemented in 1987 and traders have ripped apart the strategy in US in the last 20 years. So , What does one learn from reading up this dead strategy ? May be some knowledge about what worked and why it worked in a specific market.

The highlight of the book is the following statement :

Any model should consider the following :

  1. Transaction Costs,
  2. Risk,
  3. Return,
  4. Market Efficiency

and the objective of a investment strategy should be minimize first two, maximize third and work around the fourth. Liquidity and Taxes are the additional elements to make any model realistic.

Coming back to what the book is all about,

It is about Alpha, a term used to represent , excess of benchmark returns generated by a fund manager. This book talks about the eternal quest for alpha and poses a lot of interesting questions. It begins with a history of passive indexing strategy and then tries to bring out various perspectives for seeking alpha. Ross’s APT, Markowtiz Portfolio theory, Sharpe’s contribution, Tobin’s Separation theorem are some of the historical aspects which the author brings out in a conversational mode. The rise of managed futures , behavioral finance, quant jocks , var and the hype around it, etc are highlighted in the book.

In the end , this book makes one think about the fund manager and the issues he/she faces

If one looks at a typical fund manager, he has to deal with the following :

  • Public information
  • Technology and what role he should allow it to play
  • Portfolio selection
  • Portfolio risk management
  • His/her own emotional ups and downs while trading and taking positions
  • Dealing with Investors
  • Dealing with Markets
  • Analysts – Fundamental, techinical ,quant jocks etc

The above are only a few things of a host of activities that a fund manager needs to look at. In the end, he is judged by only one thing – ALPHA

This book talks about changing dynamics of each of the factors mentioned above. The author takes a view that investing is becoming more like the whale hunting activity of yesteryears. In the initial period of whale hunting, the target was easily found near the shoreline. However as time passed, one had to go deep in to the ocean to get a whale and over a period of time, whale population was very scarce that hunting was no longer considered a business that one can depend upon. In the investment world, where the yesteryears were dominated by market efficiencies , a lot of managers were able to give handsome returns to the investors. But as the market became more efficient, alpha became elusive. With the rise of computing power, the arbitrage opportunities are becoming scarce.

So, how does one achieve alpha in these days where beating index is becoming more and more difficult. I don’t know..makes me wonder whether there is any case for stat arb trading models at all. Like pairs trading ,is there some strategy which is yet to be exploited by wallstreet ? May be there is, May be there isn’t…Reverse Law of Large numbers strategy, the emerging research output from behavioral finance would , i guess, throw more light in the times to come.

Recently, I stumbled on to an article which had a debate between Fischerian approach Vs Bayesian approach for design inference. According to the author, both had drawbacks when dealing with events with small probabilities. I became curious to know the bayesian side of things. My work in the past few years did make me use Fischerian approach in many forms..But What about Bayesian world ? Except the famous MontyHall problem and few other general examples, I had not read or understood clearly , how a bayesian approach could be put to real use. Many years ago, reading, “Hackers and Painters” , I did come across an entire chapter of applying bayesian principles to fight spam, written elegantly by Paul Graham. However at that point of time, my motivation levels for bayesian approach were low..May be I never understood anything at all, except have a 10,000 ft view of the entire branch of statistics.

Why bayesian now ? My experiments with R, the open source software for Statistics , was becoming rather boring. It was the same old wine in new package, same old commands with new syntax. The same old, t tests, hypothesis testing, regression, multivariate regression, logistic reg, multinomial, garch, etc etc..It was deja vu for me…Yes, all these techniques were indeed interesting..Infact I am tutoring a Phd student on using the raw data for thesis dissertation.Somehow, whenever I am dealing with the Fischerian world, I am more inclined to believe that it is used as a sophisticated wrap to our beliefs rather than validating our beliefs in a meaningful way…I guess that is where Bayesian probability comes in. When I stumbled on to this book, I wanted to go through the entire book in one sitting and just understand once and for all , why should i consider / ignore bayesian world ? and so it happened that one afternoon I sat with this book with only one thing in my mind , “What’s the practical real world application of Baye’s ? “.
“Ending Spam” provided me with a solid answer. Let me try to summarize the key points of the book :

First, some history about it :
The world’s first spam message was sent by a marketing manager in 1978 at DEC, which raised a furore in the then low bandwidth arpanet network. It was subsequently followed by College Fund spams, Jesus Spam:),notorious couple, Canter & Siegel who became famous for writing a software program to spam, Jeff Slaton, the spam King, Flood gate (first spamware). There were a lot of individual ineffective battles that were faught by anti-spammers. Blacklists, @abuse addresses, etc etc..But nothing was effective. By 2002, the spam reached 40% of the total internet traffic. The solution was becoming elusive.

Initial Tools:

Blacklists
, centralized blacklists were the first solutions to the spam problem. Email software gave user the choice of blacklisting senders based on their source email address or a set of specific words. This gave rise to a a lot of false positives. However this became popular it was very easy to implement and customize.Maintenance of the blacklists was a big problem

Heuristic Filtering,came next. Users connected to a centralized service which downloaded the mails from users ISP, ran those mails through a set of heuristics and lookups and acted as a filter . This was effective for sometime until hackers learnt to get by the rules. Also, heuristic filter applied a universal score and based on a set threshold score, classified a message as a spam or Ham( a word generally used for a relevant message). There were a lot of maintenance headaches as the server lists had to be updated regularly

Whitelisting , was next. Only the allowed users can send an email . This cuts spam completely but at the same time is also not good as it cuts off the entire world. Also, forgeries can plague this system.

Challenge Response where the senders had to do the job of spam filtering. This was news to me..Since I have never seen this kind of mail ever, it was so surprising to learn that such systems were used to fight spam..Look at the mail below, i bet, you will be surprised too

Greetings,
You just sent an email to my spam-free email service. Because this is the first time you have sent to this email account, please confirm yourself so you’ll be recognized when you send to me in the future. It’s easy. To prove your message comes from a human and not a computer, click on the link below: http://%5BSome Web Link] . Attached is your original message that is in my pending folder, waiting for your quick authentication.

Throttling was probably one of the most sane means of attacking spam.The philosophy behind throttling is that a legitimate mail distribution would never need to send more than a certain threshold of traffic to any particular network. For example, a legitimate mailing list may send out huge quantities of mail, but each message is going to different recipients on different networks. At most, only a handful of the messages going out would be directed to any one network. A spammer, on the other hand, may have scripts designed to bombard a network with spam by using a dictionary attack, in which every possible username is generated.

Collaborative filtering , address obfuscation, litigation were few other methods used to cut down spam..However spam continued.

Language Classification : Use spammer’s weapon against the spammer. One of the things which a spam filter has at its disposal is the content of the message. A language classification filter is a machine learning tool which does the following: It first gets trained on a corpus , tokenizes the incoming mail and then assigns probabilities to each token in the message. Ultimately it assigns a spam score based on joint probabilities and classified the entire message as SPAM or NOT A SPAM. The most wonderful thing about this approach is that the user is in control to decide the corpus , user is in a position to provide feedback to the filter and thus filter works on a customized basis. This approach where the initial a priori probabilities are revised , and new posterior probabilities are computed

In my next post, I will review the basic ideas of statistical filtering mentioned in the book.


AVOID THIS BOOK

This book has a deceptive title ” Analysis of Financial Data”.Sadly, it does not help an analyst in anyway.. I went through the book and it did not add an iota of knowledge about analysis..It is 10,000 ft view of stats and its usage in finance.Even the 10,000 ft view is hazy!!!.. It gives a set of formulas with no explanation whatsoever. This book is nothing but huge number of english words filled between statistics jargon.

Againstthegods

"Against the gods" is a beautiful narrative on the history of risk management. Peter L Bernstein , the author of the book has a done a terrific work of narrating the evolution of risk measurement. The book is divided in to 5 periods where the story of risk is presented.

The five demarcated periods are Uptil 1200 ,1200-1700,1700-1900,1900-1960,Post 1960.

In each of these periods, the author talks about various personalities involved. Let me recap the book in the same manner, listing the main items from each of these time periods.

Firstly, something about the title of the book " against the gods" . It is so named  because the author brings out a pattern in his narrative i.e, through the history of the development of risk, there was one powerful idea that galvanized the development of risk, the idea  that humans are in control of their destiny Vs Gods who control the destiny.

Pre-1200
Numbers Era
Greeks had an immense interest in gambling and hence probability theory was the natural thing for them to take upon and explore. Yet Greeks never took up probability and worked on it. One of the reason that author surmises is the fact that greeks were of the belief that the world was controlled by god and any study to control the universe in whichever manner would only result in a futile exercise.
Pre-1200's was a period which was characterized by folks trying to understand and formulate numbers.

Fibonacci
Fibonacci

Leonardo Pisano (Fibonacci) wrote a called  Liber Aabaci which was the first treatment on the theory and application of various aspects of numbers. This was also a period when the development of 0 made a significant impact in the way numbers were used

Period 1200 – 1700
Outstanding facts Era

Renaissance Gambler:
Pacioli

Lucal Pacioli was the first person to give an exhaustive treatment to accounting in his book summa.He is also the first accountant in the human history.  In his book summa, he gave tables for 60*60 multiplication operations. He was a numbers man and he posed the most famous problem of all times, the problem of balla.
A and B are playing a fair game of balla. They agree to continue until one has won 6 rounds. The game actually stops when A has won 5 and B has won three. How should the stakes be divided

Cardan_3.jpg Cardano

A physician named Cardano was an eternal gambler. He gambled every day of his life and he had seen so many gambling games in his life that he wanted a set of rules for playing the game based on the odds of various outcomes. He wrote a breat book on mathematics Ars Magna ( The great art) which was followed by Liber de Ludo Algae( Book on games of chance). This appears to have been the first serious effort to develop statistical principles of probability. Probability always had 2 meanings one looking in to the future, the other interpreting the past, he former cncerned with our opinions and the latter concerned with what we actually know. However the idea of measuring probability came later which means " How much can we accept of what we know ? . In a sense the book  Liber de Ludo Algae was a primer to risk management. Cardano is credit for an bringing a new terminology like fair dice, circuit, combinations,odds ratio etc. Interestingly , the word "fair dice" came in to being because he had spent years on the gambling table and he could see how various players cheated.  However his book was not accessible for a lot of mathematicians in the renaissance time for various reasons

French Connection :
Pascal_2
Pascal  Chevalier Chevalier Fermat_5Fermat

 There are three important French  personalities who played a significant role in the development of probability theory. First, was Blaise pascal , an outstanding mathematician whose work on cones at a young age of 16 brought great praise for his intellectual faculties. The other was Fermat whose work on the theory of numbers is by far the most comprehensive work done by an individual. He is more popularly known for his last theorem which mathematicians have struggled to solve it for about 350 years.

These two mathematicians were great in their respective fields but it was Chevalier De mere , a noble man, a person with keen interest in gambling and mathematics , posed the old problem of balla. In a series of communications between Pascal and Fermat, Pascal came up with a triangle , popularly referred to as Pascal's triangle to calculate the odds for solving problem of balla. This was the first time  a mathematical tool was used to forecast, in this case, the prize money in the game. Pascal's triangle was a neat way to summarize the events that could happen in a probabilistic sense. For example if there are 5 games are to be played between 2 folks, then the 2 power 5 = 32 corresponds to the 5th row in the triangle from which one can read different types of events that happen.

Remarkable Notions Man:

Graunt
Graunt   Petty
Petty

John Graunt , a merchant and William Petty were folks who used statistical inference techniques. Graunt was a man who was obsessed with verifying common every day notions. With the help of Petty, he developed a method of drawing inferences from a small sample  .Both were extremely interested in the organization of human society rather than the science of nature However they never used the word probability . Estimating the odds of uncertain events had to wait until 1700-1900 , a period appropriately titled "Measurement Unlimited"

Period 1700 – 1900
Measurement Unlimited Era

Meet the Bernoulli family:

Bernoulli family is considered as a family which had a swarm of mathematical descendants whose have made immense contributions to the understanding of uncertainty.

Family_tree

Bernoulli_Daniel_3
Daniel Bernoulli

Daniel Bernoulli is credited to have brought in the element of risk taker in to the whole game of risk. He hypothesized that the importance of wealth for an individual is inversely proportional to the amount of wealth accumulated. From the world of simple dice, roulette wheels, the inclusion of the player brought in a whole new dimension to the risk management development. Utility as a concept had a tremendous influence on the way risk management principles were developed in the later years.Petersburg paradox is a classic example of utility concept explained by Bernoulli,

 

Bernoulli_Jacob
Jacob I Bernoulli

Bernoulli was interested in a-posteriori probabilities , i.e computing probabilities of something after fact.His example of a glass jar having 3000 white pebbles and 2000 black is often quoted in literature. The problem goes something like this :
A pebble is drawn, its color is noted and then put it back . How many pebbles need to be drawn so that we  can reasonably certain of the true ration, result should be with in 2% of the true ratio. The answer turns out to be 25,550
Experimenting on the same lines he formulated the Law of Large numbers which says that :Average of large number of throws will be more likely than the average of a small number of throws to differ from the true average by less than some stated amount..

DeMoivre_gr_12
De moivre

Demoivre then picked the concept and formulated the normal curve distribution.The third person who belonged to the same era and contributed to the formulation of a posterior probabilities is Bayes. Though none of his work got published when he was alive, his work had a great influence later. Possibly, the most important contribution of bayes was the precise problem formulation :

Bayes
Bayes

Given the number of times in which an unknown event has happened and failed, Required , the chance that the probability of its happening in a single trial lies somewhere between any two degrees of probability that can be named ?
Then came gauss who formulated the single most important theorem in statistics, the central limit theorem

 Gauss_1803
Gauss

Gauss was conducting research on geodesic measurements, i.e, the distance between any two points on the earth and the direct distance between the 2 points. As earth has a curvature, the 2 metrics are going to be different and they are going to be different in different places. But what Guass  found an amazing pattern. The average value of errors between actual and observed , even though they were different in various places, the average of average errors followed De moivres bell curve. Thus central limit theorem deals with the average of averages.
In simple words, this theorem says, if you pick a large sample, take its average, do it multiple times, and plot the averages of all the sample, the frequency distribution is a normal distribution..This is an amazing pattern because the actual distribution of random variables can be anything, but sample averages tend to be normal.

MW_Galton
Francis Galton

Galton was an amateur scientist with a keen interest in heredity but with no interest in business or economics.He was a measurement freak and he studied extensively numbers. He treatise on heredity is supposed to have evoked great praise from Charles Darwin. His contribution to statistics is "Regression to mean" . In the long term, the high and low values of a variable stabilize to an average value. He also hypothesized that influences on a variable themselves had to be normally distributed for the dependent variable to be normally distributed. This is nothing but the popular theorem that sum of N normal random variables is another normal random variable.

Period 1900 – 1960
Clouds of Vagueness and the demand for Precision

The essence of risk management lies in maximizing the areas where we have some control over the outcome while minimizing the areas where we have absolutely no control over the outcome and the linkage between effect and cause is hidden from us.Two people from this era wanted to attribute causality to the every day events. One was Laplace and other was Henri Poincare. However both agreed to the fact that not always, there is complete information to attribute the causality.

    

Laplace_6
Laplace Poincare
Poincare

Hence the development of reject or non-reject hypothesis came in to being . It was thought one can never be certain about any thing. only one can reject or not reject a hypothesis with some confidence level. Thus statistical inference and hypothesis testing concepts flourished .

Soon, the winds in the development of risk management started to change after the world war. The so called happy state that was in the imagination of most people was shattered by the destruction all around. More information was only adding to the uncertainty around.Francis Galton died in 1911 and Henri Poincare died the followin year, Their passing marked the end of the grand age of measurement. Subsequently ideas of Keynes becoming wide spread where he hailed uncertainity and critiqued all the classical ways of dealing with uncertainty using law of large numbers .

Markowitz, Harry M
Markowitz

At the same time, a graduate from chicago , Markowitz, applied mathematics to portfolio selection and came up with a model for selecting stocks.  Inspite of many assumptions , this was widely adopted by the street. For quite some time, the notion that investors are rational was in vogue, when a professor from chicago advocated the behavioral aspect of investing, which opened up a new branch of economics called behavioral economics/ finance. Thus the stage was set for understanding the degrees of belief and exploring uncertainty

Post 1960
Degrees of Disbelief & Exploring uncertainty

The last part of book focuses on prospect theory, derivatives such as futures and options to tame uncertainty.

I have tried to give a pretty elaborate summary of the book. However there are a lot of aspects which you can cherish if you go through the details behind the chronology of events. .

PointersThe topic of pointers and dynamic memory management is not understood by many folks for a couple of reasons :

One, Lets face it..It requires some effort to understand whats going on. However it is a one time effort and subsequently one can learn as one experiments, codes etc. Then why don't folks put in the one time effort.
Second reason, I can surmise is  : Most of the C++ introductory books have one chapter or at most two on pointers before diving in to OOPS concepts. OOPs principles are important and they should be covered but often times, the understanding of pointers is sometimes assumed or swept under the carpet when the aspects of dynamic memory management are introduced such as virtual  function, cloning, copy constructor, new etc.

So, the programmer typically has some idea of pointers and tries to piece together the OOPs concepts. This is ok if the programmer  never has to deal with memory in his working life, i mean, if a Virtual machine does all the work for him/her. So, you see the point here, the evolution of modern languages like Java, .NET frameworks gives a programmer a lot of stuff for free but what is lost is the basic understanding of memory management. Thus a programmer might use an for(i=0;i<10;i++) { for (j=0;j<10;j++) cout<<"RK[i][j]"<<endl;}} and might think that the data is being stored as a 10*10 cell matrix or some thing like that. He might not be able to appreciate the fact that all 100 values in the array are stored as contiguous memory locations and RK declared as an array is nothing but a label to the memory address. He might not appreciate the fact that the same result of the above loop can be obtained by a pointer to the first element of the array.

In most of the OOPs concepts in C++, there is an inevitable meeting with symbols such as *, &, -> and they make a code really fast.Crucial concepts like pass by reference, operator overloading, rule of three, virtual functions, etc are all dependent on the prior knowledge of pointers. However for folks who have not spent time on pointers , they cannot appreciate the beauty of some of the rules that need to be followed. One might have a vague idea of heap and stack , but when one sees an error related to memory thrown by the machine, it is a time to go back to fundamentals .

Well , when I began coding C++, I underwent the same cycle, coded and learnt from some introductory books, knew the scott meyer tips and applied them in my code , but pointers was an area which I was shaky about. until…I was suggested a book on pointers which was well suited to a guy who already had coded in c++. my understanding of C++ was greatly enhanced by a book titled "C++ Pointers and Dynamic Memory Management" by Michael C.Daconta. 

If a programmer wants to develop some real power applications, I guess this is one of books which gives a clear description of the pointers business. Well, it is 400 page book and if you are a person who loves to critique your own code, you can pretty much cover the book over a weekend and when you start coding , the next time on, you will be raring to use pointers as they are indeed what makes C++ powerful.

Art of learning
 

A few days back , I was strolling in the quiet Barnes and Noble on 14th St.  when I stumbled on to a book titled" The art of learning". At the outset, it looked similar to "Success Vs Joy", one of my all time favorite books , written by Geet Sethi.

This book is an autobiography by Josh Waitzkin, a chess prodigy on whose life, "Searching for Bobby Fischer" movie was made. I browsed through a few pages, read a couple of intermittent points and put the book down. Reason being, I did not have time to actually sit through and read a book on learning. I was in a hurry to get on with my daily schedule. Just as I walked out of the book store, something in me, made me retrace my steps to the book section. The book had a magnetic effect on me and  I ended up buying it. I read the book on  my travel back to my apt, forgot about my dinner and just kept reading it until I reached the end of the book.After a very long time I managed to read a book in one sitting. In this brief post , I will try to recap some of the points mentioned in the book.  I feel this should be read by any one who wants to live up to one's complete potential. Here are few points from the book
Making Smaller Circles:
It is extremely important to concentrate on depth than width. Josh gives a few examples to explain this crucial aspect of learning. We might learn all the fancy things in the world but fail to internalize some of the basic principles. Example from chess is , you can learn a gazillion openings and still crumble under a new attack for the simple reason that you have started your chess life with 32 pieces on the board. Why not start with just 2 pieces King and a Pawn, OR, King and a Bishop and then slowly look at King and Bishop Vs King and a Knight.  Incrementally build up the knowledge. The most important point is to understand each piece so well that you look at a opening , you just see that it is just one of the patterns when the individual pieces get together and move in a specific way. What do you gain by drawing these smaller circle ? Great perspective on things , on the depth of each move. I guess every one should ponder and ask a question to themselves, " When was the last time I was completely focused on a task and lost a sense of time, and in the end understood the beauty of the underlying principles.

An example from the every day life of let's say an analyst who uses stats in his work: How many times that an analyst makes a small circle and wonders , lets say, " What exactly is the difference between correlation and covariance ? Is one related to other ? What are the scenarios where they kind of give the same meaning ? what are the scenarios where each has its own meaning ? Can one calculate covariance between any 2 variables, lets say when both are stochastic, when one is stochastic, one is random, ? " The point is as long as one does not take time to really understand the basic principles of any subject, his / her learning , understanding and application will ONLY be superficial.

Other interesting ideas discussed in the book are Investment in Loss , Beginner's Mind and Slowing down time. Well there are a lot more than that….and my single blog post can never do justice to the book. It is a wonderful book and I strongly recommend it