The highlight of the book is the following statement :

Any model should consider the following :

- Transaction Costs,
- Risk,
- Return,
- Market Efficiency

and the objective of a investment strategy should be minimize first two, maximize third and work around the fourth. Liquidity and Taxes are the additional elements to make any model realistic.

Coming back to what the book is all about,

It is about

Alpha, a term used to represent , excess of benchmark returns generated by a fund manager. This book talks about the eternal quest for alpha and poses a lot of interesting questions. It begins with a history of passive indexing strategy and then tries to bring out various perspectives for seeking alpha. Ross’s APT, Markowtiz Portfolio theory, Sharpe’s contribution, Tobin’s Separation theorem are some of the historical aspects which the author brings out in a conversational mode. The rise of managed futures , behavioral finance, quant jocks , var and the hype around it, etc are highlighted in the book.

In the end , this book makes one think about the fund manager and the issues he/she faces

If one looks at a typical fund manager, he has to deal with the following :

- Public information
- Technology and what role he should allow it to play
- Portfolio selection
- Portfolio risk management
- His/her own emotional ups and downs while trading and taking positions
- Dealing with Investors
- Dealing with Markets
- Analysts – Fundamental, techinical ,quant jocks etc

The above are only a few things of a host of activities that a fund manager needs to look at. In the end, he is judged by only one thing – ALPHA

This book talks about changing dynamics of each of the factors mentioned above. The author takes a view that investing is becoming more like the whale hunting activity of yesteryears. In the initial period of whale hunting, the target was easily found near the shoreline. However as time passed, one had to go deep in to the ocean to get a whale and over a period of time, whale population was very scarce that hunting was no longer considered a business that one can depend upon. In the investment world, where the yesteryears were dominated by market efficiencies , a lot of managers were able to give handsome returns to the investors. But as the market became more efficient, alpha became elusive. With the rise of computing power, the arbitrage opportunities are becoming scarce.

So, how does one achieve alpha in these days where beating index is becoming more and more difficult. I don’t know..makes me wonder whether there is any case for stat arb trading models at all. Like pairs trading ,is there some strategy which is yet to be exploited by wallstreet ? May be there is, May be there isn’t…Reverse Law of Large numbers strategy, the emerging research output from behavioral finance would , i guess, throw more light in the times to come.

**"Against the gods" **is a beautiful narrative on the history of risk management. Peter L Bernstein , the author of the book has a done a terrific work of narrating the evolution of risk measurement. The book is divided in to 5 periods where the story of risk is presented.

**The five demarcated periods are Uptil 1200 ,1200-1700,1700-1900,1900-1960,Post 1960.**

In each of these periods, the author talks about various personalities involved. Let me recap the book in the same manner, listing the main items from each of these time periods.

Firstly, something about the title of the book " against the gods" . It is so named because the author brings out a pattern in his narrative i.e, through the history of the development of risk, there was one powerful idea that galvanized the development of risk, the idea that humans are in control of their destiny Vs Gods who control the destiny.

**Pre-1200**

**Numbers Era**

Greeks had an immense interest in gambling and hence probability theory was the natural thing for them to take upon and explore. Yet Greeks never took up probability and worked on it. One of the reason that author surmises is the fact that greeks were of the belief that the world was controlled by god and any study to control the universe in whichever manner would only result in a futile exercise.

Pre-1200's was a period which was characterized by folks trying to understand and formulate numbers.

Fibonacci

Leonardo Pisano (Fibonacci) wrote a called **Liber Aabaci** which was the first treatment on the theory and application of various aspects of numbers. This was also a period when the development of 0 made a significant impact in the way numbers were used

**Period 1200 – 1700 **

Outstanding facts Era

**Renaissance Gambler:**

Lucal Pacioli was the first person to give an exhaustive treatment to accounting in his book summa.He is also the first accountant in the human history. In his book** summa**, he gave tables for 60*60 multiplication operations. He was a numbers man and he posed the most famous problem of all times, the problem of balla.

*A and B are playing a fair game of balla. They agree to continue until one has won 6 rounds. The game actually stops when A has won 5 and B has won three. How should the stakes be divided*

** Cardano**

A physician named Cardano was an eternal gambler. He gambled every day of his life and he had seen so many gambling games in his life that he wanted a set of rules for playing the game based on the odds of various outcomes. He wrote a breat book on mathematics **Ars Magna **( The great art) which was followed by Liber de Ludo Algae( Book on games of chance). This appears to have been the first serious effort to develop statistical principles of probability. Probability always had 2 meanings one looking in to the future, the other interpreting the past, he former cncerned with our opinions and the latter concerned with what we actually know. However the idea of measuring probability came later which means " How much can we accept of what we know ? . In a sense the book **Liber de Ludo Algae **was a primer to risk management. Cardano is credit for an bringing a new terminology like fair dice, circuit, combinations,odds ratio etc. Interestingly , the word "fair dice" came in to being because he had spent years on the gambling table and he could see how various players cheated. However his book was not accessible for a lot of mathematicians in the renaissance time for various reasons

**French Connection :**

Pascal Chevalier Fermat

There are three important French personalities who played a significant role in the development of probability theory. First, was Blaise pascal , an outstanding mathematician whose work on cones at a young age of 16 brought great praise for his intellectual faculties. The other was Fermat whose work on the theory of numbers is by far the most comprehensive work done by an individual. He is more popularly known for his last theorem which mathematicians have struggled to solve it for about 350 years.

These two mathematicians were great in their respective fields but it was Chevalier De mere , a noble man, a person with keen interest in gambling and mathematics , posed the **old problem of balla**. In a series of communications between Pascal and Fermat, Pascal came up with a triangle , popularly referred to as Pascal's triangle to calculate the odds for solving problem of balla. This was the first time a mathematical tool was used to forecast, in this case, the prize money in the game. **Pascal's triangle** was a neat way to summarize the events that could happen in a probabilistic sense. For example if there are 5 games are to be played between 2 folks, then the 2 power 5 = 32 corresponds to the 5th row in the triangle from which one can read different types of events that happen.

**Remarkable Notions Man:**

Graunt

Petty

John Graunt , a merchant and William Petty were folks who used statistical inference techniques. Graunt was a man who was obsessed with verifying common every day notions. With the help of Petty, he developed a method of drawing inferences from a small sample .Both were extremely interested in the organization of human society rather than the science of nature However they never used the word probability . Estimating the odds of uncertain events had to wait until 1700-1900 , a period appropriately titled "Measurement Unlimited"

**Period 1700 – 1900 **

**Measurement Unlimited Era**

**Meet the Bernoulli family:**

Bernoulli family is considered as a family which had a swarm of mathematical descendants whose have made immense contributions to the understanding of uncertainty.

Daniel Bernoulli

Daniel Bernoulli is credited to have brought in the element of risk taker in to the whole game of risk. He hypothesized that the importance of wealth for an individual is inversely proportional to the amount of wealth accumulated. From the world of simple dice, roulette wheels, the inclusion of the player brought in a whole new dimension to the risk management development. **Utility as a concept had a tremendous influence on the way risk management principles** were developed in the later years.Petersburg paradox is a classic example of utility concept explained by Bernoulli,

Jacob I Bernoulli

Bernoulli was interested in a-posteriori probabilities , i.e computing probabilities of something after fact.His example of a glass jar having 3000 white pebbles and 2000 black is often quoted in literature. The problem goes something like this :

A pebble is drawn, its color is noted and then put it back . How many pebbles need to be drawn so that we can reasonably certain of the true ration, result should be with in 2% of the true ratio. The answer turns out to be 25,550

Experimenting on the same lines he formulated the **Law of Large numbers** which says that :*Average of large number of throws will be more likely than the average of a small number of throws to differ from the true average by less than some stated amount..*

De moivre

Demoivre then picked the concept and formulated the **normal curve distribution**.The third person who belonged to the same era and contributed to the formulation of a posterior probabilities is Bayes. Though none of his work got published when he was alive, his work had a great influence later. Possibly, the most important contribution of bayes was the precise problem formulation :

Bayes

*Given the number of times in which an unknown event has happened and failed, Required , the chance that the probability of its happening in a single trial lies somewhere between any two degrees of probability that can be named ?*

Then came gauss who formulated the single most important theorem in statistics, the central limit theorem

** **

Gauss

Gauss was conducting research on geodesic measurements, i.e, the distance between any two points on the earth and the direct distance between the 2 points. As earth has a curvature, the 2 metrics are going to be different and they are going to be different in different places. But what Guass found an amazing pattern. The average value of errors between actual and observed , even though they were different in various places, the average of average errors followed De moivres bell curve. Thus **central limit theorem deals with the average of averages. **

In simple words, this theorem says, if you pick a large sample, take its average, do it multiple times, and plot the averages of all the sample, the frequency distribution is a normal distribution..This is an amazing pattern because the actual distribution of random variables can be anything, but sample averages tend to be normal.

Francis Galton

Galton was an amateur scientist with a keen interest in heredity but with no interest in business or economics.He was a measurement freak and he studied extensively numbers. He treatise on heredity is supposed to have evoked great praise from Charles Darwin. His contribution to statistics is **"Regression to mean" **. In the long term, the high and low values of a variable stabilize to an average value. He also hypothesized that influences on a variable themselves had to be normally distributed for the dependent variable to be normally distributed. This is nothing but the popular theorem that sum of N normal random variables is another normal random variable.

**Period 1900 – 1960**

**Clouds of Vagueness and the demand for Precision**

The essence of risk management lies in maximizing the areas where we have some control over the outcome while minimizing the areas where we have absolutely no control over the outcome and the linkage between effect and cause is hidden from us.Two people from this era wanted to attribute causality to the every day events. One was Laplace and other was Henri Poincare. However both agreed to the fact that not always, there is complete information to attribute the causality.

Laplace

Poincare

Hence the development of** reject or non-reject hypothesis **came in to being . It was thought one can never be certain about any thing. only one can reject or not reject a hypothesis with some confidence level. Thus statistical inference and hypothesis testing concepts flourished .

Soon, the winds in the development of risk management started to change after the world war. The so called happy state that was in the imagination of most people was shattered by the destruction all around. More information was only adding to the uncertainty around.Francis Galton died in 1911 and Henri Poincare died the followin year, Their passing marked the end of the grand age of measurement. Subsequently ideas of Keynes becoming wide spread where he hailed uncertainity and critiqued all the classical ways of dealing with uncertainty using law of large numbers .

Markowitz

At the same time, a graduate from chicago , Markowitz, applied mathematics to portfolio selection and came up with a model for selecting stocks. Inspite of many assumptions , this was widely adopted by the street. For quite some time, the notion that investors are rational was in vogue, when a professor from chicago advocated the behavioral aspect of investing, which opened up a new branch of economics called behavioral economics/ finance. Thus the stage was set for understanding the degrees of belief and exploring uncertainty

**Post 1960**

**Degrees of Disbelief & Exploring uncertainty**

The last part of book focuses on prospect theory, derivatives such as futures and options to tame uncertainty.

I have tried to give a pretty elaborate summary of the book. However there are a lot of aspects which you can cherish if you go through the details behind the chronology of events. .

The topic of

**pointers and dynamic memory management** is not understood by many folks for a couple of reasons :

**One**, Lets face it..It requires some effort to understand whats going on. However it is a one time effort and subsequently one can learn as one experiments, codes etc. Then why don't folks put in the one time effort.

**Second** reason, I can surmise is : Most of the C++ introductory books have one chapter or at most two on pointers before diving in to OOPS concepts. OOPs principles are important and they should be covered but often times, the understanding of pointers is sometimes assumed or swept under the carpet when the aspects of dynamic memory management are introduced such as virtual function, cloning, copy constructor, new etc.

So, the programmer typically has some idea of pointers and tries to piece together the OOPs concepts. This is ok if the programmer never has to deal with memory in his working life, i mean, if a Virtual machine does all the work for him/her. So, you see the point here, the evolution of modern languages like Java, .NET frameworks gives a programmer a lot of stuff for free but what is lost is the basic understanding of memory management. Thus a programmer might use an for(i=0;i<10;i++) { for (j=0;j<10;j++) cout<<"RK[i][j]"<<endl;}} and might think that the data is being stored as a 10*10 cell matrix or some thing like that. He might not be able to appreciate the fact that all 100 values in the array are stored as contiguous memory locations and RK declared as an array is nothing but a label to the memory address. He might not appreciate the fact that the same result of the above loop can be obtained by a pointer to the first element of the array.

In most of the OOPs concepts in C++, there is an inevitable meeting with symbols such as *, &, -> and they make a code really fast.Crucial concepts like pass by reference, operator overloading, rule of three, virtual functions, etc are all dependent on the prior knowledge of pointers. However for folks who have not spent time on pointers , they cannot appreciate the beauty of some of the rules that need to be followed. One might have a vague idea of heap and stack , but when one sees an error related to memory thrown by the machine, **it is a time to go back to fundamentals** .

Well , when I began coding C++, I underwent the same cycle, coded and learnt from some introductory books, knew the scott meyer tips and applied them in my code , but pointers was an area which I was shaky about. until…I was suggested a book on pointers which was well suited to a guy who already had coded in c++. my understanding of C++ was greatly enhanced by a book titled **"C++ Pointers and Dynamic Memory Management"** by Michael C.Daconta.

If a programmer wants to develop some real power applications, I guess this is one of books which gives a clear description of the pointers business. Well, it is 400 page book and if you are a person who loves to critique your own code, you can pretty much cover the book over a weekend and when you start coding , the next time on, you will be raring to use pointers as they are indeed what makes C++ powerful.

A few days back , I was strolling in the quiet Barnes and Noble on 14th St. when I stumbled on to a book titled" The art of learning". At the outset, it looked similar to "Success Vs Joy", one of my all time favorite books , written by Geet Sethi.

This book is an autobiography by Josh Waitzkin, a chess prodigy on whose life, "Searching for Bobby Fischer" movie was made. I browsed through a few pages, read a couple of intermittent points and put the book down. Reason being, I did not have time to actually sit through and read a book on learning. I was in a hurry to get on with my daily schedule. Just as I walked out of the book store, something in me, made me retrace my steps to the book section. The book had a magnetic effect on me and I ended up buying it. I read the book on my travel back to my apt, forgot about my dinner and just kept reading it until I reached the end of the book.After a very long time I managed to read a book in one sitting. In this brief post , I will try to recap some of the points mentioned in the book. I feel this should be read by any one who wants to live up to one's complete potential. Here are few points from the book

**Making Smaller Circles:**

It is extremely important to concentrate on depth than width. Josh gives a few examples to explain this crucial aspect of learning. We might learn all the fancy things in the world but fail to internalize some of the basic principles. Example from chess is , you can learn a gazillion openings and still crumble under a new attack for the simple reason that you have started your chess life with 32 pieces on the board. Why not start with just 2 pieces King and a Pawn, OR, King and a Bishop and then slowly look at King and Bishop Vs King and a Knight. Incrementally build up the knowledge. The most important point is to understand each piece so well that you look at a opening , you just see that it is just one of the patterns when the individual pieces get together and move in a specific way. What do you gain by drawing these smaller circle ? Great perspective on things , on the depth of each move. I guess every one should ponder and ask a question to themselves, " When was the last time I was completely focused on a task and lost a sense of time, and in the end understood the beauty of the underlying principles.

An example from the every day life of let's say an analyst who uses stats in his work: How many times that an analyst makes a small circle and wonders , lets say, " What exactly is the difference between correlation and covariance ? Is one related to other ? What are the scenarios where they kind of give the same meaning ? what are the scenarios where each has its own meaning ? Can one calculate covariance between any 2 variables, lets say when both are stochastic, when one is stochastic, one is random, ? " The point is as long as one does not take time to really understand the basic principles of any subject, his / her learning , understanding and application will ONLY be superficial.

Other interesting ideas discussed in the book are **Investment in Loss , Beginner's Mind and Slowing down time**. Well there are a lot more than that….and my single blog post can never do justice to the book. It is a wonderful book and I strongly recommend it