May 2014


image

The title is meant to convey the message that many problems in mathematics can be solved using elementary tools, more of a street fighting kind than some heavy weight combat type tools. There have been many other books in this genre that highlight the importance of smart guessing and approximations but this book is exceptional in one way – It shows that math problems like solving differentiation, integration, differential equations, etc. which are typically not dealt in pop science books, can also be solved by street fighting tools. There are many non linear differential equations too, that can be solved using the tools mentioned in the book. I guess this book is particularly targeted towards those who need to deal with some kind of math on a daily basis. What does the book contain? There are six tools mentioned in the book:

 

SFM

 

  1. Dimensional Analysis: This is the classic Physics 101 technique of using dimensions of various quantities in the formula to check its validity. However the author shows enough examples to display its power in a variety of problems from integration to solving differential equations.
  2. Easy cases: This tool entails using easy cases to check a proposed formula or solution and construct low entropy expressions that pass all easy-cases tests. For many problems in geometry, this tool is very potent. In fact this tool is something we all intuitively use but probably never pay attention to the fact that we are using such a technique.
  3. Lumping: This tool turns calculus on its head and solves problems not by  “calculation on infinitesimal intervals” but by “lumping intervals”. There are techniques like 1/e approximation, Full width at half maximum(FWHM) that turn integration problems in to summation problems. The author shows that by using a secant approximation to a tangent, one can estimate derivatives and in the process convert differential equations in to simple equations that can be solved quickly.
  4. Pictorial Proofs: This tool is about using pictures for proving various mathematical identities, integrals etc. By using pictures to solve the problem, it is likely that our brain is going to remember the proof as well as recall it effortlessly. One can see all the big ideas at a glance by using images to solve problems.
  5. Taking out the big part: First things first, i.e. tackling the big part of the problem is the essence of this tool. Analyze the big part first, and worry about the correction afterward. This successive-approximation approach, a species of divide-and-conquer reasoning, gives results automatically in a low-entropy form.
  6. Analogy: This tool entails solving a similar less complicated problem and using its results as a guide to a more complex problem. An example in this chapter that involves the summation of the roots of the equation tan(x)-x=0 is beautiful. Using street fighting tools, the author solves the problem of summing up infinite transcendental numbers.

The book is neatly organized . It describes one tool per chapter and to enable active learning, each chapter is interspersed with questions along the way to cement the readers understanding. There are many things I liked about the book. I will list down a few of them.

  • A way to derive Sterling’s approximation of n! using pictorial proofs and lumping.
  • Pictorial proof for AM >= GM.
  • Newton-Raphson method explained via pictures.
  • Use a secant approximation to derivative and solve linear differential equations like spring mass problem, time period of a pendulum problem.
  • Solving Navier-Stokes equation using dimensional analysis and easy cases.
  • Solving a Gaussian integral via dimensional analysis and easy cases. Gaussian integrals are something I see day in and day out but till date, I have never paused to solve them via dimensional analysis!
  • Using dimensionless groups of variables to guess the relationships amongst them.
  • Importance of working with low entropy expressions.
  • Deriving Euler-MacLaurin formula via analogy

 

imageTakeaway:

“When the going gets tough, the tough lower their standards” is the message from the book. In lowering the standards, one escapes rigor mortis: the fear of making an unjustified leap even when it lands on a correct result. Escaping rigor mortis need not always produce accurate answer but why bother about three decimal place accurate solution, when all that we need most of the times is an rough approximate answer.

image

The distinguishing feature of state space time series models is that observations are regarded as made up of distinct components such as trend, seasonal, regression elements and disturbance terms, each of which is modeled separately. These models for the components are put together to form a single model called a state space model which provides the basis for analysis. The book is primarily aimed at applied statisticians and
econometricians. Not much of math background is needed to go through the book,at least the first part of the book. State space time series analysis began with the path breaking paper of Kalman and early developments of the subject took place in the field of engineering. The term state space comes form engineering. Statisticians and econometricians tend to stick to the same terminology.

Part I of the book deals with linear Gaussian models and Part II deals with the extensions of it to non linear, non Gaussian world. Extensions like exponentially distributed observations, allowing nonlinearities in the model,allowing heavy-tailed densities to deal with outliers in the observations and structural shifts in the state are dealt in the second part of the book. The treatment given in the book is a simulation based approach, as excepting for a few models, there are no closed form solutions available. Instead of following the MCMC treatment, the book shows the use of importance sampling and antithetic variables in analyzing state space models. The book provides analysis from the classical treatment as well as the Bayesian treatment. In the given link at the end of this post, I have summarized Part I of the book that take up 175 out of 240 odd pages of the book. Part II of the book is all about nonlinear non Gaussian models and the use of importance sampling to estimate the filtering and smoothing estimates. In contrast to Part II of the book that is math heavy, the first part of the book is definitely manageable. In fact one can go through the entire Part I of the book by knowing just one Lemma.