November 2014


image

For those of us who are born before 1985, it is likely that we have seen two worlds; one, a world that wasn’t dependent on net and another, where our lives are dominated/controlled by the web and social media. The author says that that given this vantage point, we have a unique perspective of how things have changed. It is impossible to imagine a life without print. However before the 1450’s Guttenberg printing press invention, the knowledge access was primarily through oral tradition. Similarly may be a decade or two from now, our next generation would be hard-pressed to imagine a life without connectivity. There is a BIG difference between the Guttenberg’s revolution and Internet—the pace. Even though the printing press was invented around 1450’s, it was not until 19th century that enough people were literate, for the written word to influence the society. In contrast, we have seen both the offline and online world in less than one life time.

We are in a sense a straddle generation, with one foot in the digital pond and the other on the shore, and are experiencing a strange suffering as we acclimatize. In a single generation, we are the only people in the history experiencing massive level of change. The author is not critical about technology, after all technology is neither good nor bad. It is neutral. Firstly something about the word in  title—Absence. It is used as a catch-all term for any activity that does not involve internet, mobile, tablets, social media etc. Given this context, the author structures the content of the book in to two parts. The first part of the book explores certain aspects of our behavior that have dramatically changed and we can see the consequences of it all around us. The second part of the book is part reflection, part experimentation by the author to remain disconnected in this hyper connected world. In this post, I will summarize a few sections of the book.

Kids these days

Increasingly the kids are living in a world where the daydreaming silences in the lives are filled by social media notifications and burning solitudes are extinguished by constant yapping on the social networks/phones and playing video games. That said, what is role of a parent? The author argues that we have a responsibility of providing enough offline time to children.

How often have you seen a teenager staring outside a window and doing nothing but be silent? In all likelihood parents might think that there is something wrong with their kid—he had a fight over something with a sibling, something in the class upset him/her, someone has taunted their kid etc. If the kid is typing something on his mobile or talking over the phone texting, playing a video game, the standard reaction of a parent is – Kids these days—and leave it at that. Instead of actively creating an atmosphere where downtime becomes a recurring event, parents are shoving technology on to kids to escape the responsibility. How else can one justify this product (iPotty)?

clip_image002

One digital critic says, “It not only reinforces unhealthy overuse of digital media, it’s aimed at toddlers. We should NOT be giving them the message that you shouldn’t even take your eyes off a screen long enough to pee.”

Many research studies have concluded that teenagers are more at ease with technologies than one another. The author argues that parents should be aware of subtle cues and create engineered absences for them to develop empathy for others via the real world interactions than avatars in the digital world.

Montaigne once wrote, “We must reserve a back shop, all our own, entirely free, in which to establish our real liberty and our principal retreat and solitude.” But where will tomorrow’s children set up such a shop, when the world seems to conspire against the absentee soul?

Confession

The author mentions the Amanda Todd incident—a teenager posts a YouTube video about her online bully and commits suicide. Online bullying is a wide spread phenomena but the social technologies offer a weak solution to the problem—flag this as inappropriate / block the user. Though the crowd sourced moderation may look like a sensible solution, in practice, it is not. The moderation team for most of the social media firms cannot handle the amount of flagging requests. The author mentions about big data tools being developed that take in all the flagging request streams and then decide the appropriate action. The reduction of our personal lives to mere data does run the risk of collapsing things into a Big Brother scenario, with algorithms scouring the Internet for “unfriendly” behavior and dishing out “correction” in one form or another. Do we want algorithms to abstract, monitor and quantify us? Well, if online bullying can be reduced by digital tools, so be it, even though it smells like a digit band-aid to cure problems in digital world. The author is however concerned with “broadcast” culture that has suddenly been thrust upon us.

When we make our confessions online, we abandon the powerful workshop of the lone mind, where we puzzle through the mysteries of our own existence without reference to the demands of an often ruthless public.

Our ideas wilt when exposed to scrutiny too early—and that includes our ideas about ourselves. But we almost never remember that. I know that in my own life, and in the lives of my friends, it often seems natural, now, to reach for a broadcasting tool when anything momentous wells up. Would the experience not be real until you had had shared it, confessed your “status”?

The idea that technology must always be a way of opening up the world to us, of making our lives richer and never poorer, is a catastrophic one. But the most insidious aspect of this trap is the way online technologies encourage confession while simultaneously alienating the confessor.

Authenticity

The author wonders about the “distance” that any new technology creates between a person and his/her “direct experience”. Maps made “information obtained by exploring a place” less useful and more time consuming, as an abstract version of the same was convenient. Mechanical clock regimented the leisurely time and eventually had more control on you than body’s own inclinations. So are MOOCS that take us away from directly experiencing a teacher’s lesson in flesh and blood. These are changes in our society and possibly irrevocable. The fact that “Selfie stick makes to the Time magazine’s list of 25 best inventions of 2014” says that there is some part of us that wants to share the moment, than actually experiencing it. Daniel Kahneman in one of his interviews talks about the riddle of experiencing self vs. remembering self.

Suppose you go on a vacation and, at the end, you get an amnesia drug. Of course, all your photographs are also destroyed. Would you take the same trip again? Or would you choose one that’s less challenging? Some people say they wouldn’t even bother to go on the vacation. In other words, they prefer to forsake the pleasure, which, of course, would remain completely unaffected by its being erased afterwards. So they are clearly not doing it for the experience; they are doing it entirely for the memory of it.

Every person I guess in today’s world is dependent on digital technologies. Does it take us away from having an authentic or more direct experience? Suppose your cell phone and your internet are taken away from you over the weekend, can you lead a relaxed / refreshing weekend? If the very thought of such a temporary 2 day absence gives you discomfort, then one must probably relook at the very idea of what it means to have an authentic experience. Can messaging a group on WhatsApp count as an authentic “being with a friend” experience? All our screen time, our digital indulgence, may well be wreaking havoc on our conception of the authentic. Paradoxically, it’s the impulse to hold more of the world in our arms that leaves us holding more of reality at arm’s length.

The author mentions about Carrington Event

On September 1, 1859, a storm on the surface of our usually benevolent sun released an enormous megaflare, a particle stream that hurtled our way at four million miles per hour. The Carrington Event (named for Richard Carrington, who saw the flare first) cast green and copper curtains of aurora borealis as far south as Cuba. By one report, the aurorae lit up so brightly in the Rocky Mountains that miners were woken from their sleep and, at one a.m., believed it was morning. The effect would be gorgeous, to be sure. But this single whip from the sun had devastating effects on the planet’s fledgling electrical systems. Some telegraph stations burst into flame.

and says such an event, according to experts, is 12% probable in the next decade and 95% probable in the next two centuries. What will happen when such an event happens?

Breaking Away

This chapter narrates author’s initial efforts to seek the absence. In a way, the phrase “seeking the absence” is itself ironical. If we don’t seek anything and be still, aren’t we in absence already? Not really if our mind is in a hyperactive state.

One can think of many things that demand a significant time investment, well may be, an uninterrupted time investment, to be precise. In my life, there are a couple of such activities – reading a book, understanding a concept in math/stat, writing a program, playing an instrument. One of the first difficulties in pursuing these tasks is, “How does one go about managing distractions, be it digital / analog distractions”? About the digital distractions- constantly checking emails/ whatsapp/twitter makes it tough to concentrate on a task that necessitates full immersion.

Why does our brain want to check emails/messages so often? What makes these tools addictive? It turns out the answer was given way back in 1937 by the psychologist, B.F Skinner who describes the behavior as “operant conditioning”. Studies show that constant, reliable rewards do not produce the most dogged behavior; rather, it’s sporadic and random rewards that keep us hooked. Animals, including humans, become obsessed with reward systems that only occasionally and randomly give up the goods. We continue the conditioned behavior for longer when the reward is taken away because surely, surely, the sugar cube is coming up next time. So, that one meaningful email once in a while keeps us hooked on “frequent email checking” activity.Does trading in the financial markets in search of alpha, an outcome of operant conditioning? The more I look at the traders who keep trading despite poor performance, the more certain I feel it is. The occasional reward makes them hooked to trading, despite having a subpar performance.

Try reading a book and catch yourself how many times you start thinking about – what else could I be doing now/ reading now?—Have we lost the ability to remain attentive to a given book or task without constantly multi-tasking. BTW research has proven beyond doubt that there is nothing called multitasking. All we do is mini-tasking. It definitely happens to me quite a number of times. When I am going through something that is really tough to understand in an ebook ( mostly these days, the books I end up reading are in ebook format as hardbound editions of the same are beyond my budget), I click on ALT+TAB –the attention killing combination on a keyboard that takes me from a situation where I have to actively focus on stuff for understanding TO a chrome/Firefox tab where I can passively consume content, where I can indulge in hyperlink hopping , wasting time and really not gaining anything. Over the years, I have figured out a few hacks that alerts me of this compulsive “ALT+TAB” behavior. I cannot say I have slayed ALT+TAB dragon for good at least I have managed to control it.

The author narrates his experience of trying to read the book, “War and Peace” , a thousand page book, amidst his hyper-connected world. He fails to get past in the initial attempts as he finds himself indulging in the automatic desires of the brain.

I’ve realized now that the subject of my distraction is far more likely to be something I need to look at than something I need to do. There have always been activities—dishes, gardening, sex, shopping—that derail whatever purpose we’ve assigned to ourselves on a given day. What’s different now is the addition of so much content that we passively consume.

Seeking help from Peter Brugman(18 minutes), he allots himself 100 pages of “War and Peace” each day with ONLY three email check-ins in a day. He also explores the idea of using a software that might help him in controlling distractions (Dr. Sidney D’Mello , a Notre Dame professors, is creating a software that tracks real-time attention of a person and sounds off an alarm) . In the end, the thing that helps the author complete “War and Peace” is the awareness of lack of absence which makes him find period of absence when he can immerse himself. I like the way he describes this aspect,

As I wore a deeper groove into the cushions of my sofa, so the book I was holding wore a groove into my (equally soft) mind.

There’s a religious certainty required in order to devote yourself to one thing while cutting off the rest of the world—and that I am short on. So much of our work is an act of faith, in the end. We don’t know that the in-box is emergency-free, we don’t know that the work we’re doing is the work we ought to be doing. But we can’t move forward in a sane way without having some faith in the moment we’ve committed to. “You need to decide that things don’t matter as much as you might think they matter,”

Does real thinking require retreat? The author thinks so and cites the example of John Milton who took a decade off to read, read, and read, at a time when his peers were DOING and ACCOMPLISHING stuff. Did he waste his time? Milton, after this retreat, wrote “Paradise Lost”, work, a totemic feat of concentration. Well this example could be a little too extreme for a normal person to take up. But I think we can actively seek mini retreats in a day/week/month/year. By becoming oblivious to thoughts like, “what others are doing?”, “what else should I be doing right now?”, “what could be the new notification on my mobile/desktop related to?” , I guess we will manage to steal mini-retreats in our daily lives.

Memory

The word memory evokes, at least amongst many of us, a single stationary cabinet that we file everything from whose retrieval is at best partial( for the trivia that goes on in our lives). This popular notion has been totally invalidated by many experiments in neuroscience. Brenda Milner’s study of Patient M is considered a landmark event in neuroscience as it established that our memory is not a single stationary cabinet that we file everything. Motor memory and Declarative memory reside in different parts of brain. Many subsequent experiments have established that human memory is a dynamic series of systems, with information constantly moving between. And changing.

Why does the author talk about memory in this book? The main reason is that we are relying more and more on Google, Wikipedia and digital tools for storing and looking up information. We have outsourced “transactive memory” to these services. In this context , the author mentions about timehop , a service that reminds you what you were doing a year ago after aggregating content from your online presence on Facebook, twitter, and blogs. You might think this is a cool thing where timehop keeps track of your life. However there is a subtle aspect that is going behind such services. We are tending to offload our memory to digital devices. Isn’t it good that I don’t have to remember all the trivia of life AND at the same time have it at the click of a button? Why do we need memory at all when whatever ALL we need is at a click of a button ? There is no harm in relying on these tools. The issue however, is that you cannot equate effective recall of information to “human memory”. Memorization is the act of making something “a property of yourself,” and this is in both senses: The memorized content is owned by the memorizer AND also becomes a component of that person’s makeup. If you have memorized something, the next time you try to access the memory, you have a new memory of it. Basically accessing memory changes memory. This is fundamentally different from “externalized memory”. In a world where we are increasingly finding whatever we need online, “having a good memory” or “memorization” skills might seem a useless skill. This chapter argues that it isn’t.

How to absent oneself?

This chapter is about author’s experience of staying away from digital world for one complete month—he fondly calls it “Analog August”. He dutifully records his thoughts each day. At the end of one full month of staying away, he doesn’t have an epiphany or something to that effect. Neither does he have some breakthrough in his work. When he resumes his life after the month long sabbatical, he realizes one thing—Every hour, Every day we choose and allow devices/services/technologies in to our lives. By prioritizing them and being aware of them is winning half the battle. By consciously stepping away from each of these connections on a daily basis is essential to get away from their spell.

imageTakeaway:

How does it feel to be the only people in the history to know life with and without internet ? Inevitably the internet is rewiring our brains. Internet does not merely enrich our experience, it is becoming our experience. By thinking about the various aspects that are changing, we might be able to answer two questions: 1) What will we carry forward as a straddle generation? 2) What worthy things might we thoughtlessly leave behind as technology dissolves in to the very atmosphere of our lives? . Michael Harris has tried answering these questions and in the process has written a very interesting book. I have thoroughly enjoyed reading this book amidst complete silence. I guess every straddle generation reader will relate to many aspects mentioned in book.

image

In today’s world where access to information is being democratized like never before, “learning how to learn” is a skill that will play a key role in one’s academic and professional accomplishments. This book collates ideas from some of the recent books on learning such as, “Make it Stick”, “A Mind for Numbers”, “The Five Elements of Effective Thinking”, “Mindset”, etc. The author has added his own personal take on the various research findings mentioned in the book and has come up with a 250 page book. If one has really absorbed the concepts mentioned in the previous books, then you really DO want to read this book. Any exercise that puts you in retrieval mode of certain concepts alters your memory associated with those specific concepts. Hence even though this is book serves as a content aggregator of all the previous books, reading it from the eyes of a new person, changes the way we store and retrieve memories of the main principles behind effective learning.

Broaden the Margins

The book starts with the author narrating his own college experience, one in which standard learning techniques like “find a quiet place to study”, “practice something repeatedly to attain mastery”, “take up a project and do not rest until it is finished” were extremely ineffective. He chucks this advice and adopts an alternative mode of learning. Only later in his career as a science journalist, does he realize that some of the techniques he had adopted during his college days were actually rooted in solid empirical research. Researchers over the past few decades have uncovered techniques that remain largely unknown outside scientific circles. The interesting aspect of these techniques is that they run counter to the learning advice that we have all taken at some point in our lives. Many authors have written books/blog posts to popularize these techniques. The author carefully puts all the main learning techniques in a format that is easy to read, i.e. he strips away the academic jargon associated with the techniques. The introductory chapter gives a roadmap to the four parts of the book and preps the reader’s mind to look out for the various signposts in the “learning to learn” journey.

The Story maker

A brief look at the main players in our brain:

image

Labeled in the figure are three areas. The entorhinal cortex acts as filter for the incoming information, the hippocampus is the area where memory formation begins and neocortex is the area where conscious memories are stored. It was H.M, the legendary case study that helped medical research community and doctors give a first glance in to the workings of the brain. Doctors removed hippocampus from H.M’s brain essentially removing the ability to form long term memories. Many amazing aspects of brain were revealed by conducting experiments on H.M. One of them being motor skills like playing music, driving a car are not dependent on hippocampus. This meant that memories were not uniformly distributed and brain had specific areas that handled different types of memory. H.M had some memories of his past after removal of hippocampus. This means that there were long term memories residing in some part of the brain. The researchers then figured out that the only candidate left in the brain where memories could be stored was the neocortex. The neocortex is the seat of human consciousness, an intricate quilt of tissue in which each patch has a specialized purpose.

image

To the extent that it’s possible to locate a memory in the brain, that’s where it resides: in neighborhoods along the neocortex primarily, not at any single address. This is as far as storage is concerned. How is retrieval done? Again a set of studies on epilepsy patients revealed that the left brain weaves the story based on the sensory information. The left hemisphere takes whatever information it gets and tells a tale to the conscious mind. Which part of the left brain tells this story? There is no conclusive evidence on this. The only thing known is that this interpreter module is present somewhere in the left hemisphere and it is vital to forming a memory in the first place. The science clearly establishes one thing: The brain does not store facts, ideas and experiences like a computer does, as a file that is clicked open, always displaying the identical image. It embeds them in a network of perceptions, facts and thoughts, slightly different combinations of which bubble up each time. No memory is completely lost but any retrieval of memory fundamentally alters it.

The Power of Forgetting

This chapter talks about Herbin Ebbinghaus and Philip Boswood Ballard who were the first to conduct experiments relating to memory storage and retrieval. Ebbinghaus tried to cram 2300 nonsense words and figured out how long it would take to forget them.

image

The above is probably what we think of memory. Our retention rate of anything falls as time goes. Philip Boswood Ballard on the other hand was curious to see what can be done to improve learning. He tested his students in the class at frequent intervals and found that testing increased their memory and made them better learners. These two experiments were followed by several other experiments and finally Bjorks of UCLA shepherded the theory to give it a concrete direction. They coined their theory as “Forget to Learn”. Any memory has two kinds of strengths, storage strength and retrieval strength. Storage strength builds up steadily and grows with usage of time. Retrieval strength on the other hand is a measure of how quickly a nugget of information comes to mind. It increases with studying and use. Without reinforcement, retrieval strength drops off quickly and its capacity is relatively small. The surprising thing about retrieval strength is this: the harder we work at retrieving something, the greater is the subsequent spike in retrieval and storage strength. Bjorks call this “desirable difficulty”. This leads to the key message of the chapter, “Forgetting is essential for learning”

Breaking Good Habits

This chapter says that mass practice does not work as well as randomized practice. Finding a particular place to do your work and working on just one thing till you master, and then proceeding on to the next , is what we often hear an advice for effective learning. This chapter says that by changing the study environment randomly, randomly picking various topics to study gives a better retrieval memory than the old school of thought.

Spacing out

This chapter says that spacing out any learning technique is better than massed practice. If you are learning anything new, it is always better to space it out than cram everything at one go. This is the standard advice – Do not study all at once. Study a bit daily. But how do we space out the studies? What is the optimal time to revisit something that you have read already? Wait for too long a time, the rereading will sound as a completely new material. Wait for too less a time, your brain gets bored because of familiarity. This chapter narrates the story of Piotr Wozniak, who tackled this problem of “how to space your studies?” and eventually created SuperMemo, a digital flashcard software which is used by many people to learn foreign languages. Anki, an open source version of SuperMemo is another very popular way to inculcate spaced repetition in your learning schedule. The essence of this chapter is to distribute your time over a longer interval in order to retrieve efficiently and ultimately learn better.

The Hidden Value of Ignorance

The chapter talks about “Fluency illusion”, the number one reason why many students flunk exams. You study formulae, concepts, theories etc. and you are under the illusion that you know everything until the day you see the examination paper. One way to come out of this illusion is to test oneself often. The word “test”, connotes different things to different people. For some teachers, it is a way to measure a student’s learning. For some students, it is something they need to crack to get through a course. The literature on “testing” has a completely different perspective. “Testing” is way of learning. When you take a test, you retrieve concepts from your memory and the very act of retrieving fundamentally alters the way you store those concepts. Testing oneself / taking a test IS learning. The chapter cites a study done on students shows the following results

image

The above results show that testing does not = studying. In fact, testing > studying and by a country mile on delayed tests. Researchers have come up with a new term to ward off some of the negative connotation associated with the word “test”; they call it “retrieval practice”. Actually this is a more appropriate term as testing oneself (answering a quiz / reciting from memory/ writing from memory) essentially is a form of retrieval that shapes learning. When we successfully retrieve something from the memory, we then re-store it in the memory in a different way than we did before. Not only has storage level spiked; the memory itself has new and different connections. It’s now linked to other related aspects that we have also retrieved. Using our memory changes our memory in ways we don’t anticipate. One of the ideas that the chapter delves in to is to administer a sample pre-final exam right at the beginning of the semester. The student will anyway flunk the exam. But the very fact that he gets to see a set of questions and looks at the pattern of questions before anything is taught, makes him a better learner by the end of semester.

Quitting before you are ahead

The author talks about “percolation”, the process of quitting an activity after we have begun and then revisiting at frequent intervals. Many writers explicitly describe this process and you can read their autobiographies to get in to the details. Most of the writers say something to this effect: “I start on a novel, then take a break and wander around a familiar/ unfamiliar environment, for when I do so, the characters tend to appear in the real/imaginary worlds who give clues to continue the story”. This seems to be to domain specific. May be it applies only to the “writing” field where after all writing about something is discovering what you think about it and it takes conscious quitting and revisiting your work.

The author cites enough stories to show that this kind of “percolation” effect can be beneficial to many other tasks. There are three elements of percolation. The first element of percolation is interruption. Whenever you begin a project, there will be times when your mind might say, “Moron quit it now, I can’t take it anymore”. Plodding through that phase is what we have been told leads to success. However this chapter suggests another strategy, “quit with the intention of coming back to it”. There is always a fear that we will never get back to working on it. But if it is something you truly care, you will get back to it at some point in time. An interesting thing happens when you quit and you want to get back to the activity after a break, the second element of percolation kicks in, i.e. your mind is tuned to see/observe things related to your work, everywhere. Eventually the third element of percolation comes in to play; listening to all the incoming bits and pieces of information from the environment and revisiting the unfinished project. In essence, having this mindset while working on a project means quitting frequently with the intention of returning to it, which tunes your mind to see things you have never paid attention to. I have seen this kind of “percolation” effect in my own learning process so many times that I don’t need to read a raft of research to believe that it works.

Being Mixed up

The author starts off by mentioning the famous beanbag tossing experiment of 1978 that showed the benefits of interleaved practice. This study was buried by academicians as it was against the conventional wisdom of “practice till you master it”. Most of the psychologists who study learning fall in two categories, first category focus on motor/movement and the second category focus on language/abstract skills. Studies have also proven that we have separate ways to memorize motor skills and language skills. Motor memories can be formed without hippocampus unlike declarative memories. Only in 1990s did researchers start to conduct experiments that tested both motor and declarative memories. After several experimental studies, researchers found that interleaving has a great effect on any kind of learning. The most surprising thing about interleaving is that the people who participated in the experiments felt that massed practice was somehow better, despite test scores showing that interleaving as a better alternative. One can easily relate to this kind of feeling. If you spent let’s say a day on something and you are able to understand a chapter in a book, you might be tempted to read the next chapter and the next until the difficulty level reaches a point where you need to put in far more effort to get through the concepts. Many of us might not be willing to take a break and revisit it, let’s say a week later or a month later. Why? These are following reasons based on my experience:

  • I have put so much effort in understanding the material ( let’s say the first 100 pages of a book). This new principle/theorem on the 101st page is tough. If I take a break and come back after a week or so, I might have to review all the 100 pages again which could be waste of time. Why not somehow keep going and put in a lot of effort in understanding the stuff on page 101 when all the previous 100 pages are in my working memory.
  • I might never get the time to revisit this paper/book again and my understanding will be shallow
  • Why give up when I seem to cruising along the material given in the book? This might be a temporary show stopper that I will slog it out.
  • By taking a break from the book, am I giving in to my lazy brain which does not want to work through the difficult part?
  • What is the point in reading something for a couple of hours, then reading something else for a couple of hours? I don’t have a good feeling that I have learnt something
  • I have put in so many hours in reading this paper/book. Why not put in some extra hours and read through the entire book?

The above thoughts, research says are precisely the ones that hamper effective learning. Interleaving is unsettling but it is very effective

Importance of Sleep

We intuitively know that a good sleep/quick nap brings our energy levels back. But why do humans sleep? One might think that since this is an activity that we have been doing since millennia, neuroscientists / psychologists / researchers would have figured out the answer by now. No. There is a no single agreed upon scientific explanation for it. There are two main theories that have been put forth. First is that sleep is essentially a time-management adaptation. Humans could not hunt or track in the dark. There was nothing much to do and automatically the internal body clock evolved to sleep during those times. Brown bat sleeps 20 hours and is awake for 4 hours in the dusk when it can hunt mosquitoes and moths. Many such examples give credence to this theory that we are awake when we there’s hay to be made and we sleep when there is none. The other theory it that sleep’s primary purpose is memory consolidation. Ok, if we take for granted that for some reason, evolution has made us crave for sleep, what happens to stuff that we learn? Does it get consolidated in sleep? The author gives a crash course on the five stages of sleep.

image

The five stages of sleep are illustrated in the above figure. There are bursts of REM(Rapid eye moment) in a typical 8 hr. sleep period. Typically one experiences a total of four to five REM bursts during the night–of 20 min of average duration. With its bursts of REM and intricate, alternating layers of wave patterns, the brain must be up to something during sleep. But what? For the last two decades there has been massive evidence that sleep improves retention and comprehension. Evidence has also shown mapping between Stage II of the sleep and motor skill consolidation, mapping between REM phase and learning skill consolidation. If you are a musician/artist preparing for tomorrow’s performance, it is better to practice late in to the night and get up little late so that Type II phase of sleep is completed. If you are trying to learn something academic, it makes sense to sleep early as REM phase comes up in the early stages of 8 hr. sleep period that helps you consolidate. Similar research has been done on “napping” and it has been found to be useful for learning consolidation. The brain is basically doing the function of separating signal from noise.

The Foraging brain

If effective learning is such a basic prerequisite to our survival in today’s world, why haven’t people figured out a way out to do it efficiently? There is no simple answer to this. The author’s response to this question is that our ideas of learning are at odds with the way our brain has been shaped over the millennia. Humans were foragers; hunting and tracking activities dominated human’s life for over a million years. The brain adapted to absorb – at maximum efficiency –the most valuable cues and survival lessons. Human brain too became a forager—for information, strategies, for clever ways to foil other species’ defenses. However its language, customs and schedules have come to define as how we think the brain should work—Be organized, develop consistent routines, concentrate on work, focus on one skill. All this sounds fine until we start applying in our daily lives. Do these strategies make us effective learners?

We know intuitively that concentrating on something beyond a certain time is counterproductive, mass practice does not lead to longer retention; it is difficult to be organized when there are so many distractions. Instead of adapting our learning schedules to the foraging brain, we have been trying to adapt our foraging brains( something that has evolved over a millennia) to our customs/schedules/notions about learning things( something that has happened over the few thousand years). The author says that this is the crux of the problem. This has kept us at bay in becoming effective learners. The foraging brain of the past that brought us back to our campsite is the same one that we use to make sense of the academic and motor domains. Most often when we do not understand something, the first instinct is to give up. However this feeling of “lost” is essential for the foraging brain to look for patterns, aid your brain in to creating new pathways to make sense of the material. This reinforces many of the aspects touched upon in this book:

  • If you do not forget and you are not lost, you do not learn.
  • If you do not space out learning, you do not get lost from time to time and hence you do not learn.
  • If you do not use different contexts/physical environments to learn, your brain has fewer cues to help you make sense of learning.
  • If you do not repeatedly test yourself, the brain doesn’t get feedback and the internal GPS becomes rusty

It is high time to adapt our notions of learning to that of our foraging brain; else we will be forever trying to do something that our brains will resist.

imageTakeaway:

There are some counterintuitive strategies for learning that are mentioned in this book—changing the physical environment of your study, spaced repetition, testing as a learning strategy, interleaving, quitting and revisiting project frequently, welcoming distractions in your study sessions etc. Most of these are different from the standard suggestions on “how to learn”. However the book collates all the evidence from the research literature and argues that these strategies are far more effective for learning than what we have known before.

image

In a world where uncertainty is the norm, “being curious” is one of the ways to hedge volatility in our professional and personal life. By developing and maintaining a state of curiosity in whatever we do, we have a good chance of leading a productive life. The author of this book, Ian Leslie, is a journalist and it should not come as a surprise that this book’s content is essentially “annotating a set of articles and books on curiosity”. The book is a little longer than a blog post / newspaper article and falls short of a well researched book.

We all kind of intuitively know that real learning comes from being curious. Does one need to read a book to know about it? Not really, if you understand that curiosity is vulnerable to benign neglect, if you truly understand what feeds curiosity and what starves it. Unless we are consciously aware of it, our mind might take us in a direction where we are comfortable with the status quo. The more we can identify the factors that keep us in a “curious state”, the better we are, at being in one, or at least in making an effort to get in to that state. This book give visuals/metaphors/examples that gives us some idea of what “others” have talked / written / experienced about curiosity.

Firstly a few terms about curiosity itself. Broadly there are two kinds of curiosities, First is the diversive curiosity , a restless desire for new and next. The other kind is epistemic curiosity, a more disciplined and effortful inquiry, “keep traveling even when the road is bumpy” kind of curiosity. The Googles/wikis/MOOCs of the world whet our diversive curiosity. But that alone is not enough. From time to time, we need to get down and immerse ourselves to get a deeper understanding of stuff around us. If we are forever in the state of diversive curiosity, our capacity for the slow, difficult and frustrating process of gathering knowledge i.e. epistemic curiosity, may be deteriorating.

The author uses Isaiah Berlin’s metaphor of Hedgehog vs. Fox and says that we must be “Foxhogs”. Foxhogs combine the traits of a hedgehog (who has a substantial expertise in something) and a fox ( who is aware of things happening in a ton of other areas). Curious learners go deep and they go wide. Here is a nice visual that captures the traits of a “Foxhog”

image

Typically they say a startup with 2 or three founders is ideal. I guess the reason might be that atleast the team as a whole, satisfies “foxhog” criterion. Wozniak was a hedgehox and Steve Jobs was a fox, their combination catapulted Apple from a garage startup to what it is today. Alexander Arguelles (I can speak 50 languages) is another foxhog. Charles Darwin, Charlie Munger, Nate Silver are all “foxhogs” who have developed T shaped skillsets.

Tracing the history of curiosity, the author says that, irrespective of the time period you analyze, there has always been a debate between Diversive vs. Epistemic curiosity. In today’s digital world too, with the onslaught of social media and ever increasing attention seeking tools, how does one draw a line between Diversive and Epistemic curiosity appetites? One of the consequences of “knowledge available at a mouse click” is that it robs you of the “desirable difficulty” that is essential for learning. “Slow to learn and slow to forget” is becoming difficult as Internet provides you instant solutions making our learning in to, “easy to learn and easy to forget” activity. Google gives answers to anything that  you question but it won’t tell you what questions to ask. Widening information access does not mean curiosity levels have increased. Ratchet effect is an example of this phenomenon.

Via Omniscience bias:

James Evans, a sociologist at the University of Chicago, assembled a database of 34 million scholarly articles published between 1945 and 2005. He analysed the citations included in the articles to see if patterns of research have changed as journals shifted from print to online. His working assumption was that he would find a more diverse set of citations, as scholars used the web to broaden the scope of their research. Instead, he found that as journals moved online, scholars actually cited fewer articles than they had before. A broadening of available information had led to “a narrowing of science and scholarship”. Explaining his finding, Evans noted that Google has a ratchet effect, making popular articles even more popular, thus quickly establishing and reinforcing a consensus about what’s important and what isn’t. Furthermore, the efficiency of hyperlinks means researchers bypass many of the “marginally related articles” print researchers would routinely stumble upon as they flipped the pages of a printed journal or book. Online research is faster and more predictable than library research, but precisely because of this it can have the effect of shrinking the scope of investigation.

The book makes a strong argument against the ideas propagated by people like Ken Robinson, Sugata Mitra, who claim that knowledge is obsolete, self-directed learning is the only way to educate a child, banishing memorizing stuff from syllabus is important etc. In the late nineteenth and twentieth centuries, a series of thinkers and educators founded “progressive” schools, the core principle of which was the teachers must not get in the way of the child’s innate love of discovery. Are these observations based on evidence? The author cites a lot of empirical research findings and dispels each of the following myths:

  • Myth 1 : Children don’t need teachers to instruct them
  • Myth2 : Fact kills creativity
  • Myth3 : Schools should teach thinking skills instead of knowledge

The last past of the chapter gives a few suggestions to the readers that enable them to be in a “curious state”. Most of them are very obvious but I guess the anecdotes and stories that go along with the suggestions helps one to be more cognizant about them. Here are two of the examples from one of the sections :

image 
International Boring Conference

The Boring Conference is a one-day celebration of the mundane, the ordinary, the obvious and the overlooked – subjects often considered trivial and pointless, but when examined more closely reveal themselves to be deeply fascinating. How often do we pause and look at mundane stuff ?

image 
Georges Perec(Question your tea-spoons)

Perec urges the reader to pay attention not only to the extraordinary but to — as he terms it— the infraordinary, or what happens when nothing happens. We must learn to pay attention to “the daily,” “the habitual”: What we need to question is bricks, concrete, glass, our table manners, our utensils, our tools, the way we spend our time, our rhythms. To question that which seems to have ceased forever to astonish us. We live, true, we breathe, true; we walk, we open doors, we go down staircases, we sit at a table in order to eat, we lie down on a bed to go to sleep. How? Where? When? Why? Describe your street. Describe another street. Compare.Make an inventory of your pockets, of your bag. Ask yourself about the provenance, the use, what will become of each of the objects you take out. Question your tea-spoons.

The book ends with a quote by T.H.White

“The best thing for being sad," replied Merlin, beginning to puff and blow, "is to learn something. That’s the only thing that never fails. You may grow old and trembling in your anatomies, you may lie awake at night listening to the disorder of your veins, you may miss your only love, you may see the world about you devastated by evil lunatics, or know your honour trampled in the sewers of baser minds. There is only one thing for it then — to learn. Learn why the world wags and what wags it. That is the only thing which the mind can never exhaust, never alienate, never be tortured by, never fear or distrust, and never dream of regretting. Learning is the only thing for you. Look what a lot of things there are to learn.”

imageimage

Every year there are at least a dozen pop math/stat books that get published. Most of them try to illustrate a variety of mathematical/statistical principles using analogies/anecdotes/stories that are easy to understand. It is a safe assumption to make that the authors of these books spend a considerable amount of time thinking about the apt analogies to use, those that are not too taxing on the reader but at the same time puts across the key idea. I tend to read at least one pop math/stat book in a year to whet my “analogy appetite”. It is one thing to write an equation about some principle and a completely different thing to be able to explain a math concept to somebody. Books such as these help in building one’s “analogy” database so that one can start seeing far more things from a math perspective. The author of this book, Jordan Ellenberg, is a math professor at University of Wisconsin-Madison and writes a math column for “Slate”. The book is about 450 odd pages and gives a ton of analogies. In this post, I will try to list down the analogies and some points made in the context of several mathematical principles illustrated in the book.

  • Survivorship bias
    • Abraham Wald’s logic of placing armor on engines that had no bullet holes
    • Mutual funds performance over a long period
    • Baltimore stockbroker parable
  • Linearity Vs. Nonlinear behavior
    • Laffer curve
  • Notion of limits in Calculus
    • Zeno’s Paradox
    • Augustin-Louis Cauchy’s and his work on summing infinite series
  • Regression
    • Will all Americans become obese? The dangers of extrapolation
    • Galton Vs. Secrist – “Regression towards mediocrity” observed in the data but both had different explanations. Secrist remained in the dark and attributed mediocrity to whatever he felt like. Secretist thought the regression he painstakingly documented was a new law of business physics, something that would bring more certainty and rigor to the scientific study of commerce. But it was just the opposite. Galton on the other hand was a mathematician and hence rightly showed that in the presence of a random effect, the regression towards mean is a necessary fact. Wherever there is a random fluctuation, one observes regression towards mean, be it mutual funds, performance of sportsmen, mood swings etc.
    • Correlation is non-transitive. Karl Pearson idea using geometry makes it easy to prove.
    • Berkson’s fallacy – Why handsome men are jerks? Why popular novels are terrible?

image

  • Law of Large numbers
    • Small school vs. Large school performance comparison
  • Partially ordered sets
    • Comparing disasters in human history
  • Hypothesis testing + “P value” + Type I error ( seeing a pattern where there is none) + Type II error(missing a pattern when there is one)
    • Experimental data from dead fish fMRI measurement: Dead fish have the ability to correctly assess the emotions the people in the pictures displayed. Insane conclusion that passes statistical tests
    • Torah dataset (304,8500 letter document) used by a group of researchers to find hidden meanings beneath the stories, genealogies and admonitions. Dangers of data mining.
    • Underpowered test : Using binoculars to detect moons around Mars
    • Overpowered test: If you study a large sample size, you are bound to reject null as your dataset will enable you to see ever-smaller effects. Just because you can detect them doesn’t mean they matter.
    • “Hot hand” in basketball : If you ask the right question, it is difficult to detect the effect statistically. The right question isn’t “Do basket players sometimes temporarily get better or worse at making shots? – the kind of yes/no question a significance test addresses. { Null – No “hothand”, Alternate : “Hot hand” } is an underpowered test . The right question is “How much does their ability vary with time, and to what extent can observers detect in real time whether a player is hot”? This is a tough question.
    • Skinner rejected the hypothesis that Shakespeare did not alliterate!
    • Null Hypothesis Significance testing, NHST,is a fuzzy version of “Proof by contradiction”
    • Testing whether a set of stars in one corner of a constellation (Taurus) is grouped together by chance?
    • Parable by Cosma Shalizi : Examining the livers of sheep to predict about future events. Very funny way to describe what’s going with the published papers in many journals
    • John Ioannidis Research paper “Why most Published Researched Findings Are False”?
    • Tests of genetic association with disease – awash with false positives
    • Example of a low powered study : Paper in Psychological science( a premier journal) concluded that “Married woman were more likely to support Mitt Romney when they were in the fertile portion of their ovulatory cycle”!
    • Low powered study is only going to be able to see a pretty big effect. But sometimes you know that the effect, if it exists, is small. In other words, a study that accurately measures the effect of a gene is likely to be rejected as statistically insignificant, while any result that passes the pvalue test is either a false positive or a true positive that massively overstates the effect
    • Uri Simonsohn, a professor at Penn brilliantly summarizes the problem of replicability as “p-hacking”(somehow getting it to the 0.05 level that enables one to publish papers)

image

    • In 2013, the association for Psychological science announced that they would start publishing a new genre of articles, called Registered Replication Reports. These reports aimed at reproducing the effects reported in widely cited studies, are treated differently from usual papers in a crucial way: The proposed experiment is accepted for publication before the study is carried out. If the outcomes support the initial finding, great news, but if not they are published anyway so that the whole community can know the full state of the evidence.
  • Utility of Randomness in math
    • “Bounded gaps” conjecture: Is there a bound for the gap between two primes? Primes get rarer and rarer as we chug along integer axis. Then what causes the gap to be bounded?
    • How many twin primes are there in the first N numbers (Among first N numbers, about N/log N are prime)?
    • Mysteries of prime numbers need new mathematical ideas that structure the concept of structurelessness itself
  • How to explain “Logarithm” to a kid? The logarithm of a positive integer can be thought as the number of digits in the positive number.
  • Forecast performance
    • Short term weather forecasts have become a possibility, given the explosion of computing power and big data. However any forecast beyond 2 weeks is dicey. On the other hand, the more data and computing power you have , some problems might yield highly accurate forecasts such as prediction of the course of an asteroid. Whatever domain you work in, you need to consider where does your domain lie between these two examples, i.e. one where big data + computing power helps and the second where big data + computing power + whatever is needed does not help you get any meaningful forecast beyond a short term forecast.
  • · Recommendation Algorithms
    • After decades of being fed with browsing data, recommendations for almost all the popular sites suck
    • Netflix prize, an example that is used by many modern Machine learning 101 courses It took 3 years of community hacking to improve the recommendation algo. Sadly the algo was not put to use by Netflix. The world moved on in three years and Netflix was streaming movies online, which makes dud recommendations less of a big deal.
  • Bayes theorem
    • Which Facebook users are likely to be involved in terrorist activities? Facebook assigns a probability that each of its users is associated with terrorist activities. The following two questions have vastly different answers. You need to be careful about what you are asking.
      1. What is the chance that a person gets put on a Facebook’s list, given that they are not a terrorist?
      2. What’s the chance that a person’s not a terrorist, given that they are on Facebook list ?
    • Why one must go Bayes? P(Data/Null) is what frequentist answers , P(Null/Data) is what a Bayesian answers
    • Are Roulette wheels biased? Use priors and experimental data to verify the same
  • Expected Value
    • Lottery ticket pricing
    • Cash WinFall : How a few groups hijacked the Massachusetts State Lottery ? Link : Boston Globe, that explains why it turned out to be a private lottery.
    • Use the additivity law of expectation to solve Buffon’s Needle problem
  • Utility curve
    • If you miss your flight, how to quantify your annoyance level?
    • Utility of dollars earned for guy moonlighting is different from that of a tenured professor
    • St Petersburg paradox
  • Error correction coding , Hamming code, Hamming distance, Shannon’s work :
    • Reducing variance of loss in Cash WinFall lottery : Choosing the random numbers with less variance is a computationally expensive problem if brute force is used. Information theory and Projective geometry could be the basis on which the successful MIT group generated random numbers that had less variance while betting.
    • Bertillion’s card system to identify criminals and Galton’s idea that redundancy in the card can be quantified, were formalized by Shannon who showed that the correlation between variables reduces the informativeness of a card
  • Condorcet Paradox
    • Deciding a three way election is riddled with many issues. There is no such thing as the public response. Electoral process defines the public response and makes peace with the many paradoxes that are inherent in deciding the public response.

Quotes from the book:

  • Knowing mathematics is like wearing a pair of X-ray specs that reveal hidden structures underneath the messy and chaotic surface of the world
  • Mathematics is the extension of common sense. Without the rigorous structure that math provides, common sense can lead you astray. Formal mathematics without common sense would turn math computations in to sterile exercise.
  • It is pretty hard to understand mathematics without doing mathematics. There is no royal road to any field of math. Getting your hands dirty is a prerequisite
  • People who go into mathematics for fame and glory don’t stay in mathematics for long
  • Just because we can assign whatever meaning we like to a string of mathematical symbols doesn’t mean we should. In math, as in life, there are good choices and there are bad ones. In the mathematical context, the good choices are the ones that settle unnecessary perplexities without creating new ones
  • We have to teach math that values precise answers but also intelligent approximation, that demands the ability to deploy existing algorithms fluently but also the horse sense to work things out on the fly that mixes rigidity with a sense of play. If we don’t do teach it that way, we are not teaching mathematics at all.
  • Field Medalist David Mumford: Dispense plane geometry entirely from the syllabus and replace it with a first course in programming.
  • “Statistically noticeable” / “Statistically detectable” is a better term than using “Statistically significant”. This should be the first statement that must be drilled in to any newbie taking stats101 course.
  • If gambling is exciting, you are doing it wrong – A powerful maxim applicable for people looking for investment opportunities too. Hot stocks provide excitement and most of the times that is all they do.
  • It is tempting to think of “very improbable” as meaning “essentially impossible”. Sadly NHST makes us infer based on “very improbable observation”. One good reason why Bayes is priceless in this aspect
  • One of the most painful aspects of teaching mathematics is seeing my students damaged by the cult of the genius. That cult tells students that it’s not worth doing math unless you’re the best at math—because those special few are the only ones whose contributions really count. We don’t treat any other subject that way. I’ve never heard a student say, "I like ‘Hamlet,’ but I don’t really belong in AP English—that child who sits in the front row knows half the plays by heart, and he started reading Shakespeare when he was 7!" Basketball players don’t quit just because one of their teammates outshines them. But I see promising young mathematicians quit every year because someone in their range of vision is "ahead" of them. And losing mathematicians isn’t the only problem. We need more math majors who don’t become mathematicians—more math-major doctors, more math-major high-school teachers, more math-major CEOs, more math-major senators. But we won’t get there until we dump the stereotype that math is worthwhile only for child geniuses

The book ends with a quote from Samuel Beckett

image