The Art of Choosing

27 02 2016

The Art of Choosing by Sheena Iyengar surpassed all my expectations and more. I’m very impressed by this work. I was surprised to learn that the author is blind. At no point in the whole book would you get an idea of any such limitation on author’s part.  This work is a result of decades of her work in the field of choice. It also assimilated the outcome of numerous studies performed of many other researchers.

theartofchoosingThe book talked about how important choice is to one’s well-being as well as how and in what circumstances it can be over-whelming and/or harmful. It laid out how choice shapes our lives. It is interesting to note that in many cases, a belief that we have a choice is more important than whether we actually have the choice or not. A large part of the choosing process can be about making a personal statement – establishing or reaffirming a self image.

The book also touched upon cultural differences in perceiving choice in different aspects of life. Asian cultures do not see many aspects of life, even as important as marriage, profession etc. as personal choices, and let the family and/or community to make those choices for them. In contrast, Western cultures are characterized by the individual freedom of choice. This keeps the onus on the individual to make each and every choice that comes along his or her way. And sometimes it can be quite self-defeating.

I expected the book to rant about consumerism and the profusion of choice we currently have as part of each and every buying experience. The book definitely touched this aspect, but also a lot more. Sheena Iyengar presented choice as a philosophy of life and dissected it from all angles worthy enough to consider. It gave a well-rounded perspective on choice and the role it plays in our lives. This book is a treasure as far as I’m concerned.

I lamented before on this blog about analysis paralysis I frequently encounter while trying to choose among plethora of options available out there and I hoped that this book about choice would address that predicament and offer some tips. I was pleased to find that it did. They include:

  • Gain expertise in the field to choose better
  • Defer to experts’ recommendations or crowd wisdom in areas where you don’t have expertise
  • Consult experts when you are too emotionally tied to a situation to make sound judgment
  • Use programs like SMarT, StickK, SnuzNLuz alarm to make beneficial choices

I would add that going with satisficing rather than maximizing strategy can combat analysis paralysis. But of course, this can seem easier said than done, especially when choice is much more than what it appears to be at first glance.

The author concludes that despite all the assistance of science, choice remains an art at its core.

Advertisements




Intuition and decision making – Part2

19 11 2014

By definition, intuition, like our logical processing, is based on past knowledge/experiences (whether we know it or not). So, it may not give you a right answer in a new situation. It is at best a guide. It can be right in some circumstances and wrong in others. David Meyers, an experiential psychologist, in his brilliant book – Intuition: Its Powers and Perils, suggests the context in which context our intuition serves us well and in which it doesn’t. We can trust our intuition when:

  • harnessing the automaticity of everyday life – our implicit learning, memory etc.
  • we have experience based expertise
  • we are reading emotions from others’ faces
  • after letting our distracted or sleeping unconscious mind work on a decision task

We should not trust our intuition when:

  • Buying a lottery ticket
  • Picking stocks to buy
  • Predicting athletic performance from who is currently “hot”
  • Predicting job performance from a casual interview
  • Judging who is lying vs. truth-telling

Sure, it gave me a lot of clarity. Here is a short video (1 hr) by David Meyers on the topic, in which he gives the gist of his book. In the video, he also explains why we intuitively fear the wrong things. He says, we fear

  • What our ancestral history prepared us to fear
  • What we cannot control
  • What’s immediate
  • What’s most readily available in memory (availability heuristic)

Some of the biases that we need to be wary of are:

  • Self-fulfilling prophecy – we tend to behave in ways that make our predictions about something come true.
  • Confirmation bias – we tend to only look at the evidence that confirms our beliefs
  • Overconfidence bias – we generally are overconfident about our predictions and estimations
  • Affective forecasting – we tend to overestimate our  future happiness or otherwise from an event

Obviously, this short list is no way an exhaustive one. Sometimes, I feel so overwhelmed by all these invisible forces acting against me. 😛 Given the amount of reliance people usually put on their intuition, even while making critical decisions, I think we need more convincing on the part of the pitfalls or rather perils of it than its power. 🙂 Nonetheless, in view of its value, it helps us to be more intuitive. So, how can we improve it? It’s pleasantly surprising and heartening to know that New Zealand would like its kids to “reflect on their own learning, draw on personal knowledge and intuitions, ask questions, and challenge the basis of assumptions and perceptions.” Jamie McKenzie, the editor of an educational technology journal, provides a list of steps that can help students make use of their intuitions:

  1. Clarifying, Demystifying and Defining
  2. Enhancing Awareness and the Ability to Read Intuitions
    1. Meeting new people
    2. Predicting the next move
    3. Sizing up a situation
  3. Testing and Balancing Intuitions against Other Thinking

Read his excellent article here. Practicing meditation and being mindful is an excellent way to improve our intuition. Mindfulness enables us to be in the present, thereby making it more feasible for us to pick up the subtle cues and information around us. Meditation helps us to calm down, bring down the noise inside our heads. This makes it easier for us to listen to our inner self. Being more observant of others and surroundings also helps a lot. This brings to my next question: why are some people more intuitive? Is it a predisposition, a natural inclination? Or is it environment? Like everything else about us, it is a combination of both. Genetics partly shape our ability. But a significant portion of it depends on the kind of environment we are exposed to. Also, as we feel comfortable with a particular way of thinking, we tend to reinforce that behavior by repeatedly preferring it to the other resulting in a positive feedback loop. The popular psychology claims that there is a dominant part of your brain – Are you left-brained or right-brained? Left_Vs_Right_Brain Clearly, intuition is associated with right-brain and logical thinking and analysis with left-brain. But, the recent research indicates that the dominance is a myth and in fact both parts need to work together to solve anything. Nevertheless, we see that some people approach things more in the “right” way and some others more in the “left” way. This article points out that there are biases inherent in both the approaches. Intuition-dominant biases:

  • Overlooking crucial details
  • Expecting solutions to sound in a certain way
  • Not recognizing precise language
  • Believing their level of understanding is deeper than what it is

Logical-dominant biases:

  • Ignoring information they cannot immediately fit into a framework
  • Ignoring their emotions
  • Making rules too strict

People need to use the appropriate approach based on the problem/task at hand. For example, learning a mathematical or science concept should be approached analytically to be more effective, while understanding the emotions needs more of an intuitive approach. It is quite possible that you may not have an intuition in a certain situation. Then, all you have to depend on is concrete data. On the other hand, it is also possible that sometimes, you may not have any or enough data to rely on to make a deliberate decision, or very time pressed to actually make a logical decision. In those cases, you may have to act out of your intuition. I will end this post with an interesting questions: Are women more intuitive than men? If so, why? If not, why the myth is so prevalent? I’ll explore this in a later post. 🙂 Part1 Note: This is a continuation of an earlier post on Effective decision-making.





Intuition and decision making – Part1

18 11 2014

We all have intuition. Whether we acknowledge it or not. It’s not uncommon for people to decide something based on their gut feeling, instinct, intuition.

The questions that immediately sprang up in my mind include:

  • Does it always serve us right?
  • How reliable is it?
  • When should we trust it and when should we question it?
  • Why are some people more intuitive than others?

Before trying to answer these questions, let’s start with looking at what intuition actually is. The dictionary says it is “the ability to understand something immediately, without the need for conscious reasoning.”  It’s what is referred to as System 1 thinking in Daniel Kahneman’s latest best seller – Thinking Fast and Slow. It typically is fast, automatic, effortless, implicit, and emotional.

It also serves the current discussion to differentiate it from “instinct”. Instinct is an instantaneous reaction to a physical environment or situation without any thought put into it. All animals have instinct – so essential for survival. (Intuition vs Instinct)

It’s interesting to note that the past 50 years of scientific research challenges the trust people put on their intuition. It suggests that the more deliberative, slower, conscious, effortful, explicit, and logical System 2 thinking works in the favor of the decision maker. The problem with intuition or System 1 thinking is that it’s influenced by several psychological biases and heuristics, which will lead to incorrect decisions.

But the research results fail to provide conclusive proof of the positive effect of varied strategies to reduce decision biases and encouraging System 2 thinking. The most important pre-requisite for these strategies to work is the awareness of the decision makers of their biases and their willingness to address them to improve their decision-making. On the other hand, for biases which they don’t like to acknowledge, changing the environment to address the bias in play can be a plausible approach Example: addressing the ‘Status quo bias’ by having the desired option as default. (Milkman, Chugh, Bazerman, 2008).

When Malcolm Gladwell brought into the limelight the fabulous “power of thinking without thinking”, by which we make brilliant decisions in the Blink of an eye, I believe many people, including myself, were dazzled.  I feel that it’s written powerfully and extremely engagingly with the purpose of eliciting such response. Sure, he touches upon some of the prejudices and biases that can influence our intuition resulting in misjudgments, but the book fails to clarify when it is right and when it is wrong. In short, this bestselling book doesn’t offer a complete picture and failed to answer my questions.

Personally, I’m a skeptic and have a scientific mind. It’s easier for me to dismiss anything which doesn’t sound logical to me. But of course, it would be a mistake if I do so without investigating or giving it a benefit of doubt. Isn’t it? I believe that despite our strong convictions (or rather because of them), we need to be open to any claim or any new information, and be willing to investigate it to determine its validity. Nevertheless, I’m highly dubious of ideas like “intuitive healing”, which place too much reliance on intuition, make it sound more like magic.  🙂

I think my intuition is not so strong. Actually, I never really thought about it. I’m sure I get certain messages from my intuition but I guess I’m usually not too attentive of them and miss their significance. However, I have come to realize that it’s a mistake. Because, our intuition is a valuable resource, without which our logical analysis will be incomplete.

This makes sense because there is no magic about intuition. It is in fact the result of years of learning, experience, and expertise. We all know that as we gain expertise on something, it becomes automatic. Ex: driving.  Over the time, we don’t consciously exercise our logical mind to do the task but automatically perform it. Intuition is actually based on a lot of cues and subtle information that our subconscious picks up and processes it so fact that our conscious has no idea whatsoever that anything might have actually happened. As such, our stereotypes, prejudices, and other biases which are so ingrained in our psyche manifest themselves in intuition. So, we must always take it with a grain of salt.

Even Hercule Poirot, the master detective who relies on method and intelligence, proclaims “Never ignore your intuition”.  😛 So, the best way to go about it is as Robert Heller puts it – “Never ignore a gut feeling, but never believe that it’s enough”. We should never dismiss it right away because it may be taking into account some important information that your conscious mind is not able to pick up.

Given the nature of intuition – that it’s automatic and comes from without any conscious reasoning process, and the fact that it actually comes from our past experiences or some other subconscious knowledge, we need to take it into account. But test it against data. If you don’t have enough data to dismiss it, gather more data.

Note: This is a continuation of an earlier post on Effective decision-making.





Effective decision-making

2 11 2014

I had been to a discussion meet-up last night. The topic was ‘Effective decision-making’. It was an informal session trying to tease out different aspects associated with the decision-making aspects. We all make decisions, big and small, all the time. So, it goes without saying that it serves us well for them to be effective. The first question is how do we define “effective decision-making”. Is it the process or the outcome or some combination of it? Definition: As per the management guru ‘Peter F. Drucker”, an effective decision-making process must go through some steps:

  1. The classification of the problem
  2. The definition of the problem
  3. The specifications which the solution to the problem must satisfy (the “boundary conditions”).
  4. The decision as to what is “right”, rather than what is acceptable, in order to meet the boundary conditions
  5. The building into the decision of the action to carry it out.
  6. The feedback which tests the validity of the decision against the actual course of events.

Unless these elements are the stepping stones of the decision process, the executive will not arrive at a right, and certainly not at an effective, decision.

So, this tells us that “Effective decision-making” is all about the process. But decisions are usually interpreted as good or bad only in hindsight, once we know the actual outcome. And I believe that it’s not the right way to judge a decision. Because while we making decisions, there are risk and uncertainty in play. Risk is known and if we don’t factor it in our decision-making process, then we are not being effective. But the uncertainty relates to unknown and we don’t have any idea about it. (Who would have predicted something like 9/11??) Strategic Decisions Group of Stanford says that “Making good decisions in the face of uncertainty requires understanding the difference between decisions and outcomes. They say that decision quality integrates the art and science of decision making along six elements:

  1. Appropriate frame
  2. Creative, doable alternatives
  3. Meaningful reliable information
  4. Clear values and tradeoffs
  5. Logically correct reasoning
  6. Commitment to action

There are pitfalls at each step that we need to be first aware of and then avoid. The framework suggests that we should try to attain 100% quality on each of these elements. (100% being a point where additional effort will not improve the decision.) But in most cases, when we are judging others’ decision-making, the only thing we have access to is the outcome.

It was pointed out in the discussion that in political scenario, all the citizens know is how the decisions made by the President/Prime Minister turned out bad i.e., in hindsight. Is it how it’s supposed to be? Logically no. But since they have access only to the outcome, people vote based on the outcome.

Coming to the process, there can be several ways to go about deciding something – rational, spontaneous, avoidant, participatory/dependent, intuitive. How to determine, which process is the best. Should it be dictated by the type of problem or situation or just a preference of decision-making style? Every one of us has a preferred or dominant style of decision-making, which is part of our personality. Scott & Bruce’s (1995) General Decision Making Style questionnaire can be used to assess how people approach decision situations.

Many questions sprang up during the discussion. Most important of them being:

  • If your data says something but your gut pull you towards something else, how should be decide?
  • How reliable is “gut feeling/instinct”?
  • How much data is enough to make a data-driven decision? How do we handle analysis-paralysis?
  • How do we arrive at the list of criteria to evaluate our alternatives? How do we prioritize them?

As for me, I make (or at least try to make) rational decisions; I list out my alternatives, come up with criteria to evaluate them, prioritize the criteria and based on all the pros and cons of the alternatives make a choice. Sure, it’s taxing. It’s a lot of effort. Will it always produce good results?? I’m afraid not. The challenge for me personally is in trying to evaluate the alternatives. What are the criteria to consider and what’s the weightage I need to give to each of them? In hindsight, I see that failing to factor in some of the important/relevant criteria has resulted in the bad outcomes. How to address this major pitfall?

One interesting question that was posed is – what is the optimal number of criteria we need to consider? Given the fact that we may have tens of requirements and that it may not be economically or otherwise feasible to consider all of them, we may have to prioritize and limit them to a handful so that we can arrive at the decision in a timely and effective manner. But is there a one-size-fit-all magic number??

One of the major takeaways for me from the discussion is “cost-benefit analysis” of the decision. With the data and criteria we have at our disposal, we have to make sure that the cost of making a decision does not outweigh the benefits that can be accrued from an effective decision-making.  In some cases, the quality of the decision may not affect us in any big way and we can live with sub-optimal choices. We will be better focusing our time and energy on only those decisions which are crucial and warrant the extensive analysis. This is a very important aspect to think about and is one way to deal with “analysis-paralysis”.

It was indeed an exuberant and stimulating discussion. But as you can see, no concrete conclusions were arrived at and there are more questions than answers. There are lot of interesting and intriguing aspects that I would like to explore further and “try to” come up with some perspective, which I hope would be objective.





Forewarned is forearmed

23 09 2014

Why would we want to understand our everyday thinking in the first place?

Because, we would like to improve it.

But why would we like to improve it?

Because it affects our day-to-day decision-making and we want to make better decisions.

By trying to understand the science of everyday thinking, we attempt to understand the biases, influences, and attitudes that affect our thinking, how they affect, and how the awareness of them would enable us to be better thinkers.

But, ironically, the renowned psychologist, Daniel Kahneman, who has studied human judgment and decision-making and the inherent heuristics and biases at play for decades, confesses that his thinking has not improved much over the time. But he has a valuable advice for us: “Pick up an area and work on improving it, rather than focusing on improving overall thinking in general.”

Hmm…!

Well, let’s just believe in the adage – “forewarned is forearmed”, and keep this not so encouraging confession aside for now.

It is startling, to say the least, to discover that we operate under the influence of a large number of psychological biases, heuristics, and cognitive errors on a daily basis. To start with, we are subjected to “Naïve Realism”, which implies that we believe the world to be as it seems. But actually it’s not – we see it through our “lens” (perception) and it’s different for everyone. People tend to underestimate the contribution of their beliefs and theories to observation and judgment and fail to realize how many other ways they could have been interpreted. This tendency is referred to as “Fundamental Cognitive Error”.

Do you know that everyone one of us like to see himself/herself as above average? This is called “the above-average effect”. Do you relate to the experience where you plan for the end exams or a paper submission and largely fall short of the time it really takes to accomplish them? Well, you are not alone. We all have this tendency called the “Planning Fallacy”, where we underestimate how long it will take for us to complete a task.

“Availability Heuristic” causes us to misinterpret ease of cognitive processing as being indicative of a larger category. So, over-hyped news items or tragic events that are stuck to memory make us believe that they are more prevalent. The classic example is airplane crashes vs road accidents. Even though more people die on the road compared to on a plane per day, the fact that the airplane crash is given more media coverage leads us to believe that flying is more dangerous than driving.

Of course, there is the “Confirmation Bias”, by which people look for and gather evidence that supports or confirms their beliefs or hypotheses. And by “Representativeness Heuristic”, we estimate the likelihood of an event by comparing it to an existing prototype that already exists in our minds.

The most interesting cognitive bias is the “Fundamental Attribution Error”. It describes the tendency by which we attribute others’ behavior in a given situation to their personality traits rather than external factors, especially negative behavior. For example, if a colleague is late to a meeting, we think that he/she is lazy or irresponsible.By contrast, we attribute our lapses in behavior almost always to external circumstances. Given the same situation,  when we are late to a meeting, we believe that it’s just a bad day.

The discussion of these hidden forces reminds me of an article on Harvard Business Review, which I’ve read so long ago. In fact it’s the first of its kind I’ve read and it was an enlightening revelation. The article is called “The Hidden Traps of Decision Making”. Some of the traps are:

  • Anchoring – giving disproportionate weightage to the first information received
  • Status Quo Bias – the tendency to maintain the status quo
  • Framing effect – the way a problem is framed affects the decision in a big way

You can read the full article here and enlighten yourself.

There are many other interesting things that underlie our thinking and shape our behavior. David Meyers’ Exploring Social Psychology is a wonderful book that explains several phenomena pertaining to our behavior with others. It’s a must read for anyone interested in understanding the inherent players of social behaviors. There’s an amazing course called “Social Pyschology” on Coursera, covering the same and more material. Unfortunately, I couldn’t take it both the times it was offered during the past two years. I compensated a little, by reading David Meyer’s book and I must say I’m immensely rewarded. I’ll cover the book in a separate blogpost.

Part 2 of Science of Everyday Thinking series.

Part 1