Wednesday, July 18, 2012

decomposing problems


(...and by that I don't mean necrotising fasciitis or "flesh eating bacteria"... that's a decomposing problem too, but not one that I want to talk about.)

Still slogging through the "Thinking Fast and Slow" audiobook during my drive.  This morning Danny was on the topic of decomposing problems correctly.  In other words, he was talking about setting the frame.
He uses an example of an economist who asked his friend "would you accept a coin-flip bet.  If the coin comes up heads, you win $200.  If it comes up tails, you lose $100."

A rational actor would jump all over it, but his friend (a human and therefore not completely rational) responded "no, because I would emotionally suffer from the loss of $100 far more accutely than I would emotionally enjoy the gain of $200."  His friend then added "however, if you were willing to make 100 such bets, I'll take you up on it".

Danny goes on to explain a theoretical foundation based on prospect and utility theory for why this adds up to make sense based on the relative loss-aversion and personal utility of winning and losing money.  In really simple terms, when you work out the numbers:
  1. one coin flip on this bet you have a 50-50 chance of losing money
  2. two coin flips, you have a 25% chance of losing money
  3. three coin flips, you have a 12.5% chance of losing money
  4. and so on. 
The real point is that framing matters.  We often make the mistake of taking big, complicated analytical problems and breaking them up into SDs and sending special teams off to work specific parts of the problem.  They report back one at a time with their answer, but the questions were framed too narrowly and the sum of the answers to the individual questions may not add up to the right answer for the big problem.

In the case of the coin flip from above, we could take the portfolio of 100 bets and say "whoa, this is too complicated" and pick 100 teams and tell each them to go work the numbers.  Each would come back and say "don't make the bet" and the leader, who invested much effort from his org, would rationally say "do none of this stuff" and reject the whole portfolio.  That would be a mistake.

I don't have data to support it, but I think this is, in large part, why large companies often fail to take advantage of the many good, small ideas that crop up.  We have many, many of these small bets that we could be taking.  Each of them has a potential for a big win but is more likely to lose.  We feel the losses more accutely than the hypothetical wins.  For that matter, we suffer the investment more than the excitement of the possible win.  One-at-a-time, we kill these ideas, without ever looking at the whole portfolio.

Monday, July 9, 2012

thinking slow and slower


I'm reading Danny Kahneman's book "Thinking Fast and Slow".  It is definitely a good read and worthwhile for just about anyone.  It is especially important for people who make large, infrequent decisions with slow time to feedback.

I just have to point out one comical inconsistency in the book... it sort of proves that Kahneman himself is prone to mixed thinking.

Early in the book he talks about regression to the mean, or what is also sometimes called "mean reversion".  The central concept of regression to the mean is that there is a mixture of skill and luck in any outcome.  Holding skill relatively constant from contest to contest or trial to trial or decision to decision, that leaves luck as the big difference from one chance to the next.  If somebody just had a great outcome, it was very likely that they were lucky on that outcome.  Just playing the odds, that means it's also very likely that they will do worse on the next outcome.  The reverse applies for bad outcomes.

He then tells a great story about a flight instructor, that you can find anywhere if you're interested.  He also cautions the reader not to build a plausible, explanatory and causal story to explain the shift in performance.  He beats up on sports-casters, experts of all kinds and so on for building up a story to explain why high-performers often do less well after a spectacular success.

A few chapters later in the book, he starts talking about optimists. He spends a while talking about the benefits and dangers of optimism.  He talks for a while about how award-winning CEOs tend to lead their companies to underperform the market.  He builds a plausible, causal story about how the over-confidence of CEOs that have done well leads them to make overconfident decisions, such as acquisitions and extreme risk-taking behaviors.  His story sounds a lot like the sort of story he was lambasting earlier for sports stars.
:)

Friday, June 15, 2012

thinking fast and slow

Reading Kahneman's book about type 1 and type 2 thinking, or precognitive v. cognitive, subconscious v. conscious, id v. superego, automatic v. reflective etc. etc.

It's definitely a concept that has been around for a long time in various fields, but fun to read about.  The brain primarily operates based on habits and perceived patterns.  At the same time, what we think of as our consciousness is essentially a focused self-reflection on some portion of that boiling churn.  Certain things are pushed up from the subconscious for more careful reflection; things that stand out as abnormal or that require mental processing that can't be done from habit (such as complicated math).

Kahneman's theory is that easy problems are best solved with automatic thinking, hard problems are best solved through deliberate thinking and that we have to be careful about it.  He also has a great sentence, something like:
"When faced with hard problems, we often solve an easier problem instead, and are unaware of the substitution"

Friday, March 16, 2012

viral campaign from Intel

http://ultrabooktemptations.intel.com/

cute little viral campaign for the ultrabooks from Intel.