This is a long quote/excerpt from Adam Robinson I’ve been holding onto for a while, from Tribe of Mentors. Worth considering, especially if you strive to work in a data-informed product organization.
Virtually all investors have been told when they were younger — or implicitly believe, or have been tacitly encouraged to do so by the cookie-cutter curriculums of the business schools they all attend — that the more they understand the world, the better their investment results. It makes sense, doesn’t it? The more information we acquire and evaluate, the “better informed” we become, the better our decisions. Accumulating information, becoming “better informed,” is certainly an advantage in numerous, if not most, fields.
But not in the eld of counterintuitive world of investing, where accumulating information can hurt your investment results.
In 1974, Paul Slovic — a world-class psychologist, and a peer of Nobel laureate Daniel Kahneman — decided to evaluate the effect of information on decision-making. This study should be taught at every business school in the country. Slovic gathered eight professional horse handicappers and announced, “I want to see how well you predict the winners of horse races.” Now, these handicappers were all seasoned professionals who made their livings solely on their gambling skills.
Slovic told them the test would consist of predicting 40 horse races in four consecutive rounds. In the first round, each gambler would be given the five pieces of information he wanted on each horse, which would vary from handicapper to handicapper. One handicapper might want the years of experience the jockey had as one of his top five variables, while another might not care about that at all but want the fastest speed any given horse had achieved in the past year, or whatever.
Finally, in addition to asking the handicappers to predict the winner of each race, he asked each one also to state how confident he was in his prediction. Now, as it turns out, there were an average of ten horses in each race, so we would expect by blind chance — random guessing — each handicapper would be right 10 percent of the time, and that their confidence with a blind guess to be 10 percent.
So in round one, with just five pieces of information, the handicappers were 17 percent accurate, which is pretty good, 70 percent better than the 10 percent chance they started with when given zero pieces of information. And interestingly, their confidence was 19 percent — almost exactly as confident as they should have been. They were 17 percent accurate and 19 percent confident in their predictions.
In round two, they were given ten pieces of information. In round three, 20 pieces of information. And in the fourth and final round, 40 pieces of information. That’s a whole lot more than the five pieces of information they started with. Surprisingly, their accuracy had flatlined at 17 percent; they were no more accurate with the additional 35 pieces of information. Unfortunately, their confidence nearly doubled — to 34 percent! So the additional information made them no more accurate but a whole lot more confident. Which would have led them to increase the size of their bets and lose money as a result.
Beyond a certain minimum amount, additional information only feeds — leaving aside the considerable cost of and delay occasioned in acquiring it — what psychologists call “confirmation bias.” The information we gain that conflicts with our original assessment or conclusion, we conveniently ignore or dismiss, while the information that confirms our original decision makes us increasingly certain that our conclusion was correct.
So, to return to investing, the second problem with trying to understand the world is that it is simply far too complex to grasp, and the more dogged our at- tempts to understand the world, the more we earnestly want to “explain” events and trends in it, the more we become attached to our resulting beliefs — which are always more or less mistaken — blinding us to the financial trends that are actually unfolding. Worse, we think we understand the world, giving investors a false sense of confidence, when in fact we always more or less misunderstand it.
You hear it all the time from even the most seasoned investors and financial “experts” that this trend or that “doesn’t make sense.” “It doesn’t make sense that the dollar keeps going lower” or “it makes no sense that stocks keep going higher.” But what’s really going on when investors say that something makes no sense is that they have a dozen or whatever reasons why the trend should be moving in the opposite direction.. yet it keeps moving in the current direction. So they believe the trend makes no sense. But what makes no sense is their model of the world. That’s what doesn’t make sense. The world always makes sense.
In fact, because financial trends involve human behavior and human beliefs on a global scale, the most powerful trends won’t make sense until it becomes too late to profit from them. By the time investors formulate an understanding that gives them the confidence to invest, the investment opportunity has already passed.
So when I hear sophisticated investors or financial commentators say, for example, that it makes no sense how energy stocks keep going lower, I know that energy stocks have a lot lower to go. Because all those investors are on the wrong side of the trade, in denial, probably doubling down on their original decision to buy energy stocks. Eventually they will throw in the towel and have to sell those energy stocks, driving prices lower still.
9 thoughts on “Adam Robinson on Understanding”
“But what’s really going on when investors say that something makes no sense is that they have a dozen or whatever reasons why the trend should be moving in the opposite direction.. yet it keeps moving in the current direction. So they believe the trend makes no sense. But what makes no sense is their model of the world. That’s what doesn’t make sense. The world always makes sense.”
“The world always makes sense.” This is, in my opinion, the most important thing to remember in most any situation. Our perceptions shape our understanding of the world, but they’re just our perceptions.
One of my favorite quotes that I often summon in relation to this is,
“The major problems of the world are the result of the difference between the way nature works and the way people think.”
– Gregory Bateson
“The world always makes sense” part has also resonated very much with me.
I think there is both an element of truth to this, and going too far.
The goal then, is to find high quality information, instead of just large quantities of information. Another way to put it: not all information has equal value.
That is the point where I think he drove is point over the cliff. The world always is what it is, and “making sense” is relative. What makes sense to one person can be defined as insanity by others.
There’s a related paper that sheds some light on why knowing more leads to decreased accuracy “The illusion of knowledge: When more information reduces accuracy and increases confidence” http://www.sciencedirect.com/science/article/pii/S0749597807000064
Interestingly I wrote about this a long time ago, because data tends to give you only a piece of the information. (https://www.withinboredom.info/2015/06/data-cant-drive-your-business-veers-into-oncoming-competition/). For example, if you set out a water collector and collect rainwater to measure how much it rains and put it under a tree, your measurements would be way off. However, you have to know the tree will influence your measurements and, figure out a way to either remove the tree or change how you’re collecting. You also have to notice the tree and/or realize your measurements are off or unbelievable.
This is like how businesses use aggregate data to drive their businesses using 80% of 80% of their users (or less) and without really looking at the rest. This is great at first because 80% of 80% is a lot, then you lose 10% of that 20% the business didn’t focus on, and eventually, the customer growth plateaus because the product are now tailored for a very specific kind of user, instead of all users. Interestingly, AI’s also have this problem of overfitting and getting stuck/focusing on all the wrong things.
Granted, not all businesses have this problem.
Perhaps more information isn’t the culprit, but rather holding onto confirmation bias. I wonder if simply being aware (and letting go) of one’s bias can improve outcomes. Maybe thats easier said than done.
This part is also relevant for leaders:
“Slovic told them the test would consist of predicting 40 horse races in four consecutive rounds. In the first round, each gambler would be given the five pieces of information he wanted on each horse, which would vary from handicapper to handicapper. One handicapper might want the years of experience the jockey had as one of his top five variables, while another might not care about that at all but want the fastest speed any given horse had achieved in the past year, or whatever.”
So the handicappers chose the information that they would base their prediction on. The information they had wasn’t really random. They started round 1 with an open mind, so to say and asked the right initial questions by requesting the information that was most relevant to them (based on experience, I assume). It’s not surprising that this information that actually improved their predictions.
Now, in round two the confirmation bias really kicks in, because by now they are have a theory and will most likely ask for information that confirms those theories. But more confirmation of a wrong theory doesn’t make the theory more correct. At some point you have to start looking actively for information that goes against your theory (call it the scientific method if you will).
And to bring it back to real-life decision making:
– What is the bare minimum of information you need to take that decision?
– How can you mitigate/soften the impact in case you are wrong?
– Whom can you rely on to challenge your decisions/convictions?
Such valuable thought/reality in this bit. In the end, whether we go with our (we think) informed opinions or our confirmation biases, we’re all interpreting whatever information we can see, whether it fell into our laps or we sought it out.
Tribe of Mentors has been on my to-read list.
In a world of High Frequency Trading where algorithms buy and sell stocks in fractions of a second based on Natural Lamguage processing of twitter comments … does this even make anything in the original quote even revelant anymore?