3 Common Errors
February 2023, Vol. 12 No. 2
Hello,

Making decisions based on what we think we know to be true can be dangerous especially when what we "know" is either partially or wholly incorrect.

Today's newsletter looks at the three causes of this problem flawed data, flawed models, and flawed thinking and offers suggestions for avoiding these common traps.
Please reply with your thoughts and comments.
Charlie
Charlie Goodrich
Founder and Principal
Goodrich & Associates
In this issue…
When What You Know to be True Isn’t
Heard on the Street
About Us
When What You Know To Be True Isn’t
“It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so.” — Mark Twain

Mark Twain may not have been a practicing business professional, but his insight regarding the danger of acting on “truths” that aren’t, applies just the same.

There are three causes of this problem: flawed data, flawed models, and flawed thinking. 

Let’s look at each in turn…

#1. Flawed Data

What can go wrong here? Lots.

The sample upon which the data is drawn may have issues, usually due to: 

  • Path forking. This occurs when looking at lots of data to find something meaningful, with the resulting data set being very conditional. We sort through data and say “not this,” “not this,” “oh look at that.” The data set has been narrowed, and the real finding therefore is “If this, and this, then that.” 

  • Sample size. If the sample is too small, any finding to be statistically significant must represent a large difference. That difference looks compelling and meaningful, but it isn’t.

Taken together, these two things are known as “ P-hacking .” The key in both cases is that the mere presence of statistical significance does not necessarily tell the entire story or reliably predict future outcomes. 


For example, for years, it was believed that the US decline in manufacturing jobs was the result of productivity improvements, a conclusion reached by looking at country-wide data.  But when an economist took a closer look , it was discovered that nearly all of the productivity gain came from a single industry category: computers and electronic products. There was no productivity gain anywhere else in manufacturing. Computer productivity was from getting more chips on a wafer.

Not considering how the data is collected or measured.

A client of mine buys a commodity by the ton, expands and processes it, and then puts the expanded product in a bag measured in cubic feet. Previously, my client measured how much of the commodity was used by taking a one-cubic-foot sample of the processed product once every hour and weighing it. They then took a measure of cubic feet per hour and multiplied it by the weight of the one-cubic-foot sample to calculate tons used. This is very inaccurate. 

Today, they take accurate physical inventories and calculate things mathematically: beginning inventory + tons received - ending inventory. 

Not aggregating the data. 

A classic example of this involves adding up the profits of all public companies and examining trends in that aggregate number to divine the relative health of business and the economy in general. But given the size of the entire economy, the aggregate of all public company profitability is a lousy measure of business profitability. 

Instead, economists look at net income of corporations in the National Income and Product Accounts.

Faulty data filtration.

By necessity, organizations filter data. The result is that some data is missingThe way around this is to bypass the filters (internal staff) and meet customers and suppliers, go to trade shows, and simply walk about your physical operation in an approachable way to get unfiltered data.

#2. Flawed Models

Data is meaningful because we use it in the context of a model to draw conclusions . Often, although we may think we are not using a model, we really are: at its simplest, a model is nothing more than “If A then B.” The problem arises when A leads to C and not B.

Common errors…

Thinking that your business operates differently than it really does.

This often occurs when applying a model that worked in a different context.

For example, when I was in the rent-a-car business we always lost money in New York City. Prices were low, costs were high, and market share was suboptimal. But even after I was able to demonstrate to the COO that breakeven volume was more than 100% of the market (!), he insisted on doubling down on volume because at his prior company (then a market leader), greater market share meant greater profit.

Overlooking industry changes that cause a change in your business model. 

When LaGuardia airport went to shared common facilities for rental cars, much of the cost advantage of the big companies evaporated (but not the price advantage).

Using too many variables.

Adding more variables and relationships between those variables often, but not always, yields a more statistically significant explanation than does a simpler model. 

But, since each variable contains some degree of error, when it comes to projecting, the relationship between some of these variables may no longer exist (or have been a statistical fluke). When that occurs, the projection becomes one of error x error x error… resulting in a very bad projection!

#3. Flawed Thinking

Unfortunately, our brains are not wired to always think logically , something that academics have demonstrated for quite some time.

Some of the more common mind traps that can trip up business decisions include…

Anchoring

The natural tendency to make preliminary or “straw man” first impression decisions often leads to ignoring the rest of the data — even when that data does not support the initial decision.

Projecting

Projecting our own value and belief systems onto others when predicting how other organizations (even countries) will act. There is no better example of this than the Cuban Missile Crises, in which Kennedy's advisors modeled Russian thinking through American values, beliefs, and political systems (something we are reminded of again vis a vis Ukraine).

Delusions of Clairvoyance

Thinking we know what other people are thinking or how they would think, much better than we really do. This applies to “reading body language” as well. In both cases, research has shown that despite our best efforts, we can't do this with any degree of reliability.

Confirmation Bias 

Focusing on information that confirms our existing beliefs and rejecting data that goes against it. Humans tend to be overconfident in their own knowledge and opinions, something that certainly tripped up my rent-a-car COO. Confirmation bias can make anchoring bias worse.

Faulty Pattern Recognition 

Our ability to see patterns in information based on experience, having seen the pattern before, is valuable in any number of situations. If we don't check the underlying data, however, it's easy to identify the wrong problem or cause of a problem. Pattern recognition without data verification often results from a lack of time to make proper decisions, sloppiness, or overconfidence in general.

Final Thoughts

In short, always look for robustness — the existence of multiple, at least partially independent means of determining something. (Thanks to University of Chicago's William Wimsatt for this definition.) 

Make sure as well to look at different data sets and see if they draw the same conclusion, get unfiltered data as previously discussed, and above all, don’t be overconfident about what you think you know!
Please share with your colleagues:
Heard on the Street
What is the impact of the mandate to switch credit line pricing from LIBOR-based to SOFR-based? According to recent research from the Federal Reserve Bank of New York, the impact is to increase the borrowing cost on credit lines. 

The short answer as to why is that LIBOR is a credit-based rate and SOFR is risk-free-based rate. Read more of this short explanation here
About Us
Goodrich & Associates is a management consulting firm. We specialize in restructuring and insolvency problems. Our Founder and Principal, Charlie Goodrich, holds an MBA in Finance from the University of Chicago and a Bachelor's Degree in Economics from the University of Virginia, and has over 30 years' experience in this area.


To ensure that you continue to receive emails from us, please add
charlie@goodrich-associates.com to your address book today.

Goodrich & Associates respects your privacy.
We do not sell, rent, or share your information with anybody.

Copyright © 2023 Goodrich & Associates LLC. All rights reserved.

For more on Goodrich & Associates and the services we offer, click here .

Newsletter developed by Blue Penguin Development
Goodrich & Associates
781.863.5019