I can’t tell you.
In my climb up Maslow’s pyramid, I find myself checking Google Maps for turn-by-turn directions. In life, I’m looking for a script while doing improv comedy; for a recipe amongst chefs who don’t measure; for a blueprint in an imprecise world.
I’m better off coming to terms with the fact that there is no one path; no right or wrong; no universal truth. And yet, we live in a world — and in an age of media — where so-called experts are seeking to capture (and monetize) our attention with the definitive guides to success.
As someone who looks (too often) to the outside world for guidance, I’m struggling. For every compelling and practical article I read, there is another equally compelling and practical article telling me to do the exact opposite of what the first compelling and practical article told me to do.
We’ll call it Norman’s Third Law of the Internet —
For every prescription there is an equal and opposite opinion.
I recently wrote about aspiration and value optimization. In short, when we aspire, we’re making decisions that optimize the values of our future selves over the values of our current selves. It’s awkward and contradictory for our current selves, but necessary throughout the transition process.
This topic had me thinking about benchmarking — as we aspire, who do we benchmark against? Someone at our level, or someone at the level we’re reaching for?
Daniel Egan, the Director of Behavior Science and Investing at Betterment, writes of his aspiration to become a better swimmer, after a lapse in practice. He’s inclined to compare himself to Michael Phelps or, more realistically, the people in the pool swimming faster than he is. However, he argues that this comparison does not serve him well:
We should be benchmarking and improving on our own performance relative to our own goals. Not others.
Meanwhile, Benjamin Hardy writes of gamifying our lives to quickly accomplish big goals by “competing” with people far advanced than our current level. It’s a tactic taken from chess Grandmaster and overall wunderkind Josh Waitzkin:
Waitzkin observed that most others in his Tai Chi class would naturally practice with those at their same skill-level or slightly worse… He would purposefully practice with people far more skilled than he was… This process compressed and quickened Waitzkin’s skill development… [and] he progressed much faster than others in his class.
Two compelling and practical arguments, almost directly in opposition to each other. Who’s advice do I follow?
For me, the acknowledgment that there is no universal truth is important and reassuring. Rather than trying to find the answer, and trying to evaluate which argument above is “better”, I can approach a decision with the understanding that I’m not seeking the “right” choice, but merely the “least wrong choice”.
The only truth is that there is no truth.
Venture capitalist Josh Wolfe hit this point home in a podcast interview on The Knowledge Project with Shane Parrish. In a discussion on parenting advice, Josh said:
The one thing that I know for sure, is the mere existence proof — when you walk into Barnes & Noble and there are a thousand books on parenting means nobody knows what they’re talking about. Because if there was an answer, you would have one book.
With parenting, it’s to each their own, and every child is different and every circumstance is different, and what they need is different…
When faced with two (or more) divergent, yet equally compelling and practical arguments, the acknowledgment that there is no right or wrong is a license to experiment, to make mistakes and to iterate.
So, who should we listen to?
Whose advice should inform our hypothesis?
We use heuristics — mental shortcuts — to facilitate the decision-making process. A relevant heuristic is ourdeference and automatic obedience to authority figures. Often, obedience to authority is a useful rule of thumb.
Still, psychologist Bob Cialdini, in his book Influence, suggests two defense mechanisms against purported authorities: firstly, asking “Is this authority truly an expert?”, and secondly, asking “How truthful can we expect the expert to be?”
In the example above, both Daniel Egan and Josh Waitzkin are relevant authorities to the topic at hand. While they may have an interest in selling Betterment’s services (in Dan’s case) or books (in Josh’s case) they otherwise don’t appear to have any reason to be insincere or misleading in the motives behind the advice they are giving.
We then must be wary of cognitive biases…
As I wrote, incidentally, in a prior post entitled Why You Should Stop Following Internet Experts, we fall victim to the outcome bias judge the quality of a decision once the outcome is already known.
When a team wins a sporting event, we attribute the win to the team’s or coaches’ performance, rather than randomness or chance.
We already know that Josh Waitzkin is wildly successful. Therefore, we attribute the success to his expertise and subsequently believe that his advice — what worked for him — will work for us.
Similarly, we fall victim to the survivorship bias when we concentrate on a select group of people — already successful entrepreneurs or those who are putting out an abundance of content — and overlook those with a lack of visibility.
We learn from the successes, and failures, of successful entrepreneurs. But, we don’t get to learn from the successes or failures of failed entrepreneurs, whose insight may be equally valuable.
Resulting is a phenomenon coined by Annie Duke in her book Thinking in Bets: Making Smarter Decisions When You Don’t Have All The Facts. We are resulting when we focus on the outcome of decisions rather than the process.
Thinking probabilistically, following Josh’s advice may give us a 75% chance of success. That also means that if we follow his advice exactly, we still have a 1 in 4 chance of failure.
Does that mean Josh’s advice was bad advice? Or bad advice for us?
These tactics are important to consider while formulating a hypothesis and to ensure that we are testing, with the information we have available at present, the best possible hypothesis.
Although a similar or identical hypothesis may have tested positive for Dan or Josh, we must acknowledge that the hypothesis may not test positive for us.
While this should go without saying, it is also vital to actuallytest the hypothesis and, per the scientific method, analyze results and redevelop and retest new hypotheses when and where necessary.
How often have we made assumptions about something, or were certain in our convictions, only to find out later that our prior assumptions or convictions were incorrect or did not serve us well? I was reminded of this by a recent tweet:
A high level of scrutiny and skepticism is valuable, but can also be debilitating and cause for procrastination. I think it’s a problem with the number of self-help books and content available. We’re so focused on reading what so-called experts are suggesting we do, that we forget to do the things they are advising we do. Or, we fear choosing a path because of the sheer amount of paths suggested to us.
Experimentation reminds us that we can get answers to our questions, but that we have to act — and that sometimes we may be (pleasantly) surprised!
I’m not here to make a qualitative argument and tell you which one of Daniel or Josh’s argument is right or wrong. I don’t know which argument is right or wrong. I know which one sounds good to me, and I have a hypothesis as to which is better for me, but who am I to tell you that it’ll work for you? I don’t even know that it’ll work for me!
All I can say is that I’m comforted by the opportunity to experiment and figure out which method is best for me.
I’ve read about emerging technology that can better prescribe medication based on our DNA sequencing. With machine learning and AI, perhaps one day a computer will be able to tell us the most value optimizing decision to make, and whose advise is best to follow.
Until then, we have no choice but to find what works the old fashion way — through life experiences and trial and error.