Expertise vs. Judgement
As I noted yesterday, the core development need of an investor is judgement, not technical expertise. As if to affirm this point, Nicholas Kristof wrote in the NY Times:
"There’s evidence that what matters in making a sound forecast or decision isn’t so much knowledge or experience as good judgment — or, to be more precise, the way a person’s mind works."
He then writes further:
The expert on experts is Philip Tetlock, a professor at the University of California, Berkeley. His 2005 book, “Expert Political Judgment,” is based on two decades of tracking some 82,000 predictions by 284 experts. The experts’ forecasts were tracked both on the subjects of their specialties and on subjects that they knew little about.
The result? The predictions of experts were, on average, only a tiny bit better than random guesses — the equivalent of a chimpanzee throwing darts at a board.
“It made virtually no difference whether participants had doctorates, whether they were economists, political scientists, journalists or historians, whether they had policy experience or access to classified information, or whether they had logged many or few years of experience,” Mr. Tetlock wrote.
Indeed, the only consistent predictor was fame — and it was an inverse relationship. The more famous experts did worse than unknown ones. That had to do with a fault in the media. Talent bookers for television shows and reporters tended to call up experts who provided strong, coherent points of view, who saw things in blacks and whites.
Other studies have confirmed the general sense that expertise is overrated. In one experiment, clinical psychologists did no better than their secretaries in their diagnoses. In another, a white rat in a maze repeatedly beat groups of Yale undergraduates in understanding the optimal way to get food dropped in the maze. The students overanalyzed and saw patterns that didn’t exist, so they were beaten by the rodent.