You can fool yourself – and intuition is often mistaken. This is not fashionable to say in certain circles, in which intuition is given a special status. And yes, the brain and mind con do some amazing things, and intuition is often astonishingly right, especially certain types of intuition, in which, for example, we ‘intuit’ how another person is feeling, or even what thoughts they are having.

But, intuition is, or can be, just as subject to “filter mistakes” as any other function of consciousness. What are filter mistakes? A filter error happens when we make assumptions, and filter out information that doesn’t support those assumptions, and filter in information that does. Another term that we might apply to my expression “filter mistake” is perception bias or cognitive bias.

A recent article on The Hazards of Confidence talks about these kinds of conghitive biases – worth a read – and consider how these kinds of problems may be affecting your training and exploration.

Because our impressions of how well each soldier performed were generally coherent and clear, our formal predictions were just as definite. We rarely experienced doubt or conflicting impressions. We were quite willing to declare: “This one will never make it,” “That fellow is rather mediocre, but should do O.K.” or “He will be a star.” We felt no need to question our forecasts, moderate them or equivocate. If challenged, however, we were fully prepared to admit, “But of course anything could happen.”

We were willing to make that admission because, as it turned out, despite our certainty about the potential of individual candidates, our forecasts were largely useless. The evidence was overwhelming. Every few months we had a feedback session in which we could compare our evaluations of future cadets with the judgments of their commanders at the officer-training school. The story was always the same: our ability to predict performance at the school was negligible. Our forecasts were better than blind guesses, but not by much.

We were downcast for a while after receiving the discouraging news. But this was the army. Useful or not, there was a routine to be followed, and there were orders to be obeyed. Another batch of candidates would arrive the next day. We took them to the obstacle field, we faced them with the wall, they lifted the log and within a few minutes we saw their true natures revealed, as clearly as ever. The dismal truth about the quality of our predictions had no effect whatsoever on how we evaluated new candidates and very little effect on the confidence we had in our judgments and predictions.

I thought that what was happening to us was remarkable. The statistical evidence of our failure should have shaken our confidence in our judgments of particular candidates, but it did not. It should also have caused us to moderate our predictions, but it did not. We knew as a general fact that our predictions were little better than random guesses, but we continued to feel and act as if each particular prediction was valid. I was reminded of visual illusions, which remain compelling even when you know that what you see is false. I was so struck by the analogy that I coined a term for our experience: the illusion of validity.

I had discovered my first cognitive fallacy.

http://www.nytimes.com/2011/10/23/magazine/dont-blink-the-hazards-of-confidence.html?_r=1&pagewanted=all

Summary – In general, however, you should not take assertive and confident people at their own evaluation unless you have independent reason to believe that they know what they are talking about. Unfortunately, this advice is difficult to follow: overconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion.