I came across reference to this in a discussion recently and so looked it up on wiki.
No doubt we all think it describes those of an opposite view of things to ourselves
The Dunning–Kruger effect is a cognitive bias in which unskilled people make poor decisions and reach erroneous conclusions, but their incompetence denies them the metacognitive ability to appreciate their mistakes. The unskilled therefore suffer from illusory superiority, rating their ability as above average, much higher than it actually is, while the highly skilled underrate their own abilities, suffering from illusory inferiority. This leads to the situation in which less competent people rate their own ability higher than more competent people. It also explains why actual competence may weaken self-confidence. Competent individuals falsely assume that others have an equivalent understanding. "Thus, the miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others."
Well, if you define incompetence as the inability to calibrate your own skill, then I guess that makes both the bad and the good, incompetent. So if you have a talent, best to keep stumm, and let the skill do the talking.
In other words, trust no man at his own estimation .. and this insight is probably at the root of the scientific method ..."Nullius in Verba"?
They suffer from illusory superiority, rating their ability as above average, much higher than it actually is
sounds a lot more educated than my normal sentence. "They ain't got a bl**dy clue what they're talking about" And as someone who is the font of all knowledge, I should know! P
Post by speakertoanimals on Feb 1, 2011 17:52:15 GMT 1
The usual example of this that many people can relate to is school students (when did pupils become students, confuses the hell out of old farts like me for whom student means undergraduate...........), and the poor students who have no idea how weak they actually are, or that they don't understand what they think theu understand. Whereas the bright ones who are going to get 95% in their maths exam, but who nevertheless are convinced they are stupid because of that few lost percent, and those few little things they got wrong on the exam paper.
Although you could argue that the reason they get 95% percent is precisely because they continue to focus on what they still don't quite understand or get right...................
I wouldn't have thought anyone would have gone to all the trouble of giving it a double-barrelled name though.................
Ah -- interesting to see cultural differences, and perhaps all that american self-esteem stuff fits in there as well -- the dim MAY have a sense of self-esteem, and MAY feel that they are doing okay, but doesn't mean that they are. And research does indicate that inflating self-esteem can actually HARM grades, rather than improve them. You can't IMPROVE unless you focus on what you don't yet know -- hence the clever students who focus on the elusive 5%, rather than the 95% they did get right.
Post by marchesarosa on Mar 16, 2011 16:27:49 GMT 1
How does that relate to the highly elastic IPCC concept of "uncertainty" which they are now pushing like mad and plain ignorance of all the pertinent facts?
Last Edit: Apr 25, 2011 19:59:08 GMT 1 by marchesarosa
On Judith Curry's blog, Climate Etc, Paul Vaughan said
Statisticians, perceived by many as possessing magical powers, receive FLOODS of requests for assistance. Academic training in the field of Statistics is far more abstract than what the general public imagines (…& even far more abstract than what most with a B.Sc. or M.Sc. would imagine). From a mathematicians perspective, Stats is “applied” and since many statisticians have a background in math, they also (in general) view their field as “applied”. What is needed in the climate discussion is NOT application of ABSTRACT concepts based on UNTENABLE assumptions, but rather DATA EXPLORATION (which differs fundamentally from statistical inference). NO amount of abstractly-clever parametric “uncertainty” computations can make up for OVERLOOKING key lurking variables (a responsibility of field specialists, not statistical helpers).
I welcome participation by statisticians, physicists, etc., but I caution everyone to realize that such participation doesn’t guarantee silver bullets – or even anything meaningful – if the mainstream culture of PRETENDING untenable assumptions aren’t a problem is allowed to remain dominant. Absolute Guarantee: Statisticians will bring FLOODS of untenable assumptions to the discussion. FLOODS. To avoid being blown into an abstract realm, EVERY untenable assumption will have to be patiently & thoroughly disarmed.
At THIS stage, the problem is not “uncertainty”, but IGNORANCE. ONLY once the ignorance problem has been overcome can we move on to meaningfully addressing uncertainty. (At that stage statisticians could be a tremendous help …but we’re categorically not there yet.)
What we REALLY need at THIS stage is PROFOUNDLY TALENTED data explorers (ones who are wise enough to outright DISMISS all untenable assumptions). Once ignorance is overcome, the data explorers will be able to pass the torch to the statisticians, who will at THAT stage actually be empowered to make meaningful assessments of uncertainty (something which is IMPOSSIBLE given current ignorance levels).
Best Regards.
----------- Well said, Paul. I couldn't have put it better myself. you can't make a silk purse out of a pig ear no matter how clever-clever your statistical tricks are. If you don't have the proper data you aint got nuffin. Sweet FA!
Last Edit: Apr 25, 2011 20:01:18 GMT 1 by marchesarosa