Australian General Semantics Society Inc.

   

            

 

Seminar Summary - 19 August 2012

 

 

" Managing our Biases"
What are the main biases affecting our rational judgments and decisions
and can we be educated out of them?

Led by Professor Gabriel Donleavy

at Gavan and Pauline's welcoming home "Clifftop View"


Catching Up

As "GS practitioners", we "always" have some GS Diary entries to share with the group.  Birthdays, babies and books were a popular focus for the day.

Homework from last month ... !

Remember what Pauline asked us to do over the month from last meeting ...
      Write down three strategies you intend to put into practice in the month,
      and make a note on your calendar, each Friday,  to check you are actually doing it.

Our compliance with this proposal was somewhat less that 100% ...
Ahh well, there's always next month!

Today's discussion on Biases ...

Incoming AGS President, Gabriel Donleavy, addressed us on the topic "Biases in Decision Making". He began by reminding us the GS characterises mathematics as a language which has all particulars included. Rational decision making accordingly uses mathematical models to make the best possible decisions based on accurate numbers for the variables and coefficients that input into the model. He outlined the payoff matrix used in Game Theory and showed two cases of the Prisoners' Dilemma, to show how even the most rational decision making depends on having enough f the right information and assembling it appropriately.

He pointed out that the real world rarely allowed for such perfect information as randomness, noise and unpredictability were intrinsic to virtually all real world situations, especially those involving people. He explained how the Lens Model depicts the role and influence of uncertainty, the extent to which the optimum model deriveable from the real world is reflected through cues we use in our own judgment models and how our models fall short of the best ones because we may be inconsistent and inaccurate in our selection and weighting of cues.

However, even if we could model the world very closely and our decision models were well calibrated with the world, we would still be vulnerable to flawed decision making through a range of systematic biases, most of which have their roots in our emotions but some of which have roots in our inability to cope with complexity without using oversimplified rules of thumb called heuristics. Tversky and Kahneman poularised their initial discovery of the widespread prevalence of three heuristics:
  * availability(receny and primacy),
  * representativeness (stereotyping) and
  * anchorage (stubbornly clinging to the neighbourhood of a seriously wrong first guess,
    especially of some quantity like a future stock price in the face of contradictory evidence).

Wikipedia lists pages and pages of cognitive biases but Gabriel selected the most prevalent as he saw it and 452 pages of research articles suggested. These were discussed by the group and included:
  * endowment effects,
  * framing bias,
  * halo effects,
  * omission bias,
  * Keats's error,
  * accountabilty and risk aversion bias,
  * hindsight bias,
  * sunk cost bias,
  * conflict avoidance bias and
  * path dependence.

Biases could be reduced by editing judgments and proposals before promulgating and printing them. Convening a focus group to filter proposals using de Bono's "Six Thinking Hats" had been found very helpful in some quarters. De Bono recommends wearing his hats in the following sequence; - blue, red, white, yellow, black, green, blue again, red again. There was discussion around this and some group members already had experience of the thinking hats approach to meetings, with and without role rotation.

~0~

David added a little on typing of errors:

Apart from the common notions of sample error, mathematical error, or logical error, statisticians commonly speak of two types of errors: Type I and Type II.

A Type I error involves erroneously accepting the incorrect answer. This could occur, for example, when we believe a falsehood in terms of superstition, or if an investigator is "crying wolf" without a wolf in sight. 

A Type II error involves erroneously rejecting the correct answer, when presented with clear evidence.  This could be when we ignore the signals of an immiment health or safety crisis.

In a 1957 article titled "Errors of the Third Kind in Statistical Consulting", Allyn W. Kimball introduced a third type of error (Type III) to describe the more common occurrence of producing the correct answer - to the wrong question!   That is, solving the wrong problem correctly.

People have suggested lots of others. One of my favourites is:
Type IX error: "The consultant has performed an analysis which adequately addresses the research question posed by the client. The client does not like the answer. The client suggests an inappropriate analysis that he thinks will give him the answer he wants."
   .. or
"The doctor gets a correct diagnosis and the patient does not like the answer. They then suggest an inappropriate analysis that they think will give a diagnosis they want."
   .. or
I imagine this sort of error might come up in the accounting world when an auditor performs an audit correctly but the customer does not like the answer...

Here are a couple of books on Biases that you might like:
  * "Inevitable Illusions: How mistakes of reason rule our minds"
      by Massimo Piattelli-Palmarini
      (It had the frame bias we talked about, amongst others.), and
  * "Tools of Critical Thinking: Metathoughts for Psychology"
      by David Levy


Next Meeting:

Sunday
16 September: "Managing Our Lives' Transitions"
    How do we manage the transitions between phases in our lives?
    How can GS principles assist in this?
    How do hidden assumptions direct our decision-making?
    Led by Mr Laurie Cox.

10:30am - 4:30pm at Bonnet Bay, Sydney, Australia
 
 

 

(Updated by RJ 19/08/2012)

For details of our discussion meetings and seminars, locations and membership, Contact AGS
 
         .....
Web site by RLJamez