The confirmation bias in the forensic sciences.

James and Nicki always wanted to work in the forensic sciences. Whilst reading towards their undergraduate degrees they would borrow as many books from the library as they could on forensics and watch the popular television programmes about ‘forensic experts.’ One day when looking though an interesting book about case studies in forensics Nicki came across an interesting case study.

In 1988, Barry Laughman confessed during interrogation to the charges of rape and murder of his neighbour. The following day tests revealed that the person who committed the crime had Type A blood whilst Laughman had Type B. Aware that Laughman had confessed to the crimes the state forensic chemists proposed four theories (none of which were scientific) to dismiss the mismatch. Laughman was in due course convicted and sentenced to 16 years in prison. He was eventually released in November 2003 after a re-examination of the DNA evidence.

The case of Barry Laughman gives us a clear example of the influence of confirmation bias in the forensic sciences. The confirmation bias is shown when an individual ignores evidence that goes against what they believe whilst trying to confirm the belief (Dror, 2006). In Barry’s case the Virginian state forensic chemist ignored contradictory evidence and persisted in dismissing the mismatch in DNA evidence.

image 1.png

The confirmation bias causes problems in all areas of decision-making. In the forensic sciences errors in decision-making, as caused by the confirmation bias can have severe consequences innocent people can spend a lifetime in prison, and the actual criminal can go on to reoffend. In the forensic sciences, the confirmation bias has been reported by the National Academy of Sciences (2009) in firearms, hair and fibre analysis, blood splatter, hand-writing and fingerprints (Kossin et al., 2013; Garrett et al., 2011).

In a recent study investigators found an interesting example of how the confirmation bias can influence the outcome of a forensic analysis (Ulery et al., 2012). The investigators gave forensic fingerprint examiners the same evidence twice, at approximately 10% of the time the examiners reached different conclusions (Ulery et al., 2012). Three of the reasons as to why the examiners reached differing conclusions are (i) examiners often receive direct communication from the police (e.g., letters, phone calls etc), (ii) cross-communication between examiners, and (iii) examiners overstating the strength of evidence.

image 2.png

There are measures that can be taken to prevent the confirmation bias. The FBI’s Latent Print Unit revised their Standard Operating Procedures (SOP) (Cole et al., 2005). They adopted a programme of masked verification whereby fingerprint comparisons that involve a single print are masked-verified (i.e., in isolation with no further information about the print). The change in SOP prevents the second examiner from inferring the first examiner’s conclusion when two examiners individually examine the evidence (Office of the Inspector General, 2011).

Other measures that can be undertaken to prevent the confirmation bias include training all forensic examiners so that they know about cognitive biases. Just two of the courses help to install knowledge of cognitive biases are the FBI’s week-long Facial Comparison and Identification Training and the Australian government’s 2-day long facial comparison course. The linear examination of evidence by multiple examiners (Heyer et al., 2013), cross-laboratory verification (Kossin et al., 2013) and peer verification (Heyer et al., 2011) can all help in reducing the impact of the confirmation bias in the forensic sciences.

So, like James and Nicki if you are interested in working in the forensic sciences it is important to learn about the influence of cognitive biases on decision-making. Some private forensic companies have begun to provide training for their employees, and some governments have started to provide training. With adequate training one day we may be able avoid false convictions.

Advertisements

First blog post

A brief introduction to heuristics and biases in the decision-making research.

John found himself standing at the station surrounded by the cosmopolitan rush. As the crowd ebbed and flowed around him he was struggling to remember. John had been to London some 30 years earlier but had visited so many places since, he had planned to visit before but never got around to it until today. He needed to get to his conference on time but couldn’t remember the route. John had to make a decision and risk being late by going the wrong way or staying at the station and guarantee being late. At that moment John thought to himself “Aren’t nice conference halls always in a nice part of town at an impressive hotel? Of course they are.” By remembering the numerous other conferences he had attended John headed towards the nicest hotel in this part of London. John got to his conference on time by going with his ‘gut-feeling’, a ‘hunch’ that he knew had worked many time before, he just did not know how.

Just like John many of us go with ‘gut-feeling’ about a situation every day, choosing to rely on our ‘hunches’ and ‘intuition’. We like to think of ourselves as logical thinkers who take our time when making an important decision. When asked “How did you make that decision?” or “Why did you choose that option?” most of us would reply that we weighed up the ‘pros and cons’, taking all of the facts into consideration. We are naturally inclined to think that decisions that are made with slow and careful consideration produce better answers than those that are not.

Many of us have grown up reading the books and watching films that portray famous double-acts that oppose each other in the way that they make decisions, take for consideration Dr Jekyll and Mr Hyde, Captain Kirk and Spock, or Sherlock Holmes and Dr Watson . The popular duo Sherlock Holmes and Dr Watson are a clear example of our natural inclination to believe that one decision-making strategy is superior to another. If you sit down and read the books (I’d recommend them) Holmes makes slow and calculated decisions, generating ingenious plans to whatever situation he finds himself in, whilst Watson on the few occasions when he does act, makes rapid decisions.

Holmes is of course famous for his deductive reasoning, a type of reasoning that is not always reliable outside of the idealistic word of the great works of Sir Arthur Conan Doyle. Below are two interesting examples where Holmes has failed to make reliable decisions. Firstly from the short-story “The adventure of the priory school” we encounter Holmes trying to deduce the direction that a bicycle had traveled by observing the tracks that had been left behind in the mud (The Strand Magazine, 1904).

Holmes: “This track, as you perceive, was made by a rider who was going from the direction of the school.”

Watson: “Or towards it?”

Holmes: “No, no, my dear Watson. The more deeply sunk impression is, or course, the hind wheel, upon which the weight rests. You perceive several places where it has passed across and obliterated the more shallow mark of the front one. It was undoubtedly heading away from the school.”

Here we see that Holmes’ deductive reasoning clearly fails. If you think about it, no matter what direction a bicycle is travelling the hind (back) wheel must always follow the front wheel. Bicycles cannot travel backwards, well, not very easily in any instance. Holmes opts to use the ‘confirmation bias’ heuristic here, where he takes notice of the information that confirms his idea (the heavier track cutting through the lighter track), whilst ignoring any contradictory information (this would happen no matter what direction the bicycle was travelling).

1

Secondly, in another of the great Sherlock Holmes stories (The Hound of the Baskervilles, 1902) we see Holmes and Watson picking up a walking stick with a small, silver band at one end. On the silver band we hear that the following is engraved “To James Mortimer, M.R.C.S. from his friends of the C.C.H. 1884.” Holmes and Watson spend some time trying to work out who it belongs to and what the initials stand for when Holmes makes the unusual ‘mistake’ in going with his intuition.

“…I would suggest, for example, that a presentation to a doctor is more likely to come from a hospital than from a hunt, and that when the initials ‘C.C.’ are placed before that hospital the words ‘Charing Cross’ very naturally suggest themselves.”

At first glance, Holmes’ deduction here seems logical, however, he is using what cognitive neuroscientists now call the ‘representativeness heuristic’. Holmes has no evidence that the walking stick belongs to a doctor he simply assumes that because doctors often frequent the place in which the stick was found and that it appears to belong to a wealthy man that it must therefore belong to a doctor. ‘C.C.H.’ could just as easily stand for the ‘Country Club of Honiton’ or a number of other things.

In recent years, since the Noble prize winning research of Daniel Kahneman (2002 prize in Economics) a substantial amount of work has been produced into how we make decisions. John’s ‘gut-feeling’ about which was way to go to get to his meeting in time and Holme’s diversion from his normal decision-making strategy fall into the realm of Kahneman’s short-cuts in thinking  (heuristics). Kahneman, although a psychologists by training won the Noble prize in economics because he demonstrated that in all walks of life we rely on these short-cuts to make decisions.

Intuition and heuristics can be used to both bad and good effects. Research in cognitive neuroscience and psychology since 2003 has shown the use of heuristics and intuition in most situations in which decisions are required. We use heuristics when in a hurry to make a decision or when we are distracted by something else. Experienced police officers (Brown & Daus, 2015), managers (Tversky & Kahneman, 1981), gamblers (Alberola et al., 2013), retail investors (Butler et al., 2014), forensic experts (Dror & Cole, 2010) and even the military (Keller et al., 2015) often use intuition and heuristics to make quick decisions.

Some professionals have even demonstrated that they have the ability to choose which type of decision-making style to use when in a hurry (intuition or slow and calculated). A piece of research published last year by Dr Volker Thoma and colleagues investigated decision-making in financial traders from the trading floors of London and non-experts when asked to make decisions regarding financial transactions (Thoma et al, 2015 – PlosOne). The study found that even when required to make cognitively taxing decisions the city traders ignored their intuition and made calculated decisions. The non-expert group as you can expect relied on intuition to make decisions regarding the financial transactions. This study shows that, although we like to think of ourselves as logical thinkers we often relying on intuition, heuristics and ‘gut-feeling’ to make decisions for us, particularly when we are confronted with a lot of facts to calculate.

Despite our intuition and heuristics being useful at times there is also a down-side to heuristic and intuitive-based reasoning. One case from 2004 demonstrates how heuristics can have negative consequences when we don’t understand how we have come to a particular conclusion (known as cognitive biases).

On the morning of Thursday the 11th of March 2004 a disaster struck Madrid at between 7:30 and 8:00, several simultaneous explosions occurred on the Madrid underground. In the following investigation the FBI offered to help the Spanish National Police find who was responsible. Fingerprints left at the scene were collected and the FBI linked these to an American attorney. Brandon Mayfield, a Muslim attorney from Oregon was arrested and held for two weeks on the basis of an erroneous match between these fingerprints. The fingerprints were not an exact match, but had some similarities with respect to the ridges in the prints. Not only did one FBI fingerprint examiner misinterpret these prints but a further two additional examiners corroborated with the original findings. After two weeks in jail Brandon Mayfield was released without any charges. Later in the investigation the Spanish National Police linked the fingerprints to an Algerian national called Ouhnane Dauod. The similarities in the ridges had made it easy for cognitive biases to take over and affect the identification of suspects. Once the FBI examiners found matches in some of the ridges ‘confirmation bias’ took control. The examiners paid explicit attention to the similarities whilst ignoring any differences. This case study goes to show that even in legal criminal cases of great importance decisions can by affected by differences in decision-making strategies, even when this is unknown to the decision maker.

2

These cases all go to show that the way in which we make decisions affects us in everything we do, nobody is immune to unknowingly making decisions based on cognitive biases. Since the growth in research after Kahneman won the Noble prize in 2002 a large number of heuristics and cognitive biases have been identified. We have seen previously clear examples of how we use the confirmation bias and representativeness heuristic. To name just a few of the other heuristics and biases there is the anchoring heuristic, availability heuristic, framing heuristic, hindsight bias, attribution bias and recognition heuristic. The positive and negative aspects of making decisions with the use of heuristics and biases are seen every day when they can either aid in making an accurate, correct decision or an incorrect decision.

Research by Professor Gerd Gigerenzer at the Max Planck Institute in Berlin in cognitive psychology has focused primarily on the use of cognitive biases and heuristics. In numerous pieces of research Gigerenzer’s lab has demonstrated that heuristics can often lead to more accurate decisions than taking the time to think about the alternatives. This research shows that the notion of taking the time to weigh up the ‘pros and cons’ when making a decision and thinking ‘logically’ as many of our popular literary duos do not always produce a superior solution to a problem, sometimes going with our intuition works (intuition as under-pinned by a set of basic heuristics).

Our decision-making strategies are complex things part of Gigerenzer’s work has focused on the recognition heuristic. The recognition heuristic suggests that when given two alternatives we often go with what we know. For instance, if you were asked to choose between two brands of clothing and there was no significant differences between the items of clothing other than the make you’d more than likely go with the brand that you are familiar with. Advertisers take advantage of the fact that we trust a brand that we know more than one that we don’t, this is in part why advertising has grown into a ‘big money industry’. When a company can advertise a car for example, on billboards around town, in advertisements on TV and in full-page spreads in magazines and newspapers their sales will go up. Put quite simply, we often go with what we know and trust. Seeing a brand on a daily basis reinforces our perceived knowledge about a brand which makes us more likely to trust it, associating it with good quality.

In research conducted some 30 years before winning his Noble prize Daniel Kahneman and his long running collaborator Amos Tversky identified another heuristic that is prevalent in everyday life, the availability heuristic. In the now classic study in cognitive psychology participants were asked to judge the frequency of which particular letter appeared in either the first place (e.g., right) or the third place in a word (e.g., work). Participants could think of more instances of particular letters appearing as the first letter in a word than the third, and therefore often judged falsely that the letters occurred more frequently at the beginning of a word. In terms of the speed of making a decision and the use of cognitive resources it was easier to think of letters occurring first. Participants therefore made the error in with this idea and succumbing to what Kahneman and Tversky called the ‘availability heuristic’.

For better or for worse intuition, heuristics and ‘gut-feelings’ are an important tool in our decision-making tool box this blog will explore these through the rooms.