Diagnosis Bias and its Revelance During the Diagnosis Process
Eduardo Esteban-Zubero1*, Mario Antonio Valdivia-Grandez2, Moisés Alejandro Alatorre-Jiménez3, Lourdes del Carmen Rizo-De La Torre4, Alejandro Marín-Medina4, Sara Anabel Alonso-Barragán5and Carlos Arturo López-García6
1Medical Doctor in Emergency Department of Hospital San Pedro. Calle Piqueras, 98, 26006, Logroño, Spain
2Intensive Care Unit of Hospital Clínico Universitario Lozano Blesa. Avenida San Juan Bosco, 15, 50009 Zaragoza, Spain
3Department of Neurosciences, Western biomedical research center, Mexican Institute of Social Security, Guadalajara, Mexico
4Department of Genetics, Western biomedical research center, Mexican Institute of Social Security, Guadalajara, Mexico
5Department of Genetics, University of Guadalajara. Av. Juárez No. 976, Centro, 44100 Guadalajara, JAL, México
6Mission Regional Medical Center. 900 S Bryan Rd, Mission Tx 78572, USA
*Corresponding Author: Eduardo Esteban-Zubero, Medical Doctor in Emergency Department of Hospital San Pedro. Calle Piqueras, 98, 26006, Logroño, Spain, Tel: +34654123994;
Received: 17 July 2017; Accepted: 21 July 2017; Published: 25 July 2017View / Download Pdf Share at Facebook
Diagnosis bias occurs when the diagnosis is not intentionally delayed (the physician do not have the sufficient information available), after an error, or missed to evaluate some information provided (it may occurs due to it could be the first time that the physician try to diagnose the pathology). It may be clasified in cognitive errors, with different subtypes (including cognitive biases, heuristic, diagnostic anchoring, player’s fallacty, satisfaction bias, confirmation bias, outcome bias, retrospective distortion, and overconfidence), and affective influences. Both of them may occur in the two different diagnostics models: Model 1, based on an intuitive and automatic process that requires little cognitive ability; and model 2, a reflexive and analytical process that requires great cognitive ability. However, in the clinical practice, a mix of them is generally used. Diagnosis bias are important because it has been estimated an error rate in a range from 0.6% to 12%. In addition, the adverse effects generated by diagnostic errors have been estimated to range from 6.9% to 17%. The purpouse of this review is to improve the knowledge about diagnosis bias, the awareness of them, and provide adequate ways to avoid them.
Diagnosis bias, Diagnosis, Congitive errors, Affective influences
Uncertainty is an essential part of the practice of medicine. This characteristic encompasses all the diagnostic and therapeutic process and is conditioned by external (technology available to the doctor, the environment of the hospital or consultation...), and internal agents (knowledge about the symptomatology and potential pathology, personal problems). Nowadays, the medical activity is also conditioned by the care pressure. The accomplishment of long shifts of work and the short time that can be dedicated to each patient, generate that the time provided to elaborate the medical decision is shorter than the adequate .
Diagnosis bias occurs when the diagnosis is not intentionally delayed (the physician do not have the sufficient information available), after an error, or missed to evaluate some information provided (it may occurs due to it could be the first time that the physician try to diagnose the pathology). Attending to the diagnosis bias, cognitive errors are the most frequent in the field of Primary Care Medicine and Emergency Medicine [2, 3]. The error rate has been estimated to range from 0.6% to 12% , although some authors assert that these values may reach until 15% . The adverse effects generated by diagnostic errors have been estimated to range from 6.9% to 17% [4, 6].
The aim of this study is to ellaborate a short review about diagnosis bias, including its types as well as adequate manners to prevent them.
2. Diagnostic Models
To avoid diagnosis bias, physicians have developed several diagnostic methods in a systematic way performing theorems, like Bayes theorem . This model is based on the clinician's ability to assess the information received (patient's symptomatology), and estimate the probability that the patient will present it in the etiological diagnosis of the suspected disease. Due to the actual style of clinical practice and the lack of information that is sometimes observed, this model is not viable nowadays . At present, several authors affirm that the diagnostic process is a dual system. Model 1 would be based on an intuitive and automatic process that requires little cognitive ability. On the other hand, model 2 is a reflexive and analytical process that requires great cognitive ability [9-11].
The model 1 is the one used by physicians with great clinical experience, based on their previous similar situations or on abstract prototypes in disease schemes [12, 13]. It is estimated that, by this method, the diagnosis is elaborated in approximately 10 seconds . On the other hand, model 2 is the one used by physicians of lesser experience or students, using for its accomplishment a good interview to be able to look for a guide symptom to realize the differential diagnosis process. The list of possible differential diagnoses has been estimated to be obtained after 28 seconds, and this is usually not greater than 6 possible options. Compared to method 1, the definitive diagnosis takes longer to be elaborated, and it is estimated that it will be obtained in a range of 1 to 7 minutes . It should also be noted that the information is processed in different parts of the cerebral cortex. While in model 1 the ventromedial prefrontal cortex participates, in model 2 it does the right inferior prefrontal cortex .
Nevertheless, the clinical reasoning is not based on one of these two models. The physician may change from one to another. In example, the expert physician will make a more detailed differential diagnosis with to the appearance of pathology apparently unknown. In addition, both models can be mixed based on an intuitive first time to elaborate the diagnosis of a pathology (model 1), followed by a reflexive process to confirm the suspected diagnosis (model 2) . However, authors have not decided which of the 2 methods generates the lowest rate of diagnostic errors [15, 16].
3. Cognitive Bias and Affective Influences
3.1 Cognitive bias
A doctor accepts a clinical diagnosis as correct when there is a great agreement of ideas with other colleagues and the diagnosis is obtained quickly during the consultation . However, it has been observed that the time of consultation is not related with the possibility of asserting a correct clinical diagnosis. It is due to it has been observed that clinicians have to obtain a conclusion with a range of 60% to 70% of the information obtained [6, 10]. As already mentioned, no higher error rates have been observed in any of the diagnostic models. Nevertheless, a higher prevalence of erroneous diagnoses have been observed related with fatigue, lack of sleep, excess of work, and overconfidence [11, 17].
Diagnostic errors have been classified over time into different types. Some authors have classified them as non-failed processes (atypical presentations of the disease, non-patient collaboration, lack of knowledge of the disease...), systematic errors, and cognitive errors . Subsequently, it was defined that cognitive errors can be derived from 3 sources: knowledge deficits, cognitive bias and clinician attitude problems . Cognitive biases are just one type of them, and they can be classified in different ways. Some schemes classify them according to the stages of decision making in which they occur: during the collection of data, during the interpretation of the data, or during the evaluation of probability [7, 19].
Heuristic, which is defined as a method to increase knowledge, is influenced by availability. This affects the decision to choose a definitive diagnosis between the different options. It will be presumably influenced by previous similar cases as well as the plausibility of the diagnosis according to the symptomatology presented [20-22]. For example, an acute myocardial infarction will be more readily eligible as a diagnosis in a patient affected with a chest pain than a pulmonary thromboembolism.
Diagnostic anchoring occurs when it is concluded that a patient is affected by a certain pathology before receiving all the necessary information about it. It influences negatively in reassessing the diagnosis when more data of the case is collected [20-22]. In addition, this can be influenced by three other types of bias: the primacy effect (the first idea is the one that reigns), and the effect of recent (is easier to retain and to value in a greater manner the last information received that the first one), and confirmation bias (the initial diagnosis is affirmed by internal or external agents to the physician) [2, 21]. This is consistent with several studies that affirm that clinicians do not diagnose pathologies that are not similar to the "classical" pattern of presentation .
Player's fallacy is another related bias. It is consisted in the presumption of gamblers that events like throw a coin have a logical sequence of repetition . This is erroneous, and the simile could be that, before a woman who is attended by a Primary Care physician affected by breast discomfort, if a breast cancer is diagnosed, the appearance of another woman the same day in the consultation with the same symptomatology discards that pathology because of it is not usual that ratio of probability or cases sequencing. Satisfaction bias is referred to the end of the diagnostic research when an abnormality is found. This supposes that this alteration is taken as the etiological cause of the problem, omitting the possibility of a different pathology. An example would be the discovery of a bone cyst on an x-ray, ruling out any ligamentous problems .
Another bias to be ruled out is the confirmation bias. It is referred to the idea that the formulation of the first diagnostic hypothesis has repercussions on doctors and colleagues who subsequently value the case. This bias is increased in a direct manner when more intermediate steps occur, and may be influenced by health care professionals (confirming the hypothesis) as well as non-health care people (relatives, friends, etc., who have suffered the pathology that the patient has been labeled). This bias can be applied to the triage process of the Emergency Rooms. It has been observed that those patients screened with a higher severity and with an apparently more severe diagnosis are subsidiary to receiving more studies than those who do not [1, 20].
Another bias is the outcome bias, which occurs when the clinician underestimates the a poor outcome and overestimates a good outcome [7, 20]. One example would be to attribute high fever to an apparently banal process rather than a bacteremia.
Retrospective distortion refers to the ability to evaluate an outcome in relation to the probability that it is secondary to the plausibility of the diagnosis of suspicion previously elaborated. This can be also applied to those processes where the outcome of a case is known and the result of a diagnostic test performed in the diagnostic process is analyzed. If the result was the death, the interpretation of varies compared with the evaluation realized during the diagnosis process may be different due to is influenced by the final result [19, 24].
Finally, the overconfidence bias is the best known of all. Some studies have found that health professionals are frequently classified within the upper half of their profession compared to their peers . Likewise, it has been observed that in the process of transition from medical student to resident and specialist physician, an exponentially increase of overconfidence is observed .
3.2 Affective influences
The affective influences have been studied less in the literature, but it is well known the negative influence that intoxicated patients, manipulators, or with some kind of connotation contrary to the ideology of the health professional, generate to the doctor. All of this is included in the countertransference that is received by the patient . Because of this, the physician must know the possibility of this influence to try to avoid it. This can be also related to the possibility to attribute that the poor evolution of the disease is the fault of the patient instead of to a diagnostic error . It has also been observed that in the last hours of work time, diagnostic errors are more frequent due to lack of sleep or fatigue .
3.3 Different ways to resolve biases
The first step to resolve biases is based on the awareness of the same by health professionals . Attending to the etiology, Table 1 summarizes the possibilities that are available to both to the health professional and the health system in order to diminish this risk.
Improve the knowledge of:
Strategies for diagnosis reviewing:
Table 1: Measures to avoid diagnostic errors.
Diagnosis bias are relevant in the clinical practice. They may be classified attending to the type or the step during the diagnosis process. In addition, it may be observed in expert as well as newbies physicians. Due to that, the awareness of it by health professionals as well as health system is necessary to prevent them.
- Phua DH, Tan NC. Cognitive aspect of diagnostic errors. Ann Acad Med Singapore 42 (2013): 33-41.
- Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med 165 (2005): 1493-1499.
- Kachalia A, Gandhi TK, Puopolo AL, et al. Missed and delayed diagnoses in the emergency department: a study of closed malpractice claims from 4 liability insurers. Ann Emerg Med 49 (2007): 196-205.
- Sandhu H, Carpenter C, Freeman K, et al. Clinical decisionmaking: opening the black box of cognitive reasoning. Ann Emerg Med 48 (2006): 713-739.
- Schwartz A, Elstein AS. Clinical Reasoning in Medicine. In: Higgs J, Jones MA, Loftus S, Christensen N, editors. Clinical Reasoning in the Health Professions. 3rd Edn. Amsterdam: Butterworth-Heinemann (2008).
- Kuhn GJ. Diagnostic errors. Acad Emerg Med 9 (2002): 740-750.
- Elstein AS. Heuristics and biases: selected errors in clinical reasoning. Acad Med 74 (1999): 791-794.
- Graber ML. Educational strategies to reduce diagnostic error: can you teach this stuff? Adv Health Sci Educ Theory Pract 1 (2009): 63-69.
- Elstein AS. Thinking about diagnostic thinking: a 30-year perspective. Adv Health Sci Educ Theory Pract 1 (2009): 7-18.
- Norman G. Dual processing and diagnostic errors. Adv Health Sci Educ Theory Pract 1 (2009): 37-49.
- Croskerry P. Clinical cognition and diagnostic error: applications of a dual process model of reasoning. Adv Health Sci Educ Theory Pract 1 (2009): 27-35.
- Elstein AS, Schwartz A. Clinical problem solving and diagnostic decisión making: selective review of the cognitive literature. BMJ 324 (2002): 729-732.
- Bowen JL. Educational strategies to promote clinical diagnostic reasoning. N Engl J Med 355 (2006): 2217-2225.
- Goel V, Dolan RJ. Explaining modulation of reasoning by belief. Cognition 87 (2003): B11-22.
- Coderre S, Mandin H, Harasym PH, et al. Diagnostic reasoning strategies and diagnostic success. Med 37 (2003): 695-703.
- Eva KW, Hatala RM, Leblanc VR, et al. Teaching from the clinical reasoning literature: combined reasoning strategies help novice diagnosticians overcome misleading information. Med Educ 41 (2007): 1152-1158.
- Patel VL, Cohen T. New perspectives on error in critical care. Curr Opin Crit Care 14 (2008): 456-459.
- Norman GR, Eva KW. Diagnostic error and clinical reasoning. Med Educ 44 (2010): 94-100.
- Bornstein BH, Emler AC. Rationality in medical decision making: a review of the literature on doctors' decision-making biases. J Eval Clin Pract 7 (2001): 97-107.
- Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med 9 (2002): 1184-1204.
- Trowbridge RL. Twelve tips for teaching avoidance of diagnostic errors. Med Teach 30 (2008): 496-500.
- Mamede S, van Gog T, van den Berge K, et al. Effect of availability bias and reflective reasoning on diagnostic accuracy among internal medicine residents. JAMA 304 (2010): 1198-1203.
- Payne VL, Crowley RS. Assessing the use of cognitive heuristic representativeness in clinical reasoning. AMIA Annu Symp Proc 6 (2008): 571-575.
- Wears RL, Nemeth CP. Replacing hindsight with insight: toward better understanding of diagnostic failures. Ann Emerg Med 49 (2007): 206-209.
- Berner ES, Graber ML. Overconfi dence as a cause of diagnostic error in medicine. Am J Med 121 (2008): S2-23.
- Friedman CP, Gatti GG, Franz TM, et al. Do physicians know when their diagnoses are correct? Implications for decision support and error reduction. J Gen Intern Med 20 (2005): 334-339.
- Croskerry P, Abbass AA, Wu AW. How doctors feel: affective issues in patients' safety. Lancet 372 (2008): 1205-1206.
- Dawson NV. Physician judgment in clinical settings: methodological infl uences and cognitive performance. Clin Chem 39 (1993): 1468-1478.
- Ruutiainen AT, Durand DJ, Scanlon MH, et al. Increased error rates in preliminary reports issued by radiology residents working more than 10 consecutive hours overnight. Acad Radiol 20 (2013): 305-311.