SciELO - Scientific Electronic Library Online

 
vol.1 issue4Health sciences educators in the XXI century: Are we insane (in the Einstenian sense)?A five-generation assessment of basic science knowledge retention at the end of the second year of medical school, at the National Autonomous University of Mexico (2007-2011) author indexsubject indexsearch form
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

Related links

  • Have no similar articlesSimilars in SciELO

Share


Investigación en educación médica

On-line version ISSN 2007-5057

Investigación educ. médica vol.1 n.4 Ciudad de México Oct./Dec. 2012

 

Editorial

 

Conceptual frameworks... What lenses can they provide to medical education?

 

Marcos conceptuales... ¿Qué ópticas pueden ofrecer a la educación médica?

 

Imagine that you were appointed as a member of a task force that your associate dean for education created to develop a new test to assess medical students' diagnostic reasoning. You met for the first time and various members of the task force had many ideas about such a test. Some would like to assess the students' ability to justify their differential diagnoses, others focused on assessing diagnostic probabilities. Some wanted a test with open-ended, written responses; others preferred multiple-choice questions because they are easier to score. Some quoted examples from the literature where similar tests had been reported. The discussion got heated and seemed to lack focus.

Time out... How could conceptual frameworks from education and the social sciences be helpful to the committee in their deliberations about the three main tasks they face in creating a new test, that is, determining what to assess and how to assess diagnostic reasoning, and establishing the reliability of the scores and the extent to which they actually measure diagnostic reasoning, that is, issues of validity. Conceptual frameworks are "ways of thinking about a problem or a study, or ways of representing how complex things work the way they do. Different frameworks will emphasize different variables and outcomes, and their inter-relatedness."1 Let's explore how conceptual frameworks could help the task force.

Eva in 2005 framed diagnostic reasoning into two broad categories of reasoning strategies: non-analytical reasoning (i.e., "automatic, unconscious strategies," such as pattern recognition) and analytical reasoning (i.e., "controlled, conscious strategies").2 Furthermore, there are many theories and models of analytical reasoning, including for example: hypothetico-deductive reasoning, causal reasoning, decision analysis, and problem representation. The analytical - non-analytical framework provides a broad view of what to assess, the big picture, while the individual theories and models of non-analytical reasoning each focus on particular aspects of the complexities of medical diagnostic reasoning. No one theory or model explains the whole reasoning process, but each theory or model provides some in-depth understanding regarding certain limited aspects. For example, the hypothetic-deductive model of reasoning breaks the diagnostic process into four stages of medical inquiry: data acquisition, hypothesis generation, data interpretation, and hypothesis evaluation.3 Theories of problem representation and structural semantics focus on how clinicians choose to view (represent) the problem overall, at a more abstract level (technically called "semantic qualifiers," underlined in the following example). For example, "a 72-year-old man says that he was awakened in the middle of the night with a horrible pain in his right knee, like he had last year"; the clinician thinks, "Here's an older man with acute, recurrent nocturnal attacks of severe knee pain in a single, large joint, a mono arthritis. This makes me think of..."4 Theories of decision analysis on the other hand focus on choices (decision trees) and prior and conditional probabilities, as well as how to combine the clinical information into a diagnostic decision (posterior probability). The different conceptual frameworks, by highlighting different aspects of a problem or construct, lead educators, or researchers, to consider different outcomes or variables depending on the framework selected. For example, assessing diagnostic reasoning from a problem representation perspective would lead the task force to measure the number and use of semantic qualifiers, while using a decision analytic framework would have them measure options and probabilities.

Conceptual frameworks provide alternative ways of viewing the task, in this case, the various aspects of clinical reasoning. Instead of jumping too quickly to one particular solution (e.g., a multiple-choice test of diagnostic justifications), the committee by exploring various conceptual frameworks, now has a broader view of the charge and has a number of alternatives to choose from. Faced with a similar charge of developing a diagnostic reasoning test, Williams, Klamen & Hoffman chose to build a test that would assess one non-analytical process, that is, pattern recognition, and one analytical, data interpretation.5 They then chose two types of test format to measure these processes, extended matching to assess pattern recognition and a modified script concordance test to assess data interpretation.

The third task facing the task force is to gather evidence on the reliability and validity of the scores. Three national organizations (i.e., the American Educational Research Association, the American Psychological Association, and the National Council on Measurement in Education) have put forth a unified construct of validity in their Standards for Educational and Psychological Testing.6 This framework highlights five sources of construct validity evidence "to support or refute meaningful score interpretation":7 content, response process, internal structure, relationship to other variables, and consequences. The task force could use this framework to pick relevant elements of validity to guide their validation work.

Conceptual frameworks come from theories, best practices, and models. Theories are explanatory and predictive and are based on observations or experiments, such as Ericsson's theory of expertise development (i.e., deliberate mixed practice with feedback).8 Best practices are derived from outcome or effectiveness studies, such as Case & Swanson's handbook on how to write good test questions,9 while models (sometimes called design tools) are derived from theories or concepts, such as the Standards quoted above (See Bordage, for more examples).1

When considering conceptual frameworks, either for development projects or for research studies, begin by asking yourself, "What perspective or background am I coming from? Which conceptual framework am I currently using?" Whether explicit or not, we all have conceptual frameworks in mind when sorting out a problem or formulating a research question. Then, force yourself to consider competing frameworks. Making the frameworks explicit and exploring alternatives will broaden your perspectives and choices. Each conceptual framework provides you with a set of well-defined concepts and variables to help you better understand the problem at hand. The various conceptual frameworks will enrich the development process, and eventually the product, such as the diagnostic reasoning test in the introductory example. In the case of a research study, the research question will be clearer and the framework will guide the selection of outcome variables to measure and the interpretation of the results.

Conceptual frameworks are crucial because they have implications for how you analyze the problem (e.g., which aspect of diagnostic reasoning to test), which aspects you will pursue (e.g., pattern recognition and data interpretation), and how you will interpret the results (e.g., test score results). Different conceptual frameworks can be used for different aspects of the task at hand. Often in education, the task has two dimensions, one related to the educational process (e.g., developing and validating a test), the other related to content (e.g., what aspect(s) of diagnostic reasoning to assess). The task will thus require the consideration of two different sets of conceptual frameworks, one for process and one for content.

Review the literature broadly regarding your topic of interest, going from the medical education literature to health professions education and the social sciences such as education and psychology. Review articles can be especially helpful in identifying relevant frameworks and authors. When reviewing the literature, ask yourself what framework guided each author's thinking. The frameworks are not always explicit in the articles. In the absence of an explicit statement, try to infer the framework used by the researchers by looking at the references they quoted, or contact the authors or experts in the field for clarification.

Each conceptual framework is inherently limited. It is like a lens used to look at a problem. Each lens highlights, magnifies certain aspects and filters out others.1 You will need to make choices among frameworks. Don't be too eager to jump on one particular framework or solution too soon. Each conceptual framework has its particular focus and set of assumptions. Consequently, different frameworks can lead to different interpretations. For example in a study of diagnostic reasoning, two methodological approaches, one mainly propositional (i.e., the linear dimension of clinical discourses), the other mainly semantic (i.e., the vertical dimension of clinical discourses), were used to analyze the same data set and resulted in two different interpretations of the reasoning processes used by clinicians.10 To avoid being one-track minded, begin by opening up the possibilities, explore alternatives, generate multiple competing hypotheses, and then make conscious choices as to which frameworks will be most useful to you, that is, which one(s) will provide better insights or interpretations of what's going on. Without alternatives, you will likely have many blind spots that run the risk of remaining undetected.

Small is beautiful! Don't try to solve all the world's problems at once. Pick certain aspects of the problem and build on solid grounds. Be explicit about your choices and let the conceptual framework, or set of frameworks, guide your work. In your search for conceptual frameworks, seek the collaboration of educators or social scientists in the field who know the literature and the array of frameworks well. It's a great opportunity to establish collaborations between you (the clinical or basic science teacher) and the educators. While you provide the cogent questions and relevant problems to be addressed, the educators can provide the relevant education or social sciences frameworks, not only regarding your content of interest but also alternative methodologies to address those contents. The Lemieux & Bordage study cited above is a good example of such a synergetic and productive inter-disciplinary collaboration; Lemieux is an anthropologist.

Conceptual frameworks also offer guidance on how you can build on other researchers' work and develop your own conceptually-based program of research, often starting with one framework and then adding new frameworks as new insights are gained or because old frameworks are no longer sufficient to explain what is going on. This allows you to do programmatic research and development work, and thus gain ever greater depth of understanding about the problem or research topic. On the long run, this well-grounded and gradual process will help move both practice and the field forward.11,12 The insights gained from each framework can be gathered within and across researchers and disciplines to provide a rich, overall representation of the complexities of a problem or situation. Eva's non-analytical and analytical portrayal of clinical reasoning is a good example of such a synthesis.

In summary, conceptual frameworks are important and do matter in medical education. They provide myriads of lenses and perspectives to look at and explore problems and methodologies. They help medical educators build well-grounded and solid solutions to problems and research questions. By exploring alternative frameworks, you will find which ones best fit your needs, which ones give you the most insights. Up front, the frameworks provide you with a set of well-defined concepts and variables to guide the development of your work and the interpretation of your results. You will benefit by producing and publishing well thought out and grounded work. Your institution will benefit with high quality solutions and products (e.g., a well-grounded assessment tool to assess the students' diagnostic reasoning). And eventually the field of medical education and health professions education will benefit with new insights to enlighten theory and practice.

 

Georges Bordage
Department of Medical Education College of Medicine,
University of Illinois at Chicago Chicago, USA

 

Correspondence:
Department of Medical Education.
College of Medicine,
University of Illinois at Chicago.
Chicago, USA.
E-mail: Bordage@uic.edu.

 

References

1. Bordage G. Conceptual Frameworks to Illuminate and Magnify. Med Educ 2009;43:312-319.         [ Links ]

2. Eva KW. What Every Teacher Needs to Know About Clinical Reasoning. Med Educ 2005;39:98-106.         [ Links ]

3. Elstein A, Shulman L, Sprafka S. Medical Problem Solving. An Analysis of Clinical Reasoning. Cambridge, Mass. Harvard University Press. 1978.         [ Links ]

4. Bordage G. Why Did I Miss the Diagnosis? Some Cognitive Explanations and Educational Implications. Acad Med 1999;74:S138-143.         [ Links ]

5. Williams RG, Klamen DL, Hoffman RM. Medical Student Acquisition of Clinical Working Knowledge. Teach Learn in Med 2008;20:5-10.         [ Links ]

6. American Educational Research Association, American Psychological, Association, National Council on Measurement in Education. Standards for Educational and Psychological Testing. Washington, D.C. American Educational Research Association. 1999.         [ Links ]

7. Downing SM. Validity: On the Meaningful Interpretation of Assessment Data. Med Educ 2003;37:830-837.         [ Links ]

8. Ericsson KA. Deliberate Practice and the Acquisition and Maintenance of Expert Performance in Medicine and Related Domains. Acad Med 2004;70:S70-81.         [ Links ]

9. Consultado el 06 de agosto de 2012. http://www.nbme.org/publications/item-writing-manual-download.html

10. Lemieux M, Bordage G. Propositional Versus Structural Semantic Analyses of Medical Diagnostic Thinking. Cogn Sci 1992;16:185-204.         [ Links ]

11. Bordage G. Moving the Field Forward: Going Beyond Quantitative-Qualitative. Acad Med 2007;82:S126-128.         [ Links ]

12. Norman G. Fifty Years of Medical Education Research: Waves of Migration. Med Educ 2011;45:785-791.         [ Links ]

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License