Do consecutive Patient Management Problem (PMP) and Modified Essay Question (MEQ) Examinations Improve Clinical Reasoning in Students?

AUTHORS

Mohammad Reza Mahmoodi ORCID 1 , *

1 Physiology Research Center, Institute of Basic and Clinical Physiology and Nutrition Department, School of Health, Kerman University of Medical Sciences, Kerman, Iran

How to Cite: Mahmoodi M R . Do consecutive Patient Management Problem (PMP) and Modified Essay Question (MEQ) Examinations Improve Clinical Reasoning in Students?, Strides Dev Med Educ. Online ahead of Print ; 16(1):e86566. doi: 10.5812/sdme.86566.

ARTICLE INFORMATION

Strides in Development of Medical Education: 16 (1); e86566
Published Online: September 25, 2019
Article Type: Research Article
Received: November 19, 2018
Revised: May 7, 2019
Accepted: May 13, 2019
Crossmark

Crossmark

CHEKING

READ FULL TEXT
Abstract

Objectives: The purpose of this study was to evaluate the improvement of students’ ability to answer consecutive patient management problem (PMP) and modified essay question (MEQ) exams, to assess its relationship with academic progress, and to determine whether consecutive PMP-MEQ exams can improve the students’ clinical reasoning skills by improving the test scores.

Methods: This descriptive, analytical, cross-sectional study consisted of 67 third-year nutrition students in three consecutive years, who were asked to prepare for a multiple-choice question (MCQ) test and consecutive PMP-MEQ exams. The students were required to answer PMP-MEQ exam, which comprised of two queries of five-choice question (PMP) and three short-answer questions (MEQ). Repeated measures ANOVA, independent t-test, paired t-test, and Pearson’s correlation test were used for statistical analysis.

Results: The mean difference in PMP scores was significant between the three periods (P = 0.0001). However, the difference in the mean score of PMP exam between students with grade point average (GPA) ≥ 16 and GPA < 16 was not significant, except for PMP3 (P = 0.001). An increase was observed in the scores of students in both groups by continuous PMP examination. The significant mean difference in PMP3 exam showed that improvement of students with GPA ≥ 16 was greater than that of students with GPA < 16 (P = 0.001). The difference in the mean scores of MCQ and PMP exams was significant, except for the third PMP exam in students with GPA ≥ 16 (P = 0.143).

Conclusions: Use of PMP-MEQ exams in reasoning-based clinical education can be a suitable approach for clinical evaluation of undergraduate students. Also, continuous PMP-MEQ examination can improve the clinical reasoning of students, mainly those with GPA ≥ 16.

Keywords

Clinical Reasoning Continuous Assessment Modified Essay Question Patient Management Problem

Copyright © 2019, Strides in Development of Medical Education. This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/) which permits copy and redistribute the material just in noncommercial usages, provided the original work is properly cited.

1. Background

Effective clinical reasoning depends on the health professional’s ability to collect and analyze the right cues or information to reach an accurate understanding of a patient problem or differential diagnosis, to plan and implement the right interventions, and finally to learn from the process (1, 2). Reasoning- and competency-based medical education requires a robust and multi-dimensional assessment system (3). It relies on continuous, inclusive, and elaborate assessment and feedback systems, which facilitate the development of reasoning and competence (4).

On the other hand, in most countries, a multiple choice question (MCQ) is the most common assessment method of medical knowledge, followed by modified essay question (MEQ) (5). MCQ does not focus on the evaluation of cognitive skills, and many MCQs assess small sections of textbooks. With the introduction of problem-based learning for the evaluation of clinical reasoning and competence in medical and health professional courses, besides the shift from a traditional lecture-based curriculum to a student-centered one, many schools are currently reviewing their assessment tools and introducing new strategies for evaluating the student (6).

In a study, two popular formats of tests, i.e., MCQ and MEQ, were compared. Based on their findings, although MCQ and MEQ may assess different skills, there is a very strong relationship between their content scores (7). In another study, the results of MEQ and MCQ were strongly and positively correlated, and the overall examination showed good reliability and validity. In their study, MEQ included more questions on recall of knowledge, which were more structurally flawed, compared to MCQ. The MEQ exam failed to achieve its primary goal, that is to assess higher-order cognitive skills (8). In fact, some researchers believe that a well-constructed MCQ is superior to MEQ in assessing the higher-order cognitive skills of undergraduate medical students in a problem-based learning setup.

Development of MEQ for the assessment of students’ cognitive skills is not a simple task and is frequently associated with item-writing flaws (9). Knox described that with careful preparation, MEQ can provide a measure of abilities (including attitudes), which cannot be easily assessed by other means. MEQ can also provide an active learning experience in small groups or in a large plenary session (10). In another study, the patient management problem (PMP) method was applied to assess whether an increase in clinical experience can influence the nutrition care planning process. The findings revealed that basic nutrition care planning skills are attained during dietetic internships, while advanced skills, such as information processing and/or confidence in clinical decision-making, are acquired through clinical experience (11).

2. Objectives

The effectiveness of continuous PMP-MEQ examination in clinical reasoning training for nutrition students with different levels of academic progress has not been studied yet. Therefore, the purpose of this study was to evaluate the improvement of students’ ability in consecutive PMP-MEQ exams and to determine its relationship with different levels of academic progress. This study also aimed to determine whether consecutive PMP-MEQ exams can improve different aspects of clinical reasoning skills by increasing the exam scores.

3. Methods

3.1. Study Sample

This descriptive, analytical, cross-sectional study was conducted at Kerman University of Medical Sciences among 67 third-year undergraduate nutrition students, who were enrolled in the study between 2015 and 2017 in three consecutive years.

3.2. Study Design

At the end of the routine teaching module on the topic of “food-borne diseases”, the students were asked to prepare for MCQ and PMP-MEQ exams. The assessment method was described for all students. Ten MCQs were presented, with five discriminators for each question. The students were told that one of the discriminators would be the correct response to MCQ. In the first phase of the examination, after MCQ, the students were asked to complete the PMP-MEQ exam, which comprised of two queries of five-choice question (PMP) and three short-answer questions (MEQ). In the second and third phases, the students participated in the second and third PMP-MEQ examinations; each exam took place one week after the other.

Generally, the instructor must be familiar with the design and development of PMP-MEQ exams. Arrangement and preparation of PMP-MEQ was based on the modified four-step instructions published by Harden (12). In this exam, no test-retest was performed. In the first stage, the instructor planned and designed a clinical case and provided information about an individual patient, who was referred to the emergency ward with a set of signs associated with the ingestion of an unknown contaminated food (based on the subjective report). Next, students, based on their etiological knowledge of the disease transmitted by microorganisms, described the incubation period, as well as signs and symptoms of the clinical case.

The students were required to answer two questions (PMP exam) about the type of microorganism and the food causing intoxication. In the final stage, the students were required to answer three short questions to explain the reason for their diagnosis and suggest appropriate treatments for patient and preventive methods to prevent the prevalence of the disease in the community. PMPs simulate reality and reproduce the decisions of a medical student for investigating and managing a patient. Also, the students were required to be involved actively in the problem (12).

According to Bloom’s taxonomy, there are four levels of cognitive learning, including understanding, applying, analyzing, and evaluating. Various dimensions of clinical reasoning, such as awareness of clinical cues, confirmation of clinical problems, determination and implementation of actions, and evaluation and reflection, were incorporated in the PMP-MEQ exam in this study.

3.3. Statistical Analysis

Statistical analysis was performed in SPSS version 22.0, and P < 0.05 was considered statistically significant. General linear models (repeated measures ANOVA) were used to compare the mean differences in PMP and MEQ scores, based on the grade point average (GPA) of the semester and the total GPA of five semesters. GPA generally represents the average value of the accumulated final grades earned in courses over time. The results of analyses are presented in Tables 1 and 2.

Table 1. Results of One-Way Repeated Measures ANOVA of PMP-MEQ Scores Based on the Students’ GPA in the Fifth Semestera
ScoresTotal (N = 67)GPA ≥ 16 (N = 26)GPA < 16 (N = 41)Sig.
PMP-MEQ1 4.79 ± 4.684.85 ± 4.894.76 ± 4.600.939
PMP-MEQ2 9.93 ± 6.9010.65 ± 7.129.46 ± 6.810.496
PMP-MEQ3 14.10 ± 4.5416.31 ± 3.8012.71 ± 4.440.001
Sig.0.00010.00010.0001

aValues are expressed as mean ± SD.

Table 2. Results of One-Way Repeated Measures ANOVA of PMP-MEQ Scores Based on the Students’ Total GPA During Five Semestersa
ScoresTotal (N = 67)Total GPA ≥ 16 (N = 37)Total GPA < 16 (N = 30)Sig.
PMP-MEQ1 4.79 ± 4.684.65 ± 4.914.97 ± 4.450.784
PMP-MEQ2 9.93 ± 6.9011.41 ± 6.988.10 ± 6.460.051
PMP-MEQ3 14.10 ± 4.5415.49 ± 4.2512.40 ± 4.350.005
Sig.0.00010.00010.0001

aValues are expressed as mean ± SD.

The non-significance of Box's test of equality indicates the equality of covariance matrices for dependent variables in the groups. Also, non-significance of Mauchly’s test of sphericity meets the assumption of compound symmetry, and Levene’s test indicates that the variance in three periods (PMP1, PMP2 and PMP3) is equivalent for the measures. Independent t-test was used for the comparison of mean differences between PMP-MEQ and MCQ exams. Also, paired t-test was performed for the comparison of mean differences in PMP-MEQ scores between students with GPA ≥ 16 and GPA < 16. Moreover, Pearson’s correlation test was used to determine the relationship between the exam scores and academic progress variables, such as GPA and total GPA.

4. Results

Male students comprised 29.9% of the total study population. The results of repeated measures ANOVA showed that the mean difference in PMP scores was significant in three examination periods (P = 0.0001). However, the difference in the mean score of each PMP exam between students with GPA ≥ 16 and GPA < 16 was not significant, except for PMP3 (P = 0.001). Therefore, the scores differed significantly in the three examination periods. We found that the students’ scores increased by continuous PMP examination in both groups. The significant mean difference in PMP3 scores showed that the progress of students with GPA ≥ 16 was greater than that of students with GPA < 16 (P = 0.001) (Table 1). Therefore, continuous PMP assessment contributes to the improvement of students’ clinical reasoning, mainly in students with GPA ≥ 16.

The results of repeated measures ANOVA indicated no significant difference between the total GPA of five semesters and GPA of the fifth semester. The mean difference in PMP scores was significant between the examination periods (P = 0.0001); in other words, the scores differed significantly in these periods. We found an increase in the students’ scores with continuous PMP examination in both groups. However, the difference in the mean score of third PMP exam was significant between students with total GPA ≥ 16 and total GPA < 16 (P = 0.005); the difference was also close to significant in the second PMP exam (P = 0.051). The significance of mean differences in PMP3 scores indicate that the progress of students with total GPA ≥ 16 was greater than that of students with GPA < 16 (P = 0.005) (Table 2). The interpretation of results presented in Table 1 is as the same as the results presented in Table 2.

Additionally, the results of paired t-test in Table 3 confirm the results presented in Tables 1 and 2 regarding academic progress variables. The results of independent t-test regarding the mean scores of MCQ exam and each PMP exam were significant, except for the mean difference of MCQ score with the third PMP score in students with GPA ≥ 16 (P = 0.143) (Table 3). The increase in the scores of students with continuous PMP examination, particularly in students with GPA ≥ 16, revealed that improvement of clinical reasoning was prominent in this group.

Table 3. Significant Differences Between MCQ and PMP-MEQ Scores Based on the Students' GPA of the Fifth Semester and Total GPA of Five Semestersa
ScoresMCQPMP-MEQ1Sig.PMP-MEQ2Sig.PMP-MEQ3Sig.
Total16.49 ± 2.574.79 ± 4.680.00019.93 ± 6.900.000114.10 ± 4.540.0001
GPA ≥ 16 (N = 26)17.38 ± 1.864.85 ± 4.890.000110.65 ± 7.120.000116.31 ± 3.800.143
GPA < 16 (N = 41)15.93 ± 2.814.76 ± 4.600.00019.46 ± 6.810.000112.71 ± 4.450.0001
Sig.0.0230.9390.4960.001
Total GPA ≥ 16 (N = 37)17.27 ± 2.094.65 ± 4.910.000111.41 ± 6.980.000115.49 ± 4.250.015
Total GPA < 16 (N = 30)15.53 ± 2.814.97 ± 4.450.00018.10 ± 6.460.000112.40 ± 4.350.001
Sig.0.0050.7840.0510.005

aValues are expressed as mean ± SD.

Table 4 presents the results of Pearson’s correlation coefficient (r), as a common method for analyzing the relationship between two variables. The results showed that the third PMP exam score was significantly related to academic progress variables, such as GPA and total GPA (P < 0.01). The MCQ score was also significantly related to academic progress variables (P < 0.01) (Table 4). These significant relationships indicate that the students’ MCQ scores were similar to the third PMP scores. Therefore, improvement of students’ clinical reasoning through continuous PMP examination was confirmed.

Table 4. Correlation Coefficients Between MCQ Score, PMP-MEQ Score, and Academic Progress Variables
VariablesMCQPMP-MEQ1PMP-MEQ2PMP-MEQ3GPA of Fifth SemesterTotal GPA
MCQ10.1620.2380.3150.424a0.452a
PMP-MEQ110.295b-0.160-0.002-0.079
PMP-MEQ210.0800.1630.282b
PMP-MEQ310.373a0.425a
GPA of fifth semester10.878a
Total GPA1

aCorrelation is significant at 0.01.

bCorrelation is significant at 0.05.

5. Discussion

This study aimed to provide applicable evidence for medical and paramedical school instructors in clinical departments, who are responsible for evaluating the clinical reasoning of undergraduate students. Schmidt and Mamede claimed that different approaches can be implemented in clinical reasoning education in different phases of training. In their review, they discussed the most common approach, i.e., serial-cue approach, perhaps because of its simulation of diagnostic activities (13). In Germany, development and implementation of a clinical reasoning course in the final year of undergraduate medical training was a major objective of medical education, which could lead to an improvement in the target skills. Overall, it seems advantageous to integrate a longitudinal course in the medical curriculum in order to present better strategies for improving clinical reasoning (14).

In this regard, a previous study provided a successful example of a small-group brainstorming course for enhancing the diagnostic and clinical reasoning skills of new medical clerks. The positive results obtained during the “clinical excellence program” encouraged the formal implementation of this course as part of the clerkship curriculum (15). Therefore, the small group teaching-learning approach is one of the effective approaches, which can improve clinical reasoning skills.

In the current study, by continuing problem-based learning as PMP-MEQ examination, we aimed to improve the clinical experience and clinical reasoning of students. It should be noted that integration of basic sciences knowledge in clinical reasoning is an essential component of health professional education. Generally, effective clinical reasoning involves several sequential domains, including awareness of clinical cues and collection of cues and information, confirmation of clinical problems, determination and implementation of actions, and evaluation and reflection. It involves remembrance and memory, understanding and recognition, interpretation and organization, integration and analysis, and deduction to solve a clinical case in different situations (e.g., classroom and patient’s bed).

Knowledge of basic sciences supports the acquisition of new clinical knowledge, which improves diagnostic reasoning. Successful teaching strategies involve establishing connections between basic and clinical sciences, use of reasonable analogies, and study of multiple clinical problems in multiple settings (16). Conversely, inadequate clinical knowledge is the most common problem, resulting in poor clinical reasoning, as obviously reported in the present study. In the current study, improvement of clinical reasoning in students with poor academic progress was lower than that of students with appropriate academic progress. One of the main concerns in medical education is integration of clinical reasoning into the medical curriculum (without clinical reasoning being consistently defined, taught, or assessed within or between educational programs in the curriculum), which may result in major variations in clinical reasoning education. These findings support the need for the development of optimal educational practices for clinical reasoning curricula and learning assessment (17).

In another study, different attitudes to teaching and learning clinical reasoning were identified, which reflect the Western and Asian cultures of learning. The potential effect of cultural differences in planning optimal programs for teaching and learning clinical reasoning is important in the changing global context of medical education, especially when the Western medical education is implemented in Asian settings (18).

Generally, assessment follows the teaching-learning process. The assessment method of important examinations strongly influences student learning and may shape and improve the student’s learning approaches (19). In a study, the modified problem-based learning (PBL) method, with short-answer questions, was the preferred method in 39% of students, followed by PBL with the modified essay question (36%) and lectures (25%). Therefore, the modified PBL is a reasonable option for schools that cannot meet the staff and space requirements of PBL curriculum (20). Accordingly, in some universities, where the clinical environment for teaching and learning clinical reasoning is not available, implementation of some exams, such as PMP-MEQ, in a clinical format is preferable.

Palmer and Devitt revealed that MEQs are often preferable to other forms of assessment, such as MCQs, for the evaluation of higher-order cognitive skills. MEQs often form a vital component of end-of-course assessments in higher education. In a study, effectiveness of MEQ in the measurement of higher-order cognitive skills was examined in an undergraduate institution. The modified essay question failed to consistently assess higher-order cognitive skills, whereas MCQ examined more than merely recall of knowledge. The researchers concluded that construction of MEQs for the assessment of higher-order cognitive skills cannot be assumed to be a simple task (21).

Moreover, a study investigated the effect of practice exam on the scores of a test, comprising of both MCQ and PMP. It was found that the effect of practice exam on the PMP score was greater than its effect on the MCQ score (22). In another study, correlations between objective structured clinical examination (OSCE) and written tests, such as script concordance testing and clinical reasoning problems, were insignificant. The results showed that written tests of clinical reasoning could provide additional applicable information for the evaluation of students’ capabilities during a course of family medicine clerkship (23).

5.1. Conclusions

Integration of PMP-MEQ in reasoning-based clinical education can be an effective approach to the clinical evaluation of undergraduate students. Continuous PMP examination can improve the students’ clinical reasoning, mainly among students with GPA ≥ 16.

Acknowledgements

Footnotes

References

  • 1.

    Levett-Jones T, Hoffman K, Dempsey J, Jeong SY, Noble D, Norton CA, et al. The 'five rights' of clinical reasoning: An educational model to enhance nursing students' ability to identify and manage clinically 'at risk' patients. Nurse Educ Today. 2010;30(6):515-20. doi: 10.1016/j.nedt.2009.10.020. [PubMed: 19948370].

  • 2.

    Kiesewetter J, Ebersbach R, Gorlitz A, Holzer M, Fischer MR, Schmidmaier R. Cognitive problem solving patterns of medical students correlate with success in diagnostic case solutions. PLoS One. 2013;8(8). e71486. doi: 10.1371/journal.pone.0071486. [PubMed: 23951175]. [PubMed Central: PMC3741183].

  • 3.

    Norcini JJ, Holmboe ES, Hawkins RE. Evaluation challenges in the era of outcomes-based education. In: Holmboe ES, Hawkins RE, editors. Practical guide to the evaluation of clinical competence. Philadelphia, PA: Mosby/Elsevier; 2008. p. 1-9.

  • 4.

    Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010;32(8):676-82. doi: 10.3109/0142159X.2010.500704. [PubMed: 20662580].

  • 5.

    Kittrakulrat J, Jongjatuporn W, Jurjai R, Jarupanich N, Pongpirul K. The ASEAN economic community and medical qualification. Glob Health Action. 2014;7:24535. doi: 10.3402/gha.v7.24535. [PubMed: 25215908]. [PubMed Central: PMC4161945].

  • 6.

    Azer SA. Assessment in a problem-based learning course: Twelve tips for constructing multiple choice questions that test students' cognitive skills. Biochem Mol Biol Educ. 2003;31(6):428-34. doi: 10.1002/bmb.2003.494031060288.

  • 7.

    Norman GR, Smith EK, Powles AC, Rooney PJ, Henry NL, Dodd PE. Factors underlying performance on written tests of knowledge. Med Educ. 1987;21(4):297-304. doi: 10.1111/j.1365-2923.1987.tb00367.x. [PubMed: 3626897].

  • 8.

    Palmer EJ, Duggan P, Devitt PG, Russell R. The modified essay question: Its exit from the exit examination? Med Teach. 2010;32(7):e300-7. doi: 10.3109/0142159X.2010.488705. [PubMed: 20653373].

  • 9.

    Khan MU, Aljarallah BM. Evaluation of modified essay questions (MEQ) and multiple choice questions (MCQ) as a tool for assessing the cognitive skills of undergraduate medical students. Int J Health Sci (Qassim). 2011;5(1):39-43. [PubMed: 22489228]. [PubMed Central: PMC3312767].

  • 10.

    Knox JD. Use modified essay questions. Med Teach. 1980;2(1):20-4. doi: 10.3109/01421598009072166. [PubMed: 24480002].

  • 11.

    Gates GE, Kris-Etherton PM, Greene G. Nutrition care planning: Comparison of the skills of dietitians, interns, and students. J Am Diet Assoc. 1990;90(10):1393-7. [PubMed: 2212421].

  • 12.

    Harden RM. Preparation and presentation of patient-management problems (PMPs). Med Educ. 1983;17(4):256-76. [PubMed: 6192317].

  • 13.

    Schmidt HG, Mamede S. How to improve the teaching of clinical reasoning: A narrative review and a proposal. Med Educ. 2015;49(10):961-73. doi: 10.1111/medu.12775. [PubMed: 26383068].

  • 14.

    Harendza S, Krenz I, Klinge A, Wendt U, Janneck M. Implementation of a Clinical Reasoning Course in the Internal Medicine trimester of the final year of undergraduate medical training and its effect on students' case presentation and differential diagnostic skills. GMS J Med Educ. 2017;34(5):Doc66. doi: 10.3205/zma001143. [PubMed: 29226234]. [PubMed Central: PMC5704605].

  • 15.

    Yang LY, Huang CC, Hsu HC, Yang YY, Chang CC, Chuang CL, et al. Voluntary attendance of small-group brainstorming tutoring courses intensify new clerk's "excellence in clinical care": A pilot study. BMC Med Educ. 2017;17(1):2. doi: 10.1186/s12909-016-0843-6. [PubMed: 28056969]. [PubMed Central: PMC5217545].

  • 16.

    Castillo JM, Park YS, Harris I, Cheung JJH, Sood L, Clark MD, et al. A critical narrative review of transfer of basic science knowledge in health professions education. Med Educ. 2018;52(6):592-604. doi: 10.1111/medu.13519. [PubMed: 29417600].

  • 17.

    Christensen N, Black L, Furze J, Huhn K, Vendrely A, Wainwright S. Clinical reasoning: Survey of teaching methods, integration, and assessment in entry-level physical therapist academic education. Phys Ther. 2017;97(2):175-86. doi: 10.2522/ptj.20150320. [PubMed: 27609900].

  • 18.

    Findyartini A, Hawthorne L, McColl G, Chiavaroli N. How clinical reasoning is taught and learned: Cultural perspectives from the University of Melbourne and Universitas Indonesia. BMC Med Educ. 2016;16:185. doi: 10.1186/s12909-016-0709-y. [PubMed: 27443145]. [PubMed Central: PMC4957336].

  • 19.

    Mahmoodi MR, Baneshi MR, Mohammad Alizadeh S. Influence of assessment method selection in studying and learning approaches: Is it necessary to change assessment style? Future Med Educ J. 2014;4(2):35-40.

  • 20.

    Al-Faris EA, Abdulghani HM, Abdulrahman KA, Al-Rowais NA, Saeed AA, Shaikh SA. Evaluation of three instructional methods of teaching for undergraduate medical students, at King Saud University, Saudi Arabia. J Family Community Med. 2008;15(3):133-8. [PubMed: 23012180]. [PubMed Central: PMC3377127].

  • 21.

    Palmer EJ, Devitt PG. Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research paper. BMC Med Educ. 2007;7:49. doi: 10.1186/1472-6920-7-49. [PubMed: 18045500]. [PubMed Central: PMC2148038].

  • 22.

    Bridgham RG, Rothman AI. The effects of taking a practice examination on scores in the Qualifying Examination of the Medical Council of Canada. Med Educ. 1982;16(4):219-22. doi: 10.1111/j.1365-2923.1982.tb01252.x. [PubMed: 7121339].

  • 23.

    Dory V, Charlin B, Vanpee D, Gagnon R. Multifaceted assessment in a family medicine clerkship: A pilot study. Fam Med. 2014;46(10):755-60. [PubMed: 25646825].

  • COMMENTS

    LEAVE A COMMENT HERE: