Relationships between Bloom’s taxonomy, judges’ estimation of item difficulty and psychometric properties of items from a progress test

a prospective observational study

Authors

Keywords:

Psychometrics, Educational measurements, Medical education

Abstract

BACKGROUND: Progress tests are longitudinal assessments of students’ knowledge based on successive tests. Calibration of the test difficulty is challenging, especially because of the tendency of item-writers to overestimate students’ performance. The relationships between the levels of Bloom’s taxonomy, the ability of test judges to predict the difficulty of test items and the real psychometric properties of test items have been insufficiently studied. OBJECTIVE: To investigate the psychometric properties of items according to their classification in Bloom’s taxonomy and judges’ estimates, through an adaptation of the Angoff method. DESIGN AND SETTING: Prospective observational study using secondary data from students’ performance in a progress test applied to ten medical schools, mainly in the state of São Paulo, Brazil. METHODS: We compared the expected and real difficulty of items used in a progress test. The items were classified according to Bloom’s taxonomy. Psychometric properties were assessed based on their taxonomy and fields of knowledge. RESULTS: There was a 54% match between the panel of experts’ expectations and the real difficulty of items. Items that were expected to be easy had mean difficulty that was significantly lower than that of items that were expected to be medium (P < 0.05) or difficult (P < 0.01). Items with high-level taxonomy had higher discrimination indices than low-level items (P = 0.026). We did not find any significant differences between the fields in terms of difficulty and discrimination. CONCLUSIONS: Our study demonstrated that items with high-level taxonomy performed better in discrimination indices and that a panel of experts may develop coherent reasoning regarding the difficulty of items.

Downloads

Download data is not yet available.

Author Biographies

Pedro Tadao Hamamoto Filho, Universidade Estadual Paulista

MD, PhD. Physician, Department of Neurology, Psychology and Psychiatry, Universidade Estadual Paulista (UNESP), Botucatu (SP), Brazil.

Eduardo Silva, Universidade Estadual Paulista

BSc. Statistical Manager, Edudata Informática, São Paulo (SP), Brazil.

Zilda Maria Tosta Ribeiro, Universidade Estadual Paulista

MD. Assistant Professor, Faculdade de Medicina de Marília (FAMEMA), Marília (SP), Brazil.

Maria de Lourdes Marmorato Botta Hafner, Universidade Estadual Paulista

MD, MSc. Assistant Professor, Faculdade de Medicina de Marília (FAMEMA), Marília (SP), Brazil.

Dario Cecilio-Fernandes

PhD. Researcher, Department of Medical Psychology and Psychiatry, Universidade Estadual de Campinas (UNICAMP), Campinas (SP), Brazil.

Angélica Maria Bicudo, Universidade Estadual Paulista

MD, PhD. Associate Professor, Department of Pediatrics, Universidade Estadual de Campinas (UNICAMP), Campinas (SP), Brazil.

References

Impara JC, Plake BS. Teacher’s ability to estimate item difficulty: A test of the assumptions in the Angoff standard setting method. J Educ Measurement. 2005;35(1):69-81. doi: 10.1111/j.1745-3984.1998.tb00528.x.

Zaidi NLB, Grob KL, Monrad SM, et al. Pushing critical thinking skills with multiple-choice questions: Does Bloom’s taxonomy work? Acad Med. 2018;93(6):856-9. PMID: 29215375; doi: 10.1097/ACM.0000000000002087.

Bloom BS. Taxonomy of educational objectives: the classification of education goals. Cognitive domain. Handbook 1. New York: Longman; 1956. ISBN-10: 0679302093; ISBN-13: 978-0679302094.

Adams NE. Bloom’s taxonomy of cognitive learning objectives. J Med Libr Assoc. 2015;103(3):152-3. PMID: 26213509; doi: 10.3163/1536-5050.103.3.010.

Anderson LW, Krathwohl DR, Airasian PW, et al. A taxonomy for learning, teaching, and assessing: A revision of Bloom’s Taxonomy of Educational Objectives. New York: Addison Wesley Longman; 2001.

Palmer EJ, Devitt PG. Assessment of higher-order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research paper. BMC Med Educ. 2007;7:49. PMID: 18045500; doi: 10.1186/1472-6920-7-49.

Downing SM, Tekian A, Yudkowsky R. Procedures for establishing defensible absolute passing scores on performance examinations in health professions education. Teach Learn Med. 2006;18(1):50-7. PMID: 16354141; doi: 10.1207/s15328015tlm1801_11.

McKinley DW, Norcini JJ. How to set standards on performance-based examinations: AMEE Guide No. 85. Med Teach. 2014;36(2):97-110. PMID: 24256050; doi: 10.3109/0142159X.2013.853119.

Muijtjens AM, Hoogenboom RJ, Verwijnen GM, Van Der Vleuten CP. Relative or Absolute Standards in Assessing Medical Knowledge Using Progress Tests. Adv Health Sci Educ Theory Pract. 1998;3(2):81-7. PMID: 12386445; doi: 10.1023/A:1009728423412.

Cohen-Schotanus J, van der Vleuten CP. A standard setting method with the best performing students as point of reference: practical and affordable. Med Teach. 2010;32(2):154-60. PMID: 20163232; doi: 10.3109/01421590903196979.

Norcini JJ. Setting standards on educational tests. Med Educ. 2003;37(5):464-9. PMID: 12709190; doi: 10.1046/j.1365-2923.2003.01495.x.

Tomic ER, Martins MA, Lotufo PA, Benseñor IM. Progress testing: evaluation of four years of application in the school of medicine, University of São Paulo. Clinics. 2005;60(5):389-96; doi: 10.1590/S1807-59322005000500007.

Sakai MH, Ferreira Filho OF, Almeida MJ, Mashima DA, Marchese MC. Progress testing and course evaluation: ten years of experience in the undergraduate medical course at the State University of Londrina. Rev Bras Educ Méd. 2008; 32(2):254-63; doi: 10.1590/S0100-55022008000200014.

Pinheiro OL, Spadella MA, Moreira HM, Ribeiro ZMT, Guimarães APC, Almeida Filho OM, Hafner MLMB. Progress test: an evaluative tool for academic management. Rev Bras Educ Méd. 2015; 39(1):68-78; doi: 10.1590/1981-52712015v39n1e02182013.

Bicudo AM, Hamamoto Filho PT, Abbade JF, Hafner MLMB, Maffei CML. Consortia of cross-institutional Progress Testing for all medical schools in Brazil. Rev Bras Educ Méd. 2019;43(4):151-6. doi: 10.1590/1981-52712015v43n4rb20190018.

Cecilio-Fernandes D, Kerdijk W, Bremers AJ, Aalders W, Tio RA. Comparison of level of cognitive process between case-based items and non-case-based items of the interuniversity progress test of medicine in the Netherlands. J Educ Eval Health Prof. 2018;15:28. PMID: 30541188; doi: 10.3352/jeehp.2018.15.28.

Verhoeven BH, van der Steeg AF, Scherpbier AJ, et al. Reliability and credibility of an Angoff standard setting procedure in progress testing using recent graduates as judges. Med Educ. 1999;33(11):832-7. PMID: 10583792; doi: 10.1046/j.1365-2923.1999.00487.x.

Verhoeven BH, Verwijnen GM, Muijtjens AM, Scherpbier AJ, van der Vleuten CP. Panel expertise for an Angoff standard setting procedure in progress testing: item writers compared to recently graduated students. Med Educ. 2012;36(9):860-7. PMID: 12354249; doi: 10.1046/j.1365-2923.2002.01301.x.

Angoff WH. Scales, norms and equivalent scores. In: Thorndike RL, editor. Educational measurement. Washington DC: American Council on Education; 1971.

Díaz FR, López FJB. Bioestatística. 1st ed. São Paulo: Cengage Learning; 2014.

Cecilio-Fernandes D, Kerdijk W, Jaarsma AD, Tio RA. Development of cognitive processing and judgments of knowledge in medical students: Analysis of progress test results. Med Teach. 2016;38(11):1125-9. PMID: 27117670; doi: 10.3109/0142159X.2016.1170781.

van der Vleuten CPM, Verwijnen GM, Wijnen WHFW. Fifteen years of experience with progress testing in a problem-based curriculum. Med Teach. 1996;18(2):103-9. doi: 10.3109/01421599609034142.

Coombes L, Ricketts C, Freeman A, Stratford J. Beyond assessment: feedback for individuals and institutions based on the progress test. Med Teach. 2010;32(6):486-90. PMID: 20515378; doi: 10.3109/0142159X.2010.485652.

Nouns ZM, Georg W. Progress testing in German speaking countries. Med Teach. 2010;32(6):467-70. PMID: 20515374; doi: 10.3109/0142159X.2010.485656.

Albanese M, Case SM. Progress testing: critical analysis and suggested practices. Adv Health Sci Educ Theory Pract. 2016;21(1):221-34. PMID: 25662873; doi: 10.1007/s10459-015-9587-z.

Kibble JD, Johnson T. Are faculty predictions or item taxonomies useful for estimating the outcome of multiple-choice examinations? Adv Physiol Educ. 2011;35(4):396-401. PMID: 22139777; doi: 10.1152/advan.00062.2011.

Cunnington JPW, Norman GR, Blake JM, Dauphinee WD, Blackmore DE. Applying learning taxonomies to test items: Is a fact an artefact? In: Scherpbier AJJA, van der Vleuten CPM, Rethans JJ, van der Steeg AFW, editors. Advances in Medical Education. The Netherlands: Springer, Dordrecht; 1997. doi: 10.1007/978-94-011-4886-3.

Downing SM. Item response theory: applications of modern test theory in medical education. Med Educ. 2003;37(8):739-45. PMID: 12945568l doi: 10.1046/j.1365-2923.2003.01587.x

Sakai MH, Ferreira Filho OF, Matsuo T. Assessment of the cognitive growth of the Medicine student: applying the equalization test to the Progress Test. Rev Bras Educ Méd. 2011; 35(4):493-501. doi: 10.1590/S0100-55022011000400008.

Downloads

Published

2020-02-06

How to Cite

1.
Hamamoto Filho PT, Silva E, Ribeiro ZMT, Hafner M de LMB, Cecilio-Fernandes D, Bicudo AM. Relationships between Bloom’s taxonomy, judges’ estimation of item difficulty and psychometric properties of items from a progress test: a prospective observational study. Sao Paulo Med J [Internet]. 2020 Feb. 6 [cited 2025 Mar. 9];138(1):33-9. Available from: https://periodicosapm.emnuvens.com.br/spmj/article/view/576

Issue

Section

Original Article