Expert assessment of requirements for members of qualification commissions

  • Authors: Кочубей A.V.1,2, Мишарин V.M.3,2, Казаков A.S.4,2, Кочубей V.V.1,2
  • Affiliations:
    1. Federal Scientific and Clinical Center of Specialized Types of Medical Care, Moscow, Russia
    2. Russian University of Medicine, Moscow, Russia
    3. Research Institute of Pulmonology, Moscow, Russia
    4. Research Institute for Healthcare Organization and Medical Management, Moscow, Russia
  • Issue: No 4 (2024)
  • Pages: 434-439
  • Section: Articles
  • URL: https://remedium-journal.ru/journal/article/view/1728
  • DOI: https://doi.org/10.32687/1561-5936-2024-28-4-434-439
  • Cite item

Abstract


The vulnerability of an assessment conducted by another person lies in the varying degree of rigor of the examiner, possibly his lack of competence and likely dependence on others. Ensuring objectivity is achieved by selecting examiners based on informal signs of professional development. Categorization implies an assessment of the qualifications of specialists directly by the members of the qualification commissions. The purpose of the study is to assess the ability of requirements for members of qualification commissions to ensure their unified expert level of qualification and independence. Methods. The expert survey was conducted by correspondence and consisted of one iteration. The expert assessment of the requirements for the members of the qualification commissions was carried out according to 3 parameters: The ability of the requirements to guarantee a single expert level of qualification of the commission members in professional activities (1) and related disciplines (2), as well as the independence of the commission members (3). A ten-point Staple scale was used. For the analysis, the sums of the estimates were calculated, the consistency of the estimates, the normality of the distribution and the differences in expert estimates for two or more criteria were checked. The threshold value of the error level, p, is less than 0.05. Results. Expert estimates are consistent, W = 0.877, p ≤ 0.001. The sum of the assessments of the ability of the requirements to ensure the expert level of qualification of a member of the commission in professional activity is equal to –193, in related disciplines — −199, independence — −299. Expert assessments of the ability of requirements to guarantee independence are lower than to guarantee an expert level in professional and related fields, U = 1372, p ≤ 0.001, for both pairs. The estimates of the first and second parameters do not differ significantly, U = 1964.5, p ≤ 0.915. The ability to guarantee expertise in professional and related fields is better assessed by accreditation requirements than by other requirements, p = 0.001. Conclusions. Most of the requirements imposed on members of qualification commissions do not guarantee their expert level in professional activities and related disciplines, as well as independence from third parties.

About the authors

Adelina V. Кочубей

Federal Scientific and Clinical Center of Specialized Types of Medical Care, Moscow, Russia; Russian University of Medicine, Moscow, Russia

Email: kochoubeya@gmail.com

Viktor M. Мишарин

Research Institute of Pulmonology, Moscow, Russia; Russian University of Medicine, Moscow, Russia

Email: info@pulmonology-russia.ru

Alexey S. Казаков

Research Institute for Healthcare Organization and Medical Management, Moscow, Russia; Russian University of Medicine, Moscow, Russia

Email: keyprojet@yandex.ru

Valentin V. Кочубей

Federal Scientific and Clinical Center of Specialized Types of Medical Care, Moscow, Russia; Russian University of Medicine, Moscow, Russia

Email: kochoubey@gmail.com

References

  1. Gershuni O., Czabanowska K., Burazeri G. et al. Is there a golden recipe? A scoping review of public health workforce development. Eur. J. Public Health. 2019;29(3):401–408. doi: 10.1093/eurpub/cky247
  2. Staudenmann D., Waldner N., Lörwald A., Huwendiek S. Medical specialty certification exams studied according to the Ottawa Quality Criteria: a systematic review. BMC Med. Educ. 2023;23(1):619. doi: 10.1186/s12909-023-04600-x
  3. Zimina E. V., Kochubej A. V., Konanyhina A. K., Navarkin M. V. The domestic system of training and continuous professional development of medical workers: SWOT analysis. Modern problems of science and education. 2015;(4):445.
  4. Thiessen N., Fischer M. R., Huwendiek S. Assessment methods in medical specialist assessments in the DACH region — overview, critical examination and recommendations for further development. GMS J. Med. Educ. 2019;36(6):78.
  5. McManus I. C., Thompson M., Mollon J. Assessment of examiner leniency and stringency ('hawk-dove effect') in the MRCP(UK) clinical examination (PACES) using multi-facet Rasch modelling. BMC Med. Educ. 2006;6:42. doi: 10.1186/1472-6920-6-42
  6. Fuchs J. W., Youmans Q. R. Mitigating bias in the era of virtual residency and fellowship interviews. J. Grad. Med. Edu. 2020;12:674–677.
  7. Ryan T. Addressing bias and lack of objectivity in the Canadian resident matching process. CMAJ. 2018;190:E1211–E1212.
  8. Osler W. Examinations, examiners and examinees. Lancet. 1913;136:1047–1050.
  9. Fleming P. R., Manderson W. G., Matthews M. B. et al. Evolution of an examination, M. R. C. P. (UK). Br. Med. J. 1974;2(5910):99–107. doi: 10.1136/bmj.2.5910.99
  10. Wakeford R., Denney M., Ludka-Stempien K., Dacre J, McManus I. C. Cross-comparison of MRCGP & MRCP(UK) in a database linkage study of 2,284 candidates taking both examinations: assessment of validity and differential performance by ethnicity. BMC Med. Educ. 2015;15:1. doi: 10.1186/s12909-014-0281-2
  11. Finn Y., Cantillon P., Flaherty G. Exploration of a possible relationship between examiner stringency and personality factors in clinical assessments: a pilot study. BMC Med. Educ. 2014;14:1052. doi: 10.1186/s12909-014-0280-3
  12. Klein R., Snyder E. D., Koch J. et al. Analysis of narrative assessments of internal medicine resident performance: are there differences associated with gender or race and ethnicity? BMC Med. Educ. 2024;24(1):72. doi: 10.1186/s12909-023-04970-2
  13. Klein R., Snyder E. D., Koch J. et al. Exploring gender and thematic differences in qualitative assessments of internal medicine resident performance. BMC Med. Educ. 2023;23(1):932. doi: 10.1186/s12909-023-04917-7
  14. Locke R., Bell J., Scallan S. et al. The experience and professional development of medical appraisers. Educ Prim Care. 2018 Nov;29(6):351–356. doi: 10.1080/14739879.2018.1514987
  15. Atkinson A. R., Abbott C., Oswald A. et al. Strategies to enable transformation in medical education: faculty and trainee development in competence by design. Perspect. Med. Educ. 2024;13(1):85–94. doi: 10.5334/pme.960
  16. Schultz K. W., Kolomitro K., Koppula S., Bethune C. H. Competency-based faculty development: applying transformations from lessons learned in competency-based medical education. Can. Med. Educ. J. 2023;14(5):95–102. doi: 10.36834/cmej.75768
  17. Holmboe E. S., Ward D. S., Reznick R. K. et al. Faculty development in assessment: the missing link in competency-based medical education. Acad. Med. 2011;86(4):460–467. doi: 10.1097/ACM.0b013e31820cb2a7
  18. Yeates P., Sebok-Syer S. S. Hawks, doves and rasch decisions: understanding the influence of different cycles of an OSCE on students’ scores using many facet rasch modeling. Med. Teach. 2017;39:92–99.
  19. Seaward J. R., Carter L. R., Nagarkar P., Zhang A. Y. Rating the rater: a technique for minimizing leniency bias in residency applications. Plast. Reconstr. Surg. Glob Open. 2023;11(4):e4892. doi: 10.1097/GOX.0000000000004892
  20. Kiraly L., Dewey E., Brasel K. Hawks and doves: adjusting for bias in residency interview scoring. J. Surg Educ. 2020;77:e132–e137.
  21. Clement E. A., Oswald A., Ghosh S., Hamza D. M. Exploring the quality of feedback in entrustable professional activity narratives across 24 residency training programs. J Grad. Med. Educ. 2024;16(1):23–29. doi: 10.4300/JGME-D-23-00210.1
  22. Kornegay J. G., Kraut A., Manthey D. et al. Feedback in medical education: a critical appraisal. AEM Educ. Train. 2017;1:98–109. doi: 10.1002/aet2.10024
  23. Sheehan J. External examiners: roles and issues. J. Adv. Nurs. 1994;20(5):943–949. doi: 10.1046/j.1365-2648.1994.20050943.x
  24. Allen M., Russell T., Ford L. et al. Identification and evaluation of criterion measurement methods. Mil. Psychol. 2023;35(4):308–320. doi: 10.1080/08995605.2022.2050165
  25. Kuznetsova O. V., Samoilov A. S., Romanov S. V., Abaeva O. P. From certification to accreditation: the history of the development of domestic medical education and prospects for transition to the system continuing medical education. Extreme Medicine. 2018;20(4):551–558.
  26. Lockyer J., Carraccio C., Chan M. K. et al. Core principles of assessment in competency-based medical education, Med. Teach. 2017;39(6):609–616. doi: 10.1080/0142159X.2017.1315082
  27. Wright C., Campbell J., McGowan L. et al. Interpreting multi-source feedback: online study of consensus and variation among GP appraisers. Br. J. Gen. Pract. 2016;66(645):e277–284. doi: 10.3399/bjgp16X684373
  28. Norcini J., Anderson M. B., Bollela V. et al. Consensus framework for good assessment. Med. Teach. 2018;40(11):1102–1109. doi: 10.1080/0142159X.2018.1500016

Statistics

Views

Abstract - 0

PDF (Russian) - 0

Cited-By


PlumX

Dimensions


Copyright (c) 2024 АО "Шико"

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Mailing Address

Address: 105064, Moscow, st. Vorontsovo Pole, 12, building 1

Email: redactor@remedium-journal.ru

Phone: +7(495) 917-48-86



Principal Contact

Sherstneva Elena Vladimirovna
EXECUTIVE SECRETARY
FSSBI «N.A. Semashko National Research Institute of Public Health»

105064, Vorontsovo Pole st., 12, Moscow


Email: redactor@remedium-journal.ru

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies