The outcome of the undergraduate medical programme is to produce clinically competent health care providers relevant for the South African context. Educational institutions find it hard to ensure the quality of assessments where competency must be assessed. This study aimed to compile an assessment framework that can be used to benchmark current assessment practices in the clinical phase of the undergraduate medical programme where competency must be certified.
In this observational, descriptive study, qualitative data were gathered using the steps described by the World Health Organization for rapid reviews. Literature was searched, screened and selected before data were analysed and a framework was constructed.
Twenty-five official documents were included in the study. The framework addressed the three components of quality assessment, namely, accreditation, assessment and quality assurance. Assessors should attend to the principles of assessment, namely, validity, reliability, fairness, feasibility, educational effect and acceptability, but realise that no assessment meets all these criteria. The first step to ensure quality assessment is to identify a clear outcome. Assessment should be planned and aligned with this outcome.
It is clear that clinical assessment is multidimensional and that no assessment is perfect. Programme accreditation, assessment practices and psychometrics can assist to improve the quality of assessment but cannot judge clinical competence. Using experienced assessors with a variety of assessment methods on a continuous basis is the proposed way to assess clinical competence. An assessment framework can assist to improve assessment, but it cannot guarantee quality assessment.
In South Africa, undergraduate medical training programmes are offered at nine accredited universities.
Three components of quality in assessment have been described in the literature, namely, accreditation, assessment and audit.
To assess the quality of an assessment, it must be benchmarked against appropriate criteria. Benchmarking is described as the process of comparing standards with external criteria, with the aim of improvement.
An assessment framework can be described as a ‘common language and mental model’ that guides assessors on what to look for in student assessment to maximise the reliability of the assessment. This framework also informs students and leadership on what to expect during assessment.
Accreditation entails certification, which confirms that a programme and/or training facility is capable of fulfilling required specifications for a specific period. For instance, the South African Qualifications Authority (SAQA) accredits the providers who offer outcomes-based learning programmes that are aligned with registered unit standards and qualifications of the National Qualifications Framework (NQF).
South African Qualifications Authority defines assessment as ‘a process used to identify, gather and interpret information and evidence against required competencies’ in order to make a judgement about a learner’s achievement.
Quality, standards and relevance are key elements of quality assurance.
The duration of undergraduate medical training is between 5 and 6 years, and it is offered in three phases, namely, orientation, pre-clinical and clinical training.
A preliminary literature review was performed to identify frameworks, policies and criteria on quality assessment in the clinical phase of an undergraduate medical programme. At the institutional level, the assessment policy of the UFS sets minimum criteria for assessment of undergraduate students, which requires alignment with national policies and acts.
What are the current regulations and policies as well as best evidence practices that inform quality assessment in the clinical phase of an MBChB programme?
To compile a framework that can be used to benchmark current assessment practices based on official regulations and policies, and supported by best evidence practices to ensure defendable assessment in the clinical phase of the MBChB programme in South Africa.
In this article, a rapid review of the regulations and policy documents of the bodies that regulate the assessment and training of the MBChB programme at the UFS was used to formulate a framework for clinical assessment. This framework may be helpful to benchmark the quality of assessment in the current MBChB programmes in South Africa and beyond its borders.
Qualitative data were gathered using a rapid review. No formal definition or uniform method is described to conduct a rapid review although a rapid review can be described as a simplified systematic review.
The World Health Organization (WHO) proposed seven steps to follow for a systematic review, which may be adjusted according to the specific needs for a rapid review.
needs assessment and topic selection
protocol development with or without registration
literature search
screening and selection of the literature
data extraction
risk-of-bias and quality assessment of data
knowledge synthesis.
In rapid reviews, some of these steps are commonly simplified or omitted, but the description of the method should not be compromised.
The following steps were used in this rapid review.
The difficulty to defend the quality of clinical assessment in an undergraduate medical programme was identified as an area to investigate.
A protocol was developed before the study commenced. The protocol limited the inclusion of primary source documents to the following:
official regulations and policy documents of the regulatory bodies responsible for assessment in undergraduate medical education at the UFS and South Africa
international guidelines on clinical assessment issued by the WFME
the AMEE guidelines on frameworks for clinical assessment.
The primary outcomes to investigate were:
accreditation and registration
assessment
quality assurance.
The following search strategies were followed:
No date limitations were placed on the documents included in the review.
The search process was conducted from May to July 2019.
An Internet data search was conducted.
The
These databases identified from the literature study were consulted: UFS official website, HPCSA, SAQA, Council on Higher Education (CHE), WFME and AMEE websites.
The following words and/or phrases were searched: accreditation, assessment policies, assessment guidelines, clinical assessment, quality assurance in assessment, principles of quality assessment and undergraduate assessment. Searches were conducted with single words and phrases and the inclusion and exclusion of ‘AND’ and ‘OR’.
Backward searching was performed using references and cross-references to related policies and regulations of the identified regulatory bodies.
Forward searches of the literature entailed the search for related and updated information from the same documents or topics to ensure that all relevant information was identified.
Documents were screened for selection by a single review author. Documents initially found not to meet the outcome of the study were not included but saved separately. These documents were screened a second time to ensure that relevant data were not omitted. When in doubt, the study leader could assist with selection decisions.
Documents were grouped according to the primary outcomes that were accreditation and registration, assessment and audit. The assessment category was subdivided into the following subcategories:
assessment content and standards
assessment types
assessment methods
principles of quality assessment.
A table displayed the specific outcomes that were addressed by each document included in the study.
This was omitted in this review, although care was taken to include all relevant documents by following the prescribed protocol. Document quality was not assessed as only official policies and regulations were included.
For each category, the results of the review were summarised and discussed. This was supplemented by a secondary literature search to clarify concepts. The guidelines for framework development described by Pangaro and Ten Cate
To ensure the credibility of the data collected and to ensure that relevant documents were included in the document review, the protocol was strictly followed. National and international guidelines were added to enable the transferability of results to other institutions. The steps followed in the rapid review were described clearly to assess the dependability of the results.
The study was registered and approved by the Health Sciences Research and Ethics Committee (HSREC) at the University of Free State (UFS-HSD 2019/0001/2304). As only documents in the public domain were used for this literature review and analysis, no permission was necessary.
The MBChB programme is offered under the legislation of the Department of Health and the Department of Education (previously the Ministry of Education). The
Primary documents used in document review.
Document | Accreditation or registration | Assessment |
Quality assurance | |||
---|---|---|---|---|---|---|
Assessment content and standard | Assessment types | Assessment methods | Principles of quality assessment | |||
South Arica. Council on Higher Education. |
√ | - | - | - | - | - |
South Africa. |
√ | - | - | - | - | - |
South Africa. |
- | √ | √ | - | √ | √ |
Health Professions Council of South Africa. Core competencies for undergraduate students in clinical associate, dentistry and medical teaching and learning programmes in South Africa 2014. |
- | √ | - | - | - | - |
Health Professions Council of South Africa. Accredited facilities. 2019. |
√ | - | - | - | - | - |
Health Professions Council of South Africa. Professional Boards. 2019. |
√ | - | - | - | - | - |
South African Qualifications Authority. |
√ | - | - | - | - | - |
South African Qualifications Authority. The National Qualifications Framework Curriculum Development. 2000. |
- | - | √ | - | √ | - |
South African Qualifications Authority. National Qualifications Framework and the Standards setting. 2003. |
- | - | - | - | √ | √ |
South African Qualifications Authority. Criteria and Guidelines for Assessment of NQR Registered Unit standards and Qualifications. 2001. |
- | - | - | - | - | - |
South African Qualifications Authority. Guidelines for integrated assessment. 2005. |
- | √ | √ | - | √ | - |
South African Qualifications Authority. National Policy and Criteria for Designing and Implementing Assessment for NQF Qualifications and Part-Qualifications and Professional Designations in South Africa. 2014. |
- | - | √ | √ | - | - |
South Africa. National Qualifications Framework. |
√ | - | - | - | - | - |
University of the Free State. Teaching-Learning Policy. 2008. |
- | - | √ | √ | √ | - |
University of the Free State. Quality assurance policy. 2009. |
- | - | - | - | - | √ |
University of the Free State. Guidelines for the implementation of external moderation. 2009. |
- | - | - | - | - | √ |
University of the Free State. Assessment policy on the UFS coursework learning programme. 2016. |
- | - | √ | √ | √ | √ |
University of the Free State. General rules for undergraduate qualifications, postgraduate diplomas, Bachelor Honours degrees, Master’s degrees, Doctoral degrees, Higher Doctorates, Honorary degrees and the Convocation. 2019. |
- | - | √ | - | √ | - |
University of the Free State. Faculty of Health Sciences. Rule book. School of Clinical Medicine. Undergraduate Qualifications. 2019. |
- | √ | √ | - | √ | - |
University of the Free State. School of Clinical Medicine. Undergraduate programme management. 2019. SOP Quality assurance. |
- | - | - | - | - | √ |
World Federation for Medical Education. 2015. Standards. Basic Medical Education. |
- | - | √ | √ | √ | √ |
World Federation for Medical Education. Accreditation. 2017. |
√ | - | - | - | - | - |
Pangaro L, Ten Cate O. AMEE Guide No. 78. |
- | √ | √ | √ | √ | - |
Tavakol M, Dennick R. AMEE Guide 119. |
- | - | - | - | √ | - |
NQR, National Qualification Register; NQF, National Qualification Framework; UFS, University of the Free State; SOP, Standard Operating Procedures; AMEE, Association for Medical Education in Europe.
According to the
The
Because of globalisation and the increased demand for accountability in health care, the WHO and the WFME worked together on documents for the accreditation of health training institutions worldwide. The WFME gives ‘recognition status’ to an accrediting agency that meets international standards.
Four components of assessment were identified, namely, assessment content and standards, assessment types, assessment methods and principles of assessment.
An assessment to ensure a competent practitioner must include elements of knowledge, skills and values.
Different types of assessment applicable to medical assessment were identified from the document review, namely, formative assessment, integrated assessment and summative assessment.
Formative assessment is described as a series of assessments that occur during the learning and training process.
Summative assessment is the assessment that takes place after learning. The aim of summative assessment is to award grades and to validate performance and competence.
Theoretical, practical and integrated assessment methods were described, and they relate to the aim or outcome of the assessment. Theoretical assessments include multiple-choice questions, modified essay questions or short-answer questions, as well as long questions. Oral examinations can be used to test knowledge or to combine knowledge with communication skills. Clinical assessments include unobserved long cases, mini clinical evaluation exercises (mini-CEX), objective structured clinical examinations (OSCEs) and direct observation of clinical practice (DOPS). Integrated assessment methods include portfolios, logbooks, elective reports and workplace-based assessments, as well as feedback from stakeholders.
From the UFS general rules
Assessment should be an integral part of curriculum planning and must be aligned with outcomes.
Assessment should be performed on the appropriate NQF level in accordance with programme registration.
All assessments should be planned to cover all assessment domains.
Assessment takes place in a system and must be planned accordingly.
In order to be a quality assessment, each of these assessments should fulfil criteria for validity, reliability, transparency, fairness and practicability.
Moderation should form part of overall, as well as individual, assessments.
There should be accountability for each assessment, with evidence that the assessment was moderated.
An assessment can be considered credible if the criteria for fairness, validity, reliability and practicability have been met.
Quality assurance policies are essential for ensuring that specifications and standards are maintained.
All the primary documents necessary for the rapid review were available in the public domain on the identified websites. Information in these documents was aligned with each other. Many cross-references to other documents were found in source documents. By comparing the information in the respective documents, it was found that there was no contradiction in the documents. The data included in the rapid review can therefore be considered representative and appropriate for the purpose of this study.
The three components of quality assessment, namely, accreditation and registration, assessment and quality assurance, should from part of an assessment framework to benchmark current assessment. The inclusion of best-practice evidence in the framework will make the framework globally relevant.
Accreditation and registration is usually not a problem for training facilities in South Africa as the HPCSA conducts regular site visits and requires annual progress reports from training facilities to ensure compliance with accreditation and registration requirements.
Accreditation of the training provider and the qualification by the HPCSA.
Training may take place only at a university registered with the Department of Education.
The qualification must be registered with SAQA.
All students in the MBChB programme must be registered with the HPCSA.
A recommendation of the 2010 Ottawa Conference was to develop criteria for accreditation of international medical educational programmes.
Assessment in medical education is complex and includes various stakeholders, each with their own expectations. These stakeholders include students, teachers, lecturers, educational institutions, health care systems, regulatory bodies and patients.
The quality of clinical assessment can be improved if attention is given to the following assessment principles:
Validity: Content validity can be improved with blueprinting of individual as well as overall assessments, and construct validity with appropriate assessment methods.
Reliability: Reliability can be improved by training assessors, enhancing the quality of questions and mark sheets, and by increasing the number of stations or questions per assessment.
Fairness: Although all assessment cannot be equal, there should be no discrimination against any student, assessor or patient. It is also important that assessment should be conducted according to the NQF level that the programme is registered for.
Feasibility: All assessments should be realistic, practical and sensible in the context where they take place. This can be achieved by careful planning and consideration of all resources.
Educational effect: Assessment should promote learning through study for assessments, or making use of workplace-based assessment and constructive feedback.
Acceptability: All stakeholders, including patients, students and the educational institution, should be satisfied with the assessment. This can be achieved through transparency and keeping all stakeholders informed.
Moderation is a quality assurance process that confirms that the assessment is valid and reliable and meets minimum standards.50 Moderation should form part of the overall assessment in the MBChB programme, as well as of each assessment. Moderation can be conducted internally and/or externally, and should take place before and after assessments. An external moderator should moderate all high-stakes examinations.
Limitations of the study: Although the rapid review was performed according to the protocol, the risk-for-bias and quality of documents were not evaluated by a second reviewer. These results may not be 100% transferable to all MBChB programmes as different universities have different assessment policies and methods.
The complexity of clinical assessment warrants that assessment be ‘evaluated on a programmatic level’ rather than on individual assessment level, as no individual assessment meets all the criteria of validity, reliability, educational impact, acceptability and cost.
This rapid review attempted to develop a framework to benchmark the quality of assessment in the clinical phase of an undergraduate medical programme. As a first step, all stakeholders should be aware of the outcome of the programme. All assessment and training in the MBChB programme must be aligned with the outcome of the programme, namely, to produce a competent medical practitioner who can integrate knowledge, skills and attitudes relevant to the South African context.
Open-mindedness is essential during the assessment process. Programme accreditation, assessment practices and psychometrics can assist to improve the quality of assessment but cannot judge clinical competence. Experienced assessors, using a variety of assessment methods on a continuous basis, is the proposed way to go. By implementing quality assurance processes, institutions can ensure that specifications and standards are maintained and improved, and that they are globally competitive. It is clear that clinical assessment is multidimensional and that no assessment is perfect. An assessment framework can assist to improve assessment, but it cannot guarantee quality assessment.
Schematic display of the framework to measure the quality of assessment in the clinical phase of an undergraduate medical programme.
The authors have declared that no competing interests exist.
H.B. was in charge of conceptualisation of the study, protocol development, data collection and writing of the article. J.B. and L.J.V.d.M. assisted with conceptualisation and planning of the study, as well as critical evaluation and the final approval of the manuscript.
This research received no specific grant from any funding agency in the public, commercial or non-for-profit sectors.
Data sharing is not applicable to this article as no new data were created or analysed in this study.
The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of any affiliated agency of the authors.