About the Author(s)

Neetha J. Erumeda Email symbol
Department of Family Medicine and Primary Care, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, South Africa

Gauteng Department of Health, Ekurhuleni District Health Services, Germiston, South Africa

Ann Z. George symbol
Centre of Health Science Education, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, South Africa

Louis S. Jenkins symbol
Division of Family Medicine and Primary Care, Faculty of Medicine and Health Science, Stellenbosch University, Cape Town, South Africa

Western Cape Department of Health, George Hospital, George, South Africa

Primary Health Care Directorate, Department of Family, Community and Emergency Care, Faculty of Health Sciences, University of Cape Town, Cape Town, South Africa


Erumeda NJ, George AZ, Jenkins LS. Evidence of learning in workplace-based assessments in a Family Medicine Training Programme. S Afr Fam Pract. 2024;66(1), a5850. https://doi.org/10.4102/safp.v66i1.5850

Original Research

Evidence of learning in workplace-based assessments in a Family Medicine Training Programme

Neetha J. Erumeda, Ann Z. George, Louis S. Jenkins

Received: 19 Oct. 2023; Accepted: 10 Jan. 2024; Published: 26 Apr. 2024

Copyright: © 2024. The Author(s). Licensee: AOSIS.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Background: Learning portfolios (LPs) provide evidence of workplace-based assessments (WPBAs) in clinical settings. The educational impact of LPs has been explored in high-income countries, but the use of portfolios and the types of assessments used for and of learning have not been adequately researched in sub-Saharan Africa. This study investigated the evidence of learning in registrars’ LPs and the influence of the training district and year of training on assessments.

Methods: A cross-sectional study evaluated 18 Family Medicine registrars’ portfolios from study years 1–3 across five decentralised training sites affiliated with the University of the Witwatersrand. Descriptive statistics were calculated for the portfolio and quarterly assessment (QA) scores and self-reported clinical skills competence levels. The competence levels obtained from the portfolios and university records served as proxy measures for registrars’ knowledge and skills.

Results: The total LP median scores ranged from 59.9 to 81.0, and QAs median scores from 61.4 to 67.3 across training years. The total LP median scores ranged from 62.1 to 83.5 and 62.0 to 67.5, respectively in QAs across training districts. Registrars’ competence levels across skill sets did not meet the required standards. Higher skills competence levels were reported in the women’s health, child health, emergency care, clinical administration and teaching and learning domains.

Conclusion: The training district and training year influence workplace-based assessment (WPBA) effectiveness. Ongoing faculty development and registrar support are essential for WPBA.

Contribution: This study contributes to the ongoing discussion of how to utilise WPBA in resource-constrained sub-Saharan settings.

Keywords: assessment for learning; assessment of learning; formative assessments; workplace-based assessments; work-based learning; family physician; family medicine; learning portfolio.


The World Health Organization (WHO) advocates that medical training should develop socially accountable health professionals with the required skills.1,2 Socially accountable medical training requires education and service activities that address the priority health concerns of the communities being served.2 Trainees’ ‘social responsiveness’, an essential element of social accountability,3 refers to their ability to integrate the complex skills required to attend to patient needs and improve patient outcomes in the community.

The recent shift to work-based learning (WBL) and workplace-based assessment (WPBA) aligns with the call for greater social accountability in medical training. Training in authentic clinical contexts promotes professionalism, communication, teamwork and interprofessional collaboration and enhances self-directed learning, thus preparing trainees to be responsive to communities’ needs.4,5 Despite the benefits of WBL, medical schools may not be equipped to assess the complex integration of knowledge, skills and attitudes in authentic settings and have been criticised for not linking the phases of trainee learning with patient outcomes.5 The educational impact of WBL has been explored in high-income countries, but the use of WPBAs and the types of assessments used for and of learning have not been adequately researched in sub-Saharan Africa. This article describes the evidence of learning in WPBAs in a Family Medicine training programme in South Africa (SA).

Work-based learning shifts the focus from theoretical learning to learning in and from practice.6 Work-based learning also broadens the scope of educational assessment from assessment of learning (AoL), summative components with pass/fail decisions,5 to assessment for learning (AfL).7 Assessments for learning form part of day-to-day clinical practice: seeking, reflecting and responding to information from observation and dialogue to promote learning.8 Assessment for learning provides meaningful narrative feedback to enhance the learning experience.7,9 Assessments of learning are high-stake assessments focusing on achieving learning outcomes at the end of a programme7,10 for accountability, ranking and certification of trainee’s achievement of competence.11

The distinction between AfL and AoL is complex, making it difficult for trainers to discern how to use the different assessment types effectively.11 Workplace-based assessments captured in learning portfolios (LPs) provide adequate evidence of trainees’ competence and are collectively used for the summative AoL.7 Given that WBL and WPBA are recent developments, it may be more difficult for trainers to use AfL and AoL to the trainees’ best benefit.

Learning portfolios for workplace-based assessment

In programmatic assessment, defined as ‘a consciously designed assessment system in which the longitudinal development is visible to the learner as usable feedback which provides rich data for informed, holistic decision on making learner progression’, individual data points from low-stake AfL accumulate into multiple data points for high-stake AoL.4 Learning portfolios, introduced in the United Kingdom in 1996 for postgraduate general practice vocational training,12,13 are now widely used as a longitudinal tool in programmatic assessments.

Learning portfolios offer several benefits for trainees’ longitudinal development. Portfolios may include evidence of personal and professional WBL,14 assessments of soft skills, like professionalism, communication, teamwork and ‘hard’ clinical procedural skills.4,15 Soft skills are better learned and assessed when opportunities are provided to address the complexities of clinical practice in the workplace.4,14,15 Learning portfolios provide a systematic overview of the tasks, expected levels of competency for the required tasks, perceived levels of competence (ability to perform the tasks) and areas needing improvement.15 Competency cannot be measured, but competence, an attribute of the person performing the task, can be measured.16 Learning plans, an essential component of LPs, encourage trainees to track their educational progress and stimulate reflection and feedback as critical skills in self-directed learning.13,17 Supervisors are expected to assist trainees in developing learning goals and support them in achieving their learning plans.18,19 The most critical factor influencing LP use in high-income countries was supervision that stimulates trainees’ reflection and promotes deep learning.20,21 The organisation of portfolios also affects their efficacy.21 Trainees require clear guidelines on the portfolio structure, format and content and how to avoid excessive paperwork.21

Workplace-based assessments are best used as part of a programmatic assessment approach; no single assessment method embodies all the required educational characteristics, such as reliability and validity, educational impact and acceptability.22 However, different WPBA tools have varying educational impacts across clinical contexts.23,24,25 In high-income countries, the effectiveness of WPBA tools relied on prior training on using the tools and the contexts where assessments were conducted.22,24 One factor affecting WPBA is the lack of consensus on the number of assessments used as AfL.24 Other factors arise from the trainee-tool interactions. For example, trainee-assessor relationships could promote lenient marking.24 Despite the challenges associated with WPBAs, the trainees and trainers agree that WPBAs provide valuable narrative feedback and augment trainee self-reflective skills, thus improving learning opportunities.26

Potential barriers to effective LP use include the paperwork required, finding senior staff to supervise procedures, ineffective supervision, the lack of faculty engagement21 and ticking boxes to indicate whether skills were performed.17,27,28 Ticking boxes does not provide feedback on the quality of performance.20 Other barriers that have been identified were the need for multiple assessors, inadequate supervisor and trainee training on WPBA tools, low trainee confidence and whether WPBAs are conducted for learning or of learning.13,21,24 In one study, trainees were concerned that delayed feedback after WPBAs would impact their results – supervisors could not always remember individual trainees.28

While WBL theories and models have been developed, variations in how trainees and supervisors utilise WPBA tools across settings need further research.13 Workplace-based assessment tools have been extensively investigated in high-income settings like the Netherlands and the United Kingdom, but there is a paucity of studies exploring their effect on trainee performance in low-middle-income contexts.29 There is a need for further characterisation of LP use, purpose, structure, evidence for mentoring and assessments across different contexts.21 Some authors have suggested a need to explore the influence of context on LP use to understand why implementation failed in some contexts and to identify and interrogate improvements.21

The SA learning portfolio requirements

Learning portfolios were introduced SA as a WPBA tool in postgraduate family medicine (FM) training in 2012.30 The Colleges of Medicine of South Africa (CMSA), the examining board for the national exit examinations for registrars (SA postgraduate medical trainees), specifies five-unit standards and 83 programmatic learning outcomes to be achieved at the end of a 4-year specialist training programme.30,31,32 The national LP was designed to capture the evidence for 50 of the 83 learning outcomes.31,33 The LP records evidence of registrars’ personal and professional development through reflection and from WPBA tools.30 Detailed learning plans should specify registrars’ learning needs, plans to address their needs and the goals they have achieved.34 Registrars have to obtain a subminimum of 60% in the LP for each of 3 years to progress to the following year and qualify for the national exit examinations.30

The guidelines for the national LP recommend establishing effective supervisor-registrar relationships with regular engagements, including monthly meetings, at least 6 h of dedicated weekly training and supervision and adequate mentoring.30 Learning portfolios can be handwritten or electronic, depending on university requirements, but electronic portfolios are associated with better completion rates, improved quality of scoring entries, increased frequency of use and sustained supervisor-registrar engagement.35 Previous SA studies found that registrars’ reflective skills, knowledge of the requirements, good supervisor feedback and regular engagement with supervisors positively influenced LP use.34

The skills list for registrars was revised and validated in a 2018 national Delphi study.36 This list of 205 core and 39 elective skills is part of the national LP requirements.30

There are four self-reported competence levels for skills:

  • A – Only possesses theoretical knowledge about the skill.
  • B – Possesses theoretical knowledge and observed that skill performed.
  • C – Performed the skill under supervision.
  • D – Can be fully entrusted to perform that specific skill independently.

There are few studies on WPBA and using LPs in postgraduate medical training in low-middle countries, especially in public district health systems where WBL occurs across various primary health care (PHC) facilities and hospitals. This study adds to the discourse by quantifying evidence of learning in LPs, evaluating the AfL and AoL, and investigating the influence of context, such as the training district or the year of training, on the nature of assessments.

Postgraduate family medicine programme at the University of the Witwatersrand

Family medicine was recognised nationally as a new speciality in 2007. The University of the Witwatersrand (Wits University) introduced a full-time postgraduate decentralised FM training programme in 2008. The training programme is structured across 4 years of training. Registrars undergo 2- to 3-week theoretical training blocks at the university annually, depending on the training year. The university also provides an online component comprising FM-related clinical and non-clinical topics, research and journal clubs. Fourth-year registrars focus on the research component and undergo elective rotations to gain further knowledge in a clinical discipline or an area, depending on the registrar’s interest gained during training years.

The FM registrar training programme is based in PHC clinics, community health centres (CHC) or district hospitals across the five study districts. Community health centres are 24-h nurse-led facilities supported by general practitioners and family physicians (FPs). District hospitals are 50–300 bedded hospitals serving a defined population and providing 24-h comprehensive care services and run by general practitioners with a few general specialists, including FPs.37 Registrars rotate every 2 to 3 months in the district or regional-hospital departments for skills training. Skills rotations occur with registrars rotating through various departments, including internal medicine, obstetrics and gynaecology and paediatrics. The timing, location and duration of clinical rotations vary between districts. Decentralised clinical training also includes weekly training sessions at district regional training centres comprising academic discussions (clinical and non-clinical topics), clinical case discussions (one-on-one or group) and WPBAs using the mini-clinical evaluation exercise (mini-CEX) or direct observation of procedural skill tool. The structure of these sessions varied slightly across districts. Another training component is the monthly or bimonthly one-on-one engagements, captured in the LP, between registrars and supervisors to discuss registrars’ learning needs, review their learning plans and progress and identify gaps to be addressed. In some districts, registrars also have observed consultations or procedures with FP supervision at CHCs or PHCs.

The University of the Witwatersrand FM registrar training uses the programmatic assessment model,38 with multiple assessments during the training year. Assessments for learning include WPBAs across the CMSA-prescribed curriculum captured in the LP. Other LP sections are learning plans, educational meetings, direct observations, written assignments, logbooks, certificates for attending courses, meetings and end-of-year assessments.30 Assessment for learning is used to decide annual registrar progression: the quarterly assessments (QA) and the LP overall score are considered. The QA are observed consultations or procedures evaluated using WPBA tools by a supervisor from another district. The QA scores were taken as AfL separately from the overall LP scores contributing to the year mark. The involvement of multiple supervisors improves the validity of WPBA.24 Assessment of learning includes university examinations conducted 18 months into the training programme and the national exit examination after at least three training years as the final high-stake assessment towards specialist qualification.

In the logbook section of the Wits University FM LP, registrars must achieve competencies clustered into core and elective skills within 14 clinical domains aligned to CMSA requirements. These clinical domains are adult health, ear, nose and throat (ENT), eyes and skin, women’s health, emergency care, child health, consultation skills, clinical governance, orthopaedics, anaesthetics, forensics, community-oriented primary care (COPC), clinical administration, teaching and learning and palliative care. Registrars must achieve competence at level D for all core skills and level C or D for elective skills. Their immediate supervisors evaluate registrars’ bi-annual self-assessments of skills in their logbooks before submitting the LPs to the university for final evaluation by the portfolio committee.


Study design

This study forms part of a broader, mixed-methods case study evaluating the postgraduate FM training programme at University of the Witwatersrand using a logic model. The quantitative component of the broader study reported in this article evaluated evidence of learning in WPBAs as short-term outcomes of the logic model – the short-term outcomes were proxy measures of registrars’ knowledge and skills.

Study setting

The study was conducted across the five training districts affiliated with University of the Witwatersrand: Ekurhuleni, Johannesburg Metro, Sedibeng and West Rand in the Gauteng province and Dr Kenneth Kaunda in the North West province of South Africa. Gauteng province is more densely populated (total population above 15 million) than the North West province (just over 4 million).39 However, more than 75% of the population in both provinces is uninsured and utilises government health services at PHC clinics, CHCs and district hospitals.39 There is a gross disparity in the distribution of doctors across the provinces. While Gauteng province has over 4000 medical practitioners and 1500 specialists, North West has an acute doctor shortage with just above 1000 medical practitioners and 100 specialists covering the province.39

The health facilities that serve the population vary widely across the training districts.39,40 While the Johannesburg Metro district has 108 PHC clinics, 11 CHCs and two district hospitals, Ekurhuleni has 84 PHC clinics, nine CHCs and one district hospital. Sedibeng has 8 CHCs and 30 PHC clinics, while West Rand district has 45 PHC clinics and 3 CHCs.41 The less-populated Dr Kenneth Kaunda district in the North West province has only 30 PHC clinics and 10 CHCs, spread over 14 671 km.41 The number of regional and tertiary hospitals ranges from five hospitals in the Johannesburg Metro district to no tertiary facility in the Sedibeng, West Rand and Dr Kenneth Kaunda districts.41 Most of the uninsured population in these communities relies solely on health services provided by PHC and CHC facilities as the first point of health care, emphasising the need for high-quality patient care to be provided at these facilities.

Study population and sampling

The study population included all 18 registrars’ LPs across 3 years of the 2020 training programme. All 18 registrars who submitted LPs at the end of the training year 2020 (N = 18) consented to their LPs being evaluated. The registrars’ total QA and LP sectional scores, including their skills competence levels and final LP scores, were collected as the AfL. The AoL component for the first- and second-year registrars was the outcome of the 18-month university examinations and the national exit college examination results for third-year registrars.

Data collection

The ‘evidence for learning tool’ was developed to record registrars’ scores and skills competence levels. Quarterly assessment scores and university or CMSA examinations’ pass or fail outcomes (in the same or the following year) were extracted from university records. Registrars in the second year wrote the University of the Witwatersrand exams in 2020; the first years wrote in 2021 after completing their 18 months of training. The third-year registrars, who had submitted their LPs in 2020, sat the CMSA exams in 2021, in their fourth year of study. This study used supervisors’ total scores allocated for QA, LP and various LP sections and the final pass/fail in the university or CMSA examinations as proxy measures of registrars’ knowledge.

The nationally validated skills set36 was used to develop the skills section of the data collection tool. Registrars’ self-assessment scores for each skill in the LP logbook were taken as proxy measures for skill competencies. After the final submission, the registrar’s immediate supervisor and the university’s portfolio committee verified all the LP scores.

Data analysis

The data were entered into an Microsoft Excel spreadsheet and imported into Stata 14.2 software. Median and interquartile ranges were calculated for the total LP, LP sectional and QA scores for each training district and year. Frequencies of self-reported competence levels for each skill (category scores A–D) were determined. Registrars’ total skill-set scores for each domain were calculated using the highest competency level ‘D’; competence levels A, B and C were not considered. For example, if a registrar assessed themselves at level D in 30 of the 38 ‘adult health skills’, their total skills set score for that domain would be 30/38. The median and interquartile ranges for each domain’s total skill set score were calculated for each training year and district.

Ethical considerations

Ethical approval was obtained from the Human Research Ethics Committee (HREC Medical) of the University of the Witwatersrand (Certificate number M191140). Permission was obtained from the University Registrar and the Head of the Department of Family Medicine to conduct the research and access the university records and LPs. Informed consent was obtained from registrars to access the LPs. The research was carried out following the Helsinki Declaration.


Six LPs from each year of registrar training were assessed (N = 18). The total LP median scores and interquartile ranges were 76.3 (62.1–81.2), and the median and interquartile ranges for the QA scores were 64.9 (60.7–69.5).

The median and interquartile ranges calculated for the total LP scores and various LP sectional and QA scores across the training years (Y1–Y3) are shown in Table 1. The total LP median scores were higher in Year 3 compared to Year one (Y1) and Year two (Y2) (Table 1). The Year two registrar’s QA median scores were higher than Years one (Y1) and three (Y3) (Table 1). Supervisors awarded full marks (10/10) in some LP sections in Y1 and Y2 but not in Y3 (Table 1).

TABLE 1: Assessment for learning scores in learning portfolios and quarterly assessments across training years.

Table 2 shows the median and interquartile ranges calculated for the total LP scores and various LP sectional and QA scores across the five districts (D1–D5). The total LP median scores varied considerably across D1–D5 (Table 2), but not the QA median scores. Again, some LP sections (learning plans, educational meetings and observations) had median scores of 10/10 in some districts (Table 2).

TABLE 2: Assessment for learning scores in learning portfolios and quarterly assessments across training districts (N = 18).

In the first- and second-year summative assessments (AoL), 6 out of 12 registrars passed the 18-month university examinations on the first attempt; five of the 12 failed them, and one registrar did not take the mid-point examinations. Of the six registrars who completed the 3 years of training, four who sat the national exit examination passed on their first attempt, and two did not take the examination.

Clinical skills competence levels

The logbook skills section was incomplete across all training years, but primarily in the first year: four first-year registrars did not complete the section. The self-assessed competence levels of 18 registrars for 205 core clinical skills and 39 elective skills varied considerably (Appendix 1). Depending on the registrars’ self-assessment and the type of skill, competence levels of skills varied and scored from A to D (Appendix 1). Third-year registrars self-assessed their competence as higher (D scores) in most skills than in the other 2 years. The registrars reported higher competence levels in clinical domains such as adult health, women’s health, child health and emergency care than ENT, eyes and skin, orthopaedics and anaesthetics. First- and second-year registrars scored low in domains like orthopaedics, anaesthetics, ENT, eyes and skin because these clinical rotations occur during their third year of training. Most registrars, even first-year registrars, reported higher competence levels in consultation, clinical administration, clinical governance, COPC and teaching and learning domains (Appendix 1).

Although the self-reported level of competence in performing clinical skills was relatively better among third-year registrars compared to Years one and two, some core skills still included levels A, B and C. Examples included a laparotomy for ectopic pregnancy, cricothyroidotomy, assisted vaginal delivery, proctoscopy, vasectomy and reduction of elbow dislocation (Table 3). The total number of registrars in each year was six but varied as registrars did not report on many core skills and left them incomplete (Table 3).

TABLE 3: Self-reported competence levels on selected core clinical skills among third-year registrars (n = 6).

Table 4 shows the total skills set score medians and interquartile ranges for all 14 clinical domains in the LP across the 3 years of training. These medians were higher in adult health, women’s health, emergency care and child health for second- and third-year registrars than ENT, eyes and skin, orthopaedics and anaesthetics domains (Table 4). The Year two median was higher than Year three for the COPC and teaching and learning domains. When total core skills median scores progressed across the training years, elective skills median scores remained low across all three training years (Table 4).

TABLE 4: Total skills set score median and interquartile ranges for Years 1–3 (N = 18) in 14 family medicine domains.


This study examined the evidence of registrar learning from scores and self-reported clinical-skills competence levels recorded in LPs and QAs. The major finding was that LPs are not being used optimally as self-directed learning tools in decentralised training contexts, as has been found previously.13,17,42 The scores in various LP sections and skills level scores (for each skill and skill set) were variable in different clinical domains across training years and lacked adequate evidence of registrar progression in knowledge and skills. Although the LPs showed evidence of registrar learning, they can be used more effectively in WPBA.

Multiple assessments from various sources in authentic contexts improve assessment validity and reliability.22 Although there was evidence that the programme included multiple WPBAs, there was also evidence of lenient scoring in LP sections like the learning plans and skills competence levels. The negative impact of lenient scoring may be exacerbated by inadequate narrative feedback, as previously reported for this training programme.35 These combined problems raise concerns about whether LPs are being used effectively to support self-directed learning and suggest that supervisors need more clarity on using LPs in postgraduate training.13,17

The total LP scores improved across training years, but in QA scores across 3 years, second-year registrars had a higher score average than third-year registrars, similar to previous studies that detailed the inadequacies and leniency of supervisors while scoring WPBAs.43 Other LP sections, such as educational meetings and direct observations, showed the required number of educational meetings and WPBAs between supervisors and registrars. As stated in previous studies,28,34,42 registrars and supervisors in this study focused on completing LP sections rather than effectively utilising them as a learning tool. Whether supervisor–registrar engagement was quantified or engagement took place according to the expected standards with quality feedback to the registrars needs further exploration. In our WBL context, all LP scores were allocated by immediate FP supervisors and later verified by the university’s portfolio committee. The LP introduction in the first year, with adequate registrar training on its use and ongoing mentoring by supervisors, could have enhanced LP utilisation during self-directed learning.21,44

Quarterly assessment scores, part of the AfL component, contributed separately from LP scores to decide yearly registrar progression. In authentic clinical settings, supervisors and registrars struggle to use WPBA tools such as mini-CEX strictly as a summative tool during QA, always adding a formative component to those assessments. Similar challenges were identified related to mini-CEX use in high-income countries.24,43

The self-assessed competence in this study ranged across all four levels. Self-assessment is a feasible method for registrars to report their perceived competence, but it is susceptible to over- or under-reporting.45,46 Although the logbooks ensure that trainees perform the maximum number of skills required for competence,23 incomplete entries across all three training years undermine that intention. Registrars may have been reluctant to complete the logbook skills section because they perceived their skills competence as inadequate. Alternatively, they may not have felt confident about reporting their competence levels because of inadequate exposure to these skills in their context. Despite the increasing emphasis on self-assessment and critical analysis of trainees’ performance in competency-based medical education,47 the registrars appeared to lack the skills to self-assess their performance. The reported competence levels improved in higher training years, suggesting continuous skills learning at their workplaces. Registrars perform skill rotations in major disciplines during their first and second years of training, allowing them to acquire all needed skills. It was encouraging to find that registrars reported adequate competence levels in domain-specific skills learnt earlier in their training, such as consultation, clinical governance and administration, community-oriented primary care and teaching and learning.

A concerning finding was that specific skills such as cricothyroidotomy, assisted vaginal delivery, laparotomy for ruptured ectopic pregnancy and reducing an elbow dislocation were some examples of skills in which third-year registrars reported lower skills competence levels. Third-year registrars should be fully entrusted to perform these essential skills and be able to train junior registrars or students. Registrars’ skills deficiencies should be included in learning plans as learning needs, discussed with supervisors and planned on how to acquire them. Achieving adequate procedural skill competence levels by registrars is a priority in full-time postgraduate FM training programmes compared to older part-time programmes.44 The medical officers working in district hospitals in similar contexts reported adequate competence in performing these procedures.45 Once qualified as an FP, the registrar will need to function as a consultant and capacitator for medical practitioners and other health workers by performing various surgical and obstetric skills to strengthen district hospital services.48 This study reiterated that registrars require adequate exposure to learning opportunities to achieve mastery in performing core procedures encountered commonly in primary-care settings.

Third-year registrars reported lower competence levels on core clinical skills such as proctoscopy, applying clubfoot cast, vasectomy, culdocentesis and performing a brachial block. Questions that need to be answered include whether registrars had sufficient opportunities or allocated time to practise these skills during clinical rotations and whether these procedures were performed in sufficient numbers in various disciplines of the regional and district hospitals where registrars rotated. Current trends in medical education emphasise contextualising the curriculum and learning opportunities to acquire knowledge and skills needed in the health care system where trainees practice.49 While the FM skills list was revised in 2017,36 some core skills are still not performed sufficiently at peri-urban district hospitals.45,50 Perhaps it is time to revise the FM registrar training skills list to include more relevant context-specific procedural skills.

The impact of the training context on achieving the required levels of core skills competence may not be sufficiently considered. Family physicians in SA work in various health-system contexts like district, regional and tertiary hospitals, private general practice or rural and urban clinics, where they require most of these skills.51 Mastering all core skills could be more relevant for an FP practising in a rural district hospital where acute health worker shortages and referral challenges are experienced than in urban settings. It may be time to consider separating skill sets for FPs practising in rural and urban contexts.45,52 Even in high-income countries, doctors working in rural areas reported higher skill levels than their urban counterparts.46,53 A compulsory rural block for registrars working in peri-urban district hospitals to achieve specific skills, for example, a 2 to 3-month rotation or even a year of longitudinal clinical work from a peri-urban district hospital to a rural district hospital, could potentially mitigate the skills gaps identified among registrars.

Although studies were conducted when the LP was introduced into postgraduate FM training in SA about 10 years ago,42 there has not been further research to measure its longitudinal impact. This study initially sought evidence for WBL in the short term, but comparing the findings to studies from a decade ago highlights the long-term impact after LPs were introduced whether they were effectively utilised in WBL. Despite the positive impact of LPs on WBL, this study highlighted several issues that adversely affected how effectively registrars used the LPs, including incomplete logbook sections and a lack of clarity about the roles of FPs.

Workplace-based assessments captured in the LP provide trainees with more opportunities to reflect on their practice in authentic settings, positively influencing learning behaviour.43 Teamwork, professionalism and self-appraisal are all assessed when a mini-CEX tool is used in WPBAs.22 The assessment of complex tasks relied on the trainee’s ability to integrate cognitive, psychomotor and affective components, best evaluated in authentic clinical settings.22 Currently, the LP primarily focuses on scoring systems to determine whether the numbers needed are met or whether registrars achieved adequate skill competence levels. Providing more qualitative feedback on registrar performance will enhance the comprehensiveness of WPBAs. Most importantly, creating a learning environment that encourages a reflective dialogue between trainees and their supervisors is vital for effective LP utilisation.20


The number of LPs assessed was low, despite including all those available. However, the results contributed to a better understanding of the ‘phenomenon of interest’ for the broader mixed-methods case study, namely the postgraduate FM registrar decentralised training at the University of the Witwatersrand.

The data extracted depended on the legibility and completeness of various LP sections. Many skill competencies were incomplete, especially in the first-year registrar portfolios, with data collection likely impacted by the coronavirus disease 2019 (COVID-19) lockdown. During the lockdown, registrars struggled to complete LPs in some clinical skills rotations, which affected our results.

Self-assessment may have resulted in the over- or under-reporting of skills competency levels. However, this effect may have been mitigated by supervisors’ evaluating competency levels by directly observing procedures. Additional mitigation likely arose during the University’s portfolio committee’s final evaluation. All districts are represented on this internal committee, which interrogates the LPs and discusses inconsistencies, thereby improving the validity of the results. The initial intention was to determine the association of the scores and skills with the training districts and training years, but this was not feasible given the small number of LPs. Future research on entrusted decisions by supervisors on observed skills may corroborate registrars’ self-reported competency levels.


The study highlighted the need for faculty development and registrar training to improve FPs’ and registrars’ literacy in WPBAs. Training will capacitate supervisors to mentor registrars effectively and enhance registrars’ self-directed learning. The LPs should include multiple assessments of various competencies, with both hard and soft skills, from various assessors in different contexts. Adequate formative feedback needs to be provided to augment registrar learning opportunities. Regular formative assessment visits by faculty programme managers will improve and maintain WPBA standards, which could translate to better registrar learning at their workplaces.


This study aimed to evaluate evidence of learning in LPs, formative and summative WPBAs and the influence of the training district and the year of training on assessments. While the findings provided a holistic view of WPBA in FM training in one setting, they could apply to similar contexts at South African universities. Future research on WPBA across multiple training programmes will give a complete picture of postgraduate FM training in SA, which may also benefit other sub-Saharan African countries.


A special thanks to the University of the Witwatersrand registrars for allowing access to their learning portfolios. The authors acknowledge the statistical assistance of Prof. E.L. and Dr Z.M.Z of the University of the Witwatersrand.

Competing interests

The authors declare that they have no financial or personal relationships that may have inappropriately influenced them in writing this article.

Authors’ contributions

N.J.E. conceptualised the research, collected and analysed the data and wrote the first draft of the article. A.Z.G. and L.S.J. contributed to the data analysis and writing the subsequent drafts. All authors contributed to the article and approved the final version.

Funding information

This research work is supported by the Faculty Research Committee Individual Research Grants 2021, University of the Witwatersrand.

This work is based on the research supported in part by the National Research Foundation of South Africa for the grant, Unique Grant No 122003.

Data availability

Data are available from the corresponding author, N.J.E., upon reasonable request.


The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of any affiliated agency of the authors, and the publisher.


  1. World Health Organization. Transforming and scaling up health professionals’ education and training: World Health organisation guidelines 2013 [homepage on the Internet]. c2019 [cited 2021 Nov 11]. Available from: https://www.who.int/publications-detail-redirect/transforming-and-scaling-up-health-professionals’-education-and-training
  2. Boelen C, Heck JE, World Health Organization. Division of Development of Human Resources for Health. Defining and measuring the social accountability of medical schools / Charles Boelen and Jeffery E. Heck [homepage on the Internet]. WHO/HRH/95.7. Unpublished. 1995. [cited 2023 Mar 23]. Available from: https://apps.who.int/iris/handle/10665/594411
  3. Kenwright DN, Wilkinson T. Quality in medical education. In: Swanwick T, Forrest K, O’ Brien BC editors. Understanding medical education: Evidence, theory and practice. 3rd ed. Hoboken, NJ: Wiley-Blackwell, 2019; p. 101–110.
  4. Fredman BDM, Howie PW, Ker J, Pippard MJ. AMEE Medical Education Guide No. 24: Portfolios as a method of student assessment. Med Teach. 2001;23(6):535–551. https://doi.org/10.1080/0142159012009095
  5. Van der Vleuten C, Lindemann I, Schmidt L. Programmatic assessment: the process, rationale and evidence for modern evaluation approaches in medical education. Med J Aust. 2018;209(9):386–388. https://doi.org/10.5694/mja17.00926
  6. Morris C. Work-Based Learning. In: Swanwick T, Forrest K, O’ Brien BC, editors. Understanding medical education: Evidence, theory and practice. 3rd ed. Hoboken, NJ: Wiley-Blackwell, 2019; p. 163–174.
  7. Wood DF. Formative Assessment: Assessment for learning. In: Swanwick T, Forrest K, O’ Brien BC editors. Understanding medical education: Evidence, theory and practice. 3rd ed. Hoboken, NJ: Wiley-Blackwell, 2019; p. 361–373.
  8. Klenowski V. Assessment for learning revisited: An Asia-Pacific perspective. Assess Educ. 2009;16(3):263–268. https://doi.org/10.1080/09695940903319646
  9. Kulasegaram K, Rangachari PK. Beyond ‘formative’: Assessments to enrich student learning. Adv Physiol Educ. 2018;42(1):5–14. https://doi.org/10.1152/advan.00122.2017
  10. Nigel DKS, Ramesh M. Assessment and competence. In: Mehay R, editor. The essential handbook for GP training and education. London: Radcliff Publishing, 2013; p. 409–423.
  11. Schellekens LH, Bok HGJ, De Jong LH, Van der Schaaf MF, Kremer WDJ, Van der Vleuten CPM. A scoping review on the notions of Assessment as Learning (AaL), Assessment for Learning (AfL), and Assessment of Learning (AoL). Stud Educ Evaluation. 2021;71:101094. https://doi.org/10.1016/j.stueduc.2021.101094
  12. Snadden D, Thomas ML, Griffin EM, Hudson H. Portfolio-based learning and general practice vocational training. Med Educ. 1996;30(2):148–152. https://doi.org/10.1111/j.1365-2923.1996.tb00733.x
  13. Snadden D, Thomas ML. Portfolio learning in general practice vocational training – Does it work? Med Educ. 1998;32(4):401–406. https://doi.org/10.1046/j.1365-2923.1998.00245.x
  14. Tartwijk JV, Driessen EW. Portfolios in personal and professional development. In Swanwick T, Forrest K, O’Brien BC, editors. Understanding medical education: Evidence, theory and practice. 3rd ed. Hoboken, NJ: Wiley-Blackwell, 2019; p. 255–262.
  15. Tartwijk JV, Driessen EW. Portfolios for assessment and learning: AMEE Guide no. 45. Med Teach. 2009;31(9):790–801. https://doi.org/10.1080/01421590903139201
  16. Holmes A, Tuin MP, Turner SL. Competence and competency in higher education, simple terms yet with complex meanings: Theoritical and practical issues for university teachers and assessors implemeting Competency_Based Education (CBE). Ed Process Int J. 2021;10(3):39–52. https://doi.org/10.22521/edupij.2021.103.3
  17. Heeneman S, Driessen EW. The use of a portfolio in postgraduate medical education – Reflect, assess and account, one for each or all in one? GMS J Med Educ. 2017;34(5):Doc57. https://doi.org/10.3205/zma001134
  18. Lockspeiser TM, Kaul P. Using individualized learning plans to facilitate learner-centered teaching. J Pediatr Adolesc Gynecol. 2016;29(3):214–217. https://doi.org/10.1016/j.jpag.2015.10.020
  19. Robinson JD, Persky AM. Developing self-directed learners. Am J Pharm Educ. 2020;84(3):847512. https://doi.org/10.5688/ajpe847512
  20. Driessen E. Do portfolios have a future? Adv Health Sci Educ. 2017; 22(1):221–228. https://doi.org/10.1007/s10459-016-9679-4
  21. Driessen E, Van Tartwijk J, Van Der Vleuten C, Wass V. Portfolios in medical education: Why do they meet with mixed success? A systematic review. Med Educ. 2007;41(12):1224–1233. https://doi.org/10.1111/j.1365-2923.2007.02944.x
  22. Van der Vleuten CPM, Schuwirth LWT. Assessing professional competence: From methods to programmes. Med Educ. 2005;39(3):309–17. https://doi.org/10.1111/j.1365-2929.2005.02094.x
  23. Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach. 2007;29(9–10):855–871. https://doi.org/10.1080/01421590701775453
  24. Prentice S, Benson J, Kirkpatrick E, Schuwirth L. Workplace-based assessments in postgraduate medical education: A hermeneutic review. Med Educ. 2020;54(11):981–992. https://doi.org/10.1111/medu.14221
  25. Hejri SM, Jalili M, Masoomi R, Shirazi M, Nedjat S, Norcini J. The utility of mini-clinical evaluation exercise in undergraduate and postgraduate medical education: A BEME review: BEME Guide No.59. Med Teach. 2020;42(2):125–142. https://doi.org/10.1080/0142159X.2019.1652732
  26. Driessen EW, Van Tartwijk J, Govaerts M, Teunissen P, Van der Vleuten CPM. The use of programmatic assessment in the clinical workplace: A Maastricht case report. Med Teach. 2012;34(3):226–231. https://doi.org/10.3109/0142159X.2012.652242
  27. Hrisos S, Illing JC, Burford BC. Portfolio learning for foundation doctors: Early feedback on its use in the clinical workplace. Med Educ. 2008;42(2):214–223. https://doi.org/10.1111/j.1365-2923.2007.02960.x
  28. Belcher R, Jones A, Smith LJ, et al. Qualitative study of the impact of an authentic electronic portfolio in undergraduate medical education. BMC Med Educ. 2014;14(1):265. https://doi.org/10.1186/s12909-014-0265-2
  29. Miller A, Archer J. Impact of workplace-based assessment on doctors’ education and performance: A systematic review. BMJ. 2010;341:c5064. https://doi.org/10.1136/bmj.c5064
  30. The Colleges of Medicine of South Africa. Fellowship of the College of Family Physicians of South Africa. [homepage on the internet]. The College of Family Physicians: CFP(CMSA); No date [cited 2022 Aug 15]. Available from: https://www.cmsa.co.za/view_exam.aspx?QualificationID=9
  31. Couper I, Mash B, Smith S, Schweitzer B. Outcomes for family medicine postgraduate training in South Africa. S Afr Fam Pract. 2012;54(6):501–506. https://doi.org/10.1080/20786204.2012.10874283
  32. Mash R, Steinberg H, Naidoo M. Updated programmatic learning outcomes for the training of family physicians in South Africa. S Afr Fam Pract. 2021;63(1):5342. https://doi.org/10.4102/safp.v63i1.5342
  33. Jenkins L, Mash B, Derese A. Development of a portfolio of learning for postgraduate family medicine training in South Africa: A Delphi study. BMC Fam Pract. 2012;13(1):11. https://doi.org/10.1186/1471-2296-13-11
  34. Jenkins L, Mash B, Derese A. The national portfolio for postgraduate family medicine training in South Africa: A descriptive study of acceptability, educational impact, and usefulness for assessment. BMC Med Educ. 2013;13(1):1–11. https://doi.org/10.1186/1472-6920-13-101
  35. De Swardt M, Jenkins LS, Von Pressentin KB, Mash R. Implementing and evaluating an e-portfolio for postgraduate family medicine training in the Western Cape, South Africa. BMC Med Educ. 2019;19(1):251. https://doi.org/10.1186/s12909-019-1692-x
  36. Akoojee Y, Mash R. Reaching national consensus on the core clinical skill outcomes for family medicine postgraduate training programmes in South Africa. Afr J Prim Health Care Fam Med. 2017;9(1):e1–e8. https://doi.org/10.4102/phcfm.v9i1.1353
  37. Department of Health. National Health Act, 2003: Regulations relating to categories of hospitals. Pretoria: South African Government Publishers, 2012; p.1–26. (Gov Not R185 Gov Gaz 35101).
  38. Van Der Vleuten CPM, Schuwirth LW, Driessen EW, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34(3):205–214. https://doi.org/10.3109/0142159X.2012.652239
  39. Ndlovu N, Day C, Gray A, Busang J, Mureithi L. South African Health Review: Health and related indicators 2021 [homepage on the Internet]. Durban: Health Systems Trust, No date [cited 2022 Jul 9]; p. 311–355. Available from: https://www.hst.org.za/publications/South%20African%20Health%20Reviews/Chapter29_Indicators_SAHR21_04022022_OD.pdf1
  40. Massyn N, Ndlovu N, Thesandree P. District Health Barometer 2019/20 [homepage on the Internet]. Durban: Health Systems Trust; c2020 [cited 2022 Jul 9]. Available from: https://www.hst.org.za/publications/District%20Health%20Barometers/DHB%202019-20%20Complete%20Book.pdf#search=District%20health%20Barometer%202020%2F202
  41. Massyn N, Tanna G, Day C, Ndlovu N. District Health Barometer: District Health profiles 2017/2018 [homepage on the Internet]. Durban: Health System Trust; c2018 [cited 2023 Apr 02]. Available from: https://www.hst.org.za/publications/District%20Health%20Barometers/Distric%20Health%20Barometer-District%20Health%20Profiles%2020172018.pdf
  42. Jenkins LS. The development and evaluation of a portfolio of learning in the workplace for postgraduate family medicine education in South Africa [Thesis]. Stellenbosch: Stellenbosch University; 2014 [cited 2020 Dec 30]. Available from: https://scholar.sun.ac.za:443/handle/10019.1/86358
  43. Burch VC. The changing landscape of workplace-based assessment. JATT [serial online]. 2019 [cited 2021 Aug 20];20(S2):37–59. Available from: http://www.jattjournal.net/index.php/atp/article/view/143675
  44. Mash R, Malan Z, Blitz J, Edwards J. Improving the quality of clinical training in the workplace: Implementing formative assessment visits. S Afr Fam Pract. 2019;61(6):264–272. https://doi.org/10.1080/20786190.2019.1647639
  45. Erumeda NJ, Couper ID, Thomas LS. A self-assessment study of procedural skills of doctors in peri-urban district hospitals of Gauteng, South Africa. Afr J Prim Health Care Fam Med. 2019;11(1):e1–e8.
  46. Garcia-Rodriguez JA, Dickinson JA, Perez G, et al. Procedural knowledge and skills of residents entering Canadian family medicine programs in Alberta. Fam Med. 2018;50(1):10–21. https://doi.org/10.22454/FamMed.2018.968199
  47. Eno C, Correa R, Stewart DE, et al. Milestones guidebook for residents and fellows [home page on the Internet]. Accreditation Council for Graduate Medical Education (ACGME), c2020 [cited 2023 Aug 16]; p. 1–19. Available from: https://www.acgme.org/globalassets/PDFs/Milestones/MilestonesGuidebookforResidentsFellows.pdf
  48. Hendriks H, Adeniji A, Jenkins L, Mash RJ. The contribution of family physicians to surgical capacity at district hospitals in South Africa. Afr J Prim Health Care Fam Med. 2021;13(1):e1–e3. https://doi.org/10.4102/phcfm.v13i1.3193
  49. Grant J. Principles of curriculum design. In Swanwick T, Forrest K, O’Brien BC, editors. Understanding medical education: Evidence, theory and practice. 3rd ed. Hoboken, NJ: Wiley-Blackwell, 2019; p. 71–76.
  50. De Villiers M, De Villiers P. The knowledge and skills gap of medical practitioners delivering district hospital services in the Western Cape, South Africa. South Afr Fam Pract. 2006;48(2):16–16c. https://doi.org/10.1080/20786204.2006.10873333
  51. Flinkenflögel M, Sethlare, Cubaka, Makasa M, Guyse A, De Maeseneer J. A scoping review on family medicine in sub-Saharan Africa: Practice, positioning and impact in African health care systems. Hum Resour Health. 2020;18:27. https://doi.org/10.1186/s12960-020-0455-4
  52. Kalu QN, Eshiet AI, Ukpabio EI, Etiuma AU, Monjok E. A rapid need assessment survey of anaesthesia and surgical services in district public hospitals in Cross River State, Nigeria. BJMP [serial online]. 2014 [cited 2023 Apr 02];7(4):a733. Available from: https://www.bjmp.org/content/rapid-need-assessment-survey-anaesthesia-and-surgical-services-district-public-hospitals-cross-river-state-nigeria
  53. Goertzen J. Learning procedural skills in family medicine residency: Comparison of rural and urban programs. Can Fam Physician. 2006;52(5):622–623.

Appendix 1

TABLE 1-A1: Self-reported skills competence levels on 205 core clinical skills in 18 registrar learning portfolios.
TABLE 1-A1 (Continues…): Self-reported skills competence levels on 205 core clinical skills in 18 registrar learning portfolios.
TABLE 1-A1 (Continues…): Self-reported skills competence levels on 205 core clinical skills in 18 registrar learning portfolios.
TABLE 1-A1 (Continues…): Self-reported skills competence levels on 205 core clinical skills in 18 registrar learning portfolios.
TABLE 1-A1 (Continues…): Self-reported skills competence levels on 205 core clinical skills in 18 registrar learning portfolios.

Crossref Citations

No related citations found.