- Research
- Open access
- Published:
How does learning happen in field epidemiology training programmes? A qualitative study
BMC Medical Education volume 25, Article number: 411 (2025)
Abstract
Background
Despite a 75-year history of building epidemiologic capacity and strengthening public health systems, the learning processes in field epidemiology training programmes (FETPs) remain unexamined.
Methods
We codesigned a grounded theory and narrative inquiry qualitative study to fill this gap. The study aimed to understand the learning processes in four FETPs by describing training approaches for field epidemiologists, outlining learning strategies among trainees, and examining principles and practices that align training approaches and learning strategies. Data collection included participant observations and semi-structured interviews with FETP trainees and advisors within programmes in Australia, Japan, Mongolia, and Taiwan.
Results
Analysis revealed that learning occurs as trainees engage in real-world public health contexts, interacting with their people, systems, data, and knowledge. Facilitators of the learning process were learning environments (projects, routine placement work, field investigations, and courses), advisor stewardship, and trainee tenacity.
Conclusions
Our findings align with established and contemporary learning theories and suggest that all countries have the tools to build field epidemiology capacity and leadership. To refine these tools, governments, partners, and programme leaders should ensure access to learning environments, fortify advisor stewardship, and foster a culture of resilience among trainees. FETP is among the strongest levers to bolster the workforce for global health security before the next pandemic, and these findings reveal pathways toward better investments.
Background
Field epidemiologists play a critical role in public health. They investigate urgent problems 'to guide, as quickly as possible, the processes of selecting and implementing interventions to lesson or prevent illness or death' ([1], p. 3). These urgent problems include outbreaks [2], armed conflict [3], and disasters [4]. During the COVID-19 pandemic, field epidemiologists worked in surveillance, case investigations, response coordination, point of entry, and risk communication, detecting cases, assessing the risk and spread of SARS-CoV-2, and disseminating information [5].
For almost 75 years, the core training mechanism for field epidemiologists has been the field epidemiology training programme (FETP). These programmes employ service-based, hands-on training that emphasizes learning by doing to develop skilled field epidemiologists who detect, investigate, and control disease outbreaks; conduct surveillance and applied epidemiology studies; and measure intervention impacts [6]. Since the first programme in the USA in 1951, which aimed to turn clinicians into field-savvy epidemiologists [7], almost 100 FETPs have operated in more than 200 countries and territories [8]. The WHO Joint External Evaluation recognizes FETP as a key component of workforce development [9]. For more details about FETP, see [10,11,12].
Despite the critical role of FETPs in public health, systematic research into FETP learning processes is scarce. A few studies have examined supervision quality and impact [13, 14], and one explored alignment of the foundational programme with adult learning theories [15]. Most of the FETP literature comprises commentary (for example, [16,17,18]) or evaluation of programme outputs and outcomes. Al Nsour and colleagues reviewed the FETP evaluation literature and concluded ‘substantial positive impact of FETPs on trainees and graduates' ([19], p. 10). The review identifies evaluation outcomes for skills and knowledge, involvement of trainees and graduates in field epidemiology activities, improvements in field epidemiology functions, and others. It identifies gaps including coverage of the programme, funding, mentoring, and enticement of veterinary and medical school graduates. Evaluations are essential to understanding and improving training programmes as they reveal whether trainees have learned, but they do not reveal how trainees have learned. A gap thus persists in the literature on how trainees learn, how contextual factors influence learning processes, and how emerging adult learning theories might inform training approaches.
To address these gaps in the literature, we sought to understand the learning processes in today’s FETPs. Such understanding could inform strategies to scale training during emergencies, adapt an FETP to context, balance global standardization with local needs, measure learning, and align training with technological advances and modern adult learning theories. Specifically, we explored learning within four FETPs by 1) describing training approaches for field epidemiologists; 2) outlining learning strategies among trainees; and 3) examining principles and practices that align training approaches and learning strategies. In this paper, we describe the common learning processes across programmes.
Methods
This co-designed study situated in constructivist qualitative research explored learning processes in the FETPs of Australia, Japan, Mongolia, and Taiwan. Our design employed grounded theory and narrative inquiry because they align with our worldview and research aims. Grounded theory is a social research method that aims to generate explanatory theory inductively from analysis of social research data [20]. Charmaz [21] argues that while quantitative research aims for statistical inferences, grounded theory aims to fit theories emerging from the data with the data, supplying evidence for emerging hypotheses that quantitative research could then pursue. Narrative inquiry focuses on narratives as displays of human existence situated in action, examining lives and lived experiences as sources of knowledge and understanding [22] and revealing our social nature, social structures and sense-making [23]. Data collection included in-depth semi-structured interviews and participant observations, and analysis followed Charmaz’s [24] grounded theory and Polkinghorne’s [27] analysis of narratives and narrative analysis. For a comprehensive description of the wider study methodology, see [24].
The lead researcher (MMG) invited seven one- and two-year FETPs from the WHO Western Pacific Region to participate in this study. The programmes differed in duration, training cohort size, years in operation, host institution, recruitment strategies, national language, and health system. Minimizing further variability that might have complicated interpretation and maximising resources led to limiting invitations to one region. To participate, programme directors had to agree for a lead trainer to serve as a coresearcher. Six FETPs agreed to participate with two later dropping out for administrative reasons. Table 1 provides an overview of the four participating FETPs. The programmes varied in aims, duration, recruitment, cohort sizes, supervision and mentoring, and course structures. Australia’s programme alone offered a master’s degree and required a research thesis. Mongolia’s programme had the shortest duration and was the only one to include a group project.
We codesigned the study with a lead trainer, facilitator, convenor, or supervisor from each participating FETP (authors EF, TS, ASH, and MB). We modelled the co-design on Wadsworth’s Critical Reference Group [25]. During monthly virtual discussions, the lead researcher and lead trainers from participating FETPs discussed perspectives on learning and programmes’ research needs, leading to our initial research objectives: to understand how programmes train, how trainees learn, and what prevents effective training. These discussions informed the study design.
The lead researcher (MMG) conducted participant observations [26] for five consecutive days each in Australia, Japan, and Mongolia. We did not conduct observations in Taiwan because of resource and scheduling constraints. The directors of the programmes discussed the participant observation protocol with all potential participants, clarifying that they could opt out. Directors addressed any concerns and signed informed consent for the programme to participate in the observations. During participant observations, the lead researcher observed and took field notes on advisors and trainees during classroom training sessions, project meetings, outbreak investigation planning, group project development, project presentations, and routine surveillance meetings. He also interacted with advisors and trainees through informal conversations between activities. For this study, we used the term ‘advisor’ for trainers, mentors, facilitators, and supervisors because each programme used different terms, and the same people often performed multiple roles. A field protocol for participant observations is available as a supplementary file.
For interviews, we enrolled a pool of trainees and advisors from each FETP. Because of programme sizes and research participation cultures, we invited all trainees from Japan, Mongolia, and Taiwan and a random sample of 10 trainees from Australia. We invited all core advisors—identified by coresearchers as those most involved in training, mentoring or supervising current trainees—to enrol. All enrolled participants provided demographic information and informed consent to participate as interviewees and have interviews video recorded. We then selected participants from the enrolled pool for interviews, aiming for diversity in age, gender, and educational background within each programme. Table 2 presents the interview recruitment numbers.
We trained three bilingual interviewers to conduct semi-structured interviews virtually in Mongolian, Mandarin, Japanese, or English. Interview guides are available in a supplementary file. Interviewers transcribed the interviews and provided the transcripts to the interviewees for concurrence before translating them into English. We conducted 47 interviews, 1–3 each with 19 trainees and 6 advisors. Table 3 details the participant characteristics.
The lead researcher began analysis with open and then focused coding following Charmaz [21], using NVivo 14. He read interview transcripts and participant observation field notes employing constant comparison [20, 21] to generate initial codes and coding trees and brought those with relevant data to the monthly meetings with co-researchers. They read transcript excerpts, provided interpretations, and discussed verisimilitude and usefulness of the coding, which informed adjustments to the coding scheme.
For focused coding, we integrated analysis of narratives [27] by identifying stories in the data and coding them according to the coding scheme while remaining open to emerging codes. We followed Polkinghorne’s ([27], p. 12) definition of ‘sustained emplotted accounts with a beginning, middle, and end.’ Plots were conceptual schemes that provide contextual meaning and integrate events into an organized whole. Narratives ranged from complete stories with clear sequences of events and meanings to fragmented stories that required composition through probing questions or researcher synthesis. Again, the lead researcher brought coded data to the co-researchers for interpretation and discussion of verisimilitude and usefulness, further adapting the coding.
Next, the lead researcher employed narrative analysis [27], integrating narratives by programme to construct one case narrative for each programme. By ‘case narrative’ we mean each FETP’s story of trainees coming to and going through the programme, synthesising the observed or recounted experiences in the data into a common account. Doing this reduced the likelihood that any one participant’s experience might overly influence the findings. After constructing the case narratives, we compared them with the remaining (non-story) interview and participant observation data to ensure consistency. For contrasting data, we either incorporated them into the case narrative or adjusted the language of the case narrative to reflect the diversity. The lead researcher met with each co-researcher to assess the verisimilitude of the programme’s case narrative. We then compared the four case narratives, returning to grounded theory’s constant comparison, to identify common learning processes and contextual factors influencing them.
Results
Comparison of the four case narratives revealed a common learning process and facilitators of that learning process. Here, we present the learning process, learning outcomes, and facilitators of the learning process.
Learning process: Interacting with and in context
Trainees learned by engaging in real and simulated public health contexts focused primarily on acute infectious diseases (see Fig. 1). These contexts involved interactions with people, data, systems, and knowledge through practical assignments, simulations, discussions, case studies, routine work, projects, and field investigations. Trainees applied existing knowledge and strategies to navigate the situation, reinforcing, revising, or identifying gaps in knowledge. For example, in an investigation of antimicrobial-resistant pathogens, the following narrative features ZY’s interactions with teammates, advisors, and hospital staff and shows how ZY’s knowledge and strategies evolved.
Listening to and smoothing communication to control disease—ZY (second-year trainee): [T]he most memorable thing for me was that I thought I had no idea about clinical work (...) I didn't know what nurses do in a hospital (...) So, I think the most important thing I learned was to hear about their work (...)
In the first place, the [advisors] taught us in advance what they are going to ask about when interviewing in the investigation of hospital-acquired infections, and we also prepared an interview guide. Then, we followed the guide during the interviews. I think that learning about those interview processes was an important outcome of FETP.
So, we asked the nurses about how they usually manage infection control in the hospital wards (...) I conducted interviews with the [teammate who] specialized in [nosocomial infections] and the [advisor] and the other FETP trainee, who has clinical experience, collected the information from electronic medical records. I think it would have been difficult if the person in charge of electronic medical records did not have any experience using [the records system. . .] I think we were able to collect information efficiently (...)
I didn’t really have much awareness of the problem of hospital-acquired infections in the first place (...) I heard from the nurses on site that although the nurses and others on site were working very hard, [management and other staff] were not really responding (...) The [advisor] said something like, ‘[FETP] does conduct field investigations and speaks for the infection control staff and nurses—what they want to tell [management], what they want to see done. It is one of our roles to facilitate smooth communication’ (...)
The drug resistance issue was very new to me personally, and I was able to learn a lot about it in terms of general knowledge. The other thing I learned was that the FETP's work, or the impact or role of field investigations, is that sometimes we are called upon to help smooth out a situation where a solution is already there in the field, but something is not going well.
Narratives about learning took place mostly in real contexts (e.g., presenting to stakeholders, checking mosquito traps, and evaluating a surveillance system). Occasionally, they occurred in simulated contexts in courses (e.g., case studies, role plays). All narratives involved interactions with people, data, systems, or knowledge. For example, ZY (above) interacted with hospital staff, teammates, patients, data systems, hospital systems (e.g., staff responsibilities, infection prevention and control), and knowledge of these systems, data, and antimicrobial resistance; GG’s research project (below) involved interactions with advisors, stakeholders, and data owners while navigating the challenges of traffic accident data systems and the knowledge of them and of research methods.
Seeing research with a new perspective—GG (first-year trainee): [I]t was difficult to get data on [traffic] accidents, even though I work for [this] organization. The (...) data cannot be given directly because it contains personal information [. . .so] it was a little difficult to get that data from the specialist in charge (...) Also, it was difficult to make a conclusion [based] on that [traffic accident] number alone (...) In addition, [ . . .] it was difficult to put the [material from the] biostatistics course into a real environment and use it (...)
Prior to the project, GG had aspired to present at the provincial academic research conference but lacked confidence in research methods. Learning through the project gave GG a new perspective on that conference.
GG (first-year trainee): I had thought about doing research before, but my knowledge was lacking. When I wanted to do research, I read it in a book, but I didn't understand it (...) This year, [the] academic conference was held (...) [I realized the] entire process of [the researchers’] detailed research work (...) seems to be very lacking. So, I am very happy to have studied in [FETP. . .] I learned everything about the rationale, goals, and objectives in [FETP] this year and acquired a small amount of knowledge [ . . .] I think that there was a lot of effort to improve and show my knowledge in front of people (...) I improved a lot and showed my knowledge proudly in front of people [at the project defence].
As shown in ZY’s and GG’s narratives, field investigations and projects provided the impetus for contextual engagement. Trainees applied existing knowledge (e.g., medicine, public health, software programmes, laboratory diagnostics) to navigate these contexts. Their strategies included building relationships; leading; observing or shadowing; consulting advisors; obtaining feedback; reading guidance, textbooks, manuscripts, etc.; viewing online tutorials; collaborating with advisors; seeking peer support; and trying and erring. ZY sought guidance from a peer, talked with hospital nurses, took advice from advisors, and followed an interview guide to navigate the investigation, whereas GG applied knowledge of research, national data sources, and biostatistics (developed in FETP), took advisors’ advice, read, and prepared to answer stakeholder questions.
As trainees applied knowledge, they reinforced it, revised it, or identified gaps in it. ZY revised the knowledge of interviewing, clinical practice, antimicrobial resistance, and the role of field epidemiologists. GG revised the knowledge of research methods, motor vehicle accidents, and presentations. AB described a knowledge gap in the data that surfaced in this project:
Not just data analysis—AB (second-year trainee): You expect everything to go fine and smooth, the data to be 100% complete (...) But, you know, so if you were just doing the data analysis with the dataset that you were provided [. . .you’re] just a data analyst, not an epidemiologist (...) Like having a research question and then able to answer with what resources you have, what dataset you have, and with the limited resources—if you can answer that research question, I think you would be a nice epidemiologist. That's what I learned here.
(...) I normally meet with my [advisors], and then when I present the data in tables and figures, I'm questioned, like, ‘Does it make sense? So, what is the data missing?’ (...) So that's what I learned: like, okay, so I should have some answers (...) to address those questions (...) So, for that, I need to understand like the surveillance system in place as well (...) Like, what's the surveillance system? [ . . .] And when did it start? (...) So, I need to understand and work with people (...) who have access to that data and who had been in that department for many years (...)
So, yeah, so the question started from my [advisors]. Then, that's what widened my horizon. Okay. I need to be, need to be investigating further. It's not just data analysis.
To address knowledge gaps, trainees employed similar strategies (i.e., building relationships, leading, observing, etc.). AB read documents and consulted colleagues. ZY sought advice from peers with clinical backgrounds. GG read and consulted the advisor. Although trainees had preferred strategies, they adapted them to the situation.
Learning outcomes
Trainees learned more than technical knowledge. Table 4 summarizes the learning outcomes described in trainee interviews, including improvements in knowledge of public health systems, communication, relationships, and decision-making, as well as changes in perspective, in understanding the meaning of field epidemiologist, and in the value of field epidemiology. Some advisors said that the most important outcome was for trainees to know how to handle outbreak investigations. While other types of knowledge, such as surveillance and research, were important, advisors saw them as supporting the ability to handle an investigation. Few trainees had multiple opportunities for field investigations, and still fewer in this study led one.
Facilitators of the learning process: learning environments, advisor stewardship, trainee tenacity
Learning environments
Narratives illustrated learning experiences primarily in routine placement work, projects, field investigations, and intensive courses. We use the term ‘learning environments’ to reflect that these settings did more than provide a backdrop; they facilitated learning by providing context and purpose for interaction with reduced risk. By ‘risk’, we refer to the potential consequences of trainee error. Routine placement work included daily and ad hoc assignments that trainees performed in their placements (not described in the Mongolia FETP). The projects included epidemiological studies; surveillance data analyses; and surveillance evaluations in Taiwan, Japan, and Australia. For field investigations, trainees deployed in teams to investigate predominantly acute public health events, e.g., in communities, military institutions, mining sites, and hospitals. Intensive courses were introductory and short courses with FETP staff or subject-matter experts at host institutions and local universities leading them.
For example, JS learned that communication in public health involved knowing the audience and delivering clear messages to ensure appropriate and meaningful actions. The trainee linked this learning to multiple experiences in routine placement work, small events that accumulated through repetition, where JS had to convince clinicians, laboratory personnel, and public health workers to share data, order tests, and take action. Immediate feedback from the success or failure of these interactions informed learning. Similarly, HK learned the limits of surveillance through a project evaluating a surveillance system. '[Each trainee] evaluates the surveillance system of one disease from various perspectives, which is both work and study. I think I learned the most about the mechanisms and limitations of surveillance through this activity' (HK, second-year trainee).
Table 5 presents the characteristics and common trainee tasks, roles, and motivations for each learning environment. Although all FETP activities involved learning, each environment balanced learning and practice to a different degree. Routine placement work, projects, and field investigations focused on learning through practice, while intensive courses relied on knowledge transfer through didactic and interactive methods. Each learning environment had a distinct timing and feedback characteristic. Trainees interacted with advisors in all environments and with experts in three. Interactions with other groups differed by environment. Advisors reduced risk in routine work, projects, and field investigations by supervising, assigning tasks, selecting topics, and preparing trainees before sending them to non-FETP settings. Trainee motivations to engage in learning environments were mostly about contributing, showing competence, and learning.
Learning experiences in intensive courses were mixed. Trainees expressed frustration more often at didactic sessions. A few identified new subjects, tools, and terms in lectures as ‘interesting’ and starting points for independent learning. Many felt confused, overwhelmed, or disconnected. During participant observations, trainees often appeared to disengage from lectures after twenty minutes. The accessibility of guest lecturers and their understanding of trainee needs were issues for some, whereas nonnative English speakers struggled with epidemiological terms and presentations in English. Interactive sessions helped some trainees understand field epidemiology and its relation to public health. Some trainees also appreciated meeting peers and identifying their own contributions and knowledge gaps.
A bit of a blur—BD (second-year trainee): I wouldn't say there was a whole lot of collaboration going on in [the course]. It was more guest speakers coming in and doing their bit and leaving, um, which is great because you get different people's expertise but not—because you get a lot of people who may have variable levels of investment in your [laughs] future and your career (...) But yeah, I don't know, it was a bit of a blur. All the [course sessions] were full on and then over quickly.
Forgetting the next day—SG (first-year trainee): [Dr H] taught us topics related to what is frequency, mean, correlation, etc., for five days. We understood at that time. When he gave a case, [we could] find the average: what is the frequency? Find the [relative risk]. Find the [odds ratio], etc. But we forgot the next day when he re-asked again from us. Therefore, to be more productive (...) we should have sat and [studied] ourselves.
Trainee tenacity
Trainee motivation, perseverance, and resourcefulness emerged as key factors for learning, which we termed ‘tenacity’. Most trainees cited the COVID-19 pandemic as influencing their decision to join the FETP, whereas some highlighted improvements in knowledge, contributions to public health, and advancing their careers (e.g., obtaining certificates, connecting with the alumni network, and securing overseas work). Above, we described motivations for engaging in learning environments (see Table 4), which were essential for accessing learning opportunities. Difficulties arose, however, (e.g., accessing and interpreting data, accessing resources in the native language, applying standard concepts to diverse contexts, lacking access to advisors) that required perseverance and resourcefulness. These qualities helped trainees stay engaged in learning (see ZY and GG narratives). Advisors described these qualities as ‘acting on their own ideas’, being ‘ready to bear the load’, showing ‘eagerness to learn’, and ‘love for the profession’. One advisor said, 'With the passion, they can complete everything. They can overcome everything if they have some passion' (ET, advisor). When asked about trainees without passion, ET responded, 'They need more (…)—every time, we need to say the same thing (…) because they do not want to learn from us (…)—they don't hear our voice.'
An unexpected learning resource—KG (first-year trainee): It was really difficult to work on a topic where there was no one to ask (...) I couldn’t find anyone [in this country] who knew to ask (...)
I was able to get information [from a social media website]. There is an epidemiology group [there . . .] A student [in the group] from [a foreign university] contacted me, gave me articles and books, and gradually supported me. There was a language barrier for me when I read the provided books. I don't understand the [topic in] English (...) So, to understand and read in [my language], there were a lot of difficulties (...), many scientific terms, which are difficult to understand in English for a person who does not fully understand [one’s own language . . .]
Interviewer: (...) How much effect did the [research project] have on your work?
KG: (...) There is no longer a situation where our [advisors] criticize me because [now], they say I will do good research [. . . and they] have [sent me] requests to [teach] in this field (...) I aim to conduct [training] after gaining in-depth and comprehensive knowledge. Currently [I am] reading a lot of textbooks.
A few narratives illustrated where more tenacity might have aided learning. In one example, a trainee attempted to learn a statistical programme with multiple resources (i.e., following a handbook, planning, watching an advisor, doing exercises, joining a peer group) but lacked a project that required the software, lost motivation, and stopped. The trainee explained that there was 'no angle' or 'no goal', so the trainee shifted focus to other priorities. Another trainee narrated the story of SI, a trainee from the cohort who left the programme. SI worked another job while taking part in the introductory course and had difficulty meeting with the assigned advisor. When SI was to present the research project, he felt unprepared to answer the panel’s potential questions and thus did not go.
Advisor stewardship
Trainees interacted with advisors in contexts more than with any other group, relying on them to navigate contexts and bridge knowledge gaps. Advisors shaped trainee learning by determining assignments, roles, and responsibilities and managing the risks of trainee error. As illustrated in this next example, they connected trainees with context. In addition to supervising or training, advisors stewarded learning.
Learning more than investigation—NU (second-year trainee): I don't have much experience [in the hospital], and it's a very complicated setting (...) because (...) they have a lot of patients, and (...) they have a high workload. They always feel stressed, and it's very hard to work with them. So, if they do not collaborate with me, it will be very challenging for me to get the data (...)
So, mostly if we want to work with the stakeholders in [this country], I will get the advice from my [advisors] because they have many experiences working with different settings, especially the hospital. So before working with [the stakeholders . . .] the connection is very important. So, one of my [advisors] helped me to connect with the leader of the [hospital. . .] And if the leaders have some agreement about conducting the kind of study, they will like support us to have some meeting, have some discussion and then (...) we will work closely to try to do something like this.
And from many times I have to go to the hospital to collect the data, I realize that I have to [be] familiar with the schedule of the doctors (...) because in the morning they will [be] very busy (...)
And it's very hard to try to balance the benefits between two organizations because (...) they want to have some kind of benefit for the article of the clinical side. And from my side, I want to finish my [investigation] but focus on the epi descriptive analysis (...) I have to discuss a lot with my [advisors], and they advise me [. . .to] propose some kind of solution (...) So, I learn how to work with the hospital and how to try to balance the benefits between the different organizations.
Discussion
We explored learning in four FETPs via qualitative methods and found that trainees learned through interactions with people, data, systems, and knowledge in real and simulated public health contexts. Four learning environments emerged: routine placement work, projects, field investigations, and intensive courses. Advisors stewarded learning by shaping opportunities, whereas tenacity enabled trainees to navigate challenges and remain engaged. Through these processes, trainees learned the necessary technical knowledge and tacit knowledge to apply it in practice.
Foundational and contemporary constructivist theories support our findings by emphasizing the role of interactions with environments in learning. Piaget [28] argued that knowledge is constructed through environmental interactions, leading to equilibrium or disequilibrium and triggering schema assimilation or accommodation. Vygotsky [29] highlighted the importance of social interactions mediated by cultural tools, especially language internalized as thought. Kolb [30] described learning as a cyclical process of experience, reflection, conceptualization, and experimentation. Illeris [31] proposed a triangular model where environmental and psychological processes interact, with content and incentives shaping learning. Together, these theories underscore that learning involves interpreting and revising knowledge through engagement with the environment.
Among the four learning environments identified, routine work emerged as the most robust, whereas intensive courses were the least, suggesting that some environments may facilitate learning more than others. Taber and colleagues similarly reported that paramedics and firefighters understood their professions through practice despite extensive courses [32]. In our data, interactive approaches seemed to foster more learning than did didactic methods in intensive courses. Such a finding—given the resources invested in FETP curriculum development (for example, [33,34,35])—implies misalignment that merits exploration.
Multiple factors could explain differences across learning environments. Trainees spend more time in routine work, projects, and field investigations than in courses. Additionally, they may be less likely to remember course experiences in interviews because years of schooling could have made them accustomed to such environments. Less learning and greater difficulty are also expected early in training, when trainees are mostly in courses. Additionally, FETP lectures seldom include follow-up activities, unlike most secondary and university courses, which incorporate exercises or essays to reinforce learning.
We argue instead that routine work, projects, and field investigations facilitate learning more so than courses by immersing trainees in the practice of field epidemiology, allowing them to grasp its meaning, knowledge, and competence. Lave and Wenger’s theory of legitimate peripheral participation in communities of practice [36] supports this claim. It posits that learning is a social process situated in the everyday activities of a community of practitioners. Newcomers progress from peripheral toward full participation, acquiring knowledge and competence through interaction with the community’s learning resources. This theory, developed through studies of apprenticeships in health and non-health professions, has also been applied to learning in firefighters and paramedics [32] emergency room residents [37], and probationary constables [38].
Lave and Wenger [36] argue that knowledge is not inherently ‘abstract’ or ‘concrete’ but that a separation arises when newcomers are sequestered from practice, forced to interact with knowledge disconnected from practice. This explanation resonates with FETP trainees in this study who felt overwhelmed or disengaged during courses. Freire’s ‘banking concept of education’ similarly critiques this disconnect, advocating for praxis—integrated theory and practice [39]. Campbell’s study of police newcomers also concluded that relegating trainees to observation ignores their prior knowledge and reduces enthusiasm. The authors recommend more opportunities for involvement in diverse environments and bridging the gap between the training institute and field [40].
Lave and Wenger [36] propose facilitating access to the community of practice (COP), including its activities, members, resources, tools, and opportunities to participate. Through legitimate peripheral participation, newcomers learn not through abstract curricula but modified forms of participation that open the COP to them [41]. Wenger clarifies that participation fosters learning because through it knowing and learning align, the nature of competence aligns with the processes of acquiring, sharing, and extending competence, so learning through participation is epistemologically correct [41].
Facilitating newcomer access to practice, however, requires peripherality and legitimacy [41]. Peripherality allows newcomers to take part with reduced responsibility, risk, intensity, complexity, or cost of error, while legitimacy—such as recognition as an FETP trainee or mentorship from an experienced field epidemiologist—acknowledges newcomers as potential community members despite their inability to demonstrate competence. These concepts highlight the critical role of advisor stewardship in FETP learning through managing peripherality and bestowing legitimacy.
In summary, learning always occurs, but its nature depends on what and how trainees engage. Engaging in lectures leads to learning lecturing; engaging in field epidemiology practice leads to learning the practice. Routine work, projects, field investigations, and advisor stewardship provide the peripherality and legitimacy needed to open the practice to trainees.
While legitimate peripheral participation explains how learning environments and advisor stewardship facilitate learning, it also raises concerns. First, when learning occurs inherently through social processes, training design risks becoming futile. Second, the COP may be reified too much, prioritizing ‘joining the club’ over the practice itself. Indeed, a few trainees in this study admitted joining FETP ‘for the connections’, ‘to work overseas’, or for the ‘official certificate’. Some stopped routine work or declined field opportunities to draft reports for graduation, which their advisors encouraged. Third, the COP may overlook, ignore, or exclude those who do not fit its worldview. Nicolini and colleagues’ review of COP literature highlights that power struggles in COPs can hinder knowledge sharing, defend entrenched power structures, and delegitimize outsider perspectives [42], which could homogenize FETP trainees and alumni and obstruct innovation and adaptation to a country’s dynamic needs. Among our participants, nonnative English speakers and women with caregiving responsibilities reported additional challenges, raising concerns about inclusivity, especially given FETP’s mandate to serve diverse communities and achieve a multisectoral One Health approach.
To address these concerns, we suggest three approaches. First, ensure trainees have diverse opportunities to engage in legitimate peripheral practice by leveraging the peripherality and legitimacy of routine work, projects, field investigations, and advisor stewardship. Second, members of the COP should be regularly involved in defining the practice and what trainees must demonstrate to join the COP and in clarifying how they will engage trainees at the periphery. Although community members are mostly national FETP alumni, those who take other paths to field epidemiology should be included. Third, reflect and clarify with COP members why the FETP wants trainees from diverse backgrounds and then ensure that trainees from diverse backgrounds have sufficient learning resources (linguistically, socially, professionally, culturally, etc.) and advisor stewardship for legitimate peripheral participation. Check in regularly with trainees to know what they need.
We do not advocate removing intensive courses from FETP. Trainees in this study valued meeting peers, identifying learning resources, and obtaining initial images of field epidemiology. Interactive sessions such as case studies helped trainees achieve these outcomes. Courses can be useful for simulated practice (e.g., interviewing cases, analysing messy data, negotiating findings with stakeholders). The foundational FETP relied on case studies to introduce the discipline, although most learning occurred in two-year placements [15]. Wenger, too, emphasizes that classroom instruction is not useless but that it should supplement the inherent learning potential of practice, not substitute it. He suggests grounding courses in practice by interspersing information sharing and reflection sessions with peripheral engagement in real-world settings instead of front loading with classroom training [41].
A last important finding is the role of trainee tenacity in learning. The motivation to join FETP and the motivation to engage in specific tasks helped trainees engage, whereas perseverance and resourcefulness helped them learn through difficulty. Goldman [37], studying emergency medicine residents, also reported that high motivation and self-direction were required for learning. Harris and colleagues reported that self-initiating skills were important for building police constables’ confidence and competence [38]. Illeris specifies incentive, including feelings, emotion, volition, and motivation, as an important dimension providing and directing the necessary mental energy for learning [31]. For Lave and Wenger [36], motivation is at the core of the theory. It drives trainees toward full participation. Wenger explains learning as an experience of identity rather than an accumulation of skills and information because learning transforms who we are and what we can do. So, it is a process of becoming and a source of meaningfulness and energy [41]. His explanation resonates with trainees who want to perform well in front of others, improve technical abilities, or fulfil their perceived professional needs. In contrast, two study participants did not complete their programmes, and their reasons were a preference for another COP or different identity. Programs would thus do well to look for tenacity when selecting trainees, to clarify the importance of tenacity with trainees at the beginning, and to cultivate tenacity throughout the programme. Research into selecting and cultivating tenacity in adults would benefit these efforts.
A few strengths [23, 43] of our method are notable. First, we codesigned and co-interpreted this research. The inclusion of lead advisors from the programmes helped reduce bias and ensure that the questions and findings were meaningful to the programmes. Periodic review of data with these advisors increased verisimilitude, validity, and trustworthiness. Second, the high number of trainee interviews, follow-up interviews with most, and participant observations allowed for the triangulation of findings across data sources, further increasing validity. Finally, including multiple programmes with diverse characteristics reduces the likelihood that these findings are driven by one individual, country, culture, or programme.
Two key assumptions underlie our analysis. First, learning reflects a change in knowledge or perspective [28, 29]. While such changes may lead to behaviour modifications, other factors influence behaviour, so we focused on changes in knowledge and perspective as evidence of learning outcomes. Second, learning links with memorable moments that have a lasting impact on understanding and action [30, 44] especially within a one- or two-year programme. We prioritized these moments as evidence of learning.
Our study of four programmes does not represent the diversity of FETPs. We did not include Frontline FETP or low-income countries, and we were unable to recruit advisors from Australia’s programme (the only university-based programme in the study). Therefore, caution should be used when these findings are applied. Nevertheless, this study provides the first systematic description of the learning process in FETP, which can be built upon through research and practice to scale efforts for emergencies, adapt FETP to new contexts, balance global standardization and local needs, inform measures of learning and evaluation, and align training with modern adult learning theories.
Conclusions
We have shown that learning in FETPs occurs through trainees engaging in public health contexts, being stewarded by advisors and driven by tenacity. The identification of these key components suggests that countries in diverse resource settings have the tools to build field epidemiology capacity and public health leadership. To sharpen such tools, governments, partners, and programme leaders should focus on enhancing learning environments, ensuring advisor accessibility, and fostering a culture of tenacity among trainees. Future research should focus on cultural context, advisor stewardship, and trainee tenacity. FETP is among the strongest tools for national, regional, and global health security; we cannot let the next pandemic arrive before fortifying it with smart investments.
Data availability
The datasets generated and analysed during the current study are not publicly available due to individual privacy and confidentiality but are available from the corresponding author on reasonable request.
Abbreviations
- CDC (Taiwan):
-
Taiwan Centers for Disease Control
- COHFE:
-
Competencies for One Health field epidemiology
- COP:
-
Community of Practice
- COVID-19:
-
Coronavirus disease 2019
- FETP:
-
Field Epidemiology Training Programme
- NA:
-
Not Applicable
- SARS-CoV-2:
-
Severe Acute Respiratory Syndrome coronavirus 2
- TEPHINET:
-
Training Programs in Epidemiology and Public Health Interventions Network
- US CDC:
-
United States Centers for Disease Control and Prevention
- USA:
-
United States of America
- WHO:
-
World Health Organization
References
Goodman RA, Buehler JW, Mott JA. Defining Field Epidemiology. In: Goodman SARaRA, editor. The CDC Field Epidemiology Manual. 4th ed. Oxford: Oxford University Press; 2019. p. 3–19.
Matsui Y, Kamiya H, Kamenosono A, Matsui T, Oishi K, Kidokoro M, et al. First mumps outbreak in a decade: measuring impact of mumps among naïve population—Tokunoshima island. Japan Open forum infectious diseases. 2017;4(suppl_1):S243.
Khairy A, Bashier H, Nuh H, Ahmed N, Ali Y, Izzoddeen A, et al. The role of the Field Epidemiology Training Program in the public health emergency response: Sudan armed conflict 2023. Front Public Health. 2024;12:1300084.
Baltazar CS, Rossetto EV. Mozambique field epidemiology and laboratory training programme as responders workforce during idai and kenneth cyclones: A commentary. Pan African Medical Journal. 2020;36:1–5.
Hu AE, Fontaine R, Turcios-Ruiz R, Abedi AA, Williams S, Hilmers A, et al. Field epidemiology training programmes contribute to COVID-19 preparedness and response globally. BMC Public Health. 2022;22(1):63.
O’Carroll PW, Kirk MD, Reddy C, Morgan OW, Baggett HC. The Global Field Epidemiology Roadmap: Enhancing Global Health Security by Accelerating the Development of Field Epidemiology Capacity Worldwide. Health Secur. 2021;19(3):349–51.
Langmuir AD. The Epidemic Intelligence Service of the Center for Disease Control. Public Health Rep. 1980;95(5):470–7.
TEPHINET. Training Programs 2023 [updated December 2023. Available from: https://www.tephinet.org/training-programmes.
WHO. Joint external evaluation tool: International Health Regulations (2005). Geneva: World Health Organization; 2016. Available from: https://apps.who.int/iris/bitstream/handle/10665/204368/9789241510172_eng.pdf.
Music SI, Schultz MG. Field epidemiology training programs: new international health resources. JAMA: The Journal of the American Medical Association. 1990;263(24):3309–11.
Jones DS, Dicker RC, Fontaine RE, Boore AL, Omolo JO, Ashgar RJ, et al. Building global epidemiology and response capacity with field epidemiology training programmes. Emerg Infect Dis. 2017;23:S158–65.
The Task Force for Global Health, editor The Global Field Epidemiology Roadmap. Meeting held at the Rockefeller Foundation Bellagio Center; 2018 November 26, 2018; Bellagio, Italy.
Forbes O, Davis S, Dyda A, Rosewell A, Williams S, Kirk M, et al. Field epidemiology training programmes in the Asia-Pacific: what is best practice for supervision? Western Pacific surveillance and response journal: WPSAR. 2019;10(4):9–17.
Forbes OD, S.; Dyda, A.; Rosewell, A.; Williams, S.; Moffatt, C.; Viney, K. Expert Perspectives on Outbreak Investigation Training: A Quality Improvement Exercise. Global Biosecurity. 2020;1(4).
Griffith MM, Pamphilon B. Case study: adult learning and public health–a foundational training programme in field epidemiology with lessons and opportunities for collaboration. Int J Lifelong Educ. 2024;43(1):52–66.
Bensyl DM, King ME, Greiner A. Applied Epidemiology Training Needs for the Modern Epidemiologist. Am J Epidemiol. 2019;188(5):830–5.
Griffith MM, Ochirpurev A, Yamagishi T, Nishiki S, Jantsansengee B, Matsui T, et al. An approach to building Field Epidemiology Training Programme (FETP) trainees’ capacities as educators. Western Pacific surveillance and response journal: WPSAR. 2018;9(3):1–3.
Patel MS, Phillips CB. Strengthening field-based training in low and middle-income countries to build public health capacity: Lessons from Australia’s Master of Applied Epidemiology programme. Aust New Zealand Health Policy. 2009;6:5.
Al Nsour M, Khasawneh G, Khader Y, Bashier H. Evaluation of field epidemiology training programmes: a scoping review. Front Epidemiol. 2024;4:1376071.
Glaser BG, Strauss AL. The discovery of grounded theory; strategies for qualitative research. Chicago,: Aldine Pub. Co.; 1967. x, 271 p. p.
Charmaz K. Constructing grounded theory. 2nd Edition. ed. London: SAGE; 2014.
Clandinin DJ. Engaging in narrative inquiry. Walnut Creek, California: Left Coast Press, Inc; 2013.
Patton MQ. Qualitative research & evaluation methods: integrating theory and practice. Fourth edition. ed. Thousand Oaks, California: SAGE Publications, Inc.; 2015. xxi, 806 pages p.
Griffith MM, Field E, Huang AS-e, Shimada T, Battsend M, Housen T, et al. How do field epidemiologists learn? A protocol for a qualitative inquiry into learning in field epidemiology training programmes. BMJ Open. 2024;14(1):e077690.
Wadsworth Y. What is Participatory Action Research? Action research international. 1998: Paper 2.
DeWalt KM, DeWalt BR. Participant Observation: A Guide for Fieldworkers. Blue Ridge Summit, UNITED STATES: AltaMira Press; 2010.
Polkinghorne DE. Narrative configuration in qualitative analysis. Int J Qual Stud Educ. 1995;8(1):5–23.
Piaget J. The origins of intelligence in children. Second edition. ed. New York: International Universities Press; 1952.
Vygotsky LS, Cole M. Mind in society: the development of higher psychological processes. Cambridge: Harvard University Press; 1978.
Kolb DA. Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice-Hall; 1984.
Illeris K. A comprehensive understanding of human learning. In: Illeris K, editor. Contemporary Theories of Learning. 2nd ed. London: Routledge; 2018. p. 1–15.
Taber N, Plumb D, Jolemore S. "Grey" areas and "organized chaos" in emergency response. Journal of Workplace Learning. 2008;20(4):272–85.
Guidance for One Health field epidemiology curriculum development: a supplemental manual to the Competencies for One Health field epidemiology (COHFE) framework. Geneva: World Health Organization, Food and Agriculture Organization of the United Nations and World Organisation for Animal Health; 2023.
Traicoff DA, Suarez-Rangel G, Espinosa-Wilkins Y, Lopez A, Diaz A, Caceres V. Strong and Flexible: Developing a Three-Tiered Curriculum for the Regional Central America Field Epidemiology Training Program. Pedagogy in Health Promotion. 2015;1(2):74–82.
Traicoff DA, Walke HT, Jones DS, Gogstad EK, Imtiaz R, White ME. Replicating success: Developing a standard FETP curriculum. Public Health Rep. 2008;123(SUPPL. 1):28–34.
Lave J, Wenger E. Situated learning: legitimate peripheral participation. Cambridge England: Cambridge University Press; 1991.
Goldman E. Learning in a chaotic environment. J Work Learn. 2009;21(7):555–74.
Harris R, Simons M, Carden P. Peripheral journeys: learning and acceptance of probationary constables. J Work Learn. 2004;16(4):205–18.
Freire P. Pedagogy of the Oppressed. 30th Anniversary ed. New York: The Continuum International Publishing Group Inc; 2000/1968 September 1, 2000. 183 p.
Campbell M. Learning in early-career police: coming into the workplace. Asia-Pacific Journal of Cooperative Education. 2009;10(1):19–28.
Wenger E. Communities of Practice: learning, meaning, and identity. 1st ed. Cambridge: Cambridge University Press; 1998. p. 318.
Nicolini D, Pyrko I, Omidvar O, Spanellis A. Understanding communities of practice: taking stock and moving forward. Acad Manag Ann. 2022;16(2):680–718.
Leung L. Validity, reliability, and generalizability in qualitative research. J Family Med Prim Care. 2015;4(3):324–7.
Dewey J. Experience and education. New York: The Macmillan company; 1938.
Acknowledgements
We are grateful for the following individuals and programs without whom this research could not have been conducted: Li Fang Li, Yukiko Kamio, Oyuntuya Bayanjargal, the Australian Master of Applied Epidemiology, FETP Japan, Mongolian FETP, and Taiwan FETP.
Funding
This research received no specific financial support.
Author information
Authors and Affiliations
Contributions
MMG contributed to the conception and design of the study, conducted participant observations and interviews, analysed and interpreted data, and drafted and revised the manuscript. EF contributed to the conception and design of the study, interpreted data, and provided input on the manuscript. ASH contributed to the conception and design of the study, interpreted data, and provided input on the manuscript. TS contributed to the conception and design of the study, interpreted data, and provided input on the manuscript. MB contributed to the conception and design of the study and interpreted data. TH contributed to the conception and design of the study and provided input on the manuscript. BP contributed to the conception and design of the study, interpreted data, and provided input on the manuscript. MDK contributed to the conception and design of the study and provided input on the manuscript. All authors have approved the submitted version and have agreed to be personally accountable for the author's own contributions and to ensure that questions related to the accuracy or integrity of any part of the work are appropriately investigated, resolved, and the resolution documented in the literature.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
The Australian National University Human Research Ethics Committee (2021/771) and the Institutional Review Board of Centers for Disease Control, Ministry of Health and Welfare, Taiwan (112206), approved this research. No additional institutes required ethics review. Consent was obtained from programme directors for participant observation after discussion with participants, who could opt out. Interviewees consented to participate individually.
Consent for publication
All included interview transcript data has been anonymised as much as possible. Participants whose transcript data are included in the manuscript had the opportunity to review the text and provide consent.
Competing interests
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Griffith, M.M., Field, E., Huang, A.SE. et al. How does learning happen in field epidemiology training programmes? A qualitative study. BMC Med Educ 25, 411 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12909-025-06982-6
Received:
Accepted:
Published:
DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12909-025-06982-6