Skip to main content

Educators’ perceptions and challenges of student assessment process at Prince Sattam Bin Abdulaziz University dentistry program: a qualitative study

Abstract

Background

As part of Saudi Vision 2030, there is increasing demand for dentistry colleges to provide training in the student assessment process. Assessment is the process of accurately determining a learner’s skills across multiple educational domains. The objectives of this study were to investigate teachers’ perspectives, assessment challenges, and make recommendations for improving the assessment process of undergraduate dental students at Prince Sattam bin Abdulaziz University in Al-Kharj, Saudi Arabia.

Methods

A qualitative study employed the grounded theory approach following purposive sampling. Four focus group interviews were conducted with course directors from the College of Dentistry at Prince Sattam bin Abdulaziz University (PSAU) using open-ended questions for data collection. Otter software was used for the transcription and NVivo 14 for the data analysis.

Results

Four themes emerged: perspectives on the assessment process, summative and formative assessments, challenges of assessments, and proposed solutions to the assessment challenges. Most educators perceived assessments as assessments of learning, with the planning and execution of assessments requiring regulation. Different feedback models were occasionally used by examiners to improve student performance. Examiner standardization training, communication, and calibration were lacking, according to the educators in this study.

Conclusion

The challenges of the assessment process in the College of Dentistry at PSAU are multifactorial including the examiners themselves, students, and the college. These challenges indicated the need for a tailor-made, appropriately designed faculty development training program related to different methods of student assessment.

Peer Review reports

Introduction

A successful dental education program requires the assessment of skills, knowledge, and professional values. Students learn through interactions with their peers and teachers, as well as receiving feedback on their performance. Assessments are frequently used in dental education to track students’ learning progress and determine their competency and readiness for the dental profession [1,2,3]. Having robust academic assessments and associated processes should ensure patient safety, adhering to the practice principle “do no harm” [4]. Furthermore, assessments are frequently the primary focus and motivator for students to engage in the learning process. Given this, it is proposed that “assessment processes should be appropriate, trustworthy, and reliable as a gateway for dental graduates to become qualified to practice independently“ [5].

Various approaches have been used to assess student learning outcomes. Many dental schools have traditionally used written exams, multiple-choice questions, true/false assessments, clinical, or procedural examinations, and case presentations to assess their students [6]. However, these methods have been used with a focus on memorization and clinical practice, which often contribute to academic failure as they are likely to encourage a superficial learning approach. Moreover, they can lead to the continuation of a traditional, purely dental training approach, with no or only limited connection between courses [6, 7].

Assessment processes have been used to evaluate students’ learning stages, facilitating informed decisions that promote their academic progression. The assessment should focus on knowledge development rather than the product, namely an assignment grade or test result [8, 9] and usually an educational establishment would be responsible for defining an assessment model that complies with legal requirements [9]. Prince Sattam Ben Abdulaziz University (PSAU) is a modern university in the Kingdom of Saudi Arabia, having continuous improvement of teaching and learning processes as one of its strategic objectives. “In alignment with Saudi Arabia’s Human Capability Development Program (one of Vision 2030’s Realization Programs) that seeks to improve the quality of education outcomes, enhance teacher preparation and development, and ensure alignment of educational outcomes with labor market needs” indicates that there is growing emphasis on advancing assessment practices and professional development in higher.

education [10].

Educational assessment occurs in two main contexts. First, teachers evaluate students’ summative achievement over time. The second is used by educational leaders to assess whether students have met learning objectives through students and educators’ perceptions. Many studies have been conducted to assess student perceptions related to educational environments [11], learning approaches [12], and the curriculum of dentistry programs [13] in Saudi Arabia. However, to the authors’ knowledge, few studies highlight educators’ perceptions related to the methods of assessment and any related challenges, whether in the PSAU College of Dentistry or globally.

AlAskari et al. [14] investigate the perceptions, implementations, and challenges encountered by faculty members in applying formative assessment to undergraduate medical students at Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia. Its faculty members acknowledged the concept and significance of formative assessment; however, its implementation was restricted. Furthermore, Knox et al. [15] examine the perceptions and experiences of educators regarding the assessments implemented within their institution. The educators expressed satisfaction with their institutional assessment while identifying areas for improvement.

Assessment plays a crucial role in confirming that students have attained the necessary competencies to practice safely and effectively. Accordingly, ascertaining educators’ perceptions regarding the assessment process implemented in dentistry courses at PSAU, to identify areas of improvement, merits investigation. Hence, this study’s objectives were to investigate educators’ perspectives, assessment challenges, and make recommendations for improving the assessment process of undergraduate dental students at PSAU in Al-Kharj, Saudi Arabia.

Methods

Ethical approval

The study was approved by the Standing Committee of Bioethics Research at Prince Sattam bin Abdulaziz University with approval number (SCBR-110/2023). Written informed consent was obtained from the participants prior to the data collection.

Study settings

This is a qualitative study conducted using a grounded theory approach. This method uses real-world data to examine a process and posit new hypotheses. It is an inductive approach that derives new theories from facts, unlike hypothesis-deductive methods. Thus, adopting these methods makes data collection more efficient [16, 17]. The study was carried out at the PSAU College of Dentistry in Al-Kharj, Saudi Arabia. PSAU is developing significantly at all levels to meet the Kingdom of Saudi Arabia’s Vision 2030. The College of Dentistry was founded in 2007 and ten student cohorts have graduated. This study used a focus group discussion methodology with open-ended questions to facilitate educators in expounding the challenges that may arise during assessment processes and make recommendations for improving undergraduate dental student assessments.

A focus group discussion interview guide was developed. This guide includes: a welcome statement describing the research, the discussion objectives, confidentiality, and basic rules; an introduction to the moderator, assistant, and group members; interview questions; a researcher’s conclusion to formally end the discussion. Three content validation experts from medical fields were assigned to evaluate the focus group discussion guidelines in respect of suitability, simplicity, and comprehension of the interview questions. The letters of invitation were submitted via email. On an evaluation form, the validation experts were asked to rate each question on a scale of “good,” “moderate,” “weak,” and “not suitable”. Furthermore, they were asked to provide feedback on additional questions. Having gathered the relevant information, we focused on selecting study participants based on their experience and willingness to participate in the discussion.

Participants and recruitment

Forty participants from the college’s dentistry departments (four departments) responded and expressed interest in participating in the focus group discussion. As inclusion criteria, we chose educators with five years of full-time teaching experience in higher education, which is the minimum period for completing a bachelor’s degree in dentistry, as well as their commitment to pedagogical activities at the institution. Only the core courses from the final three years of the undergraduate program were considered, as they are mostly taught by dentistry department educators and include clinical training; the first training year comprises courses that are common to all health sciences programs.

Using these criteria, twenty-eight educators were eligible to participate in the study. However, it was impossible to include everyone because some were on leave or traveling. Finally, twenty-four educators agreed to participate in the study. All educators were approached and informed by the institutional head about the purpose of the study, date, and venue of the focus group discussion.

The recruited educators were randomly assigned to four focus groups. Each focus group consisted of six participants, and open-ended questions (see Table 1 and Supplementary Document) were used to explore and obtain participants’ viewpoints regarding the assessment challenges they had encountered when assessing dental students at PSAU College of Dentistry.

Table 1 Data collection tool (Open ended questions)

Data collection

A focus group discussion interview guide was sent by email to the recruited educators. A pilot focus group interview (including five randomly selected people from the recruited educators) was conducted a week before the data collection. This was done via the Zoom platform to ensure the interview questions were both appropriate and aligned to the objectives and research questions. Furthermore, this was to evaluate the questions’ clarity and measure the time needed for completion, which significantly reduces the risk of errors, saving both time and resources. Four separate focus group discussions, related to four departments in the college, were held from October to December 2023 by four investigators who are college educators, with the assistance of observers. Each discussion took place in a college meeting room and lasted approximately two hours per group. One expert in medical education science and students’ assessment methodology moderated all four discussions. The participants were informed of the purpose of the focus group. To ensure the requisite data confidentiality, their written permission was also obtained to record the discussion for the researchers’ use after the interview. Each focus group was audio recorded, and the recording securely stored using dedicated computer programs.

Data analysis

The discussion was transcribed verbatim using Otter software, then the qualitative data collected was analyzed using a thematic approach. Thematic analysis uses both inductive and deductive methods, including reflexive thematic analysis (RTA) and the codebook approach.

respectively [18, 19]. The reflective thematic analysis approach consists of six steps [18, 19]: data familiarization, initial codes development, initial themes development, reviewing themes, defining themes, and finally writing the report (Fig. 1) [16]. The first step was to read all the focus group discussion interview data to obtain a general understanding, followed by reading the verbatim transcription and double-checking for accuracy. The second step involved content analysis with NVivo 14 software to extract the meaningful units, which were then encoded. In step three we revised the initially developed codes and determined themes based on their similarities and differences. The fourth step was to review the themes. This entailed eliminating weaker themes or combining them with other themes to create clearly defined themes. The fifth step was to define themes [16], which gave a clear description of each one (Table 2). The final step was to create the report, which included interview excerpts, and a narrative of findings designed to achieve the research objectives.

Fig. 1
figure 1

The reflective thematic analysis approach components

Table 2 Code book showing themes, sub-themes, number of files coded and references

Results

The demographic information showed twenty-four participants (70.84% male and 29.16% female) participated in these four FGDs, with 6 participants in each group. Most participants were assistant professors (45.83%), with further details about the participants given in Table 3.

Table 3 Demographic characteristics of focus group discussion (FGD) participants (n = 24)

A total of four themes and thirteen sub-themes were extracted from the data (Fig. 2). The four main themes with their descriptions are presented in Table 4. These four themes were perspectives on the process of the assessments, summative and formative assessments, challenges of assessments, and suggestions for improving the assessment process. Perspectives from all participants will be used to present and illustrate the subthemes.

Fig. 2
figure 2

Identified themes and sub-theme based after focus group discussion

Table 4 Four main themes with their descriptions

Perspectives on the process of assessments

General views of examiners

Most of the participants (70.83%) highlighted the importance of the assessment process to track students’ learning progress. However, they focused primarily on summative assessment, particularly in the practical assessment, “In dentistry, we have a practical assessment for courses involving practical skills”. The importance of the diversity within the assessment process, such as formative and summative assessment, utilizing diverse methods of assessment, and the importance of giving feedback to students, was reported by other participants (29.16%). They also raised the importance of developing and reviewing the current faculty development training program, “I think it is good and diverse, but this will not eliminate the need to review and develop it”.

Assessment planning and execution

Most of the participants (60.61%) stated that the course directors of the subject are responsible for planning and carrying out student assessments: “Definitely the execution of the assessment will be the responsibility, of the course director.” “Course directors planning and executing assessments however, the midterm and final exams, should be approved by the department”. However, other participants attributed responsibility for the student assessment process to various college departments, including the assessment unit (9.09%) if one is available, the Vice Dean for Academic Affairs (9.09%), and the department chair (15.15%) “The department Chairman can supervise and monitor the assessment practices” and “has to be the vice deanship of academic affairs”.

College support

All participants stated that their college provided adequate resources, such as libraries, books, materials, and references, to assist them with the student assessment process. Furthermore, the participants appreciate the support from the Deanship for Academic Affairs in organizing workshops and training sessions. “Access to resources like libraries, references, books, materials”“I really appreciate the help provided by the Vice Dean of Academic Affairs, Vice Dean for Academic Affairs and in terms of helping us to deliver the best exams.”

Summative and formative assessments

Utilizing blueprint

Most participants (85.19%) emphasized the importance of creating a test blueprint based on the course objectives and program learning outcomes. A test blueprint defines the knowledge and skills to be assessed, allowing them to create purpose-driven and successful assessments. “I design the blueprint based on the nature of the course that I teach and its learning objectives that must be achieved for students in accordance with the learning outcomes of my college in knowledge, skills and values”. Other participants undervalued the importance of developing a test blueprint.

Utilizing different assessment tools

All participants gave examples of how they evaluate their students in various domains, including knowledge, skills, and attitudes (Fig. 3). Regarding knowledge, they utilize quizzes and written summative exams through direct questions: “We are able to evaluate how well students understand through objective tests, quizzes, and assignments“knowledge is fairly assessed by direct questions to the students”. For skills, they stated that they utilize practical and clinical exams in the final exams as well as direct observation during training procedures: “We are in the dentistry college. So here the skills are very important. Assessing them by clinical exam”. Furthermore, participants stated that they used questionnaire surveys or observation as the main strategy to assess students’ professional attitude and mentioned: “Attitude could be assessed by considering the attendance, behaviour and adherence to the university and college regulations”.

Fig. 3
figure 3

Participants’ current strategies assessing various domains; knowledge, skills, and attitude

Ways of providing feedback

Most participants (91.66%) identified various strategies for providing feedback to students, such as direct written or verbal feedback: “Direct we are doing by written and verbal feedback, or by open discussion: “The way I believe is effective is to discuss directly with students their performance and how they can improve it.” or using the sandwich technique “One-to-one feedback when it’s provided, I prefer to use sandwich technique.” Verbal feedback offers prompt responses, suitable for contexts requiring rapid communication. Nonetheless, the absence of a written record complicates subsequent review. Open feedback enhances student confidence, professional communication, and trust. However, it may lead to unequal student participation and be time-consuming if not managed correctly. The sandwich feedback method can mitigate the impact of criticism, facilitate the feedback process, and conclude meetings on a positive note. However, this approach may lack clarity, and some students might perceive it as undermining their positive performance.

Other participants (8.35%) raised the importance of the Pendleton feedback model, “It’s very good and very directed to the benefit of the students. Ensuring compassionate relationship with the students. So rather than going and criticising their work, or [saying] they did bad here, I can go for a one-to-one feedback based on Pendelton”.

Challenges of assessments

Educator related challenges

The participants noted various challenges they faced, such as not adhering to standard rubrics and grading each assignment, “No clear rubric”, inconsistent grading because of bias, “As an examiner there should not be any bias towards students”, and lack of calibration or standardization between assessors, “Lack of calibration between assessors, so there is a variation in the marks”. Some participants also highlighted that one assessor assessing the students was not appropriate and affected the validity of the grades, “As recommended by the external examination examiner, it is advisable to have multiple individuals, rather than just one person referred to as the director, assessing the final marks”. Thus, a second external examiner was required.

Student-related challenges

The participants identified a variety of student-related challenges they encountered while conducting assessments. One of the most common such challenges is a focus on grades rather than skills’ development, which is what they have been taught since they were in school. “Students always focus on the grades rather than what they learn in the labs or the clinic. [I] always like, emphasize this point for them, what they get out of the school from the skills, for example, or the knowledge is the most important factors that they focus on”. Furthermore, students might reject different assessment methods, “the most common difficulty seen from students is that some of them are not taking the formative exams seriously so do not attend”.

College-related challenges

The participants reported various challenges linked to the college that they faced, including lack of infrastructure technology and appropriately tailored faculty development training programs: “There is no exam centre or exam software or device assisting in correcting exam papers”Lack of financial support to attend workshops, self-development is done by faculty members themselves”.

Suggestions for improvements of student assessment process (Table 5)

Educator-related improvements

The participants offered the following remedies for educator-related challenges: (1) Standardized criteria to evaluate all students “There should be standardised criteria to evaluate all students”, (2) There should be uniform grading criteria “Adherence to established grading criteria [can] also contribute to minimising challenges associated with the examiner’s role”, (3) commitment to fairness, peer feedback and continuous improvement “Continuous improvement and a commitment to fairness are essential in overcoming challenges related to the examiner”, and (4) communicating expectations to students clearly “clear communication of expectations to students”.

Table 5 Summary and recommendation plan for improvement of the assessment process in the study

Student-related improvements

Participants offered two potential strategies for addressing difficulties related to student challenges: enhancing communication with students through different approaches “Maintain open lines of communication with students” and establishing a supportive learning environment “Create support structures, such as tutoring services, study groups, and counselling resources.”

College related improvements

Ongoing professional development through the college assessment unit was the solution offered in respect of college challenges: “Colleges should have assessment unit monitor the assessment’s system and support examiners by offering continuous training sessions and introducing workshops and discussion panel about assessments”.

General suggestions for improvements

Figure 4 shows participants’ suggestions for improving the student assessment process. Additionally, participants said that better student assessment practices could be facilitated by incorporating formative assessments into the curriculum: “Incorporate formative assessments throughout the curriculum to provide ongoing feedback and identify areas that need improvement early. This would require reasonable students/teacher ratio.” The significance of evaluating and modernizing the teaching methodology by implementing appropriate effective strategies like team-based learning, flipped classrooms, and case-based discussions was also emphasized by the participants.

Fig. 4
figure 4

Participants’ suggestions for improvements of student assessment process

Discussion

In pedagogy, assessment is the process of evaluating and providing feedback on skills, knowledge, and attitudes to improve future learning outcomes. It should focus on developing knowledge rather than the product, in the sense of a simple score or test result [20, 21]. This study presented a clear picture of the educators’ perspectives, challenges, and their recommendations for improving the assessment process of undergraduate dental students at PSAU in Saudi Arabia.

The current study was conducted utilizing focus group interviews with college educators by qualified and experienced people, allowing the educators to express their opinions freely to identify student assessment challenges in the College of Dentistry. Focus group discussion is an effective tool for gathering truthful and reliable information by promoting interaction among participants. Furthermore, it can support quantitative data from students’ grades or results and contribute to the modernization of the student assessment process [22]. The findings of this study are significant and will assist administrative stakeholders in reevaluating the assessment process used in PSAU College of Dentistry for the undergraduate dentistry courses, by considering its current challenges and potential for student development.

According to the study’s findings, most assessments in the College of Dentistry at PSAU are summative, which is an assessment of learning. It was reported by participants that formative assessment is critical for improving the quality of student learning [23]. These findings are in line with AlAskari et al. [14] who evaluated educators’ perceptions about utilizing the formative assessment at the medical college, Imam Abdulrahman Bin Faisel, Dammam, Saudi Arabia. They found that the medical faculty and staff were generally aware of formative assessment with respect to its usefulness. However, the application domain showed a poor implementation by faculty members in their assessment process, due to the large number of students and the increased workload. On the other hand, Morris et al. [24] reported that, even though formative assessment is widely promoted in educational discourse, strong causal evidence supporting its effectiveness in higher education particularly regarding its impact on academic performance is still limited and “mixed”. Additionally, Schellekens et al. [25] explored how educators in their institution perceive and achieve assessment quality, as well as how they perceive assessment impact upon student learning. Their study stated that there was a gap between theory and practice, in that many educators were not aware of the quality assessment criteria that relate to the assessment for learning.

Most of the educators at PSAU confirmed they have full responsibility for planning and executing the assessments of their students. However, there is a need for collaboration between course directors, departments, and academic affairs to ensure the reliability as well as the validity of the implemented assessments from different domains, and this should be done by considering both the program and course learning outcomes. Notably, as suggested by the educators, the assessment unit is needed to regulate the assessment procedure, including planning and execution, and to establish a clear assessment policy to be followed by all examiners in the college. This finding concurred with a previous study [20], which emphasized that rather than working independently to provide separate assessments, it was widely agreed that a longitudinally oriented, systematic, and programmatic approach to assessment would offer greater reliability and enhance the capability to illustrate learning.

Most of the educators utilized a test blueprint based on the course objectives and program learning outcomes. However, others undervalued the importance of developing a test blueprint. Several studies have indicated that using a blueprint can help examiners to minimise the likelihood of either insufficient or excessive representation of assessment contents, which can jeopardise the assessment’s validity [26,27,28]. Furthermore, the PSAU educators utilized assessment tools for assessing dental students in various domains of knowledge, skills, and attitude that could be acceptable and consistent with previous evidence [28, 29]. However, our research does not offer the potential to determine the validity and reliability of the assessments, as the findings were reported by the course directors rather than through direct measurement. Additionally, other assessment tools, including workplace-based assessment (mini-CEX, case-based discussion), are required to assess practical and clinical skills [30]. Furthermore, objective structured clinical examinations (OSCEs) or standardized patients would be more relevant when assessing undergraduate dentistry students, potentially increasing the reliability and validity of assessments in some practical courses [31].

Educators at PSAU’s College of Dentistry acknowledged the importance of providing feedback to dental students, recognizing its critical role in improving the learning process and enhancing student performance. Educators have suggested a series of training activities regarding this topic. In the absence of suitable feedback, students might struggle to gain the necessary knowledge and skills essential for dentistry. Therefore, it is essential for educators to provide timely feedback to enhance students’ learning outcomes [32].

A significant challenge in terms of student assessments at PSAU’s College of Dentistry is the lack of calibration and standardization among assessors, which has a negative impact on the assessment process. Undertaking standardization with all assessors should guarantee a clear and precise grading system, minimize grade discrepancies among students, and improve overall student academic achievement. These benefits would significantly improve the educational environment, which should positively correlate with students’ quality of learning, and influence their cognitive outcomes. Furthermore, the educators expressed the need for a specific exam center and exam software. Additionally, a lack of resources could affect student achievement [33,34,35].

Educators proposed solutions to the challenges encountered when administering assessments. Having adequate resources would facilitate the use of multiple summative exams and quizzes both on-site and via Blackboard. Additionally, educators suggested training, workshops, and discussion panels for assessments as ways to enhance the processes of teaching and assessment. Furthermore, incorporating formative assessments into the curriculum was considered beneficial, and using a rubric would aid in decreasing subjectivity in the assessments [36]. These recommendations are in line with Kasem et al., who reported that assessment requires considering everything that can impact the learning environment. Practically we should focus on learners and trainees, the near future healthcare workforce, to prepare them to practice with contextual competence [37]. Moreover, having qualified external examiners was mentioned by most educators as a solution to some challenges and to improve the reliability of the assessment process. This finding is in line with several studies, which stated that external examiners could comment on the assessment process, whether it is properly designed and applied, and whether it is undertaken in a way that is fair and equitable to all students, and supportive of the achievement of learning outcomes [38, 39].

The study has identified key challenges, including lack of assessor calibration, inadequate feedback models, and limited resources. The findings of this study offer several directions that could improve the student assessment process in PSAU’s educational context. Firstly, it is essential to establish an assessment unit to regulate the college assessment procedures. In addition, faculty policy should emphasize assessment for learning quality criteria more explicitly to generate a sense of urgency in changing current practices, for example, by addressing these criteria more explicitly in quality assurance procedures. Secondly, faculty members require training about the different types assessment process. The faculty development training program should be designed and implemented after performing a needs assessment through cooperation between the Faculty Development Unit and the college’s Medical Education Committee. Having customized training activities should be compulsory for each academic rank since faculty members play a significant role in teaching as they are central to ensuring student learning. In alignment with Saudi Arabia’s Vision 2030, dental colleges are committed to enhancing quality education through a strong emphasis on teacher training and professional development [10].

Additionally, comments made by participants were that students’ perceptions of the assessment procedures in the College of Dentistry at PSAU should be investigated. Furthermore, educators’ and students’ perspectives on the challenges of assessments should be considered in all dental schools across Saudi Arabia universities.

Limitations to this study are acknowledged in that these findings are specific to the College of Dentistry at PSAU and cannot be generalized. Furthermore, the dual role of the authors as both educators and researchers constitutes a limitation of the study. Nonetheless, the research findings remain unaffected, as to mitigate bias a specialist in medical education science moderated all the discussions. Additionally, three content validation experts evaluated the suitability and simplicity of the interview questions in the focus group discussion guidelines. However, student perspectives on the challenges they encountered during their assessments were not included and triangulating student input, educators’ perspectives and the existing literature could have provided additional insights.

Conclusion

The challenges of the assessment process in the College of Dentistry at PSAU are multifactorial including the examiners themselves, students, and the college. These challenges identified the need for a tailored, appropriately designed faculty development training program related to different methods of student assessment. The customized training activities must be compulsory for each academic rank and should be essential for faculty members’ promotions. The establishment of an assessment unit is highly recommended to regulate the assessment procedure. The findings of this qualitative research will assist the relevant authorities in enhancing the student assessment process. Discussing these findings in the upcoming department meeting and subsequently at the college board meeting before the end of academic year could facilitate the provision of adequate financial, technological, training, and facility resources to enhance the quality of the assessment process in the next academic year.

Data availability

The datasets generated and/or analyzed during the current study are not publicly available but are available from the corresponding author on reasonable request.

References

  1. Alkharusi H. Effects of classroom assessment practices on students’ achievement goals. Edu Asses. 2008;13(4):243–66.

    Article  Google Scholar 

  2. Baird JA. Does the learning happen inside the black box? Assessment in education: principles. Policy Pract. 2011;18(4):343–5.

    Google Scholar 

  3. Boud D. How Can Practice Reshape Assessment? In: Joughin, G, editors Asses Learn Judge High Edu. 2008;1(1):1–15.

  4. Mossey P, Newton J, Stirrups D. Defining, conferring and assessing the skills of the dentist. Br Dent J. 1997;182(4):123.

    Article  Google Scholar 

  5. Taylor CL, Grey N, Satterthwaite JD. Assessing the clinical skills of dental students: a review of the literature. J Educ Learn. 2013;2(1):20–31.

    Article  Google Scholar 

  6. Bissell V, Dawson LJ. Assessment and feedback in dental education: a journey. Br Dent J. 2022;233(6):499–502.

    Article  Google Scholar 

  7. Shaik SA, Almarzuqi A, Almogheer R, Alharbi O, Jalal A, Alorainy M. Assessing Saudi medical students learning approach using the revised two-factor study process questionnaire. Int J Med Educ. 2017;8:292–6.

    Article  Google Scholar 

  8. Albino JE, Young SK, Neumann LM, Kramer GA, Andrieu SC, Henson L, Horn B, Hendricson WD. Assessing dental students’ competence: best practice recommendations in the performance assessment literature and investigation of current practices in predoctoral dental education. J Dent Educ. 2008;72(12):1405–35.

    Article  Google Scholar 

  9. Schuwirth LWT, Van der Vleuten CP. General overview of the theories used in assessment: AMEE guide 57. Med Teach. 2011;33(10):783–97.

    Article  Google Scholar 

  10. Human Capability Development Program. Human Capability Development Program Delivery Plan 2021–2025. Kingdom of Saudi Arabia: Vision 2030 Realization Program; 2021. Last accessed 25/3/2025 at: https://www.vision2030.gov.sa/media/pgid4z3t/2021-2025-human-capability-development-program-delivery-plan-en.pdf

  11. Sabbagh HJ, Bakhaider HA, Abokhashabah HM, Bader MU. Students’ perceptions of the educational environment at King Abdulaziz university faculty of dentistry (KAUFD): a cross-sectional study. BMC Med Educ. 2020;20(1):241.

    Article  Google Scholar 

  12. Alahmari F, Basudan A, Shaheen M. Assessment of learning approaches of Saudi dental students using the revised Two-Factor study process questionnaire: A Cross-Sectional study. Open Dentistry J 2023 Feb, 17;1–7.

  13. Al Kuwaiti A. Assessing the students’ perception of the quality of dental program offered in Saudi Arabia. Open Dentistry J 2021 Dec, 15;650–7.

  14. AlAskari AM, Al Elq AH, Mohamed ER, Almulhem MA, Zeeshan M, Alabbad KA, AlSairafi AH, Alsharif AM. Faculty Members’ Perception, Implementation, and Challenges of Formative Assessment in Undergraduate Medical Education: A Cross-Sectional Study Cureus. 2024;16(11):e74107.

  15. Knox S, Brand C, Sweeney C. Perceptions of paramedic educators on assessments used in the first year of a paramedic programme: a qualitative exploration. BMC Med Educ. 2023;23(1):952.

    Article  Google Scholar 

  16. Glaser B, Strauss A. Grounded theory: the discovery of grounded theory. Sociol J Bri Sociol Assoc. 1967;12(1):27–49.

    Google Scholar 

  17. Corbin JM, Strauss A. streamlines Res process Qual Sociol. 1990;13(1):3–21. G theoretical power of the analysis, these methods tailor data gathering to the researcher’s emerging analysis. Thus, adopting these methods makes data collection more efficient and, therefore.

  18. Braun V, Clarke V. One size fits all? What counts as quality practice in (reflexive) thematic analysis? Qual Res Psychol. 2021;18(3):328–52.

    Article  Google Scholar 

  19. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.

    Article  Google Scholar 

  20. Patel US, Tonni I, Gadbury-Amyot C, Van der Vleuten CP, Escudier M. Assessment in a global context: an international perspective on dental education. Eur J Dent Edu. 2018;22(1):21–7.

    Article  Google Scholar 

  21. Norton L. Using assessment to promote quality learning in higher education. Learning, teaching and assessing in higher education. Dev Ref Prac. 2007;1(1):92–101.

    Google Scholar 

  22. Rauf A, Baig L, Jaffery T, Shafi R. Exploring the trustworthiness and reliability of focus groups for obtaining useful feedback for evaluation of academic programs. Educ Health (Abingdon). 2014 Jan-Apr;27(1):28–33.

  23. Menéndez IY, Napa MA, Moreira ML, Zambrano GG. The importance of formative assessment in the learning teaching process. Int J Humanit Soc Sci. 2019;3(2):238–49.

    Article  Google Scholar 

  24. Morris R, Perry T, Wardle L. Formative assessment and feedback for learning in higher education: a systematic review. Rev Educ. 2021;9(3):614–49. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/rev3.3292

    Article  Google Scholar 

  25. Schellekens LH, Kremer WDJ, Schaaf MF, Vleuten CPM, Bok HGJ. Between theory and practice: educators’ perceptions on assessment quality criteria and its impact on student learning. Front Educ. 2023;8:1–9.

    Article  Google Scholar 

  26. Adkoli BV, Deepak KK. 2012. Blueprinting in assessment. Principles Asses Med Educ. 2012;1(1):205– 13.

  27. Gujarathi AP, Dhakne Palwe S, Patil RN, Borade-Gedam P, Mahale MS, et al. Preparation of blueprints for formative theory assessment of undergraduate medical students in community medicine. MVP J Med Sci. 2020;2(2):100–03.

    Article  Google Scholar 

  28. Higgins JP, Green S, editors. Cochrane handbook for systematic reviews of interventions. 2008.

  29. Leung K, Trevena L, Waters D. Systematic review of instruments for measuring nurses’ knowledge, skills and attitudes for evidence-based practice. J Adv Nurs. 2014;70(10):2181–95.

    Article  Google Scholar 

  30. Saedon H, Saedon MHM, Aggarwal SP. Workplace-based assessment as an educational tool: guide supplement 31.3– Viewpoint. Med Teach, 32:9, e369-e372.

  31. Regehr GL, Freeman RI, Robb AN, Missiha NA, Heisey RU. OSCE performance evaluations made by standardized patients: comparing checklist and global rating scores. Acad Med. 1999;74(10):135–7.

    Article  Google Scholar 

  32. Schlenz MA, Michel K, Wegner K, Schmidt A, Rehmann P, Wöstmann B. Undergraduate dental students’ perspective on the implementation of digital dentistry in the preclinical curriculum: a questionnaire survey. BMC Oral Health. 2020;20:1–0.

    Article  Google Scholar 

  33. Greenwald R, Hedges LV, Laine RD. The effect of school resources on student achievement. Rev Educ Res. 1996;66(3):361–96.

    Article  Google Scholar 

  34. Hauer KE, Park YS, Bullock JL, Tekian A. My assessments are biased! Measurement and Sociocultural approaches to achieve fairness in assessment in medical education. Acad Med. 2023;98(8 Suppl):S16–27.

    Article  Google Scholar 

  35. Holmboe ES, Osman NY, Murphy CM, Kogan JR. The urgency of now: rethinking and improving assessment practices in medical education programs. Acad Med. 2023;98(8 Suppl):S37–49.

    Article  Google Scholar 

  36. Bennett C. Assessment rubrics: thinking inside the boxes. Learn Teach. 2016;9(1):50–72.

    Article  Google Scholar 

  37. Kassam A, de Vries I, Zabar S, Durning SJ, Holmboe E, Hodges B, Boscardin C, Kalet A. The next era of assessment within medical education: exploring intersections of context and implementation. Perspect Med Educ. 2024;13(1):496–506.

    Article  Google Scholar 

  38. Sengupta E, Blessinger P, Ssemwanga A, Cozza B. Introduction to the role of external examining in higher Education–Challenges and best practices. Emerald Insight. 2021;38(1):3–11.

    Google Scholar 

  39. Burgess M, Phillips H. Authentic assessments and the external examiner. In the role of external examining in higher education: challenges and best practices. Emerald Insight. 2021;38(1):43–59. Emerald Publishing Limited.

    Google Scholar 

Download references

Acknowledgements

This study is supported via funding from Prince Sattam bin Abdulaziz University project number (PSAU/2024/R/1446).

Funding

None.

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization, A.S.A., S.A., and A.A.; methodology, A.S.A., S.A., and A.A., software, validation, A.S.A. and S.A.; investigation, A.S.A., and A.A.; data curation, A.S.A.; S.A.; and A.A; writing—original draft preparation, A.S.A.; S.A.; A.A.; and N.A., writing—review and editing, A.S.A. and N.A.; supervision, A.S.A. All authors have read and agreed to the published version of the manuscript.

Corresponding author

Correspondence to Abdullah Saad Alqahtani.

Ethics declarations

Ethics approval and consent to participate

The study was approved by the Standing Committee of Bioethics Research at Prince Sattam bin Abdulaziz University with approval number (SCBR-110/2023). Written informed consent was obtained from the participants prior to study.

Consent for publication

Not Applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Alqahtani, A.S., Al-Nasser, S., Alzahem, A. et al. Educators’ perceptions and challenges of student assessment process at Prince Sattam Bin Abdulaziz University dentistry program: a qualitative study. BMC Med Educ 25, 640 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12909-025-07227-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12909-025-07227-2

Keywords