Introduction: The challenge of terminology in EMI dental education
Globalization of higher education has increased English-medium instruction (EMI) in non-Anglophone settings. For dental students learning dense terminology, mastering specialized vocabulary in English becomes a critical gatekeeper to success. In Malaysia, where international students form a substantial part of the dental programs, tools that can scaffold terminology without compromising disciplinary depth are particularly valuable.
Study purpose and design
This mixed-methods pilot investigates whether AI-assisted terminology support—specifically via ChatGPT—can improve comprehension of oral anatomy terms among linguistically diverse dental students in a Malaysian university context. A convergent design collected quantitative outcomes (terminology comprehension tests and engagement scales) and qualitative insights (focus groups and open-ended surveys) to explore not only learning gains but also student experiences, ethical considerations, and interactions with AI in a multilingual EMI curriculum.
Participants and intervention
Participants were 35 Chinese international students in Year 1 and Year 2 enrolled in the Bachelor of Dental Surgery program. Over eight weeks, students used ChatGPT at least twice weekly to clarify definitions, explain functions, and contextualize anatomical terms encountered in oral anatomy. An orientation covered effective prompting, safety, and academic integrity, with faculty taking a non-participatory oversight role to preserve student autonomy.
Data collection and measures
Quantitative data included a 20-item multiple-choice terminology test administered pre- and post-intervention, and an adapted Utrecht Work Engagement Scale for Students (UWES-S). Usage logs captured frequency and duration of AI interactions. Qualitative data came from four focus groups and open-ended survey responses, analyzed thematically to identify shifts in clarity, autonomy, and ethical awareness.
Key findings: improved comprehension and engagement
Results indicate a substantial improvement in terminology comprehension, with post-test scores rising from an average of 10.4 to 16.1 (on a 20-item scale). Engagement levels also increased significantly, reflecting stronger vigor, dedication, and absorption in learning tasks. On average, students used ChatGPT 2.6 times per week, spending about 8.4 minutes per session. Most frequent queries involved clarifying terms (45%), explaining functions or clinical relevance (32%), and comparing related structures (23%).
Qualitative insights: autonomy, confidence, and caution
Four main themes emerged. First, learners reported greater clarity and confidence to participate in class discussions. Second, they valued self-directed, private tutoring that supports pacing and personalized explanations. Third, concerns about over-reliance and variable accuracy highlighted the need for faculty guidance and critical appraisal. Fourth, students demonstrated ethical awareness, including data privacy considerations and careful handling of sensitive information.
Implications for pedagogy
The study aligns with
- Self-Determination Theory (SDT): Autonomy and competence were enhanced as students directed their own inquiries.
- Cognitive Load Theory (CLT): AI helped reduce extraneous linguistic load, freeing cognitive resources for core anatomical concepts.
- Disciplinary literacy: AI-assisted terminology supports students’ fluency in biomedical discourse within EMI contexts.
Ethics and implementation considerations
Ethical safeguards—clear usage guidelines, faculty oversight, and ongoing monitoring—were central to the design. Students reported awareness of accuracy limits and privacy concerns, underscoring the need for co-designed AI integration with pedagogy and policy in health professions education.
Limitations and future directions
Limitations include a small, single-institution sample and absence of a control group. Future work should incorporate larger, diverse cohorts, measure prior English proficiency and digital literacy, and explore longitudinal effects across multiple courses. Investigating educator perspectives will further illuminate sustainable, ethical AI adoption in dental education.
Conclusion
AI-assisted terminology learning via ChatGPT can meaningfully enhance comprehension and engagement for linguistically diverse dental students in EMI settings. When embedded within a well-structured, ethically guided framework, AI acts as a valuable supplement to traditional pedagogy, reducing linguistic barriers while preserving disciplinary rigor.