Categories: Education Research / Dental Education

Leveraging ChatGPT to Support Terminology Learning in Oral Anatomy: A Mixed-Methods Study Among Linguistically Diverse Dental Students

Leveraging ChatGPT to Support Terminology Learning in Oral Anatomy: A Mixed-Methods Study Among Linguistically Diverse Dental Students

Introduction: The challenge of terminology in EMI dental education

Globalization in higher education has increased English-medium instruction (EMI) across non-Anglophone contexts. In dental education, mastering dense disciplinary terminology is essential for future clinicians, yet linguistically diverse students often face cognitive and linguistic barriers when engaging with oral anatomy. This mixed-methods study investigates how artificial intelligence, specifically ChatGPT, can support terminology learning for international dental students in Malaysia, highlighting how AI can reduce cognitive load while promoting autonomous, self-directed study.

Study context and aims

Malaysia’s role as a regional hub for international education places multilingual learners in EMI settings where English is the primary language of instruction. The study, conducted at SEGi University, focused on Chinese international students enrolled in the Bachelor of Dental Surgery program. Its aims were to (1) assess whether AI-assisted tools improve comprehension of dental terminology, (2) understand how students experience AI in a multilingual EMI curriculum, and (3) explore ethical considerations associated with AI in education.

Methods: A convergent mixed-methods pilot

Researchers combined quantitative and qualitative data in an eight-week pilot using ChatGPT (GPT-4). A 20-item terminology test measured comprehension before and after the intervention, while the Utrecht Work Engagement Scale for Students (UWES-S) gauged engagement. Usage logs captured interaction patterns and time spent, and four focus groups explored student experiences. The sample consisted of 35 Year 1 and Year 2 students, a purposive group chosen for high relevance to oral anatomy terminology challenges. Ethical oversight, informed consent, and data privacy safeguards were integral to the design.

Intervention details: How ChatGPT was used

Participants received orientation on prompt engineering and safe usage, then used ChatGPT at least twice weekly to clarify definitions, context, and clinical relevance of terms. Faculty played a non-participatory role, ensuring that AI use remained student-centered. The intervention unfolded in three phases: baseline measurements and briefing; active AI-supported study; and post-intervention assessments plus focus groups. Analyses triangulated quantitative results with qualitative insights for robust interpretation.

Key findings: Improved comprehension and engagement

Results indicated a substantial improvement in terminology comprehension, with mean scores rising from 10.4 to 16.1 (out of 20), a statistically significant gain with a large effect size (Cohen’s d = 1.91). Engagement also increased, with UWES-S scores rising from 3.2 to 4.1 on average. Usage patterns showed students averaged 2.6 ChatGPT sessions per week, spending about 8.4 minutes per session. The most common queries involved term clarification, followed by explanations of function and clinical relevance, and comparisons between similar structures.

Qualitative themes underline value and caveats

Focus-group analyses revealed four dominant themes: (1) clarity and confidence in using terminology, (2) self-directed, personalized learning with a perceived “private tutor” effect, (3) concerns about over-reliance and occasional inaccuracies, and (4) ethical and privacy considerations underscoring the need for safeguards. Students valued autonomy and immediate feedback but stressed the necessity of faculty oversight to verify accuracy and prevent superficial engagement.

Implications for pedagogy: AI as a scaffold, not a substitute

The findings align with Cognitive Load Theory (reducing extraneous load) and Self-Determination Theory (supporting autonomy and competence). AI-assisted terminology learning can lower linguistic barriers, enabling deeper engagement with core anatomical concepts. The study argues for a thoughtful, co-designed approach where educators provide structure, transparency, and ethical guidelines, ensuring AI augments rather than replaces traditional instruction.

Limitations and future directions

As a pilot without a control group, results limit broad generalizability. Future research should include diverse institutions, longer follow-ups, and measured variables such as baseline English proficiency and digital literacy. Investigating faculty perspectives and developing co-designed AI integration frameworks will be crucial to sustainable, equitable adoption in EMI dental programs.

Conclusion: Toward equitable, AI-enhanced terminology learning

ChatGPT shows promise as a scalable, accessible support for linguistically diverse dental students facing terminology-heavy curricula. When embedded within a guided, ethically aware framework, AI-assisted terminology learning can enhance comprehension, boost engagement, and promote autonomous study—contributing to more equitable outcomes in global dental education.