Aytekin, Mehmet Cem and Saygın, Yücel (2024) Discovering prerequisite relations using large language models. Interactive Learning Environments . ISSN 1049-4820 (Print) 1744-5191 (Online) Published Online First https://dx.doi.org/10.1080/10494820.2024.2375338
Full text not available from this repository. (Request a copy)
Official URL: https://dx.doi.org/10.1080/10494820.2024.2375338
Abstract
Automatic detection of prerequisite relations between concepts in education has always been a challenging AI task for researchers. Identification of prerequisite relations enables students to study new subjects more effectively and systematically, while allowing instructors to better tailor their learning materials to students' needs. However, to accurately detect these relations, the AI system must understand the context and meaning behind each concept and how it relates to other concepts in the domain. This requires a deep understanding of the educational curriculum and the ability to analyze large amounts of text and data. Large language models (LLMs) are a recent innovation in AI. LLMs have the capability to understand and generate human-like text since they are trained on a vast amount of text from the internet, books, articles, and more. LLMs can also be fine-tuned to specialize in tasks such as document summarization, question answering, or detecting user sentiment in reviews. Fine-tuning is done with a smaller, task-specific dataset. In this work, we introduce strategies for fine-tuning LLMs to improve their capability to detect prerequisite relations between educational concepts. To the best of our knowledge, this is the first work that utilizes fine-tuned LLMs for prerequisite detection. Our evaluation results demonstrate that fine-tuned LLMs are effective models for prerequisite detection. Our fine-tuning process also facilitates the generation of explanations to shed light on the reasoning behind prerequisite relations. Datasets we generated and used in fine-tuning are made public for the research community. We hope our contributions can aid in organizing and presenting knowledge in education and serve as a foundation for future research in the field.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | deep learning models in education; educational data mining; large language models; Prerequisite detection |
Divisions: | Faculty of Engineering and Natural Sciences |
Depositing User: | Yücel Saygın |
Date Deposited: | 23 Aug 2024 16:17 |
Last Modified: | 23 Aug 2024 16:17 |
URI: | https://research.sabanciuniv.edu/id/eprint/49789 |