Using Large Language Models to Create Educational Materials for the “Databases” Discipline
https://doi.org/10.35596/1729-7648-2025-31-4-5-14
Abstract
This paper examines the use of large language models (LLM) for creating educational materials. A practical methodology for generating high-quality educational content for the specific discipline of “Databases” is proposed and verified. A multi-stage methodology is presented, in which one LLM generates content, and a second, independent “reasoning” model verifies its quality and correctness. A comparison method with an authoritative source and a modified “verification chain” algorithm was used to check the generated materials for factual errors. The results confirm that this approach, when used with modern, high-performance LLMs (such as DeepSeek and Gemini), enables the creation of high-quality educational texts with a low probability of hallucinations. The methodology can significantly accelerate the development of reliable educational materials and can be optimized by reducing the number of iterations while maintaining a high-quality initial response.
About the Author
A. D. AsenchikBelarus
Asenchik Aleh Daniilovich, Cand. Sci. (Phys. and Math.), Associate Professor, Associate Professor at the Department of Information Technology
Tel.: +375 29 696-75-01
246029, Oktyabrya Ave., 48, Gomel
References
1. Gan W., Wang Z., Li J., Li Y., Zheng Y., Yu J. (2023) Large Language Models in Education: Vision and Opportunities. 2023 IEEE International Conference on Big Data. 4776–4785. https://doi.org/10.1109/ BigData59044.2023.10386291.
2. Kasneci E., Seßler K., Küchemann S., Bannert M., Dementieva D., Fischer F., et al. (2023) ChatGPT for Good? On Opportunities and Challenges of Large Language Models for Education. Learning and Individual Differences. 103. https://doi.org/10.1016/j.lindif.2023.102274.
3. Bonner E., Lege R., Frazier E. (2023) Large Language Model-Based Artificial Intelligence in the Language Classroom: Practical Ideas for Teaching. Teaching English with Technology. 23 (1), 23–41.
4. Baidoo-Anu D., Owusu Ansah L. (2023) Education in the Era of Generative Artificial Intelligence (AI): Understanding the Potential Benefits of ChatGPT in Promoting Teaching and Learning. Journal of AI. 7 (1), 52–62. https://doi.org/10.61969/jai.1337500.
5. Mittal U., Sai S., Chamola V., Sangwan D. (2024) A Comprehensive Review on Generative AI for Education. IEEE Access. 12, 142733–142759. https://doi.org/10.1109/ACCESS.2024.3468368.
6. Hennekeuser D., Vaziri D. D., Golchinfar D., El-Khatib K. (2024) Enlarged Education – Exploring the Use of Generative AI to Support Lecturing in Higher Education. International Journal of Artificial Intelligence in Education. https://doi.org/10.1007/s40593-024-00424-y.
7. Întorsureanu I., Oprea S.-V., Bâra A., Vespan D. (2025) Generative AI in Education: Perspectives Through an Academic Lens. Electronics. 14. https://doi.org/10.3390/electronics14051053.
8. Huang Q., Lv C., Lu L., Tu S. (2025) Evaluating the Quality of AI-Generated Digital Educational Resources for University Teaching and Learning. Systems. 13. https://doi.org/10.3390/systems13030174.
9. Ji Z., Lee N., Frieske R., Yu T., Su D., Xu Y., et al. (2023) Survey of Hallucination in Natural Language Generation. ACM Computing Surveys. 55 (12), 1–38. https://doi.org/10.1145/3571730.
10. Lewis P., Perez E., Piktus A., Petroni F., Karpukhin V., Goyal N., et al. (2020) Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks. Advances in Neural Information Processing Systems. 33, 9459–9474.
11. Dhuliawala S., Komeili M., Xu J., Raileanu R., Weston J., Roller S. (2023) Chain-of-Verification Reduces Hallucination in Large Language Models. arXiv:2309.11495. https://doi.org/10.48550/arXiv.2309.11495.
Review
For citations:
Asenchik A.D. Using Large Language Models to Create Educational Materials for the “Databases” Discipline. Digital Transformation. 2025;31(4):5-14. (In Russ.) https://doi.org/10.35596/1729-7648-2025-31-4-5-14


















