Ed Explainable AI
It has been two years since ChatGPT has become part of a common conversation in many areas of life, including higher ed. The potential applications for generative AI (GenAI) have been/are being discussed and considered amongst many academics. This week, I would like to share current research targeted on the possibilities of GenAI in higher ed. The article is entitled, “Educational eXplainable Artificial Intelligence (XAI) Tools for personalized learning” by Ogata, et al. (2024).
The authors propose that XAI should encompass two essential components:
providing explanations to raise students’ awareness of their learning progress;
explaining recommendations to establish trust and motivate students to continue using the system.
In the field of education there has been a long history of research into self-explanation, where students [reflect, then] explain the process of their answers. This has been recognized as a beneficial intervention to promote metacognitive skills, however, there is also unexplored potential to gain insight into the problems that learners experience due to inadequate prerequisite knowledge and skills that are required, or in the process of their application to the task. In this study, the authors propose a system in which students and the AI system explain to each other their reasons behind decisions that were made. These include self-explanation of student cognition during the answering process, and explanation of recommendations based on internal mechanisms and other abstract representations of model algorithms.
The paper proposes a symbiotic learning system that adapts to the learner’s performance and promotes mutual understanding through the use of explanations by the learner and the system. Educational eXplainable AI Tools (EXAIT) aims to combine the benefits of both types of explanations into a single learning tool that can co-evolve symbiotically through the learner’s self-explanation and AI-generated explanation.
Initially, the system provides AI recommendations of potential learning paths to foster trust and learner awareness. The student then completes a task using a stylus pen to input their efforts. The system then prompts the student to self-explain their answer process by replaying it interactively and annotating points in time to indicate the knowledge applied to overcome sub-problems. Time series analysis is then applied to the self-explanation and answer process data to extract information, such as backtracking or stuck points, that could indicate problems with dependent or related knowledge. The ultimate goal of the system is to complete the symbiotic explanation cycle by incorporating the self-explanation analysis into the AI recommendation model.
The learner reviews their answer and explains why they conducted those steps in the answer using the answer process analysis and self-explanation tool. The AI engine collects learners’ answers, self-evaluations, and self-explanations during these processes. Then the AI engine analyzes the self-explanations by the learner to identify potential stuck points in the answering process and link them to relevant concepts. This information is then combined with the learner’s answer history and analyzed to generate a limited number of appropriate recommendations for the learner based on their current situation. Finally, the AI explains why the recommendations have been made. The learner can select learning material that they deem as being appropriate based on their interpretation of the explanation and what they perceive as meeting their current learning needs.
There have been other studies which have explored the value of student reflection and self-explanation using technology. In Soto’s (2015) study, the author asked students to explain their efforts with mathematical word problems using screencasts and the ExplainEverything app. The results found that when students explained themselves, they often used more sophisticated language and focused on the reasoning behind the mathematical concepts; and the act of creating screencasts encouraged students to actively think about their explanations.
References
Ogata et al. (2024). Educational eXplainable Artificial Intelligence Tools for personalized learning. Research and Practice in Technology Enhanced Learning, 19(19).
Soto, M. (2015). Elementary students' mathematical explanations and attention to audience with screencasts. Journal of Research on Technology in Education, 47(4), 242-258.
Comments