In the field of intelligent education,the integration of artificial intelligence,especially deep learning technologies,has garnered significant attention.Knowledge tracing(KT)plays a pivotal role in this field by pred...In the field of intelligent education,the integration of artificial intelligence,especially deep learning technologies,has garnered significant attention.Knowledge tracing(KT)plays a pivotal role in this field by predicting students’future performance through the analysis of historical interaction data,thereby assisting educators in evaluating knowledgemastery and tailoring instructional strategies.Traditional knowledge tracingmethods,largely based on Recurrent Neural Networks(RNNs)and Transformer models,primarily focus on capturing long-term interaction patterns in sequential data.However,these models may neglect crucial short-term dynamics and other relevant features.This paper introduces a novel approach to knowledge tracing by leveraging a pure Multilayer Perceptron(MLP)architecture.We proposeMixerKT,a knowledge tracing model based on theHyperMixer framework,which uniquely integrates global and localMixer feature extractors.This architecture enables more effective extraction of both long-terminteraction trends and recent learning behaviors,addressing limitations in currentmodels thatmay overlook these key aspects.Empirical evaluations on twowidely-used datasets,ASSIS Tments2009 and Algebra2005,demonstrate that MixerKT consistently outperforms several state-of-the-art models,including DKT,SAKT,and Separated Self-Attentive Neural Knowledge Tracing(SAINT).Specifically,MixerKT achieves higher prediction accuracy,highlighting its effectiveness in capturing the nuances of learners’knowledge states.These results indicate that our model provides a more comprehensive representation of student learning patterns,enhancing the ability to predict future performance with greater precision.展开更多
Question Generation(QG)is the task of utilizing Artificial Intelligence(AI)technology to generate questions that can be answered by a span of text within a given passage.Existing research on QG in the educational fiel...Question Generation(QG)is the task of utilizing Artificial Intelligence(AI)technology to generate questions that can be answered by a span of text within a given passage.Existing research on QG in the educational field struggles with two challenges:the mainstream QG models based on seq-to-seq fail to utilize the structured information from the passage;the other is the lack of specialized educational QG datasets.To address the challenges,a specialized QG dataset,reading comprehension dataset from examinations for QG(named RACE4QG),is reconstructed by applying a new answer tagging approach and a data-filtering strategy to the RACE dataset.Further,an end-to-end QG model,which can exploit the intra-and inter-sentence information to generate better questions,is proposed.In our model,the encoder utilizes a Gated Recurrent Units(GRU)network,which takes the concatenation of word embedding,answer tagging,and Graph Attention neTworks(GAT)embedding as input.The hidden states of the GRU are operated with a gated self-attention to obtain the final passage-answer representation,which will be fed to the decoder.Results show that our model outperforms baselines on automatic metrics and human evaluation.Consequently,the model improves the baseline by 0.44,1.32,and 1.34 on BLEU-4,ROUGE-L,and METEOR metrics,respectively,indicating the effectivity and reliability of our model.Its gap with human expectations also reflects the research potential.展开更多
基金supported by National Natural Science Foundation of China(Nos.62266054 and 62166050)Key Program of Fundamental Research Project of Yunnan Science and Technology Plan(No.202201AS070021)+2 种基金Yunnan Fundamental Research Projects(No.202401AT070122)Yunnan International Joint Research and Development Center of China-Laos-Thailand Educational Digitalization(No.202203AP140006)Scientific Research Foundation of Yunnan Provincial Department of Education(No.2024Y159).
文摘In the field of intelligent education,the integration of artificial intelligence,especially deep learning technologies,has garnered significant attention.Knowledge tracing(KT)plays a pivotal role in this field by predicting students’future performance through the analysis of historical interaction data,thereby assisting educators in evaluating knowledgemastery and tailoring instructional strategies.Traditional knowledge tracingmethods,largely based on Recurrent Neural Networks(RNNs)and Transformer models,primarily focus on capturing long-term interaction patterns in sequential data.However,these models may neglect crucial short-term dynamics and other relevant features.This paper introduces a novel approach to knowledge tracing by leveraging a pure Multilayer Perceptron(MLP)architecture.We proposeMixerKT,a knowledge tracing model based on theHyperMixer framework,which uniquely integrates global and localMixer feature extractors.This architecture enables more effective extraction of both long-terminteraction trends and recent learning behaviors,addressing limitations in currentmodels thatmay overlook these key aspects.Empirical evaluations on twowidely-used datasets,ASSIS Tments2009 and Algebra2005,demonstrate that MixerKT consistently outperforms several state-of-the-art models,including DKT,SAKT,and Separated Self-Attentive Neural Knowledge Tracing(SAINT).Specifically,MixerKT achieves higher prediction accuracy,highlighting its effectiveness in capturing the nuances of learners’knowledge states.These results indicate that our model provides a more comprehensive representation of student learning patterns,enhancing the ability to predict future performance with greater precision.
基金This work was supported by the National Natural Science Foundation of China(No.62166050)Yunnan Fundamental Research Projects(No.202201AS070021)Yunnan Innovation Team of Education Informatization for Nationalities,Scientific Technology Innovation Team of Educational Big Data Application Technology in University of Yunnan Province,and Yunnan Normal University Graduate Research and innovation fund in 2020(No.ysdyjs2020006).
文摘Question Generation(QG)is the task of utilizing Artificial Intelligence(AI)technology to generate questions that can be answered by a span of text within a given passage.Existing research on QG in the educational field struggles with two challenges:the mainstream QG models based on seq-to-seq fail to utilize the structured information from the passage;the other is the lack of specialized educational QG datasets.To address the challenges,a specialized QG dataset,reading comprehension dataset from examinations for QG(named RACE4QG),is reconstructed by applying a new answer tagging approach and a data-filtering strategy to the RACE dataset.Further,an end-to-end QG model,which can exploit the intra-and inter-sentence information to generate better questions,is proposed.In our model,the encoder utilizes a Gated Recurrent Units(GRU)network,which takes the concatenation of word embedding,answer tagging,and Graph Attention neTworks(GAT)embedding as input.The hidden states of the GRU are operated with a gated self-attention to obtain the final passage-answer representation,which will be fed to the decoder.Results show that our model outperforms baselines on automatic metrics and human evaluation.Consequently,the model improves the baseline by 0.44,1.32,and 1.34 on BLEU-4,ROUGE-L,and METEOR metrics,respectively,indicating the effectivity and reliability of our model.Its gap with human expectations also reflects the research potential.