Personalizing running training with immersive technologies using a multimodal framework

Personalizing running training with immersive technologies using a multimodal framework

Journal, New Pub
To improve performance and prevent injuries, running training needs proper personalized supervision and planning. This study examines the factors that influence running training programs, and the benefits and challenges of personalized plans. It also investigates how multimodal, immersive and artificial intelligence (AI) technologies can improve personalized training. We did an exploratory sequential mixed research with running coaches. We analyzed the data and found relevant factors of the training process. We recognized four key aspects for running training: physical, technical, mental and body awareness. We used these aspects to create a framework that proposes multimodal, immersive and AI technologies to help personalized running training. It also lets coaches guide their athletes on each aspect personally. The framework aims to personalize the training by showing how coaches and multimodal learning experience agents…
Read More
New Pub: the Multimodal Learning Analytics Handbook

New Pub: the Multimodal Learning Analytics Handbook

Book, Publication
Finally published, the new book "The Multimodal Learning Analytics Handbook" published by Springer edited by Michail Giannakos, Daniel Spikol, Daniele Di Mitri, Kshitij Sharma, Xavier Ochoa, Rawad Hammad. The book is the first comprehensive resource in the area of multimodal data for learning. State-of-the-art machine learning and AI methods for making sense of complex learning data. It explores the role and impact of multimodal data on teaching, learning, and training.
Read More
New Pub: Multimodal Learning Experience for Deliberate Practice

New Pub: Multimodal Learning Experience for Deliberate Practice

Book chapter
A new book chapter has been published as part of the Multimodal Learning Analytics Handbook edited by Springer. While digital education technologies have improved to make educational resources more available, the modes of interaction they implement remain largely unnatural for the learner. Modern sensor-enabled computer systems allow extending human-computer interfaces for multimodal communication. Advances in Artificial Intelligence allow interpreting the data collected from multimodal and multi-sensor devices. These insights can be used to support deliberate practice with personalised feedback and adaptation through Multimodal Learning Experiences (MLX). This chapter elaborates on the approaches, architectures, and methodologies in five different use cases that use multimodal learning analytics applications for deliberate practice. Di Mitri, D., Schneider, J., Limbu, B., Mat Sanusi, K.A., Klemke, R. (2022). Multimodal Learning Experience for Deliberate Practice. In: Giannakos,…
Read More