Designing and implementing multimodal data collection in classroom to capture metacognition in collaborative learning

Jonna Malmberg, Sanna Järvelä, Hanna Järvenoja, Eetu Haataja, Héctor J. Pijeira-Díaz, Ahsen Cini

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingAcademicpeer-review

Abstract

While prominent empirical research exploring the possibilities to utilize different data channels in the research of regulation in collaborative learning is emerging, we are still in the process of discovering the relevant combinations of different data sources and proper ways to combine data from different channels. This is the case particularly with metacognition. The potential of using multiple data channels lies also in their power to be transferred as a tool for providing learners ‘on the fly’ support for regulation when needed. However, an advanced understanding of the regulated learning in collaborative learning contexts, and particularly on metacognitive processes is essential to harness the benefits of technology in supporting these processes in collaborative learning.
Original languageEnglish
Title of host publicationCrossMMLA in practice
Pages23-29
Number of pages7
Volume2610
Publication statusPublished - 1 Jan 2020
Event2020 CrossMMLA in Practice: Collecting, Annotating and Analyzing Multimodal Data Across Spaces: co-located with 10th International Learning and Analytics Conference (LAK 2020) - Online, Germany
Duration: 24 Mar 202024 Mar 2020

Publication series

SeriesCEUR Workshop Proceedings
ISSN1613-0073

Workshop

Workshop2020 CrossMMLA in Practice: Collecting, Annotating and Analyzing Multimodal Data Across Spaces
Abbreviated titleCrossMMLA 2020
Country/TerritoryGermany
Period24/03/2024/03/20

Keywords

  • Collaborative learning
  • Metacognition
  • Multimodal data

Fingerprint

Dive into the research topics of 'Designing and implementing multimodal data collection in classroom to capture metacognition in collaborative learning'. Together they form a unique fingerprint.

Cite this