Next Article in Journal
Introducing the Comprehensive Value Function for Sustainability Full-Spectrum Assessment
Next Article in Special Issue
SustainableTransformation of Undergraduate Engineering Education in China through Sino-Australian Cooperation: A Case Study on Electro-Mechanical System
Previous Article in Journal
The Application of Chitosan-Based Adsorbents for the Removal of Hazardous Pollutants from Aqueous Solutions—A Review
Previous Article in Special Issue
A Multi-Project Evaluation of Engineering Students’ Performance for Online PBL: Taking the Sustainable Decision Analysis Course as an Example
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluating the Impact of Learning Management Systems in Geographical Education in Primary School: An Experimental Study on the Importance of Learning Analytics-Based Feedback

by
Sergio Tirado-Olivares
1,
Ramón Cózar-Gutiérrez
1,*,
José Antonio González-Calero
1 and
Nuno Dorotea
2
1
LabinTic, Laboratory of Technology Integration in Classroom, Faculty of Education of Albacete, University of Castilla-La Mancha (UCLM), 02071 Albacete, Spain
2
UIDEF, Research and Development Unit in Education and Training, Institute of Education, University of Lisbon, 1649-004 Lisbon, Portugal
*
Author to whom correspondence should be addressed.
Sustainability 2024, 16(7), 2616; https://doi.org/10.3390/su16072616
Submission received: 14 February 2024 / Revised: 19 March 2024 / Accepted: 19 March 2024 / Published: 22 March 2024

Abstract

:
Traditionally, educational processes were focused on learning theoretical geography content, often supplemented with hands-on activities. However, advances in technology have enabled the integration of Learning Management Systems (LMSs) such as Moodle, which enable students to learn at their own pace, receive instant and individualized feedback about their daily academic performance, and gather more daily information individually based on techniques such as Learning Analytics (LAs). Despite these benefits, there is a lack of evidence supporting this educational approach in primary education. This experimental study, involving 80 fifth-grade students, aims to address this gap while investigating the territorial and socio-economic organization of their environment and comparing two types of feedback provided: simply the correct answer (control group), and more extensive (experimental group). The findings suggest that the implementation of Moodle tasks facilitates learning, irrespective of the type of feedback provided. However, students rated activities higher in terms of usefulness and satisfaction with the teaching–learning process when extensive feedback was provided. Additionally, the daily data collected proved useful for teachers in predicting students’ final outcomes. These results highlight the potential benefits of carrying out activities in Moodle, despite their short duration, particularly at this academic level and within this knowledge domain.

1. Introduction

Recently, there has been an increasing acknowledgment of the significance of student-centered learning contexts. Indeed, this pedagogical goal has become a primary focus of 21st-century education [1]. In this regard, authors such as Redecker et al. [2] underscore that remarkable technological advancements are facilitating innovative and precise pedagogical approaches centered on each learner who is actively engaged in their educational journey. In addition, Pelletier et al. [1] argue that current technological tools enhance educators’ ability to monitor their students more effectively, identify learning difficulties, and assess the overall progress of the learning environment. Indeed, this approach facilitates the renewal of the teaching–learning process, not only from the perspective of the student but also from that of the teacher. Thanks to technology, daily tasks, commonly known as homework, can be completed through more sustainable learning models that do not need to use natural resources such as paper to excess, while also enabling the collection of information in a much more automated and sustainable way over time by the teacher. Then, it allows for the adaptation of teaching methods to better suit student-centered learning styles, while also providing teachers with the tools and insights needed to monitor progress and address challenges in a timely manner thanks to current educational trends such as Learning Analytics (hereinafter, LAs) [3,4].
In addition, education is confronted with the challenge of student demotivation. Traditional methodologies, which are predominantly lecture-based, may encourage students to perceive content as closed knowledge that they passively receive in class and should memorize. This approach has been widely used in social science disciplines [5] such as geography [6]. For this reason, there has been limited research to date that specifically examines the integrated application of technological and active approaches in disciplines outside the scientific-technological field [7].
Within this evolving educational landscape, different authors underscore the necessity of adapting assessment processes to align with this new active educational paradigm and the benefits of enhanced monitoring capabilities (e.g., [8]). This adaptation involves a departure from traditional summative assessment approaches, which concentrate exclusively on the final outcome, focusing instead on assessing the learning process itself [4,9]. Authors such as Bulut et al. [10] and Gašević et al. [11] support this approach, emphasizing the importance of redefining assessment strategies to reflect the dynamic and multifaceted nature of contemporary learning environments more accurately.
In this context, the potential of Learning Management Systems (LMSs) such as Moodle to transform the teaching–learning process is noteworthy [12]. In particular, Moodle provides a dual benefit: it offers an environment for individual student learning and the integration of LAs, enabling more frequent and elaborate feedback on the learning process for both students and teachers [13,14]. Given the limited existing evidence so far in the use of LAs in educational settings [15], this study aims to employ the LMS Moodle jointly with LAs data gathered during its use to examine the impact of feedback utilization in the field of social sciences, particularly geography in primary education.

Aim and Research Questions

This research aims to address the existing gaps in the literature regarding the lack of evidence about the use of LMSs such as Moodle in compulsory education, particularly in the context of geographic education. Consequently, this study encompasses an analysis of the application of feedback mediated by LAs data, not only for helping teachers’ daily monitorization of the learning process and the development of predictive models based on formative assessment but also for investigating the impact of its integration on both student learning outcomes and satisfaction.
In view of the above, the present research aims to contribute to the educational community by examining the potential of its use. Thus, the global objective of this study is intended to be addressed by three specific questions:
RQ01. How does the completion of the LMS tasks and the associated feedback impact the academic performance of students in geography in primary education?
RQ02. What is the potential of the LAs data gathered during the LMS task in predicting the final academic performance of fifth-grade students?
RQ03. Are fifth-grade students satisfied with the LMS task and the feedback provided?

2. Literature Review

In order to answer these questions, it is first necessary to review the literature on how education and the agents involved must adapt to the new educational demands. This is an area in which technology, in general, and both the LMS Moodle and techniques such as LAs in particular can play an important role. Therefore, it is essential to understand what other authors have already said on this topic and the conclusions they have reached.

2.1. Tradition vs. Adaptation to Current Education Needs: From Passive and Paper-Based Learning to an Active, Technological and Student-Centeredone

Although contemporary society is changing due to, among other factors, the significant technological development of recent years, the pedagogy of geography, its purpose, and the methodologies and resources implemented have not undergone major modifications [5]. In this context, mere memorization, traditionally associated with the study of social sciences, is losing its relevance. As various authors such as Wineburg [16] point out, information is now just a few mouse clicks away. Therefore, the importance lies not just in knowing the information contained in a textbook, but in understanding how to access it and competently judge its veracity [17,18]. In fact, Brooks et al. [19] underscore the need for students to think geographically. That is, students should become aware of the importance of geographical content in understanding current issues such as multiculturalism, climate change, and socio-political challenges, based on knowledge of their causes and the interaction between different phenomena. This allows them to develop a critical consciousness through the analysis and interpretation of the information presented to them. In line with this, Roberts [20] had already highlighted the importance of knowing geography to make sense of one’s own geographical context and the global context—a context that is increasingly accessible thanks to technology.
Despite the above, in geography teaching, there is a lesser research tradition towards educational renewal [21,22]. The teaching of geographical content or subjects focused on understanding social aspects over time has followed a passive model of knowledge transmission. In this model, students were mere recipients of information provided by their teachers during lectures and through reading materials usually provided through textbooks, which sometimes contained images or maps. After the initial learning phase, students are expected to complete activities from the textbook in their notebooks, which are then corrected in a group setting under the supervision of the teacher on the following day. This can also lead to generalized disinterest from students in these subjects as they consider them of little use in their daily lives [23]. This lack of knowledge about applicability is partly attributed to the difficulties students have in understanding spatial relationships [24]. This information, in addition to being received passively, had to be absorbed and retained by the students for later reflection in summative tests [9], traditionally exams, which have been frequently used in social science areas [25]. This pedagogical approach contrasts with current educational trends that emphasize the need for a more competency-based approach to learning [5,19]. This raises a question: why not promote a change in didactics using technology?
Authors such as Palacios-Rodríguez et al. [26] suggest that the lack of educational innovation is due to the insufficient training of teachers in digitalization. However, it has already been noted that this transition towards more interactive and practical methods not only improves students’ academic performance, fostering a deeper understanding and a broader appreciation of the relevance of geography in their environment, but also contributes to a more positive perception of the discipline and its contents [27]. Therefore, it is necessary to continue along this line of action. This is a need that is also justified according to current educational approaches that emphasize that teaching should help students learn for themselves and adapt to their educational needs [2]. This can be achieved by technological tools and techniques such as Moodle and LAs [13,28].

2.2. Leveraging Tasks in Moodle to Enhance Active and Autonomous Learning Paths

We must bear in mind that any new activity integrated into the teaching–learning process should have a distinct pedagogical objective [29]. Moodle facilitates the individual realization of various types of activities and enables subsequent automatic feedback processes, making it an appropriate platform for the implementation of daily and substantive feedback [12]. In particular, this LMS offers features to design and conduct different kinds of activities and concurrently monitor their performance on a daily basis. This allows educators and students to track learners’ activity completion and the academic achievements attained [13,30]. In fact, the significant information gathered enables teachers to predict students’ academic performance [10,12,31]. As can be seen, the use of LMSs aids in enhancing the teaching–learning process, with a focus on both students and teachers.
On the one hand, the integration of LMS tasks proves beneficial for students. This approach enables students to take on more active and autonomous roles in their learning process, leading to various advantages. For instance, Kliziene et al. [32] highlight the positive impact on student academic achievement resulting from the use of this type of virtual teaching–learning platform. Furthermore, Magalhães et al. [33] demonstrate through their comprehensive literature review that digital assignments tend to be more effective than traditional paper-based assignments. This, coupled with their ability to reduce paper consumption in educational contexts, makes them a potentially sustainable and effective alternative for learning. In addition, Suad et al. [34] also noted increased levels of student motivation. Therefore, based on the findings of these authors, the use of LMSs not only enhances teaching practices through the ability to gather and analyze significant daily information, but also proves beneficial for students and current policies for a more sustainable world. This underscores the importance of integrating this approach broadly, including primary education levels and disciplines such as geography.
On the other hand, regarding teachers, the use of Moodle within the pedagogical context not only assists them in understanding the final academic level attained by students, but also how this learning process has been taking place. This, in essence, facilitates the implementation of contemporary educational trends such as LAs. Although various definitions of LAs exist, one of the most commonly used definitions is proposed by Long et al. [3]: “the measurement, collection, analysis, and data report about both learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (p. 1). The potential of LAs is so significant that the EDUCAUSE Horizon Report, Teaching and Learning Edition, identifies it as one of the most promising emerging technology trends [1].
On the contrary, despite its promising interest, this trend is currently underutilized in humanities disciplines [15,35]. One reason is that it is very time-consuming and difficult to systematically process and analyze student learning data in real-time using analog and traditional methods. Therefore, further research in this area is needed. However, teachers can introduce LAs into their lessons through the LMS, as previously noted. Specifically speaking about Moodle, which is one of the most well-known and widely used LMSs in education [14], it not only facilitates the collection of data relevant to LAs, but also supports the adaptation of learning sequences, allowing each student to work autonomously at their own pace [36].
By integrating tasks from the LMS Moodle, and the LAs data gathered during these tasks into the pedagogical process, the need to wait for subsequent summative tests to determine whether the content has been assimilated is eliminated. This approach facilitates the faster identification of students’ difficulties and students at risk of dropping out [37], thereby enabling timely intervention [38,39]. This is crucial to conduct effective daily evaluations of students.

2.3. Empowering Both Educators and Learners: The Potential of Learning Analytics to Foster Greater and Daily Feedback throughout the LMS Moodle Sustainably

While it is widely acknowledged that assessment processes need to be updated to provide more frequent and individualized feedback, traditional summative paradigms still persist in practice [4,8,40,41]. This is, among other factors, due to the time-consuming and costly nature of manually collecting individual learner performance data [38]. However, LAs enable the collection of large volumes of data or information in real-time and in diverse contexts, thus monitoring the effectiveness or not of the teaching–learning processes [3,42]. The potential of LAs is so significant that the European Competence Framework (DigCompEdu) includes it among the digital competences that teachers should apply in class for daily monitoring of the teaching–learning process [2].
As Gašević et al. [4] highlight, LAs enables not only the automatic collection of large amounts of information, but also the integration of formative assessment processes as another daily activity. The Learning Analytics Community Exchange project underscores the future significance of LAs. Both education administrations and teachers are expected to increasingly rely on LAs recommendations in their decision-making processes [43]. This underscores the potential of LAs to provide reliable evidence for determining optimal learning trajectories, appropriate instructional materials, and personalized measures for individual learners, thereby facilitating a more personalized learning process. This aligns with one of the key concepts of formative assessment, as it helps teachers to know how the learning process is going and enables students to improve: daily feedback.
Sadler [44] defined formative assessment as a process where the quality of students’ responses is evaluated with the aim of enhancing their competence. This process necessitates the use of feedback mechanisms to gauge the success of the teaching–learning process. Despite the wide range of definitions, as noted by Black and Wiliam [45], there are common elements that underscore its importance. These definitions highlight the application of daily feedback processes, which not only allow teachers to closely monitor classroom activities but also enable students to self-regulate their learning. Consequently, the implementation of formative assessment aligns with the ongoing transformation in education, effectively addressing the limitations of summative assessments. In the specific case of social science, different authors propose that the assessment process should be integrated as any other class activity to regulate and enhance student learning, address mistakes, and make informed decisions [9,23]. Thus, the use of LAs is aligned with this purpose, as it enables more effective monitoring of learning and the contextual factors influencing it [8,10,30].
However, despite the promising prospects of LAs, Dubé and Wen [46], in their review of technology trends in K-12 education, emphasize that while the educational impact of LAs is often emphasized in the literature, there is still a scarcity of concrete evidence on the effective integration of LAs in classroom settings.
Similarly, different authors emphasize the necessity for further research that specifically addresses how to effectively articulate these findings for formative evaluation purposes e.g., [4,30]. This becomes even more crucial considering the limited evidence available on the integration of LAs in primary education and in domains beyond the scientific-technological field, as different literature reviews state [15,35,47]. The use of LAs has demonstrated promising outcomes [48,49,50]. These studies highlight the effectiveness of LAs in personalized learning, assessing academic achievement, and improving skills while preventing misconceptions in students. Thus, LAs leads to increased student awareness of their learning process, improved feedback, and enhanced formative assessment processes.
By way of summary, Srinivasa et al. [51] suggest that the successful application of LAs requires collecting student experiences in a virtual environment, such as Moodle, which provides sufficient quality data to build an effective predictive model for student achievement. Simultaneously, this approach should help students to become more aware of their learning process and take appropriate actions. Thus, it is crucial for students to comprehend the reported data, including both failures and successes, in a straightforward manner.
This study adopts this approach with the intention of providing the educational community with a practical and useful method for applying more frequent and daily feedback in Primary Education.

3. Materials and Methods

3.1. Design

To address the research questions previously mentioned, a quantitative and experimental study was conducted. The LMS Moodle, which was utilized, facilitates individualized work for students, thereby enabling the pure randomization of all participants. This design was intended to mitigate potential influences from variables extraneous to the study, obviate the need for assigning manually students to new groups, and simultaneously facilitate the replication of the study under similar conditions and contexts in future research [52]. Such are the advantages of this design that authors like Leppink [53] highlight this as the most rigorous way to measure causality in the relationships between variables given that all previous differences are left to chance. Consequently, this design ensures that both experimental conditions are present in each class set, avoiding potential biases such as the student’s previous level, educational context, or the teacher’s teaching style.

3.2. Context and Participants

This experimental and quantitative study involved 80 fifth-grade students from six classes across two Spanish public schools. To initiate contact with the participating schools, we obtained the necessary permissions from both the education administration of the autonomous community of Castilla-La Mancha and the ethics committee of the University of Castilla-La Mancha (Date 07.04.2022./No. CEIS-632710-Z1N4). Following this, the administration itself informed the schools about the study via email. Upon receiving approval from the schools, two of the participating researchers reached out to the center to provide a detailed explanation of the project, including its objectives, procedure, and expected timeline for its execution, which were adapted to fit the teaching plan. Lastly, we presented the project to the students’ families, and they gave their explicit consent for their children’s participation in the project.
To conform to the experimental design, students were randomly assigned to one of two experimental conditions. All students engaged in identical activities and adhered to their usual learning dynamics. However, the feedback provided differed between groups. The experimental group (EG) received feedback on the accuracy of their responses, including the correct answers and comprehensive explanations that justified the correct choice for each question. In contrast, the control group (CG) only received feedback on the correctness of their answers, and which the correct answers were. An example of this activity and the two types of feedback is presented in Figure 1.
Initially, the sample size included more students from both schools who agreed to participate in the study. However, for the purpose of analyzing all the research questions, only those students who attended all the sessions comprising the study were considered. After discarding those students, the EG consisted of 43 students (18 females and 25 males), and the CG was composed of 37 students (14 females and 27 males). All students were studying at the same level and covering the same content.

3.3. Procedure

To execute all tasks, an online course in Moodle was established, and all students were registered. Each student was assigned a unique username and password to access the activities in a controlled manner. Activities were initially concealed from the students until the corresponding content was explained in class. Consequently, until they had reviewed the content block with their teacher, the students were unable to complete the questionnaire related to these contents. Similarly, once the session concluded, the activities were again hidden to prevent students from accessing them outside of school. For this reason, continuous communication with the teaching staff was essential. Therefore, all these questionnaires were conducted within the school premises. This approach was designed to have better control over the timing and manner of student completion of such questionnaires.
In particular, the project planning comprised six sessions. The initial session was allocated for the pre-test, during which students completed a questionnaire on the geography topics to be covered in the project: demography and social organization of Spain. The same questionnaire was administered at the end of the project (post-test). The execution of this test, both pre- and post-project, was allotted a 30 min duration for the students. Moreover, this test was conducted on the Moodle platform under conditions mirroring those of a formal examination, with consultation of any external information prohibited.
In the subsequent sessions, students continued to complete questionnaires via Moodle. However, these instructional sessions featured questionnaires designed for a shorter completion time (approximately 10 min), focusing on reviewing previously covered material. Thus, this questionnaire was intended to serve as a continuous evaluation tool while at the same time being included as another activity in the teaching–learning process [9]. Topics such as understanding population pyramids, the population density of the country where this study was conducted, and comprehension of economic sectors were among the content reviewed.
Finally, in the last session, the students not only completed the post-test to address RQ01 and RQ02 but also filled out the LOES-S instrument, aiming to address RQ03 (see Section 3.4). A comprehensive summary of the research procedure is depicted in Figure 2.

3.4. Instruments

To answer the research questions, distinct instruments were employed throughout the study. Initially, Moodle was leveraged to design and administer all the questionnaires. This LMS, in addition to facilitating the design and implementation of the questionnaires, enabled the collection of LAs data, such as questionnaire scores and completion times, which were crucial for both RQ01 and RQ02. Moreover, by conducting all activities within this digital environment, researchers were able to continuously monitor the project without the need for in-person intervention or disruption of the typical classroom dynamics. This approach helped to prevent bias due to the physical presence of researchers. The questionnaires combined the options provided by Moodle: multiple choice, true or false, text completion, image completion, and drag and drop. The test conducted before and after the intervention, which covered the entire topic, consisted of 12 questions. The remaining questionnaires, which focused on specific parts of the syllabus, comprised between 6 and 8 questions, depending on the requirements of each section. This ensured that it could be completed at the end of the class in a short period of time.
In the final session, the Learning Object Evaluation Scale for Students (LOES-S), developed by Kay and Knaack [54], was employed to assess students’ satisfaction with the task (RQ03). The items on this five-level Likert scale questionnaire (ranging from 1 (Strongly disagree) to 5 (Strongly agree)) were adapted to our learning object (LMS Moodle). This adaptation aimed to evaluate students’ satisfaction based on the three dimensions of LOES-S: (1) learning, (2) quality, and (3) engagement with the learning object. The reason for choosing this scale in particular is due, first, to the fact that it is an instrument that can be adapted to evaluate different educational tools, in our case Moodle and its tasks. Secondly, as the authors stated, there is little evidence of validated instruments of this style for particular use in K-12 environments. Finally, despite its short length, it allows us to evaluate the three dimensions—previously mentioned—that are of special interest according to our research question.

3.5. Data Analyses

All data gathered were coded and exported to a database for their analysis. In particular, the statistical software R 4.3.2 [55] was used to address all the research questions. To this aim, the learning scores (correct answers in each Moodle task) were standardized on a 10-point scale. Then, both descriptive and inferential statistical analyses were carried out. A confidence interval of 95% was used for all the analyses.
First, to assess potential differences in student achievement (RQ01), a moderation analysis was executed to investigate the relationship between variables. In particular, we examined the impact of feedback on post-test scores, considering group conditions as the independent variable and pre-test scores as the moderator variable. This analysis aimed to determine whether students’ prior knowledge influenced the effect of feedback. Unlike some alternative methods (such as ANCOVA), this analysis accommodates the possibility of an interaction between the independent variable and the covariate. By doing so, we can ascertain whether the strength of the independent variable’s effect varies based on the level of the covariate [56]. A comparison was conducted between CG and EG conditions using the PROCESS macro in R [57]. Subsequently, if significant interactions were observed, the Johnson–Neyman technique [57] was applied to determine the regions of the pre-test scores (moderator) for which the effect of experimental conditions on students’ final academic outcome was significant.
After that, to examine the predictive potential of LAs-formative assessment activities on students’ academic performance (RQ02), a linear regression was performed using the LAs data collected from Moodle throughout the project. To achieve this, the data were initially analyzed to verify that regression assumptions were satisfied, including normality, linearity, and homoscedasticity. Finally, to measure students’ satisfaction with the project (RQ03), the non-parametric Mann–Whitney U test was used as ordinal data were gathered from the LOES-S instrument.

4. Results

The obtained results have been systematically organized to address the research questions. Firstly, we present the data derived from the analysis of the impact of utilizing extensive feedback in LMS tasks on students’ academic achievement (RQ01). Subsequently, we discuss how the LAs data, gathered from the execution of these tasks, enables the prediction of this academic achievement (RQ02). Lastly, we examine the level of satisfaction exhibited by the students with the tasks performed (RQ03). This structure ensures a comprehensive and coherent presentation of the findings, aligning with the standards of academic research papers.

4.1. Is the Inclusion of Extensive Feedback Beneficial for the Improvement of Students’ Academic Achievement?

In relation to the first research question, Table 1 presents the descriptive results obtained. Before the intervention, students demonstrated a comparable medium–low knowledge level. They appeared to possess some understanding, albeit insufficient, about the geography content they were expected to work on. Following the intervention, there was an increase in the level of knowledge attained. In this regard, students in the EG appeared to outperform those in the CG.
This observation underscores the efficacy of the intervention in bolstering the students’ comprehension of the subject matter. However, an inferential analysis was conducted to clearly identify the differences (if any) between both experimental conditions.
To examine if these differences were statistically significant or not, a moderation analysis with one moderator (pre-test scores) was carried out. This analysis enables the evaluation of the impact (if any) of extensive feedback on the final academic performance. Upon confirming a non-interaction between pre-test scores and experimental conditions (F (1, 76) = 0.63, p = 0.43), the moderation analysis showed that students’ prior geographical knowledge significantly affected the final knowledge reached after the intervention (b = 0.72, se = 0.15, 95% CI [0.42, 1.03], t = 4.77, p < 0.001). However, no statistically significant differences were observed in the post-test between the experimental conditions, irrespective of the experimental condition intervention (b = 1.05, se = 1.04, 95% CI [−1.02, 3.12], t = 1.01, p = 0.317).
Students’ prior knowledge is a relevant variable that should be considered when implementing and evaluating interventions, as it could significantly moderate the effectiveness of such strategies. However, the Johnson–Neyman analysis confirmed that the effect of extensive feedback was not statistically significant for students compared to short feedback, regardless of their pre-test scores (see Figure 3). Thus, although the extensive feedback appears to be particularly beneficial for those with a low prior level, no significant differences were found with respect to the control condition.

4.2. Can LAs Data Collected during LMS Tasks Be Used to Predict Student Academic Achievement?

To assess the potential of utilizing LAs data collected during LMS tasks as a tool for formative assessment, a multilinear regression analysis was conducted (Table 2). This model aimed to predict post-test scores based on the experimental condition, pre-test, and mean scores obtained during the daily LMS tasks.
A significant regression equation was found (F (3.76) = 35.91; p < 0.001; R2 = 0.58). As shown in Table 2, participants’ predicted post-test scores are calculated as 0.36 + 0.82 (LMS task) + 0.11 (pre-test) + 0.15 (experimental condition). Here, LMS scores are measured on a 10-point scale and the experimental condition is coded as 0 for the CG and 1 for the EG. Therefore, a 1-point increase in LMS tasks’ scores led to an increase of +0.82 points in the post-test scores for students in both groups, regardless of the experimental condition and their previous knowledge base. Thus, taking everything previously noted, the daily LMS scores used were significant predictors of the post-test scores.

4.3. What Is the Level of Student Satisfaction with the LMS Task after the Study?

To address the last research question, the results obtained from the LOES-S questionnaire were used. To address the last research question, the results obtained from the LOES-S questionnaire were used. This questionnaire was voluntarily completed by students at the conclusion of the experimental phase. To this aim, the first step was to confirm the reliability of the questionnaires for the analyzed sample using Cronbach’s alpha. A Cronbach’s alpha close to 0.9 was obtained (α = 0.868), indicating high reliability. Subsequently, the students’ responses were analyzed by condition (see Table 3).
Focusing on the differences between the two conditions, students from the EG rated the LMS task higher than those from the CG, particularly in the Learning and Engagement dimensions. The Mann–Whitney U test confirmed a statistically significant difference in the Learning (U = 543.5, p = 0.001, r = 0.35) and Engagement dimensions (U = 601.0, p = 0.005, r = 0.14), but not in the Quality dimension (U = 720.0, p = 0.078, r = 0.30). According to Cohen [58], the effect size obtained in both dimensions with a statistically significant difference could be considered medium-sized [58]. The results are positive in both groups, which is logical considering that the activities were the same under both experimental conditions. However, the GE students scored significantly higher on aspects such as the usefulness of the tasks and the subsequent feedback provided. Similarly, students from the EG found the LMS tasks more attractive than those from the CG. Therefore, it appears that EG students perceive the extensive feedback provided as useful and interesting to implement during their learning process.

5. Discussion

Following the study and considering the results obtained, several noteworthy aspects merit further analysis. Firstly, the execution of the study demonstrates that the LMS Moodle facilitates the autonomous and active engagement of primary education students in activities, including those in areas such as geography. Its integration aligns with contemporary approaches to practical learning and individualization of the teaching–learning process [1,2]. In addition, this approach can help reduce paper consumption. Traditionally, class activities are completed by hand, which, on the contrary, has been proven to be less efficient [33]. By contrast, this approach is more efficient and can make the teaching–learning process more sustainable. Furthermore, it incorporates evaluation as a daily activity [9,23]. Despite the scarcity of evidence in the existing literature [4,22,41], this study aims to contribute to the scientific and educational community by explaining that this educational approach can also be successfully implemented in compulsory education beyond the scientific-technological domain, thanks to technological tools such as this LMS.
Utilizing Moodle and data collected from LAs, students were able to consolidate their daily learning through brief quizzes. This approach aimed to foster greater student engagement in their own teaching–learning process, promoting self-management and awareness of their knowledge. This line of action is aligned with current issues of geography education [17,19,20]. Concurrently, this approach provided teachers with daily information of interest about the learning process, eliminating the need to wait for summative assessments for this information [8,9,40]. However, focusing on the students’ academic performance based on the type of feedback provided, contrary to what might be expected, and despite descriptive analysis suggesting that extensive feedback was beneficial to students (particularly those facing greater learning challenges), the moderation analysis revealed no significant differences between the groups.
Both groups demonstrated an improvement in their knowledge over the course of the study. This enhancement was observed despite the brevity of the activities and the fact that the post-test was conducted under identical conditions as the other sessions: on the same Moodle platform, without notifying students beforehand that they would take this test or asking them to study for it. These observations suggest a positive impact from the activities, irrespective of the nature of the feedback provided. Thus, while this study supports the need for formative assessment procedures where feedback is a crucial component [44,45], future research should investigate whether these non-significant differences are attributable to insufficient reflection time for students, or the interest for students to revisit task execution and the feedback post-assessment.
On the contrary, reports generated by Moodle, utilizing collected LAs data, have shown its efficacy in enabling educators to gather pertinent data on students’ daily academic performance without necessitating additional time that manual data collection would require [38]. Thus, this is a possible way of applying formative evaluation in a sustainable manner over time. The linear regression model obtained indicates that students’ daily scores are a robust predictor of their final academic standing in a geography course, irrespective of the experimental condition. In addition, the data obtained through the beta statistics demonstrate the utility of these LMS tasks for improving students’ academic achievement. Consequently, not only could this educative approach be useful to improve academic performance, but it could also address the limitations associated with summative assessments, and potentially enable earlier intervention. This is of particular significance, as authors such as Alfageme and Miralles Martínez [25] and Feliu [39] have highlighted the often belated nature of traditional evaluation processes.
Returning to the student population, the results obtained regarding the last research question about their satisfaction with assignments suggest that the implementation of comprehensive feedback may significantly influence this factor. The higher scores associated with the advantages of these tasks and their capacity to engage EG students suggest that the extensive feedback was deemed beneficial, given that the activities were identical to those of the CG. Considering the widespread lack of student motivation for these subject matters [23], and the assertion by authors such as Brooks [19], Graves [24], and Roberts [20] that students must comprehend geographical content to appreciate its utility, this approach appears to be a promising alternative that warrants further application in educational settings.

6. Conclusions

Society, education, and the instruction of subjects such as geography cannot remain unaffected by the ongoing technological transformations. As society broadly adapts to these changes, there is mounting evidence supporting the necessity of teacher training in their use [26]. This is crucial for adapting education to meet the needs of the 21st century, even in this area of knowledge, despite the relative scarcity of evidence [15,21,22,35]. For this reason, the objective of this study was to demonstrate how the Learning Management System Moodle and daily collected Learning Analytics data can be effectively utilized in geography instruction during the school stage.
This study was designed in line with current educational trends of individualized learning and increased student monitoring to facilitate formative evaluation [1,2,4]. For this purpose, fifth-grade students completed identical activities on Moodle during a comprehensive didactic unit. The only difference was the type of feedback they received post-intervention: extensive feedback versus simply knowing whether their response was correct or not (see previous example in Figure 1). The results indicate that students in both groups improved their knowledge during the intervention, and the extensive feedback, in contrast to limited feedback, did not significantly impact their academic achievement. However, students within the experimental group rated the instruction higher. They recognized their relevance to learning and expressed satisfaction with the teaching–learning process.
In line with the necessity to provide teachers with technological tools and appropriate methodologies for the formative assessment of their students [30,40], the data gathered daily throughout this project underscores the merit of such an approach. These Moodle activities, despite requiring only a short period for their daily execution during the class, serve as a reliable predictor of eventual academic achievement. Consequently, this study not only aims to demonstrate the feasibility of the application of this pedagogical strategy despite the scarce bibliographic evidence, but also its potential benefits for both students and teachers.

Limitations and Proposals for Future Research

Throughout this study, various improvement proposals and novel hypotheses have surfaced, presenting promising opportunities for future research. Given the study temporalization, one potential area of exploration involves extending the duration of sessions in which the Learning Management System activities were carried out. This extension would enable a two-fold analysis: firstly, it could help in determining whether a longer implementation of the proposed pedagogical model, in relation to the type of feedback, amplifies the disparities in students’ final academic achievement. Secondly, it could assess if students’ satisfaction with the tasks is sustained over a longer period. Additionally, future studies should assess students’ capacity to focus on feedback reports, providing them with more time to check their own performance on the activities. It may even be beneficial to allow students access to these activities and reports outside of class hours, enabling them to review them in preparation for subsequent examinations.
Likewise, increasing the number of students can improve the external validity of the conclusions presented in this study. This will enable the analysis of other variables such as gender that could lead to an enhancement in the linear model derived in this study. Consequently, future research could consider gathering data on students’ progression through questionnaires and the time taken to complete them. These factors, coupled with the potential implementation of additional metrics such as teachers’ perceptions of these tasks, could provide valuable insights. Finally, expanding the study to other educational levels and disciplines may also yield new findings that could be compared with the results obtained in this study.

Author Contributions

Conceptualization, S.T.-O., R.C.-G. and J.A.G.-C.; methodology, S.T.-O., R.C.-G. and J.A.G.-C.; formal analysis, S.T.-O. and J.A.G.-C.; investigation, S.T.-O., R.C.-G. and N.D.; resources, S.T.-O. and R.C.-G.; writing—original draft preparation, S.T.-O.; writing—review and editing, S.T.-O., R.C.-G., J.A.G.-C. and N.D.; supervision, project administration, and funding acquisition, R.C.-G. and J.A.G.-C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Castilla-La Mancha Regional Administration under project SBPLY/19/180501/000278; by the University of Castilla-La Mancha and by the European Regional Development Fund (ERDF) under Grant 2022-GRIN-34039; and by the Ministry of Universities of Spain under Grant FPU20/02375 and EST23/00741.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of the University of Castilla-La Mancha (Date 7 April 2022./No. CEIS-632710-Z1N4).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

We would like to thank all of the teachers and students who participated in this study.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Baldassarri, S. 2022 EDUCAUSE Horizont Report. Teaching and Learning Edition. Educ. Publ. 2022, 32, e14. [Google Scholar] [CrossRef]
  2. Redecker, C. European Framework for the Digital Competence of Educators: DigCompEdu; Punie, Y., Ed.; Publications Office: Luxembourg, 2017; ISBN 978-92-79-73494-6. [Google Scholar]
  3. Long, P.; Siemens, G.; Conole, G.; Gašević, D. LAK ’11: Proceedings of the 1st International Conference on Learning Analytics and Knowledge; Association for Computing Machinery: New York, NY, USA, 2011. [Google Scholar]
  4. Gašević, D.; Greiff, S.; Shaffer, D.W. Towards Strengthening Links between Learning Analytics and Assessment: Challenges and Potentials of a Promising New Bond. Comput. Human Behav. 2022, 134, 107304. [Google Scholar] [CrossRef]
  5. Gómez-Carrasco, C.J.; Ortuño, J.; Miralles-Martínez, P. Enseñar Ciencias Sociales Con Métodos Activos de Aprendizaje: Reflexiones y Propuestas a Través de la Indagación; Octaedro: Barcelona, Spain, 2018; ISBN 9788417219536. [Google Scholar]
  6. Acar, O.A.; Tuncdogan, A. Using the Inquiry-Based Learning Approach to Enhance Student Innovativeness: A Conceptual Model. Teach. High. Educ. 2019, 24, 895–909. [Google Scholar] [CrossRef]
  7. Kozanitis, A.; Nenciovici, L. Effect of Active Learning versus Traditional Lecturing on the Learning Achievement of College Students in Humanities and Social Sciences: A Meta-Analysis. High. Educ. 2022, 86, 1377–1394. [Google Scholar] [CrossRef]
  8. Sagarika, R.H.; Kandakatla, R.; Gulhane, A. Role of Learning Analytics to Evaluate Formative Assessments: Using a Data Driven Approach to Inform Changes in Teaching Practices. J. Eng. Educ. Transform. 2021, 34, 550–556. [Google Scholar] [CrossRef]
  9. Miralles Martínez, P.; Gómez Carrasco, C.J.; Sánchez Ibañez, R. Dime Qué Preguntas y Te Diré Qué Evalúas y Enseñas. Análisis de Los Exámenes de Ciencias Sociales En Tercer Ciclo de Educación Primaria. Aula Abierta 2014, 42, 83–89. [Google Scholar] [CrossRef]
  10. Bulut, O.; Gorgun, G.; Yildirim-Erbasli, S.N.; Wongvorachan, T.; Daniels, L.M.; Gao, Y.; Lai, K.W.; Shin, J. Standing on the Shoulders of Giants: Online Formative Assessments as the Foundation for Predictive Learning Analytics Models. Br. J. Educ. Technol. 2023, 54, 19–39. [Google Scholar] [CrossRef]
  11. Gašević, D.; Dawson, S.; Rogers, T.; Gasevic, D. Learning Analytics Should Not Promote One Size Fits All: The Effects of Instructional Conditions in Predicting Academic Success. Internet High. Educ. 2016, 28, 68–84. [Google Scholar] [CrossRef]
  12. Yassine, S.; Kadry, S.; Sicilia, M.-A. IEEE A Framework for Learning Analytics in Moodle for Assessing Course Outcomes. In Proceedings of the 2016 IEEE Global Engineering Education Conference (EDUCON), Abu Dhabi, United Arab Emirates, 10–13 April 2016; pp. 261–266. [Google Scholar]
  13. Justin, T.S.; Krishnan, R.; Nair, S.; Samuel, B.S. Learners’ Performance Evaluation Measurement Using Learning Analytics in Moodle. In Lecture Notes in Networks and Systems; Springer Science and Business Media Deutschland GmbH: Berlin/Heidelberg, Germany, 2022; Volume 191, pp. 931–942. [Google Scholar]
  14. Ros Martínez de Lahidalga, I. Moodle, la Plataforma Para la Enseñanza y Organización Escolar; 2008; pp. 1–12. Available online: https://addi.ehu.es/handle/10810/6876 (accessed on 18 March 2024).
  15. Li, K.C.; Wong, B. Trends of Learning Analytics in STE(A)M Education: A Review of Case Studies. Interact. Technol. Smart Educ. 2020, 17, 323–335. [Google Scholar] [CrossRef]
  16. Wineburg, S. Why Learn History (When It’s Already on Your Phone); The University of Chicago Press: Chicago, IL, USA, 2018; ISBN 9780226357218. [Google Scholar]
  17. Moreno-Vera, J.R.; Alvén, F. Concepts for Historical and Geographical Thinking in Sweden’s and Spain’s Primary Education Curricula. Humanit. Soc. Sci. Commun. 2020, 7, 107. [Google Scholar] [CrossRef]
  18. Chinn, C.A.; Barzilai, S.; Duncan, R.G. Education for a “Post-Truth” World: New Directions for Research and Practice. Educ. Res. 2021, 50, 51–60. [Google Scholar] [CrossRef]
  19. Brooks, C.; Butt, G.; Fargher, M. The Power of Geographical Thinking; Brooks, C., Butt, G., Fargher, M., Eds.; International Perspectives on Geographical Education; Springer International Publishing: Cham, Switzerland, 2017; ISBN 978-3-319-49985-7. [Google Scholar]
  20. Roberts, M. Powerful Knowledge and Geographical Education. Curric. J. 2014, 25, 187–209. [Google Scholar] [CrossRef]
  21. Scholten, N.; Caldis, S.; Sprenger, S. Intervention Studies to Improve Initial Teacher Education in Geography: A Scoping Review. In International Perspectives on Geographical Education; Springer Nature: Berlin/Heidelberg, Germany, 2022; pp. 9–24. [Google Scholar]
  22. Gómez-Carrasco, C.J.; Miralles-Martínez, P.; López-Facal, R. Handbook of Research on Teacher Education in History and Geography; Peter Lang AG: Lausanne, Switzerland, 2021. [Google Scholar]
  23. Gómez-Carrasco, C.J.; Miralles-Martínez, P. Los Contenidos de Ciencias Sociales y Las Capacidades Cognitivas En Los Exámenes de Tercer Ciclo de Educación Primaria ¿una Evaluación En Competencias? Rev. Complut. Educ. 2013, 24, 91–121. [Google Scholar] [CrossRef]
  24. Graves, N.J. Geography in Education; Heinemann Educational: Portsmouth, NH, USA, 1975. [Google Scholar]
  25. Alfageme, M.B.; Miralles Martínez, P. El Profesorado de Geografía e Historia de Enseñanza Secundaria Ante La Evaluación. Educ. Rev. 2014, 52, 193–209. [Google Scholar] [CrossRef]
  26. Palacios-Rodríguez, A.; Cabero-Almenara, J.; Barroso-Osuna, J. Competencia Digital Docente Según #DigCompEdu. Aportes Desde la Investigación; Universidad de Sevilla: Sevilla, Spain, 2023. [Google Scholar]
  27. Tüzün, H.; Yilmaz-Soylu, M.; Karakuş, T.; Inal, Y.; Kizilkaya, G. The Effects of Computer Games on Primary School Students’ Achievement and Motivation in Geography Learning. Comput. Educ. 2009, 52, 68–77. [Google Scholar] [CrossRef]
  28. Tsai, Y.-S.; Perrotta, C.; Gasevic, D. Empowering Learners with Personalised Learning Approaches? Agency, Equity and Transparency in the Context of Learning Analytics. Assess. Eval. High. Educ. 2020, 45, 554–567. [Google Scholar] [CrossRef]
  29. Rosário, P.; Cunha, J.; Nunes, T.; Nunes, A.R.; Moreira, T.; Núñez, J.C. “Homework Should Be...but We Do Not Live in an Ideal World”: Mathematics Teachers’ Perspectives on Quality Homework and on Homework Assigned in Elementary and Middle Schools. Front. Psychol. 2019, 10, 430481. [Google Scholar] [CrossRef]
  30. Stanja, J.; Gritz, W.; Krugel, J.; Hoppe, A.; Dannemann, S. Formative Assessment Strategies for Students’ Conceptions—The Potential of Learning Analytics. Br. J. Educ. Technol. 2023, 54, 58–75. [Google Scholar] [CrossRef]
  31. Mwalumbwe, I.; Mtebe, J.S. Using Learning Analytics to Predict Students’ Performance in Moodle Learning Management System: A Case of Mbeya University of Science and Technology. Electron. J. Inf. Syst. Dev. Ctries. 2017, 79, 1–13. [Google Scholar] [CrossRef]
  32. Kliziene, I.; Taujanskiene, G.; Augustiniene, A.; Simonaitiene, B.; Cibulskas, G. The Impact of the Virtual Learning Platform Eduka on the Academic Performance of Primary School Children. Sustainability 2021, 13, 2268. [Google Scholar] [CrossRef]
  33. Magalhães, P.; Ferreira, D.; Cunha, J.; Rosário, P. Online vs. Traditional Homework: A Systematic Review on the Benefits to Students’ Performance. Comput. Educ. 2020, 152, 103869. [Google Scholar] [CrossRef]
  34. Suad, A.; Tapalova, O.; Berestova, A.; Vlasova, S. The Impact of Moodle Learning Analytics on Students’ Performance and Motivation. Int. J. Instr. 2023, 16, 297–312. [Google Scholar] [CrossRef]
  35. Tirado-Olivares, S.; Bueno-Baquero, A.; López-Fernández, C.; Mínguez-Pardo, R.; Cózar-Gutiérrez, R. Revisión de La Literatura Sobre El Uso de Learning Analytics En El Rendimiento Académico de Estudiantes de Pregrado: Impresiones Iniciales. In Educar para Transformar: Innovación Pedagógica, Calidad y TIC en Contextos Formativos; Cobos-Sanchiz, D., López-Meneses, E., Martín-Padilla, A.H., Molina-García, L., Jaén-Martínez, A., Eds.; Dykinson: Madrid, Spain, 2023; pp. 2511–2521. ISBN 978-84-1122-469-7. [Google Scholar]
  36. Lunsford, M.L.; Pendergrass, M. Making Online Homework Work. Primus 2016, 26, 531–544. [Google Scholar] [CrossRef]
  37. Cechinel, C.; De Freitas Dos Santos, M.; Barrozo, C.; Schardosim, J.E.; De Vila, E.; Ramos, V.; Primo, T.; Munoz, R.; Queiroga, E.M. A Learning Analytics Dashboard for Moodle: Implementing Machine Learning Techniques to Early Detect Students at Risk of Failure. In Proceedings of the 2021 16th Latin American Conference on Learning Technologies, LACLO 2021, Arequipa, Peru, 19–21 October 2021; Institute of Electrical and Electronics Engineers Inc.: New York, NY, USA, 2021; pp. 130–136. [Google Scholar]
  38. Pardo, A.; Jovanovic, J.; Dawson, S.; Gašević, D.; Mirriahi, N. Using Learning Analytics to Scale the Provision of Personalised Feedback. Br. J. Educ. Technol. 2019, 50, 128–138. [Google Scholar] [CrossRef]
  39. Feliu, J. Evaluación Colaborativa Por Competencias En Un Equipo Docente. In Analítica del Aprendizaje: 30 Experiencias con Datos en el Aula; Filvà, D.A., Ed.; Independiente: Badalona, Spain, 2018; pp. 100–104. [Google Scholar]
  40. Børte, K.; Lillejord, S.; Chan, J.; Wasson, B.; Greiff, S. Prerequisites for Teachers’ Technology Use in Formative Assessment Practices: A Systematic Review. Educ. Res. Rev. 2023, 41, 100568. [Google Scholar] [CrossRef]
  41. Tirado-Olivares, S.; López-Fernández, C.; González-Calero, J.A.; Cózar-Gutiérrez, R. Enhancing Historical Thinking through Learning Analytics in Primary Education: A Bridge to Formative Assessment. Educ. Inf. Technol. 2024, 1–25. [Google Scholar] [CrossRef]
  42. Pelletier, K.; Brown, M.; Brooks, D.C.; McCormack, M.; Reeves, J.; Arbino, N.; Bozkurt, A.; Crawford, S.; Czerniewicz, L.; Gibson, R.; et al. 2021 EDUCAUSE Horizon Report Teaching and Learning Edition—Learning & Technology Library (LearnTechLib); EDU: San Diego, CA, USA, 2021; ISBN 978-1-933046-08-2. [Google Scholar]
  43. Ferguson, R.; Clow, D. ACM Learning Analytics Community Exchange: Evidence Hub. In Proceedings of the LAK ’16: Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, New York, NY, USA, 25–29 April 2016; pp. 520–521. [Google Scholar]
  44. Sadler, D.R. Formative Assessment and the Design of Instructional Systems. Instr. Sci. 1989, 18, 119–144. [Google Scholar] [CrossRef]
  45. Black, P.; Wiliam, D. Developing the Theory of Formative Assessment. Educ. Assess. Eval. Account. 2009, 21, 5–31. [Google Scholar] [CrossRef]
  46. Dubé, A.K.; Wen, R. Identification and Evaluation of Technology Trends in K-12 Education from 2011 to 2021. Educ. Inf. Technol. 2022, 27, 1929–1958. [Google Scholar] [CrossRef]
  47. Knobbout, J.; van der Stappen, E. Where Is the Learning in Learning Analytics? A Systematic Literature Review to Identify Measures of Affected Learning. EC-TEL 2018. Lifelong Technol. Learn. 2018, 11082, 88–100. [Google Scholar]
  48. Christopoulos, A.; Pellas, N.; Laakso, M.-J. A Learning Analytics Theoretical Framework for STEM Education Virtual Reality Applications. Educ. Sci. 2020, 10, 317. [Google Scholar] [CrossRef]
  49. Mangaroska, K.; Sharma, K.; Gasevic, D.; Giannakos, M. Multimodal Learning Analytics to Inform Learning Design: Lessons Learned from Computing Education. J. Learn. Anal. 2020, 7, 79–97. [Google Scholar] [CrossRef]
  50. Ifenthaler, D.; Yau, J.Y.K. Utilising Learning Analytics to Support Study Success in Higher Education: A Systematic Review. Educ. Technol. Res. Dev. 2020, 68, 1961–1990. [Google Scholar] [CrossRef]
  51. Srinivasa, K.G.; Muralidhar, K. A Beginner’s Guide to Learning Analytics; Springer: Berlin/Heidelberg, Germany, 2021. [Google Scholar] [CrossRef]
  52. Hedges, L.V. Challenges in Building Usable Knowledge in Education. J. Res. Educ. Eff. 2018, 11, 1–21. [Google Scholar] [CrossRef]
  53. Leppink, J. The Question-Design-Analysis Bridge. In Statistical Methods for Experimental Research in Education and Psychology; Springer: Cham, Switzerland, 2019; pp. 3–21. ISBN 978-3-030-21241-4. [Google Scholar]
  54. Kay, R.H.; Knaack, L. Assessing Learning, Quality and Engagement in Learning Objects: The Learning Object Evaluation Scale for Students (LOES-S). Educ. Technol. Res. Dev. 2009, 57, 147–168. [Google Scholar] [CrossRef]
  55. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2020. [Google Scholar]
  56. Leppink, J. Analysis of Covariance (ANCOVA) vs. Moderated Regression (MODREG): Why the Interaction Matters. Health Prof. Educ. 2018, 4, 225–232. [Google Scholar] [CrossRef]
  57. Hayes, A.F. Introduction to Mediation, Moderation, and Conditional Process Analysis: A Regression-Based Approach; Guildford Publications: New York, NY, USA, 2022. [Google Scholar]
  58. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Erlbaum: Mahwah, NJ, USA, 1988. [Google Scholar]
Figure 1. Example of an activity and the feedback provided depending on the research condition.
Figure 1. Example of an activity and the feedback provided depending on the research condition.
Sustainability 16 02616 g001
Figure 2. Research process conducted to integrate LAs-formative assessments in geography education.
Figure 2. Research process conducted to integrate LAs-formative assessments in geography education.
Sustainability 16 02616 g002
Figure 3. Graph based on the post-test scores (ordinates) considering the scores of the pre-test (abscissae).
Figure 3. Graph based on the post-test scores (ordinates) considering the scores of the pre-test (abscissae).
Sustainability 16 02616 g003
Table 1. Descriptive data according to students’ scores in both pre-test and post-test.
Table 1. Descriptive data according to students’ scores in both pre-test and post-test.
TestGroup N M SD
Pre-test CG 374.261.94
EG 434.771.83
Post-test CG 375.762.14
EG 436.372.11
Table 2. Multilinear regression analysis for LMS tasks, pre-test and experimental condition predicting post-test scores.
Table 2. Multilinear regression analysis for LMS tasks, pre-test and experimental condition predicting post-test scores.
BSDBetatp
Constant0.360.58 0.6220.536
LMS tasks (mean score)0.820.120.696.82<0.001
Pre-test0.110.110.100.960.338
Experimental condition0.150.320.040.470.639
Table 3. Descriptive data obtained from the LOES-S questionnaire.
Table 3. Descriptive data obtained from the LOES-S questionnaire.
TestDimensionGroupTotalUpr
LOES-SLearning CG3.67 (0.68)543.50.0010.35
EG4.14 (0.75)
QualityCG4.11 (0.73)720.00.0780.14
EG4.38 (0.64)
EngagementCG3.63 (0.70)601.00.0050.30
EG4.02 (0.99)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tirado-Olivares, S.; Cózar-Gutiérrez, R.; González-Calero, J.A.; Dorotea, N. Evaluating the Impact of Learning Management Systems in Geographical Education in Primary School: An Experimental Study on the Importance of Learning Analytics-Based Feedback. Sustainability 2024, 16, 2616. https://doi.org/10.3390/su16072616

AMA Style

Tirado-Olivares S, Cózar-Gutiérrez R, González-Calero JA, Dorotea N. Evaluating the Impact of Learning Management Systems in Geographical Education in Primary School: An Experimental Study on the Importance of Learning Analytics-Based Feedback. Sustainability. 2024; 16(7):2616. https://doi.org/10.3390/su16072616

Chicago/Turabian Style

Tirado-Olivares, Sergio, Ramón Cózar-Gutiérrez, José Antonio González-Calero, and Nuno Dorotea. 2024. "Evaluating the Impact of Learning Management Systems in Geographical Education in Primary School: An Experimental Study on the Importance of Learning Analytics-Based Feedback" Sustainability 16, no. 7: 2616. https://doi.org/10.3390/su16072616

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop