Next Article in Journal
The Impact of Digital Financial Inclusion on Household Commercial Insurance for Sustainable Governance Mechanisms under Regional Group Differences
Previous Article in Journal
A Two-Stage Robust Pricing Strategy for Electric Vehicle Aggregators Considering Dual Uncertainty in Electricity Demand and Real-Time Electricity Prices
Previous Article in Special Issue
Interpretable Bike-Sharing Activity Prediction with a Temporal Fusion Transformer to Unveil Influential Factors: A Case Study in Hamburg, Germany
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Architecture for Workplace Learning Analytics (WLA) to Support Lifelong Learning in Sustainable Smart Organisations

Department of Computing Sciences, Nelson Mandela University, Gqeberha 6001, South Africa
*
Author to whom correspondence should be addressed.
Sustainability 2024, 16(9), 3595; https://doi.org/10.3390/su16093595
Submission received: 29 February 2024 / Revised: 10 April 2024 / Accepted: 14 April 2024 / Published: 25 April 2024

Abstract

:
An environment that supports lifelong learning contributes to the sustainability of the organisations in a Smart City, their stakeholders and ultimately, the city itself. Workplace Learning Analytics (WLA) can provide an organisation’s employees with the opportunity for lifelong learning in the workplace to enhance their skills and knowledge in their current and/or future roles. It uses the data generated by Learning Experience Platforms (LXPs) and other learning platforms to provide support for data-driven decision-making to gain a competitive edge. One of the components of successful and sustainable WLA is a layered architecture. The purpose of this paper is to present the design of a WLA architecture that can be used in organisations to impact successful WLA. This architecture was developed from the results of evaluating four potentially relevant architectures for WLA. The evaluation consisted of three phases. In the first phase, the architectures were evaluated using the DeLone and McLean Model of Information Systems Success. In the second phase, a real-world case of an organisation that provides lifelong learning opportunities to its stakeholders was used to validate the findings of the first phase. The proposed Layered Architecture for WLA was further validated in the third phase where a focus group discussion was held with participants from the real-world context. The architecture can provide valuable guidance to both practitioners and researchers to improve the success of WLA.

1. Introduction

Learning is a constant process in people’s lives, where technology and applications encourage users to adopt continuous, lifelong learning [1]. Lifelong learning refers to acquiring knowledge over time and being a continuous learner, where what you have learned can be used to assist with future learning and development [2]. It is a key success factor of Smart Cities and forms part of the Smart People, Smart Organisation, and Smart Technology and ICT Infrastructure dimensions of Smart Cities [3,4]. Smart People are those who adopt a lifelong learning strategy, and Smart Organisations are those that have a data-driven culture [4]. An environment that supports lifelong learning contributes to the sustainability of the Smart Organisations in a Smart City, their stakeholders and ultimately the city itself [5,6,7].
The Smart Technology and ICT Infrastructure dimension considers both smart data and technology as integral factors of the success of a Smart City. Technology can enable Smart Cities, which are learning cities, to implement lifelong learning and are important connectors to provide access to learning to all individuals in society [5]. Lifelong learning and its supporting technologies can have a large socio-technical impact on Smart Cities [5]. In these cities, technology is seen as an enabler towards sustainability [6], facilitating learning for and in the workplace and fostering a culture of learning throughout life [7]. A key technology for lifelong learning systems is machine learning, which accumulates the knowledge acquired in previous processes or tasks and utilises this knowledge to assist with future learning [1]. As a result, the learner becomes increasingly more knowledgeable and more effective at learning.
Sometimes, lifelong learning is referred to as continuous learning or continuous education [8]. For this paper, the term lifelong learning will be used, which will encompass all similar or related terminology. Learning in organisations is referred to as workplace learning and has its own set of characteristics that are unique to that of higher education [8].
Despite the challenges of lifelong learning, employers (organisations) are becoming more aware of the need for and benefits of adopting it in the workplace [8]. Workplace learning is especially important for professional transformation that spans an employee’s career path [8] and for being able to adapt to the shifts in technology [9]. By conducting career pathing and by identifying areas of development, organisations can become more aware of what skills are needed by an employee at any given time [8]. Lifelong learning can take place in workplace learning environments where employers provide learning and upskilling opportunities to their employees [7]. By providing the means and time for learning, employees can translate their learning into practice, be empowered to innovate and be encouraged to collaborate. Using technology in the workplace to foster learning, good habits can be formed where learning can happen anywhere and at any time. Some organisations have expanded their development/upskilling offerings and provide learning opportunities to their external stakeholders and clients in addition to their employees [10].
Workplace Learning Analytics (WLA), Learning Management Systems (LMSs) and Learning Experience Platforms (LXPs) can provide employees with the opportunity for lifelong learning in the workplace [8,11,12]. WLA uses the data generated by the LXP and other learning platforms to support data-driven decision-making and to provide a competitive edge for organisations [13]. Current, real-time data is required for this decision-making and is a key factor for the success of machine learning and lifelong learning systems [1]. The Data, Information, Knowledge, and Wisdom (DIKW) Hierarchy identifies and showcases the impact data, information, knowledge, and wisdom have on each other [14]. The DIKW Hierarchy can be aligned to the attributes of information quality, where it is important to have the raw data (data), data which is meaningful (information), that can be understood in context (knowledge) and, ultimately, used to forecast and predict future action (wisdom). Therefore, accurate data is essential for organisations.
One of the components of successful and sustainable WLA is a layered architecture [4,15]. Currently, there is a gap in the existing literature regarding WLA [16]. Since most Learning Analytics (LA) research has focused on formal education in schools and higher education, learning processes in the workplace have been neglected, particularly in terms of WLA [8,16,17]. In the workplace context, success factors for Business Analytics (BA) have been proposed in the literature, such as those in [18], but these factors were not considered in the context of training or learning in corporate organisations. Limited research has been found regarding architectures specifically applied to the workplace and WLA and on empirical data on factors or components for architectures that support successful WLA.
The purpose of this study is to address these gaps and to answer the research question: What architecture can be used for successful WLA?
The study used the pragmatic, systematic approach proposed in [19] because it was conducted in a real-world context at a corporate organisation called ERPCo that sells and implements ERP software all over the world. Research conducted in the Information Systems (IS) domain is known to adopt a pragmatic approach as the research is applied to both theoretical and practical issues [20]. Since WLA is a type of IS, this approach was adopted. User participation in IS projects should be reinforced so that pragmatic models and practical processes are created. Using a pragmatic, systematic approach together with FGDs can provide traceability and explore endless associations between the problem, the requirements, the solution and the artefact [19]. For theoretical framing, because the purpose of this research was to develop an architecture for WLA that can contribute to the success of WLA, the DeLone and McLean Information Systems (D&M IS) Success Model [21] was deemed a relevant theory to use.
The structure of the paper is as follows: Section 2 provides details of the research design. Section 3 reports on a literature review of lifelong learning, WLA and data management for sustainability. Section 3 also includes a comparison of four potentially relevant architectures for WLA. Section 4 reports on the evaluation of the architectures and presents the requirements for a WLA architecture. In Section 5, the findings of an evaluation of the ERPCo learning architecture using the identified requirements are presented. Section 6 reports on an evaluation of four WLA architectures by employees at ERPCo using a Focus Group Discussion (FGD). Section 7 presents the proposed Layered Architecture for WLA that can contribute to the successful adoption thereof in organisations. Finally, the conclusions of this paper are provided in Section 8.

2. Research Design (Materials and Methods)

The objective of the research was to select and design an architecture that can support successful WLA.
A pragmatic, systematic approach [19] and a three-phase evaluation were conducted to meet this objective. In the first phase, a thorough and rigorous literature review of papers published between 2011 and 2019 was conducted, whereby four architectures were identified as potentially relevant for WLA and evaluated to determine the common components and shortcomings. These timelines were used because this research forms part of a larger PhD research study, which commenced in 2019. Ethics approval is not required for literature reviews, and therefore, ethics approval was only applied for later in the study prior to the FGDs being conducted in 2022 and 2023.
In the second phase, the architectures were further evaluated using the D&M IS Success Model [21]. More recent studies have shown the D&M IS Success Model applied in a variety of research projects to determine the success of different systems; for example, eCommerce systems, knowledge management systems, e-government systems, and Massive Open Online Courses [22]. The D&M IS Success Model has even been used to measure the success of an Enterprise Resource Planning system post-implementation [23].
One of the dimensions of success in this theory is Information Quality, which addresses the required and favourable characteristics of system outputs such as management reports and web pages [24]. These characteristics include relevance, understanding, accuracy, conciseness, completeness, currency, timeliness and usability. The characteristics of relevance, completeness, and timeliness were used to evaluate the four architectures identified in the literature review and to compare them with the real-world case at ERPCo as a proof of concept. In this organisation, lifelong learning through training and education is provided to its stakeholders (the employees, customers, and partners). Learning opportunities are made available via their LXP. Other dimensions of the D&M IS Success Model were considered in the larger PhD study; however, from the FGDs findings and the analysis of the dimensions of the characteristics of the D&M IS Success Model, only Information Quality was applicable for the architecture evaluation.
From the findings of the first two phases, a set of requirements for WLA were identified. In the third phase, an FGD was conducted with employees from ERPCo. Multiple FGDs were conducted as part of the larger PhD study; however, only one of these investigated architectures and is, therefore, the only one reported on in this paper. An FGD was used since this approach is used extensively for knowledge generation and is seen as an effective instrument for the collection of new ideas and a form of engaging with key stakeholders that need to problem-solve and make decisions. In essence, a focus group brings together a collection of individuals with certain common characteristics, with the main purpose of improving understanding in terms of how they feel or think about an issue, idea, product, or service. In [25], the increased use of FGDs in IS research was emphasised, which also highlights the appropriateness of using FGDs to propose improvements for an architecture’s design and to determine its usefulness in its field of use. The FGD was used to gather empirical data from participants at ERPCo. Using a real-world context provided traceability and explored continuous associations between the problem, the requirements, and the solution [19]. FGDs were deemed the most appropriate form of data collection as experts in different fields were able to review, comment on, and discuss with like-minded individuals who might use a WLA architecture to serve different needs in the same organisation. The topics outlined in the FGD covered two main themes: (1) architectures and (2) tools, technologies, and platforms. The participants needed to meet the following inclusion criteria:
  • Have more than three years of experience working on technical projects;
  • Have prior experience and knowledge of working with data, tools, technologies, and platforms to support advanced integration and technical-related or dependent projects or initiatives; and
  • Have a clear understanding of how the learning data is pushed or pulled from or to other systems in the organisation (in other words, to be data literate).
The requirement to be data literate was measured by their job role and years of experience. If the employee had worked with data in the company or a previous company, they were considered data literate. In addition, years of experience formed part of this criterion, so those who met the criterion in their job role had to have also worked in that role for more than one year.
The FGD targeted between five and seven participants, as is recommended. The target participants were ERPCo employees at different levels in the organisational hierarchy, such as executives, managers, team leads, and specialists. The participants were required to be either part of the technical team at ERPCo or to have had some involvement in projects and processes actioned by technical staff members.
Since the primary researcher worked in the real-world setting of ERPCo, where the field-based research was conducted, the participant–observer methodology of qualitative research was adopted [26], which is also referred to as the practitioner–researcher approach [27]. The practitioner–researcher, who was the primary researcher of this study, fully understood and was deeply immersed in the research context as an employee at ERPCo. The researcher drew on extensive experience (over eight years in the ERP systems and learning space, where they had been present during two separate LMS/LXP implementations). The research was framed by the practitioner–researcher’s theoretical knowledge of IS, learning data, LA, LMSs and LXPs. Thus, the researcher was simultaneously considered both the practitioner and researcher. However, there might be some limitations or possible bias with this approach. A method to reduce the risk of bias was to request input from other experts, which was achieved through the FGD and the use of two external experts to verify the data analysis findings.
The data collected from the FGDs was analysed using the Qualitative Content Analysis (QCA) method [28]. QCA is a structured way to convert large amounts of text, for example, from FGDs, into summarised and concise key results and findings. The findings of the evaluations assisted with the design of a WLA architecture that can be used in corporate organisations.

3. Literature Review

Lifelong learning, WLA and data management are crucial for sustainability in Smart Cities (Section 3.1). Four architectures identified from the literature were found to be potentially relevant for evaluating the requirements for successful WLA (Section 3.2).
The literature review only included papers written in English that were either journal or conference papers and were published between 2011 and 2019. No white papers were reviewed.

3.1. Data Management for WLA and Sustainability

A sustainable organisation is one that makes efficient use of its available resources [3], in this context, specifically how organisations can use lifelong learning together with technology to redefine the learning society of the new age [5]. When a Smart City becomes a reliable source for lifelong learning and technology, it is sustainable [5]. A smart organisation is a company that relies on technology, infrastructure, and systems. A Smart City should ideally support learning and education, which is one of the requirements for a Smart City to be fully integrated [6].
Large amounts of learner data are produced within corporate and educational institutions on LMSs, LXPs, and other learning systems/environments. An LMS is an online learning system that provides an educational experience for learners, including students, teachers/facilitators, and managers. An LMS provides learning material that is always accessible from the comfort of the learner’s home [13] and is hosted over the Internet [29,30]. In some cases, it can also be accessed on-the-go from a mobile device [13]. On the other hand, an LXP focuses more on the user experience (UX) and offers a more personalised learning experience, that is, one that is less formal than the approach offered by an LMS [31].
With an abundance of data available from LMSs and LXPs and with the help of LA, insights can be provided that can then be used to assist with organisational decision-making [12]. WLA focuses on learning materials, tools, and interventions in the workplace [8,32]. It highlights the importance of learning data for corporate organisations [8], where digital learning environments or LMSs collect large amounts of data about users, their behaviours, and the way they learn [11,12].
WLA needs to consider the robust impact of the human element, which has its own distinctive set of characteristics compared to other workplace data [18]. Enhancing WLA can have a constructive effect on organisational decision-making concerning training, thereby enhancing an organisation’s skillset, competitiveness and sustainability.
Data management occurs at all levels of an organisation and is seen as a key benefit, especially if it supports information availability, data quality, and efficient operations and thereby provides a competitive advantage [33]. In [34], it was reported that there are data management challenges related to privacy, security, data governance, data and information sharing, cost/operational expenditures, and data ownership. For privacy, global governmental policies, laws, and acts need to be considered, such as the South African Protection of Personal Information (POPI) Act [35].
To overcome any data management challenges, a solid architecture should be adopted to assist organisations during their LA implementations [10,36]. Emphasis needs to be placed on the need to follow design principles and on the creation of LA software architectures because of the complexity and data intensity of multiple data sources and activities related to the processing of data [37].
The visualisation of data in dashboards is essential for an architecture’s design [38,39]. Displaying the output of LA using dashboards can help to identify learners who are at risk, who are struggling, and who require additional support. However, it is not always clear whether WLA is built into the LMS/LXP platform or whether it is seen as a separate module or add-on.
By making use of the practitioner–researcher approach, the researcher had exposure to several learning platforms over the years; where some of these platforms had WLA built in, and others had it as a separate module where the additional payment was required to add it to the platform. For some platforms, the WLA component is also separate, and it is entirely external to the LMS/LXP. An example of a separate Business Intelligence (BI) or analytics platform is Microsoft Power BI. These platforms use different terminology for WLA, such as insights, dashboards, analytics, BI, training intelligence, data modules, and reporting. Often, LMSs and LXPs are also referred to by terms that are used interchangeably, such as system, platform, application, tool, or technology. In essence, these terms all reference to some form of IS.

3.2. Comparison of Architectures

The management of data and the architectures used to support it are essential to use WLA in an organisation. The literature review identified only four architectures in the IS field that could be relevant for WLA. These were the BI Architecture (Section 3.2.1), the Big Data Architecture for LA in Education (Section 3.2.2), the Data Collection, Evaluation and Knowledge (DEK) Model for LA in Higher Education (Section 3.2.3), and the Social Semantic Server (SSS) for WLA (Section 3.2.4).

3.2.1. BI Architecture

LA can be classified under the broader field of BI [10], and therefore, the BI Architecture proposed in [10] was deemed relevant. This architecture has five layers that are essential for supporting high-quality data and smooth information flow. The layers are data sources, Extract-Transform-Load (ETL), data warehouse, end user, and metadata (Figure 1). The data source layer acquires data from internal and external sources [10]. This data is a combination of structured, unstructured and semi-structured data based on the data source. The ETL layer concentrates on three main processes: extraction, transformation, and loading of the data. Identifying and collecting relevant data from different data sources (extraction) is a significant part of this process since internal and external data sources are often not integrated, are incomplete and/or may contain duplicate data. The extraction process is therefore needed to filter the data needed to support organisational decision-making. The transformation process can then take place, where the data is converted, using a set of business rules, to a consistent format for reporting and analysis. The loading process is the last stage in the ETL layer, where the data is then loaded into the target repository.
The data warehouse layer consists of three components: an Operational Data Store (ODS), a data warehouse, and data marts. In essence, data flows from the ODS to the data warehouse and then to the subsequent data marts. The end user layer consists of the tools and technologies that are used to display the data in different formats to different users. This layer includes query and reporting tools, Online Analytical Processing (OLAP) and data mining, data visualisation tools, and analytical applications. For example, modelling and forecasting, such as BI and AI. As depicted in the pyramid of the end user layer, the move from the bottom of the pyramid to the top represents the respective increase in the level of comprehensiveness at which the data is processed and presented. This layer also represents the increased complexity of the decision-making process as an organisation moves higher up in the hierarchy. The metadata layer is all about the data, for instance, where the data is used, from where the data is sourced and how data relates to other data. This data is visible and interconnected on various layers of the architecture.

3.2.2. Big Data Architecture for LA in Education

A Big Data architecture framework for LA in higher education was proposed by [40], which integrates data requirements and processes for the generation, acquisition, cleaning, pre-processing, storage management, analytics, visualisation, and alerts of data. This architecture implements five different elements of LA, which are:
  • Data gathering devices are used to collect raw structured and unstructured data to send to the data management systems for analysis; for example, student cards, LMSs, social networks, sensors, and a student’s IS;
  • Data storage and management systems are used to process and convert the raw data into a form that can be processed by the data analytics system; for example, data pre-processing and data cleaning;
  • Data analytics systems are used to extract meaningful and valuable information from the raw, static data; for instance, to obtain insights or to make predictions;
  • Data visualisations are used to visually depict the results from the data analysis to assist with instant and informed decision-making;
  • Action is the element that caters for the goal of the LA system, which is to provide data that can be used to detect alerts, provide warnings and highlight improvement areas.
Due to the limited information provided in [40], not much detail could be found about this architecture framework. The researcher attempted to contact the authors for further information, but they failed to respond. Without this detail, the architecture could not be deemed useful and was excluded from further evaluation.

3.2.3. DEK Model

The DEK Model for LA in Higher Education proposed in [41] has four levels/layers (Figure 2): Data Collection, Evaluation, Knowledge, and Presentation (which includes visualisation). In [41], the need to develop integrated LA toolsets for all stakeholders was stressed. These toolsets should consist of four elements, namely:
  • An LA engine/framework or architecture (used to identify and process data);
  • An adaptive content engine for providing adaptivity of the learning process, content, and instructional design;
  • An intervention engine (including student progress);
  • Dashboard, reporting and visualisation tools; for instance, the sense-making components of the LA system that visualise data to help with making decisions.
The DEK Model incorporates all aspects of the European Standards and Guidelines for Quality Assurance (ESG), including ESG1 to ESG7 [42]:
  • ESG1—Policy and procedures for quality assurance;
  • ESG2—Approval, monitoring, and periodic review of programs and awards;
  • ESG3—Assessment of students;
  • ESG4—Quality assurance of teaching staff;
  • EGS5—Learning resources and student support;
  • EGS6—Information systems;
  • EGS7—Public information.
The Data Collection Level addresses external data sources such as the employability of graduates, educational objects data, and learning/digital content and covers data integration and storage [36,37]. The Evaluation Level addresses evaluation from three different perspectives: the objective, the subjective, and the aggregation module (EGS2, EGS3, ESG4, and ESG5). An objective evaluation quantifies criteria from the obtained data and converts it into a numerical representation for comparison purposes; for example, student progress and success assessment, e-learning course formal criteria compliance, teacher publications quality, and teacher publications originality. Subjective evaluation obtains information directly from the users of the education, such as students, and focuses specifically on analysing levels of satisfaction relating to provided programmes and studies, their teachers, and their e-learning courses. Aggregation evaluation combines the previously mentioned evaluations with the results of programme, student, teacher and content assessments.
The Knowledge Level processes the collected data and evaluates it for decision support to uncover hidden connections. Forecasting can also be used for trend and opportunity identification, as well as for the identification of strong and/or weak factors that can influence future initiatives’ quality and success. The Presentation Layer focuses on ESG1 and ESG7. ESG6 is noticeable throughout the model and process as it focuses on ISs.
The DEK Model was implemented at a Higher Education Institution (HEI) where there were noticeable barriers relating to its adoption and deployment [41]. The findings revealed that a variety of skills and a good understanding of different topics are required by university staff to correctly implement the rules, regulations and standards of implementing such an architecture. In [41], there was the need for a set of guidelines regarding policies, standards of privacy, data protection and governance, with data security also highlighted. Emphasis was also placed on management support and the importance of continuous analysis to ensure successful LA architecture design and development. Due to the complexity of an LA implementation, further analysis is required in areas relating to its technological, organisational, and human-related factors.

3.2.4. SSS for WLA

Ruiz-Calleja et al. [43] proposed an SSS, which is a service-based infrastructure or architecture specifically for WLA. The SSS has evolved over the years and stemmed from knowledge creation theories, specifically the three metaphors of learning, namely knowledge acquisition, participation, and knowledge creation [44], where it was used to collect data from workplace learning tools. The SSS then evolved to integrate the data into a common data model that was sent back to WLA applications to exploit the data. Due to its flexibility of design, the SSS can be adapted for different WLA situations.
Figure 3 represents a scenario for the SSS, which consists of [43]:
  • Workplace learning participants (REQ1);
  • The use of workplace learning tools, such as web applications, to learn while at work and collection tools that collect learning events (REQ2);
  • The WLA infrastructure collects the data (REQ3) from the workplace learning tools to create a dataset (REQ5);
  • This data is then sent back to the WLA applications (REQ4) in the form of data visualisation applications or recommendation systems to support the decision-making of workplace learning participants based on their learning evidence and experiences (REQ6).
Figure 4 represents a possible configuration of the SSS software architecture [43]. In this architecture, external learning tools submit their data to the SSS Activity service, which tracks learner and resource interactions. The data is stored in the Metadata service, which also manages the datasets of the SSS. The Metadata service also allows other services to access the data. The APIs implemented in the Metadata service wrap the interfaces of the databases, and the data model can then be abstracted to fit the business logic of each service. Thus, the SSS data layer is both scalable and adaptable. The business logic of the SSS is composed of a collection of services, namely Simple services that serve a single function or that manage entity types or Composed services, for example, recommender systems, that exploit other services to provide their own functionality. Since the SSS architecture is both flexible and extensible, it can be adapted to different workplace learning situations, and the data can be presented in different ways, for example, in an LA Dashboard. The SSS includes a set of services that can be exploited by several LA functionalities, such as the Activity service, the Metadata service and the Service API, which is a REST API.

4. Evaluation of Architectures

The four architectures were evaluated in terms of their common components, context, and shortcomings. The first phase of the evaluation was based on the findings from the literature review (Section 3). A summary of this evaluation is provided in Table 1. These architectures were all published between 2011 and 2019 and focused on LA in HEIs, WLA and BI. Therefore, they were all deemed relevant to the context of this research.
According to [4,15], one of the primary requirements of WLA is a layered architecture. The only architecture that specifically focused on WLA was the one found in [43]; however, it is not a layered architecture. Data is at the crux of any architecture, which was also highlighted in [4,15]. All the architectures included collected data in their layers/elements. Two of the four architectures implicitly defined layers in their architectures.
In [4,15], the importance of using visualisation techniques for data presentation in WLA was highlighted. These techniques should be incorporated into a presentation layer in the architecture. Three of the architectures included some form of presentation or visualisation element. Tools and technologies for WLA are needed to visualise data; however, only two of the four architectures made some mention of tools or technologies used.
In the second phase of the evaluations, the architectures were evaluated by the practitioner–researcher according to the typical requirements of an organisation using the real-world context of ERPCo as a proof of concept. The criteria used for the evaluation were three from the Information Quality dimension in the D&M IS Success Model. These criteria were relevance, timeliness, and completeness:
  • Relevance—to the context/environment:
    Scored 3 if in the context of WLA;
    Scored 2 if in the context of LA;
    Scored 1 if in the context of BI in general.
  • Relevance—most appropriate or relevant for the real-world context in corporate organisations such as ERPCo:
    Ranked from 1 to 4, with 1 being the least relevant and 4 being the most relevant.
  • Timeliness:
    Ranked from 1 to 4, with 1 being the oldest and 4 being the most recent.
  • Completeness:
    Ranked from 1 to 4, with 1 having the least detail and 4 having the most detail.
The scores allocated to each architecture are summarised in Table 2, where the top two overall scores are marked with an asterisk (*). From the evaluation criteria, it was evident that the SSS for WLA scored the highest for all four criteria, which revealed that it is the most relevant, the most appropriate, the most recent, and the most complete of the architectures, with a score of 14 out of a possible 15 points. However, it does not have implicitly defined layers as recommended in other architectures.
The DEK Model had the second highest score, but it was not deemed a good fit as its context is more in line with pedagogy, content, and the learners themselves, which is not the focus of this research. Aspects of relevance from the DEK Model related to data collection, knowledge, and levels/layers.

5. WLA Architecture Requirements

From the reviewed literature, the comparison of architectures, and finally from the findings of the FGD, it was deduced that a WLA architecture should at least meet the following requirements:
  • Consist of a layered design with the following eight layers:
    Data source layer (should consider all the different types of data and sources, both internal and external);
    Data collection layer (for example, to collect learning events data);
    ETL layer;
    Data warehouse layer (storage) or dataset;
    Evaluation level;
    Knowledge level;
    Visualisation (or presentation) layer;
    End user (or workplace learning participants) layer.
  • Include the technologies (the systems, platforms, and tools) involved in WLA;
  • Include recommendations and predictions (machine learning/AI) for learning optimisation;
  • Incorporate data management, ownership, and governance.
In order to test and validate the proposed requirements for a WLA architecture, the practitioner–researcher obtained a copy of ERPCo’s architecture for learning and WLA used at the time of writing, which was referred to as their Education Architecture. The requirements identified above from the literature were then applied to this architecture, and the following shortcomings and commonalities were identified:
  • A layered design:
    This layout was unclear in the current ERPCo Education Architecture. Although the six components were included in various forms, they were not tiered or layered.
  • Different types of data:
    Data should either be structured, semi-structured, or unstructured to cater for all possibilities, as data is available in several forms. The ERPCo Education Architecture referred to data in the general sense and focused only on structured learning data. Therefore, unstructured data should be added to it.
  • Data source layer and data collection layer:
    Sources of data were especially important for ERPCo since they are reliant on different types of data from a variety of sources. A high-level mention of learning data was found in the architecture but without much detail. Also, no element related to data collection and integration of the data were shown.
  • ETL layer; data warehouse layer (storage) or dataset; evaluation level; and knowledge level:
    None of these layers were evident in the architecture.
  • Visualisation layer and end user layer:
    This layer was missing from the architecture, and there was no specified presentation element mention at all for data visualisations.
  • Data management, ownership, and governance:
    This requirement is an important challenge for ERPCo to address as part of data management but was not included in their architecture at all.
  • Recommendations, suggestions, and predictions (machine learning/AI) for learning optimisation:
    The ERPCo Education Architecture did not cater for this requirement at all and should, therefore, be added to it
  • Include the technologies (the systems, platforms, and tools) involved in WLA:
    The various systems, platforms and tools are clearly shown in the architecture.
In conclusion, it is evident that the four reviewed architectures, as well as ERPCo’s Education Architecture, have some gaps in terms of relevance, timeliness and completeness. All existing architectures would require improvements in these three areas to be able to support successful WLA.

6. Review of WLA Architectures Using FGD Participants

The FGD probing topics focused on analysing existing architectures, tools, technologies, and platforms for WLA. During the evaluation of the four architectures (Section 3), the Big Data Architecture was withdrawn due to a lack of available information. Therefore, only three of the architectures were presented to the FGD participants. They were asked to comment on whether they believed the architectures were complete and represented the main aspects, whether anything should be added to or removed from the architectures, and to provide general feedback. Participants were also asked to comment on which architecture(s) they thought were the most appropriate or relevant for ERPCo. The data were analysed using the QCA method as described in Section 2, and several themes were identified. The participants of the FGD, the architecture feedback from participants and the derived themes are described in this section.

6.1. Participant Profile

Five targeted participants who met the criteria consented to and participated in the FGD. The participants’ biographical information is summarised in Table 3. There were three females and two males, with all of them being in middle, intermediate, or top management. All five had postgraduate qualifications, and their job roles were strongly linked to managing learners and working with learning data.
In order to avoid bias, two external experts who were not involved in the research reviewed and validated the analysis of the FGD data in terms of the themes and codes identified. These experts were both academics with PhDs in Computer Science.

6.2. Themes: Review of Architectures in Literature

The FGD participants were not presented with the requirements for a WLA architecture identified in Section 5 and were, therefore, unaware of these. However, in the general discussion of the three architectures presented, these requirements were verified to some extent since they highlighted the following main themes as they related to the needs of ERPCo:
  • A data-focused architecture rather than a technically oriented one.
  • A central source of the truth for the data, which can be achieved with an integrated dataset.
  • Data availability, which can be achieved with an integrated, central source of data.
  • The importance of an integration or ETL layer.
  • The importance of middleware, for example, using a semantic server.
While reviewing the BI Architecture, P14 mentioned, “I think it’s a pretty standard definition of how the BI structure would work and we are working towards implementing very much this same type of architecture. We’ve got a few variations in it. In terms of the global goal that we have for ERPCo as a business, but you know this is that general overview and that’s pretty accurate”. P14 could apply what was available in the BI Architecture to what was in progress at ERPCo, especially in their role as BI Specialist. For example, they stated that “there’s also the integration layer between the systems, which is also an ETL process on its own, and I think when we starting to talk in the business, there is some, maybe not misunderstanding, but some confusion between the systems integration ETL and the BI ETL, which are similar but have very different focuses in terms of their desired outcome”.
P13 mentioned that the “API layer as we could potentially provide that information to them. So, I just feel like the business has so many points of getting information of which maybe we should potentially have one point of getting that information”. P14 added to this thought and said that ERPCo was working towards a central source of the truth for data and reporting, which was noted from this comment: “We do want to have that central source, that is a reflection of the individual sources of truth. And all reporting eventually once that is fully built out and can provide all the different requirements within the business for their data, all of these additional sources should go away and be replaced with their central data source”. Having a centralised data source as one source of the truth has various benefits, such as those mentioned by P14, “that way there’s a central query to the data source that everybody gets the same information, and therefore the information remains consistent across the business and is not variant depending on the ETL that person’s using or the source of that one may not be exactly the same as the other source that’s being used”.
P14 described the BI Architecture as highly technical since it includes how the data should be technically managed and stored. P8 noticed that the BI Architecture pulled data from multiple systems.
The DEK Model was reported as being less technical than the BI Architecture by P14, who said that it focuses on the “data objective point of view rather than the technical observation of how data is going to be managed”. P8 mentioned that the DEK Model purely focused on the education aspect, that it provided a general view and that it would be relevant for targeted audiences. P5 agreed and noted that “it includes subjective responses or data as well”. P8 added that “then you’d have to have rules around that subjectivity to eliminate the outliers”.
Only one participant (P14) commented on the SSS architecture and said, “I think this is a nice data flow to it, because it’s extrapolated just high enough to remove those variances between different systems and data availability and yet can still very clearly talk to how the systems are actually going to communicate. It can be translated quite easily down to the lower layer where you can talk to the individual systems and their capabilities. It specifically mentions REST APIs, but you know that technology could be a different technology and play there and just be substituted in, provided it provides similar features”.

6.3. Architecture Information Quality Comparison

The participants were asked to comment on which of the three architectures they thought was the most appropriate or relevant for ERPCo according to the three criteria of Information Quality from the D&M IS Success Model, that is, relevance, timeliness, and completeness. A summary of the findings is provided in Table 4, where it is evident that the DEK Model was the least preferred of the three. The BI Architecture and the SSS for WLA both had three participants that supported it. The two main themes were the need for an architecture that supported:
  • A clear data strategy;
  • Objective and subjective content.
P13 stated that of the three, they preferred the BI Architecture [10] because they had worked with it before, and it was “easy to use and understand”. P14 supported both the BI Architecture and the SSS Architecture. The BI Architecture was preferred because of their beliefs around where data should be stored and mentioned the following, “I think there is an opportunity for us to consider our data strategy overall and potentially use some of the ideas coming out of that to enhance our overall data strategy. Validity and wisdom in having a look at that and comparing it to our data strategy and seeing where we can improve.” P14 also supported the SSS Architecture as well and felt that it would meet most of their needs.
P8 mentioned that the DEK Model, from the perspective of education data, would be a good fit in the education space since it included both objective and subjective content. However, P8 also noted that “when we spread further, we are wanting to look across systems with triggers from different instances that happen in other systems” and added that it would be a better option to apply the BI Architecture in the interim and the SSS Architecture in the future. P13 and P14 also had varying opinions on whether the DEK Model could be incorporated into the BI Architecture at the end user layer, with only one of them mentioning that it could be incorporated while the other participant did not think so.
The participants were then asked to consider the three presented architectures and compare them with the ERPCo Education Architecture, considering specific aspects such as layers, presentation, data ownership, data governance, data security and types of data, for example, structured, semi-structured or unstructured data.
Regarding the layers, P14 mentioned that many of the layers fell in between or across some of the different areas and said, “we’ve got the database, that’s our initial landing database. We’re gathering the information into, if you relate that back to your BI model, that’s part of where we were hitting the ETL layer. And then when it moves into ZAP, it’s where it’s going into the data warehousing. And then when it’s coming out the top to Power BI, it is where you’re starting to see, conceptually the data marts and the reporting layer at the top. So, the layers are kind of broken up between the different pieces. Here the view is not showing it in layers but the layers do still implicitly exist underneath”. P13 agreed with this explanation from P14.
For the types of data, P14 mentioned the following: “comments of the structured, unstructured I would hesitate to say, but it’s pretty much everything we’re working with. It would fall into the category of structured data. I don’t know of anywhere that we’re consuming anything that sort of would fall even vaguely into the category of unstructured data”.

7. Discussion and Layered WLA Architecture

Considering the literature review of existing architectures and the feedback from the participants of the FGD, the proposed Layered WLA Architecture is illustrated in Figure 5. Since the SSS Architecture was deemed the most relevant from the literature review (Table 2) and was also supported by the participants of the FGD (Section 6), it was used as the foundation for the proposed architecture. However, elements from the BI Architecture, the DEK Model and the ERPCo Education Architecture were also considered and incorporated.
The Layered WLA Architecture has four layers with the SSS (Middleware) at the centre. The bottom layer is for Data Sources, both internal and external. For example, the dataset includes learning data from the LMS/LXP, partner user data from the Partner Relations Management (PRM), customer user data from the Customer Relationship Management (CRM), project data from the Project Management (PM), and Pay-As-You-Go (PAYG) user data from the eCommerce platform. The dataset also includes data about the users (learners), including their interactions and assigned learning content. These layers and their associated components should work together to achieve WLA success.
The Data Warehouse layer incorporates the databases and integration systems/workflow tools, as well as the simple, composed and metadata services, that is, the REST APIs and the Data API. User resources and interactions (data) are transported via an Activity Service REST API. The data warehouse layer is where the ETL processes take place. The SSS is the middleware that sits between the data warehouse and the next layer, which is the Tools and Technology layer. This layer contains all the tools for content creation, data visualisation, learning (for example, LXP/LMS data), AI and machine learning (recommender systems), and external platforms. The top layer is Visualisations, which includes the LA Dashboard as well as the stakeholder users who will use the dashboard visualisations to make informed decisions.
The findings highlighted that a WLA Architecture should provide a central source of the truth for WLA, as it pulls data from multiple systems to ensure the success of WLA. The proposed Layered WLA architecture meets this requirement and also aligns with the DIKW Hierarchy, as within the architecture, each layer corresponds to that of data, information, knowledge and wisdom, and these layers impact the various tools and systems required for WLA.
The results of the evaluations of the architectures in the first two phases indicate that for an architecture to impact the success of WLA, it needs to be relevant to the context, relevant, complete and timeless. If these characteristics are present, it is an indicator of the quality of information being presented and included in the architecture. This aligns with the Information Quality dimension of the D&M IS Success Model theory. For this reason, it can be motivated that the research conducted has transferability since it addresses how you can relate your research to an existing theory; if you can do this, it broadens the theoretical significance of the research being conducted [45].

8. Conclusions

This paper answered the research question: What architecture can be used for successful WLA? through theoretical and practical evaluations. In the first two phases of the research, four architectures were identified in the literature review and reviewed using the D&M IS Success Model dimension of Information Quality. A set of common requirements for WLA was identified. These requirements were then validated by applying them to the real-world context at ERPCo by the researcher. From the architectures reviewed, the SSS was deemed the most recent and most relevant for the context; however, there were still gaps in its requirements specific to WLA and ERPCo.
In the FGD, the five participants who all worked at ERPCo with learners and/or learning data confirmed the SSS as one of the most appropriate/relevant architectures for an organisation such as ERPCo due to its attributes related to data flow, data availability and how the systems communicate with each other.
The findings of these evaluations were used as input to the design of the Layered WLA Architecture (Figure 5), which has four distinct layers, namely: visualisation, tools and technology, data warehouse, and internal and external data sources. In the data warehouse layer, it is important to note that there is a clear distinction between the systems integration ETL and the BI ETL as they focus on different desired outcomes.
The proposed Layered WLA Architecture is an architecture that can be adopted by corporate organisations to support WLA’s success. Without the adoption and use of such an architecture, the success of WLA could be at risk. Potential challenges that could arise whilst implementing the proposed WLA Architecture could be related to stakeholder support, budget constraints in acquiring the various tools and technology, as well as the skills required to configure and maintain the infrastructure.
Researchers can use the WLA Architecture as a point of reference for what constitutes successful WLA at corporate organisations, as well as to expand on it as tools and technologies develop in the future. The adoption of a WLA architecture can assist Smart Cities and Smart Organisations to improve sustainability, specifically in terms of lifelong learning, data, and technology. Since the real-world context was the case of a global company in the field of software and software training, the results could be applicable to all countries that work in a similar industry, that is, one that is global and provides online training on software.
There are some possible threats to validity since the number of participants was fairly low. However, the feedback from these participants provided some interesting lessons learned from the real-world context, and this might be applicable to similar companies. In addition, since the architecture was designed from best practice from published architectures it has already been validated. The feedback from participants served as additional qualitative data and feedback only. It did not impact the architecture proposed in the paper, but merely provided additional validity to it.
Future research could provide further validation to this work if applied in a different organisation in a similar context (generalisability) or adopted in a different context to further evaluate transferability. Transferability of the findings is provided by using a theory to validate the approach; this is an additional step that was used to alleviate any possible threats to validity. In this study, transferability was ensured through using the D&M IS Success Model theory. Due to only one company in the real-world context being studied, it cannot definitively be generalised to all real-world contexts in all industries. Future research should implement the architecture in similar contexts but with more than one company and a larger sample size. Future work could also further analyse data requirements for successful WLA adoption within corporate organisations.

Author Contributions

Conceptualisation, A.W. and B.S.; methodology, A.W. and B.S.; software, A.W.; validation, A.W. and B.S.; formal analysis, A.W. and B.S.; investigation, A.W. and B.S.; resources, A.W. and B.S.; data curation, A.W. and B.S.; writing—original draft preparation, A.W. and B.S.; writing—review and editing, A.W. and B.S.; visualisation, A.W.; supervision, B.S.; project administration, A.W. and B.S.; funding acquisition, A.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially funded by ERPCo (the company the primary researcher works for).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Nelson Mandela University Research Ethics Committee—Human (REC-H) with an ethics clearance reference number H22-SCI-CSS-009 on 1 November 2022.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The datasets presented in this article are not readily available because of privacy, legal, and ethical reasons concerning the participants of this study.

Acknowledgments

We thank Simone Beets for assisting with proofreading and technical checking.

Conflicts of Interest

The authors declare that this study received funding from ERPCo. The funder was not involved in the study design, collection, analysis, interpretation of data, the writing of this article or the decision to submit it for publication.

References

  1. Brna, A.P.; Brown, R.C.; Connolly, P.M.; Simons, S.B.; Shimizu, R.E.; Aguilar-Simon, M. Uncertainty-based modulation for lifelong learning. Neural Netw. 2019, 120, 129–142. [Google Scholar] [CrossRef] [PubMed]
  2. Agirre, E.; Jonsson, A.; Larcher, A. Framing Lifelong Learning as Autonomous Deployment: Tune Once Live Forever. In Proceedings of the International Workshop on Spoken Dialogue Systems Technology, Siracusa, Italy, 6 September 2019. [Google Scholar]
  3. Giffinger, R.; Wien, T.U.; Fertner, C.; Kalasek, R.; Milanović, N.P. Smart Cities Ranking of European Medium-Sized Cities; Vienna University of Technology: Vienna, Austria, 2007. [Google Scholar]
  4. Van der Hoogen, A. An Enterprise Architecture for Environmental Information Management and Reporting; Nelson Mandela Metropolitan University: Gqeberha, South Africa, 2013. [Google Scholar]
  5. Scott, L. Learning Cities as Smart Cities: Connecting Lifelong Learning and Technology. In Examining the Socio-Technical Impact of Smart Cities; IGI Global: Hershey, PA, USA, 2021; pp. 68–90. [Google Scholar] [CrossRef]
  6. Brown, K.; Larionova, V.A.; Lally, V. Lifelong learning as a tool for the development of smart cities: Technology enhanced learning as an enabler. R-Economy 2018, 4, 133–143. [Google Scholar] [CrossRef]
  7. Zhuang, R.; Fang, H.; Zhang, Y.; Lu, A.; Huang, R. Smart Learning environments for a Smart City: From the perspective of lifelong and lifewide learning. Smart Learn. Environ. 2017, 4, 6. [Google Scholar] [CrossRef]
  8. Kopp, T.; Kinkel, S.; Schäfer, T.; Kieslinger, B.; Brown, A.J. Measuring the impact of learning at the workplace on organisational performance analytics. Int. J. Product. Perform. Manag. 2020, 69, 1455–1474. [Google Scholar] [CrossRef]
  9. Longmore, A.L.; Grant, G.; Golnaraghi, G. Closing the 21st-century knowledge gap: Reconceptualizing teaching and learning to transform business education. J. Transform. Educ. 2018, 16, 197–219. [Google Scholar] [CrossRef]
  10. Ong, I.L.; Siew, P.H.; Wong, S.F. A Five-Layered Business Intelligence Architecture. Commun. IBIMA 2011, 2011, 695619. [Google Scholar] [CrossRef]
  11. Sedkaoui, S.; Khelfaoui, M. Understand, develop and enhance the learning process with Big Data. Inf. Discov. Deliv. 2019, 47, 2–16. [Google Scholar] [CrossRef]
  12. Parrish, A.H.; Richman, L.J. Dual perspectives on learning analytics in higher education. J. Appl. Res. High. Educ. 2020, 12, 4–14. [Google Scholar] [CrossRef]
  13. Aldiab, A.; Chowdhury, H.; Kootsookos, A.; Alam, F.; Allhibi, H. Utilization of Learning Management Systems (LMSs) in higher education system: A case review for Saudi Arabia. Energy Procedia 2019, 160, 731–737. [Google Scholar] [CrossRef]
  14. Johansson, C.; Parida, V.; Larsson, A.C. How are Knowledge and Information Evaluated?—Decision-making in Stage-gate Processes. In Proceedings of the ICED 09, the 17th International Conference on Engineering Design, Palo Alto, CA, USA, 24–27 August 2009; pp. 195–206. [Google Scholar]
  15. Villegas-Ch, W.; Luján-Mora, S.; Buenaño-Fernandez, D. Data Mining Toolkit for Extraction of Knowledge from LMS. In Proceedings of the 2017 9th International Conference on Education Technology and Computers, Barcelona, Spain, 20–22 December 2017; pp. 31–35. [Google Scholar] [CrossRef]
  16. Rulevy, D.F.; Aprilianti, A. The Analysis of Factors That Affect Intention to Use on E-learning Users Using Technology Acceptance Model (TAM) Approach. In Proceedings of the 5th Global Conference on Business, Management and Entrepreneurship (GCBME 2020), Bandung, Indonesia, 8 August 2020; pp. 602–608. [Google Scholar]
  17. Attwell, G.; Kieslinger, B.; Blunk, O.; Schmidt, A.P.; Schaefer, T.; Jelonek, M.; Kunzmann, C.; Prilla, M.; Reynard, C. Workplace Learning Analytics for Facilitation in European Public Employment Services; CEUR Workshop: Aachen, Germany, 2016; Volume 1601, pp. 91–97. [Google Scholar]
  18. Seddon, P.B.; Constantinidis, D.; Tamm, T.; Dod, H. How does business analytics contribute to business value? Inf. Syst. J. 2017, 27, 237–269. [Google Scholar] [CrossRef]
  19. Henriques, T.A.; O’Neill, H. Design science research with focus groups—A pragmatic meta-model. Int. J. Manag. Proj. Bus. 2021, 16, 119–140. [Google Scholar] [CrossRef]
  20. Ågerfalk, P.J. Getting pragmatic. Eur. J. Inf. Syst. 2010, 19, 251–256. [Google Scholar] [CrossRef]
  21. DeLone, W.H.; McLean, E.R. Information Systems Success Revisited. In Proceedings of the Annual Hawaii International Conference on System Sciences, Bis Island, Hawaii, 7–10 January 2002; pp. 2966–2976. [Google Scholar]
  22. Wang, Y.; Wang, H.; Albert, L.J. MOOC Relevance: A Key Determinant of the Success for Massive Open Online Courses. J. Inf. Syst. Educ. 2023, 34, 456–471. [Google Scholar]
  23. Lessa, L.; Negash, S.; Mekonnen, T. Respecifying DeLone and McLean Information Systems Success Model for Measuring ERP System Post-implementation Success. In Proceedings of the Twenty-eighth Americas Conference on Information Systems, AMCIS 2022, Minneapolis, MN, USA, 10–14 August 2022. [Google Scholar]
  24. DeLone, W.H.; McLean, E.R. Information Systems Success Measurement; Now Publishers Inc.: Hanover, NH, USA, 2016. [Google Scholar]
  25. Hevner, A.; Chatterjee, S. Design Research in Information Systems; Springer: New York, NY, USA, 2010. [Google Scholar]
  26. Yin, R.K. Qualitative Research from Start to Finish; The Guilford Press: New York, NY, USA, 2011. [Google Scholar]
  27. Kotsias, J.; Ahmad, A.; Scheepers, R. Adopting and integrating cyber-threat intelligence in a commercial organisation. Eur. J. Inf. Syst. 2023, 32, 35–51. [Google Scholar] [CrossRef]
  28. Erlingsson, C.; Brysiewicz, P. A hands-on guide to doing content analysis. Afr. J. Emerg. Med. 2017, 7, 93–99. [Google Scholar] [CrossRef]
  29. Ramírez-Correa, P.E.; Rondan-Cataluña, F.J.; Arenas-Gaitán, J.; Alfaro-Perez, J.L. Moderating effect of learning styles on a learning management system’s success. Telemat. Inform. 2017, 34, 272–286. [Google Scholar] [CrossRef]
  30. Pour, M.J.; Mesrabadi, J.; Hosseinzadeh, M. A comprehensive framework to rank cloud-based e-learning providers using best-worst method (BWM): A multidimensional perspective. Online Inf. Rev. 2019, 44, 114–138. [Google Scholar] [CrossRef]
  31. Feffer, M. LXP vs. LMS: What Are the Differences? Tech Target. Available online: https://www.techtarget.com/searchhrsoftware/tip/LXP-vs-LMS-What-are-the-differences#:~:text=The fundamental difference between LXP,ability to facilitate personalized learning (accessed on 24 April 2022).
  32. Ruiz-Calleja, A.; Prieto, L.P.; Ley, T.; Rodriguez-Triana, M.J.; Dennerlein, S. Learning Analytics for Professional and Workplace Learning: A Literature Review. In Proceedings of the European Conference on Technology Enhanced Learning, Tallinn, Estonia, 12–15 September 2017; Springer: Berlin/Heidelberg, Germany, 2017; pp. 164–178. [Google Scholar]
  33. Enofe, M.O. Data Management in an Operational Context: A Study at Volvo Group Trucks Operations. Master’s Thesis, School of Economics and Management, Lund University, Lund, Sweden, 2017. [Google Scholar]
  34. Sivarajah, U.; Kamal, M.M.; Irani, Z.; Weerakkody, V. Critical analysis of Big Data challenges and analytical methods. J. Bus. Res. 2017, 70, 263–286. [Google Scholar] [CrossRef]
  35. Mohlameane, M.; Ruxwana, N. Exploring the impact of cloud computing on existing South African regulatory frameworks. S. Afr. J. Inf. Manag. 2020, 22, 1–9. [Google Scholar] [CrossRef]
  36. Pecori, R.; Suraci, V.; Ducange, P. Efficient computation of key performance indicators in a distance learning university. Inf. Discov. Deliv. 2019, 47, 96–105. [Google Scholar] [CrossRef]
  37. Shankar, S.K.; Prieto, L.P.; Rodriguez-Triana, M.J.; Ruiz-Calleja, A. A review of multimodal learning analytics architectures. In Proceedings of the IEEE 18th International Conference on Advanced Learning Technologies, ICALT 2018, Mumbai, India, 9–13 July 2018; pp. 212–214. [Google Scholar]
  38. Armatas, C.; Spratt, C.F. Applying learning analytics to program curriculum review. Int. J. Inf. Learn. Technol. 2019, 36, 243–253. [Google Scholar] [CrossRef]
  39. Bennett, L.; Folley, S. Four design principles for learner dashboards that support student agency and empowerment. J. Appl. Res. High. Educ. 2020, 12, 15–26. [Google Scholar] [CrossRef]
  40. Matsebula, F.; Mnkandla, E. A big data architecture for learning analytics in higher education. In Proceedings of the 2017 IEEE AFRICON: Science, Technology and Innovation for Africa, AFRICON 2017, Cape Town, South Africa, 18–20 September 2017; pp. 951–956. [Google Scholar] [CrossRef]
  41. Drlik, M.; Skalka, J.; Svec, P.; Kapusta, J. Proposal of Learning Analytics Architecture Integration into University IT Infrastructure. In Proceedings of the IEEE 12th International Conference on Application of Information and Communication Technologies, AICT 2018, Almaty, Kazakhstan, 17–19 October 2018; pp. 1–6. [Google Scholar]
  42. Skalka, J.; Drlik, M.; Svec, P. Knowledge Discovery from University Information Systems for Purposes of Quality Assurance Implementation. In Proceedings of the IEEE Global Engineering Education Conference, EDUCON, Berlin, Germany, 13–15 March 2013; pp. 591–596. [Google Scholar] [CrossRef]
  43. Ruiz-Calleja, A.; Dennerlein, S.; Kowald, D.; Theiler, D.; Lex, E.; Ley, T. An infrastructure for workplace learning analytics: Tracing knowledge creation with the social semantic server. J. Learn. Anal. 2019, 6, 120–139. [Google Scholar] [CrossRef]
  44. Paavola, S.; Hakkarainen, K. The knowledge creation metaphor—An emergent epistemological approach to learning. Sci. Educ. 2005, 14, 535–557. [Google Scholar] [CrossRef]
  45. Saunders, M.; Lewis, P.; Thornhill, A. Research Methods for Business Students, 5th ed.; Pearson Education Limited: Essex, UK, 2019. [Google Scholar]
Figure 1. BI Architecture [10] (p. 3).
Figure 1. BI Architecture [10] (p. 3).
Sustainability 16 03595 g001
Figure 2. DEK Model [41].
Figure 2. DEK Model [41].
Sustainability 16 03595 g002
Figure 3. Potential Scenario Supported by the SSS Architecture [43] (p. 124).
Figure 3. Potential Scenario Supported by the SSS Architecture [43] (p. 124).
Sustainability 16 03595 g003
Figure 4. Possible Configuration of the SSS Software Architecture [43] (p. 124).
Figure 4. Possible Configuration of the SSS Software Architecture [43] (p. 124).
Sustainability 16 03595 g004
Figure 5. Layered WLA Architecture.
Figure 5. Layered WLA Architecture.
Sustainability 16 03595 g005
Table 1. Comparison of BI and LA Architectures.
Table 1. Comparison of BI and LA Architectures.
ArchitectureYearContextComponentsShortcomings
BI Architecture [10]2011BIFive layers:
  • Data source layer
  • ETL layer
  • Data warehouse layer
  • Metadata layer
  • End user layer (visualisation)
  • Does not explicitly refer to tools or technologies
  • Not specific to WLA
Big Data Architecture for LA [40] 2017LA, HEIs Five elements:
  • Data gathering devices (sources)
  • Data storage and management systems
  • Data analytics systems (predictions)
  • Data visualisations (presentation)
  • Action
  • No detail provided
  • The layers are not implicitly defined
  • Not specific to WLA
  • Actions (can be used to assist adaptivity)
DEK Model [41]2018LA, HEIsFour layers/levels/tiers:
  • Data collection level
  • Evaluation level
  • Knowledge level
  • Presentation layer
  • Does not explicitly refer to tools or technologies
  • Not specific to WLA
SSS for WLA [43]2019WLAFive requirements:
  • Workplace learning participants
  • Workplace learning tools (web application and collaboration tool)
  • Collects learning events (data)
  • Workplace LA applications (data visualisation application and recommendation systems)
  • Dataset
  • The layers are not implicitly defined
Table 2. Evaluation of BI and LA Architectures.
Table 2. Evaluation of BI and LA Architectures.
ArchitectureRelevance to ContextRelevanceTimelinessCompletenessTotal
BI Architecture [10]12137
Big Data Architecture for LA [40]21216
DEK Model [41]233311 *
SSS for WLA [43]344314 *
Table 3. Summary of FGD Participants.
Table 3. Summary of FGD Participants.
Participant CodeAge GroupGenderCurrent Job Title/RoleYears
Experience in Current Job Role
Current Job LevelHighest Level of
Qualification
P535–44 years oldFemaleEducation Executive; Global Program Manager: Partners3–5 yearsMiddle ManagementHonours Degree
P855–64 years oldFemaleCustomer Relations ExecutiveMore than 10 yearsTop Management,
(Director/Executive)
Bachelor’s Degree
P1035–44 years oldMaleChief People Officer; Chief Strategy and People Officer6–10 yearsTop Management,
(Director/Executive)
Master’s Degree
P1325–34 years oldFemaleSystems Owner6–10 yearsIntermediate/Experienced,
(Senior Team Member)
Bachelor’s Degree
P1445–54 years oldMaleBusiness Intelligence SpecialistLess than 2 yearsIntermediate/Experienced,
(Senior Team Member)
Diploma
Table 4. Most Appropriate/Relevant Architecture for ERPCo.
Table 4. Most Appropriate/Relevant Architecture for ERPCo.
ArchitectureParticipants Who Verified
BI Architecture [10]P8, P13, P14
DEK Model [41]P8
SSS for WLA [43]P8, P13, P14
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Whale, A.; Scholtz, B. An Architecture for Workplace Learning Analytics (WLA) to Support Lifelong Learning in Sustainable Smart Organisations. Sustainability 2024, 16, 3595. https://doi.org/10.3390/su16093595

AMA Style

Whale A, Scholtz B. An Architecture for Workplace Learning Analytics (WLA) to Support Lifelong Learning in Sustainable Smart Organisations. Sustainability. 2024; 16(9):3595. https://doi.org/10.3390/su16093595

Chicago/Turabian Style

Whale, Alyssa, and Brenda Scholtz. 2024. "An Architecture for Workplace Learning Analytics (WLA) to Support Lifelong Learning in Sustainable Smart Organisations" Sustainability 16, no. 9: 3595. https://doi.org/10.3390/su16093595

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop