Next Article in Journal
Study of Rock Damage Constitutive Model Considering Temperature Effect Based on Weibull Distribution
Previous Article in Journal
Investigation of Car following and Lane Changing Behavior in Diverging Areas of Tunnel–Interchange Connecting Sections Based on Driving Simulation
Previous Article in Special Issue
Virtual Production: Real-Time Rendering Pipelines for Indie Studios and the Potential in Different Scenarios
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enriching User-Visitor Experiences in Digital Museology: Combining Social and Virtual Interaction within a Metaverse Environment

Instituto de Diseño y Fabricación, Universitat Politècnica de València, 46022 València, Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(9), 3769; https://doi.org/10.3390/app14093769
Submission received: 20 March 2024 / Revised: 25 April 2024 / Accepted: 26 April 2024 / Published: 28 April 2024

Abstract

:
This study investigates the potential of integrating multilayer animations and sophisticated shader technologies to enhance visitor social interactions within metaverse exhibition spaces. It is part of a broader initiative aimed at developing innovative digital museology strategies that foster social engagement through virtual reality (VR) experiences. The methodology adopted seeks to provide a more immersive and human-centric exploration of 3D digital environments by blending elements of physical spaces with the interactive dynamics common in video games. A virtual exhibition space themed around Mars was created as a testbed to facilitate social interactions among users, who navigate the environment via avatars. This digital space was developed using a specialized Unity template designed by the metaverse platform Spatial.io. Overcoming the programming constraints imposed by Spatial.io, which limits the use of external scripts for security and stability, posed a significant challenge. Nonetheless, by leveraging the ability to modify shader codes used for material creation and employing advanced animation techniques with layered effects, the authors of this work achieved dynamic material responses to lighting changes and initiated complex asset interactions beyond simple linear animations.

1. Introduction

A noticeable trend is emerging in the evolution of conventional museum displays, marked by the incorporation of multisensory Extended Reality (XR) technologies. This category encompasses virtual reality (VR), augmented reality (AR), and mixed reality (MR), each adding a layer of immersive sensory experience to the traditional exhibition format [1]. Indeed, the use of XR technologies in cultural heritage is increasingly confirmed as positive [2]. Despite the current small number of publications on this topic “the majority of the authors are developing VR experiences” by resorting to game engine software. In fact, these types of technologies present “clear benefits for attracting visitors and encouraging revisits”, enabling users to experience more engaging, immersive, and meaningful cultural heritage content [3].
Recent studies have shown that VR in museums not only enhances the visitor’s experience by providing immersive and interactive learning opportunities but also serves as a powerful tool for storytelling and preservation of cultural heritage. For instance, Lee et al. discovered that absorptive experiences in VR significantly influence the overall museum VR tour experience and visitor’s intention to visit a museum, suggesting that VR can substantially enhance the educational and entertainment value of museum visits [4].
Furthermore, immersive VR applications in cultural heritage contexts have shifted the way individuals access and experience cultural assets, from virtual museum tours to exploring ancient buildings, thus broadening the reach and accessibility of cultural education [5]. These technologies not only augment the visitor’s experience but also offer new methodologies for cultural heritage preservation and interpretation, enabling a richer and more detailed understanding of past civilizations and cultural narratives [6,7].
The adoption of virtual exhibition spaces in the metaverse responds to the progress of cutting-edge technologies and the interest of museums in integrating them as a complementary activity to attract visitors, improve their relationship with them, and add value to their exhibits. Also, these virtual environments contribute to an immersive and accessible experience that provides new ways of interacting with culture.
One of the key motivations behind adopting these technologies is their ability to transform the conventional museum experience. Virtual and augmented realities enable users to explore, learn, and interact in ways that go beyond traditional physical boundaries, making cultural heritage accessible to a global audience. This is particularly vital for providing educational and cultural access to remote or underserved populations, who can now experience world-class exhibits from the comfort of their own homes.
The motivation also stems from the potential of these technologies to create engaging and dynamic environments. Such environments encourage users to interact not only with the exhibits but also with each other, promoting a sense of community and shared learning. Advanced technologies allow for the creation of interactive narratives and educational programs within the virtual space, enhancing the user’s engagement and understanding of cultural heritage.
Furthermore, the shift toward digital integration in museums reflects broader societal trends towards digitalization and virtual interaction. As digital literacy grows and virtual experiences become more common, museums are adapting to meet the expectations of a technologically adept audience. The adoption of immersive technologies is a strategic response to these trends, positioning museums as forward-thinking, inclusive, and adaptive institutions in the digital age.
By embracing virtual and augmented realities, museums are not only expanding their reach but also redefining the possibilities of cultural engagement and education, ensuring they remain relevant and resonant in an increasingly digital world.
Consequently, the objective of this work to harness and implement cutting-edge technologies has led to the development of a platform that revolutionizes the traditional museum experience. This initiative is underpinned by the conviction that educational opportunities should transcend the physical boundaries of conventional museum spaces, whether accessed from a university, a conference, or the comfort of one’s home.
The strategic vision of the authors of this work for future museums includes equipping them to offer visitors a more immersive interaction with exhibits. For instance, the proposed approach enables users to virtually “walk through” historical sites or engage with detailed reconstructions of significant events that conventional displays might fail to convey comprehensively. This method not only facilitates the exploration of rare artifacts and fragile ecosystems without risking damage but also adapts educational content to accommodate diverse learning styles through interactive narratives and gamification strategies. Such enhancements not only increase the educational value of museum visits but also appeal to a broader demographic, particularly younger, tech-savvy audiences who are drawn to dynamic and interactive experiences.
Furthermore, this technological advancement in museums extends beyond maintaining relevance in today’s digital age, since it significantly widens the educational scope by engaging a larger audience. The proposed method also aims for these virtual experiences to enhance user interaction, thereby enriching the cultural exhibits. By rendering learning experiences more accessible and captivating beyond the traditional limits of a museum, the proposed approach redefines the concept of visiting and learning from a museum. This strategic evolution is designed to enable the museum sector to adapt and prosper within the modern digital context.
The integration of VR and holographic technologies has notably transformed museum and educational applications, enhancing visitor engagement and learning experiences significantly. This evolution began with the pioneering work of Lee et al. [8], who introduced a novel content control technology using hand gestures to enhance the usability of holographic displays. Concurrently, Zhao et al. [9] developed VR-based 3D modeling and interaction technologies specifically for museums, significantly elevating visitor satisfaction and engagement. Additionally, Zhu et al. [10] extended the application of gesture interactions to the automotive industry, analyzing skylight gesture interaction for automobiles and establishing a set of design criteria based on data from young Chinese drivers.
Building on these foundations, other researchers have employed VR to create immersive and interactive virtual museum exhibits. Such initiatives not only enrich visitor interactions by enabling novel ways of engagement but also incorporate advanced deep learning technologies to assist curators in creating more engaging narratives [11,12]. The broader implementation of VR in museums has fostered a more dynamic visitor experience, facilitating direct interaction with exhibits in a virtual space, with international applications exploring the diversity of visitor experiences, particularly in cultural contexts such as China [13,14,15].
Furthermore, targeted applications for younger audiences have emerged, where interactive VR allows children to animate their drawings within a virtual environment, thereby promoting educational engagement through an amalgamation of art and technology [16]. Meileni et al. [17] developed an Android-based VR application for the Balaputra Dewa Museum, offering virtual tours that enhance visitor accessibility and understanding of historical collections through immersive 360° photos. Similarly, Schlachhoff et al. [18] introduced RHINO-VR, an interactive museum exhibit using VR to teach mobile robotics concepts, revitalizing the historical tour guide robot RHINO and allowing visitors to interact with the robot in a virtual environment.
The integration of tangible interaction artifacts with digital information in museum exhibits has also significantly enhanced visitor interaction and learning [15,19]. This multidimensional approach deepens the educational impact of digital content, providing a multisensory learning environment [20]. Comprehensive surveys have discussed the use of VR in structural design and cultural heritage, highlighting VR’s capability to enhance interaction with cultural artifacts and architectural structures, aiding significantly in education and preservation practices [21].
Hence, Yokutkhon [21] emphasizes the transformative impact of VR and AR in education, particularly in Eastern countries, by enhancing traditional educational methods and improving learning outcomes through immersive and interactive experiences. Recent studies, including that by Gülen et al. [20], discuss the integration of STEM (Science, Technology, Engineering, and Mathematics) education within the metaverse, highlighting both the challenges and opportunities presented. This revolutionary approach adapts educational environments to advanced technological ecosystems, offering immersive and interactive learning experiences.
Jiang et al. [22] presented an intelligent digital museum system that uses mixed reality and AI-driven data analysis to enhance human–computer interaction and make the display of cultural artifacts more accessible and engaging. Kanematsu et al. [23] discussed a virtual STEM class for nuclear safety education within the metaverse, blending e-learning with hands-on activities to enhance student engagement and understanding at an early educational stage. Solanes et al. [24] explored the potential of metaverses for STEM education, presenting a methodological framework and a case study on the integration of interactive metaverses into STEM courses.
Thence, the metaverse, primarily a playground for brands to explore commercial opportunities, also offers museums a unique opportunity to cater to their audience in innovative ways [25]: by facilitating immersive learning experiences and interactive engagements, museums can transcend traditional boundaries, blending education with entertainment in a manner that is both engaging and accessible [26]. The initial challenges of metaverse content creation, marked by complex development processes, have lessened with the advent of user-friendly virtual spaces that can be customized directly from a web browser, significantly lowering the barriers to entry and democratizing the creation and consumption of digital content.
In this dynamic landscape, the duration of VR tours can be more flexible, allowing for a more personalized adaptation to the user’s needs and preferences. It is advisable to consider this aspect in the initial stages of storytelling and experience design, considering the final intended configuration [27]. This factor can be particularly important when analyzing user experience, as the duration of the experience can influence the user’s satisfaction, level of engagement, and overall perception of the visit.
The interactivity inherent in the metaverse offers museums an unprecedented opportunity to revolutionize visitor engagement and education. Drawing from the insights of Hennig-Thurau et al. [28], it is evident that advancements in VR technology are pivotal, significantly enhancing the user’s sense of spatial presence, a crucial element for immersive VR applications as highlighted by Bailenson [29]. These capabilities extend beyond individual experiences, fostering real-time social interactions among users represented by avatars, and enabling a form of engagement that transcends geographical and physical limitations [7,30]. Visitors can engage with exhibits, participate in guided tours, and interact with other visitors from around the world as if they were physically present together in the same space.
Al-Jundi and Tanbour have also emphasized the importance of such immersive experiences in enhancing learning outcomes and visitor satisfaction [31]. The metaverse, with its capacity for high-fidelity social interaction, opens up new avenues for collaborative learning and community building within museum spaces. Museums can leverage this technology to host interactive workshops, live question and answer sessions with experts, and collaborative art projects, fostering a sense of community among visitors [32].
The integration with digital technologies has significantly transformed the way museums interact with their audiences. The use of online resources and digital access has made museum collections, exhibitions, and educational materials remotely accessible, enriching visitors’ understanding through virtual tours, digital audio guides, and other interactive digital experiences, a concept supported by earlier observations from Hawkey [33,34]. The COVID-19 pandemic accelerated this digital shift, necessitating innovative approaches to mitigate the loss of human capital and maintain public engagement with cultural heritage globally [35]. In response, many museums turned to emerging technologies as a lifeline, thereby preserving public interest and participation.
Dohoney introduces the concept of “smart museums”, which synergize traditional exhibitions with cutting-edge technologies. These institutions leverage immersive technology to enhance the delivery of complex cultural heritage material to visitors. This approach has not only broadened access but also deepened the engagement and learning opportunities for audiences worldwide [36]. However, while embracing these technological advances, it is crucial for museums to strike a balance between interactivity and historical accuracy. The models and recreated spaces within museums, especially those pertaining to heritage, must maintain a level of realism [37]. They should offer interactive experiences that are engaging and educational yet remain faithful to the reality they represent. This balance ensures that the immersive experiences not only captivate and educate visitors but also respect and accurately convey the cultural and historical significance of the exhibits. Through careful implementation of emerging technologies, museums can continue to serve as gateways to the past, providing immersive and educational experiences that resonate with a modern audience while preserving the integrity of the cultural heritage they showcase.
The exploration of new technologies and the functionalities offered by the metaverse is pivotal in understanding their appeal and potential application within museum spaces [38], both in traditional exhibition settings and in the context of interactive or “new museum” environments. This investigation is not only about gauging the attractiveness of these innovations to audiences but also about assessing their practicality, effectiveness, and the added value they can bring to the museum experience.
At the heart of this exploration is the utilization of specific shaders and parametric animations within Unity, which are tools that represent just a fraction of the technological capabilities available. Shaders, for example, have been employed to create dynamic visual effects that can reveal hidden aspects of exhibits or artworks when viewed through a digital lens [39]. This can transform a static viewing experience into an interactive exploration, inviting visitors to engage more deeply with the content. Such shaders can uncover layers of an artwork, historical context, or detailed insights into an artifact’s significance, enriching the visitor’s understanding and appreciation.
Parametric animations add another layer of interactivity, enabling objects within the metaverse to move and change not randomly but in response to predefined parameters or user interactions. With relatively few parameters, these animations can simulate real-world physics, create lifelike movements, or trigger changes in the environment that respond to the visitor’s actions. This can make the experience of exploring a virtual museum more engaging, mimicking the exploration of a physical space where one’s movements and decisions influence what one sees and learns.
The integration of these technologies into museum exhibitions—whether virtual replicas of existing museums or entirely new, interactive museum experiences designed for the metaverse—opens numerous possibilities. For traditional museums, it offers a way to extend the reach of their collections, making them accessible to a global audience unable to visit in person. It also allows for the creation of enhanced or augmented experiences that can add value even for those who can visit the physical space, offering layers of interaction and exploration that go beyond what is possible in the real world.
Thus, the deployment of advanced technologies forms the backbone of new museum concepts. These technologies facilitate the creation of virtual spaces unrestrained by physical limitations, allowing for the dynamic updating, expansion, and complete transformation of exhibitions with ease. Such flexibility makes these virtual spaces not only educational but also highly engaging, utilizing the metaverse’s unique capabilities to craft immersive and interactive narratives that captivate visitors in ways traditional museums cannot.
However, the emphasis in developing these virtual museum experiences should not be solely on the technology itself but rather on the conditions and content. The design of museum experiences is inherently complex and demands a diverse set of skills encompassing scientific, engineering, artistic, graphic, pedagogical, psychological, technical, and educational expertise. The goal is to forge a new and impactful medium for cultural transmission that merges real and virtual elements to enhance the sense of presence within the narrative.
Looking ahead, the principal challenge in museum studies and digital technology is to transition the focus of research towards understanding perception from both a content and technical perspective. This shift aims to minimize cognitive errors and optimize user comfort and satisfaction, ultimately enriching the visitor’s experience in digital applications. By addressing these challenges, museums can better harness the potential of digital environments to offer enriching, educational, and enjoyable experiences.
The primary objective is to develop an exhibition space in the metaverse employing multilayer animations and shading technologies to enhance interactions among visitors. This study conducts an empirical investigation into the use of multilayer animations and sophisticated shader technologies within metaverse platforms. The authors of this work achieved dynamic material responses to changes in lighting and complex asset interactions that go beyond basic linear animations. By examining the interplay of these visual elements, the research illuminates their impact on user engagement, immersion, and overall experience.
Furthermore, this work transcends theoretical discussions and extends its findings to practical applications. Cultural and educational institutions can derive preliminary practical insights from the results presented in this study. Lastly, this research highlights the transformative potential of these technological approaches. Beyond mere aesthetics, multilayer animations and advanced shaders facilitate social interaction and enhance educational experiences.
This paper’s structure is outlined as follows: the proposed application is developed in Section 2. Subsequently, Section 3 provides insights into the interface’s usability and additional aspects. Finally, the paper concludes with a discussion and concluding remarks presented in Section 4 and Section 5.

2. Materials and Methods

2.1. Design Methodology for Metaverse Exhibition Spaces

The methodology for creating an exhibition space in the metaverse, as depicted in Figure 1, encompasses a comprehensive approach to designing applications to implement interactive VR platforms in museum environments. While acknowledging that human–computer interaction (HCI) is a crucial component in user experience, this methodology is not limited only to that aspect. Instead, it aims to provide a detailed guide that covers the entire design process, from initial conception to final product delivery. In this regard, aspects ranging from understanding client needs to selecting technical tools and engaging in continuous iteration are addressed to ensure a robust final product that is tailored to both user and client expectations within the context of the metaverse.
The process follows a systematic progression through several key stages. It starts with gathering client specifications, which involves making initial contact to understand the client’s needs and vision for the use of the virtual environment. This stage is crucial for aligning the project’s goals with the client’s expectations.
Following this, the collaborative design stage is initiated, where a joint process is established based on the client’s indications. During this phase, mock-ups are created to provide a concrete visual representation, allowing the client to provide feedback before moving on to the next stage. These mock-ups include assets, media, and a metaverse mockup that incorporates branded elements, user interaction possibilities, and avatar creation, ensuring that all design decisions are well documented.
The next step involves a meticulous selection of tools, platforms, and hardware to ensure technical feasibility and efficiency, meeting the requirements of the exhibition metaverse. The chosen platform must support the envisioned user interactions and the immersive experience intended for the virtual space.
An alpha version of the exhibition is then released for review by testers, followed by the production of a beta version for customer approval. This iterative cycle includes feedback addressing modifications and enhancements as needed. The evaluation and delivery stage involves presenting the beta version to a rigorous analysis during a period agreed with clients and evaluators, culminating in the delivery of the completed expository metaverse.
The methodology concludes with a period of constant testing and refinement, where the final version is validated in more detail by users within the exhibition environment. Feedback from this phase leads to ongoing adjustments to ensure the metaverse’s uninterrupted adaptation and maximum efficiency.
Each of these stages is essential for developing a user-centric virtual exhibition space that meets the client’s needs and enhances the visitor’s experience in the metaverse. The detailed documentation of design decisions, selection of appropriate tools, and iterative validation ensure that the final product is both technically sound and aligned with the envisioned user experience.

2.2. Client Specifications for Metaverse Exhibition Space

This section delineates the tailored requirements set forth by the Hub of Experimental Museology (HUME) for the virtual exhibition space [40]. HUME, an innovative collaboration hub for companies, researchers, and museums within the extended reality division of the Institute of Design and Manufacturing (IDF) at the Universitat Politècnica de València, is dedicated to fostering the adoption and exploration of cutting-edge technologies like extended reality, Artificial Intelligence, and metaverses. The genesis of this exhibition space is rooted in HUME’s initiative to forge novel digital museology strategies that leverage virtual reality to bolster social engagement.
In the developmental phase (see Figure 1) meticulous attention was given to HUME’s feedback, which was instrumental in identifying and addressing specific needs. These requirements encompass various project facets, including the assets constituting the exhibition space, multimedia integration, the preliminary metaverse mockup, branding elements to ensure client recognition, user interaction dynamics, and the process of avatar creation. These elements are methodically introduced following the client-collaborative design approach detailed in the Collaborative preliminary design with the client depicted in Figure 1.

2.3. Design and Animation of the Metaverse Exhibition Space

The virtual exhibition designed for this study simulates an environment reminiscent of a floating space station above Mars’s thin atmosphere. It features a central platform linked to six smaller platforms by transparent ’glass’ bridges. One of the peripheral platforms highlights a 3D model of the Perseverance rover, which becomes animated upon a visitor’s activation of a blue button located in front of it. Adjacent to this is a platform designed as a balcony overlooking a full-size Starship rocket model, which simulates takeoff and landing cycles when a user interacts with it from another blue button.
The central interactive exhibit on the main platform includes a miniature of Mars, a robotic arm equipped with a spotlight, and four floor buttons surrounding it, a design cue borrowed from video games to indicate interactive elements. Visitors can activate these buttons by positioning their avatars over them, triggering a combination of pre-recorded and transitional animations. As a result, the articulated arm directs a spotlight to illuminate areas of a Mars map revealing significant details such as extreme climate temperatures, geological crater history, orographic features, and a speculative visualization of Mars terraformed with blue oceans and green lands. In the case of the crater scene, a particle system was added to enhance the tridimensional perception of asteroid impacts on the surface of the planet; see Figure 2.
Each interactive animation is accompanied by an audio explanation exclusive to the activating visitor, ensuring a personalized experience. This setup allows multiple users to engage with the exhibit simultaneously without interference from overlapping activities, while still facilitating social communication among visitors through their microphones, avatar expressions, and webcams.
Complementing these features, the exhibition showcases various images sourced from NASA [41] around the main platform, providing educational insights into Mars and humanity’s endeavors to explore and possibly colonize the planet.

2.4. Development of the Metaverse Exhibition Space

The virtual exhibition was developed using the Spatial.io platform [42,43], renowned for its robust device compatibility, which is supporting a diverse array of hardware including PCs, smartphones, tablets, and various VR systems [44]. Key to this work was the Spatial Creator Toolkit [45], integrated within the Unity game engine, which facilitated the creation of immersive virtual environments. This toolkit is particularly advantageous due to its comprehensive suite of features such as customizable avatars, dynamic entrance points, and sophisticated event triggers, which are essential for interactive virtual spaces.
The initial build was conducted using version 0.52 of the Toolkit, released on 11 April 2023. This version introduced significant enhancements of the interactivity within the virtual environments. However, limitations due to the early stage of this version restricted the access to Unity’s full scripting and animation APIs in this work, posing challenges in achieving the desired complexity in interactive features.
To circumvent the API restrictions, the authors of this work employed innovative solutions such as multi-timeline animation blending and the use of complex shaders. These shaders were programmed to dynamically alter the visual properties of objects in response to virtual lighting conditions, thereby simulating realistic environmental interactions at a reduced computational cost.
Interaction within the exhibition was primarily facilitated through the Spatial Interactable component in Unity [46], which, despite its limitations in scripting flexibility, was pivotal in creating a structured and user-friendly interaction model. This setup was crucial for maintaining the exhibition’s interactivity without the ability to directly animate objects via code.
Additionally, the necessity to employ the Spatial Interactable component within Unity to facilitate user–object interactions further delimited the project’s creative freedom. This component, while enabling basic interactions, imposed a framework that required adherence to a specific interaction model, stifling direct animation of objects by code.
Despite these challenges, the project team devised inventive solutions to circumvent these limitations and realize the envisioned interactive experiences. First, by strategically blending animations across two timelines, the team succeeded in simulating the intricate movements of the robotic arm exhibit. This technique enabled the creation of a more dynamic and lifelike animation, enhancing the visitor’s engagement with the exhibit. Secondly, the team employed complex shaders programmed to respond to light sources dynamically to locally change the maps that paint the objects. This approach allowed for the modification of objects’ appearances and transparencies in real-time with a low computing cost, based on the interaction with virtual spotlights that moved thanks to the first technique.
Utilizing the Shader Graph tool in the Unity Editor, a custom shader was designed specifically for the project. This shader plays a crucial role in the visualization of the Martian surface within the exhibition space.
The shader was crafted to function not merely as a conventional light shader but as an interactive visual mask. This was achieved by configuring the shader to alter its behavior in response to illumination by a spotlight. Rather than simply adding light to the scene, the shader selectively reveals a hidden map overlaid on the original terrain texture of Mars; see Figure 3.
For this virtual space, user interaction with various models was required, such as the rocket, the planet Mars, or the robot. For the rocket and the vehicle, simple animation clips were recorded to establish two states. The spacecraft was programmed to transit from the resting position to takeoff and land in a long loop, while the Martian wheeled vehicle transits from the resting state to simulate forward motion by turning wheels movements while the Ingenuity helicopter spins around it.
However, for the central exhibitor displaying the model of the planet, a different strategy was applied to create a more interactive and complex and experience, since it was designed to show completely different maps depending on the position of the user around it. The challenge was to create fluid animations able to transit from any position of the articulated lamp placed on the top of the exhibitor to the position of the user when they move over one of the marked squares of the ground, without the use of custom scripts; see Figure 4. The only script allowed in that version of the platform to set the animations was the “Trigger Event” to update one or more parameters of an animation state machine.
In Unity, state machines facilitate the creation of flow diagrams that specify the conditions under which the system transitions from one animation clip to another [47]. These diagrams are represented by nodes linked with arrows, with each arrow depicting the transition condition. For example, transitioning from the “Unfolded Articulated Light Lamp” state to the folded state requires setting the Boolean parameter “FoldState” to true, triggered by any corresponding event, and similarly for the reverse transition; see Figure 4a,b.
While this setup effectively governs the deployment of the articulated lamp, it does not address the rotation around the vertical axis needed to align the lamp with the user’s position via a floor button. To achieve a seamless and continuous adjustment, a strategy involving layered animations and extended transitions between states was implemented. This approach allows multiple animations to be superimposed on a single object, enabling the simultaneous execution of folding–unfolding and rotational movements; see Figure 4c,d. Additionally, transitions lasting approximately two seconds were configured to move from “Any State” to a specific rotation state, thereby facilitating on-the-fly animation creation—animations that are generated in real-time to smoothly transition from any orientation to the targeted one.
Once the animation flow is created, a Trigger Event is added to the object, specifying the animations to be activated by the trigger. As shown in Figure 5a, one can trigger events upon entering (On Enter Event) or exiting (On Exit Event) the Trigger. With the Unity Event is Synced option-enabled, one can ensure that all users in the virtual space see the result of triggering once it is activated by any of them. With the button disabled, only the user stepping on the button and triggering the event can see the result, in this case, the activation of animations or audio on each of the exhibitor’s buttons.
For instance, the spot marked as “Mars Past” triggers an animation that directs an articulated spotlight to illuminate the surface of the planet from above the user’s position. The light from the spotlight activates the shader’s masking function, which in turn displays a vibrant, full-color map. This map imaginatively portrays Mars as it might have appeared in the past, with expansive blue oceans and lush green continents—an artistic interpretation inspired by terrestrial landscapes provided by NASA.
In terms of audio, sounds have been added to various objects, along with explanations for each of the Mars maps—activated by the Trigger Event—and the ambient sound of the metaverse. All of this is achieved using Unity’s Audio Source component. With this component, an audio track can be added, and options such as volume, stereo, distance, play on start, play in loop, etc., can be configured; see Figure 5b.

2.5. Research Design

This section outlines the experimental design used to validate the alpha version of the Metaverse Exhibition Space created in this study, conducted at the Institute of Design and Manufacturing of the Universitat Politècnica de València (Spain). The experiments involved groups of two users each, exploring the virtual exhibition space using Meta Quest 2 headsets, ensuring a consistent VR experience across the board. The research consisted of two distinct case studies:
  • Case Study 1: This case concentrated on the concurrent use of PC and VR interfaces. The objective was to gather and compare the insights from two users utilizing these differing interfaces. Each user started with a brief tutorial on how to operate the VR headset provided by the assistants. Following this introductory session, users were allowed to freely explore the exhibition space at their leisure, engaging with the exhibits and each other.
  • Case Study 2: This case aimed to evaluate the usability and the sensation of presence within the application, using a uniform interface for all users. The process began with an explanation of the virtual space they were about to explore. Once equipped with VR headsets, users were instructed on how to navigate using the controllers, including optimal hand and finger positioning as needed. They were then given complete freedom to explore the space without any time restrictions. During their exploration, if users seemed unsure of where to go next, they could be directed to specific areas of interest. Notably, assistants could join the Exhibition Space from a separate computer while a user was inside, enhancing interaction by demonstrating the environment’s capability to simultaneously accommodate multiple users, each interacting with elements of their choosing.
    It should be noted that assistants can enter to the Exhibition Space from another device (computer) while the user was inside it, with the aim of creating a greater interaction and showing the capacity of the environment to have two or more users in it, each one exploring and interacting with the elements that they decide at the time.
For both case studies, there were no time constraints imposed on the users, allowing for a fully immersive experience. External observers monitored and documented the interactions among users during the experiments. The sessions were concluded when any user indicated a desire to leave the virtual space.

2.6. Data Analysis

Consistent with previous studies [48,49,50,51,52], this research involved conducting several usability tests and user interviews to validate the proposed methodology and to demonstrate the advantages of the developed application.
For the users involved in Case Study 2, two standard questionnaires were administered: the Igroup Presence Questionnaire (IPQ) [53,54,55], and the System Usability Scale (SUS) [56]. The IPQ was selected for its effectiveness in evaluating the sense of presence within virtual environments, as well as various other factors including realism, and the quality of the interface and devices used. Conversely, the SUS questionnaire was primarily used to evaluate the usability of the interface developed.
The IPQ questionnaire consists of 14 items designed to measure three distinct subscales: spatial presence, which refers to the sensation of physically being in the virtual environment (VE); involvement, which gauges the level of attention and engagement with the VE; and experienced realism, which assesses the perceived realism within the VE. Additionally, the IPQ includes a general item that captures the overall ‘sense of being there’, showing significant correlations with all three subscales, particularly with spatial presence. The items or questions featured in the IPQ are outlined in Table 1 and employ a seven-point Likert-type scale (ranging from 1 to 7). The scores from each item are averaged to calculate a score for each dimension. This article evaluates the variability for each dimension using standard deviation.
Meanwhile, the SUS questionnaire, whose ten items are outlined in Table 2, was also employed. Each item is scored on a scale from 1 to 5. For items with a positive statement, the score is the scale position minus 1. For negatively worded items, the score is 5 minus the scale position. After scoring individual items, the scores are summed and then multiplied by 2.5 to convert the original scores of 0–40 to 0–100. Furthermore, the mean and standard deviation are calculated to assess the responses of all users for each question.
Furthermore, to gather more detailed insights into their experience, users were interviewed by several staff members.

3. Results

3.1. Case Study 1

In this section, the alpha version of the developed Metaverse Exhibition Space is shown. Two users participated in a video demonstration (https://media.upv.es/player/?id=883b46d0-e130-11ee-9972-a18d14c897dd (accessed on 24 April 2024) to showcase the application: one utilized the Meta Quest 2 virtual reality headset (Figure 6a), and the other operated a personal computer (Figure 6b), each in separate spaces. Both users were previously acquainted with the virtual environment.
Upon entering the virtual space, users encountered signage that provided a brief overview of the space’s purpose and guidance on how to interact with the site, including a critical clarification on engaging with floor buttons (Figure 7a), which was an adjustment made following prior user feedback.
Figure 8a shows the two users on the main platform, where the planet Mars is located, and various buttons that, when pressed, activate the flashlight-robot arm, which ‘reveals’ the temperatures that the planet can reach, in this case. There are also other shaders about Mars craters or its orography, for example. Likewise, the same figure shows that a user can only see the interaction he or she makes with the elements in the space. When a button on the flashlight-robot arm is pressed by a user, it is activated only for himself or herself, even if another user is also on a button. This makes it easy for all users to interact with this stop at the same time.
The virtual exhibition’s use of multilayer animations and advanced shader technologies significantly enhanced user engagement and interactivity. These technologies were particularly impactful when users interacted with the animated 3D model of the Perseverance rover and the Starship rocket’s takeoff and landing simulations, which were highlighted as key aspects enriching the user experience (Figure 8b,c).
Distinct user behaviors were observed: the VR headset user spent more time engaging with the textual information (time stamp 0:00:14 in the video demonstration), while the PC user demonstrated a broader range of movements, such as running and jumping (time stamp 0:02:36 in the video demonstration), facilitated by the Spatial.io platform. The VR user’s experience was more ’conventional’, focused on walking and exploring.
Interaction between users was seamless, allowing for easy communication, even when they were not co-located or at the same exhibit (time stamp 0:02:32 in the video demonstration). Users could visually encounter and converse within the virtual space, enhancing the social dimension of the experience.
Despite these positive aspects, some issues were noted with VR interactions, such as unintentional menu activation (Figure 9a) and challenges with navigating too close to virtual objects (Figure 9b), which occasionally disrupted the immersive experience.

3.2. Case Study 2

A total of 22 users were carefully chosen for this analysis. The demographic breakdown is as follows: 50% of the users identified as female, and an equal 50% as male. The study aimed to encompass a broad age range, resulting in 9.09% under 18 years old, 22.73% between 18 and 25 years old, 22.73% between 25 and 40 years old, 40.91% between 40 and 55 years old, and 4.54% between 55 and 70 years old. Regarding educational background, 50% had basic studies, 27.27% had bachelor’s degrees, and 22.73% had post-graduate qualifications.
Additionally, 22.73% of the users reported having experience in metaverse worlds. Concerning gaming habits, 31.82% indicated regular use of various videogame devices (e.g., gamepads, joysticks), while 13.64% reported occasional use. Regarding travel planning, 59.1% stated they typically use the Internet for trip planning. Lastly, 27.27% of the users mentioned a tendency to visit museums during their travels, while 54.55% reported occasional museum visits when on the road.
The outcomes of the IPQ are depicted in Figure 10. Specifically, Figure 10a exhibits the mean and standard deviation for each IPQ question, while Figure 10b displays the mean and standard deviation in percentage for each subscale of the Presence Questionnaire (PQ). Noteworthy findings include a General Presence score of 87.77% with a standard deviation of 1.21%, demonstrating that users experienced a strong sense of immersion in the Virtual Environment (VE). The Spatial Presence score reached 62.34% with a standard deviation of 2.09%, reflecting that users felt physically present in the VE. An Involvement score of 58.77% with a standard deviation of 1.36% suggests that a majority of users were actively engaged with all aspects of the VE. The Experienced Realism score was 55.19% with a standard deviation of 1.79%, indicating that over half of the users consistently felt immersed in the VE despite the absence of realistic objects. These findings support the main objective of the proposed approach, which aims to create a natural and user-friendly VE compatible with most current commercial VR headsets. It is important to note that achieving enhanced realism would require greater computational resources and specialized hardware, such as advanced graphics cards.
Regarding the SUS questionnaire, the overall perceived usability scored 83.52 out of 100 (ranging from 70 to 100 with a standard deviation of 8.23), indicating a notably high level of usability for the proposed application. Additionally, Figure 11 visually represents the results for each item of the SUS questionnaire, which is detailed in Table 2. Importantly, a majority of users expressed a willingness to use this application frequently, highlighting its user-friendly nature. Users consistently reported that the application functionalities were well integrated, ensuring a seamless and consistent experience. Furthermore, users expressed confidence in their interactions with the application.

4. Discussion

In a initial public showcase, the alpha version of the developed Metaverse Exhibition Space was exhibited at the III CM Málaga (Culture & Museums International Tech Forum, 19–20 June 2023, Málaga, Spain) (see Figure 12), albeit in an earlier version than the one presented in the current study. The demonstration involved using a VR headset and a laptop, allowing attendees to observe the virtual space in real time as users explored it.
Notably, one of the authors of this study, represented by an avatar and operating from the laptop, accompanied the user in the space, fostering interaction and preventing the user from feeling isolated. This setup aimed to introduce a novel exhibition format within a metaverse, exploring the potential for visitor interaction in a virtual museum setting centered around Mars.
The space was presented to the public as a new form of exhibition for museums, within a metaverse, with which it is sought to investigate the potential for social interaction among visitors. Specifically, the creation of a virtual exhibition around the planet Mars, which they could see with VR headset, was explained to them along with the different elements they could interact with.
A concise questionnaire, featuring four quantitative and two qualitative questions, was administered to attendees post-experience; see Table 3. Seven random users aged between 25 and 65 years completed this survey, which aimed to gauge their navigation and interaction within the Spatial.io metaverse, their engagement with overlay maps and interactive 3D Mars maps, their views on the animation quality, and their social experience within the virtual space. They were then given complete freedom to explore the space without any time restrictions.
The quantitative responses were assessed using a scale of 1 (“Very unsatisfactory”) to 5 (“Very satisfactory”). The mean score obtained was 3.6, with a standard deviation of 1.3, indicating a promising yet improvable application. Notably, among all users, only two users rated any of the questions with a score of 1 out of 5. Upon further discussion with them after the test, it became evident that they faced difficulties interacting with the buttons in the immersive space.
Furthermore, the results revealed that four out of the seven users were proficient in activating the interactive 3D maps without external assistance. In contrast, the remaining three users expressed their inability to do so. These findings underscore the importance of addressing usability issues, particularly concerning navigation within the immersive environment, to enhance user experience and application effectiveness.
The qualitative feedback also highlighted areas for enhancement, such as UI improvements, graphic quality, and clearer interaction cues. Some of the comments were as follows:
  • “I found it very interesting, easy, and quick to learn how to use it. Congratulations!”;
  • “It is a very interactive experience with a lot of potential.”;
  • “Possibility to manage the vehicle. Some improvement in UI (change the position of the button to activate the vehicles).”;
  • “Further research.”;
  • “Improve the graphics.”;
  • “The audio helps the experience!”;
  • “It is not very clear which are the buttons to trigger interactions. It would be nice to be able to go down from the platforms to the ground. The vertigo sensation is interesting”.
This valuable input informed subsequent refinements, including clearer signage to aid interaction with the virtual environment. Specifically, a distinct signal was integrated at the virtual space entrance as shown in Figure 7a. This signage serves to guide visitors in identifying interactive buttons and understanding that it is necessary to stand on them to activate them. This modification was prompted by feedback from III CM Málaga.
Figure 13 provides a visual comparison between the interaction buttons featured in the alpha version showcased in the III CM Málaga (see Figure 13a) and the subsequent modification introduced in the last iteration to enhance user interaction (see Figure 13b). Notably, all buttons have been redesigned and now require stepping on them, or standing on them, to initiate an action. This transformation was implemented with the goal of enhancing user comprehension and simplifying interaction within the virtual environment. These refinements are aimed at improving overall user experience and promoting seamless navigation through the immersive space.
Future testing will expand user diversity and using VR headsets alongside other devices to ensure a comprehensive and interactive visitor experience. The Spatial.io platform’s capabilities for audiovisual interaction and dynamic movement will continue to play a crucial role in these developments.
Qualitative data highlighted that visitors engaged more with interactive elements than static displays, indicating the value of dynamic content in maintaining visitor interest. Feedback post-visit was overwhelmingly positive, with many users expressing a desire to return to the virtual exhibition, suggesting a successful engagement and educational outcome.
It is important to recognize that approaches such as the one presented in this work can bring significant benefits to the museum industry, and they also raise various ethical considerations and inherent limitations related to the use of VR. Technological dependence and the excessive use of devices can create a bubble of informational relationships, which calls for a critical reflection on the potential risks embedded within this densely woven information network [57].
Moreover, privacy and data protection are critical ethical concerns when implementing VR in museums. The collection and utilization of personal data can compromise user privacy and pose significant risks to information security [58]. Additionally, issues of accessibility and inclusion are paramount; the technology must be inclusive and accessible to all individuals, including those with disabilities and those who do not have access to high-end devices [59].
Furthermore, the authenticity and integrity of cultural heritage must be carefully considered. It is essential that VR technology respects the authenticity and integrity of cultural artifacts and avoids any distortion or manipulation of historical information [57].
Lastly, the sustainability and environmental impact of VR technologies should not be overlooked. These technologies must be designed and employed in a manner that minimizes environmental degradation and promotes long-term sustainability [58].

5. Conclusions

This research anticipates leveraging the advancements provided by the recently released Spatial Creator Toolkit v1.8. With its expanded scripting support and the introduction of numerous new features, this updated version promises to substantially enhance the capabilities for creating more sophisticated and engaging virtual museum experiences. This study advocates for the continuous exploration and adoption of emerging technologies to enrich the interactivity, realism, and educational value of metaverse-based exhibitions.
The empirical investigation conducted has underscored the potential of multilayer animations and advanced shader technologies within metaverse platforms to significantly augment user engagement, immersion, and the overall experience. This research contributes to the burgeoning field of digital museology, demonstrating how virtual environments can transcend traditional museum experiences by offering interactive and immersive narratives that draw visitors into the subject matter in unprecedented ways.
Future research will continue to explore the broad spectrum of technological opportunities presented by the metaverse, aiming to enhance the educational and interactive aspects of museum exhibitions. The findings from this study lay a foundation for further exploration into how virtual reality can be harnessed to create dynamic, engaging, and informative museum experiences that reach a global audience.
In addition, this study has not only achieved its aim of creating an immersive and educational virtual exhibition space but also illuminated the path for future research and development in digital museology within the metaverse. The insights garnered from this work are anticipated to make a valuable contribution to the field, offering a new perspective on the intersection of technology, culture, and education.
It is crucial to underscore that the results obtained in this work by the authors are preliminary. Therefore, the main goal of their future work is to deploy the beta version of the application within a museum setting. This approach will be designed to gather comprehensive data from a diverse array of visitors.
The acquisition of these data is intended to validate and corroborate the initial findings obtained in this work. Moreover, the insights derived from this data collection may facilitate the exploration of new avenues for research and study, potentially leading to the establishment of various novel research trajectories based on the outcomes observed.
This research aimed to initiate an exploration of VR in museums might enhance understanding and retention of knowledge vis-à-vis traditional methods. Hence, the authors of this work acknowledge the need for extensive research to fully capture the diverse impacts of VR. This would include assessing its potential to convey complex scientific theories and historical narratives more effectively through immersive, experiential learning experiences that go beyond what is possible with conventional books and static exhibits. Furthermore, the analysis conducted in this work about the changes in museum attendance following the introduction of VR exhibits offered preliminary insights into consumer behavior, highlighting the importance of further studies to guide strategic planning. These insights are vital for leveraging digital innovations to improve educational outcomes and visitor engagement, ultimately supporting the museum’s goals of maintaining relevance and competitiveness in a rapidly evolving cultural sector.
The integration of VR immersive technologies in museum settings, as discussed in this article, offers transformative potential for enhancing visitor engagement and educational outcomes. However, the deployment of these technologies also brings forth significant challenges. Technical issues, such as the robustness of the VR systems, the need for regular updates, and compatibility with existing museum infrastructures, must be meticulously addressed to ensure smooth operation. Moreover, the financial implications of adopting such advanced technologies—including initial investment and ongoing maintenance costs—pose substantial challenges, particularly for institutions with limited budgets.
Another critical aspect to consider is the training of staff to manage and operate VR setups effectively, as well as educating visitors on how to use these systems efficiently. These are additional hurdles that necessitate careful planning and execution.
The preliminary findings of this study emphasize that by addressing the commented challenges and concentrating on the quality of the visitor experience, museums can effectively leverage immersive technologies. This strategic approach enables museums to remain relevant and competitive in an increasingly digital world, thereby ensuring their continued cultural and educational impact. This focus not only supports the technological integration but also enhances the overall engagement and learning outcomes for visitors.

Author Contributions

Conceptualization, A.M.; Methodology, A.A; Software, L.F.; Validation, A.M.-T.; Formal analysis, A.A.; Investigation, A.A.; Data curation, A.M.-T.; Writing—original draft, A.A and L.F.; Writing—review & editing, J.E.S. and L.G.; Supervision, A.M. and A.M.-T.; Funding acquisition, A.M. and L.G. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been funded by the Spanish Government (Grant PID2020-117421RB-C21 funded by MCIN/AEI/10.13039/501100011033) and by the Generalitat Valenciana (Grant INVEST/2022/324).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data is contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Spadoni, E.; Carulli, M.; Ferrise, F.; Bordegoni, M. Impact of Multisensory XR Technologies on Museum Exhibition Visits. In Proceedings of the International Conference on Human-Computer Interaction, Copenhagen, Denmark, 23–28 July 2023; Springer: Berlin/Heidelberg, Germany, 2023; pp. 120–132. [Google Scholar]
  2. Silva, M.; Teixeira, L. eXtended Reality (XR) experiences in museums for cultural heritage: A systematic review. In Proceedings of the International Conference on Intelligent Technologies for Interactive Entertainment, Virtual Event, 3–4 December 2021; Springer: Berlin/Heidelberg, Germany, 2021; pp. 58–79. [Google Scholar]
  3. Carrozzino, M.; Bergamasco, M. Beyond virtual museums: Experiencing immersive virtual reality in real museums. J. Cult. Herit. 2010, 11, 452–458. [Google Scholar] [CrossRef]
  4. Lee, H.; Jung, T.; tom Dieck, M.T.; Chung, N. Experiencing immersive virtual reality in museums. Inf. Manag. 2020, 57, 103229. [Google Scholar] [CrossRef]
  5. Cecotti, H. Cultural Heritage in Fully Immersive Virtual Reality. Virtual Worlds 2022, 1, 82–102. [Google Scholar] [CrossRef]
  6. Banfi, F.; Brumana, R.; Stanga, C. Extended reality and informative models for the architectural heritage: From scan-to-BIM process to virtual and augmented reality. Virtual Archaeol. Rev. 2019, 10, 14–30. [Google Scholar] [CrossRef]
  7. Margetis, G.; Apostolakis, K.C.; Ntoa, S.; Papagiannakis, G.; Stephanidis, C. X-Reality Museums: Unifying the Virtual and Real World towards Realistic Virtual Museums. Appl. Sci. 2021, 11, 338. [Google Scholar] [CrossRef]
  8. Hyun-Kyung Lee, S.P.; Lee, Y. A proposal of virtual museum metaverse content for the MZ generation. Digit. Creat. 2022, 33, 79–95. [Google Scholar] [CrossRef]
  9. Zhao, W.; Su, L.; Dou, F. Designing virtual reality based 3D modeling and interaction technologies for museums. Heliyon 2023, 9, e16486. [Google Scholar] [CrossRef]
  10. Yancong, Z.; Guandong, T.; Liu, W.; Qi, R. How Post 90’s Gesture Interact with Automobile Skylight. Int. J. Hum.–Comput. Interact. 2022, 38, 395–405. [Google Scholar] [CrossRef]
  11. Zidianakis, E.; Partarakis, N.; Ntoa, S.; Dimopoulos, A.; Kopidaki, S.; Ntagianta, A.; Ntafotis, E.; Xhako, A.; Pervolarakis, Z.; Kontaki, E.; et al. The Invisible Museum: A User-Centric Platform for Creating Virtual 3D Exhibitions with VR Support. Electronics 2021, 10, 363. [Google Scholar] [CrossRef]
  12. Rizvic, S.; Boskovic, D.; Mijatovic, B. Advanced interactive digital storytelling in digital heritage applications. Digit. Appl. Archaeol. Cult. Herit. 2024, 33, e00334. [Google Scholar] [CrossRef]
  13. Gong, Q.; Zou, N.; Yang, W.; Zheng, Q.; Chen, P. User experience model and design strategies for virtual reality-based cultural heritage exhibition. Virtual Real. 2024, 28, 69. [Google Scholar] [CrossRef]
  14. Baradaran Rahimi, F.; Boyd, J.E.; Eiserman, J.R.; Levy, R.M.; Kim, B. Museum beyond physical walls: An exploration of virtual reality-enhanced experience in an exhibition-like space. Virtual Real. 2022, 26, 1471–1488. [Google Scholar] [CrossRef]
  15. Li, G.; Lin, S.; Tian, Y. Immersive Museums in the Digital Age: Exploring the Impact of Virtual Reality on Visitor Satisfaction and Loyalty. J. Knowl. Econ. 2024, 15, 1–34, in press. [Google Scholar] [CrossRef]
  16. Vera, L.; Coma, I.; Pérez, M.; Riera, J.V.; Martínez, B.; Gimeno, J. The Mediterranean forest in a science museum: Engaging children through drawings that come to life in a virtual world. Multimed. Tools Appl. 2024, 83, 1–22, in press. [Google Scholar] [CrossRef]
  17. Meileni, H.; Hapsari, Y.; Devani, F.T.; Nugraha, A.C.; Jannah, M.; Firdaus, R. Exploration of Virtual Reality Technology Implementation in the Mobile Application of Balaputra Dewa Museum. In Proceedings of the 7th FIRST 2023 International Conference on Global Innovations (FIRST-ESCSI 2023), Palembang, Indonesia, 30–31 October 2023; Atlantis Press: Malang, Indonesia, 2024; pp. 352–359. [Google Scholar] [CrossRef]
  18. Schlachhoff, E.; Dengler, N.; Holland, L.V.; Stotko, P.; de Heuvel, J.; Klein, R.; Bennewitz, M. RHINO-VR Experience: Teaching Mobile Robotics Concepts in an Interactive Museum Exhibit. arXiv 2024, arXiv:cs.RO/2403.15151. [Google Scholar]
  19. Shehade, M.; Stylianou-Lambert, T. Virtual Reality in Museums: Exploring the Experiences of Museum Professionals. Appl. Sci. 2020, 10, 4031. [Google Scholar] [CrossRef]
  20. Gulen, S.; Donmez, I.; Idin, S. STEM Education in Metaverse Environment: Challenges and Opportunities. J. Steam Educ. 2022, 5, 100–103. [Google Scholar] [CrossRef]
  21. Yokutkhon, R. The use of virtual reality and augmented reality in education. J. New Century Innov. 2024, 48, 157–161. [Google Scholar]
  22. Jiang, T.; Gan, X.; Liang, Z.; Luo, G. AIDM: Artificial intelligent for digital museum autonomous system with mixed reality and software-driven data collection and analysis. Autom. Softw. Eng. 2022, 29, 22. [Google Scholar] [CrossRef]
  23. Kanematsu, H.; Kobayashi, T.; Barry, D.M.; Fukumura, Y.; Dharmawansa, A.; Ogawa, N. Virtual STEM Class for Nuclear Safety Education in Metaverse. Procedia Comput. Sci. 2014, 35, 1255–1261. [Google Scholar] [CrossRef]
  24. Solanes, J.E.; Montava-Jordà, S.; Golf-Laville, E.; Colomer-Romero, V.; Gracia, L.; Muñoz, A. Enhancing STEM Education through Interactive Metaverses: A Case Study and Methodological Framework. Appl. Sci. 2023, 13, 10785. [Google Scholar] [CrossRef]
  25. Mittal, S. Metaverse Business and Economic Models: Opportunities and Challenges for Virtual Economies, Digital Asset Markets, and Advertising Strategies. BSc Thesis, Delhi Technological University, Delhi, India, May 2023. Available online: http://14.139.251.106:8080/jspui/bitstream/repository/20292/1/Shivangi%20Mittal%20DMBA.pdf (accessed on 24 April 2024).
  26. Trunfio, M.; Jung, T.; Campana, S. Mixed reality experiences in museums: Exploring the impact of functional elements of the devices on visitors’ immersive experiences and post-experience behaviours. Inf. Manag. 2022, 59, 103698. [Google Scholar] [CrossRef]
  27. Pagano, A.; Palombini, A.; Bozzelli, G.; De Nino, M.; Cerato, I.; Ricciardi, S. ArkaeVision VR Game: User Experience Research between Real and Virtual Paestum. Appl. Sci. 2020, 10, 3182. [Google Scholar] [CrossRef]
  28. Hennig-Thurau, T.; Aliman, D.N.; Herting, A.M.; Cziehso, G.P.; Linder, M.; Kübler, R.V. Social interactions in the metaverse: Framework, initial evidence, and research roadmap. J. Acad. Mark. Sci. 2023, 51, 889–913. [Google Scholar] [CrossRef]
  29. Bailenson, J. Experience on Demand: What Virtual Reality Is, How It Works, and What It Can Do; WW Norton & Company: New York, NY, USA, 2018. [Google Scholar]
  30. Shin, D. Empathy and embodied experience in virtual environment: To what extent can virtual reality stimulate empathy and embodied experience? Comput. Hum. Behav. 2018, 78, 64–73. [Google Scholar] [CrossRef]
  31. Al-Jundi, H.A.; Tanbour, E.Y. A framework for fidelity evaluation of immersive virtual reality systems. Virtual Real. 2022, 26, 1103–1122. [Google Scholar] [CrossRef]
  32. Longo, M.C.; Faraci, R. Next-Generation Museum: A Metaverse Journey into the Culture. Sinergie Ital. J. Manag. 2023, 41, 147–176. [Google Scholar] [CrossRef]
  33. Hutson, J.; Hutson, P. Museums and the Metaverse: Emerging Technologies to Promote Inclusivity and Engagement. In Application of Modern Trends in Museum; IntechOpen: London, UK, 2023; pp. 1–20. [Google Scholar]
  34. Hawkey, R. Learning with Digital Technologies in Museums, Science Centres and Galleries; Futurelab Series; Futurelab: Bristol, UK, 2004; pp. 1–44. [Google Scholar]
  35. Krantz, A.; Downey, S. The significant loss of museum educators in 2020: A data story. J. Mus. Educ. 2021, 46, 417–429. [Google Scholar] [CrossRef]
  36. Dohoney, R. The Chicago sound show at the smart museum of art, the University of Chicago. Sound Stud. 2020, 6, 271–274. [Google Scholar] [CrossRef]
  37. Malik, U.S.; Tissen, L.; Vermeeren, A. 3D reproductions of cultural heritage artifacts: Evaluation of significance and experience. Stud. Digit. Herit. 2021, 5, 1–29. [Google Scholar] [CrossRef]
  38. GIL-SANG, Y. Transformation of the museum experience through virtual reality (METAVERSE). Public Hist. Mus. 2022, 5, 7. [Google Scholar]
  39. Unity. Shaders. Unity Documentation. Available online: https://docs.unity3d.com/Manual/Shaders.html (accessed on 11 March 2024).
  40. Hub de Museología Experimental. Available online: https://hume.institutoidf.com/ (accessed on 11 March 2024).
  41. NASA. NASA IMAGES. Available online: https://www.nasa.gov/images/ (accessed on 28 April 2023).
  42. Spatial.io. Available online: https://www.spatial.io/ (accessed on 11 March 2024).
  43. Zawish, M.; Dharejo, F.A.; Khowaja, S.A.; Raza, S.; Davy, S.; Dev, K.; Bellavista, P. AI and 6G Into the Metaverse: Fundamentals, Challenges and Future Research Trends. IEEE Open J. Commun. Soc. 2024, 5, 730–778. [Google Scholar] [CrossRef]
  44. Martí-Testón, A.; Muñoz, A.; Gracia, L.; Solanes, J.E. Using WebXR Metaverse Platforms to Create Touristic Services and Cultural Promotion. Appl. Sci. 2023, 13, 8544. [Google Scholar] [CrossRef]
  45. Spatial Creator Toolkit. Available online: https://docs.spatial.io/ (accessed on 11 March 2024).
  46. Unity. Game Development Platform. Available online: https://unity.com/ (accessed on 11 March 2024).
  47. Unity. Animation System Overview. Unity Documentation. Available online: https://docs.unity3d.com/Manual/AnimationOverview.html (accessed on 11 March 2024).
  48. Blattgerste, J.; Strenge, B.; Renner, P.; Pfeiffer, T.; Essig, K. Comparing Conventional and Augmented Reality Instructions for Manual Assembly Tasks. In Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments, ACM, Island of Rhodes, Greece, 21–23 June 2017; pp. 75–82. [Google Scholar] [CrossRef]
  49. Attig, C.; Wessel, D.; Franke, T. Assessing Personality Differences in Human-Technology Interaction: An Overview of Key Self-report Scales to Predict Successful Interaction. In HCI International 2017—Posters’ Extended Abstracts; Stephanidis, C., Ed.; Springer: Cham, Switzerland, 2017; pp. 19–29. [Google Scholar]
  50. Franke, T.; Attig, C.; Wessel, D. A Personal Resource for Technology Interaction: Development and Validation of the Affinity for Technology Interaction (ATI) Scale. Int. J. Hum.-Comput. Interact. 2018, 35, 456–467. [Google Scholar] [CrossRef]
  51. Muñoz, A.; Martí, A.; Mahiques, X.; Gracia, L.; Solanes, J.E.; Tornero, J. Camera 3D positioning mixed reality-based interface to improve worker safety, ergonomics and productivity. CIRP J. Manuf. Sci. Technol. 2020, 28, 24–37. [Google Scholar] [CrossRef]
  52. Solanes, J.E.; Muñoz, A.; Gracia, L.; Tornero, J. Virtual Reality-Based Interface for Advanced Assisted Mobile Robot Teleoperation. Appl. Sci. 2022, 12, 6071. [Google Scholar] [CrossRef]
  53. Schubert, T.; Friedmann, F.; Regenbrecht, H. The Experience of Presence: Factor Analytic Insights. Presence Teleoperators Virtual Environ. 2001, 10, 266–281. [Google Scholar] [CrossRef]
  54. Regenbrecht, H.; Schubert, T. Real and Illusory Interactions Enhance Presence in Virtual Environments. Presence Teleoperators Virtual Environ. 2002, 11, 425–434. [Google Scholar] [CrossRef]
  55. Schubert, T. The sense of presence in virtual environments: A three-component scale measuring spatial presence, involvement, and realness. Z. Medien. 2003, 15, 69–71. [Google Scholar] [CrossRef]
  56. Brooke, J. “SUS-A Quick and Dirty Usability Scale.” Usability Evaluation in Industry; CRC Press: Boca Raton, FL, USA, 1996; ISBN 9780748404605. [Google Scholar]
  57. Martínez Ruiz, X. Educación virtual: Consideraciones éticas y semánticas desde la infoesfera. Innov. Educ. 2015, 15, 9–14. [Google Scholar]
  58. EVE Museos Innovación. Museos y su Futuro Tecnológico. Available online: https://evemuseografia.com/2023/02/07/museos-y-su-futuro-tecnologico/ (accessed on 14 April 2024).
  59. Pando, F. La Realidad Virtual como Herramienta para el Aprendizaje, la Salud y el Trabajo del Futuro. Available online: https://www.itmastersmag.com/noticias-analisis/que-es-y-hacia-donde-avanza-la-realidad-virtual/ (accessed on 14 April 2024).
Figure 1. Methodology proposed for designing metaverse exhibition spaces.
Figure 1. Methodology proposed for designing metaverse exhibition spaces.
Applsci 14 03769 g001
Figure 2. Animation of asteroid impacts on the Mars surface.
Figure 2. Animation of asteroid impacts on the Mars surface.
Applsci 14 03769 g002
Figure 3. Custom materials with interactive light reactive shaders.
Figure 3. Custom materials with interactive light reactive shaders.
Applsci 14 03769 g003
Figure 4. Articulated lamp animation example: (a) Fold–unfold animation of the articulated lamp. (b) Base layer holding the state machine for fold–unfold the articulated lamp. (c) Rotation animations around the Y axis. (d) State machine showing parameters on the left, and on the right the rotation layer that overlaps the base layer.
Figure 4. Articulated lamp animation example: (a) Fold–unfold animation of the articulated lamp. (b) Base layer holding the state machine for fold–unfold the articulated lamp. (c) Rotation animations around the Y axis. (d) State machine showing parameters on the left, and on the right the rotation layer that overlaps the base layer.
Applsci 14 03769 g004
Figure 5. Developed GameObjects: (a) Trigger Event component. (b) Audio Source component.
Figure 5. Developed GameObjects: (a) Trigger Event component. (b) Audio Source component.
Applsci 14 03769 g005
Figure 6. Users with different devices: (a) User with VR headset. (b) User with computer.
Figure 6. Users with different devices: (a) User with VR headset. (b) User with computer.
Applsci 14 03769 g006
Figure 7. Signs inside the immersive space: (a) Explanation of the buttons. (b) Images about Mars.
Figure 7. Signs inside the immersive space: (a) Explanation of the buttons. (b) Images about Mars.
Applsci 14 03769 g007
Figure 8. Users in the different stops of the immersive space: (a) Main platform. (b) Starship stop. (c) Perseverance stop.
Figure 8. Users in the different stops of the immersive space: (a) Main platform. (b) Starship stop. (c) Perseverance stop.
Applsci 14 03769 g008
Figure 9. Problems or drawbacks for the user experience: (a) Open menu from VR headset. (b) Suddenly having Mars too close.
Figure 9. Problems or drawbacks for the user experience: (a) Open menu from VR headset. (b) Suddenly having Mars too close.
Applsci 14 03769 g009
Figure 10. Results of the Igroup Presence Questionnaire: (a) Data (red dots), mean (blue bars), and standard deviation (vertical black lines) per question. (b) Subscales results (mean and standard deviation).
Figure 10. Results of the Igroup Presence Questionnaire: (a) Data (red dots), mean (blue bars), and standard deviation (vertical black lines) per question. (b) Subscales results (mean and standard deviation).
Applsci 14 03769 g010
Figure 11. Results of the SUS questionnaire: data (red dots), mean (blue bars), and standard deviation (vertical black lines) per question.
Figure 11. Results of the SUS questionnaire: data (red dots), mean (blue bars), and standard deviation (vertical black lines) per question.
Applsci 14 03769 g011
Figure 12. Alpha version of the Metaverse Exhibition Space exhibited at the III CM Málaga (Culture & Museums International Tech Forum, 19–20 June 2023, Málaga, Spain).
Figure 12. Alpha version of the Metaverse Exhibition Space exhibited at the III CM Málaga (Culture & Museums International Tech Forum, 19–20 June 2023, Málaga, Spain).
Applsci 14 03769 g012
Figure 13. Animation activation buttons: (a) By interactive button. (b) By trigger event.
Figure 13. Animation activation buttons: (a) By interactive button. (b) By trigger event.
Applsci 14 03769 g013
Table 1. Items of the IPQ questionnaire [53,54,55].
Table 1. Items of the IPQ questionnaire [53,54,55].
NumberDescription of the Item/Question
IPQ1In the computer generated world I had a sense of “being there”
IPQ2Somehow I felt that the virtual world surrounded me
IPQ3I felt like I was just perceiving pictures
IPQ4I did not feel present in the virtual space
IPQ5I had a sense of acting in the virtual space, rather than operating something from
outside
IPQ6I felt present in the virtual space
IPQ7How aware were you of the real world surrounding while navigating in the virtual
world? (i.e., sounds, room temperature, other people, etc.)?
IPQ8I was not aware of my real environment
IPQ9I still paid attention to the real environment
IPQ10I was completely captivated by the virtual world
IPQ11How real did the virtual environment seem to you?
IPQ12How much did your experience in the virtual environment seem consistent with
your real world experience?
IPQ13How real did the virtual world seem to you?
IPQ14The virtual world seemed more realistic than the real world
Table 2. Items of the SUS questionnaire [56].
Table 2. Items of the SUS questionnaire [56].
NumberDescription of the Item/Question
SUS1I think that I would like to use this system frequently
SUS2I found the system unnecessarily complex
SUS3I thought the system was easy to use
SUS4I think that I would need the support of a technical person to be able to use
this system
SUS5I found the various functions in this system were well integrated
SUS6I thought there was too much inconsistency in this system
SUS7I would imagine that most people would learn to use this system very quickly
SUS8I found the system very cumbersome to use
SUS9I felt very confident using the system
SUS10I needed to learn a lot of things before I could get going with this system
Table 3. Questionnaire for attendees post-experience in the III CM Málaga.
Table 3. Questionnaire for attendees post-experience in the III CM Málaga.
NumberTypeDescription of the Question
(1)QuantitativeHow would you rate your ability to navigate and
interact in the Spatial.io metaverse?
(2)QuantitativeWhat did you think of the functionality of the overlay
maps (geographic of heights, temperatures, meteorite
fallout) activatable by stepping on buttons with the
avatar?
(3)QuantitativeHow would you rate the animation of the Space X
spacecraft and Perseverance?
(4)QuantitativeHow would you rate your experience of socializing in
space with other visitors and the guide assisting in
the visit?
(5)QualitativeWere you able to activate the interactive 3D Mars
maps without external help?
(6)QualitativeDo you have any ideas or comments on how to
improve or apply this experience?
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Alabau, A.; Fabra, L.; Martí-Testón, A.; Muñoz, A.; Solanes, J.E.; Gracia, L. Enriching User-Visitor Experiences in Digital Museology: Combining Social and Virtual Interaction within a Metaverse Environment. Appl. Sci. 2024, 14, 3769. https://doi.org/10.3390/app14093769

AMA Style

Alabau A, Fabra L, Martí-Testón A, Muñoz A, Solanes JE, Gracia L. Enriching User-Visitor Experiences in Digital Museology: Combining Social and Virtual Interaction within a Metaverse Environment. Applied Sciences. 2024; 14(9):3769. https://doi.org/10.3390/app14093769

Chicago/Turabian Style

Alabau, Alba, Lidia Fabra, Ana Martí-Testón, Adolfo Muñoz, J. Ernesto Solanes, and Luis Gracia. 2024. "Enriching User-Visitor Experiences in Digital Museology: Combining Social and Virtual Interaction within a Metaverse Environment" Applied Sciences 14, no. 9: 3769. https://doi.org/10.3390/app14093769

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop