257
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Collaborative Skills Training Using Digital Tools: A Systematic Literature Review

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Received 20 Oct 2023, Accepted 23 Apr 2024, Published online: 13 May 2024

Abstract

The development of information and communication technologies has changed our way of working, emphasizing the need for individuals to develop collaborative skills. The aim of the present systematic review was to examine the extent to which these skills can be developed through the use of digital tools. A search of seven electronic databases, following PRISMA guidelines, yielded 18 relevant peer-reviewed articles. Analysis of the literature revealed that digital tools have the potential to enhance collaborative skills. However, the effects vary considerably, depending on which tools, methods, and measures are used. It also revealed that studies were conducted mainly in the social sciences, mostly among students, and half of them focused on short interventions. Another finding was that little is known about the features of the digital tools that actually contribute to these effects. Work on how digital tools contribute to the development of collaborative skills is still in its infancy, and more research based on rigorous methods and measures is needed.

1. Introduction

The work environment has changed considerably over the past few decades, with the development of information and communications technology (ICT). This has led to the transformation of organizations and the democratization of teleworking and hybrid work. Employee work dynamics have changed to accommodate these new practices, and because of the transformation of activities brought about by ICT, students now need specific skills to become part of the knowledge society (Anderson, Citation2008). These include digital literacy, critical thinking, creativity, and collaborative skills, which are variously referred to as soft skills, transferable skills, key skills, social skills, nontechnical skills, and twenty-first century skills (e.g., Cinque, Citation2016; Flin et al., Citation2008; Goggin et al., Citation2019; Touloumakos, Citation2020). Similarly, various classifications of these skills have been developed in different domains. For example, Binkley et al. (Citation2012) suggested organizing skills according to whether they are related to ways of thinking (e.g., creativity, critical thinking, metacognition), ways of working (e.g., communication, collaboration), tools for working (e.g., information literacy, ICT literacy), or living in the world (e.g., citizenship, personal responsibility, and social responsibility). Taken together with other classifications (Ananiadou & Claro, Citation2009; Flin et al., Citation2008; Soulé & Warrick, Citation2015; Sun et al., Citation2022), they indicate that some skills are regarded as central (i.e., collaboration, communication, ICT literacy), while others are viewed as more peripheral (e.g., creativity, critical thinking), or are completely overlooked, such as planning and conflict resolution (Voogt & Roblin, Citation2012).

Despite differences between the classifications and terminologies used in the literature, there is a broad consensus on the need to develop these skills in order to facilitate performance and adaptation in the 21st-century work environment. First, these skills are needed to search, assess, select, and communicate relevant information, and to collaborate within teams (Morrison-Smith & Ruiz, Citation2020). In today’s labor market, these skills are viewed as essential for employability and are much sought after by employers (Abelha et al., Citation2020; Cimatti, Citation2016; Succi & Canovi, Citation2020). They therefore need to be developed not just among students, but also among employees in the course of their professional career, from a lifelong learning perspective (for reviews, see Noah & Abdul Aziz, Citation2020; Widad & Abdellah, Citation2022). Second, the transformation of the work environment brought about by ICT has created new and specific challenges for employees (Sun et al., Citation2022). For example, with the development of digital remote working environments and teleworking, teams are more likely to encounter difficulties when it comes to efficiently making decisions, negotiating, or collaborating (e.g., Swaab et al., Citation2012; Vayre, Citation2021). Digital technologies are omnipresent in education and the workplace, presenting fresh challenges when it comes to managing information and working with others. Hence the need to examine the literature concerning the training of individuals’ soft skills using an ICT environment.

To our knowledge, few systematic reviews have been carried out on the development of soft skills using digital tools. van Laar et al. (Citation2017)’s systematic review examined the concepts used to describe the skills needed in a digital environment based on a classification of 21st-century skills, and identified seven core skills with digital components: technical, information, communication, collaboration, creativity, critical thinking, and problem solving. In a subsequent literature review, the same authors (van Laar et al., Citation2020) tried to identify the main determinants of these seven skills. Results indicated that the main determinants were limited to personality and psychological factors. Their systematic review also revealed that a specific category of soft skills related to collaboration and communication tended to be neglected by researchers, concluding that “21st-century skills and 21st-century digital skills studies measured the determinants of problem-solving skills relatively frequently, whereas collaboration and communication skills studies were underreported” (van Laar et al., Citation2020, p. 9). It therefore seems important to investigate how people develop teamwork and collaborative skills using digital tools. Conceptually, the terms teamwork and collaborative (problem-solving) skills are often used interchangeably, but the former is generally broader than the latter. Teamwork can be defined as the process whereby team members collaborate to achieve task goals (Driskell et al., Citation2018). In other words, it refers to “the integration of individuals’ efforts toward the accomplishment of a shared goal” (Mathieu et al., Citation2008, p. 458). At a more general level, teamwork refers to the skills involved in working with others, such as coordination, collaboration, cooperation, communication, leadership, conflict management, problem solving, and decision making.

A number of international reports have underlined the importance of developing both teamwork and collaborative skills. For example, the OECD Program for International Student Assessment (PISA) highlighted a generally low level of proficiency among students in terms of collaborative problem-solving (OECD, Citation2017a, Citation2017b). The OECD Teaching and Learning International Survey (TALIS) asking teachers and school leaders about their learning environments reached the same conclusions for education professionals, as the latter were found to have very poor collaborative skills (OECD, Citation2019). In commenting collaborative problem-solving skills in education, Fiore et al. (Citation2018) noted that students tend to overestimate their collaborative skills, even though training to develop these skills is rarely included in their curricula. Although numerous studies have described ways of assessing students’ ability to solve problems collaboratively (e.g., Care et al., Citation2016; Hamet Bagnou et al., Citation2022; Rojas et al., Citation2021; Sun et al., Citation2020, Citation2022), there continues to be a dearth of research on how best to train students to develop teamwork and collaborative skills (Ahonen & Kinnunen, Citation2015; Love et al., Citation2023). In recent studies, authors have advocated for educators and researchers to work together to develop evidence-based training of collaborative skills for students (Fiore et al., Citation2018; Greiff & Borgonovi, Citation2022). Moreover, research suggests that various digital tools can be used to develop teamwork and collaborative skills (e.g., Cuomo et al., Citation2022; Li & Liu, Citation2022; Makri et al., Citation2021). For example, robots can promote collaborative skills (Toh et al., Citation2016), and games and simulations can help develop teamwork skills such as decision-making (Akcaoglu & Koehler, Citation2014; Tiwari et al., Citation2014), problem solving (Lancaster, Citation2014; Liu et al., Citation2011), and collaborative skills (Qian & Clark, Citation2016; Yang & Chang, Citation2013).

Despite a growing interest in technology-based environments for learning 21st-century skills, there remains a need for empirical evidence to assess how digital tools can be used specifically to develop teamwork and collaborative skills. The purpose of the present systematic review was to fill some of these gaps by comprehensively summarizing existing research on the development of collaborative and communication skills with digital tools.

This present review aims to investigate the following research question: How does training involving the use of digital tools promote the development of collaborative skills? More specifically, we seek to answer the following research questions (RQs):

  • RQ1. In which subject areas and disciplines were previous studies conducted?

  • RQ2. What was the population of these studies?

  • RQ3. Which types of interventions were used to develop collaborative skills?

  • RQ4. Which research designs were used?

  • RQ5. Which digital tools were used?

  • RQ6. How were these skills measured and what were the effects of training?

2. Method

This systematic review was performed according to Preferred Reporting of Items for Systematic reviews and Meta-Analyses Statement (PRISMA) guidelines. PRISMA provides a systematic checklist of 27 criteria and a flow diagram allowing for the reproducibility and transparency of the research (Moher et al., Citation2015; Page et al., Citation2021).

2.1. Search strategy

We searched seven databases: Web of Science and six databases available through the EBSCO platform: APA PsycInfo, APA PsycArticles, MEDLINE, Psychology and Behavioral Science Collection, Academic Search Premier, and ERIC. These databases were chosen because they cover various disciplinary areas in psychology, learning sciences, and medicine. Our institution’s access to the selected databases allowed us to use the same Boolean function for each one. The three expanders proposed by EBSCO were used to broaden the scope of the search: apply related words (“include synonyms and plurals of the terms”), search within documents (“search for the keywords within the full text of articles, as well as abstract and citation information”), and apply equivalent subjects (“utilize mapped vocabulary terms to add precision to unqualified keyword searches”).

Our search concerned three concepts: teamwork and collaborative skills, digital tools, and educational context. We used several keywords and synonyms for each one, to ensure a broad coverage of studies. We therefore implemented the following Boolean search: (“teamwork skill*” OR “teamwork competenc*” OR “teamwork abilit*” OR “collaborati* skill*” OR “collaborati* competenc*” OR “collaborati* abilit*”) AND (“software” OR “online tool*” OR “online environment” OR “digital tool*” OR “digital environment” OR “numerical tool*” OR “electronic tool*” OR “computer-based tool*” OR “interactive environment” OR “computer-assisted” OR “computer-aided” OR “web-based” OR “groupware”) AND (“train*” OR “educat*” OR “learn*” OR “instruct*” OR “teach*”). The final search and final data extraction were performed in January 2023.

2.2. Eligibility criteria

We included studies related to (1) teamwork and collaborative skills development, (2) involving the use of an ICT environment or digital tools, and (3) in the context of education and learning. Studies that did not deal with collaborative skills or teamwork or did not use a digital tool in the context of education and learning were excluded. Given our criteria, we focused on collaborative skills as in computer-supported collaborative learning (CSCL), and not on cooperative skills as in computer-supported cooperative work (CSCW). In the literature, a distinction is made between these two concepts, based on the way in which the task is divided (Schürmann et al., Citation2024). With cooperation, a given task is divided among group members, whereas collaboration refers to “the mutual engagement of participants in a coordinated effort to solve the problem together” (Roschelle & Teasley, Citation1995, p. 70). Only the latter was considered in the present systematic review.

To be included, articles also had to be written in English and peer reviewed, and their full texts had to be available. We excluded tests, editorials, dissertations, chapters, proceedings, and book reviews. Gray literature was also excluded.

2.3. Study selection

The search yielded a total of 2475 articles. After eliminating 642 duplicates, we were left with 1833 articles. The abstracts of these articles were read, resulting in the exclusion of 1722 articles. The 111 remaining articles were uploaded in full text (three were not available to our institution and the authors did not respond to our requests, so they were excluded from the next step). After reading the full texts, 90 articles were deleted: 25 that did not involve the use of a digital tool designed to develop learners’ collaborative skills; and 65 that did not provide any data on the development of learners’ collaborative skills. A total of 18 articles therefore met both our inclusion criteria and were selected for analysis. The procedure is shown in the form of a PRISMA flow chart in , and the study selection file is available online in our OSF project (https://osf.io/u2qez/).

Figure 1. Flow chart of systematic review.

*Consider, if feasible to do so, reporting the number of records identified from each database or register searched (rather than the total number across all databases/registers).

**If automation tools were used, indicate how many records were excluded by a human and how many were excluded by automation tools.

From: Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. British Medical Journal, 372(71). doi:10.1136/bmj.n71.

Figure 1. Flow chart of systematic review.*Consider, if feasible to do so, reporting the number of records identified from each database or register searched (rather than the total number across all databases/registers).**If automation tools were used, indicate how many records were excluded by a human and how many were excluded by automation tools.From: Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. British Medical Journal, 372(71). doi:10.1136/bmj.n71.

2.4. Screening and selection bias

To avoid selection bias, the selection process was performed by two authors (A.C. and B.H.). The first selection step based on titles and abstracts was performed independently by the two coders, who each examined all the articles (Cohen’s kappa = 0.76; interrater agreement = 97.76%). To be included in the second step, an article had to be selected by at least one of the two coders. For the last step, requiring full-text assessments, all 18 articles were independently coded by the two coders according to the selection criteria (Cohen’s kappa = 0.72; interrater agreement = 90.74%). Any disagreements (10 of the 111 full-text articles) were discussed until a consensus was reached.

2.5. Data extraction

We created a data extraction form to explore the selected studies. This form included the following sections: characteristics of studies (authors, date of publication, country), number of participants and design of studies, ICT environment used, metrics, and main findings (see Appendix A).

2.6. Quality appraisal

We used the Mixed Methods Appraisal Tool (MMAT) to assess the quality of the studies (Hong et al., Citation2018). MMAT is composed of two screening questions and five core quality criteria. There is one core quality criterion for each of the five categories of study design: qualitative, quantitative (descriptive, randomized, and nonrandomized), and mixed. Studies were considered to be of high quality if they met 100% of the criteria, moderate quality if they met 80–99% of the criteria, average quality if they met 60–79% of the criteria, low quality if they met 40–59% of the criteria, and very low quality if they met < 39% of the criteria. The authors performed a quality assessment for the 18 articles after full-text screening, and discussed the results of this assessment. None of the appraised articles were excluded, even when the quality score was low. Research designs of the selected studies were as follows: qualitative (n = 4), quantitative descriptive (n = 2), quantitative randomized (n = 4), and quantitative nonrandomized (n = 8). Across all designs, five studies scored 100, five scored 80, five scored 60, one scored 20, and two scored 0. Just over half the studies were rated as being of high or moderate quality.

3. Results

The 18 studies analyzed in the present review are listed in Appendix A.

3.1. Subject areas and disciplines

The selected articles were published between 2001 and 2021. The number of articles was constant over 5-year periods until 2016 (two papers between 2001 and 2006, three papers between 2006 and 2011, and between 2011 and 2016), and half were published after 2016.

We used the Scimago Journal & Country Rank database (https://www.scimagojr.com) to establish the subject areas and disciplines covered by the journals in which the selected articles were published (). The most prevalent subject area was social science (n = 12), followed by medicine (n = 7) and computer science (n = 5). Education was the most prevalent discipline (n = 11), followed by e-learning (n = 6) and computer science applications (n = 5).

Table 1. Categorization of studies according to subject area and discipline.

3.2. Population

Studies described in the selected articles were primarily conducted among students (n = 13), notably at university (n = 9), but also younger ones in middle or high school (n = 4), and three of the latter were conducted in Taiwan. Four of the nine studies among university students involved healthcare students, and three engineering students. Five studies focused on employees in aviation (n = 2) and healthcare (n = 3).

3.3. Intervention programs

All the studies conducted in healthcare were embedded in one of two types of intervention program: either interprofessional education (IPE)Footnote1 (Carbonaro et al., Citation2008; Djukic et al., Citation2015; Evans et al., Citation2016; Lee et al., Citation2020; Riesen et al., Citation2012) or Team Strategy and Tools to Enhance Performance and Patient Safety (TeamSTTEP)Footnote2 (Burns et al., Citation2021; Wang et al., Citation2017). Other intervention programs mentioned in the selected articles pertained to vocational fields such as human resources management programs for business school students (Riivari et al., Citation2021), crew resources management for employees such as pilots (Brannick et al., Citation2005), or science technology engineering and mathematics (STEM) for junior high school students (Lin et al., Citation2018).

3.4. Research design

The selected articles could be divided into case studies (n = 4) and (quasi)-experimental studies (n = 14). Three distinct designs were used for the latter: between-group (n = 5), within-participants (n = 3), and mixed (n = 6). The activity was mostly performed in groups (n = 13), rather than individually (n = 5). As shown in , all four case studies and nine of the 14 experimental designs dealt with a group activity.

Table 2. Numbers of articles referring to Individual or group activities according to study design.

Eight of the selected articles described longitudinal interventions where the activity lasted between 6 weeks and 12 months. The 10 remaining articles described a one-off session featuring two activities to develop collaborative skills. Only two of the eight longitudinal studies used repeated training sessions to improve collaborative skills (Jarrett et al., Citation2016; Riesen et al., Citation2012).

3.5. Digital tools used

Across the selected articles, a wide variety of digital tools were used, with authors generally using their own tool. These tools could be divided into two categories: immersive environments (n = 10) and non-immersive environments (n = 7). In addition, activities using the digital tools were administered either individually or in groups (see ).

Table 3. Numbers of articles referring to Individual or group activities according to category of digital tool used.

3.5.1. Immersive environments

The digital tools classified as immersive environments could be subdivided into four categories: (1) video games (played on a digital device with one or more players through an avatar in a virtual world); (2) simulations with scenarios (programs developed to represent different situations from case studies, using standardized scripts); (3) PC-based simulators (programs developed to use a computer to represent the operation of a machine, system, or phenomenon); and (4) augmented reality tools (integrating 3D virtual elements into a real environment in real time).

The five studies using video games were conducted in an immersive virtual world where players moved using an avatar and played in a team to complete a mission (Chang & Hwang, Citation2017; Jarrett et al., Citation2016; Riesen et al., Citation2012; Riivari et al., Citation2021; Sancho-Thomas et al., Citation2009). In Web.Alive, the immersive world is similar to the real world, and participants can interact with other players and access learning content in the form of videos, slide presentations, or other web-based resources (Riesen et al., Citation2012). NUCLEO plunges players into an immersive world representing an island, where they must form teams to complete several missions assigned to them by an instructor (Sancho-Thomas et al., Citation2009). Players can interact with their group, the instructor and other players on the island, and through the game they can access Moodle learning management system services, tools and content. In NoviCraft, participants need to complete different tasks as a team in order to escape from an island (Riivari et al., Citation2021). To be successfully completed, each task requires participants to share information, coordinate their actions, and make collective decisions. Adopting a similar game design, Chang and Hwang (Citation2017) used a video game about biology where teams of players had to perform various tests in order to collect information needed to complete a mission. Finally, Jarrett et al. (Citation2016) used a game in which teams of four played against the computer to destroy enemy tanks. Within each team, participants were split into two pairs, and each pair was assigned a tank, with one member taking on the role of gunner, and the other the role of commander/driver. The four players each had their own computer and communicated with each other using a microphone and headset.

The four studies using simulations with scenarios were conducted among healthcare workers or students in medicine or nursing. Two of the studies involved online simulations using Storyline with standardized scripts (Burns et al., Citation2021; Wang et al., Citation2017), and the two others integrated online course modules into the simulation (Djukic et al., Citation2015; Lee et al., Citation2020). In Burns et al. (Citation2021), participants performed three simulations of a pediatric patient in septic shock, in which they had to select optimum medical actions and communication toward other team members from a set of options. After each simulation, participants received automatic feedback through two scores based on their medical and communication selections during the scenario. Wang et al. (Citation2017) used five scenario-based computer simulations featuring videos of a patient with interactive features. In Lee et al. (Citation2020), the e-learning program featured simulations of four case scenarios, coupled with immediate feedback for incorrect answers. In Djukic et al. (Citation2015), participants performed five knowledge-focused modules and two virtual patient modules, in which participants collaborated with a computer agent. The agent’s comments and the content were created by the faculty staff.

A single study used a PC-based simulator, resembling line-oriented flight training, to train two-member teams (Brannick et al., Citation2005). After team-based training provided either by digital tools or by an instructor, participants performed a team-based task designed to assess the development of their collaborative skills using a high-fidelity simulator.

In the only study to use augmented reality tools (Chen et al., Citation2020), students in groups of three or four used a tablet to access augmented reality material and complete two robot building tasks using interlocking bricks.

3.5.2. Nonimmersive environments

In nonimmersive environments, digital tools can be divided into three categories: sharing platforms (digital tools developed to enable several people to work together on a subject); online courses (digital tools developed to provide online courses); and collaborative problem-solving tools (digital tools developed to solve problems collaboratively).

Five studies involved a sharing platform (Baser et al., Citation2017; Carbonaro et al., Citation2008; Carroll et al., Citation2015; Cortez et al., Citation2009; Evans et al., Citation2016). All provided a shared workplace and a synchronous and asynchronous discussion channel. Three used a virtual classroom environment providing a shared workplace such as an interactive whiteboard, instant messaging (Carbonaro et al., Citation2008), a functionality to participate in team conferences via an asynchronous team discussion board in Blackboard (Evans et al., Citation2016), or an asynchronous online forum with the possibility of sharing documents in group or class folders (Baser et al., Citation2017). Basic Resources for Integrated Distributed Group Environments (BRIDGE), another sharing platform, was used during a semester to support collaborative homework activities, providing a space to share information or various documents for planning work, and to interact synchronously and asynchronously with team members (Carroll et al., Citation2015). Finally, Cortez et al. (Citation2009) used a Pocket PC where participants received two set of elements specific to each group member: one list (riddles) was not interchangeable with other members, but the other one (solutions) was interchangeable. The objective was to associate each riddle with a solution on the Pocket PC. To perform the task, group members had to collaborate, by exchanging their solutions.

An online course was only used in one study (Kraus & Gramopadhye, Citation2001), where aircraft maintenance technicians individually received team-based training through Aircraft Maintenance Team Training (AMTT), a computer-based multimedia training tool. This tool provided material in a variety of formats (videos, photos, animations), together with different submodules to develop specific team skills such as communication, leadership, and decision making (Kraus & Gramopadhye, Citation1999). Finally, a collaborative problem-solving tool was also used in a single study (Lin et al., Citation2018), where eight collaborative problem-solving tasks were implemented using a digital tool. Four tasks were used for training, and four for the assessment of collaborative skills. A computer-simulated participant served as a teammate during the problem-solving tasks. Participants communicated with the computer agent on the screen, and the computer agent answered by searching for the solution in a database.

3.6. Collaborative skills measured and effects of training

Analysis of the included articles showed that collaborative skills were measured using questionnaires, observations, interviews, written reports, or task performances. Concerning questionnaires, results revealed a considerable variety of assessment tools. We counted 11 different literature-based questionnaires, and only one of these, the University of West England Interprofessional Questionnaire,Footnote3 was used in two different articles (Carbonaro et al., Citation2008; Evans et al., Citation2016). Concerning observations, several different methods were used, including the subjective rating of behaviors on a Likert scale (Brannick et al., Citation2005; Carroll et al., Citation2015; Wang et al., Citation2017), and the counting of specific behaviors observed during an activity (Chang & Hwang, Citation2017). Some authors relied on task performances, such as the mean numbers of riddles solved or not solved (Cortez et al., Citation2009), the number of enemy tanks destroyed (Jarrett et al., Citation2016) or the number of bottles knocked over by a robot car (Chen et al., Citation2020). One study used 1-page written reports, where team participants assessed team roles, and analyzed their team’s effectiveness and decision making.

Most of the included articles used a combination of tools to assess collaborative skills. Of the 10 articles that combined two methods, the most frequently used combination was questionnaire plus observation (n = 5). Only two articles used three methods (questionnaire, observation, and performance task). Of the six articles that used a single method, four used a questionnaire, one observation, and one a performance task. Taken together, the most frequently used methods were the questionnaire (n = 14), followed by observation (n = 9).

Taken together, most of the articles reported a positive effect of training with digital tools on the development of collaborative skills. However, five of them failed to find any effect on the development of collaborative skills. This lack of effect was particularly noticeable in the articles dealing with various modes of learning, when blended learning (i.e., combination of online and in-person) was compared with more traditional approaches (i.e., either online or in-person) (Burns et al., Citation2021; Carbonaro et al., Citation2008; Djukic et al., Citation2015; Kraus & Gramopadhye, Citation2001; Wang et al., Citation2017). Of the remaining 13 articles, only three compared groups with and without the use of digital tools, showing a positive effect of digital tools on the development of collaborative skills (Brannick et al., Citation2005; Chen et al., Citation2020; Lin et al., Citation2018). Three other studies explored the effects of specific variables in conjunction with the use of digital tools to develop collaborative skills. In two of these studies, video games were used, with results revealing that peer support helped to improve collaborative skills (Chang & Hwang, Citation2017), and the use of an after-action review procedure (e.g., debriefing) enhanced team performance (Jarrett et al., Citation2016). In the third study, using the learning-to-collaborate-by-collaborating process improved collaborative skills (Cortez et al., Citation2009). The seven other articles reported a positive effect of training with digital tools on the development of collaborative skills. More specifically, four studies with a case study design only assessed collaborative skills after the use of digital tools (Baser et al., Citation2017; Carroll et al., Citation2015; Riivari et al., Citation2021; Sancho-Thomas et al., Citation2009), whereas the other three articles assessed them both before and after a digital tool use (Evans et al., Citation2016; Lee et al., Citation2020; Riesen et al., Citation2012).

4. Discussion

The present systematic review examined how digital tools can be used to train collaborative skills. A search of seven databases allowed us to identify 2475 articles, of which only 18 met our inclusion criteria.

Most of these articles were published in the social sciences area, specifically education and e-learning, followed by medicine and computer science. In fact, 13 of the 18 included articles were primarily conducted among students, though only four of them involved middle- or high-school students. The five remaining articles focused on employees in two vocational fields: aviation and healthcare.

Regarding study design, analysis revealed a high proportion of experimental designs (14/18), although these seldom featured a longitudinal design (4/14). These results are consistent with a recent systematic review in which only one of the 46 selected papers using gamification in education and learning had a longitudinal design (Zainuddin et al., Citation2020). Moreover, the present review revealed that all the articles in the medical field, contrary to those in other fields, used standardized tools for measuring collaborative skills. More specifically, they were based on one of two systematic intervention programs (i.e., TeamSTTEP or IPE), which provide a framework and tools for developing and assessing collaborative skills. This observation is not surprising, given that the medical field, like the aeronautics field, has pioneered the training of professionals to develop teamwork skills, and is more likely to involve team simulations (e.g., using high-fidelity mannequins) (Fritz et al., Citation2008). As the growing transformation of work appeals to collaborative skills for teamwork, research on the development of collaborative skills now needs to be extended to other vocational fields (Succi & Canovi, Citation2020).

The present systematic review also revealed that a wide variety of digital tools were used to develop collaborative skills, and immersive environments were predominant (10/18). Three main types of digital tools were used: video games, sharing platforms, and simulations with scenarios. Although most of the digital tools in the included studies required group activities (13/18), some of them relied on individual activities. It was mainly the scenario-based simulations that involved individual activities, for example when the implementation of a virtual agent simulated an environment that gave individuals the impression of doing group work. These results highlight the importance of working in groups to develop collaborative skills using digital tools, but digital environments simulating group work can also be used individually for this purpose. In the included articles, all the video games that were used had to be played in teams in order for a mission to be accomplished. It was by collaborating that participants could performed a given mission. In other words, in order to develop collaborative skills from video games, they had to collectively take part in the mission, including communicating with each other and/or helping each other.

The present systematic literature review highlighted a diversity of findings about the effects of digital tools on the development of collaborative skills. This diversity can be attributed not only to the variety of digital tools used by researchers, but also to the variety of measures used to assess collaborative skills. Self-report questionnaires were often preferred, and the use of different non standardized instruments without any psychometric validation limited comparisons between studies. When researchers use their own self-report measures, it is difficult to gauge the efficacy of an intervention at both the first (participants’ satisfaction) and second (attitudinal or knowledge change) levels identified by Kirkpatrick and Kirkpatrick (Citation2006). To assess this efficacy, and also to improve collaborative skills, future studies should reinforce behavioral and performance measures and used them in addition to self-report measures. Concerning the latter, it would be useful to administer standardized and validated questionnaires measuring collaborative skills, notably in repeated training sessions aimed at improving these skills, as there have so far been very few longitudinal studies (Jarrett et al., Citation2016; Riesen et al., Citation2012).

Moreover, analysis revealed that some of the quantitative studies included in the present systematic review had adopted a media comparison approach, assessing the contribution of digital tools compared with more traditional pedagogical situations where they are not used (Brannick et al., Citation2005; Burns et al., Citation2021; Carbonaro et al., Citation2008; Kraus & Gramopadhye, Citation2001). This is a useful approach for demonstrating the possible effects of digital tool use, but it also has limitations, notably because it focuses on the technology rather than on the learner, and also because it does not help to demonstrate the complementarity of media, or the importance of instructional methods or the features of the digital environment (See Buchner & Kerres, Citation2023). Other studies varied the pedagogical situations in which one or more digital tools were used, assessing the effects of synchronization-based peer assistance (Chang & Hwang, Citation2017), a learning-to-collaborate step (Cortez et al., Citation2009), or the delivery of the same module in versus outside the classroom (Djukic et al., Citation2015). Finally, a few studies combined the two approaches, investigating, for example, the effects of augmented reality and competition (Chen et al., Citation2020) or a web-based collaborative problem-solving system with or without a teacher’s guidance. The variety of these pedagogical situations and instructional methods made it difficult to come up with practical recommendations, but nevertheless demonstrated the importance of the pedagogical context in which the digital tool is used.

4.1. Theoretical and practical implications

This systematic analysis of the literature has implications in terms of our original research question. It would seem that we must shift our focus away from the question of which digital tools to use in order to develop collaborative skills to that of how to use them (the value-added approach). In a closely related field of research, Cai et al. (Citation2022) demonstrated in their meta-analysis that adding scaffolding (e.g., feedback, hints, or reflection) to digital game-based learning environments has positive effects on learners’ performance. This finding tends to support a value-added approach in which the emphasis is on assessing the characteristics of the learning environment, rather than on actual digital tool (non)use. The necessary development of this value-added approach has been supported in the field of game-based learning (Mayer, Citation2019), as well as more recently in the field of augmented reality (Buchner & Kerres, Citation2023).

In terms of recommendations, this systematic review may have implications not only for Human-Computer Interaction researchers, but also for educators aiming to train people to collaborate in teams. Indeed, Human-Computer Interaction researchers needs to focus its research on this approach in order to gain a better understanding of how to design useful tools for training collaborative skills. For example, some recent findings have suggested that delivering instructions on how to collaborate during a computer-based collaborative task improves performance and task conflict regulation in online groups (Hémon et al., Citation2024). This study adopted an approach that we believe should be developed further, in order to better understand how specific features of digital tools can support group activities by enhancing teamwork and collaborative skills. As underscored by Fiore et al. (Citation2018), “students rarely receive meaningful instruction, modelling and feedback on collaboration” (p. 368), which may explain why they overestimate their collaborative skills without necessarily collaborating efficiently. This review also has implications for education and training. Collaborative skills are transversal skills that need to be developed by students, and should be included better in the curriculum in addition to disciplinary contents. Nowadays, immersive and non-immersive digital tools appear to be promising and complementary to traditional non-technological methods based on face-to-face collaboration. Nevertheless, educators need to be vigilant in the way they measure the acquisition of collaborative skills using digital tools. We also recommend organizing a series of training sessions over time, and evaluating their effects using validated metrics.

5. Future research directions

Based on this systematic review, we can propose some ideas for future research.

First, our findings confirm the need to go further in the use of digital tools, and in particular, to continue taking a closer look at the different functionalities offered by digital environments. Future research should apply the value-added approach in order to draw up recommendations for future digital tool development. For example, research should focus on the impact of automatic feedback on self-assessment and/or the use of instruments to guide and facilitate the development of collaborative skills during training sessions.

Furthermore, although a diversity of collaborative skills measures was used, self-report questionnaires are privileged by researchers. It therefore appears necessary to verify the psychometric qualities of the scales used. In addition, future research would benefit from using performance-tasks to evaluate collaborative skills, thus avoiding a potential desirability bias and false memories when measuring these skills by questionnaires. With this perspective, the desire to integrate the development of learners’ collaborative skills into their curriculum, notably through the use of digital tools, leads us to reconsider research design. Future research would benefit from the implementation of longitudinal study designs to examine the effects of training over time in ecological contexts.

6. Limitations

The present systematic review had several limitations. First, we focused exclusively on articles published in English peer-reviewed journals, and did not include any gray literature (e.g., white papers, doctoral dissertations, chapters, and articles published in non-peer reviewed journals). Second, although we used expanders proposed by databases to search for terms synonymous with our keywords, we may have missed some articles involving teamwork, collaborative skills or digital tools, as these were not mentioned in the Abstract or keywords. For these reasons, some relevant studies may not have been included in this systematic review. Third, the review failed to yield clear evidence of the effects of digital tools on the development of collaborative skills, owing to the quality of the selected studies. According to the MMAT, just over half our selected studies were of moderate or high quality, depending on their design. There is therefore a need for further high-quality studies to investigate the effects of team training with digital tools on the development of learners’ teamwork and collaborative skills. Given the methodological issues we highlighted, and the disparity of the study fields, digital tools, methods and measures used, any attempt to generalize the results reported here must be carried out with caution.

7. Conclusion

One of the main conclusions of the present systematic review is that digital tools have the potential to develop collaborative skills. Integrating digital tools into traditional classrooms or incorporating digital tools for training teams in in-person courses contributes to the development of these skills. Nevertheless, the mere presence of digital tools does not systematically guarantee an improvement in collaborative skills. Rather, the transformative potential of digital tools lies in their ability to reshape and enhance existing learning modalities through specific interventions. Another important issue is the need for future research to better understand the features of the digital environments that enhance these skills. A further challenge for the future is to embed interventions that use digital tools to develop collaborative skills within the academic curriculum. In order to assess the development of these skills, and the efficacy of the digital tools intended to promote them, standardized measures with good psychometric validity should be used, as well as behavioral and performance measures collected over different training sessions from a longitudinal perspective. To conclude, work on how digital tools contribute to the development of collaborative skills is still in its infancy, and more research based on rigorous methods and measures is needed.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

Data and codebook are available on our OSF webpage: https://osf.io/u2qez/.

Additional information

Funding

This research is a part of the ECC’IPE project supported by the French Ministry of National Education, Youth and Sports (MENJS), Digital innovation and educational excellence action.

Notes on contributors

Anthony Cherbonnier

Anthony Cherbonnier is a post-doctoral fellow in social psychology and ergonomics at the University of Rennes 2 (France) where he obtained his PhD. His research focuses on emotion in online environments, collaborative skills and human-computer interaction.

Brivael Hémon

Brivael Hémon obtained his PhD in social psychology at the University of Rennes 2 (France). His research focuses on safety voice and collaboration in work groups. In his current postdoctoral position, he is working on the acceptability of a digital tool intended to develop pupils’ critical thinking skills.

Nicolas Michinov

Nicolas Michinov is a Professor of social and applied psychology at the University of Rennes 2 (France). He studies individual and group processes to determine their influence on different outcomes (e.g., performance, creativity and emotional states), and in different contexts, such as collaborative working, collaborative learning and online learning.

Eric Jamet

Eric Jamet is a Professor of cognitive psychology and ergonomics at the University of Rennes 2 (France). His research focuses on various issues related to the cognitive processing of complex documents and human-computer interaction (e.g., e-learning and multimedia comprehension, active learning and scaffolding effects).

Estelle Michinov

Estelle Michinov is a Professor of social psychology at the University of Rennes 2 (France). Her research interests group processes and team performance in different contexts. She examines factors that contribute to team performance and team training methods (simulation-based, technology based and virtual reality).

Notes

1 “Interprofessional education occurs when students from two or more professions learn about, from, and with each other to enable effective collaboration and improve health outcomes” World Health Organization (WHO) (Citation2010, p. 7).

2 TeamSTEPPS is an evidence-based set of teamwork tools, aimed at optimizing patient outcomes by improving communication and teamwork skills among healthcare professionals (French Agency for Healthcare Research and Quality).

3 UWEIQ has 35 items belonging to four subscales: communication and teamwork (9 items), interprofessional learning (9 items), interprofessional interaction (9 items), and interprofessional relationships (8 items) (Pollard et al., Citation2004, Citation2005).

References

  • References marked with an asterisk indicate studies included in the literature review.
  • Abelha, M., Fernandes, S., Mesquita, D., Seabra, F., & Ferreira-Oliveira, A. T. (2020). Graduate employability and competence development in higher education – A systematic literature review using PRISMA. Sustainability, 12(15), 5900. https://doi.org/10.3390/su12155900
  • Ahonen, A. K., & Kinnunen, P. (2015). How do students value the importance of twenty-first century skills? Scandinavian Journal of Educational Research, 59(4), 395–412. https://doi.org/10.1080/00313831.2014.904423
  • Akcaoglu, M., & Koehler, M. J. (2014). Cognitive outcomes from the game-design and learning (GDL) after-school program. Computers & Education, 75, 72–81. https://doi.org/10.1016/j.compedu.2014.02.003
  • Ananiadou, K., & Claro, M. (2009). 21st century skills and competences for new millennium learners in OECD countries. (OECD Education Working Papers 41; OECD Education Working Papers, Vol. 41). https://doi.org/10.1787/218525261154
  • Anderson, R. E. (2008). Implications of the information and knowledge society for education. In J. Voogt & G. Knezek (Eds.), International handbook of information technology in primary and secondary education (pp. 5–22). Springer US. https://doi.org/10.1007/978-0-387-73315-9_1
  • Arthur, W., Jr., Bell, S. T., & Edwards, B. D. (2007). A longitudinal examination of the comparative criterion-related validity of additive and referent-shift consensus operationalizations of team-efficacy. Organizational Research Methods, 10, 35–58. https://doi.org/10.1177/1094428106287574
  • *Baser, D., Ozden, M. Y., & Karaarslan, H. (2017). Collaborative project-based learning: An integrative science and technological education project. Research in Science & Technological Education, 35(2), 131–148. https://doi.org/10.1080/02635143.2016.1274723
  • Binkley, M., Erstad, O., Herman, J., Raizen, S., Ripley, M., Miller-Ricci, M., & Rumble, M. (2012). Defining twenty-first century skills. In P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st century skills (pp. 17–66). Springer Netherlands. https://doi.org/10.1007/978-94-007-2324-5_2
  • *Brannick, M. T., Prince, C., & Salas, E. (2005). Can pc-based systems enhance teamwork in the cockpit? The International Journal of Aviation Psychology, 15(2), 173–187. https://doi.org/10.1207/s15327108ijap1502_4
  • Buchner, J., & Kerres, M. (2023). Media comparison studies dominate comparative research on augmented reality in education. Computers & Education, 195, 104711. https://doi.org/10.1016/j.compedu.2022.104711
  • *Burns, R., Gray, M., Peralta, D., Scheets, A., & Umoren, R. (2021). TeamSTEPPS online simulation: Expanding access to teamwork training for medical students. BMJ Simulation & Technology Enhanced Learning, 7(5), 372–378. https://doi.org/10.1136/bmjstel-2020-000649
  • Cai, Z., Mao, P., Wang, D., He, J., Chen, X., & Fan, X. (2022). Effects of scaffolding in digital game-based learning on student’s achievement: A three-level meta-analysis. Educational Psychology Review, 34(2), 537–574. https://doi.org/10.1007/s10648-021-09655-0
  • *Carbonaro, M., King, S., Taylor, E., Satzinger, F., Snart, F., & Drummond, J. (2008). Integration of e-learning technologies in an interprofessional health science course. Medical Teacher, 30(1), 25–33. https://doi.org/10.1080/01421590701753450
  • Care, E., Scoular, C., & Griffin, P. (2016). Assessment of collaborative problem solving in education environments. Applied Measurement in Education, 29(4), 250–264. https://doi.org/10.1080/08957347.2016.1209204
  • *Carroll, J. M., Jiang, H., & Borge, M. (2015). Distributed collaborative homework activities in a problem-based usability engineering course. Education and Information Technologies, 20(3), 589–617. https://doi.org/10.1007/s10639-013-9304-6
  • Chai, C.-S., Deng, F., Tsai, P.-S., Koh, J. H. L., & Tsai, C.-C. (2015). Assessing multidimensional students’ perceptions of twenty-first-century learning practices. Asia Pacific Education Review, 16(3), 389–398. https://doi.org/10.1007/s12564-015-9379-4
  • *Chang, S.-C., & Hwang, G.-J. (2017). Development of an effective educational computer game based on a mission synchronization-based peer-assistance approach. Interactive Learning Environments, 25(5), 667–681. https://doi.org/10.1080/10494820.2016.1172241
  • *Chen, X., Xie, H., Zou, D., & Hwang, G.-J. (2020). Application and theory gaps during the rise of artificial intelligence in education. Computers and Education, 1, 100002. https://doi.org/10.1016/j.caeai.2020.100002
  • Cimatti, B. (2016). Definition, development, assessment of soft skills and their role for the quality of organizations and enterprises. International Journal for Quality Research, 10(1), 90–130. https://doi.org/10.18421/IJQR10.01-05
  • Cinque, M. (2016). “Lost in translation”. Soft skills development in European countries. Tuning Journal for Higher Education, 3(2), 389–427. https://doi.org/10.18543/tjhe-3(2)-2016pp389-427
  • *Cortez, C., Nussbaum, M., Woywood, G., & Aravena, R. (2009). Learning to collaborate by collaborating: A face-to-face collaborative activity for measuring and learning basics about teamwork1: Learning to collaborate by collaborating. Journal of Computer Assisted Learning, 25(2), 126–142. https://doi.org/10.1111/j.1365-2729.2008.00298.x
  • Cuomo, S., Roffi, A., Luzzi, D., & Ranieri, M. (2022). Immersive environments in higher education: The digital well-being perspective. In M. Ranieri, M. Pellegrini, L. Menichetti, A. Roffi, & D. Luzzi (Eds.), Social justice, media and technology in teacher education (Vol. 1649, pp. 30–41). Springer International Publishing. https://doi.org/10.1007/978-3-031-20777-8_3
  • *Djukic, M., Adams, J., Fulmer, T., Szyld, D., Lee, S., Oh, S.-Y., & Triola, M. (2015). E-Learning with virtual teammates: A novel approach to interprofessional education. Journal of Interprofessional Care, 29(5), 476–482. https://doi.org/10.3109/13561820.2015.1030068
  • Driskell, J. E., Salas, E., & Driskell, T. (2018). Foundations of teamwork and collaboration. The American Psychologist, 73(4), 334–348. https://doi.org/10.1037/amp0000241
  • *Evans, S., Ward, C., & Margerison, C. (2016). Online interprofessional education in dietetic students: Online interprofessional education. Nutrition & Dietetics, 73(3), 268–274. https://doi.org/10.1111/1747-0080.12235
  • Fiore, S. M., Graesser, A., & Greiff, S. (2018). Collaborative problem-solving education for the twenty-first-century workforce. Nature Human Behaviour, 2(6), 367–369. https://doi.org/10.1038/s41562-018-0363-y
  • Flin, R., O’Connor, P. and Crichton, M. (2008). Safety at the sharp end: A guide to non-technical skills. Burlington: Ashgate Publishing Company.
  • Fritz, P. Z., Gray, T., & Flanagan, B. (2008). Review of mannequin-based high-fidelity simulation in emergency medicine. Emergency Medicine Australasia, 20(1), 1–9. https://doi.org/10.1111/j.1742-6723.2007.01022.x
  • Goggin, D., Sheridan, I., Lárusdóttir, F., & Guðmundsdóttir, G. (2019). Towards the identification and assessment of transversal skills. 2513–2519. https://doi.org/10.21125/inted.2019.0686
  • Greiff, S., & Borgonovi, F. (2022). Teaching of 21st century skills needs to be informed by psychological research. Nature Reviews Psychology, 1(6), 314–315. https://doi.org/10.1038/s44159-022-00064-w
  • Hamet Bagnou, J., Prigent, E., Martin, J.-C., & Clavel, C. (2022). Adaptation and validation of two annotation scales for assessing social skills in a corpus of multimodal collaborative interactions. Frontiers in Psychology, 13, 1039169. https://doi.org/10.3389/fpsyg.2022.1039169
  • Hémon, B., Cherbonnier, A., Michinov, E., Jamet, E., & Michinov, N. (2024). When instructions based on constructive controversy boost synergy in online groups. International Journal of Human–Computer Interaction, 40(5), 1102–1110. https://doi.org/10.1080/10447318.2022.2132028
  • Hong, Q. N., Fàbregues, S., Bartlett, G., Boardman, F., Cargo, M., Dagenais, P., Gagnon, M.-P., Griffiths, F., Nicolau, B., O’Cathain, A., Rousseau, M.-C., Vedel, I., & Pluye, P. (2018). The Mixed Methods Appraisal Tool (MMAT) version 2018 for information professionals and researchers. Education for Information, 34(4), 285–291. https://doi.org/10.3233/EFI-180221
  • *Jarrett, S. M., Glaze, R. M., Schurig, I., Muñoz, G. J., Naber, A. M., McDonald, J. N., Bennett, W., & Arthur, W. (2016). The comparative effectiveness of distributed and colocated team after-action reviews. Human Performance, 29(5), 408–427. https://doi.org/10.1080/08959285.2016.1208662
  • Jeng, J. H., & Tang, T. I. (2004). A model of knowledge integration capability. Journal of Information, Technology and Society, 4(1), 13–45.
  • King S, Pimlott J, Taylor E, Loomis J, Skakun E. (2005). Final report: Evaluation of team objective structured clinical examination. University of Alberta University Teaching Research Fund.
  • Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training programs: The four levels. (3rd ed). Berrett-Koehler.
  • *Kraus, D. C., & Gramopadhye, A. K. (2001). Effect of team training on aircraft maintenance technicians: Computer-based training versus instructor-based training. International Journal of Industrial Ergonomics, 27(3), 141–157. https://doi.org/10.1016/S0169-8141(00)00044-5
  • Kraus, D., & Gramopadhye, A. K. (1999). Team training: Role of computers in the aircraft maintenance environment. Computers & Industrial Engineering, 36(3), 635–654. https://doi.org/10.1016/S0360-8352(99)00156-4
  • Lancaster, R. J. (2014). Serious game simulation as a teaching strategy in pharmacology. Clinical Simulation in Nursing, 10(3), e129–e137. https://doi.org/10.1016/j.ecns.2013.10.005
  • *Lee, S. J., Park, M. S., Kwon, D. Y., Kim, Y., & Chang, S. O. (2020). The development and effectiveness of combining case-based online lecture and simulation programs to facilitate interprofessional function care training in nursing homes. Computers, Informatics, Nursing, 38(12), 646–656. https://doi.org/10.1097/CIN.0000000000000655
  • Leucht, R., Madson, M., Taugher, M., & Petterson, J. (1990). Assessing perceptions: Design and validation of an interdisciplinary education perception scale. Journal of Allied Health, 19, 181–191.
  • Li, C., & Liu, J. (2022). Collaborative virtual environment for distant and blended learning in the higher education setting: A systematic review. In R. C. Li, S. K. S. Cheung, P. H. F. Ng, L.-P. Wong, & F. L. Wang (Eds.), Blended learning: Engaging students in the new normal era (Vol. 13357, pp. 135–146). Springer International Publishing. https://doi.org/10.1007/978-3-031-08939-8_12
  • *Lin, K.-Y., Yu, K.-C., Hsiao, H.-S., Chang, Y.-S., & Chien, Y.-H. (2018). Effects of web-based versus classroom-based STEM learning environments on the development of collaborative problem-solving skills in junior high school students. International Journal of Technology and Design Education, 30(1), 21–34. https://doi.org/10.1007/s10798-018-9488-6
  • Liu, C.-C., Cheng, Y.-B., & Huang, C.-W. (2011). The effect of simulation games on the learning of computational problem solving. Computers & Education, 57(3), 1907–1918. https://doi.org/10.1016/j.compedu.2011.04.002
  • Love, H. B., Cross, J. E., Fosdick, B. K., Tofany, E., & Dickmann, E. M. (2023). Teaching team science: The key to addressing 21st century global challenges. Small Group Research, 54(3), 396–427. https://doi.org/10.1177/10464964221121349
  • MacDonald, C.J., Archibald, D., Trumpower, D., Casimiro, L., Cragg, B., & Jelley, W. (2010). Designing and operationalizing a toolkit of bilingual interprofessional education assessment instruments. Journal of Research in Interprofessional Practice and Education, 1(3), 304–316. https://doi.org/10.22230/jripe.2010v1n3a36
  • Makri, A., Vlachopoulos, D., & Martina, R. A. (2021). Digital escape rooms as innovative pedagogical tools in education: A systematic literature review. Sustainability, 13(8), 4587. https://doi.org/10.3390/su13084587
  • Mathieu, J., Maynard, M. T., Rapp, T., & Gilson, L. (2008). Team effectiveness 1997–2007: A review of recent advancements and a glimpse into the future. Journal of Management, 34(3), 410–476. https://doi.org/10.1177/0149206308316061
  • Mayer, R. E. (2019). Computer games in education. Annual Review of Psychology, 70(1), 531–549. https://doi.org/10.1146/annurev-psych-010418-102744
  • Moher, D., Shamseer, L., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., Shekelle, P., & Stewart, L. A. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Systematic Reviews, 4(1), 1. https://doi.org/10.1186/2046-4053-4-1
  • Morrison-Smith, S., & Ruiz, J. (2020). Challenges and barriers in virtual teams: A literature review. SN Applied Sciences, 2(6), 1096. https://doi.org/10.1007/s42452-020-2801-5
  • Noah, J. B., & Abdul Aziz, A. (2020). A Systematic review on soft skills development among university graduates. EDUCATUM Journal of Social Sciences, 6(1), 53–68. https://doi.org/10.37134/ejoss.vol6.1.6.2020
  • OECD. (2017a). PISA 2015 assessment and analytical framework: Science, reading, mathematic, financial literacy and collaborative problem solving. Organisation for Economic Co-operation and Development. https://www.oecd-ilibrary.org/education/pisa-2015-assessment-and-analytical-framework_9789264281820-en.
  • OECD. (2017b). PISA 2015 results (Volume V): Collaborative problem solving. OECD. https://doi.org/10.1787/9789264285521-en
  • OECD. (2019). Attracting and effectively preparing candidates. In OECD, TALIS 2018 Results (Vol I, pp. 121–150). OECD. https://doi.org/10.1787/dd6dd4bc-en
  • Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., … Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ, 372, n71. https://doi.org/10.1136/bmj.n71
  • Pollard, K., Miers, M. E., & Gilchrist, M. (2004). Collaborative learning for collaborative working? Initial findings from a longitudinal study of health and social care students. Health & Social Care in the Community, 12(4), 346–358. https://doi.org/10.1111/j.1365-2524.2004.00504.x
  • Pollard, K., Miers, M. E., & Gilchrist, M. (2005). Second year scepticism: Pre-qualifying health and social care students’ midpoint self-assessment, attitudes and perceptions concerning interprofessional learning and working. Journal of Interprofessional Care, 19(3), 251–268. https://doi.org/10.1080/13561820400024225
  • Qian, M., & Clark, K. R. (2016). Game-based Learning and 21st century skills: A review of recent research. Computers in human behavior, 63, 50–58. https://doi.org/10.1016/j.chb.2016.05.023
  • *Riesen, E., Morley, M., Clendinneng, D., Ogilvie, S., & Ann Murray, M. (2012). Improving interprofessional competence in undergraduate students using a novel blended learning approach. Journal of Interprofessional Care, 26(4), 312–318. https://doi.org/10.3109/13561820.2012.660286
  • *Riivari, E., Kivijärvi, M., & Lämsä, A.-M. (2021). Learning teamwork through a computer game: For the sake of performance or collaborative learning? Educational Technology Research and Development, 69(3), 1753–1771. https://doi.org/10.1007/s11423-021-10009-4
  • Rojas, M., Nussbaum, M., Chiuminatto, P., Guerrero, O., Greiff, S., Krieger, F., & Van Der Westhuizen, L. (2021). Assessing collaborative problem-solving skills among elementary school students. Computers & Education, 175, 104313. https://doi.org/10.1016/j.compedu.2021.104313
  • Roschelle, J., & Teasley, S. D. (1995). The construction of shared knowledge in collaborative problem solving. In C. O’Malley (Ed.), Computer supported collaborative learning (pp. 69–97). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-85098-1_5
  • *Sancho-Thomas, P., Fuentes-Fernández, R., & Fernández-Manjón, B. (2009). Learning teamwork skills in university programming courses. Computers & Education, 53(2), 517–531. https://doi.org/10.1016/j.compedu.2009.03.010
  • Schürmann, V., Marquardt, N., & Bodemer, D. (2024). Conceptualization and measurement of peer collaboration in higher education: A systematic review. Small Group Research, 55(1), 89–138. https://doi.org/10.1177/10464964231200191
  • Siegler, E. L., Hyer, K., Fulmer, T., Mezey, M. (1998). Core measures for the GITT program. In E. L. Siegler, K. Hyer, T. Fulmer, & M. Mezey (Eds.), Geriatric interdisciplinary team training (pp. 259–277). New York: Springer Publishing Company.
  • Soulé, H., & Warrick, T. (2015). Defining 21st century readiness for all students: What we know and how to get there. Psychology of Aesthetics, Creativity, and the Arts, 9(2), 178–186. https://doi.org/10.1037/aca0000017
  • Succi, C., & Canovi, M. (2020). Soft skills to enhance graduate employability: Comparing students and employers’ perceptions. Studies in Higher Education, 45(9), 1834–1847. https://doi.org/10.1080/03075079.2019.1585420
  • Sun, C., Shute, V. J., Stewart, A. E. B., Beck-White, Q., Reinhardt, C. R., Zhou, G., Duran, N., & D'Mello, S. K. (2022). The relationship between collaborative problem solving behaviors and solution outcomes in a game-based learning environment. Computers in Human Behavior, 128, 107120. https://doi.org/10.1016/j.chb.2021.107120
  • Sun, C., Shute, V. J., Stewart, A., Yonehiro, J., Duran, N., & D'Mello, S. (2020). Towards a generalized competency model of collaborative problem solving. Computers & Education, 143, 103672. https://doi.org/10.1016/j.compedu.2019.103672
  • Swaab, R. I., Galinsky, A. D., Medvec, V., & Diermeier, D. A. (2012). The communication orientation model: Explaining the diverse effects of sight, sound, and synchronicity on negotiation and group decision-making outcomes. Personality and Social Psychology Review, 16(1), 25–53. https://doi.org/10.1177/1088868311417186
  • Tiwari, S. R., Nafees, L., & Krishnan, O. (2014). Simulation as a pedagogical tool: Measurement of impact on perceived effective learning. The International Journal of Management Education, 12(3), 260–270. https://doi.org/10.1016/j.ijme.2014.06.006
  • Toh, L. P. E., Causo, A., Tzuo, P. W., Chen, I. M., & Yeo, S. H. (2016). A review on the use of robots in education and young children. Journal of Educational Technology & Society, 19(2), 148–163.
  • Touloumakos, A. K. (2020). Expanded yet restricted: A mini review of the soft skills literature. Frontiers in Psychology, 11, 2207. https://doi.org/10.3389/fpsyg.2020.02207
  • van Laar, E., van Deursen, A. J. A. M., van Dijk, J. A. G. M., & de Haan, J. (2017). The relation between 21st-century skills and digital skills: A systematic literature review. Computers in Human Behavior, 72, 577–588. https://doi.org/10.1016/j.chb.2017.03.010
  • van Laar, E., van Deursen, A. J. A. M., van Dijk, J. A. G. M., & de Haan, J. (2020). Determinants of 21st-century skills and 21st-century digital skills for workers: A systematic literature review. SAGE Open, 10(1), 215824401990017. https://doi.org/10.1177/2158244019900176
  • Vayre, E. (2021). Challenges in deploying telework: Benefits and risks for employees. In M. B. Chaumon (Ed.), Digital transformations in the challenge of activity and work (1st ed., pp. 87–100). Wiley. https://doi.org/10.1002/9781119808343.ch7
  • Voogt, J., & Roblin, N. P. (2012). A comparative analysis of international frameworks for 21st century competences: Implications for national curriculum policies. Journal of Curriculum Studies, 44(3), 299–321. https://doi.org/10.1080/00220272.2012.668938
  • *Wang, C. L., Chinnugounder, S., Hippe, D. S., Zaidi, S., O'Malley, R. B., Bhargava, P., & Bush, W. H. (2017). Comparative effectiveness of hands-on versus computer simulation–based training for contrast media reactions and teamwork skills. Journal of the American College of Radiology, 14(1), 103–110.e3. https://doi.org/10.1016/j.jacr.2016.07.013
  • Widad, A., & Abdellah, G. (2022). Strategies used to teach soft skills in undergraduate nursing education: A scoping review. Journal of Professional Nursing, 42, 209–218. https://doi.org/10.1016/j.profnurs.2022.07.010
  • World Health Organization. (2010). Framework for action on interprofessional education & collaborative practice.
  • Yang, Y.-T C., & Chang, C.-H. (2013). Empowering students through digital game authorship: Enhancing concentration, critical thinking, and academic achievement. Computers & Education, 68, 334–344. https://doi.org/10.1016/j.compedu.2013.05.023
  • Zainuddin, Z., Chu, S. K. W., Shujahat, M., & Perera, C. J. (2020). The impact of gamification on learning and instruction: A systematic review of empirical evidence. Educational Research Review, 30, 100326. https://doi.org/10.1016/j.edurev.2020.100326

Appendix A.