730
Views
0
CrossRef citations to date
0
Altmetric
Article

Health Promotion MOOCs (hpMOOCs): A Dual Lens for Assessing Quality

ABSTRACT

Introduction

Health promotion MOOCs (hpMOOCs) can be designed to educate general audiences on global health issues and provide accessible evidence-based content. Thus, it is essential to assess the instructional design quality of hpMOOCs through more than one perspective to ensure course participants have an effective learning experience and their learning needs are met.

Methods

Our study involved a comprehensive exploration of an hpMOOC about dementia risk reduction with sustained massive reach and established impact. To fully understand the quality of this course, instructional design elements were assessed from an instructional designer lens and a course participant lens. The designer lens involved a detailed review of the quality of the course against a recognized quality rubric. The participant lens involved topic modeling and thematic analysis of 1,531 responses to a feedback survey item in order to identify design elements preferred by participants.

Results

While the course performs well and meets most quality standards, it falls short of fully meeting the criteria set by the rubric. However, the rubric could be further developed to better facilitate the quality assurance of hpMOOCs. The participant lens reveals the course meets participants’ expectations and it offers two new avenues for enhancing the current presentation of course content.

Conclusion

Assessing the quality of hpMOOCs through a dual lens provides triangulation that helps to gain a more comprehensive understanding of how to improve their instructional design quality.

Introduction

MOOCs in health and medicine

Massive Open Online Courses (MOOCs), which first entered the online learning space in 2008 (Liyanagunawardena et al., Citation2013), have become widespread open learning educational tools that can deliver useful, accessible, and effective learning experiences to general and professional audiences (Ossiannilsson, Citation2021). MOOCs have added new learning and teaching dimensions to conventional education (Salas-Rueda et al., Citation2022; Walls et al., Citation2015). They are seen as an evolution in online education as they potentially allow education to be easily accessed by an unlimited number of people around the world (Ossiannilsson, Citation2021; Rodrigo et al., Citation2020).

One burgeoning group of MOOCs that has emerged since 2012 focuses on topics related to health and medicine. Within this category, courses are designed to achieve distinct goals and attract a wide range of individuals, including health practitioners, medical students, patients, and the general public (Liyanagunawardena & Williams, Citation2014). Some health and medicine MOOCs focus on training healthcare workers (Magaña-Valladares et al., Citation2018). Other courses aim to be resources in medical school curricula (Hamdan et al., Citation2022). Recently, a large focus of health and medicine MOOCs has been on training the health workforce to respond to national and global emergencies such as the COVID-19 pandemic (Gómez Gómez & Munuera Gómez, Citation2021; Utunen et al., Citation2022). Another focus is to promote health, educate the general public on global health issues, and provide accessible evidence-based content that does not require prior subject knowledge. This subgroup of MOOCs will be referred to as health promotion MOOCs (hpMOOCs) in this study.

Online health educational tools such as hpMOOCs can help raise awareness of conditions such as dementia which has been identified as a major chronic and progressive disease of the 21st century (Van Asbroeck et al., Citation2021; World Health Organization, Citation2017). The Preventing Dementia Massive Open Online Course (PDMOOC), developed by the Wicking Dementia Research and Education Centre is one example of an hpMOOC. The PDMOOC serves as a globally available tool to increase participants’ knowledge of dementia risk reduction and develop risk literacy through evidence-based content (Farrow et al., Citation2017, Citation2020). It has had a sustained global uptake over successive years which means that it has reached over 100,000 participants, and has an atypically high completion rate (Farrow et al., Citation2022). Given the demonstrated wide reach of this free public health intervention, it is imperative to ensure its instructional design is of high quality. This paper aims to explore the design quality of this hpMOOC about dementia risk reduction from a dual lens that draws insights from multiple data sources. Doing so may maximize its potential to democratize public health messaging and empower users to translate knowledge into practice.

One important perspective on course design is that of the course designer, which helps to ensure that accepted instructional design best-practices are applied to develop effective learning environments that meet evolving technology demands and align with institutional traditions and accreditation goals (Brown & Green, Citation2019). However, a designer lens might be constrained by institutional contexts and conservative teaching strategies, overlooking elements that learners consider important for a successful learning experience (Brown & Green, Citation2019; Teixeira et al., Citation2018). Further, the absence of culturally, pedagogically, and epistemically diverse MOOC designers involved in the course development process poses the risk of creating non-inclusive and inaccessible courses (Adam, Citation2020; Ruipérez-Valiente et al., Citation2022). Therefore, to bridge design gaps, it is essential to also investigate a course’s design through a participant lens to understand the design features that learners expect to see in a course (Oh et al., Citation2023), evaluate their perceptions and experiences (Hood & Littlejohn, Citation2016), and create engaging and accessible courses (Lan & Hew, Citation2020). The substantial number of enrollments in MOOCs presents a range of participants with diverse cultural, linguistic, academic, and professional backgrounds that can inform course design (Rizvi et al., Citation2022; Ruipérez-Valiente et al., Citation2022).

MOOC quality through the designer lens

In order to review the quality of a course, quality assurance rubrics are often valuable to course designers because they provide a set of standards to define, evaluate, and achieve quality in courses (Caskurlu et al., Citation2021; Downes, Citation2015; Lenert & Janes, Citation2017). Given the increasing role of MOOCs in reshaping online education and their integration into the educational and professional sectors across various disciplines, researchers and organizations have developed quality assurance instruments for MOOCs founded on known and accepted online learning principles (Margaryan et al., Citation2015; Yousef et al., Citation2014). Other scholars have modified these instruments to study the instructional design quality of MOOCs in various fields of study (Egloffstein et al., Citation2019; Oh et al., Citation2020).

Of the quality instruments previously employed to assess the quality of MOOCs, the Quality Matters (QM) rubrics are particularly notable as they provide a set of implementable quality standards related to online education, are grounded in research, and are informed by instructional design best practices (Shattuck et al., Citation2014). QM rubrics provide the opportunity to generate a holistic course review composed of the strengths of the course, specific areas of improvement, and detailed examples for how to improve course design elements. In contrast, other quality assurance instruments take the form of questionnaires or checklists (Margaryan et al., Citation2015; Yousef et al., Citation2014). Among the QM rubrics currently available, the Continuing and Professional Education (CPE) Rubric is the most applicable to MOOCs, because it is presented as being relevant to self-paced, non-credit, and open enrollment courses (Adair et al., Citation2014; Ekren, Citation2017). This rubric assesses various elements of distance education courses including content presentation, learning goals, evaluation of progress, use of materials, learner engagement, educational technology, learner support, and accessibility. The rubric expands each element to cover a wider range of quality standards which are important in providing a quality educational experience to MOOC participants.

MOOC quality through the participant lens

MOOCs allow participants to set and achieve their own goals in ways different to the expected path set by MOOC designers (Downes, Citation2015; Shrader et al., Citation2016). MOOC participants may be interested in a specific section within the course but not concerned with completing all the assessments and activities needed to obtain a certificate of completion (El Said, Citation2017; Goopio & Cheung, Citation2021). While students’ perspectives on QM standards and their impact on learning has been studied, this has been within the context of for-credit online courses and from the point of view of tertiary students (Sadaf et al., Citation2019). Since the learning behavior of participants in MOOCs differs greatly from the expected behavior of traditional students (Poellhuber et al., Citation2019), it is essential to examine which design elements MOOC participants value most in hpMOOCs.

Course feedback is an invaluable component for continued course development (Knox et al., Citation2014) and MOOCs can provide a vast quantity of rich data to investigate a variety of concepts such as participants’ preferred design elements in MOOCs (Diver & Martinez, Citation2015; King et al., Citation2013). Feedback from open-ended survey responses can generate a large amount of textual data which can be analyzed with statistical approaches such as Structural Topic Modeling (STM) (Roberts et al., Citation2014). STM provides a reliable and replicable method to generate a mixture of topics from the data, but interpretation of the topics’ meanings is guided by researchers’ understandings of the data (Roberts et al., Citation2014). Previous research has employed this method to analyze discussion board posts and open-ended survey responses from hpMOOCs (Borchard et al., Citation2022; Farrow et al., Citation2022; McInerney et al., Citation2018). These studies have produced useful insights into MOOC participants’ understanding of course content, their level of engagement with learning activities, and the application of the knowledge.

The sustained global reach of the PDMOOC means that a range of participant perspectives are relevant for consideration. People who undertake the PDMOOC tend to be mostly female, middle-aged, highly educated, working in aged care, and residing in English speaking countries (Farrow et al., Citation2022), but this user profile is representative of populations seeking online health information (Jia et al., Citation2021).

The review of this established, domain-specific hpMOOC, the PDMOOC, allowed the research team to gain a detailed understanding of its quality from a dual lens: course designers and course participants. The current study has two aims:

  • Aim one: assess the quality of the PDMOOC design elements against the CPE QM Rubric.

  • Aim two: explore participants’ preferences for specific design elements in the PDMOOC.

Materials and methods

The Preventing Dementia MOOC

The PDMOOC is organized in four modules (Module 1: Can dementia be prevented?, Module 2: Dementia risk – it’s not all in your head, Module 3: A healthy and active mind, and Module 4: Interventions for prevention) released weekly (Wicking Dementia Research and Education Centre, Citation2022). It delivers evidence-based information addressing the non-modifiable and the key modifiable risk factors associated with dementia, exposes some of the myths regarding dementia risk and causes, and presents how the evidence on prevention interventions can be translated into practice (Farrow et al., Citation2017, Citation2022). All course modules follow a similar structure: pages are arranged by topic and include embedded video lectures given by field experts, readings, links to external additional readings, and discussion board forums. An end of module quiz tests participants’ knowledge and measures course progression and completion (Wicking Dementia Research and Education Centre, Citation2022). The PDMOOC was first developed independently by an academic team from the same center as the authors. However, the authors of the study had no involvement in the development of the course and the academic team had no involvement in this evaluation. The course was designed for a wide audience, from health professionals to the general public, with or without an interest in dementia risk reduction.

To review the MOOC a study with an explanatory sequential design was undertaken (Creswell & Creswell, Citation2023). This two-phase design allowed the research team to integrate two different types of data to triangulate and offer a more comprehensive understanding of course design elements.

The CPE rubric

The fully annotated version of the CPE Rubric and permission to use it for this purpose were made available to the course reviewer by QM’s research team (QM CPE Continuing Education and Professional Development Rubric, Citation2015). The first author (IME) completed the self-review of the PDMOOC as he is an experienced Instructional Designer, course reviewer, and is familiar with QM’s standards and review processes. He was not involved in the design or development of the MOOC. A self-review was chosen to examine the course quality of the PDMOOC as it allows reviewers to conduct an internal review of the course against the CPE Rubric and does not involve any cost. In total, the QM CPE Rubric has eight General Standards (GS): GS 1 - Course Overview and Introduction, GS 2 - Learning Objectives (Competencies), GS 3 - Assessment and Measurement, GS 4 - Instructional Materials, GS 5 - Learning Activities and Learner Interaction, GS 6 - Course Technology, GS 7 - Learner Support, and GS 8 - Accessibility and Usability.

These standards are further divided into 43 Specific Review Standards (SRS). There are 22 essential three-point SRSs, 13 very important two-point SRSs, and eight important one-point SRSs. The CPE Rubric standards add up to a total of 100 points. For a course to successfully meet QM standards, 85% of the points (i.e., 85 points) in the rubric must be obtained. Additionally, all 22 standards marked as essential must also be met for the course to meet QM standards overall. For each SRS not met, QM requires that course reviewers provide a recommendation that guides and enables course developers to make the necessary course adjustments to meet QM standards in a subsequent course review.

Qualitative feedback data

PDMOOC participants are invited to fill out a Course Feedback Survey after course completion. This survey consists of Likert scale items (strongly agree, agree, neutral, disagree, and strongly disagree) and free-text items about the use of and satisfaction with specific existing content elements (discussion boards, videos, and readings). Four further free-text items explored general satisfaction with the course and content improvement. Preliminary analysis of the data from these four items confirmed only one item offered the most relevant information regarding course takers’ learning preferences and needs for additional or supplementary content: “Please suggest any additional course content, discussion questions, activities or other approaches for delivering the content that you would have liked” (henceforth labeled “extra content wanted”). The responses to this item provided specific, practical, and implementable recommendations focused on new educational materials that would enhance the learning experience of future course participants. Text responses from this single item for the October 2020 and May 2021 iterations of the PDMOOC were collated for analysis.

Participants

The participants for this study had enrolled and completed either the October 2020 or the May 2021 iteration of the PDMOOC. Participation in research is completely voluntary and is open to all course takers over 18 years old. Enrollees are provided with information about all elements of the research. If they decide to contribute, potential participants consent digitally. Ethical approval for this study was granted by the University of Tasmania Social Sciences Human Research Ethics Committee (H0024547).

Topic modeling

Structural topic modeling (STM) is a way to analyze large quantities of textual data (Borchard et al., Citation2022; Roberts et al., Citation2019). STM was used to analyze participants’ text responses to the item “extra content wanted” in R Studio (R version R 4.1.2; see Supplementary File 1). The textProcessor function applied stemming of words and removed stopwords and punctuation in the corpus. Next, the prepDocuments function structured and indexed the data for STM. The searchK function was employed to select the most appropriate number of topics (K). K was given the values 10, 20, 30, 40, and 50 to estimate topic models. After evaluating the quality of each of these models using the evaluation tools provided by the stm package, the range of topics was narrowed down from 20 to 30. Thematic analysis was undertaken on the topics which exceeded 5% expected proportion across the entire data set.

Consistent with Farrow et al. (Citation2022) and Borchard et al. (Citation2022), the STM results were further explored through thematic analysis to interpret the meanings underpinning the data. Using the findThoughts function, one author (IME) reviewed 30 exemplar responses that were highly associated with each estimated topic to gain an initial understanding of each topic and then delve deeper in the meanings and patterns in the participants’ responses. Themes and their associated exemplars were discussed with all authors and consensus reached as to accuracy and meaning.

Results

CPE rubric – course review

The QM self-review of the PDMOOC returned a score of 84 out of 100 possible points. The PDMOOC met 36 SRS, meeting standards regarding overall course design, course instructional materials, course assessments, course learning activities, course technologies, and course usability ().

Table 1. QM CPE rubric specific review standards (SRS) met.

However, the PDMOOC scored below the 85% rule and did not meet seven standards in the rubric (). Specifically, the seven SRS were not met for the following reasons:

Table 2. QM CPE rubric specific review standards (SRS) not met.

Standard 1.4 – course policies

The PDMOOC did not include course policies such as academic integrity and late submission policies.

Standard 1.7 – statement of technical skills

The PDMOOC’s main website includes a Frequently Asked Questions (FAQs) list stating the requirement of basic computer skills for course completion. However, this information is external to the course and may not reach all enrollees.

Standard 2.3 – stated learning objectives

Though learning objectives are mentioned in the introduction video at the beginning of the course, they are not “clearly and prominently” stated within the course.

Standards 7.2, 7.3, and 7.4 – institutional support services

The PDMOOC does not link to accessibility policies, learner services, resources, or course support recourses from within the Learning Management System (LMS) as required by these standards.

Standard 8.2 – accessibility statement

The PDMOOC does not require the use of technologies outside the course but does not include an accessibility statement for the LMS where the course is hosted.

PDMOOC participants

The October 2020 and May 2021 iterations of the PDMOOC registered a total of 19,823 enrollees. Out of this number 12,549 people completed the course (63% completion rate). Of these, there were 6,351 or 51% consenters and survey completers. The analysis included only course completers who consented to be part of the study, provided their age, and were aged 18 to 100 years.

Qualitative feedback data

The Course Feedback Survey elicited a total of 6,351 responses, with 3,439 responses from the October 2020 and 2,912 responses from the May 2021 iterations of the PDMOOC. Of these, 1,597 provided a text response to the “extra content wanted” survey item. Only the first survey attempt was analyzed for those who completed the survey multiple times. This left a total of 1,531 responses in the corpus. Participants had a median age of 59 and most identified as female (83%).

Topic modeling

A 25-topic model () was selected as providing the best balance of held-out likelihood, residuals, semantic coherence, and exclusivity (Roberts et al., Citation2019).

Figure 1. The expected topic proportions and the top five words associated with each topic.

Note. The topics in bold text are the topics with proportions of 5% and above within the whole data set.
Figure 1. The expected topic proportions and the top five words associated with each topic.

Topics with an expected proportion of 5% and above across the entire corpus were used to determine three key themes (); “leave it,” “visualize it,” and “personalize it.”

Figure 2. STM process for generating topics and themes.

Figure 2. STM process for generating topics and themes.

Leave it

The current design of the PDMOOC was perceived by some participants as organized and well presented: “I was satisfied with exactly the way the course was presented. 10 out of 10 for presentation”. The available course content also seemed to satisfy some course completers leaving no suggestions of any additional content that could be added to the course: “I am satisfied with all that was offered”. Participants, rather than providing suggestions, used this as an occasion to thank course developers: “ … I am extremely thankful to MOOC professionals for developing such a great resource.”

Visualize it

Some PDMOOC participants expressed their preference for a more visually oriented presentation of the course content. Participants suggested that course messages could be more engaging through the incorporation of visual elements: “More visual aids can be used to make the subject lively” and would help increase knowledge: “Animations/graphics or other visual representations … can help processing and consolidation of knowledge.” Specifically, they highlighted infographics as a valuable tool that aided their comprehension of the course information: “Infographics and supporting texts were key to assisting me with understanding the content.” Further, infographics could also help learners with knowledge retention and recall after course completion: “Maybe more infographic images and they are probably the easiest to recall or otherwise to save to view at a later time to jog memory”. The existing video scripts were useful to learners, but these scripts should accurately align with their corresponding video content: “More infographics/animations can be added into the text version of the videos to facilitate learners who are better with reading and absorbing information at their own pace.” It is worth noting that the participants’ requests for infographics were intended to supplement the existing content rather than replace it: “More infographics, please, but only in addition to the present content.”

Personalize it

Within this theme, participants suggested adding content with which they could have a connection. There was an interest from some participants for content to include real-world situations: “Perhaps more real life people in real life scenarios who are experiencing dementia.” Participants felt that they could connect with the course a bit more if it included interviews with people living with dementia and people supporting a person who has dementia: “ … getting insight from people with dementia, hearing their stories, and having a glimpse of their life.”

Discussion

From a designer lens, the PDMOOC design achieved many quality standards and was successful in meeting most of the essential design elements outlined in the CPE Rubric. According to literature on MOOC design, participants value clear structure and organization (Hew, Citation2018), prefer concise and easy to understand course content (Nanda et al., Citation2021), and expect easy to navigate courses (Oh et al., Citation2023). In line with this, the PDMOOC includes clear “how to get started” instructions, a detailed overview of the course structure, intuitive course navigation, and content that meets the learning needs of course completers. The presence of these elements may have contributed to participants’ preferences for leaving the course largely unchanged. Specifically, the theme “Leave it” confirms that much of the current design and approach to delivering the course met the expectations of participants and was suitable for hpMOOCs.

Conversely, within the theme “Visualize it,” the participant lens highlighted the integration of visual materials as a potential avenue for enhancing content presentation. In particular, infographics arise as a promising tool that could meet the learning needs of participants in hpMOOCs. Compared to the traditional textual formats of information, informational graphics, or infographics, can communicate complex health information to users in innovative, accessible, and engaging visual formats (McCrorie et al., Citation2016; McSween-Cadieux et al., Citation2021). In support of this theme, an assessment via the designer lens found that while existing course materials meet QM standards for accessibility, there remains an opportunity to enhance them. Developers of hpMOOCs could design materials that combine pictures and text to increase attention, comprehension, and recall of health information, particularly among individuals with limited literacy skills and lower levels of education (Houts et al., Citation2006). By doing so, hpMOOCs may reach to a broader range of learners and potentially motivate non-completers to persist and successfully finish the course.

Under the theme “Personalize it,” participants suggested the addition of materials that showcase personal and first-hand narratives as a new approach to contextualize present health information. This aligns with the idea that MOOC experiences are influenced by content that promotes authentic learning and emotional engagement (Deng & Benckendorff, Citation2021). Thus, the PDMOOC could incorporate real-life stories of people making lifestyle changes to reduce their risk of dementia. Through a personalization approach, participants in hpMOOCs may establish a meaningful connection with the content and apply the acquired knowledge to their everyday lives. Interestingly, the designer lens unintentionally disregards this design aspect of importance to participants. While designers adhere to rubric standards to assess the quality of a course, there is a possibility that design elements valued by participants may inadvertently be overlooked if the rubric is not fit-for-purpose.

The unmet rubric standards are largely associated with providing better clarification of institutional requirements and policies (see ). To rectify these, it is recommended that the course includes the statement of requisite minimum technical skills (SRS 1.7), explicitly states learning objectives (SRS 2.1), and includes the accessibility statement for the LMS hosting the PDMOOC (SRS 8.2). These particular elements have not emerged as key aspects that contribute to a positive learning experience of participants in MOOCs (Deng & Benckendorff, Citation2021). The relevance of the remaining unmet standards can be debated in the context of intent of hpMOOCs.

The PDMOOC did not describe any related course policies to course participants (SRS 1.4) because the course has no compulsory assignments, learning activities or penalties for late submissions. Rather the course has intended action outcomes. Contrary to traditional for-credit online courses in higher education, MOOC participants can freely choose to engage with course content or ignore the parts of a course that are not suited to their learning needs (Alamri, Citation2022; Askeroth & Richardson, Citation2019). Learner support standards (SRS 7.2, 7.3, 7.4) guide course reviewers to verify that typical support services in for-credit courses are provided in MOOCs. It is questionable whether support services such as financial aid, or access to library resources, are relevant to large scale, non-credit courses, perhaps particularly considering these courses are mostly offered for free or at low costs.

Notably, it becomes evident that there is an opportunity to further improve the alignment of the CPE Rubric to hpMOOC content. The CPE Rubric originated as an adaption of QM’s flagship rubric, the Higher Education (HE) Rubric, Fifth Edition, which is based on best practices and accreditation standards in traditional online education (Legon, Citation2006; Shattuck, Citation2017). It has been presented as effectively modified for most MOOCs due to changes to standards concerning active involvement from the instructor and direct student-to-student interactions (Adair, Citation2013). However, as noted above, learner support standards still guide reviewers to verify that MOOCs provide support services to participants. In addition, the CPE Rubric allocates the same points to each of its standards as the HE Rubric from which it was adapted. The exception is SRS 4.4 (“the instructional materials are current”) which appropriately increases from a “very important” two-point standard to an “essential” three-point standard. Since English remains the prevalent language of instruction in MOOCs (Türkay et al., Citation2017), SRS 4.5 (“a variety of instructional materials is used”), a two-point “very important” standard, could be updated to an “essential” standard to highlight the importance of providing course content that is accessible and understandable to participants for whom English might not be their first language or for whom different learning modes are preferred. Similarly, SRS 8.5 (“course multimedia facilitates ease of use”) should be updated to an “essential” standard as videos have emerged as the widely adopted and nearly ubiquitous medium for disseminating information in MOOCs (Deng & Benckendorff, Citation2021).

The CPE Rubric attempts to assure the quality of many different types of courses: online and blended non-credit courses, continuing education courses, professional training courses, personal development courses, non-credit competency-based courses, and MOOCs (Quality Matters, Citationn.d..). Rather than post hoc removal of standards which are not relevant to MOOCs (Lowenthal & Hodges, Citation2015), QM might consider the development of a fit-for-purpose rubric specific for MOOCs which incorporates contemporary research regarding the hallmark features of MOOCs. By doing so, the application of the CPE rubric could be expanded to facilitate the quality assurance of hpMOOCs. And perhaps, more broadly, guide developers of specialized MOOCs in the health and medicine domain to design courses that are more accessible to lay audiences.

A holistic understanding of course quality can be gained from employing a dual lens approach. In this study, the exploration of course design elements from different perspectives provided insights into the strengths and areas for improvement of an hpMOOC. From the course designer perspective, despite the existing limitations of the CPE rubric, opportunities for small administrative adjustments are evident. Conversely, the participant lens echoes the success of the MOOC but also indicates two new avenues for targeted content development to deliver a more appealing and personalized course. A promising direction for future research is to investigate the content designed to incorporate both visually appealing and relevant messages to participants.

Limitations

A limitation of this study is the use of the CPE Rubric to evaluate one hpMOOC. However, this provides a useful test case for an in-depth exploration of its utility for MOOCs intended for dissemination of public health messages to the general public. A tailored rubric could be tested on multiple hpMOOCs to fine-tune the breadth of its applicability. Another limitation is the absence of feedback data from participants who did not complete the course. Future investigations warrant an exploration into the factors contributing to participant attrition or partial involvement within the context of hpMOOCs.

Conclusion

Assessing the quality of hpMOOCs through a dual lens provides a powerful triangulation that brings a more comprehensive understanding of how to improve the design elements participants deem most valuable. Course feedback provides a venue for participants to voice their preferences and expectations. To further participants’ involvement in the quality assurance of hpMOOCs, course developers could adopt a co-design approach to collaborate with course takers and create content that is closely aligned to their needs and preferences. In this way, participants could become not just course takers but content co-designers.

Supplemental material

Supplemental Material

Download MS Word (22.2 KB)

Acknowledgments

We wish to thank all participants for their contributions to this study, and the team behind the development and delivery of the PDMOOC.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The free-text response data are not publicly available due to restrictions imposed by the University of Tasmania Human Research Ethics Committee. A sample of carefully selected excerpts from free-text responses are included in the manuscript.

Supplementary material

Supplemental data for this article can be accessed online at https://doi.org/10.1080/08923647.2024.2325845

Unknown widget #5d0ef076-e0a7-421c-8315-2b007028953f

of type scholix-links

References

  • Adair, D. (2013). Quality matters for MOOCs: Results and implications of the first QM MOOC reviews. Educase. https://library.educause.edu/-/media/files/library/2013/12/eliweb1313-pdf.pdf
  • Adair, D., Alman, S. W., Budzick, D., Grisham, L. M., Mancini, M. E., & Thackaberry, A. S. (2014). Many shades of MOOCs. Internet Learning, 3(1), 7. https://doi.org/10.18278/il.3.1.5
  • Adam, T. (2020). Open educational practices of MOOC designers: Embodiment and epistemic location. Distance Education, 41(2), 171–185. https://doi.org/10.1080/01587919.2020.1757405
  • Alamri, M. M. (2022). Investigating students’ adoption of MOOCs during COVID-19 pandemic: Students’ academic self-efficacy, learning engagement, and learning persistence. Sustainability, 14(2), 714. https://doi.org/10.3390/su14020714
  • Askeroth, J. H., & Richardson, J. C. (2019). Instructor perceptions of quality learning in MOOCs they teach. Online Learning, 23(4). https://doi.org/10.24059/olj.v23i4.2043
  • Borchard, J., Bindoff, A., Farrow, M., Kim, S., McInerney, F., & Doherty, K. (2022). Family carers of people living with dementia and discussion board engagement in the understanding dementia massive open online course. Aging & Mental Health, 27(5), 1–9. https://doi.org/10.1080/13607863.2022.2042188
  • Brown, A. H., & Green, T. D. (2019). The essentials of instructional design: Connecting fundamental principles with process and practice. Routledge. https://ikhsanaira.files.wordpress.com/2016/05/the-essential-of-instructional-design.pdf
  • Caskurlu, S., Richardson, J. C., Alamri, H. A., Chartier, K., Farmer, T., Janakiraman, S., Strait, M., & Yang, M. (2021). Cognitive load and online course quality: Insights from instructional designers in a higher education context. British Journal of Educational Technology, 52(2), 584–605. https://doi.org/10.1111/bjet.13043
  • Creswell, J. W., & Creswell, J. D. (2023). Research design: Qualitative, quantitative, and mixed methods approaches (6th ed.), SAGE.
  • Deng, R., & Benckendorff, P. (2021). What are the key themes associated with the positive learning experience in MOOCs? An empirical investigation of learners’ ratings and reviews. International Journal of Educational Technology in Higher Education, 18(1). https://doi.org/10.1186/s41239-021-00244-3
  • Diver, P., & Martinez, I. (2015). MOOCs as a massive research laboratory: Opportunities and challenges. Distance Education, 36(1), 5–25. https://doi.org/10.1080/01587919.2015.1019968
  • Downes, S. (2015). The quality of massive open online courses. In B. H. Khan & M. Ally (Eds.), International handbook of E-Learning volume 1: Theoretical perspectives and research (pp. 65–78). Routledge.
  • Egloffstein, M., Koegler, K., & Ifenthaler, D. (2019). Instructional quality of business MOOCs: Indicators and initial findings. Online Learning, 23(4). https://doi.org/10.24059/olj.v23i4.2091
  • Ekren, G. (2017). Existing criteria determining course quality in distance education. The Online Journal of Quality in Higher Education, 4(4), 17–24.
  • El Said, G. R. (2017). Understanding how learners use massive open online courses and why they drop out. Journal of Educational Computing Research, 55(5), 724–752. https://doi.org/10.1177/0735633116681302
  • Farrow, M., Fair, H., Klekociuk, S. Z., Vickers, J. C., & Lavorgna, L. (2022). Educating the masses to address a global public health priority: The preventing dementia massive open online course (MOOC). Public Library of Science ONE, 17(5), e0267205. https://doi.org/10.1371/journal.pone.0267205
  • Farrow, M., Klekociuk, S. Z., Bindoff, A., & Vickers, J. C. (2020). The preventing dementia massive open online course results in behaviour change associated with reduced dementia risk: The my INdex of DEmentia risk (MINDER) study. Alzheimer’s & Dementia, 16(S10). https://doi.org/10.1002/alz.046565
  • Farrow, M., Ward, D., Klekociuk, S., & Vickers, J. (2017). Building capacity for dementia risk reduction: The preventing dementia MOOC. Alzheimer’s & Dementia, 13, 871–P872. https://doi.org/10.1016/j.jalz.2017.06.1244
  • Gómez Gómez, F., & Munuera Gómez, P. (2021). Use of MOOCs in health care training: A descriptive-exploratory case study in the setting of the COVID-19 pandemic. Sustainability, 13(19), 10657. https://doi.org/10.3390/su131910657
  • Goopio, J., & Cheung, C. (2021). The MOOC dropout phenomenon and retention strategies. Journal of Teaching in Travel & Tourism, 21(2), 177–197. https://doi.org/10.1080/15313220.2020.1809050
  • Hamdan, D., Pamoukdjian, F., Lehmann-Che, J., de Bazelaire, C., Vercellino, L., Calvani, J., Battistella, M., Bertheau, P., Falgarone, G., & Bousquet, G. (2022). A massive open online course to teach undergraduate medical students in oncology: Keys of success. Heliyon, 8(11), e11306. https://doi.org/10.1016/j.heliyon.2022.e11306
  • Hew, K. F. (2018). Unpacking the strategies of Ten highly rated MOOCs: Implications for engaging students in large online courses. Teachers College Record: The Voice of Scholarship in Education, 120(1), 1–40. https://doi.org/10.1177/016146811812000107
  • Hood, N., & Littlejohn, A. (2016). MOOC quality: The need for new measures. Journal of Learning for Development, 3(3), 28–42. https://doi.org/10.56059/jl4d.v3i3.165
  • Houts, P. S., Doak, C. C., Doak, L. G., & Loscalzo, M. J. (2006). The role of pictures in improving health communication: A review of research on attention, comprehension, recall, and adherence. Patient Education and Counseling, 61(2), 173–190. https://doi.org/10.1016/j.pec.2005.05.004
  • Jia, X., Pang, Y., & Liu, L. S. (2021). Online health information seeking behavior: A systematic review. Healthcare (Basel), 9(12), 1740. https://doi.org/10.3390/healthcare9121740
  • King, C., Kelder, J.-A., Phillips, R., McInerney, F., Doherty, K., Walls, J., Robinson, A., & Vickers, J. (2013). Something for everyone: MOOC design for informing dementia education and research. Paper presented at the European Conference in E-Learning (ECEL 2013), France: Sophia Antipolis
  • Knox, J., Ross, J., Sinclair, C., Macleod, H., & Bayne, S. (2014). MOOC feedback: Pleasing all the people? In S. D. Krause & C. Lowe (Eds.), Invasion of the MOOCs: The promise and perils of massive open online courses (pp. 98–104). Parlor Press.
  • Lan, M., & Hew, K. F. (2020). Examining learning engagement in MOOCs: A self-determination theoretical perspective using mixed method. International Journal of Educational Technology in Higher Education, 17(1). https://doi.org/10.1186/s41239-020-0179-5
  • Legon, R. (2006). Comparison of the quality matters rubric to accreditation standards for distance learning. MarylandOnline. https://confluence.delhi.edu/pages/worddav/preview.action?fileName=Comparison+of+the+Quality+Matters+Rubric±+Summary.pdf&pageId=74055682
  • Lenert, K. A., & Janes, D. P. (2017). The incorporation of quality attributes into online course design in higher education. International Journal of E-Learning & Distance Education, 32(1), 1–14. https://files.eric.ed.gov/fulltext/EJ1146391.pdf
  • Liyanagunawardena, T. R., Adams, A. A., & Williams, S. A. (2013). MOOCs: A systematic study of the published literature 2008-2012. International Review of Research in Open & Distributed Learning, 14(3), 202–227. https://doi.org/10.19173/irrodl.v14i3.1455
  • Liyanagunawardena, T. R., & Williams, S. A. (2014). Massive open online courses on health and medicine: Review. Journal of Medical Internet Research, 16(8), e191. https://doi.org/10.2196/jmir.3439
  • Lowenthal, P. R., & Hodges, C. B. (2015). In search of quality: Using quality matters to analyze the quality of massive, open, online courses (MOOCs). International Review of Research in Open & Distributed Learning, 16(5), 83–101. https://doi.org/10.19173/irrodl.v16i5.2348
  • Magaña-Valladares, L., Rosas-Magallanes, C., Montoya-Rodríguez, A., Calvillo-Jacobo, G., Alpuche-Arande, C. M., & García-Saisó, S. (2018). A MOOC as an immediate strategy to train health personnel in the cholera outbreak in Mexico. BMC Medical Education, 18(111), 1–7. https://doi.org/10.1186/s12909-018-1215-1
  • Margaryan, A., Bianco, M., & Littlejohn, A. (2015). Instructional quality of Massive Open Online Courses (MOOCs). Computers & Education, 80, 77–83. https://doi.org/10.1016/j.compedu.2014.08.005
  • McCrorie, A. D., Donnelly, C., & McGlade, K. J. (2016). Infographics: Healthcare communication for the digital age. The Ulster Medical Journal, 85(2), 71–75. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4920488/
  • McInerney, F., Doherty, K., Bindoff, A., Robinson, A., & Vickers, J. (2018). How is palliative care understood in the context of dementia? Results from a massive open online course. Palliative Medicine, 32(3), 594–602. https://doi.org/10.1177/0269216317743433
  • McSween-Cadieux, E., Chabot, C., Fillol, A., Saha, T., & Dagenais, C. (2021). Use of infographics as a health-related knowledge translation tool: Protocol for a scoping review. BMJ Open, 11(6), e046117. https://doi.org/10.1136/bmjopen-2020-046117
  • Nanda, G., Douglas, A., Waller, D. E., Merzdorf, H., & Goldwasser, D. (2021). Analyzing large collections of open-ended feedback from MOOC learners using LDA topic modeling and qualitative analysis. IEEE Transactions on Learning Technologies, 14(2), 146–160. https://doi.org/10.1109/tlt.2021.3064798
  • Oh, E. G., Chang, Y., & Park, S. W. (2020). Design review of MOOCs: Application of e-learning design principles. Journal of Computing in Higher Education, 32(3), 455–475. https://doi.org/10.1007/s12528-019-09243-w
  • Oh, E. G., Cho, M.-H., & Chang, Y. (2023). Learners’ perspectives on MOOC design. Distance Education, 44(3), 1–19. https://doi.org/10.1080/01587919.2022.2150126
  • Ossiannilsson, E. (2021). MOOCS for lifelong learning, equity, and liberation. In D. M. Cvetković (Ed.), MOOC (Massive Open Online Courses). Intechopen. https://doi.org/10.5772/intechopen.99659
  • Poellhuber, B., Roy, N., & Bouchoucha, I. (2019). Understanding participant’s behaviour in massively open online courses. International Review of Research in Open & Distributed Learning, 20(1), 221–242. https://doi.org/10.19173/irrodl.v20i1.3709
  • QM CPE (Continuing Education and Professional Development Rubric). (2015). Quality matters. Used under license. All rights reserved. https://www.qmprogram.org/myqm/
  • Quality Matters. (n.d). CPE rubric - course design rubric standards. https://www.qualitymatters.org/qa-resources/rubric-standards/cpe-rubric
  • Rizvi, S., Rienties, B., Rogaten, J., & Kizilcec, R. F. (2022). Beyond one-size-fits-all in MOOCs: Variation in learning design and persistence of learners in different cultural and socioeconomic contexts. Computers in Human Behavior, 126, 106973. https://doi.org/10.1016/j.chb.2021.106973
  • Roberts, M. E., Stewart, B. M., & Tingley, D. (2019). Stm: An R package for structural topic models. Journal of Statistical Software, 91(1), 1–40. https://doi.org/10.18637/jss.v091.i02
  • Roberts, M. E., Stewart, B. M., Tingley, D., Lucas, C., Leder-Luis, J., Gadarian, S. K., Albertson, B., & Rand, D. G. (2014). Structural topic models for open-ended survey responses. American Journal of Political Science, 58(4), 1064–1082. https://doi.org/10.1111/ajps.12103
  • Rodrigo, C., Iniesto, F., & García-Serrano, A. (2020). Reflections on instructional design guidelines from the MOOCification of distance education. In UXD and UCD approaches for accessible education (pp. 21–37). IGI Global. https://doi.org/10.4018/978-1-7998-2325-4.ch002
  • Ruipérez-Valiente, J. A., Staubitz, T., Jenner, M., Halawa, S., Zhang, J., Despujol, I., Maldonado-Mahauad, J., Montoro, G., Peffer, M., Rohloff, T., Lane, J., Turro, C., Li, X., Pérez-Sanagustín, M., & Reich, J. (2022). Large scale analytics of global and regional MOOC providers: Differences in learners’ demographics, preferences, and perceptions. Computers & Education, 180, 104426. https://doi.org/10.1016/j.compedu.2021.104426
  • Sadaf, A., Martin, F., & Ahlgrim-Delzell, L. (2019). Student perceptions of the impact of “quality matters” certified online courses on their learning and engagement. Online Learning, 23(4). https://doi.org/10.24059/olj.v23i4.2009
  • Salas-Rueda, R.-A., Castañeda-Martínez, R., Eslava-Cervantes, A.-L., & Alvarado-Zamorano, C. (2022). Teachers’ perception about MOOCs and ICT during the COVID-19 pandemic. Contemporary Educational Technology, 14(1), ep343. https://doi.org/10.30935/cedtech/11479
  • Shattuck, K. (2017). Building a scalable bridge while assuring quality. American Journal of Distance Education, 31(3), 151–153. https://doi.org/10.1080/08923647.2017.1337930
  • Shattuck, K., Zimmerman, W. A., & Adair, D. (2014). Continuous improvement of the QM rubric and review processes: Scholarship of integration and application. Internet Learning, 3(1), 25–34. https://doi.org/10.18278/il.3.1.3
  • Shrader, S., Wu, M., Owens-Nicholson, D., & Santa Ana, K. (2016). Massive Open Online Courses (MOOCs): Participant activity, demographics, and satisfaction. Online Learning, 20(2). https://doi.org/10.24059/olj.v20i2.596
  • Teixeira, A. M., Pinto, M. D. C. T., Stracke, C. M., Tan, E., Kameas, A., Vassiliadis, B., & Sgouropoulou, C. (2018). Divergent perceptions from MOOC Designers and learners on interaction and learning experience: Findings from the Global MOOQ Survey. Proceedings of the European Distance and E-Learning Network - EDEN- 2018 Annual Conference. Genova, 17-20 June, 2018 (pp. 215–225). https://proc.eden-online.org/index.php/PROC/article/view/1613/1321
  • Türkay, S., Eidelman, H., Rosen, Y., Seaton, D., Lopez, G., & Whitehill, J. (2017). Getting to Know English Language Learners in MOOCs. Proceedings of the Fourth (2017) ACM Conference on Learning at Scale, 209–212. https://doi.org/10.1145/3051457.3053987
  • Utunen, H., Mattar, L., Piroux, C., Ndiaye, N., Christen, P., & Attias, M. (2022). Superusers of self-paced online learning on OpenWHO. Studies in Health Technology and Informatics, 295, 16–19. https://doi.org/10.3233/shti220648
  • Van Asbroeck, S., Van Boxtel, M. P. J., Steyaert, J., Köhler, S., Heger, I., De Vugt, M., Verhey, F., & Deckers, K. (2021). Increasing knowledge on dementia risk reduction in the general population: Results of a public awareness campaign. Preventive Medicine, 147, 106522. https://doi.org/10.1016/j.ypmed.2021.106522
  • Walls, J., Kelder, J.-A., King, C., Booth, S., & Sadler, D. (2015). Quality assurance for massive open access online courses: Building on the old to create something new. In E. McKay & J. Lenarcic (Eds.), Macro-level learning through Massive Open Online Courses (MOOCs): Strategies and predictions for the future (pp. 25–47). IGI Global. https://doi.org/10.4018/978-1-4666-8324-2.ch002
  • Wicking Dementia Research and Education Centre. (2022). Preventing Dementia. https://mooc.utas.edu.au/course/30
  • World Health Organization. (2017). Global action plan on the public health response to dementia 2017–2025. World Health Organization.
  • Yousef, A. M. F., Chatti, M. A., Schroeder, U., & Wosnitza, M. (2014). What drives a successful MOOC? An empirical examination of criteria to assure design quality of MOOCs. 2014 IEEE 14th International Conference on Advanced Learning Technologies. https://doi.org/10.1109/icalt.2014.23