Our course could be your life(style): Tales of MOOC revisiting learners 2015-2020

Abstract

Many studies have identified crucial factors that impact learner engagement in online courses, particularly free to join courses like MOOCs and have explored aspects of openness, freeness, production values, retention strategies and the impact of MOOCs on digital teaching and learning, but little has been said about learners who continue to revisit the same MOOC, their intention to revisit and then their behaviour when actually revisiting. This study explored the unique learner cohort of an open-ended skills-based MOOC and through the analysis of learner comments, it examined the behaviour, motivations and factors that contributed to learners revisiting the same MOOC over a five year period from 2015 to 2020. This study found that learners may first join the course for the content, but then choose to revisit for the community, course educators, content updates and enhancements, and for the impact that revisiting has on their behaviour and emotions. This study also found that a significant number of learners revisited, some up to 10 times in consecutive sequence, while others revisited in a syncopated pattern. While the outcomes from this study may not be generalisable to MOOCs for all topic areas, insights gained may be of interest to designers curious about creating flexible, social, practical and skills-based experiences for lifelong learners.

Introduction

Since their emergence in the mid-2000s, massive open online courses (MOOC) have been predicated on making learning available to everyone, and at scale. Much effort has been spent analysing data generated by MOOC participants (eg., Guo, Kim & Rubin 2014; Savage 2009; Wang 2017) across a variety of MOOCs (eg., Engle, Mankoff & Carbrey 2015; Hew & Cheung 2008; Taib, Chuah & Aziz 2017) to determine if video production methods (format, style, type and duration) and pedagogical dimensions (cooperative learning, feedback, activities and assessments) are crucial for cultivating an engaging learning experience. However, few studies have examined the specific cohorts of students who revisit and their behaviour and motivations for revisiting.

The purpose of this research is to identify the factors that contribute to learners revisiting a MOOC. In order to identify those factors we asked:

‘What can we discover about the behaviour and motivations of revisiting learners from comments in a MOOC and how can these findings inform the design of MOOCs in the future?

MOOC learners

Many learners take MOOCs to develop or build-on existing skills to enhance their future employability, shape a goal for further study, connect with people, understand basic concepts or general understanding and satisfy their curiosity (Zheng, Rosson, Shih and Carrol 2015). Laurillard (2014) noted that a number of previous studies have shown that most MOOC users are already well educated. In Coursera MOOCs, for example, an average of 85% of participants have one or more degrees. For London and Edinburgh-based MOOCs offered in 2013, the figure was around 70%. In their examination of learners taking a five-week ‘Introduction to Infographics and Data Visualization’ MOOC, Liu, Kang and McKelroy (2015) found the top two reasons for taking the MOOC were to learn more about the topic for personal reasons (employment prospects/career readiness) and for their current job. Other main reasons included learning about future career possibilities, what MOOCs are like, getting course materials, and learning from specific instructor(s). Achieving a certificate (of completion) and engaging with MOOC takers as a community of learners were also listed as reasons for taking the MOOC, but resulted in fewer responses. The results from interviews by Zheng et al., (2015) with learners about their reasons for taking a MOOC support findings of Liu et al., (2015), where learners revealed they took MOOCs to fulfil their current study needs, help their current position (as a student or in the workplace), develop a social connection with others who shared similar interests and also to prepare for future job opportunities or to gain experience in a field they might study in a more formal manner after taking the MOOC (Zheng et al. 2015).

Reasons for learners taking a MOOC were further explored by Xiong, Kornhaber, Suen, Pursel & Goins (2015), where they defined a general interest in taking a MOOC as intrinsic motivation, taking a MOOC for external rewards, such as earning a certificate as extrinsic motivation, and taking a MOOC for connecting with others as social motivation. As observed by Crues, Bosch, Anderson, Perry, Bhat and Shaik (2018), to this end, the subject matter of the course was also indicative of the reason a learner might take a MOOC.

Engle, Mankoff and Carbrey (2015) noted that understanding MOOC students and the characteristics that lead to their success will enable modification to courses for increased student achievement, while Hew and Cheung (2008) explored how to promote student contribution in asynchronous online discussions and Taib, Chuah and Aziz (2017) investigated four unique MOOCs using Assessing MOOC Pedagogy (AMP) to characterise ten pedagogical dimensions (1) Epistemology, (2) Role of teacher, (3) Focus of activities, (4) Structure, (5) Approach to content, (6) Feedback, (7) Co-operative learning, (8) Accommodation of individual differences, (9) Activities or assessments, and (10) User role. These factors noted by research carried out across a variety of MOOCs (eg., Engle, Mankoff & Carbrey 2015; Hew & Cheung 2008; Taib, Chuah & Aziz 2017) may seem elementary for a single instance or one-off run of a MOOC, but what of revisiting learners – those learners who repeatedly join and then continue to actively participate in the same MOOC over multiple instances? An example of this cohort would be the learners who continue to participate in Monash University’s ‘Mindfulness for Wellbeing and Peak Performance’ (https://www.futurelearn.com/courses/mindfulness-wellbeing-performance) and ‘Maintaining a Mindful Life’ (https://www.futurelearn.com/courses/mindfulness-life) free to join courses on FutureLearn.

Mindfulness at Monash

Mindfulness is an everyday experience. It’s about being fully present and fully engaged in each moment in our lives (Chambers 2020). Mindfulness is associated with paying attention and the evidence is suggesting that (training the attention and) learning to pay attention may be the most important skill we ever learn (Hassed n.d.) – it helps us to focus; to stay on task; to communicate more effectively and empathically; to not get caught in cycles of rumination and worry; and to enjoy life more, including life’s simple pleasures (Chambers & Hassed 2015).

The development of ‘Mindfulness for Wellbeing and Peak Performance’ (MIND) and the follow-up complementary ‘Maintaining a Mindful Life’ (MINDLIFE) emerged from Monash’s long term interest in the application of mindfulness in education settings and a partnership with UK-based online learning platform FutureLearn, which culminated in the first run of MIND in 2015 (Chambers & Hassed 2015).
Since then, a total of 328,091 learners enrolled in MIND across its 14 runs, while a total of 67,690 learners enrolled in MINDLIFE across its 8 runs since 2017. Class Central, an online course aggregator, reports that FutureLearn listed MIND as the fifth most popular courses by enrollment numbers since their first release (Bowden 2019). MIND and MINDLIFE are also listed on Class Central’s Top 100 online courses of all time – a status based on reviews written by Class Central users.

While both courses are predicated on mindfulness, in MIND learners cultivate mindfulness techniques to reduce stress and improve wellbeing and work/study performance (possibly for the first time). In MINDLIFE, learners cultivate mindfulness techniques to improve their communication, relationships and emotional health – it builds upon the introductory MIND course and demonstrates to learners how to embed mindfulness into all aspects of life.

MIND and MINDLIFE are four week courses (although MIND was originally designed as a six week course and remained so from Run 1 to Run 7) with a time commitment of 3 hours per week to work through the course material and sample each mindfulness exercise. Sections of each course make extensive use of video (didactic content and bespoke weekly wrap-ups from lead educators) and audio (mindfulness meditations and exercises) as well as text-based explanations of mindfulness concepts. Quizzes, articles and additional reading are also included to provide learners with opportunities to check their understanding or dedicate more time to explore the science of mindfulness in more depth, and to provide additional information to where claims or references to research are made in the course. Given the experiential nature of the courses, learners are strongly encouraged to watch the videos, practise the mindfulness meditations and exercises, share their ideas, reflections and experiences and join in the discussions – each run of the course features two course mentors to ensure appropriate facilitation of the course.

Our learners

Demographic data identifies MIND and MINDLIFE learners to originate from many countries around the world, with the highest number of learners from the United Kingdom, Australia, United States of America, followed by India, Canada, Ireland, Spain, France and New Zealand. Demographic data is gathered via an optional survey at the enrolment stage of the learner journey. For gender, of those learners who responded to the survey, 76% of learners identified as female while 22% identified as male, as shown in Figure 1.

Figure 1. Gender reported by respondents

For employment status, 32% of respondents revealed they work full time, 23% are retired, 14% work part-time and 10% are self-employed, as shown in Figure 2.

Figure 2. Employment status reported by respondents

For the age group, 22% were aged above 65 years of age, 22% aged between 56 and 65, 18% aged 46 and 55, and 15% aged 36 and 45, as shown in Figure 3.

Figure 3. Age reported by respondents

Our learners are mostly female, over 36 years of age and employed, which is interesting when compared to enrolment patterns and gender distribution found in MOOCs related to programming, science, technology, engineering, mathematics (STEM courses), who have a greater number of male learners (Glass, Shiokawa‐Baklan & Saltarelli 2016; Williams, Stafford, Corliss & Reilly 2018).

Literature review

Previous studies have identified a range of factors which impact on learner engagement with MOOCs. Given the freeness (no cost to join) of earlier MOOCs, low or no vested interest has been linked to low completion rates (Petronzi & Hadi 2016). Lack of engagement on MOOCs may be due to factors such as connectivity, digital skills, time zones, and institutional power dynamics (Walji, Deacon, Small & Czerniewicz 2016), or even disaffection with forum and peer-related issues (Hew, 2018). As noted by Walji et al. (2016), simply using a (MOOC) platform that promotes social learning is, of course, not enough for engaged learning to happen. Other studies identified factors that MOOC learners find engaging in large-scale open online courses, for example Hew, Quao & Tang (2016) surfaced themes in their data of structure and pace, video, instructors, content and resources, interaction and support, and assignments and assessments. These themes were surfaced by carrying out a machine learning classifier to analyse 24,612 reflective sentences posted by 5,884 students, who participated in one or more of 18 highly rated MOOCs. While an analysis of three top-rated MOOCs in the disciplines of programming languages, literature, and arts & design undertaken by Hew (2016) revealed problem-centric learning with clear expositions, instructor accessibility and passion, peer interaction, active learning, and course resources to address participant learning needs to be design factors found in a well-received MOOC.

Literature review

Previous studies have identified a range of factors which impact on learner engagement with MOOCs. Given the freeness (no cost to join) of earlier MOOCs, low or no vested interest has been linked to low completion rates (Petronzi & Hadi 2016). Lack of engagement on MOOCs may be due to factors such as connectivity, digital skills, time zones, and institutional power dynamics (Walji, Deacon, Small & Czerniewicz 2016), or even disaffection with forum and peer-related issues (Hew, 2018). As noted by Walji et al. (2016), simply using a (MOOC) platform that promotes social learning is, of course, not enough for engaged learning to happen. Other studies identified factors that MOOC learners find engaging in large-scale open online courses, for example Hew, Quao & Tang (2016) surfaced themes in their data of structure and pace, video, instructors, content and resources, interaction and support, and assignments and assessments. These themes were surfaced by carrying out a machine learning classifier to analyse 24,612 reflective sentences posted by 5,884 students, who participated in one or more of 18 highly rated MOOCs. While an analysis of three top-rated MOOCs in the disciplines of programming languages, literature, and arts & design undertaken by Hew (2016) revealed problem-centric learning with clear expositions, instructor accessibility and passion, peer interaction, active learning, and course resources to address participant learning needs to be design factors found in a well-received MOOC.

Irrespective of the learner’s inclination for undertaking learning, the opportunity must be flexible and permit the learner drop-in and out at their discretion, often at intervals throughout their life, much like ‘a personal learning continuum, rather than as unrelated, separate gatherings’ (Bozkurt & Keefer 2018). Communities that foster a culture of participatory learning contribute to the premise of lifelong learning, something to be part of, a sense ‘that the teaching and learning experience, the idea of imagining where you stand within that environment is something that’s akin to the DIY punk experience’ (Groom 2016). While it is possible for an online course to cultivate an environment for this type of experience, it’s an environment that is suited for a learner that is self-determined, attentive, can think critically, reason, and is ready to collaborate (Pegrum 2009). The presence of instructors is critical, where they support learners, the development of a community of learning, all the while gently guiding learners through the course. These varying reasons elicit the question as to why learners continue to participate and persist with a course and continue to engage in the community of learning.

As MOOCs are voluntary, is it possible that learners simply enjoy being part of a community or their goals to deepen their knowledge and skills are being met, or is it the convenience, ease of use and flexibility afforded by the platform, or possibly even something else entirely that’s related to their individual needs that’s being served?

Revisiter: defining the repeat MOOC learner

For the purposes of this study, a revisiter is a learner who has enrolled in more than one run of either of Monash’s mindfulness course on FutureLearn between 2016 and 2020, with some learners having revisited since the first run of MIND in 2015. Returning and re-take are terms used in other studies to describe repeated participation.

As a phenomenon, there’s not a lot of research available on the motivations of returning MOOC learners. In their investigation of learners returning to a teacher professional development MOOC, Chen, Fan, Zhang & Qiong (2017) found that learners were possibly motivated by the opportunity to improve their grades, refresh their theoretical understanding, and solve practical problems. Although the focus of their study was learner intention to revisit, Huang, Zhang & Liu (2017) found vividness, teacher subject knowledge and interactivity to be effective towards revisiting. Viewing returning learners through a non-MOOC lens, Woodgate, Macleod, Scott & Haywood (2015) note how it’s uncommon for students to re-take higher education courses, given the cost, time commitment and other regulations that may limit participation. As they also note, the affordances of MOOCs (openness and freeness) creates opportunities for students to re-take a course and then engage in mastery learning, a philosophy and set of instructional strategies (Guskey 2012) designed to promote a student’s capacity to practise a skill or knowledge. Because mindfulness is an open-ended skill that requires perseverance and time for learning, which are two key variables that promote mastery learning (Bloom 1968), the recurring runs of MIND and MINDLIFE create opportunities for learners to join and revisit the courses due to their no natural limit to continuation or expansion (Degree of Freedom 2013).

What other motivations are there for learners to revisit? On the course description of MIND on FutureLearn, learners noted the flexibility of the course content and design, its capacity for enabling mastery learning and its applicability to changing personal and world events, stating

‘Despite this being a re-run, the content was kept up-to-date with current situations and the application of mindfulness as an aid to dealing with the impact that the pandemic is having felt very relevant’,

‘Excellent course and perfect during COVID-19’,

‘I found this course particularly useful and pertinent in this time of COVID, but I am sure in life generally’,

‘I started this course during our most recent lockdown, and found that the course content and mindfulness practice couldn’t have come at a better time’

and
‘I thoroughly enjoyed this course and the way it was presented. I have learnt a great deal and I have found this very useful especially during COVID-19 and working from home’.

Learners (Class Central 2018) who participated in either or both of the mindfulness-related offerings by Monash reported

‘The two courses that I have been blessed with the opportunity to do on a number of ocassions [sic] have really enhanced the quality of not just my life but those who share their lives with me in any way. I am simply a better person for the learnings that I have had through undertaking the courses’,

‘I have found the leaders, […], really engaging and great role models for their subject, being calm, measured and reassuring, but also inspirational and encouraging. The strength of the course is their rapport in the short videos and audios which acknowledge human frailty and make the continuing practice of mindfulness seem vital and attainable in daily life.’

‘I plan to take the course again when it is offered in November, so I can delve more deeply into this more authentic way of viewing the world and living my life’.

These statements go some way to providing some insight into ongoing learner participation. Furthermore, the learner sentiment expressed towards the mindfulness related offerings is somewhat akin to the model proposed by Peltier, Drago, and Schibrowsky (2003), that identified student-to-student interactions, student-instructor interactions and instructor support and mentoring, as well as course content, course structure and information delivery technology to key components to learner perceived quality of the online learning experience.

Vividness is defined as the quality of being very clear, powerful and detailed in your mind (Campbell, Fuller & Hess 2009). When applied to digital experiences, through the use of multimedia components such as video, text, voice and animation, for example, vividness is a media capability (Campbell et al. 2009) that should engage and fully immerse the user in a sensorially rich mediated environment that appeals to multiple senses (Coyle & Thurston 2001) that are associated with increased feelings of being elsewhere. While multimedia vividness is associated with increased feelings of telepresence, vividness of course content in a MOOC (which are often sensorially rich mediated environments) has been found to be positively associated with a student’s intention to revisit (Huang, Zhang & Liu 2017), meaning an individual’s readiness or willingness to make a repeat visit to the same destination (Stylos, Vassiliadis, Bellou & Andronikidis 2016).

The role of video

Although the use of video isn’t exclusive to MOOCs, video is the primary means of content delivery. Each MOOC is unique, where educators, course content designers and producers work together to create video content that makes use of a diverse combination of video types (text, static image, image and presenter face/talking head and fully animated) and video design elements (static slides, digital ink/scribbler/analysis, discussion/discursive), that complements the pedagogical approach. The impact of video production on learning is of great interest to educators and MOOC producers. Guo, Kim and Rubin (2014) found that videos of short duration are engaging to learners. An equal level of engagement was identified for video types that featured the ‘talking head’ of a presenter or used digital ink/scribbler/analysis to provide a more detailed explanation. Neilson (2014) and Savage (2009) support the approach of videos of short duration. To maintain learner engagement, Neilson (2014) recommended that videos should combine different video types and design elements, such as a presenter face/talking head with a digital ink/scribbler/analysis – instructors who sketched Khan-style tutorials could situate themselves “on the same level” as the student rather than talking at the student in “lecturer mode” (Guo et al. 2014).

Multiple studies have investigated the role of cinematography and cinema production design/methodology on learner engagement. In his research, Wang (2017) identified camera shot style and background design ‘down to the granularity of how the image is captured, recorded and delivered’ (Wang 2017) to be as equally as important as video types and video design elements on learner attention and engagement. In addition, Wang, Chen and Wu (2015) observed the impact of different video types and video design elements on learning.

They concluded that the production cost and intensity associated with creating ‘lecture capture and picture-in-picture videos may be worthwhile for online learning from the perspectives of improved learning performance and reduced cognitive load’ (Wang, Chen and Wu 2015). The use of video isn’t reduced only to pre-prepared and pre-recorded content that contributes to the course material. Video can also be created in response to learner activity and play a critical role in enabling educators and course instructors to provide feedback to learners throughout the delivery of the course. Henderson and Phillips (2015) reported that the affordances of video enable instructors to convey detailed and elaborate feedback as well as encouragement and praise to learners more so than written commentary. Henderson and Phillips (2015) also reported positive learner response to video-based feedback, where learners felt they had a closer connection with the instructors or educators. Echoing the reports from Henderson and Phillips are learner reviews posted on Class Central, a search engine and review website for free online courses,

‘The final feedback video each week is the jewel in their crown. It sums up the week’s thoughts and mentions some of the issues and talking points brought up in the online comments’,

‘Weekly feedback on YouTube responds to learners questions and comments as they have arisen that week’

and
‘The weekly feedback sessions allowed the participants to feel engaged and gave a more personal feel to an online course’ (Class Central 2018).

The use of video-based feedback in this capacity not only addresses any potential knowledge deficit experienced by learners, ‘makes the massive feel intimate’ (Pappano 2018), but also personalised the course in a meaningful and practical way, while actively contributing to the change in learner perceptions of the course as the type where an ‘instructor is not as available because there are tens of thousands of others in the class’ (Pappano 2012).

Because the feedback videos by the lead educators are recorded at the end of each week during each run of MIND and MINDLIFE they help personalise the course experience for learners. Learners from MIND shared their thoughts on the use of feedback videos on the course description of FutureLearn,

‘The review at the end of each week shows the comments of learners are read and taken note so you feel everyone is really involved’

and
‘I really liked the use of video, particularly the end of week round-ups’.

While video plays a vital role in the delivery of course content and as a mechanism for feedback to learners of Monash’s mindfulness offerings, it also plays an important role as edutainment, that is, content that is primarily educational but has incidental entertainment value (Zheng et al. 2015). These edutainment videos are made produced during each run publicly available via the Monash Mindfulness YouTube channel and act as exclusive (and responsive to global events and/or collective learner discussion, COVID-19, for example) supplementary content for learners to watch in their spare time (without joining the course) as well as standalone content to increase awareness of Monash mindfulness offerings to non-learners. Responding this way to learners helps cultivate a unique currency for learners to contextualise their experience to immediately relatable personal and emerging (local and global) situations that may benefit from the application of mindfulness practices.

To this end, the use of video in Monash’s mindfulness offerings makes a crucial contribution to the course design and delivery and learners intention to revisit.

The educator presence

For learners in an online course like a MOOC with potentially thousands of learners, a course team presence (educators, instructors, course mentors) is a crucial component that ensures the facilitation and guidance of discussions and interaction between the learners so they can have a rewarding experience. According to Hew (2014), instructor accessibility and passion are some of the features that were identified as key for promoting learner engagement, where engagement is defined as an observable action in the course. While engagement is different from completion and retention (which in MOOCS, are an often misconstrued metric devised by educators and platform providers for defining their view of learner success, satisfaction, needs or goals), an analysis by Adamopoulos (2013) revealed that the role of the instructor did have the largest possible effect on the likelihood of completion.

Learner reviews about Mindfulness for Wellbeing and Peak Performance on Class Central stated
‘In addition to this there are two outstanding mentors who support the learning and help clarify the more scientific aspects of the course and make lots of useful links and resourses [sic] available to enable the independant [sic] exploration of a complex subject’,

‘The moderators are extremely active in their support for learners’

and
‘The mentors had valuable insights and comments and added a lot’ (Class Central 2018).

A learner review on the course description of MIND noted
‘How valuable it is to have meditation and mindfulness companions. Yes, they are online but the videos and discussion areas plus feedback as well as the teachers plus mentors make you feel that you are with real people — people who care about you and your progress.’

These reviews echo the findings of Adamopoulos (2013) and Pilli & Admiraal (2017) in relation to learner interaction with an instructor, where learner retention and satisfaction is higher when instructors are highly responsive. However, some earlier literature on MOOCS (Kop 2011; Mackness 2013; Milligan & Littlejohn 2014) seemingly bemoan the absence of interaction between instructors and learners, asserting that learners may be prevented from having a quality learning experience due to their limited capacity to undertake self-directed learning. In the past, a diminished learning experience may have been inevitable, considering the vintage of the online environment in which their learner cohort was constrained (the anti-social EdX platform and the hardcore Connectivist Edupunk-ness of the open web). Since the launch of FutureLearn, an online learning platform that’s been ‘inspired by Laurillard’s Conversational Framework’ (Sánchez-Vera, Maria del Mar, Leon Urrutia, Manuel, Davis & Hugh 2015), a diminished online learning experience could potentially be less likely as the platform is predicated on a social learning experience, instructor to learner, learner to learner and learner to content. In learning, and likewise online, there is no escape from the need for dialogue, no room for mere telling, nor for practice without description, nor for experimentation without reflection, nor for student action without feedback (Laurillard, 2001). According to Mill (2008), Laurillard divides her learning conversation into four phases, where learners encounter (1) a discursive phase that introduces learners to new concepts and allowing them to try out the idea and its corresponding language, questioning and clarifying, (2) an interactive phase in which learners interact with tasks and attempt to apply the new concept and get feedback on their performance, (3) an adaptive phase in which learners apply their own ideas to the practice, modify their ideas and adapt based on what they’ve learned, and (4) a reflective phase in which learners consider their experience of the interactive and adaptive phase, reflect on their learning, relate theory back to their practice, adjust their thinking based on their reflection and frame future actions to be more successful.

Research gap

As demonstrated in the review of the literature, previous MOOC research identified structure and pace, video, content, instructors, interaction and support and assignments as contributing factors to learner engagement in a MOOC, while factors such as connectivity, digital skills, time zones, or even peer related issues may contribute to lack of engagement. It’s worth noting that most literature relating to engagement and retention is associated with learners staying in a course rather than returning or revisiting the same course, which is different. Analysis of three top-rated MOOCs also revealed problem-centric learning with clear expositions, instructor accessibility and passion, peer interaction, active learning, and course resources to address participant learning needs to be design factors found in a well-received MOOC.

People undertake learning for personal and professional reasons: to increase their knowledge and skills that benefit themselves and others, and to maintain or improve their knowledge and skills. Given these different reasons, the learning opportunity must be flexible enough to enable the learner to drop-in and out at intervals throughout their life – these intervals of learning most likely occur with a community of like-minded individuals. While online communities that foster a culture of participatory learning contribute to the premise of lifelong learning, these environments are suited for learners that are self-determined, attentive, can think critically, reason and ready to collaborate.

The openness and freeness of MOOCs create opportunities for learners to ‘re-take’ a course, which is uncommon with higher education courses. Re-taking a course enables learners to practise and continue to master a skill or knowledge which is ideal for MOOCs about cultivating open-ended skills (with no natural limit to continuation or expansion) that require perseverance and time for learning. MOOCs are often sensorially rich mediated environments with the quality of vividness. This quality has been found to be positively associated with an individual’s readiness or willingness to make a repeat visit.

Video is the primary means of content delivery in MOOCs, which has contributed to the great interest by researchers into video production. Researchers identified that videos should combine a number of different video types, design elements, cinematography and design methodology to maintain learner engagement. Video can also be used responsively to provide feedback and engage learners throughout delivery of the MOOC, which helps to cultivate a personal connection. The presence and responsiveness of the course team (educators, instructors, course mentors) to guide and facilitate learners is a crucial component to a rewarding MOOC experience, and the largest possible effect on the likelihood of completion.

Many researchers have explored aspects of openness, freeness, impacts of MOOCs on digital teaching and learning, learner demographics, enrolments, motivation and retention (Veletsianos & Shephardson 2016; Zhu, Sari & Lee 2018), sentiment analysis of discussion forums (Wen, Yang & Rosé 2014) or even hype (Fischer 2014). While these research areas of interest were inherent to the early MOOC phenomenon before the MOOC pivot where courses and content were moved partially or fully behind paywalls (Reich & Ruipérez-Valiente 2019), little has been said about learners who continue to revisit the same course, their intention to revisit and then their behaviour when actually revisiting.

Learner behaviour and their motivations for revisiting the same course are missing from the current research area related to the design, development and delivery of digital experiences like MOOCs, and therefore is the focus of this study. The goal of this research is to investigate enrolment data, learner activity data and themes identified from comments made by revisiting learners between 2015 and 2020 to identify the factors that contribute to learners revisiting a MOOC. This investigation hopes to reveal findings that can inform design decisions of similar MOOCs in the future.

Research methodology

Methodology

For researchers who want to develop a better understanding of the experiences and motivations of learners in a MOOC, comments made by learners in the MOOC provide a rich and authentic data source that can provide invaluable insights because they can potentially reveal the underlying reasons for learner activity. For this study, quiz responses, step activity and enrolment information as well as learner comments were used to uncover revisiting learners – these data sets are made readily available to partners of FutureLearn. By using available data sets from FutureLearn rather than independently surveying learners, it is possible to link freshly caught ‘wild comments’ made by learners to their level of engagement, activity and number of times they revisited the MOOC. Using available data sets also means that the comments are taken from the ‘wild’ or made within the naturally occurring context of their learning and course revisiting experience and not in response to an additional survey. The use of freshly caught comments isn’t without its problems, mainly the lack of clarity and directness generally afforded by survey questions. These comments require analysis, thematic coding, interpretation and distillation of themes have potential to lack precision. Using a survey, revisiting learners could be invited to respond to questions asking them specifically about their revisiting, their reasons for doing so, and explore the behavioural and affective outcomes of their revisiting, and more. Although the use of a survey may result in more direct responses, this approach is challenged by being optional and not strictly related to the course experience, which may reduce the number and consistency of interested respondents (particularly revisiting learners) over time. It’s worth noting that the FutureLearn platform does permit partners to invite learners to participate in research, but given the constraints of General Data Protection Regulation (GDPR) and its impact on direct communication with learners, the potential for an invitation to reach a large number of learners (from current and past course runs) is extremely limited.

To further investigate the outcomes identified in the comments, case studies of a small group of visiting learners were undertaken to develop a better understanding of their motivations and underlying behaviour and emotions. While the use of case studies provide a unique opportunity to find out more about an individual learner or small instance of learners in a real life context, their usefulness in telling a broader story about revisiting learners may be limited.

Research methods

This project has approval from the Deakin University Arts and Education Faculty Human Ethics Advisory Group (HAE-20-081). This project is based on data sets from each run of MIND and MINDLIFE which were made available to Monash University from FutureLearn for research purposes in a de-identified form. The data consists of de-identified:

  • student comments from runs of the course between 2015 and 2020 made by students during the course of their learning
  • data analytics of student engagement with different steps in the course (number of clicks, time on page, number of comments, number of replies)
  • enrolments, demographics (age and country), sentiment and survey data (pre-course, leaving).

This data has been collected by FutureLearn and used by Monash University as a partner institution to monitor and improve learning and teaching within the MOOC over a number of iterations of the course. While the data-sets used for this research are provided in a de-identified form, it would be possible for students, teachers and moderators in the specific iteration of the MOOC to go back into the MOOC, search for a specific comment, and identify the public profile of the student who made that comment. For this reason, no comments will be quoted in the research paper or any subsequent publications to ensure no student is able to be identified from the outcomes of this research. Learners have consented to this use of the data as part of the Terms and Conditions (Privacy Policy/Research ethics) of their enrolment with FutureLearn. This includes their consent to their data being used for research purposes, and an acknowledgement by the learners that their comments will not be quoted directly in any publications without their permission. Data sets of learner enrolment, step activity and learner comments in comma separated values (CSV) file format were sourced from course runs between 2015 and 2020 of Monash University’s ‘Mindfulness for Wellbeing and Peak Performance’ (MIND) and ‘Maintaining a Mindful Life’ (MINDLIFE) free to join courses on FutureLearn. Data from MINDLIFE was investigated and no further additional information as found, so this study focuses on MIND.

Table 1 lists the title, run number, date and labelling of each data set sourced from MIND and MINDLIFE used for this research.

Table 1. Data source

 Course  Run number  Start date  Label
 MIND  1  Monday 14 September, 2015  R1
 MIND  2  Monday 8 February, 2016  R2
 MIND  4  Monday 23 May, 2016 R4
 MIND  5  Monday 19 September, 2016  R5
 MIND  6  Monday 6 February, 2017  R6
 MIND  7  Monday 15 May, 2017  R7
 MIND  8  Monday 2 October, 2017 R8
 MIND  9  Monday 5 February, 2018  R9
 MIND  11  Monday 7 May, 2018  R11
 MIND  12  Monday 1 October, 2018  R12
 MIND  13  Monday 4 March, 2019  R13
 MIND  14  Monday 1 July, 2019  R14
 MIND  15  Monday 7 October, 2019  R15
 MIND  16  Monday 16 March, 2020  R16
 MINDLIFE  1  Monday 13 November, 2017  R1
 MINDLIFE  2  Monday 11 June, 2018  R2
 MINDLIFE  3  Monday 23 July, 2018  R3
 MINDLIFE  4  Monday 1 April, 2019  R4
 MINDLIFE  5  Monday 27 May, 2019  R5
 MINDLIFE  6  Monday 15 November, 2019  R6
 MINDLIFE  7  Monday 19 February, 2020  R7
 MINDLIFE  8  Monday 29 May, 2020  R8

It’s worth noting that Run 3 and Run 10 of MIND were not included as data sources because they were closed/private runs of the course, which meant learner comments, learner activity and even motivation for joining the course would be incongruent with the open/free to join course offering.

The CSV files were imported into Microsoft Excel and then exported as Microsoft Excel files for greater flexibility and utility. Quantitative analysis of enrolment data was carried out in Excel. Because FutureLearn considers team members as enrolments, additional work was done to remove course team members (lead educators, course mentors, course developers and designers) and reviewers (non-team members/miscellaneous personnel) from all data sets to ensure accurate enrolment numbers. Then, pivot tables and concatenation were used to calculate the total number of revisiting joiners (learners who enrolled in more than one run of the course).

The data files were then imported into NVivo 12 for ‘coding’ of themes. During the initial analysis of the data in NVivo, a number of themes were identified. Further analysis of the themes using the lens of learner archetypes and behavioural and affective/emotional subthemes was undertaken. The analysis revealed two main groups of revisiting learners based on the frequency and pattern of their revisiting. These two groups were then explored further, which resulted in a number of case studies that mapped each revisiting learner’s journey through each course run.

Ten revisiting learners were identified as case studies based on the number and frequency of their revisits, step activity, quiz responses and comments. Because FutureLearn allocates a unique identification number to each learner when they create their FutureLearn profile, it was possible to link data sets together and map the journey of each revisiting learner. Two main groups of revisiting learners were identified in the data: ‘Sequencers’ and ‘Syncopators’. ‘Sequencers’ revisited a course offering multiple times one after the other in a sequence without a break, while ‘Syncopators’ revisited multiple times non sequentially, on the off-beat or revisiting after a long break (sometimes more than 12 months between revisits). From ten revisiting learners initially identified, four learners were selected as case studies to represent two groups of revisiters.

Results and discussion

Enrolments and revisits

Combined, a total number of 33,990 revisiting learners joined a Monash mindfulness offering between Monday 14 September 2015 and Monday 29 May 2020. It’s worth noting that this number is determined by enrolments and does not represent active learners – typically, around 4% of learners leave a course run. There were 27,957 revisiting learners who joined MIND between Run 1 (14 September 2015) and Run 16 (16 March 2020). Table 2 summarises the number of revisits and the number of joiners, from highest to lowest for MIND.

Table 2. Revisits to MIND

MIND
 Number of revisits  Joiners
 14  12
 13  7
 12  12
 11  18
 10  42
 9  41
 8  84
 7  164
 6  272
 5  525
 4  1315
 3  4144
 2  21321
 Total  27957

There were 6,033 revisiting learners who joined MINDLIFE between Run 1 (13 November 2017) and Run 8 (29 May 2020). While MINDLIFE is complementary to MIND as a learning experience, the amount of revisiting learners may suggest that revisiters to MIND is not an isolated event. Table 3 summarises the number of revisits and the number of joiners, learners and active learners, from highest to lowest.

Table 3. Revisits to MINDLIFE

MINDLIFE
 Number of revisits  Joiners
 8  44
 7  48
 6  67
 5  170
 4  353
 3  1003
 2  4348
 Total  6033

Themes and sub-themes

Initial thematic analysis revealed core themes of revisiting, community, personal practice, content approval, observed outcomes and course team as reasons for learners to revisit the course. Table 4 lists these themes and provides a brief description.

Table 4. Summary of themes from comments made by revisiting learners

 Theme  Description
 Revisiting  Learners who enrolled or self-identified as having previously participated in a course offering, either sequentially or more than once – many learners listed the amount of revisits, run number or year of commencement in their comments.
 Community  Learners who highlighted the importance of community, felt a sense of community, or felt supported in a community of like minded and enthusiastic people.
 Content approval  Learners who identified the value of the course content (videos, written content, mindfulness meditations and exercises, and bespoke weekly feedback videos).
 Observed outcomes/change  Learners who identified the value of the course content (videos, written content, mindfulness meditations and exercises, and bespoke weekly feedback videos).
 Course team  Learners who celebrated and valued the contribution made by the course team – the lead educators and course mentors.

Results indicate that revisiting learners chose to return because the course enabled them to refresh and reinforce their skills and knowledge (because practising mindfulness has no natural limit), stay motivated to maintain and grow a regular / ongoing meditation practice, feel supported in a community of like minded and enthusiastic people, and access and engage with new/course specific content. Revisiters also revealed behavioural and emotional reasons for their ongoing participation, with many learners noticing an increased awareness and changes in behaviour, staying on track and maintaining their established mindfulness practice, starting unitasking/efficient attention switching instead of multitasking, changing ingrained behaviours with a positive impact, reinforcing and practising mindfulness and creating permanent change to their outlook on life. These learners also noticed how their ongoing participation made them feel less anxious or anxious for a shorter time, more motivated and less overwhelmed, and part of a global community made-up of old friends.
Revisiting reflects the findings of Bozkurt & Keefer (2018), where the learning opportunity must be flexible and permit the learner drop-in and out at their discretion, often at intervals throughout their life, much like ‘a personal learning continuum, rather than as unrelated, separate gatherings’. Typical comments made by revisiting learners in relation to the theme of revisiting include references to the number of course runs they’ve joined, their eagerness to enrol, engage with other revisiting learners and make a start on the course – many learners felt revisiting gave them a chance to pause and recharge their batteries, build on existing skills or (re)discover something they’ve previously missed.

Revisiting learners also revealed in the comments that feeling supported in a community of like minded and enthusiastic people was another reason for their return. These comments reflect findings of (Groom 2016), where communities that foster a culture of participatory learning contribute to the premise of lifelong learning. The community doesn’t come without a degree of effort from the learner, as noted by Pegrum (2009), where the course is an environment that is suited for a learner that is self-determined, attentive, can think critically, reason, and is ready to collaborate. Typical comments made by revisiting learners in relation to the theme of community include references to following the success of other learners, opportunities to learn from an ‘international’ community, a sense of strength, support and perspectives of others, and being motivated by other learners.

About the Author

rowan_peter

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php