Tag Archives: learning

My first noise displacement

Super-stoked about taking my very first steps with the visual coding software TouchDesigner by making my way through Polyhop’s Noise Displacement Tutorial – it’s a fun tutorial with just the right amount of detail designed especially for first timers like me.

You literally start with a ‘blank canvas’ and then over time slowly build up to something something visually interesting and fun to play with – I’m most likely going to adopt this workflow while I’m still finding my feet with TouchDesigner.

My key takeaways

  • Learning how to chain together Noise TOPs
  • Using Nulls to connect nodes
  • Polyhop’s expression “toot the parameters” – love it!

My noise displacement network
MovieOut of my noice displacement network

Research plan – They’re back! To what extent and in what ways do particular pedagogical dimensions, or combinations of, contribute to repeated learner participation in an online course over consecutive or non-consecutive runs?


Since their emergence in the mid-2000s, massive open online courses (MOOCs) have been predicated on making learning available to everyone, and at scale. Much effort has been spent analysing data generated by MOOC participants (eg., Guo, Kim & Rubin, 2014; Savage, 2009; Wang, 2017) to determine if video production methods, format, style, type or even duration has the capacity to solely influence student learning and engagement. While the use of instructional video by educators isn’t new to face-to-face or online learning experiences, it does serve as a critical and primary means of content delivery in a MOOC. Research across a variety of MOOCS (eg., Engle, Mankoff & Carbrey 2015; Hew & Cheung, 2008; Taib, Chuah & Aziz, 2017) has also been conducted into better understanding the impact of pedagogical dimensions such as cooperative learning, feedback, activities and assessments on learner participation, therefore suggesting that there may be more factors to cultivating an engaging learning experience than video instruction alone. These factors may seem elementary for a single instance or one-off run of a MOOC, but what of repeat learners – those learners who repeatedly join and then continue to actively participate in the same MOOC over multiple instances? An example of this cohort would be the learners who continue to participate in Monash University’s ‘Mindfulness for Wellbeing and Peak Performance’ (MIND) and ‘Maintaining a Mindful Life’ (MINDLIFE) courses on FutureLearn.

I currently work in a central unit within Monash University responsible for the design, development and delivery of short online courses with education partner FutureLearn. MIND is one of our longest running courses and continues to enjoy a large learner cohort who actively and consistently engage with the course content, other learners and the course team. The positive learner response to MIND stimulated the release of MINDLIFE, a complementary course which has also been well received by learners. While the design, development and delivery of Monash’s other non-mindfulness courses are remarkably similar to the two mindfulness offerings, it has become evident that these courses may not attract repeat learners (over consecutive or non-consecutive runs) in a similar way. But why? The purpose of this research is to identify, explore and provide a detailed account of the extent and ways in which particular pedagogical dimensions, or combinations of, contribute to repeated learner participation in an online course over consecutive or non-consecutive runs.

This research has a number of potential benefits, for individual respondents, education/teaching and learning, and designers and developers of online courses. Respondents may find it interesting to reflect on their ongoing participation in an online course and note any particular elements or experiences they’ve find to be particularly important to them or that have influenced their ongoing course participation. More broadly, this research should help improve our understanding of online learners’ motivation to continue to participate in an online course over consecutive or non-consecutive runs which could then be used to increase ongoing course participation as well as guide the design, development and delivery of alternate or expanded online course offerings.

Literature review

The following identifies key themes from literature related to pedagogical dimensions, or combinations of, inherent in online courses, thus providing context for my research.

The use of video

Although the use of video isn’t exclusive to MOOCs, video is the primary means of content delivery. Each MOOC is unique, where educators, course content designers and producers work together to create video content that makes use of a diverse combination of video types (text, static image, image and presenter face/talking head and fully animated) and video design elements (static slides, digital ink/scribbler/analysis, discussion/discursive), that complements the pedagogical approach. The impact of video production on learning is of great interest to educators and MOOC producers. Guo, Kim and Rubin (2014) found that videos of short duration are engaging to learners. An equal level of engagement was identified for videos types that featured the ‘talking head’ of a presenter or used digital ink/scribbler/analysis to provide a more detailed explanation. Neilson (2014) and Savage (2009) support the approach of videos of short duration. To maintain learner engagement, Neilson (2014) recommended that videos should combine different video types and design elements, such as a presenter face/talking head with a digital ink/scribbler/analysis. Multiple studies have investigated the role of cinematography and cinema production design/methodology on learner engagement. In his research, Wang (2017) identified camera shot style and background design ‘down to the granularity of how the image is captured, recorded and delivered’ (Wang 2017) to be as equally as important as video types and video design elements on learner attention and engagement. In addition, Wang, Chen and Wu (2015) observed the impact of different video types and video design elements on learning. They concluded that the production cost and intensity associated with creating ‘lecture capture and picture-in-picture videos may be worthwhile for online learning from the perspectives of improved learning performance and reduced cognitive load’ (Wang, Chen and Wu 2015). The use of video isn’t reduced only to pre-prepared and pre-recorded content that contributes to the course material. Video can also be create in responsive to learner activity and play a critical role in enabling educators and instructors to provide feedback to learners throughout course delivery. Henderson and Phillips (2015) reported that the affordances of video enable instructors to convey detailed and elaborate feedback as well as encouragement and praise to learners more so than written commentary. Henderson and Phillips (2015) also reported positive learner response to video-based feedback, where learners felt they had a closer connection with the instructors or educators. Echoing the reports from Henderson and Phillips are learner reviews posted online ‘The final feedback video each week is the jewel in their crown. It sums up the week’s thoughts and mentions some of the issues and talking points brought up in the online comments’ and ‘The weekly feedback sessions allowed the participants to feel engaged and gave a more personal feel to an online course’ (Class Central 2018). The use of video-based feedback in this capacity not only addresses any potential knowledge deficit experienced by learners, makes the ‘makes the massive feel intimate’ (Pappano 2018), but also personalised the course in a meaningful and practical way, while actively contributing to the change in learner perceptions of the course as the type where an ‘instructor is not as available because there are tens of thousands of others in the class’ (Pappano 2012).

The role of the facilitator

When thinking about why learners persist with a course or even choose to repeat the same course again, it’s critical to consider the role of instructors, course mentors or any of the multitudes of names for the course team members responsible for facilitating discussions and interacting with learners. According to Hew (2014), instructor accessibility and passion are some of the features that were identified as key for promoting learner engagement, where engagement is defined as an observable action in the course. While engagement is different from completion and retention (which in MOOCS, are an often misconstrued metric devised by educators and platform providers for defining their view of learner success, satisfaction, needs or goals) an analysis by Adamopoulos (2013) revealed that the role of the instructor did have the largest possible effect on the likelihood of completion. Learner reviews posted online ‘In addition to this there are two outstanding mentors who support the learning and help clarify the more scientific aspects of the course and make lots of useful links and resourses [sic] available to enable the independant [sic] exploration of a complex subject’ (Class Central 2018), ‘The moderators are extremely active in their support for learners’ (Class Central 2018) and ‘The mentors had valuable insights and comments and added a lot’ (Class Central 2018) echo the findings of Adamopoulos (2013) and Pilli & Admiraal (2017) in relation to learner interaction with an instructor, where learner retention and satisfaction is higher when instructors are highly responsive. However, some earlier literature on MOOCS (Kop, 2011; Mackness, 2013; Milligan & Littlejohn, 2014) seemingly bemoan the absence of interaction between instructors and learners, asserting that learners may be prevented from having a quality learning experience due to their limited capacity to undertake self-directed learning. In the past, a diminished learning experience may have been inevitable, considering the vintage of the online environment in which their learner cohort was constrained (the anti-social EdX platform and the hardcore Connectivist Edupunk-ness of the open web). Since the launch of FutureLearn, a platform that’s been ‘inspired by Laurillard’s Conversational Framework’ (Sánchez-Vera, Maria del Mar, Leon Urrutia, Manuel and Davis, Hugh C 2015), a diminished learning experience seems less likely as the platform is predicated on a social learning experience, instructor-learner, learner-learner and learner-content.

A learning continuum

According to Horrigan (2016), people undertake learning for personal and professional reasons. Personal learners often choose learning opportunities that are likely increase their knowledge and skills that benefit themselves and others. Professional learners often choose learning opportunities that are likely to maintain or improve their knowledge and skills. Irrespective of the learner’s inclination for undertaking learning, the opportunity must be flexible and permit the learner drop-in and out at their discretion, often at intervals throughout their life, much like ‘a personal learning continuum, rather than as unrelated, separate gatherings’ (Aras Bozkurt & Jeffrey Keefer 2018). Communities that foster a culture of participatory learning contribute to the premise of lifelong learning, something to be part of, an idea ‘that the teaching and learning experience, the idea of imagining where you stand within in that environment is something that’s akin to the DIY punk experience‘ (Groom 2016). While the course creates an environment for this type of experience, its an environment that is suited for a learner that is self-determined, attentive, can think critically, reason, and is ready to collaborate (Pegrum, 2009). The presence of instructors is once again critical, where they support learners, the development of a community of learning, all the while gently guiding learners through the course. This raises the question as to why learners continue to participate and persist with a course and continue to engage in the community of learning. As MOOCS are voluntary, is it possible that learners simply enjoy being part of a community or their goals to deepen their knowledge and skills are being met, or is it the convenience, ease of use and flexibility afforded by the platform, or possibly even something else entirely that’s related to their individual needs that’s being served? Learners (Class Central, 2018) who participated in either or both of the mindfulness-related offerings by Monash reported ‘The two courses that I have been blessed with the opportunity to do on a number of ocassions [sic] have really enhanced the quality of not just my life but those who share their lives with me in any way. I am simply a better person for the learnings that I have had through undertaking the courses’, ‘I have found the leaders, Craig and Richard, really engaging and great role models for their subject, being calm, measured and reassuring, but also inspirational and encouraging. The strength of the course is their rapport in the short videos and audios which acknowledge human frailty and make the continuing practice of mindfulness seem vital and attainable in daily life.’ and ‘I plan to take the course again when it is offered in November, so I can delve more deeply into this more authentic way of viewing the world and living my life’, which provides some insight into their ongoing participation. Furthermore, the learner sentiment expressed towards the mindfulness related offerings is somewhat akin to the model proposed by Peltier, Drago, and Schibrowsky (2003), that identified student-to-student interactions, student-instructor interactions and instructor support and mentoring, as well as course content, course structure and information delivery technology as key components to learner perceived quality of the online learning experience.

Research methodology

This research is situated in the postpositivist paradigm, which is predicated on the need to identify and assess the causes of outcomes (Creswell 2018). This seems to be the most appropriate for my research because, as noted by Phillips and Bourbles (2000), data, evidence and rational considerations shape knowledge. For my research, knowledge will be accrued from data gathered from measurement tools (surveys) completed by respondents or from direct observation (semi-structured interviews). Creswell (2018) astutely synthesised elements from work of Phillips and Bourbles (2000) that argues research seeks to develop relevant, true statements, ones that can serve to explain the situation of concern or that describe the causal relationships of interest. Explaining the causal relationships is most critical for my research, particularly when I’ll be required propose a possible relationship between pedagogical dimensions, or combinations of, that contribute to repeated learner participation. This research will make use of an explanatory sequential mixed methods design approach, which leverages quantitative and qualitative research data. Creswell (2000) notes the overall intent of this design is to have the qualitative data help explain in more detail the initial quantitative results. Fittingly, the affordances of the FutureLearn platform means that quantitative data about course participant activity is readily available which will streamline the first phase of the two-phase design approach. A two-phase design approach with advantages, that according to (Morse 1991) include straightforwardness and opportunities for the exploration of the quantitative results in more detail.

*Based upon the ‘Visual model for mixed-methods sequential explanatory design procedures’ outlined by Ivankova, Creswell & Stick (2006).

Data collection procedures

Initially, quantitative data will be collected from datasets automatically generated from learner activity in the platform. Additional quantitative and principal qualitative data will then be collected from learners who have previously taken part in either Monash mindfulness course. They will be invited to participate in an anonymous 10-15 minute online survey via custom course notices (scheduled emails sent to learners), together with learners currently completing the course. Learners will also be invited to participate in the survey from strategically placed call-to-actions/call-outs within course content combined with carefully prepared ‘nudges’ included in scheduled course notices throughout course delivery. At the end of the survey, learners will be invited to take part in a semi-structured interview, scheduled on their availability. Data will be gathered from Monash-based Qualtrics surveys and datasets (comments, weekly sentiment surveys, step activity, post-course survey data etc) readily available from the stats dashboard of each run of each mindfulness course on FutureLearn. Qualtrics was chosen as the online survey tool due to its organisational endorsement and support as well its accessibility features and responsive design, which contributes to the ease of survey completion by respondents who may choose to answer using a range of devices (mobile phones, tablets and computers). This project is considered to be low risk and respondents are unlikely to experience any serious harm from participating.

The participants

Since it’s first run in 2015, over 254,000 learners have participated in MIND, with over 27,000 learners taking part in MINDLIFE since its inception in 2017. These learners, and any new learners that join any new run of either mindfulness course will be invited to participate in the online survey.

The setting

FutureLearn is the platform which hosts the two mindfulness-based courses offered by Monash (MIND and MINDLIFE) and will serve as an appropriate and relevant context to engage with learners.

Data analysis

Online survey responses, FutureLearn datasets, interview transcripts and audio recordings will be imported into NVivo for thematic analysis based on identified pedagogical themes, with the provision for the creation of new themes or modification of existing themes as they arise.

Limitations of the study

One key limitation of this research is the discipline specific nature of the course offerings. The mindfulness courses are made up of a number of pedagogical dimensions that create opportunities for learners to investigate the science and theoretical framework of mindfulness, but also cultivate and apply the skills as a deeply personal ongoing practise. While a lifelong commitment to learning and practising a particular skill is in no way exclusive to mindfulness or the course offerings, some disciplines may not lend themselves to a similar level of commitment. These differences would give strong caution for research findings that reduce or over simplify any recommendations that suggest a simple transposition of the course design.


Adamopoulos, P. (2013). What makes a great MOOC? An interdisciplinary analysis of student retention in online courses. In Thirty fourth international conference on information systems, Milan, 2013.

Aras Bozkurt & Jeffrey Keefer (2018) Participatory learning culture and community formation in connectivist MOOCs, Interactive Learning Environments, 26:6, 776-788.

Creswell, J.D., & Creswell, J.W. (2018). Research design: qualitative, quantitative & mixed methods approaches, 5th edn, Sage, Melbourne.

Class Central 2018, Maintaining a Mindful Life, retrieved 20 July 2018, <https://www.class-central.com/course/futurelearn-maintaining-a-mindful-life-9078#reviews>

Class Central 2018, Mindfulness for Wellbeing and Peak Performance, retrieved 20 July 2018, <https://www.class-central.com/course/futurelearn-mindfulness-for-wellbeing-and-peak-performance-3714#reviews>

Guo, P. J., Kim, J., & Rubin, R. (2014). How video production affects student engagement: An empirical study of MOOC videos. In Proceedings of the first ACM conference on Learning@ scale (pp. 41-50).

Henderson, M., & Phillips, M. (2015). Video-based feedback on student assessment: scarily personal. Australasian Journal of Educational Technology, 31(1), 51-66.

Hew, K. F. (2014). Promoting engagement in online courses: what strategies can we learn from three highly rated MOOCS. British Journal of Educational Technology (Online First). http://doi.org/10.1111/bjet.12235.

IMDB 2018, The Young Ones – Nasty, retrieved 21 July 2018, <https://www.imdb.com/title/tt0752259/>.

Kop, R. (2011). The challenges to connectivist learning on open online networks: learning experiences during a massive open online course. The International Review of Research in Open and Distance Learning, 12(3), 19e38.

Mackness, J. (2013). cMOOCs and xMOOCs-key differences. Available from: https://jennymackness.wordpress.com/2013/10/22/cmoocs-and-xmoocs-key-differences/.

Milligan, C., & Littlejohn, A. (2014). Supporting professional learning in a massive open online course. The International Review of Research in Open and Distance Learning, 15(5), 197e213.

Neilson, B. (2014). Video Production and Learner Engagement in MOOCs, retrieved 22 July 2018, <http://www.yourtrainingedge.com/video-production-and-learner-engagement-in-moocs/>

Pegrum, M. (2009). From blogs to bombs: The future of digital technologies in education. Perth,
Australia: University of Western Australia Publishing.

Peltier, J.W., Drago, W., & Schibrowsky, J. A. (2003). Virtual communities and the assessment of online marketing education. Journal of Marketing Education, 25(3), 260e276.

Pew Research Center 2016, Lifelong Learning and Technology, retrieved 20 July 2018, <http://www.pewinternet.org/2016/03/22/lifelong-learning-and-technology/>

Phillips, D. C., & Burbles, N. C. (2000) Postpositivism and educational research. Lanham, MD: Rowman & Littlefield.

Pilli, O., & Admiraal, W. (2017). Students’ learning outcomes in massive open online courses (MOOCs): Some suggestions for course design. Journal of Higher Education, 7(1), 46–71.

Sánchez-Vera, Maria del Mar, Leon Urrutia, Manuel and Davis, Hugh C. (2015) Challenges in the creation, development and implementation of MOOCs: Web Science course at the University of Southampton. Comunicar, 22 (44), 37-43.

The New York Times 2012, The Year of the MOOC, retrieved 21 July 2018, <https://www.nytimes.com/2012/11/04/education/edlife/massive-open-online-courses-are-multiplying-at-a-rapid-pace.html>

This Course Could Be Your Life, Keynote – Jim Groom 2016, YouTube, CIRT Lab, 3 March, retrieved 20 July 2018, <https://www.youtube.com/watch?v=Xk97NwetXtE>.

W. Wang, C. Chen and C. Wu, “Effects of Different Video Lecture Types on Sustained Attention, Emotion, Cognitive Load, and Learning Performance,” 2015 IIAI 4th International Congress on Advanced Applied Informatics, Okayama, 2015, pp. 385-390.

Wang, P.-Y. (2017). The impact of camera shot and background design for MOOC videos on student recall and flow experience. Journal of Educational Media and Library Science, 54(3):237-268

Hi Rowan

You did a very good job of using the literature to complement the purpose for a research project. However, I found it difficult to locate a clearly written research question within the text of the submitted paper; the research question is somewhat implied. You used research theory well to justified a mixed methods procedure but there were elements within the design that required more information, for instance:

  • whether the online survey is going to be designed to answer open or closed questions (or a combination of both)
  • how many participants will be interviewed
  • what process will be used to select participants for interviews
  • whether descriptive and/or inferential statistics will be used to analyse quantitative data, and
  • a review of the ethical considerations.

Targeting 250,000 participants to respond to an online survey is massive and not suited to a Master’s level research project seeing it will take considerable time to gather and analyse data. Random sampling should have been considered and discussed. The research project is of value within the field of education however the process needs to be narrowed down and clearly articulated.

WWW – Simple access to web pages

I’m building an interactive experience which makes use of available weather data, much like the data services made available by the BOM or the OpenWeatherMap. This means I need to be able to load data from the web, parse it and then transform 3D geometry based on real world locations and the loaded weather data.

That’s a whole load of stuff to figure out and then pull together, so I’m making a start by learning how to load data by retrieving the contents of URLs and weather specific data, just to get the whole thing going.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
//using System.Xml;
//using System;

public class SimpleWebAccess : MonoBehaviour {

	IEnumerator Start()

		WWW www = new WWW ("ftp://ftp.bom.gov.au/anon/gen/fwo/IDV10751.xml");
		yield return www;
		print (www.text);


Related links

Service Design Boot Camp at GA

On Saturday 20 August I attended the Service Design Boot Camp workshop at General Assembly in Melbourne, which was pretty cool. I’ve been interested in the discipline for some time, exploring aspects of design process, prototyping, testing and iterating with Coursera’s Design: Creation of Artifacts in Society open online course online course, but I’ve never had the chance to embrace it fully face-to-face, until now. Awesome.

On the promotional page for the workshop they said “This course is for anyone that has an interest in applying the design process to solve complex problems. It’s likely you’ll have many transferable skills or experiences that will be put to use through the course of the day.” Cool. That’s exactly what I want to be able to do.

What did we do?

In the workshop we worked through the components that make up the practise of service design:

  • Discovery: gaining empathy and understanding the needs and pain points of users.
  • Ideation: Developing a range of ideas on how to develop a solution to meet the needs of all users.
  • Prototyping: Testing and iterating, including the customer experience, “front of house” interactions, and back of house dependencies.
  • Communication: Articulating the many facets of your offering in a concise way.








“How might we..” “for…” “so that…”


Deliver – Service blueprinting and role play

Completed service blueprint for our solution.
Completed service blueprint for our solution.

We use role play to articulate the customer journey and services we designed.

Leading practitioners

Mobas – mobile checklists enabling authentic workplace assessment

Mobas was a web application that (is actually still going…) enables vocational education students and teachers to create, complete and submit formative and summative assessments from their smartphone or other Internet enabled device – it was funded by the E-learning for Participation and Skills project for the National e-Learning Strategy and released in 2012/2013.


National VET e-Learning Strategy and the students and teachers at Box Hill Institute


Web application/Moodle learning management system (LMS) with a custom plug-in.


Students, teachers and workplace supervisors who are undertaking and facilitating training and assessment in the workplace, specifically trades such as carpentry, plumbing and automotive.

My role – learning designer

  • Work with the project manager and project director to determine assessment templates that would be most relevant to the students, workplace and industry.
  • Map proposed assessment templates to relevant units of competency
  • Design mockups for the specified assessment templates
  • Work with multimedia development, design and infrastructure team to design user interface elements, functionality and user experience.
  • Write instructional copy for Mobas and Moodle LMS.
  • Report to the project manager and the project director on the completion of allocated tasks.
  • Test and evaluate assessment templates throughout production cycle and provide feedback as required. 
  • Facilitate training sessions for students and teachers and co-present information sessions for project stakeholders and management.
  • Naming rights – I titled the web application as ‘Mobas’.

Watch the project team share their reflections on the project and the value that Mobas brings to teaching and learning in the VET sector.

Log in screen of Mobas
The work diary assessment template – students can add an entry (offline) which is then uploaded to the LMS.

My first attempts at strumming D major and E minor

Here are my first attempts at playing the D major and E minor chords. Pretty bad. I’ll continue to practice slowly alternating between the two chords and then I’ll attempt to record an Asynchronous Jam with the Class Recording: Strumming D major and E minor. Then, I’ll learn A minor.

E minor
E minor

D major
D major

Strumming D major and E minor (Badly)

Stringing my guitar and tuning it (Standard tuning)

New strings
My new D’Addario Medium Light strings. They have a full, bright tone?

It took some time, but I was finally able to string my guitar with new D’Addario Medium Light strings. This is the process I worked through. Using pliers, I gently tugged the bridge pins from the bridge, removed the old strings and then placed the ball end of each new string into their respective bridge hole. I then inserted the end of each string through the hole of each tuning-post. Next I had wind each string around its post. I found an informative video on YouTube demonstrating how to string a guitar and do a funky little loop around each post. I had to watch it a couple of times just to get the direction of the loop right.
All up, it took me about 45 minutes to finish stringing my guitar. Now it’s time to tune it.

I used Guitar Tuna to tune my guitar. It's good for guitar newbies like me!
Guitar Tuna! The polygraph test line-on-graph-paper style, individually labelled strings and Tune up or Tune down instructions made it easier for a total guitar beginner like me tune each string.

I had tried using a number of different guitar tuner apps to tune my guitar. I had even tried using the web-based guitar tuner. I felt like they were more suited to people with a bit more guitar experience than me. That’s ok. I just needed something with a little more instructional scaffolding.  That’s why I liked I Guitar TUNA! I was able to select the string on the app I wanted to tune, pick the same string on the guitar and then adjust the tuning key according to the feedback provided by the coloured line drawn by the pen on the graph paper and the Tune up or Tune down instructions that were displayed. Once the line was drawn in the middle of the graph paper and coloured green I knew that the string was tuned. I then followed the same process for the remaining strings. With more experience and the need to play songs with alternate tunings, I might find Guitar TUNA! restrictive, but for now it’s my tuner of choice.

My introduction to the Introduction to Guitar open online course

I can’t play guitar. I decided to change that by buying a second-hand acoustic guitar and then signing up as an open online student to the Introduction to Guitar course. The Introduction To Guitar course is going to be challenging and fun. I hope it’s not too painful to watch and listen to me learn how to play the guitar. Let’s see how it goes.

A public conversation with myself about learning

A public conversation with myself about learning is a collection and reflection on some of @todd_conaway’s tweets from #silt2011. One of the most interesting things for me is the familiarity of themes expressed in his tweets. Learning objectives with real-world application of skills, learning outcomes and competency based completion are all staple objectives in the Vocational Education and Training (VET) sector in which I work.