I’m building an interactive experience which makes use of available weather data, much like the data services made available by the BOM or the OpenWeatherMap. This means I need to be able to load data from the web, parse it and then transform 3D geometry based on real world locations and the loaded weather data.
I’m interested in the teaching and learning as well as art applications of this technology. What’s interesting about this as an exercise is the wasted opportunity to celebrate and showcase respondents of their input into the data gathering and research process. Metro and Monash could have actively showcase the exercise by representing the number of passengers/users on the platform in a creative way, personally and publically – on their device and on large screens and through speakers. Doing this may mitigate some of the possible resistance by passengers who only find out about their contribution to research after the fact – they see a small sign that gives them an option of opting-out by turning-off their WIFI (and therefore disabling their own WIFI connectivity – work/study), which isn’t really fair.
This trial is a joint initiative between Metro, Public Transport Victoria (PTV) and Monash University.
What is this trial about?
Metro, PTV and Monash University are conducting research to gain real time data of passenger numbers on platforms at Richmond station and on board trains travelling from Richmond across the network.
This is about using technology to provide better information to improve the services we provide to customers.
What will the trial do?
Information will be collected on how people are using Richmond Station by counting the number of Wi-Fi enabled devices on the platforms and trains.
As a result of this collection of data, Metro will be able to further analyse how it can improve the customer experience by:
collecting data to better inform future network service planning
improving information available at stations and allocation of customer service staff
Identifying crowd movements on and around platforms
providing customers with a better overall service
When will the trial take place?
The trial will commence on the 17th of February 2017 and run until the 30th of June 2017.
Where will the trial be conducted?
The trial will focus only on passenger flow on platforms 7,8,9 and 10 plus the concourse at Richmond Station and on board four trains.
All areas where this technology is active will be clearly marked to advise customers.
How does it work?
Wi-Fi routers will be installed on platforms 7,8,9 and 10, plus the concourse at Richmond Station and on four trains. These devices will be able to count the number of active Wi-Fi devices in the vicinity of these platforms.
How can I opt out?
All you need to do is to turn off the Wi-Fi on your personal device and you will not be included in this trial.
Can you access my personal information from my electronic devices?
No. Personal information is never traced or tracked.
Your devices unique identification number (MAC address) is put through two levels of encoding. This ensures your personal information cannot be traced or tracked.
How will I know what trains I’m being tracked on?
All trains where this technology is active will be clearly marked with posters to advise customers.
Will I start being registered as soon as I step on one of the 4 trains or only when we approach Richmond Station?
The technology counts the number of active WiFi devices on board the trains regardless of their location on the network. However during analysis only trains that travel through Richmond station will be considered.
Who has access to this information?
The number of devices will be the only information collected. Your personal information is never traced or tracked and will remain completely anonymous.
The raw data is only held by Monash University and will be deleted 90 days after finalisation of the trial. The statistical information will be used to gauge the accuracy of this technology and will be shared with Metro and the PTV.
How can I get more information?
For more information you can contact the PTV call centre: 1800 800 007
On the promotional page for the workshop they said “This course is for anyone that has an interest in applying the design process to solve complex problems. It’s likely you’ll have many transferable skills or experiences that will be put to use through the course of the day.” Cool. That’s exactly what I want to be able to do.
What did we do?
In the workshop we worked through the components that make up the practise of service design:
Discovery: gaining empathy and understanding the needs and pain points of users.
Ideation: Developing a range of ideas on how to develop a solution to meet the needs of all users.
Prototyping: Testing and iterating, including the customer experience, “front of house” interactions, and back of house dependencies.
Communication: Articulating the many facets of your offering in a concise way.
“What is the strange profound attraction that this rectangular piece of concrete holds for them? Do we now observe the rights of passage of a newly emerging civilisation?” – Dr Eugene D Mander (Public Domain, 1988)
Today I attended the Rethinking Online Learning: Melding the Best of Teaching, Television and Testing seminar presented by Professor Gosling (Department of Psychology, University of Texas, Austin, USA) as part of the Innovations in Teaching and Learning series of seminars presented by Melbourne University.
Professor Gosling’s seminar was based on the work he’s doing with a colleague from University of Texas in the way of rethinking online learning, particularly a synchronous broadcast delivered to a large number of students. In the description for the seminar, Professor Gosling described his work in the following way:
We teach a Synchronous Massive Online Course (or SMOC), broadcasts live to about 2000 students. With daily quizzes and a television show format, we find that absentee rates are low, test performance high, study habits greatly improved, with large drops in achievement gaps between rich and poor students. The synchronous broadcast model offers a number benefits including facilitating interactive elements and addressing concerns about cheating. Many challenges remain but our experiences (and data) suggest that large online classes taught using this format have great potential.
In his seminar Professor Gosling’s spoke about the design, development and delivery of a Synchronous Massive Online Course (or SMOC) for the Introduction to Psychology course at University of Texas. The SMOC was a response to what he called the Big Old Class (BOC) where there was high student attrition and low achievement. Built on Canvas (the learning management system (LMS) by Instructure), Gosling and his colleague were able to broadcast their lecture (in a chat show format with segments such daily news items, lab experiments and interviews with experts) from a studio at the University to a live student audience and students tuned in online. Within the Canvas LMS, students were also able to form mentor-based study groups (known as pods), complete surveys, access online textbooks and resources and complete daily tests (known as benchmarking). Benchmarking featured questions individualised to the student and contained feedback with support that enabled the student to undertake self-regulated learning. Professor Gosling advocated daily benchmarking as a method of providing students with feedback and measurement on their performance in contrast to more traditional mid-term examinations where performance was often measured too late (which often made it more difficult for the student to do something about it).
Problems and issues
The only problem or issue with the television show format that was mentioned by Professor Gosling was the cost of production, particularly the team (analagous to live television production) required to coordinate and sequence the broadcast the show.
Although Professor Gosling didn’t mention cheating and collusion as a problem or issue for the course, it isn’t something that’s specific to this course format. It just becomes a little more complex when student behaviour is somewhat obfuscated by online delivery (Professor Gosling did go on to talk about his approach to managing cheating and collusion between his students).
Professor Gosling went on to tell us that the course had a success with its increase in student retention rates and grades. He attributed the success to the television show format, the intensive benchmarking with feedback (which encouraged self-regulated learning), the student mentoring and facilitated discussions (via the study-group pods). The course was also a success as far as gathering data about student behaviour (online) that could be used for further research and continued course enhancement. Although not mentioned by Professor Gosling, this data could also serve as a potential revenue stream. Based on the success of the course, Professor Gosling told us this model was being strongly considered for adoption by other faculties at his university.
Managing cheating and collusion
Professor Gosling and his team managed cheating and collusion between students throughout the daily benchmarking by consigning someone to write some software that monitored and compared in real-time the order of each question that was completed by each student and the amount of time it took each of them them to complete the question. The software then identified patterns of completion and was able to determine the likelihood of collusion between students during benchmarking. Professor Gosling and his team then decided if they were going to send the suspicious student an email warning them their behaviour was being closely monitored.
There’s no denying the the flexible and fun nature of the online television show style broadcast would have been contributing a factor to the increase in student performance, but I can’t help but think the mentored study groups and the rather rigorous and regimented daily benchmarking would have also been a major contributing factor to the increase in student performance particularly when the benchmarking provided feedback that helped students undertake their own self-regulated learning). Besides, a fair, reasonable and diverse assessment strategy would probably measure student performance and provide them with feedback and support their self-regulated learning anyway.
From what Professor Gosling told us, the SMOC has been a success, but I can’t also help but think the broadcast model is somewhat traditional and doesn’t consider constructivist and connectivist approaches to course design that incorporate the network (as a learning environment with peers) and the large number tools available to enable students to become authors and contribute to course content.
There’s certainly a place for student generated, curated, moderated and broadcast content (with the teacher and other students as well in response to content) particularly with a premise of a television show format. Unfortunately, the broadcast (without feedback or input from students) method of guiding, monitoring and directing students could be considered a fairly regular and popular instructional strategy for those yearning to repetitively deliver learning at scale.