Kollum

Kollum is a concept for a collaborative location-based realtime audio experience that takes place in urban or suburban environments. Kollum is an attempt to conceptualise and capture the elements of location-based audio experiences that incorporate elevation or altitude through cumulative and persistant columns of sound. Users can use the audio recording features of their smart phone or mobile device to create a new audio block that makes up a column at their location  or add an audio block  to an existing column nearby. This animation is an attempt to visually represent the assembly…

In the absence of Glass

In the absence of my own Google Glass, I’d like some type of wearable rig that can hold a mobile device such as a tablet or mobile phone. This rig would sit on the wearers shoulder or be strapped to their chest. It would allow the wearer to view any assistive augment displayed on the screen of the mobile device and also allow the wearer to use both hands to interact with the point of interest. The rig would need to be able to be adjusted to suit the wearer and multiple…

Exploring Daniel Shiffman’s Display RGB, IR, and Depth Images, Point Cloud and Average Point Tracking examples

Getting started with Kinect and Processing by working my way through Daniel Shiffman’s most helpful RGB Display RGB, IR, and Depth Images, Point Cloud and Average Point Tracking examples. Thanks for sharing your code Daniel.

Sew, grow and mow: An AR lawn art experience

Sew, grow and mow is a multi-user virtual lawn art experience played in real locations with real dimensions in real time. Sew, grow and mow is a continuation of an existing idea about persistently augmenting a space. The experience Location Users choose a location nearby or a location they can easily access to place their virtual lawn. Suitable locations can be determined by browsing satellite photographs such as Google maps. They can then choose the design. Design Users can choose from a library of lawn art design templates or create their own with…

Mental note: AR homework

Mental note. I need to do some homework. I need to determine how I can use the Metaio SDK and Unity as an alternative to Blender and Aurasma for developing AR experiences for my VET Development Centre Specialist Scholarship. AR in the browser with something like JSARToolKit library with the WebRTC getUserMedia API Brekel Kinect Pro Face software Metaio SDK – Fundamentals of SDK and Unity Metaio SDK – Tutorial 1 – Hello, World! Metaio SDK – Setting up the Development Environment Metaio SDK – Getting started with the SDK…

ARcamp: Day 2 – Tuesday 21 May 2013

As part of my VET Development Centre Specialist Scholarship I attended ARcamp 2.0 at the Inspire Centre at the University of Canberra from Monday 20 May to Tuesday 21 May 2013. This blog post provides an overview of the presentations and activities that I participated in throughout the second day of camp. The work of Thomas Tucker While not specifically working in the filed of AR, Thomas Tucker from Winston-Salem State University spoke about his work with high-end technology such as the FARO scanner and consumer-level technology such as the…

ARcamp: Day 1 – Monday 20 May 2013

As part of my VET Development Centre Specialist Scholarship I attended ARcamp 2.0 at the Inspire Centre at the University of Canberra from Monday 20 May to Tuesday 21 May 2013. This blog post provides an overview of the presentations and activities that I participated in throughout the first day of camp. Welcome to ARcamp Danny Munnerley from ARstudio welcomed us all to camp. He spoke about ARstudio as a two-year practical and research project that was nearing completion. He also spoke about the eventual release of a resource that…

VET Development Centre Specialist Scholarship: Event 2 – Professional development

Today I attended the second scheduled event for the VET Development Centre Specialist Scholarship. The event was a professional development session facilitated by Greg Stephens. During  the session, Greg presented his unique perspective on the following themes that mapped to the nine units that make up BSB51407 Diploma of Project Management: The People side of Leading Projects Leading projects in the contemporary workplace Project Leadership – What’s important? Leading through the Project life cycle High performing project teams Leading teams at each stage of the project cycle Leading through Project challenges…

ARcamp is coming

I’m pretty stoked to be attending ARcamp 2.0 at the Inspire Centre at the University of Canberra from Monday 20 May to Tuesday 21 May as part of my VET Development Centre Specialist Scholarship. Judging by the recent camp update, it looks like we are all going to be immersed in two days of hands-on augmented reality workshop goodness. They’ve also provided us with a link to the ARcamp schedule and the Augmented Reality in Education Wiki and some links to AR industry players such as BuildAR, Metaio, Junaio and Aurasma to…

ARIG: A prototype camera rig for recording contextual tablet and mobile phone screen activity

ARIG is a camera rig for recording activity on and around the screen of a tablet or mobile phone screen. The concept for ARIG came from my need to record my experiments with marker and location-based augmented reality experiences. In this example, ARIG records a simple 3D cube augment produced with Blender 2.62 and Aurasma Studio.

A simple cube (Blender 2.62 and Aurasma Studio)

Blender 2.62 does a good job of exporting a 3D scene in the Collada (DAE) format for use as an overlay in Aurasma Studio. You just need to make sure you interpret the newest version of the Aurasma 3D Guidelines in a Blender 2.62 context. For a Blender 2.62 user the most important guidelines to follow are: Models need triangulated faces (Edit mode > Ctrl+T) No more than four lamps (lights) although three are recommended Models are to have no more than one map/texture Create a .tar archive to upload…

The challenge of using markers placed on the floor to trigger and then engage with an augment containing a 3D object modelled to scale

As part of my VET Development Centre Specialist Scholarship I’m in the process of developing my practical skills in designing and building augmented reality learning experiences. One of the experiences I’m currently prototyping is a workplace hazard identification activity. This has brought about an interesting challenge. I’m currently grappling with the challenge of using markers placed on the floor to trigger and then engage with an augment containing a 3D object modelled to scale. A marker needs to be in view and recognisable at all times for the augment to work.…

VET Development Centre Specialist Scholarship: Event 1 – Induction

This year I was fortunate enough to be granted a VET Development Centre Specialist Scholarship. Specialist Scholarships are available to non-teaching staff who wish to develop their skills, capability and professional standing within the VET system. Among the services provided by specialist staff are student support, student administration, human resources, learning design, records management, purchasing, learning resources, information technology, occupational health and safety and financial management. The Specialist Scholarship Program focuses on the professional development of non-teaching staff in the context of high level administrative and specialist tasks required of…

Concept: Augmented advertising could reduce litter on the streets

Advertising taped to the footpath at pedestrian crossings is a common sight in areas of Busan such as Seomyeon that are frequented by youth, young people and university students. As a location for advertising it is ideal. It is one of the rare times busy pedestrians will stop, if only for a short time. That’s long enough to catch their eye with some brightly coloured paper. Placement on the footpath is also more likely to increase the potential for a pedestrian to view the advertisement as they gaze down at…

css.php