-
Explorations in augmented reality presentation for the E-LEARNING INDUSTRY ASSOCIATION event
This afternoon I attended the E-LEARNING INDUSTRY ASSOCIATION PechaKucha event at Centre for Adult Education (CAE) in Melbourne’s CBD and presented Explorations in augmented reality. Explorations in augmented reality described my experience of prototyping an augmented reality experience as part of my VET Development Centre Specialist Scholarship. The event also featured presentations from Aliki Kompogiorgas (Box Hill Institute) and Scott Wallace (Nine Lanterns). Aliki presented on the six niche MOOCS for VET currently being developed by the Institute, while Scott presented on interative storytelling and the use of narrative in learning experiences. Explorations in augmented reality from Rowan Peter
-
Augmented contextual instruction user experience (Image tracking)
Screen captures from the completed image-tracking augmented contextual instruction user experience. The augmented contextual instruction is made up of a sequence of junaio channels that can be browsed in the junaio AR browser. For this example, the channels were browsed using junaio on an iPad.
-
Augmented contextual instruction user experience (Object tracking)
Screen captures from the completed object-tracking augmented contextual instruction user experience. The augmented contextual instruction is made up of a sequence of junaio channels that can be browsed in the junaio AR browser. For this example, the channels were browsed using junaio on an iPad.
-
Save. Upload. Wait. Test. Tweak. Save again. Upload again. Wait again. Test again. Tweak again. Almost there!
Save. Upload. Wait. Test. Tweak. Save again. Upload again. Wait again. Test again. Tweak again. Almost there! And another thing. It’s been 11 days since I posted my issue with Z translation of buttons in UI Designer of Metaio Creator 2.6 to the Helpdesk of the Metaio Developer Portal and there hasn’t been one response! Damn.
-
Placing geometry in the 3D point cloud in Metaio Creator 2.6
Placing geometry in the 3D point cloud in Metaio Creator 2.6 is a little cumbersome, bewildering and often inaccurate. It takes lots of clicking and seemingly random fiddling to place geometry. You also can’t really determine if your geometry is placed correctly within the 3D point cloud until you upload to your channel and then view your channel in Junaio on your device. If it’s not placed correctly, you need to tweak it in Creator, publish it and then view it again in Junaio. Repeat process until it looks like it might almost be placed correctly. Curious.
-
Lost in Metaio Creator’s Z translation
I’m currently wrestling with the correct Z translation of images and buttons in UI Designer of Creator 2.6. I’ve placed some buttons on top of an background image in the UI Designer of Creator and have encountered a problem where the image obscures the buttons and prevents them from working. I’ve checked the Z translation of the image in relation to the buttons and have made sure the buttons translated above the background image, but this doesn’t seem to make a difference when the published channel is viewed in Junaio on my iPad. I’ve posted my issue with Z translation of buttons in UI Designer of Creator 2.6 to the…
-
One to prepare and one to publish
One to prepare the required the channel and associated assets and one with the only license to publish. Not the best workflow, but it’s the only way at this time.
-
Two channels published without the extras
After two days of waiting, my Step 4: Remove piston seal from caliper and Step 1: Remove bracket from caliper channels on Junaio have been made public. This is good news, but I think I went a little early on the publish because I failed to include extras such buttons for the instructional video, learner resources and link to next step in the process. The geometry is also misplaced. Now trying to upload to update the published channel and it doesn’t seem to be working. You might have to unpublish the channel, upload changes, publish it and then wait two more days for it to be approved. Nope, you can just upload…
-
Geometry for brake caliper augment: Step 4: Remove piston seal from caliper
Previsualising the pointed tool, piston seal and rear brake caliper geometry for Step 4: Remove piston seal from caliper stage of the brake caliper augment. Geometry will be exported as FBX from Blender, prepared by FBXMeshCoverter for import into Creator for upload into my Metaio channel for final use as an augment. Figuring out the production workflow for each 3D model used in each step of the augmented contextual instruction. Important tip: Remember to export your fbx from Blender to your Desktop and not anywhere else on your computer. For some unknown or unexplained reason fbx files exported to anywhere else but the Desktop do not seem to work properly…
-
Supplementary material for augmented contextual instruction
These four instructional video are supplementary material for the four augmented contextual instruction sequences I’m developing as part of my VET Development Centre Specialist Scholarship.
-
Geometry for brake caliper augment: Step 3: Remove piston from caliper
Previsualising piston, boot and arrow geometry for Step 3: Remove piston from caliper stage of the brake caliper augment. Geometry will be exported as FBX from Blender, prepared by FBXMeshCoverter for import into Creator for upload into my Metaio channel for final use as an augment. Figuring out the production workflow for each 3D model used in each step of the augmented contextual instruction.
-
Preparing low-poly geometry for augmented contextual instruction in the disassembly and assembly of a vehicle’s rear brake caliper.
Preparing low-poly geometry for augmented contextual instruction in the disassembly and assembly of a vehicle’s rear brake caliper. Augmented contextual instruction is to be prepared in Metaio Creator, published to a Metaio channel and then accessed by learners through the Junaio app. 3D point cloud data is gathered with Metaio Toolbox. This video conceptualises one stage of the disassembly of a brake caliper. [Edit: Tuesday 13 August 2013] This video incorrectly conceptualises one part of the brake caliper disassembly process. A screwdriver is not used to extract the piston, seal and rubber boot from the caliper.
-
Prototyping augmented contextual instruction with Metaio Creator (Demo mode) and Toolbox
For this prototype, I’m using Metaio Creator image and object recognition features of Toolbox in restrictive Demo Mode to further explore some aspects of my concept of augmented contextual instruction. Unfortunately, in Demo Mode I can’t use the excellent 3D point cloud data captured in Toolbox to prototype all aspects of my augmented contextual instruction. In Demo Mode augments can only be triggered by a QR code, which is kinda okay for testing while you’re building. I’m thinking about buying a license. This video shows some of the features of the augmented contextual help I’m trying to prototype with Metaio Creator in Demo Mode.
-
Conceptualising proximity-based level of detail for augmented reality experiences incorporating 3D geometry
Attempting to conceptualise proximity-based level of detail for augmented reality experiences such as Sew, grow and mow that incorporate 3D geometry.
-
Concept: Augmented contextual instruction
Augmented contextual instruction provides users with procedural demonstrations based on recognisable features and attributes of an object. Augmented contextual instruction could be used in vocational training and assessment contexts. Users can add (record), edit and share their own contextual instruction.
-
Blender selection modes: The sync selection button
Once you’ve created seams in your mesh and then unwrapped it, you’ll then probably want to move contiguous groups of faces or UV islands into a location that will make it easier for you to create and then paint a texture. Before you can start moving your UV islands you’ll need to change your selection mode. Sync selection is an extremely useful selection mode that allows you to select separate UV islands and then rotate, scale and transform them in the UV/Image editor without affecting the corresponding elements in the 3D editor. You can activate and then deactivate Sync selection by selecting the Sync selection button in the header of the…
-
Kollum
Kollum is a concept for a collaborative location-based realtime audio experience that takes place in urban or suburban environments. Kollum is an attempt to conceptualise and capture the elements of location-based audio experiences that incorporate elevation or altitude through cumulative and persistant columns of sound. Users can use the audio recording features of their smart phone or mobile device to create a new audio block that makes up a column at their location or add an audio block to an existing column nearby. This animation is an attempt to visually represent the assembly of a number of sound columns in a urban space (Does not represent actual experience…YET!).
-
In the absence of Glass
In the absence of my own Google Glass, I’d like some type of wearable rig that can hold a mobile device such as a tablet or mobile phone. This rig would sit on the wearers shoulder or be strapped to their chest. It would allow the wearer to view any assistive augment displayed on the screen of the mobile device and also allow the wearer to use both hands to interact with the point of interest. The rig would need to be able to be adjusted to suit the wearer and multiple types of devices to prevent a confronting experience if the mobile device was placed too close to the users face.…
-
Mental note: AR homework
Mental note. I need to do some homework. I need to determine how I can use the Metaio SDK and Unity as an alternative to Blender and Aurasma for developing AR experiences for my VET Development Centre Specialist Scholarship. AR in the browser with something like JSARToolKit library with the WebRTC getUserMedia API Brekel Kinect Pro Face software Metaio SDK – Fundamentals of SDK and Unity Metaio SDK – Tutorial 1 – Hello, World! Metaio SDK – Setting up the Development Environment Metaio SDK – Getting started with the SDK Metaio – Getting started Unity3D Unity – Learn modules
-
ARcamp: Day 2 – Tuesday 21 May 2013
As part of my VET Development Centre Specialist Scholarship I attended ARcamp 2.0 at the Inspire Centre at the University of Canberra from Monday 20 May to Tuesday 21 May 2013. This blog post provides an overview of the presentations and activities that I participated in throughout the second day of camp. The work of Thomas Tucker While not specifically working in the filed of AR, Thomas Tucker from Winston-Salem State University spoke about his work with high-end technology such as the FARO scanner and consumer-level technology such as the Kinect. We also got to play with FARO scanner and scan the INSPIRE centre and surrounding area. There are more…
-
ARcamp: Day 1 – Monday 20 May 2013
As part of my VET Development Centre Specialist Scholarship I attended ARcamp 2.0 at the Inspire Centre at the University of Canberra from Monday 20 May to Tuesday 21 May 2013. This blog post provides an overview of the presentations and activities that I participated in throughout the first day of camp. Welcome to ARcamp Danny Munnerley from ARstudio welcomed us all to camp. He spoke about ARstudio as a two-year practical and research project that was nearing completion. He also spoke about the eventual release of a resource that compiles their research findings and that this year would be the last camp. The welcome also featured an update on…
-
ARcamp is coming
I’m pretty stoked to be attending ARcamp 2.0 at the Inspire Centre at the University of Canberra from Monday 20 May to Tuesday 21 May as part of my VET Development Centre Specialist Scholarship. Judging by the recent camp update, it looks like we are all going to be immersed in two days of hands-on augmented reality workshop goodness. They’ve also provided us with a link to the ARcamp schedule and the Augmented Reality in Education Wiki and some links to AR industry players such as BuildAR, Metaio, Junaio and Aurasma to help us prepare. Awesome.
-
ARIG: A prototype camera rig for recording contextual tablet and mobile phone screen activity
ARIG is a camera rig for recording activity on and around the screen of a tablet or mobile phone screen. The concept for ARIG came from my need to record my experiments with marker and location-based augmented reality experiences. In this example, ARIG records a simple 3D cube augment produced with Blender 2.62 and Aurasma Studio.
-
The challenge of using markers placed on the floor to trigger and then engage with an augment containing a 3D object modelled to scale
As part of my VET Development Centre Specialist Scholarship I’m in the process of developing my practical skills in designing and building augmented reality learning experiences. One of the experiences I’m currently prototyping is a workplace hazard identification activity. This has brought about an interesting challenge. I’m currently grappling with the challenge of using markers placed on the floor to trigger and then engage with an augment containing a 3D object modelled to scale. A marker needs to be in view and recognisable at all times for the augment to work. An augment containing a 3D object not modelled to scale can be easily triggered and engaged with by a marker…
-
Your device does not support this type of Aura
Your device does not support this type of Aura.
-
Dr Softain’s emergency broadcast
Dr Sigmeund Softain is specialist surgeon. He was responsible for pioneering The Softain Biopsy medical procedure. Dr Softain’s research is experimental and ongoing. He Tweets his day-to-day research activities and also discusses his work and recent findings on his blog. One day everything changed. Something went wrong in Dr Softain’s laboratory. An experimental medical procedure went awry. An outbreak. Amidst all the chaos and confusion Dr Softain recorded his final broadcast. An emergency broadcast warning us of an unknown peril. Dr Softain’s emergency broadcast Outside Dr Softain’s laboratory All that was remained was a pool of blood and Dr Softain’s laboratory pass. User engagement I recorded and edited Dr Softain’s emergency video on my iPad. I then used…
-
Concept: Augmented advertising could reduce litter on the streets
Advertising taped to the footpath at pedestrian crossings is a common sight in areas of Busan such as Seomyeon that are frequented by youth, young people and university students. As a location for advertising it is ideal. It is one of the rare times busy pedestrians will stop, if only for a short time. That’s long enough to catch their eye with some brightly coloured paper. Placement on the footpath is also more likely to increase the potential for a pedestrian to view the advertisement as they gaze down at their smartphone. Accidental line of sight! The outcome from this type of advertising is the large volume of waste paper.…