VET Development Centre CEO Denise Stevens opens proceedings for the 2013 Specialist Scholar graduation ceremony.
Today I attended the graduation ceremony for all 2013 VET Development Centre Specialist Scholars. The graduation ceremony was the final event for all recipients of this year’s Specialist Scholarship. The event featured an opening address and presentation of certificates and a gifts to all scholars from VET Development Centre CEO Denise Stevens.
The graduation ceremony also featured an amazing presentation from Terese McAleese, Director of Learning at Swinburne University of Technology. Terese spoke about her journey as an immigrant from 1980s Belfast to Melbourne as well as her journey in vocational education and training (VET), from a student to an administrator and teacher and finally to a professional responsible for designing learning experiences. Terese also spoke about the transformational nature of the VET sector and TAFE, a place where skills could be developed for the direct application in industry as well as a place for second chances or opportunities for education that might have been missed the first time. The chance for a new or better life.
Screen capture from Metaio Creator. Untextured 3D geometry placed within the point cloud of a mapped environment. Once published to a junaio channel, the 3D geometry will be placed over the environment when browsed in the junaio AR browser.
Screen captures from the completed image-tracking augmented contextual instruction user experience. The augmented contextual instruction is made up of a sequence of junaio channels that can be browsed in the junaio AR browser. For this example, the channels were browsed using junaio on an iPad.
The first step in disassembling a rear brake caliper is to remove the bracket from the caliper. Use a spanner to loosen the retaining bolts.The second step in disassembling a rear brake caliper is to inspect and clean the retaining bolts and remove the rubber seal from the bracket.Insert the air tool into the fluid inlet port of the caliper.Use a pointed tool to remove the piston seal from the caliper.The instructional poster provides the user with an entry point to the object-based or image-based ‘How to disassemble a rear brake caliper’ augmented contextual instruction.
Screen captures from the completed object-tracking augmented contextual instruction user experience. The augmented contextual instruction is made up of a sequence of junaio channels that can be browsed in the junaio AR browser. For this example, the channels were browsed using junaio on an iPad.
Place the assembled rear brake caliper on the workbench with the retaining bolts facing towards you.The first step in disassembling a rear brake caliper is to remove the bracket from the caliper. Use a spanner to loosen the retaining bolts.Remove the bracket from the caliper and then place the bracket on the workbench facing towards you.The second step in disassembling a rear brake caliper is to inspect and clean the retaining bolts and remove the rubber seal from the bracket.Put the bracket to one side. Place the caliper upside down on the workbench with the inlet port facing away from you.Insert the air tool into the fluid inlet port of the caliper.Turn the caliper over with the cylinder bore facing towards you.Use a pointed tool to remove the piston seal from the caliper.You have now completed the final step in disassembling a rear brake caliper. Remember to inspect and clean all parts before reassembly.
Rear brake caliper placed in the correct position for Step 1: Remove bracket from brake caliper.
Bracket placed in correct position for Step 2: Inspect and clean retaining bolts.
Brake caliper placed in correct position for Step 3: Remove piston from brake caliper.
Caliper placed in correct position for Step 4: Remove piston seal from brake caliper.
Channel viewed in Junaio AR browser (Portrait).Channel viewed in Junaio AR browser (Landscape).Channel viewed in Junaio AR browser (Portrait).Channel viewed in Junaio AR browser (Landscape).
Placing geometry in the 3D point cloud in Metaio Creator 2.6 is a little cumbersome, bewildering and often inaccurate. It takes lots of clicking and seemingly random fiddling to place geometry. You also can’t really determine if your geometry is placed correctly within the 3D point cloud until you upload to your channel and then view your channel in Junaio on your device. If it’s not placed correctly, you need to tweak it in Creator, publish it and then view it again in Junaio. Repeat process until it looks like it might almost be placed correctly. Curious.
The first step in disassembling a rear brake caliper is to remove the bracket from the caliper. Use a spanner to loosen the retaining bolts.The second step in disassembling a rear brake caliper is to inspect and clean the retaining bolts and remove the rubber seal from the bracket.The third step in disassembling a rear brake caliper is to remove the piston from the caliper.The fourth and final step in disassembling a rear brake caliper is to remove the piston seal from the caliper.
Screen captures from my image tracking tests in Junaio for each step of the disassembly process with UI elements. Unfortunately, due to my Z translation problems in UI Designer of Creator 2.6 I had to remove the background image which was placed underneath the buttons. Doing this gives the buttons a floaty and slightly random feel, but at least the buttons function as they should!
The first step in disassembling a rear brake caliper is to remove the bracket from the caliper. Use a spanner to loosen the retaining bolts.The second step in disassembling a rear brake caliper is to inspect and clean the retaining bolts and remove the rubber seal from the bracket.The third step in disassembling a rear brake caliper is to remove the piston from the caliper.The fourth and final step in disassembling a rear brake caliper is to remove the piston seal from the caliper.
I’m currently wrestling with the correct Z translation of images and buttons in UI Designer of Creator 2.6. I’ve placed some buttons on top of an background image in the UI Designer of Creator and have encountered a problem where the image obscures the buttons and prevents them from working. I’ve checked the Z translation of the image in relation to the buttons and have made sure the buttons translated above the background image, but this doesn’t seem to make a difference when the published channel is viewed in Junaio on my iPad.
While attempting to solve this problem on my own in Creator, I experimented with placing the buttons partially over the background image and then publishing to my channel. This had an interesting result. The button displays and functions correctly!
Here are some screen captures of the properties windows for the background image and each button in the UI Designer of Creator and the published channel viewed in Junaio on my iPad.
The properties window for the background image. The image has a 0.0 Z translation.The properties window for the button placed over the background image. The button has a 5.0 Z translation.The properties window for a button placed partially over the background image. The button has a 5.0 Z translation.The palcement of UI elements in the published channel viewed in Junaio on my ipad.
One to prepare the required the channel and associated assets and one with the only license to publish. Not the best workflow, but it’s the only way at this time.
Event 3: Knowledge sharing. The third and final event for all 2013 Specialist Scholars.
Today I attended the third and final scheduled event for the VET Development Centre Specialist Scholarship. The event was a knowledge sharing event where all specialist scholars presented their findings and outcomes from their participation in the program. Unlike Event 1 and Event 2, this event gave everyone the opportunity shared what they’ve learned.
Surprised, but not surprised by the broad range topics presented by all specialist scholars.
The Tweets were correct! The diversity of presentations from each scholar and the range of application and utilisation was amazing. From study programmes, workshops, conferences to the creation of artifacts, each scholar had made the most of their time in a useful and productive way. Thanks VET Development Centre for giving us all the opportunity to take part in the Specialist Scholar program.
If you’re interested, you can download the PDF version (6.7MB) or the PowerPoint version (6.9MB) of my VET Development Centre Specialist Scholarship: Event 3 – Knowledge sharing presentation.
After two days of waiting, my Step 4: Remove piston seal from caliper and Step 1: Remove bracket from caliper channels on Junaio have been made public. This is good news, but I think I went a little early on the publish because I failed to include extras such buttons for the instructional video, learner resources and link to next step in the process. The geometry is also misplaced. Now trying to upload to update the published channel and it doesn’t seem to be working. You might have to unpublish the channel, upload changes, publish it and then wait two more days for it to be approved.
Nope, you can just upload the Creator file to the server again and it will ask you to update the already published channel. For me, Creator and the Metaio/Junaio platform lack some expectation-outcomes-experience-scaffolding for users. This software, platform and service can be a little bewildering at times.
This recording shows my first two channels published to Junaio. Both channels feature incorrect placement of geometry and missing user interface. It’s a work in progress.
Previsualising the pointed tool, piston seal and rear brake caliper geometry for Step 4: Remove piston seal from caliper stage of the brake caliper augment. Geometry will be exported as FBX from Blender, prepared by FBXMeshCoverter for import into Creator for upload into my Metaio channel for final use as an augment. Figuring out the production workflow for each 3D model used in each step of the augmented contextual instruction.
Important tip: Remember to export your fbx from Blender to your Desktop and not anywhere else on your computer. For some unknown or unexplained reason fbx files exported to anywhere else but the Desktop do not seem to work properly when converted using FBXMeshConverter.
Screen captures from 3D point cloud data gathering with Metaio Toolbox for sequences to be used in my augmented contextual instruction on the rear brake caliper. The 3D point cloud data will then be used as tracking technology to place my augments.
A rear brake caliper. This 3D point cloud will be used to place an augment for the first step of the rear brake caliper disassembly sequence.
The bracket from a rear brake caliper. This 3D point cloud data will be used to place an augment for the second step of the rear brake caliper disassembly sequence.
An overturned rear brake caliper. This point cloud data will be used to place an augment for the third step of the rear brake caliper disassembly sequence.
A rear brake caliper with the piston removed. This point cloud data will be used to place an augment for the fourth and final step of the rear brake caliper disassembly sequence.
Metaio specify iTunes for the transfer of 3D point cloud data from your iPad to your computer.
Captions can help to provide an equivalent learning experience for viewers who may be hearing impaired, speak other languages or use assistive technology. Captions are also valuable in a teaching and learning context where it may be impractical for learners to wear headphones or play video at high volume in an group training environment such as workshop, classroom or laboratory.
Previsualising piston, boot and arrow geometry for Step 3: Remove piston from caliper stage of the brake caliper augment. Geometry will be exported as FBX from Blender, prepared by FBXMeshCoverter for import into Creator for upload into my Metaio channel for final use as an augment. Figuring out the production workflow for each 3D model used in each step of the augmented contextual instruction.
Preparing low-poly geometry for augmented contextual instruction in the disassembly and assembly of a vehicle’s rear brake caliper. Augmented contextual instruction is to be prepared in Metaio Creator, published to a Metaio channel and then accessed by learners through the Junaio app. 3D point cloud data is gathered with Metaio Toolbox.
This video conceptualises one stage of the disassembly of a brake caliper.
Using Metaio Toolbox to map the brake caliper and create a 3D point cloud.
Brake caliper with components in partial disassembly.
Three components that make up the brake caliper.
Components that make up
Roughly storyboarding the brake caliper disassembly process. Arrows show direction and movement of the point of interest.
Once the piston has been loosened, it rubber boot can be removed from the brake caliper.
[Edit: Tuesday 13 August 2013] This video incorrectly conceptualises one part of the brake caliper disassembly process. A screwdriver is not used to extract the piston, seal and rubber boot from the caliper.
The obtuse futuristic device that can only be serviced by a fearless technician with a little help from some augmented contextual instruction.
For this prototype, I’m using Metaio Creator image and object recognition features of Toolbox in restrictive Demo Mode to further explore some aspects of my concept of augmented contextual instruction. Unfortunately, in Demo Mode I can’t use the excellent 3D point cloud data captured in Toolbox to prototype all aspects of my augmented contextual instruction. In Demo Mode augments can only be triggered by a QR code, which is kinda okay for testing while you’re building. I’m thinking about buying a license.
This video shows some of the features of the augmented contextual help I’m trying to prototype with Metaio Creator in Demo Mode.
Point cloud with 3D object. The 3D object provides instruction to the learner how they’re to rotate the component.
Working with the point cloud in Metaio Creator.
Working within the constraints of Demo mode. Placing a button that launches additional resources about the component.
Prototyping AR concepts to determine the value of Creator is challenging, particularly due to the extremely tight constraints of Demo mode. Workaround – Make a number of different channels with single features!
Brainstorming additional screen elements to support the learner during contextual help/instruction.
Creating and editing UV layout for 3D object augment.
3D geometry exported from Blender as an FBX and then processed with FBXMeshConverter to make it suitable for Creator.
3D geometry exported from Blender as an FBX and then processed with FBXMeshConverter to make it suitable for Creator.
Augmented contextual instruction provides users with procedural demonstrations based on recognisable features and attributes of an object. Augmented contextual instruction could be used in vocational training and assessment contexts. Users can add (record), edit and share their own contextual instruction.
Planning the augmented contextual instruction with pen and envelope.
Planning with pen, highlighters and post-it notes.
Planning with pen, highlighters and post-it notes.
Mental note. I need to do some homework. I need to determine how I can use the Metaio SDK and Unity as an alternative to Blender and Aurasma for developing AR experiences for my VET Development Centre Specialist Scholarship.
Thomas Tucker from Winston-Salem State University talks about many of his projects.
The work of Thomas Tucker
While not specifically working in the filed of AR, Thomas Tucker from Winston-Salem State University spoke about his work with high-end technology such as the FARO scanner and consumer-level technology such as the Kinect. We also got to play with FARO scanner and scan the INSPIRE centre and surrounding area.
Rob Fitzgerald discusses the challenges of designing, developing and implementing programs in the field.
Rob Fitzgerald
Rob Fitzgerald spoke about the Agriculture Sector Linkages Program in Pakistan (ASLP2) that was co-developed by the University of Canberra and local communities and government in Pakistan. Although ASLP2 didn’t contain any AR experiences, Rob did discuss universal themes such as the importance of engaging and considering learners/intended audience/market/users during the design process and the role of technology in the solution.
Meet Trak Lord (US Marketing and Communications) from Metaio
Trak Lord from Metaio
Trak Lord (US Marketing and Communications) from Metaio Skyped into ARcamp to present examples of AR in education and also talk about what they are doing in the field of AR and the future of their product. Although it was treated as incidental amongst all the 3D dinosaurs, the 3D object/image recognition feature of the Metaio SDK that enabled the diagnostic/procedural instruction manual augment on the Mitsubishi air-conditioner was my personal highlight. This was the only practical, useful or vocational example of AR as a value-add or assistive tool in his suite of marketing videos.
Amber Standley talks APositive and other AR design case studies.
Amber Standley
Amber Standley presented two AR case studies that demonstrated possible implementations of AR. The first AR case study was a marker-based experience that allowed users to learn more about the University of Canberra’s emissions by scanning a poster. The user was then able to:
identify and compare emission levels from buildings that make up the University of Canberra campus
review past emission levels (performance)
display weekly information on buildings with low emissions
display discussions about each building.
The second AR case study was also a marker-based experience that allowed users to learn more about distinguished alumni from the University of Canberra by scanning photographs mounted in an engraved frame. Each photograph triggered a video which displayed biographical and additional information about each distinguished alumni.
Paul Krix talks Unity and the Metaio SDK.
Paul Krix
Paul Krix provided a very general overview of creating a mobile AR app for iOS or Android with Unity and the Metaio SDK (SDK can be downloaded from the Metaio website). The MetaioMan featured in his presentation.
Danny Munnerley from ARstudio welcomed us all to camp. He spoke about ARstudio as a two-year practical and research project that was nearing completion. He also spoke about the eventual release of a resource that compiles their research findings and that this year would be the last camp.
Rob Manson talks about the current state of play in AR.
Tony and Dean from AIE talk about their experiences developing the 35° 17 South.
35° 17 South
Tony Oakden and Dean Walshe from the Academy of Interactive Entertainment (AIE) spoke about their experiences developing the 35° 17 South. 35° 17 South was a multi-reality location-based game that took place in April on the grounds of the National Gallery of Australia. Although didn’t contain elements of AR, many aspects of it’s design, development and implementation reflected issues typically encountered when producing technology-based activities.
Alex Young talks about elements that make up the Design 29 exhibition at the National Archives of Australia.
Design 29: Creating a capital
Alex Young spoke about the issues, development concerns she encountered and insights from developing AR experiences for the Design 29: Creating a capital exhibition at the National Archives of Australia. The Design 29: Creating a capital exhibition brought together the original designs for Canberra by finalists of the 1911 Federal Capital City Design Competition and featured AR elements such as video, animation and detailed scans of original artwork.
Stephen Barrass fields questions about the Garden of Australian Dreams.
Garden of Australian Dreams
Stephen Barrass spoke about his recent project Garden of Australian Dreams as well as his past AR projects AVIARy, Edible Audience and Vanishing Point. Pretty awesome stuff.
Dannerly Munnerly introduces the Mini design challenge.
AR mini design challenge
The AR mini design challenge was facilitated by Danny Munnerly and Matt Bacon. The challenge was to form a group, brainstorm, design an AR experience on paper and then present the concept to the camp. Prizes were awarded to the best AR experiences. The AR mini design challenge was an awesome exercise in design thinking, where we need to discover, interpret, ideate, experiment and then evolve a concept.
Matt Bacon talks Studio Aurasma.
Studio Aurasma and the Aurasma iPad app
Matt Bacon presented a session on designing and developing simple AR experiences with the Aurasma iPad app and the web browser based Studio Aurasma. This session gave us the opportunity to create a simple marker-based video augment with the iPad app as well as explore the features of Studio Aurasma.
Facilitator Greg Stephens got us all thinking about leading projects and teams in this group activity.
Today I attended the second scheduled event for the VET Development Centre Specialist Scholarship. The event was a professional development session facilitated by Greg Stephens. During the session, Greg presented his unique perspective on the following themes that mapped to the nine units that make up BSB51407 Diploma of Project Management:
The People side of Leading Projects
Leading projects in the contemporary workplace
Project Leadership – What’s important?
Leading through the Project life cycle
High performing project teams
Leading teams at each stage of the project cycle
Leading through Project challenges
Addressing resistance; when things go wrong; managing yourself
Greg also initiated lively discussion, activities and presented examples of:
model project success factors such as:
Pinto and Slevin’s 1998 list of 10 project success factors
Turner’s 1999 project drivers diagram
differences between project management and project leadership
project leader qualities
characteristics of effective leaders
characteristics of an effective team
situational leadership model
work preferences – team management systems
Belbin’s Team role summary descriptions
what drives performance and engagement
coaching the GROW model
Tuckman’s four-stage (Forming, Storming, Norming and Performing) group development model with Edison’s (Informing, Conforming and Deforming) expansion
tips for influencing, persuading and resolving conflict
guidelines for leading change
managing employee resistance to change
responding to indifference or anger
a resilience framework (Vision, Determination, Interaction, Relationships, Problem solving, Organisation and Self confidence)
ARIG is a camera rig for recording activity on and around the screen of a tablet or mobile phone screen. The concept for ARIG came from my need to record my experiments with marker and location-based augmented reality experiences.
In this example, ARIG records a simple 3D cube augment produced with Blender 2.62 and Aurasma Studio.
The original concept sketch for ARIG.
Concept sketch for the Light sock component of ARIG. The Light sock prevents reflections on the iPad surface when recording augments outside.
Pattern block for the Light sock component of ARIG.
Unpainted ARIG with tablet mount.
Unpainted ARIG with mobile phone mount.
The complete ARIG kit with light sock components and iPad, mobile phone and DSLR required to record the experience.
A simple 3D cube augment created with Blender and Aurasma Studio
Blender 2.62 does a good job of exporting a 3D scene in the Collada (DAE) format for use as an overlay in Aurasma Studio. You just need to make sure you interpret the newest version of the Aurasma 3D Guidelines in a Blender 2.62 context. For a Blender 2.62 user the most important guidelines to follow are:
Models need triangulated faces (Edit mode > Ctrl+T)
No more than four lamps (lights) although three are recommended
Models are to have no more than one map/texture
Create a .tar archive to upload to Ausrasma Studio made up of .dae (Exported from Blender 2.62), .png (Texture) and a .png thumbnail (256 x 256).
This video is an example of a simple 3D cube augment produced with Blender 2.62 and Aurasma Studio.
The 3D object used in this augment has been reduced in scale to enable the object to be viewed within the constraints of a marker-based augment.
As part of my VET Development Centre Specialist Scholarship I’m in the process of developing my practical skills in designing and building augmented reality learning experiences. One of the experiences I’m currently prototyping is a workplace hazard identification activity. This has brought about an interesting challenge. I’m currently grappling with the challenge of using markers placed on the floor to trigger and then engage with an augment containing a 3D object modelled to scale.
A marker needs to be in view and recognisable at all times for the augment to work. An augment containing a 3D object not modelled to scale can be easily triggered and engaged with by a marker placed on the floor as the marker will most likely remain in view and recognisable at all times. An augment containing a 3D object modelled to scale can also be easily triggered by a marker placed on the floor. The user then needs to move away from the marker to engage with the augment. As the user moves away from the marker it no longer remains in view and recognisable. This means the augment will fail.
In this example, a simple augment of an industrial workplace scene is triggered by the marker. The industrial workplace scene has been reduced in size and is no longer suitable.
Possible solutions?
Increase the size of the marker or place the marker on a wall to ensure the marker remains in view and recognisable at all times. Increasing the marker could be a solution, but then a specialist printer may be required instead of a standard domestic or office printer. Placing the marker on the wall could be a solution, but only if the experience was thematically relevant. A marker placed on wall could also be used to trigger an augment on the floor. This could also work, but would require strict placement to ensure the augment is placed in an accurate position on the floor relative to the marker and not floating in the air or buried in the floor.
Another possible solution could also be to trigger the augment containing the 3D object modelled to scale based on location. This solution could work if the designated location for the augment was outside or if the location could be accurately determined when indoors.
Prototyping a formative hazard identification activity.
Prototyping a formative hazard identification activity.
Prototyping a formative hazard identification activity.
Aurasma’s advanced image recognition technology can sometimes get a little confused!
Aurasma’s advanced image recognition technology can sometimes get a little confused!
Your device does not support this type of Aura. Testing a marker-based 3D augment with my mobile phone. Pity my phone doesn’t have the resources to manage a 3D augment. If it did, it would’ve appeared in Aurasma’s list of supported devices.
My exploration of how we learn and how we design and develop digital stuff that helps us learn.