Tag Archives: specialist scholars

Graduation ceremony for the 2013 Specialist Scholarship

Denise Stevens
VET Development Centre CEO Denise Stevens opens proceedings for the 2013 Specialist Scholar graduation ceremony.

Today I attended the graduation ceremony for all 2013 VET Development Centre Specialist Scholars. The graduation ceremony was the final event for all recipients of this year’s Specialist Scholarship. The event featured an opening address and presentation of certificates and a gifts to all scholars from VET Development Centre CEO Denise Stevens.

The graduation ceremony also featured an amazing presentation from Terese McAleese, Director of Learning at Swinburne University of Technology. Terese spoke about her journey as an immigrant from 1980s Belfast to Melbourne as well as her journey in vocational education and training (VET), from a student to an administrator and teacher and finally to a professional responsible for designing learning experiences. Terese also spoke about the transformational nature of the VET sector and TAFE, a place where skills could be developed for the direct application in industry as well as a place for second chances or opportunities for education that might have been missed the first time. The chance for a new or better life.

A solution to the challenge of using markers placed on the floor to trigger and then engage with an augment containing a 3D object modelled to scale

Using Simultaneous localization and mapping (SLAM) or mapping a room (creating a 3D point cloud) or environment with Metaio Toolbox could be a solution to the challenge of using markers placed on the floor to trigger and then engage with an augment containing a 3D object modelled to scale.

point_cloud_room_mapping
Screen capture from Metaio Creator. Untextured 3D geometry placed within the point cloud of a mapped environment. Once published to a junaio channel, the 3D geometry will be placed over the environment when browsed in the junaio AR browser.

Augmented contextual instruction user experience (Image tracking)

Screen captures from the completed image-tracking augmented contextual instruction user experience.  The augmented contextual instruction is made up of a sequence of junaio channels that can be browsed in the junaio AR browser. For this example, the channels were browsed using junaio on an iPad.

Step 1: Remove bracket from caliper
The first step in disassembling a rear brake caliper is to remove the bracket from the caliper. Use a spanner to loosen the retaining bolts.
Step 2: Inspect and clean retaining bolts
The second step in disassembling a rear brake caliper is to inspect and clean the retaining bolts and remove the rubber seal from the bracket.
Step 3: Remove piston from brake caliper.
Insert the air tool into the fluid inlet port of the caliper.
Step 4: Remove seal from brake caliper
Use a pointed tool to remove the piston seal from the caliper.
Instructional poster
The instructional poster provides the user with an entry point to the object-based or image-based ‘How to disassemble a rear brake caliper’ augmented contextual instruction.

Augmented contextual instruction user experience (Object tracking)

Screen captures from the completed object-tracking augmented contextual instruction user experience.  The augmented contextual instruction is made up of a sequence of junaio channels that can be browsed in the junaio AR browser. For this example, the channels were browsed using junaio on an iPad.

Step 1: Remove bracket from caliper
Place the assembled rear brake caliper on the workbench with the retaining bolts facing towards you.
step_01_screen
The first step in disassembling a rear brake caliper is to remove the bracket from the caliper. Use a spanner to loosen the retaining bolts.
step_02_instruct
Remove the bracket from the caliper and then place the bracket on the workbench facing towards you.
step_02_screen
The second step in disassembling a rear brake caliper is to inspect and clean the retaining bolts and remove the rubber seal from the bracket.
step_03_instruct
Put the bracket to one side. Place the caliper upside down on the workbench with the inlet port facing away from you.
step_03_screen
Insert the air tool into the fluid inlet port of the caliper.
step_04_instruct
Turn the caliper over with the cylinder bore facing towards you.
The fourth and final step in disassembling a rear brake caliper is to remove the piston seal from the caliper.
Use a pointed tool to remove the piston seal from the caliper.
Disassembly complete!
You have now completed the final step in disassembling a rear brake caliper. Remember to inspect and clean all parts before reassembly.
learner_resources
Web page for supplementary material. http://rowanpeter.com/exp/ar/creator/resources/index.html
Supplementary material
Once the bolts have been loosened, you can then use your fingers to remove them. You can check out the How to disassemble a rear brake caliper playlist on YouTube.

(Unfortunately) I think someone may have heard correctly!

Someone in the Metaio Developer Portal said they’d heard the UI Designer in Creator was made for iPhone4 screens and the created UI design is not responsive to different screen sizes. I think that someone may have heard correctly. Damn!

2013-10-11 12.20.49
Channel viewed in Junaio AR browser (Portrait).
2013-10-11 12.22.39
Channel viewed in Junaio AR browser (Landscape).
2013-10-11 12.21.07
Channel viewed in Junaio AR browser (Portrait).
2013-10-11 12.22.04
Channel viewed in Junaio AR browser (Landscape).

Save. Upload. Wait. Test. Tweak. Save again. Upload again. Wait again. Test again. Tweak again. Almost there!

Save. Upload. Wait. Test. Tweak. Save again. Upload again. Wait again. Test again. Tweak again. Almost there!

2013-10-09 22.33.57-1
Yes. It’s still one to prepare and one to publish.

And another thing. It’s been 11 days since I posted my issue with Z translation of buttons in UI Designer of Metaio Creator 2.6 to the Helpdesk of the Metaio  Developer Portal and there hasn’t been one response! Damn.

2013-10-13 10.34.58
Too many glowing screens if you know what I mean.

Placing geometry in the 3D point cloud in Metaio Creator 2.6

Placing geometry in the 3D point cloud in Metaio Creator 2.6 is a little cumbersome, bewildering and often inaccurate. It takes lots of clicking and seemingly random fiddling to place geometry. You also can’t really determine if your geometry is placed correctly within the 3D point cloud until you upload to your channel and then view your channel in Junaio on your device. If it’s not placed correctly, you need to tweak it in Creator, publish it and then view it again in Junaio. Repeat process until it looks like it might almost be placed correctly. Curious.

step_01_point_cloud
The first step in disassembling a rear brake caliper is to remove the bracket from the caliper. Use a spanner to loosen the retaining bolts.
step_02_point_cloud
The second step in disassembling a rear brake caliper is to inspect and clean the retaining bolts and remove the rubber seal from the bracket.
step_03_point_cloud
The third step in disassembling a rear brake caliper is to remove the piston from the caliper.
step_04_point_cloud
The fourth and final step in disassembling a rear brake caliper is to remove the piston seal from the caliper.

Testing image tracking with UI elements for each step of the disassembly process

Screen captures from my image tracking tests in Junaio for each step of the disassembly process with UI elements. Unfortunately, due to my Z translation problems in UI Designer of Creator 2.6 I had to remove the background image which was placed underneath the buttons. Doing this gives the buttons a floaty and slightly random feel, but at least the buttons function as they should!

step_01_remove_bracket
The first step in disassembling a rear brake caliper is to remove the bracket from the caliper. Use a spanner to loosen the retaining bolts.
step_02_inspect_and_clean_bolts
The second step in disassembling a rear brake caliper is to inspect and clean the retaining bolts and remove the rubber seal from the bracket.
step_03_remove_and_clean_piston
The third step in disassembling a rear brake caliper is to remove the piston from the caliper.
step_04_remove_seal_brake_caliper
The fourth and final step in disassembling a rear brake caliper is to remove the piston seal from the caliper.

Lost in Metaio Creator’s Z translation

I’m currently wrestling with the correct Z translation of images and buttons in UI Designer of Creator 2.6. I’ve placed some buttons on top of an background image in the UI Designer of Creator and have encountered a problem where the image obscures the buttons and prevents them from working. I’ve checked the Z translation of the image in relation to the buttons and have made sure the buttons translated above the background image, but this doesn’t seem to make a difference when the published channel is viewed in Junaio on my iPad.

I’ve posted my issue with Z translation of buttons in UI Designer of Creator 2.6 to the Metaio Helpdesk and look forward to hearing from someone in the community.

While attempting to solve this problem on my own in Creator, I experimented with placing the buttons partially over the background image and then publishing to my channel. This had an interesting result. The button displays and functions correctly!

Here are some screen captures of the properties windows for the background image and each button in the UI Designer of Creator and the published channel viewed in Junaio on my iPad.

ui_issue_button_background_image
The properties window for the background image. The image has a 0.0 Z translation.
ui_issue_button_over_image
The properties window for the button placed over the background image. The button has a 5.0 Z translation.
ui_issue_button_partial
The properties window for a button placed partially over the background image. The button has a 5.0 Z translation.
ui_issue_ipad
The palcement of UI elements in the published channel viewed in Junaio on my ipad.

VET Development Centre Specialist Scholarship: Event 3 – Knowledge sharing

knowledge_sharing_2013_09_13
Event 3: Knowledge sharing. The third and final event for all 2013 Specialist Scholars.

Today I attended the third and final scheduled event for the VET Development Centre Specialist Scholarship. The event was a knowledge sharing event where all specialist scholars presented their findings and outcomes from their participation in the program. Unlike Event 1 and Event 2, this event gave everyone the opportunity shared what they’ve learned.

Tweets
Surprised, but not surprised by the broad range topics presented by all specialist scholars.

The Tweets were correct! The diversity of presentations from each scholar and the range of application and utilisation was amazing. From study programmes, workshops, conferences to the creation of artifacts, each scholar had made the most of their time in a useful and productive way. Thanks VET Development Centre for giving us all the opportunity to take part in the Specialist Scholar program.

If you’re interested, you can download the PDF version (6.7MB) or the PowerPoint version (6.9MB) of my VET Development Centre Specialist Scholarship: Event 3 – Knowledge sharing presentation.

Two channels published without the extras

After two days of waiting, my Step 4: Remove piston seal from caliper and Step 1: Remove bracket from caliper channels on Junaio have been made public. This is good news, but I think I went a little early on the publish because I failed to include extras such buttons for the instructional video, learner resources and link to next step in the process. The geometry is also misplaced. Now trying to upload to update the published channel and it doesn’t seem to be working. You might have to unpublish the channel, upload changes, publish it and then wait two more days for it to be approved. 

Nope, you can just upload the Creator file to the server again and it will ask you to update the already published channel. For me, Creator and the Metaio/Junaio platform lack some  expectation-outcomes-experience-scaffolding for users. This software, platform and service can be a little bewildering at times.

This recording shows my first two channels published to Junaio. Both channels feature incorrect placement of geometry and missing user interface. It’s a work in progress.

step_04_published

Geometry for brake caliper augment: Step 4: Remove piston seal from caliper

Previsualising the pointed tool, piston seal and rear brake caliper geometry for Step 4: Remove piston seal from caliper stage of the brake caliper augment. Geometry will be exported as FBX from Blender, prepared by FBXMeshCoverter for import into Creator for  upload into my Metaio channel for final use as an augment. Figuring out the production workflow for each 3D model used in each step of the augmented contextual instruction.

Important tip: Remember to export your fbx from Blender to your Desktop and not anywhere else on your computer. For some unknown or unexplained reason fbx files exported to anywhere else but the Desktop do not seem to work properly when converted using FBXMeshConverter.

Screen Shot 2013-09-08 at 11.06.57 AM

Screen Shot 2013-09-07 at 12.33.09 PM

Gathering 3D point cloud with Metaio Toolbox

Screen captures from 3D point cloud data gathering with Metaio Toolbox for sequences to be used in my augmented contextual instruction on the rear brake caliper. The 3D point cloud data will then be used as tracking technology to place my augments.

3d_point_cloud_caliper_complete
A rear brake caliper. This 3D point cloud will be used to place an augment for the first step of the rear brake caliper disassembly sequence.

3d_point_cloud_clean_bot_pins
The bracket from a rear brake caliper. This 3D point cloud data will be used to place an augment for the second step of the rear brake caliper disassembly sequence.

3d_point_cloud_remove_piston
An overturned rear brake caliper. This point cloud data will be used to place an augment for the third step of the rear brake caliper disassembly sequence.

3d_point_cloud_remove_seal
A rear brake caliper with the piston removed. This point cloud data will be used to place an augment for the fourth and final step of the rear brake caliper disassembly sequence.

using_itunes_to_gather_point_cloud
Metaio specify iTunes for the transfer of 3D point cloud data from your iPad to your computer.

Preparing and testing captions for the augmented contextual help supplementary instructional videos

Captions can help to provide an equivalent learning experience for viewers who may be hearing impaired, speak other languages or use assistive technology. Captions are also valuable in a teaching and learning context where it may be impractical for learners to wear headphones or play video at high volume in an group training environment such as workshop, classroom or laboratory.

This screen capture shows the tools and process I used to prepare and test captions for my augmented contextual help supplementary instructional videos.

Soundbooth, TextEdit and Movist are the tools I used to create captions.

Geometry for brake caliper augment: Step 3: Remove piston from caliper

Previsualising piston, boot and arrow geometry for Step 3: Remove piston from caliper stage of the brake caliper augment. Geometry will be exported as FBX from Blender, prepared by FBXMeshCoverter for import into Creator for  upload into my Metaio channel for final use as an augment. Figuring out the production workflow for each 3D model used in each step of the augmented contextual instruction.

Screen Shot 2013-08-25 at 2.30.16 PM

Screen Shot 2013-08-25 at 10.44.53 AM

Screen Shot 2013-08-26 at 8.24.00 PM

Screen Shot 2013-08-26 at 8.24.37 PM

Preparing low-poly geometry for augmented contextual instruction in the disassembly and assembly of a vehicle’s rear brake caliper.

Preparing low-poly geometry for augmented contextual instruction in the disassembly and assembly of a vehicle’s rear brake caliper. Augmented contextual instruction is to be prepared in Metaio Creator, published to a Metaio channel and then accessed by learners through the Junaio app. 3D point cloud data is gathered with Metaio Toolbox.

This video conceptualises one stage of the disassembly of a brake caliper.

[Edit: Tuesday 13 August 2013] This video incorrectly conceptualises one part of the brake caliper disassembly process. A screwdriver is not used to extract the piston, seal and rubber boot from the caliper.

Prototyping augmented contextual instruction with Metaio Creator (Demo mode) and Toolbox

2013-07-31 18.32.49
The obtuse futuristic device that can only be serviced by a fearless technician with a little help from some augmented contextual instruction.

For this prototype, I’m using Metaio Creator image and object recognition features of Toolbox in restrictive Demo Mode to further explore some aspects of my concept of augmented contextual instruction. Unfortunately, in Demo Mode I can’t use the excellent 3D point cloud data  captured in Toolbox to prototype all aspects of my augmented contextual instruction. In Demo Mode augments can only be triggered by a QR code, which is kinda okay for testing while you’re building. I’m thinking about buying a license.

This video shows some of the features of the augmented contextual help I’m trying to prototype with Metaio Creator in Demo Mode.

Conceptualising proximity-based level of detail for augmented reality experiences incorporating 3D geometry

Attempting to conceptualise proximity-based level of detail for augmented reality experiences such as Sew, grow and mow that incorporate 3D geometry.

Attempting to conceptualise the
Attempting to conceptualise proximity-based level of detail for augmented reality experiences incorporating 3D geometry.

Conceptualising with pen, highlighter and post-it notes.
Conceptualising with pen, highlighter and post-it notes.

Conceptualising lawn art.

Screen Shot 2013-07-13 at 2.05.30 PM

Screen Shot 2013-07-13 at 1.32.23 PM

Concept: Augmented contextual instruction

Augmented contextual instruction provides users with procedural demonstrations based on recognisable features and attributes of an object. Augmented contextual instruction could be used in vocational training and assessment contexts. Users can add (record), edit and share their own contextual instruction.

Planning the augmented contextual instruction with pen and envelope.
Planning the augmented contextual instruction with pen and envelope.

Mental note: AR homework

mental_note
Post-it notes help to record mental notes.

Mental note. I need to do some homework. I need to determine how I can use the Metaio SDK and Unity as an alternative to Blender and Aurasma for developing AR experiences for my VET Development Centre Specialist Scholarship.

ARcamp: Day 2 – Tuesday 21 May 2013

Augmented poster
One of the many marker-based AR experiences that could be found inside the INSPIRE centre during the two days of ARcamp.

As part of my VET Development Centre Specialist Scholarship I attended ARcamp 2.0 at the Inspire Centre at the University of Canberra from Monday 20 May to Tuesday 21 May 2013. This blog post provides an overview of the presentations and activities that I participated in throughout the second day of camp.

Thomas Tucker from Winston-Salem State University.
Thomas Tucker from Winston-Salem State University talks about many of his projects.

The work of Thomas Tucker

While not specifically working in the filed of AR, Thomas Tucker from Winston-Salem State University spoke about his work with high-end technology such as the FARO scanner and consumer-level technology such as the Kinect. We also got to play with FARO scanner and scan the INSPIRE centre and surrounding area.

There are more photos of Thomas Tucker’s presentation from Day 2 of ARcamp in my ARcamp 2.0 set on Flickr.

Rob Fitzgerald discusses the challenges of designing, developing and implementing programs in the field
Rob Fitzgerald discusses the challenges of designing, developing and implementing programs in the field.

Rob Fitzgerald

Rob Fitzgerald spoke about the Agriculture Sector Linkages Program in Pakistan (ASLP2) that was co-developed by the University of Canberra and local communities and government in Pakistan. Although ASLP2 didn’t contain any AR experiences, Rob did discuss universal themes such as the importance of engaging and considering learners/intended audience/market/users during the design process and the role of technology in the solution.

There are more photos of Rob Fitzgerald’s presentation from Day 2 of ARcamp in my ARcamp 2.0 set on Flickr.

Meet Trak Lord (US Marketing and Communications) from Metaio
Meet Trak Lord (US Marketing and Communications) from Metaio

Trak Lord from Metaio

Trak Lord (US Marketing and Communications) from Metaio Skyped into ARcamp to present examples of AR in education and also talk about what they are doing in the field of AR and the future of their product. Although it was treated as incidental amongst all the 3D dinosaurs, the 3D object/image recognition feature of the Metaio SDK that enabled  the diagnostic/procedural instruction manual augment on the Mitsubishi air-conditioner was my personal highlight. This was the only practical, useful or vocational example of AR as a value-add or assistive tool in his suite of marketing videos.

There are more photos of Trak Lord’s presentation from Day 2 of ARcamp in my ARcamp 2.0 set on Flickr.

Amber Standley talks APositive and other AR design case studies.
Amber Standley talks APositive and other AR design case studies.

Amber Standley

Amber Standley presented two AR case studies that demonstrated possible implementations of AR. The first AR case study was a marker-based experience that allowed users to learn more about the University of Canberra’s emissions by scanning a poster. The user was then able to:

  • identify and compare emission levels from buildings that make up the University of Canberra campus
  • review past emission levels (performance)
  • display weekly information on buildings with low emissions
  • display discussions about each building.

 The second AR case study was also a marker-based experience that allowed users to learn more about distinguished alumni from the University of Canberra by scanning photographs mounted in an engraved frame. Each photograph triggered a video which displayed biographical and additional information about each distinguished alumni. 

Paul Krix talks Unity and the Metaio SDK.
Paul Krix talks Unity and the Metaio SDK.

Paul Krix

Paul Krix provided a very general overview of creating a mobile AR app for iOS or Android with Unity and the Metaio SDK (SDK can be downloaded from the Metaio website). The MetaioMan featured in his presentation.

ARcamp: Day 1 – Monday 20 May 2013

Early
I arrived an hour and half early for ARcamp 2.0. I spent my time waiting in front of the huge video wall.

As part of my VET Development Centre Specialist Scholarship I attended ARcamp 2.0 at the Inspire Centre at the University of Canberra from Monday 20 May to Tuesday 21 May 2013. This blog post provides an overview of the presentations and activities that I participated in throughout the first day of camp.

Welcome to ARcamp

Danny Munnerley from ARstudio welcomed us all to camp. He spoke about ARstudio as a two-year practical and research project that was nearing completion. He also spoke about the eventual release of a resource that compiles their research findings and that this year would be the last camp.

Rob Manson talks about the current state of play in AR.
Rob Manson talks about the current state of play in AR.

The welcome also featured an update on the current state of play in AR by Rob Manson from Mob Labs. Rob spoke about AR technology, simultaneous localisation and mapping (SLAM), his company buildAR, the challenge of developing standards for AR, the probable slow demise of proprietary browsers and the eventual rise of AR as common web browser experience built on Web socketsWeb Real-Time Communication (WebRTC) and the Web Graphics Library (WebGL).

There are more photos of Rob Manson’s presentation from Day 1 of ARcamp in my ARcamp 2.0 set on Flickr.

Tony and Dean from AIE
Tony and Dean from AIE talk about their experiences developing the 35° 17 South.

35° 17 South

Tony Oakden and Dean Walshe from the Academy of Interactive Entertainment (AIE) spoke about their experiences developing the 35° 17 South. 35° 17 South was a multi-reality location-based game that took place in April on the grounds of the National Gallery of Australia. Although didn’t contain elements of AR, many aspects of it’s design, development and implementation reflected issues typically encountered when producing technology-based activities.

Alex Young talks about Design 29
Alex Young talks about elements that make up the Design 29 exhibition at the National Archives of Australia.

Design 29: Creating a capital

Alex Young spoke about the issues, development concerns she encountered and insights from developing AR experiences for the Design 29: Creating a capital exhibition at the National Archives of Australia. The Design 29: Creating a capital exhibition brought together the original designs for Canberra by finalists of the 1911 Federal Capital City Design Competition and featured AR elements such as video, animation and detailed scans of original artwork.

Stephen Barrass fields questions about the Garden of Australian Dreams.
Stephen Barrass fields questions about the Garden of Australian Dreams.

Garden of Australian Dreams

Stephen Barrass spoke about his recent project Garden of Australian Dreams as well as his past AR projects AVIARy, Edible Audience and Vanishing Point. Pretty awesome stuff.

Mini design challenge
Dannerly Munnerly introduces the Mini design challenge.

AR mini design challenge

The AR mini design challenge was facilitated by Danny Munnerly and Matt Bacon. The challenge was to form a group, brainstorm, design an AR experience on paper and then present the concept to the camp. Prizes were awarded to the best AR experiences. The AR mini design challenge was an awesome exercise in design thinking, where we need to discover, interpret, ideate, experiment and then evolve a concept.

Matt Bacon talks Studio Aurasma.
Matt Bacon talks Studio Aurasma.

Studio Aurasma and the Aurasma iPad app

Matt Bacon presented a session on designing and developing simple AR experiences with the Aurasma iPad app and the web browser based Studio Aurasma. This session gave us the opportunity to create a simple marker-based video augment with the iPad app as well as explore the features of Studio Aurasma.

 

VET Development Centre Specialist Scholarship: Event 2 – Professional development

Group activity
Facilitator Greg Stephens got us all thinking about leading projects and teams in this group activity.

Today I attended the second scheduled event for the VET Development Centre Specialist Scholarship. The event was a professional development session facilitated by Greg Stephens. During  the session, Greg presented his unique perspective on the following themes that mapped to the nine units that make up BSB51407 Diploma of Project Management:

  • The People side of Leading Projects
    • Leading projects in the contemporary workplace
    • Project Leadership – What’s important?
  • Leading through the Project life cycle
    • High performing project teams
    • Leading teams at each stage of the project cycle
  • Leading through Project challenges
    • Addressing resistance; when things go wrong; managing yourself

Greg also initiated lively discussion, activities and presented examples of:

  • model project success factors such as:
    • Pinto and Slevin’s 1998 list of 10 project success factors
    • Turner’s 1999 project drivers diagram
  • differences between project management and project leadership
  • project leader qualities
  • characteristics of effective leaders
  • characteristics of an effective team
  • situational leadership model
  • work preferences – team management systems
  • Belbin’s Team role summary descriptions
  • what drives performance and engagement
  • coaching the GROW model
  • Tuckman’s four-stage (Forming, Storming, Norming and Performing) group development model with Edison’s (Informing, Conforming and Deforming) expansion
  • tips for influencing, persuading and resolving conflict
  • guidelines for leading change
  • managing employee resistance to change
  • responding to indifference or anger
  • a resilience framework (Vision, Determination, Interaction, Relationships, Problem solving, Organisation and Self confidence)

ARcamp is coming

ARcamp is coming. Indeed.
ARcamp is coming. Indeed.

I’m pretty stoked to be attending ARcamp 2.0 at the Inspire Centre at the University of Canberra from Monday 20 May to Tuesday 21 May as part of my VET Development Centre Specialist Scholarship. Judging by the recent camp update, it looks like we are all going to be immersed in two days of hands-on augmented reality workshop goodness. They’ve also provided us with a link to the ARcamp schedule and the Augmented Reality in Education Wiki and some links to AR industry players such as BuildAR, Metaio, Junaio and Aurasma to help us prepare. Awesome.

ARIG: A prototype camera rig for recording contextual tablet and mobile phone screen activity

ARIG is a camera rig for recording activity on and around the screen of a tablet or mobile phone screen. The concept for ARIG came from my need to record my experiments with marker and location-based augmented reality experiences.

In this example, ARIG records a simple 3D cube augment produced with Blender 2.62 and Aurasma Studio.

ARIG (Tablet mount)
Unpainted ARIG with tablet mount.

ARIG (Phone mount)
Unpainted ARIG with mobile phone mount.

The complete ARIG kit with iPad, mobile phone, DSLR and light sock components.
The complete ARIG kit with light sock components and iPad, mobile phone and DSLR required to record the experience.

A simple cube (Blender 2.62 and Aurasma Studio)

simple_cube
A simple 3D cube augment created with Blender and Aurasma Studio

Blender 2.62 does a good job of exporting a 3D scene in the Collada (DAE) format for use as an overlay in Aurasma Studio. You just need to make sure you interpret the newest version of the Aurasma 3D Guidelines in a Blender 2.62 context. For a Blender 2.62 user the most important guidelines to follow are:

  • Models need triangulated faces (Edit mode > Ctrl+T)
  • No more than four lamps (lights) although three are recommended
  • Models are to have no more than one map/texture
  • Create a .tar archive to upload to Ausrasma Studio made up of .dae (Exported from Blender 2.62), .png (Texture) and a .png thumbnail (256 x 256).

This video is an example of a simple 3D cube augment produced with Blender 2.62 and Aurasma Studio.

The challenge of using markers placed on the floor to trigger and then engage with an augment containing a 3D object modelled to scale

prototype_03
The 3D object used in this augment has been reduced in scale to enable the object to be viewed within the constraints of a marker-based augment.

As part of my VET Development Centre Specialist Scholarship I’m in the process of developing my practical skills in designing and building augmented reality learning experiences. One of the experiences I’m currently prototyping is a workplace hazard identification activity. This has brought about an interesting challenge. I’m currently grappling with the challenge of using markers placed on the floor to trigger and then engage with an augment containing a 3D object modelled to scale.

A marker needs to be in view and recognisable at all times for the augment to work. An augment containing a 3D object not modelled to scale can be easily triggered and engaged with by a marker placed on the floor as the marker will most likely remain in view and recognisable at all times. An augment containing a 3D object modelled to scale can also be easily triggered by a marker placed on the floor. The user then needs to move away from the marker to engage with the augment. As the user moves away from the marker it no longer remains in view and recognisable. This means the augment will fail.

In this example, a simple augment of an industrial workplace scene is triggered by the marker. The industrial workplace scene has been reduced in size and is no longer suitable.

Possible solutions?
Increase the size of the marker or place the marker on a wall to ensure the marker remains in view and recognisable at all times. Increasing the marker could be a solution, but then a specialist printer may be required instead of a standard domestic or office printer. Placing the marker on the wall could be a solution, but only if the experience was thematically relevant. A marker placed on wall could also be used to trigger an augment on the floor. This could also work, but would require strict placement to ensure the augment is placed in an accurate position on the floor relative to the marker and not floating in the air or buried in the floor.

Another possible solution could also be to trigger the augment containing the 3D object modelled to scale based on location. This solution could work if the designated location for the augment was outside  or if the location could be accurately determined when indoors.