While not specifically working in the filed of AR, Thomas Tucker from Winston-Salem State University spoke about his work with high-end technology such as the FARO scanner and consumer-level technology such as the Kinect. We also got to play with FARO scanner and scan the INSPIRE centre and surrounding area.
Rob Fitzgerald spoke about the Agriculture Sector Linkages Program in Pakistan (ASLP2) that was co-developed by the University of Canberra and local communities and government in Pakistan. Although ASLP2 didn’t contain any AR experiences, Rob did discuss universal themes such as the importance of engaging and considering learners/intended audience/market/users during the design process and the role of technology in the solution.
Trak Lord (US Marketing and Communications) from Metaio Skyped into ARcamp to present examples of AR in education and also talk about what they are doing in the field of AR and the future of their product. Although it was treated as incidental amongst all the 3D dinosaurs, the 3D object/image recognition feature of the Metaio SDK that enabled the diagnostic/procedural instruction manual augment on the Mitsubishi air-conditioner was my personal highlight. This was the only practical, useful or vocational example of AR as a value-add or assistive tool in his suite of marketing videos.
Amber Standley presented two AR case studies that demonstrated possible implementations of AR. The first AR case study was a marker-based experience that allowed users to learn more about the University of Canberra’s emissions by scanning a poster. The user was then able to:
identify and compare emission levels from buildings that make up the University of Canberra campus
review past emission levels (performance)
display weekly information on buildings with low emissions
display discussions about each building.
The second AR case study was also a marker-based experience that allowed users to learn more about distinguished alumni from the University of Canberra by scanning photographs mounted in an engraved frame. Each photograph triggered a video which displayed biographical and additional information about each distinguished alumni.
Paul Krix provided a very general overview of creating a mobile AR app for iOS or Android with Unity and the Metaio SDK (SDK can be downloaded from the Metaio website). The MetaioMan featured in his presentation.
Danny Munnerley from ARstudio welcomed us all to camp. He spoke about ARstudio as a two-year practical and research project that was nearing completion. He also spoke about the eventual release of a resource that compiles their research findings and that this year would be the last camp.
Tony Oakden and Dean Walshe from the Academy of Interactive Entertainment (AIE) spoke about their experiences developing the 35° 17 South. 35° 17 South was a multi-reality location-based game that took place in April on the grounds of the National Gallery of Australia. Although didn’t contain elements of AR, many aspects of it’s design, development and implementation reflected issues typically encountered when producing technology-based activities.
Design 29: Creating a capital
Alex Young spoke about the issues, development concerns she encountered and insights from developing AR experiences for the Design 29: Creating a capital exhibition at the National Archives of Australia. The Design 29: Creating a capital exhibition brought together the original designs for Canberra by finalists of the 1911 Federal Capital City Design Competition and featured AR elements such as video, animation and detailed scans of original artwork.
The AR mini design challenge was facilitated by Danny Munnerly and Matt Bacon. The challenge was to form a group, brainstorm, design an AR experience on paper and then present the concept to the camp. Prizes were awarded to the best AR experiences. The AR mini design challenge was an awesome exercise in design thinking, where we need to discover, interpret, ideate, experiment and then evolve a concept.
Studio Aurasma and the Aurasma iPad app
Matt Bacon presented a session on designing and developing simple AR experiences with the Aurasma iPad app and the web browser based Studio Aurasma. This session gave us the opportunity to create a simple marker-based video augment with the iPad app as well as explore the features of Studio Aurasma.
Today I attended the second scheduled event for the VET Development Centre Specialist Scholarship. The event was a professional development session facilitated by Greg Stephens. During the session, Greg presented his unique perspective on the following themes that mapped to the nine units that make up BSB51407 Diploma of Project Management:
The People side of Leading Projects
Leading projects in the contemporary workplace
Project Leadership – What’s important?
Leading through the Project life cycle
High performing project teams
Leading teams at each stage of the project cycle
Leading through Project challenges
Addressing resistance; when things go wrong; managing yourself
Greg also initiated lively discussion, activities and presented examples of:
model project success factors such as:
Pinto and Slevin’s 1998 list of 10 project success factors
Turner’s 1999 project drivers diagram
differences between project management and project leadership
project leader qualities
characteristics of effective leaders
characteristics of an effective team
situational leadership model
work preferences – team management systems
Belbin’s Team role summary descriptions
what drives performance and engagement
coaching the GROW model
Tuckman’s four-stage (Forming, Storming, Norming and Performing) group development model with Edison’s (Informing, Conforming and Deforming) expansion
tips for influencing, persuading and resolving conflict
guidelines for leading change
managing employee resistance to change
responding to indifference or anger
a resilience framework (Vision, Determination, Interaction, Relationships, Problem solving, Organisation and Self confidence)
ARIG is a camera rig for recording activity on and around the screen of a tablet or mobile phone screen. The concept for ARIG came from my need to record my experiments with marker and location-based augmented reality experiences.
In this example, ARIG records a simple 3D cube augment produced with Blender 2.62 and Aurasma Studio.
Blender 2.62 does a good job of exporting a 3D scene in the Collada (DAE) format for use as an overlay in Aurasma Studio. You just need to make sure you interpret the newest version of the Aurasma 3D Guidelines in a Blender 2.62 context. For a Blender 2.62 user the most important guidelines to follow are:
Models need triangulated faces (Edit mode > Ctrl+T)
No more than four lamps (lights) although three are recommended
Models are to have no more than one map/texture
Create a .tar archive to upload to Ausrasma Studio made up of .dae (Exported from Blender 2.62), .png (Texture) and a .png thumbnail (256 x 256).
This video is an example of a simple 3D cube augment produced with Blender 2.62 and Aurasma Studio.
As part of my VET Development Centre Specialist Scholarship I’m in the process of developing my practical skills in designing and building augmented reality learning experiences. One of the experiences I’m currently prototyping is a workplace hazard identification activity. This has brought about an interesting challenge. I’m currently grappling with the challenge of using markers placed on the floor to trigger and then engage with an augment containing a 3D object modelled to scale.
A marker needs to be in view and recognisable at all times for the augment to work. An augment containing a 3D object not modelled to scale can be easily triggered and engaged with by a marker placed on the floor as the marker will most likely remain in view and recognisable at all times. An augment containing a 3D object modelled to scale can also be easily triggered by a marker placed on the floor. The user then needs to move away from the marker to engage with the augment. As the user moves away from the marker it no longer remains in view and recognisable. This means the augment will fail.
In this example, a simple augment of an industrial workplace scene is triggered by the marker. The industrial workplace scene has been reduced in size and is no longer suitable.
Increase the size of the marker or place the marker on a wall to ensure the marker remains in view and recognisable at all times. Increasing the marker could be a solution, but then a specialist printer may be required instead of a standard domestic or office printer. Placing the marker on the wall could be a solution, but only if the experience was thematically relevant. A marker placed on wall could also be used to trigger an augment on the floor. This could also work, but would require strict placement to ensure the augment is placed in an accurate position on the floor relative to the marker and not floating in the air or buried in the floor.
Another possible solution could also be to trigger the augment containing the 3D object modelled to scale based on location. This solution could work if the designated location for the augment was outside or if the location could be accurately determined when indoors.
Remember Simple SketchUp Models back when it was the Google 3D Warehouse? My favourite simple model was the 350 ml takeaway coffee cup with sipper style lid. Since 2008, the simple takeaway coffee cup has been viewed and downloaded a few times. I hope people found it useful.
My exploration of how we learn and how we design and develop stuff that helps us learn.