The obtuse futuristic device that can only be serviced by a fearless technician with a little help from some augmented contextual instruction.
For this prototype, I’m using Metaio Creator image and object recognition features of Toolbox in restrictive Demo Mode to further explore some aspects of my concept of augmented contextual instruction. Unfortunately, in Demo Mode I can’t use the excellent 3D point cloud data captured in Toolbox to prototype all aspects of my augmented contextual instruction. In Demo Mode augments can only be triggered by a QR code, which is kinda okay for testing while you’re building. I’m thinking about buying a license.
This video shows some of the features of the augmented contextual help I’m trying to prototype with Metaio Creator in Demo Mode.
Point cloud with 3D object. The 3D object provides instruction to the learner how they’re to rotate the component.
Working with the point cloud in Metaio Creator.
Working within the constraints of Demo mode. Placing a button that launches additional resources about the component.
Prototyping AR concepts to determine the value of Creator is challenging, particularly due to the extremely tight constraints of Demo mode. Workaround – Make a number of different channels with single features!
Brainstorming additional screen elements to support the learner during contextual help/instruction.
Creating and editing UV layout for 3D object augment.
3D geometry exported from Blender as an FBX and then processed with FBXMeshConverter to make it suitable for Creator.
3D geometry exported from Blender as an FBX and then processed with FBXMeshConverter to make it suitable for Creator.