Previsualising the pointed tool, piston seal and rear brake caliper geometry for Step 4: Remove piston seal from caliper stage of the brake caliper augment. Geometry will be exported as FBX from Blender, prepared by FBXMeshCoverter for import into Creator for upload into my Metaio channel for final use as an augment. Figuring out the production workflow for each 3D model used in each step of the augmented contextual instruction.
Important tip: Remember to export your fbx from Blender to your Desktop and not anywhere else on your computer. For some unknown or unexplained reason fbx files exported to anywhere else but the Desktop do not seem to work properly when converted using FBXMeshConverter.
Preparing low-poly geometry for augmented contextual instruction in the disassembly and assembly of a vehicle’s rear brake caliper. Augmented contextual instruction is to be prepared in Metaio Creator, published to a Metaio channel and then accessed by learners through the Junaio app. 3D point cloud data is gathered with Metaio Toolbox.
This video conceptualises one stage of the disassembly of a brake caliper.
[Edit: Tuesday 13 August 2013] This video incorrectly conceptualises one part of the brake caliper disassembly process. A screwdriver is not used to extract the piston, seal and rubber boot from the caliper.
Once you’ve created seams in your mesh and then unwrapped it, you’ll then probably want to move contiguous groups of faces or UV islands into a location that will make it easier for you to create and then paint a texture.
Before you can start moving your UV islands you’ll need to change your selection mode. Sync selection is an extremely useful selection mode that allows you to select separate UV islands and then rotate, scale and transform them in the UV/Image editor without affecting the corresponding elements in the 3D editor. You can activate and then deactivate Sync selection by selecting the Sync selection button in the header of the UV/Image editor.
Making heavy use of the Sync selection button to arrange UV islands on the texture for the pre-visualisation geometry for the Sew, grew and mow lawn art experience.
Blender 2.62 does a good job of exporting a 3D scene in the Collada (DAE) format for use as an overlay in Aurasma Studio. You just need to make sure you interpret the newest version of the Aurasma 3D Guidelines in a Blender 2.62 context. For a Blender 2.62 user the most important guidelines to follow are:
Models need triangulated faces (Edit mode > Ctrl+T)
No more than four lamps (lights) although three are recommended
Models are to have no more than one map/texture
Create a .tar archive to upload to Ausrasma Studio made up of .dae (Exported from Blender 2.62), .png (Texture) and a .png thumbnail (256 x 256).
This video is an example of a simple 3D cube augment produced with Blender 2.62 and Aurasma Studio.
A work-in-progress render of the scene and 50 frame animated geometry for an augment of door/entry area to Dr Softain’s laboratory. Thematically, this scene takes place around the same time as Dr Softain’s emergency broadcast.
The modelling is based on measurements and reference photos taken at the scene. The animated door opening was achieved by creating a simple bone system, skinning the mesh and then animated the bones. I did this because I thought animated rigid geometry wasn’t supported by Aurasma Studio. I was wrong. Animated rigid geometry is supported by Aurasma Studio. I may continue to use bones to animate the opening of the door and other geometry that makes up the scene.
If all goes to plan, the final .dae export and augment with Aurasma Studio of the alternate animated door/entry area should replace the real door/entry area entirely.
Work to be completed
The completed scene will be made up of a partially visible collapsed Dr Softain, handing lights and elements such as strange equipment and tools you may expect to find in a laboratory. I’m also considering replacing animated versions of the fridge and bin seen in the reference photos. Each element will need to be low-poly and combined with other geometry into a single mesh to meet the 3D guidelines for Aurasma Studio. Further visual effects such as dirt, spilt chemicals, blood etc will be painted onto 512 x 512 material that is then applied to the mesh. The animation looks a bit stiff, so I’ll give that a bit of tweak too!
Thinking out loud
Sketching out the door/entry scene and thinking about the limitations of designing and developing augments. There’s something about them that makes them merely passive observational pieces. They seem read-only. Web 1.0. Augments and the fictional layer should be read/write by those who interact with the space. That’s more web 2.0 – beyond. I guess that’s the challenge. Integrate them into/with something else where action is required and/or make the diorama read/write.
I’m interested in exploring the use of augmented reality (AR) in learning experiences. I’ve decided to prototype my early simple AR experiments with Processing and Blender. These early explorations will make use of augments placed with fiducial markers. My goal is to then explore developing AR learning experiences with Layar that can then be viewed through iOS and Android mobile devices. I’d then like to explore placing augments without using fiducial markers. These augment could be determined by location. One step at a time.
I used Processing, Blender and NyARToolkit to create this very simple zombie wound augment. This needs a bit more work as the augment is displaying bounding box information and the low-poly modelling is not as smooth as it should be. The augment could be made to look a little more integrated with my body with improved modelling and texturing. That will come with the next iteration.
I created the Epidermis Edit animation in early 2000. The aesthetic is directly influenced by anatomy, medical procedures, macro-photography, electron microscopy and scientific inquiry. Back then, the Epidermis Edit animation was intended to be a realistic study in the movement and manipulation of the skin. A slightly surreal animation, but now it just seems confusing and weird. Unintended.
I used Blender 1.5 for all polygon modelling, animation, textures, lighting and rendering. I used After Effects to compile the animation, sound and credits, and then render the final output to AVI and Quicktime format.
My exploration of how we learn and how we design and develop stuff that helps us learn.