A few years back, the Earth Science building at WashU was dedicated and the wealthy donor, to which the building now bears his name, generously gave the department a number of valuable rock samples. Some of them are precious and fragile enough that they are not on display in the cabinets in the entrance hallway which seems a shame. However, it could provide a good project to learn to do photogrammetry a little better!
In the lab we set up a light box where we can place the samples on to a turntable so we can rotate without having to move the camera. I also put a couple of scale bars on the turntable that we can use as reference to scale the object correctly during the photogrammetry steps. After each photo I rotated the turntable about 10 degrees so about 36 photos per lap, and then I did 2 more laps at different angles in order to get plenty of coverage. I would normally use a remote with the camera and spend a little time getting the camera levels correct, but I didn’t have a remote with me and I just wanted to use this as an example anyway.
Once the files are loaded into the photogrammetry software (Agisoft Photoscan), the first thing to do is to add markers to the photos at known places on the scale bars and set their location (1cm intervals). Doing this on a bunch of photos will allow the software to correctly scale the final model. The software then does a lot of the heavy lifting, aligning the cameras, calculating common points and also adding color to the vertices. After a couple more steps you end up with a textured, 3D mesh of the rock sample.
Now we have our 3D model of the rock sample, we could just leave it there, stuck on a 2d monitor in perpetuity… but we have a HoloLens! Exporting an FBX file and a 4k PNG texture to Unity is relatively straightforward. Unity’s coordinate system is different from Photoscan’s so I need to rotate the model and place it at the correct point in the scene. I also noticed that my scale was now off by a factor of 100, this is probably to do with the Photoscan markers’ units being in cm.
There’s also just been a release of some new scripts on Microsoft’s Mixed Reality GitHub which I thought would be good to test out. They’ve added some handy manipulation tools which I’ve added to this scene. I’m not sure whether they’ll be easily transferable to the current apps I’ve been developing as they seem a little buggy at the moment. This scene then gets deployed to the HoloLens and I can test out how well Photoscan was able to recreate the sample:
It turned out pretty well I think given that it was just a rough set of photos and I didn’t using the highest accuracy settings in Photoscan. Could end up with a nice little holographic rock museum before too long!