I have previously outlined my goal of testing multiple photogrammetry solutions on a single dataset, and reporting times and results.
I’m using a dataset based on photographs of this Styracosaurus model (I’ve had it since I was quite young):
The dataset has 53 photos in total, and is available from this link. [This will be moved to figshare in due course].
The model is about 12 cm in total length, has texture for the scales, and a reflective brass nameplate on the base. The model was situated on a glass desk, and there may be reflection issues from that.
I’ve had a few requests to take a look at MicMac. I’ve dabbled with it before, and I’ve seen plenty of praise for it online, but because it – and the website you get it from – are in French, I [like a typical English speaker] haven’t invested much time in trying to suss it out.
Hold on to your pants folks, because I’m going to fumble through. Most of what I’m doing I took from the English language wiki. You can download the latest binaries from here.
As far as I can tell, there’s no interface, and this thing is run entirely from the command line. Default install went to C:\MicMac64bits for me, and for the sake of simplicity [i.e. so I don’t have to call binaries from a different directory] I’ve stuck my folder of photos, “Styrac” in the Micmac binary folder. Not great practice but it’s convenient here.
So I’ll open a command prompt inside the images folder, ‘Styrac’.
As an aside, the most recent version of windows 10, the Creators Update, allows execution of windows binaries from the bash on windows command line, which is ace. As such I’m running everything through bash so I can use the unix ‘time’ command to time how long each stage takes. However, that means the GPU is currently off limits (I think), though to my knowledge MicMac doesn’t leverage the GPU.
Tie Point Matching
The first command to run is:
..\mm3d.exe Tapioca MulScale “.*.JPG” 500 2500
The ‘Tapioca’ command tells Micmac to look for tie points. ‘Mulscale’ means it will first run through the images at 500px resolution (500 pixels on longest side) to find most likely matches, before running at a higher resolution, in this case 2500. Tapioca and Mulscale are case-dependent.
I had a load of errors about my Sony Nex-6 that zoomed by, but it kept on trucking.
Time taken for tie Points Search: 5m 54 seconds
Internal and Relative orientation.
It’s not helping that the MicMac process uses different terms than I’ve come across before… Seems this is a portion of what I’d normally call matching cameras?
The command is:
..\mm3D.exe Tapas RadialStd “.*.JPG” Out=MyFolder
We’re creating a new sub-folder here called “MyFolder” in which ‘stuff’ will be stored (I have obviously not delved into the nitty gritty for this one) – you can call this whatever you want.
Time: 1m 33 seconds
Visualize Relative Orientation
This command just outputs stuff so we can see the cameras and sparse point cloud in meshlab:
..\mm3d.exe AperiCloud “.*.JPG” MyFolder
Time taken: 1m 43 seconds
Here’s said sparse point cloud (AperiCloud_MyFolder.ply) visualized in meshlab:
So far so good, seems everything is in order.
Now we can mask the sparse point cloud. Run:
..\mm3d.exe SaisieMasqQT AperiCloud_MyFolder.ply
This opens a GUI showing the point cloud and cameras:
Zoom and move the mouse according the controls at the bottom of the window, and then hit ‘F9’ to change to selection mode, and draw a polygon around the area you’re interested in (in this case, the point cloud actually contains very little that isn’t of interest, so I’m being fairly liberal with my selection). Left click to build the polygon, and right click to close the loop:
Then go to Selection->add inside polygon to selection. I just choose all the points. Then File->Save Selection Info, leaving two files in your images folder, in this case AperiCloud_MyFolder_polyg3d.xml and AperiCloud_MyFolder_selectionInfo.xml.
The command to use is:
..\mm3d.exe C3DC BigMac “.*.JPG” MyFolder
Time Taken: 19m 0 seconds
We’ve now created a dense point cloud, which appeared in my Styrac folder as “C3DC_BigMac.ply”. Here it is in Meshlab:
This is pretty poor, especially if we view the other side:
There are tools available on the website (but not in the tutorials) that describe meshing and texturing, but honestly the reconstruction is so poor I’m not going to bother.
Conclusions and notes.
Before commenting on the quality of the model, I want to point out that MicMac saves all intermediate steps and files, and in this case that left >20gb of files in my images folder.
The model is, obviously, pretty poor. Time wasn’t great, and the usability of the tools is the hardest I’ve encountered yet.
Ultimately I can’t recommend MicMac to a novice user when there are so many clearer, easier to use packages available. I have no doubt as to the power of getting into the settings in MicMac, but by god it’s not straightforward to use, and the tutorials and instructions available are not great. If you have plenty of time you could probably get much better results, and I’d be interested to hear from people who’ve run this dataset through Micmac.