This is an unfinished algorithm used to match objects with their corresponding reflections in a marine environment with the goal of assisting in depth analysis. The code was converted to python (not stored in this repository) and then completed. It is one component of a real-time SLAM system that was developed in the Computer Vision research group under former UIUC Professors Soon-Jo Chung and Seth Hutchinson.
The basic algorithm searches across an input image for a subset of extremal regions (which maintain an unchanged shape over a large set of thresholds and are characterized by uniform intensity) known as maximally stable extremal regions (MSER). In order to match any object to its reflection, the input image is divided in two with the ‘shoreline’ acting as a cutoff. Subsequently, convex hulls of each MSER region were generated and scaled by 1.5x, 2x and 3x. Using a robust matching algorithm obtained from a research paper, Robust Wide Baseline Stereo from Maximally Stable Extremal Regions by Matas et al, the algorithm then searches for regions similar to the scaled regions within a certain threshold. This algorithm works well in perfect marine conditions (still water, etc.), but the performance deteriorates as the water gets more ‘choppy’. As a result, the the code needs to be optimized to account for rougher water conditions.