Skip to content
Max edited this page May 14, 2022 · 11 revisions

Welcome to the aquamarine wiki!

Basic information

Current software can make a comparison of images. Based on 2 pictures made in the same position and under similar conditions. The library will take data of pixels via reading files as a matrix of RGB values and provide difference results. These results are a representation of pixels changes, and if these changed pixels are connected - there is a chance that these changes are related to one object.

Part of these changes can be skipped to avoid image 'noise', which happens normally if lighting conditions change (for example the sun is shining with different brightness levels, and objects under this light also can change color, but these are the same objects). Taking it into account, this software should take care of ignoring pixels with such color changes which are related to not changed or moved objects. For resolving this, an added set of parameters in which users can change and comparison will be adjusted regarding particular circumstances. For example, parameter 'affinity threshold' - to make a definition about what is a color change, when a user can set the value of the maximum level of changed color which has to recognize as a noise. In simple words, this is a kind of sensitivity value for the color change, if the value will be bigger than this defined constant for every checked pixel - it will mean that pixel change found. Depending on processing type such change will be recognized for example - as a possible part of the object. 'Minimum pixels in object' - defines how many pixels should be connected to determine that its object, if less - in output results such object will be skipped. This parameter uses provides the skipping of unnecessary changes, which normally happens in outdoor natural conditions. In some cases there is no needs to track if grass or tree leaves are trembling under wind impact, much more important will be to focus on bigger objects.

Movement is also a kind of object change, but with some pattern, and this pattern should be checked on some collection of images. For example, if positions of changed objects are found, in the next checks for 3 images this change will be present with direction pattern(left, right, etc.). By this approach can be confirmed if movement is found or not. For optimization purposes class implemented for movement, detection checks the only area of previously found objects and tracking how they behave during a period.

Algorithms

For searching objects was selected Breadth-first search algorithm, where every potential pixel change will be checked all neighbor pixels and newfound changes will be passed in the next iteration, the algorithm will be finished if all neighbor pixels will be checked, and no more changes found. The resulting of this will array(std::vector) of pixels. To make such search optimized were added few optimizations: 1) multi-threading for BFS-search, 2) user-defined 'step' for checking pixels.

Multi-threaded BFS:

User can specify the desired count of threads. The thread pool feature allows utilizing all available threads. Users shall be aware that big threads count brings more job to do regarding context switch and possible overload with scheduling tasks for CPU. Also in the case of embedded devices, overheating of CPU (with lowering CPU clock) can be a result of a big thread count. For example: on 7700HQ CPU 32 threads provide results faster than 64.

On the image below shows an example for start positions for checking for first and second threads, in case, if optimal threads count will be equal 2:

dfxfd

Pixel step:

Means that the user can define how frequently checks on the image have to be done, to check with interval. Especially in high-resolution pictures, there is no need to check every pixel because in general, the result of such a search will be the same. But checks without steps will be slower. Only one requirement: the step should be able to detect the smallest objects.

Clone this wiki locally