You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The first version of PINK was written basically in C-style and we have duplicated code for CPU and GPU execution. Therefore, for PINK version 2 we would like to rewrite the code to get a more generic structure to avoid the CPU/GPU code duplication and to enable the implementation of an python interface, so that the C++ functions can be called directly in python.
The basic idea is to separate data and algorithm classes and to formulate the SOM, neurons and training objects as LayoutTypes. We start with the two LayoutTypes Cartesian<dim, T> and Hexagonal<T>, where dim is the dimension of the cartesian grid and T the data type, e.g. float. A image has then the type Cartesian<2, float>. The type of a 2-dimensional, cartesian SOM would be Cartesian<2, Cartesian<2, float>> or Hexagonal<Cartesian<2, float>> for a hexagonal SOM.
The SOM training and mapping algorithms are formulated as Functors, where the call operator gets the SOM and the training data object:
The first version of PINK was written basically in C-style and we have duplicated code for CPU and GPU execution. Therefore, for PINK version 2 we would like to rewrite the code to get a more generic structure to avoid the CPU/GPU code duplication and to enable the implementation of an python interface, so that the C++ functions can be called directly in python.
The development will be performed in the branch https://github.com/HITS-AIN/PINK/tree/redesign.
The basic idea is to separate data and algorithm classes and to formulate the SOM, neurons and training objects as LayoutTypes. We start with the two LayoutTypes
Cartesian<dim, T>
andHexagonal<T>
, where dim is the dimension of the cartesian grid and T the data type, e.g.float
. A image has then the typeCartesian<2, float>
. The type of a 2-dimensional, cartesian SOM would beCartesian<2, Cartesian<2, float>>
orHexagonal<Cartesian<2, float>>
for a hexagonal SOM.The SOM training and mapping algorithms are formulated as Functors, where the call operator gets the SOM and the training data object:
The new design makes it also easier to add new SOM types and training algorithms.
Discussions and suggestions are always welcome.
The text was updated successfully, but these errors were encountered: