SddF is supposed to be a scalable distributed duplicate detection framework. Apache Spark is used as distributed computing framework. SddF is the result of my master's thesis in computational science. It is still a prototype and not supposed to be used in production. Detailed information can be found in my thesis.
There is no binary release of SddF at the moment. Therefor you have to build it on your own. The following lines are briefly describing how to get started.
The Scala Build Tool (SBT) needs to be installed. Look at the SBT docs to find the installation procedure for your OS.
- Clone the sddf repository
- Run sbt publishLocal to compile SddF and make it available to other projects locally.
git clone https://github.com/numbnut/sddf.git
sbt publishLocal
- Clone the sddf-example repository
- Run the example class
git clone https://github.com/numbnut/sddf-example.git
sbt run
SddF is licensed under the GPLv3.