From 9c9984892ed6306a9e06f0e599a4ee5abdf0939c Mon Sep 17 00:00:00 2001 From: Robin Linacre Date: Fri, 24 Jun 2022 10:10:31 +0100 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index dac0a6ce6a..bf38d75928 100644 --- a/README.md +++ b/README.md @@ -6,7 +6,7 @@ ✨✨ **Note to new users:** ✨✨ -Version 3 of Splink is in development that will make it simpler and more intuitive to use. It also removes the need for PySpark for smaller data linkages of up to around 1 million records. You can try it by installing a [pre-release](https://pypi.org/project/splink/#history), or in the new demos [here](https://github.com/moj-analytical-services/splink_demos/tree/splink3_demos). For new users, it may make sense to work with the new version, because it is quicker to learn. However, note that the new code is not yet fully tested. +Version 3 of Splink is in development that will make it simpler and more intuitive to use. It also removes the need for PySpark for smaller data linkages of up to around 1 million records. You can find the documentation [here](https://moj-analytical-services.github.io/splink/index.html). You can try it by installing a [pre-release](https://pypi.org/project/splink/#history), or in the new demos [here](https://github.com/moj-analytical-services/splink_demos/tree/splink3_demos). For new users, it may make sense to work with the new version, because it is quicker to learn. However, note that the new code is not yet fully tested. # Splink: Probabilistic record linkage and deduplication at scale