From 14271d6debaea7510477bce24688a4ead48e3df9 Mon Sep 17 00:00:00 2001 From: Cabir C <64752006+Cabir40@users.noreply.github.com> Date: Wed, 8 May 2024 23:20:41 +0300 Subject: [PATCH] added apple silicon installation (#1193) --- docs/en/licensed_install.md | 69 +++++++++++++++++++++++++++++++++++++ 1 file changed, 69 insertions(+) diff --git a/docs/en/licensed_install.md b/docs/en/licensed_install.md index 0b35ad9eb7..bbfd377be2 100644 --- a/docs/en/licensed_install.md +++ b/docs/en/licensed_install.md @@ -353,6 +353,75 @@ conda activate sparknlp jupyter notebook ``` +## Apple Silicon Support + +**Installation for Apple Silicon (M1, M2, M3)**: Starting from version 4.0.0, Spark NLP has experimental support for apple silicon. + +Make sure the following prerequisites are set: + +1. Installing SDKMAN, you can also follow the official documentation at https://sdkman.io/install + - `$ curl -s "https://get.sdkman.io" | bash` + - `source "$HOME/.sdkman/bin/sdkman-init.sh"` + - `sdk list java` + list available java libraries: + + ![image](https://github.com/JohnSnowLabs/spark-nlp-workshop/assets/64752006/9d05bd11-14c5-454e-bbab-fea4e91da905) + + +2. Installing Java + - `sdk install java 8.0.402-amzn` + - `whereis java` + - `java -version` + +3. Installing MiniConda, you can also follow the official documentation at https://docs.anaconda.com/free/miniconda/#quick-command-line-install + - `mkdir -p ~/miniconda3` + - `curl https://repo.anaconda.com/miniconda/Miniconda3-py39_23.11.0-2-MacOSX-arm64.sh -o ~/miniconda3/miniconda.sh` PS: you can change python version to 3.10 or 3.11 + - `bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3` + - `~/miniconda3/bin/conda init bash` + - `~/miniconda3/bin/conda init zsh` + - `source miniconda3/bin/activate` + +4. Installing `jupyter environments` or you can install it via `VSCode` + + ```bash + # use base environment + conda --version + java -version + conda activate + pip install pyspark==3.4.0 + pip install jupyter + conda env config vars set PYSPARK_PYTHON=python + conda activate + conda env config vars set PYSPARK_DRIVER_PYTHON=jupyter + conda activate + conda env config vars set PYSPARK_DRIVER_python_OPTS=notebook + conda activate + jupyter notebook + ``` + + ```bash + # or create new sparknlp environment + conda --version + java -version + conda create -n sparknlp python=3.9 -y + conda activate sparknlp + pip install pyspark==3.4.0 + pip install jupyter + conda env config vars set PYSPARK_PYTHON=python + conda activate sparknlp + conda env config vars set PYSPARK_DRIVER_PYTHON=jupyter + conda activate sparknlp + conda env config vars set PYSPARK_DRIVER_python_OPTS=notebook + conda activate sparknlp + jupyter notebook + ``` + +5. Installing Spark NLP Healthcare + + please see the [Spark NLP Healthcare Installation Notebook](https://github.com/JohnSnowLabs/spark-nlp-workshop/blob/master/platforms/apple-silicon/installation.ipynb) + + + ## Non-johnsnowlabs Clinical NLP on Ubuntu > These instructions use non-johnsnowlabs installation syntax. For simplified installation with `johnsnowlabs` library, check first section.