Skip to content

Commit

Permalink
Make sure pyspark is 3.1.2 [skip test]
Browse files Browse the repository at this point in the history
  • Loading branch information
maziyarpanahi committed Nov 3, 2021
1 parent 13c0714 commit 0b9dd28
Show file tree
Hide file tree
Showing 7 changed files with 18 additions and 18 deletions.
4 changes: 2 additions & 2 deletions README.md
Expand Up @@ -149,7 +149,7 @@ $ java -version
$ conda create -n sparknlp python=3.7 -y
$ conda activate sparknlp
# spark-nlp by default is based on pyspark 3.x
$ pip install spark-nlp==3.3.2 pyspark
$ pip install spark-nlp==3.3.2 pyspark==3.1.2
```

In Python console or Jupyter `Python3` kernel:
Expand Down Expand Up @@ -689,7 +689,7 @@ The easiest way to get this done on Linux and macOS is to simply install `spark-
$ conda create -n sparknlp python=3.7 -y
$ conda activate sparknlp
# spark-nlp by default is based on pyspark 3.x
$ pip install spark-nlp==3.3.2 pyspark jupyter
$ pip install spark-nlp==3.3.2 pyspark==3.1.2 jupyter
$ jupyter notebook
```

Expand Down
2 changes: 1 addition & 1 deletion docs/_layouts/landing.html
Expand Up @@ -201,7 +201,7 @@ <h3 class="grey h3_title">{{ _section.title }}</h3>
<div class="highlight-box">
{% highlight bash %}
# Install Spark NLP from PyPI
$ pip install spark-nlp==3.3.2 pyspark
$ pip install spark-nlp==3.3.2 pyspark==3.1.2

# Install Spark NLP from Anaconda/Conda
$ conda install -c johnsnowlabs spark-nlp
Expand Down
4 changes: 2 additions & 2 deletions docs/en/concepts.md
Expand Up @@ -4,7 +4,7 @@ header: true
title: General Concepts
permalink: /docs/en/concepts
key: docs-concepts
modify_date: "2021-08-31"
modify_date: "2021-11-03"
use_language_switcher: "Python-Scala"
show_nav: true
sidebar:
Expand Down Expand Up @@ -62,7 +62,7 @@ $ java -version
$ conda create -n sparknlp python=3.7 -y
$ conda activate sparknlp
# spark-nlp by default is based on pyspark 3.x
$ pip install spark-nlp==3.3.2 pyspark jupyter
$ pip install spark-nlp==3.3.2 pyspark==3.1.2 jupyter
$ jupyter notebook
```

Expand Down
16 changes: 8 additions & 8 deletions docs/en/developers.md
Expand Up @@ -5,15 +5,15 @@ seotitle: Spark NLP
title: Developers Guideline
permalink: /docs/en/developers
key: docs-developers
modify_date: "2020-05-08"
modify_date: "2021-11-03"
show_nav: true
sidebar:
nav: sparknlp
---


Spark NLP is an open-source library and everyone's contribution is welcome!
In this section we provide a guide on how to setup your environment using IntelliJ IDEA for a smoother start. You can also check our video tutorials available on our YouTube channel: https://www.youtube.com/johnsnowlabs
In this section we provide a guide on how to setup your environment using IntelliJ IDEA for a smoother start. You can also check our video tutorials available on our YouTube channel: https://www.youtube.com/johnsnowlabs

## Setting up the Environment

Expand Down Expand Up @@ -58,7 +58,7 @@ When the repo cloned IDE will detect SBT file with dependencies. Click **Yes** t

![Pop up build](\assets\images\pop_up_build.png)

In the **Import from sbt** pop up make sure you have JDK 8 detected. Click **Ok** to proceed and download required resources.
In the **Import from sbt** pop up make sure you have JDK 8 detected. Click **Ok** to proceed and download required resources.

![Pop up settings build](\assets\images\settings_build.png)

Expand All @@ -72,11 +72,11 @@ Next step is to install Python plugin to the IntelliJ IDEA. To do this, open `Fi

![Python plugin](\assets\images\python_plugin.png)

After this steps you can check project structure in the `File -> Project Structure -> Modules`.
After this steps you can check project structure in the `File -> Project Structure -> Modules`.

![Project structure](\assets\images\project_structure.png)

Make sure what you have `spark-nlp` and `spark-nlp-build` folders and no errors in the exported dependencies.
Make sure what you have `spark-nlp` and `spark-nlp-build` folders and no errors in the exported dependencies.

In the `Project` settings check what project SDK is set to 1.8 and in `Platform Settings -> SDK's` you have Java installation as well as Python installation.

Expand All @@ -94,7 +94,7 @@ If you don't see Python installed in the `SDK's` tab click **+** button, add **P

### Run tests in Scala

Click **Add configuration** in the Top right corner. In the pop up click on the **+** and look for **sbt task**.
Click **Add configuration** in the Top right corner. In the pop up click on the **+** and look for **sbt task**.

![Add config](\assets\images\add_config.png)

Expand Down Expand Up @@ -178,7 +178,7 @@ source venv/bin/activate
after install packages by running

```
pip install pyspark numpy
pip install pyspark==3.1.2 numpy
```

</div><div class="h3-box" markdown="1">
Expand All @@ -194,7 +194,7 @@ In the **Name** field put `AssemblyCopy`. In the **Tasks** field write down `ass

You can find created jar in the folder ``spark-nlp/python/lib/sparknlp.jar``

*Note: Assembly command creates a fat jars, that includes all dependencies within*
*Note: Assembly command creates a fat jars, that includes all dependencies within*

</div>

Expand Down
4 changes: 2 additions & 2 deletions docs/en/examples.md
Expand Up @@ -4,7 +4,7 @@ header: true
title: Examples
key: docs-examples
permalink: /docs/en/examples
modify_date: "2021-08-31"
modify_date: "2021-11-03"
---

Showcasing notebooks and codes of how to use Spark NLP in Python and Scala.
Expand All @@ -16,7 +16,7 @@ $ java -version
# should be Java 8 (Oracle or OpenJDK)
$ conda create -n sparknlp python=3.7 -y
$ conda activate sparknlp
$ pip install spark-nlp==3.3.2 pyspark
$ pip install spark-nlp==3.3.2 pyspark==3.1.2
```

## Google Colab Notebook
Expand Down
2 changes: 1 addition & 1 deletion docs/en/install.md
Expand Up @@ -47,7 +47,7 @@ $ java -version
# should be Java 8 (Oracle or OpenJDK)
$ conda create -n sparknlp python=3.7 -y
$ conda activate sparknlp
$ pip install spark-nlp==3.3.2 pyspark
$ pip install spark-nlp==3.3.2 pyspark==3.1.2
```

Of course you will need to have jupyter installed in your system:
Expand Down
4 changes: 2 additions & 2 deletions python/README.md
Expand Up @@ -149,7 +149,7 @@ $ java -version
$ conda create -n sparknlp python=3.7 -y
$ conda activate sparknlp
# spark-nlp by default is based on pyspark 3.x
$ pip install spark-nlp==3.3.2 pyspark
$ pip install spark-nlp==3.3.2 pyspark==3.1.2
```

In Python console or Jupyter `Python3` kernel:
Expand Down Expand Up @@ -689,7 +689,7 @@ The easiest way to get this done on Linux and macOS is to simply install `spark-
$ conda create -n sparknlp python=3.7 -y
$ conda activate sparknlp
# spark-nlp by default is based on pyspark 3.x
$ pip install spark-nlp==3.3.2 pyspark jupyter
$ pip install spark-nlp==3.3.2 pyspark==3.1.2 jupyter
$ jupyter notebook
```

Expand Down

0 comments on commit 0b9dd28

Please sign in to comment.