Skip to content

Commit

Permalink
Release/440 release candidate (#13742)
Browse files Browse the repository at this point in the history
* SPARKNLP-782 Removes deprecated parameter enablePatternRegex (#13664)

* SPARKNLP-748: Custom Entity Name for Date2Chunk (#13680)

- added parameter "entityName" to change metadata name

* SPARKNLP-784 Fix loading WordEmbeddingsModel bug when cache_folder is from S3 (#13707)

* SPARKNLP-605: ConvNextForImageClassification (#13713)

* SPARKNLP-605: ConvNextForImageClassification

- Added ConvNextForImageClassification with new tests
- Refactored image Preprocessor and added new config
- Implemented filters with resample property for
  ImageResizeUtils.resizeBufferedImage (with minor
  performance gain)
- Minor improvements for ViT and Swin

* SPARKNLP-605: Docs

* SPARKNLP-605: Lazy values for test

* SPARKNLP-785 Fix WordEmbeddingsModel bug whit LightPipeline (#13715)

* [skip test] SPARKNLP-783: Python 3.6 deprecated in Spark 3.2 (#13724)

* SPARKNLP-763 Implementing ZeroShot Text Classification for BERT and DistilBERT based on NLI (#13727)

* SPARKNLP-763 Fix a typo

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-763 add unfinished traits

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-763 Create a new BertForZeroShotClassification annotator

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-763 Create a new HasCandidateLabelsProperties

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-763 Implement predict sequence with NLI, new tokenize from strings, and new tag ZeroShot

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-763 Clean up the code

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-763 Add BertForZeroShotClassification to annotator [skip test]

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-763 Add BertForZeroShotClassification to ResourceDownloader [skip test]

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-763 Implement BertForZeroShotClassification in Python [skip test]

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-763 Add unit tests for BertForZeroShotClassification

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* change default model to bert_base_cased_zero_shot_classifier_xnli

* SPARKNLP-763 Fix Scaladoc and Pydoc

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-763 Fix Update unit test in Scala

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

---------

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* Sparknlp 534 Introducing BART Transformer for text-to-text generation tasks like translation and summarization (#13731)

* WIP: Added Bart transformer scala files

* WIP: Added BART tokenizer test and BART is locally working

* WIP: Added BART tokenizer test and BART is locally working

* WIP: Added Beam Hypothesis and Beam Scorer implementations

* WIP: Added Logit Processors

* WIP: Added Beam Search implementation

* WIP: Completed Beam Search implementation
WIP: Added Generate method for text generation

* WIP: fixed a bug in Beam search algorithm
WIP: Generate method for text generation

* WIP: changed BartTransformer methods to include beam size and added description

* WIP: changed BartTransformer test methods

* WIP: fixed errors in BeamSearch

* WIP: Updated to use separate encoder decoder model

* WIP: Changed model to handle the int64 version of the model weights

* WIP: Added python API implementation

* Pass session and encoder state as a parameter
Clean up unnecessary code

* Update TopK Logit Warper Logic

* Code clean up

* Update Tests

* Update documentation

* Update documentation and python tests

* Update python tests

* SPARKNLP-534 move BartTokenizer to the Bart backend

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-534 Fix the copyright year

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-534 Add BartTransformer to annotator and ResourceDownloader

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-534 Fix BartTransformer in annotator

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

---------

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>
Co-authored-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* Bump version to 4.4.0

* Update doc style and fix unit test [skip test]

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-605: Fix parameter eval for vit tests

* Update default model name (#13744)

* SPARKNLP-796 Creating a new `nerHasNoSchema` param (#13745)

* Adding missing CPUvsGPUbenchmark page

* SPARKNLP-796 Creating a new `nerHasNoSchema` param

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

---------

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* Change default model for BART to distilbart-xsum-12-6

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* Change default model for BART to distilbart_xsum_12_6

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* Replace nlp with sparknlp.org website

* Update INT64 to INT32 (#13748)

* Fix the wrong column in unit test [skip test]

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>

* SPARKNLP-805: Documentation for release/440 (#13743)

* Fixed memory leak

* Added Bart Notebook

* Add new features and update docs[run doc]

* Update install.md

* Update CHANGELOG [run doc]

* Update Scala and Python APIs

* release spark-nlp 4.4.0 on Conda [skip test]

---------

Signed-off-by: Maziyar Panahi <maziyar.panahi@iscpif.fr>
Co-authored-by: Danilo Burbano <37355249+danilojsl@users.noreply.github.com>
Co-authored-by: Devin Ha <33089471+DevinTDHa@users.noreply.github.com>
Co-authored-by: Prabod Rathnayaka <prabod@rathnayaka.me>
Co-authored-by: Devin Ha <t.ha@tu-berlin.de>
Co-authored-by: github-actions <action@github.com>
  • Loading branch information
6 people committed Apr 10, 2023
1 parent 12c7baf commit 684f8c9
Show file tree
Hide file tree
Showing 1,743 changed files with 62,015 additions and 12,627 deletions.
4 changes: 2 additions & 2 deletions .github/ISSUE_TEMPLATE/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,5 @@ contact_links:
url: https://github.com/JohnSnowLabs/spark-nlp/discussions/5669
about: Discussion about importing models from other libraries
- name: Want to contribute a model? Visit the NLP Models Hub to upload your model.
url: https://nlp.johnsnowlabs.com/models
about: A place for sharing and discovering Spark NLP models and pipelines
url: https://sparknlp.org/models
about: A place for sharing and discovering Spark NLP models and pipelines
22 changes: 21 additions & 1 deletion CHANGELOG
Original file line number Diff line number Diff line change
@@ -1,3 +1,23 @@
========
4.4.0
========
----------------
New Features
----------------
* Implement a new Zero-Shot Text Classification for BERT annotator called `BertForZeroShotClassification`
* Implement a new ConvNextForImageClassification annotator
* Introducing BART Transformer for text-to-text generation tasks like translation and summarization
* Set custom entity name in Data2Chunk via `setEntityName` param
* Add a new `nerHasNoSchema` param for NerConverter when labels coming from NerDLMOdel and NerCrfModel don't have any schema
----------------
Bug Fixes & Enhancements
----------------
* Fix loading `WordEmbeddingsModel` bug when loading a model from S3 via `cache_folder` config
* Fix `WordEmbeddingsModel` bug failing when it's used with `setEnableInMemoryStorage` set to `True` and LightPipeline
* Remove deprecated parameter enablePatternRegex from EntityRulerApproach & EntityRulerModel
* Deprecate Python 3.6


========
4.3.2
========
Expand Down Expand Up @@ -798,7 +818,7 @@ New Features
Backward compatibility
----------------

* We have updated our MarianTransformer annotator to be compatible with TF v2 models. This change is not compatible with previous models/pipelines. However, we have updated and uploaded all the models and pipelines for `3.1.x` release. You can either use `MarianTransformer.pretrained(MODEL_NAME)` and it will automatically download the compatible model or you can visit [Models Hub](https://nlp.johnsnowlabs.com/models) to download the compatible models for offline use via `MarianTransformer.load(PATH)`
* We have updated our MarianTransformer annotator to be compatible with TF v2 models. This change is not compatible with previous models/pipelines. However, we have updated and uploaded all the models and pipelines for `3.1.x` release. You can either use `MarianTransformer.pretrained(MODEL_NAME)` and it will automatically download the compatible model or you can visit [Models Hub](https://sparknlp.org/models) to download the compatible models for offline use via `MarianTransformer.load(PATH)`

========
3.0.3
Expand Down
141 changes: 73 additions & 68 deletions README.md

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ name := getPackageName(is_silicon, is_gpu, is_aarch64)

organization := "com.johnsnowlabs.nlp"

version := "4.3.2"
version := "4.4.0"

(ThisBuild / scalaVersion) := scalaVer

Expand Down Expand Up @@ -54,7 +54,7 @@ publishTo := {
else Some("releases" at nexus + "service/local/staging/deploy/maven2")
}

homepage := Some(url("https://nlp.johnsnowlabs.com"))
homepage := Some(url("https://sparknlp.org"))

scmInfo := Some(
ScmInfo(
Expand Down
8 changes: 4 additions & 4 deletions conda/meta.yaml
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
package:
name: "spark-nlp"
version: 4.3.2
version: 4.4.0

app:
entry: spark-nlp
summary: Natural Language Understanding Library for Apache Spark.

source:
fn: spark-nlp-4.3.2.tar.gz
url: https://files.pythonhosted.org/packages/40/a7/6d450edede7a7f54b3a5cd78fe3d521bad33ada0f69de0b542c1ab13f3bd/spark-nlp-4.3.2.tar.gz
sha256: 749d591175a7c88c96d75dcd84565a37216df5ca76aac5200a0d7214c0440022
fn: spark-nlp-4.4.0.tar.gz
url: https://files.pythonhosted.org/packages/2e/aa/19e34297e3cc22a0f361c22cc366418158612bca8f5b9e3959c1f066f747/spark-nlp-4.4.0.tar.gz
sha256: e76fdd82b966ca169ba8a1fdcfe2e684fc63abaf88de841d2eb881cacb5e0105
build:
noarch: generic
number: 0
Expand Down
38 changes: 20 additions & 18 deletions docs/_layouts/landing.html
Original file line number Diff line number Diff line change
Expand Up @@ -201,22 +201,22 @@ <h3 class="grey h3_title">{{ _section.title }}</h3>
<div class="highlight-box">
{% highlight bash %}
# Install Spark NLP from PyPI
$ pip install spark-nlp==4.3.2 pyspark==3.3.0
$ pip install spark-nlp==4.4.0 pyspark==3.3.0

# Install Spark NLP from Anaconda/Conda
$ conda install -c johnsnowlabs spark-nlp

# Load Spark NLP with Spark Shell
$ spark-shell --packages com.johnsnowlabs.nlp:spark-nlp_2.12:4.3.2
$ spark-shell --packages com.johnsnowlabs.nlp:spark-nlp_2.12:4.4.0

# Load Spark NLP with PySpark
$ pyspark --packages com.johnsnowlabs.nlp:spark-nlp_2.12:4.3.2
$ pyspark --packages com.johnsnowlabs.nlp:spark-nlp_2.12:4.4.0

# Load Spark NLP with Spark Submit
$ spark-submit --packages com.johnsnowlabs.nlp:spark-nlp_2.12:4.3.2
$ spark-submit --packages com.johnsnowlabs.nlp:spark-nlp_2.12:4.4.0

# Load Spark NLP as an external Fat JAR
$ spark-shell --jars spark-nlp-assembly-4.3.2.jar
$ spark-shell --jars spark-nlp-assembly-4.4.0.jar
{% endhighlight %}
</div>
</div>
Expand All @@ -236,8 +236,8 @@ <h2 class="h2_title grey">Transformers at Scale</h2>
<div class="transformer-descr">
<b>Spark NLP</b> is the only open-source NLP library in <b>production</b> that offers state-of-the-art transformers such as
<b>BERT</b>, <b>CamemBERT</b>, <b>ALBERT</b>, <b>ELECTRA</b>, <b>XLNet</b>, <b>DistilBERT</b>, <b>RoBERTa</b>, <b>DeBERTa</b>,
<b>XLM-RoBERTa</b>, <b>Longformer</b>, <b>ELMO</b>, <b>Universal Sentence Encoder</b>, <b>Google T5</b>, <b>MarianMT</b>, and <b>OpenAI GPT2</b> not only to <b>Python</b>, and <b>R</b>
but also to <b>JVM</b> ecosystem (<b>Java</b>, <b>Scala</b>, and <b>Kotlin</b>) at <b>scale</b> by extending <b>Apache Spark</b> natively
<b>XLM-RoBERTa</b>, <b>Longformer</b>, <b>ELMO</b>, <b>Universal Sentence Encoder</b>, <b>Facebook BART</b>, <b>Google T5</b>, <b>MarianMT</b>, <b>OpenAI GPT2</b>,
<b>Google ViT</b>, <b>ASR Wav2Vec2</b> and many more not only to <b>Python</b>, and <b>R</b> but also to <b>JVM</b> ecosystem (<b>Java</b>, <b>Scala</b>, and <b>Kotlin</b>) at <b>scale</b> by extending <b>Apache Spark</b> natively
</div>
</div>
</div>
Expand Down Expand Up @@ -313,18 +313,11 @@ <h4 class="blue h4_title">NLP Features</h4>
<li><strong>ALBERT</strong> Embeddings</li>
<li><strong>XLNet</strong> Embeddings</li>
<li><strong>ELMO</strong> Embeddings</li>

</ul>
<ul class="list1">

<li><strong>Universal Sentence</strong> Encoder</li>
<li><strong>Sentence</strong> Embeddings</li>
<li><strong>Chunk</strong> Embeddings</li>
<li>Neural <strong>Machine Translation</strong> (MarianMT)</li>
<li><strong>Text-To-Text</strong> Transfer Transformer <strong>(Google T5)</strong></li>
<li><strong>Generative Pre-trained</strong> Transformer 2 <strong>(OpenAI GPT-2)</strong></li>
<li>Vision Transformer (ViT) <strong>Image Classification</strong></li>
<li>Automatic Speech Recognition <strong>(Wav2Vec2)</strong></li>
</ul>
<ul class="list1">
<li>Table Question Answering <strong>(TAPAS)</strong></li>
<li>Unsupervised <strong>keywords extraction</strong></li>
<li>Language <strong>Detection</strong> & <strong>Identification</strong> (up to 375 languages)</li>
Expand All @@ -342,11 +335,20 @@ <h4 class="blue h4_title">NLP Features</h4>
<li>Longformer for <strong>Token & Sequence Classification</strong></li>
<li>Transformer-based <strong>Question Answering</strong></li>
<li><strong>Named entity</strong> recognition (DL model)</li>
<li>Facebook BART <strong>NLG, Translation, and Comprehension</strong></li>
<li>Zero-Shot <strong>NER & Text</strong> Classification (ZSL)</li>
<li>Neural <strong>Machine Translation</strong> (MarianMT)</li>
<li><strong>Text-To-Text</strong> Transfer Transformer <strong>(Google T5)</strong></li>
<li><strong>Generative Pre-trained</strong> Transformer 2 <strong>(OpenAI GPT-2)</strong></li>
<li>Vision Transformer (Google ViT) <strong>Image Classification</strong></li>
<li>Microsoft Swin Transformer <strong>Image Classification</strong></li>
<li>Facebook ConvNext <strong>Image Classification</strong></li>
<li>Automatic Speech Recognition <strong>(Wav2Vec2 & HuBERT)</strong></li>
<li>Easy <strong>TensorFlow</strong> integration</li>
<li><strong>GPU</strong> Support</li>
<li>Full integration with <strong>Spark ML</strong> functions</li>
<li><strong>9400+</strong> pre-trained <strong>models </strong> in <strong>200+ languages! </strong>
<li><strong>3200+</strong> pre-trained <strong>pipelines </strong> in <strong>200+ languages! </strong>
<li><strong>12000+</strong> pre-trained <strong>models </strong> in <strong>200+ languages! </strong>
<li><strong>5000+</strong> pre-trained <strong>pipelines </strong> in <strong>200+ languages! </strong>
</ul>
</div>
{% highlight python %}
Expand Down
8 changes: 4 additions & 4 deletions docs/api/com/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no" />
<title>Spark NLP 4.3.2 ScalaDoc - com</title>
<meta name="description" content="Spark NLP 4.3.2 ScalaDoc - com" />
<meta name="keywords" content="Spark NLP 4.3.2 ScalaDoc com" />
<title>Spark NLP 4.4.0 ScalaDoc - com</title>
<meta name="description" content="Spark NLP 4.4.0 ScalaDoc - com" />
<meta name="keywords" content="Spark NLP 4.4.0 ScalaDoc com" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8" />


Expand All @@ -28,7 +28,7 @@
</head>
<body>
<div id="search">
<span id="doc-title">Spark NLP 4.3.2 ScalaDoc<span id="doc-version"></span></span>
<span id="doc-title">Spark NLP 4.4.0 ScalaDoc<span id="doc-version"></span></span>
<span class="close-results"><span class="left">&lt;</span> Back</span>
<div id="textfilter">
<span class="input">
Expand Down
8 changes: 4 additions & 4 deletions docs/api/com/johnsnowlabs/client/CredentialParams.html
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no" />
<title>Spark NLP 4.3.2 ScalaDoc - com.johnsnowlabs.client.CredentialParams</title>
<meta name="description" content="Spark NLP 4.3.2 ScalaDoc - com.johnsnowlabs.client.CredentialParams" />
<meta name="keywords" content="Spark NLP 4.3.2 ScalaDoc com.johnsnowlabs.client.CredentialParams" />
<title>Spark NLP 4.4.0 ScalaDoc - com.johnsnowlabs.client.CredentialParams</title>
<meta name="description" content="Spark NLP 4.4.0 ScalaDoc - com.johnsnowlabs.client.CredentialParams" />
<meta name="keywords" content="Spark NLP 4.4.0 ScalaDoc com.johnsnowlabs.client.CredentialParams" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8" />


Expand All @@ -28,7 +28,7 @@
</head>
<body>
<div id="search">
<span id="doc-title">Spark NLP 4.3.2 ScalaDoc<span id="doc-version"></span></span>
<span id="doc-title">Spark NLP 4.4.0 ScalaDoc<span id="doc-version"></span></span>
<span class="close-results"><span class="left">&lt;</span> Back</span>
<div id="textfilter">
<span class="input">
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no" />
<title>Spark NLP 4.3.2 ScalaDoc - com.johnsnowlabs.client.aws.AWSAnonymousCredentials</title>
<meta name="description" content="Spark NLP 4.3.2 ScalaDoc - com.johnsnowlabs.client.aws.AWSAnonymousCredentials" />
<meta name="keywords" content="Spark NLP 4.3.2 ScalaDoc com.johnsnowlabs.client.aws.AWSAnonymousCredentials" />
<title>Spark NLP 4.4.0 ScalaDoc - com.johnsnowlabs.client.aws.AWSAnonymousCredentials</title>
<meta name="description" content="Spark NLP 4.4.0 ScalaDoc - com.johnsnowlabs.client.aws.AWSAnonymousCredentials" />
<meta name="keywords" content="Spark NLP 4.4.0 ScalaDoc com.johnsnowlabs.client.aws.AWSAnonymousCredentials" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8" />


Expand All @@ -28,7 +28,7 @@
</head>
<body>
<div id="search">
<span id="doc-title">Spark NLP 4.3.2 ScalaDoc<span id="doc-version"></span></span>
<span id="doc-title">Spark NLP 4.4.0 ScalaDoc<span id="doc-version"></span></span>
<span class="close-results"><span class="left">&lt;</span> Back</span>
<div id="textfilter">
<span class="input">
Expand Down
8 changes: 4 additions & 4 deletions docs/api/com/johnsnowlabs/client/aws/AWSBasicCredentials.html
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no" />
<title>Spark NLP 4.3.2 ScalaDoc - com.johnsnowlabs.client.aws.AWSBasicCredentials</title>
<meta name="description" content="Spark NLP 4.3.2 ScalaDoc - com.johnsnowlabs.client.aws.AWSBasicCredentials" />
<meta name="keywords" content="Spark NLP 4.3.2 ScalaDoc com.johnsnowlabs.client.aws.AWSBasicCredentials" />
<title>Spark NLP 4.4.0 ScalaDoc - com.johnsnowlabs.client.aws.AWSBasicCredentials</title>
<meta name="description" content="Spark NLP 4.4.0 ScalaDoc - com.johnsnowlabs.client.aws.AWSBasicCredentials" />
<meta name="keywords" content="Spark NLP 4.4.0 ScalaDoc com.johnsnowlabs.client.aws.AWSBasicCredentials" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8" />


Expand All @@ -28,7 +28,7 @@
</head>
<body>
<div id="search">
<span id="doc-title">Spark NLP 4.3.2 ScalaDoc<span id="doc-version"></span></span>
<span id="doc-title">Spark NLP 4.4.0 ScalaDoc<span id="doc-version"></span></span>
<span class="close-results"><span class="left">&lt;</span> Back</span>
<div id="textfilter">
<span class="input">
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no" />
<title>Spark NLP 4.3.2 ScalaDoc - com.johnsnowlabs.client.aws.AWSCredentialsProvider</title>
<meta name="description" content="Spark NLP 4.3.2 ScalaDoc - com.johnsnowlabs.client.aws.AWSCredentialsProvider" />
<meta name="keywords" content="Spark NLP 4.3.2 ScalaDoc com.johnsnowlabs.client.aws.AWSCredentialsProvider" />
<title>Spark NLP 4.4.0 ScalaDoc - com.johnsnowlabs.client.aws.AWSCredentialsProvider</title>
<meta name="description" content="Spark NLP 4.4.0 ScalaDoc - com.johnsnowlabs.client.aws.AWSCredentialsProvider" />
<meta name="keywords" content="Spark NLP 4.4.0 ScalaDoc com.johnsnowlabs.client.aws.AWSCredentialsProvider" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8" />


Expand All @@ -28,7 +28,7 @@
</head>
<body>
<div id="search">
<span id="doc-title">Spark NLP 4.3.2 ScalaDoc<span id="doc-version"></span></span>
<span id="doc-title">Spark NLP 4.4.0 ScalaDoc<span id="doc-version"></span></span>
<span class="close-results"><span class="left">&lt;</span> Back</span>
<div id="textfilter">
<span class="input">
Expand Down
14 changes: 7 additions & 7 deletions docs/api/com/johnsnowlabs/client/aws/AWSGateway.html
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no" />
<title>Spark NLP 4.3.2 ScalaDoc - com.johnsnowlabs.client.aws.AWSGateway</title>
<meta name="description" content="Spark NLP 4.3.2 ScalaDoc - com.johnsnowlabs.client.aws.AWSGateway" />
<meta name="keywords" content="Spark NLP 4.3.2 ScalaDoc com.johnsnowlabs.client.aws.AWSGateway" />
<title>Spark NLP 4.4.0 ScalaDoc - com.johnsnowlabs.client.aws.AWSGateway</title>
<meta name="description" content="Spark NLP 4.4.0 ScalaDoc - com.johnsnowlabs.client.aws.AWSGateway" />
<meta name="keywords" content="Spark NLP 4.4.0 ScalaDoc com.johnsnowlabs.client.aws.AWSGateway" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8" />


Expand All @@ -28,7 +28,7 @@
</head>
<body>
<div id="search">
<span id="doc-title">Spark NLP 4.3.2 ScalaDoc<span id="doc-version"></span></span>
<span id="doc-title">Spark NLP 4.4.0 ScalaDoc<span id="doc-version"></span></span>
<span class="close-results"><span class="left">&lt;</span> Back</span>
<div id="textfilter">
<span class="input">
Expand Down Expand Up @@ -450,9 +450,9 @@ <h3>Value Members</h3>


</li><li name="com.johnsnowlabs.client.aws.AWSGateway#downloadFilesFromDirectory" visbl="pub" class="indented0 " data-isabs="false" fullComment="no" group="Ungrouped">
<a id="downloadFilesFromDirectory(bucketName:String,keyPrefix:String,directoryPath:java.io.File):Unit"></a><a id="downloadFilesFromDirectory(String,String,File):Unit"></a>
<a id="downloadFilesFromDirectory(bucketName:String,keyPrefix:String,directoryPath:java.io.File,isIndex:Boolean):Unit"></a><a id="downloadFilesFromDirectory(String,String,File,Boolean):Unit"></a>
<span class="permalink">
<a href="../../../../com/johnsnowlabs/client/aws/AWSGateway.html#downloadFilesFromDirectory(bucketName:String,keyPrefix:String,directoryPath:java.io.File):Unit" title="Permalink">
<a href="../../../../com/johnsnowlabs/client/aws/AWSGateway.html#downloadFilesFromDirectory(bucketName:String,keyPrefix:String,directoryPath:java.io.File,isIndex:Boolean):Unit" title="Permalink">
<i class="material-icons"></i>
</a>
</span>
Expand All @@ -461,7 +461,7 @@ <h3>Value Members</h3>
<span class="kind">def</span>
</span>
<span class="symbol">
<span class="name">downloadFilesFromDirectory</span><span class="params">(<span name="bucketName">bucketName: <span class="extype" name="scala.Predef.String">String</span></span>, <span name="keyPrefix">keyPrefix: <span class="extype" name="scala.Predef.String">String</span></span>, <span name="directoryPath">directoryPath: <span class="extype" name="java.io.File">File</span></span>)</span><span class="result">: <span class="extype" name="scala.Unit">Unit</span></span>
<span class="name">downloadFilesFromDirectory</span><span class="params">(<span name="bucketName">bucketName: <span class="extype" name="scala.Predef.String">String</span></span>, <span name="keyPrefix">keyPrefix: <span class="extype" name="scala.Predef.String">String</span></span>, <span name="directoryPath">directoryPath: <span class="extype" name="java.io.File">File</span></span>, <span name="isIndex">isIndex: <span class="extype" name="scala.Boolean">Boolean</span> = <span class="symbol">false</span></span>)</span><span class="result">: <span class="extype" name="scala.Unit">Unit</span></span>
</span>


Expand Down
Loading

0 comments on commit 684f8c9

Please sign in to comment.