Skip to content

Commit

Permalink
Remove Scala Docs Page, keep backup (#155)
Browse files Browse the repository at this point in the history
* Remove Scala Docs Page, keep backup


---------

Co-authored-by: Mufaddal Rohawala <mufi@amazon.com>
  • Loading branch information
mufaddal-rohawala and Mufaddal Rohawala committed Feb 22, 2024
1 parent e27ccff commit 6cdd969
Show file tree
Hide file tree
Showing 5 changed files with 18 additions and 18 deletions.
2 changes: 1 addition & 1 deletion buildspec-release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ phases:
- tox -e flake8,twine,sphinx

# pyspark unit tests (no coverage)
- tox -e py37 -- tests/
- tox -e py38 -- tests/

# todo consider adding subset of integration tests

Expand Down
6 changes: 3 additions & 3 deletions buildspec.yml
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ phases:
# pyspark linters and unit tests
- cd $CODEBUILD_SRC_DIR/sagemaker-pyspark-sdk
- tox -e flake8,twine,sphinx
- tox -e py37,stats -- tests/
- tox -e py38,stats -- tests/

# spark integration tests
- cd $CODEBUILD_SRC_DIR/integration-tests/sagemaker-spark-sdk
Expand All @@ -51,6 +51,6 @@ phases:

# pyspark integration tests
- cd $CODEBUILD_SRC_DIR/sagemaker-pyspark-sdk
- IGNORE_COVERAGE=- tox -e py37 -- $CODEBUILD_SRC_DIR/integration-tests/sagemaker-pyspark-sdk/tests/ -n 10 --boxed --reruns 2
# - test_cmd="IGNORE_COVERAGE=- tox -e py37 -- $CODEBUILD_SRC_DIR/integration-tests/sagemaker-pyspark-sdk/tests/ -n 10 --boxed --reruns 2"
- IGNORE_COVERAGE=- tox -e py38 -- $CODEBUILD_SRC_DIR/integration-tests/sagemaker-pyspark-sdk/tests/ -n 10 --boxed --reruns 2
# - test_cmd="IGNORE_COVERAGE=- tox -e py38 -- $CODEBUILD_SRC_DIR/integration-tests/sagemaker-pyspark-sdk/tests/ -n 10 --boxed --reruns 2"
# - execute-command-if-has-matching-changes "$test_cmd" "src/" "tests/" "setup.*" "requirements.txt" "tox.ini" "buildspec.yml"
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -33,10 +33,10 @@ class PCASageMakerEstimator(SageMakerEstimatorBase):
dictionary with entries in trainingSparkDataFormatOptions with key "labelColumnName" or
"featuresColumnName", with values corresponding to the desired label and features columns.
PCASageMakerEstimator uses
:class:`~sagemaker_pyspark.transformation.serializers.ProtobufRequestRowSerializer` to serialize
Rows into RecordIO-encoded Amazon Record protobuf messages for inference, by default selecting
the column named "features" expected to contain a Vector of Doubles.
:class:`~sagemaker_pyspark.transformation.serializers.ProtobufRequestRowSerializer` is used
by PCASageMakerEstimator to serialize Rows into RecordIO-encoded Amazon Record protobuf
messages for inference, by default selecting the column named "features" expected to contain
a Vector of Doubles.
Inferences made against an Endpoint hosting a PCA model contain a "projection" field appended
to the input DataFrame as a Dense Vector of Doubles.
Expand Down
20 changes: 10 additions & 10 deletions sagemaker-pyspark-sdk/tox.ini
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
[tox]
envlist = flake8,twine,sphinx,py37,stats
envlist = flake8,twine,sphinx,py38,stats
skip_missing_interpreters = False

[testenv]
deps =
coverage == 4.4
pytest
pytest-cov
pytest-rerunfailures
pytest-xdist
coverage>=5.2, <6.2
pytest==6.2.5
pytest-cov==3.0.0
pytest-rerunfailures==10.2
pytest-xdist==2.4.0

LANG=en_US.UTF-8
LANGUAGE=en_US:en
Expand All @@ -28,8 +28,8 @@ passenv =
[testenv:sphinx]
basepython=python3
deps =
sphinx
sphinx_rtd_theme
sphinx==5.1.1
sphinx-rtd-theme==0.5.0

commands = sphinx-build -b html docs html

Expand All @@ -44,8 +44,8 @@ commands =
[testenv:flake8]
basepython=python3
deps =
flake8
flake8_formatter_abspath
flake8==4.0.1
flake8_formatter_abspath==1.0.1

skip_install = true
commands=flake8 src/sagemaker_pyspark/ tests/ setup.py
Expand Down

0 comments on commit 6cdd969

Please sign in to comment.