Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Resolve issue #2018 #2022

Merged
merged 158 commits into from
Feb 19, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
158 commits
Select commit Hold shift + click to select a range
23dd95b
Issue with TF
miguelgfierro Oct 12, 2023
212d5ac
Comment out the PR gate affected tests with the upgrade to TF>2.10.1
miguelgfierro Oct 13, 2023
5ce18fa
Comment out the nightly builds affected tests with the upgrade to TF>…
miguelgfierro Oct 13, 2023
46c76a5
:bug:
miguelgfierro Oct 13, 2023
7e86950
Comment out the nightly builds affected tests with the upgrade to TF>…
miguelgfierro Oct 13, 2023
0e4d72b
revert the breaking tests with TF 2.10.1
miguelgfierro Oct 13, 2023
1d13495
temporary pin to TF=2.8.4
miguelgfierro Oct 13, 2023
c3f9d67
Update security tests
miguelgfierro Oct 13, 2023
1d543c7
Try to resolve #2018
SimonYansenZhao Oct 17, 2023
4d86d38
Exclude tensorflow versions that are not supported
SimonYansenZhao Oct 17, 2023
90c3920
Correct version comparison using packaging.version.Version
SimonYansenZhao Oct 17, 2023
eb47a4f
Capture importerror
SimonYansenZhao Oct 17, 2023
0e580bc
Restrict tensorflow < 2.13
SimonYansenZhao Oct 17, 2023
7973235
Set tensorflow < 2.12
SimonYansenZhao Oct 17, 2023
4c39a80
Not triggering unit tests on Draft PR (#2033)
loomlike Nov 3, 2023
22d8707
Refactor ranking metric `map` to be the same as Spark's (#2004)
loomlike Nov 3, 2023
6fa92b5
Add missing kernelspec language
SimonYansenZhao Nov 13, 2023
00102be
Remove scrapbook and papermill deps
miguelgfierro Oct 30, 2023
f285677
notebook utils programmatic execution
miguelgfierro Oct 30, 2023
cad762e
Test notebook programmatic
miguelgfierro Oct 31, 2023
e55b311
Added test notebook for utils
miguelgfierro Oct 31, 2023
76dc706
data notebooks
miguelgfierro Oct 31, 2023
ccd98c5
Replace papermill and scrapbook for new internal function
miguelgfierro Oct 31, 2023
e855606
Replace papermill and scrapbook for new internal function
miguelgfierro Oct 31, 2023
f662d17
Update new programmatic execution code
miguelgfierro Oct 31, 2023
a712885
Update new programmatic execution code
miguelgfierro Oct 31, 2023
9db40a6
Update notebooks with new utility
miguelgfierro Oct 31, 2023
d8378e8
:bug:
miguelgfierro Oct 31, 2023
84b0bbd
Issue with xDeepFM WIP
miguelgfierro Oct 31, 2023
53fcbcf
:bug:
miguelgfierro Oct 31, 2023
e997621
:bug:
miguelgfierro Nov 3, 2023
de7633e
Document the tests in programmatic notebook
miguelgfierro Nov 3, 2023
e78b8b8
:memo:
miguelgfierro Nov 3, 2023
ccfb5ca
WIP
miguelgfierro Nov 3, 2023
72192f9
WIP
miguelgfierro Nov 4, 2023
428820e
Import missing store_metadata
SimonYansenZhao Nov 13, 2023
0620ccd
Correct pattern matching and substitution
SimonYansenZhao Nov 14, 2023
393ac47
Merge multiline parameters into one line
SimonYansenZhao Nov 18, 2023
fa4f87c
Increase timeout
SimonYansenZhao Dec 18, 2023
9befa84
Fix nightly test errors (#2045)
loomlike Dec 21, 2023
7df82f3
Fix benchmarks last cell to store value, not [value]
loomlike Dec 21, 2023
f9e4b34
:memo:
miguelgfierro Dec 23, 2023
56d9369
:memo: remove papermill and scrapbook references
miguelgfierro Dec 23, 2023
7b83ae5
:memo: remove papermill and scrapbook references
miguelgfierro Dec 23, 2023
2151469
:memo:
miguelgfierro Dec 23, 2023
4c07eb1
:memo: remove papermill and scrapbook references
miguelgfierro Dec 23, 2023
f407446
:memo: remove papermill and scrapbook references
miguelgfierro Dec 23, 2023
32ea148
:memo: remove papermill and scrapbook references
miguelgfierro Dec 23, 2023
4fabbb1
:memo:
miguelgfierro Dec 23, 2023
080bc0f
Updated PR template
miguelgfierro Jan 7, 2024
93a4f1c
Updated contributing
miguelgfierro Jan 7, 2024
0c43ad5
Updated PR template and contributing
miguelgfierro Jan 7, 2024
0ae786e
Updated contributing
miguelgfierro Jan 7, 2024
f275965
[Fix] correct MIND data construction of user behavior history
thaiminhpv Jan 8, 2024
31b7970
change path hybrid
miguelgfierro Dec 29, 2023
5d0c496
Update hybrid to CF
miguelgfierro Dec 29, 2023
9dfc3be
change path hybrid
miguelgfierro Dec 29, 2023
e34fdad
change path hybrid
miguelgfierro Dec 29, 2023
faba430
:memo:
miguelgfierro Dec 29, 2023
f925db2
Replace LayerRNNCell with AbstractRNNCell
SimonYansenZhao Jan 22, 2024
70d04a8
Stop testing for deeprec
SimonYansenZhao Jan 30, 2024
b6036bd
Refactor ranking metric `map` to be the same as Spark's (#2004)
loomlike Nov 3, 2023
ac93817
notebook utils programmatic execution
miguelgfierro Oct 30, 2023
d08d649
Test notebook programmatic
miguelgfierro Oct 31, 2023
330770b
Added test notebook for utils
miguelgfierro Oct 31, 2023
c3e92df
Replace papermill and scrapbook for new internal function
miguelgfierro Oct 31, 2023
7122109
Replace papermill and scrapbook for new internal function
miguelgfierro Oct 31, 2023
e795de6
Update new programmatic execution code
miguelgfierro Oct 31, 2023
a5116f3
Update new programmatic execution code
miguelgfierro Oct 31, 2023
0ec3b9e
Update notebooks with new utility
miguelgfierro Oct 31, 2023
96a7e21
:bug:
miguelgfierro Oct 31, 2023
32f9858
Issue with xDeepFM WIP
miguelgfierro Oct 31, 2023
4ecb668
:bug:
miguelgfierro Oct 31, 2023
3d89766
:bug:
miguelgfierro Nov 3, 2023
a894b5c
Document the tests in programmatic notebook
miguelgfierro Nov 3, 2023
00b7634
:memo:
miguelgfierro Nov 3, 2023
0ca11dc
WIP
miguelgfierro Nov 3, 2023
6a97812
WIP
miguelgfierro Nov 4, 2023
5b01e25
Import missing store_metadata
SimonYansenZhao Nov 13, 2023
3211560
Correct pattern matching and substitution
SimonYansenZhao Nov 14, 2023
a4b0377
Increase timeout
SimonYansenZhao Dec 18, 2023
afe19f3
Fix nightly test errors (#2045)
loomlike Dec 21, 2023
9b881b6
Fix benchmarks last cell to store value, not [value]
loomlike Dec 21, 2023
bda7322
:memo:
miguelgfierro Dec 23, 2023
27f6aa9
:memo: remove papermill and scrapbook references
miguelgfierro Dec 23, 2023
a810580
:memo: remove papermill and scrapbook references
miguelgfierro Dec 23, 2023
7860d29
:memo: remove papermill and scrapbook references
miguelgfierro Dec 23, 2023
22bea12
:memo:
miguelgfierro Dec 23, 2023
756467e
Updated PR template
miguelgfierro Jan 7, 2024
728a6ee
Updated contributing
miguelgfierro Jan 7, 2024
2f6c765
Updated PR template and contributing
miguelgfierro Jan 7, 2024
8dca2b1
Updated contributing
miguelgfierro Jan 7, 2024
7765d38
change path hybrid
miguelgfierro Dec 29, 2023
27da022
change path hybrid
miguelgfierro Dec 29, 2023
080262d
:memo:
miguelgfierro Dec 29, 2023
780bafb
Creating a jupyter book
miguelgfierro Dec 27, 2023
f9ddd80
Creating documentation
miguelgfierro Dec 27, 2023
434d6e7
:memo:
miguelgfierro Dec 28, 2023
919a6ed
WIP
miguelgfierro Dec 29, 2023
796ec60
Added rst files
miguelgfierro Dec 29, 2023
32fbf2a
license
miguelgfierro Dec 29, 2023
d2067a7
Weird warning with a link in the docstrings
miguelgfierro Dec 29, 2023
d5a20d9
:memo:
miguelgfierro Dec 30, 2023
ded2254
Fix docstring errors and replace .. note:: with Note:
miguelgfierro Dec 30, 2023
b831e23
:memo:
miguelgfierro Dec 30, 2023
770a474
:memo:
miguelgfierro Dec 30, 2023
aeabba7
Automatic build of documentation
miguelgfierro Dec 31, 2023
bbf05ea
Automatic build of documentation dev
miguelgfierro Dec 31, 2023
6568a93
Automatic build of documentation deps
miguelgfierro Dec 31, 2023
08f2c11
Automatic build of documentation deps
miguelgfierro Dec 31, 2023
82328eb
Automatic build of documentation deps
miguelgfierro Dec 31, 2023
ddb43cc
Delete workflow and try via UI
miguelgfierro Dec 31, 2023
5b348fd
Added again the workflow
miguelgfierro Dec 31, 2023
b68f9f6
git add * -rf
miguelgfierro Dec 31, 2023
1a4f54b
git add * -f
miguelgfierro Dec 31, 2023
a2928a3
add git info
miguelgfierro Dec 31, 2023
03005e4
actions to automatically update documentation
miguelgfierro Jan 4, 2024
b2afba1
actions to automatically update documentation
miguelgfierro Jan 4, 2024
37edbcf
actions to automatically update documentation :bug:
miguelgfierro Jan 4, 2024
b01bc7e
actions to automatically update documentation :bug:
miguelgfierro Jan 4, 2024
36e36a8
trying github token
miguelgfierro Jan 4, 2024
66fb9bd
trying github token
miguelgfierro Jan 4, 2024
4ae1fc0
trying github token and pull before pushing
miguelgfierro Jan 4, 2024
40ef986
pull rebase
miguelgfierro Jan 4, 2024
e82463c
pull rebase and -Xtheirs
miguelgfierro Jan 4, 2024
96ba2e3
clean
miguelgfierro Jan 4, 2024
b27e88a
Update documentation badge
miguelgfierro Jan 4, 2024
986dab6
install all deps
miguelgfierro Jan 4, 2024
156ce2f
try adding other sphinx extensions
miguelgfierro Jan 4, 2024
caece7f
Refact model rst
miguelgfierro Jan 5, 2024
f72067f
comment geoimc and rlrmc docs until issue is fixed
miguelgfierro Jan 5, 2024
9678fe2
:memo:
miguelgfierro Jan 5, 2024
c9d6056
Adding init and other special members
miguelgfierro Jan 5, 2024
725eb6e
Adding init and other special members
miguelgfierro Jan 5, 2024
9dd9fa1
Reviewing other rst
miguelgfierro Jan 5, 2024
4be08b4
Change sphinx version
miguelgfierro Jan 15, 2024
86f6378
Change sphinx version and jupyter book
miguelgfierro Jan 15, 2024
7e12dc7
Change the way we compile the documentation
miguelgfierro Jan 15, 2024
9ac20ce
Using the latest JB release
miguelgfierro Jan 27, 2024
4e6f265
Documentation working
miguelgfierro Jan 27, 2024
aa2ca0a
Update docs/_config.yml
miguelgfierro Jan 29, 2024
6ad777f
Update docs/requirements-doc.txt
miguelgfierro Jan 29, 2024
b622bd1
Update docs/_config.yml
miguelgfierro Jan 29, 2024
484c403
Added comments by @SimonYansenZhao
miguelgfierro Jan 29, 2024
66842de
Upgrade versions of GitHub Actions
SimonYansenZhao Jan 30, 2024
77235dc
Update setup.py
SimonYansenZhao Jan 30, 2024
2783200
Try to disable sum and sum_component only
SimonYansenZhao Jan 30, 2024
ff6768a
Upgrade AzureML docker image
SimonYansenZhao Jan 31, 2024
2d9341a
Correct variable names
SimonYansenZhao Jan 31, 2024
b886919
Install git in the Conda env
SimonYansenZhao Feb 12, 2024
35610bf
Disable test_xdeepfm_component_definition
SimonYansenZhao Feb 12, 2024
8866565
Use latest CUDA
SimonYansenZhao Feb 19, 2024
02103e2
Correct GPU selection
SimonYansenZhao Feb 19, 2024
530f85b
Remove leading whitespaces in Dockerfile
SimonYansenZhao Feb 19, 2024
c37f370
Simplify azureml-test/action.yml
SimonYansenZhao Feb 19, 2024
76ae5e0
Install wget in Docker image
SimonYansenZhao Feb 19, 2024
dfaa9ed
Update
SimonYansenZhao Feb 19, 2024
94ece08
Merge staging
SimonYansenZhao Feb 19, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
49 changes: 19 additions & 30 deletions .github/actions/azureml-test/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ runs:
using: "composite"
steps:
- name: Setup python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: "3.8"
- name: Install azureml-core and azure-cli on a GitHub hosted server
Expand All @@ -82,43 +82,32 @@ runs:
- name: Install wheel package
shell: bash
run: pip install --quiet wheel
- name: Submit CPU tests to AzureML
- name: Submit tests to AzureML
shell: bash
if: contains(inputs.TEST_GROUP, 'cpu')
run: >-
python tests/ci/azureml_tests/submit_groupwise_azureml_pytest.py --clustername ${{inputs.CPU_CLUSTER_NAME}}
--subid ${{inputs.AZUREML_TEST_SUBID}} --reponame "recommenders" --branch ${{ github.ref }}
--rg ${{inputs.RG}} --wsname ${{inputs.WS}} --expname ${{inputs.EXP_NAME}}_${{inputs.TEST_GROUP}}
--testlogs ${{inputs.TEST_LOGS_PATH}} --testkind ${{inputs.TEST_KIND}}
--conda_pkg_python ${{inputs.PYTHON_VERSION}} --testgroup ${{inputs.TEST_GROUP}}
--disable-warnings --sha "${GITHUB_SHA}"
- name: Submit GPU tests to AzureML
shell: bash
if: contains(inputs.TEST_GROUP, 'gpu')
run: >-
python tests/ci/azureml_tests/submit_groupwise_azureml_pytest.py --clustername ${{inputs.GPU_CLUSTER_NAME}}
--subid ${{inputs.AZUREML_TEST_SUBID}} --reponame "recommenders" --branch ${{ github.ref }}
--rg ${{inputs.RG}} --wsname ${{inputs.WS}} --expname ${{inputs.EXP_NAME}}_${{inputs.TEST_GROUP}}
--testlogs ${{inputs.TEST_LOGS_PATH}} --add_gpu_dependencies --testkind ${{inputs.TEST_KIND}}
--conda_pkg_python ${{inputs.PYTHON_VERSION}} --testgroup ${{inputs.TEST_GROUP}}
--disable-warnings --sha "${GITHUB_SHA}"
- name: Submit PySpark tests to AzureML
shell: bash
if: contains(inputs.TEST_GROUP, 'spark')
run: >-
python tests/ci/azureml_tests/submit_groupwise_azureml_pytest.py --clustername ${{inputs.CPU_CLUSTER_NAME}}
--subid ${{inputs.AZUREML_TEST_SUBID}} --reponame "recommenders" --branch ${{ github.ref }}
--rg ${{inputs.RG}} --wsname ${{inputs.WS}} --expname ${{inputs.EXP_NAME}}_${{inputs.TEST_GROUP}}
--testlogs ${{inputs.TEST_LOGS_PATH}} --add_spark_dependencies --testkind ${{inputs.TEST_KIND}}
--conda_pkg_python ${{inputs.PYTHON_VERSION}} --testgroup ${{inputs.TEST_GROUP}}
--disable-warnings --sha "${GITHUB_SHA}"
python tests/ci/azureml_tests/submit_groupwise_azureml_pytest.py \
--subid ${{inputs.AZUREML_TEST_SUBID}} \
--reponame "recommenders" \
--branch ${{ github.ref }} \
--rg ${{inputs.RG}} \
--wsname ${{inputs.WS}} \
--expname ${{inputs.EXP_NAME}}_${{inputs.TEST_GROUP}} \
--testlogs ${{inputs.TEST_LOGS_PATH}} \
--testkind ${{inputs.TEST_KIND}} \
--conda_pkg_python ${{inputs.PYTHON_VERSION}} \
--testgroup ${{inputs.TEST_GROUP}} \
--disable-warnings \
--sha "${GITHUB_SHA}" \
--clustername $(if [[ ${{inputs.TEST_GROUP}} =~ "gpu" ]]; then echo "${{inputs.GPU_CLUSTER_NAME}}"; else echo "${{inputs.CPU_CLUSTER_NAME}}"; fi) \
$(if [[ ${{inputs.TEST_GROUP}} =~ "gpu" ]]; then echo "--add_gpu_dependencies"; fi) \
$(if [[ ${{inputs.TEST_GROUP}} =~ "spark" ]]; then echo "--add_spark_dependencies"; fi)
- name: Get exit status
shell: bash
id: exit_status
run: echo "code=$(cat ${{inputs.PYTEST_EXIT_CODE}})" >> $GITHUB_OUTPUT
- name: Check Success/Failure
if: ${{ steps.exit_status.outputs.code != 0 }}
uses: actions/github-script@v3
uses: actions/github-script@v7
with:
script: |
core.setFailed('All tests did not pass!')
4 changes: 2 additions & 2 deletions .github/workflows/azureml-cpu-nightly.yml
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Check out repository code
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Get test group names
id: get_test_groups
uses: ./.github/actions/get-test-groups
Expand All @@ -71,7 +71,7 @@ jobs:
test-group: ${{ fromJSON(needs.get-test-groups.outputs.test_groups) }}
steps:
- name: Check out repository code
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Execute tests
uses: ./.github/actions/azureml-test
id: execute_tests
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/azureml-gpu-nightly.yml
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Check out repository code
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Get test group names
id: get_test_groups
uses: ./.github/actions/get-test-groups
Expand All @@ -71,7 +71,7 @@ jobs:
test-group: ${{ fromJSON(needs.get-test-groups.outputs.test_groups) }}
steps:
- name: Check out repository code
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Execute tests
uses: ./.github/actions/azureml-test
id: execute_tests
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/azureml-release-pipeline.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,9 @@ jobs:
needs: [unit-test-workflow, cpu-nightly-workflow, gpu-nightly-workflow, spark-nightly-workflow]
steps:
- name: Check out repository code
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Setup python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: "3.8"
- name: Install wheel package
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/azureml-spark-nightly.yml
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Check out repository code
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Get test group names
id: get_test_groups
uses: ./.github/actions/get-test-groups
Expand All @@ -70,7 +70,7 @@ jobs:
test-group: ${{ fromJSON(needs.get-test-groups.outputs.test_groups) }}
steps:
- name: Check out repository code
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Execute tests
uses: ./.github/actions/azureml-test
id: execute_tests
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/azureml-unit-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Check out repository code
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Get test group names
id: get_test_groups
uses: ./.github/actions/get-test-groups
Expand All @@ -60,7 +60,7 @@ jobs:
test-group: ${{ fromJSON(needs.get-test-groups.outputs.test_groups) }}
steps:
- name: Check out repository code
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Execute tests
uses: ./.github/actions/azureml-test
id: execute_tests
Expand Down
12 changes: 6 additions & 6 deletions .github/workflows/sarplus.yml
Original file line number Diff line number Diff line change
Expand Up @@ -41,10 +41,10 @@ jobs:
matrix:
python-version: ["3.8", "3.9"]
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

Expand Down Expand Up @@ -96,15 +96,15 @@ jobs:
- name: Upload Python wheel as GitHub artifact when merged into main
# Upload the whl file of the specific python version
if: github.ref == 'refs/heads/main'
uses: actions/upload-artifact@v2
uses: actions/upload-artifact@v4
with:
name: pysarplus-${{ env.sarplus_version }}-cp${{ matrix.python-version }}-wheel
path: ${{ env.PYTHON_ROOT }}/dist/*.whl

- name: Upload Python source as GitHub artifact when merged into main
# Only one pysarplus source tar file is needed
if: github.ref == 'refs/heads/main' && matrix.python-version == '3.10'
uses: actions/upload-artifact@v2
uses: actions/upload-artifact@v4
with:
name: pysarplus-${{ env.sarplus_version }}-source
path: ${{ env.PYTHON_ROOT }}/dist/*.tar.gz
Expand All @@ -131,7 +131,7 @@ jobs:
hadoop-version: "3.3.1"

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Test
run: |
Expand Down Expand Up @@ -180,7 +180,7 @@ jobs:

- name: Upload Scala bundle as GitHub artifact when merged into main
if: github.ref == 'refs/heads/main'
uses: actions/upload-artifact@v2
uses: actions/upload-artifact@v4
with:
name: sarplus-${{ env.sarplus_version }}-bundle_2.12-spark-${{ matrix.spark-version }}-jar
path: ${{ env.SCALA_ROOT }}/target/scala-2.12/*bundle*.jar
4 changes: 2 additions & 2 deletions .github/workflows/update_documentation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,10 @@ jobs:

steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: 3.10

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -601,9 +601,9 @@ def __init__(
):
self._build_bias = build_bias

if args is None or (nest.is_sequence(args) and not args):
if args is None or (nest.is_nested(args) and not args):
raise ValueError("`args` must be specified")
if not nest.is_sequence(args):
if not nest.is_nested(args):
args = [args]
self._is_sequence = False
else:
Expand Down
3 changes: 2 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
extras_require = {
"gpu": [
"nvidia-ml-py3>=7.352.0",
"tensorflow==2.8.4", # FIXME: Temporarily pinned due to issue with TF version > 2.10.1 See #2018
"tensorflow>=2.8.4,!=2.9.0.*,!=2.9.1,!=2.9.2,!=2.10.0.*,<3",
"tf-slim>=1.1.0",
"torch>=1.13.1", # for CUDA 11 support
"fastai>=1.0.46,<2",
Expand All @@ -72,6 +72,7 @@
"pytest>=3.6.4",
"pytest-cov>=2.12.1",
"pytest-mock>=3.6.1", # for access to mock fixtures in pytest
"packaging>=20.9", # for version comparison in test_dependency_security.py
],
}
# For the brave of heart
Expand Down
56 changes: 34 additions & 22 deletions tests/ci/azureml_tests/submit_groupwise_azureml_pytest.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,6 @@
"""
import argparse
import logging
import glob

from azureml.core.authentication import AzureCliAuthentication
from azureml.core import Workspace
Expand Down Expand Up @@ -146,7 +145,6 @@ def setup_persistent_compute_target(workspace, cluster_name, vm_size, max_nodes)

def create_run_config(
cpu_cluster,
docker_proc_type,
add_gpu_dependencies,
add_spark_dependencies,
conda_pkg_jdk,
Expand All @@ -165,7 +163,6 @@ def create_run_config(
the following:
- Reco_cpu_test
- Reco_gpu_test
docker_proc_type (str) : processor type, cpu or gpu
add_gpu_dependencies (bool) : True if gpu packages should be
added to the conda environment, else False
add_spark_dependencies (bool) : True if PySpark packages should be
Expand All @@ -179,7 +176,39 @@ def create_run_config(
run_azuremlcompute = RunConfiguration()
run_azuremlcompute.target = cpu_cluster
run_azuremlcompute.environment.docker.enabled = True
run_azuremlcompute.environment.docker.base_image = docker_proc_type
if not add_gpu_dependencies:
# https://github.com/Azure/AzureML-Containers/blob/master/base/cpu/openmpi4.1.0-ubuntu22.04
run_azuremlcompute.environment.docker.base_image = "mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu22.04"
else:
run_azuremlcompute.environment.docker.base_image = None
# Use the latest CUDA
# See
# * https://learn.microsoft.com/en-us/azure/machine-learning/how-to-train-with-custom-image?view=azureml-api-1#use-a-custom-dockerfile-optional
# * https://github.com/Azure/AzureML-Containers/blob/master/base/gpu/openmpi4.1.0-cuda11.8-cudnn8-ubuntu22.04
run_azuremlcompute.environment.docker.base_dockerfile = r"""
FROM nvcr.io/nvidia/cuda:12.3.1-devel-ubuntu22.04
USER root:root
ENV NVIDIA_VISIBLE_DEVICES all
ENV NVIDIA_DRIVER_CAPABILITIES compute,utility
ENV LANG=C.UTF-8 LC_ALL=C.UTF-8
ENV DEBIAN_FRONTEND noninteractive
RUN apt-get update && \
apt-get install -y wget git-all && \
apt-get clean -y && \
rm -rf /var/lib/apt/lists/*
# Conda Environment
ENV MINICONDA_VERSION py38_23.3.1-0
ENV PATH /opt/miniconda/bin:$PATH
ENV CONDA_PACKAGE 23.5.0
RUN wget -qO /tmp/miniconda.sh https://repo.anaconda.com/miniconda/Miniconda3-${MINICONDA_VERSION}-Linux-x86_64.sh && \
bash /tmp/miniconda.sh -bf -p /opt/miniconda && \
conda install conda=${CONDA_PACKAGE} -y && \
conda update --all -c conda-forge -y && \
conda clean -ay && \
rm -rf /opt/miniconda/pkgs && \
rm /tmp/miniconda.sh && \
find / -type d -name __pycache__ | xargs rm -rf
"""

# Use conda_dependencies.yml to create a conda environment in
# the Docker image for execution
Expand All @@ -195,6 +224,7 @@ def create_run_config(

# install recommenders
reco_extras = "dev"
conda_dep.add_conda_package("anaconda::git")
if add_gpu_dependencies and add_spark_dependencies:
conda_dep.add_channel("conda-forge")
conda_dep.add_conda_package(conda_pkg_jdk)
Expand Down Expand Up @@ -326,13 +356,6 @@ def create_arg_parser():
default="STANDARD_D3_V2",
help="Set the size of the VM either STANDARD_D3_V2",
)
# cpu or gpu
parser.add_argument(
"--dockerproc",
action="store",
default="cpu",
help="Base image used in docker container",
)
# Azure subscription id, when used in a pipeline, it is stored in keyvault
parser.add_argument(
"--subid", action="store", default="123456", help="Azure Subscription ID"
Expand Down Expand Up @@ -421,16 +444,6 @@ def create_arg_parser():

logger = logging.getLogger("submit_groupwise_azureml_pytest.py")
args = create_arg_parser()

if args.dockerproc == "cpu":
from azureml.core.runconfig import DEFAULT_CPU_IMAGE

docker_proc_type = DEFAULT_CPU_IMAGE
else:
from azureml.core.runconfig import DEFAULT_GPU_IMAGE

docker_proc_type = DEFAULT_GPU_IMAGE

cli_auth = AzureCliAuthentication()

workspace = setup_workspace(
Expand All @@ -450,7 +463,6 @@ def create_arg_parser():

run_config = create_run_config(
cpu_cluster=cpu_cluster,
docker_proc_type=docker_proc_type,
add_gpu_dependencies=args.add_gpu_dependencies,
add_spark_dependencies=args.add_spark_dependencies,
conda_pkg_jdk=args.conda_pkg_jdk,
Expand Down
Loading
Loading