Skip to content

Commit

Permalink
Update dependencies (#428)
Browse files Browse the repository at this point in the history
* Update dependencies

* Fix for numpy v1.17.0

* Downgrade pytest version for travis

* Update .travis.yml

* Rollback to previous docker image

* Update .travis.yml

* Trying to upgrade pytest

* Try to ignore pytest warning

* Update setup.cfg

* Use full name to ignore pytest warning

* Correct import

* Remove gym and tf warnings

* Add TD3 to tensorboard tests

* Ignore additional gym warning

* Get rid of additional tf warning

* Re-enable new docker image

* Test with different tf version

* Upgrade pytest

* Upgrade tf to 1.13.2

* Try downgrading tf

* Move Travis CI test to separate bash file

* Try splitting up test suite

* Diagnostic: echo test glob

* Avoid unintended wildcard

* Upgrade TF to 1.13.2, move scripts to subdirectory

* Split up tests to keep them <20m long

* Try thread-safe SubprocVecEnv

* Disable non-thread safe start methods in tests; document default change

* Rebalance tests

* Rebalance some more

* Rebalance, hopefully for the last time

* Fix globs

* Update docker cpu image: add coverage reporter for travis

* Codacy partial upload

* Bump Docker image version

* Make Travis read environment variable

* Pass project token in

* Remove pip install and fix coverage final report
  • Loading branch information
araffin authored and hill-a committed Sep 6, 2019
1 parent 0133b86 commit e2e5c1a
Show file tree
Hide file tree
Showing 18 changed files with 143 additions and 50 deletions.
36 changes: 28 additions & 8 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,23 +2,43 @@ language: python
python:
- "3.5"

env:
global:
- DOCKER_IMAGE=araffin/stable-baselines-cpu:v2.7.1

notifications:
email: false

services:
- docker

install:
- docker pull araffin/stable-baselines-cpu
- docker pull ${DOCKER_IMAGE}

script:
- ./scripts/run_tests_travis.sh "${TEST_GLOB}"

matrix:
jobs:
include:
- name: "Unit Tests"
script:
# For pull requests from fork, Codacy token is not available, leading to build failure
- 'if [ "$TRAVIS_PULL_REQUEST" != "false" ]; then docker run -it --rm --network host --ipc=host --mount src=$(pwd),target=/root/code/stable-baselines,type=bind araffin/stable-baselines-cpu bash -c "cd /root/code/stable-baselines/ && pytest --cov-config .coveragerc --cov-report term --cov=. -v tests/"; fi'
- 'if [ "$TRAVIS_PULL_REQUEST" = "false" ]; then docker run -it --env CODACY_PROJECT_TOKEN=$CODACY_PROJECT_TOKEN --rm --network host --ipc=host --mount src=$(pwd),target=/root/code/stable-baselines,type=bind araffin/stable-baselines-cpu bash -c "cd /root/code/stable-baselines/ && pytest --cov-config .coveragerc --cov-report term --cov-report xml --cov=. -v tests/ && python-codacy-coverage -r coverage.xml --token=$CODACY_PROJECT_TOKEN"; fi'
# Big test suite. Run in parallel to decrease wall-clock time, and to avoid OOM error from leaks
- stage: Test
name: "Unit Tests a-h"
env: TEST_GLOB="[a-h]*"

- name: "Unit Tests i-l"
env: TEST_GLOB="[i-l]*"

- name: "Unit Tests m-sa"
env: TEST_GLOB="{[m-r]*,sa*}"

- name: "Unit Tests sb-z"
env: TEST_GLOB="{s[b-z]*,[t-z]*}"

- name: "Sphinx Documentation"
script:
- 'docker run -it --rm --mount src=$(pwd),target=/root/code/stable-baselines,type=bind araffin/stable-baselines-cpu bash -c "cd /root/code/stable-baselines/ && pip install .[docs] && pushd docs/ && make clean && make html"'
- 'docker run -it --rm --mount src=$(pwd),target=/root/code/stable-baselines,type=bind ${DOCKER_IMAGE} bash -c "cd /root/code/stable-baselines/ && pip install .[docs] && pushd docs/ && make clean && make html"'

- stage: Codacy Trigger
script:
# When all test coverage reports have been uploaded, instruct Codacy to start analysis.
- 'docker run -it --rm --network host --ipc=host --mount src=$(pwd),target=/root/code/stable-baselines,type=bind --env CODACY_PROJECT_TOKEN=${CODACY_PROJECT_TOKEN} ${DOCKER_IMAGE} bash -c "cd /root/code/stable-baselines/ && /root/code/codacy-coverage-reporter final"'
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ Also, when a bug fix is proposed, tests should be added to avoid regression.
To run tests:

```
./run_tests.sh
./scripts/run_tests.sh
```

## Changelog and Documentation
Expand Down
18 changes: 9 additions & 9 deletions docker/Dockerfile.cpu
Original file line number Diff line number Diff line change
Expand Up @@ -13,27 +13,27 @@ RUN \
pip install --upgrade pip && \
pip install codacy-coverage && \
pip install scipy && \
pip install tqdm && \
pip install joblib && \
pip install zmq && \
pip install dill && \
pip install progressbar2 && \
pip install mpi4py && \
pip install cloudpickle && \
pip install tensorflow==1.5.0 && \
pip install click && \
pip install tensorflow==1.8.0 && \
pip install opencv-python && \
pip install numpy && \
pip install pandas && \
pip install pytest==3.5.1 && \
pip install pytest && \
pip install pytest-cov && \
pip install pytest-env && \
pip install pytest-xdist && \
pip install matplotlib && \
pip install seaborn && \
pip install glob2 && \
pip install gym[atari,classic_control]>=0.10.9

ENV PATH=$VENV/bin:$PATH

RUN apt-get -y update && apt-get -y install curl jq

# Codacy code coverage report: used for partial code coverage reporting
RUN cd $CODE_DIR &&\
curl -Ls -o codacy-coverage-reporter "$(curl -Ls https://api.github.com/repos/codacy/codacy-coverage-reporter/releases/latest | jq -r '.assets | map({name, browser_download_url} | select(.name | contains("codacy-coverage-reporter-linux"))) | .[0].browser_download_url')" &&\
chmod +x codacy-coverage-reporter

CMD /bin/bash
9 changes: 1 addition & 8 deletions docker/Dockerfile.gpu
Original file line number Diff line number Diff line change
Expand Up @@ -13,25 +13,18 @@ RUN \
pip install --upgrade pip && \
pip install codacy-coverage && \
pip install scipy && \
pip install tqdm && \
pip install joblib && \
pip install zmq && \
pip install dill && \
pip install progressbar2 && \
pip install mpi4py && \
pip install cloudpickle && \
pip install tensorflow-gpu==1.8.0 && \
pip install click && \
pip install opencv-python && \
pip install numpy && \
pip install pandas && \
pip install pytest==3.5.1 && \
pip install pytest && \
pip install pytest-cov && \
pip install pytest-env && \
pip install pytest-xdist && \
pip install matplotlib && \
pip install seaborn && \
pip install glob2 && \
pip install gym[atari,classic_control]>=0.10.9

ENV PATH=$VENV/bin:$PATH
Expand Down
4 changes: 2 additions & 2 deletions docs/guide/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ Or, with the shell file:

.. code-block:: bash
./run_docker_gpu.sh pytest tests/
./scripts/run_docker_gpu.sh pytest tests/
Run the docker CPU image

Expand All @@ -152,7 +152,7 @@ Or, with the shell file:

.. code-block:: bash
./run_docker_cpu.sh pytest tests/
./scripts/run_docker_cpu.sh pytest tests/
Explanation of the docker command:

Expand Down
33 changes: 33 additions & 0 deletions docs/misc/changelog.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,11 @@ Breaking Changes:
OpenMPI-dependent algorithms.
See :ref:`installation notes <openmpi>` and
`Issue #430 <https://github.com/hill-a/stable-baselines/issues/430>`.
- SubprocVecEnv now defaults to a thread-safe start method, `forkserver` when
available and otherwise `spawn`. This may require application code be
wrapped in `if __name__ == '__main__'`. You can restore previous behavior
by explicitly setting `start_method = 'fork'`. See
`PR #428 <https://github.com/hill-a/stable-baselines/pull/428>`_.

New Features:
^^^^^^^^^^^^^
Expand Down Expand Up @@ -51,6 +56,34 @@ Documentation:
- Fix and rename custom policy names (@eavelardev)



Pre-Release 2.7.1a0 (WIP)
--------------------------


Breaking Changes:
^^^^^^^^^^^^^^^^^
- updated dependencies: tensorflow v1.8.0 is now required

New Features:
^^^^^^^^^^^^^

Bug Fixes:
^^^^^^^^^^
- set `allow_pickle=True` for numpy>=1.17.0 when loading expert dataset

Deprecations:
^^^^^^^^^^^^^

Others:
^^^^^^^
- docker images were updated

Documentation:
^^^^^^^^^^^^^^



Release 2.7.0 (2019-07-31)
--------------------------

Expand Down
2 changes: 1 addition & 1 deletion run_docker_cpu.sh → scripts/run_docker_cpu.sh
Original file line number Diff line number Diff line change
Expand Up @@ -8,5 +8,5 @@ echo $cmd_line


docker run -it --rm --network host --ipc=host \
--mount src=$(pwd),target=/root/code/stable-baselines,type=bind araffin/stable-baselines-cpu\
--mount src=$(pwd),target=/root/code/stable-baselines,type=bind araffin/stable-baselines-cpu:v2.7.1 \
bash -c "cd /root/code/stable-baselines/ && $cmd_line"
2 changes: 1 addition & 1 deletion run_docker_gpu.sh → scripts/run_docker_gpu.sh
Original file line number Diff line number Diff line change
Expand Up @@ -8,5 +8,5 @@ echo $cmd_line


docker run -it --runtime=nvidia --rm --network host --ipc=host \
--mount src=$(pwd),target=/root/code/stable-baselines,type=bind araffin/stable-baselines\
--mount src=$(pwd),target=/root/code/stable-baselines,type=bind araffin/stable-baselines:v2.7.1 \
bash -c "cd /root/code/stable-baselines/ && $cmd_line"
File renamed without changes.
35 changes: 35 additions & 0 deletions scripts/run_tests_travis.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
#!/usr/bin/env bash

DOCKER_CMD="docker run -it --rm --network host --ipc=host --mount src=$(pwd),target=/root/code/stable-baselines,type=bind"
BASH_CMD="cd /root/code/stable-baselines/"

if [[ $# -ne 1 ]]; then
echo "usage: $0 <test glob>"
exit 1
fi

if [[ ${DOCKER_IMAGE} = "" ]]; then
echo "Need DOCKER_IMAGE environment variable to be set."
exit 1
fi

TEST_GLOB=$1

set -e # exit immediately on any error

# For pull requests from fork, Codacy token is not available, leading to build failure
if [ "$TRAVIS_PULL_REQUEST" != "false" ]; then
${DOCKER_CMD} ${DOCKER_IMAGE} \
bash -c "${BASH_CMD} && \
pytest --cov-config .coveragerc --cov-report term --cov=. -v tests/test_${TEST_GLOB}"
else
if [[ ${CODACY_PROJECT_TOKEN} = "" ]]; then
echo "Need CODACY_PROJECT_TOKEN environment variable to be set."
exit 1
fi

${DOCKER_CMD} --env CODACY_PROJECT_TOKEN=${CODACY_PROJECT_TOKEN} ${DOCKER_IMAGE} \
bash -c "${BASH_CMD} && \
pytest --cov-config .coveragerc --cov-report term --cov-report xml --cov=. -v tests/test_${TEST_GLOB} && \
/root/code/codacy-coverage-reporter report -l python -r coverage.xml --partial"
fi
12 changes: 11 additions & 1 deletion setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -4,5 +4,15 @@ license_file = LICENSE

[tool:pytest]
# Deterministic ordering for tests; useful for pytest-xdist.
env =
env =
PYTHONHASHSEED=0
filterwarnings =
ignore:inspect.getargspec:DeprecationWarning:tensorflow
ignore::pytest.PytestUnknownMarkWarning
# Tensorflow internal warnings
ignore:builtin type EagerTensor has no __module__ attribute:DeprecationWarning
ignore:The binary mode of fromstring is deprecated:DeprecationWarning
ignore::FutureWarning:tensorflow
# Gym warnings
ignore:Parameters to load are deprecated.:DeprecationWarning
ignore:the imp module is deprecated in favour of importlib:PendingDeprecationWarning
6 changes: 3 additions & 3 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
install_tf, tf_gpu = False, False
try:
import tensorflow as tf
if tf.__version__ < LooseVersion('1.5.0'):
if tf.__version__ < LooseVersion('1.8.0'):
install_tf = True
# check if a gpu version is needed
tf_gpu = tf.test.is_gpu_available()
Expand All @@ -29,7 +29,7 @@

tf_dependency = []
if install_tf:
tf_dependency = ['tensorflow-gpu>=1.5.0'] if tf_gpu else ['tensorflow>=1.5.0']
tf_dependency = ['tensorflow-gpu>=1.8.0'] if tf_gpu else ['tensorflow>=1.8.0']
if tf_gpu:
print("A GPU was detected, tensorflow-gpu will be installed")

Expand Down Expand Up @@ -140,7 +140,7 @@
license="MIT",
long_description=long_description,
long_description_content_type='text/markdown',
version="2.7.0",
version="2.7.1a0",
)

# python setup.py sdist
Expand Down
3 changes: 1 addition & 2 deletions stable_baselines/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@
from stable_baselines.td3 import TD3
from stable_baselines.sac import SAC


# Load mpi4py-dependent algorithms only if mpi is installed.
try:
import mpi4py
Expand All @@ -21,4 +20,4 @@
from stable_baselines.trpo_mpi import TRPO
del mpi4py

__version__ = "2.7.0"
__version__ = "2.7.1a0"
16 changes: 8 additions & 8 deletions stable_baselines/common/vec_env/subproc_vec_env.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,17 +55,17 @@ class SubprocVecEnv(VecEnv):
.. warning::
Only 'forkserver' and 'spawn' start methods are thread-safe,
which is important when TensorFlow
sessions or other non thread-safe libraries are used in the parent (see issue #217).
However, compared to 'fork' they incur a small start-up cost and have restrictions on
global variables. With those methods,
users must wrap the code in an ``if __name__ == "__main__":``
which is important when TensorFlow sessions or other non thread-safe
libraries are used in the parent (see issue #217). However, compared to
'fork' they incur a small start-up cost and have restrictions on
global variables. With those methods, users must wrap the code in an
``if __name__ == "__main__":`` block.
For more information, see the multiprocessing documentation.
:param env_fns: ([Gym Environment]) Environments to run in subprocesses
:param start_method: (str) method used to start the subprocesses.
Must be one of the methods returned by multiprocessing.get_all_start_methods().
Defaults to 'fork' on available platforms, and 'spawn' otherwise.
Defaults to 'forkserver' on available platforms, and 'spawn' otherwise.
"""

def __init__(self, env_fns, start_method=None):
Expand All @@ -77,8 +77,8 @@ def __init__(self, env_fns, start_method=None):
# Fork is not a thread safe method (see issue #217)
# but is more user friendly (does not require to wrap the code in
# a `if __name__ == "__main__":`)
fork_available = 'fork' in multiprocessing.get_all_start_methods()
start_method = 'fork' if fork_available else 'spawn'
forkserver_available = 'forkserver' in multiprocessing.get_all_start_methods()
start_method = 'forkserver' if forkserver_available else 'spawn'
ctx = multiprocessing.get_context(start_method)

self.remotes, self.work_remotes = zip(*[ctx.Pipe() for _ in range(n_envs)])
Expand Down
2 changes: 1 addition & 1 deletion stable_baselines/gail/dataset/dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ def __init__(self, expert_path=None, traj_data=None, train_fraction=0.7, batch_s
if traj_data is None and expert_path is None:
raise ValueError("Must specify one of 'traj_data' or 'expert_path'")
if traj_data is None:
traj_data = np.load(expert_path)
traj_data = np.load(expert_path, allow_pickle=True)

if verbose > 0:
for key, val in traj_data.items():
Expand Down
2 changes: 1 addition & 1 deletion tests/test_gail.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ def test_generate(generate_env):
if key != 'episode_returns':
assert val.shape[0] == n_timesteps, "inconsistent number of timesteps at '{}'".format(key)

dataset_loaded = np.load('expert.npz')
dataset_loaded = np.load('expert.npz', allow_pickle=True)
assert dataset.keys() == dataset_loaded.keys()
for key in dataset.keys():
assert (dataset[key] == dataset_loaded[key]).all(), "different data at '{}'".format(key)
Expand Down
5 changes: 2 additions & 3 deletions tests/test_tensorboard.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

import pytest

from stable_baselines import A2C, ACER, ACKTR, DQN, DDPG, PPO1, PPO2, SAC, TRPO
from stable_baselines import A2C, ACER, ACKTR, DQN, DDPG, PPO1, PPO2, SAC, TD3, TRPO

TENSORBOARD_DIR = '/tmp/tb_dir/'

Expand All @@ -19,6 +19,7 @@
'ppo1': (PPO1, 'CartPole-v1'),
'ppo2': (PPO2, 'CartPole-v1'),
'sac': (SAC, 'Pendulum-v0'),
'td3': (TD3, 'Pendulum-v0'),
'trpo': (TRPO, 'CartPole-v1'),
}

Expand Down Expand Up @@ -47,5 +48,3 @@ def test_multiple_runs(model_name):
assert os.path.isdir(TENSORBOARD_DIR + logname + "_1")
# Check that the log dir name increments correctly
assert os.path.isdir(TENSORBOARD_DIR + logname + "_2")


6 changes: 5 additions & 1 deletion tests/test_vec_envs.py
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,11 @@ def obs_assert(obs):


def test_subproc_start_method():
start_methods = [None] + multiprocessing.get_all_start_methods()
start_methods = [None]
# Only test thread-safe methods. Others may deadlock tests! (gh/428)
safe_methods = {'forkserver', 'spawn'}
available_methods = multiprocessing.get_all_start_methods()
start_methods += list(safe_methods.intersection(available_methods))
space = gym.spaces.Discrete(2)

def obs_assert(obs):
Expand Down

0 comments on commit e2e5c1a

Please sign in to comment.