Skip to content

Commit

Permalink
[pre-commit.ci] pre-commit autoupdate (#362)
Browse files Browse the repository at this point in the history
<!--pre-commit.ci start-->
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.0.284 →
v0.0.285](astral-sh/ruff-pre-commit@v0.0.284...v0.0.285)
- [github.com/asottile/blacken-docs: 1.15.0 →
1.16.0](adamchainz/blacken-docs@1.15.0...1.16.0)
<!--pre-commit.ci end-->

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
  • Loading branch information
pre-commit-ci[bot] committed Aug 22, 2023
1 parent 22f49df commit 4408389
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 11 deletions.
4 changes: 2 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ repos:
- id: black-jupyter
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.0.284
rev: v0.0.285
hooks:
- id: ruff
args: ["--fix"]
Expand All @@ -34,6 +34,6 @@ repos:
args: ["--write"]
# Python inside docs
- repo: https://github.com/asottile/blacken-docs
rev: 1.15.0
rev: 1.16.0
hooks:
- id: blacken-docs
5 changes: 1 addition & 4 deletions doc/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -116,10 +116,7 @@ resources = Resources(
gpu_per_node=2,
queue_name="GPU_2080Ti",
group_size=4,
custom_flags=[
"#SBATCH --nice=100",
"#SBATCH --time=24:00:00"
],
custom_flags=["#SBATCH --nice=100", "#SBATCH --time=24:00:00"],
strategy={
# used when you want to add CUDA_VISIBLE_DIVECES automatically
"if_cuda_multi_devices": True
Expand Down
7 changes: 2 additions & 5 deletions dpdispatcher/hdfs_cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,7 @@ def exists(uri):
)
except Exception as e:
raise RuntimeError(
f"Cannot check existence of hdfs uri[{uri}] "
f"with cmd[{cmd}]"
f"Cannot check existence of hdfs uri[{uri}] " f"with cmd[{cmd}]"
) from e

@staticmethod
Expand Down Expand Up @@ -81,9 +80,7 @@ def copy_from_local(local_path, to_uri):
"""
# Make sure local_path is accessible
if not os.path.exists(local_path) or not os.access(local_path, os.R_OK):
raise RuntimeError(
f"try to access local_path[{local_path}] " "but failed"
)
raise RuntimeError(f"try to access local_path[{local_path}] " "but failed")
cmd = f"hadoop fs -copyFromLocal -f {local_path} {to_uri}"
try:
ret, out, err = run_cmd_with_all_output(cmd)
Expand Down

0 comments on commit 4408389

Please sign in to comment.