-
Couldn't load subscription status.
- Fork 1.3k
Description
Bug Report
Please provide information about your setup
Output of dvc version:
$ dvc version
WARNING: Unable to detect supported link types, as cache directory '.dvc/cache' doesn't exist. It is usually auto-created by commands such as `dvc add/fetch/pull/run/import`, but you could create it manually to enable this check.
DVC version: 1.1.2
Python version: 3.7.7
Platform: Linux-4.20.17-042017-generic-x86_64-with-debian-stretch-sid
Binary: False
Package: pip
Supported remotes: http, https, s3
Repo: dvc, git
Filesystem type (workspace): ('ext4', '/dev/sda2')Additional Information (if any):
I was trying out DVC and I cannot make it work with a local deployment of Minio. Minio is hosted at 127.0.0.1:9000 and works as expected, I tested it.
Contents of .dvc/config:
[cache]
s3 = s3cache
['remote "s3cache"']
url = s3://mybucket
endpointurl = http://127.0.0.1:9000
Logs:
$ dvc add s3://mybucket/textfile --external --verbose
2020-07-02 09:05:55,227 DEBUG: fetched: [(3,)]
2020-07-02 09:05:55,583 DEBUG: fetched: [(0,)]
2020-07-02 09:05:55,587 ERROR: unexpected error - An error occurred (403) when calling the HeadObject operation: Forbidden
------------------------------------------------------------
Traceback (most recent call last):
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/dvc/main.py", line 53, in main
ret = cmd.run()
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/dvc/command/add.py", line 22, in run
external=self.args.external,
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/dvc/repo/__init__.py", line 36, in wrapper
ret = f(repo, *args, **kwargs)
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/dvc/repo/scm_context.py", line 4, in run
result = method(repo, *args, **kw)
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/dvc/repo/add.py", line 91, in add
stage.save()
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/dvc/stage/__init__.py", line 380, in save
self.save_outs()
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/dvc/stage/__init__.py", line 391, in save_outs
out.save()
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/dvc/output/base.py", line 253, in save
if not self.exists:
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/dvc/output/base.py", line 189, in exists
return self.remote.tree.exists(self.path_info)
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/dvc/remote/s3.py", line 133, in exists
return self.isfile(path_info) or self.isdir(path_info)
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/dvc/remote/s3.py", line 166, in isfile
self.s3.head_object(Bucket=path_info.bucket, Key=path_info.path)
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/botocore/client.py", line 316, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/lmaheo/miniconda3/envs/dvc/lib/python3.7/site-packages/botocore/client.py", line 637, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: ForbiddenAfter some investigation, dvc does seem to take into account the configuration and the endpointurl. However on this specific boto3 request it does not. I did not go much more into the code to find out why the two s3 clients are generated from different configurations.
Configuration for the failing request:
{'url_path': '/mybucket/textfile', 'query_string': {}, 'method': 'HEAD', 'headers': {'User-Agent': 'Boto3/1.14.14 Python/3.7.7 Linux/4.20.17-042017-generic Botocore/1.17.14'}, 'body': b'', 'url': 'https://s3.amazonaws.com/mybucket/textfile', 'context': {'client_region': 'us-east-1', 'client_config': <botocore.config.Config object at 0x7f4fd6aaf310>, 'has_streaming_input': False, 'auth_type': None, 'signing': {'bucket': 'mybucket'}, 'timestamp': '20200702T130555Z'}}Configuration loaded by DVC at some point during the call:
{'url': 's3://mybucket', 'endpointurl': 'http://127.0.0.1:9000', 'use_ssl': True, 'listobjects': False}Any idea as to why this behaviour is happening?
Thanks,
Lucas