Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix S3.rename didn't correctly remove the old file #248

Merged
merged 5 commits into from
Dec 27, 2021

Conversation

belltailjp
Copy link
Member

@belltailjp belltailjp commented Dec 27, 2021

In the current pfio, S3.rename just copies the content into a new file, not removing the original one.

Fixing rename

>>> with pfio.v2.from_url('s3://suzuo/writer/test') as fs:
...     with fs.open('foo.dat', 'wb') as f:
...         f.write(b'hello world')
... 
...     print('ls before', list(fs.list()))
...     fs.rename('foo.dat', 'bar.dat')
...     print('ls after', list(fs.list()))

ls before ['foo.dat']
ls after ['bar.dat', 'foo.dat']

This is because that rename calls remove to delete the source object with the key normalized, which is then normalized again inside the remove. So remove tries to remove the specified key that does not exist instead of the original source one that we actually need to delete.
https://github.com/pfnet/pfio/blob/2.0.1/pfio/v2/s3.py#L523

This PR fixes the unnecessary normalization when calling remove in rename.

Fixing remove

One of the reasons why the incorrect rename behavior is missed would be because remove didn't raise anything even if the specified target doesn't actually exist.
This S3 behavior is inconsistent to other filesystems.
Since it seems to be a constraint comes from boto3 or perhaps S3 itself, I added an explicit existence check.

Test refactoring

In addition, since S3 test cases it getting increased, I DRYed fixtures to make each test case a little simpler.

The following conventional test case

@mock_s3
def test_foo():
    bucket = "test-dummy-bucket"
    key = "it's me!deadbeef"
    secret = "asedf;lkjdf;a'lksjd"
    with S3(bucket, create_bucket=True):
        with from_url('s3://test-dummy-bucket/',
                      aws_access_key_id=key,
                      aws_secret_access_key=secret) as s3:
            ...
        with open_url('s3://test-dummy-bucket/xxxx', 'rb'
                      aws_access_key_id=key,
                      aws_secret_access_key=secret) as f:
            ...

can be equivalently rewritten as:

def test_foo(s3_fixture):
    with from_url('s3://test-dummy-bucket/', **s3_fixture.aws_kwargs) as s3:
        ...
    with open_url('s3://test-dummy-bucket/xxxx', 'rb', **s2_fixture.aws_kwargs) as f:
        ...

@belltailjp
Copy link
Member Author

/test

@pfn-ci-bot
Copy link

Successfully created a job for commit 83a36ac:

1 similar comment
@pfn-ci-bot
Copy link

Successfully created a job for commit 83a36ac:

@belltailjp
Copy link
Member Author

/test

@pfn-ci-bot
Copy link

Successfully created a job for commit 18c64d1:

1 similar comment
@pfn-ci-bot
Copy link

Successfully created a job for commit 18c64d1:

@kuenishi kuenishi added this to the 2.1.0 milestone Dec 27, 2021
@belltailjp
Copy link
Member Author

/test

@pfn-ci-bot
Copy link

Successfully created a job for commit 7e5074d:

1 similar comment
@pfn-ci-bot
Copy link

Successfully created a job for commit 7e5074d:

@kuenishi kuenishi added the cat:bug Bug report or fix. label Dec 27, 2021
@belltailjp
Copy link
Member Author

/test

@pfn-ci-bot
Copy link

Successfully created a job for commit e5d55ff:

1 similar comment
@pfn-ci-bot
Copy link

Successfully created a job for commit e5d55ff:

Copy link
Member

@kuenishi kuenishi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome!

@kuenishi kuenishi merged commit 11ffde6 into pfnet:master Dec 27, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cat:bug Bug report or fix.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants