Skip to content

Commit

Permalink
Release 13.0.0
Browse files Browse the repository at this point in the history
  • Loading branch information
davidfischer-ch committed Nov 14, 2018
1 parent 4780cee commit 9d0fe04
Show file tree
Hide file tree
Showing 3 changed files with 27 additions and 3 deletions.
24 changes: 24 additions & 0 deletions changelog.rst
Expand Up @@ -2,6 +2,30 @@
changelog
=========

-------
v13.0.0
-------

Compatibility breaks
====================

* Remove fake bson ObjectId (private module) when library not available.
* Function filesystem.find_recursive is now matching patterns against the whole path.
* Function aws.s3.list_objects: Handle multiple patterns like filesystem.find_recursive.
* Function aws.s3.remove_objects: Handle multiple patterns like filesystem.find_recursive.

Features
========

* Add function filesystem.to_user_id
* Add function filesystem.to_group_id
* Add function regex.from_path_patterns

Fix and enhancements
====================

* Replace relative imports of len(.) > 1 by absolute imports

-------
v12.2.3
-------
Expand Down
2 changes: 1 addition & 1 deletion pytoolbox/__init__.py
Expand Up @@ -2,4 +2,4 @@

from __future__ import absolute_import, division, print_function, unicode_literals

__version__ = '12.2.3'
__version__ = '13.0.0'
4 changes: 2 additions & 2 deletions pytoolbox/aws/s3.py
Expand Up @@ -60,10 +60,10 @@ def read_object(s3, bucket_name, path, file=None, fail=True):
raise


def remove_objects(s3, bucket_name, prefix='', pattern=r'.*', simulate=False):
def remove_objects(s3, bucket_name, prefix='', patterns=r'*', simulate=False, unix_wildcards=True):
"""Remove objects matching pattern, by chunks of 1000 to be efficient."""
objects = []
for obj in list_objects(s3, bucket_name, prefix, pattern):
for obj in list_objects(s3, bucket_name, prefix, patterns, unix_wildcards=unix_wildcards):
key = obj['Key']
objects.append({'Key': key})
yield obj
Expand Down

0 comments on commit 9d0fe04

Please sign in to comment.