-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automatically transform dynamodb types #103
Conversation
cls.table.delete() | ||
|
||
def test_item_resource(self): | ||
self.table.put_item(Item=copy.deepcopy(self.item_data)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is sort of strange that the dictionary inputted by the user will get modified (that is why I needed to send it a copy of it). I am not sure if there is a great work around this because all of these transformations happens before serialization and there is not really a way to return a new dictionary of params for the before-parameter-build
botocore event.
1966aae
to
6307175
Compare
I updated this PR to plumb in all of the conditional expression stuff. Note that the only new commit is the last one. The rest of the commits come from the conditional expression builder PR. |
# This assumes the base class is the base resource object | ||
resource_base_class = base_classes[0] | ||
|
||
class DynamoDBHighLevelResource(resource_base_class): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As I was reading this, I was curious why this wasn't just its once class at the module level? It would avoid having to execute the same code every time a ddb resource was created, and would allow us to unit test this class directly and make the code a little cleaner IMO. Note a huge deal but I was curious.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reason I did this was that I needed a handler to the event emitter. By creating the class in the handler, I get a handle to the base resource class and if I subclass from that and call super()
on its __init__
I will get an event emitter.
Ideally, I would add the event emitter to the event, but unfortunately, I cannot because the event emitter at that event is for the factory so if I register it to that emitter the higher level abstraction will be applied to all future clients that are created even though it is only called when the dynamodb resource is created.
I could also take it out of the method and subclass it from ServiceResource
, but it may need to be updated in the future if we ever build more base abstractions on top of ServiceResource
. However for now, this is completely viable way of doing it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was actually suggesting just making it a mixin class that was also a base class for the passed in resource class, i.e
class DDBHLResource(object):
def __init__(self, ...): # all the same code you have
def register_...():
base_classes.insert(0, DDBHLResource)
That should avoid the problem of subclass from ServiceResource
you mention, and still work with the event emitter requirements you have wouldn't it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not sure if that would work. I need to subclass from ServiceResource
because it has logic that I need to use its __init__
where it will actually setup the ``ResourceMeta`: https://github.com/boto/boto3/blob/develop/boto3/resources/base.py#L85.
EDIT: Nevermind, it passes the integration tests. I was basing the comment off of failed unit test. Those need updating.
Overall it looks good and I like the general approach. I just had some comments about the tests and possibly pulling out some of the inner function classes out to the module level to avoid having to create them every time the handlers are invoked. |
generated_names = {} | ||
generated_values = {} | ||
|
||
# Apply transformation for conditon and key conditon expression parameters. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
typo for conditon
Added handlers on both the intput and output of clients and resources to automatically use TypeSerializer and TypeDeserializer. Also added the conditonal expression builder to allow automatic use of Attr and Key classes when condition expressions are needed.
Avoids the creation of partial functions by using classes instead.
Avoids directly modifying mutable structures provided by the user.
@jamesls |
Thanks! Just had a small alternate suggestion for that dynamodb high level class that might be a little cleaner. |
I made the change. Works fine. I did think it was a little weird that if you instantiate the class directly (i.e. without using |
Looks good. |
Automatically transform dynamodb types
Added handlers on both the intput and output of clients and resources to
automatically use TypeSerializer and TypeDeserializer.
cc @jamesls