-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Store byte object as field #37
Comments
That's a good idea, beside this doesn't seems like something hard to add : Just create a new PR welcomed 👍 |
Okay, good. But what would the default serialization method do? Byte data isn't necessarily convertible into a string, right? How would someone want a byte Field to be serialized? In my special case the byte field represents some pickled state of an object that i don't even want to be serialized and sent over my api (is there a way to exclude fields on serialization?).
Works fine for storing and if the stored byte field is valid utf-8 it gets converted into a string on serialization. |
Bytes is a valid bson type (named Binary data in mongodb types) Beside, it seems pymongo does the convertion >>> hello = 'héllo'
>>> doc_id = db.test.insert({'str': hello, 'bytes': hello.encode()})
>>> db.test.find_one(doc_id)
{'bytes': b'h\xc3\xa9llo', 'str': 'héllo', '_id': ObjectId('57ad9b0713adf23b7095fcee')} So I think the
Yes there is ! You should use the attribute @instance.register
class MyDoc(Document):
pickled_stuff = field.BytesField(load_only=True, dump_only=True)
public_name = field.StrField()
# inside your POST API
payload = get_payload_from_request()
my_doc = MyDoc(**payload)
# raise ValidationError if a 'pickled_stuff' field is present
assert my_doc.pickled_stuff == None
my_doc.pickled_stuff = pickle_my_stuff() # must return bytes
my_doc.commit()
return 200, 'Ok'
# inside your GET API
my_doc = MyDoc.find({'id': my_id})
print(my_doc)
# <... {'pickled_stuff': b'<pickled data>', 'public_name': 'test' }...>
my_doc.dump()
{'public_name': 'test'}
return 200, json.dumps(my_doc.dump()) You should also have a look at the flask example which show you how to use umongo inside an API with custom loading/dumping schema |
Thanks for sharing this, great stuff! My pickled data would fail on serialization:
|
I think we should not try to do any string encode/decode inside umongo. This should be the user responsibility to provide |
Does this enhancement still need? If so, I want to get a try. |
@touilleMan @chenjr0719 @lafrech @martinjuhasz
Conclusion from the above: uMongo needs BinaryField. If Marshmallow guys refuse to add support for it – f*ck them, let's do it in uMongo Unfortunately, I'm not uMongo developer and haven't dig deep into how everything works. Here is an example of import bson
from marshmallow import compat as ma_compat, fields as ma_fields
from umongo import fields
class BinaryField(fields.BaseField, ma_fields.Field):
default_error_messages = {
'invalid': 'Not a valid byte sequence.'
}
def _serialize(self, value, attr, data):
return ma_compat.binary_type(value)
def _deserialize(self, value, attr, data):
if not isinstance(value, ma_compat.binary_type):
self.fail('invalid')
return value
def _serialize_to_mongo(self, obj):
return bson.binary.Binary(obj)
def _deserialize_from_mongo(self, value):
return bytes(value) Maybe there are some obscure caveats, maybe not. This is the code I'm currently having in project and it seems to work like a charm. (I'm using Motor) Would be nice if someone familiar with internals of uMongo could take a look |
@thodnev Thanks for the code, I'm going to use it as I need the ability to store binary data (in this case, a password salt created from os.urandom). If this code works, I can't imagine it would be too hard to add to a PR (if you haven't already). |
Is there a way to store a python byte object as a field using umongo without relying on gridfs?
In my case i want to store pretty small binary objects that get changed rarely.
In pymongo the recommended way seems to use a bson field but i cannot find a related field in umongos fields.py.
The text was updated successfully, but these errors were encountered: