Skip to content

Commit

Permalink
Merge remote-tracking branch 'MongoEngine/master'
Browse files Browse the repository at this point in the history
  • Loading branch information
9nix00 committed Jun 19, 2015
2 parents ab2ef69 + 5fa5284 commit 0bbbbdd
Show file tree
Hide file tree
Showing 19 changed files with 767 additions and 186 deletions.
1 change: 1 addition & 0 deletions AUTHORS
Original file line number Diff line number Diff line change
Expand Up @@ -223,3 +223,4 @@ that much better:
* Kiryl Yermakou (https://github.com/rma4ok)
* Matthieu Rigal (https://github.com/MRigal)
* Charanpal Dhanjal (https://github.com/charanpald)
* Emmanuel Leblond (https://github.com/touilleMan)
7 changes: 7 additions & 0 deletions docs/changelog.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,13 @@ Changes in 0.9.X - DEV
- Added __ support to escape field name in fields lookup keywords that match operators names #949
- Support for PyMongo 3+ #946
- Fix for issue where FileField deletion did not free space in GridFS.
- No_dereference() not respected on embedded docs containing reference. #517
- Document save raise an exception if save_condition fails #1005
- Fixes some internal _id handling issue. #961
- Updated URL and Email Field regex validators, added schemes argument to URLField validation. #652
- Removed get_or_create() deprecated since 0.8.0. #300
- Capped collection multiple of 256. #1011
- Added `BaseQuerySet.aggregate_sum` and `BaseQuerySet.aggregate_average` methods.
Changes in 0.9.0
================
Expand Down
34 changes: 23 additions & 11 deletions docs/guide/defining-documents.rst
Original file line number Diff line number Diff line change
Expand Up @@ -315,12 +315,12 @@ reference with a delete rule specification. A delete rule is specified by
supplying the :attr:`reverse_delete_rule` attributes on the
:class:`ReferenceField` definition, like this::

class Employee(Document):
class ProfilePage(Document):
...
profile_page = ReferenceField('ProfilePage', reverse_delete_rule=mongoengine.NULLIFY)
employee = ReferenceField('Employee', reverse_delete_rule=mongoengine.CASCADE)

The declaration in this example means that when an :class:`Employee` object is
removed, the :class:`ProfilePage` that belongs to that employee is removed as
removed, the :class:`ProfilePage` that references that employee is removed as
well. If a whole batch of employees is removed, all profile pages that are
linked are removed as well.

Expand Down Expand Up @@ -447,8 +447,10 @@ A :class:`~mongoengine.Document` may use a **Capped Collection** by specifying
:attr:`max_documents` and :attr:`max_size` in the :attr:`meta` dictionary.
:attr:`max_documents` is the maximum number of documents that is allowed to be
stored in the collection, and :attr:`max_size` is the maximum size of the
collection in bytes. If :attr:`max_size` is not specified and
:attr:`max_documents` is, :attr:`max_size` defaults to 10000000 bytes (10MB).
collection in bytes. :attr:`max_size` is rounded up to the next multiple of 256
by MongoDB internally and mongoengine before. Use also a multiple of 256 to
avoid confusions. If :attr:`max_size` is not specified and
:attr:`max_documents` is, :attr:`max_size` defaults to 10485760 bytes (10MB).
The following example shows a :class:`Log` document that will be limited to
1000 entries and 2MB of disk space::

Expand All @@ -465,19 +467,26 @@ You can specify indexes on collections to make querying faster. This is done
by creating a list of index specifications called :attr:`indexes` in the
:attr:`~mongoengine.Document.meta` dictionary, where an index specification may
either be a single field name, a tuple containing multiple field names, or a
dictionary containing a full index definition. A direction may be specified on
fields by prefixing the field name with a **+** (for ascending) or a **-** sign
(for descending). Note that direction only matters on multi-field indexes.
Text indexes may be specified by prefixing the field name with a **$**. ::
dictionary containing a full index definition.

A direction may be specified on fields by prefixing the field name with a
**+** (for ascending) or a **-** sign (for descending). Note that direction
only matters on multi-field indexes. Text indexes may be specified by prefixing
the field name with a **$**. Hashed indexes may be specified by prefixing
the field name with a **#**::

class Page(Document):
category = IntField()
title = StringField()
rating = StringField()
created = DateTimeField()
meta = {
'indexes': [
'title',
'$title', # text index
'#title', # hashed index
('title', '-rating'),
('category', '_cls'),
{
'fields': ['created'],
'expireAfterSeconds': 3600
Expand Down Expand Up @@ -532,11 +541,14 @@ There are a few top level defaults for all indexes that can be set::
:attr:`index_background` (Optional)
Set the default value for if an index should be indexed in the background

:attr:`index_cls` (Optional)
A way to turn off a specific index for _cls.

:attr:`index_drop_dups` (Optional)
Set the default value for if an index should drop duplicates

:attr:`index_cls` (Optional)
A way to turn off a specific index for _cls.
.. note:: Since MongoDB 3.0 drop_dups is not supported anymore. Raises a Warning
and has no effect


Compound Indexes and Indexing sub documents
Expand Down
20 changes: 5 additions & 15 deletions docs/guide/querying.rst
Original file line number Diff line number Diff line change
Expand Up @@ -263,21 +263,11 @@ no document matches the query, and
if more than one document matched the query. These exceptions are merged into
your document definitions eg: `MyDoc.DoesNotExist`

A variation of this method exists,
:meth:`~mongoengine.queryset.QuerySet.get_or_create`, that will create a new
document with the query arguments if no documents match the query. An
additional keyword argument, :attr:`defaults` may be provided, which will be
used as default values for the new document, in the case that it should need
to be created::

>>> a, created = User.objects.get_or_create(name='User A', defaults={'age': 30})
>>> b, created = User.objects.get_or_create(name='User A', defaults={'age': 40})
>>> a.name == b.name and a.age == b.age
True

.. warning::
:meth:`~mongoengine.queryset.QuerySet.get_or_create` method is deprecated
since :mod:`mongoengine` 0.8.
A variation of this method, get_or_create() existed, but it was unsafe. It
could not be made safe, because there are no transactions in mongoDB. Other
approaches should be investigated, to ensure you don't accidentally duplicate
data when using something similar to this method. Therefore it was deprecated
in 0.8 and removed in 0.10.

Default Document queries
========================
Expand Down
35 changes: 25 additions & 10 deletions mongoengine/base/document.py
Original file line number Diff line number Diff line change
Expand Up @@ -184,7 +184,7 @@ def __setattr__(self, name, value):
self__initialised = False
# Check if the user has created a new instance of a class
if (self._is_document and self__initialised
and self__created and name == self._meta['id_field']):
and self__created and name == self._meta.get('id_field')):
super(BaseDocument, self).__setattr__('_created', False)

super(BaseDocument, self).__setattr__(name, value)
Expand Down Expand Up @@ -672,7 +672,7 @@ def _delta(self):

@classmethod
def _get_collection_name(cls):
"""Returns the collection name for this class.
"""Returns the collection name for this class. None for abstract class
"""
return cls._meta.get('collection', None)

Expand Down Expand Up @@ -782,7 +782,7 @@ def _build_index_spec(cls, spec):
allow_inheritance = cls._meta.get('allow_inheritance',
ALLOW_INHERITANCE)
include_cls = (allow_inheritance and not spec.get('sparse', False) and
spec.get('cls', True))
spec.get('cls', True) and '_cls' not in spec['fields'])

# 733: don't include cls if index_cls is False unless there is an explicit cls with the index
include_cls = include_cls and (spec.get('cls', False) or cls._meta.get('index_cls', True))
Expand All @@ -795,16 +795,25 @@ def _build_index_spec(cls, spec):

# ASCENDING from +
# DESCENDING from -
# GEO2D from *
# TEXT from $
# HASHED from #
# GEOSPHERE from (
# GEOHAYSTACK from )
# GEO2D from *
direction = pymongo.ASCENDING
if key.startswith("-"):
direction = pymongo.DESCENDING
elif key.startswith("*"):
direction = pymongo.GEO2D
elif key.startswith("$"):
direction = pymongo.TEXT
if key.startswith(("+", "-", "*", "$")):
elif key.startswith("#"):
direction = pymongo.HASHED
elif key.startswith("("):
direction = pymongo.GEOSPHERE
elif key.startswith(")"):
direction = pymongo.GEOHAYSTACK
elif key.startswith("*"):
direction = pymongo.GEO2D
if key.startswith(("+", "-", "*", "$", "#", "(", ")")):
key = key[1:]

# Use real field name, do it manually because we need field
Expand All @@ -827,7 +836,8 @@ def _build_index_spec(cls, spec):
index_list.append((key, direction))

# Don't add cls to a geo index
if include_cls and direction is not pymongo.GEO2D:
if include_cls and direction not in (
pymongo.GEO2D, pymongo.GEOHAYSTACK, pymongo.GEOSPHERE):
index_list.insert(0, ('_cls', 1))

if index_list:
Expand Down Expand Up @@ -973,8 +983,13 @@ def _lookup_field(cls, parts):
if hasattr(getattr(field, 'field', None), 'lookup_member'):
new_field = field.field.lookup_member(field_name)
else:
# Look up subfield on the previous field
new_field = field.lookup_member(field_name)
# Look up subfield on the previous field or raise
try:
new_field = field.lookup_member(field_name)
except AttributeError:
raise LookUpError('Cannot resolve subfield or operator {} '
'on the field {}'.format(
field_name, field.name))
if not new_field and isinstance(field, ComplexBaseField):
if hasattr(field.field, 'document_type') and cls._dynamic \
and field.field.document_type._dynamic:
Expand Down
8 changes: 6 additions & 2 deletions mongoengine/base/fields.py
Original file line number Diff line number Diff line change
Expand Up @@ -290,6 +290,7 @@ def to_python(self, value):
return value

if self.field:
self.field._auto_dereference = self._auto_dereference
value_dict = dict([(key, self.field.to_python(item))
for key, item in value.items()])
else:
Expand Down Expand Up @@ -424,8 +425,11 @@ class ObjectIdField(BaseField):
"""

def to_python(self, value):
if not isinstance(value, ObjectId):
value = ObjectId(value)
try:
if not isinstance(value, ObjectId):
value = ObjectId(value)
except:
pass
return value

def to_mongo(self, value):
Expand Down
31 changes: 23 additions & 8 deletions mongoengine/base/metaclasses.py
Original file line number Diff line number Diff line change
Expand Up @@ -385,15 +385,17 @@ def __new__(cls, name, bases, attrs):
new_class._auto_id_field = getattr(parent_doc_cls,
'_auto_id_field', False)
if not new_class._meta.get('id_field'):
# After 0.10, find not existing names, instead of overwriting
id_name, id_db_name = cls.get_auto_id_names(new_class)
new_class._auto_id_field = True
new_class._meta['id_field'] = 'id'
new_class._fields['id'] = ObjectIdField(db_field='_id')
new_class._fields['id'].name = 'id'
new_class.id = new_class._fields['id']

# Prepend id field to _fields_ordered
if 'id' in new_class._fields and 'id' not in new_class._fields_ordered:
new_class._fields_ordered = ('id', ) + new_class._fields_ordered
new_class._meta['id_field'] = id_name
new_class._fields[id_name] = ObjectIdField(db_field=id_db_name)
new_class._fields[id_name].name = id_name
new_class.id = new_class._fields[id_name]
new_class._db_field_map[id_name] = id_db_name
new_class._reverse_db_field_map[id_db_name] = id_name
# Prepend id field to _fields_ordered
new_class._fields_ordered = (id_name, ) + new_class._fields_ordered

# Merge in exceptions with parent hierarchy
exceptions_to_merge = (DoesNotExist, MultipleObjectsReturned)
Expand All @@ -408,6 +410,19 @@ def __new__(cls, name, bases, attrs):

return new_class

def get_auto_id_names(self):
id_name, id_db_name = ('id', '_id')
if id_name not in self._fields and \
id_db_name not in (v.db_field for v in self._fields.values()):
return id_name, id_db_name
id_basename, id_db_basename, i = 'auto_id', '_auto_id', 0
while id_name in self._fields or \
id_db_name in (v.db_field for v in self._fields.values()):
id_name = '{0}_{1}'.format(id_basename, i)
id_db_name = '{0}_{1}'.format(id_db_basename, i)
i += 1
return id_name, id_db_name


class MetaDict(dict):

Expand Down
25 changes: 16 additions & 9 deletions mongoengine/dereference.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,21 +128,25 @@ def _fetch_objects(self, doc_type=None):
"""
object_map = {}
for collection, dbrefs in self.reference_map.iteritems():
refs = [dbref for dbref in dbrefs
if unicode(dbref).encode('utf-8') not in object_map]
if hasattr(collection, 'objects'): # We have a document class for the refs
col_name = collection._get_collection_name()
refs = [dbref for dbref in dbrefs
if (col_name, dbref) not in object_map]
references = collection.objects.in_bulk(refs)
for key, doc in references.iteritems():
object_map[key] = doc
object_map[(col_name, key)] = doc
else: # Generic reference: use the refs data to convert to document
if isinstance(doc_type, (ListField, DictField, MapField,)):
continue

refs = [dbref for dbref in dbrefs
if (collection, dbref) not in object_map]

if doc_type:
references = doc_type._get_db()[collection].find({'_id': {'$in': refs}})
for ref in references:
doc = doc_type._from_son(ref)
object_map[doc.id] = doc
object_map[(collection, doc.id)] = doc
else:
references = get_db()[collection].find({'_id': {'$in': refs}})
for ref in references:
Expand All @@ -154,7 +158,7 @@ def _fetch_objects(self, doc_type=None):
for x in collection.split('_')))._from_son(ref)
else:
doc = doc_type._from_son(ref)
object_map[doc.id] = doc
object_map[(collection, doc.id)] = doc
return object_map

def _attach_objects(self, items, depth=0, instance=None, name=None):
Expand All @@ -180,7 +184,8 @@ def _attach_objects(self, items, depth=0, instance=None, name=None):

if isinstance(items, (dict, SON)):
if '_ref' in items:
return self.object_map.get(items['_ref'].id, items)
return self.object_map.get(
(items['_ref'].collection, items['_ref'].id), items)
elif '_cls' in items:
doc = get_document(items['_cls'])._from_son(items)
_cls = doc._data.pop('_cls', None)
Expand Down Expand Up @@ -216,17 +221,19 @@ def _attach_objects(self, items, depth=0, instance=None, name=None):
for field_name, field in v._fields.iteritems():
v = data[k]._data.get(field_name, None)
if isinstance(v, (DBRef)):
data[k]._data[field_name] = self.object_map.get(v.id, v)
data[k]._data[field_name] = self.object_map.get(
(v.collection, v.id), v)
elif isinstance(v, (dict, SON)) and '_ref' in v:
data[k]._data[field_name] = self.object_map.get(v['_ref'].id, v)
data[k]._data[field_name] = self.object_map.get(
(v['_ref'].collection , v['_ref'].id), v)
elif isinstance(v, (dict, list, tuple)) and depth <= self.max_depth:
item_name = "{0}.{1}.{2}".format(name, k, field_name)
data[k]._data[field_name] = self._attach_objects(v, depth, instance=instance, name=item_name)
elif isinstance(v, (dict, list, tuple)) and depth <= self.max_depth:
item_name = '%s.%s' % (name, k) if name else name
data[k] = self._attach_objects(v, depth - 1, instance=instance, name=item_name)
elif hasattr(v, 'id'):
data[k] = self.object_map.get(v.id, v)
data[k] = self.object_map.get((v.collection, v.id), v)

if instance and name:
if is_list:
Expand Down
Loading

0 comments on commit 0bbbbdd

Please sign in to comment.