New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
django.contrib.postgres.fields.ArrayField #2485
Conversation
|
||
def index_transform_factory(index, base_field): | ||
|
||
class IndexTransform(Transform): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please dont' create new classes dynamically for each query like this, IndexTransform
should be factored out and take some parameter to it's constructor (and then offer an __call__
or something), same with SliceTransform
.
I'd prefer these to live in the |
Option 2 for dimensions looks good. As for deconstruction, what extra control would you like? I'd rather this stuff was more achievable from inside fields themselves. Looking over the diff, it looks like you'd want the ability to pass out whole field instances? That should work... And for testing things with migrations, it's enough to just add migrations into a test app, and they'll get run at test time. If you want to explicitly test individual migration operations, you'll need something like I have in the "migrations" tests, where you swap in different values of MIGRATION_MODULES for certain tests and run the migrate command (or the machinery underlying it) directly. |
Thanks Andrew, I hadn't realised that deconstruction was recursive. I've added a test that |
Most of the forms code is now present. The js in the admin needs improving, and the admin integration needs some tests. I need to look at how we've tested similar things in other areas to know exactly what to write here.
|
vals = json.loads(value) | ||
value = [] | ||
for val in vals: | ||
value.append(self.base_field.to_python(val)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not a list comprehension here? (And on line 106)
value = [self.base_field.to_python(val) vor val in vals]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or even the faster map(self.base_field.to_python, vals)
, but that's more arguable.
Great work :) I really can't wait to see it in django! |
NullableIntegerArrayModel.objects.create(field=[2, 3]), | ||
NullableIntegerArrayModel.objects.create(field=[20, 30, 40]), | ||
NullableIntegerArrayModel.objects.create(field=None), | ||
] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not using a bulk_create
here? I may be a bit obsessed with performance, but I like when tests also are fast ;)
self.objs = NullableIntegerArrayModel.objects.bulk_create([
NullableIntegerArrayModel(field=[1]),
NullableIntegerArrayModel(field=[2]),
NullableIntegerArrayModel(field=[2, 3]),
NullableIntegerArrayModel(field=[20, 30, 40]),
NullableIntegerArrayModel(field=None),
])
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bulk create bypasses some logic so I'd rather stick to the "safe" option.
They don't play nice with flexible sizes.
Missing: - Tests - Fully working js
It needs a better way of handing JS widgets in the admin as a whole before it is easy to write. In particular there are serious issues involving DateTimePicker when used in an array.
This will be a documented pattern so having a test for it is useful.
Ok, so I have removed the admin functionality for now. In order to do this nicely, it seems likely I will need to do a more thorough review of how javascript widgets in the admin are built in order to make this work nicely. However, model field, form fields and documentation are ready for review. I think this is a complete enough patch for initial inclusion. |
self.base_field.set_attributes_from_name(name) | ||
|
||
@property | ||
def definition(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this needed somewhere?
return '%s[%s]' % (self.base_field.db_type(connection), size) | ||
|
||
def get_prep_value(self, value): | ||
if isinstance(value, list): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
list is sufficient here or should this be for every iterable?
return self.widget.is_hidden | ||
|
||
def value_from_datadict(self, data, files, name): | ||
regex = re.compile(name + '_([0-9]+).*') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are we sure that name does not have to be escaped? I guess it should be a valid Python identifier and thus be safe, but maybe it's worth leaving a comment here?
…m_lookup and custom_transform. Previously, class lookups from the output_type would be used, but any changes to custom_lookup or custom_transform would be ignored.
Also fix slicing as much as it can be fixed.
If we aren't including the variable size one, we don't need to search like this.
Committed in 6041626 |
What about basic admin functionality for array field ? |
This is a first draft of Array fields. The basic field definition is there, with the required functionality to handle arrays of almost any type. I've also written the lookups/transforms specific to array fields.
Work still to do:
The last of these is a particularly interesting case. Postgres has a "casual relationship" with the definition of an array field. You can create
integer[]
,integer[][]
,integer[3][4]
etc, but postgres docs state that this is basically just documentation as it is not enforced at all. We have a couple of options here:max_size=4
and do python side only validation. We'd still pass the correct[4]
to postgres, but it won't enforce integrity.dimensions
flag to be passed allowing for any option. I think this isn't needed as if you want a 2-dimensional array you could doArrayField(ArrayField(IntegerField()))
. This also makes the code path much easier as all the functions which delegate to thebase_field
don't have to worry about its dimensions.In the absence of strong opinion otherwise, I'm going to do option 2.
Other notes for reviewers:
contained_by
, which iscontains
with the arguments reversed. It's basically a "is subset" operator. Thinking about it as I'm writing this, I think it does have use cases so I should add it in.__iexact
,startswith
etc) continue to be accepted, even though they are largely useless.contains
has been overloaded with a more sensible implementation. This is on the principle that date based fields accept them, and the query is functional (casts everything to text). Personally, I would like fields to only support the lookups which make sense on them now that is easily done, but this is a backwards incompatible change. I may open it up as a ticket when working on refactoring__year
etc into transforms.deconstruct
method which means the__init__
accepts two formats for the base field. I wonder whether this could be avoided if there is a suitable hook inmigrations.writer
to allow me to pass a string containing the correct field definition for thebase_field
fromdeconstruct
. This would make the migration files look less weird. @andrewgodwin is this sensible? Also should I have explicit tests that migrations work, and if so what would that look like?