-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large queries cause errors #9
Comments
This issue seems similar to this one from the original repository: michiya#143 At least the hot-fix solution if the same: Chunk requests in chunks of 2000 and swallow the bad performance :) |
Here a workaround :
Basically what it does it send a comma seperated value and then resplit it to use it in a query. |
Don't event need SPLIT_STRING...
|
The issue that I have with my previous solutions is that it only works on a case by case basis and it will not work if we use prefetch_related with more that 2100 related objects. To fix this , create a lookup.py next to your models.py with :
make sure to import in your models.py file :
and voilà it will works! |
Split_string was added in SQL 2016, for previous version this function works for me : |
@etiennepouliot I'm trying your solution, but despite importing it into models.py, those functions in the IN lookup don't seem to be ran when I call prefetch_related. Is this not meant to work for that? I'm using graphene django with gql_optimizer, which results in one giant query with 25000+ parameters |
A solution has been implemented on the dev branch of the Microsoft fork. See my issue on that repo for details. The solution uses the Microsoft recommended solution to large parameter lists by creating a TEMP TABLE and joining over that. |
Indeed it's not working for prefetch_related this way. If anybody know how, I would appreciate. Maybe I need to dig this deeper. |
I have a product where around 300k instances are created in an import.
Before importing (which I fear may fail as well), I perform an existence check against a single field in the form
MyModel.objects.filter(my_field__in=new_values).exists()
wherenew_values
is a set of strings.Here is the stack trace of the issue:
The text was updated successfully, but these errors were encountered: