Replies: 1 comment 2 replies
-
Having too much columns has a big impact on performances, especially on Inserts / Updates If you are using sql server, maybe you can have a look on sparse columns as well (if you have a lot of null values) Another thing you may consider is to make a specific scope for this one huge table, and a regular scope for the others |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello!
We are syncing a database with 10 tables, where one of the tables has around 60 columns we have to sync and also around 250k rows. Doing a reinitialize of that takes a lot of time (10+ minutes). We found out that the most work is done when applying (inserting) the data for that one huge table on the client side. Did some testing and we found out that the amount of columns does inflict the performance - so we narrowed them down to 25 columns - but the sync is still slow, specifically on some devices (it's a mobile app client). So, if we loose all the columns but two, we can get the full sync to be done in around 4 minutes. But obviously, this is not the solution for us - we need that data in the columns. So, we are wondering, what can we do (without changing/modifying the database) to improve the performance of the sync? Any tips? We do the reinitialize each time user changes however this one large table could stay somehow since every user uses same data. Can we take some advantage of that? Could I reinitialize all but one table?
Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions