Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
0013726: table "container_content": over 13041450 rows - in words: thirteen million rows - how to cleanup? #6789
Reported by nbe-renzel-net on 23 Jan 2018 13:58
We're syncing our Tine20 installation with our Active Directory every hour using a cron job. Tine20 has been running for about 10 months now.
The PostgreSQL dumps are growing and growing - about 6 MB larger every day (gzip'ed compressed dumps). And I wondered why.
Looking at the database directly, I see that the table "tine20_container_content" now contains about 13 million rows. Most rows are for "container_id=1" which equals to our "Internal Contacts" that are updated with every Active Directory sync.
Is that really neccessary that this table grows so large? Are there any caveats if I delete all entries "WHERE container_id=1 AND action='update'"? I see that this table is used within the "syncotron" library so I better ask here before I delete the - in my eyes - unneccessary entries.
TIA and regards,
Comment posted by pschuele on 8 Mar 2018 15:13
if you don't use activesync or carddav, it should be no problem to remove the records with container_id = 1 from there.
we'll talk about this and maybe will provide a cleanup script that removes the obsolete records from the table.
but the ad sync should not update the records in each sync process. maybe there is something wrong there, too? perhaps you could check the logs of the user sync (and/or the contact history) and check the fields that are updated in each sync.