Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: greatly speed up migrator and lower memory usage #2874

Conversation

lincolnthalles
Copy link
Contributor

  • remove all Store indirections
  • query database directly with prepared statements
  • add some feedback, each 500 updates

At some point, these migrations probably should be split into each database driver. But in it's current state, the queries used in these migrations are compatible across SQLite, Postgres and MySQL.

This most likely closes #2850.


The resource path migrator is taking more than 25 minutes to complete with 300k resources at current v0.19.0. Memory usage peaked at 280MB, and CPU usage is not so high, but never goes down. And as it commits each change separately, the total data written on disk is also nuts, going over 2.5GB. That's probably why some VPS and low-end systems are having a hard time to upgrade to v0.19.0.

image
memos-stable-1 | 2024-01-31T08:09:14.919Z INFO store/migrator.go:54 migrated 300000 local resource paths in 28m31.269576206s

At time of writing, it's still stuck with this message, probably still committing data, and the server is not yet available.
image


This PR brings down the migration time with the same dataset to under 4 seconds in my machine, and the memory usage never goes beyond 125 MB. The total data written is a little over the database size (113 MB).

image
memos-dev-1 | 2024-01-31T07:40:45.866Z INFO store/migrator.go:108 Migrated 300000 local resource paths in 3.860779589s

Both containers on these captures were seeded with the same data and started at the same time.

If you'd like, I can add the docker-compose file and the Python seeding script that helped with this testing.

- prevent disabling password login
- prevent updating `memos-demo` user
- prevent setting additional style
- prevent setting additional script
- add some error feedback to system settings UI
- remove all Store indirections
- query database directly with prepared statements
Copy link
Collaborator

@boojack boojack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. I think we can remove these migrators writen with golang in the next release.

@boojack boojack merged commit 279cba0 into usememos:main Jan 31, 2024
5 checks passed
@eallion
Copy link
Contributor

eallion commented Jan 31, 2024

still

@lincolnthalles
Copy link
Contributor Author

still

Try using docker pull neosmemo/memos:0.19, this currently will get you v0.19.1, which removed these migrations entirely. The stable tag is still on v0.19.0.

@gitkeniwo
Copy link

still

Try using docker pull neosmemo/memos:0.19, this currently will get you v0.19.1, which removed these migrations entirely. The stable tag is still on v0.19.0.

Same. I pulled the latest v0.19. When logged in the latest memos seem to be loaded normally, but once I clicked click to fetch more the page crashed. After refresh clicked fetch more again and it crashed again. Same applied to the explore page.
The resource page seems fine though.

@eallion
Copy link
Contributor

eallion commented Jan 31, 2024

Try using docker pull neosmemo/memos:0.19, this currently will get you v0.19.1, which removed these migrations entirely. The stable tag is still on v0.19.0.

version: "3.0"
services:
  memos:
    image: ghcr.io/usememos/memos:0.19.1
  ... ...

I have tried 0.19.1 and test.
so terrible bugs

@Domonlee
Copy link

Domonlee commented Feb 1, 2024

still

Try using docker pull neosmemo/memos:0.19, this currently will get you v0.19.1, which removed these migrations entirely. The stable tag is still on v0.19.0.

Works fine for me! Thanks a lot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

The latest version consumes a significant amount of memory, please update with caution.
5 participants