Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proxy command improvements #20

Closed
luya-bot opened this issue Dec 18, 2017 · 8 comments
Closed

Proxy command improvements #20

luya-bot opened this issue Dec 18, 2017 · 8 comments

Comments

@luya-bot
Copy link

This issue has originally been reported by @nadar at luyadev/luya#1546.
Moved here by @nadar.


  • ask to skip large tables (more then 10k entries?)
  • insert data after request, instead of save them into an array and insert after sync (slow down?)
  • option to skip tables
  • option to skip files
@boehsermoe
Copy link
Member

Data sync failed while truncate a table with foreign keys from other table. While the sync the foreign key check should disabled.
FOREIGN_KEY_CHECKS=0

Maybe change unique check and sql mode too?
UNIQUE_CHECKS=0
SQL_MODE='NO_AUTO_VALUE_ON_ZERO'

@nadar
Copy link
Member

nadar commented Apr 17, 2018

could you make an example of what you mean?

@boehsermoe
Copy link
Member

boehsermoe added a commit to boehsermoe/luya-module-admin that referenced this issue May 28, 2018
boehsermoe added a commit to boehsermoe/luya-module-admin that referenced this issue May 29, 2018
@boehsermoe
Copy link
Member

What do you think about extend \luya\admin\proxy\ClientBuild::$optionTable with the negated sign "!" to exclude tables from the data sync?

@nadar
Copy link
Member

nadar commented May 30, 2018

So an example could be --t=cms_*,!admin_* include all cms_* tables but exclude all admin_* tables?

Maybe create a new issue, as we should keep PRs as small as possible its better to have multiple issues withs multiples PRs.

@nadar nadar closed this as completed in de9f569 May 30, 2018
@nadar
Copy link
Member

nadar commented May 31, 2018

@boehsermoe

*****_table has 1668138 entries. Do you want continue table sync? (yes|no) [yes]:y

Works perfect :-) But i did not skip the 1'668'138 rows and the download works without reaching memory limit! Thanks

@boehsermoe
Copy link
Member

Nice to hear! And how long the download needed?

@nadar
Copy link
Member

nadar commented May 31, 2018

still downloading :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants