-
Notifications
You must be signed in to change notification settings - Fork 564
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: D1 execute and backup commands improvements #2107
Conversation
celso
commented
Oct 31, 2022
- Better and faster handling when importing big SQL files using execute --file
- Increased visibility during imports, sends output with each batch API call
- Backups are now downloaded to the directory where wrangler was initiated from
- Better and faster handling when importing big SQL files using execute --file - Increased visibility during imports, sends output with each batch API call - Backups are now downloaded to the directory where wrangler was initiated from
🦋 Changeset detectedLatest commit: f31091d The changes in this PR will be included in the next version bump. Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
A wrangler prerelease is available for testing. You can install this latest build in your project with: npm install --save-dev https://prerelease-registry.developers.workers.dev/runs/3363610522/npm-package-wrangler-2107 You can reference the automatically updated head of this PR with: npm install --save-dev https://prerelease-registry.developers.workers.dev/prs/2107/npm-package-wrangler-2107 Or you can use npx https://prerelease-registry.developers.workers.dev/runs/3363610522/npm-package-wrangler-2107 dev path/to/script.js Additional artifacts:npm install https://prerelease-registry.developers.workers.dev/runs/3363610522/npm-package-cloudflare-pages-shared-2107 |
Codecov Report
@@ Coverage Diff @@
## main #2107 +/- ##
==========================================
- Coverage 73.11% 73.05% -0.06%
==========================================
Files 127 127
Lines 8610 8618 +8
Branches 2264 2265 +1
==========================================
+ Hits 6295 6296 +1
- Misses 2315 2322 +7
|
@@ -52,7 +51,7 @@ type QueryResult = { | |||
query?: string; | |||
}; | |||
// Max number of bytes to send in a single /execute call | |||
const QUERY_LIMIT = 1_000_000; // 1MB | |||
const QUERY_LIMIT = 10_000; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the benefit in sending 10kb over 1MB?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
QUERY_LIMIT is actually the number of SQL statements going into a single D1 API request inside a transaction, not MB. 10,000 is a reasonable number that's efficient/fast enough when importing big SQL files, but also doesn't kill D1's DO from resource exhaustion.