-
Notifications
You must be signed in to change notification settings - Fork 2.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SegmentationFault at 0
when benchmarking graphql
#180
Labels
bug
Something isn't working
Comments
Thanks for the report. Looking into it now |
As a temporary workaround, this works: import { parse } from "graphql";
export default {
port: 3000,
async fetch(request: Request) {
const req: any = await request.blob();
return new Response(JSON.stringify(parse((await req.json()).query)));
},
}; On my M1, it reports 44,657 requests/second Note that |
Slightly faster to use
import { parse } from "graphql";
export default {
port: 3000,
async fetch(request: Request) {
const req: any = await request.text();
return new Response(JSON.stringify(parse(JSON.parse(req).query)));
},
}; |
There are two bugs:
|
Should be fixed now! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi, first of all, great work! Excited to see these insane performance gains, just tested out the HTTP server.
I have a very simple test setup, which I'm benchmarking with oha. When I send queries for about 1s, bun crashed with a SegmentationFault.
This is my setup:
http.ts
package.json
Benchmark command that succeeds:
Benchmark command that fails:
Error output:
The text was updated successfully, but these errors were encountered: