Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SegmentationFault at 0 when benchmarking graphql #180

Closed
timsuchanek opened this issue May 26, 2022 · 6 comments
Closed

SegmentationFault at 0 when benchmarking graphql #180

timsuchanek opened this issue May 26, 2022 · 6 comments
Labels
bug Something isn't working

Comments

@timsuchanek
Copy link

timsuchanek commented May 26, 2022

Hi, first of all, great work! Excited to see these insane performance gains, just tested out the HTTP server.
I have a very simple test setup, which I'm benchmarking with oha. When I send queries for about 1s, bun crashed with a SegmentationFault.
This is my setup:

http.ts

import { parse } from 'graphql'

export default {
  port: 3000,
  async fetch(request: Request) {
    const req: any = await request.json()
    return new Response(JSON.stringify(parse(req.query)))
  },
}

package.json

{ "dependencies": { "graphql": "^16.5.0" } }

Benchmark command that succeeds:

oha -z 10ms http://localhost:3000 -T 'application/json' -d '{"query": "{ this is a query }"}'

Benchmark command that fails:

oha -z 1s http://localhost:3000 -T 'application/json' -d '{"query": "{ this is a query }"}'

Error output:

➜  bun-test bun http.ts

SegmentationFault at 0


–––– bun meta ––––
Bun v0.0.83 macOS Silicon 21.4.0
AutoCommand: public_folder 
Elapsed: 1920ms | User: 475ms | Sys: 357ms
RSS: 112.02MB | Peak: 112.02MB | Commit: 0.27GB | Faults: 0
–––– bun meta ––––

Ask for #help in https://bun.sh/discord or go to https://bun.sh/issues
@Jarred-Sumner
Copy link
Collaborator

Thanks for the report. Looking into it now

@Jarred-Sumner
Copy link
Collaborator

Jarred-Sumner commented May 26, 2022

As a temporary workaround, this works:

import { parse } from "graphql";

export default {
  port: 3000,
  async fetch(request: Request) {
    const req: any = await request.blob();
    return new Response(JSON.stringify(parse((await req.json()).query)));
  },
};

On my M1, it reports 44,657 requests/second

image

Note that json() is a non-standard addition to Blob in bun. It makes it more consistent with Response & Request

@Jarred-Sumner
Copy link
Collaborator

Slightly faster to use text(), it seems:

❯ oha -z 1s http://localhost:3000 -T 'application/json' -d '{"query": "{ this is a query }"}'
Summary:
  Success rate:	1.0000
  Total:	1.0008 secs
  Slowest:	0.0064 secs
  Fastest:	0.0001 secs
  Average:	0.0010 secs
  Requests/sec:	48298.9863

  Total data:	37.71 MiB
  Size/request:	818 B
  Size/sec:	37.68 MiB

Response time histogram:
  0.000 [303]   |
  0.001 [5426]  |■■■■■■■■
  0.001 [8870]  |■■■■■■■■■■■■■■
  0.001 [19702] |■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■
  0.001 [10148] |■■■■■■■■■■■■■■■■
  0.002 [2307]  |■■■
  0.002 [978]   |■
  0.002 [285]   |
  0.003 [141]   |
  0.003 [50]    |
  0.003 [129]   |

Latency distribution:
  10% in 0.0006 secs
  25% in 0.0008 secs
  50% in 0.0010 secs
  75% in 0.0012 secs
  90% in 0.0014 secs
  95% in 0.0016 secs
  99% in 0.0021 secs

Details (average, fastest, slowest):
  DNS+dialup:	0.0016 secs, 0.0009 secs, 0.0019 secs
  DNS-lookup:	0.0000 secs, 0.0000 secs, 0.0001 secs

Status code distribution:
  [200] 48339 responses
import { parse } from "graphql";

export default {
  port: 3000,
  async fetch(request: Request) {
    const req: any = await request.text();
    return new Response(JSON.stringify(parse(JSON.parse(req).query)));
  },
};

@Jarred-Sumner
Copy link
Collaborator

There are two bugs:

  1. Incorrect logic when freeing an aborted request that has a pending request body
  2. There are cases where the promise is not holding a strong reference to the JS objects correctly, so the promise is freed by the time the promise resolves

@Jarred-Sumner
Copy link
Collaborator

8% faster since yesterday

image

@sno2 sno2 added bug Something isn't working segfault labels Jul 29, 2022
@Jarred-Sumner
Copy link
Collaborator

Should be fixed now!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants