Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

perf: Reduce object generation and optimize performance. #95

Merged
merged 25 commits into from
Nov 27, 2023

Conversation

usualoma
Copy link
Member

This is an optimization in a way that does not use the web standard Request and Response, so it may be out of line with the implementation policy in this repository, but this change made it 1.6 times faster.

Currently, only the simplest response where a string is returned is optimized, but if we work hard with additional PRs, we can also optimize the pattern where streaming is returned.

app

import { serve } from '../node-server/dist/index.js'
import { Hono } from './dist/index.js'

const app = new Hono()
app.get('/', (c) => c.json({ hello: 'world' }))

serve(app)

main branch

image

perf/proxy-object branch

image

@usualoma
Copy link
Member Author

with 5d8da3b

image

@yusukebe
Copy link
Member

Wow, ultra fast!

I'll review this later.

@tangye1234, what are your thoughts on this? Are there any problems?

src/listener.ts Outdated
Comment on lines 117 to 119
get body() {
return this.getRequestCache().body
},

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would make a huge difference for devs migrating from Express to Hono in the next few weeks. 🚀

Would it be possible to also support a built-in middleware such as body-parser (or a how-to guide) via config? cc @yusukebe

Thanks!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @rafaell-lycan!
Thanks for your comment.

Do you think this will cause problems? If so, please let us know what specific issues you see.

This would make a huge difference for devs migrating from Express to Hono in the next few weeks. 🚀

@usualoma
Copy link
Member Author

Sorry, I added this after the review started. I would also like to include the following commits.

In hono-base.ts, there is an instanceof Response condition, so I made it behave more like a Response.

https://github.com/honojs/hono/blob/864fae668c7505e523cfe9d57e4e0778ddaa2fe5/src/hono-base.ts#L302

When a Respose object is returned from fetch instead of a Promise, it returns a response synchronously without using await or then.

This produced the following results in my environment

スクリーンショット 2023-11-15 5 36 04

@usualoma
Copy link
Member Author

Incidentally, if we proceed with this implementation, I would like to merge the following two additional PRs after this

@tangye1234
Copy link
Contributor

After extensive testing, I found that the new Request / new Response are the main culprits killing performance.
Too many large objects are created by nodejs Request / Response during benchmark, which also verified why koa is faster than expressjs.

The story is, in the past we deliberated over whether to use @remix-run/fetch or nodejs native fetch. Nodejs undici seemed to be heavy, slower than @remix-run/fetch, and it creates more internal objects which kills the performance.

However the performance differences were marginal. We deemed that a more conventional implementation would be neat and compatible with more cases. Hence we decided to stick with the native fetch. Now, we see a more BRALLIANT implementation is taking place here.

Personally I am not sure whether we should workaround it with this PR, By this, I mean, should we make extensive efforts to avoid using the native Request / Response constructor, aiming at incredibly high speeds for simple use cases like a hello world benchmark? This could disrupt the simplicity of the project, as sometimes magic is just that - magic.

The original purpose behind this PR was to shed off the conversion between nodejs and web standards, aiming to employ the simplest methods possible. Maybe yes, it’s appropriate to leverage this concept as hono node-server is designed to bridge this gap.

@yusukebe, the decision lies with you, I think it has reached a stage where there's not much coding work left, it’s more about the direction you want to take.

@yusukebe
Copy link
Member

@usualoma @tangye1234

Super thanks for your great work. I'll consider this deeply in this weekend.

@usualoma
Copy link
Member Author

Hi, @tangye1234!

Thanks for your thoughtful and detailed review. I fully agree with your comments. (I also learned a lot from the comments about the Node.js Response object submitted before that.)

While there is no doubt that this PR will achieve a very significant speedup on small benchmarks, it will compromise the simplicity of the implementation. Whether we should focus our efforts there or not is a matter of judgment.

For my part, even if this PR were to be merged, it is the effort of the node-server implementation up to this point that makes the result viable. If the bodyInit is not a string or ReadableStream, this PR’s Response will fall back to the Node.js Response object, which requires the existing implementation.

As an aside, this is similar to the relationship between Hono's RegExpRouter and TrieRouter. Where RegExpRouter, which is complex and huge in implementation but very fast, cannot handle some routing, TrieRouter, a straightforward implementation, can take it as a fallback destination.

Anyway, I have no objection to leaving it to the decision of @yusukebe.

Best regards!

@usualoma
Copy link
Member Author

P.S.

If someone asks me if this PR is a "cheat against benchmark scripts?" I would answer "No" to that question.

This is because most requests expected to be accelerated in a website production environment will likely be GET requests that return either a string or a ReadableStream. Since this PR is an optimization for those (which is annoying considering GraphQL), it will also contribute to performance gains for the production environment (although the effect will be more minor as the response size increases).

RegExpRouter performs amazingly well in the benchmark scripts. Still, it is also faster in the production environment for "lots of handler registrations" and "lots of middleware registrations" than other routers. These characteristics are similar to this case.

In short, the content of this PR should produce results consistent with Hono's policy of "speedy in benchmark scripts, but also fast in production environments.

@yusukebe
Copy link
Member

Let's make Hono one of the fastest frameworks for Node.js!

For Hono, being fast and small are most important things. So, let's work on making Hono the fastest framework for Node.js!

I've deeply considered the PR by @usualoma and comments by @tangye1234, and after running the app with the Node.js Adapter with this PR, I've decided to proceed with it.

The reason is simple: it makes Hono faster. My investigation revealed that the implementation isn't overly complex or "magical". The logic is simple: it makes operations that aren't immediately necessary lazy. This approach is already applied in some parts of Hono's core.

How much faster will Hono be with this PR?

I'm comparing it with other frameworks using the scripts:

  • Express
  • Fastify
  • Polka
  • Hono with Node.js Adapter v1.2.3
  • Hono with Node.js Adapter with this PR
  • Hono on Bun

The scripts you can use: https://gist.github.com/yusukebe/895f2ff55014c10f3e249a87e592ca49

One test is accessing /, returning plain text, and the other is returning a JSON response, as used in Fastify's Benchmarks.

According to these tests, Hono is either the fastest or the second behind Fastify. We need to document these results, so I will do that.

Next Actions

@usualoma

Could you run the benchmarks and look for more tuning points?

I'll review and comment later.

@tangye1234

Your contribution so far is appreciated. I hope you will continue to do so. Thank you.

src/listener.ts Outdated
outgoing: ServerResponse | Http2ServerResponse
) => {
try {
res = await res
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we make it does not await if it's not necessary?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

__body: body,
__init: init,
__cache: [body, (init?.headers || {}) as Record<string, string>],
})
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hono does not return a Content-Type header when returning a response with a metho like c.text('hi').

We have to set Content-Type as text/plain;charset=UTF-8 if it's not set, at here for somewhere.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add default content-type by 2df0dd8 and 2bed5ba

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@yusukebe
Copy link
Member

PS.

As we have discussed, being fast in benchmarks is not the same as being fast in real world. But being fast in benchmarks is good metrics, and as @usualoma said, in many cases this improvement will improve performance in the real world.

@yusukebe
Copy link
Member

The results on my machine:

Screenshot 2023-11-18 at 16 19 19

@yusukebe
Copy link
Member

The results for /json. Fastify is fastest.

Screenshot 2023-11-18 at 16 27 39

@usualoma
Copy link
Member Author

Thank you. I will now merge the following and do some refactoring
usualoma#1
usualoma#2

I will ask for a re-review here when I am done.

@yusukebe
Copy link
Member

@usualoma

If you're interested, you might want to check out this repository: https://github.com/the-benchmarker/web-frameworks

This is another popular benchmark. Hono is tested on Bun, but the results show it as slow, because it might be running with a single thread for Bun. All frameworks for Bun don’t seem to perform well in this benchmark.

I think Hono has the potential to be the fastest among JavaScript frameworks.

@usualoma
Copy link
Member Author

Could you run the benchmarks and look for more tuning points?

The/json performance is so good with fastify because the serializer is prepared in advance. This is indeed a good idea.

The npm that builds the serializer can be found here, which will also speed up Hono's /json. (It's already fast, though, and it isn't easy to stabilize the benchmark)
https://github.com/fastify/fast-json-stringify

I have seen improved performance with the following code

import fastJson from 'fast-json-stringify'
import { serve } from '../node-server/dist/index.mjs'
import { Hono } from './dist/index.js'

const stringify = fastJson({
    type: 'object',
    properties: {
        hello: {
            type: 'string'
        }
    }
})

const app = new Hono()
app.get('/', (c) => c.text('Hi'))
app.get('/json', (c) =>
    new Response(stringify({ hello: 'world' }), {
        headers: {
            'content-type': 'application/json; charset=UTF-8'
        }
    })
)

const port = 3001
console.log(port)

serve({
    fetch: app.fetch,
    port
})

Creating a custom serializer is not that difficult, and if it is a straightforward object, it can be implemented as follows, making it faster. However, supporting all kinds of schemas would be a big job.

import { serve } from '../node-server/dist/index.mjs'
import { Hono } from './dist/index.js'

const jsonEscapeRe = /["\\\n\r]/g
const isJsonEscapeRequired = (str) => jsonEscapeRe.test(str)

const buildSerializer = (schema) => {
    const list = []
    const keys = []
    for (const [k, v] of Object.entries(schema.properties)) {
        if (v.type === 'string') {
            list.push(`"${k}":"`)
            keys.push(k)
        }
    }
    list[0] = '{' + list[0]
    list.push('"}')
    return (data) => {
        let res = list[0]
        for (let i = 0; i < keys.length; i++) {
            const value = data[keys[i]]
            res += (isJsonEscapeRequired(value) ? JSON.stringify(value).replace(/^"|"$/g, '') : value) + list[i + 1]
        }
        return res
    }
}
const serializer = buildSerializer(
    {
        type: 'object',
        properties: {
            hello: {
                type: 'string'
            }
        }
    }
)

const app = new Hono()
app.get('/', (c) => c.text('Hi'))
app.get('/json', (c) =>
    new Response(serializer({ hello: 'world' }), {
        headers: {
            'content-type': 'application/json; charset=UTF-8'
        }
    })
)

const port = 3001
console.log(port)

serve({
    fetch: app.fetch,
    port
})

Considering the possibility of introducing this to Hono, it would be nice to be associated with jsonT or replaced somehow. However, it sounds like a challenging task.

@usualoma
Copy link
Member Author

My work is done.

Would you please review this?
@yusukebe @tangye1234

Although only some comments were added, 661a123 made the Request and Response definitions into independent files for better visibility.
The function names are descriptive throughout, so I don't think we will have trouble understanding them.

As for the tests, I have added some for the Request and Response fallback pattern. The existing tests will cover the others.

Copy link
Member

@yusukebe yusukebe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@usualoma

LGTM! Awesome work!

__body: body,
__init: init,
__cache: [body, (init?.headers || {}) as Record<string, string>],
})
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

const app = new Hono()

app.get('/json-blob', async () => {
return new Response(new Blob([JSON.stringify({ foo: 'blob' })]), {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice tests!

@yusukebe
Copy link
Member

@tangye1234

Sorry to ping you again, but could you review this? If I don't hear back from you soon, I'll go ahead and merge it.

@tangye1234
Copy link
Contributor

tangye1234 commented Nov 21, 2023

I am testing an implementation, which can fully share the pros of the PR, but more cleaner, while also can satisfy these cases:

// response name not changed
Response.name === 'Response'

// response prototype chain not changed
new Response() instanceof (await fetch('https://www.google.com')).constructor

// can only use new operator
expect(Response).toThows()

// support Response static method
typeof Response.error === 'function'
typeof Response.json === 'function'
typeof Response.redirect === 'function'

// support other class to extends from Response
class NextResponse extends Response {}

// support ReadableStream / URLSearchParams / string / ArrayBuffer quick proxy(without constructing real Response)

But I don't know when it could be finished.

@usualoma
Copy link
Member Author

@tangye1234

Thank you for your important points! You are indeed correct.
In 04162de, I have ensured that the tests you have shown me succeed. How about this.
In my environment, adding this change did not degrade performance.

cc: @yusukebe

@tangye1234
Copy link
Contributor

tangye1234 commented Nov 21, 2023

I also did the same tests, completely depends on your branch.

https://github.com/tangye1234/honojs-node-server/tree/perf/proxy-object

I did not PR to here, cause I am respecting @usualoma's performant and beautiful job.

In my code snippets, there are 3 important things which should be clear:

  1. Use symbol and private (#) members or methods as possible as I can, this will not leak the internal members and will never be conflicted to outside environments.
  2. Polyfill Response in globals, not in response file itself, this will make all polyfills clearer and more controllable.
  3. Use a wrapper Request class to construct the input rather than plain object. It is a similar approach to the Response, so the prototype chain can be easily maintained. The left things can be easily done, like parsing body or cookies.

Additionally, I did some work to support the stream / arraybuffer body and also support getting the native NodeJS Response state for a more directly proxy. And I have resurved static methods for Response (json / redirect / error).

The job may not be completely over, I test this implementation which is slightly slower than @usualoma 's, I guess it is due to the complex code work.

Speed tests:
@usualoma 16.5MB/s
Mime's job 15.75MB/s

@yusukebe
Copy link
Member

Thanks, @tangye1234,

In this case, I prefer @usualoma's implementation (I think it's because I am familiar with reading @usualoma's code). Therefore, I'd like to proceed with this PR.

@usualoma
Copy link
Member Author

usualoma commented Nov 21, 2023

@tangye1234
Thank you!

I have confirmed your branch.
https://github.com/tangye1234/honojs-node-server/tree/perf/proxy-object

  1. Use symbol and private (#) members or methods as possible as I can, this will not leak the internal members and will never be conflicted to outside environments.

Yes, indeed, my branch was very messy in its implementation in this regard. It was fixed in e0c57d2.

It would also be reasonable to override global.Request via globals.ts. It was fixed in 5e28a91. However, this is a jest issue, since global.Response = Response does not override in the jest environment, so the current https://github.com/tangye1234/honojs-node-server/tree/perf/proxy-object, the test is running without Response being overwritten.

I think getResponseInternalBody is an exciting implementation, it would help speed things up, reduce conditional branching in the current part of the following, and I think it is very attractive.
https://github.com/honojs/node-server/blob/main/src/listener.ts#L78-L81

However, since it depends on the internal implementation of Node.js, it is debatable whether this should be introduced to node-server, so it is better to make it another PR. Another PR is better because it will be clear that it is the result of @tangye1234

As for me, I would like to consider performance first here, so I think the content of this PR is the fastest at the moment, as indicated by #95 (comment), so I would like to merge current HEAD. After that, improving with additional PRs as needed would be good.

@tangye1234
Copy link
Contributor

There is an important case you may missed.

const app = new Hono()

app.get('/', c => {
  const isRealRequest = c.req.raw instanceof Request
  // should return true
  return c.json(isRealRequest)
})

In the listener, the incoming should be converted to a Request, which should be an instance of real nodejs Request.
This Request is not a polyfill, it is just a wrapper constructed by IncomingMessage.

@usualoma
Copy link
Member Author

@tangye1234

Thank you again.
I had not forgotten about that point, but I could not think of a situation where a confirmation like c.req.raw instanceof Request would be needed, so I thought that if there was such a use case, it could be handled in a later minor version. Nevertheless, as you point out, if we can handle it, it would be better to have it handled from the beginning.

It was fixed in a9c3cd2

@yusukebe
Copy link
Member

@usualoma

Sorry for the delay! Is this ready to be merged? Okay??

I'll release the v1.3.0 includes this PR after merging.

@usualoma
Copy link
Member Author

@yusukebe
Welcome back!
Yes, I believe we can merge the current branch's HEAD.

@yusukebe
Copy link
Member

Okay! Let's gooooooooooo!!

@yusukebe yusukebe merged commit 68df0e1 into honojs:main Nov 27, 2023
3 checks passed
@usualoma usualoma deleted the perf/proxy-object branch November 27, 2023 06:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants