-
Notifications
You must be signed in to change notification settings - Fork 503
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Undici performance vs axios #2622
Comments
Have you got something to replicate this setup? How many http calls are you making? |
Nope. I was not able to create a deterministic environment locally. I will try again
We have 25/30 req/s (average) |
No need, I tracked it down. What is happening is that undici is establishing a new connection for every POST, and we are paying the TLS price. This behavior is correct but strict, as in certain cases, the connection can be kept open. This was introduced by @ronag in #76, and it made things too strict. @ronag wdyt? |
Yes, the implementation is too strict. Though it will require some thinking to make sure we don't introduce bugs by loosening it. |
I think the simplest way is to make pipelining equal to 1 a special case. |
What about adding a new parameter to the pool configuration to let the user choose this behavior? |
@mcollina Actually I'm unsure if you are right. I don't see why it wouldn't re-use the connection, once the whole request/response cycle has completed. |
Application headers: "Content-Type": "application/x-thrift",
"Accept": "application/x-thrift",
"Authorization": internalAuthToken,
"User-Agent": consumerKey, |
@ronag, you are right, the socket is correctly reused: import { createServer } from 'node:http';
import { request } from './index.js';
import { setTimeout as sleep } from 'node:timers/promises';
import { strictEqual } from 'node:assert';
let lastSocket = null;
const server = createServer((req, res) => {
console.log('request ', req.url, req.method);
if (lastSocket) {
res.end(JSON.stringify({ same: lastSocket == req.socket }));
} else {
res.end(JSON.stringify({ noSocket: true }));
}
lastSocket = req.socket;
})
await new Promise((resolve) => server.listen(3000, resolve));
{
const { body } = await request('http://localhost:3000', {
method: 'POST',
body: JSON.stringify({ hello: 'world' })
});
console.log(await body.json());
}
await sleep(1000);
{
const { body } = await request('http://localhost:3000', {
method: 'POST',
body: JSON.stringify({ hello: 'world' })
});
const { same } = await body.json();
strictEqual(same, true);
console.log('same socket');
}
server.close(); So I guess that whatever combination of inputs you are using @Gigioliva is causing undici to create a new socket every time. Can you create a small reproduction of what you are doing? |
and server... |
I will try 😅 the system is quite complex to reproduce...I will start from a simple setup and add new steps until the problem also occurs locally |
Hi @mcollina Thank you for your time. After some investigation, we found the problem. The default value of Sorry for the issue. I think we can close it 🙏🏼 |
Undici performance vs axios
I would like to migrate my client from axios to undici.
However, in some occasions undici is slower than axios. In the example in screenshot, an API call with axios has ~5ms latency (P90). When I use undici instead, the latency increases to
40ms
.Analyzing the program using Google Cloud Profiler, I notice that most of the time is spent creating the TLS connection, even though I use a
Pool
. I would expect that after an initial phase, creating the connection should no longer be a problem.I also tried using the
connections
andpipelining
parameters of Pool object but with no improvement 😞Can you help me understand how to optimize these calls?
Reproducible By
Providing a full code is complex. High level:
Expected Behavior
Latency using undici is more or less the same as it was with axios.
Logs & Screenshots
Environment
node:20.9-bookworm-slim
The text was updated successfully, but these errors were encountered: