Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: bug: memory leak when using bentoml>=1.2 #4775

Merged
merged 7 commits into from
Jun 12, 2024

Conversation

frostming
Copy link
Contributor

@frostming frostming commented Jun 4, 2024

Fixes #4760

Signed-off-by: Frost Ming me@frostming.com

What does this PR address?

Change the request temp dir to create on demand.
/cc @Zheaoli @gusghrlrl101

@frostming frostming requested a review from a team as a code owner June 4, 2024 00:58
@frostming frostming requested review from ssheng and removed request for a team June 4, 2024 00:58
@frostming
Copy link
Contributor Author

Do you think we need to reuse the temp directories across different requests, i.e. to use a temp dir pool. Is that overkill?

@frostming frostming closed this Jun 4, 2024
@frostming frostming reopened this Jun 4, 2024
@Zheaoli
Copy link

Zheaoli commented Jun 4, 2024

Do you think we need to reuse the temp directories across different requests, i.e. to use a temp dir pool. Is that overkill?

It's not recommended, I think.

I prefer to make a tmp directory when it's usable like it's a file style request. Rather than that, I think it would be waste to make a tmp directory for every request

@Zheaoli
Copy link

Zheaoli commented Jun 4, 2024

I think maybe we don't need a directory per request? We can group by the directory by thread/process id, and all the request file in the same directory?

@frostming
Copy link
Contributor Author

I prefer to make a tmp directory when it's usable like it's a file style request. Rather than that, I think it would be waste to make a tmp directory for every request

Yes, that is exactly what this PR does. The creation is lazy.

@Zheaoli
Copy link

Zheaoli commented Jun 4, 2024

Yes, that is exactly what this PR does. The creation is lazy.

Yep, I got it. But I might have another idea here

I think maybe we don't need a directory per request? We can group by the directory by thread/process id, and all the request file in the same directory?

@frostming
Copy link
Contributor Author

frostming commented Jun 4, 2024

I think maybe we don't need a directory per request? We can group by the directory by thread/process id, and all the request file in the same directory?

But we would also need to clean it periodically or its size will keep growing. The purpose of the temp dir is to isolate the store path for each request which is handled in coroutines, so using a process id/thread id wouldn't work. The same process and thread may be handling more than one request concurrently. For example, it makes the following work:

@bentoml.api
def predict(self, input: int, ctx: bentoml.Context) -> Path:
    output_path = Path(ctx.temp_dir, "output.txt")  # this file shouldn't be overwritten by other request
    with open(output_path, 'wb') as f:
        f.write(some_bytes)
    return output_path

@Zheaoli
Copy link

Zheaoli commented Jun 4, 2024

I think maybe we don't need a directory per request? We can group by the directory by thread/process id, and all the request file in the same directory?

But we would also need to clean it periodically or its size will keep growing. The purpose of the temp dir is to isolate the store path for each request which is handled in coroutines, so using a process id/thread id wouldn't work. The same process and thread may be handling more than one request concurrently. For example, it makes the following work:

@bentoml.api

def predict(self, input: int, ctx: bentoml.Context) -> Path:

    output_path = Path(ctx.temp_dir, "output.txt")  # this file shouldn't be overwritten by other request

    with open(output_path, 'wb') as f:

        f.write(some_bytes)

    return output_path

oh I got your means, you want to the developer following the common conventions like all files during request in the same directory

Copy link

@Zheaoli Zheaoli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@xianml
Copy link
Contributor

xianml commented Jun 4, 2024

How about we just use a mem file instead? say pyfakefs or just use fssepc to create a memory file system.
After all, we do not really need to sink data to real file. right?

@frostming
Copy link
Contributor Author

How about we just use a mem file instead

That would be better, but our goal is to provide a seamless experience so users can save files using the familiar os.path or pathlib.Path interfaces, neither one of the above does.

@xianml
Copy link
Contributor

xianml commented Jun 4, 2024

That would be better, but our goal is to provide a seamless experience so users can save files using the familiar os.path or pathlib.Path interfaces, neither one of the above does.

mem file is also an impl of a file interface. And use it as the impl of ctx.temp_dir, can be returned as a Path.
is it feasible ?

@frostming
Copy link
Contributor Author

mem file is also an impl of a file interface. And use it as the impl of ctx.temp_dir, can be returned as a Path.
is it feasible ?

I am afraid not, many inference functions expect a string as the input or output path. And we will lose the memfile magic.

@aarnphm
Copy link
Contributor

aarnphm commented Jun 4, 2024

hmm can we use memfile and still offer pathlib-like API here?

I guess that FS API is actually not very well known afaict

@frostming
Copy link
Contributor Author

hmm can we use memfile and still offer pathlib-like API here?

but os.path apis won't work right? That's extra learning burden for users.

Fixes bentoml#4760

Signed-off-by: Frost Ming <me@frostming.com>
Signed-off-by: Frost Ming <me@frostming.com>
Signed-off-by: Frost Ming <me@frostming.com>
Signed-off-by: Frost Ming <me@frostming.com>
Signed-off-by: Frost Ming <me@frostming.com>
Signed-off-by: Frost Ming <me@frostming.com>
Signed-off-by: Frost Ming <me@frostming.com>
@frostming frostming enabled auto-merge (squash) June 12, 2024 00:37
@frostming frostming requested a review from bojiang June 12, 2024 00:37
@frostming frostming merged commit 89eb556 into bentoml:main Jun 12, 2024
42 checks passed
@frostming frostming deleted the frostming/issue4760 branch June 12, 2024 09:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

bug: memory leak when I am using bentoml>=1.2
5 participants