Skip to content

remove uneccesarry field for FixedBlockPool in inference (#5729)#5729

Closed
hy-NJU wants to merge 1 commit intopytorch:mainfrom
hy-NJU:export-D91654624
Closed

remove uneccesarry field for FixedBlockPool in inference (#5729)#5729
hy-NJU wants to merge 1 commit intopytorch:mainfrom
hy-NJU:export-D91654624

Conversation

@hy-NJU
Copy link
Copy Markdown

@hy-NJU hy-NJU commented May 4, 2026

Summary:

X-link: https://github.com/facebookresearch/FBGEMM/pull/2657

In the training FixedBlockPool, we have 4 field:
int64_t key;
uint32_t timestamp;
uint32_t count : 31
bool used : 1;

But count field is never used in inference while take 4 bytes DRAM space.
Here we set up InferenceFixedBlockPool to save such space

Reviewed By: emlin

Differential Revision: D91654624

@meta-cla meta-cla Bot added the cla signed label May 4, 2026
@meta-codesync
Copy link
Copy Markdown
Contributor

meta-codesync Bot commented May 4, 2026

@hy-NJU has exported this pull request. If you are a Meta employee, you can view the originating Diff in D91654624.

hy-NJU pushed a commit to hy-NJU/FBGEMM that referenced this pull request May 4, 2026
Summary:

X-link: facebookresearch/FBGEMM#2657

In the training FixedBlockPool, we have 4 field:
int64_t key;
uint32_t timestamp;
uint32_t count : 31
bool used : 1;

But count field is never used in inference while take 4 bytes DRAM space.
Here we set up InferenceFixedBlockPool to save such space

Reviewed By: emlin

Differential Revision: D91654624
@hy-NJU hy-NJU force-pushed the export-D91654624 branch from 5a750f7 to c32b764 Compare May 5, 2026 01:44
@meta-codesync meta-codesync Bot changed the title remove uneccesarry field for FixedBlockPool in inference remove uneccesarry field for FixedBlockPool in inference (#5729) May 5, 2026
hy-NJU pushed a commit to hy-NJU/FBGEMM that referenced this pull request May 5, 2026
Summary:

X-link: facebookresearch/FBGEMM#2657

In the training FixedBlockPool, we have 4 field:
int64_t key;
uint32_t timestamp;
uint32_t count : 31
bool used : 1;

But count field is never used in inference while take 4 bytes DRAM space.
Here we set up InferenceFixedBlockPool to save such space

Reviewed By: emlin

Differential Revision: D91654624
hy-NJU pushed a commit to hy-NJU/FBGEMM that referenced this pull request May 5, 2026
Summary:

X-link: facebookresearch/FBGEMM#2657

In the training FixedBlockPool, we have 4 field:
int64_t key;
uint32_t timestamp;
uint32_t count : 31
bool used : 1;

But count field is never used in inference while take 4 bytes DRAM space.
Here we set up InferenceFixedBlockPool to save such space

Reviewed By: emlin

Differential Revision: D91654624
Summary:

X-link: facebookresearch/FBGEMM#2657

In the training FixedBlockPool, we have 4 field:
int64_t key;
uint32_t timestamp;
uint32_t count : 31
bool used : 1;

But count field is never used in inference while take 4 bytes DRAM space.
Here we set up InferenceFixedBlockPool to save such space

Reviewed By: emlin

Differential Revision: D91654624
@hy-NJU hy-NJU force-pushed the export-D91654624 branch from c32b764 to bba8806 Compare May 5, 2026 14:09
hy-NJU pushed a commit to hy-NJU/FBGEMM that referenced this pull request May 5, 2026
Summary:

X-link: facebookresearch/FBGEMM#2657

In the training FixedBlockPool, we have 4 field:
int64_t key;
uint32_t timestamp;
uint32_t count : 31
bool used : 1;

But count field is never used in inference while take 4 bytes DRAM space.
Here we set up InferenceFixedBlockPool to save such space

Reviewed By: emlin

Differential Revision: D91654624
@meta-codesync meta-codesync Bot closed this in 4ad11ec May 5, 2026
@meta-codesync
Copy link
Copy Markdown
Contributor

meta-codesync Bot commented May 5, 2026

This pull request has been merged in 4ad11ec.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant