Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Intel QuickAssist? #455

Closed
bcronje opened this issue Nov 17, 2016 · 5 comments
Closed

Support for Intel QuickAssist? #455

bcronje opened this issue Nov 17, 2016 · 5 comments
Labels

Comments

@bcronje
Copy link

bcronje commented Nov 17, 2016

Are there any plans to support Intel QuickAssist inside ZSTD? The performance that the QuickAssist adapters like these bring to the table is really compelling, especially in real time compression scenarios.

@Cyan4973
Copy link
Contributor

This is certainly an interesting development,
though it will require specific developments and support.
It's also unclear how programmable these adapters are, which means it might be a decision entirely within Intel's hands.

@kklouzal
Copy link

kklouzal commented Nov 22, 2016

Many modern CPU have Intel Quick Assist acceleration so you don't have to shell out the cost for one of those high end cards to reap the benefits of Quick Assist. Access to the API is provided at the OS level weather it be Windows or Linux. I assume ZStd would provide alternate compression/decompression methods which took advantage of Quick Assist and leave it to the developer to determine if the system supports Quick Assist and to implement those alternate methods so one could take advantage of the hardware provided by the system.

@bcronje
Copy link
Author

bcronje commented Nov 22, 2016

After doing a bit more reading up on QuickAssist I realized my initial assumptions were incorrect. Initially I thought the QuickAssist hardware provides computing offload that could be utilized by ZSTD, but it appears that Intel have implemented their own compression algorithm inside QuickAssist, so it does not appear to be usable by any other algorithms.

@Cyan4973 Cyan4973 closed this as completed Dec 2, 2016
@Ornias1993
Copy link
Contributor

Ornias1993 commented Dec 20, 2019

@bcronje @Cyan4973 Thats not fully true, afaik QAT could be used to offload parts of the zstd stack... Such as hashing.
And it should actually be possible to use it with custom compression:
As CEPH also already uses it that way:

QAT Support for Compression

As mentioned above, QAT support for compression is based on QATzip library in user space, which is designed to take full advantage of the performance provided by QuickAssist Technology. Unlike QAT based encryption, QAT based compression is supported through a tool class for QAT acceleration rather than a compressor plugin. The common tool class will be shared among zip, snappy, lz4 compressor plugins, and can transparently accelerate the existing compression types. So user is allowed to use it to speed up the existing compression types as long as the QAT hardware is available and QAT is capable to handle them.

@klauspost
Copy link

Sorry if I'm reviving the dead. It uses LZ4 to compress the data, then partially decompresses the output, converts it to zstd sequences and a literal buffer and then recompresses those using the zstd (software) library.

Source: https://github.com/intel/QATzip/blob/master/utils/qzstd.c#L211

Must compress less than level 1 and I can't imagine it being faster.

I tried a similar trick with Snappy when I was learning Zstandard. Fun, but pretty much useless. Fully decompressing and re-compressing is usually both better and faster.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants