Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PackJPG disabled #2

Open
M-Gonzalo opened this issue Jul 30, 2021 · 8 comments
Open

PackJPG disabled #2

M-Gonzalo opened this issue Jul 30, 2021 · 8 comments

Comments

@M-Gonzalo
Copy link

I'm guessing due to licensing? Could you see if you can work around this? Thanks in advance.

@M-Gonzalo M-Gonzalo changed the title PackJPG disabled PackJPG and wavpack disabled Jul 30, 2021
@M-Gonzalo M-Gonzalo changed the title PackJPG and wavpack disabled PackJPG disabled Jul 30, 2021
@bartoszek
Copy link
Owner

bartoszek commented Jul 30, 2021

As far as I can tell it should be enabled, check nm -C /usr/lib/libpcompress.so|grep jpg

0000000000095220 t packjpg_filter
00000000000a4290 t packjpg_filter_process
00000000001a12a8 b jpgfilename
00000000001a129c b jpgfilesize
000000000009a310 t jpg_parse_jfif(unsigned char, unsigned int, unsigned char*)
0000000000099750 t jpg_next_mcupos(int*, int*, int*, int*, int*, int*)
000000000009c230 t jpg_next_mcuposn(int*, int*, int*)
000000000009b3b0 t jpg_encode_crbits(abitwriter*, abytewriter*)
000000000009ca00 t jpg_encode_eobrun(abitwriter*, huffCodes*, int*) [clone .part.0]
000000000009c120 t jpg_next_huffcode(abitreader*, huffTree*)
000000000009b510 t jpg_setup_imginfo()
000000000009b430 t jpg_rebuild_header()
0000000000014100 t jpg_rebuild_header() [clone .cold]
000000000009da20 t jpg_decode_block_seq(abitreader*, huffTree*, huffTree*, short*)
000000000009c180 t jpg_decode_dc_prg_fs(abitreader*, huffTree*, short*)
000000000009b220 t jpg_encode_block_seq(abitwriter*, huffCodes*, huffCodes*, short*)
000000000009c990 t jpg_encode_dc_prg_fs(abitwriter*, huffCodes*, short*) [clone .isra.0]

By default it's build with LGPL3 license which has packjpg enabled.

./archive/pc_arc_filter.c
#ifndef _MPLV2_LICENSE_
extern size_t packjpg_filter_process(uchar_t *in_buf, size_t len, uchar_t **out_buf);
ssize_t packjpg_filter(struct filter_info *fi, void *filter_private);

extern size_t packpnm_filter_process(uchar_t *in_buf, size_t len, uchar_t **out_buf);
ssize_t packpnm_filter(struct filter_info *fi, void *filter_private);
#endif

@M-Gonzalo
Copy link
Author

M-Gonzalo commented Jul 31, 2021

You're right, I have the same output as you. But it just doesn't work then:

$ pcompress -a -l14 -GLPxjC photos photos.pz
Scanning files.
Sorting ...
Scaling to 2 threads


Compression Statistics
======================
Total chunks           : 3
Best compressed chunk  : 2 MB(14.52%)
Worst compressed chunk : 19 MB(99.90%)
Avg compressed chunk   : 13 MB(69.93%)

Adaptive mode stats:
        BZIP2 chunk count: 0
        LIBBSC chunk count: 0
        PPMd chunk count: 0
        LZMA chunk count: 1
        LZ4 chunk count: 0

37623558        photos.tar.pcf //precomp -cn
44000941        photos.pz
44862622        photos

@bartoszek
Copy link
Owner

bartoszek commented Jul 31, 2021

Yep, looks like a bug in pcompress.c:init_pc_contex()
pctx.enable_packjpg is set only in section responsible for automatic selection of extra compression filters when higher levels are set.
https://github.com/moinakg/pcompress/blob/c6e779c40041b7bb46259e9806fa92b20c7b78fb/pcompress.c#L3658-L3674

Using any -DPGjx flags sets pctx.advanced_opts=1 which skips this section and pctx.enable_packjpg and friends are newer set according to command line flags.

I've pushed to github a small patch that should fix this, gave it a try ( postponed push to AUR until confirmed to resolve the issue, also debug flags is set for you in PKGBUILD so just makepkg -CLfi it 😏 )

3190e4b

@bartoszek
Copy link
Owner

bartoszek commented Jul 31, 2021

To quickly see whats going on start pcompress with desired flags and attach perf to its pid with sudo perf top -d 3 -p $PID you should see some pjg_* functions from packjpg filter.

  11.90%  libpcompress.so.1  [.] Bt4_MatchFinder_GetMatches
   8.94%  libpcompress.so.1  [.] model_s::update_model
   8.28%  libpcompress.so.1  [.] aricoder::encode
   6.66%  libpcompress.so.1  [.] GetMatchesSpec1
   5.06%  libpcompress.so.1  [.] model_s::shift_context
   4.81%  libpcompress.so.1  [.] abitreader::read
   3.88%  libpcompress.so.1  [.] model_s::totalize_table
   3.70%  libpcompress.so.1  [.] pjg_encode_ac_high
   3.53%  libpcompress.so.1  [.] RangeEnc_EncodeBit
   3.03%  libpcompress.so.1  [.] model_b::shift_context
   2.42%  libpcompress.so.1  [.] pjg_aavrg_context
   2.25%  libpcompress.so.1  [.] aricoder::write_bit
   2.07%  libpcompress.so.1  [.] GetOptimum

@M-Gonzalo
Copy link
Author

M-Gonzalo commented Jul 31, 2021

It's not working for me:

35810318        photos.pz    // -l14
44000941        photos.pz    // -l14 -GLPxjC

It's also overriding the use of wavpack:

2021471        audio.pz    // -l14
4720015        audio.pz    // -l14 -GLPxjC

I'm using the newer binary, of course.
Do you want me to upload it or to run it with some other parameter?

@bartoszek
Copy link
Owner

bartoszek commented Aug 1, 2021

I've tried precomp (btw. thanks, I haven't knew about it) and it looks fine to me 🤔

d /tmp/photos*
42M     /tmp/photos_fix.pz    // pcompress -j
42M     /tmp/photos_orig.pz   // pcompress -l14
42M     /tmp/photos.pcf       // precomp /tmp/photos.tar
57M     /tmp/photos.tar

@M-Gonzalo
Copy link
Author

M-Gonzalo commented Aug 1, 2021

It's working now! I can't tell why it wasn't before, though... Maybe the system was calling a cached version? But it shouldn't.

Anyways, I'm reaaally glad. I've been trying to make pcompress work to test it since I first found out about it years ago. Thank you!

BTW, You might want to see this, if you're planning on giving pcompress some more love:
Google's brunsli is 5x faster than packJPG, and only about -1% on ratio. Sometimes even stronger, and is under development.

Also, fast-lzma2 is 2x faster than mt-lzma, and lends itself beautifully to the chunk-based approach of pcompress. Basically, it allows to use the same state across chunks, producing a true solid stream no matter how many threads it's using. Bottom line, it will probably compress way better than the current approach while doing it twice faster.

There are a lot more than can be done on pcompress, if you want to discuss it. Thanks again for your work!

@bartoszek
Copy link
Owner

Thanks for great reassures, I'm not really into cpp development but will look into it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants