Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Usage about Wavemesh #1

Closed
Seven377 opened this issue Nov 19, 2015 · 12 comments
Closed

Usage about Wavemesh #1

Seven377 opened this issue Nov 19, 2015 · 12 comments

Comments

@Seven377
Copy link

Can you provide the instruction about wavemesh ? I don't know how to use it.

@valette
Copy link
Owner

valette commented Nov 19, 2015

As explained in the Readme file:

execute wavemesh without arguments to see the available options.

Also, feel free to submit pull requests if you want to add more documentation!

Thanks

@Seven377
Copy link
Author

I have successfully run Wavemesh, I have two questions:

  1. when I set the coordinates quantization to 16 bits, there have a error :
    " terminate called after throwing an instance of 'std::length_error'
    what(): vector::_M_fill_insert "
  2. I don't know how to compute compression rates (bpv),including connectivity compression rates and geometry compression rates.

@valette
Copy link
Owner

valette commented Nov 19, 2015

when I set the coordinates quantization to 16 bits, there have a error :
" terminate called after throwing an instance of 'std::length_error'
what(): vector::_M_fill_insert "

I think you cannot use 16 bits, 15 bits should work however

I don't know how to compute compression rates (bpv),including connectivity compression rates and geometry compression rates

during decompression, a file "report.txt" should give you all these infos

@Seven377
Copy link
Author

I find it , thank you very much.

@Seven377
Copy link
Author

I have understood the meaning of each value in the file "report.txt" , however, I want to know that if the "total data" contains the compression rates of connectivity and geometry ? Can I obtain the compression rates of geometry at each detail of level ?

@valette
Copy link
Owner

valette commented Nov 23, 2015

OK, the data is not completely clear:

Let's see:
Level 25: 11547f, 5826v, valence entropy= 2.2838, total data: 4.51972 bits/v (connectivity: 25757bits, 3.73387 bits/vertex for this level)

at level 25, the total data weighs 4.51972 b/v i.e. 4.51972 *5826 = 26331 bits. As written, connectivity took 25757 bits, so geometry took the remaining part.

@Seven377
Copy link
Author

I have tried to compute geometry compression rates in this way, however, the size of total data sometimes is less than connectivity. for example, when I used Bunny model ,
Level 16: 5618f, 2811v, valence entropy= 2.31131, total data: 3.19518 bits/v (connectivity: 13092bits, 3.92571 bits/vertex for this level)
the total data weighs 3.19518 bits/v i.e. 3.19518*2811=8982 bits, but connectivity took 13092bits.
In fact, this problem exists at level 1 to level 16.

@valette
Copy link
Owner

valette commented Nov 23, 2015

oops, sorry, I made a mistake in my explanation on bunny:
Filename: out.ddd
Quantization : 12 bits
No lifting
No Wavelet Geometrical Criterion
Total execution time : 0.75816 seconds : 91604.7 faces/s
Level 0 : 26f, 19v, total data: 1456 bits (connectivity: 422bits)
Level 1: 27f, 20v, valence entropy= 2.22821, total data: 0.050066 bits/v (connectivity: 606bits, 184 bits/vertex for this level)
[...cut...]
Level 25: 25587f, 12874v, valence entropy= 2.0304, total data: 8.37412 bits/v (connectivity: 48430bits, 3.29857 bits/vertex for this level)
Level 26: 69451f, 34834v, valence entropy= 1.19835, total data: 16.2191 bits/v (connectivity: 96366bits, 2.18288 bits/vertex for this level)
Global coding: 16.2191 bits/vertex, connectivity : 2.76644 bits/vertex, geometry : 13.4527bits/vertex
File size: 70622bytes

for level 25:
the total data weighs 8.37412 bits/v i.e. 8.37412 * 34834=291704 bits. for this model, you should always multiply the bitrate (in bits/v) by the original number of vertices : 34834 in the bunny case.

Sorry for the mixup is that better now?

@Seven377
Copy link
Author

I have two questions:

  1. For Bunny model, at each level, compression rates of geometry=(total data* 34834- connectivity)/34834 , right ? or I should divide the current number of vertices at this level (12874, take level 25 for example )?
  2. Should I use the original number of vertices for all model ? Because this problem also exists when I used Sphere model.

@valette
Copy link
Owner

valette commented Nov 23, 2015

1 - correct.
2 - Yes, you should always use the original number of vertices. This is a common use for progressive compression (all papers report this in the same way)

@Seven377
Copy link
Author

I have learned is far from enough for mesh compression, I will continue in my efforts.Thank you for your patient explanation.

@valette
Copy link
Owner

valette commented Nov 23, 2015

you're welcome!

@valette valette closed this as completed Jul 4, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants