You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have started trying to import partially melted snowflakes and I am constatly getting segfault.
First memory corruption I have identified in the algorithm that parses the constituents. The algorithm consider the vector of particle constituents to be long: Ndipole*Nmaterials instead of Ndipoles. I have started implementing a solution in branch https://github.com/rhoneyager/libicedb/tree/heterogeneous_particles but I hesitate to pull-request because:
We need to discuss a little bit more on how we store the constituents informations (at the moment it is not clear to me and I might have resolved the issue in the wrong way)
There is a check in Shapes.cpp that prevents to write files with more than one constituent without specifying constituent_names
Thanks to answers to #15 I know the second point has some hdf I/O issues as well. We might want to fix those before making additional progress
The text was updated successfully, but these errors were encountered:
I have added also some 2-components shapefiles to the folder samplefiles.
After the recent changes to the master branch I do not get segmentation faults anymore.
However if I do
I get Abort message.
That is due to lines 149-152 of Shapes.cpp that check if the number of constituents names is equal to the number of particle constituents. Fine, I comment that check and try to run again obtaining my first 2-components shape database.
Now looking to the content it seems that components are just randomly assigned.
I think that is related to how particle_scattering_element_composition_whole is filled in shapeIOtextParsers2.cpp
There that variable is treated as it is Ndipole*Nmaterials long whereas it is actually just Ndipole long (different sizing of the variables may "erroneously" compensate for the diverse memory allocations).
I do not think it is urgent to fix this. It is much more important to agree on the standard first. I thought we were looking to include just particle_scattering_element_composition_fractional to simplify the standard (at the cost of trivial floating point storage), but I might be wrong.
I have started trying to import partially melted snowflakes and I am constatly getting segfault.
First memory corruption I have identified in the algorithm that parses the constituents. The algorithm consider the vector of particle constituents to be long: Ndipole*Nmaterials instead of Ndipoles. I have started implementing a solution in branch https://github.com/rhoneyager/libicedb/tree/heterogeneous_particles but I hesitate to pull-request because:
Thanks to answers to #15 I know the second point has some hdf I/O issues as well. We might want to fix those before making additional progress
The text was updated successfully, but these errors were encountered: