You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have some basic file structure support right now. For files that had filenames this works 80%. Some pallets aren't detected correctly. Because for tonberry the filename doesn't matter as long as the hash is right. At least I think so.
Some mods have to use the hashmap to find out what texture and pallet to replace. So we would need to port the hash code from tonberry to c#. It's in the GlobalContext.cpp.
Basic file structure support ((\w{2})\w\d+) => $1\$2\$2*.png or $1\$2\$2_13-28*.png
Port the hash algorithm. Tonberry hashes the texture that direct3d is drawing. The source code was released and it's in c++
Find out what state the textures need to be in. We need a texture in a correct state to generate the same hash. We have the texture in it's raw 8/16 bit state and the converted to 24/32 bit state.
I'm guessing it's the latter.
Be able to parse the csv files. These contain the modder's chosen filename and hash in decimal form.
Maybe it would be better to use a standard hash. Since they seem to want a new hash. Once we have mapped out all the textures we can run multiple hashes on all of them. Track the time and collision rate. Then we could suggest that be added to tonberry enhanced. Though tonberry doesn't check the entire texture so we could come up with tests like every other pixel and keep going till we get maximum speed and 0 collisions. https://softwareengineering.stackexchange.com/questions/49550/which-hashing-algorithm-is-best-for-uniqueness-and-speed
I am gonna put this on the back burner for now I wanted to put this here as a remember to come back to this.
The text was updated successfully, but these errors were encountered:
We are going to skip support of hash. As I don't think we can generate the hash w/o making exactly the same data in our textures. So we will just require paths and filenames to be very specific.
We are going to skip support of hash. As I don't think we can generate the hash w/o making exactly the same data in our textures. So we will just require paths and filenames to be very specific.
I have some basic file structure support right now. For files that had filenames this works 80%. Some pallets aren't detected correctly. Because for tonberry the filename doesn't matter as long as the hash is right. At least I think so.
Some mods have to use the hashmap to find out what texture and pallet to replace. So we would need to port the hash code from tonberry to c#. It's in the GlobalContext.cpp.
I'm guessing it's the latter.
Maybe it would be better to use a standard hash. Since they seem to want a new hash. Once we have mapped out all the textures we can run multiple hashes on all of them. Track the time and collision rate. Then we could suggest that be added to tonberry enhanced. Though tonberry doesn't check the entire texture so we could come up with tests like every other pixel and keep going till we get maximum speed and 0 collisions. https://softwareengineering.stackexchange.com/questions/49550/which-hashing-algorithm-is-best-for-uniqueness-and-speed
I am gonna put this on the back burner for now I wanted to put this here as a remember to come back to this.
The text was updated successfully, but these errors were encountered: