New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Leveldb data format #21
Comments
I have the same question. Since I am not so familiar with the leveldb format, I simply try to look into the code in e.g., convert_cifar_data.cpp and figure out how it works. To convert .mat file to work with caffe, I first convert it to binary format (use fwrite in matlab) as the cifar-10 dataset did according to this description (http://www.cs.toronto.edu/~kriz/cifar.html). After that, I use the modified version of convert_cifar_data.cpp to convert it to leveldb format. This approach works well when my .mat file is some image data, i.e., they are stored in uint8. However, I have trouble when it comes to int16 or float format. (It seems that caffe assumes uint8 data.) |
Caffe accepts single/float image data. If you are testing, the matlab wrapper should be suitable for your needs as long as you pass images in the expected dimension order. For efficiency or for parceling up data for training, conversion to leveldb via the It could be helpful to take a look at the matlab wrapper and the python convert examples as well as the io utility code (although you could have to do an intermediate step of exporting your .mat to images). |
Thanx ChenlongChen I just follow what you say, and now can transform cifar10 .mat format to binary format, then just follow the cifar10 example and get the correct training accuracy.
|
update from upstream
Compatibility fixes for newer numpy version
Score mapping is failing on some differently (but correctly) formatted PDB's
Hi guys, since my data set is in matlab .mat file, I want to convert it to leveldb format, so it can be used in caffe frame, can you give me some hints? Many thanks.
The text was updated successfully, but these errors were encountered: