-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tf-slim: Unable to read dataset using slim.data_set_provider.DataSetProvider when images are not in JPEG. #6788
Comments
@kwotsin would you be able to do what you want to do just by using the core (i.e. non-contrib) APIs? @sguada This issue makes a compelling case that there's an API design issue in contrib/slim that prevents the user from loading grayscale PNG files. It's a niche use case, but looks like something that ought to be fixed, if you have time to help our friend. |
tf-slim.data can decode jpeg, png or raw see here The only caveat is that it needs to know that the encoded image has png format when creating the tf-example. |
@sguada Yes I have changed the code to read the images in png format before I read them using the slim API, however I can't seem to find a way to get tf-slim to know the files are encoded in png. @jart I am figuring out how to do this in pure TF, but it seems I might have to rewrite a custom function for reading the dataset or simply read from disk the images (might be much slower). But for the time being I have bypassed the problem by trying out jpeg files first. |
Just change 'jpg' to 'png' in this line when the file is 'png' |
Imported from GitHub PR openxla/xla#6788 Copybara import of the project: -- 2aecd49b60db6f74b6d669cc64375d3615667d1a by Neil Girdhar <mistersheik@gmail.com>: Repair collision between typing.Tuple and Tuple Merging this change closes #6788 PiperOrigin-RevId: 580255010
What related GitHub issues or StackOverflow threads have you found by searching the web for your problem?
Couldn't find relevant threads as there aren't many tf-slim questions.
Environment info
Operating System:
Ubuntu 16.04
Installed version of CUDA and cuDNN:
8.00
If installed from binary pip package, provide:
python -c "import tensorflow; print(tensorflow.__version__)"
.0.11
If possible, provide a minimal reproducible example (We usually don't have time to read hundreds of lines of your code)
Basically I have directory of subdirectories, with each subdirectory containing png images of a certain class. Editing the tf-slim download_and_convert_flowers.py to suit my images, I created a set of tfrecord files (train and validation both included) and stored it in a directory.
Following which, I used the 'get_split' function from dataset_utils in https://github.com/tensorflow/models/blob/master/slim/datasets/dataset_utils.py
to create a DataSet class from reading the tfrecord files of a certain type (either train or validation. I indicated 'train'). So now I have a DataSet class to read.
The problem comes when I try to use a batch loading function to actually start reading the DataSet object for extracting images and creating a batch:
The problem comes from
slim.dataset_data_provider.DatasetDataProvider
as I couldn't read the dataset at all.Here is my error traceback:
Upon inspection, I have checked that a likely error comes from the line
File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/slim/python/slim/data/tfexample_decoder.py", line 297, in tensors_to_item image = self._decode(image_buffer, image_format)
in tfexample_decoder.py, the image_format, if not indicated, will be a JPEG image by default unless there are 4 channels (RGBA) in the images. But since my images are grayscale, a JPEG decoder is used by default. Yet, I have no way to specify the image format as 'png' specifically unless the source code is changed. How should I go about this, or am I mistaken about the error I have arrived at?
I have previously successfully run the code (many of which are referenced from the tf-slim walkthrough ipynb file), but the images I used were in jpeg.
Any help is very much appreciated. Thank you for your time.
The text was updated successfully, but these errors were encountered: