Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Search Alot but didn't find any solution : input/output error #113

Closed
khalid5454 opened this issue Aug 23, 2019 · 2 comments
Closed

Search Alot but didn't find any solution : input/output error #113

khalid5454 opened this issue Aug 23, 2019 · 2 comments

Comments

@khalid5454
Copy link

khalid5454 commented Aug 23, 2019

Hi, all I am using this file to open images files directory and save some features but getting Errors I search a lot but didn't find any solution ..

################################################################################################################################

This file is used to extract features from dataset and save it on disc

inputs:

outputs:

################################################################################################################################

import os
import random

import numpy as np
import tensorflow as tf

Just disables the warning, doesn't enable AVX/FMA

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
import pickle

os._warn_preinit_stderr = 0

BOTTLENECK_TENSOR_NAME = 'pool_3/_reshape:0'
BOTTLENECK_TENSOR_SIZE = 2048
MODEL_INPUT_WIDTH = 299
MODEL_INPUT_HEIGHT = 299
MODEL_INPUT_DEPTH = 3
JPEG_DATA_TENSOR_NAME = 'DecodeJpeg/contents:0'
RESIZED_INPUT_TENSOR_NAME = 'ResizeBilinear:0'
MAX_NUM_IMAGES_PER_CLASS = 2 ** 27 - 1 # ~134M

def create_inception_graph():
""""Creates a graph from saved GraphDef file and returns a Graph object.

Returns:
Graph holding the trained Inception network, and various tensors we'll be
manipulating.
"""
with tf.compat.v1.Session() as sess:
model_filename = os.path.join(
'imagenet', 'classify_image_graph_def.pb')
with tf.gfile.FastGFile("imagenet/classify_image_graph_def.pb", 'rb') as f:
graph_def = tf.compat.v1.GraphDef()
graph_def.ParseFromString(f.read())
bottleneck_tensor, jpeg_data_tensor, resized_input_tensor = (
tf.import_graph_def(graph_def, name='', return_elements=[
BOTTLENECK_TENSOR_NAME, JPEG_DATA_TENSOR_NAME,
RESIZED_INPUT_TENSOR_NAME]))
return sess.graph, bottleneck_tensor, jpeg_data_tensor, resized_input_tensor

def run_bottleneck_on_image(sess, image_data, image_data_tensor,
bottleneck_tensor):
bottleneck_values = sess.run(
bottleneck_tensor,
{image_data_tensor: image_data})
bottleneck_values = np.squeeze(bottleneck_values)
return bottleneck_values

Get outputs from second-to-last layer in pre-built model

boots_files = [
'uploads/dogs_and_cats/Boots/' + f
for
f
in
os.listdir('uploads/dogs_and_cats/Boots')
]
sandals_files = [
'uploads/dogs_and_cats/Sandals/' + f
for
f
in
os.listdir('uploads/dogs_and_cats/Sandals')
]
shoes_files = [
'uploads/dogs_and_cats/Shoes/' + f
for
f
in
os.listdir('uploads/dogs_and_cats/Shoes')
]
slippers_files = [
'uploads/dogs_and_cats/Slippers/' + f
for
f
in
os.listdir('uploads/dogs_and_cats/Slippers')
]
apparel_files = [
'uploads/dogs_and_cats/apparel/' + f
for
f
in
os.listdir('uploads/dogs_and_cats/apparel')
]

all_files = boots_files + shoes_files + slippers_files + sandals_files + apparel_files

random.shuffle(all_files)

num_images = 10000
neighbor_list = all_files[:num_images]
with open('neighbor_list_recom.pickle', 'wb') as f:
pickle.dump(neighbor_list, f)
print("saved neighbour list")

extracted_features = np.ndarray((num_images, 2048))
sess = tf.compat.v1.Session()
graph, bottleneck_tensor, jpeg_data_tensor, resized_image_tensor = (create_inception_graph())

for i, filename in enumerate(neighbor_list):

image_data = tf.io.gfile.GFile(filename, 'rb').read()
features = run_bottleneck_on_image(sess, image_data, jpeg_data_tensor, bottleneck_tensor)

extracted_features[i:i + 1] = features

if i % 250 == 0:
    print(i)

np.savetxt("saved_features_recom.txt", extracted_features)
print("saved exttracted features")

++++++++++++++++++++++++
Erorr

"C:\Users\Muhammad Khalid\Anaconda3\python.exe" "C:/Users/Muhammad Khalid/Desktop/Recommendation systems using image similarity powered by deep learning/Deeplearning_Image_Similarity-master/server/image_vectorizer.py"

WARNING: Logging before flag parsing goes to stderr.
W0823 21:18:45.101209 20020 init.py:308] Limited tf.compat.v2.summary API due to missing TensorBoard installation.
saved neighbour list

W0823 21:18:45.223723 20020 deprecation.py:323] From C:/Users/Muhammad Khalid/Desktop/Recommendation systems using image similarity powered by deep learning/Deeplearning_Image_Similarity-master/server/image_vectorizer.py:39: FastGFile.init (from tensorflow.python.platform.gfile) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.gfile.GFile.

Traceback (most recent call last):
File "C:/Users/Muhammad Khalid/Desktop/Recommendation systems using image similarity powered by deep learning/Deeplearning_Image_Similarity-master/server/image_vectorizer.py", line 113, in
image_data = tf.io.gfile.GFile(filename, 'rb').read()

File "C:\Users\Muhammad Khalid\Anaconda3\lib\site-packages\tensorflow\python\lib\io\file_io.py", line 122, in read
self._preread_check()

File "C:\Users\Muhammad Khalid\Anaconda3\lib\site-packages\tensorflow\python\lib\io\file_io.py", line 84, in _preread_check
compat.as_bytes(self.__name), 1024 * 512)

tensorflow.python.framework.errors_impl.UnknownError: NewRandomAccessFile failed to Create/Open: uploads/dogs_and_cats/Sandals/Athletic : Access is denied.
; Input/output error

Process finished with exit code 1

@yilei
Copy link
Contributor

yilei commented Aug 26, 2019

This doesn't seem to be an absl issue, did you mean tensorflow? Closing.

@yilei yilei closed this as completed Aug 26, 2019
@khalid5454
Copy link
Author

yeah look like tensorflow error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants