Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failure in segmenting OD from DRIONS DB #10

Open
Mahanteshambi opened this issue Apr 23, 2019 · 1 comment
Open

Failure in segmenting OD from DRIONS DB #10

Mahanteshambi opened this issue Apr 23, 2019 · 1 comment

Comments

@Mahanteshambi
Copy link

I am using your

U-Net, OD on DRIONS-DB (fold 0)

notebook to simulate OD segmentation on DRIONS DB, It works perfectly fine when i use your pre-trained model from folder

05.03,02_40,U-Net light, on DRIONS-DB 256 px fold 0, SGD, high augm, CLAHE, log_dice loss

to segment images from your "DRIONS_DB.hdf5" But
When I load image from DRIONS db, and predict segmentation using the following code, then I see very bad segmentation.

img_path = 'E:/DRIONS/DRIONS-DB/images/image_001.jpg'
im = np.array(Image.open(img_path))
im = im[0:, 40:]
print(im.shape)
im = cv2.resize(im, (256,256))
print(im.shape)
plt.imshow(im), plt.show()
im = np.expand_dims(im, axis=0)
im = tf_to_th_encoding(im)
prediction = (model.predict(im)[0, 0]).astype(np.float64)
plt.imshow(prediction>0.5, cmap=plt.cm.Greys_r), plt.show()

So can you help me out in understanding why it is working when using an image from your DRIONS_DB.hdf5 file and why it is failing while processing a new image?

@seva100
Copy link
Owner

seva100 commented Dec 25, 2019

@Mahanteshambi, big apologies for not noticing your request in time.
Actually you just needed to add one line which rescales an image from uint8 [0, 255] scale to float [0, 1] scale of intensities:

im = np.array(Image.open(img_path))
im = im[0:, 40:]
print(im.shape)
im = cv2.resize(im, (256,256))

im = im.astype(np.float64) / 255.0

print(im.shape)
plt.imshow(im), plt.show()
im = np.expand_dims(im, axis=0)
im = tf_to_th_encoding(im)
prediction = (model.predict(im)[0, 0]).astype(np.float64)
plt.imshow(prediction, cmap=plt.cm.Greys_r), plt.show()

This way it works fine for me.

Also, it's necessary to add CLAHE im = skimage.exposure.equalize_adapthist(im) right after the rescaling line im = im.astype(np.float64) / 255.0 in order to reproduce exactly the same results, since model expects images only after CLAHE was performed on them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants