You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I use your code for change detection, in your code you do some transpose operations on the data and the results of model. What is the function of this transposition operation? Do I need to transpose my change detection dataset?
When I used your code for change detection, at a few hundred iters in the first epoch, there was a huge loss,and the precision of the validation is always 0. I'm not clear where the problem is.
Very much looking forward to your answer, thank you and all the best!
The text was updated successfully, but these errors were encountered:
Those transpose operations are just for visualisation purposes.
prepare_img returns height x width x channel bumpy array, while the network expects batch x channel x height x width as the input.
If you are getting huge loss numbers and the only change is your dataset, I would recommend verifying that your ground truth masks are in the correct format -- each pixel in the segmentation mask must be a digit corresponding to your semantic class
Thank you very much for your answer, I found the reason why the loss is very large, because my learning rate is set too large. Thank you again, and I wish you all the best!
img_inp = torch.tensor(prepare_img(img).transpose(2, 0, 1)[None]).float()
segm = mnet(img_inp)[0].data.cpu().numpy().transpose(1, 2, 0)
Hello, I use your code for change detection, in your code you do some transpose operations on the data and the results of model. What is the function of this transposition operation? Do I need to transpose my change detection dataset?
When I used your code for change detection, at a few hundred iters in the first epoch, there was a huge loss,and the precision of the validation is always 0. I'm not clear where the problem is.
Very much looking forward to your answer, thank you and all the best!
The text was updated successfully, but these errors were encountered: