Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

prediction difference between batch=1 and batch=16 #16

Closed
alanbekker opened this issue May 7, 2018 · 8 comments
Closed

prediction difference between batch=1 and batch=16 #16

alanbekker opened this issue May 7, 2018 · 8 comments

Comments

@alanbekker
Copy link

alanbekker commented May 7, 2018

Any ideas why I'm receiving different prediction values when running with batch_size=1,16?
find code below:
Thanks!

def predict(self,speech_input):
labels = np.empty(0, int)
labels = np.append(labels, range(speech_input.shape[0]), axis=0)
feature,logits,_ = self.session.run(
[self.features,self.logits,self.end_points_speech],
feed_dict={self.is_training: False, self.batch_dynamic: labels.shape[0],
self.margin_imp_tensor: 50,
self.batch_speech: speech_input})
#self.batch_labels: labels.reshape([labels.shape[0], 1])})

    # Extracting the associated numpy array.
    #print (feature[0])

    return  feature,logits
@astorfi
Copy link
Owner

astorfi commented May 13, 2018

Are you running on your own dataset?

@alanbekker
Copy link
Author

alanbekker commented May 13, 2018 via email

@astorfi
Copy link
Owner

astorfi commented May 13, 2018

I do not get your question actually. Why are you expecting to get the same prediction values?

@alanbekker
Copy link
Author

alanbekker commented May 13, 2018 via email

@astorfi
Copy link
Owner

astorfi commented May 14, 2018

What do you mean by batch16?

@alanbekker
Copy link
Author

alanbekker commented May 14, 2018 via email

@astorfi
Copy link
Owner

astorfi commented May 16, 2018

I think you should debug the values more carefully. It's weird. It's hard for me to have an idea about the details and the values that you are getting. Please see if the averaging is performing correctly. This might be due to the drastic changes happened in new TensorFlow versions.

@astorfi
Copy link
Owner

astorfi commented May 29, 2018

Please open this issue if the problem has not been resolved!
Thanks

@astorfi astorfi closed this as completed May 29, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants