Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Benchmark : XgBoost performance on fashion-mnist 89.8% and on MNIST 96.8% #88

Closed
anktplwl91 opened this issue Dec 8, 2017 · 1 comment · Fixed by #89
Closed

Benchmark : XgBoost performance on fashion-mnist 89.8% and on MNIST 96.8% #88

anktplwl91 opened this issue Dec 8, 2017 · 1 comment · Fixed by #89

Comments

@anktplwl91
Copy link

Hello, I tried XgBoost on both Fashion-MNIST and MNIST dataset, with the only pre-processing as scaling the pixel values to mean=0.0 and var=1.0.

Fashion-MNIST
Train accuracy 99.5%
Validation accuracy 90.7%
Test accuracy 89.8%

MNIST
Train accuracy 99.7%
Validation accuracy 97.4%
Test accuracy 96.8%

Notebook link
https://github.com/anktplwl91/fashion_mnist.git

hanxiao pushed a commit that referenced this issue Dec 8, 2017
@SynthaxWarrior
Copy link

Isn't it wrong to do fit_transform() instead of transform() on test set while scaling?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants