Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Image Classification API] Bottleneck phase values always computed #4452

Closed
luisquintanilla opened this issue Nov 7, 2019 · 1 comment

Comments

@luisquintanilla
Copy link

@luisquintanilla luisquintanilla commented Nov 7, 2019

System information

  • OS version/distro: Windows 10
  • .NET Version (eg., dotnet --info): 2.1
  • ML.NET Version (eg., dotnet --info): 1.4.0

Issue

Setting ReuseTrainSetBottleneckCachedValues and ReuseValidationSetBottleneckCachedValues parameters in ImageClassificationTrainer.Options to true, does the bottleneck computation on subsequent runs. I believe once the bottleneck values are computed on the first run, by setting both of those parameters to true, bottleneck computation should be skipped and the model should go directly into the training phase. Is this no longer the case?

Source code / logs

See sample source code at this link: https://github.com/luisquintanilla/machinelearning-samples/blob/33f87d226f350fb36552dd8b1cee6a7c3f12da89/samples/csharp/getting-started/DeepLearning_ImageClassification_Binary/DeepLearning_ImageClassification_Binary/Program.cs#L53

@luisquintanilla luisquintanilla referenced this issue Nov 7, 2019
2 of 2 tasks complete
@codemzs

This comment has been minimized.

Copy link
Member

@codemzs codemzs commented Nov 7, 2019

hi Luis, you need to specific a workspace path so that bottleneck files are not deleted else bottleneck files are created in a temporary folder that is deleted after training run.

@codemzs codemzs closed this Nov 7, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants
You can’t perform that action at this time.