Skip to content

Commit

Permalink
Add batch_size parameter to inference configuration files
Browse files Browse the repository at this point in the history
  • Loading branch information
valhassan committed May 1, 2024
1 parent 24dd60b commit 71b60c2
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 0 deletions.
1 change: 1 addition & 0 deletions config/inference/default_binary.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ inference:
model_path: ${general.save_weights_dir}/
output_path:
checkpoint_dir: # (string, optional): directory in which to save the object if url
batch_size: 8
chunk_size: # if empty, will be calculated automatically from max_pix_per_mb_gpu
# Maximum number of pixels each Mb of GPU Ram to allow. E.g. if GPU has 1000 Mb of Ram and this parameter is set to
# 10, chunk_size will be set to sqrt(1000 * 10) = 100.
Expand Down
1 change: 1 addition & 0 deletions config/inference/default_multiclass.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ inference:
model_path: ${general.save_weights_dir}/
output_path:
checkpoint_dir: # (string, optional): directory in which to save the object if url
batch_size: 8
chunk_size: # if empty, will be calculated automatically from max_pix_per_mb_gpu
# Maximum number of pixels each Mb of GPU Ram to allow. E.g. if GPU has 1000 Mb of Ram and this parameter is set to
# 10, chunk_size will be set to sqrt(1000 * 10) = 100.
Expand Down

0 comments on commit 71b60c2

Please sign in to comment.