If I have multiple models but with one GPU, can tensorflow serving allows multiple model to share one GPU? Also how do I enable the usage of GPU - as currently seems like only CPU is used though I have GPU installed (and is used with tensorflow training/inference without problem).
If I have multiple models but with one GPU, can tensorflow serving allows multiple model to share one GPU? Also how do I enable the usage of GPU - as currently seems like only CPU is used though I have GPU installed (and is used with tensorflow training/inference without problem).