diff --git a/pages/managed-inference/faq.mdx b/pages/managed-inference/faq.mdx index 4bfe11ee96..4132842e5f 100644 --- a/pages/managed-inference/faq.mdx +++ b/pages/managed-inference/faq.mdx @@ -57,7 +57,7 @@ Managed Inference offers different Instance types optimized for various workload You can select the Instance type based on your model’s computational needs and compatibility. ## How is Managed Inference billed? -Billing is based on the Instance type and usage duration. Unlike [Generative APIs](/generative-apis/quickstart/), which are billed per token, Managed Inference provides predictable costs based on the allocated infrastructure. +Billing is based on the Instance type and usage duration (in minutes). Unlike [Generative APIs](/generative-apis/quickstart/), which are billed per token, Managed Inference provides predictable costs based on the allocated infrastructure. Billing only starts when model a deployment is ready and can be queried. Pricing details can be found on the [Scaleway pricing page](https://www.scaleway.com/en/pricing/model-as-a-service/#managed-inference). ## Can I pause Managed Inference billing when the instance is not in use? @@ -79,4 +79,4 @@ Absolutely. Managed Inference integrates seamlessly with other Scaleway services ## Do model licenses apply when using Managed Inference? Yes, model licenses need to be complied with when using Managed Inference. Applicable licenses are available for [each model in our documentation](/managed-inference/reference-content/). - For models provided in the Scaleway catalog, you need to accept licenses (including potential EULA) before creating any Managed Inference deployment. -- For custom models you choose to import on Scaleway, you are responsible for complying with model licenses (as with any software you choose to install on a GPU Instance for example). \ No newline at end of file +- For custom models you choose to import on Scaleway, you are responsible for complying with model licenses (as with any software you choose to install on a GPU Instance for example).