Skip to content

LLMs and MIMIC #1593

Jul 20, 2023 · 1 comments · 3 replies
Discussion options

You must be logged in to vote

Thank you for being thoughtful about this!

  1. It is clear that sending data over an API is a violation of the DUA.
  2. It's also clear that by storing data on Google Cloud Platform, we are to some extent trusting GCP to house the data, albeit within our complete control (we can delete it, and the data are encrypted such that GCP staff cannot read it).

For LLMs, we are carrying on with the principles behind that second point. So, as of now (July 2023), our stance is:

  1. Sending data to an external API endpoint (Claude, OpenAI, etc): Not permitted.
  2. Deploying a model within your controlled cloud environment and using it (e.g. Azure OpenAI): Permitted.
  3. Running an LLM locally on your own hardware/acc…

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@aegis301
Comment options

@aegis301
Comment options

@alistairewj
Comment options

Answer selected by aegis301
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants