OCI Data Science can be used to fine-tune, deploy, and manage Large Langugage Models (LLMs) effectively, efficiently, and easily. This page curates the links to some common use cases for LLMs.
Fine tune Llama 2 with distributed multi-node, multi-GPU job
Quantize Llama 2 70B to 4 bits and deploy on 2xA10s
Deploy Llama 2 on fully service managed deployment using TGI or vLLM
Deploy Meta-Llama-3-8B-Instruct with Oracle Service Managed vLLM(0.3.0) Container
Deploy Meta-Llama-3.1-405B-Instruct with vLLM(0.5.3.post1) Container
Deploy Meta-Llama-3.1-8B-Instruct with vLLM(0.5.3.post1) Container
AI Quick Actions make working with LLMs super simple and requires no coding. From the AI Quick Actions extension in a Notebook session, you can explore foundation models, kickoff a fine-tuning process, deploy as a web endpoint and test it with a simple chat interface, and run evaluation jobs. Learn more in this blog post: Introducing AI Quick Actions in OCI Data Science