Skip to content

piseabhijeet/prompt_optimization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Prompt Optimization

This repository demonstrates prompt optimization using both Local LLMs and an Azure AD Service Principal.

Prerequisites

Dataset Creation

A dataset was created to identify Personally Identifiable Information (PII) using ChatGPT with the following prompt:

Create an Excel file that contains three columns: 'id', 'text', and 'contains_pii'. 
Add around 1,000 rows of text in the 'text' column, ensuring some rows contain PII and others do not. 
Fill the 'contains_pii' column with 'yes' or 'no' accordingly. Be consistent.

The generated dataset is saved in the data folder.

Objective

Using this dataset, we aim to optimize prompts by:

  1. Running a local LLM.
  2. Utilizing a GPT-4o endpoint with an Azure AD Service Principal.

Notebook Overview

Part 1: Optimization Using a Local LLM

This section demonstrates how to optimize prompts using a locally hosted LLM. The deepseek-r1:8b model has already been downloaded.

Steps:

  1. Start the Ollama server:
    ollama serve
    
  2. Run the first part of the notebook.

Part 2: Optimization Using GPT-4o Endpoint

This section demonstrates how to optimize prompts using a GPT-4o endpoint with an Azure AD Service Principal.

Required Values:

  • TENANT_ID: Your Azure tenant ID.
  • CLIENT_ID: Your Azure client ID.
  • CLIENT_SECRET: Your Azure client secret.
  • AZURE_OPENAI_ENDPOINT: The endpoint URL, e.g., https://gpt4o....
  • AZURE_OPENAI_DEPLOYMENT: The deployment name (not the raw model name), e.g., gpt-4o.
  • AZURE_API_VERSION: The API version, e.g., 2023-05-15.

Folder Structure

prompt_optimization/
├── data/                # Contains the PII dataset
├── notebooks/           # Jupyter notebooks for optimization
├── README.md            # Project documentation

Getting Started

  1. Ensure the dataset is available in the data folder.
  2. Follow the steps in the notebook to perform prompt optimization.

License

This project is licensed under the MIT License. See the LICENSE file for details.

About

Prompt optimization when using Local LLMs and When using an Azure AD Service Principal

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published