Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@ Results:
}
```

## Step 2: Show the current Python version
## Show the current Python version

The following command retrieves the Python runtime version currently used by your Azure App Service.

Expand All @@ -115,7 +115,7 @@ Results:
"PYTHON|3.10"
```

## Step 3: Set the desired Python version
## Set the desired Python version

Update your Azure App Service instance to use a specific Python version. Replace the desired Python version (e.g., "PYTHON|3.11") as needed.

Expand All @@ -124,6 +124,7 @@ export DESIRED_PYTHON_VERSION="PYTHON|3.11"
az webapp config set --resource-group $RESOURCE_GROUP --name $APP_NAME --linux-fx-version $DESIRED_PYTHON_VERSION
```

## Verify Version
Verify the updated Python version:

```bash
Expand All @@ -138,7 +139,7 @@ Results:
"PYTHON|3.11"
```

## Step 4: List all supported Python runtime versions
## List all supported Python runtime versions

Use the following command to view all Python versions supported by Azure App Service on Linux.

Expand Down
7 changes: 2 additions & 5 deletions scenarios/CreateAKSWebApp/create-aks-webapp.md
Original file line number Diff line number Diff line change
Expand Up @@ -486,18 +486,18 @@ Cert-manager provides Helm charts as a first-class method of installation on Kub

```bash
helm repo add jetstack https://charts.jetstack.io
helm repo update
helm install cert-manager jetstack/cert-manager --namespace cert-manager --version v1.7.0
```

2. Update local Helm Chart repository cache

```bash
helm repo update
```

3. Install Cert-Manager addon via helm by running the following:

```bash
helm install cert-manager jetstack/cert-manager --namespace cert-manager --version v1.7.0
```

4. Apply Certificate Issuer YAML File
Expand Down Expand Up @@ -538,9 +538,6 @@ Cert-manager provides Helm charts as a first-class method of installation on Kub
nodeSelector:
"kubernetes.io/os": linux
EOF
```

```bash
cluster_issuer_variables=$(<cluster-issuer-prod.yml)
```

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -573,7 +573,7 @@ The metrics collected are:
* Look for single processes with high read/write rates per second. This information is a guidance for processes with I/O more than identifying issues.
Note: the `--human` option can be used to display numbers in human readable format (that is, `Kb`, `Mb`, `GB`).

### `ps`
### Top CPU processes

Lastly `ps` command displays system processes, and can be either sorted by CPU or Memory.

Expand All @@ -599,6 +599,7 @@ root 2186 42.0 0.0 73524 5836 pts/1 R+ 16:55 0:06 stress-ng --c
root 2191 41.2 0.0 73524 5592 pts/1 R+ 16:55 0:06 stress-ng --cpu 12 --vm 2 --vm-bytes 120% --iomix 4 --timeout 240
```

## Top memory processes
To sort by `MEM%` and obtain the top 10 processes:

```azurecli-interactive
Expand Down Expand Up @@ -634,13 +635,4 @@ echo "$extracted"

To run, you can create a file with the above contents, add execute permissions by running `chmod +x gather.sh`, and run with `sudo ./gather.sh`.

This script saves the output of the commands in a file located in the same directory where the script was invoked.

Additionally, all the commands in the bash block codes covered in this document, can be run through `az-cli` using the run-command extension, and parsing the output through `jq` to obtain a similar output to running the commands locally: `

```azurecli-interactive
output=$(az vm run-command invoke -g $MY_RESOURCE_GROUP_NAME --name $MY_VM_NAME --command-id RunShellScript --scripts "ls -l /dev/disk/azure")
value=$(echo "$output" | jq -r '.value[0].message')
extracted=$(echo "$value" | awk '/\[stdout\]/,/\[stderr\]/' | sed '/\[stdout\]/d' | sed '/\[stderr\]/d')
echo "$extracted"
```
This script saves the output of the commands in a file located in the same directory where the script was invoked.
45 changes: 0 additions & 45 deletions scenarios/PostgresRAGLLM/app.py

This file was deleted.

35 changes: 12 additions & 23 deletions scenarios/PostgresRAGLLM/chat.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
from db import VectorDatabase

# Configure logging
logging.basicConfig(level=logging.DEBUG)
logging.basicConfig(level=logging.INFO)

parser = argparse.ArgumentParser()
parser.add_argument('--api-key', dest='api_key', type=str)
Expand All @@ -18,13 +18,11 @@
parser.add_argument('--pgpassword', dest='pgpassword', type=str)
parser.add_argument('--pgdatabase', dest='pgdatabase', type=str)
parser.add_argument('--populate', dest='populate', action="store_true")
parser.add_argument('--question', dest='question', type=str, help="Question to ask the chatbot")
args = parser.parse_args()


class ChatBot:
def __init__(self):
logging.debug("Initializing ChatBot")
self.db = VectorDatabase(pguser=args.pguser, pghost=args.phhost, pgpassword=args.pgpassword, pgdatabase=args.pgdatabase)
self.api = AzureOpenAI(
azure_endpoint=args.endpoint,
Expand All @@ -39,21 +37,17 @@ def __init__(self):
)

def load_file(self, text_file: str):
logging.debug(f"Loading file: {text_file}")
logging.info(f"Loading file: {text_file}")
with open(text_file, encoding="UTF-8") as f:
data = f.read()
chunks = self.text_splitter.create_documents([data])
for i, chunk in enumerate(chunks):
text = chunk.page_content
embedding = self.__create_embedding(text)
self.db.save_embedding(i, text, embedding)

def __create_embedding(self, text: str):
logging.debug(f"Creating embedding for text: {text[:30]}...")
return self.api.embeddings.create(model="text-embedding-ada-002", input=text).data[0].embedding
logging.info("Done loading data.")

def get_answer(self, question: str):
logging.debug(f"Getting answer for question: {question}")
question_embedding = self.__create_embedding(question)
context = self.db.search_documents(question_embedding)

Expand All @@ -80,26 +74,21 @@ def get_answer(self, question: str):
)
return response.choices[0].message.content

def __create_embedding(self, text: str):
return self.api.embeddings.create(model="text-embedding-ada-002", input=text).data[0].embedding


def main():
chat_bot = ChatBot()

if args.populate:
logging.debug("Loading embedding data into database...")
chat_bot.load_file("knowledge.txt")
logging.debug("Done loading data.")
return

if args.question:
logging.debug(f"Question provided: {args.question}")
print(chat_bot.get_answer(args.question))
return

while True:
q = input("Ask a question (q to exit): ")
if q == "q":
break
print(chat_bot.get_answer(q))
else:
while True:
q = input("Ask a question (q to exit): ")
if q == "q":
break
print(chat_bot.get_answer(q))


if __name__ == "__main__":
Expand Down
101 changes: 6 additions & 95 deletions scenarios/PostgresRAGLLM/postgres-rag-llm.md
Original file line number Diff line number Diff line change
Expand Up @@ -136,101 +136,12 @@ pip install -r requirements.txt
python chat.py --populate --api-key $API_KEY --endpoint $ENDPOINT --pguser $PGUSER --phhost $PGHOST --pgpassword $PGPASSWORD --pgdatabase $PGDATABASE
```

## Set up Web Interface
## Run Chat bot

Create a simple web interface for the chatbot using Flask.

1. **Install Flask**

```bash
pip install Flask
```

2. **Create `app.py`**

Create a file named `app.py` in the `scenarios/PostgresRagLlmDemo` directory with the following content:

```python
from flask import Flask, request, render_template
import subprocess
import os

app = Flask(__name__)

@app.route('/', methods=['GET'])
def home():
return render_template('index.html', response='')

@app.route('/ask', methods=['POST'])
def ask():
question = request.form['question']
result = subprocess.run([
'python', 'chat.py',
'--api-key', os.getenv('API_KEY'),
'--endpoint', os.getenv('ENDPOINT'),
'--pguser', os.getenv('PGUSER'),
'--phhost', os.getenv('PGHOST'),
'--pgpassword', os.getenv('PGPASSWORD'),
'--pgdatabase', os.getenv('PGDATABASE'),
'--question', question
], capture_output=True, text=True)
response = result.stdout
return render_template('index.html', response=response)

if __name__ == '__main__':
app.run(host='0.0.0.0', port=5000)
```

3. **Create `index.html`**

Create a `templates` directory inside `scenarios/PostgresRagLlmDemo` and add an `index.html` file with the following content:

```html
<!doctype html>
<html lang="en">
<head>
<title>Chatbot Interface</title>
</head>
<body>
<h1>Ask about Zytonium</h1>
<form action="/ask" method="post">
<input type="text" name="question" required>
<button type="submit">Ask</button>
</form>
<pre>{{ response }}</pre>
</body>
</html>
```

4. **Run the Web Server**

Ensure that all environment variables are exported and then run the Flask application:

```bash
export API_KEY="$API_KEY"
export ENDPOINT="$ENDPOINT"
export PGUSER="$PGUSER"
export PGHOST="$PGHOST"
export PGPASSWORD="$PGPASSWORD"
export PGDATABASE="$PGDATABASE"

python app.py
```

The web interface will be accessible at `http://localhost:5000`. You can ask questions about Zytonium through the browser.

## Next Steps

- Explore more features of [Azure Cognitive Search](https://learn.microsoft.com/azure/search/search-what-is-azure-search).
- Learn how to [use Azure OpenAI with your data](https://learn.microsoft.com/azure/cognitive-services/openai/use-your-data).
<!-- ## Run Chat bot

This final step initializes the chatbot in your terminal. You can ask it questions about Zytonium and it will use the embeddings in the postgres database to augment your query with relevant context before sending it to the LLM model.

```bash
echo "Ask the chatbot a question about Zytonium!"
```
To run the chatbot, paste this following command to the terminal: `cd ~/scenarios/PostgresRagLlmDemo && python chat.py --api-key $API_KEY --endpoint $ENDPOINT --pguser $PGUSER --phhost $PGHOST --pgpassword $PGPASSWORD --pgdatabase $PGDATABASE`

```bash
python chat.py --api-key $API_KEY --endpoint $ENDPOINT --pguser $PGUSER --phhost $PGHOST --pgpassword $PGPASSWORD --pgdatabase $PGDATABASE
``` -->
echo "
To run the chatbot, see the last step for more info.
"
```
3 changes: 1 addition & 2 deletions scenarios/PostgresRAGLLM/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
azure-identity==1.17.1
openai==1.55.3
psycopg2==2.9.9
langchain-text-splitters==0.2.2
Flask==2.3.2
langchain-text-splitters==0.2.2
13 changes: 0 additions & 13 deletions scenarios/PostgresRAGLLM/templates/index.html

This file was deleted.

Loading
Loading