From 52e11de52b0f7fbee77ea4c428152d4ffb7b4f92 Mon Sep 17 00:00:00 2001 From: Hansong Zhang Date: Thu, 19 Sep 2024 11:11:43 -0700 Subject: [PATCH] Update GH link in docs --- examples/demo-apps/android/LlamaDemo/README.md | 12 ++++++------ examples/demo-apps/apple_ios/LLaMA/README.md | 8 ++++---- 2 files changed, 10 insertions(+), 10 deletions(-) diff --git a/examples/demo-apps/android/LlamaDemo/README.md b/examples/demo-apps/android/LlamaDemo/README.md index 397d677a5b7..41b030cef06 100644 --- a/examples/demo-apps/android/LlamaDemo/README.md +++ b/examples/demo-apps/android/LlamaDemo/README.md @@ -46,7 +46,7 @@ Below are the UI features for the app. Select the settings widget to get started with picking a model, its parameters and any prompts.

- +

@@ -55,7 +55,7 @@ Select the settings widget to get started with picking a model, its parameters a Once you've selected the model, tokenizer, and model type you are ready to click on "Load Model" to have the app load the model and go back to the main Chat activity.

- +

@@ -87,12 +87,12 @@ int loadResult = mModule.load(); ### User Prompt Once model is successfully loaded then enter any prompt and click the send (i.e. generate) button to send it to the model.

- +

You can provide it more follow-up questions as well.

- +

> [!TIP] @@ -109,14 +109,14 @@ mModule.generate(prompt,sequence_length, MainActivity.this); For LLaVA-1.5 implementation, select the exported LLaVA .pte and tokenizer file in the Settings menu and load the model. After this you can send an image from your gallery or take a live picture along with a text prompt to the model.

- +

### Output Generated To show completion of the follow-up question, here is the complete detailed response from the model.

- +

> [!TIP] diff --git a/examples/demo-apps/apple_ios/LLaMA/README.md b/examples/demo-apps/apple_ios/LLaMA/README.md index a2316fd437e..3f3214e5b49 100644 --- a/examples/demo-apps/apple_ios/LLaMA/README.md +++ b/examples/demo-apps/apple_ios/LLaMA/README.md @@ -58,11 +58,11 @@ For more details integrating and Running ExecuTorch on Apple Platforms, checkout * Ensure that the ExecuTorch package dependencies are installed correctly, then select which ExecuTorch framework should link against which target.

-iOS LLaMA App Swift PM +iOS LLaMA App Swift PM

-iOS LLaMA App Choosing package +iOS LLaMA App Choosing package

* Run the app. This builds and launches the app on the phone. @@ -82,13 +82,13 @@ For more details integrating and Running ExecuTorch on Apple Platforms, checkout If the app successfully run on your device, you should see something like below:

-iOS LLaMA App +iOS LLaMA App

For Llava 1.5 models, you can select and image (via image/camera selector button) before typing prompt and send button.

-iOS LLaMA App +iOS LLaMA App

## Reporting Issues