From 203a7f8591eb8d654cc0a705f7d2292d6c385dc5 Mon Sep 17 00:00:00 2001 From: James Busche Date: Mon, 14 Oct 2024 14:26:08 -0700 Subject: [PATCH] Some small usability suggestions Signed-off-by: James Busche --- docs/lab-1/README.md | 2 +- docs/lab-2/README.md | 9 ++++++--- docs/pre-work/README.md | 13 +++++++++++++ 3 files changed, 20 insertions(+), 4 deletions(-) diff --git a/docs/lab-1/README.md b/docs/lab-1/README.md index 081b41e..e061893 100644 --- a/docs/lab-1/README.md +++ b/docs/lab-1/README.md @@ -23,7 +23,7 @@ Many open LLMs available today license the model itself for derivative work, but Granite Code comes in a wide range of sizes to fit your workstation's available resources. Generally, the bigger the model, the better the results, with a tradeoff: model responses will be slower, and it will take up more resources on your machine. We chose the 20b option as my starting point for chat and the 8b option for code generation. Ollama offers a convenient pull feature to download models: -Open up your terminal, and run the following commands: +Open up a second terminal, and run the following commands: ```bash ollama pull granite-code:20b diff --git a/docs/lab-2/README.md b/docs/lab-2/README.md index 067db51..40a0884 100644 --- a/docs/lab-2/README.md +++ b/docs/lab-2/README.md @@ -11,7 +11,7 @@ Before we go any farther, write in "Who is batman?" to verify that `ollama`, VSCode, and `continue` are all working correctly. !!! troubleshooting - If Continue is taking a long time to respond, restart Visual Studio Code. If that doesn't resolve your issue, restart Ollama. + If Continue is taking a long time to respond, make sure your terminal with `ollama serve` is still running. If Ollama is running, restart Visual Studio Code. If that doesn't resolve your issue, restart Ollama. If you would like to go deeper with `continue`, take a look at the [official Continue.dev how-to guide](https://docs.continue.dev/how-to-use-continue). Its worth taken the moment if you want, otherwise, when you get home and try this on your own @@ -27,6 +27,9 @@ this technology is there to support you, not _do_ your work. Now, lets open up VSCode and have it look something like the following: ![batman](../images/whoisbatman.png) +!!! troubleshooting + If you lose the Continue pane in VSCode, you can re-enable it in VSCode by clicking at the top of the screen under "View --> Appearance --> Secondary Side Bar" and then the Continue window will be visiable again. + ## Building out `main.py` Now create a new file, and put it in a new directory. Normally it's `ctrl-n` or `command-n` call it @@ -69,7 +72,7 @@ example we need to to get the code fixed. ## First pass at debugging -I'll run the following commands to build up an virtual environment, and install some modules, lets +We'll run the following commands to build up an virtual environment, and install some modules, lets see how far we get. !!! tip @@ -115,7 +118,7 @@ For me, all I had to do was remove those extra spaces, but I'd be curious to kno ## Second pass at debugging -Now that I've clean it up, and it seems I had to do some importing: +Now that I've cleaned it up, and it seems I had to do some importing: ```python import tkinter diff --git a/docs/pre-work/README.md b/docs/pre-work/README.md index d3b105b..f68db16 100644 --- a/docs/pre-work/README.md +++ b/docs/pre-work/README.md @@ -34,6 +34,11 @@ After the installation is complete, install [ollama](https://ollama.com) via `br brew install ollama ``` +Then start the Ollama server in a terminal window +```bash +ollama serve +``` + ### Windows installation steps Install ollama via the website [here](https://ollama.com/download/windows). @@ -54,6 +59,14 @@ After the installation is complete, install [vscode](https://code.visualstudio.c brew install --cask visual-studio-code ``` +Then start up VSCode: + +```bash +mkdir ~/INSTRUCTLAB +cd ~/INSTRUCTLAB +code . +``` + ### Windows installation steps Install Code via the website [here](https://code.visualstudio.com/Download).