A little terminal based program that lets you interact with LLMs available via Amazon Bedrock.
- You will need an AWS account
- You will need to enable the LLMs you wish to use in Amazon Bedrock via the Model Access page in the AWS Console. The default LLMs for both Chat and Prompt commands are provided by Anthropic, so it is recommended to enable these as a starting point.
- You will need to install the AWS CLI tool and run
aws config
from the command line to set up credentials.
At this time you can install chat-cli via pre-packaged binaries (thanks to GoReleaser!) for your operating system/architecture combination of choice.
- Head to https://github.com/chat-cli/chat-cli/releases/latest to find the binary for your setup.
- Download and unzip to find a pre-compiled binary file that should work on your system.
If you have Homebrew installed on your system you can do the following two commands:
brew tap chat-cli/chat-cli
brew install chat-cli
Notes:
- You won't need Go installed on your system to use the pre-packaged binaries or Homebrew
- These are currently unsigned binary files. For most systems, this will not be an issue, but on MacOS you will need to follow these instructions.
You will need Go v1.22.1 installed on your system. You can type go version
to ensure you have the correct version installed.
To build the project from source, clone this repository to your local machine and use Make to build the binary.
git clone git@github.com:go-micah/chat-cli.git
cd chat-cli
make
To run the program from within the same directory use the following command syntax.
./bin/chat-cli <command> <args> <flags>
If you downloaded a pre-packaged binary or used Homebrew to install your path will be different. You can add your binary to your path (Homebrew does this for you) and then you can just do the following:
chat-cli <command> <args> <flags>
You can get help at anytime with the --help
flag. Typing --help
after any command will display args and flags available to that command.
There are currently three ways to interact with foundation models through this interface.
- Send a single prompt to an LLM from the command line using the
prompt
command - Start an interactive chat with an LLM using the
chat
command - Generate an image with the
image
command
You can send a one liner prompt like this:
chat-cli prompt "How are you today?"
You can also read in a file from stdin
as part of your prompt like this:
cat myfile.go | chat-cli prompt "explain this code"
or
chat-cli prompt "explain this code" < myfile.go
This will add <document></document>
tags around your document ahead of your prompt. This syntax works especially well with Anthropic Claude. Other models may produce different results.
You can start an interactive chat sessions which will remember your conversation as you chat back and forth with the LLM.
You can start an interactive chat session like this:
chat-cli chat
- Type
quit
to quit the interactive chat session.
Starting a chat session with the chat-cli chat
command will automatically save your chats to a local sqlite database. If you would like to restore a prior chat session you can do so in the following way:
Start by using the chat list
command to list 10 most recent chat sessions.
chat-cli chat list
This will print a list that looks something like the following:
❯ go run main.go chat list
2024-12-17T04:29:59Z | 9be2adda-5966-45c9-8a07-f7a7d486ca36 | How do I get started with AWS?
2024-12-17T04:25:53Z | 07927821-f443-4e92-84c6-86d6fa30ebf2 | What't the best way to decide which car
2024-12-17T04:23:57Z | 6ecdece8-9547-4b8b-9f36-2b92df2f84d6 | What is the best way to decide on which
2024-12-16T04:29:09Z | 879c2dd7-ba3d-4f59-a576-a1ce556ceb4e | What do you know about optics?
2024-12-16T04:28:52Z | 3a51ea83-93df-4af4-a1b3-d1ce89d845d9 | What can you tell me about electronics?
2024-12-16T04:25:14Z | e16d52a8-83a9-4dc6-8e74-e41610689a9e | What is a Go package for printing markdo
2024-12-16T04:24:35Z | 7c4764e1-029d-4ebe-a7d6-43ef230e5117 | Can you help me write a poem about dogs?
2024-12-15T05:25:14Z | 5b2c9fb0-9ed4-4616-90be-b482bc640f8c | Can you summarize what you know about Gi
2024-12-15T05:24:04Z | 042ce5bc-a693-4e8b-9db6-eb4834b5dbac | What do you know about the Go programmin
2024-12-15T04:28:47Z | 56614689-356c-4d54-bb2c-10bd5af56b93 | How are you today?
Find the chat-id
that corresponds to the chat session you would like to load and copy it to your clipboard. Once copied you can load that chat session like this:
chat-cli chat --chat-id 9be2adda-5966-45c9-8a07-f7a7d486ca36
This will print out the saved chat and leave you at a prompt where you can pick up where you left off. Future chats will continue to save with the same chat-id
as you go.
Please note: Eventually your chat session will result in a very large prompt context. Depending on the LLM you are using, you may get an error. Consider starting a new session when your chat session gets really lengthy!
You can get a list of all supported models in your current region like this:
chat-cli models list
Please notes, this is the full list of all possible models. You will need to enable access for any models you'd like to use.
Currently all text based LLMs available through Amazon Bedrock are supported. The LLMs you wish to use must be enabled within Amazon Bedrock.
To switch LLMs, use the --model-id
flag.
You can supply the exact model id from the list above like so:
chat-cli prompt "How are you today?" --model-id cohere.command-text-v14
By default, responses will stream to the command line as they are generated. This can be disabled using the --no-stream
flag with the prompt command. Not all models offer a streaming response capability.
You can disable streaming like this:
chat-cli prompt "What is event driven architecture?" --no-stream
Only streaming response capable models can be used with the chat
command.
There are several flags you can use to override the default config settings. Not all config settings are used by each model.
--max-tokens defaults to 500
--temperature defaults to 1.0
--topP defaults to 0.999
Some LLMs support uploading an image. Images can be either png or jpg and must be less than 5MB. To upload an image do the following:
chat-cli prompt "Explain this image" --image IMG_1234.JPG
Please note this only works with supported models.
With the image
command you can generate images with any supported Foundation Model. Simply follow the syntax below:
chat-cli image "Generate an image of a cat eating cereal"
You can specify the model with the --model-id
flag set to model's full model id or family name. You can also specify an output filename with the --filename
flag.