Plugin for LLM adding support for Meta LLama 2 and 3 models in Amazon Bedrock
Install this plugin in the same environment as LLM. From the current directory
llm install llm-bedrock-meta
You will need to specify AWS Configuration with the normal boto3 and environment variables.
For example, to use the region us-west-2
and AWS credentials under the personal
profile, set the environment variables
export AWS_DEFAULT_REGION=us-west-2
export AWS_PROFILE=personal
This plugin adds models called bedrock-llama2-13b
, bedrock-llama2-70b
, bedrock-llama3-8b-instruct
, and bedrock-llama3-70b-instruct
. You can also use it with the aliases like bl2
or bl2-70
.
You can query them like this:
llm -m bl3-70-i "Ten great names for a new space station"
llm -m bl3-70-i "Ten great names for a new space station"
You can also chat with the model:
llm chat -m bl3-70-i
-o max_gen_len 1024
, default 2_048: The maximum number of tokens to generate before stopping.-o verbose 1
, default 0: Output more verbose logging.-o temperature 0.8
, default 0.6: Use a lower value to decrease randomness in the response.top_p
, default 0.9: Use a lower value to ignore less probable options. Set to 0 or 1.0 to disable.
Use like this:
llm -m bl3-70-i -o max_gen_len 20 "Sing me the alphabet"
Ah, ah, ah! Here's the alphabet song for you:
A, B, C, D, E, F, G,
H, I, J, K, L, M, N, O,
P, Q, R, S, T, U, V, W,
X, Y, Z,
Now I know my ABCs,
Next time won't you sing with me?
Hope that brought a smile to your face!