Skip to content

Releases: simonw/llm-mlc

0.5

15 Aug 04:22
Compare
Choose a tag to compare
  • New -o max_gen_len 100 option for setting the maximum length of the generated text. #10

0.4

14 Aug 00:21
afbafe8
Compare
Choose a tag to compare
0.4
  • The llm mlc download-model command now takes zero or more optional -a/--alias options to configure aliases for the model once it has been installed. #4:
    llm mlc download-model Llama-2-7b-chat --alias llama2
  • Installation instructions are clearer, and show how to install required dependencies first. #6
  • The plugin no longer crashes llm if it cannot find the dist/prebuilt folder. #9
  • New options for temperature, top_p and repetition_penalty: #7
    llm -m Llama-2-7b-chat \
      -o temperature 0.5 \
      -o top_p 0.9 \
      -o repetition_penalty 0.9 \
      'five names for a cute pet ferret'

0.3

12 Aug 05:26
5792acb
Compare
Choose a tag to compare
0.3
  • Conversation mode now works, so you can continue a conversation with an MLC model with llm -c "follow-up prompt". #3

0.2

12 Aug 04:52
c400925
Compare
Choose a tag to compare
0.2
  • Token streaming now works. #2

0.1a0

12 Aug 02:07
Compare
Choose a tag to compare
0.1a0 Pre-release
Pre-release
  • Initial release. Tools for installing and running MLC models using LLM. #1