Skip to content

Releases: eoctet/Octet.Chat

v1.2.2

18 Oct 12:53
0f6abd3
Compare
Choose a tag to compare
  1. Update inference generator.
  2. Update llama.cpp libs version to b1395.

v1.2.1

17 Oct 17:25
9f4fa4f
Compare
Choose a tag to compare
  1. Update tensor_split param.
  2. Update Java docs.
  3. Update llama.cpp libs version to b1387.

v1.2.0

16 Oct 04:03
Compare
Choose a tag to compare
  1. Add model prompt templates.
  2. Update Java docs.

v1.1.9

15 Oct 05:58
Compare
Choose a tag to compare
  1. Update llama.cpp libs version to b1381.

v1.1.8

13 Oct 15:27
Compare
Choose a tag to compare
  1. Update llama.cpp libs version to b1380.

v1.1.7

12 Oct 06:56
Compare
Choose a tag to compare
  1. Update llama.cpp libs version to b1369.

v1.1.6

08 Oct 07:26
Compare
Choose a tag to compare
  1. Update llama.cpp libs version to b1345.
  2. Update Continuous generation and chat.

v1.1.5

06 Oct 12:28
Compare
Choose a tag to compare
  1. Fix decoding failure.
  2. Added continuous chat session.
  3. Update batch decode.

v1.1.4

04 Oct 03:22
Compare
Choose a tag to compare
  1. Update llama.cpp libs version to b1317.
  2. Update llama grammar support.
  3. Added batch decode support.
  4. Fix rope_freq_base default value.

v1.1.3

29 Sep 12:10
Compare
Choose a tag to compare
  1. Update llama.cpp libs version to b1292.
  2. Update LlamaService API.
  3. Added conversation memory to the prompt.
  4. Optimize code structure.