Releases: eoctet/Octet.Chat
Releases · eoctet/Octet.Chat
v1.2.2
- Update inference generator.
- Update llama.cpp libs version to b1395.
v1.2.1
- Update tensor_split param.
- Update Java docs.
- Update llama.cpp libs version to b1387.
v1.2.0
- Add model prompt templates.
- Update Java docs.
v1.1.9
- Update llama.cpp libs version to b1381.
v1.1.8
- Update llama.cpp libs version to b1380.
v1.1.7
- Update llama.cpp libs version to b1369.
v1.1.6
- Update llama.cpp libs version to b1345.
- Update Continuous generation and chat.
v1.1.5
- Fix decoding failure.
- Added continuous chat session.
- Update batch decode.
v1.1.4
- Update llama.cpp libs version to b1317.
- Update llama grammar support.
- Added batch decode support.
- Fix rope_freq_base default value.
v1.1.3
- Update llama.cpp libs version to b1292.
- Update LlamaService API.
- Added conversation memory to the prompt.
- Optimize code structure.