Releases: eoctet/Octet.Chat
Releases · eoctet/Octet.Chat
v1.4.2
🎉 Refactored function calls & Optimized chat formatter and APIs.
- Refactored function calls, provided function call APIs, now able to adapt to more models.
- Optimized API parameters using the OpenAPI v2 styles.
- Support the use of custom Jinja templates for chat.
- Updated llama Java libs.
v1.4.1
🎉 Optimized chat formatter and Windows Cli
- Refactor the chat message formatting module to use JinJava.
- Add a new native API
llamaModelMeta
.
- Optimize model parameters.
- Fix Windows Cli output format error.
- Update Maven dependencies.
- Update llama Java libs.
v1.4.0
☕️ LLaMA-Java-Core
🤖 Octet-Chat-App
- Update Maven dependencies.
- update model config load.
v1.3.9
☕️ LLaMA-Java-Core
- Support dynamic temperature sampling.
- Update llama-java libs.
🤖 Octet-Chat-App
- Optimize auto agent.
- Add openapi docs.
v1.3.8
☕️ LLaMA-Java-Core
🤖 Octet-Chat-App
- Add WebUI support.
- Rename project name.
- Optimize open api.
- Fix API response result parsing issue.
v1.3.7
☕️ LLaMA-Java-Core
- Optimize components.
- Optimize CMake & Fix metallib failure to load.
- Update the handling of special tokens.
- Update chatml prompt template.
- Optimize generation parameters.
- Update llama-java libs.
🤖 LLaMA-Java-App
- Add AI agent support.
- Add plugin modules, such as Api, Datetime.
- Add automated QA testing.
- Update characters config.
v1.3.6
☕️ LLaMA-Java-Core
- Fix multibyte decoding failures.
- Update llama-java libs.
🤖 LLaMA-Java-App
v1.3.5
☕️ LLaMA-Java-Core
- Add split_mode param.
- Optimize all parameter formats.
- Fix incorrect construction methods in completion mode.
- Update llama-java libs.
v1.3.4
☕️ LLaMA-Java-Core
- Update llama-java libs.
- Fix missing project in Maven deploy.
Note: Please use the latest version of the Java library