Skip to content

0.8.0

Compare
Choose a tag to compare
@trufae trufae released this 11 Jul 16:58
· 174 commits to master since this release

What's Changed

Packaging:

  • Fix installation and usage, requires latest r2pipe
  • Separate r2pm packages for r2ai and r2ai-plugin
  • r2pm -r r2ai-server to launch llamafile, llama and kabaldcpp to use with openapi backend
  • Use the latest huggingface api

Commands:

  • -M and -MM for the short and long list of models supported
  • -L-# - delete the last N messages from the chat history
  • Add 'decai' a decompiled based on AI as an r2js script
  • -w webserver supports tabbyml, ollama, openapi, r2pipe and llamaserver rest endpoints
  • -ed command edits the r2ai.rc file with $EDITOR
  • ?t command to benchmark commands
  • ?V command is now fixed to show version of r2, r2ai and llama

Backends:

  • -e llm.gpu to use cpu or gpu
  • OpenAPI (http requests for llamaserver)
  • Support ollama servers
  • Larger context window by default (32K)
  • top_p and top_k parameters can now be tweaked
  • Latest llamapy supports latest gemma2 models
  • Add support for Google Gemini API by @dnakov in #24
  • docs: update README.md by @eltociear in #27
  • Package updates by @dnakov in #29
  • Fix anthropic chat mode by @dnakov in #31

Contributors

Full Changelog: 0.7.0...0.8.0