Releases: radareorg/r2ai
Releases · radareorg/r2ai
0.8.4
What's Changed
- Add r2plugin Makefile system install by @prodrigestivill in #32
- Several fixes for standalone mode, add support for AWS bedrock models by @FernandoDoming in #33
New Contributors
- @prodrigestivill made their first contribution in #32
- @FernandoDoming made their first contribution in #33
Full Changelog: 0.8.2...0.8.4
r2ai-0.8.2
Full Changelog: 0.8.0...0.8.2
- decai plugin - r2ai based decompiler for radare2
- Pin llamacpp version to avoid installation issues
- Fix autocompletion on linux terminals
- Improved prompt
- Use chromadb instead of vectordb, and use it by default
- Add support for ddg webscrapping
- Support json output
- Handle <<EOF for multiline inputs
- Add support for llama3
0.8.0
What's Changed
Packaging:
- Fix installation and usage, requires latest r2pipe
- Separate r2pm packages for r2ai and r2ai-plugin
- r2pm -r r2ai-server to launch llamafile, llama and kabaldcpp to use with openapi backend
- Use the latest huggingface api
Commands:
- -M and -MM for the short and long list of models supported
- -L-# - delete the last N messages from the chat history
- Add 'decai' a decompiled based on AI as an r2js script
- -w webserver supports tabbyml, ollama, openapi, r2pipe and llamaserver rest endpoints
- -ed command edits the r2ai.rc file with $EDITOR
- ?t command to benchmark commands
- ?V command is now fixed to show version of r2, r2ai and llama
Backends:
- -e llm.gpu to use cpu or gpu
- OpenAPI (http requests for llamaserver)
- Support ollama servers
- Larger context window by default (32K)
- top_p and top_k parameters can now be tweaked
- Latest llamapy supports latest gemma2 models
- Add support for Google Gemini API by @dnakov in #24
- docs: update README.md by @eltociear in #27
- Package updates by @dnakov in #29
- Fix anthropic chat mode by @dnakov in #31
Contributors
Full Changelog: 0.7.0...0.8.0
0.7.0
What's Changed
- Add few tests and run the linter in the CI
- Use llama3 model by default
- Add TAB autocompletion
- Better python's pip and venv support
- Add support for user plugins via the '..' command
- r2ai -repl implemented on rlang and r2pipe modes
Full Changelog: 0.6.1...0.7.0
0.6.0
What's Changed
- [WIP] Auto mode by @dnakov in #3
- Fix live code/message blocks and simplify the code by @trufae in #6
- change :auto to
'
by @dnakov in #7 - support chatml-function-calling via llama-cpp by @dnakov in #4
- Support functionary models for auto mode by @dnakov in #10
- Support for anthropic claude + tools by @dnakov in #11
- Support Hermes-2-Pro and auto mode by @dnakov in #13
- Add anthropic claude3 haiku model by @dnakov in #12
- Update repl.py by @Xplo8E in #18
- Support for groq by @dnakov in #19
- Fixes by @dnakov in #20
New Contributors
- @dnakov made their first contribution in #3
- @trufae made their first contribution in #6
- @Xplo8E made their first contribution in #18
Full Changelog: 0.5.0...0.6.0