We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Hi, thanks for the great work. I found that the eos token will be masked out when use tensor_fn.create_attention_mask. https://github.com/ZihanWang314/RAGEN/blob/4c457dc3a9509394c56acc97bb68e85cd19a4c75/ragen/llm_agent/tensor_helper.py#L35C1-L37C72 I don't know if this will have an effect on learning when to stop for models. Thanks!
The text was updated successfully, but these errors were encountered:
Thanks for the nice catch! Will check!
Sorry, something went wrong.
No branches or pull requests
Hi, thanks for the great work. I found that the eos token will be masked out when use tensor_fn.create_attention_mask.
https://github.com/ZihanWang314/RAGEN/blob/4c457dc3a9509394c56acc97bb68e85cd19a4c75/ragen/llm_agent/tensor_helper.py#L35C1-L37C72
I don't know if this will have an effect on learning when to stop for models. Thanks!
The text was updated successfully, but these errors were encountered: