Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Hi @TheTinyTeddy #5

Open
Garibaldi2000 opened this issue Jul 26, 2024 · 0 comments
Open

Hi @TheTinyTeddy #5

Garibaldi2000 opened this issue Jul 26, 2024 · 0 comments

Comments

@Garibaldi2000
Copy link
Owner

Hi @TheTinyTeddy
Thanks for your interest in our work. Your desired behavior needs to set up cache during decoding. To enable this, you need to set configuration = TTTConfig(use_cache=True). Please let me know if this resolve your issue.

Originally posted by @xvjiarui in test-time-training/ttt-lm-pytorch#17 (comment)

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant