Skip to content

examples : evaluate tokens in batches after swapping context #1014

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Merged
merged 2 commits into from
Apr 21, 2023

Conversation

grencez
Copy link
Contributor

@grencez grencez commented Apr 16, 2023

This new loop around llama_eval is a bit redundant with the batching done in the main loop, but without a refactor it's all still necessary to keep print statements happening at the right times.

@grencez grencez force-pushed the batching branch 5 times, most recently from 26748b2 to 3bc0a89 Compare April 16, 2023 10:22
@grencez grencez changed the title Evaluate tokens in batches after swapping context examples: Evaluate tokens in batches after swapping context Apr 16, 2023
@grencez grencez changed the title examples: Evaluate tokens in batches after swapping context examples : evaluate tokens in batches after swapping context Apr 16, 2023
@grencez grencez marked this pull request as ready for review April 16, 2023 10:30
@grencez
Copy link
Contributor Author

grencez commented Apr 17, 2023

Tests passed yesterday. I just synced recent changes and added a comment.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants