Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Question about Llama-7B and Llama-7B-Pro comparison. #5

Open
ryusaeba opened this issue Jan 12, 2024 · 2 comments
Open

Question about Llama-7B and Llama-7B-Pro comparison. #5

ryusaeba opened this issue Jan 12, 2024 · 2 comments

Comments

@ryusaeba
Copy link

As Llama-7B-Pro uses additional 80B pretrain data for improving math and code, did uses the same 80B pretrained data on Llama-7B directly? If so, how's the results?

@hills-code
Copy link
Collaborator

We have not done this experiment yet. We may consider to do this later. Currently, we are going to do the expansion to Mistral and multi-modal models.

@ryusaeba
Copy link
Author

Understood. Please share with me if you have any update. Also looking forward your expansion to Mistral and Multi-Modal models.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants