Skip to content

Facing Out Of Memory issue for llama-13b model when trained on 4 gpus #3554

premanand09 started this conversation in General
Discussion options

You must be logged in to vote

Replies: 0 comments

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
1 participant