How to deploy LoRA adapter on NIM based on checkpoint created during llm.finetune with NeMo Run? #12493
-
I successfully followed the NeMo tutorial to fine-tune Llama 3 with LoRA using nemo:24.12. This process generated a checkpoint stored in the directory: Checkpoint StructureThe checkpoint directory contains the following structure: NIM's Required StructureNow, I want to deploy this LoRA adapter on NIM, but NIM requires a different file structure for adapters: ProblemI am unable to find a way to convert my checkpoint into a Questions
Any guidance on this would be greatly appreciated! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
I found a soultion how to convert it to hf format, which is also supported by NIM with llm.peft.export_lora:
|
Beta Was this translation helpful? Give feedback.
I found a soultion how to convert it to hf format, which is also supported by NIM with llm.peft.export_lora: