Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

如何微调? #16

Open
jasonisme123 opened this issue Jun 18, 2024 · 5 comments
Open

如何微调? #16

jasonisme123 opened this issue Jun 18, 2024 · 5 comments

Comments

@jasonisme123
Copy link

No description provided.

@asirgogogo
Copy link
Collaborator

可以参考 https://github.com/datawhalechina/self-llm/tree/master/bilibili_Index-1.9B

@asirgogogo
Copy link
Collaborator

hi~,新增了finetune的代码 https://github.com/bilibili/Index-1.9B/blob/main/finetune/README.md

@Moemu
Copy link

Moemu commented Aug 6, 2024

hi~,新增了finetune的代码 https://github.com/bilibili/Index-1.9B/blob/main/finetune/README.md

这个“指令集构建”好像不支持具有上下文的对话吧?

@Moemu
Copy link

Moemu commented Aug 6, 2024

而且在train.sh中,不应该使用python -m torch.distributed.launch,这已经在torch 2.0版本被弃用,应该使用torchrun

@asirgogogo
Copy link
Collaborator

如果想要微调多轮对话(具有上下文的话),你需要修改finetune.py中的process_func函数,对你的多轮对话数据进行适配(他是非常灵活的),多轮对话的拼接格式“{system}reserved_0{human}reserved_1{assistant}...reserved_0{human}reserved_1{assistant}”

关于启动方式,python -m torch.distributed.launch的方式依然适用,当然你也可以修改为
torchrun --nnodes 1 --nproc_per_node=4 --node_rank=0 --master_addr=your_ip --master_port=your_port finetune.py

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants