Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Issues about integrated inference #177

Open
mikecovlee opened this issue Mar 4, 2024 · 0 comments
Open

Issues about integrated inference #177

mikecovlee opened this issue Mar 4, 2024 · 0 comments
Assignees
Labels
bug Something isn't working enhancement New feature or request

Comments

@mikecovlee
Copy link
Contributor

Traceback

Traceback (most recent call last):
  File "/home/mikecovlee/work/multi-lora-fine-tune/mlora.py", line 175, in <module>
    inference(config, model, tokenizer)
  File "/home/mikecovlee/work/multi-lora-fine-tune/mlora.py", line 106, in inference
    input_data = mlora.MultiLoraBatchData(
TypeError: MultiLoraBatchData.__init__() got an unexpected keyword argument 'prompts_'

TODO

Improve inference functions. @mikecovlee

@mikecovlee mikecovlee added bug Something isn't working enhancement New feature or request labels Mar 4, 2024
@mikecovlee mikecovlee self-assigned this Mar 4, 2024
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant