Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

how to use with h2ovl missisipi? #4

Open
karen-pal opened this issue Nov 27, 2024 · 0 comments
Open

how to use with h2ovl missisipi? #4

karen-pal opened this issue Nov 27, 2024 · 0 comments

Comments

@karen-pal
Copy link

Hello thank you for your project I'm trying to learn how to use it

from transformers import AutoTokenizer
from onnxllm import AutoModelForCausalLM

model_path = 'h2oai/h2ovl-mississippi-800m'
# you should download onnx models from https://huggingface.co/inisis-me first
tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_path, trust_remote_code=True)

image_path = 'dni_frente.jpg'
question = "<image>\nExtract the text from the image and fill the following json {'Document':'','Document_number':'','Surname':'','Name':'','date_of_birth':'','Nationality':'','issue_date':'','expiration_date':'',}"

inputs = tokenizer(prompt, return_tensors='pt')
# Perform inference
response, history = model.chat(tokenizer, image_path, question, generation_config, history=None, return_history=True)

but i get errors


Traceback (most recent call last):
  File "/home/kpalacio/Documentos/kiut-ld-message-processing/ocr_data_extraction/test.py", line 7, in <module>
    model = AutoModelForCausalLM.from_pretrained(model_path, trust_remote_code=True)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/kpalacio/micromamba/envs/llms/lib/python3.11/site-packages/onnxllm/models/auto_factory.py", line 57, in from_pretrained
    model_class = _get_model_class(config, cls._model_mapping)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/kpalacio/micromamba/envs/llms/lib/python3.11/site-packages/onnxllm/models/auto_factory.py", line 35, in _get_model_class
    supported_models = model_mapping[type(config)]
                       ~~~~~~~~~~~~~^^^^^^^^^^^^^^
  File "/home/kpalacio/micromamba/envs/llms/lib/python3.11/site-packages/onnxllm/models/auto_factory.py", line 87, in __getitem__
    model_type = self._reverse_config_mapping[key.__name__]
                 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^
KeyError: 'H2OVLChatConfig'

am i missing something?
I'm guessing I'm missing a step like exporting the model to onnx? I'm new to the onnx ecosystem.
thank you in advance

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant