Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Tune VLM configs for SmolVLM and Qwen2-VL #1307

Merged
merged 2 commits into from
Jan 29, 2025
Merged

Conversation

xrdaukar
Copy link
Collaborator

@xrdaukar xrdaukar commented Jan 29, 2025

Description

-- Remove unnecessary dataset limits for SmolVLM and Qwen2-VL
-- SmolVLM: --exclude "onnx/*
-- Tested on GCP
-- The changes are similar to #1287 and #1306

Related issues

Fixes OPE-951

Before submitting

  • This PR only changes documentation. (You can ignore the following checks in that case)
  • Did you read the contributor guideline Pull Request guidelines?
  • Did you link the issue(s) related to this PR in the section above?
  • Did you add / update tests where needed?

Reviewers

At least one review from a member of oumi-ai/oumi-staff is required.

@xrdaukar xrdaukar marked this pull request as ready for review January 29, 2025 02:49
@xrdaukar xrdaukar changed the title save Tune VLM configs for SmolVLM and Qwen2-VL Jan 29, 2025
@xrdaukar xrdaukar merged commit f9d5d68 into main Jan 29, 2025
1 check passed
@xrdaukar xrdaukar deleted the xrdaukar/vlm-cfg-tune-v3 branch January 29, 2025 03:06
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants