Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Fix Failing Transformers Tests #53

Merged
merged 6 commits into from
Aug 5, 2024
Merged

Fix Failing Transformers Tests #53

merged 6 commits into from
Aug 5, 2024

Conversation

Satrat
Copy link
Contributor

@Satrat Satrat commented Aug 2, 2024

SUMMARY:
We've missed a few test failures with GHA being broken in this repo.

  • Upping the perplexity threshold for 15M models, this was required due to a GPTQ fix
  • Fixing a test recipe that was only sparsifying one layer to sparsify all layers
  • Fixing a device assignment issue in a finetuning test
  • Setting all the 7b parameter tests to be weekly instead of nightly

TEST PLAN:
Updated and reran the unit tests locally to confirm they're fixed

@Satrat Satrat merged commit 0a0a2de into main Aug 5, 2024
8 of 12 checks passed
@Satrat Satrat deleted the sa/transformer_test_fixes branch August 5, 2024 15:55
markmc pushed a commit to markmc/llm-compressor that referenced this pull request Nov 13, 2024
* initial commit

* Update build-nightly.yaml

* Update build-nightly.yaml

* Update build-nightly.yaml

* Update build-nightly.yaml

* try again

* typo

* try again

* change os

* update

* try again

* add test step

* add publish step; update tests

* cleanup

* fix typo; switch to cron

* fix test

* fix

* Update setup.py

---------

Co-authored-by: bogunowicz@arrival.com <bogunowicz@arrival.com>
Co-authored-by: Dipika Sikka <dipikasikka1@gmail.com>
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants