Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

llm-guard package installation with poetry. #104

Open
Jurik-001 opened this issue Feb 27, 2024 · 14 comments
Open

llm-guard package installation with poetry. #104

Jurik-001 opened this issue Feb 27, 2024 · 14 comments

Comments

@Jurik-001
Copy link

If i want to install the llm-guard i run always in following issue:

ModuleNotFoundError: No module named 'torch'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

You address that already in your docs, the problem with your solution is, i did not find a way to integrate it with poetry. Can you please fix it?

@asofter
Copy link
Collaborator

asofter commented Feb 28, 2024

Hey @Jurik-001 , we are planning to migrate it to Poetry, but unfortunately, there is no ETA on that. Are you running from Mac M1 since we usually see this problem on ARM-based machines?

@Jurik-001
Copy link
Author

Yes thats correct i use an M1 device, i can also support on migrating to poetry, if you want @asofter

@asofter
Copy link
Collaborator

asofter commented Feb 28, 2024

That would be great. I have only limited experience with Poetry (as a user but never created a project with it). Thank you!

@Jurik-001
Copy link
Author

okay, i am start working on it, i saw you also not configure black, is that right? @asofter

@asofter
Copy link
Collaborator

asofter commented Feb 28, 2024

Thanks! We do but through pre-commit hooks: https://github.com/protectai/llm-guard/blob/main/.pre-commit-config.yaml#L12-L16

@liadlevy-pando
Copy link

I have this problem as well using pip on MacOs Apple M1 Max @asofter

@Jurik-001
Copy link
Author

yeah, i worked yesterday on the switch to poetry, but get trouble with xformers package, same issue if i want install the current guard package, needs some intervention their, whats actually wrong @liadlevy-pando

@asofter
Copy link
Collaborator

asofter commented Mar 4, 2024

Do you think it's solved with the newer version of Pytorch? We can try testing using it. Or, I am actually planning to just rely on the ONNX Runtime to remove some dependencies

@Jurik-001
Copy link
Author

@asofter so i think the problem is xFormers, if i understand correct that library i based on CUDA (facebookresearch/xformers#987), which clearly not work on mac m chips. In which context you use that lib?

@asofter
Copy link
Collaborator

asofter commented Mar 12, 2024

Honestly, we used it somewhere but I don't think we do anymore. Removed the package. Please let me know if it fixed the problem.

@jayita13
Copy link

jayita13 commented Mar 24, 2024

I'm facing the same issue with Python 3.12.1 on Windows fails at xformers installation
Any way to resolve it yet @asofter @Jurik-001

@jayita13
Copy link

seems like Python 3.12 wasn't compatible
I was able to run it with 3.10

@CandiedCode
Copy link
Contributor

CandiedCode commented Apr 29, 2024

Now that python has standardized on several build improvements like PEP 517 and PEP 660, I'm curious what migrating to poetry specifically solves?

Hatch is a PyPA project that seems to support similar functionality as poetry.

rye the makers of ruff and uv, also have a ackage management solution for Python. Because of the monorepo support, it maybe an interesting choice, for managing llm-guard and llm-guard-api.

I use setuptools + pip-tools personally, as I find I have less IDE configuration issues versus using poetry + vscode together.

@rahulsingh50
Copy link

Python 3.12.1 on Windows fails.Issue

from llm_guard import scan_output, scan_prompt
ModuleNotFoundError: No module named 'llm_guard'

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants