Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[Bug]: litellm 1.59.9 ModuleNotFoundError: No module named 'cgi' #8081

Closed
glenn-jocher opened this issue Jan 29, 2025 · 9 comments · Fixed by #8160
Closed

[Bug]: litellm 1.59.9 ModuleNotFoundError: No module named 'cgi' #8081

glenn-jocher opened this issue Jan 29, 2025 · 9 comments · Fixed by #8160
Labels
bug Something isn't working mlops user request

Comments

@glenn-jocher
Copy link

What happened?

Error showing in the latest version of litellm 1.59.9 ModuleNotFoundError: No module named 'cgi'

Relevant log output

Hint: make sure your test modules/packages have valid Python names.
Traceback:
/opt/hostedtoolcache/Python/3.13.1/x64/lib/python3.13/importlib/__init__.py:88: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/test_llm.py:9: in <module>
    from assistant.utils.llm import MODELS, LLMClient
assistant/utils/__init__.py:19: in <module>
    from assistant.utils.llm import MODELS, TOKENIZER, Chunk, LLMClient
assistant/utils/llm.py:11: in <module>
    import litellm
/opt/hostedtoolcache/Python/3.13.1/x64/lib/python3.13/site-packages/litellm/__init__.py:1061: in <module>
    from .llms.bedrock.chat.converse_transformation import AmazonConverseConfig
/opt/hostedtoolcache/Python/3.13.1/x64/lib/python3.13/site-packages/litellm/llms/bedrock/chat/__init__.py:1: in <module>
    from .converse_handler import BedrockConverseLLM
/opt/hostedtoolcache/Python/3.13.1/x64/lib/python3.13/site-packages/litellm/llms/bedrock/chat/converse_handler.py:20: in <module>
    from .invoke_handler import AWSEventStreamDecoder, MockResponseIterator, make_call
/opt/hostedtoolcache/Python/3.13.1/x64/lib/python3.13/site-packages/litellm/llms/bedrock/chat/invoke_handler.py:32: in <module>
    from litellm.litellm_core_utils.prompt_templates.factory import (
/opt/hostedtoolcache/Python/3.13.1/x64/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py:2156: in <module>
    from cgi import parse_header
E   ModuleNotFoundError: No module named 'cgi'

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

v1.59.9

Twitter / LinkedIn details

https://www.linkedin.com/in/glenn-jocher/

@glenn-jocher glenn-jocher added the bug Something isn't working label Jan 29, 2025
@glenn-jocher glenn-jocher changed the title [Bug]: litellm 1.59.9 error: ModuleNotFoundError: No module named 'cgi' [Bug]: litellm 1.59.9 ModuleNotFoundError: No module named 'cgi' Jan 29, 2025
@starpit
Copy link

starpit commented Jan 29, 2025

Same, only affecting Python 3.13 venvs in our CI tests.

This seems to be the proximate commit: 8eaa5dc#diff-a3c9843a716ad6c49482395f7357e83abc5397168eafc75fa2dba2124619dcebR2156

Merged 15 hours ago. Does this project not have python 3.13 tests?

starpit added a commit to starpit/prompt-declaration-language that referenced this issue Jan 29, 2025
BerriAI/litellm#8081

Signed-off-by: Nick Mitchell <nickm@us.ibm.com>
@ishaan-jaff
Copy link
Contributor

cc @krrishdholakia can you look into this, coming from here: 8eaa5dc#diff-a3c9843a716ad6c49482395f7357e83abc5397168eafc75fa2dba2124619dcebR2156

starpit added a commit to IBM/prompt-declaration-language that referenced this issue Jan 29, 2025
BerriAI/litellm#8081

Signed-off-by: Nick Mitchell <nickm@us.ibm.com>
@JosTheNeutralGood
Copy link

cgi is no longer part of the python standard library in 3.13

I had a similar issue after installing some packages in Python and the issue was resolved after I ran %pip install legacy-cgi

The documentation says to use a fork of the module found legacy-cgi
https://docs.python.org/3/library/cgi.html

@neoneye
Copy link

neoneye commented Jan 30, 2025

same issue here.

I'm trying to get smolagents to use ollama, and smolagents LiteLLMModel class use litellm.

@mmabrouk
Copy link

We have the same issue. cgi has been depracated in python 3.13, meaning litellm does not work in python 3.13

@blkt
Copy link

blkt commented Jan 31, 2025

@ishaan-jaff @krrishdholakia this might help fixing support for Python 3.13, basically you can add legacy-cgi = "==2.6.2" as dependency.

Source https://docs.python.org/3/library/cgi.html.

@dhh1995
Copy link
Contributor

dhh1995 commented Feb 1, 2025

@ishaan-jaff @krrishdholakia implemented a replacement of the parse_header function as suggested in PEP594 in #8160 , without adding new package dependency. Please review the changes to resolve this issue for Python 3.13 users.

@krrishdholakia
Copy link
Contributor

Closing as the fix is now on main. Should be live in today's release - v1.60.x

@krrishdholakia krrishdholakia closed this as not planned Won't fix, can't repro, duplicate, stale Feb 1, 2025
@krrishdholakia krrishdholakia reopened this Feb 1, 2025
@Daniil-Aleshechkin-IQ
Copy link

Daniil-Aleshechkin-IQ commented Feb 3, 2025

FYI I just ran into this again on v1.60.0

Here's the stack trace

PS C:\Users\Daniil.Aleshechkin\litellm> litellm --config .\config.yaml
C:\Users\Daniil.Aleshechkin\AppData\Local\Programs\Python\Python313\Lib\site-packages\pydantic\_internal\_config.py:345: UserWarning: Valid config keys have changed in V2:
* 'fields' has been removed
  warnings.warn(message, UserWarning)
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\Daniil.Aleshechkin\AppData\Local\Programs\Python\Python313\Scripts\litellm.exe\__main__.py", line 4, in <module>        
    from litellm import run_server
  File "C:\Users\Daniil.Aleshechkin\AppData\Local\Programs\Python\Python313\Lib\site-packages\litellm\__init__.py", line 779, in <module>
    from .llms.bedrock.chat.converse_transformation import AmazonConverseConfig
  File "C:\Users\Daniil.Aleshechkin\AppData\Local\Programs\Python\Python313\Lib\site-packages\litellm\llms\bedrock\chat\__init__.py", line 1, in <module>
  File "C:\Users\Daniil.Aleshechkin\AppData\Local\Programs\Python\Python313\Lib\site-packages\litellm\llms\bedrock\chat\converse_handler.py", line 20, in <module>
    from .invoke_handler import AWSEventStreamDecoder, MockResponseIterator, make_call
  File "C:\Users\Daniil.Aleshechkin\AppData\Local\Programs\Python\Python313\Lib\site-packages\litellm\llms\bedrock\chat\invoke_handler.py", line 34, in <module>
    from litellm.litellm_core_utils.prompt_templates.factory import (
    ...<7 lines>...
    )
  File "C:\Users\Daniil.Aleshechkin\AppData\Local\Programs\Python\Python313\Lib\site-packages\litellm\litellm_core_utils\prompt_templates\factory.py", line 2156, in <module>
    from cgi import parse_header
ModuleNotFoundError: No module named 'cgi'

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
bug Something isn't working mlops user request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

10 participants