Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[vulnerability-fix] Huggingface - Server side template injection #2949

Closed
krrishdholakia opened this issue Apr 11, 2024 · 7 comments
Closed
Assignees
Labels
bug Something isn't working

Comments

@krrishdholakia
Copy link
Contributor

krrishdholakia commented Apr 11, 2024

Links:

GHSA-46cm-pfwv-cgf8
PR to fix the issue: #2941

Status

✅ Resolved

What happened?

Huggingface chat template is stored on huggingface hub in the tokenizer_config.json

e.g. - https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2/blob/41b61a33a2483885c981aa79e0df6b32407ed873/tokenizer_config.json#L42

This is called when a hf model is called, to get it's chat template.

We need to add a check to ensure the chat template is sanitized correctly (it's a prompt template, not malicious code).


Update: PR is now live - #2941

@krrishdholakia krrishdholakia added the bug Something isn't working label Apr 11, 2024
@krrishdholakia
Copy link
Contributor Author

I believe we can do this similarly to how transformers handles this by running jinja inside an immutablesandboxenvironment

    @lru_cache
    def _compile_jinja_template(self, chat_template):
        try:
            import jinja2
            from jinja2.exceptions import TemplateError
            from jinja2.sandbox import ImmutableSandboxedEnvironment
        except ImportError:
            raise ImportError("apply_chat_template requires jinja2 to be installed.")

        if version.parse(jinja2.__version__) < version.parse("3.0.0"):
            raise ImportError(
                "apply_chat_template requires jinja2>=3.0.0 to be installed. Your version is " f"{jinja2.__version__}."
            )

        def raise_exception(message):
            raise TemplateError(message)

        jinja_env = ImmutableSandboxedEnvironment(trim_blocks=True, lstrip_blocks=True)
        jinja_env.globals["raise_exception"] = raise_exception
        return jinja_env.from_string(chat_template)

@ishaan-jaff ishaan-jaff pinned this issue Apr 11, 2024
@krrishdholakia
Copy link
Contributor Author

PR - #2941

@krrishdholakia krrishdholakia changed the title [Bug]: Huggingface - Server side template injection Huggingface - Server side template injection Apr 11, 2024
@ishaan-jaff ishaan-jaff self-assigned this Apr 11, 2024
@ishaan-jaff ishaan-jaff changed the title Huggingface - Server side template injection [vulnerability-fix] Huggingface - Server side template injection Apr 11, 2024
@ishaan-jaff
Copy link
Contributor

Confirming the fix worked - I ran a bad tokenizer.json and it was not able to impact my machine

Screenshot 2024-04-10 at 8 51 31 PM

@ishaan-jaff
Copy link
Contributor

PR here - confirmed it works for me: #2941

@ishaan-jaff
Copy link
Contributor

merged into main + queued a new release

@krrishdholakia
Copy link
Contributor Author

krrishdholakia commented Apr 11, 2024

Fix should be live soon in v1.34.42

@krrishdholakia
Copy link
Contributor Author

krrishdholakia commented Apr 11, 2024

Closing as fix is now live in v1.34.42

pypi - https://pypi.org/project/litellm/1.34.42/
docker - https://github.com/BerriAI/litellm/pkgs/container/litellm

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants