You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I am trying to set up Yi-Coder locally on my M1 pro MacOS. However, it looks like vllm is no longer available for Mac Users because of a torch dependency
I'm wondering if this is a known issue.
The text was updated successfully, but these errors were encountered:
Hello @sajal1123, regarding this issue, I haven't encountered any version incompatibility problems when using Ollama or transformers for local inference. Could you please provide the specific error message you're seeing? Perhaps we can further troubleshoot this problem together.
Hello, I am trying to set up Yi-Coder locally on my M1 pro MacOS. However, it looks like vllm is no longer available for Mac Users because of a torch dependency
I'm wondering if this is a known issue.
The text was updated successfully, but these errors were encountered: