Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Using model LLM local for Q&A in PyGwalker Visualization #640

Open
Anhduchb01 opened this issue Oct 11, 2024 · 1 comment
Open

Using model LLM local for Q&A in PyGwalker Visualization #640

Anhduchb01 opened this issue Oct 11, 2024 · 1 comment

Comments

@Anhduchb01
Copy link

I need to use PyGwalker locally and don't want to use OpenAI for Q&A. However, I can also run LLM models locally. How can I combine that local model with PyGwalker Q&A?

@ObservedObserver
Copy link
Member

Related discussion here: #454

Besides, if you want to have more customization of AI + data visualization, try graphic-walker, which provides more API for local integration.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants