A high-performance HTTP/2-enabled proxy server designed specifically to enable Cursor IDE's Composer to use DeepSeek's language models. This proxy translates OpenAI-compatible API requests to DeepSeek API format, allowing Cursor's Composer and other OpenAI API-compatible tools to seamlessly work with DeepSeek's models.
This proxy was created to enable Cursor IDE users to leverage DeepSeek's powerful language models through Cursor's Composer interface as an alternative to OpenAI's models. By running this proxy locally, you can configure Cursor's Composer to use DeepSeek's models for AI assistance, code generation, and other AI features. It handles all the necessary request/response translations and format conversions to make the integration seamless.
- HTTP/2 support for improved performance
- Full CORS support
- Streaming responses
- Support for function calling/tools
- Automatic message format conversion
- Compression support (Brotli, Gzip, Deflate)
- Compatible with OpenAI API client libraries
- Go 1.19 or higher
- DeepSeek API key
- Public Endpoint
- Clone the repository
- Install dependencies:
go mod download
- Create a
.env
file in the project root:
DEEPSEEK_API_KEY=your_deepseek_api_key_here
- Start the proxy server:
go run proxy.go
The server will start on port 9000 by default.
- Use the proxy with your OpenAI API clients by setting the base URL to
http://your-public-endpoint:9000/v1
/v1/chat/completions
- Chat completions endpoint/v1/models
- Models listing endpoint
gpt-4o
maps to DeepSeek's GPT-4o equivalent modeldeepseek-chat
for DeepSeek's native chat model
github.com/andybalholm/brotli
- Brotli compression supportgithub.heygears.com/joho/godotenv
- Environment variable managementgolang.org/x/net
- HTTP/2 support
- The proxy includes CORS headers for cross-origin requests
- API keys are required and validated
- Secure handling of request/response data
This project is licensed under the GNU General Public License v2.0 (GPLv2). See the LICENSE.md file for details.