This is a self-hostable proxy server that'll interface with your systems on behalf of Inferable. Inferable will only be able to read the service metadata of your services, therefore this is a good place to:
- Place secrets that you don't want exposed to the outside world.
- Wrap your existing services selectively so that Inferable can only access the ones you want it to.
- Place your own custom logic in the proxy to do things like rate limiting, authentication, etc.
- Add additional observability.
You can npm install
any @inferable/*
package in the proxy and it'll get discovered and bootstrapped on proxy start.
For all first-party integrations, visit the Inferable Marketplace in your dashboard.
inferable proxy add <service_name>
Or, if you choose to do this manually, note that:
- All your services must be placed in the
src/services
directory. - A service must be exposed in a single Typescript file that ends with
service.ts
.
This application is dockerized, and you can run it in any environment that supports docker.
docker build . -t inferable/proxy:latest
The services require the following environment variables at a minimum:
INFERABLE_API_SECRET="sk_xxx"
Assuming you have a .env
file with the above, you can run the proxy with:
docker run -p 8173:8173 -e-env-file.env inferable/proxy:latest
Port 8173
exposes 3 endpoints:
/health
: Health check endpoint. Reports some telemtry about the proxy./live
: Liveness check endpoint. Reports 200 if the proxy is alive.