Kepler is a vulnerability database and lookup store and API currently utilising National Vulnerability Database as data sources; implementing CPE 2.3 tree expressions and version range evaluation in realtime.
Docker (recommended)
We provide a docker bundle with kepler
, dedicated PostgreSQL database and Ofelia as job scheduler for continuous update
export CONTAINER_SOCKET=/var/run/docker.sock
docker compose build
docker-compose up
Podman (optional)
export CONTAINER_SOCKET=/run/user/1000/podman/podman.sock
podman compose build
podman-compose up
Or just use an alias:
alias docker=podman
When the application starts checks for pending database migrations and automatically applies them. Remove the --migrate
option to stop when a pending migration is detected
If you're interested in adding new migrations you should check out and instal Diesel-cli.
After you have diesel-cli
installed, you can run:
diesel migration generate <name_your_migration>
This will generae up.sql
and down.sql
files which you can than apply with :
diesel migration run
- Or by re-starting your kepler conainer (this auto triggers migrations)
Alternatively you can build kepler
from sources. To build you need rust
, cargo
and libpg-dev
(or equivalent PostgreSQL library for your Linux distribution)
cargo build --release
The system will automatically fetch and import new records every 3 hours if you use our bundle, while historical data must be imported manually.
Kepler currently supports two data sources, National Vulnerability Database and NPM Advisories. You can import the data sources historically as follows.
To import NIST records from all available years (2002 to 2025):
for year in $(seq 2002 2025); do
docker exec -it kepler kepler import_nist $year -d /data
done
-
System will automatically fetch and import new records records every 3 hours. (using schedulled
ofelia
job) -
Append
--refresh
argument if you want to refetch from National Vulnerability Database (NVD) source.
Example - Refresh data for 2025
docker exec -it kepler kepler import_nist 2025 -d /data --refresh
Example - Custom batch size -e KEPLER__BATCH_SIZE
docker exec -it -e KEPLER__BATCH_SIZE=4500 kepler kepler import_nist 2025 -d /data --refresh
NOTE: Postgres supports 65535 params total so be aware when changing the default
KEPLER__BATCH_SIZE=5000
- Postgres limits
If you want to rebuild your dabase. You would do it in these steps:
docker-compose down -v # -v (bring down volumes)
docker compose build # optional (if you made some backend changes)
docker-compose up
Than re-trigger the NIST data Import step.
There are two primary APIs as of right now — the product
API and the cve
API detailed below.
Products can be listed:
curl http://localhost:8000/products
Grouped by vendor:
curl http://localhost:8000/products/by_vendor
Or searched:
curl http://localhost:8000/products/search/iphone
To use the vulnerabilities search API via cURL (prepend node-
to the product name in order to search for NPM specific packages):
curl \
--header "Content-Type: application/json" \
--request POST \
--data '{"product":"libxml2","version":"2.9.10"}' \
http://localhost:8000/cve/search
Responses are cached in memory with a LRU limit of 4096 elements.
If you get the linking with cc
error that looks similar to this one, you're likely missing some c
related tooling or libs.
error: linking with `cc` failed: exit status: 1
//...
= note: /usr/bin/ld: cannot find -lpq: No such file or directory
collect2: error: ld returned 1 exit status
This one required Postgres related clib to be added.
Fedora
sudo dnf install postgresql-devel
Arch
sudo pacman -S postgresql-libs