Welcome! To get started with our project, you need to install a couple of essential tools:
- Go (version 1.22 or higher)
- Uv (a Python package manager)
Follow the instructions below to set up your environment.
You will find the instruction to install Go depending on your infrastructure here: https://go.dev/doc/install
You will find the instruction to install uv depending on your infrastructure here: https://docs.astral.sh/uv/getting-started/installation/
You will find the instruction to install Rust here: https://www.rust-lang.org/tools/install
You will find the instruction to install Docker here: https://docs.docker.com/engine/install/
In the /templates
folder you will find the templates of the environment variables
needed to run the project. If you do not want to modify it, you can just run:
cp ./templates/.env.template ./.env
cp ./templates/grafana.env.template ./grafana.env
cp ./templates/production.env.template ./production.env
You will need the environment variables defined in the .env
file to be loader.
You can either use the dotenv plugin or load it using:
export $(grep -v '^#' .env | xargs)
At the root of the repository, you will find a Makefile
that assists with
various tasks.
The first step is to download the data that you will find here.
Put it inside the /data
directory.
The second step is to train the autoencoder. You can run at the root directory:
make model/build
This command will train the model and save the artifacts in the ml_service
directory.
Next, you'll need to containerize the ML facade and the ML service. To do this, run:
make services/build
Once everything is trained and built, you can start the services by running:
make services/up
To stop the services, run:
make services/stop
To delete the containers, run:
make services/down
You can send data to either the RabbitMQ server or the API endpoint by running:
make run/send-data --rabbitmq={true/false} --requests=10
You can log in to Grafana to check the dashboards and monitor predictions in real-time.
make grafana