Skip to content

Latest commit

 

History

History
103 lines (73 loc) · 2.54 KB

File metadata and controls

103 lines (73 loc) · 2.54 KB

Setup Instructions

Welcome! To get started with our project, you need to install a couple of essential tools:

  • Go (version 1.22 or higher)
  • Uv (a Python package manager)

Follow the instructions below to set up your environment.

Go

You will find the instruction to install Go depending on your infrastructure here: https://go.dev/doc/install

Uv

You will find the instruction to install uv depending on your infrastructure here: https://docs.astral.sh/uv/getting-started/installation/

Rust

You will find the instruction to install Rust here: https://www.rust-lang.org/tools/install

Docker

You will find the instruction to install Docker here: https://docs.docker.com/engine/install/

Setup environment variables

In the /templates folder you will find the templates of the environment variables needed to run the project. If you do not want to modify it, you can just run:

cp ./templates/.env.template ./.env
cp ./templates/grafana.env.template ./grafana.env
cp ./templates/production.env.template ./production.env

You will need the environment variables defined in the .env file to be loader. You can either use the dotenv plugin or load it using:

export $(grep -v '^#' .env | xargs)

Next steps

At the root of the repository, you will find a Makefile that assists with various tasks.

Download the data

The first step is to download the data that you will find here. Put it inside the /data directory.

Build the Model

The second step is to train the autoencoder. You can run at the root directory:

make model/build

This command will train the model and save the artifacts in the ml_service directory.

Build the Facade

Next, you'll need to containerize the ML facade and the ML service. To do this, run:

make services/build

Run the Services

Once everything is trained and built, you can start the services by running:

make services/up

To stop the services, run:

make services/stop

To delete the containers, run:

make services/down

Send the data

You can send data to either the RabbitMQ server or the API endpoint by running:

make run/send-data --rabbitmq={true/false} --requests=10

Check the dashboards

You can log in to Grafana to check the dashboards and monitor predictions in real-time.

make grafana