If you want to boot up mlflow project with one-liner - this repo is for you. The only requirement is docker installed on your system and we are going to use Bash on linux/windows.
- Configure
.env
file for your choice. You can put there anything you like, it will be used to configure you services - Run
docker compose up
- Open up http://localhost:5000 for MlFlow, and http://localhost:9001/ to browse your files in S3 artifact store
👇Video tutorial how to set it up + BONUS with Microsoft Azure 👇
- One file setup (.env)
- Minio S3 artifact store with GUI
- MySql mlflow storage
- Ready to use bash scripts for python development!
- Automatically-created s3 buckets
Click to show
- Configure your client-side
For running mlflow files you need various environment variables set on the client side. To generate them user the convienience script ./bashrc_install.sh
, which installs it on your system or ./bashrc_generate.sh
, which just displays the config to copy & paste.
$ ./bashrc_install.sh
[ OK ] Successfully installed environment variables into your .bashrc!
The script installs this variables: AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, MLFLOW_S3_ENDPOINT_URL, MLFLOW_TRACKING_URI. All of them are needed to use mlflow from the client-side.
- Test the pipeline with below command with conda. If you dont have conda installed run with
--no-conda
mlflow run git@github.com:databricks/mlflow-example.git -P alpha=0.5
# or
python ./quickstart/mlflow_tracking.py
- (Optional) If you are constantly switching your environment you can use this environment variable syntax
MLFLOW_S3_ENDPOINT_URL=http://localhost:9000 MLFLOW_TRACKING_URI=http://localhost:5000 mlflow run git@github.com:databricks/mlflow-example.git -P alpha=0.5
Copyright (c) 2021 Tomasz Dłuski
Licensed under the MIT License (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License by reviewing the file LICENSE in the repository.