mlflow-server / README.md
MLflow On-Premise Deployment using Docker Compose
mlflow server to manage ml models lifecycle
Last updated: 4/16/2026GitHub
MLflow On-Premise Deployment using Docker Compose
Easily deploy an MLflow tracking server with 1 command.
MinIO S3 is used as the artifact store and PostgreSQL server is used as the backend store.
How to run
-
Clone this repository
git clone https://github.com/rihal-om/mlflow-server.git -
cdinto themlflow-serverdirectory -
Build and run the containers with
docker-composedocker-compose up -d --build -
Access MLflow UI with http://localhost:5000
-
Access MinIO UI with http://localhost:9002
-
Access FastAPI docs with http://localhost:8008/docs
Containerization
The MLflow tracking server is composed of 5 docker containers:
- MLflow server
- MLflow server config
- MinIO object storage server
- MinIO create bucket
- PostgreSQL database server
- FastAPI server