mlflow-server / README.md

MLflow On-Premise Deployment using Docker Compose

mlflow server to manage ml models lifecycle

Last updated: 4/16/2026GitHub

MLflow On-Premise Deployment using Docker Compose

Easily deploy an MLflow tracking server with 1 command.

MinIO S3 is used as the artifact store and PostgreSQL server is used as the backend store.

How to run

  1. Clone this repository

    git clone https://github.com/rihal-om/mlflow-server.git
    
  2. cd into the mlflow-server directory

  3. Build and run the containers with docker-compose

    docker-compose up -d --build
    
  4. Access MLflow UI with http://localhost:5000

  5. Access MinIO UI with http://localhost:9002

  6. Access FastAPI docs with http://localhost:8008/docs

Containerization

The MLflow tracking server is composed of 5 docker containers:

  • MLflow server
  • MLflow server config
  • MinIO object storage server
  • MinIO create bucket
  • PostgreSQL database server
  • FastAPI server