Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

best practices on production deployment? #76

Open
RafalSkolasinski opened this issue Nov 4, 2020 · 6 comments
Open

best practices on production deployment? #76

RafalSkolasinski opened this issue Nov 4, 2020 · 6 comments

Comments

@RafalSkolasinski
Copy link

RafalSkolasinski commented Nov 4, 2020

I wonder what is a recommended way of deploying mlmd in production?

The only piece in documentation about starting gRPC server I found here:

bazel run -c opt --define grpc_no_ares=true  //ml_metadata/metadata_store:metadata_store_server

Did I miss something?

What would be the best approach to deploy mlmd in Kubernetes?
Some example manifests / documentation on this could be very helpful.

@paveldournov
Copy link

Please take a look at the Kubeflow Pipelines project, it uses MLMD for tracking lineage of artifacts and jobs.

https://github.com/kubeflow/pipelines

@hughmiao
Copy link
Contributor

@dushyanthsc for deployment guidelines from KFP side.

@Jeffwan
Copy link

Jeffwan commented Nov 12, 2020

What's the plan on the multi-tenancy support? I know MLMD doesn't have this concept yet, what's the best practice we should follow?

@hughmiao
Copy link
Contributor

@Jeffwan In the current release, each mlmd-server talks to a single db instance. If the users are allowed to share the same db, then reusing the single server with the released image is fine. When the clients need to store to different db, then multiple server instances are needed at the moment. Would you please elaborate more about the multi-tenancy use case in your deployment to better understand the priorities?

@Jeffwan
Copy link

Jeffwan commented Nov 14, 2020

@hughmiao Thanks for the explanation. Sure. Let me collect more requirements internally and come back with a concrete summary and then we can have some discussion.

@htahir1
Copy link

htahir1 commented May 30, 2022

I'm also looking for some guidance here. How exactly do we deploy MLMD in production? What is the proper guidance here? Is there a docker image that we can simply run without going through Bazel? Any tips might be helpful :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants