Skip to content

Commit

Permalink
Merge branch-24.03 into branch-24.06
Browse files Browse the repository at this point in the history
  • Loading branch information
dagardner-nv committed Apr 11, 2024
2 parents 951e3b8 + ab8d0a7 commit 90a6f59
Show file tree
Hide file tree
Showing 12 changed files with 34 additions and 21 deletions.
13 changes: 13 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,19 @@ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
# Morpheus 24.03.01 (10 Apr 2024)

## 🚨 Breaking Changes

- Move MemoryDescriptor to the morpheus namespace ([#1602](https://github.com/nv-morpheus/Morpheus/pull/1602)) [@dagardner-nv](https://github.com/dagardner-nv)

## 🐛 Bug Fixes

- Switch to kafka 3.5.2 ([#1612](https://github.com/nv-morpheus/Morpheus/pull/1612)) [@dagardner-nv](https://github.com/dagardner-nv)
- Update mlflow to avoid CVE-2024-27132 and CVE-2024-27133 ([#1609](https://github.com/nv-morpheus/Morpheus/pull/1609)) [@dagardner-nv](https://github.com/dagardner-nv)
- Fix for databricks_cli import error ([#1604](https://github.com/nv-morpheus/Morpheus/pull/1604)) [@dagardner-nv](https://github.com/dagardner-nv)
- Move MemoryDescriptor to the morpheus namespace ([#1602](https://github.com/nv-morpheus/Morpheus/pull/1602)) [@dagardner-nv](https://github.com/dagardner-nv)


# Morpheus 24.03.00 (7 Apr 2024)

Expand Down
2 changes: 1 addition & 1 deletion ci/conda/recipes/morpheus/meta.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ outputs:
- grpcio # Version determined from cudf
- libmrc
- libwebp>=1.3.2 # Required for CVE mitigation: https://nvd.nist.gov/vuln/detail/CVE-2023-4863
- mlflow>=2.2.1,<3
- mlflow>=2.10.0,<3
- mrc
- networkx>=2.8
- numpydoc =1.5.*
Expand Down
4 changes: 2 additions & 2 deletions ci/scripts/download_kafka.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@
import pytest_kafka
from pytest_kafka.install import set_up_kafka

DEFAULT_KAFKA_URL = 'https://downloads.apache.org/kafka/3.4.1/kafka_2.13-3.4.1.tgz'
DEFAULT_KAFKA_TAR_ROOTDIR = 'kafka_2.13-3.4.1/'
DEFAULT_KAFKA_URL = 'https://downloads.apache.org/kafka/3.5.2/kafka_2.13-3.5.2.tgz'
DEFAULT_KAFKA_TAR_ROOTDIR = 'kafka_2.13-3.5.2/'


def main():
Expand Down
2 changes: 1 addition & 1 deletion conda/environments/all_cuda-121_arch-x86_64.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ dependencies:
- librdkafka>=1.9.2,<1.10.0a0
- libtool
- libwebp=1.3.2
- mlflow=2.9.2
- mlflow>=2.10.0,<3
- mrc=24.06
- myst-parser=0.18.1
- nbsphinx
Expand Down
2 changes: 1 addition & 1 deletion conda/environments/dev_cuda-121_arch-x86_64.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ dependencies:
- isort
- librdkafka>=1.9.2,<1.10.0a0
- libtool
- mlflow=2.9.2
- mlflow>=2.10.0,<3
- mrc=24.06
- myst-parser=0.18.1
- nbsphinx
Expand Down
2 changes: 1 addition & 1 deletion conda/environments/examples_cuda-121_arch-x86_64.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ dependencies:
- jsonpatch>=1.33
- kfp
- libwebp=1.3.2
- mlflow=2.9.2
- mlflow>=2.10.0,<3
- networkx=2.8.8
- newspaper3k=0.2
- nodejs=18.*
Expand Down
2 changes: 1 addition & 1 deletion conda/environments/runtime_cuda-121_arch-x86_64.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ dependencies:
- elasticsearch==8.9.0
- feedparser=6.0.10
- grpcio=1.59
- mlflow=2.9.2
- mlflow>=2.10.0,<3
- networkx=2.8.8
- numpydoc=1.5
- nvtabular=23.08.00
Expand Down
4 changes: 2 additions & 2 deletions dependencies.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -253,7 +253,7 @@ dependencies:
- elasticsearch==8.9.0
- feedparser=6.0.10
- grpcio=1.59
- mlflow=2.9.2
- mlflow>=2.10.0,<3
- networkx=2.8.8
- nvtabular=23.08.00
- pydantic
Expand Down Expand Up @@ -301,7 +301,7 @@ dependencies:
- dask=2023.12.1
- distributed=2023.12.1
- kfp
- mlflow=2.9.2
- mlflow>=2.10.0,<3
- papermill=2.4.0
- s3fs=2023.12.2

Expand Down
2 changes: 1 addition & 1 deletion examples/digital_fingerprinting/production/conda_env.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ dependencies:
- distributed
- kfp
- librdkafka
- mlflow>=2.2.1,<3
- mlflow>=2.10.0,<3
- nodejs=18.*
- nvtabular=23.06
- papermill
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ RUN apt update && \
rm -rf /var/cache/apt/* /var/lib/apt/lists/*

# Install python packages
RUN pip install "mlflow >=2.2.1,<3" boto3 pymysql pyyaml
RUN pip install "mlflow >=2.10.0,<3" boto3 pymysql pyyaml

# We run on port 5000
EXPOSE 5000
Expand Down
18 changes: 9 additions & 9 deletions models/mlflow/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@ are included for publishing TensorRT, ONNX and FIL models to your MLflow Model R

## Requirements

* MLflow (tested on 1.24.0)
* Python (tested on 3.8)
* MLflow (tested on 2.11.3)
* Python (tested on 3.11)

## Install Triton Docker Image

Expand Down Expand Up @@ -89,7 +89,7 @@ Create an MLflow container with a volume mounting the Triton model repository:
```bash
docker run -it -v /opt/triton_models:/triton_models \
--env TRITON_MODEL_REPO=/triton_models \
--env MLFLOW_TRACKING_URI=localhost:5000 \
--env MLFLOW_TRACKING_URI="http://localhost:5000" \
--gpus '"device=0"' \
--net=host \
--rm \
Expand All @@ -115,29 +115,29 @@ The `publish_model_to_mlflow` script is used to publish `triton` flavor models t
```
python publish_model_to_mlflow.py \
--model_name sid-minibert-onnx \
--model_directory <path-to-morpheus-models-repo>/models/triton-model-repo/sid-minibert-onnx \
--model_directory /triton_models/triton-model-repo/sid-minibert-onnx \
--flavor triton
```

## Deployments

The Triton `mlflow-triton-plugin` is installed on this container and can be used to deploy your models from MLflow to Triton Inference Server. The following are examples of how the plugin is used with the `sid-minibert-onnx` model that we published to MLflow above. For more information about the
`mlflow-triton-plugin`, refer to Triton's [documentation](https://github.com/triton-inference-server/server/tree/r23.01/deploy/mlflow-triton-plugin)
`mlflow-triton-plugin`, refer to Triton's [documentation](https://github.com/triton-inference-server/server/tree/r24.03/deploy/mlflow-triton-plugin)

### Create Deployment

To create a deployment use the following command

##### CLI
```
mlflow deployments create -t triton --flavor triton --name sid-minibert-onnx -m models:/sid-minibert-onnx/1
mlflow deployments create -t triton --flavor triton --name sid-minibert-onnx -m "models:/sid-minibert-onnx/1"
```

##### Python API
```
from mlflow.deployments import get_deploy_client
client = get_deploy_client('triton')
client.create_deployment("sid-minibert-onnx", " models:/sid-minibert-onnx/1", flavor="triton")
client.create_deployment("sid-minibert-onnx", "models:/sid-minibert-onnx/1", flavor="triton")
```

### Delete Deployment
Expand All @@ -158,14 +158,14 @@ client.delete_deployment("sid-minibert-onnx")

##### CLI
```
mlflow deployments update -t triton --flavor triton --name sid-minibert-onnx -m models:/sid-minibert-onnx/2
mlflow deployments update -t triton --flavor triton --name sid-minibert-onnx -m "models:/sid-minibert-onnx/1"
```

##### Python API
```
from mlflow.deployments import get_deploy_client
client = get_deploy_client('triton')
client.update_deployment("sid-minibert-onnx", "models:/sid-minibert-onnx/2", flavor="triton")
client.update_deployment("sid-minibert-onnx", "models:/sid-minibert-onnx/1", flavor="triton")
```

### List Deployments
Expand Down
2 changes: 1 addition & 1 deletion models/mlflow/docker/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ RUN sed -i 's/conda activate base/conda activate mlflow/g' ~/.bashrc
SHELL ["/opt/conda/bin/conda", "run", "-n", "mlflow", "/bin/bash", "-c"]

ARG TRITON_DIR=/mlflow/triton-inference-server
ARG TRITON_VER=r24.01
ARG TRITON_VER=r24.03

RUN mkdir ${TRITON_DIR} && \
cd ${TRITON_DIR} && \
Expand Down

0 comments on commit 90a6f59

Please sign in to comment.