In Codespaces
On Windows with Docker-Compose
Operationalizing ML models involves moving them from development to production to drive business value.
Preparing the model for deployment involves optimizing performance, ensuring it handles real-world data, and packaging it for integration into existing systems.
Deploying the model involves moving it from development to production, making it accessible to users and applications.
Once deployed, models must be continuously monitored for accuracy and reliability, and may need retraining on new data and updates to maintain effectiveness.
The operationalized model must be integrated into existing workflows, applications, and decision-making processes to drive business impact.
Effective operationalization enables organizations to move beyond experimentation and drive tangible value from ML at scale, powering intelligent applications that personalize the customer experience and creates real business value.
MLOps fosters collaboration between data scientists, ML engineers, and DevOps teams by providing a unified environment for experiment tracking, feature engineering, model management, and deployment. This breaks down silos and accelerates the entire machine learning lifecycle.
MLOps ensures high-quality, reliable models in production through clean datasets, proper testing, validation, CI/CD practices, monitoring, and governance.
MLOps enables reproducibility and compliance by versioning datasets, code, and models, providing transparency and auditability to ensure adherence to policies and regulations.
MLOps streamlines the ML lifecycle, enabling organizations to successfully deploy more projects to production and derive tangible business value and ROI from AI/ML investments at scale.
Mage offers features to build, run, and manage data pipelines for data transformation and integration, including pipeline orchestration, notebook environments, data integrations, and streaming pipelines for real-time data.
Mage helps prepare data, train machine learning models, and deploy them with accessible API endpoints.
Mage simplifies MLOps by providing a unified platform for data pipelining, model development, deployment, versioning, CI/CD, and maintenance, allowing developers to focus on model creation while improving efficiency and collaboration.
-
Clone the following respository containing the complete code for this module:
git clone https://github.com/mage-ai/mlops.git cd mlops
-
Launch Mage and the database service (PostgreSQL):
./scripts/start.sh
If don't have bash in your enviroment, modify the following command and run it:
PROJECT_NAME=mlops \ MAGE_CODE_PATH=/home/src \ SMTP_EMAIL=$SMTP_EMAIL \ SMTP_PASSWORD=$SMTP_PASSWORD \ docker compose up
It is ok if you get this warning, you can ignore it
The "PYTHONPATH" variable is not set. Defaulting to a blank string.
-
The subproject that contains all the pipelines and code is named
unit_3_observability
-
Open
http://localhost:6789
in your browser. -
In the top left corner of the screen next to the Mage logo and
mlops
project name, click the project selector dropdown and choose theunit_0_setup
option. -
Click on the pipeline named
example_pipeline
. -
Click on the button labeled
Run @once
.