- Install AWS CLI - link to instruction
- Generate key for the user that you will be using to manage terraform resources - link to the instruction
- Login using
aws configure
command
- Terraform version used here : v1.5.0
- Install terraform - link to the official documentation
- Install terraform - docs for automatic documentation (optional) terraform-docs
- For terraform to work optimally we should store our terraform state in cloud. You can store it locally but it is not recommended. Details about terraform state.
- Go to Amazon S3 and click Create Bucket
- Choose a name and region for your bucket
- You can leave the rest of the settings default
- change
bucket
value inmain.tf
file to name of the bucket you just created - adjust
variables.tf
andlocal.tf
so the data matches your project, if you are not using vpn use0.0.0.0/0
instead (it will make your application available from any IP) - run
terraform init
Documentation for terraform elements HERE
- create ECR using command
terraform apply -target=aws_ecr_repository.mlflow_ecr
- build image based on Dockerfile from
/mlflow-gcp/AWS
- upload image to ECR with tag
latest
- official instruction
Change region
, aws_account_id
, repository_name
and imgine_id
so it matches your project.
- authenticate docker for aws
aws ecr get-login-password --region region | docker login --username AWS --password-stdin aws_account_id.dkr.ecr.region.amazonaws.com
- tag your imagine
docker tag imgine_id aws_account_id.dkr.ecr.region.amazonaws.com/repository_name:latest
- push your imagine to AWS
docker push aws_account_id.dkr.ecr.region.amazonaws.com/repository_name:latest
- use
terraform plan
to review what elements will be created - use
terraform apply
to set up the rest of the infrastructure (it will take a while)
- in folder
AWS/sample_client
create.env
file base on template check_tracking.py
- uploads an artifact to the bucket, it will be visible in MlFlow GUIupload_model.py
- uploads more advanced model