Skip to content

Commit

Permalink
add support for endpoint url
Browse files Browse the repository at this point in the history
  • Loading branch information
dral3x committed Feb 17, 2024
1 parent 980a970 commit c3c305e
Show file tree
Hide file tree
Showing 5 changed files with 55 additions and 44 deletions.
4 changes: 4 additions & 0 deletions sync-backups-s3/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Changelog

## 1.0.2

- Add `endpoint_url` option to support non-AWS S3-compatible services

## 1.0.1

- Remove unneccesary option `aws_region`
Expand Down
9 changes: 6 additions & 3 deletions sync-backups-s3/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,18 +8,21 @@ Upload HomeAssistant backups to an AWS S3 bucket.
## Prerequisites

In order to use this add-on, you'll need an AWS account with:

- a S3 bucket
- a IAM user with access to the bucket above.

Be aware about all security implications of managing a AWS account and permissions in general.

This add-on also supports S3-compatible services (like DigitalOcean, Cloudflare, Backblaze, Wazabi and more). You just need to provide credentials and the endpoint URL of those services.

## Installation

The installation of this add-on is pretty straightforward and no different to installing any other Home Assistant add-on.

1. Click the "Add Add-on Repository To My" button below to open the add-on on your Home Assistant instance, or manually add the repository `https://github.com/dral3x/ha-addons` in the Add-on Store.

[![Add add-on repository to my Home Assistant](https://my.home-assistant.io/badges/supervisor_add_addon_repository.svg)](https://my.home-assistant.io/redirect/supervisor_add_addon_repository/?repository_url=https%3A%2F%2Fgithub.com%2Fdral3x%2Fha-addons)
[![Add add-on repository to my Home Assistant](https://my.home-assistant.io/badges/supervisor_add_addon_repository.svg)](https://my.home-assistant.io/redirect/supervisor_add_addon_repository/?repository_url=https%3A%2F%2Fgithub.com%2Fdral3x%2Fha-addons)

2. Search for this add-on, and click it.
3. Click the "Install" button to install the add-on.
Expand Down Expand Up @@ -51,6 +54,6 @@ action:

At 2AM a new backup will be created, and 15 minutes after that, the upload to s3 will start.

## Credits
## Credits

This add-on is based on the work of [hassio-backup-s3](https://github.com/mikebell/hassio-backup-s3) and several other forks.
This add-on is based on the work of [hassio-backup-s3](https://github.com/mikebell/hassio-backup-s3) and several other forks.
65 changes: 31 additions & 34 deletions sync-backups-s3/config.json
Original file line number Diff line number Diff line change
@@ -1,35 +1,32 @@
{
"name": "Sync Backups on Amazon S3",
"version": "1.0.1",
"slug": "sync-backups-on-s3",
"description": "Sync backups to your Amazon S3 bucket",
"url": "https://github.com/dral3x/ha-addons/tree/main/sync-backups-s3",
"arch": [
"aarch64",
"amd64"
],
"image": "ghcr.io/dral3x/sync-backups-on-s3-{arch}",
"init": false,
"boot": "manual",
"backup": "cold",
"hassio_role": "backup",
"options": {
"aws_key_id": null,
"aws_key_secret": null,
"bucket_name": null,
"bucket_folder": "/",
"delete_if_missing": false,
"storage_class": "STANDARD_IA"
},
"schema": {
"aws_key_id": "password",
"aws_key_secret": "password",
"bucket_name": "str",
"bucket_folder": "str",
"delete_if_missing": "bool",
"storage_class": "list(STANDARD|REDUCED_REDUNDANCY|STANDARD_IA|ONEZONE_IA|INTELLIGENT_TIERING|GLACIER|DEEP_ARCHIVE|GLACIER_IR)"
},
"map": [
"backup:rw"
]
}
"name": "Sync Backups on Amazon S3",
"version": "1.0.2",
"slug": "sync-backups-on-s3",
"description": "Sync backups to your Amazon S3 bucket",
"url": "https://github.com/dral3x/ha-addons/tree/main/sync-backups-s3",
"arch": ["aarch64", "amd64"],
"image": "ghcr.io/dral3x/sync-backups-on-s3-{arch}",
"init": false,
"boot": "manual",
"backup": "cold",
"hassio_role": "backup",
"options": {
"aws_key_id": null,
"aws_key_secret": null,
"bucket_name": null,
"bucket_folder": "/",
"delete_if_missing": false,
"storage_class": "STANDARD_IA",
"endpoint_url": ""
},
"schema": {
"aws_key_id": "password",
"aws_key_secret": "password",
"bucket_name": "str",
"bucket_folder": "str",
"delete_if_missing": "bool",
"storage_class": "list(STANDARD|REDUCED_REDUNDANCY|STANDARD_IA|ONEZONE_IA|INTELLIGENT_TIERING|GLACIER|DEEP_ARCHIVE|GLACIER_IR)",
"endpoint_url": "str?"
},
"map": ["backup:rw"]
}
4 changes: 4 additions & 0 deletions sync-backups-s3/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ BUCKET=`jq -r .bucket_name /data/options.json`
FOLDER=`jq -r .bucket_folder /data/options.json`
STORAGE_CLASS=`jq -r .storage_class /data/options.json`
DELETE=`jq -r .delete_if_missing /data/options.json`
ENDPOINT_URL=`jq -r .endpoint_url /data/options.json`

# Setting up aws cli
aws configure set aws_access_key_id $KEY
Expand All @@ -23,6 +24,9 @@ OPTIONS="--storage-class $STORAGE_CLASS"
if [[ $DELETE == true ]]; then
OPTIONS+=" --delete"
fi
if [[ -n "$ENDPOINT_URL" ]]; then
OPTIONS+=" --endpoint=$ENDPOINT_URL"
fi

echo "Sync started..."

Expand Down
17 changes: 10 additions & 7 deletions sync-backups-s3/translations/en.yaml
Original file line number Diff line number Diff line change
@@ -1,19 +1,22 @@
configuration:
aws_key_id:
name: AWS Access Key
aws_key_id:
name: Access Key ID
description: The access key used to upload files on the bucket.
aws_key_secret:
name: AWS Secret Access Key
aws_key_secret:
name: Secret Access Key
description: The secret used to upload files on the bucket.
bucket_name:
bucket_name:
name: S3 Bucket name
description:
bucket_folder:
bucket_folder:
name: Files prefix
description: The "folder" where backups are uploaded inside the bucket. It must ends with /
delete_if_missing:
name: Delete files from bucket if missing locally
description: Set if you want to remove files from the bucket if those are missing from the local backup folder.
storage_class:
name: Storage Class
description: Visit https://aws.amazon.com/s3/storage-classes for performance and cost of each class.
description: Visit https://aws.amazon.com/s3/storage-classes for performance and cost of each class.
endpoint_url:
name: S3-compatible API endpoint URL
description: Leave it empty if you want to use AWS S3 service. Otherwise set it to the endpoint that others S3-compatible services offers (for example DigitalOcean, Backblaze, Wasabi or Cloudflare). Be aware that the URL should start with "https://"

0 comments on commit c3c305e

Please sign in to comment.