Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change the way backups are handled! #22

Open
vjeeva opened this issue Oct 20, 2017 · 1 comment
Open

Change the way backups are handled! #22

vjeeva opened this issue Oct 20, 2017 · 1 comment

Comments

@vjeeva
Copy link
Owner

vjeeva commented Oct 20, 2017

They are copied, even if local... then they are streamed by SSH tunnel into the engine container. Can't take advantage of parallel jobs and remote SSH file streaming is super slow.

Have to get off of pyopen3 and use Docker exec directly on the container!

@vjeeva
Copy link
Owner Author

vjeeva commented Oct 20, 2017

Covered in #20 but only for postgres. Will need to do this for mongo and mysql.

vjeeva pushed a commit that referenced this issue Oct 23, 2017
* Updated Readme for what we need plus added triggering functionality to UI

* Added support for reading remote logs

* Saving work of logs based on configs.

* Automation tool triggering added, log dumping made for local or s3 based on driver, logs can dump to either and retrieve from either! Also finally routed properly for docker images to load backups via mount rather than ssh tunnel file pipe... Added support for postgres pg_restore parallel restore. Idk if I mmissed anything else

* Updated README for leftover items

* S3 backups go into host mounted tmp dir

* Fix to tmp dir
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant