Skip to content
This repository has been archived by the owner on Dec 5, 2021. It is now read-only.

Store packages on S3 #91

Open
nanorepublica opened this issue Jan 30, 2018 · 0 comments
Open

Store packages on S3 #91

nanorepublica opened this issue Jan 30, 2018 · 0 comments

Comments

@nanorepublica
Copy link

Currently pyshop defaults to storing packages locally on disk. This is not ideal if your server can be killed at random points (eg on any particular cloud service or with a new deployment)

The lines that currently enforce this behaviour are:

dir_ = os.path.join(settings['pyshop.repository'],
filename[0].lower())
if not os.path.exists(dir_):
os.makedirs(dir_, 0o750)
filepath = os.path.join(dir_, filename)
while os.path.exists(filepath):
if asbool(settings.get('pyshop.upload.never_overwrite', '0')):
# policy: don't overwrite files that already exist in the repo
raise exc.HTTPConflict(
"Uploading version ({}) would overwrite existing file"
.format(params['version']))
log.warning('File %s exists but new upload self.request, deleting',
filepath)
os.unlink(filepath)
size = 0
with open(filepath, 'wb') as output_file:
input_file.seek(0)
while True:
data = input_file.read(2 << 16)
if not data:
break
size += len(data)
output_file.write(data)

I would be great to have the storage mechanism as a configurable option rather than be forced to use local disk. I would suggest using this package (https://pypi.python.org/pypi/pyramid_storage/0.1.2) to abstract the storage mechanism to be configurable & extensible.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant