██████╗ ███████╗██████╗ ██╗ ██╗████████╗██╗ ██╗
██╔══██╗██╔════╝██╔══██╗██║ ██║╚══██╔══╝╚██╗ ██╔╝
██║ ██║█████╗ ██████╔╝██║ ██║ ██║ ╚████╔╝
██║ ██║██╔══╝ ██╔═══╝ ██║ ██║ ██║ ╚██╔╝
██████╔╝███████╗██║ ╚██████╔╝ ██║ ██║
╚═════╝ ╚══════╝╚═╝ ╚═════╝ ╚═╝ ╚═╝
>----------------------------- REST API for Proxies
- Rest API providing free proxies to use
- Integration with scrapydoo to obtain proxies
- Integration with inspector to validate proxies
- Proxies updated and checked hourly
# Copy the example environment file to .env
# SCRAPYD_URL must be set for the workflow to work,
# You can get an instance up and running through https://github.com/zubedev/scrapydoo
cp .env.example .env
# Build the docker image and run the container
docker-compose up --build --detach
# You can scale up the number of workers for more concurrency
docker-compose up --scale worker=4 --detach
Deputy API is now available at http://localhost:8000. If DEBUG=True
, you can see the browsable API.
Deputy Admin is now available at http://localhost:8000/admin. Credentials are set automatically from .env
file.
- random:
/proxies/random
- get a random proxy
# Poetry is required for installing and managing dependencies
# https://python-poetry.org/docs/#installation
poetry install
# If you don't like doing `poetry run` all the time
# poetry shell # Activate virtual environment in terminal
# Requires a PostgreSQL database to be running and configured in .env
# poetry run python manage.py makemigrations # Create migrations
poetry run python manage.py migrate # Run migrations
# Collect static files for whitenoise
poetry run python manage.py collectstatic
# Run Deputy API
poetry run python manage.py runserver 0.0.0.0:8000
# Create a superuser
poetry run python manage.py createsuperuser
# Install pre-commit hooks
poetry run pre-commit install
# Formatting (inplace formats code)
poetry run black .
# Linting (and to fix automatically)
poetry run ruff .
poetry run ruff --fix .
# Type checking
poetry run mypy .
Configuration details can be found in pyproject.toml.