Skip to content

Commit

Permalink
Use multi-stage build for Docker
Browse files Browse the repository at this point in the history
  • Loading branch information
pi-sigma committed Feb 7, 2024
1 parent d0f37c2 commit 4c52eae
Show file tree
Hide file tree
Showing 5 changed files with 42 additions and 45 deletions.
54 changes: 38 additions & 16 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,24 +1,46 @@
FROM python:3.10-slim
#
# Backend build
#
FROM python:3.11-slim-bookworm AS backend

# build deps
RUN apt-get update && apt-get upgrade && apt-get install -y --no-install-recommends \
build-essential \
libpq-dev

# get latest version of pip
RUN pip install pip -U

# install requirements
COPY /requirements/* /app/requirements/
RUN pip install -r /app/requirements/base.txt

# pyppeteer deps (https://stackoverflow.com/a/71935536)
RUN xargs apt-get install -y --no-install-recommends < /app/requirements/pyppeteer_deps.txt


#
# Final build
#
FROM python:3.11-slim-bookworm AS final

ENV PIP_DISABLE_PIP_VERSION_CHECK 1
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
ENV DJANGO_ENV "BASE"

COPY requirements/* requirements/
RUN apt-get update && apt-get upgrade -y && apt-get install -y --no-install-recommends \
postgresql-client

RUN apt-get update \
# psycopg2 + deps
&& apt-get install -y --no-install-recommends build-essential \
&& apt-get install -y --no-install-recommends libpq-dev \
&& pip install psycopg2 \
# pyppeteer deps (cf. https://stackoverflow.com/a/71935536)
&& xargs apt-get install -y --no-install-recommends < requirements/pyppeteer_deps.txt \
&& pip install -r requirements/production.txt
# copy backend deps
COPY --from=backend /usr/local/lib/python3.11 /usr/local/lib/python3.11
COPY --from=backend /usr/local/bin/ /usr/local/bin/

COPY . /usr/src/app
WORKDIR /usr/src/app
COPY . /app
WORKDIR /app

RUN python manage.py collectstatic --link --no-input
# create user and drop privileges
RUN useradd -m pi-sigma
RUN chown -R pi-sigma /app
USER pi-sigma

RUN useradd -m myuser
USER myuser
RUN python manage.py collectstatic --link --no-input
25 changes: 0 additions & 25 deletions Dockerfile.dev

This file was deleted.

4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,8 +68,8 @@ http://localhost:8000
```

If all went well, you should see the homepage of the app with a list of news sources arranged in a grid.
The grids are empty to begin with, but the celery workers will start right away and you should see the
first articles displayed shortly thereafter.
The grids are empty to begin with and fill up when the celery workers start
(depends on the schedule in `scraper.tasks`).

In order to extract data about the sources from the database, use the following command while the web container is running (the commands for the other tables are analogous):
```sh
Expand Down
2 changes: 1 addition & 1 deletion docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ services:
image: nous_aggregator
build: &web_build
context: .
dockerfile: Dockerfile.dev
dockerfile: Dockerfile
environment: &web_env
- DJANGO_SETTINGS_MODULE=${DJANGO_SETTINGS_MODULE}
- SECRET_KEY=${SECRET_KEY}
Expand Down
2 changes: 1 addition & 1 deletion scraper/tasks.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
magazines = {
"en": {
"schedule": 3600 * 4, # 4h
"schedule": 3600,
"titles": [
"Al Jazeera",
"Associated Press",
Expand Down

0 comments on commit 4c52eae

Please sign in to comment.