An async program that takes requests to fetch TV guides, spins up async celery workers to do the job.
- Create
.env
file following the example in.env.example
- Add the domain you are hosting the application on to the ALLOWED_HOSTS in the .env file.
- Spin up the docker containers.
- Create a folder in the root directory called
temp
. We will use this to clone the siteini.pack in the next step. - First you need to hit the endpoint that updates the
siteini.pack
folder. You can look at the swagger docs to find the endpoint. Currently the endpoint for that isupdate-site-pack
end it accepts a get request.
- run the containers
docker compose up
- Create
.env
file following the example in.env.example
- Assumes that there is a
postgres
database outside of docker that the app can connect to. - run the containers
docker compose -f docker-compose.yml -f docker-compose.prod.yml up
- We can running the container in a detached state using the command.
docker compose -f docker-compose.yml -f docker-compose.prod.yml up --detach
- Once you have container up and running, you need to get into the
web
service and run migrations. - You can do that like this:
- Access the container using the command
docker compose exec web bash
where web is the name of the service we defined in the docker compose file. - Run the migrations using the command
python manage.py migrate
- With the server running go to the url
http://127.0.0.1:8000/swagger-ui
.
This application makes use of docker to spin up several services to get the job done.
This service runs Django.
A postgres for the Django application. Running postgres postgres:15.3
.
this is defined in docker-compose.override.yml
.
If you are using your own postgres instance running on your local machine you can create a database and create a user like this.
create database mydb;
create user myuser with encrypted password 'mypass';
grant all privileges on database mydb to myuser;
each of the above is run as a single command. So first you create the database then the user.
We are using the redis server as a message broker for Celery. Celery is the task queue we are using.
We have each celery worker running as a container, this is because of how WebGrab the scrapper we are using runs. Otherwise one celery worker container and making using of concurrency would have worked. To add a new worker copy one of the existing celery_workers.
- how to access a service running on the host machine from within a container?
- we set up extra_hosts on the web_service. So from within the
web
service we can access the host like this.
to access a service running on portcurl http://host.docker.internal:8080
8080
on the host machine. - we set up extra_hosts on the web_service. So from within the
- See the docker containers that are running.
docker ps
- Get the logs of specific container
you can get the name of the container from the previous step.
docker logs -f <name_of_container>
- Check the status of the celery workers.
- First get into the web container.
docker compose exec web bash
- Check the workers.
celery -A web_grab inspect active
whereweb_grab
is the name of the project. See celery docs
- First get into the web container.
- Reading all docker logs
docker compose -f docker-compose.yml -f docker-compose.prod.yml logs --follow
- See docs on docker compose logs here
- Creating an API Key.
- log in as a superuser and create an API key using the admin dashboard.
Endpoint | Functionality | Note |
---|---|---|
POST /upload/<file_name> |
Upload custom ini file. | Needs to be .xml or .ini |
GET /docker-down |
Docker compose down | Brings down the docker containers. |
GET /docker-status/ |
List containers running. | Shows which containers are running. |