Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix all files using pre-commit #1170

Merged
merged 6 commits into from
Mar 5, 2024
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .nvmrc
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
16.*
dalecannon marked this conversation as resolved.
Show resolved Hide resolved
4 changes: 2 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Production static resource building
# Production static resource building
# Currently moved to makefile for development

# FROM node:lts-buster-slim AS jsdeps
Expand All @@ -20,7 +20,7 @@ ARG ENV="prod"
ENV ENV="${ENV}" \
PYTHONUNBUFFERED=1 \
PYTHONDONTWRITEBYTECODE=1\
PATH="${PATH}:/home/tamato/.local/bin"
PATH="${PATH}:/home/tamato/.local/bin"

# don't run as root
RUN groupadd -g 1000 tamato && \
Expand Down
44 changes: 22 additions & 22 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ DB_DUMP?=${PROJECT}_db.sql
DB_NAME?=${PROJECT}
DB_USER?=postgres
DATE=$(shell date '+%Y_%m_%d')
TEMPLATE_NAME?="${DB_NAME}_${DATE}"
TEMPLATE_NAME?="${DB_NAME}_${DATE}"

-include .env
export
Expand All @@ -20,7 +20,7 @@ SPHINXOPTS ?=
.PHONY: help clean clean-bytecode clean-static collectstatic compilescss dependencies \
docker-clean docker-deep-clean docker-down docker-up-db docker-down docker-image \
docker-db-dump docker-test node_modules run test test-fast docker-makemigrations \
docker-checkmigrations docker-migrate build-docs docker-restore-db docker-import-new-db
docker-checkmigrations docker-migrate build-docs docker-restore-db docker-import-new-db



Expand Down Expand Up @@ -118,7 +118,7 @@ docker-test-fast:
@echo
@echo "> Running tests in docker..."
@${COMPOSE_LOCAL} ${DOCKER_RUN} \
${PROJECT} ${PYTHON} -m pytest -x -n=auto --dist=loadfile
${PROJECT} ${PYTHON} -m pytest -x -n=auto --dist=loadfile

## clean-docs: Clean the generated documentation files
clean-docs:
Expand All @@ -134,15 +134,15 @@ build-docs html:
## docker-clean: clean unused images and volumes
docker-clean:
@echo
@echo "> Cleaning unused images in docker..."
@echo "> Cleaning unused images in docker..."
@${DOCKER} image prune -a -f
@echo "> Cleaning unused volumes in docker..."
@${DOCKER} volume prune -f
@echo "> Cleaning unused volumes in docker..."
@${DOCKER} volume prune -f

## docker-deep-clean: deep clean all unused systems (containers, networks, images, cache)
docker-deep-clean:
@echo
@echo "> Cleaning unused systems in docker..."
@echo "> Cleaning unused systems in docker..."
@${DOCKER} system prune -a

## docker-down: shut down services in Docker
Expand All @@ -154,38 +154,38 @@ docker-down:
## docker-up-db: shut down services in Docker
docker-up-db:
@echo
@echo "> Running db in docker..."
@echo "> Running db in docker..."
@${COMPOSE_LOCAL} up -d db
@echo
@echo
@echo "Waiting for database \"ready for connections\""
@sleep 15;
@sleep 15;
@echo "Database Ready for connections!"

## docker-import-new-db: Import new db DB_DUMP into new TEMPLATE_NAME in Docker container db must be running
## docker-import-new-db: Import new db DB_DUMP into new TEMPLATE_NAME in Docker container db must be running
docker-import-new-db: docker-up-db
@${COMPOSE_LOCAL} exec -u ${DB_USER} db psql -c "DROP DATABASE ${TEMPLATE_NAME}" || true
@${COMPOSE_LOCAL} exec -u ${DB_USER} db psql -c "CREATE DATABASE ${TEMPLATE_NAME} TEMPLATE template0"
@${COMPOSE_LOCAL} exec -u ${DB_USER} db psql -c "DROP DATABASE ${TEMPLATE_NAME}" || true
@${COMPOSE_LOCAL} exec -u ${DB_USER} db psql -c "CREATE DATABASE ${TEMPLATE_NAME} TEMPLATE template0"
@echo "> Running db dump: ${DB_DUMP} in docker..."
@cat ${DB_DUMP} | ${COMPOSE_LOCAL} exec -T db psql -U ${DB_USER} -d ${TEMPLATE_NAME}
@sleep 5;
@cat ${DB_DUMP} | ${COMPOSE_LOCAL} exec -T db psql -U ${DB_USER} -d ${TEMPLATE_NAME}
@sleep 5;

## docker-restore-db: Resotre db in Docker container set DB_NAME to rename db must be running
## docker-restore-db: Resotre db in Docker container set DB_NAME to rename db must be running
docker-restore-db: docker-down docker-up-db
@${COMPOSE_LOCAL} exec -u ${DB_USER} db psql -c "DROP DATABASE ${DB_NAME}" || true
@${COMPOSE_LOCAL} exec -u ${DB_USER} db psql -c "CREATE DATABASE ${DB_NAME} TEMPLATE ${TEMPLATE_NAME}"
@${COMPOSE_LOCAL} exec -u ${DB_USER} db psql -c "DROP DATABASE ${DB_NAME}" || true
@${COMPOSE_LOCAL} exec -u ${DB_USER} db psql -c "CREATE DATABASE ${DB_NAME} TEMPLATE ${TEMPLATE_NAME}"
@sleep 5;

## docker-db-dump: Run db dump to import data into Docker
docker-db-dump: docker-up-db
@echo "> Running db dump in docker..."
@cat ${DB_DUMP} | ${COMPOSE_LOCAL} exec -T db psql -U ${DB_USER} -d ${DB_NAME}
@cat ${DB_DUMP} | ${COMPOSE_LOCAL} exec -T db psql -U ${DB_USER} -d ${DB_NAME}

## docker-first-use: Run application for first time in Docker
## docker-first-use: Run application for first time in Docker
docker-first-use: docker-down docker-clean node_modules compilescss docker-build docker-import-new-db \
docker-restore-db docker-migrate docker-superuser docker-up
docker-restore-db docker-migrate docker-superuser docker-up

## docker-makemigrations: Run django makemigrations in Docker
docker-makemigrations:
docker-makemigrations:
@echo
@echo "> Running makemigrations in docker..."
@${COMPOSE_LOCAL} ${DOCKER_RUN} \
Expand Down
2 changes: 1 addition & 1 deletion checks/tests/test_tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ def check(request):
assert num_completed >= num_successful

check = factories.TransactionCheckFactory.create(
**{trait: True for trait in traits}
**{trait: True for trait in traits},
)
check_names = [str(i) for i in range(num_checks)]
completes = repeat(True, num_completed)
Expand Down
4 changes: 2 additions & 2 deletions commodities/import_handlers.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ def get_absorbed_into_goods_nomenclature_link(self, model, kwargs):
):
previous = (
models.GoodsNomenclatureSuccessor.objects.filter(
**{key: self.data[key] for key in self.identifying_fields}
**{key: self.data[key] for key in self.identifying_fields},
)
.latest_approved()
.get()
Expand Down Expand Up @@ -144,7 +144,7 @@ def create_missing_goods_nomenclature_description_period(
goods_nomenclature_description_handler,
):
"""
in some circumstances, we will receive an EU update that will reference
In some circumstances, we will receive an EU update that will reference
a historic description period, and since TAP does not track SIDs
currently for this data we cant resolve the reference.

Expand Down
9 changes: 7 additions & 2 deletions commodities/models/dc.py
Original file line number Diff line number Diff line change
Expand Up @@ -911,7 +911,10 @@ def to_transaction(self, workbasket: WorkBasket) -> TrackedModel:

with workbasket.new_transaction(order=order) as transaction:
return self.obj.new_version(
workbasket, transaction, update_type=self.update_type, **attrs
workbasket,
transaction,
update_type=self.update_type,
**attrs,
)

def _get_preemptive_transaction_order(self, workbasket: WorkBasket) -> int:
Expand Down Expand Up @@ -1693,7 +1696,9 @@ def __init__(self, **kwargs):


def get_tracked_model_reflection(
obj: TrackedModel, transaction: Transaction = None, **overrides
obj: TrackedModel,
transaction: Transaction = None,
**overrides,
):
"""
Returns a reflection of a TrackedModel object.
Expand Down
12 changes: 9 additions & 3 deletions commodities/tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -166,7 +166,9 @@ def create_collection(


def create_record(
transaction_pool: Iterator[Transaction], factory, **kwargs
transaction_pool: Iterator[Transaction],
factory,
**kwargs,
) -> TrackedModel:
"""
Returns a new TrackedModel instance.
Expand All @@ -179,7 +181,9 @@ def create_record(


def create_dependent_measure(
commodity: Commodity, transaction_pool: Iterator[Transaction], **kwargs
commodity: Commodity,
transaction_pool: Iterator[Transaction],
**kwargs,
) -> Measure:
"""Returns a new measure linked to a given good."""
factory = factories.MeasureFactory
Expand All @@ -190,7 +194,9 @@ def create_dependent_measure(


def create_footnote_association(
commodity: Commodity, transaction_pool: Iterator[Transaction], **kwargs
commodity: Commodity,
transaction_pool: Iterator[Transaction],
**kwargs,
) -> FootnoteAssociationGoodsNomenclature:
"""Returns a new footnote association linked to a given good."""
factory = factories.FootnoteAssociationGoodsNomenclatureFactory
Expand Down
22 changes: 14 additions & 8 deletions common/business_rules.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,8 @@ def get_linked_models(
:param model TrackedModel: Get models linked to this model instance
:param transaction Transaction: Get latest approved versions of linked
models as of this transaction
:rtype Iterator[TrackedModel]: The linked models
:return: The linked models
:rtype: Iterator[TrackedModel]
"""
for field, related_model in get_relations(type(model)).items():
business_rules = getattr(related_model, "business_rules", [])
Expand Down Expand Up @@ -169,9 +170,11 @@ def violation(
"""
Create a violation exception object.

:param model Optional[TrackedModel]: The model that violates this business rule
:param message Optional[str]: A message explaining the violation
:rtype BusinessRuleViolation: An exception indicating a business rule violation
:param model Optional[TrackedModel]: The model that violates this
business rule :param message Optional[str]: A message explaining the
dalecannon marked this conversation as resolved.
Show resolved Hide resolved
violation
:rtype BusinessRuleViolation: An exception indicating a business rule
violation
"""

return getattr(self.__class__, "Violation", BusinessRuleViolation)(
Expand All @@ -185,7 +188,7 @@ def only_applicable_after(cutoff: Union[date, datetime, str]):
Decorate BusinessRules to make them only applicable after a given date.

:param cutoff Union[date, datetime, str]: The date, datetime or isoformat
date string of the time before which the rule should not apply
date string of the time before which the rule should not apply
"""

if isinstance(cutoff, str):
Expand Down Expand Up @@ -226,8 +229,10 @@ def skip_when_update_type(cls: Type[BusinessRule], update_types: Iterable[Update
"""
Skip business rule validation for given update types.

:param cls Type[BusinessRule]: The BusinessRule to decorate
:param update_types Iterable[int]: The UpdateTypes to skip
:param cls: The BusinessRule to decorate
:type cls: Type[BusinessRule]
:param update_types: The UpdateTypes to skip
:type update_types: Iterable[int]
"""
_original_validate = cls.validate

Expand Down Expand Up @@ -336,7 +341,8 @@ def validate(self, model):
Check whether the specified model violates this business rule.

:param model TrackedModel: The model to check
:raises BusinessRuleViolation: Raised if the passed model violates this business rule.
:raises BusinessRuleViolation: Raised if the passed model violates this
business rule.
"""
if self.has_violation(model):
raise self.violation(model)
Expand Down
5 changes: 4 additions & 1 deletion common/fields.py
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,10 @@ class TaricDateRangeField(DateRangeField):
range_type = TaricDateRange

def from_db_value(
self, value: Union[DateRange, TaricDateRange], *_args, **_kwargs
self,
value: Union[DateRange, TaricDateRange],
*_args,
**_kwargs,
) -> TaricDateRange:
"""
By default Django ignores the range_type and just returns a Psycopg2
Expand Down
23 changes: 12 additions & 11 deletions common/migrations/0002_transaction_partition_1_of_3.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,17 +8,18 @@ class Migration(migrations.Migration):
"""
Transaction partition field 1 of 3.

1/3 Create Partition field, with partition defaulting to (1, SEED), as at the time of writing most of
Transactions are from the seed file.

2/3 Data migration to set Transaction partitions to (2, REVISION) and (3, DRAFT). REVISION: Before this
migration was written REVISION transactions are contained in workbaskets after the first workbasket with approved
workbasket status. After this migration was written data schemas allow more control over SEED / REVISION
transactions. DRAFT: Draft Transactions are inferred by checking for transactions not in the first workbasket
that lack an approved workbasket status..

3/3
Set the default value to (3, DRAFT)
1/3 Create Partition field, with partition defaulting to (1, SEED), as at
the time of writing most of Transactions are from the seed file.

2/3 Data migration to set Transaction partitions to (2, REVISION) and (3,
DRAFT). REVISION: Before this migration was written REVISION transactions
are contained in workbaskets after the first workbasket with approved
workbasket status. After this migration was written data schemas allow more
control over SEED / REVISION transactions. DRAFT: Draft Transactions are
inferred by checking for transactions not in the first workbasket that lack
an approved workbasket status..

3/3 Set the default value to (3, DRAFT)
"""

dependencies = [
Expand Down
23 changes: 12 additions & 11 deletions common/migrations/0003_transaction_partition_2_of_3.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,17 +42,18 @@ class Migration(migrations.Migration):
"""
Transaction partition field 2 of 3.

1/3 Create Partition field, with partition defaulting to (1, SEED), as at the time of writing most of
Transactions are from the seed file.

2/3 Data migration to set Transaction partitions to (2, REVISION) and (3, DRAFT). REVISION: Before this
migration was written REVISION transactions are contained in workbaskets after the first workbasket with approved
workbasket status. After this migration was written data schemas allow more control over SEED / REVISION
transactions. DRAFT: Draft Transactions are inferred by checking for transactions not in the first workbasket
that lack an approved workbasket status..

3/3
Set the default value to (3, DRAFT)
1/3 Create Partition field, with partition defaulting to (1, SEED), as at
the time of writing most of Transactions are from the seed file.

2/3 Data migration to set Transaction partitions to (2, REVISION) and (3,
DRAFT). REVISION: Before this migration was written REVISION transactions
are contained in workbaskets after the first workbasket with approved
workbasket status. After this migration was written data schemas allow more
control over SEED / REVISION transactions. DRAFT: Draft Transactions are
inferred by checking for transactions not in the first workbasket that lack
an approved workbasket status..

3/3 Set the default value to (3, DRAFT)
"""

dependencies = [
Expand Down
23 changes: 12 additions & 11 deletions common/migrations/0004_transaction_partition_3_of_3.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,17 +8,18 @@ class Migration(migrations.Migration):
"""
Transaction partition field 3 of 3.

1/3 Create Partition field, with partition defaulting to (1, SEED), as at the time of writing most of
Transactions are from the seed file.

2/3 Data migration to set Transaction partitions to (2, REVISION) and (3, DRAFT). REVISION: Before this
migration was written REVISION transactions are contained in workbaskets after the first workbasket with approved
workbasket status. After this migration was written data schemas allow more control over SEED / REVISION
transactions. DRAFT: Draft Transactions are inferred by checking for transactions not in the first workbasket
that lack an approved workbasket status..

3/3
Set the default value to (3, DRAFT)
1/3 Create Partition field, with partition defaulting to (1, SEED), as at
the time of writing most of Transactions are from the seed file.

2/3 Data migration to set Transaction partitions to (2, REVISION) and (3,
DRAFT). REVISION: Before this migration was written REVISION transactions
are contained in workbaskets after the first workbasket with approved
workbasket status. After this migration was written data schemas allow more
control over SEED / REVISION transactions. DRAFT: Draft Transactions are
inferred by checking for transactions not in the first workbasket that lack
an approved workbasket status..

3/3 Set the default value to (3, DRAFT)
"""

dependencies = [
Expand Down
2 changes: 1 addition & 1 deletion common/models/mixins/description.py
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ def get_descriptions(self) -> TrackedModelQuerySet:
}

query = descriptions_model.objects.filter(**filter_kwargs).order_by(
*descriptions_model._meta.ordering
*descriptions_model._meta.ordering,
)

return query.current()
Expand Down
14 changes: 11 additions & 3 deletions common/models/tracked_qs.py
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,11 @@ def version_ordering(self) -> TrackedModelQuerySet:
return self.order_by("transaction__partition", "transaction__order")

def _get_current_related_lookups(
self, model, *lookups, prefix="", recurse_level=0
self,
model,
*lookups,
prefix="",
recurse_level=0,
) -> List[str]:
"""
Build a list of lookups for the current versions of related objects.
Expand Down Expand Up @@ -241,10 +245,14 @@ def with_latest_links(self, *lookups, recurse_level=0) -> TrackedModelQuerySet:
run multiple queries for every current relation.
"""
related_lookups = self._get_current_related_lookups(
self.model, *lookups, recurse_level=recurse_level
self.model,
*lookups,
recurse_level=recurse_level,
)
return self.select_related(
"version_group", "version_group__current_version", *related_lookups
"version_group",
"version_group__current_version",
*related_lookups,
)

def with_transactions_and_models(self) -> TrackedModelQuerySet:
Expand Down
Loading
Loading