Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(data-warehouse): Log out memory every 10 secs for the new workers #27083

Merged
merged 1 commit into from
Dec 19, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions posthog/temporal/data_imports/external_data_job.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,13 @@
import datetime as dt
import json
import re
import threading
import time

from django.conf import settings
from django.db import close_old_connections
import posthoganalytics
import psutil
from temporalio import activity, exceptions, workflow
from temporalio.common import RetryPolicy
from temporalio.exceptions import WorkflowAlreadyStartedError
Expand Down Expand Up @@ -177,6 +180,22 @@ def create_source_templates(inputs: CreateSourceTemplateInputs) -> None:
create_warehouse_templates_for_source(team_id=inputs.team_id, run_id=inputs.run_id)


def log_memory_usage():
process = psutil.Process()
logger = bind_temporal_worker_logger_sync(team_id=0)

while True:
memory_info = process.memory_info()
logger.info(f"Memory Usage: RSS = {memory_info.rss / (1024 * 1024):.2f} MB")

time.sleep(10) # Log every 10 seconds


if settings.TEMPORAL_TASK_QUEUE == DATA_WAREHOUSE_TASK_QUEUE_V2:
thread = threading.Thread(target=log_memory_usage, daemon=True)
thread.start()


# TODO: update retry policies
@workflow.defn(name="external-data-job")
class ExternalDataJobWorkflow(PostHogWorkflow):
Expand Down
Loading