Skip to content

Commit

Permalink
fix: Update S3 test and README
Browse files Browse the repository at this point in the history
  • Loading branch information
tomasfarias committed Mar 19, 2024
1 parent dbee673 commit 65a406a
Show file tree
Hide file tree
Showing 2 changed files with 14 additions and 7 deletions.
11 changes: 7 additions & 4 deletions posthog/temporal/tests/batch_exports/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,8 @@ This module contains unit tests covering activities, workflows, and helper funct

BigQuery batch exports can be tested against a real BigQuery instance, but doing so requires additional setup. For this reason, these tests are skipped unless an environment variable pointing to a BigQuery credentials file (`GOOGLE_APPLICATION_CREDENTIALS=/path/to/my/project-credentials.json`) is set.

> :warning: Since BigQuery batch export tests require additional setup, we skip them by default and will not be ran by automated CI pipelines. Please ensure these tests pass when making changes that affect BigQuery batch exports.
> [!WARNING]
> Since BigQuery batch export tests require additional setup, we skip them by default and will not be ran by automated CI pipelines. Please ensure these tests pass when making changes that affect BigQuery batch exports.
To enable testing for BigQuery batch exports, we require:
1. A BigQuery project and dataset
Expand All @@ -24,7 +25,8 @@ DEBUG=1 GOOGLE_APPLICATION_CREDENTIALS=/path/to/my/project-credentials.json pyte

Redshift batch exports can be tested against a real Redshift (or Redshift Serverless) instance, with additional setup steps required. Due to this requirement, these tests are skipped unless Redshift credentials are specified in the environment.

> :warning: Since Redshift batch export tests require additional setup, we skip them by default and will not be ran by automated CI pipelines. Please ensure these tests pass when making changes that affect Redshift batch exports.
> [!WARNING]
> Since Redshift batch export tests require additional setup, we skip them by default and will not be ran by automated CI pipelines. Please ensure these tests pass when making changes that affect Redshift batch exports.
To enable testing for Redshift batch exports, we require:
1. A Redshift (or Redshift Serverless) instance.
Expand All @@ -41,14 +43,15 @@ Replace the `REDSHIFT_*` environment variables with the values obtained from the

## Testing S3 batch exports

S3 batch exports are tested against a MinIO bucket available in the local development stack. However there are also unit tests that specifically target an S3 bucket. Additional setup is required to run those specific tests:
S3 batch exports are tested against a MinIO bucket available in the local development stack. However there are also unit tests that specifically target an S3 bucket (like `test_s3_export_workflow_with_s3_bucket`). Additional setup is required to run those specific tests:

1. Ensure you are logged in to an AWS account.
2. Create or choose an existing S3 bucket from that AWS account to use as the test bucket.
3. Create or choose an existing KMS key id from that AWS account to use in tests.
4. Make sure the role/user you are logged in as has permissions to use the bucket and KMS key.

For PostHog employees, check your password manager for these details.
> [!NOTE]
> For PostHog employees, your password manager contains a set of credentials for S3 batch exports development testing. You may populate your development environment with these credentials and use the provided test bucket and KMS key.
With these setup steps done, we can run all tests (MinIO and S3 bucket) from the root of the `posthog` repo with:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -524,7 +524,7 @@ async def s3_client(bucket_name, s3_key_prefix):
async with aioboto3.Session().client("s3") as s3_client:
yield s3_client

await delete_all_from_s3(minio_client, bucket_name, key_prefix=s3_key_prefix)
await delete_all_from_s3(s3_client, bucket_name, key_prefix=s3_key_prefix)


@pytest.mark.skipif(
Expand Down Expand Up @@ -593,7 +593,11 @@ async def test_s3_export_workflow_with_s3_bucket(
)

workflow_id = str(uuid4())
destination_config = s3_batch_export.destination.config | {"endpoint_url": None}
destination_config = s3_batch_export.destination.config | {
"endpoint_url": None,
"aws_access_key_id": os.getenv("AWS_ACCESS_KEY_ID"),
"aws_secret_access_key": os.getenv("AWS_SECRET_ACCESS_KEY"),
}
inputs = S3BatchExportInputs(
team_id=ateam.pk,
batch_export_id=str(s3_batch_export.id),
Expand Down Expand Up @@ -631,7 +635,7 @@ async def test_s3_export_workflow_with_s3_bucket(
assert run.status == "Completed"

await assert_clickhouse_records_in_s3(
s3_compatible_client=minio_client,
s3_compatible_client=s3_client,
clickhouse_client=clickhouse_client,
bucket_name=bucket_name,
key_prefix=s3_key_prefix,
Expand Down

0 comments on commit 65a406a

Please sign in to comment.