feat: Add port of itertools.batched #312
GitHub Actions / Test Results
succeeded
Nov 11, 2024 in 1s
68 passed, 0 failed and 0 skipped
✅ test-results/results.xml
68 tests were completed in 11s with 68 passed, 0 failed and 0 skipped.
Test suite | Passed | Failed | Skipped | Time |
---|---|---|---|---|
glue_utils | 68✅ | 11s |
✅ glue_utils
test.pyspark.context.test_opensearch.TestGluePySparkContextForMongoDB
✅ test_write_dynamic_frame
✅ test_create_dynamic_frame
test.pyspark.context.test_kinesis.TestGluePySparkContextForKinesis
✅ test_write_dynamic_frame
✅ test_create_dynamic_frame
test.pyspark.context.test_documentdb.TestGluePySparkContextForDocumentDB
✅ test_write_dynamic_frame
✅ test_create_dynamic_frame
test.pyspark.context.test_dynamodb.TestGluePySparkContextForDynamoDB
✅ test_create_dynamic_frame
✅ test_write_dynamic_frame
✅ test_create_dynamic_frame_using_export
test.test_helpers
✅ test_generate_partitioned_path[partitions0-year=2022/month=01/day=01]
✅ test_generate_partitioned_path[partitions2-brand=apple/category=electronics]
✅ test_generate_partitioned_path[partitions1-category=electronics/brand=apple]
test.test_helpers.TestBatched
✅ test_batched[iterable3-1-expected3]
✅ test_batched[iterable1-3-expected1]
✅ test_batched_with_generator
✅ test_batched[iterable2-5-expected2]
✅ test_batched_with_invalid_n
✅ test_batched[iterable0-3-expected0]
test.pyspark.context.test_jdbc.TestGluePySparkContextForJDBCRedshift
✅ test_write_dynamic_frame
✅ test_create_dynamic_frame
test.pyspark.context.test_jdbc.TestGluePySparkContextForJDBC
✅ test_write_dynamic_frame[sqlserver]
✅ test_create_dynamic_frame[mysql]
✅ test_write_dynamic_frame[mysql]
✅ test_write_dynamic_frame[postgresql]
✅ test_create_dynamic_frame[postgresql]
✅ test_create_dynamic_frame[sqlserver]
✅ test_create_dynamic_frame[oracle]
✅ test_write_dynamic_frame[oracle]
test.pyspark.test_job.TestGluePySparkJob
✅ test_init_with_log_level[LogLevel.INFO]
✅ test_init_with_invalid_spark_conf
✅ test_init_with_spark_conf
✅ test_init_options_cls
✅ test_init_with_log_level[LogLevel.ALL]
✅ test_init_with_log_level[LogLevel.TRACE]
✅ test_init_with_invalid_options_cls
✅ test_init_with_log_level[LogLevel.ERROR]
✅ test_init_with_log_level[LogLevel.OFF]
✅ test_init_with_partition_options
✅ test_init_with_log_level[LogLevel.DEBUG]
✅ test_init[args0-resolved_options0]
✅ test_init[args1-resolved_options1]
✅ test_init_with_log_level[LogLevel.FATAL]
✅ test_init_with_default_log_level
✅ test_init_with_log_level[LogLevel.WARN]
test.pyspark.test_job.TestGluePySparkJobMethods
✅ test_managed_glue_context_without_commit
✅ test_commit
✅ test_managed_glue_context
test.test_options.TestOptionsWithNonString
✅ test_warning_for_non_string_fields
test.test_options.TestBaseOptions
✅ test_from_sys_argv[args0-resolved_options0]
✅ test_from_sys_argv[args1-resolved_options1]
test.test_options.TestNullableOptionsWithDefaults
✅ test_missing_options
✅ test_from_sys_argv
test.test_options.TestOptions
✅ test_missing_options
✅ test_from_sys_argv
test.test_options.TestNullableOptions
✅ test_from_sys_argv
✅ test_missing_options
test.pyspark.context.test_kafka.TestGluePySparkContextForKafka
✅ test_create_dynamic_frame
✅ test_write_dynamic_frame
test.pyspark.context.test_mongodb.TestGluePySparkContextForMongoDB
✅ test_write_dynamic_frame
✅ test_create_dynamic_frame
test.pyspark.context.test_s3.TestGluePySparkContextForS3
✅ test_write_dynamic_frame_to_s3_json
✅ test_write_dynamic_frame_to_s3_csv
✅ test_write_dynamic_frame_to_s3_parquet
✅ test_create_dynamic_frame_from_s3_json
✅ test_create_dynamic_frame_from_s3_csv
✅ test_create_dynamic_frame_from_s3_parquet
✅ test_create_dynamic_frame_from_s3_xml
✅ test_write_dynamic_frame_to_s3_xml
Loading