Skip to content

Commit

Permalink
Configure the Spark metrics properties while creating a s3 Glue Conne…
Browse files Browse the repository at this point in the history
…ctor

Signed-off-by: Louis Chu <[email protected]>
  • Loading branch information
noCharger committed Feb 6, 2024
1 parent 70d94e6 commit d0c9f36
Showing 1 changed file with 9 additions and 0 deletions.
9 changes: 9 additions & 0 deletions docs/user/interfaces/asyncqueryinterface.rst
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,15 @@ The system relies on the default AWS credentials chain for making calls to the E
* ``applicationId``, ``executionRoleARN`` and ``region`` are required parameters.
* ``sparkSubmitParameter`` is an optional parameter. It can take the form ``--conf A=1 --conf B=2 ...``.

Starting with Flint 0.1.1, users could use AWS CloudWatch as an external metrics sink while configuring their own metric sources. For example::

plugins.query.executionengine.spark.config:
'{ "applicationId":"xxxxx",
"executionRoleARN":"arn:aws:iam::***********:role/emr-job-execution-role",
"region":"eu-west-1",
"sparkSubmitParameters": "--conf spark.dynamicAllocation.enabled=false --conf spark.metrics.conf.*.sink.cloudwatch.class=org.apache.spark.metrics.sink.CloudWatchSink --conf spark.metrics.conf.*.sink.cloudwatch.namespace=OpenSearchSQLSpark --conf spark.metrics.conf.*.sink.cloudwatch.regex=(opensearch|numberAllExecutors).* --conf spark.metrics.conf.*.source.cloudwatch.class=org.apache.spark.metrics.source.FlintMetricSource"
}'
Please refer to https://github.com/opensearch-project/opensearch-spark for a complete list of supported Spark configurations for custom metrics support.

Async Query Creation API
======================================
Expand Down

0 comments on commit d0c9f36

Please sign in to comment.