Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Backport 0.4] [Bugfix] Insights on query execution error #486

Merged
merged 1 commit into from
Jul 26, 2024

Conversation

opensearch-trigger-bot[bot]
Copy link

Backport 3c8a490 from #475.

* BugFix: Add error logs

Signed-off-by: Louis Chu <[email protected]>

* Add IT

Signed-off-by: Louis Chu <[email protected]>

* Fix IT

Signed-off-by: Louis Chu <[email protected]>

* Log stacktrace

Signed-off-by: Louis Chu <[email protected]>

* Use full msg instead of prefix

Signed-off-by: Louis Chu <[email protected]>

---------

Signed-off-by: Louis Chu <[email protected]>
(cherry picked from commit 3c8a490)
Signed-off-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
@noCharger
Copy link
Collaborator

2024-07-26T00:12:27.4148148Z 24/07/25 17:12:27 ERROR CustomLogging: {"timestamp":1721952747413,"severityText":"ERROR","severityNumber":17,"body":{"message":"Fail to analyze query. Cause: Table or view not found: testTable; line 1 pos 22;\n'Project ['name, 'age]\n+- 'UnresolvedRelation [testTable], [], false\n"},"attributes":{"domainName":"UNKNOWN:UNKNOWN","clientId":"UNKNOWN","exception.type":"org.apache.spark.sql.AnalysisException","exception.message":"Table or view not found: testTable; line 1 pos 22;\n'Project ['name, 'age]\n+- 'UnresolvedRelation [testTable], [], false\n"}}
2024-07-26T00:12:27.4150793Z org.apache.spark.sql.AnalysisException: Table or view not found: testTable; line 1 pos 22;
2024-07-26T00:12:27.4151405Z 'Project ['name, 'age]
2024-07-26T00:12:27.4151793Z +- 'UnresolvedRelation [testTable], [], false
2024-07-26T00:12:27.4152169Z 
2024-07-26T00:12:27.4152574Z 	at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
2024-07-26T00:12:27.4153528Z 	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$1(CheckAnalysis.scala:131)
2024-07-26T00:12:27.4154822Z 	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$1$adapted(CheckAnalysis.scala:102)
2024-07-26T00:12:27.4155813Z 	at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:367)
2024-07-26T00:12:27.4156586Z 	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreachUp$1(TreeNode.scala:366)
2024-07-26T00:12:27.4157411Z 	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreachUp$1$adapted(TreeNode.scala:366)
2024-07-26T00:12:27.4158311Z 	at scala.collection.Iterator.foreach(Iterator.scala:943)
2024-07-26T00:12:27.4158823Z 	at scala.collection.Iterator.foreach$(Iterator.scala:943)
2024-07-26T00:12:27.4159376Z 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
2024-07-26T00:12:27.4160269Z 	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
2024-07-26T00:12:27.4160860Z 	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
2024-07-26T00:12:27.4161439Z 	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
2024-07-26T00:12:27.4162093Z 	at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:366)
2024-07-26T00:12:27.4162903Z 	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis(CheckAnalysis.scala:102)
2024-07-26T00:12:27.4163914Z 	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis$(CheckAnalysis.scala:97)
2024-07-26T00:12:27.4164773Z 	at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:188)
2024-07-26T00:12:27.4165620Z 	at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:214)
2024-07-26T00:12:27.4166532Z 	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:330)
2024-07-26T00:12:27.4167555Z 	at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:211)
2024-07-26T00:12:27.4168387Z 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:76)
2024-07-26T00:12:27.4169273Z 	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
2024-07-26T00:12:27.4170176Z 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:185)
2024-07-26T00:12:27.4171031Z 	at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:510)
2024-07-26T00:12:27.4171878Z 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:185)
2024-07-26T00:12:27.4172632Z 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
2024-07-26T00:12:27.4173703Z 	at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:184)
2024-07-26T00:12:27.4175123Z 	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:76)
2024-07-26T00:12:27.4176949Z 	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:74)
2024-07-26T00:12:27.4178427Z 	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:66)
2024-07-26T00:12:27.4179714Z 	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
2024-07-26T00:12:27.4180849Z 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
2024-07-26T00:12:27.4181813Z 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
2024-07-26T00:12:27.4183364Z 	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:622)
2024-07-26T00:12:27.4184502Z 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
2024-07-26T00:12:27.4185595Z 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:617)
2024-07-26T00:12:27.4186802Z 	at org.apache.spark.sql.FlintJobExecutor.executeQuery(FlintJobExecutor.scala:412)
2024-07-26T00:12:27.4188205Z 	at org.apache.spark.sql.FlintJobExecutor.executeQuery$(FlintJobExecutor.scala:401)
2024-07-26T00:12:27.4189440Z 	at org.apache.spark.sql.FlintREPL$.executeQuery(FlintREPL.scala:49)
2024-07-26T00:12:27.4190883Z 	at org.apache.spark.sql.FlintREPL$.$anonfun$executeQueryAsync$1(FlintREPL.scala:821)
2024-07-26T00:12:27.4192122Z 	at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
2024-07-26T00:12:27.4193165Z 	at scala.util.Success.$anonfun$map$1(Try.scala:255)
2024-07-26T00:12:27.4193886Z 	at scala.util.Success.map(Try.scala:213)
2024-07-26T00:12:27.4194668Z 	at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
2024-07-26T00:12:27.4195658Z 	at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
2024-07-26T00:12:27.4196764Z 	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
2024-07-26T00:12:27.4197863Z 	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
2024-07-26T00:12:27.4199039Z 	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
2024-07-26T00:12:27.4200428Z 	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
2024-07-26T00:12:27.4202031Z 	at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
2024-07-26T00:12:27.4203854Z 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
2024-07-26T00:12:27.4205401Z 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
2024-07-26T00:12:27.4206699Z 	at java.base/java.lang.Thread.run(Thread.java:829)
2024-07-26T00:12:27.4333580Z 24/07/25 17:12:27 INFO FlintREPL: command complete: FlintCommand(state=failed, query=SELECT name, age FROM testTable, statementId=eecc682a-28ed-4641-a953-fc5be840e21a, queryId=104, submitTime=1721952737217, error=Some({"Message":"Fail to analyze query. Cause: Table or view not found: testTable; line 1 pos 22;\n'Project ['name, 'age]\n+- 'UnresolvedRelation [testTable], [], false\n"}))
2024-07-26T00:12:27.4336768Z 24/07/25 17:12:27 INFO RetryableHttpAsyncClient: Building retryable http async client with options: FlintRetryOptions{maxRetries=3, retryableStatusCodes=429,502, retryableExceptionClassNames=Optional.empty}

Logs output as expected on IT

@noCharger noCharger merged commit 84dcceb into 0.4 Jul 26, 2024
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants