Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Not found exception for class in org.apache.log4j.* #183

Open
1 of 8 tasks
jcamachor opened this issue Sep 3, 2024 · 0 comments
Open
1 of 8 tasks

[BUG] Not found exception for class in org.apache.log4j.* #183

jcamachor opened this issue Sep 3, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@jcamachor
Copy link
Contributor

Willingness to contribute

Yes. I would be willing to contribute a fix for this bug with guidance from the OpenHouse community.

OpenHouse version

0.5.116

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 20.0): Ubuntu 24.04 LTS
  • JDK version: openjdk version "1.8.0_422"

Describe the problem

While trying to start Spark 3.1.1, e.g., the thrift server, we get a ClassNotFoundException (stacktrace below). It seems that the OpenHouse packaging does not rename org.apache.log4j dependencies in apps/spark/build.gradle, but there might be some constant references that cause the issue.

starting org.apache.spark.sql.hive.thriftserver.HiveThriftServer2, logging to /home/ohuser/spark/logs/spark-ohuser-org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1-oh-headnode.out
failed to launch: nice -n 0 bash /home/ohuser/spark/bin/spark-submit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 --name Thrift JDBC/ODBC Server --packages org.apache.iceberg:iceberg-azure:1.5.0,org.apache.iceberg:iceberg-spark-runtime-3.1_2.12:1.2.0 --jars openhouse-spark-apps_2.12-*-all.jar,openhouse-spark-runtime_2.12-latest-all.jar --conf spark.sql.catalog.openhouse=org.apache.iceberg.spark.SparkCatalog --conf spark.sql.catalog.openhouse.catalog-impl=com.linkedin.openhouse.spark.OpenHouseCatalog --conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,com.linkedin.openhouse.spark.extensions.OpenhouseSparkSessionExtensions
    at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
    at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:75)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:83)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  Caused by: java.lang.ClassNotFoundException: openhouse/relocated/org.apache.log4j.Category
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:264)
    ... 15 more
full log in /home/ohuser/spark/logs/spark-ohuser-org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1-oh-headnode.out

Stacktrace, metrics and logs

No response

Code to reproduce bug

No response

What component does this bug affect?

  • Table Service: This is the RESTful catalog service that stores table metadata. :services:tables
  • Jobs Service: This is the job orchestrator that submits data services for table maintenance. :services:jobs
  • Data Services: This is the jobs that performs table maintenance. apps:spark
  • Iceberg internal catalog: This is the internal Iceberg catalog for OpenHouse Catalog Service. :iceberg:openhouse
  • Spark Client Integration: This is the Apache Spark integration for OpenHouse catalog. :integration:spark
  • Documentation: This is the documentation for OpenHouse. docs
  • Local Docker: This is the local Docker environment for OpenHouse. infra/recipes/docker-compose
  • Other: Please specify the component.
@jcamachor jcamachor added the bug Something isn't working label Sep 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant