-
Notifications
You must be signed in to change notification settings - Fork 989
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ES-hadoop is not compatible with spark 3.5.1 #2210
Comments
Many projects have similar issues from the spark api change |
Thanks for the report! |
[INFO] | | +- org.apache.spark:spark-network-common_2.12:jar:3.5.1:compile Also dependencies are bringing in a protobuf that is old that sets off OSS vulnerability scanning. |
Upgrading to Spark 3.5 is going to be tricky because of compiler errors like this caused by a breaking change in the spark API:
I think we'll have to move several more classes from our spark core package down into the various spark-version-specific packages. |
These are unavoidable, previously in Hive we had made "shim layers" and used reflection to deal with breaking API changes. I will look into at least getting it working and then we can see what the change set is. |
Is there any updates on that? |
We recently added support for 3.4.3, but we have not dealt with the big changes in 3.5 yet. |
@masseyke I am facing this issue with spark 3.5.2 any update to this so far? |
No update yet, sorry. |
@masseyke Thanks for confirmation |
What kind an issue is this?
The easier it is to track down the bug, the faster it is solved.
Often a solution already exists! Don’t send pull requests to implement new features without
first getting our support. Sometimes we leave features out on purpose to keep the project small.
Issue description
Spark 3.5.1 has changed some UDF code in catalyst which breaks a number of applications built against older versions of spark
Steps to reproduce
Code:
Strack trace:
Version Info
OS: : Linux
JVM : JDK8/11
Hadoop/Spark:
ES-Hadoop :
ES : 7.X latest.
Feature description
The text was updated successfully, but these errors were encountered: