-
Notifications
You must be signed in to change notification settings - Fork 172
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature fix load as spark #509
Feature fix load as spark #509
Conversation
Fixed issue for when initiating load_as_spark() in case when a spark session is already initiated in separate thread and cannot be started in current thread. This usually arises when using the method in django rest api.
This reverts commit 22f3124.
Fixed issue for when initiating load_as_spark() in case when a spark session is already initiated in separate thread and cannot be started in current thread. This usually arises when using the method in Django rest api.
@@ -150,11 +150,7 @@ def load_as_spark( | |||
except ImportError: | |||
raise ImportError("Unable to import pyspark. `load_as_spark` requires PySpark.") | |||
|
|||
spark = SparkSession.getActiveSession() | |||
assert spark is not None, ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we still want to keep the assert?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It will not be beneficial to keep it. If any case occurs with issues in spark session fetching then this assert will handle that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I want to hear your thoughts as to what should be the better approach.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for the delay, I'm reading this, it seems it will default to None
, so I think this assert is still helpful. https://spark.apache.org/docs/latest/api/java/org/apache/spark/sql/SparkSession.html
feel free to reopen |
Closes #508
Used SparkSession.active() instead of SparkSession.getActiveSession() in Delta-Sharing.py which returns a spark session regardless of any active session being present or not.