-
-
Notifications
You must be signed in to change notification settings - Fork 311
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot create a pydantic model with a pandera.typing.pyspark.DataFrame
type.
#1446
Comments
The pyspark.sql pandera backend does not currently support pydantic types. The current behavior is designed to only work with pyspark types. Going to change this to an enhancement ticket, will need discussion with the defacto code owners for the pyspark.sql integration: @NeerajMalhotra-QB @jaskaransinghsidana. |
Ah, okay I misread this issue! You want to use a pandera pyspark.sql schema in your pydantic models, correct? This should actually work, reverting this to a bug. Open to contributions for this. |
Just looking at the code above I suspect the issue is your import |
I get the same error with both: I have a fix working locally and will submit a PR for this in the next couple of days. |
Sumbitted a bugfix in #1447 for review. |
Describe the bug
A clear and concise description of what the bug is.
Pydantic models always throw
is_instance_of
validation errors if apandera.typing.pyspark.DataFrame
type is used. Pydantic integration with pyspark dataframes is broken.Note: Please read this guide detailing how to provide the necessary information for us to reproduce your bug.
Code Sample, a copy-pastable example
The above leads to the following error:
Expected behavior
A clear and concise description of what you expected to happen.
We would expect the PydanticContainer to instantiate successfully. The error says that the
DataFrame
we're feeding in is not aDataFrame
.Desktop (please complete the following information):
Screenshots
If applicable, add screenshots to help explain your problem.
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: