Replies: 6 comments 10 replies
-
Asking here per request: As this uncouples core and adapters, that seems like a pretty big shift. Why is this not version 2.0.0? Or, if for internal reasons it must be 1.8.0, are adapters free to make their release that is compatible with this into version 2.0.0? I want to signal to customers that they can't just take latest without making the change, particularly in environments where they get latest by default, like in Databricks Workflows. |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
How do we decide the full spectrum of SQL types and yaml types to put in the type tests? Is it basically, here are all the SQL types that my database supports, and here are all the ways I expect users might want to specify it? Is there any standardized way to communicate this mapping to users? |
Beta Was this translation helpful? Give feedback.
-
@dataders , test_unit_testing test is generating a nested CTE, which is not supported in Fabric yet. |
Beta Was this translation helpful? Give feedback.
-
Also, I added render_limited() as limit key word is not supported by t-sql. It looks like the default implementation from dbt-core is picked up instead of using the adapter implementation. |
Beta Was this translation helpful? Give feedback.
-
Hi. Curious to know if anyone successfully implemented dbt-core (v1.8.6) unit testing with dbt-athena (v1.8.4) for iceberg tables. I tried running the unit test, but the test is attempting to create a hive table instead. |
Beta Was this translation helpful? Give feedback.
-
Overview
This discussion is for communicating to adapter maintainers the scope of work needed to make use of the changes coming via dbt-core 1.8.0. If you have questions and concerns, please ask them here for posterity.
Please consider this a living document between now and the date of final release. If there's something missing, please comment below!
Loom video overview (12 min)
TBD
release timeline
The below table gives the milestones between up to and including the final release. It will be updated with each subsequent release.
prior maintainer upgrade guides
Example Diffs
TL;DR
This upgrade is a big deal, and should be the last time in a while that dbt Labs asks maintainers for this amount of work. There's two logical pieces of work here, that, in theory, aren't especially complex:
Decoupled Dependency on dbt-core
Context
Excerpt from #9171
How to implement
Everything you need to know is in dbt-adapters#87. If you have a question or concern, please ask it there.
Unit Testing
Unit testing is a long-awaited feature that is finally coming to dbt. It is a powerful feature that will allow users to test their models in isolation from the rest of their project. This should not require a great deal of work on the part of the adapter maintainer, but, more importantly, how dbt is used changes, so it's important that we have test coverage for these new scenarios.s
support for
--empty
flag to enable "dry run" modeHow to implement
--empty
supportdbt-core#8971 added a new BaseRelation method:
render_limited()
. Effectively, this method will wrap a model'sSELECT
statement into a subqueryIf your data platform supports
LIMIT
clauses, you have no work to do. However, some SQL dialects (e.g. T-SQL) do not supportLIMIT
clauses. In this case, you will need to implementrender_limited()
for your adapter.--empty
Testsdbt.tests.adapter.empty.test_empty.BaseTestEmpty
macros
CAST
andSAFE_CAST
supportdbt does a vast amount of type casting behind the scenes such as:
agate
(our current csv reader) <> PythonYAML
<> PythonIn the case of unit testing, we needed to extend further the
yaml
<>python
casting to allow for more specific definition of mock data.theoretically this should be a no-op for most adapters, but it's worth checking to make sure that the
CAST
andSAFE_CAST
macros are supported in your adapter. For example, dbt-spark now has asafe_cast
that it did not before (dbt-spark#files).Tests within
dbt.tests.adapter.unit_testing
For unit testing, there are a handful (3) of functional tests worth implementing as an adapter to ensure both baseline functionality and expected behavior when mocking inputs with various types:
test_types.BaseUnitTestingTypes
data_types
fixture (example)data_types
is a list of (sql_value
,yaml_value
) wheresql_value
should be a literal in the upstream 'real' input, andyaml_value
is what the value looks like in yaml when it is being mocked out by the userunit
materialization makes use ofsafe_cast
to cast the user-provided yaml value to the expected input type while building fixture CTEssafe_cast
defers tocast
if the adapter does not support safe casting to a particular type (e.g. snowflake'ssafe_cast
does not support variant)safe_cast
andcast
may need to be extended as appropriate to support a fuller range of inputs that can be expected from the user in the context of specifying unit test input fixtures.test_case_insensitivity.BaseUnitTestCaseInsensivity
Note
there is a misspelling,
Insensivity
notInsensitivity
get_fixture_sql
, which I'd advise against as those provide the main adapter framework/entrypoints of the unit testing functionality and our 1p implementations have not had to.test_invalid_input.BaseUnitTestInvalidInput
format_row
macro which is what provides this functionality in the default implementation.Additional Tests
Important
These are new tests introduced into the adapter zone that you should have in your adapter.
TBD
Materialized Views Refactor
When the
1.7
upgrade guide was originally published (October 2023), the "Materialized Views Refactor" was stubbed out. In December it was more fully fleshed out.If your warehouse supports materialized views, you should check it out. Even if you don't the changes implemented represent a vision of the future for how materializations are handled in dbt.
Beta Was this translation helpful? Give feedback.
All reactions