-
Notifications
You must be signed in to change notification settings - Fork 64
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Document new experimental ingestion streaming APIs #584
Conversation
Signed-off-by: Andriy Redko <[email protected]>
spec/namespaces/_core.yaml
Outdated
/_bulk/stream: | ||
post: | ||
operationId: bulk.stream.0 | ||
x-operation-group: bulk |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The running linter will throw an error on the operationId
and the x-operation-group
.
I think you want to do bulk_stream
for x-operation-group
and bulk_stream.0
for operationId
. Then on the generated client, there will be a client.bulk_stream()
method, which is different from client.bulk()
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for the hints, @nhtruong !
Please add tests to these operations as well. |
Changes AnalysisCommit SHA: 8043e59 API ChangesSummary
ReportThe full API changes report is available at: https://github.com/opensearch-project/opensearch-api-specification/actions/runs/11108235686/artifacts/1996253002 API Coverage
|
Spec Test Coverage Analysis
|
Signed-off-by: Andriy Redko <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good start. Needs tests, please!
Signed-off-by: Andriy Redko <[email protected]>
Signed-off-by: Andriy Redko <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You'll need to update the 2.17 version used to the released one, and switch the next version to 2.18. Also see validator failures, etc.
tests/default/_core/bulk_stream.yaml
Outdated
@@ -0,0 +1,54 @@ | |||
$schema: ../../../json_schemas/test_story.schema.yaml |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
File should be _core/bulk/stream.yaml
.
@@ -0,0 +1,60 @@ | |||
$schema: ../../../json_schemas/test_story.schema.yaml |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same, bulk/stream.yaml
.
Signed-off-by: Andriy Redko <[email protected]>
Signed-off-by: Andriy Redko <[email protected]>
So does it actually stream data? 🤔 |
It does, I am not sure how we could express that in OpenAPI |
I don't think we need to, but do clients automatically do it? AFAIK we don't have code in our tooling that does anything special. |
@dblock need your advice here: streaming (as of today) requires dedicated transport plugin. Here are 2 options we could make it work:
|
Oh, we don't need to actually, no special treatment required |
I think this is the right answer. The tooling in this repo should support streaming just like we support various content types. |
Signed-off-by: Andriy Redko <[email protected]>
Signed-off-by: Andriy Redko <[email protected]>
@@ -0,0 +1,60 @@ | |||
$schema: ../../../../json_schemas/test_story.schema.yaml |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You'll have to move tests in the plugins folder as well from tests/default.
Signed-off-by: Andriy Redko <[email protected]>
Signed-off-by: Andriy Redko <[email protected]>
Description
Document new experimental ingestion streaming APIs
Issues Resolved
Closes #537
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check here.