Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: sync BigQueryDataTransferConfig to pinned googleapis version #3335

Conversation

jingyih
Copy link
Collaborator

@jingyih jingyih commented Dec 6, 2024

Update the BigQueryDataTransferConfig resource up to date with the pinned git version.

This introduces the following new fields.

  • spec.scheduleOptionsV2
    • spec.scheduleOptionsV2.eventDrivenSchedule.pubSubSubscriptionRef is added as a reference field
  • status.observedState.error

Adding new optional fields in v1beta1 should be fine. The conversion can be handled automatically by kube-apiserver.

Issue: #3254

@jingyih jingyih force-pushed the sync_bigquerydatatransferconfig_to_pinned_version branch from ac3f334 to 39dbcbe Compare December 7, 2024 00:07
@jingyih
Copy link
Collaborator Author

jingyih commented Dec 7, 2024

This PR bumps cloud.google.com/go/bigquery from 1.63.1 to 1.65.0, which unblocks #3125.

cc @yuwenma @xiaoweim

@jingyih jingyih changed the title [WIP] feat: sync BigQueryDataTransferConfig to pinned googleapis version feat: sync BigQueryDataTransferConfig to pinned googleapis version Dec 7, 2024
type EventDrivenSchedule struct {
// Pub/Sub subscription used to receive events.
// Only Google Cloud Storage data source support this option.
// Format: projects/{project}/subscriptions/{subscription}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: comment needs to be updated to resource reference

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done!

/* NOTYET
// A list of messages that carry the error details. There is a common set of
// message types for APIs to use.
Details []Any `json:"details,omitempty"`
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can I ask why we comment this out? The Any struct seems to be good (though the name is confusing) https://github.com/GoogleCloudPlatform/k8s-config-connector/pull/3335/files#diff-bed80b84c2e759102b2daf0a568eea2dd1d8309634dea509f245721d4684a350R102

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn’t spend much time on this field because it’s an output field that can be added later if needed. This PR was created quickly to unblock another PR and was put together in a very short timeframe.

}
return nil, fmt.Errorf("error reading referenced PubSubSubscription %v: %w", nn, err)
}
projectID, err := ResolveProjectID(ctx, reader, u)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have you tried this against real GCP?
The PubSubSubscription uses the project-id annotation so this will always end up as not-found.

cc @gemmahou to improve the project reference (focus area)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if I understand. Currently the code does look for the project-id annotation.

if projectID := obj.GetAnnotations()["cnrm.cloud.google.com/project-id"]; projectID != "" {

But I did not try this against real GCP, I can add a test later.


go run . generate-mapper \
--service google.cloud.bigquery.datatransfer.v1 \
--api-version bigquerydatatransfer.cnrm.cloud.google.com/v1alpha1
--api-version bigquerydatatransfer.cnrm.cloud.google.com/v1beta1
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

--resource BigQueryDataTransferConfig:TransferConfig
--api-version bigquerydatatransfer.cnrm.cloud.google.com/v1beta1 \
--resource BigQueryDataTransferConfig:TransferConfig \
--skip-scaffold-files # skipping because the files were generated using a previous pattern, making them incompatible with the new scaffolding approach.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh! I didn't know this flag. nice!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(reviewed the following code) Ah.. no wonders... 😅

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah this is kind of needed for re-running the codegen commands. Because we updated both the naming of the scaffolded files and the location of the external reference and identity code.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you update the direct guide ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally, new resources shouldn't require this flag. It's only intended for legacy direct resources created while the code template was being updated. We want the codegen tool to be able to be rerun without requiring the flag. I'm looking into the possibility of adding a presubmit check to ensure the tool can be rerun on PR.

@yuwenma
Copy link
Collaborator

yuwenma commented Dec 11, 2024

/approve
/lgtm

Looks good especially the flag in scifi tooling. I think we want to resolve the project-reference, but not a blocker to this PR.

Copy link
Contributor

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: yuwenma

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@yuwenma
Copy link
Collaborator

yuwenma commented Dec 11, 2024

/hold

In case you can take a look at the comment. Feel free to unhold and resolve in this PR or follow up PR

@google-oss-prow google-oss-prow bot removed the lgtm label Dec 11, 2024
@yuwenma
Copy link
Collaborator

yuwenma commented Dec 12, 2024

/lgtm
/hold cancel

@google-oss-prow google-oss-prow bot merged commit 05b63d0 into GoogleCloudPlatform:master Dec 12, 2024
19 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants