Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Node Slice Fast IPAM Object Too Large #497

Open
ivelichkovich opened this issue Aug 27, 2024 · 1 comment
Open

[BUG] Node Slice Fast IPAM Object Too Large #497

ivelichkovich opened this issue Aug 27, 2024 · 1 comment

Comments

@ivelichkovich
Copy link
Contributor

Describe the bug
A clear and concise description of what the bug is.

If you use a very large CIDR range with a very small node slice size i.e. /8 and /27 then there is so many possible slices that they are too large to fit within a kubernetes object with max 1.5MB object size.

Expected behavior
A clear and concise description of what you expected to happen.

We should validate this or surface this error at the onset that it's not supported. A /8 and a /27 would support 524,288 which is way beyond official 5,000 node support. Whereabouts should support more than the base 5,000 but I don't think we need to support 500,000.

To Reproduce
Steps to reproduce the behavior:

  1. Use Node Slice Pool
  2. Create NAD with ip_range of /8 and node_slice_size of /27

Environment:

  • Whereabouts version : latest main, unreleased yet
  • Kubernetes version (use kubectl version): any
  • Network-attachment-definition: N/A
  • Whereabouts configuration (on the host): N/A
  • OS (e.g. from /etc/os-release): N/A
  • Kernel (e.g. uname -a): N/A
  • Others: N/A

Additional info / context
Add any other information / context about the problem here.

My team can work on a resolution for this. Initially thinking we should just surface this as an error and document the supported limits.

@ivelichkovich
Copy link
Contributor Author

ivelichkovich commented Aug 27, 2024

This can be assigned to me for now, thanks @xagent003 for finding this one!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant