Improved Tests for Schema Size Limit (Issue #312)(Adding Test max encoded schema limit exceeded to pallet/schema) #435
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Hi everyone,
This pull request addresses issue #312 and aims to improve the testing of the maximum schema size limit. These improvements ensure that the system correctly handles scenarios where the schema size might be exceeded.
My Changes:
I've added more comprehensive tests to the existing suite:
One test specifically checks the Schema::create function to verify it throws the MaxEncodedSchemaLimitExceeded error when data is too large.
Another test case focuses on an edge case, making sure the limit check works correctly for data sizes very close to the maximum allowed size.
I've used a mock storage system (MockStorage) to isolate the tests from external dependencies, making them more reliable.
I've included comments throughout the code to explain the purpose of each section (arrange, act, assert) for better understanding.
Benefits:
These additional tests will help us catch potential issues related to exceeding the schema size limit before they occur in production, leading to a more robust system.
Developers can be more confident in how the system handles large schemas.
Clear test cases will make it easier to maintain the code in the future.
Next Steps:
I'd appreciate it if you could review the code changes and test them.
If there are other functions that might be affected by the schema size limit, we could consider adding similar test cases for those as well.
Please double-check that the expected error type (MaxEncodedSchemaLimitExceeded) is accurate.
Thank you for your time and consideration. Please merge this pull request when you're satisfied with the changes.
Best,
Harish Karthick S