-
-
Notifications
You must be signed in to change notification settings - Fork 89
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
merge dev to main (v2.8.0) #1827
Conversation
…create/update input types (#1804)
…ids (#1801) Co-authored-by: ymc9 <[email protected]>
…satisfy Next15's linter (#1809)
…irect abstract base (#1818)
Warning Rate limit exceeded@ymc9 has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 8 minutes and 32 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📝 Walkthrough📝 Walkthrough📝 WalkthroughWalkthroughThe pull request encompasses extensive modifications across various files, primarily focusing on enhancing documentation clarity, updating changelogs, and introducing new features related to type definitions in the ZenStack framework. Key changes include the addition of a Changes
Possibly related PRs
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 28
🧹 Outside diff range and nitpick comments (67)
packages/schema/src/language-server/zmodel-documentation-provider.ts (1)
3-5
: Enhance class documentation with more details.While the documentation is clear, consider adding:
@extends
tag to document inheritance- Example usage showing both triple-slash and JSDoc comment scenarios
/** + * @extends JSDocDocumentationProvider * Documentation provider that first tries to use triple-slash comments and falls back to JSDoc comments. + * + * @example + * ```ts + * // Using triple-slash comments + * /// This is a model + * model User { + * id Int + * } + * + * // Using JSDoc comments + * /** User model description *\/ + * model User { + * id Int + * } + * ``` */script/test-scaffold.ts (1)
22-22
: Add a comment explaining version requirements.Consider adding a comment above the dependency installation line to document why specific versions are required and any known compatibility constraints.
+// Prisma versions must be kept in sync with the framework's requirements run('npm i --no-audit --no-fund typescript [email protected] @prisma/[email protected] zod decimal.js @types/node');packages/schema/src/plugins/enhancer/model-meta/index.ts (1)
Line range hint
13-18
: Consider adding a comment to explain the preserveTsFiles logic.The function call update correctly includes the new
typeDefs
parameter. However, the preserveTsFiles condition could benefit from a brief comment explaining when each condition applies.Consider adding a comment like this:
+ // Preserve TypeScript files if explicitly requested or if custom output path is provided const preserveTsFiles = options.preserveTsFiles === true || !!options.output;
packages/schema/src/language-server/validator/typedef-validator.ts (3)
7-10
: Enhance TypeDefValidator documentation.While the documentation is present, it could be more descriptive by including:
- Purpose and responsibility of the validator
- Description of validation rules being applied
- Example usage if applicable
/** - * Validates type def declarations. + * Validates type definition declarations in the schema. + * + * This validator ensures: + * 1. No duplicate field declarations within a type definition + * 2. Valid attribute applications on type definition fields + * + * @example + * const validator = new TypeDefValidator(); + * validator.validate(typeDefNode, acceptor); */
11-14
: Add null check for typeDef parameter.The validate method should handle the case where typeDef is null or undefined to prevent runtime errors.
validate(typeDef: TypeDef, accept: ValidationAcceptor): void { + if (!typeDef) { + return; + } validateDuplicatedDeclarations(typeDef, typeDef.fields, accept); this.validateFields(typeDef, accept); }
16-18
: Add defensive checks in validateFields method.Consider adding null checks and handling empty fields array gracefully.
private validateFields(typeDef: TypeDef, accept: ValidationAcceptor) { + if (!typeDef.fields?.length) { + return; + } typeDef.fields.forEach((field) => this.validateField(field, accept)); }tests/integration/tests/enhancements/json/validation.test.ts (3)
4-20
: Consider enhancing test coverage for provider support.The test effectively validates the postgres-only constraint for custom-typed fields. However, consider expanding the test coverage:
- Add explicit tests with different providers (e.g., SQLite, MySQL) to verify the error occurs
- Include more complex type definitions to ensure the constraint applies universally
Example enhancement:
it('fails with different providers', async () => { const model = ` datasource db { provider = "sqlite" url = "file:./dev.db" } type Profile { age Int @gt(0) address String? } model User { id Int @id @default(autoincrement()) profile Profile @json @@allow('all', true) } `; await expect(loadModelWithError(model)) .resolves.toContain('Custom-typed field is only supported with "postgresql" provider'); });
22-38
: Consider additional test cases for @JSON attribute validation.The test effectively validates the @JSON attribute requirement. Consider adding test cases for:
- Multiple custom-typed fields in the same model
- Nested custom types
- Array of custom types
Example enhancement:
it('requires @json for all custom-typed fields', async () => { const model = ` type Address { street String } type Profile { age Int @gt(0) addresses Address[] } model User { id Int @id @default(autoincrement()) profile Profile address Address @@allow('all', true) } `; await expect(loadModelWithError(model)) .resolves.toContain('Custom-typed field must have @json attribute'); });
1-39
: Consider additional test scenarios for robustness.The test file would benefit from:
- Test setup/cleanup hooks (beforeAll/afterAll) if any shared resources are needed
- Error cases for invalid type definitions (e.g., circular references, invalid field types)
- Tests for type validation during runtime CRUD operations
Would you like me to provide examples of these additional test scenarios?
tests/regression/tests/issue-1786.test.ts (2)
26-35
: Consider adding constraints and access rules to the Content model.The Content model could be improved:
- The
contentType
field should be an enum to restrict valid values- Missing access control rules for content visibility
Consider this enhancement:
model Content extends BaseContent { id String @id @default(cuid()) createdAt DateTime @default(now()) updatedAt DateTime @updatedAt owner User @relation(fields: [ownerId], references: [id]) ownerId String - contentType String + contentType ContentType @@delegate(contentType) + @@allow('read', published == true || owner == auth()) } +enum ContentType { + Post + Video +}
37-44
: Add field constraints to Post and Video models.Consider adding validation rules:
- Add length constraints to
title
andname
fields- Add minimum value constraint to
duration
fieldmodel Post extends Content { - title String + title String @length(1, 255) } model Video extends Content { - name String - duration Int + name String @length(1, 255) + duration Int @gt(0) }tests/regression/tests/issue-1755.test.ts (2)
7-29
: Consider enhancing schema robustness and documentation.The schema could benefit from the following improvements:
- Add documentation comments explaining the delegation pattern and its purpose
- Add validation for the
contentType
field to ensure it only contains valid values ('Post' or 'Video')- Add indexes for better query performance
model Content { id Int @id @default(autoincrement()) createdAt DateTime @default(now()) user User @relation(fields: [userId], references: [id]) userId Int - contentType String + contentType String @allowed(['Post', 'Video']) @@delegate(contentType) + @@index([userId]) + @@index([createdAt]) }
3-4
: Improve test organization and documentation.Consider restructuring the test suite for better clarity and maintainability:
- Split into separate test cases for each ordering scenario
- Add more descriptive test titles
- Add comments explaining the purpose of each test case
-describe('issue 1755', () => { - it('regression', async () => { +describe('Content Model Ordering (Issue #1755)', () => { + let db: any; + let user: any; + + beforeEach(async () => { + // Setup code here + }); + + it('should order posts by createdAt using scalar orderBy', async () => { + // First test case + }); + + it('should order posts by createdAt using array orderBy', async () => { + // Second test case + }); + + it('should support nested ordering in relations', async () => { + // Third test case + });packages/schema/src/language-server/validator/zmodel-validator.ts (1)
79-81
: Consider caching the TypeDefValidator instanceThe implementation correctly follows the established validation pattern. However, creating a new TypeDefValidator instance for each validation could impact performance, especially in scenarios with frequent validations.
Consider caching the TypeDefValidator instance as a class member:
export class ZModelValidator { + private readonly typeDefValidator = new TypeDefValidator(); checkTypeDef(node: TypeDef, accept: ValidationAcceptor): void { - this.shouldCheck(node) && new TypeDefValidator().validate(node, accept); + this.shouldCheck(node) && this.typeDefValidator.validate(node, accept); } }packages/runtime/src/enhancements/node/create-enhancement.ts (2)
94-100
: Consider memoizing the fields computationThe field computation logic is correct, but since this might be called frequently during enhancement processing, consider memoizing the result to avoid repeated computations.
- const allFields = Object.values(options.modelMeta.models).flatMap((modelInfo) => Object.values(modelInfo.fields)); - if (options.modelMeta.typeDefs) { - allFields.push( - ...Object.values(options.modelMeta.typeDefs).flatMap((typeDefInfo) => Object.values(typeDefInfo.fields)) - ); - } + const allFields = useMemo(() => { + const modelFields = Object.values(options.modelMeta.models) + .flatMap((modelInfo) => Object.values(modelInfo.fields)); + const typeDefFields = options.modelMeta.typeDefs + ? Object.values(options.modelMeta.typeDefs) + .flatMap((typeDefInfo) => Object.values(typeDefInfo.fields)) + : []; + return [...modelFields, ...typeDefFields]; + }, [options.modelMeta]);
93-93
: Address TODO comments before mergingThe TODO comments about moving detection logic and caching should be addressed:
- Moving detection logic into each enhancement would improve modularity
- Proper caching could improve performance
Would you like help implementing these improvements? I can:
- Propose a design for moving detection logic into enhancements
- Implement a caching strategy for detection results
packages/schema/src/language-server/zmodel-workspace-manager.ts (2)
Line range hint
27-67
: Consider adding validation and safety measures to plugin initialization.The plugin initialization process could benefit from additional safeguards:
- Add validation for the plugin provider format
- Implement timeout handling for plugin loading
- Consider using a streaming approach for large workspaces
- Add concurrent access protection for the document collection
Here's a suggested improvement for the plugin provider validation:
if (providerField) { const provider = getLiteral<string>(providerField.value); - if (provider) { + if (provider && this.isValidPluginProvider(provider)) { this.pluginModels.add(provider); + } else { + console.warn(`Invalid plugin provider format: ${provider}`); } }Add this helper method:
private isValidPluginProvider(provider: string): boolean { // Add validation rules (e.g., npm package naming conventions) return /^(@[a-z0-9-~][a-z0-9-._~]*\/)?[a-z0-9-~][a-z0-9-._~]*$/.test(provider); }
Line range hint
89-134
: Optimize directory traversal and sorting logic.While the implementation is functional, there are opportunities for optimization:
- The sorting logic for node_modules could be simplified
- The directory traversal could be more efficient
Consider this optimization:
- const content = await ( - await this.fileSystemProvider.readDirectory(folderPath) - ).sort((a, b) => { - // make sure the node_moudules folder is always the first one to be checked - // so it could be early exited if the plugin is found - if (a.isDirectory && b.isDirectory) { - const aName = Utils.basename(a.uri); - if (aName === 'node_modules') { - return -1; - } else { - return 1; - } - } else { - return 0; - } - }); + const content = await this.fileSystemProvider.readDirectory(folderPath); + + // Process node_modules first if it exists + const nodeModulesEntry = content.find(entry => + entry.isDirectory && Utils.basename(entry.uri) === 'node_modules' + ); + if (nodeModulesEntry) { + await this.processNodeModules(nodeModulesEntry, pluginModels, collector); + if (pluginModels.size === 0) return; + } + + // Process remaining directories + for (const entry of content) { + if (entry.isDirectory && Utils.basename(entry.uri) !== 'node_modules') { + await this.loadPluginModels(workspaceFolder, entry.uri, pluginModels, collector); + } + }Add this helper method:
private async processNodeModules( nodeModulesEntry: { uri: URI; isDirectory: boolean }, pluginModels: Set<string>, collector: (document: LangiumDocument) => void ): Promise<void> { for (const plugin of Array.from(pluginModels)) { const path = Utils.joinPath(nodeModulesEntry.uri, plugin, PLUGIN_MODULE_NAME); try { this.fileSystemProvider.readFileSync(path); const document = this.langiumDocuments.getOrCreateDocument(path); collector(document); console.log(`Adding plugin document from ${path.path}`); pluginModels.delete(plugin); } catch { // no-op. The module might be found in another node_modules folder } } }tests/integration/tests/enhancements/json/typing.test.ts (4)
4-51
: Consider adding error cases and type assertions.While the happy path is covered, consider enhancing the test coverage:
- Add test cases for invalid age values (≤ 0) to verify @gt(0) validation
- Add type assertions to verify the structure of returned data
- Handle potential null references more safely in findUnique query
Example enhancement:
async function main() { const u = await db.user.create({ data: { profile: { age: 18 }, posts: { create: { title: 'Post1' }} } }); + expect(u.profile.age).toBe(18); console.log(u.profile.age); const u1 = await db.user.findUnique({ where: { id: u.id } }); - console.log(u1?.profile.age); + expect(u1?.profile?.age).toBe(18); + + // Test validation + await expect( + db.user.create({ data: { profile: { age: 0 } } }) + ).rejects.toThrow(); }
53-92
: Enhance optional field test coverage.The test handles optional chaining well but could be more comprehensive by:
- Testing user creation without a profile
- Testing profile update scenarios
- Testing profile deletion
Example enhancement:
async function main() { const u = await db.user.create({ data: { profile: { age: 18 } } }); console.log(u.profile?.age); + + // Test creation without profile + const u2 = await db.user.create({ data: {} }); + expect(u2.profile).toBeNull(); + + // Test profile update + const u3 = await db.user.update({ + where: { id: u.id }, + data: { profile: { age: 25 } } + }); + expect(u3.profile?.age).toBe(25); + + // Test profile removal + const u4 = await db.user.update({ + where: { id: u.id }, + data: { profile: null } + }); + expect(u4.profile).toBeNull(); }
135-181
: Enhance nested type testing scenarios.The test covers basic nested type functionality but could be more robust with:
- Independent address updates
- Validation of nested type constraints
- Edge cases for optional nested fields
Example enhancement:
async function main() { const u = await db.user.create({ data: { profile: { age: 18, address: { city: 'Issaquah' } } } }); - console.log(u.profile.address?.city); + expect(u.profile.address?.city).toBe('Issaquah'); + // Test address update + const updated = await db.user.update({ + where: { id: u.id }, + data: { + profile: { + age: 18, + address: { city: 'Seattle' } + } + } + }); + expect(updated.profile.address?.city).toBe('Seattle'); + + // Test removing nested optional field + const removed = await db.user.update({ + where: { id: u.id }, + data: { + profile: { + age: 18, + address: null + } + } + }); + expect(removed.profile.address).toBeNull(); }
1-181
: Consider implementing a shared test utility for common JSON field operations.Given the repetitive nature of the test cases and common operations (create, update, query), consider:
- Creating shared test utilities for common operations
- Implementing a custom matcher for JSON field assertions
- Adding a test helper for validation checks
Example shared utility:
// test-utils.ts export const jsonFieldTestUtils = { async createAndVerify(db: any, data: any) { const created = await db.user.create({ data }); expect(created).toBeDefined(); return created; }, async verifyValidation(db: any, data: any, expectedError?: string) { await expect( db.user.create({ data }) ).rejects.toThrow(expectedError); } };This would make the tests more maintainable and consistent across different JSON field scenarios.
packages/runtime/src/enhancements/node/types.ts (1)
82-91
: Consider enhancing the documentation.The
DelegateConstraint
type is well-structured and enables powerful relationship-based permission checks. However, the documentation could be more detailed to help developers understand:
- How the delegation works in practice
- When to use this constraint type
- Examples of common use cases
Consider expanding the JSDoc like this:
/** * Constraint delegated to another model through `check()` function call * on a relation field. + * + * This constraint type enables permission checks to be delegated to related models, + * allowing for complex permission scenarios. For example, a Post's visibility + * might depend on its Author's settings. + * + * @example + * { + * kind: 'delegate', + * model: 'Author', + * relation: 'author', + * operation: 'read' + * } */packages/runtime/src/enhancements/node/default-auth.ts (2)
Line range hint
164-174
: Consider enhancing the error message for clarity.While the error handling is correct, the error message could be more specific about what kind of user context is required.
- `Evaluating default value of field \`${fieldInfo.name}\` requires a user context` + `Evaluating default value of field \`${fieldInfo.name}\` requires an authenticated user context`
175-203
: Well-structured type definition handling with room for safety improvements.The implementation is thorough and handles nested structures well. However, consider adding safety measures:
- Add a depth limit to prevent stack overflow with deeply nested structures
- Consider adding performance optimizations for large nested objects
private setDefaultValueForTypeDefData(type: string, data: any) { + // Add a maximum depth constant at class level + private static MAX_NESTING_DEPTH = 10; + + private setDefaultValueForTypeDefDataInternal(type: string, data: any, depth = 0) { + if (depth >= DefaultAuthHandler.MAX_NESTING_DEPTH) { + throw new Error('Maximum nesting depth exceeded in type definition'); + } if (!data || (typeof data !== 'object' && !Array.isArray(data))) { return; } // ... rest of the implementation, updating recursive call to: - this.setDefaultValueForTypeDefData(fieldInfo.type, item[fieldInfo.name]); + this.setDefaultValueForTypeDefDataInternal(fieldInfo.type, item[fieldInfo.name], depth + 1); }packages/schema/src/cli/actions/repl.ts (1)
55-57
: Add path validation for loadPathWhile the dynamic loading logic is correct, consider adding validation for the
loadPath
to ensure it exists and is accessible before attempting to require it.+ if (options.loadPath && !require('fs').existsSync(path.resolve(options.loadPath))) { + throw new Error(`Invalid loadPath: ${options.loadPath} does not exist`); + } const { enhance } = options.loadPath ? require(path.join(path.resolve(options.loadPath), 'enhance')) : require('@zenstackhq/runtime');tests/integration/tests/cli/plugins.test.ts (1)
78-78
: LGTM! Prisma dependency versions updated.The synchronized update of both
@prisma/client
andprisma
to version5.22.x
maintains version compatibility.Consider adding a test case that explicitly verifies compatibility with the new Prisma version, particularly for any new features or breaking changes introduced in 5.22.x.
Also applies to: 88-88
tests/integration/tests/enhancements/json/crud.test.ts (4)
18-72
: Consider expanding test coverage.While the test case thoroughly validates creation scenarios, consider adding:
- Tests for update operations with partial JSON updates
- Tests for null/undefined handling in optional fields
- Tests for field removal scenarios
74-111
: Enhance array field test coverage.Consider adding test scenarios for:
- Empty array handling
- Array manipulation operations (push, remove elements)
- Updating specific array elements
- Array length validation
194-233
: Consider additional default value scenarios.The test covers basic default value handling well. Consider adding:
- Tests for nested default value cascading
- Tests for default values in array elements
- Tests for default value interaction with optional fields
235-269
: Enhance authentication default value testing.Consider adding test scenarios for:
- Different auth context values
- Auth context changes between operations
- Error handling for missing/invalid auth context
packages/plugins/swr/src/runtime/index.ts (1)
431-431
: Approve the enhanced type checking with a suggestion.The added type check
typeof parsed === 'object'
improves robustness by preventing potential runtime errors when accessing properties of non-object values. However, we should also handlenull
sincetypeof null === 'object'
in JavaScript.Consider this slightly more robust check:
- if (typeof parsed === 'object' && parsed?.data && parsed?.meta?.serialization) { + if (typeof parsed === 'object' && parsed !== null && parsed?.data && parsed?.meta?.serialization) {packages/schema/src/language-server/validator/datamodel-validator.ts (1)
107-115
: Document PostgreSQL provider restriction for custom typesThe validation logic for custom-typed fields is correct, but the PostgreSQL-only restriction is a significant limitation that should be documented.
Consider adding a code comment explaining why custom types are limited to PostgreSQL, or update the project's documentation to clarify this restriction. This will help users understand the limitation upfront.
+ // Custom types are currently only supported with PostgreSQL due to its native JSON handling capabilities if (isTypeDef(field.type.reference?.ref)) {
packages/schema/tests/generator/prisma-generator.test.ts (1)
Line range hint
1-570
: Well-structured test suite enhancement.The changes demonstrate good testing practices:
- Comprehensive coverage of comment generation scenarios
- Consistent configuration across test cases
- Clear separation of concerns in test cases
- Good balance of positive and negative assertions
Consider adding test cases for:
- Edge cases in comment formatting
- Multi-line comments with special characters
- Comments in different languages/encodings
packages/schema/src/res/stdlib.zmodel (1)
710-713
: Consider enhancing @JSON attribute documentationWhile the attribute is properly defined, the documentation could be more comprehensive by including:
- Usage examples
- Explanation of "strong-typed JSON" implications
- Relationship with TypeDefField
Consider expanding the documentation like this:
/** * Marks a field to be strong-typed JSON. + * + * When applied to a type definition field, it ensures type-safe JSON handling + * where the field's value must conform to the specified type structure. + * + * Example: + * ``` + * type Address { + * street: String + * city: String + * } + * + * model User { + * id: Int @id + * address: Address @json + * } + * ``` */packages/schema/src/plugins/enhancer/policy/policy-guard-generator.ts (1)
751-758
: Fix indentation while keeping the logical changes.The logical changes correctly extend validation checks to include type-def fields. However, the indentation appears inconsistent with the surrounding code.
Apply this diff to fix the indentation:
- writer.write( - `hasValidation: ${ - // explicit validation rules - hasValidationAttributes(model) || - // type-def fields require schema validation - this.hasTypeDefFields(model) - }` - ); + writer.write(`hasValidation: ${ + // explicit validation rules + hasValidationAttributes(model) || + // type-def fields require schema validation + this.hasTypeDefFields(model) + }`);tests/integration/tests/plugins/zod.test.ts (1)
1064-1082
: Consider adding more edge cases to the test suite.While the current test coverage is good, consider adding these test cases for more thorough validation:
- Maximum city length validation (>20 characters)
- Type validation for city field (non-string values)
- Type validation for address field (non-object values)
Example additions:
expect(schemas.ProfileSchema.safeParse({ address: { age: 18, city: 'N' } })).toMatchObject({ success: false }); +expect(schemas.ProfileSchema.safeParse({ address: { city: 'A'.repeat(21) } })).toMatchObject({ success: false }); +expect(schemas.ProfileSchema.safeParse({ address: { city: 123 } })).toMatchObject({ success: false }); +expect(schemas.ProfileSchema.safeParse({ address: "invalid" })).toMatchObject({ success: false });tests/integration/tests/enhancements/with-delegate/enhanced-client.test.ts (3)
1411-1441
: Consider adding more test scenarios for one-to-one self relation.The test verifies basic functionality but could be enhanced with additional scenarios:
- Verify that setting multiple successors fails (due to @unique constraint)
- Test null cases when no successor/predecessor exists
- Verify cascade delete behavior
1443-1474
: Enhance test coverage for one-to-many self relation.Consider adding the following test scenarios:
- Verify handling of multiple children
- Test nested hierarchies (grandparent -> parent -> child)
- Ensure circular references are prevented
- Test error cases for invalid references
1476-1506
: Strengthen test coverage for many-to-many self relation.The test should include additional scenarios:
- Test with multiple followers/following
- Verify disconnect operations
- Test self-following case (if applicable)
- Verify bulk operations (connectMany, disconnectMany)
Additionally, consider extracting common test data setup into beforeEach to reduce code duplication:
let db: any; let person: any; let organization: any; beforeEach(async () => { const { enhance } = await loadSchema(/* schema */); db = enhance(); person = await db.person.create({ data: {} }); organization = await db.organization.create({ data: {} }); });packages/sdk/src/validation.ts (1)
17-17
: Update function documentation to reflectTypeDef
supportThe documentation comment for
hasValidationAttributes
should be updated to indicate that the function now accepts bothDataModel
andTypeDef
declarations.Suggested change:
/** - * Returns if the given model contains any data validation rules (both at the model + * Returns if the given declaration contains any data validation rules (both at the model * level and at the field level). */packages/schema/src/plugins/enhancer/enhance/model-typedef-generator.ts (1)
41-42
: Enhance error message for unsupported field typesThe error message in
PluginError
may not provide sufficient detail, astype
is an object and may not be properly converted to a string. Consider usingJSON.stringify
or accessing specific properties to improve the error message.Apply this diff to enhance the error message:
- throw new PluginError(name, `Unsupported field type: ${type}`); + throw new PluginError(name, `Unsupported field type: ${JSON.stringify(type)}`);packages/runtime/src/enhancements/edge/json-processor.ts (5)
Line range hint
74-86
: Missingawait
when callingresolveField
infixJsonDateFields
In the
fixJsonDateFields
method,resolveField
is an asynchronous function but is being called withoutawait
. This could lead to unexpected behavior since the promise may not resolve before the code proceeds.To fix this issue, declare
fixJsonDateFields
as anasync
function and useawait
when callingresolveField
. Additionally, ensure that all calls tofixJsonDateFields
are awaited.Apply the following diff to address the issue:
- private fixJsonDateFields(entityData: any, typeDef: string) { + private async fixJsonDateFields(entityData: any, typeDef: string) { if (typeof entityData !== 'object' && !Array.isArray(entityData)) { return; } enumerate(entityData).forEach(async (item) => { if (!item || typeof item !== 'object') { return; } for (const [key, value] of Object.entries(item)) { - const fieldInfo = resolveField(this.options.modelMeta, typeDef, key, true); + const fieldInfo = await resolveField(this.options.modelMeta, typeDef, key, true); if (!fieldInfo) { continue; } if (fieldInfo.isTypeDef) { // recurse - this.fixJsonDateFields(value, fieldInfo.type); + await this.fixJsonDateFields(value, fieldInfo.type); } else if (fieldInfo.type === 'DateTime' && typeof value === 'string') { // convert to Date const parsed = Date.parse(value); if (!isNaN(parsed)) { item[key] = new Date(parsed); } } } }); }Also, update the calls to
fixJsonDateFields
in other methods:In the
doPostProcess
method (line 50):if (fieldInfo.isTypeDef) { - this.fixJsonDateFields(entityData[field], fieldInfo.type); + await this.fixJsonDateFields(entityData[field], fieldInfo.type); } else if (fieldInfo.isDataModel) {
Line range hint
33-35
: Optimize asynchronous processing withPromise.all
In the
processResultEntity
method, you are sequentially awaiting each call todoPostProcess
. If these processes are independent, consider running them concurrently to improve performance.Apply the following diff to process the entities concurrently:
protected override async processResultEntity<T>(_method: PrismaProxyActions, data: T): Promise<T> { - for (const value of enumerate(data)) { - await this.doPostProcess(value, this.model); - } + await Promise.all( + enumerate(data).map((value) => this.doPostProcess(value, this.model)) + ); return data; }
Line range hint
54-57
: Parallelize recursivedoPostProcess
calls for better performanceIn the
doPostProcess
method, the recursive calls within the loop can be executed concurrently if they are independent, enhancing performance.Modify the code as follows:
for (const item of items) { - // recurse - await this.doPostProcess(item, fieldInfo.type); + // collect promises + const promises = items.map((item) => this.doPostProcess(item, fieldInfo.type)); + await Promise.all(promises); }
Line range hint
68-89
: Ensure proper use of asynchronous functions withinforEach
In the
fixJsonDateFields
method, you're usingforEach
with asynchronous calls, which can lead to unhandled promises. Consider using afor...of
loop withawait
or properly handle promises withinforEach
.Refactor the code to handle asynchronous operations correctly:
- enumerate(entityData).forEach(async (item) => { + for (const item of enumerate(entityData)) { if (!item || typeof item !== 'object') { continue; } for (const [key, value] of Object.entries(item)) { const fieldInfo = await resolveField(this.options.modelMeta, typeDef, key, true); if (!fieldInfo) { continue; } if (fieldInfo.isTypeDef) { // recurse await this.fixJsonDateFields(value, fieldInfo.type); } else if (fieldInfo.type === 'DateTime' && typeof value === 'string') { // convert to Date const parsed = Date.parse(value); if (!isNaN(parsed)) { item[key] = new Date(parsed); } } } - }); + }
Line range hint
83-86
: Handle invalid date formats gracefullyWhen parsing date strings, ensure that invalid formats are handled appropriately to prevent potential runtime errors.
Consider adding logging or error handling for invalid date parsing:
const parsed = Date.parse(value); if (!isNaN(parsed)) { item[key] = new Date(parsed); + } else { + // Handle invalid date format + console.warn(`Invalid date format for key "${key}": ${value}`); }packages/runtime/src/enhancements/node/json-processor.ts (2)
41-61
: Enhance type safety by specifying explicit types instead ofany
Using
any
bypasses TypeScript's type checking, which can lead to runtime errors. Consider defining explicit types forentityData
indoPostProcess
andfixJsonDateFields
to improve code reliability and maintainability.Also applies to: 64-91
1-1
: Reconsider disabling ESLint rule forno-explicit-any
Disabling the
@typescript-eslint/no-explicit-any
rule project-wide can mask potential type-related issues. Instead, aim to replaceany
with more specific types where possible to leverage TypeScript's type checking capabilities.packages/runtime/src/cross/model-meta.ts (2)
199-206
: EnhancedresolveField
function for type definitionsUpdating
resolveField
to accept theisTypeDef
parameter enhances flexibility by allowing field resolution from both models and type definitions.Consider adding a JSDoc comment for the
isTypeDef
parameter to clarify its purpose:export function resolveField( modelMeta: ModelMeta, modelOrTypeDef: string, field: string, + /** + * Indicates whether to resolve the field from a type definition instead of a model + */ isTypeDef = false ): FieldInfo | undefined {
212-213
: UpdaterequireField
to support type definitionsPassing the
isTypeDef
parameter toresolveField
ensures thatrequireField
can resolve fields from type definitions.Consider enhancing the error message to specify whether the field was expected in a model or type definition for clearer debugging:
export function requireField(modelMeta: ModelMeta, model: string, field: string, isTypeDef = false) { const f = resolveField(modelMeta, model, field, isTypeDef); if (!f) { - throw new Error(`Field ${model}.${field} cannot be resolved`); + const containerType = isTypeDef ? 'type definition' : 'model'; + throw new Error(`Field ${model} (${containerType}).${field} cannot be resolved`); } return f; }packages/schema/src/plugins/zod/utils/schema-gen.ts (1)
184-184
: Consider adding a comment to explain the use ofz.lazy
forTypeDef
To enhance code readability, consider adding a comment explaining why
z.lazy
is used when handlingTypeDef
fields. This will help future maintainers understand that lazy evaluation addresses potential circular reference issues in schema definitions.packages/sdk/src/model-meta-generator.ts (3)
134-155
: Review and test the newwriteTypeDefs
function.The
writeTypeDefs
function has been introduced to handle the writing of type definitions. The implementation appears sound, but consider adding unit tests to validate its functionality and ensure it handles all edge cases appropriately.Would you like assistance in creating unit tests for
writeTypeDefs
to enhance test coverage?
Line range hint
226-284
: Consider optimizing field attribute filtering logic.Currently, when
options.generateAttributes
is false, the code still retrieves all attributes before filtering them. Optimizing this by only fetching required attributes could improve performance.Apply this diff to optimize attribute retrieval:
- if (options.generateAttributes) { - const attrs = getAttributes(f); - if (attrs.length > 0) { - writer.write(` attributes: ${JSON.stringify(attrs)},`); - } - } else { - // only include essential attributes - const attrs = getAttributes(f).filter((attr) => ['@default', '@updatedAt'].includes(attr.name)); - if (attrs.length > 0) { - writer.write(` attributes: ${JSON.stringify(attrs)},`); - } - } + const attrs = options.generateAttributes + ? getAttributes(f) + : getAttributes(f).filter((attr) => ['@default', '@updatedAt'].includes(attr.name)); + if (attrs.length > 0) { + writer.write(` attributes: ${JSON.stringify(attrs)},`); + }
80-80
: Handle potential large metadata objects gracefully.The initializer for the
metadata
variable can become quite large as models and type definitions grow. Consider modularizing the metadata generation or using streams to handle large datasets efficiently.packages/schema/src/plugins/zod/generator.ts (2)
277-282
: Consider refactoring to eliminate code duplication in schema generationThe loop starting at line 277 for generating schemas for
TypeDef
entities is similar to the existing logic forDataModel
schemas. To adhere to the DRY (Don't Repeat Yourself) principle, consider refactoring the code to unify the schema generation process for bothDataModel
andTypeDef
entities. This will improve maintainability and reduce potential inconsistencies.
292-325
: Extract common logic betweengenerateTypeDefSchema
andgenerateModelSchema
The
generateTypeDefSchema
method introduced between lines 292-325 has functionality that overlaps with the existinggenerateModelSchema
method. Refactoring shared code into a common helper function or base class can reduce duplication and facilitate easier maintenance. This approach aligns with the DRY principle and enhances code readability.packages/schema/src/plugins/enhancer/enhance/index.ts (5)
109-110
: Consider simplifying conditional imports and function callsThe repeated use of
prismaTypesFixed
to choose betweencreateLogicalPrismaImports
andcreateSimplePrismaImports
, as well as betweencreateLogicalPrismaEnhanceFunction
andcreateSimplePrismaEnhanceFunction
, adds complexity. Refactoring to minimize conditional logic can improve code readability and maintainability.Also applies to: 119-119
182-186
: Check for redundant imports increateLogicalPrismaImports
The function imports from
${prismaClientImport}
multiple times:import type * as _P from '${prismaClientImport}'; import type { Prisma, PrismaClient } from '${prismaClientImport}';Review if both imports are necessary or if they can be consolidated to avoid redundancy.
230-232
: Assess potential performance impact of theneedsPrismaClientTypeFixes
getterThe
needsPrismaClientTypeFixes
getter callsthis.hasTypeDef(this.model)
, which may be a computationally intensive operation if the model is large. Consider caching the result or optimizinghasTypeDef
to prevent potential performance degradation.
247-250
: Add documentation for thehasTypeDef
methodProviding a brief comment for the
hasTypeDef
method would clarify its purpose and enhance code readability for future maintainers.
753-760
: Include comments explaining thegenerateExtraTypes
functionThe
generateExtraTypes
function plays a key role in generating type definitions. Adding descriptive comments will improve understandability and maintainability of the code.tests/integration/tests/enhancements/with-policy/checker.test.ts (3)
653-692
: Consider Refactoring to Reduce Code DuplicationThe three test cases for different operations (
read
,create
,update
,delete
) have similar structures and repetitive assertions. To improve maintainability and readability, consider refactoring these assertions into a helper function that can be reused across tests.Example:
function expectOperation( db: any, operation: string, authData?: any, expectedResult: boolean = true ) { const assertion = expectedResult ? 'toResolveTruthy' : 'toResolveFalsy'; return expect(enhance(authData).model.check({ operation })).[assertion](); } // Usage in your test await expectOperation(db, 'read', { id: 1 }, true); await expectOperation(db, 'delete', { id: 1 }, false);
717-762
: Simplify Complex Conditions for Better ClarityIn the
'supports policy delegation combined'
test, the model permissions use complex conditions combiningcheck(foo)
with additional criteria (e.g.,value > 0
). Consider simplifying these conditions or adding comments to explain the rationale, improving readability and maintainability.Example:
// @@allow('all', check(foo) && value > 0) // The model is accessible if 'foo' passes its checks and 'value' is greater than 0
653-762
: Improve Test Descriptions for ClarityThe test descriptions like
'supports policy delegation simple'
,'explicit'
, and'combined'
may not fully convey the intent of each test. Consider elaborating the descriptions to provide more context about what specific aspect of policy delegation each test is verifying.Example:
it('supports simple policy delegation through related models', async () => { /* ... */ });packages/runtime/src/enhancements/node/policy/policy-utils.ts (2)
684-691
: Refine 'any' types to 'unknown' for better type safetyThe type guard functions
isVariableConstraint
andisDelegateConstraint
currently acceptvalue: any
. Replacingany
withunknown
enhances type safety and prevents unintended type coercion.Apply this diff to update the function signatures:
- const isVariableConstraint = (value: any): value is VariableConstraint => { + const isVariableConstraint = (value: unknown): value is VariableConstraint => { return value && typeof value === 'object' && value.kind === 'variable'; }; - const isDelegateConstraint = (value: any): value is DelegateConstraint => { + const isDelegateConstraint = (value: unknown): value is DelegateConstraint => { return value && typeof value === 'object' && value.kind === 'delegate'; };
679-719
: Consider refactoring 'getCheckerConstraint' method for maintainabilityThe
getCheckerConstraint
method spans multiple responsibilities and includes nested functions and traversals, which may affect readability and maintainability. Breaking it down into smaller, reusable functions can improve clarity.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
⛔ Files ignored due to path filters (20)
package.json
is excluded by!**/*.json
packages/ide/jetbrains/package.json
is excluded by!**/*.json
packages/language/package.json
is excluded by!**/*.json
packages/language/src/generated/ast.ts
is excluded by!**/generated/**
,!**/generated/**
packages/language/src/generated/grammar.ts
is excluded by!**/generated/**
,!**/generated/**
packages/language/syntaxes/zmodel.tmLanguage.json
is excluded by!**/*.json
packages/misc/redwood/package.json
is excluded by!**/*.json
packages/plugins/openapi/package.json
is excluded by!**/*.json
packages/plugins/swr/package.json
is excluded by!**/*.json
packages/plugins/tanstack-query/package.json
is excluded by!**/*.json
packages/plugins/trpc/package.json
is excluded by!**/*.json
packages/runtime/package.json
is excluded by!**/*.json
packages/schema/package.json
is excluded by!**/*.json
packages/sdk/package.json
is excluded by!**/*.json
packages/server/package.json
is excluded by!**/*.json
packages/testtools/package.json
is excluded by!**/*.json
pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
,!**/*.yaml
tests/integration/test-run/package.json
is excluded by!**/*.json
tests/integration/tests/frameworks/nextjs/test-project/package.json
is excluded by!**/*.json
tests/integration/tests/frameworks/trpc/test-project/package.json
is excluded by!**/*.json
📒 Files selected for processing (55)
README.md
(5 hunks)packages/ide/jetbrains/CHANGELOG.md
(1 hunks)packages/ide/jetbrains/build.gradle.kts
(1 hunks)packages/language/src/zmodel.langium
(4 hunks)packages/language/syntaxes/zmodel.tmLanguage
(1 hunks)packages/plugins/swr/src/generator.ts
(2 hunks)packages/plugins/swr/src/runtime/index.ts
(1 hunks)packages/plugins/tanstack-query/src/generator.ts
(3 hunks)packages/plugins/tanstack-query/src/runtime/common.ts
(1 hunks)packages/runtime/src/cross/model-meta.ts
(4 hunks)packages/runtime/src/cross/utils.ts
(4 hunks)packages/runtime/src/enhancements/edge/json-processor.ts
(1 hunks)packages/runtime/src/enhancements/node/create-enhancement.ts
(3 hunks)packages/runtime/src/enhancements/node/default-auth.ts
(6 hunks)packages/runtime/src/enhancements/node/delegate.ts
(3 hunks)packages/runtime/src/enhancements/node/json-processor.ts
(1 hunks)packages/runtime/src/enhancements/node/policy/policy-utils.ts
(3 hunks)packages/runtime/src/enhancements/node/types.ts
(2 hunks)packages/schema/src/cli/actions/repl.ts
(4 hunks)packages/schema/src/cli/index.ts
(1 hunks)packages/schema/src/cli/plugin-runner.ts
(2 hunks)packages/schema/src/language-server/validator/attribute-application-validator.ts
(3 hunks)packages/schema/src/language-server/validator/datamodel-validator.ts
(2 hunks)packages/schema/src/language-server/validator/typedef-validator.ts
(1 hunks)packages/schema/src/language-server/validator/zmodel-validator.ts
(4 hunks)packages/schema/src/language-server/zmodel-documentation-provider.ts
(1 hunks)packages/schema/src/language-server/zmodel-module.ts
(2 hunks)packages/schema/src/language-server/zmodel-workspace-manager.ts
(1 hunks)packages/schema/src/plugins/enhancer/enhance/index.ts
(11 hunks)packages/schema/src/plugins/enhancer/enhance/model-typedef-generator.ts
(1 hunks)packages/schema/src/plugins/enhancer/model-meta/index.ts
(1 hunks)packages/schema/src/plugins/enhancer/policy/constraint-transformer.ts
(5 hunks)packages/schema/src/plugins/enhancer/policy/policy-guard-generator.ts
(2 hunks)packages/schema/src/plugins/prisma/schema-generator.ts
(16 hunks)packages/schema/src/plugins/zod/generator.ts
(4 hunks)packages/schema/src/plugins/zod/transformer.ts
(3 hunks)packages/schema/src/plugins/zod/utils/schema-gen.ts
(4 hunks)packages/schema/src/res/starter.zmodel
(3 hunks)packages/schema/src/res/stdlib.zmodel
(5 hunks)packages/schema/src/utils/ast-utils.ts
(4 hunks)packages/schema/tests/generator/prisma-generator.test.ts
(8 hunks)packages/schema/tests/schema/validation/attribute-validation.test.ts
(1 hunks)packages/sdk/src/model-meta-generator.ts
(9 hunks)packages/sdk/src/utils.ts
(4 hunks)packages/sdk/src/validation.ts
(2 hunks)script/test-scaffold.ts
(1 hunks)tests/integration/tests/cli/plugins.test.ts
(2 hunks)tests/integration/tests/enhancements/json/crud.test.ts
(1 hunks)tests/integration/tests/enhancements/json/typing.test.ts
(1 hunks)tests/integration/tests/enhancements/json/validation.test.ts
(1 hunks)tests/integration/tests/enhancements/with-delegate/enhanced-client.test.ts
(1 hunks)tests/integration/tests/enhancements/with-policy/checker.test.ts
(2 hunks)tests/integration/tests/plugins/zod.test.ts
(1 hunks)tests/regression/tests/issue-1755.test.ts
(1 hunks)tests/regression/tests/issue-1786.test.ts
(1 hunks)
✅ Files skipped from review due to trivial changes (3)
- packages/ide/jetbrains/CHANGELOG.md
- packages/ide/jetbrains/build.gradle.kts
- packages/schema/src/res/starter.zmodel
🧰 Additional context used
🪛 Biome
packages/runtime/src/enhancements/node/json-processor.ts
[error] 85-85: isNaN is unsafe. It attempts a type coercion. Use Number.isNaN instead.
See the MDN documentation for more details.
Unsafe fix: Use Number.isNaN instead.
(lint/suspicious/noGlobalIsNan)
packages/schema/src/cli/actions/repl.ts
[error] 75-75: eval() exposes to security risks and performance issues.
See the MDN web docs for more details.
Refactor the code so that it doesn't need to call eval().
(lint/security/noGlobalEval)
🔇 Additional comments (112)
packages/schema/src/language-server/zmodel-documentation-provider.ts (2)
1-1
: LGTM!
Clean and focused imports that are all utilized in the implementation.
6-16
: Verify documentation handling for TypeDef nodes.
Let's ensure the documentation provider works correctly with the new TypeDef feature.
✅ Verification successful
Let me try a different approach to verify the documentation handling implementation.
Let me try one final verification to check the base class and its relationship with our documentation provider.
Based on the verification results, I can now provide a final response:
Documentation provider implementation is correct and well-integrated
The implementation correctly handles both triple-slash and JSDoc comments:
- Prioritizes triple-slash comments by checking the node's
comments
array first - Falls back to JSDoc comments through the parent class when needed
- Follows the existing documentation patterns seen across the codebase (verified through triple-slash comment usage)
- Properly extends the base
JSDocDocumentationProvider
class
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify TypeDef documentation handling
# Test 1: Check for TypeDef documentation patterns in the codebase
echo "Checking TypeDef documentation patterns..."
rg -l "typedef.*///"
# Test 2: Check for test cases covering TypeDef documentation
echo "Checking test coverage for TypeDef documentation..."
rg -l "typedef.*test" --type=ts
# Test 3: Verify consistent comment style across the codebase
echo "Checking comment style consistency..."
ast-grep --pattern 'typedef $_ {
$$$
}'
Length of output: 393
Script:
#!/bin/bash
# Let's analyze the documentation provider implementation and its usage
# Check for documentation provider usage patterns
echo "Checking documentation provider usage..."
rg -A 5 "getDocumentation.*AstNode"
# Look for test files related to documentation provider
echo "Checking test files..."
fd "documentation.*test" --type f
# Check for triple-slash comment patterns in the codebase
echo "Analyzing comment patterns..."
rg -A 2 "^///.*" --type=ts
# Look for related language server components
echo "Checking language server components..."
fd "language-server" --type d
Length of output: 7585
Script:
#!/bin/bash
# Check the JSDocDocumentationProvider implementation and related files
# Look for JSDocDocumentationProvider implementation
echo "Checking JSDocDocumentationProvider..."
rg -A 10 "class.*JSDocDocumentationProvider"
# Check for any other documentation provider implementations
echo "Checking other documentation providers..."
rg -A 5 "class.*DocumentationProvider"
# Look for documentation provider registration/usage in language server
echo "Checking documentation provider usage in language server..."
rg -A 5 "DocumentationProvider.*register|createServices"
Length of output: 2477
packages/schema/src/plugins/enhancer/model-meta/index.ts (2)
2-2
: LGTM! Import changes are well-organized.
The consolidation of AST-related imports and addition of isTypeDef
is clean and logical.
9-9
: LGTM! Type definition filtering looks good.
The filtering logic is clean and aligns with the framework's type definition support initiative.
Let's verify consistent type definition handling across related files:
✅ Verification successful
Type definition filtering is consistently implemented across the codebase
The verification shows that the type definition filtering pattern declarations.filter(isTypeDef)
is uniformly implemented across all relevant files:
packages/schema/src/plugins/zod/generator.ts
packages/plugins/tanstack-query/src/generator.ts
packages/plugins/swr/src/generator.ts
packages/schema/src/plugins/enhancer/model-meta/index.ts
Additionally, all generateModelMeta
calls correctly pass the filtered typeDefs
parameter with consistent configuration patterns.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for consistent type definition handling patterns
# across related generator files
# Look for similar type definition filtering patterns
rg "declarations\.filter.*isTypeDef" --type ts
# Check for generateModelMeta calls with typeDefs parameter
rg "generateModelMeta.*typeDefs" --type ts -A 5
Length of output: 2496
packages/schema/src/language-server/validator/typedef-validator.ts (1)
1-5
: LGTM: Imports are well-organized and complete.
The imports cover all necessary dependencies and follow good practices with external imports followed by internal ones.
tests/integration/tests/enhancements/json/validation.test.ts (1)
1-3
: LGTM! Clear and focused test suite setup.
The import and describe block are well-structured and appropriately named for JSON field typing tests.
tests/regression/tests/issue-1786.test.ts (1)
7-18
: Verify security implications of the User model.
The model has some security considerations to review:
- The email length constraint (6-32) might be too restrictive for some valid email addresses
- The
@@allow('create', true)
rule permits unrestricted user creation
✅ Verification successful
Based on the search results, I can now provide a comprehensive analysis of the security implications of the User model:
The email length and signup restrictions are consistent with security best practices
- The email length constraint of 6-32 characters is commonly used in test cases across the codebase and provides reasonable validation
- The
@@allow('create', true)
for signup is a standard pattern used in many test models to enable user registration - The password field correctly uses
@password
and@omit
decorators to ensure secure handling - The
@@allow('all', auth() == this)
rule properly restricts full access to only the authenticated user
The security model follows established patterns seen throughout the test codebase, with appropriate:
- Password field protection
- Self-access restrictions
- Public signup capability
- Email validation
The implementation aligns with common authentication and authorization patterns demonstrated across multiple test cases in the repository.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for similar User models in the codebase to verify if this is consistent with existing patterns
ast-grep --pattern $'model User {
$$$
@@allow($_, true)
$$$
}'
Length of output: 69
Script:
#!/bin/bash
# Let's try a broader search for User models and their access patterns
rg -U "model User \{[\s\S]*?\}" --type ts
# Also search for any @@allow rules to understand access patterns
rg "@@allow\(" --type ts -A 2 -B 2
Length of output: 504497
packages/runtime/src/cross/utils.ts (4)
2-2
: LGTM!
The import statement correctly includes the new TypeDefInfo type needed for the added functionality.
Line range hint 66-80
: LGTM!
The implementation is type-safe and provides clear error handling with proper TypeScript generic constraints.
Line range hint 97-103
: LGTM!
Good refactoring to use the new getModelInfo
utility while maintaining the same functionality.
Line range hint 49-65
: Verify the unique constraint assumption.
The function assumes that the first unique constraint contains the ID fields. While this is common, it might not always be true in complex schemas. Consider:
- Adding a comment explaining this assumption
- Adding logic to identify the primary key constraint specifically
Let's check if there are models with multiple unique constraints where the first one isn't the primary key:
packages/language/syntaxes/zmodel.tmLanguage (1)
23-23
: LGTM! The 'type' keyword addition looks correct.
The keyword is properly integrated into the control pattern, maintaining alphabetical order and consistent word boundary matching.
Let's verify that all type-related keywords are properly covered:
✅ Verification successful
The 'type' keyword addition is correctly integrated and consistent with the grammar implementation
The verification confirms that:
- The 'type' keyword is properly added to both tmLanguage and tmLanguage.json files
- The keyword aligns with the TypeDef grammar rule implementation in zmodel.langium
- The syntax highlighting pattern is consistent with the language grammar where 'type' is used to declare type definitions
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify type-related keywords in grammar files
# Expected: The new 'type' keyword should be consistently defined across grammar files
# Check if there are any type-related grammar rules that might need corresponding keywords
ast-grep --pattern 'type' packages/language/src/zmodel.langium
# Check for any type-related keywords in other grammar or configuration files
rg -l 'type' packages/language/
Length of output: 379
Script:
#!/bin/bash
# Let's examine the grammar definition and related files more closely
# Check grammar rules related to type definitions
rg -A 5 "type\s*:" packages/language/src/zmodel.langium
# Check the JSON grammar file to ensure consistency
cat packages/language/syntaxes/zmodel.tmLanguage.json | jq '.patterns[] | select(.name | contains("keyword"))'
# Look for type-related AST definitions
rg -A 3 "interface.*Type" packages/language/src/ast.ts
Length of output: 327
Script:
#!/bin/bash
# Let's try a different approach to verify the grammar consistency
# Check the grammar file content directly
cat packages/language/src/zmodel.langium | grep -A 10 "TypeDef"
# Look at both grammar files to compare keywords
cat packages/language/syntaxes/zmodel.tmLanguage
cat packages/language/syntaxes/zmodel.tmLanguage.json
# Search for type-related rules and definitions
rg -l "TypeDef|typedef" packages/language/
Length of output: 6884
packages/schema/src/language-server/validator/zmodel-validator.ts (2)
10-10
: LGTM: TypeDef import follows existing pattern
The TypeDef import is correctly placed with other AST types and is necessary for the new TypeDef validation support.
36-36
: LGTM: TypeDef validation check properly registered
The TypeDef validation check is correctly registered following the existing pattern and is appropriately grouped with other type validations.
packages/runtime/src/enhancements/node/create-enhancement.ts (3)
13-13
: LGTM: Import follows established patterns
The new import for JSON processing capabilities follows the existing enhancement import patterns.
105-105
: LGTM: Clear and consistent feature detection
The type definition field detection follows the established pattern of other feature detection flags in the codebase.
154-156
: Verify enhancement order dependencies
While the JSON processor integration looks correct, please verify that applying it at the end of the enhancement chain doesn't affect other enhancements' functionality, especially for:
- Policy enforcement on JSON fields
- Validation rules on type definition fields
- Field omission behavior
✅ Verification successful
Let me gather more information about the JSON processor implementation and its interactions.
Based on the implementation details revealed in the files, I can now generate the final response:
Enhancement order is correctly placed and safe
The JSON processor enhancement's placement at the end of the chain is appropriate and safe because:
- It only post-processes JSON fields by converting date strings to Date objects in type definition fields
- It operates independently without modifying data structure or validation rules
- Other enhancements (policy, validation, omit) are applied first, ensuring their rules are enforced before JSON processing
- The implementation uses a recursive approach that preserves the data structure while only transforming date fields
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for any order-dependent operations in the JSON processor
rg -A 5 "withJsonProcessor" packages/runtime/src/enhancements/node/
# Look for potential interactions with other enhancements
ast-grep --pattern 'function withJsonProcessor($_) {
$$$
return $_.$$$
}'
Length of output: 1712
Script:
#!/bin/bash
# Check the JSON processor implementation
cat packages/runtime/src/enhancements/node/json-processor.ts
# Look for the enhancement chain order in create-enhancement.ts
cat packages/runtime/src/enhancements/node/create-enhancement.ts
# Check for any JSON field handling in other enhancements
rg -l "JSON\." packages/runtime/src/enhancements/node/
Length of output: 9045
packages/schema/src/language-server/zmodel-workspace-manager.ts (1)
12-12
: Verify import statements are updated for the named export change.
The change from default export to named export is a good practice, but it's a breaking change that requires updates to import statements across the codebase.
✅ Verification successful
The search results show that there is only one import of ZModelWorkspaceManager
in the codebase, and it's already using the correct named import syntax. Let me verify if there are any default imports that might have been missed by the previous search pattern.
All import statements are correctly using named imports
The verification shows that:
- The only import of
ZModelWorkspaceManager
is inpackages/schema/src/language-server/zmodel-module.ts
and it's already using the correct named import syntax - No default imports or dynamic requires were found in the codebase
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Find all imports of ZModelWorkspaceManager to ensure they're updated
# Expected: All imports should use named import syntax
# Search for both old and new import patterns
echo "Checking import statements..."
rg "import.*ZModelWorkspaceManager.*from.*zmodel-workspace-manager"
Length of output: 260
Script:
#!/bin/bash
# Search for any default imports from this file
rg "import\s+ZModelWorkspaceManager\s+from"
# Also check for any dynamic imports that might use the file
rg "require.*zmodel-workspace-manager"
Length of output: 86
tests/integration/tests/enhancements/json/typing.test.ts (1)
1-3
: LGTM! Test suite structure follows best practices.
The test suite is well-organized and follows Jest conventions with proper imports.
packages/schema/src/language-server/zmodel-module.ts (2)
30-30
: LGTM! Import changes look good.
The import changes are well-structured:
- New import for documentation provider aligns with the new documentation feature
- Updated import style for WorkspaceManager maintains consistency
Also applies to: 37-37
81-83
: LGTM! Documentation service registration is properly implemented.
The documentation service is correctly integrated into the ZModelModule following Langium's service registration pattern.
Let's verify the DocumentationProvider implementation:
✅ Verification successful
Documentation service implementation is correctly verified
The implementation follows Langium's patterns correctly:
- Extends
JSDocDocumentationProvider
from Langium - Properly overrides
getDocumentation
method with enhanced functionality - Maintains fallback to JSDoc comments via
super.getDocumentation
- Is correctly integrated in the module and used by other services (e.g. CompletionProvider)
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify the DocumentationProvider implementation follows Langium patterns
# Test 1: Check if the DocumentationProvider follows Langium's provider pattern
ast-grep --pattern 'class ZModelDocumentationProvider {
$$$
constructor($services) {
$$$
}
$$$
}'
# Test 2: Check for any existing documentation-related tests
rg -l "DocumentationProvider" --glob "**/*.test.ts"
Length of output: 163
Script:
#!/bin/bash
# Let's try different approaches to verify the DocumentationProvider implementation
# Check for the DocumentationProvider class definition
rg -A 10 "class ZModelDocumentationProvider"
# Look for files containing DocumentationProvider
fd -t f . | rg "DocumentationProvider"
# Search for documentation-related imports and usage
rg "DocumentationProvider|@langium.*documentation" --type ts
Length of output: 2095
packages/runtime/src/enhancements/node/types.ts (2)
21-24
: LGTM! Well-defined CRUD type.
The new CRUD
type effectively standardizes the representation of database operations using a union of string literals. The naming and documentation are clear and follow TypeScript best practices.
100-101
: Verify the impact of the new constraint type.
The addition of DelegateConstraint
to the union type is well-placed. However, since PermissionCheckerConstraint
is a fundamental type, we should verify that all code handling these constraints properly supports the new delegate type.
Let's verify the constraint handling implementation:
✅ Verification successful
The DelegateConstraint type is properly handled throughout the codebase
The verification shows that the new DelegateConstraint
type is well integrated:
- It's properly defined in
types.ts
with required properties - The constraint is handled in
policy-utils.ts
with dedicated type checking and processing logic - The constraint solver (
constraint-solver.ts
) correctly processes all constraint types including logical constraints that may contain delegate constraints
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for switch/if statements handling constraint types
rg -t typescript "kind['\"]?\s*[:=]\s*['\"]" -A 5 packages/runtime/src/
Length of output: 112
Script:
#!/bin/bash
# Search for switch/if statements handling constraint types with a more generic pattern
rg -A 5 "switch.*kind|if.*kind" packages/runtime/src/
# Also search for any references to constraint types
rg -A 2 "LogicalConstraint|DelegateConstraint" packages/runtime/src/
# Look for files that might handle constraints
fd "constraint" packages/runtime/src/
Length of output: 11275
packages/schema/src/cli/index.ts (1)
136-137
: LGTM! Please verify backward compatibility.
The changes improve clarity by explicitly indicating that paths (rather than module names) are expected. The new --load-path
option enhances flexibility for module loading.
Let's verify the implementation and potential impact:
Consider updating the documentation to:
- Clarify the path-based approach for
--prisma-client
- Provide examples of using the new
--load-path
option - If needed, add migration notes for users updating from older versions
✅ Verification successful
Based on the implementation in packages/schema/src/cli/actions/repl.ts
, I can now provide a definitive response about the changes:
The changes are safe and improve clarity
The implementation shows that both options are handled appropriately:
prismaClient
option supports both module names and paths, with fallback logic:const prismaClientModule = options.prismaClient ?? '@prisma/client'; try { // try direct require const module = require(prismaClientModule); } catch (err) { if (!path.isAbsolute(prismaClientModule)) { // try relative require const module = require(path.join(projectPath, prismaClientModule)); } }
loadPath
is properly handled for loading ZenStack modules:const { enhance } = options.loadPath ? require(path.join(path.resolve(options.loadPath), 'enhance')) : require('@zenstackhq/runtime');
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for existing usages of the prisma-client option that might be affected by the change from module to path
# Test 1: Search for repl command usages with the old --prisma-client format
rg -l "repl.*--prisma-client" --type md --type ts --type js
# Test 2: Check if the changes are documented
rg -l "load-path.*path to load modules" --type md
Length of output: 113
Script:
#!/bin/bash
# Let's try a broader search to understand the implementation and usage
# Check the repl command implementation
ast-grep --pattern 'function repl($$$) { $$$ }'
# Look for any references to these CLI options
rg -A 3 -B 3 "(prisma-client|load-path)"
# Check for any documentation or examples
fd "README" --type f
fd "docs" --type d
Length of output: 61941
packages/runtime/src/enhancements/node/default-auth.ts (2)
11-11
: LGTM: Import addition aligns with new functionality.
The addition of getTypeDefInfo
import is necessary for the new type definition handling capabilities.
74-77
: LGTM: Enhanced payload preprocessing with type definition support.
The changes appropriately handle type definitions while maintaining the existing flow. The method name changes better reflect their generic nature rather than being auth-specific.
Also applies to: 89-92
packages/schema/src/cli/actions/repl.ts (1)
12-14
: LGTM: Function signature update is well-documented
The addition of the optional loadPath
parameter is properly typed and maintains backward compatibility.
tests/integration/tests/cli/plugins.test.ts (1)
Line range hint 13-24
: Consider expanding test coverage for package managers.
The commented-out package managers (pnpm
, pnpm-workspace
) suggest incomplete test coverage. Consider re-enabling these tests to ensure plugin functionality across all supported package managers.
tests/integration/tests/enhancements/json/crud.test.ts (2)
1-16
: LGTM! Well-structured test setup.
The test suite setup follows best practices with proper database lifecycle management and cleanup.
113-192
: LGTM! Comprehensive validation testing.
Excellent coverage of validation rules across different operations and scenarios, including:
- Create/Update/Upsert operations
- Nested object validation
- Complex constraint combinations
README.md (6)
161-161
: LGTM! Good documentation improvement.
Adding the link to Zod's website helps users better understand the underlying technology.
Line range hint 169-175
: LGTM! Framework adapters section is well-organized.
The framework adapters section maintains consistency in naming and provides correct documentation links.
183-183
: LGTM! New feature properly documented.
The addition of "Strongly typed JSON field" feature aligns with the type definition enhancements mentioned in the PR summary.
203-206
: LGTM! Example implementations are well-organized.
The example implementations are properly categorized and maintain consistent naming conventions.
Also applies to: 212-215
228-228
: LGTM! Contributing guidelines improved.
The addition of documentation repository information helps contributors understand where to make documentation changes.
251-251
: LGTM! Sponsor acknowledgment properly updated.
Benjamin Zecirovic has been correctly added to the previous sponsors section with proper formatting.
packages/plugins/tanstack-query/src/runtime/common.ts (1)
224-224
: LGTM! Enhanced type safety in deserialization.
The additional type check typeof parsed === 'object'
improves robustness by ensuring parsed
is an object before attempting property access. This defensive programming approach helps prevent potential runtime errors.
packages/plugins/swr/src/generator.ts (3)
13-13
: LGTM: Import addition for type definition support
The addition of isTypeDef
import aligns with the PR's objective of supporting type definitions in the framework.
31-31
: LGTM: Clean extraction of type definitions
The extraction of type definitions from model declarations is implemented cleanly and follows the codebase's conventions.
33-36
: LGTM: Updated model metadata generation
The integration of type definitions into the model metadata generation is implemented correctly and maintains consistency with related changes across the codebase.
Let's verify the consistency of this change with related files:
✅ Verification successful
Consistent implementation of type definitions verified across the codebase
The generateModelMeta
function is consistently called with typeDefs
parameter across all relevant files:
packages/sdk/src/model-meta-generator.ts
packages/schema/src/plugins/enhancer/model-meta/index.ts
packages/plugins/tanstack-query/src/generator.ts
packages/plugins/swr/src/generator.ts
The implementation in the SWR plugin follows the same pattern as other plugins and core modules, maintaining consistency in the codebase.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify consistent implementation of type definitions across related files
# Expected: All related files should show similar pattern of handling type definitions
# Check for similar changes in related files
rg -A 5 "generateModelMeta.*typeDefs"
Length of output: 1996
packages/schema/src/cli/plugin-runner.ts (3)
3-3
: LGTM: Import changes for type definition support
The addition of DataModel
and isTypeDef
imports aligns with the new type definition validation functionality.
314-315
: LGTM: Enhanced validation to include type definition fields
The addition of this.hasTypeDefFields(model)
check appropriately extends validation to cover type definition fields.
317-318
: LGTM: Well-implemented type definition field detection
The hasTypeDefFields
method is concise, type-safe, and follows the single responsibility principle. The use of optional chaining (?.
) in f.type.reference?.ref
provides proper null safety.
packages/schema/src/language-server/validator/datamodel-validator.ts (2)
9-9
: LGTM: Well-organized imports
The new imports are properly organized and support the added custom type validation functionality.
Also applies to: 11-18
107-115
: Verify handling of nested custom types
The current validation looks good, but let's verify the handling of edge cases.
✅ Verification successful
Let me try a different approach to verify the custom type validation by checking the actual implementation and usage.
Let me check the actual implementation of the type validation to verify the handling of custom types.
Based on the search results, I can now provide a final response about the custom type validation implementation.
Custom type validation is correctly implemented
The validation logic for custom-typed fields is properly implemented with two essential checks:
- Enforces
@json
attribute for custom-typed fields - Restricts usage to PostgreSQL provider only
The implementation is consistent across the codebase:
- Custom types are correctly transformed to Prisma's
Json
type - Type validation is properly integrated with schema generation
- Proper handling in various plugins (Zod, validation, policy guards)
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check for nested custom type definitions that might cause validation issues
echo "Searching for nested type definitions..."
ast-grep --pattern 'type $type {
$field: $nestedType
$$$
}'
# Check for circular type references that could cause infinite loops
echo "Searching for potential circular type references..."
ast-grep --pattern 'type $type1 {
$field1: $type2
$$$
}
type $type2 {
$field2: $type1
$$$
}'
Length of output: 373
Script:
#!/bin/bash
# Check how custom types are defined and used in the codebase
echo "Searching for type definitions and their usage..."
rg "type\s+\w+\s*{" -A 5
# Check for @json attribute usage with custom types
echo "Searching for @json attribute usage..."
rg "@json" -A 2
# Look for provider-specific validations
echo "Checking provider-specific validations..."
rg "provider.*postgresql" -A 3
Length of output: 44162
Script:
#!/bin/bash
# Check the implementation of type validation
echo "Searching for type validation implementation..."
rg -A 10 "isTypeDef.*reference.*ref"
# Check for related validation functions
echo "Searching for related validation functions..."
ast-grep --pattern 'function isTypeDef($_) {
$$$
}'
Length of output: 13360
packages/schema/tests/generator/prisma-generator.test.ts (5)
50-78
: LGTM! Well-structured test data with comprehensive documentation.
The test data demonstrates good practices:
- Proper documentation for enum values and models
- Clear access control rules
- Correct schema organization with proper model relationships
87-88
: LGTM! Appropriate generator configuration for testing comments.
The customAttributesAsComments
option correctly enables the testing of comment generation functionality.
99-106
: LGTM! Comprehensive test assertions for comment generation.
The assertions provide good coverage by:
- Verifying presence of expected comments
- Checking absence of comments where appropriate
- Testing different types of documentation (model, field, enum)
209-210
: Similar configuration change as seen before.
242-243
: Consistent configuration across test cases.
The customAttributesAsComments
option is consistently applied across different test scenarios, ensuring comprehensive testing of comment generation functionality.
Also applies to: 436-437, 487-488, 519-520
packages/sdk/src/utils.ts (2)
33-33
: LGTM: TypeDefField import addition
The addition of TypeDefField
to the imports aligns with the broader changes for type definition support.
464-466
: LGTM: Added container type check
Good defensive programming practice to validate the container type before proceeding with discriminator field logic.
packages/schema/src/res/stdlib.zmodel (3)
50-50
: LGTM: Addition of TypeDefField to AttributeTargetField enum
The new enum value enables proper targeting of type definition fields, which is essential for the enhanced type definition support.
179-182
: LGTM: Well-defined @supportTypeDef attribute
The attribute is properly defined with clear documentation explaining its purpose of making attributes applicable to both type definitions and fields.
567-637
: LGTM: Consistent application of @supportTypeDef to validation attributes
All validation attributes have been consistently updated to support type definitions while maintaining their original validation functionality. This enhancement allows these validation rules to be applied uniformly across both regular fields and type definitions.
packages/plugins/tanstack-query/src/generator.ts (3)
14-14
: LGTM: Import addition for type definition support
The addition of isTypeDef
import aligns with the enhanced type definition support mentioned in the PR objectives.
31-32
: LGTM: Type definitions integration
The changes correctly integrate type definitions support:
- Properly filters type definitions from model declarations
- Correctly passes them to the model metadata generator
- Maintains backward compatibility
Also applies to: 48-51
Line range hint 52-824
: No changes needed in the existing code
The existing code base continues to work seamlessly with the new type definitions support.
packages/schema/src/plugins/enhancer/policy/policy-guard-generator.ts (2)
14-14
: LGTM!
The isTypeDef
import is correctly added and follows the existing import pattern.
766-768
: LGTM!
The hasTypeDefFields
method is well-implemented with a clear purpose and correct logic for checking TypeDef fields.
packages/schema/tests/schema/validation/attribute-validation.test.ts (1)
1358-1371
: LGTM! Well-structured test case for type definition field attributes.
The test case effectively validates that the @omit
attribute cannot be used on type declaration fields. It follows the established testing patterns and provides clear error expectations.
tests/integration/tests/plugins/zod.test.ts (1)
1029-1060
: LGTM! Well-structured test setup.
The test case demonstrates a clear and practical example of custom type definitions with nested types and meaningful constraints.
packages/sdk/src/validation.ts (2)
18-21
: Function signature correctly updated to handle TypeDef
and cycle detection
The hasValidationAttributes
function signature has been appropriately updated to accept both DataModel
and TypeDef
, and includes a seen
set to prevent infinite recursion due to circular references. This change enhances the function's capability to handle nested validations.
18-21
: Ensure all usages of hasValidationAttributes
are updated to the new signature
With the updated function signature of hasValidationAttributes
, please verify that all calls to this function in the codebase have been updated accordingly. This ensures consistency and prevents potential runtime errors from mismatched parameters.
Run the following script to find all usages of hasValidationAttributes
:
packages/runtime/src/cross/model-meta.ts (3)
47-51
: Addition of isTypeDef
property to FieldInfo
Introducing the isTypeDef
property enhances the FieldInfo
type by distinguishing fields that are type definitions, which improves metadata accuracy.
151-165
: Definition of TypeDefInfo
Adding the TypeDefInfo
type allows for detailed metadata representation of type definitions, promoting consistency alongside ModelInfo
.
175-179
: Integration of typeDefs
into ModelMeta
Including typeDefs
in the ModelMeta
structure appropriately extends model metadata to encompass type definitions.
packages/language/src/zmodel.langium (6)
14-14
: Inclusion of TypeDef
in AbstractDeclaration
is appropriate
The addition of TypeDef
to the AbstractDeclaration
rule correctly integrates the new type definition feature into the grammar.
182-188
: Definition of the TypeDef
grammar rule is well-structured
The TypeDef
rule is properly defined, allowing for the declaration of custom types with one or more fields. This enhances the language's capability to support custom type definitions.
189-192
: TypeDefField
aligns with existing field definitions
The TypeDefField
rule is consistent with other field declarations, ensuring uniformity in how fields are defined within the grammar.
193-195
: TypeDefFieldType
supports built-in and custom types effectively
Allowing TypeDefFieldType
to reference both built-in types and other TypeDef
instances provides flexibility in defining complex types, which is valuable for users defining custom structures.
225-225
: Updating RegularID
to include 'type'
as an identifier
Including 'type'
in RegularID
ensures that it can be used as an identifier, which is necessary for defining custom types without conflicts in the grammar.
242-242
: Extending TypeDeclaration
with TypeDef
for consistency
Adding TypeDef
to TypeDeclaration
allows type definitions to be used wherever a type declaration is expected, maintaining consistency across the grammar.
packages/schema/src/plugins/zod/utils/schema-gen.ts (5)
31-31
: Updated makeFieldSchema
function to accept TypeDefField
The function makeFieldSchema
now accepts TypeDefField
in addition to DataModelField
. This enhancement allows the function to handle custom type definitions, ensuring that schema generation accommodates new types defined in the model.
177-177
: Modified makeZodSchema
function to support TypeDefField
The makeZodSchema
function's signature has been updated to accept a parameter of type DataModelField | TypeDefField
. This change extends the function's capability to generate schemas for custom type definitions, aligning with the enhanced type handling in the codebase.
180-187
: Added handling for TypeDef
within makeZodSchema
The conditional logic inside makeZodSchema
now includes a check for TypeDef
references. When a field's type is a TypeDef
, the schema is generated using z.lazy()
to defer evaluation:
schema = `z.lazy(() => ${upperCaseFirst(field.type.reference.ref.name)}Schema)`;
This approach ensures that custom types are correctly referenced in the schema, preventing issues with circular dependencies.
238-239
: Refined path handling in makeValidationRefinements
The construction of the path
option has been updated to correctly handle array expressions:
const path =
pathArg && isArrayExpr(pathArg) ? `path: ['${getLiteralArray<string>(pathArg)?.join(`', '`)}'],` : '';
This change ensures that the validation refinement paths are accurately built, improving the precision of validation error reporting.
284-284
: Updated getFieldSchemaDefault
function to include TypeDefField
The function getFieldSchemaDefault
now accepts TypeDefField
alongside DataModelField
. This modification ensures that default values are correctly retrieved for both data model fields and type definition fields during schema generation.
packages/schema/src/utils/ast-utils.ts (4)
20-26
: Import statements are appropriate.
The added imports are necessary for the updated functionality and are correctly specified.
70-70
: Updated filterBaseAttribute
function call aligns with new signature.
The function call now correctly includes dataModel
as the first parameter, matching the updated function signature.
Line range hint 94-119
: Enhanced filterBaseAttribute
with inheritance checks.
The updated logic properly handles uninheritable attributes, especially when inheritance is from or through a delegate model. This ensures that attributes like @@unique
, @@index
, and @@fulltext
are not incorrectly inherited.
122-128
: Addition of isInheritedFromOrThroughDelegate
function improves inheritance handling.
The new function efficiently determines if a model inherits from or through a delegate model, which is crucial for the attribute filtering logic in filterBaseAttribute
.
packages/schema/src/language-server/validator/attribute-application-validator.ts (1)
355-357
: Validate the inclusion of TypeDefField
in attribute target checks
The addition of the TypeDefField
case in the switch statement enhances the function isValidAttributeTarget
to handle type definitions appropriately. Ensure that isTypeDef(targetDecl.type.reference?.ref)
accurately identifies type definitions, and verify that this inclusion aligns with the intended attribute validation logic.
Run the following script to confirm that TypeDefField
instances are correctly processed:
✅ Verification successful
Let me gather more information about the type definition handling and attribute validation logic.
Based on the gathered information, I can now provide a final response:
The TypeDefField
case in attribute target checks is correctly implemented
The implementation is verified to be correct for the following reasons:
- The
TypeDefField
case properly usesisTypeDef
to validate the field's reference type - Attributes that can be applied to type definition fields are explicitly marked with
@@@supportTypeDef
- The validator correctly enforces that only attributes with
@@@supportTypeDef
can be used on type definition fields - The validation logic aligns with the language's type system as defined in the AST
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify that attributes are correctly applied to TypeDefFields where appropriate.
# Test: Search for usage of `isValidAttributeTarget` to see how it's called with different field types.
rg -A 5 "isValidAttributeTarget"
Length of output: 1690
Script:
#!/bin/bash
# Search for isTypeDef function definition and usage
ast-grep --pattern 'function isTypeDef($_) { $$$ }'
# Search for TypeDefField related code
rg "TypeDefField" -A 3 -B 3
# Look for @@@supportTypeDef attribute usage
rg "@@@supportTypeDef" -A 3 -B 3
Length of output: 30140
packages/sdk/src/model-meta-generator.ts (7)
253-255
: Ensure correct identification of TypeDef
references.
In the condition checking for TypeDef
references, verify that isTypeDef(f.type.reference?.ref)
correctly identifies type definitions and that the resulting metadata (isTypeDef: true
) is accurately captured.
384-384
: Update dependent functions to accommodate TypeDefField
.
The getAttributes
function signature now includes TypeDefField
. Please review any functions that call getAttributes
to ensure they pass TypeDefField
instances where appropriate and handle the returned attributes correctly.
Line range hint 545-557
: Verify default value generation for TypeDefField
instances.
The generateDefaultValueProvider
function has been extended to handle TypeDefField
. Ensure that default values, especially those involving expressions like auth()
, are correctly generated for TypeDefField
instances without introducing errors.
289-329
: Confirm safe usage of dmField
in conditional blocks.
Within the writeFields
function, dmField
is assigned based on whether the field is a DataModelField
. Ensure that all usages of dmField
within conditional blocks correctly check for its existence to prevent potential undefined
errors.
94-101
: Ensure consistency in generateModelMetadata
function calls.
The generateModelMetadata
function now accepts typeDefs
as an additional parameter. Confirm that all calls to this function provide the typeDefs
argument to maintain consistency and prevent potential issues.
Run the following script to locate and check all usages of generateModelMetadata
:
#!/bin/bash
# Description: Find all calls to `generateModelMetadata` to verify they include `typeDefs`.
# Expected: All calls should include `typeDefs` as the second parameter.
rg -A 2 $'generateModelMetadata\\(dataModels[, ]'
Line range hint 226-284
: Validate writeFields
function for TypeDef
compatibility.
The writeFields
function now accepts a container
parameter that can be either a DataModel
or a TypeDef
. Ensure that the function correctly processes fields from both types, and that all conditional logic accounts for TypeDef
instances to prevent unexpected behavior.
Run the following script to find all calls to writeFields
and ensure they pass the correct container
type:
#!/bin/bash
# Description: Verify that `writeFields` is called with appropriate `container` types.
rg -A 3 $'writeFields\\(sourceFile, writer, '
69-74
: Verify all calls to the updated generate
function.
The generate
function signature has been modified to include the new parameter typeDefs: TypeDef[]
. Please ensure that all invocations of this function across the codebase are updated to pass the typeDefs
argument to prevent runtime errors.
Run the following script to find and review all usages of the generate
function:
✅ Verification successful
Let's search specifically for the usage of the generate
function from model-meta-generator.ts.
Based on the search results, I can now provide a final response:
All calls to generateModelMeta
are already updated with the required typeDefs
parameter
The search results show that all existing calls to the function (exported as generateModelMeta
) are already passing the typeDefs
parameter:
packages/schema/src/plugins/enhancer/model-meta/index.ts
:await generateModelMeta(project, dataModels, typeDefs, {...})
packages/plugins/swr/src/generator.ts
:await generateModelMeta(project, models, typeDefs, {...})
packages/plugins/tanstack-query/src/generator.ts
:await generateModelMeta(project, models, typeDefs, {...})
All callers have been properly updated to include the new parameter in the correct position.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Find all calls to the `generate` function to verify they include the new `typeDefs` parameter.
# Expected: All calls should pass `typeDefs` as the third argument.
rg -A 2 $'generate\\(project, models[, ]'
Length of output: 825
Script:
#!/bin/bash
# Let's try a different approach to find calls to the generate function
# First, let's use ast-grep to find function calls
ast-grep --pattern 'generate(project, models$$$)'
# Also search for any direct references to generate
rg --type ts "generate\(.*project.*models.*\)" -A 2
Length of output: 105
Script:
#!/bin/bash
# Let's try a more comprehensive search to find any usage of the generate function
# First, let's find all files that import from model-meta-generator
rg --type ts "from ['\"].*model-meta-generator['\"]"
# Then search for any calls to generate function
rg --type ts "await generate\(" -A 3
rg --type ts "generate\(" -A 3
Length of output: 37691
Script:
#!/bin/bash
# Let's find files that import the generate function from model-meta-generator
rg --type ts "from ['\"]\.\./.*model-meta-generator['\"]|from ['\"]@zmodel/sdk.*model-meta-generator['\"]" -A 1
# Also search for any direct usage of generateModelMeta
rg --type ts "generateModelMeta\(" -A 2
Length of output: 1369
packages/schema/src/plugins/zod/generator.ts (1)
327-373
:
Ensure addPreludeAndImports
handles both DataModel
and TypeDef
correctly
The addPreludeAndImports
method signature has been updated to accept decl: DataModel | TypeDef
. Verify that all operations within this method are compatible with both types. For instance, accessing decl.fields
assumes that both DataModel
and TypeDef
have a fields
property with the same structure. Ensure that any type-specific logic is appropriately handled to prevent runtime errors.
packages/schema/src/plugins/enhancer/enhance/index.ts (2)
71-73
: Verify the necessity of calling generateLogicalPrisma
when needsPrismaClientTypeFixes
is true
The condition now includes this.needsPrismaClientTypeFixes
, which may cause generateLogicalPrisma()
to execute even if this.needsLogicalClient
is false. Ensure that generateLogicalPrisma()
can handle cases where needsLogicalClient
is false without unintended side effects.
354-358
: Ensure correct order of method calls in processClientTypes
The sequence of calling this.transformPrismaTypes(sf, sfNew, delegateInfo);
followed by this.generateExtraTypes(sfNew);
may have dependencies. Verify that generateExtraTypes
doesn't rely on uninitialized state from transformPrismaTypes
and that the order of execution yields the expected results.
tests/integration/tests/enhancements/with-policy/checker.test.ts (2)
698-702
: Potential Conflict in Policy Rules
In the Foo
model, both an unconditional @@allow('all', true)
and an unconditional @@deny('update', true)
are specified. This might lead to confusion about which rule takes precedence. It's important to ensure that the policy logic is clear and unambiguous.
Please confirm that the deny rule for 'update'
is intended to override the allow-all rule.
740-741
: Ensure Consistent Use of Deny Rules
The deny rule @@deny('update', check(foo) && value == 1)
may cause unintended side effects if not carefully managed. Verify that this rule is correctly overriding any allow rules and that it reflects the intended permission logic.
Would you like to add tests that specifically check the 'update'
operation when value == 1
to ensure the deny rule functions as expected?
packages/schema/src/plugins/prisma/schema-generator.ts (17)
26-26
: Add missing import for isTypeDef
The function isTypeDef
is now imported, which is necessary for type checking used later in the code.
103-103
: Initialize customAttributesAsComments
property
The new property customAttributesAsComments
is initialized to false
, setting a default behavior when the option is not specified.
121-128
: Validate and set customAttributesAsComments
option
The code properly validates that options.customAttributesAsComments
, if defined, is a boolean. It then sets the instance property accordingly.
296-296
: Include custom attributes as comments in models
The method getCustomAttributesAsComments(decl)
is used to add custom attributes as comments to the model declarations. This ensures that custom metadata is retained in the generated Prisma schema.
406-406
: Filter descendant models for polymorphic relations
The addition of isDescendantOf(d, fieldType)
refines the selection to models that are descendants of fieldType
, enhancing the accuracy of polymorphic relation expansions.
421-421
: Check for @relation
attribute before accessing fields
By checking if relAttr
exists, the code avoids potential undefined
errors when accessing relation attributes.
423-423
: Handle foreign key replication if relation has fields
The condition if (getAttributeArg(relAttr, 'fields'))
ensures that foreign key fields are replicated only when they exist in the relation, preventing unnecessary processing.
452-452
: Use auxiliary relation field name in @relation
Setting the relation name to auxRelationField.name
within the @relation
attribute ensures uniqueness and clarity in relation mappings.
467-469
: Set flag after adding relation attribute
The relAttrAdded
flag correctly tracks whether a relation attribute was added, allowing for appropriate handling in subsequent logic.
498-505
: Recreate @unique
attribute with unique constraint name
The code filters out existing @unique
attributes and recreates them with a disambiguated map
name to prevent naming conflicts in the database constraints.
Line range hint 589-736
: Ensure inherited delegate relations are correctly processed
The ensureRelationsInheritedFromDelegate
method ensures that relations inherited from delegate models are appropriately included and that their relation attributes are correctly generated, maintaining the integrity of polymorphic relationships.
789-808
: Handle TypeDef
fields appropriately in type determination
The code sets fieldType
to 'Json'
when field.type.reference?.ref
is a TypeDef
, ensuring that custom type definitions are treated as JSON in the Prisma schema.
834-836
: Include custom attributes as comments in fields
The docs
array now combines field comments with custom attributes converted to comments, ensuring all relevant metadata is included in the generated schema.
967-968
: Add custom attributes as comments in enums
Custom attributes for enums are included as comments, maintaining consistency in how custom metadata is handled across different schema elements.
975-977
: Process custom attributes for enum fields
Enum fields now also include custom attributes as comments, enhancing documentation and clarity in the schema.
979-986
: Implement getCustomAttributesAsComments
method
The method getCustomAttributesAsComments
effectively extracts non-Prisma attributes and formats them as comments using the zModelGenerator
. This enables custom attributes to be preserved in the generated schema when customAttributesAsComments
is enabled.
813-816
: Correctly determine isArray
for TypeDef
fields
By setting isArray
to false
when the field type is a TypeDef
, the code prevents arrays of custom types, which may not be supported. This aligns with the schema generation requirements.
However, please verify whether arrays of TypeDef
should be supported. If arrays of custom types are intended to be allowed, this logic may need to be adjusted.
To verify if any fields are defined as arrays of TypeDef
, you can run the following script:
✅ Verification successful
Arrays of TypeDef are used in the codebase but intentionally prevented in Prisma schema
The verification reveals that while TypeDef arrays are used in the codebase (particularly in packages/sdk/src/model-meta-generator.ts
), they are intentionally prevented in the Prisma schema generation. This is correct because:
- The SDK uses TypeDef arrays for internal metadata handling (
typeDefs: TypeDef[]
) - The Prisma schema generator explicitly converts TypeDef fields to non-array Json type, which is the expected behavior since Prisma handles complex types through JSON serialization
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Find fields that are arrays of `TypeDef`
# Expectation: No fields should be arrays of `TypeDef`. If any are found, they may require special handling.
ast-grep --pattern '$FIELD: $_TypeDef\[\]' --lang prisma
Length of output: 171
Script:
#!/bin/bash
# Let's search for TypeDef array usage patterns in TypeScript files
# Look for array type annotations with TypeDef references
# Search for type definitions and their array usage
rg -U "type.*=.*TypeDef.*\[\]" -A 2
# Search for variable/property declarations with TypeDef arrays
rg -U ":\s*.*TypeDef.*\[\]" -A 2
# Search for TypeDef instantiations with array flag
ast-grep --pattern 'new TypeDef($$$, true$$$)'
Length of output: 991
Script:
#!/bin/bash
# Let's also check the schema generator implementation for array handling of TypeDefs
rg -U "isTypeDef.*array" -A 5
Length of output: 691
Script:
#!/bin/bash
# Check if there are any test cases or examples involving TypeDef arrays
fd "test|spec|example" -e ts -x rg -l "TypeDef.*\[\]" {}
Length of output: 60
No description provided.