-
-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: clean up generation of logical prisma client #1082
Conversation
WalkthroughWalkthroughThis update introduces significant enhancements across various plugins and core components, focusing on refining the plugin architecture, improving error handling, and extending functionality through additional parameters and return types. Changes include modifications to generator functions, error checking, and the introduction of new types and utilities to facilitate more robust and flexible plugin development. The overarching goal is to streamline plugin interactions, enhance developer experience, and provide more comprehensive support for handling third-party APIs and data management. Changes
Related issues
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review Status
Actionable comments generated: 14
Configuration used: CodeRabbit UI
Files ignored due to path filters (1)
packages/runtime/package.json
is excluded by:!**/*.json
Files selected for processing (33)
- packages/plugins/openapi/src/generator-base.ts (2 hunks)
- packages/plugins/openapi/src/index.ts (2 hunks)
- packages/plugins/openapi/src/rest-generator.ts (1 hunks)
- packages/plugins/openapi/src/rpc-generator.ts (1 hunks)
- packages/plugins/swr/src/generator.ts (1 hunks)
- packages/plugins/swr/src/index.ts (1 hunks)
- packages/plugins/tanstack-query/src/generator.ts (3 hunks)
- packages/plugins/tanstack-query/src/index.ts (1 hunks)
- packages/plugins/trpc/src/generator.ts (6 hunks)
- packages/plugins/trpc/src/helpers.ts (2 hunks)
- packages/plugins/trpc/src/index.ts (1 hunks)
- packages/runtime/res/prisma.d.ts (1 hunks)
- packages/runtime/src/prisma.d.ts (1 hunks)
- packages/schema/src/cli/plugin-runner.ts (10 hunks)
- packages/schema/src/plugins/enhancer/enhance/index.ts (6 hunks)
- packages/schema/src/plugins/enhancer/index.ts (2 hunks)
- packages/schema/src/plugins/enhancer/model-meta/index.ts (1 hunks)
- packages/schema/src/plugins/enhancer/policy/policy-guard-generator.ts (3 hunks)
- packages/schema/src/plugins/plugin-utils.ts (1 hunks)
- packages/schema/src/plugins/prisma/index.ts (1 hunks)
- packages/schema/src/plugins/prisma/schema-generator.ts (7 hunks)
- packages/schema/src/plugins/zod/generator.ts (2 hunks)
- packages/schema/src/plugins/zod/index.ts (1 hunks)
- packages/schema/src/plugins/zod/transformer.ts (10 hunks)
- packages/schema/src/plugins/zod/types.ts (2 hunks)
- packages/schema/src/telemetry.ts (1 hunks)
- packages/sdk/src/model-meta-generator.ts (2 hunks)
- packages/sdk/src/prisma.ts (1 hunks)
- packages/sdk/src/types.ts (4 hunks)
- packages/testtools/src/schema.ts (2 hunks)
- tests/integration/tests/enhancements/with-delegate/enhanced-client.test.ts (1 hunks)
- tests/integration/tests/enhancements/with-delegate/plugin-interaction.test.ts (1 hunks)
- tests/integration/tests/enhancements/with-delegate/utils.ts (1 hunks)
Additional comments: 65
packages/runtime/res/prisma.d.ts (1)
- 1-1: The syntax for exporting types is correct. However, please verify the path used in the export statement to ensure it accurately points to the intended directory and does not introduce any dependency or path resolution issues.
packages/runtime/src/prisma.d.ts (1)
- 1-2: The use of
@ts-expect-error
and the export syntax are appropriate for the intended functionality. However, please verify the path used in the export statement and ensure the use of@ts-expect-error
aligns with the project's standards and doesn't unintentionally mask other errors.packages/plugins/swr/src/index.ts (1)
- 6-11: Refactoring the
run
function to usePluginFunction
and adding a check for the presence ofdmmf
are good practices that improve code maintainability and robustness. Please ensure thatPluginFunction
includes all necessary types previously imported individually.packages/plugins/tanstack-query/src/index.ts (1)
- 6-11: The refactoring to use
PluginFunction
and the addition of admmf
presence check are consistent with improvements seen in other plugins, enhancing code maintainability and robustness. Ensure thatPluginFunction
encompasses all necessary types.packages/plugins/trpc/src/index.ts (1)
- 7-12: Refactoring to use
PluginFunction
and adding admmf
presence check are consistent improvements across plugins, enhancing code maintainability and robustness. Verify thatPluginFunction
includes all necessary types.packages/schema/src/plugins/zod/index.ts (1)
- 3-11: Changing the import to
ZodSchemaGenerator
and its usage in therun
function likely clarifies the generator's purpose or functionality. Please verify that this change aligns with the intended functionality and naming conventions.packages/schema/src/plugins/zod/types.ts (1)
- 11-16: > 📝 NOTE
This review was outside the diff hunks and was mapped to the diff hunk with the greatest overlap. Original lines [1-18]
The removal of the
zmodel
property fromTransformerParams
suggests a simplification or refactoring of the type structure. Please verify that this change does not negatively impact the Zod plugin's functionality or any dependent parts of the codebase.packages/plugins/openapi/src/index.ts (1)
- 1-14: > 📝 NOTE
This review was outside the diff hunks and was mapped to the diff hunk with the greatest overlap. Original lines [7-22]
Refactoring to use
PluginFunction
and adding admmf
presence check, along with handling differentflavor
options, are good practices that enhance code maintainability, robustness, and flexibility. Ensure that the default case and error messages in the switch statement are clear and provide actionable guidance to developers.tests/integration/tests/enhancements/with-delegate/plugin-interaction.test.ts (1)
- 5-24: The integration test setup for testing the polymorphic plugin interaction, specifically with the Tanstack Query plugin, appears correct. Please verify the accuracy of paths and configurations used in the test to ensure they align with the intended test environment and objectives.
tests/integration/tests/enhancements/with-delegate/utils.ts (1)
- 1-47: The Prisma schema defined in
POLYMORPHIC_SCHEMA
uses custom directives@delegate
and@allow
, which appear to be project-specific extensions. Assuming these are correctly implemented elsewhere in the project, the schema looks well-structured and follows Prisma's conventions.packages/sdk/src/types.ts (1)
- 45-78: > 📝 NOTE
This review was outside the diff hunks and was mapped to the diff hunk with the greatest overlap. Original lines [1-86]
The changes to type definitions in this file, including the addition of
prismaClientPath
toPluginOptions
,tsProject
toPluginGlobalOptions
, and modifications toPluginFunction
, align well with the objectives of enhancing plugin systems and improving flexibility. These changes follow good practices for type definitions.packages/schema/src/plugins/plugin-utils.ts (1)
- 58-60: The addition of the
./prisma
entry in theensureDefaultOutputFolder
function aligns with the objectives of enhancing the Prisma plugin system. Ensure that any user input is properly sanitized before being used in path operations to prevent potential security issues.packages/schema/src/telemetry.ts (1)
- 114-125: The modifications to the
trackSpan
method, allowing it to return the result of theaction
function and use a generic type for the return value, enhance its flexibility and usability. These changes follow good practices for generic programming.packages/plugins/openapi/src/generator-base.ts (1)
- 15-15: The change to make the
generate
method abstract and returnPluginResult
is a positive step towards standardizing result and error handling across plugins. Ensure that all subclasses ofOpenAPIGeneratorBase
correctly implement this method and properly handlePluginResult
.packages/testtools/src/schema.ts (2)
- 256-259: The addition of
.zenstack/prisma
totsconfig.compilerOptions.paths
and the adjustments toinclude
andexclude
settings are beneficial for enhancing type safety and the development experience. Ensure to verify that these changes do not introduce any unintended side effects in the project's build or development process.- 341-351: The modifications to the arguments passed to the
prismaPlugin
function appear to provide more control over the Prisma client generation for testing purposes. Ensure that these new arguments are correctly handled by theprismaPlugin
function and that they meet the intended testing requirements without causing regressions.packages/plugins/trpc/src/helpers.ts (1)
- 227-229: The modification to the
generateRouterTypingImports
function signature to acceptoptions: PluginOptions
enhances flexibility and configurability. Ensure that all calls to this function have been updated to pass the correctoptions
parameter and that the function properly utilizes these options to determine the necessary imports.packages/plugins/trpc/src/generator.ts (6)
- 75-76: The addition of the
options
parameter to thecreateAppRouter
function enhances the flexibility of plugin options handling. This change allows for more customizable behavior of the router based on the provided options, aligning with the PR objectives to improve functionality and configurability.- 90-91: The
options
parameter in thecreateAppRouter
function signature is correctly typed asPluginOptions
, ensuring type safety and consistency with the rest of the codebase. This change supports the objective of enhancing plugin flexibility and error handling.- 100-100: The
getPrismaClientImportSpec
function is used to dynamically determine the module specifier for importing thePrismaClient
. This approach is beneficial for maintaining flexibility in the project structure and supporting different configurations.- 174-175: Including the
zodSchemasImport
andoptions
parameters in thegenerateModelCreateRouter
function call within thecreateRouter
function body demonstrates a commitment to enhancing the configurability and flexibility of the plugin system. This change aligns with the PR's objectives to improve the generation process and handling of the logical Prisma client.- 244-245: The addition of the
zodSchemasImport
andoptions
parameters to thegenerateModelCreateRouter
function signature is a positive change, promoting flexibility and configurability in the generation of model routers. This aligns with the PR's goal of refining the plugin system and improving error handling.- 263-263: The conditional inclusion of
generateRouterTypingImports
based on the presence ofgenerateClientHelpers
is a good practice, as it avoids unnecessary imports and keeps the generated code clean and efficient. This decision supports the PR's objectives of improving maintainability and performance.packages/schema/src/cli/plugin-runner.ts (8)
- 6-18: The introduction of type declarations (
OptionValue
,PluginDeclaredOptions
,PluginFunction
,PluginResult
) and utility functions (createProject
,emitProject
,saveProject
) at the beginning of the file is a positive change. It enhances code readability, maintainability, and type safety, aligning with the PR's objectives to improve the plugin system's architecture and error handling.- 109-110: The separation of plugins into
corePlugins
anduserPlugins
within thecalculateAllPlugins
method is a strategic improvement. It allows for a clearer distinction between core and user-defined plugins, facilitating better management and execution flow. This change supports the PR's goal of refining the plugin execution flow and improving project management.- 132-148: The handling of core plugins before user plugins, including the management of
dmmf
andprismaClientPath
, demonstrates a thoughtful approach to plugin execution order. This ensures that core functionalities are established before extending them with user plugins, aligning with best practices for plugin systems.- 152-160: The decision to compile code generated by core plugins before running user plugins is a good practice. It ensures that any foundational changes or enhancements made by core plugins are available and stable before user plugins are executed. This supports the PR's objectives of improving the plugin execution flow and ensuring robustness.
- 183-196: The conditional inclusion of the
@core/enhancer
plugin based onoptions.defaultPlugins
and validation rules presence is a smart design choice. It allows for dynamic plugin activation based on project requirements, contributing to a more flexible and efficient plugin system.- 208-208: The logic to ensure the
@core/zod
plugin is enabled if the@core/enhancer
is active and validation rules are present is a thoughtful addition. It demonstrates an understanding of dependencies between plugins and ensures that necessary functionalities are available when needed.- 260-260: The
calculateAllPlugins
method's return structure, separatingcorePlugins
anduserPlugins
, is a clean and effective way to manage the different types of plugins within the system. This approach enhances maintainability and clarity in the plugin execution process.- 367-375: The
compileProject
function's handling of thecompile
option fromrunnerOptions
to decide between emitting or saving project files is a flexible approach. It allows developers to choose the desired behavior based on their development or deployment needs, aligning with the PR's goal of improving project compilation and plugin execution flow.packages/sdk/src/model-meta-generator.ts (2)
- 51-54: The addition of the
preserveTsFiles
option inModelMetaGeneratorOptions
provides flexibility in deciding whether to save TypeScript files post-compilation. This is a useful feature for debugging or auditing generated code.- 68-70: The conditional saving of TypeScript files based on the
preserveTsFiles
option is correctly implemented. However, ensure that the rest of the codebase or external tooling that relies on the generated files is aware of this conditional behavior to avoid any unexpected issues.packages/plugins/swr/src/generator.ts (2)
- 52-52: The addition of the
options
parameter to thegenerateModelHooks
function call is a significant change that enhances the function's flexibility by allowing it to accept configuration options. Ensure that all calls to this function throughout the codebase have been updated to include this new parameter.Verification successful
Given the script execution did not return any results, it suggests that there are no calls to
generateModelHooks
missing the newoptions
parameter. This indicates that the necessary updates to include this parameter in all relevant function calls have likely been made throughout the codebase.* 56-56: Changing the return value of the `generate` function to return an object containing `warnings` instead of directly returning `warnings` improves the function's extensibility. This allows for future enhancements, such as returning additional metadata or results alongside warnings.Scripts Executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for calls to `generateModelHooks` without the `options` parameter. ast-grep --lang javascript --pattern $'generateModelHooks($_, $_, $_, $_, $_)'Length of output: 79
packages/schema/src/plugins/enhancer/enhance/index.ts (7)
- 1-1: The import of
DMMF
from@prisma/generator-helper
is correctly added to support the new functionalities related to Prisma schema handling.- 13-13: The addition of the
fs
import is necessary for file system operations, which are crucial for the new functionalities introduced in this PR, such as saving source files and reading the Prisma schema file.- 36-36: The declaration of
dmmf
as an optionalDMMF.Document
is appropriate for storing the Prisma schema metadata, which is essential for the enhancements made in this PR.- 40-40: The call to
generateLogicalPrisma
within the conditional block is correctly placed to ensure that a logical Prisma schema is generated only when delegate models are present.- 48-53: Creating a reexport of the logical Prisma client using
project.createSourceFile
and saving it withsaveSourceFile
is a clean approach to manage the generated Prisma client.- 23-74: > 📝 NOTE
This review was outside the diff hunks and was mapped to the diff hunk with the greatest overlap. Original lines [64-86]
The creation and saving of the
enhance.ts
source file, which includes the enhancement logic for the Prisma client, is implemented correctly. The conditional logic for includingPrismaClient
import based on the presence oflogicalPrismaClientDir
is a good practice.
- 100-143: The
generateLogicalPrisma
function is well-structured and performs its intended purpose of generating a logical Prisma schema and client. The use ofPrismaSchemaGenerator
, handling of the output directory, and thegetDMMF
call for loading the schema metadata are all correctly implemented.packages/schema/src/plugins/zod/generator.ts (7)
- 28-41: The constructor implementation correctly initializes class properties and enforces the presence of global options, ensuring that the necessary configuration is provided for schema generation.
- 139-173: The
getExcludedModels
method effectively calculates models to be excluded from generation, considering both direct exclusions and transitive dependencies. The implementation is comprehensive and well-thought-out.- 187-198: The
generateCommonSchemas
method effectively generates common schemas, such as for Decimal types, and writes them to a source file. The implementation is straightforward and correct.- 201-211: The
generateEnumSchemas
method correctly generates schemas for enums by combining Prisma and model schema enums. The use of the Transformer class for schema generation is appropriate and well-implemented.- 214-242: The
generateObjectSchemas
method effectively generates object schemas, correctly handling the option to exclude unchecked input types. The iterative approach and use of the Transformer class for schema generation are well-implemented.- 245-258: The
generateModelSchemas
method effectively generates schemas for models, correctly handling the exclusion of specified models. The iterative approach and schema generation logic are well-implemented.- 487-513: The helper methods
makePartial
,makeOmit
,makeMerge
, andmakePassthrough
correctly implement utilities for manipulating Zod schemas. The implementation is straightforward and effectively achieves the intended schema manipulations.packages/plugins/tanstack-query/src/generator.ts (2)
- 289-290: The addition of the
options
parameter to thegenerateModelHooks
function enhances its flexibility and aligns with the PR's objectives to introduce new parameters for more customizable behavior.- 62-62: Changing the return value of the
generate
function to an object containingwarnings
improves consistency and allows for future expansions in the function's return type.packages/schema/src/plugins/prisma/schema-generator.ts (4)
- 98-100: The error handling for unspecified output files is clear and follows best practices by providing a descriptive error message. This ensures that users are immediately aware of missing required configuration.
- 95-105: > 📝 NOTE
This review was outside the diff hunks and was mapped to the diff hunk with the greatest overlap. Original lines [102-111]
Checking the Prisma version against a minimum required version is a good practice for maintaining compatibility. Adding a warning to inform the user rather than throwing an error allows the generation process to continue, which might be acceptable in some cases but should be carefully considered depending on how critical the version requirement is.
- 34-39: > 📝 NOTE
This review was outside the diff hunks, and no overlapping diff hunk was found. Original lines [113-130]
Iterating through the zmodel declarations and handling each type (DataSource, Enum, DataModel, GeneratorDecl) with a dedicated method is a clean and modular approach. This enhances the readability and maintainability of the code by separating concerns and allowing for easier updates or additions to the generation logic in the future.
- 134-139: > 📝 NOTE
This review was outside the diff hunks and was mapped to the diff hunk with the greatest overlap. Original lines [132-147]
Ensuring the output directory exists before writing the file and providing an option to format the schema file using Prisma's format command are excellent practices. These steps improve the user experience by ensuring the generated schema is not only saved correctly but also presented in a well-formatted and readable manner.
packages/schema/src/plugins/enhancer/policy/policy-guard-generator.ts (2)
- 63-63: The change in the method signature of
generate
to acceptoptions
beforeoutput
is a positive step towards enhancing flexibility and clarity in handling generation options. Ensure that all calls to this method across the project have been updated to reflect this new order of parameters.- 78-78: Swapping the parameters in the
prismaImport
function call to(output, options)
aligns with the overall goal of enhancing flexibility and clarity in handling options. Ensure that thegetPrismaClientImportSpec
function has been appropriately updated to handle this new parameter order and that all its call sites have been adjusted accordingly.Verification successful
The change in the
prismaImport
function call parameters to(output, options)
has been correctly implemented across the codebase, as evidenced by the consistent usage in various files. This aligns with the goal of enhancing flexibility and clarity in handling options and confirms that thegetPrismaClientImportSpec
function and its call sites have been appropriately updated.Scripts Executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for calls to the `getPrismaClientImportSpec` function to ensure they've been updated. ast-grep --lang typescript --pattern $'getPrismaClientImportSpec($_, $_)'Length of output: 1510
packages/plugins/openapi/src/rpc-generator.ts (1)
- 92-92: The modification to the
generate
method to return an object with awarnings
property instead of directly returningthis.warnings
is a positive change. It encapsulates the warnings within a structured response, allowing for potential expansion of the return value in the future without breaking changes. This approach enhances the method's flexibility and maintainability.packages/schema/src/plugins/zod/transformer.ts (5)
- 3-3: Importing
PluginOptions
is a good practice as it ensures type safety and clarity when dealing with plugin options. This aligns with the PR's objective to enhance configurability and robustness in plugin systems.- 30-30: Changing
private zmodel: Model
topublic sourceFiles: SourceFile[] = []
shifts the class's responsibility from handling a single model to managing multiple source files. This change supports the PR's goal of enhancing the logical Prisma client generation process by allowing theTransformer
class to directly manage the output files it generates. However, ensure that this change does not introduce unintended side effects in how models are handled elsewhere in the codebase.Verification successful
The verification process has confirmed the presence of the
sourceFiles
property within theTransformer
class in the Zod plugin, as indicated in the review comment. This property is utilized to store generated source files, aligning with the enhancement of the logical Prisma client generation process by managing multiple source files directly within theTransformer
class. The change from handling a single model to managing multiple source files is evident and supports the PR's goal. No issues were found related to this change in the provided context and the additional verification steps.* 82-82: Updating method signatures to include `options: PluginOptions` enhances the flexibility and configurability of the `Transformer` class. This change allows for more granular control over the behavior of these methods, aligning with the PR's objectives to improve plugin systems. Ensure that all calls to these methods have been updated to pass the necessary `options` parameter.Scripts Executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Verify that the removal of `Model` and the introduction of `SourceFile[]` do not negatively impact other parts of the code. # This script is a placeholder for the actual verification process, which should be conducted manually or through automated tests. echo "Manual verification needed to ensure that the change from Model to SourceFile[] does not introduce issues."Length of output: 222
Script:
#!/bin/bash # Search for references to `Model` to understand its usage before the change. rg "Model" --vimgrep # Search for references to `zmodel` to see if it's still being referenced after the supposed removal. rg "zmodel" --vimgrep # Search for references to `sourceFiles` to understand how the new array is intended to be used. rg "sourceFiles" --vimgrepLength of output: 321637
Also applies to: 260-260, 291-291, 390-390
- 394-394: The use of
options.noUncheckedInput
to determine whether to generate unchecked Prisma input types is a good example of leveraging the newly introducedPluginOptions
to customize plugin behavior. This approach enhances the flexibility of theTransformer
class and aligns with the PR's goal of improving configurability.- 60-70: Pushing created
SourceFile
instances to thesourceFiles
array instead of handling them individually aligns with the change to manage multiple source files. This approach facilitates better tracking and management of generated files, supporting the PR's objective of enhancing the logical Prisma client generation process. Ensure that thesourceFiles
array is properly utilized downstream to handle the generated files.Also applies to: 88-88, 675-683
packages/plugins/openapi/src/rest-generator.ts (1)
- 79-79: The modification to the
generate
method to return an object with awarnings
property enhances flexibility and maintainability. Ensure that all usages of this method across the project have been updated to handle the new return type.tests/integration/tests/enhancements/with-delegate/enhanced-client.test.ts (1)
- 5-5: The replacement of a hardcoded schema with the
POLYMORPHIC_SCHEMA
constant is a positive change for maintainability and reusability. Ensure thatPOLYMORPHIC_SCHEMA
is correctly defined and accessible within the test environment.
const generateCmd = `prisma generate --schema "${logicalPrismaFile}" --no-engine`; | ||
try { | ||
// run 'prisma generate' | ||
await execPackage(generateCmd, { stdio: 'ignore' }); | ||
} catch { | ||
await trackPrismaSchemaError(logicalPrismaFile); | ||
try { | ||
// run 'prisma generate' again with output to the console | ||
await execPackage(generateCmd); | ||
} catch { | ||
// noop | ||
} | ||
throw new PluginError(name, `Failed to run "prisma generate"`); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error handling in the generateLogicalPrisma
function could be improved. Currently, if the first attempt to run prisma generate
fails, it tries again and then throws a PluginError
without providing much context about the failure. Consider enhancing the error handling to provide more detailed information about the failure to aid in debugging.
- throw new PluginError(name, `Failed to run "prisma generate"`);
+ throw new PluginError(name, `Failed to run "prisma generate". Please check the Prisma schema file for errors.`);
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
const generateCmd = `prisma generate --schema "${logicalPrismaFile}" --no-engine`; | |
try { | |
// run 'prisma generate' | |
await execPackage(generateCmd, { stdio: 'ignore' }); | |
} catch { | |
await trackPrismaSchemaError(logicalPrismaFile); | |
try { | |
// run 'prisma generate' again with output to the console | |
await execPackage(generateCmd); | |
} catch { | |
// noop | |
} | |
throw new PluginError(name, `Failed to run "prisma generate"`); | |
const generateCmd = `prisma generate --schema "${logicalPrismaFile}" --no-engine`; | |
try { | |
// run 'prisma generate' | |
await execPackage(generateCmd, { stdio: 'ignore' }); | |
} catch { | |
await trackPrismaSchemaError(logicalPrismaFile); | |
try { | |
// run 'prisma generate' again with output to the console | |
await execPackage(generateCmd); | |
} catch { | |
// noop | |
} | |
throw new PluginError(name, `Failed to run "prisma generate". Please check the Prisma schema file for errors.`); |
// save ts files if requested explicitly or the user provided | ||
const preserveTsFiles = options.preserveTsFiles === true || !!options.output; | ||
if (preserveTsFiles) { | ||
await sf.save(); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The addition of a conditional block to save TypeScript files based on options.preserveTsFiles
and options.output
is a thoughtful enhancement. However, ensure that user input used in file paths, especially in options.output
, is properly sanitized or validated to prevent path traversal vulnerabilities.
+ import { sanitizePath } from 'path-utils'; // Assuming 'path-utils' is a utility library for path sanitization
...
- const sf = project.createSourceFile(path.join(output, 'policy.ts'), undefined, { overwrite: true });
+ const safeOutput = sanitizePath(output); // Sanitize the output path
+ const sf = project.createSourceFile(path.join(safeOutput, 'policy.ts'), undefined, { overwrite: true });
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
// save ts files if requested explicitly or the user provided | |
const preserveTsFiles = options.preserveTsFiles === true || !!options.output; | |
if (preserveTsFiles) { | |
await sf.save(); | |
} | |
// save ts files if requested explicitly or the user provided | |
+ import { sanitizePath } from 'path-utils'; // Assuming 'path-utils' is a utility library for path sanitization | |
const preserveTsFiles = options.preserveTsFiles === true || !!options.output; | |
if (preserveTsFiles) { | |
+ const safeOutput = sanitizePath(output); // Sanitize the output path | |
+ const sf = project.createSourceFile(path.join(safeOutput, 'policy.ts'), undefined, { overwrite: true }); | |
await sf.save(); | |
} |
warnings.push(`Unable to find mapping for model ${dataModel.name}`); | ||
return; | ||
} | ||
generateModelHooks(target, version, project, outDir, dataModel, mapping); | ||
generateModelHooks(target, version, project, outDir, dataModel, mapping, options); | ||
}); | ||
|
||
await saveProject(project); | ||
return warnings; | ||
return { warnings }; | ||
} | ||
|
||
function generateQueryHook( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
📝 NOTE
This review was outside the diff hunks, and no overlapping diff hunk was found. Original lines [46-46]
Potential security risk: Ensure that user input passed to path.join
or path.resolve
is properly sanitized or validated to prevent path traversal vulnerabilities.
📝 NOTE
This review was outside the diff hunks, and no overlapping diff hunk was found. Original lines [295-295]
Potential security risk: Ensure that user input passed to path.join
or path.resolve
is properly sanitized or validated to prevent path traversal vulnerabilities.
📝 NOTE
This review was outside the diff hunks, and no overlapping diff hunk was found. Original lines [531-531]
Potential security risk: Ensure that user input passed to path.join
or path.resolve
is properly sanitized or validated to prevent path traversal vulnerabilities.
warnings.push(`Unable to find mapping for model ${dataModel.name}`); | ||
return; | ||
} | ||
generateModelHooks(project, outDir, dataModel, mapping, legacyMutations); | ||
generateModelHooks(project, outDir, dataModel, mapping, legacyMutations, options); | ||
}); | ||
|
||
await saveProject(project); | ||
return warnings; | ||
return { warnings }; | ||
} | ||
|
||
function generateModelHooks( | ||
project: Project, | ||
outDir: string, | ||
model: DataModel, | ||
mapping: DMMF.ModelMapping, | ||
legacyMutations: boolean | ||
legacyMutations: boolean, | ||
options: PluginOptions | ||
) { | ||
const fileName = paramCase(model.name); | ||
const sf = project.createSourceFile(path.join(outDir, `${fileName}.ts`), undefined, { overwrite: true }); | ||
|
||
sf.addStatements('/* eslint-disable */'); | ||
|
||
const prismaImport = getPrismaClientImportSpec(model.$container, outDir); | ||
const prismaImport = getPrismaClientImportSpec(outDir, options); | ||
sf.addImportDeclaration({ | ||
namedImports: ['Prisma'], | ||
isTypeOnly: true, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
📝 NOTE
This review was outside the diff hunks, and no overlapping diff hunk was found. Original lines [40-40]
Detected potential security risk related to path traversal vulnerabilities due to user input being used in path.join
or path.resolve
functions. Ensure that the outDir
and other path-related inputs are properly sanitized or validated to prevent unauthorized file system access.
- let outDir = requireOption<string>(options, 'output', name);
+ let outDir = sanitizePath(requireOption<string>(options, 'output', name));
Consider implementing a sanitizePath
function that checks and cleans the input path to mitigate this risk.
Also applies to: 68-68, 279-279
@@ -86,7 +87,8 @@ function createAppRouter( | |||
generateModelActions: string[] | undefined, | |||
generateClientHelpers: string[] | undefined, | |||
zmodel: Model, | |||
zodSchemasImport: string | |||
zodSchemasImport: string, | |||
options: PluginOptions | |||
) { | |||
const indexFile = path.resolve(outDir, 'routers', `index.ts`); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Detected potential security risk related to path traversal vulnerabilities. Ensure that user input influencing file paths, such as outputDir
and other parameters, is properly sanitized or validated to prevent unauthorized access to the file system. Consider implementing strict validation rules or using a whitelist approach for allowed paths.
Also applies to: 213-213, 222-222, 230-230, 247-247, 327-327
// save ts files if requested explicitly or the user provided | ||
const preserveTsFiles = options.preserveTsFiles === true || !!options.output; | ||
await generateModelMeta(project, dataModels, { | ||
output: outFile, | ||
generateAttributes: true, | ||
preserveTsFiles, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The introduction of preserveTsFiles
based on options is a good addition for flexibility. However, ensure that any paths derived from user options, especially when used in functions like path.join
, are properly sanitized or validated to prevent potential security vulnerabilities such as path traversal.
// a logical client is generated | ||
if (typeof options.output === 'string') { | ||
// `options.output` is either relative to zmodel path or absolute | ||
prismaClientPath = path.join(options.output, 'prisma'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potential security issue: The use of path.join
with options.output
could lead to a path traversal vulnerability if options.output
is user-controlled and not properly sanitized. Ensure that options.output
is validated or sanitized before use.
prismaClientPath = clientOutput; | ||
} else { | ||
// first get absolute path based on prisma schema location | ||
const absPath = path.resolve(path.dirname(output), clientOutput); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potential security issue: The use of path.resolve
with user-provided or package.json-defined paths could lead to a path traversal vulnerability if these inputs are not properly sanitized. Ensure that these inputs are validated or sanitized before use.
private async handleGeneratorOutputValue(output: string) { | ||
// create the output directory and delete contents that might exist from a previous run | ||
await fs.mkdir(output, { recursive: true }); | ||
const isRemoveContentsOnly = true; | ||
await removeDir(output, isRemoveContentsOnly); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The handleGeneratorOutputValue
method correctly prepares the output directory. However, it involves user input in file system operations, which could lead to path traversal vulnerabilities. Ensure that user input is properly sanitized or validated before use.
- await removeDir(output, isRemoveContentsOnly);
+ // Ensure output is sanitized or validated to prevent path traversal vulnerabilities
+ await removeDir(sanitizePath(output), isRemoveContentsOnly);
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
private async handleGeneratorOutputValue(output: string) { | |
// create the output directory and delete contents that might exist from a previous run | |
await fs.mkdir(output, { recursive: true }); | |
const isRemoveContentsOnly = true; | |
await removeDir(output, isRemoveContentsOnly); | |
private async handleGeneratorOutputValue(output: string) { | |
// create the output directory and delete contents that might exist from a previous run | |
await fs.mkdir(output, { recursive: true }); | |
const isRemoveContentsOnly = true; | |
// Ensure output is sanitized or validated to prevent path traversal vulnerabilities | |
await removeDir(sanitizePath(output), isRemoveContentsOnly); |
})`; | ||
prismaUpdateSchema = this.makePartial(prismaUpdateSchema); | ||
if (refineFuncName) { | ||
prismaUpdateSchema = `${refineFuncName}(${prismaUpdateSchema})`; | ||
} | ||
writer.writeLine(`export const ${upperCaseFirst(model.name)}PrismaUpdateSchema = ${prismaUpdateSchema};`); | ||
|
||
//////////////////////////////////////////////// | ||
// 3. Create schema | ||
//////////////////////////////////////////////// | ||
let createSchema = 'baseSchema'; | ||
const fieldsWithDefault = scalarFields.filter( | ||
(field) => hasAttribute(field, '@default') || hasAttribute(field, '@updatedAt') || field.type.array | ||
); | ||
} | ||
if (fieldsWithDefault.length > 0) { | ||
createSchema = this.makePartial( | ||
createSchema, | ||
fieldsWithDefault.map((f) => f.name) | ||
); | ||
} | ||
|
||
if (fkSchema) { | ||
// export schema with only scalar fields | ||
const createScalarSchema = `${upperCaseFirst(model.name)}CreateScalarSchema`; | ||
writer.writeLine(`export const ${createScalarSchema} = ${createSchema};`); | ||
if (fkSchema) { | ||
// export schema with only scalar fields | ||
const createScalarSchema = `${upperCaseFirst(model.name)}CreateScalarSchema`; | ||
writer.writeLine(`export const ${createScalarSchema} = ${createSchema};`); | ||
|
||
// merge fk fields | ||
createSchema = makeMerge(createScalarSchema, fkSchema); | ||
} | ||
// merge fk fields | ||
createSchema = this.makeMerge(createScalarSchema, fkSchema); | ||
} | ||
|
||
if (refineFuncName) { | ||
// export a schema without refinement for extensibility | ||
const noRefineSchema = `${upperCaseFirst(model.name)}CreateWithoutRefineSchema`; | ||
writer.writeLine(`export const ${noRefineSchema} = ${createSchema};`); | ||
createSchema = `${refineFuncName}(${noRefineSchema})`; | ||
} | ||
writer.writeLine(`export const ${upperCaseFirst(model.name)}CreateSchema = ${createSchema};`); | ||
if (refineFuncName) { | ||
// export a schema without refinement for extensibility | ||
const noRefineSchema = `${upperCaseFirst(model.name)}CreateWithoutRefineSchema`; | ||
writer.writeLine(`export const ${noRefineSchema} = ${createSchema};`); | ||
createSchema = `${refineFuncName}(${noRefineSchema})`; | ||
} | ||
writer.writeLine(`export const ${upperCaseFirst(model.name)}CreateSchema = ${createSchema};`); | ||
|
||
//////////////////////////////////////////////// | ||
// 3. Update schema | ||
//////////////////////////////////////////////// | ||
let updateSchema = makePartial('baseSchema'); | ||
//////////////////////////////////////////////// | ||
// 3. Update schema | ||
//////////////////////////////////////////////// | ||
let updateSchema = this.makePartial('baseSchema'); | ||
|
||
if (fkSchema) { | ||
// export schema with only scalar fields | ||
const updateScalarSchema = `${upperCaseFirst(model.name)}UpdateScalarSchema`; | ||
writer.writeLine(`export const ${updateScalarSchema} = ${updateSchema};`); | ||
updateSchema = updateScalarSchema; | ||
if (fkSchema) { | ||
// export schema with only scalar fields | ||
const updateScalarSchema = `${upperCaseFirst(model.name)}UpdateScalarSchema`; | ||
writer.writeLine(`export const ${updateScalarSchema} = ${updateSchema};`); | ||
updateSchema = updateScalarSchema; | ||
|
||
// merge fk fields | ||
updateSchema = makeMerge(updateSchema, makePartial(fkSchema)); | ||
} | ||
// merge fk fields | ||
updateSchema = this.makeMerge(updateSchema, this.makePartial(fkSchema)); | ||
} | ||
|
||
if (refineFuncName) { | ||
// export a schema without refinement for extensibility | ||
const noRefineSchema = `${upperCaseFirst(model.name)}UpdateWithoutRefineSchema`; | ||
writer.writeLine(`export const ${noRefineSchema} = ${updateSchema};`); | ||
updateSchema = `${refineFuncName}(${noRefineSchema})`; | ||
} | ||
writer.writeLine(`export const ${upperCaseFirst(model.name)}UpdateSchema = ${updateSchema};`); | ||
}); | ||
if (refineFuncName) { | ||
// export a schema without refinement for extensibility | ||
const noRefineSchema = `${upperCaseFirst(model.name)}UpdateWithoutRefineSchema`; | ||
writer.writeLine(`export const ${noRefineSchema} = ${updateSchema};`); | ||
updateSchema = `${refineFuncName}(${noRefineSchema})`; | ||
} | ||
writer.writeLine(`export const ${upperCaseFirst(model.name)}UpdateSchema = ${updateSchema};`); | ||
}); | ||
|
||
return schemaName; | ||
} | ||
return schemaName; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The generateModelSchema
method comprehensively generates a schema for a single model, covering scalar fields, relation fields, foreign key fields, and validation refinements. The approach is well-thought-out and effectively implemented. However, it involves user input in file system operations, which could lead to path traversal vulnerabilities. Ensure that user input is properly sanitized or validated before use.
- const sf = this.project.createSourceFile(path.join(output, 'models', `${schemaName}.ts`), undefined, {
+ // Ensure output is sanitized or validated to prevent path traversal vulnerabilities
+ const sf = this.project.createSourceFile(sanitizePath(path.join(output, 'models', `${schemaName}.ts`)), undefined, {
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
private async generateModelSchema(model: DataModel, output: string) { | |
const schemaName = `${upperCaseFirst(model.name)}.schema`; | |
const sf = this.project.createSourceFile(path.join(output, 'models', `${schemaName}.ts`), undefined, { | |
overwrite: true, | |
}); | |
this.sourceFiles.push(sf); | |
sf.replaceWithText((writer) => { | |
const scalarFields = model.fields.filter( | |
(field) => | |
// regular fields only | |
!isDataModel(field.type.reference?.ref) && !isForeignKeyField(field) | |
); | |
writer.writeLine('/* eslint-disable */'); | |
writer.writeLine(`import { z } from 'zod';`); | |
const relations = model.fields.filter((field) => isDataModel(field.type.reference?.ref)); | |
const fkFields = model.fields.filter((field) => isForeignKeyField(field)); | |
// import user-defined enums from Prisma as they might be referenced in the expressions | |
const importEnums = new Set<string>(); | |
for (const node of streamAllContents(model)) { | |
if (isEnumFieldReference(node)) { | |
const field = node.target.ref as EnumField; | |
if (!isFromStdlib(field.$container)) { | |
importEnums.add(field.$container.name); | |
writer.writeLine('/* eslint-disable */'); | |
writer.writeLine(`import { z } from 'zod';`); | |
// import user-defined enums from Prisma as they might be referenced in the expressions | |
const importEnums = new Set<string>(); | |
for (const node of streamAllContents(model)) { | |
if (isEnumFieldReference(node)) { | |
const field = node.target.ref as EnumField; | |
if (!isFromStdlib(field.$container)) { | |
importEnums.add(field.$container.name); | |
} | |
} | |
} | |
} | |
if (importEnums.size > 0) { | |
const prismaImport = getPrismaClientImportSpec(model.$container, path.join(output, 'models')); | |
writer.writeLine(`import { ${[...importEnums].join(', ')} } from '${prismaImport}';`); | |
} | |
if (importEnums.size > 0) { | |
const prismaImport = getPrismaClientImportSpec(path.join(output, 'models'), this.options); | |
writer.writeLine(`import { ${[...importEnums].join(', ')} } from '${prismaImport}';`); | |
} | |
// import enum schemas | |
const importedEnumSchemas = new Set<string>(); | |
for (const field of scalarFields) { | |
if (field.type.reference?.ref && isEnum(field.type.reference?.ref)) { | |
const name = upperCaseFirst(field.type.reference?.ref.name); | |
if (!importedEnumSchemas.has(name)) { | |
writer.writeLine(`import { ${name}Schema } from '../enums/${name}.schema';`); | |
importedEnumSchemas.add(name); | |
// import enum schemas | |
const importedEnumSchemas = new Set<string>(); | |
for (const field of scalarFields) { | |
if (field.type.reference?.ref && isEnum(field.type.reference?.ref)) { | |
const name = upperCaseFirst(field.type.reference?.ref.name); | |
if (!importedEnumSchemas.has(name)) { | |
writer.writeLine(`import { ${name}Schema } from '../enums/${name}.schema';`); | |
importedEnumSchemas.add(name); | |
} | |
} | |
} | |
} | |
// import Decimal | |
if (scalarFields.some((field) => field.type.type === 'Decimal')) { | |
writer.writeLine(`import { DecimalSchema } from '../common';`); | |
writer.writeLine(`import { Decimal } from 'decimal.js';`); | |
} | |
// base schema | |
writer.write(`const baseSchema = z.object(`); | |
writer.inlineBlock(() => { | |
scalarFields.forEach((field) => { | |
writer.writeLine(`${field.name}: ${makeFieldSchema(field, true)},`); | |
}); | |
}); | |
writer.writeLine(');'); | |
// relation fields | |
let relationSchema: string | undefined; | |
let fkSchema: string | undefined; | |
// import Decimal | |
if (scalarFields.some((field) => field.type.type === 'Decimal')) { | |
writer.writeLine(`import { DecimalSchema } from '../common';`); | |
writer.writeLine(`import { Decimal } from 'decimal.js';`); | |
} | |
if (relations.length > 0 || fkFields.length > 0) { | |
relationSchema = 'relationSchema'; | |
writer.write(`const ${relationSchema} = z.object(`); | |
// base schema | |
writer.write(`const baseSchema = z.object(`); | |
writer.inlineBlock(() => { | |
[...relations, ...fkFields].forEach((field) => { | |
writer.writeLine(`${field.name}: ${makeFieldSchema(field)},`); | |
scalarFields.forEach((field) => { | |
writer.writeLine(`${field.name}: ${makeFieldSchema(field, true)},`); | |
}); | |
}); | |
writer.writeLine(');'); | |
} | |
if (fkFields.length > 0) { | |
fkSchema = 'fkSchema'; | |
writer.write(`const ${fkSchema} = z.object(`); | |
writer.inlineBlock(() => { | |
fkFields.forEach((field) => { | |
writer.writeLine(`${field.name}: ${makeFieldSchema(field)},`); | |
// relation fields | |
let relationSchema: string | undefined; | |
let fkSchema: string | undefined; | |
if (relations.length > 0 || fkFields.length > 0) { | |
relationSchema = 'relationSchema'; | |
writer.write(`const ${relationSchema} = z.object(`); | |
writer.inlineBlock(() => { | |
[...relations, ...fkFields].forEach((field) => { | |
writer.writeLine(`${field.name}: ${makeFieldSchema(field)},`); | |
}); | |
}); | |
}); | |
writer.writeLine(');'); | |
} | |
writer.writeLine(');'); | |
} | |
// compile "@@validate" to ".refine" | |
const refinements = makeValidationRefinements(model); | |
let refineFuncName: string | undefined; | |
if (refinements.length > 0) { | |
refineFuncName = `refine${upperCaseFirst(model.name)}`; | |
writer.writeLine( | |
`export function ${refineFuncName}<T, D extends z.ZodTypeDef>(schema: z.ZodType<T, D, T>) { return schema${refinements.join( | |
'\n' | |
)}; }` | |
); | |
} | |
if (fkFields.length > 0) { | |
fkSchema = 'fkSchema'; | |
writer.write(`const ${fkSchema} = z.object(`); | |
writer.inlineBlock(() => { | |
fkFields.forEach((field) => { | |
writer.writeLine(`${field.name}: ${makeFieldSchema(field)},`); | |
}); | |
}); | |
writer.writeLine(');'); | |
} | |
//////////////////////////////////////////////// | |
// 1. Model schema | |
//////////////////////////////////////////////// | |
const fieldsWithoutDefault = scalarFields.filter((f) => !getFieldSchemaDefault(f)); | |
// mark fields without default value as optional | |
let modelSchema = makePartial( | |
'baseSchema', | |
fieldsWithoutDefault.length < scalarFields.length ? fieldsWithoutDefault.map((f) => f.name) : undefined | |
); | |
// compile "@@validate" to ".refine" | |
const refinements = makeValidationRefinements(model); | |
let refineFuncName: string | undefined; | |
if (refinements.length > 0) { | |
refineFuncName = `refine${upperCaseFirst(model.name)}`; | |
writer.writeLine( | |
`export function ${refineFuncName}<T, D extends z.ZodTypeDef>(schema: z.ZodType<T, D, T>) { return schema${refinements.join( | |
'\n' | |
)}; }` | |
); | |
} | |
// omit fields | |
const fieldsToOmit = scalarFields.filter((field) => hasAttribute(field, '@omit')); | |
if (fieldsToOmit.length > 0) { | |
modelSchema = makeOmit( | |
modelSchema, | |
fieldsToOmit.map((f) => f.name) | |
//////////////////////////////////////////////// | |
// 1. Model schema | |
//////////////////////////////////////////////// | |
const fieldsWithoutDefault = scalarFields.filter((f) => !getFieldSchemaDefault(f)); | |
// mark fields without default value as optional | |
let modelSchema = this.makePartial( | |
'baseSchema', | |
fieldsWithoutDefault.length < scalarFields.length ? fieldsWithoutDefault.map((f) => f.name) : undefined | |
); | |
} | |
if (relationSchema) { | |
// export schema with only scalar fields | |
const modelScalarSchema = `${upperCaseFirst(model.name)}ScalarSchema`; | |
writer.writeLine(`export const ${modelScalarSchema} = ${modelSchema};`); | |
modelSchema = modelScalarSchema; | |
// omit fields | |
const fieldsToOmit = scalarFields.filter((field) => hasAttribute(field, '@omit')); | |
if (fieldsToOmit.length > 0) { | |
modelSchema = this.makeOmit( | |
modelSchema, | |
fieldsToOmit.map((f) => f.name) | |
); | |
} | |
// merge relations | |
modelSchema = makeMerge(modelSchema, makePartial(relationSchema)); | |
} | |
if (relationSchema) { | |
// export schema with only scalar fields | |
const modelScalarSchema = `${upperCaseFirst(model.name)}ScalarSchema`; | |
writer.writeLine(`export const ${modelScalarSchema} = ${modelSchema};`); | |
modelSchema = modelScalarSchema; | |
// refine | |
if (refineFuncName) { | |
const noRefineSchema = `${upperCaseFirst(model.name)}WithoutRefineSchema`; | |
writer.writeLine(`export const ${noRefineSchema} = ${modelSchema};`); | |
modelSchema = `${refineFuncName}(${noRefineSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}Schema = ${modelSchema};`); | |
// merge relations | |
modelSchema = this.makeMerge(modelSchema, this.makePartial(relationSchema)); | |
} | |
//////////////////////////////////////////////// | |
// 2. Prisma create & update | |
//////////////////////////////////////////////// | |
// refine | |
if (refineFuncName) { | |
const noRefineSchema = `${upperCaseFirst(model.name)}WithoutRefineSchema`; | |
writer.writeLine(`export const ${noRefineSchema} = ${modelSchema};`); | |
modelSchema = `${refineFuncName}(${noRefineSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}Schema = ${modelSchema};`); | |
// schema for validating prisma create input (all fields optional) | |
let prismaCreateSchema = makePassthrough(makePartial('baseSchema')); | |
if (refineFuncName) { | |
prismaCreateSchema = `${refineFuncName}(${prismaCreateSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}PrismaCreateSchema = ${prismaCreateSchema};`); | |
// schema for validating prisma update input (all fields optional) | |
// note numeric fields can be simple update or atomic operations | |
let prismaUpdateSchema = `z.object({ | |
${scalarFields | |
.map((field) => { | |
let fieldSchema = makeFieldSchema(field); | |
if (field.type.type === 'Int' || field.type.type === 'Float') { | |
fieldSchema = `z.union([${fieldSchema}, z.record(z.unknown())])`; | |
} | |
return `\t${field.name}: ${fieldSchema}`; | |
}) | |
.join(',\n')} | |
})`; | |
prismaUpdateSchema = makePartial(prismaUpdateSchema); | |
if (refineFuncName) { | |
prismaUpdateSchema = `${refineFuncName}(${prismaUpdateSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}PrismaUpdateSchema = ${prismaUpdateSchema};`); | |
//////////////////////////////////////////////// | |
// 3. Create schema | |
//////////////////////////////////////////////// | |
let createSchema = 'baseSchema'; | |
const fieldsWithDefault = scalarFields.filter( | |
(field) => hasAttribute(field, '@default') || hasAttribute(field, '@updatedAt') || field.type.array | |
); | |
if (fieldsWithDefault.length > 0) { | |
createSchema = makePartial( | |
createSchema, | |
fieldsWithDefault.map((f) => f.name) | |
//////////////////////////////////////////////// | |
// 2. Prisma create & update | |
//////////////////////////////////////////////// | |
// schema for validating prisma create input (all fields optional) | |
let prismaCreateSchema = this.makePassthrough(this.makePartial('baseSchema')); | |
if (refineFuncName) { | |
prismaCreateSchema = `${refineFuncName}(${prismaCreateSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}PrismaCreateSchema = ${prismaCreateSchema};`); | |
// schema for validating prisma update input (all fields optional) | |
// note numeric fields can be simple update or atomic operations | |
let prismaUpdateSchema = `z.object({ | |
${scalarFields | |
.map((field) => { | |
let fieldSchema = makeFieldSchema(field); | |
if (field.type.type === 'Int' || field.type.type === 'Float') { | |
fieldSchema = `z.union([${fieldSchema}, z.record(z.unknown())])`; | |
} | |
return `\t${field.name}: ${fieldSchema}`; | |
}) | |
.join(',\n')} | |
})`; | |
prismaUpdateSchema = this.makePartial(prismaUpdateSchema); | |
if (refineFuncName) { | |
prismaUpdateSchema = `${refineFuncName}(${prismaUpdateSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}PrismaUpdateSchema = ${prismaUpdateSchema};`); | |
//////////////////////////////////////////////// | |
// 3. Create schema | |
//////////////////////////////////////////////// | |
let createSchema = 'baseSchema'; | |
const fieldsWithDefault = scalarFields.filter( | |
(field) => hasAttribute(field, '@default') || hasAttribute(field, '@updatedAt') || field.type.array | |
); | |
} | |
if (fieldsWithDefault.length > 0) { | |
createSchema = this.makePartial( | |
createSchema, | |
fieldsWithDefault.map((f) => f.name) | |
); | |
} | |
if (fkSchema) { | |
// export schema with only scalar fields | |
const createScalarSchema = `${upperCaseFirst(model.name)}CreateScalarSchema`; | |
writer.writeLine(`export const ${createScalarSchema} = ${createSchema};`); | |
if (fkSchema) { | |
// export schema with only scalar fields | |
const createScalarSchema = `${upperCaseFirst(model.name)}CreateScalarSchema`; | |
writer.writeLine(`export const ${createScalarSchema} = ${createSchema};`); | |
// merge fk fields | |
createSchema = makeMerge(createScalarSchema, fkSchema); | |
} | |
// merge fk fields | |
createSchema = this.makeMerge(createScalarSchema, fkSchema); | |
} | |
if (refineFuncName) { | |
// export a schema without refinement for extensibility | |
const noRefineSchema = `${upperCaseFirst(model.name)}CreateWithoutRefineSchema`; | |
writer.writeLine(`export const ${noRefineSchema} = ${createSchema};`); | |
createSchema = `${refineFuncName}(${noRefineSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}CreateSchema = ${createSchema};`); | |
if (refineFuncName) { | |
// export a schema without refinement for extensibility | |
const noRefineSchema = `${upperCaseFirst(model.name)}CreateWithoutRefineSchema`; | |
writer.writeLine(`export const ${noRefineSchema} = ${createSchema};`); | |
createSchema = `${refineFuncName}(${noRefineSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}CreateSchema = ${createSchema};`); | |
//////////////////////////////////////////////// | |
// 3. Update schema | |
//////////////////////////////////////////////// | |
let updateSchema = makePartial('baseSchema'); | |
//////////////////////////////////////////////// | |
// 3. Update schema | |
//////////////////////////////////////////////// | |
let updateSchema = this.makePartial('baseSchema'); | |
if (fkSchema) { | |
// export schema with only scalar fields | |
const updateScalarSchema = `${upperCaseFirst(model.name)}UpdateScalarSchema`; | |
writer.writeLine(`export const ${updateScalarSchema} = ${updateSchema};`); | |
updateSchema = updateScalarSchema; | |
if (fkSchema) { | |
// export schema with only scalar fields | |
const updateScalarSchema = `${upperCaseFirst(model.name)}UpdateScalarSchema`; | |
writer.writeLine(`export const ${updateScalarSchema} = ${updateSchema};`); | |
updateSchema = updateScalarSchema; | |
// merge fk fields | |
updateSchema = makeMerge(updateSchema, makePartial(fkSchema)); | |
} | |
// merge fk fields | |
updateSchema = this.makeMerge(updateSchema, this.makePartial(fkSchema)); | |
} | |
if (refineFuncName) { | |
// export a schema without refinement for extensibility | |
const noRefineSchema = `${upperCaseFirst(model.name)}UpdateWithoutRefineSchema`; | |
writer.writeLine(`export const ${noRefineSchema} = ${updateSchema};`); | |
updateSchema = `${refineFuncName}(${noRefineSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}UpdateSchema = ${updateSchema};`); | |
}); | |
if (refineFuncName) { | |
// export a schema without refinement for extensibility | |
const noRefineSchema = `${upperCaseFirst(model.name)}UpdateWithoutRefineSchema`; | |
writer.writeLine(`export const ${noRefineSchema} = ${updateSchema};`); | |
updateSchema = `${refineFuncName}(${noRefineSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}UpdateSchema = ${updateSchema};`); | |
}); | |
return schemaName; | |
} | |
return schemaName; | |
private async generateModelSchema(model: DataModel, output: string) { | |
const schemaName = `${upperCaseFirst(model.name)}.schema`; | |
// Ensure output is sanitized or validated to prevent path traversal vulnerabilities | |
const sf = this.project.createSourceFile(sanitizePath(path.join(output, 'models', `${schemaName}.ts`)), undefined, { | |
overwrite: true, | |
}); | |
this.sourceFiles.push(sf); | |
sf.replaceWithText((writer) => { | |
const scalarFields = model.fields.filter( | |
(field) => | |
// regular fields only | |
!isDataModel(field.type.reference?.ref) && !isForeignKeyField(field) | |
); | |
const relations = model.fields.filter((field) => isDataModel(field.type.reference?.ref)); | |
const fkFields = model.fields.filter((field) => isForeignKeyField(field)); | |
writer.writeLine('/* eslint-disable */'); | |
writer.writeLine(`import { z } from 'zod';`); | |
// import user-defined enums from Prisma as they might be referenced in the expressions | |
const importEnums = new Set<string>(); | |
for (const node of streamAllContents(model)) { | |
if (isEnumFieldReference(node)) { | |
const field = node.target.ref as EnumField; | |
if (!isFromStdlib(field.$container)) { | |
importEnums.add(field.$container.name); | |
} | |
} | |
} | |
if (importEnums.size > 0) { | |
const prismaImport = getPrismaClientImportSpec(path.join(output, 'models'), this.options); | |
writer.writeLine(`import { ${[...importEnums].join(', ')} } from '${prismaImport}';`); | |
} | |
// import enum schemas | |
const importedEnumSchemas = new Set<string>(); | |
for (const field of scalarFields) { | |
if (field.type.reference?.ref && isEnum(field.type.reference?.ref)) { | |
const name = upperCaseFirst(field.type.reference?.ref.name); | |
if (!importedEnumSchemas.has(name)) { | |
writer.writeLine(`import { ${name}Schema } from '../enums/${name}.schema';`); | |
importedEnumSchemas.add(name); | |
} | |
} | |
} | |
// import Decimal | |
if (scalarFields.some((field) => field.type.type === 'Decimal')) { | |
writer.writeLine(`import { DecimalSchema } from '../common';`); | |
writer.writeLine(`import { Decimal } from 'decimal.js';`); | |
} | |
// base schema | |
writer.write(`const baseSchema = z.object(`); | |
writer.inlineBlock(() => { | |
scalarFields.forEach((field) => { | |
writer.writeLine(`${field.name}: ${makeFieldSchema(field, true)},`); | |
}); | |
}); | |
writer.writeLine(');'); | |
// relation fields | |
let relationSchema: string | undefined; | |
let fkSchema: string | undefined; | |
if (relations.length > 0 || fkFields.length > 0) { | |
relationSchema = 'relationSchema'; | |
writer.write(`const ${relationSchema} = z.object(`); | |
writer.inlineBlock(() => { | |
[...relations, ...fkFields].forEach((field) => { | |
writer.writeLine(`${field.name}: ${makeFieldSchema(field)},`); | |
}); | |
}); | |
writer.writeLine(');'); | |
} | |
if (fkFields.length > 0) { | |
fkSchema = 'fkSchema'; | |
writer.write(`const ${fkSchema} = z.object(`); | |
writer.inlineBlock(() => { | |
fkFields.forEach((field) => { | |
writer.writeLine(`${field.name}: ${makeFieldSchema(field)},`); | |
}); | |
}); | |
writer.writeLine(');'); | |
} | |
// compile "@@validate" to ".refine" | |
const refinements = makeValidationRefinements(model); | |
let refineFuncName: string | undefined; | |
if (refinements.length > 0) { | |
refineFuncName = `refine${upperCaseFirst(model.name)}`; | |
writer.writeLine( | |
`export function ${refineFuncName}<T, D extends z.ZodTypeDef>(schema: z.ZodType<T, D, T>) { return schema${refinements.join( | |
'\n' | |
)}; }` | |
); | |
} | |
//////////////////////////////////////////////// | |
// 1. Model schema | |
//////////////////////////////////////////////// | |
const fieldsWithoutDefault = scalarFields.filter((f) => !getFieldSchemaDefault(f)); | |
// mark fields without default value as optional | |
let modelSchema = this.makePartial( | |
'baseSchema', | |
fieldsWithoutDefault.length < scalarFields.length ? fieldsWithoutDefault.map((f) => f.name) : undefined | |
); | |
// omit fields | |
const fieldsToOmit = scalarFields.filter((field) => hasAttribute(field, '@omit')); | |
if (fieldsToOmit.length > 0) { | |
modelSchema = this.makeOmit( | |
modelSchema, | |
fieldsToOmit.map((f) => f.name) | |
); | |
} | |
if (relationSchema) { | |
// export schema with only scalar fields | |
const modelScalarSchema = `${upperCaseFirst(model.name)}ScalarSchema`; | |
writer.writeLine(`export const ${modelScalarSchema} = ${modelSchema};`); | |
modelSchema = modelScalarSchema; | |
// merge relations | |
modelSchema = this.makeMerge(modelSchema, this.makePartial(relationSchema)); | |
} | |
// refine | |
if (refineFuncName) { | |
const noRefineSchema = `${upperCaseFirst(model.name)}WithoutRefineSchema`; | |
writer.writeLine(`export const ${noRefineSchema} = ${modelSchema};`); | |
modelSchema = `${refineFuncName}(${noRefineSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}Schema = ${modelSchema};`); | |
//////////////////////////////////////////////// | |
// 2. Prisma create & update | |
//////////////////////////////////////////////// | |
// schema for validating prisma create input (all fields optional) | |
let prismaCreateSchema = this.makePassthrough(this.makePartial('baseSchema')); | |
if (refineFuncName) { | |
prismaCreateSchema = `${refineFuncName}(${prismaCreateSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}PrismaCreateSchema = ${prismaCreateSchema};`); | |
// schema for validating prisma update input (all fields optional) | |
// note numeric fields can be simple update or atomic operations | |
let prismaUpdateSchema = `z.object({ | |
${scalarFields | |
.map((field) => { | |
let fieldSchema = makeFieldSchema(field); | |
if (field.type.type === 'Int' || field.type.type === 'Float') { | |
fieldSchema = `z.union([${fieldSchema}, z.record(z.unknown())])`; | |
} | |
return `\t${field.name}: ${fieldSchema}`; | |
}) | |
.join(',\n')} | |
})`; | |
prismaUpdateSchema = this.makePartial(prismaUpdateSchema); | |
if (refineFuncName) { | |
prismaUpdateSchema = `${refineFuncName}(${prismaUpdateSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}PrismaUpdateSchema = ${prismaUpdateSchema};`); | |
//////////////////////////////////////////////// | |
// 3. Create schema | |
//////////////////////////////////////////////// | |
let createSchema = 'baseSchema'; | |
const fieldsWithDefault = scalarFields.filter( | |
(field) => hasAttribute(field, '@default') || hasAttribute(field, '@updatedAt') || field.type.array | |
); | |
if (fieldsWithDefault.length > 0) { | |
createSchema = this.makePartial( | |
createSchema, | |
fieldsWithDefault.map((f) => f.name) | |
); | |
} | |
if (fkSchema) { | |
// export schema with only scalar fields | |
const createScalarSchema = `${upperCaseFirst(model.name)}CreateScalarSchema`; | |
writer.writeLine(`export const ${createScalarSchema} = ${createSchema};`); | |
// merge fk fields | |
createSchema = this.makeMerge(createScalarSchema, fkSchema); | |
} | |
if (refineFuncName) { | |
// export a schema without refinement for extensibility | |
const noRefineSchema = `${upperCaseFirst(model.name)}CreateWithoutRefineSchema`; | |
writer.writeLine(`export const ${noRefineSchema} = ${createSchema};`); | |
createSchema = `${refineFuncName}(${noRefineSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}CreateSchema = ${createSchema};`); | |
//////////////////////////////////////////////// | |
// 3. Update schema | |
//////////////////////////////////////////////// | |
let updateSchema = this.makePartial('baseSchema'); | |
if (fkSchema) { | |
// export schema with only scalar fields | |
const updateScalarSchema = `${upperCaseFirst(model.name)}UpdateScalarSchema`; | |
writer.writeLine(`export const ${updateScalarSchema} = ${updateSchema};`); | |
updateSchema = updateScalarSchema; | |
// merge fk fields | |
updateSchema = this.makeMerge(updateSchema, this.makePartial(fkSchema)); | |
} | |
if (refineFuncName) { | |
// export a schema without refinement for extensibility | |
const noRefineSchema = `${upperCaseFirst(model.name)}UpdateWithoutRefineSchema`; | |
writer.writeLine(`export const ${noRefineSchema} = ${updateSchema};`); | |
updateSchema = `${refineFuncName}(${noRefineSchema})`; | |
} | |
writer.writeLine(`export const ${upperCaseFirst(model.name)}UpdateSchema = ${updateSchema};`); | |
}); | |
return schemaName; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review Status
Actionable comments generated: 8
Configuration used: CodeRabbit UI
Files selected for processing (4)
- packages/schema/src/plugins/enhancer/enhance/index.ts (6 hunks)
- packages/schema/src/plugins/enhancer/index.ts (2 hunks)
- packages/sdk/src/prisma.ts (1 hunks)
- packages/sdk/src/utils.ts (1 hunks)
Additional comments: 7
packages/schema/src/plugins/enhancer/enhance/index.ts (7)
- 1-1: The addition of the
DMMF
import from@prisma/generator-helper
is appropriate for accessing the Prisma Data Model Meta Format (DMMF), which is essential for generating the logical Prisma schema.- 9-9: The import of
PluginError
is crucial for improved error handling within the plugin system. This aligns with the PR's objective of enhancing error handling capabilities.- 13-13: The addition of the
fs
import is necessary for file system operations, such as reading and writing files, which are integral to generating and saving the logical Prisma schema and client.- 26-26: The import of
name
from the parent directory is likely used for error handling or logging purposes, ensuring that errors can be associated with the specific plugin or operation that failed.- 27-28: The imports of
execPackage
andtrackPrismaSchemaError
are essential for executing external commands (likeprisma generate
) and tracking errors related to the Prisma schema, respectively. These additions support the PR's focus on enhancing error handling and logical Prisma client generation.- 36-36: The declaration of the
dmmf
variable is a key addition for storing the DMMF document, which is crucial for processing and generating the logical Prisma client.- 40-40: The call to
generateLogicalPrisma
within thegenerate
function is a significant addition, as it encapsulates the logic for generating the logical Prisma schema based on the model and options provided. This modular approach enhances readability and maintainability.
// a logical client is generated | ||
if (typeof options.output === 'string') { | ||
// get the absolute path of the logical prisma client | ||
const prismaClientPathAbs = path.resolve(options.output, 'prisma'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Detected a potential security issue with the use of path.resolve
incorporating user input from options.output
. This could lead to a path traversal vulnerability, allowing an attacker to access arbitrary files on the file system. It's crucial to ensure that user input is sanitized or validated before use.
- const prismaClientPathAbs = path.resolve(options.output, 'prisma');
+ // Ensure options.output is sanitized or validated to prevent path traversal
+ const prismaClientPathAbs = path.resolve(sanitizePath(options.output), 'prisma');
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
const prismaClientPathAbs = path.resolve(options.output, 'prisma'); | |
// Ensure options.output is sanitized or validated to prevent path traversal | |
const prismaClientPathAbs = path.resolve(sanitizePath(options.output), 'prisma'); |
// resolve the prisma client output path, which is relative to the prisma schema | ||
const resolvedPrismaClientOutput = path.resolve(prismaSchemaOutputDir, clientOutput); | ||
// resolve absolute path based on the zmodel file location | ||
const resolvedPrismaClientOutput = path.resolve(path.dirname(options.schemaPath), options.prismaClientPath); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given the previous comment regarding the potential security issue with path.resolve
and user input, it's important to reiterate the need for sanitizing or validating user inputs to prevent path traversal vulnerabilities. Ensure that options.schemaPath
and options.prismaClientPath
are properly handled to avoid security risks.
- const resolvedPrismaClientOutput = path.resolve(path.dirname(options.schemaPath), options.prismaClientPath);
+ // Ensure options.schemaPath and options.prismaClientPath are sanitized or validated
+ const resolvedPrismaClientOutput = path.resolve(path.dirname(sanitizePath(options.schemaPath)), sanitizePath(options.prismaClientPath));
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
const resolvedPrismaClientOutput = path.resolve(path.dirname(options.schemaPath), options.prismaClientPath); | |
// Ensure options.schemaPath and options.prismaClientPath are sanitized or validated | |
const resolvedPrismaClientOutput = path.resolve(path.dirname(sanitizePath(options.schemaPath)), sanitizePath(options.prismaClientPath)); |
@@ -285,7 +285,7 @@ | |||
if (path.isAbsolute(_path)) { | |||
return _path; | |||
} else { | |||
return path.join(path.dirname(options.schemaPath), _path); | |||
return path.resolve(path.dirname(options.schemaPath), _path); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The use of path.resolve
with potentially user-controlled input in the resolvePath
function raises a security concern regarding path traversal vulnerabilities. It's essential to validate or sanitize user inputs to ensure the security of file system access.
- return path.resolve(path.dirname(options.schemaPath), _path);
+ // Ensure _path is sanitized or validated to prevent path traversal
+ return path.resolve(path.dirname(options.schemaPath), sanitizePath(_path));
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
return path.resolve(path.dirname(options.schemaPath), _path); | |
// Ensure _path is sanitized or validated to prevent path traversal | |
return path.resolve(path.dirname(options.schemaPath), sanitizePath(_path)); |
const prismaDts = project.createSourceFile( | ||
path.join(outDir, 'prisma.d.ts'), | ||
`export type * from '${logicalPrismaClientDir}/index-fixed';`, | ||
{ overwrite: true } | ||
); | ||
await saveSourceFile(prismaDts, options); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The logic for creating a reexport of the logical Prisma client by generating a prisma.d.ts
file is a crucial addition. However, the use of path.join
with logicalPrismaClientDir
derived from user input or configuration could potentially lead to a path traversal vulnerability. Ensure that logicalPrismaClientDir
is sanitized or validated to prevent malicious paths.
- path.join(outDir, 'prisma.d.ts'),
+ path.join(sanitizePath(outDir), 'prisma.d.ts'),
Please ensure to implement a sanitizePath
function that validates or sanitizes the path to mitigate potential security risks.
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
const prismaDts = project.createSourceFile( | |
path.join(outDir, 'prisma.d.ts'), | |
`export type * from '${logicalPrismaClientDir}/index-fixed';`, | |
{ overwrite: true } | |
); | |
await saveSourceFile(prismaDts, options); | |
const prismaDts = project.createSourceFile( | |
path.join(sanitizePath(outDir), 'prisma.d.ts'), | |
`export type * from '${logicalPrismaClientDir}/index-fixed';`, | |
{ overwrite: true } | |
); | |
await saveSourceFile(prismaDts, options); |
TypeAliasDeclaration, | ||
VariableStatement, | ||
} from 'ts-morph'; | ||
import { name } from '..'; | ||
import { execPackage } from '../../../utils/exec-utils'; | ||
import { trackPrismaSchemaError } from '../../prisma'; | ||
import { PrismaSchemaGenerator } from '../../prisma/schema-generator'; | ||
|
||
// information of delegate models and their sub models | ||
type DelegateInfo = [DataModel, DataModel[]][]; | ||
|
||
export async function generate(model: Model, options: PluginOptions, project: Project, outDir: string) { | ||
const outFile = path.join(outDir, 'enhance.ts'); | ||
let logicalPrismaClientDir: string | undefined; | ||
let dmmf: DMMF.Document | undefined; | ||
|
||
if (hasDelegateModel(model)) { | ||
logicalPrismaClientDir = await generateLogicalPrisma(model, options, outDir); | ||
// schema contains delegate models, need to generate a logical prisma schema | ||
const result = await generateLogicalPrisma(model, options, outDir); | ||
|
||
logicalPrismaClientDir = './.logical-prisma-client'; | ||
dmmf = result.dmmf; | ||
|
||
// create a reexport of the logical prisma client | ||
const prismaDts = project.createSourceFile( | ||
path.join(outDir, 'prisma.d.ts'), | ||
`export type * from '${logicalPrismaClientDir}/index-fixed';`, | ||
{ overwrite: true } | ||
); | ||
await saveSourceFile(prismaDts, options); | ||
} else { | ||
// just reexport the prisma client | ||
const prismaDts = project.createSourceFile( | ||
path.join(outDir, 'prisma.d.ts'), | ||
`export type * from '${getPrismaClientImportSpec(outDir, options)}';`, | ||
{ overwrite: true } | ||
); | ||
await saveSourceFile(prismaDts, options); | ||
} | ||
|
||
project.createSourceFile( | ||
outFile, | ||
const enhanceTs = project.createSourceFile( | ||
path.join(outDir, 'enhance.ts'), | ||
`import { createEnhancement, type EnhancementContext, type EnhancementOptions, type ZodSchemas } from '@zenstackhq/runtime'; | ||
import modelMeta from './model-meta'; | ||
import policy from './policy'; | ||
${options.withZodSchemas ? "import * as zodSchemas from './zod';" : 'const zodSchemas = undefined;'} | ||
import { Prisma } from '${getPrismaClientImportSpec(model, outDir)}'; | ||
${ | ||
logicalPrismaClientDir | ||
? `import type { PrismaClient as EnhancedPrismaClient } from '${logicalPrismaClientDir}/index-fixed';` | ||
: '' | ||
} | ||
import { Prisma } from '${getPrismaClientImportSpec(outDir, options)}'; | ||
${logicalPrismaClientDir ? `import { type PrismaClient } from '${logicalPrismaClientDir}/index-fixed';` : ``} | ||
|
||
export function enhance<DbClient extends object>(prisma: DbClient, context?: EnhancementContext, options?: EnhancementOptions) { | ||
return createEnhancement(prisma, { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
📝 NOTE
This review was outside the diff hunks and was mapped to the diff hunk with the greatest overlap. Original lines [62-83]
The creation of the enhance.ts
source file with dynamic content based on the presence of a logical Prisma client directory is a key part of the PR's objective to enhance the logical Prisma client generation. However, similar to the previous comment, ensure that paths used with path.join
are sanitized or validated to prevent path traversal vulnerabilities.
- path.join(outDir, 'enhance.ts'),
+ path.join(sanitizePath(outDir), 'enhance.ts'),
Again, please ensure to implement a sanitizePath
function that validates or sanitizes the path to mitigate potential security risks.
// generate the prisma client | ||
const generateCmd = `prisma generate --schema "${logicalPrismaFile}" --no-engine`; | ||
try { | ||
// run 'prisma generate' | ||
await execPackage(generateCmd, { stdio: 'ignore' }); | ||
} catch { | ||
await trackPrismaSchemaError(logicalPrismaFile); | ||
try { | ||
// run 'prisma generate' again with output to the console | ||
await execPackage(generateCmd); | ||
} catch { | ||
// noop | ||
} | ||
throw new PluginError(name, `Failed to run "prisma generate"`); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error handling in the generateLogicalPrisma
function has been previously flagged for improvement. The updated error message is a step in the right direction, providing more context about the failure. However, consider logging the error details captured in the catch blocks to aid in debugging.
- // noop
+ console.error("Error during 'prisma generate':", e);
Adding error logging will provide developers with more information to troubleshoot issues effectively.
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
// generate the prisma client | |
const generateCmd = `prisma generate --schema "${logicalPrismaFile}" --no-engine`; | |
try { | |
// run 'prisma generate' | |
await execPackage(generateCmd, { stdio: 'ignore' }); | |
} catch { | |
await trackPrismaSchemaError(logicalPrismaFile); | |
try { | |
// run 'prisma generate' again with output to the console | |
await execPackage(generateCmd); | |
} catch { | |
// noop | |
} | |
throw new PluginError(name, `Failed to run "prisma generate"`); | |
// generate the prisma client | |
const generateCmd = `prisma generate --schema "${logicalPrismaFile}" --no-engine`; | |
try { | |
// run 'prisma generate' | |
await execPackage(generateCmd, { stdio: 'ignore' }); | |
} catch { | |
await trackPrismaSchemaError(logicalPrismaFile); | |
try { | |
// run 'prisma generate' again with output to the console | |
await execPackage(generateCmd); | |
} catch (e) { | |
console.error("Error during 'prisma generate':", e); | |
} | |
throw new PluginError(name, `Failed to run "prisma generate"`); |
import type { DMMF } from '@prisma/generator-helper'; | ||
import { DELEGATE_AUX_RELATION_PREFIX } from '@zenstackhq/runtime'; | ||
import { | ||
getAttribute, | ||
getDataModels, | ||
getDMMF, | ||
getPrismaClientImportSpec, | ||
isDelegateModel, | ||
PluginError, | ||
type PluginOptions, | ||
} from '@zenstackhq/sdk'; | ||
import { DataModel, DataModelField, isDataModel, isReferenceExpr, type Model } from '@zenstackhq/sdk/ast'; | ||
import fs from 'fs'; | ||
import path from 'path'; | ||
import { | ||
FunctionDeclarationStructure, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
📝 NOTE
This review was outside the diff hunks, and no overlapping diff hunk was found. Original lines [311-313]
The dynamic construction of a regular expression using delegateModels
could potentially lead to a Regular Expression Denial-of-Service (ReDoS) vulnerability if the input is controlled by the user. Consider validating or limiting the input to known safe values or using hardcoded regexes where possible.
Consider performing input validation or using a regex checking/sanitization library to ensure the regex does not appear vulnerable to ReDoS.
📝 NOTE
This review was outside the diff hunks, and no overlapping diff hunk was found. Original lines [334-336]
Similar to the previous comment, the dynamic construction of a regular expression using delegateInfo
could potentially lead to a Regular Expression Denial-of-Service (ReDoS) vulnerability. Ensure that the input is validated or limited to known safe values.
Consider performing input validation or using a regex checking/sanitization library to ensure the regex does not appear vulnerable to ReDoS.
Summary by CodeRabbit