Skip to content
This repository has been archived by the owner on Oct 10, 2022. It is now read-only.

Testing organization #228

Open
boenrobot opened this issue Nov 14, 2019 · 4 comments
Open

Testing organization #228

boenrobot opened this issue Nov 14, 2019 · 4 comments
Labels
enhancement new feature or bug with easy workaround

Comments

@boenrobot
Copy link
Contributor

boenrobot commented Nov 14, 2019

I've more than once been bited in the ass by the way tests are done when doing PRs, and the fact that I can't really run them locally (because I don't have enough HDD space for a Docker VM, container and images...) doesn't help either.

I think it would be cleaner if tests are reorganized in a way that makes them easier to alter and add new cases, and not rely on parsing files with testing utilities.

To this end, I propose the following.

Have a folder structure like

/test/schemas/{driver}/{schemaName}.sql
/test/configs/{testName-or-testGroup}/.tomg-config.json (here perhaps also include auxilary files like a custom naming strategy...)
/test/reference/{driver}/{testName-or-testGroup}/**
/test/output/{driver}/{testName-or-testGroup}/** (wiped before each test, and created on each test)

The testing file would run each configuration, and compare the output with the output in the "reference" folder for that same combination. The config will be augmented with credentials from env variables, so that one doesn't have to alter all configs when things change. If the config specifies a driver, only that driver will be produced and expected in the reference folder. Otherwise, all enabled ones for which there is a reference would run. If a reference is missing, there may be a warning I guess.

To minimize whitespace created differences, "prettier" could be applied to both the references and the output.

Creating the schemas in CI could happen when files are added/updated in the schemas folder (travis has envs that help with that...), at which point the respective schema would be dropped if it exists, recreated, and I guess the data files could be cached for the next build (or the container be saved as a separate layer... not sure what's more optimal for Docker and travis-ci...).

The docker container would merely contain the installed DBs and set envs for credentials. When running locally, one would set envs for the DBs they have in an env file, similarly to now.

Key difference is that to run without a container, one can just run the schemas in the schema folder with their SQL editor (or perhaps a dedicated "one time only" script that takes the env into account). If exploring a scenario not covered by existing schemas, a new one can be created to contain a minimal schema showing the issue. Equally importantly, if there is no reference for a config and driver combo, that should not be considered a test failure, allowing contributors to only contribute tests to drivers they're actually working with.

@Kononnable
Copy link
Owner

As for docker - you can test some of the drivers locally(not all of them) if you like. This should drastically reduce storage usage(and make running tests faster). It's fine most of the time, travis will catch a bug if it is database specific.

I agree that using EntityFileToJson isn't ideal - it has to relatively high maintenance cost. However keeping entity definitions in form of typeorm files have it advantages. It allows to visually assess if generated output is similar to desired one. This won't be possible with using pure sql queries. It also forbids using sql statements which can't be reproduced in typeorm - someone might use his exact production schema even if typeorm(or model generator) doesn't support some functionality(or put unnecessary information there which would obfuscate code). Later on when such functionality is introduced this might be problematic. Finally - some entity models will generate different sql statements depending on sql engine version e.g CreateDateColumn on mysql <5.6 and >5.6.

Key difference is that to run without a container, one can just run the schemas in the schema folder with their SQL editor (or perhaps a dedicated "one time only" script that takes the env into account).

You can achieve same thing right now using npm t -- --grep testName

allowing contributors to only contribute tests to drivers they're actually working with.

It's working this way for a reason. You can still enable test only on some of the drivers(in case it is driver specific functionality), but you have to do it explicitly. Most of the time feature can be implemented same way on each of the database engines. Current approach 'forces' implementation of feature to supported drivers, which make feature supported by all driver, not only specific one which was actually used by person implementing a feature. Without it after some time we would have some features working only on specific db engines(while all of them could be supported) or different implementations depending on database driver(and more probability of bugs).

PS: In the future during tests there will also be check if generated code doesn't require syncing schema before using it for the first time. But this won't be implemented until typeorm schema sync issues are resolved.

@boenrobot
Copy link
Contributor Author

boenrobot commented Nov 18, 2019

It allows to visually assess if generated output is similar to desired one. This won't be possible with using pure sql queries.

The "reference" folder is supposed to contain the exact expected generated code, thus enabling this visual check.

It also forbids using sql statements which can't be reproduced in typeorm - someone might use his exact production schema even if typeorm(or model generator) doesn't support some functionality(or put unnecessary information there which would obfuscate code).

True. However, it might be worth checking how the generator would handle such unsupported features.

Later on when such functionality is introduced this might be problematic. Finally - some entity models will generate different sql statements depending on sql engine version e.g CreateDateColumn on mysql <5.6 and >5.6.

The job of the generator is to take existing DB and turn it into entities, not to turn entities into SQL statements. That said, I see your point about version differences...

In the setup above, this could be solved by an extra ts file in the config and schema folders that gets passed the TypeORM version, DB driver version and DB server version into an exported function, and returns a Promise telling the main testing script whether to skip the current test. If absent, the test would be assumed to be unskippable with all drivers, regardless of any version constraints - the typical case. Come to think of it, with such a file in place, each .tomg-config.json driver can be ignored, forcing tests to be engine and version agnostic by default.

If a test by design produces slightly different output depending on a version and/or driver, a new config and reference can be created, each with the appropriate version constraints. If a certain schema is only supported on certain DB server versions, a ts file named after the schema will be required, to clarify those version constraints, or else the schema will be assumed supported for all DB server versions.

It's working this way for a reason. You can still enable test only on some of the drivers(in case it is driver specific functionality), but you have to do it explicitly.

With the above tweaks, that too would be addressed. The config would specify the schema name, but no driver name, meaning a test will by default require the schema on all drivers. A contributor contrubuting a new schema would be forced to think if a new schema is even required for the test they're making, or if editing all existing rereferences and/or adding a backwards compatible option would be more preferable.

Though I guess the process can be streamlines by changing the schema folder even further, to

/test/schemas/{schemaName}/{driver}.sql <-- Optional initial SQL definition (skip to entities if undefined; require at least that or an entities folder)
/test/schemas/{schemaName}/{driver}.ts <-- Optional version constraints function (apply to all drivers if undefined)
/test/schemas/{schemaName}/entities/**.ts <!-- Optional engine independent entities to be synced after initial SQL definition (also apply if sql file is undefined; require at least that or sql file)

The entities to be sycned can also serve as a form of driver indepdenent reference (in constract to the actual "reference" folder that is driver dependent by design).

@boenrobot
Copy link
Contributor Author

You can achieve same thing right now using npm t -- --grep testName

In the clone of the repo I have, running npm t -- --grep simple1-simple-entity or even just npm test gives me

internal/modules/cjs/loader.js:895
    throw err;
    ^

Error: Cannot find module 'C:\Projects\github\typeorm-model-generator\node'
    at Function.Module._resolveFilename (internal/modules/cjs/loader.js:892:15)
    at Function.Module._load (internal/modules/cjs/loader.js:785:27)
    at Function.Module.runMain (internal/modules/cjs/loader.js:1143:12)
    at internal/main/run_main_module.js:16:11 {
  code: 'MODULE_NOT_FOUND',
  requireStack: []
}

@Kononnable
Copy link
Owner

Sorry for the delay.

Weird error message. It is working correctly on my machine(so it might be OS related).

After some consideration: using described test organization(raw sql queries) might be better approach for the future. Especially when we will be able to generate models defined by entity schemas. Like every approach it has its ups and downs.
However it won't be implemented in near future(0.4) so this won't be changed now(it will require some work to do that correctly).

@Kononnable Kononnable added the enhancement new feature or bug with easy workaround label Dec 12, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement new feature or bug with easy workaround
Projects
None yet
Development

No branches or pull requests

2 participants