Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automated Integration Tests #30

Open
ClashLuke opened this issue May 9, 2022 · 0 comments
Open

Automated Integration Tests #30

ClashLuke opened this issue May 9, 2022 · 0 comments
Assignees
Labels
engineering Software-engineering problems that don't require ML-Expertise mlops

Comments

@ClashLuke
Copy link
Member

Currently, our codebase is untested and needs manual evaluation to figure out if a PR broke something or if it's valid and ready to be merged. We could start a dedicated TPU that tries to overfit on a single batch to avoid this effort. This way, we could easily have a sanity check that the model can do one forward pass, learn, and its gradients are correct.
This issue tracks the progress of implementing such an automated testing infrastructure.

@ClashLuke ClashLuke added engineering Software-engineering problems that don't require ML-Expertise mlops labels May 9, 2022
@ma7dev ma7dev self-assigned this May 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
engineering Software-engineering problems that don't require ML-Expertise mlops
Projects
None yet
Development

No branches or pull requests

2 participants