-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Y24-327 - scRNA Core bulk test #1925
Comments
For recent UAT sessions, we have used the following integration suite branch: It allows specifying parameters for number of runs, number of vac tubes each run, etc. However it only covers the usual 'scRNA Core entry point 1 - LRC Blood Vac tubes' scenario. It has all steps in separate methods so disabling calls to them becomes easy in the main flow. I am looking into
|
I have written a volume test which covers Blood Banking, cDNA Prep, and Library Prep. It can be configured using the let blocks. I have set the config to make 7 banking runs. From each run we pick 8 Seq and 2 Spare tubes. We also add 10 Input tubes to those to make 80 tubes going into pooling. Local Run: Running it against uat and training environments needs Sequencescape UAT action for randomised FluidX barcodes to be deployed (merged into develop already). |
The Sequencescape UAT action for generating FluidX barcodes is deployed to UAT. I will run the bulk test against UAT using GitLab. |
User story
As a user of the scRNA Core pipeline, I would like to be confident that the pipeline will work for the volume of samples and size of pools that I'm planning to put through the end to end test.
Who are the primary contacts for this story
Abby, Andrew, Katy
Who is the nominated tester for UAT
This doesn't need UAT, it just needs the test to pass in the UAT and/or training envs.
Acceptance criteria
To be considered successful the solution must allow:
Dependencies
The development of the test is not blocked by the following, however we need to check it passes after the following are done:
The text was updated successfully, but these errors were encountered: