-
Notifications
You must be signed in to change notification settings - Fork 2
Release Engineering Process
This document contains:
- Quality Assurance Surveillance Plan (QASP) and Implementation Details
- CI/CD Pipeline
- Code Review Checklist
See also:
The following tables list QASP deliverables (from the RFQ Statement of Objectives) and other quality requirements. "Implementation" describes how this quality check will be implemented. "Implementation Status" details whether this check has yet been implemented in the CI/CD pipeline and elsewhere.
Deliverable 1 | Accepted Features |
---|---|
Performance Standard(s) | At the beginning of each sprint, the Product Owner and development team will collaborate to define a set of user stories to be completed during the sprint. Acceptance criteria for each story will also be defined. The development team will deliver code and functionality to satisfy these user stories. |
Acceptable Quality Level | Delivered code meets the acceptance criteria for each user story. Incomplete stories will be assessed and considered for inclusion in the next sprint. |
Method of Assessment | Manual review |
Due Date | Every sprint |
Implementation | To do |
Implementation Status | To do |
Deliverable 2 | Tested Code |
---|---|
Performance Standard(s) | Code delivered under the order must have substantial test code coverage. Version-controlled GHE (GitHub Enterprise) repository of code that comprises products that will remain in the government domain. |
Acceptable Quality Level | Minimum of 90% test coverage of all code. All areas of code are meaningfully tested. |
Method of Assessment | Combination of manual review and automated testing |
Due Date | Every sprint |
Implementation |
Automated, blocks merging. Code coverage is assessed by the test runners in both api and frontend . Both apps must pass the 90% threshold, otherwise the build fails and merging is blocked. |
Implementation Status | Not yet implemented. |
Deliverable 3 | Properly Styled Code |
---|---|
Performance Standard(s) | GSA 18F Front- End Guide |
Acceptable Quality Level | 0 linting errors and 0 warnings |
Method of Assessment | Combination of manual review and automated testing using tools such as SonarQube or equivalent |
Due Date | Every sprint |
Implementation |
Automated, blocks merging. TypeScript code style is enforced for api and frontend using prettier . A prettier-ignore file holds exceptions, and is manually reviewed during code review. |
Implementation Status | Not yet implemented in Jenkins. Done in CircleCI. |
Deliverable 4 | Accessible |
---|---|
Performance Standard(s) | Web Content Accessibility Guidelines 2.1 AA standards |
Acceptable Quality Level | 0 errors reported using an automated scanner and 0 errors reported in manual testing |
Method of Assessment | Combined approach using automated and manual testing with tools equivalent to Accessibility Insights |
Due Date | Every sprint |
Implementation |
Automated, blocks merging. pa11y tests every view/screen. Manual Code review ensures that pa11y is set up to cover every view. |
Implementation Status | Not yet implemented. Minimal implementation in CircleCI. |
Deliverable 5 | Deployed |
---|---|
Performance Standard(s) | Code must [be] successfully built and deployed into the staging environment. |
Acceptable Quality Level | Successful build with a single command |
Method of Assessment | Combination of manual review and automated testing |
Due Date | Every sprint |
Implementation | After a successful deploy into the staging environment, a manual set of tests and/or an automated set of tests run with a web crawler test for functionality and regressions. Once these tests pass, the code is allowed to be deployed to production. |
Implementation Status | Not yet implemented. |
Deliverable 6 | Documented |
---|---|
Performance Standard(s) | Summary of user stories completed every two weeks. All dependencies are listed and the licenses are documented. Major functionality in the software/source code is documented, including system diagram and information. |
Acceptable Quality Level | Combination of manual review and automated testing, if available |
Method of Assessment | Manual review |
Due Date | Every sprint |
Implementation | To do |
Implementation Status | To do |
Deliverable 7 | Secure |
---|---|
Performance Standard(s) | NIST Security and Privacy Controls for Information Systems and Organizations 800-53 Rev 5 and Open Web Application Security Project (OWASP) Application Security Verification Standard 3.0 os equivalent |
Acceptable Quality Level | Code submitted must be free of medium- and high-level static and dynamic security vulnerabilities |
Method of Assessment | Clean tests resulting from a tool such as SonarQube or equivalent supported CI processes, along with documentation explaining any false positives |
Due Date | Every sprint |
Implementation | Automated, blocks merging. OWASP scan reports any issues. |
Implementation Status | OWASP scan is run in Jenkins, and blocks deploys but not merges. Should verify severity levels. |
Deliverable 8 | User research |
---|---|
Performance Standard(s) | Usability testing and other user research methods must be conducted at regular intervals throughout the development process (not just at the beginning or end). |
Acceptable Quality Level | Research plans and artifacts from usability testing and/or other research methods with end users are available at the end of every applicable sprint, in accordance with the contractor’s research plan. |
Method of Assessment | FS will manually evaluate the artifacts based on a research plan provided by the contractor at the end of the second sprint and every applicable sprint thereafter. |
Due Date | As needed |
Implementation | To do |
Implementation Status | To do |
Deliverable 9 (Non-QASP) | Code quality |
---|---|
Performance Standard(s) | Merged code is considered high quality, and is generally free. |
Acceptable Quality Level | 0 bugs and a passes the Quality Gate. If possible, set at B-level quality or higher. |
Method of Assessment | SonarQube static analysis scans for code quality. |
Due Date | As needed. |
Implementation | Automated, blocks merging. SonarQube is set up in Jenkins, and blocks merging if code does not pass the Quality Gate. Some code can be excepted from quality scans if project leadership deems it necessary. |
Implementation Status | Not yet implemented. |
The CI/CD pipeline runs many of the automated checks specified by the QASP.
See the CI/CD Pipeline page for more details.. See the CI/CD diagram on Mural.
Note: Some information may be duplicated and out of sync between this page and the CI/CD Pipeline page.
Not all checks can be run automatically in CI/CD. Here are other criteria for code review, to be considered before approving and merging code.
- Does the code meet the provided acceptance criteria? (Deliverable 1)
- Does the code being submitted include proper tests? If appropriate, does it need unit tests or integration tests? (The overall code coverage check may pass even if the code being submitted isn't properly tested.) (Deliverable 2)
- Do the
pa11y
config URLs cover every view that needs to be tested? (Deliverable 4) - What manual accessibility testing needs to be done? (Deliverable 4)
- Does this provide major functionality that needs to be documented and diagrammed? (Deliverable 6)
- Can related low-priority vulnerabilities be addressed, or low-priority dependencies be swapped out or updated? (Deliverable 7)
- Does usability testing need to occur on this feature before or after release? (Deliverable 8)
- Are there minor quality issues (e.g. code smells) that can be addressed within this sprint? Should a separate issue be opened for refactoring in the next sprint? (Deliverable 9)
NRM-Grants-Agreements Path Analysis
Home
How we work
Tech
- Platform and Technologies
- Architecture diagram
- Architecture Decision Records (GitHub)
- Release Engineering Process
Design
- Our design approach
- Visual styles
- Design tools
- Timeline of design activities
- Design debt
- Additional design resources
User research