Skip to content

Commit

Permalink
Update based on PR comments
Browse files Browse the repository at this point in the history
Signed-off-by: Ian Hoang <[email protected]>
  • Loading branch information
Ian Hoang committed Mar 5, 2024
1 parent 34584e0 commit cb8bd0f
Show file tree
Hide file tree
Showing 2 changed files with 49 additions and 39 deletions.
18 changes: 10 additions & 8 deletions MAINTAINERS_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,33 +4,35 @@

### Work on Current Issues and Create New Issues

* Maintainers can choose any issues to work on and should make new issues when needed. Maintainers should add issues that they are working on to the the `In Progress` column of this [roadmap board](https://github.com/opensearch-project/opensearch-benchmark/projects/1). Any issue added to the `In Progress` column should be properly labeled. For more information on how to properly label issues and PRs, see the [labels](#labels) section of this document.
* Maintainers can select any issues they would like to work on and should add should add them to the `In Progress` column of the [roadmap board](https://github.com/opensearch-project/opensearch-benchmark/projects/1). Any issue added to the `In Progress` column should be properly labeled. For more information on how to properly label issues and PRs, see the [labels](#labels) section of this document.

Maintainers should create new issues as needed.

### Triage Issues
* Maintainers should meet biweekly to triage issues. This involves assessing new, current, and old issues and prioritize them based on this [roadmap board](https://github.com/opensearch-project/opensearch-benchmark/projects/1).
* Maintainers should meet biweekly to triage issues. This involves assessing new, current, and old issues and prioritizing them based on the [roadmap board](https://github.com/opensearch-project/opensearch-benchmark/projects/1).

### Review Pull Requests

* Maintainers should review pull requests. Pull requests only require one maintainer to approve. The maintainer reviewing the PRs should be a subject matter expert (understand the context and purpose of the PR) and drive best practices in clean code and architecture.
* Maintainers should regularly review the backlog of pull requests. Pull requests only require one maintainer to approve. The maintainer reviewing the PRs should be a subject matter expert (understand the context and purpose of the PR) and drive best practices in clean code and architecture.

### Drive Releases
* Maintainers drive releases. A week prior to the scheduled release, maintainers should announce a code freeze in the [#performance channel](https://opensearch.slack.com/archives/C0516H8EJ7R) within the OpenSearch Slack community. For more information on releases, see the [release guide](<https://github.com/opensearch-project/OpenSearch-Benchmark/blob/main/RELEASE_GUIDE.md>)
* Maintainers drive releases. A week prior to the scheduled release, maintainers should announce a code freeze in the [#performance channel](https://opensearch.slack.com/archives/C0516H8EJ7R) within the OpenSearch Slack community. For more information on the release process, see the [release guide](<https://github.com/opensearch-project/OpenSearch-Benchmark/blob/main/RELEASE_GUIDE.md>)


## Labels

Here are a few suggestions on how to use labels.
Issues, pull requests and releases may be tagged with labels to categorize them. Here are some suggestions on how to use labels.

Priorities are set by Maintainers of the repository and should be put on specific issues and not all issues.
Priorities are set by Maintainers of the repository and should be assigned to a selected subset of issues and not all issues.

* **Low Priority** - Implementations and PRs should be reviewed and completed within a sprint
* **Medium Priority** - Implementations and PRs should be reviewed and completed within a week
* **High Priority** - Implementations and PRs should be reviewed and completed within a few days and up to a week


**Release Labels (vN.N.N)** - Repositories create consistent release labels, such as `v1.0.0`, `v1.1.0` and `v2.0.0`, as well as `patch` and `backport`. Use release labels to target an issue or a PR for a given release. See [MAINTAINERS](MAINTAINERS.md#triage-open-issues) for more information on triaging issues.
**Release Labels:** Releases are tagged with labels in the scheme `vN.N.N`, for instance `v1.0.0`, `v1.1.0` and `v2.0.0`, as well as `patch` and `backport`. Use release labels to target an issue or a PR for a given release. See [MAINTAINERS](MAINTAINERS.md#triage-open-issues) for more information on triaging issues.

The release process is standard across repositories in this org and is run by a release manager volunteering from amongst [maintainers](MAINTAINERS.md).
The release process is standard across repositories in this open-source project and is run by a release manager volunteering from amongst the [maintainers](MAINTAINERS.md).

**Request For Comments (RFC)** - This should only be applied to RFCs. These are automatically applied to the RFCs when they are published.

Expand Down
70 changes: 39 additions & 31 deletions RELEASE_GUIDE.md
Original file line number Diff line number Diff line change
@@ -1,37 +1,38 @@
# SOP: OpenSearch-Benchmark Release Guide

## Table of Contents
- [Overview](#overview)
- [Branches](#branches)
- [Releasing Branches](#release-branches)
- [Feature Branches](#feature-branches)
- [Release Labels](#release-labels)
- [Prerequisites](#prerequisites)
- [Release new version of OpenSearch Benchmark to PyPi, Docker, and ECR](#release-new-version-of-opensearch-benchmark-to-pypi-docker-and-ecr)
- [Release the new version of OpenSearch Benchmark to PyPI, Docker, and ECR](#release-the-new-version-of-opensearch-benchmark-to-pypi-docker-and-ecr)
- [Error Handling](#error-handling)

## Overview
This document explains the release strategy for artifacts in this organization.
This document explains the release strategy for artifacts in this project.

## Branches

### Release Branches

Given the current major release of 1.0, projects in this organization maintain the following active branches.
Given the current major release of 1.0, the OpenSearch Benchmark project maintains the following active branches.

* **main**: The next `1.x` release. This is the branch where all merges take place and code moves fast.
* **1.0**: The _current_ release. In between minor releases, only hotfixes (e.g. security) are backported to `1.0`.

Label PRs with the next major version label (e.g. `2.0.0`) and merge changes into `main`. Label PRs that you believe need to be backported as `1.x` and `1.0`. Backport PRs by checking out the versioned branch, cherry-pick changes and open a PR against each target backport branch.
Label PRs with the next major version label (e.g. `2.0.0`) and merge changes into `main`. Label PRs that you believe need to be backported as `1.x` and `1.0`. Backport PRs by checking out the versioned branch, cherry-picking changes and opening a PR against each target backport branch.

### Feature Branches

Do not creating branches in the upstream repo, use your fork, for the exception of long lasting feature branches that require active collaboration from multiple developers. Name feature branches `feature/<thing>`. Once the work is merged to `main`, please make sure to delete the feature branch.
Do not creating branches in the upstream repo, use your fork, with the exception of long lasting feature branches that require active collaboration from multiple developers. Name feature branches `feature/<FEATURE>`. Once the feature branch is merged into `main`, please make sure to delete the feature branch.

## Release Labels

Repositories create consistent release labels, such as `v1.0.0`, `v1.1.0` and `v2.0.0`, as well as `patch` and `backport`. Use release labels to target an issue or a PR for a given release. See [MAINTAINERS](MAINTAINERS.md#triage-open-issues) for more information on triaging issues.
Releases are tagged with labels in the scheme `vN.N.N`, for instance `v1.0.0`, `v1.1.0` and `v2.0.0`, as well as `patch` and `backport`. Use release labels to target an issue or a PR for a given release. See [MAINTAINERS](MAINTAINERS.md#triage-open-issues) for more information on triaging issues.

The release process is standard across repositories in this org and is run by a release manager volunteering from amongst [maintainers](MAINTAINERS.md).
The release process is standard across repositories in this open-source project and is run by a release manager volunteering from amongst the [maintainers](MAINTAINERS.md).

## Prerequisites

Expand All @@ -40,23 +41,26 @@ The release process is standard across repositories in this org and is run by a
```
OpenSearch Benchmark release is scheduled for 1/25 and a code freeze will be put in place starting on 1/23.
```
* Ensure that version.txt matches the new release version before proceeding. If not, open a PR that updates the version in version.txt and merge it in before proceeding with the following steps. For example, if OSB is currently on version 0.3.0 and we want to release the next version as 0.4.0, update version.txt from 0.3.0 to 0.4.0.
* Ensure you have git cloned the official OpenSearch Benchmark repository with the ssh address on your local computer.
* Ensure that all new committed changes in OSB that are visible by users are added to documentation
* Ensure that version.txt matches the new release version before proceeding. If not, open a PR that updates the version in version.txt and merge it in before proceeding with the following steps. For example, if OSB is currently at version `0.3.0` and we want to release the next version as `0.4.0`, update `version.txt` from `0.3.0` to `0.4.0`.
* Ensure you have cloned the official OpenSearch Benchmark git repository with the ssh address.
* Ensure that all new committed changes in OSB that are visible by users are added to the documentation

## Release new version of OpenSearch Benchmark to PyPi, Docker, and ECR
## Release the new version of OpenSearch Benchmark to PyPI, Docker, and ECR

NOTE: The version number below is in semantic format, for instance, 1.2.0.
NOTE: The version number below is in semantic format, for instance, `1.2.0`.

1. Create a tag: `git tag <VERSION> main`
1. Ensure that this is done in the main official opensearch-benchmark repository
2. This should be the new version that you have in version.txt.

2. Push the tag: `git push origin <VERSION>`
1. This starts a workflow in Jenkins and creates an automated issue in the OSB repository. The issue needs to be commented on by a maintainer of the repository for the release process to proceed.
2. Example of automated issue opened by Jenkins Workflow
3. Maintainer needs to comment on Automated Issue: Once Maintainer has commented, the workflow uploads OSB to PyPi and OSB Dockerhub Staging account. Once the workflows are finished publishing OSB to PyPI and OSB Dockerhub staging account (verify here), person who pushed the tag should visit both PyPi and Dockerhub staging.
1. Check progress of release here in Jenkins console:: https://build.ci.opensearch.org/job/opensearch-benchmark-release/
1. If failed,
2. Example of automated issue opened by the Jenkins Workflow

3. Maintainer needs to comment on Automated Issue: Once the maintainer has commented, the workflow uploads OSB to PyPI and OSB Docker Hub Staging account. Once the workflows are finished publishing OSB to PyPI and OSB Docker Hub staging account, the maintainer who pushed the tag should visit both PyPI and Docker Hub staging to perform the following steps to verify that the artifacts have been properly uploaded.
1. Check the progress of release here in the Jenkins console:: https://build.ci.opensearch.org/job/opensearch-benchmark-release/
1. If failed, inspect the logs.

2. Verify PyPI:
1. Download the OSB distribution build from PyPI: https://pypi.org/project/opensearch-benchmark/#files. This is a `wheel` file with the extension `.whl`.
2. Install it with `pip install`.
Expand All @@ -65,51 +69,55 @@ NOTE: The version number below is in semantic format, for instance, 1.2.0.
5. Run `opensearch-benchmark list workloads`
6. Run a basic workload on Linux and MacOS: `opensearch-benchmark execute-test --workload pmc --test-mode`

3. Verify Dockerhub Staging OSB Image Works:
3. Verify Docker Hub Staging OSB Image Works:
1. The staging images are at https://hub.docker.com/r/opensearchstaging/opensearch-benchmark/tags.
2. Pull the latest image: `docker pull opensearchstaging/opensearch-benchmark:<VERSION>`
3. Check the version of OSB: `docker run opensearchstaging/opensearch-benchmark:<VERSION> —version`
4. Run any other commands listed on the Dockerhub overview tab.
4. Copy over image from Dockerhub Staging to Dockerhub Production and ECR: Once you have verified that PyPi and Dockerhub staging image works, contact Admin team member. Admin team member will help promote the “copy-over” workflow, where Jenkins copies the Docker image from Dockerhub staging account to both Dockerhub prod account and ECR.
4. Run any other commands listed on the Docker Hub overview tab.

4. Copy over the image from Docker Hub Staging to Docker Hub Production and ECR: Once you have verified that PyPI and Docker Hub staging image works, contact Admin team member. Admin team member will help promote the “copy-over” workflow, where Jenkins copies the Docker image from Docker Hub staging account to both Docker Hub prod account and ECR.
1. Admin will need to invoke the copy-over four times:
1. repository: opensearchstaging, image: opensearch-benchmark:<VERSION> → repository: opensearchproject, image: opensearch-benchmark:<VERSION>
2. repository: opensearchstaging, image: opensearch-benchmark:<VERSION> → repository: opensearchproject, image: opensearch-benchmark:latest
3. repository: opensearchstaging, image: opensearch-benchmark:<VERSION> → repository: public.ecr.aws/opensearchproject, image: opensearch-benchmark:<VERSION>
4. repository: opensearchstaging, image: opensearch-benchmark:<VERSION> → repository: public.ecr.aws/opensearchproject, image: opensearch-benchmark:latest

5. See if OpenSearch-Benchmark Tags is Published:
1. Check that the version appears in GitHub (https://github.com/opensearch-project/opensearch-benchmark/releases) and is marked as the “latest” release. There should be an associated changelog as well. Clicking on the “Tags” tab should indicate the version number is one of the project’s tags and its timestamp should match that of the last commit.
2. Check Dockerhub Production: https://hub.docker.com/r/opensearchproject/opensearch-benchmark. Both “latest” and the published release should appear on the page along with the appropriate publication timestamp.
2. Check Docker Hub Production: https://hub.docker.com/r/opensearchproject/opensearch-benchmark. Both “latest” and the published release should appear on the page along with the appropriate publication timestamp.
3. Check ECR: https://gallery.ecr.aws/opensearchproject/opensearch-benchmark. The dropdown box at the top should list both “latest” and the published release as entries. The publication time is also indicated.
6. Notify Community: Inform everyone in the following channels that the new OpenSearch-Benchmark version is available and provide a brief summary of what the new version includes.

6. Notify the Community: Create a message that introduces the newly released OpenSearch Benchmark version and includes a brief summary of changes, enhanacements, and bug fixes in the new version. The message may look something like the following:
```
@here OpenSearch Benchmark (OSB) 1.2.0 has just been released! :hitom: :mega: :tada:
@here OpenSearch Benchmark (OSB) 1.2.0 has just been released!
What’s changed?
* Read here: https://github.com/opensearch-project/opensearch-benchmark/releases/tag/1.2.0
* This version includes several enhancements and fixes contributed by OSCI participants
* Documentation: https://opensearch.org/docs/latest/benchmark
Wow! Where can I get this?
* PyPI: https://pypi.org/project/opensearch-benchmark
* DockerHub: https://hub.docker.com/r/opensearchproject/opensearch-benchmark/tags
* Docker Hub: https://hub.docker.com/r/opensearchproject/opensearch-benchmark/tags
* ECR: https://gallery.ecr.aws/opensearchproject/opensearch-benchmark
```
Notify the following channels in OpenSearch Community Slack
* #performance

Send this message in the following channels in OpenSearch Community Slack:
* [#performance](https://opensearch.slack.com/archives/C0516H8EJ7R)


7. Ensure that we back port changes to other version branches as needed. See guide for more information.
1. Unless you released a major version, update main branch’s version.txt to the next minor version. For instance, it should be updated to 1.2.0 immediately after the 1.1.0 release.
2. Update the version.txt in the branch for the version that was just released with current version but patch version incremented
7. Ensure that we backport changes to other version branches as needed. See the guide for more information.
1. Unless you released a major version, update main branch’s `version.txt` to the next minor version. For instance, if `1.1.0` was just released, the file in the `main` branch should be updated to `1.2.0`.
2. Update the `version.txt` in the branch for the version that was just released with the current version but patch version incremented. For instance, if 1.1.0 was just released, the file in the `1.1` branch should be updated to `1.1.1`.
3. Previous minor version is now stale
4. For patch releases, ensure that the `main` branch and branch of the same major and minor version has its `version.txt` file updated. For instsance, if `1.1.1` was just released, we need to update the file in the `1.1` branch to be `1.1.2`.

## Error Handling

If error occurs during build process and need to retrigger the workflow, do the following:

* Delete tag locally `git tag -d <VERSION>`
* Delete tag on Github
* Delete the tag locally `git tag -d <VERSION>`
* Delete the tag on Github
* Delete draft-release on Github

Afterwards, remake the tag and push it.
Then, create the tag again and push it.

0 comments on commit cb8bd0f

Please sign in to comment.