diff --git a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/automating-migration-with-github-actions-importer.md b/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/automating-migration-with-github-actions-importer.md
deleted file mode 100644
index 6d3b868ea9d5..000000000000
--- a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/automating-migration-with-github-actions-importer.md
+++ /dev/null
@@ -1,176 +0,0 @@
----
-title: Automating migration with GitHub Actions Importer
-intro: 'Use {% data variables.product.prodname_actions_importer %} to plan and automate your migration to {% data variables.product.prodname_actions %}.'
-redirect_from:
- - /actions/migrating-to-github-actions/automating-migration-with-github-actions-importer
- - /actions/migrating-to-github-actions/automated-migrations/automating-migration-with-github-actions-importer
-versions:
- fpt: '*'
- ghec: '*'
- ghes: '*'
-type: how_to
-topics:
- - Migration
- - CI
- - CD
-shortTitle: 'Automate migration with {% data variables.product.prodname_actions_importer %}'
----
-
-{% data reusables.actions.enterprise-github-hosted-runners %}
-
-[Legal notice](#legal-notice)
-
-## About {% data variables.product.prodname_actions_importer %}
-
-You can use {% data variables.product.prodname_actions_importer %} to plan and automatically migrate your CI/CD supported pipelines to {% data variables.product.prodname_actions %}.
-
-{% data variables.product.prodname_actions_importer %} is distributed as a Docker container, and uses a [{% data variables.product.prodname_dotcom %} CLI](https://cli.github.com) extension to interact with the container.
-
-Any workflow that is converted by the {% data variables.product.prodname_actions_importer %} should be inspected for correctness before using it as a production workload. The goal is to achieve an 80% conversion rate for every workflow, however, the actual conversion rate will depend on the makeup of each individual pipeline that is converted.
-
-## Supported CI platforms
-
-You can use {% data variables.product.prodname_actions_importer %} to migrate from the following platforms:
-
-* Azure DevOps
-* Bamboo
-* Bitbucket Pipelines
-* CircleCI
-* GitLab (both cloud and self-hosted)
-* Jenkins
-* Travis CI
-
-## Prerequisites
-
-{% data variables.product.prodname_actions_importer %} has the following requirements:
-
-{% data reusables.actions.actions-importer-prerequisites %}
-
-### Installing the {% data variables.product.prodname_actions_importer %} CLI extension
-
-{% data reusables.actions.installing-actions-importer %}
-
-### Updating the {% data variables.product.prodname_actions_importer %} CLI
-
-To ensure you're running the latest version of {% data variables.product.prodname_actions_importer %}, you should regularly run the `update` command:
-
-```bash
-gh actions-importer update
-```
-
-### Authenticating at the command line
-
-You must configure credentials that allow {% data variables.product.prodname_actions_importer %} to communicate with {% data variables.product.prodname_dotcom %} and your current CI server. You can configure these credentials using environment variables or a `.env.local` file. The environment variables can be configured in an interactive prompt, by running the following command:
-
-```bash
-gh actions-importer configure
-```
-
-## Using the {% data variables.product.prodname_actions_importer %} CLI
-
-Use the subcommands of `gh actions-importer` to begin your migration to {% data variables.product.prodname_actions %}, including `audit`, `forecast`, `dry-run`, and `migrate`.
-
-### Auditing your existing CI pipelines
-
-The `audit` subcommand can be used to plan your CI/CD migration by analyzing your current CI/CD footprint. This analysis can be used to plan a timeline for migrating to {% data variables.product.prodname_actions %}.
-
-To run an audit, use the following command to determine your available options:
-
-```bash
-$ gh actions-importer audit -h
-Description:
- Plan your CI/CD migration by analyzing your current CI/CD footprint.
-
-[...]
-
-Commands:
- azure-devops An audit will output a list of data used in an Azure DevOps instance.
- bamboo An audit will output a list of data used in a Bamboo instance.
- circle-ci An audit will output a list of data used in a CircleCI instance.
- gitlab An audit will output a list of data used in a GitLab instance.
- jenkins An audit will output a list of data used in a Jenkins instance.
- travis-ci An audit will output a list of data used in a Travis CI instance.
-```
-
-### Forecasting usage
-
-The `forecast` subcommand reviews historical pipeline usage to create a forecast of {% data variables.product.prodname_actions %} usage.
-
-To run a forecast, use the following command to determine your available options:
-
-```bash
-$ gh actions-importer forecast -h
-Description:
- Forecasts GitHub Actions usage from historical pipeline utilization.
-
-[...]
-
-Commands:
- azure-devops Forecasts GitHub Actions usage from historical Azure DevOps pipeline utilization.
- bamboo Forecasts GitHub Actions usage from historical Bamboo pipeline utilization.
- jenkins Forecasts GitHub Actions usage from historical Jenkins pipeline utilization.
- gitlab Forecasts GitHub Actions usage from historical GitLab pipeline utilization.
- circle-ci Forecasts GitHub Actions usage from historical CircleCI pipeline utilization.
- travis-ci Forecasts GitHub Actions usage from historical Travis CI pipeline utilization.
- github Forecasts GitHub Actions usage from historical GitHub pipeline utilization.
-```
-
-### Testing the migration process
-
-The `dry-run` subcommand can be used to convert a pipeline to its {% data variables.product.prodname_actions %} equivalent, and then write the workflow to your local filesystem.
-
-To perform a dry run, use the following command to determine your available options:
-
-```bash
-$ gh actions-importer dry-run -h
-Description:
- Convert a pipeline to a GitHub Actions workflow and output its yaml file.
-
-[...]
-
-Commands:
- azure-devops Convert an Azure DevOps pipeline to a GitHub Actions workflow and output its yaml file.
- bamboo Convert a Bamboo pipeline to GitHub Actions workflows and output its yaml file.
- circle-ci Convert a CircleCI pipeline to GitHub Actions workflows and output the yaml file(s).
- gitlab Convert a GitLab pipeline to a GitHub Actions workflow and output the yaml file.
- jenkins Convert a Jenkins job to a GitHub Actions workflow and output its yaml file.
- travis-ci Convert a Travis CI pipeline to a GitHub Actions workflow and output its yaml file.
-```
-
-### Migrating a pipeline to {% data variables.product.prodname_actions %}
-
-The `migrate` subcommand can be used to convert a pipeline to its GitHub Actions equivalent and then create a pull request with the contents.
-
-To run a migration, use the following command to determine your available options:
-
-```bash
-$ gh actions-importer migrate -h
-Description:
- Convert a pipeline to a GitHub Actions workflow and open a pull request with the changes.
-
-[...]
-
-Commands:
- azure-devops Convert an Azure DevOps pipeline to a GitHub Actions workflow and open a pull request with the changes.
- bamboo Convert a Bamboo pipeline to GitHub Actions workflows and open a pull request with the changes.
- circle-ci Convert a CircleCI pipeline to GitHub Actions workflows and open a pull request with the changes.
- gitlab Convert a GitLab pipeline to a GitHub Actions workflow and open a pull request with the changes.
- jenkins Convert a Jenkins job to a GitHub Actions workflow and open a pull request with the changes.
- travis-ci Convert a Travis CI pipeline to a GitHub Actions workflow and open a pull request with the changes.
-```
-
-## Performing self-serve migrations using IssueOps
-
-You can use {% data variables.product.prodname_actions %} and {% data variables.product.prodname_github_issues %} to run CLI commands for {% data variables.product.prodname_actions_importer %}. This allows you to migrate your CI/CD workflows without installing software on your local machine. This approach is especially useful for organizations that want to enable self-service migrations to {% data variables.product.prodname_actions %}. Once IssueOps is configured, users can open an issue with the relevant template to migrate pipelines to {% data variables.product.prodname_actions %}.
-
-For more information about setting up self-serve migrations with IssueOps, see the [`actions/importer-issue-ops`](https://github.com/actions/importer-issue-ops) template repository.
-
-## Using the {% data variables.product.prodname_actions_importer %} labs repository
-
-The {% data variables.product.prodname_actions_importer %} labs repository contains platform-specific learning paths that teach you how to use {% data variables.product.prodname_actions_importer %} and how to approach migrations to {% data variables.product.prodname_actions %}. You can use this repository to learn how to use {% data variables.product.prodname_actions_importer %} to help plan, forecast, and automate your migration to {% data variables.product.prodname_actions %}.
-
-To learn more, see the [GitHub Actions Importer labs repository](https://github.com/actions/importer-labs/tree/main#readme).
-
-## Legal notice
-
-{% data reusables.actions.actions-importer-legal-notice %}
diff --git a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/extending-github-actions-importer-with-custom-transformers.md b/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/extending-github-actions-importer-with-custom-transformers.md
deleted file mode 100644
index 3a7a7d15acad..000000000000
--- a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/extending-github-actions-importer-with-custom-transformers.md
+++ /dev/null
@@ -1,182 +0,0 @@
----
-title: Extending GitHub Actions Importer with custom transformers
-intro: '{% data variables.product.prodname_actions_importer %} offers the ability to extend its built-in mapping.'
-versions:
- fpt: '*'
- ghec: '*'
- ghes: '*'
-type: how_to
-topics:
- - Migration
- - CI
- - CD
-shortTitle: Extending GitHub Actions Importer
-redirect_from:
- - /actions/migrating-to-github-actions/automated-migrations/extending-github-actions-importer-with-custom-transformers
----
-
-[Legal notice](#legal-notice)
-
-## About custom transformers
-
-{% data variables.product.prodname_actions_importer %} offers the ability to extend its built-in mapping by creating custom transformers. Custom transformers can be used to:
-
-* Convert items that {% data variables.product.prodname_actions_importer %} does not automatically convert, or modify how items are converted. For more information, see "[Creating custom transformers for items](#creating-custom-transformers-for-items)."
-* Convert references to runners to use different runner labels. For more information, see "[Creating custom transformers for runners](#creating-custom-transformers-for-runners)."
-* Convert environment variable values from your existing pipelines to {% data variables.product.prodname_actions %} workflows. For more information, see "[Creating custom transformers for environment variables](#creating-custom-transformers-for-environment-variables)."
-
-## Using custom transformers with {% data variables.product.prodname_actions_importer %}
-
-A custom transformer contains mapping logic that {% data variables.product.prodname_actions_importer %} can use to transform your plugins, tasks, runner labels, or environment variables to work with {% data variables.product.prodname_actions %}. Custom transformers are written with a domain-specific language (DSL) built on top of Ruby, and are defined within a file with the `.rb` file extension.
-
-You can use the `--custom-transformers` CLI option to specify which custom transformer files to use with the `audit`, `dry-run`, and `migrate` commands.
-
-For example, if custom transformers are defined in a file named `transformers.rb`, you can use the following command to use them with {% data variables.product.prodname_actions_importer %}:
-
-```shell
-gh actions-importer ... --custom-transformers transformers.rb
-```
-
-Alternatively, you can use the glob pattern syntax to specify multiple custom transformer files. For example, if multiple custom transformer files are within a directory named `transformers`, you can provide them all to {% data variables.product.prodname_actions_importer %} with the following command:
-
-```shell
-gh actions-importer ... --custom-transformers transformers/*.rb
-```
-
-{% note %}
-
-**Note:** When you use custom transformers, the custom transformer files must reside in the same directory, or in subdirectories, from where the `gh actions-importer` command is run.
-
-{% endnote %}
-
-## Creating custom transformers for items
-
-You can create custom transformers that {% data variables.product.prodname_actions_importer %} will use when converting existing build steps or triggers to their equivalent in {% data variables.product.prodname_actions %}. This is especially useful when:
-
-* {% data variables.product.prodname_actions_importer %} doesn't automatically convert an item.
-* You want to change how an item is converted by {% data variables.product.prodname_actions_importer %}.
-* Your existing pipelines use custom or proprietary extensions, such as shared libraries in Jenkins, and you need to define how these steps should function in {% data variables.product.prodname_actions %}.
-
-{% data variables.product.prodname_actions_importer %} uses custom transformers that are defined using a DSL built on top of Ruby. In order to create custom transformers for build steps and triggers:
-
-* Each custom transformer file must contain at least one `transform` method.
-* Each `transform` method must return a `Hash`, an array of `Hash`'s, or `nil`. This returned value will correspond to an action defined in YAML. For more information about actions, see "[AUTOTITLE](/actions/learn-github-actions/understanding-github-actions)."
-
-### Example custom transformer for a build step
-
-The following example converts a build step that uses the "buildJavaScriptApp" identifier to run various `npm` commands:
-
-```ruby copy
-transform "buildJavaScriptApp" do |item|
- command = ["build", "package", "deploy"].map do |script|
- "npm run #{script}"
- end
-
- {
- name: "build javascript app",
- run: command.join("\n")
- }
-end
-```
-
-The above example results in the following {% data variables.product.prodname_actions %} workflow step. It is comprised of converted build steps that had a `buildJavaScriptApp` identifier:
-
-```yaml
-- name: build javascript app
- run: |
- npm run build
- npm run package
- npm run deploy
-```
-
-The `transform` method uses the identifier of the build step from your source CI/CD instance in an argument. In this example, the identifier is `buildJavaScriptLibrary`. You can also use comma-separated values to pass multiple identifiers to the `transform` method. For example, `transform "buildJavaScriptApp", "buildTypeScriptApp" { |item| ... }`.
-
-{% note %}
-
-**Note**: The data structure of `item` will be different depending on the CI/CD platform and the type of item being converted.
-
-{% endnote %}
-
-## Creating custom transformers for runners
-
-You can customize the mapping between runners in your source CI/CD instance and their equivalent {% data variables.product.prodname_actions %} runners.
-
-{% data variables.product.prodname_actions_importer %} uses custom transformers that are defined using a DSL built on top of Ruby. To create custom transformers for runners:
-
-* The custom transformer file must have at least one `runner` method.
-* The `runner` method accepts two parameters. The first parameter is the source CI/CD instance's runner label, and the second parameter is the corresponding {% data variables.product.prodname_actions %} runner label. For more information on {% data variables.product.prodname_actions %} runners, see "[AUTOTITLE](/actions/using-github-hosted-runners/about-github-hosted-runners#supported-runners-and-hardware-resources)."
-
-### Example custom transformers for runners
-
-The following example shows a `runner` method that converts one runner label to one {% data variables.product.prodname_actions %} runner label in the resulting workflow.
-
-```ruby copy
-runner "linux", "ubuntu-latest"
-```
-
-You can also use the `runner` method to convert one runner label to multiple {% data variables.product.prodname_actions %} runner labels in the resulting workflow.
-
-```ruby copy
-runner "big-agent", ["self-hosted", "xl", "linux"]
-```
-
-{% data variables.product.prodname_actions_importer %} attempts to map the runner label as best it can. In cases where it cannot do this, the `ubuntu-latest` runner label is used as a default. You can use a special keyword with the `runner` method to control this default value. For example, the following custom transformer instructs {% data variables.product.prodname_actions_importer %} to use `macos-latest` as the default runner instead of `ubuntu-latest`.
-
-```ruby copy
-runner :default, "macos-latest"
-```
-
-## Creating custom transformers for environment variables
-
-You can customize the mapping between environment variables in your source CI/CD pipelines to their values in {% data variables.product.prodname_actions %}.
-
-{% data variables.product.prodname_actions_importer %} uses custom transformers that are defined using a DSL built on top of Ruby. To create custom transformers for environment variables:
-
-* The custom transformer file must have at least one `env` method.
-* The `env` method accepts two parameters. The first parameter is the name of the environment variable in the original pipeline, and the second parameter is the updated value for the environment variable for {% data variables.product.prodname_actions %}. For more information about {% data variables.product.prodname_actions %} environment variables, see "[AUTOTITLE](/actions/learn-github-actions/variables)."
-
-### Example custom transformers for environment variables
-
-There are several ways you can set up custom transformers to map your environment variables.
-
-* The following example sets the value of any existing environment variables named `OCTO`, to `CAT` when transforming a pipeline.
-
- ```ruby copy
- env "OCTO", "CAT"
- ```
-
- You can also remove all instances of a specific environment variable so they are not transformed to an {% data variables.product.prodname_actions %} workflow. The following example removes all environment variables with the name `MONA_LISA`.
-
- ```ruby copy
- env "MONA_LISA", nil
- ```
-
-* You can also map your existing environment variables to secrets. For example, the following `env` method maps an environment variable named `MONALISA` to a secret named `OCTOCAT`.
-
- ```ruby copy
- env "MONALISA", secret("OCTOCAT")
- ```
-
- This will set up a reference to a secret named `OCTOCAT` in the transformed workflow. For the secret to work, you will need to create the secret in your GitHub repository. For more information, see "[AUTOTITLE](/actions/security-guides/using-secrets-in-github-actions#creating-secrets-for-a-repository)."
-
-* You can also use regular expressions to update the values of multiple environment variables at once. For example, the following custom transformer removes all environment variables from the converted workflow:
-
- ```ruby copy
- env /.*/, nil
- ```
-
- The following example uses a regular expression match group to transform environment variable values to dynamically generated secrets.
-
- ```ruby copy
- env /^(.+)_SSH_KEY/, secret("%s_SSH_KEY)
- ```
-
- {% note %}
-
- **Note**: The order in which `env` methods are defined matters when using regular expressions. The first `env` transformer that matches an environment variable name takes precedence over subsequent `env` methods. You should define your most specific environment variable transformers first.
-
- {% endnote %}
-
-## Legal notice
-
-{% data reusables.actions.actions-importer-legal-notice %}
diff --git a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/index.md b/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/index.md
deleted file mode 100644
index 308059c51592..000000000000
--- a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/index.md
+++ /dev/null
@@ -1,23 +0,0 @@
----
-title: 'Using {% data variables.product.prodname_actions_importer %} to automate migrations'
-shortTitle: Automated migrations
-intro: 'Learn how to use {% data variables.product.prodname_actions_importer %} to migrate your CI/CD workflows to {% data variables.product.prodname_actions %}.'
-versions:
- fpt: '*'
- ghes: '*'
- ghec: '*'
-children:
- - /automating-migration-with-github-actions-importer
- - /extending-github-actions-importer-with-custom-transformers
- - /supplemental-arguments-and-settings
- - /migrating-from-azure-devops-with-github-actions-importer
- - /migrating-from-bamboo-with-github-actions-importer
- - /migrating-from-bitbucket-pipelines-with-github-actions-importer
- - /migrating-from-circleci-with-github-actions-importer
- - /migrating-from-gitlab-with-github-actions-importer
- - /migrating-from-jenkins-with-github-actions-importer
- - /migrating-from-travis-ci-with-github-actions-importer
-redirect_from:
- - /actions/migrating-to-github-actions/automated-migrations
----
-
diff --git a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-azure-devops-with-github-actions-importer.md b/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-azure-devops-with-github-actions-importer.md
deleted file mode 100644
index 8a1455c24259..000000000000
--- a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-azure-devops-with-github-actions-importer.md
+++ /dev/null
@@ -1,547 +0,0 @@
----
-title: Migrating from Azure DevOps with GitHub Actions Importer
-intro: 'Learn how to use {% data variables.product.prodname_actions_importer %} to automate the migration of your Azure DevOps pipelines to {% data variables.product.prodname_actions %}.'
-versions:
- fpt: '*'
- ghec: '*'
- ghes: '*'
-type: tutorial
-topics:
- - Migration
- - CI
- - CD
-shortTitle: Azure DevOps migration
-redirect_from:
- - /actions/migrating-to-github-actions/automated-migrations/migrating-from-azure-devops-with-github-actions-importer
----
-
-[Legal notice](#legal-notice)
-
-## About migrating from Azure DevOps with GitHub Actions Importer
-
-The instructions below will guide you through configuring your environment to use {% data variables.product.prodname_actions_importer %} to migrate Azure DevOps pipelines to {% data variables.product.prodname_actions %}.
-
-### Prerequisites
-
-* An Azure DevOps account or organization with projects and pipelines that you want to convert to {% data variables.product.prodname_actions %} workflows.
-* Access to create an Azure DevOps {% data variables.product.pat_generic %} for your account or organization.
-{% data reusables.actions.actions-importer-prerequisites %}
-
-### Limitations
-
-There are some limitations when migrating from Azure DevOps to {% data variables.product.prodname_actions %} with {% data variables.product.prodname_actions_importer %}:
-
-* {% data variables.product.prodname_actions_importer %} requires version 5.0 of the Azure DevOps API, available in either Azure DevOps Services or Azure DevOps Server 2019. Older versions of Azure DevOps Server are not compatible.
-* Tasks that are implicitly added to an Azure DevOps pipeline, such as checking out source code, may be added to a {% data variables.product.prodname_actions_importer %} audit as a GUID name. To find the friendly task name for a GUID, you can use the following URL: `https://dev.azure.com/:organization/_apis/distributedtask/tasks/:guid`.
-
-#### Manual tasks
-
-Certain Azure DevOps constructs must be migrated manually from Azure DevOps into {% data variables.product.prodname_actions %} configurations. These include:
-* Organization, repository, and environment secrets
-* Service connections such as OIDC Connect, {% data variables.product.prodname_github_apps %}, and {% data variables.product.pat_generic_plural %}
-* Unknown tasks
-* Self-hosted agents
-* Environments
-* Pre-deployment approvals
-
-For more information on manual migrations, see "[AUTOTITLE](/actions/migrating-to-github-actions/manually-migrating-to-github-actions/migrating-from-azure-pipelines-to-github-actions)."
-
-#### Unsupported tasks
-
-{% data variables.product.prodname_actions_importer %} does not support migrating the following tasks:
-
-* Pre-deployment gates
-* Post-deployment gates
-* Post-deployment approvals
-* Some resource triggers
-
-## Installing the {% data variables.product.prodname_actions_importer %} CLI extension
-
-{% data reusables.actions.installing-actions-importer %}
-
-## Configuring credentials
-
-The `configure` CLI command is used to set required credentials and options for {% data variables.product.prodname_actions_importer %} when working with Azure DevOps and {% data variables.product.prodname_dotcom %}.
-
-1. Create a {% data variables.product.prodname_dotcom %} {% data variables.product.pat_v1 %}. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic)."
-
- Your token must have the `workflow` scope.
-
- After creating the token, copy it and save it in a safe location for later use.
-1. Create an Azure DevOps {% data variables.product.pat_generic %}. For more information, see [Use {% data variables.product.pat_generic_plural %}](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate?view=azure-devops&tabs=Windows#create-a-pat) in the Azure DevOps documentation. The token must have the following scopes:
-
- * Agents Pool: `Read`
- * Build: `Read`
- * Code: `Read`
- * Release: `Read`
- * Service Connections: `Read`
- * Task Groups: `Read`
- * Variable Groups: `Read`
-
- After creating the token, copy it and save it in a safe location for later use.
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `configure` CLI command:
-
- ```shell
- gh actions-importer configure
- ```
-
- The `configure` command will prompt you for the following information:
-
- * For "Which CI providers are you configuring?", use the arrow keys to select `Azure DevOps`, press Space to select it, then press Enter.
- * For "{% data variables.product.pat_generic_caps %} for GitHub", enter the value of the {% data variables.product.pat_v1 %} that you created earlier, and press Enter.
- * For "Base url of the GitHub instance", {% ifversion ghes %}enter the URL for your {% data variables.product.product_name %} instance, and press Enter.{% else %}press Enter to accept the default value (`https://github.com`).{% endif %}
- * For "{% data variables.product.pat_generic_caps %} for Azure DevOps", enter the value for the Azure DevOps {% data variables.product.pat_generic %} that you created earlier, and press Enter.
- * For "Base url of the Azure DevOps instance", press Enter to accept the default value (`https://dev.azure.com`).
- * For "Azure DevOps organization name", enter the name for your Azure DevOps organization, and press Enter.
- * For "Azure DevOps project name", enter the name for your Azure DevOps project, and press Enter.
-
- An example of the `configure` command is shown below:
-
- ```shell
- $ gh actions-importer configure
- ✔ Which CI providers are you configuring?: Azure DevOps
- Enter the following values (leave empty to omit):
- ✔ {% data variables.product.pat_generic_caps %} for GitHub: ***************
- ✔ Base url of the GitHub instance: https://github.com
- ✔ {% data variables.product.pat_generic_caps %} for Azure DevOps: ***************
- ✔ Base url of the Azure DevOps instance: https://dev.azure.com
- ✔ Azure DevOps organization name: :organization
- ✔ Azure DevOps project name: :project
- Environment variables successfully updated.
- ```
-
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `update` CLI command to connect to the {% data variables.product.prodname_registry %} {% data variables.product.prodname_container_registry %} and ensure that the container image is updated to the latest version:
-
- ```shell
- gh actions-importer update
- ```
-
- The output of the command should be similar to below:
-
- ```shell
- Updating ghcr.io/actions-importer/cli:latest...
- ghcr.io/actions-importer/cli:latest up-to-date
- ```
-
-## Perform an audit of Azure DevOps
-
-You can use the `audit` command to get a high-level view of all projects in an Azure DevOps organization.
-
-The `audit` command performs the following steps:
-
-1. Fetches all of the projects defined in an Azure DevOps organization.
-1. Converts each pipeline to its equivalent {% data variables.product.prodname_actions %} workflow.
-1. Generates a report that summarizes how complete and complex of a migration is possible with {% data variables.product.prodname_actions_importer %}.
-
-### Running the audit command
-
-To perform an audit of an Azure DevOps organization, run the following command in your terminal:
-
-```shell
-gh actions-importer audit azure-devops --output-dir tmp/audit
-```
-
-### Inspecting the audit results
-
-{% data reusables.actions.gai-inspect-audit %}
-
-## Forecast potential {% data variables.product.prodname_actions %} usage
-
-You can use the `forecast` command to forecast potential {% data variables.product.prodname_actions %} usage by computing metrics from completed pipeline runs in Azure DevOps.
-
-### Running the forecast command
-
-To perform a forecast of potential {% data variables.product.prodname_actions %} usage, run the following command in your terminal. By default, {% data variables.product.prodname_actions_importer %} includes the previous seven days in the forecast report.
-
-```shell
-gh actions-importer forecast azure-devops --output-dir tmp/forecast_reports
-```
-
-### Inspecting the forecast report
-
-The `forecast_report.md` file in the specified output directory contains the results of the forecast.
-
-Listed below are some key terms that can appear in the forecast report:
-
-* The **job count** is the total number of completed jobs.
-* The **pipeline count** is the number of unique pipelines used.
-* **Execution time** describes the amount of time a runner spent on a job. This metric can be used to help plan for the cost of {% data variables.product.prodname_dotcom %}-hosted runners.
-
- This metric is correlated to how much you should expect to spend in {% data variables.product.prodname_actions %}. This will vary depending on the hardware used for these minutes. You can use the [{% data variables.product.prodname_actions %} pricing calculator](https://github.com/pricing/calculator) to estimate the costs.
-* **Queue time** metrics describe the amount of time a job spent waiting for a runner to be available to execute it.
-* **Concurrent jobs** metrics describe the amount of jobs running at any given time. This metric can be used to define the number of runners you should configure.
-
-Additionally, these metrics are defined for each queue of runners in Azure DevOps. This is especially useful if there is a mix of hosted or self-hosted runners, or high or low spec machines, so you can see metrics specific to different types of runners.
-
-## Perform a dry-run migration
-
-You can use the `dry-run` command to convert an Azure DevOps pipeline to an equivalent {% data variables.product.prodname_actions %} workflow. A dry run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline.
-
-{% data reusables.actions.gai-custom-transformers-rec %}
-
-### Running the dry-run command for a build pipeline
-
-To perform a dry run of migrating your Azure DevOps build pipeline to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing `pipeline_id` with the ID of the pipeline you are converting.
-
-```shell
-gh actions-importer dry-run azure-devops pipeline --pipeline-id :pipeline_id --output-dir tmp/dry-run
-```
-
-You can view the logs of the dry run and the converted workflow files in the specified output directory.
-
-### Running the dry-run command for a release pipeline
-
-To perform a dry run of migrating your Azure DevOps release pipeline to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing `pipeline_id` with the ID of the pipeline you are converting.
-
-```shell
-gh actions-importer dry-run azure-devops release --pipeline-id :pipeline_id --output-dir tmp/dry-run
-```
-
-You can view the logs of the dry run and the converted workflow files in the specified output directory.
-
-## Perform a production migration
-
-You can use the `migrate` command to convert an Azure DevOps pipeline and open a pull request with the equivalent {% data variables.product.prodname_actions %} workflow.
-
-### Running the migrate command for a build pipeline
-
-To migrate an Azure DevOps build pipeline to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing the `target-url` value with the URL for your {% data variables.product.prodname_dotcom %} repository, and `pipeline_id` with the ID of the pipeline you are converting.
-
-```shell
-gh actions-importer migrate azure-devops pipeline --pipeline-id :pipeline_id --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate
-```
-
-The command's output includes the URL of the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following:
-
-```shell
-$ gh actions-importer migrate azure-devops pipeline --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --azure-devops-project my-azure-devops-project
-[2022-08-20 22:08:20] Logs: 'tmp/migrate/log/actions-importer-20220916-014033.log'
-[2022-08-20 22:08:20] Pull request: 'https://github.com/octo-org/octo-repo/pull/1'
-```
-
-### Running the migrate command for a release pipeline
-
-To migrate an Azure DevOps release pipeline to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing the `target-url` value with the URL for your {% data variables.product.prodname_dotcom %} repository, and `pipeline_id` with the ID of the pipeline you are converting.
-
-```shell
-gh actions-importer migrate azure-devops release --pipeline-id :pipeline_id --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate
-```
-
-The command's output includes the URL of the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following:
-
-```shell
-$ gh actions-importer migrate azure-devops release --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --azure-devops-project my-azure-devops-project
-[2022-08-20 22:08:20] Logs: 'tmp/migrate/log/actions-importer-20220916-014033.log'
-[2022-08-20 22:08:20] Pull request: 'https://github.com/octo-org/octo-repo/pull/1'
-```
-
-{% data reusables.actions.gai-inspect-pull-request %}
-
-## Reference
-
-This section contains reference information on environment variables, optional arguments, and supported syntax when using {% data variables.product.prodname_actions_importer %} to migrate from Azure DevOps.
-
-### Configuration environment variables
-
-{% data reusables.actions.gai-config-environment-variables %}
-
-{% data variables.product.prodname_actions_importer %} uses the following environment variables to connect to your Azure DevOps instance:
-
-* `GITHUB_ACCESS_TOKEN`: The {% data variables.product.pat_v1 %} used to create pull requests with a converted workflow (requires the `workflow` scope).
-* `GITHUB_INSTANCE_URL`: The URL to the target {% data variables.product.prodname_dotcom %} instance (for example, `https://github.com`).
-* `AZURE_DEVOPS_ACCESS_TOKEN`: The {% data variables.product.pat_generic %} used to authenticate with your Azure DevOps instance. This token requires the following scopes:
- * Build: `Read`
- * Agent Pools: `Read`
- * Code: `Read`
- * Release: `Read`
- * Service Connections: `Read`
- * Task Groups: `Read`
- * Variable Groups: `Read`
-* `AZURE_DEVOPS_PROJECT`: The project name or GUID to use when migrating a pipeline. If you'd like to perform an audit on all projects, this is optional.
-* `AZURE_DEVOPS_ORGANIZATION`: The organization name of your Azure DevOps instance.
-* `AZURE_DEVOPS_INSTANCE_URL`: The URL to the Azure DevOps instance, such as `https://dev.azure.com`.
-
-These environment variables can be specified in a `.env.local` file that is loaded by {% data variables.product.prodname_actions_importer %} when it is run.
-
-### Optional arguments
-
-{% data reusables.actions.gai-optional-arguments-intro %}
-
-#### `--source-file-path`
-
-You can use the `--source-file-path` argument with the `forecast`, `dry-run`, or `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from source control. The `--source-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source file path instead.
-
-For example:
-
-```shell
-gh actions-importer dry-run azure-devops pipeline --output-dir ./output/ --source-file-path ./path/to/azure_devops/pipeline.yml
-```
-
-#### `--config-file-path`
-
-You can use the `--config-file-path` argument with the `audit`, `dry-run`, and `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from source control. The `--config-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source files instead.
-
-The `--config-file-path` argument can also be used to specify which repository a converted reusable workflow or composite action should be migrated to.
-
-##### Audit example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file as the source file to perform an audit.
-
-```shell
-gh actions-importer audit azure-devops pipeline --output-dir ./output/ --config-file-path ./path/to/azure_devops/config.yml
-```
-
-To audit an Azure DevOps instance using a configuration file, the configuration file must be in the following format and each `repository_slug` must be unique:
-
-```yaml
-source_files:
- - repository_slug: azdo-project/1
- path: file.yml
- - repository_slug: azdo-project/2
- paths: path.yml
-```
-
-You can generate the `repository_slug` for a pipeline by combining the Azure DevOps organization name, project name, and the pipeline ID. For example, `my-organization-name/my-project-name/42`.
-
-##### Dry run example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file as the source file to perform a dry run.
-
-The pipeline is selected by matching the `repository_slug` in the configuration file to the value of the `--azure-devops-organization` and `--azure-devops-project` option. The `path` is then used to pull the specified source file.
-
-```shell
-gh actions-importer dry-run azure-devops pipeline --output-dir ./output/ --config-file-path ./path/to/azure_devops/config.yml
-```
-
-##### Specify the repository of converted reusable workflows and composite actions
-
-{% data variables.product.prodname_actions_importer %} uses the YAML file provided to the `--config-file-path` argument to determine the repository that converted reusable workflows and composite actions are migrated to.
-
-To begin, you should run an audit without the `--config-file-path` argument:
-
-```shell
-gh actions-importer audit azure-devops --output-dir ./output/
-```
-
-The output of this command will contain a file named `config.yml` that contains a list of all the reusable workflows and composite actions that were converted by {% data variables.product.prodname_actions_importer %}. For example, the `config.yml` file may have the following contents:
-
-```yaml
-reusable_workflows:
- - name: my-reusable-workflow.yml
- target_url: https://github.com/octo-org/octo-repo
- ref: main
-
-composite_actions:
- - name: my-composite-action.yml
- target_url: https://github.com/octo-org/octo-repo
- ref: main
-```
-
-You can use this file to specify which repository and ref a reusable workflow or composite action should be added to. You can then use the `--config-file-path` argument to provide the `config.yml` file to {% data variables.product.prodname_actions_importer %}. For example, you can use this file when running a `migrate` command to open a pull request for each unique repository defined in the config file:
-
-```shell
-gh actions-importer migrate azure-devops pipeline --config-file-path config.yml --target-url https://github.com/my-org/my-repo
-```
-
-### Supported syntax for Azure DevOps pipelines
-
-The following table shows the type of properties that {% data variables.product.prodname_actions_importer %} is currently able to convert.
-
-| Azure Pipelines | {% data variables.product.prodname_actions %} | Status |
-| :-------------------- | :------------------------------------ | :------------------ |
-| condition |
- `jobs..if`
- `jobs..steps[*].if`
| Supported |
-| container | - `jobs..container`
- `jobs..name`
| Supported |
-| continuousIntegration | | Supported |
-| job | | Supported |
-| pullRequest | | Supported |
-| stage | | Supported |
-| steps | | Supported |
-| strategy | - `jobs..strategy.fail-fast`
- `jobs..strategy.max-parallel`
- `jobs..strategy.matrix`
| Supported |
-| timeoutInMinutes | | Supported |
-| variables | - `env`
- `jobs..env`
- `jobs..steps.env`
| Supported |
-| manual deployment | | Partially supported |
-| pool | - `runners`
- `self hosted runners`
| Partially supported |
-| services | | Partially supported |
-| strategy | | Partially supported |
-| triggers | | Partially supported |
-| pullRequest | | Unsupported |
-| schedules | - `on.schedule`
- `on.workflow_run`
| Unsupported |
-| triggers | | Unsupported |
-
-For more information about supported Azure DevOps tasks, see the [`github/gh-actions-importer` repository](https://github.com/github/gh-actions-importer/blob/main/docs/azure_devops/index.md).
-
-### Environment variable mapping
-
-{% data variables.product.prodname_actions_importer %} uses the mapping in the table below to convert default Azure DevOps environment variables to the closest equivalent in {% data variables.product.prodname_actions %}.
-
-| Azure Pipelines | {% data variables.product.prodname_actions %} |
-| :------------------------------------------ | :-------------------------------------------------- |
-| {% raw %}`$(Agent.BuildDirectory)`{% endraw %} | {% raw %}`${{ runner.workspace }}`{% endraw %} |
-| {% raw %}`$(Agent.HomeDirectory)`{% endraw %} | {% raw %}`${{ env.HOME }}`{% endraw %} |
-| {% raw %}`$(Agent.JobName)`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} |
-| {% raw %}`$(Agent.OS)`{% endraw %} | {% raw %}`${{ runner.os }}`{% endraw %} |
-| {% raw %}`$(Agent.ReleaseDirectory)`{% endraw %} | {% raw %}`${{ github.workspace}}`{% endraw %} |
-| {% raw %}`$(Agent.RootDirectory)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} |
-| {% raw %}`$(Agent.ToolsDirectory)`{% endraw %} | {% raw %}`${{ runner.tool_cache }}`{% endraw %} |
-| {% raw %}`$(Agent.WorkFolder)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} |
-| {% raw %}`$(Build.ArtifactStagingDirectory)`{% endraw %} | {% raw %}`${{ runner.temp }}`{% endraw %} |
-| {% raw %}`$(Build.BinariesDirectory)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} |
-| {% raw %}`$(Build.BuildId)`{% endraw %} | {% raw %}`${{ github.run_id }}`{% endraw %} |
-| {% raw %}`$(Build.BuildNumber)`{% endraw %} | {% raw %}`${{ github.run_number }}`{% endraw %} |
-| {% raw %}`$(Build.DefinitionId)`{% endraw %} | {% raw %}`${{ github.workflow }}`{% endraw %} |
-| {% raw %}`$(Build.DefinitionName)`{% endraw %} | {% raw %}`${{ github.workflow }}`{% endraw %} |
-| {% raw %}`$(Build.PullRequest.TargetBranch)`{% endraw %} | {% raw %}`${{ github.base_ref }}`{% endraw %} |
-| {% raw %}`$(Build.PullRequest.TargetBranch.Name)`{% endraw %} | {% raw %}`${{ github.base_ref }}`{% endraw %} |
-| {% raw %}`$(Build.QueuedBy)`{% endraw %} | {% raw %}`${{ github.actor }}`{% endraw %} |
-| {% raw %}`$(Build.Reason)`{% endraw %} | {% raw %}`${{ github.event_name }}`{% endraw %} |
-| {% raw %}`$(Build.Repository.LocalPath)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} |
-| {% raw %}`$(Build.Repository.Name)`{% endraw %} | {% raw %}`${{ github.repository }}`{% endraw %} |
-| {% raw %}`$(Build.Repository.Provider)`{% endraw %} | {% raw %}`GitHub`{% endraw %} |
-| {% raw %}`$(Build.Repository.Uri)`{% endraw %} | {% raw %}`${{ github.server.url }}/${{ github.repository }}`{% endraw %} |
-| {% raw %}`$(Build.RequestedFor)`{% endraw %} | {% raw %}`${{ github.actor }}`{% endraw %} |
-| {% raw %}`$(Build.SourceBranch)`{% endraw %} | {% raw %}`${{ github.ref }}`{% endraw %} |
-| {% raw %}`$(Build.SourceBranchName)`{% endraw %} | {% raw %}`${{ github.ref }}`{% endraw %} |
-| {% raw %}`$(Build.SourceVersion)`{% endraw %} | {% raw %}`${{ github.sha }}`{% endraw %} |
-| {% raw %}`$(Build.SourcesDirectory)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} |
-| {% raw %}`$(Build.StagingDirectory)`{% endraw %} | {% raw %}`${{ runner.temp }}`{% endraw %} |
-| {% raw %}`$(Pipeline.Workspace)`{% endraw %} | {% raw %}`${{ runner.workspace }}`{% endraw %} |
-| {% raw %}`$(Release.DefinitionEnvironmentId)`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} |
-| {% raw %}`$(Release.DefinitionId)`{% endraw %} | {% raw %}`${{ github.workflow }}`{% endraw %} |
-| {% raw %}`$(Release.DefinitionName)`{% endraw %} | {% raw %}`${{ github.workflow }}`{% endraw %} |
-| {% raw %}`$(Release.Deployment.RequestedFor)`{% endraw %} | {% raw %}`${{ github.actor }}`{% endraw %} |
-| {% raw %}`$(Release.DeploymentID)`{% endraw %} | {% raw %}`${{ github.run_id }}`{% endraw %} |
-| {% raw %}`$(Release.EnvironmentId)`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} |
-| {% raw %}`$(Release.EnvironmentName)`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} |
-| {% raw %}`$(Release.Reason)`{% endraw %} | {% raw %}`${{ github.event_name }}`{% endraw %} |
-| {% raw %}`$(Release.RequestedFor)`{% endraw %} | {% raw %}`${{ github.actor }}`{% endraw %} |
-| {% raw %}`$(System.ArtifactsDirectory)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} |
-| {% raw %}`$(System.DefaultWorkingDirectory)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} |
-| {% raw %}`$(System.HostType)`{% endraw %} | {% raw %}`build`{% endraw %} |
-| {% raw %}`$(System.JobId)`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} |
-| {% raw %}`$(System.JobName)`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} |
-| {% raw %}`$(System.PullRequest.PullRequestId)`{% endraw %} | {% raw %}`${{ github.event.number }}`{% endraw %} |
-| {% raw %}`$(System.PullRequest.PullRequestNumber)`{% endraw %} | {% raw %}`${{ github.event.number }}`{% endraw %} |
-| {% raw %}`$(System.PullRequest.SourceBranch)`{% endraw %} | {% raw %}`${{ github.ref }}`{% endraw %} |
-| {% raw %}`$(System.PullRequest.SourceRepositoryUri)`{% endraw %} | {% raw %}`${{ github.server.url }}/${{ github.repository }}`{% endraw %} |
-| {% raw %}`$(System.PullRequest.TargetBranch)`{% endraw %} | {% raw %}`${{ github.event.base.ref }}`{% endraw %} |
-| {% raw %}`$(System.PullRequest.TargetBranchName)`{% endraw %} | {% raw %}`${{ github.event.base.ref }}`{% endraw %} |
-| {% raw %}`$(System.StageAttempt)`{% endraw %} | {% raw %}`${{ github.run_number }}`{% endraw %} |
-| {% raw %}`$(System.TeamFoundationCollectionUri)`{% endraw %} | {% raw %}`${{ github.server.url }}/${{ github.repository }}`{% endraw %} |
-| {% raw %}`$(System.WorkFolder)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} |
-
-### Templates
-
-You can transform Azure DevOps templates with {% data variables.product.prodname_actions_importer %}.
-
-#### Limitations
-
-{% data variables.product.prodname_actions_importer %} is able to transform Azure DevOps templates with some limitations.
-
-* Azure DevOps templates used under the `stages`, `deployments`, and `jobs` keys are converted into reusable workflows in {% data variables.product.prodname_actions %}. For more information, see "[AUTOTITLE](/actions/using-workflows/reusing-workflows)."
-* Azure DevOps templates used under the `steps` key are converted into composite actions. For more information, see "[AUTOTITLE](/actions/creating-actions/creating-a-composite-action)."
-* If you currently have job templates that reference other job templates, {% data variables.product.prodname_actions_importer %} converts the templates into reusable workflows. Because reusable workflows cannot reference other reusable workflows, this is invalid syntax in {% data variables.product.prodname_actions %}. You must manually correct nested reusable workflows.
-* If a template references an external Azure DevOps organization or {% data variables.product.prodname_dotcom %} repository, you must use the `--credentials-file` option to provide credentials to access this template. For more information, see "[AUTOTITLE](/actions/migrating-to-github-actions/automated-migrations/supplemental-arguments-and-settings#using-a-credentials-file-for-authentication)."
-* You can dynamically generate YAML using `each` expressions with the following caveats:
- * Nested `each` blocks are not supported and cause the parent `each` block to be unsupported.
- * `each` and contained `if` conditions are evaluated at transformation time, because {% data variables.product.prodname_actions %} does not support this style of insertion.
- * `elseif` blocks are unsupported. If this functionality is required, you must manually correct them.
- * Nested `if` blocks are supported, but `if/elseif/else` blocks nested under an `if` condition are not.
- * `if` blocks that use predefined Azure DevOps variables are not supported.
-
-#### Supported templates
-
-{% data variables.product.prodname_actions_importer %} supports the templates listed in the table below.
-
-| Azure Pipelines | {% data variables.product.prodname_actions %} | Status |
-| :---------------------------- | :------------------------------------ | ------------------: |
-| Extending from a template | `Reusable workflow` | Supported |
-| Job templates | `Reusable workflow` | Supported |
-| Stage templates | `Reusable workflow` | Supported |
-| Step templates | `Composite action` | Supported |
-| Task groups in classic editor | Varies | Supported |
-| Templates in a different Azure DevOps organization, project, or repository | Varies | Supported |
-| Templates in a {% data variables.product.prodname_dotcom %} repository | Varies | Supported |
-| Variable templates | `env` | Supported |
-| Conditional insertion | `if` conditions on job/steps | Partially supported |
-| Iterative insertion | Not applicable | Partially supported |
-| Templates with parameters | Varies | Partially supported |
-
-#### Template file path names
-
-{% data variables.product.prodname_actions_importer %} can extract templates with relative or dynamic file paths with variable, parameter, and iterative expressions in the file name. However, there must be a default value set.
-
-##### Variable file path name example
-
-```yaml
-# File: azure-pipelines.yml
-variables:
-- template: 'templates/vars.yml'
-
-steps:
-- template: "./templates/${{ variables.one }}"
-```
-
-```yaml
-# File: templates/vars.yml
-variables:
- one: 'simple_step.yml'
-```
-
-##### Parameter file path name example
-
-```yaml
-parameters:
-- name: template
- type: string
- default: simple_step.yml
-
-steps:
-- template: "./templates/{% raw %}${{ parameters.template }}{% endraw %}"
-```
-
-##### Iterative file path name example
-
-```yaml
-parameters:
-- name: steps
- type: object
- default:
- - build_step
- - release_step
-steps:
-- {% raw %}${{ each step in parameters.steps }}{% endraw %}:
- - template: "${{ step }}-variables.yml"
-```
-
-#### Template parameters
-
-{% data variables.product.prodname_actions_importer %} supports the parameters listed in the table below.
-
-| Azure Pipelines | {% data variables.product.prodname_actions %} | Status |
-| :-------------------- | :----------------------------------------- | :------------------- |
-| string | `inputs.string` | Supported |
-| number | `inputs.number` | Supported |
-| boolean | `inputs.boolean` | Supported |
-| object | `inputs.string` with `fromJSON` expression | Partially supported |
-| step | `step` | Partially supported |
-| stepList | `step` | Partially supported |
-| job | `job` | Partially supported |
-| jobList | `job` | Partially supported |
-| deployment | `job` | Partially supported |
-| deploymentList | `job` | Partially supported |
-| stage | `job` | Partially supported |
-| stageList | `job` | Partially supported |
-
-{% note %}
-
-**Note:** A template used under the `step` key with this parameter type is only serialized as a composite action if the steps are used at the beginning or end of the template steps. A template used under the `stage`, `deployment`, and `job` keys with this parameter type are not transformed into a reusable workflow, and instead are serialized as a standalone workflow.
-
-{% endnote %}
-
-## Legal notice
-
-{% data reusables.actions.actions-importer-legal-notice %}
diff --git a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-bamboo-with-github-actions-importer.md b/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-bamboo-with-github-actions-importer.md
deleted file mode 100644
index 6719c745f615..000000000000
--- a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-bamboo-with-github-actions-importer.md
+++ /dev/null
@@ -1,386 +0,0 @@
----
-title: Migrating from Bamboo with GitHub Actions Importer
-intro: 'Learn how to use {% data variables.product.prodname_actions_importer %} to automate the migration of your Bamboo pipelines to {% data variables.product.prodname_actions %}.'
-versions:
- fpt: '*'
- ghec: '*'
- ghes: '*'
-type: tutorial
-topics:
- - Migration
- - CI
- - CD
-shortTitle: Bamboo migration
-redirect_from:
- - /actions/migrating-to-github-actions/automated-migrations/migrating-from-bamboo-with-github-actions-importer
----
-
-[Legal notice](#legal-notice)
-
-## About migrating from Bamboo with GitHub Actions Importer
-
-The instructions below will guide you through configuring your environment to use {% data variables.product.prodname_actions_importer %} to migrate Bamboo pipelines to {% data variables.product.prodname_actions %}.
-
-### Prerequisites
-
-* A Bamboo account or organization with projects and pipelines that you want to convert to {% data variables.product.prodname_actions %} workflows.
-* Bamboo version of 7.1.1 or greater.
-* Access to create a Bamboo {% data variables.product.pat_generic %} for your account or organization.
-{% data reusables.actions.actions-importer-prerequisites %}
-
-### Limitations
-
-There are some limitations when migrating from Bamboo to {% data variables.product.prodname_actions %} with {% data variables.product.prodname_actions_importer %}:
-
-* {% data variables.product.prodname_actions_importer %} relies on the YAML specification generated by the Bamboo Server to perform migrations. When Bamboo does not support exporting something to YAML, the missing information is not migrated.
-* Trigger conditions are unsupported. When {% data variables.product.prodname_actions_importer %} encounters a trigger with a condition, the condition is surfaced as a comment and the trigger is transformed without it.
-* Bamboo Plans with customized settings for storing artifacts are not transformed. Instead, artifacts are stored and retrieved using the [`upload-artifact`](https://github.com/actions/upload-artifact) and [`download-artifact`](https://github.com/actions/download-artifact) actions.
-* Disabled plans must be disabled manually in the GitHub UI. For more information, see "[AUTOTITLE](/actions/using-workflows/disabling-and-enabling-a-workflow)."
-* Disabled jobs are transformed with a `if: false` condition which prevents it from running. You must remove this to re-enable the job.
-* Disabled tasks are not transformed because they are not included in the exported plan when using the Bamboo API.
-* Bamboo provides options to clean up build workspaces after a build is complete. These are not transformed because it is assumed GitHub-hosted runners or ephemeral self-hosted runners will automatically handle this.
-* The hanging build detection options are not transformed because there is no equivalent in {% data variables.product.prodname_actions %}. The closest option is `timeout-minutes` on a job, which can be used to set the maximum number of minutes to let a job run. For more information, see "[AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idtimeout-minutes)."
-* Pattern match labeling is not transformed because there is no equivalent in {% data variables.product.prodname_actions %}.
-* All artifacts are transformed into an `actions/upload-artifact`, regardless of whether they are `shared` or not, so they can be downloaded from any job in the workflow.
-* Permissions are not transformed because there is no suitable equivalent in {% data variables.product.prodname_actions %}.
-* If the Bamboo version is between 7.1.1 and 8.1.1, project and plan variables will not be migrated.
-
-#### Manual tasks
-
-Certain Bamboo constructs must be migrated manually. These include:
-
-* Masked variables
-* Artifact expiry settings
-
-## Installing the {% data variables.product.prodname_actions_importer %} CLI extension
-
-{% data reusables.actions.installing-actions-importer %}
-
-## Configuring credentials
-
-The `configure` CLI command is used to set required credentials and options for {% data variables.product.prodname_actions_importer %} when working with Bamboo and {% data variables.product.prodname_dotcom %}.
-
-1. Create a {% data variables.product.prodname_dotcom %} {% data variables.product.pat_v1 %}. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic)."
-
- Your token must have the `workflow` scope.
-
- After creating the token, copy it and save it in a safe location for later use.
-1. Create a Bamboo {% data variables.product.pat_generic %}. For more information, see [{% data variables.product.pat_generic_title_case_plural %}](https://confluence.atlassian.com/bamboo/personal-access-tokens-976779873.html) in the Bamboo documentation.
-
- Your token must have the following permissions, depending on which resources will be transformed.
-
- Resource Type | View | View Configuration | Edit
- |:--- | :---: | :---: | :---:
- | Build Plan | {% octicon "check" aria-label="Required" %} | {% octicon "check" aria-label="Required" %} | {% octicon "check" aria-label="Required" %}
- | Deployment Project | {% octicon "check" aria-label="Required" %} | {% octicon "check" aria-label="Required" %} | {% octicon "x" aria-label="Not required" %}
- | Deployment Environment | {% octicon "check" aria-label="Required" %} |{% octicon "x" aria-label="Not required" %}| {% octicon "x" aria-label="Not required" %}
-
- After creating the token, copy it and save it in a safe location for later use.
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `configure` CLI command:
-
- ```shell
- gh actions-importer configure
- ```
-
- The `configure` command will prompt you for the following information:
-
- * For "Which CI providers are you configuring?", use the arrow keys to select `Bamboo`, press Space to select it, then press Enter.
- * For "{% data variables.product.pat_generic_caps %} for GitHub", enter the value of the {% data variables.product.pat_v1 %} that you created earlier, and press Enter.
- * For "Base url of the GitHub instance", {% ifversion ghes %}enter the URL for your {% data variables.product.product_name %} instance, and press Enter.{% else %}press Enter to accept the default value (`https://github.com`).{% endif %}
- * For "{% data variables.product.pat_generic_caps %} for Bamboo", enter the value for the Bamboo {% data variables.product.pat_generic %} that you created earlier, and press Enter.
- * For "Base url of the Bamboo instance", enter the URL for your Bamboo Server or Bamboo Data Center instance, and press Enter.
-
- An example of the `configure` command is shown below:
-
- ```shell
- $ gh actions-importer configure
- ✔ Which CI providers are you configuring?: Bamboo
- Enter the following values (leave empty to omit):
- ✔ {% data variables.product.pat_generic_caps %} for GitHub: ***************
- ✔ Base url of the GitHub instance: https://github.com
- ✔ {% data variables.product.pat_generic_caps %} for Bamboo: ********************
- ✔ Base url of the Bamboo instance: https://bamboo.example.com
- Environment variables successfully updated.
- ```
-
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `update` CLI command to connect to {% data variables.product.prodname_registry %} {% data variables.product.prodname_container_registry %} and ensure that the container image is updated to the latest version:
-
- ```shell
- gh actions-importer update
- ```
-
- The output of the command should be similar to below:
-
- ```shell
- Updating ghcr.io/actions-importer/cli:latest...
- ghcr.io/actions-importer/cli:latest up-to-date
- ```
-
-## Perform an audit of Bamboo
-
-You can use the `audit` command to get a high-level view of all projects in a Bamboo organization.
-
-The `audit` command performs the following steps:
-
-1. Fetches all of the projects defined in a Bamboo organization.
-1. Converts each pipeline to its equivalent {% data variables.product.prodname_actions %} workflow.
-1. Generates a report that summarizes how complete and complex of a migration is possible with {% data variables.product.prodname_actions_importer %}.
-
-### Running the audit command
-
-To perform an audit of a Bamboo instance, run the following command in your terminal:
-
-```shell
-gh actions-importer audit bamboo --output-dir tmp/audit
-```
-
-### Inspecting the audit results
-
-{% data reusables.actions.gai-inspect-audit %}
-
-## Forecasting usage
-
-You can use the `forecast` command to forecast potential {% data variables.product.prodname_actions %} usage by computing metrics from completed pipeline runs in your Bamboo instance.
-
-### Running the forecast command
-
-To perform a forecast of potential {% data variables.product.prodname_actions %} usage, run the following command in your terminal. By default, {% data variables.product.prodname_actions_importer %} includes the previous seven days in the forecast report.
-
-```shell
-gh actions-importer forecast bamboo --output-dir tmp/forecast_reports
-```
-
-### Forecasting a project
-
-To limit the forecast to the plans and deployments environments associated with a project, you can use the `--project` option, where the value is set to a build project key.
-
-For example:
-
-```shell
-gh actions-importer forecast bamboo --project PAN --output-dir tmp/forecast_reports
-```
-
-### Inspecting the forecast report
-
-The `forecast_report.md` file in the specified output directory contains the results of the forecast.
-
-Listed below are some key terms that can appear in the forecast report:
-
-* The **job count** is the total number of completed jobs.
-* The **pipeline count** is the number of unique pipelines used.
-* **Execution time** describes the amount of time a runner spent on a job. This metric can be used to help plan for the cost of {% data variables.product.prodname_dotcom %}-hosted runners.
- * This metric is correlated to how much you should expect to spend in {% data variables.product.prodname_actions %}. This will vary depending on the hardware used for these minutes. You can use the [{% data variables.product.prodname_actions %} pricing calculator](https://github.com/pricing/calculator) to estimate the costs.
-* **Queue time** metrics describe the amount of time a job spent waiting for a runner to be available to execute it.
-* **Concurrent jobs** metrics describe the amount of jobs running at any given time. This metric can be used to
-
-## Perform a dry-run migration of a Bamboo pipeline
-
-You can use the `dry-run` command to convert a Bamboo pipeline to an equivalent {% data variables.product.prodname_actions %} workflow. A dry-run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline.
-
-### Running a dry-run migration for a build plan
-
-To perform a dry run of migrating your Bamboo build plan to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing `:my_plan_slug` with the plan's project and plan key in the format `-` (for example: `PAN-SCRIP`).
-
-```shell
-gh actions-importer dry-run bamboo build --plan-slug :my_plan_slug --output-dir tmp/dry-run
-```
-
-### Running a dry-run migration for a deployment project
-
-To perform a dry run of migrating your Bamboo deployment project to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing `:my_deployment_project_id` with the ID of the deployment project you are converting.
-
-```shell
-gh actions-importer dry-run bamboo deployment --deployment-project-id :my_deployment_project_id --output-dir tmp/dry-run
-```
-
-You can view the logs of the dry run and the converted workflow files in the specified output directory.
-
-{% data reusables.actions.gai-custom-transformers-rec %}
-
-## Perform a production migration of a Bamboo pipeline
-
-You can use the `migrate` command to convert a Bamboo pipeline and open a pull request with the equivalent {% data variables.product.prodname_actions %} workflow.
-
-### Running the migrate command for a build plan
-
-To migrate a Bamboo build plan to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing the `target-url` value with the URL for your {% data variables.product.prodname_dotcom %} repository, and `:my_plan_slug` with the plan's project and plan key in the format `-`.
-
-```shell
-gh actions-importer migrate bamboo build --plan-slug :my_plan_slug --target-url :target_url --output-dir tmp/migrate
-```
-
-The command's output includes the URL to the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following:
-
-```shell
-$ gh actions-importer migrate bamboo build --plan-slug :PROJECTKEY-PLANKEY --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate
-[2022-08-20 22:08:20] Logs: 'tmp/migrate/log/actions-importer-20220916-014033.log'
-[2022-08-20 22:08:20] Pull request: 'https://github.com/octo-org/octo-repo/pull/1'
-```
-
-### Running the migrate command for a deployment project
-
-To migrate a Bamboo deployment project to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing the `target-url` value with the URL for your {% data variables.product.prodname_dotcom %} repository, and `:my_deployment_project_id` with the ID of the deployment project you are converting.
-
-```shell
-gh actions-importer migrate bamboo deployment --deployment-project-id :my_deployment_project_id --target-url :target_url --output-dir tmp/migrate
-```
-
-The command's output includes the URL to the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following:
-
-```shell
-$ gh actions-importer migrate bamboo deployment --deployment-project-id 123 --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate
-[2023-04-20 22:08:20] Logs: 'tmp/migrate/log/actions-importer-20230420-014033.log'
-[2023-04-20 22:08:20] Pull request: 'https://github.com/octo-org/octo-repo/pull/1'
-```
-
-{% data reusables.actions.gai-inspect-pull-request %}
-
-## Reference
-
-This section contains reference information on environment variables, optional arguments, and supported syntax when using {% data variables.product.prodname_actions_importer %} to migrate from Bamboo.
-
-### Using environment variables
-
-{% data reusables.actions.gai-config-environment-variables %}
-
-{% data variables.product.prodname_actions_importer %} uses the following environment variables to connect to your Bamboo instance:
-
-* `GITHUB_ACCESS_TOKEN`: The {% data variables.product.pat_v1 %} used to create pull requests with a converted workflow (requires `repo` and `workflow` scopes).
-* `GITHUB_INSTANCE_URL`: The URL to the target {% data variables.product.prodname_dotcom %} instance (for example, `https://github.com`).
-* `BAMBOO_ACCESS_TOKEN`: The Bamboo {% data variables.product.pat_generic %} used to authenticate with your Bamboo instance.
-* `BAMBOO_INSTANCE_URL`: The URL to the Bamboo instance (for example, `https://bamboo.example.com`).
-
-These environment variables can be specified in a `.env.local` file that is loaded by {% data variables.product.prodname_actions_importer %} when it is run.
-
-### Optional arguments
-
-{% data reusables.actions.gai-optional-arguments-intro %}
-
-#### `--source-file-path`
-
-You can use the `--source-file-path` argument with the `dry-run` or `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from the Bamboo instance. The `--source-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source file path instead.
-
-For example:
-
-```shell
-gh actions-importer dry-run bamboo build --plan-slug IN-COM -o tmp/bamboo --source-file-path ./path/to/my/bamboo/file.yml
-```
-
-#### `--config-file-path`
-
-You can use the `--config-file-path` argument with the `audit`, `dry-run`, and `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from the Bamboo instance. The `--config-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source files instead.
-
-##### Audit example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file to perform an audit.
-
-```bash
-gh actions-importer audit bamboo -o tmp/bamboo --config-file-path "./path/to/my/bamboo/config.yml"
-```
-
-To audit a Bamboo instance using a config file, the config file must be in the following format, and each `repository_slug` must be unique:
-
-```yaml
-source_files:
- - repository_slug: IN/COM
- path: path/to/one/source/file.yml
- - repository_slug: IN/JOB
- path: path/to/another/source/file.yml
-```
-
-##### Dry run example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file as the source file to perform a dry run.
-
-The repository slug is built using the `--plan-slug` option. The source file path is matched and pulled from the specified source file.
-
-```bash
-gh actions-importer dry-run bamboo build --plan-slug IN-COM -o tmp/bamboo --config-file-path "./path/to/my/bamboo/config.yml"
-```
-
-### Supported syntax for Bamboo pipelines
-
-The following table shows the type of properties that {% data variables.product.prodname_actions_importer %} is currently able to convert.
-
-| Bamboo | GitHub Actions | Status |
-| :---------------------------------- | :-----------------------------------------------| ---------------------: |
-| `environments` | `jobs` | Supported |
-| `environments.` | `jobs.` | Supported |
-| `.artifacts` | `jobs..steps.actions/upload-artifact` | Supported |
-| `.artifact-subscriptions` | `jobs..steps.actions/download-artifact` | Supported |
-| `.docker` | `jobs..container` | Supported |
-| `.final-tasks` | `jobs..steps.if` | Supported |
-| `.requirements` | `jobs..runs-on` | Supported |
-| `.tasks` | `jobs..steps` | Supported |
-| `.variables` | `jobs..env` | Supported |
-| `stages` | `jobs..needs` | Supported |
-| `stages..final` | `jobs..if` | Supported |
-| `stages..jobs` | `jobs` | Supported |
-| `stages..jobs.` | `jobs.` | Supported |
-| `stages..manual` | `jobs..environment` | Supported |
-| `triggers` | `on` | Supported |
-| `dependencies` | `jobs..steps.` | Partially Supported |
-| `branches` | Not applicable | Unsupported |
-| `deployment.deployment-permissions` | Not applicable | Unsupported |
-| `environment-permissions` | Not applicable | Unsupported |
-| `notifications` | Not applicable | Unsupported |
-| `plan-permissions` | Not applicable | Unsupported |
-| `release-naming` | Not applicable | Unsupported |
-| `repositories` | Not applicable | Unsupported |
-
-For more information about supported Bamboo concept and plugin mappings, see the [`github/gh-actions-importer` repository](https://github.com/github/gh-actions-importer/blob/main/docs/bamboo/index.md).
-
-### Environment variable mapping
-
-{% data variables.product.prodname_actions_importer %} uses the mapping in the table below to convert default Bamboo environment variables to the closest equivalent in {% data variables.product.prodname_actions %}.
-
-| Bamboo | GitHub Actions |
-| :----------------------------------------------- | :-------------------------------------------------- |
-| `bamboo.agentId` | {% raw %}`${{ github.runner_name }}`{% endraw %}
-| `bamboo.agentWorkingDirectory` | {% raw %}`${{ github.workspace }}`{% endraw %}
-| `bamboo.buildKey` | {% raw %}`${{ github.workflow }}-${{ github.job }}`{% endraw %}
-| `bamboo.buildNumber` | {% raw %}`${{ github.run_id }}`{% endraw %}
-| `bamboo.buildPlanName` | {% raw %}`${{ github.repository }}-${{ github.workflow }}-${{ github.job }`{% endraw %}
-| `bamboo.buildResultKey` | {% raw %}`${{ github.workflow }}-${{ github.job }}-${{ github.run_id }}`{% endraw %}
-| `bamboo.buildResultsUrl` | {% raw %}`${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}`{% endraw %}
-| `bamboo.build.working.directory` | {% raw %}`${{ github.workspace }}`{% endraw %}
-| `bamboo.deploy.project` | {% raw %}`${{ github.repository }}`{% endraw %}
-| `bamboo.ManualBuildTriggerReason.userName` | {% raw %}`${{ github.actor }}`{% endraw %}
-| `bamboo.planKey` | {% raw %}`${{ github.workflow }}`{% endraw %}
-| `bamboo.planName` | {% raw %}`${{ github.repository }}-${{ github.workflow }}`{% endraw %}
-| `bamboo.planRepository.branchDisplayName` | {% raw %}`${{ github.ref }}`{% endraw %}
-| `bamboo.planRepository..branch` | {% raw %}`${{ github.ref }}`{% endraw %}
-| `bamboo.planRepository..branchName` | {% raw %}`${{ github.ref }}`{% endraw %}
-| `bamboo.planRepository..name` | {% raw %}`${{ github.repository }}`{% endraw %}
-| `bamboo.planRepository..repositoryUrl` | {% raw %}`${{ github.server }}/${{ github.repository }}`{% endraw %}
-| `bamboo.planRepository..revision` | {% raw %}`${{ github.sha }}`{% endraw %}
-| `bamboo.planRepository..username` | {% raw %}`${{ github.actor}}`{% endraw %}
-| `bamboo.repository.branch.name` | {% raw %}`${{ github.ref }}`{% endraw %}
-| `bamboo.repository.git.branch` | {% raw %}`${{ github.ref }}`{% endraw %}
-| `bamboo.repository.git.repositoryUrl` | {% raw %}`${{ github.server }}/${{ github.repository }}`{% endraw %}
-| `bamboo.repository.pr.key` | {% raw %}`${{ github.event.pull_request.number }}`{% endraw %}
-| `bamboo.repository.pr.sourceBranch` | {% raw %}`${{ github.event.pull_request.head.ref }}`{% endraw %}
-| `bamboo.repository.pr.targetBranch` | {% raw %}`${{ github.event.pull_request.base.ref }}`{% endraw %}
-| `bamboo.resultsUrl` | {% raw %}`${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}`{% endraw %}
-| `bamboo.shortJobKey` | {% raw %}`${{ github.job }}`{% endraw %}
-| `bamboo.shortJobName` | {% raw %}`${{ github.job }}`{% endraw %}
-| `bamboo.shortPlanKey` | {% raw %}`${{ github.workflow }}`{% endraw %}
-| `bamboo.shortPlanName` | {% raw %}`${{ github.workflow }}`{% endraw %}
-
-{% note %}
-
-**Note:** Unknown variables are transformed to {% raw %}`${{ env. }}`{% endraw %} and must be replaced or added under `env` for proper operation. For example, `${bamboo.jira.baseUrl}` will become {% raw %}`${{ env.jira_baseUrl }}`{% endraw %}.
-
-{% endnote %}
-
-### System Variables
-
-System variables used in tasks are transformed to the equivalent bash shell variable and are assumed to be available. For example, `${system.}` will be transformed to `$variable_name`. We recommend you verify this to ensure proper operation of the workflow.
-
-## Legal notice
-
-{% data reusables.actions.actions-importer-legal-notice %}
diff --git a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-bitbucket-pipelines-with-github-actions-importer.md b/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-bitbucket-pipelines-with-github-actions-importer.md
deleted file mode 100644
index 30e9b5d77f3b..000000000000
--- a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-bitbucket-pipelines-with-github-actions-importer.md
+++ /dev/null
@@ -1,350 +0,0 @@
----
-title: Migrating from Bitbucket Pipelines with GitHub Actions Importer
-intro: 'Learn how to use {% data variables.product.prodname_actions_importer %} to automate the migration of your Bitbucket pipelines to {% data variables.product.prodname_actions %}.'
-versions:
- fpt: '*'
- ghec: '*'
- ghes: '*'
-type: tutorial
-topics:
- - Migration
- - CI
- - CD
-shortTitle: Bitbucket Pipelines migration
-redirect_from:
- - /actions/migrating-to-github-actions/automated-migrations/migrating-from-bitbucket-pipelines-with-github-actions-importer
----
-
-[Legal notice](#legal-notice)
-
-## About migrating from Bitbucket Pipelines with GitHub Actions Importer
-
-The instructions below will guide you through configuring your environment to use {% data variables.product.prodname_actions_importer %} to migrate Bitbucket Pipelines to {% data variables.product.prodname_actions %}.
-
-### Prerequisites
-
-{% data reusables.actions.actions-importer-prerequisites %}
-
-### Limitations
-
-There are some limitations when migrating from Bitbucket Pipelines to {% data variables.product.prodname_actions %} with {% data variables.product.prodname_actions_importer %}.
-
-* Images in a private AWS ECR are not supported.
-* The Bitbucket Pipelines option `size` is not supported. {% ifversion fpt or ghec %}If additional runner resources are required in {% data variables.product.prodname_actions %}, consider using {% data variables.actions.hosted_runner %}s. For more information, see "[AUTOTITLE](/actions/using-github-hosted-runners/about-larger-runners)."{% endif %}
-* Metrics detailing the queue time of jobs is not supported by the `forecast` command.
-* Bitbucket [after-scripts](https://support.atlassian.com/bitbucket-cloud/docs/step-options/#After-script) are supported using {% data variables.product.prodname_actions %} `always()` in combination with checking the `steps..conclusion` of the previous step. For more information, see "[AUTOTITLE](/actions/learn-github-actions/contexts#steps-context)."
-
- The following is an example of using the `always()` with `steps..conclusion`.
-
- ```yaml
- - name: After Script 1
- run: |-
- echo "I'm after the script ran!"
- echo "We should be grouped!"
- id: after-script-1
- if: "{% raw %}${{ always() }}{% endraw %}"
- - name: After Script 2
- run: |-
- echo "this is really the end"
- echo "goodbye, for now!"
- id: after-script-2
- if: "{% raw %}${{ steps.after-script-1.conclusion == 'success' && always() }}{% endraw %}"
- ```
-
-### Manual tasks
-
-Certain Bitbucket Pipelines constructs must be migrated manually. These include:
-
-* Secured repository, workspace, and deployment variables
-* SSH keys
-
-## Installing the {% data variables.product.prodname_actions_importer %} CLI extension
-
-{% data reusables.actions.installing-actions-importer %}
-
-## Configuring credentials
-
-The `configure` CLI command is used to set required credentials and options for {% data variables.product.prodname_actions_importer %} when working with Bitbucket Pipelines and {% data variables.product.prodname_dotcom %}.
-
-1. Create a {% data variables.product.prodname_dotcom %} {% data variables.product.pat_v1 %}. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic)."
-
- Your token must have the `workflow` scope.
-
- After creating the token, copy it and save it in a safe location for later use.
-1. Create a Workspace Access Token for Bitbucket Pipelines. For more information, see [Workspace Access Token permissions](https://support.atlassian.com/bitbucket-cloud/docs/workspace-access-token-permissions/) in the Bitbucket documentation. Your token must have the `read` scope for pipelines, projects, and repositories.
-
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `configure` CLI command:
-
- ```shell
- gh actions-importer configure
- ```
-
- The `configure` command will prompt you for the following information:
-
- * For "Which CI providers are you configuring?", use the arrow keys to select `Bitbucket`, press Space to select it, then press Enter.
- * For "{% data variables.product.pat_generic_caps %} for GitHub", enter the value of the {% data variables.product.pat_v1 %} that you created earlier, and press Enter.
- * For "Base url of the GitHub instance", {% ifversion ghes %}enter the URL for your {% data variables.product.product_name %} instance, and press Enter.{% else %}press Enter to accept the default value (`https://github.com`).{% endif %}
- * For "{% data variables.product.pat_generic_caps %} for Bitbucket", enter the Workspace Access Token that you created earlier, and press Enter.
- * For "Base url of the Bitbucket instance", enter the URL for your Bitbucket instance, and press Enter.
-
- An example of the `configure` command is shown below:
-
- ```shell
- $ gh actions-importer configure
- ✔ Which CI providers are you configuring?: Bitbucket
- Enter the following values (leave empty to omit):
- ✔ {% data variables.product.pat_generic_caps %} for GitHub: ***************
- ✔ Base url of the GitHub instance: https://github.com
- ✔ {% data variables.product.pat_generic_caps %} for Bitbucket: ********************
- ✔ Base url of the Bitbucket instance: https://bitbucket.example.com
- Environment variables successfully updated.
- ```
-
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `update` CLI command to connect to {% data variables.product.prodname_registry %} {% data variables.product.prodname_container_registry %} and ensure that the container image is updated to the latest version:
-
- ```shell
- gh actions-importer update
- ```
-
- The output of the command should be similar to below:
-
- ```shell
- Updating ghcr.io/actions-importer/cli:latest...
- ghcr.io/actions-importer/cli:latest up-to-date
- ```
-
-## Perform an audit of the Bitbucket instance
-
-You can use the audit command to get a high-level view of pipelines in a Bitbucket instance.
-
-The audit command performs the following steps.
-1. Fetches all of the pipelines for a workspace.
-1. Converts pipeline to its equivalent GitHub Actions workflow.
-1. Generates a report that summarizes how complete and complex of a migration is possible with {% data variables.product.prodname_actions_importer %}.
-
-### Running the audit command
-
-To perform an audit run the following command in your terminal, replacing `:workspace` with the name of the Bitbucket workspace to audit.
-
-```bash
-gh actions-importer audit bitbucket --workspace :workspace --output-dir tmp/audit
-```
-
-Optionally, a `--project-key` option can be provided to the audit command to limit the results to only pipelines associated with a project.
-
-In the below example command `:project_key` should be replaced with the key of the project that should be audited. Project keys can be found in Bitbucket on the workspace projects page.
-
-```bash
-gh actions-importer audit bitbucket --workspace :workspace --project-key :project_key --output-dir tmp/audit
-```
-
-### Inspecting the audit results
-
-{% data reusables.actions.gai-inspect-audit %}
-
-## Forecasting usage
-
-You can use the `forecast` command to forecast potential {% data variables.product.prodname_actions %} usage by computing metrics from completed pipeline runs in your Bitbucket instance.
-
-### Running the forecast command
-
-To perform a forecast of potential GitHub Actions usage, run the following command in your terminal, replacing `:workspace` with the name of the Bitbucket workspace to forecast. By default, GitHub Actions Importer includes the previous seven days in the forecast report.
-
-```shell
-gh actions-importer forecast bitbucket --workspace :workspace --output-dir tmp/forecast_reports
-```
-
-### Forecasting a project
-
-To limit the forecast to a project, you can use the `--project-key` option. Replace the value for the `:project_key` with the project key for the project to forecast.
-
-```shell
-gh actions-importer forecast bitbucket --workspace :workspace --project-key :project_key --output-dir tmp/forecast_reports
-```
-
-### Inspecting the forecast report
-
-The `forecast_report.md` file in the specified output directory contains the results of the forecast.
-
-Listed below are some key terms that can appear in the forecast report:
-
-* The **job count** is the total number of completed jobs.
-* The **pipeline count** is the number of unique pipelines used.
-* **Execution time** describes the amount of time a runner spent on a job. This metric can be used to help plan for the cost of {% data variables.product.prodname_dotcom %}-hosted runners.
- * This metric is correlated to how much you should expect to spend in {% data variables.product.prodname_actions %}. This will vary depending on the hardware used for these minutes. You can use the [{% data variables.product.prodname_actions %} pricing calculator](https://github.com/pricing/calculator) to estimate the costs.
-* **Concurrent jobs** metrics describe the amount of jobs running at any given time.
-
-## Performing a dry-run migration
-
-You can use the dry-run command to convert a Bitbucket pipeline to an equivalent {% data variables.product.prodname_actions %} workflow(s). A dry-run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline.
-
-### Running the dry-run command
-
-To perform a dry run of migrating a Bitbucket pipeline to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing `:workspace` with the name of the workspace and `:repo` with the name of the repository in Bitbucket.
-
-```bash
-gh actions-importer dry-run bitbucket --workspace :workspace --repository :repo --output-dir tmp/dry-run
-```
-
-### Inspecting the converted workflows
-
-You can view the logs of the dry run and the converted workflow files in the specified output directory.
-
-{% data reusables.actions.gai-custom-transformers-rec %}
-
-## Performing a production migration
-
-You can use the migrate command to convert a Bitbucket pipeline and open a pull request with the equivalent {% data variables.product.prodname_actions %} workflow(s).
-
-### Running the migrate command
-
-To migrate a Bitbucket pipeline to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing the following values.
-
-* Replace `target-url` value with the URL for your {% data variables.product.company_short %} repository.
-* Replace `:repo` with the name of the repository in Bitbucket.
-* Replace `:workspace` with the name of the workspace.
-
-```bash
-gh actions-importer migrate bitbucket --workspace :workspace --repository :repo --target-url https://github.com/:owner/:repo --output-dir tmp/dry-run
-```
-
-The command's output includes the URL of the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following:
-
-```bash
-gh actions-importer migrate bitbucket --workspace actions-importer --repository custom-trigger --target-url https://github.com/valet-dev-testing/demo-private --output-dir tmp/bitbucket
-[2023-07-18 09:56:06] Logs: 'tmp/bitbucket/log/valet-20230718-165606.log'
-[2023-07-18 09:56:24] Pull request: 'https://github.com/valet-dev-testing/demo-private/pull/55'
-```
-
-{% data reusables.actions.gai-inspect-pull-request %}
-
-## Reference
-
-This section contains reference information on environment variables, optional arguments, and supported syntax when using {% data variables.product.prodname_actions_importer %} to migrate from Bitbucket Pipelines.
-
-### Using environment variables
-
-{% data reusables.actions.gai-config-environment-variables %}
-
-{% data variables.product.prodname_actions_importer %} uses the following environment variables to connect to your Bitbucket instance.
-
-* `GITHUB_ACCESS_TOKEN`: The {% data variables.product.pat_v1 %} used to create pull requests with a transformed workflow (requires `repo` and `workflow` scopes).
-* `GITHUB_INSTANCE_URL`: The url to the target GitHub instance. (e.g. `https://github.com`)
-* `BITBUCKET_ACCESS_TOKEN`: The workspace access token with read scopes for pipeline, project, and repository.
-
-These environment variables can be specified in a `.env.local` file that will be loaded by {% data variables.product.prodname_actions_importer %} at run time. The distribution archive contains a `.env.local.template` file that can be used to create these files.
-
-### Optional arguments
-
-{% data reusables.actions.gai-optional-arguments-intro %}
-
-#### `--source-file-path`
-
-You can use the `--source-file-path` argument with the `dry-run` or `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from the Bitbucket instance. The `--source-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source file path instead.
-
-For example:
-
-```bash
-gh actions-importer dry-run bitbucket --workspace :workspace --repository :repo --output-dir tmp/dry-run --source-file-path path/to/my/pipeline/file.yml
-```
-
-#### `--config-file-path`
-
-You can use the `--config-file-path` argument with the `audit`, `dry-run`, and `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from the Bitbucket instance. The `--config-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source files instead.
-
-### Audit example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file to perform an audit.
-
-```bash
-gh actions-importer audit bitbucket --workspace :workspace --output-dir tmp/audit --config-file-path "path/to/my/bitbucket/config.yml"
-```
-
-To audit a Bitbucket instance using a config file, the config file must be in the following format, and each `repository_slug` must be unique:
-
-```yaml
-source_files:
- - repository_slug: repo_name
- path: path/to/one/source/file.yml
- - repository_slug: another_repo_name
- path: path/to/another/source/file.yml
-```
-
-## Supported syntax for Bitbucket Pipelines
-
-The following table shows the type of properties that {% data variables.product.prodname_actions_importer %} is currently able to convert.
-
-| Bitbucket | GitHub Actions | Status |
-| :------------------- | :------------------------------------------- | -----------: |
-| `after-script` | `jobs..steps[*]` | Supported |
-| `artifacts` | `actions/upload-artifact` & `download-artifact` | Supported |
-| `caches` | `actions/cache` | Supported |
-| `clone` | `actions/checkout` | Supported |
-| `condition` | `job..steps[*].run` | Supported |
-| `deployment` | `jobs..environment` | Supported |
-| `image` | `jobs..container` | Supported |
-| `max-time` | `jobs..steps[*].timeout-minutes` | Supported |
-| `options.docker` | None | Supported |
-| `options.max-time` | `jobs..steps[*].timeout-minutes` | Supported |
-| `parallel` | `jobs.` | Supported |
-| `pipelines.branches` | `on.push` | Supported |
-| `pipelines.custom` | `on.workflow_dispatch` | Supported |
-| `pipelines.default` | `on.push` | Supported |
-| `pipelines.pull-requests` | `on.pull_requests` | Supported |
-| `pipelines.tags` | `on.tags` | Supported |
-| `runs-on` | `jobs..runs-on` | Supported |
-| `script` | `job..steps[*].run` | Supported |
-| `services` | `jobs..service` | Supported |
-| `stage` | `jobs.` | Supported |
-| `step` | `jobs..steps[*]` | Supported |
-| `trigger` | `on.workflow_dispatch` | Supported |
-| `fail-fast` | None | Unsupported |
-| `oidc` | None | Unsupported |
-| `options.size` | None | Unsupported |
-| `size` | None | Unsupported |
-
-### Environment variable mapping
-
-{% data variables.product.prodname_actions_importer %} uses the mapping in the table below to convert default Bitbucket environment variables to the closest equivalent in {% data variables.product.prodname_actions %}.
-
-| Bitbucket | GitHub Actions |
-| :------------------------------------- | :------------------------------------------------------ |
-| `CI` | {% raw %}`true`{% endraw %} |
-| `BITBUCKET_BUILD_NUMBER` | {% raw %}`${{ github.run_number }}`{% endraw %} |
-| `BITBUCKET_CLONE_DIR` | {% raw %}`${{ github.workspace }}`{% endraw %} |
-| `BITBUCKET_COMMIT` | {% raw %}`${{ github.sha }}`{% endraw %} |
-| `BITBUCKET_WORKSPACE` | {% raw %}`${{ github.repository_owner }}`{% endraw %} |
-| `BITBUCKET_REPO_SLUG` | {% raw %}`${{ github.repository }}`{% endraw %} |
-| `BITBUCKET_REPO_UUID` | {% raw %}`${{ github.repository_id }}`{% endraw %} |
-| `BITBUCKET_REPO_FULL_NAME` | {% raw %}`${{ github.repository_owner }}`{% endraw %}/{% raw %}`${{ github.repository }}`{% endraw %} |
-| `BITBUCKET_BRANCH` | {% raw %}`${{ github.ref }}`{% endraw %} |
-| `BITBUCKET_TAG` | {% raw %}`${{ github.ref }}`{% endraw %} |
-| `BITBUCKET_PR_ID` | {% raw %}`${{ github.event.pull_request.number }}`{% endraw %} |
-| `BITBUCKET_PR_DESTINATION_BRANCH` | {% raw %}`${{ github.event.pull_request.base.ref }}`{% endraw %} |
-| `BITBUCKET_GIT_HTTP_ORIGIN` | {% raw %}`${{ github.event.repository.clone_url }}`{% endraw %} |
-| `BITBUCKET_GIT_SSH_ORIGIN` | {% raw %}`${{ github.event.repository.ssh_url }}`{% endraw %} |
-| `BITBUCKET_EXIT_CODE` | {% raw %}`${{ job.status }}`{% endraw %} |
-| `BITBUCKET_STEP_UUID` | {% raw %}`${{ job.github_job }}`{% endraw %} |
-| `BITBUCKET_PIPELINE_UUID` | {% raw %}`${{ github.workflow }}`{% endraw %} |
-| `BITBUCKET_PROJECT_KEY` | {% raw %}`${{ github.repository_owner }}`{% endraw %} |
-| `BITBUCKET_PROJECT_UUID` | {% raw %}`${{ github.repository_owner }}`{% endraw %} |
-| `BITBUCKET_STEP_TRIGGERER_UUID` | {% raw %}`${{ github.actor_id }}`{% endraw %} |
-| `BITBUCKET_SSH_KEY_FILE` | {% raw %}`${{ github.workspace }}/.ssh/id_rsa`{% endraw %} |
-| `BITBUCKET_STEP_OIDC_TOKEN` | No Mapping |
-| `BITBUCKET_DEPLOYMENT_ENVIRONMENT` | No Mapping |
-| `BITBUCKET_DEPLOYMENT_ENVIRONMENT_UUID` | No Mapping |
-| `BITBUCKET_BOOKMARK` | No Mapping |
-| `BITBUCKET_PARALLEL_STEP` | No Mapping |
-| `BITBUCKET_PARALLEL_STEP_COUNT` | No Mapping |
-
-### System Variables
-
-System variables used in tasks are transformed to the equivalent bash shell variable and are assumed to be available. For example, `${system.}` will be transformed to `$variable_name`. We recommend you verify this to ensure proper operation of the workflow.
-
-## Legal notice
-
-{% data reusables.actions.actions-importer-legal-notice %}
diff --git a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-circleci-with-github-actions-importer.md b/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-circleci-with-github-actions-importer.md
deleted file mode 100644
index a7eab3e873ec..000000000000
--- a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-circleci-with-github-actions-importer.md
+++ /dev/null
@@ -1,362 +0,0 @@
----
-title: Migrating from CircleCI with GitHub Actions Importer
-intro: 'Learn how to use {% data variables.product.prodname_actions_importer %} to automate the migration of your CircleCI pipelines to {% data variables.product.prodname_actions %}.'
-versions:
- fpt: '*'
- ghec: '*'
- ghes: '*'
-type: tutorial
-topics:
- - Migration
- - CI
- - CD
-shortTitle: CircleCI migration
-redirect_from:
- - /actions/migrating-to-github-actions/automated-migrations/migrating-from-circleci-with-github-actions-importer
----
-
-[Legal notice](#legal-notice)
-
-## About migrating from CircleCI with GitHub Actions Importer
-
-The instructions below will guide you through configuring your environment to use {% data variables.product.prodname_actions_importer %} to migrate CircleCI pipelines to {% data variables.product.prodname_actions %}.
-
-### Prerequisites
-
-* A CircleCI account or organization with projects and pipelines that you want to convert to {% data variables.product.prodname_actions %} workflows.
-* Access to create a CircleCI personal API token for your account or organization.
-{% data reusables.actions.actions-importer-prerequisites %}
-
-### Limitations
-
-There are some limitations when migrating from CircleCI to {% data variables.product.prodname_actions %} with {% data variables.product.prodname_actions_importer %}:
-
-* Automatic caching in between jobs of different workflows is not supported.
-* The `audit` command is only supported when you use a CircleCI organization account. The `dry-run` and `migrate` commands can be used with a CircleCI organization or user account.
-
-#### Manual tasks
-
-Certain CircleCI constructs must be migrated manually. These include:
-
-* Contexts
-* Project-level environment variables
-* Unknown job properties
-* Unknown orbs
-
-## Installing the {% data variables.product.prodname_actions_importer %} CLI extension
-
-{% data reusables.actions.installing-actions-importer %}
-
-## Configuring credentials
-
-The `configure` CLI command is used to set required credentials and options for {% data variables.product.prodname_actions_importer %} when working with CircleCI and {% data variables.product.prodname_dotcom %}.
-
-1. Create a {% data variables.product.prodname_dotcom %} {% data variables.product.pat_v1 %}. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic)."
-
- Your token must have the `workflow` scope.
-
- After creating the token, copy it and save it in a safe location for later use.
-1. Create a CircleCI personal API token. For more information, see [Managing API Tokens](https://circleci.com/docs/managing-api-tokens/#creating-a-personal-api-token) in the CircleCI documentation.
-
- After creating the token, copy it and save it in a safe location for later use.
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `configure` CLI command:
-
- ```shell
- gh actions-importer configure
- ```
-
- The `configure` command will prompt you for the following information:
-
- * For "Which CI providers are you configuring?", use the arrow keys to select `CircleCI`, press Space to select it, then press Enter.
- * For "{% data variables.product.pat_generic_caps %} for GitHub", enter the value of the {% data variables.product.pat_v1 %} that you created earlier, and press Enter.
- * For "Base url of the GitHub instance", {% ifversion ghes %}enter the URL for your {% data variables.product.product_name %} instance, and press Enter.{% else %}press Enter to accept the default value (`https://github.com`).{% endif %}
- * For "{% data variables.product.pat_generic_caps %} for CircleCI", enter the value for the CircleCI personal API token that you created earlier, and press Enter.
- * For "Base url of the CircleCI instance", press Enter to accept the default value (`https://circleci.com`).
- * For "CircleCI organization name", enter the name for your CircleCI organization, and press Enter.
-
- An example of the `configure` command is shown below:
-
- ```shell
- $ gh actions-importer configure
- ✔ Which CI providers are you configuring?: CircleCI
- Enter the following values (leave empty to omit):
- ✔ {% data variables.product.pat_generic_caps %} for GitHub: ***************
- ✔ Base url of the GitHub instance: https://github.com
- ✔ {% data variables.product.pat_generic_caps %} for CircleCI: ********************
- ✔ Base url of the CircleCI instance: https://circleci.com
- ✔ CircleCI organization name: mycircleciorganization
- Environment variables successfully updated.
- ```
-
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `update` CLI command to connect to {% data variables.product.prodname_registry %} {% data variables.product.prodname_container_registry %} and ensure that the container image is updated to the latest version:
-
- ```shell
- gh actions-importer update
- ```
-
- The output of the command should be similar to below:
-
- ```shell
- Updating ghcr.io/actions-importer/cli:latest...
- ghcr.io/actions-importer/cli:latest up-to-date
- ```
-
-## Perform an audit of CircleCI
-
-You can use the `audit` command to get a high-level view of all projects in a CircleCI organization.
-
-The `audit` command performs the following steps:
-
-1. Fetches all of the projects defined in a CircleCI organization.
-1. Converts each pipeline to its equivalent {% data variables.product.prodname_actions %} workflow.
-1. Generates a report that summarizes how complete and complex of a migration is possible with {% data variables.product.prodname_actions_importer %}.
-
-### Running the audit command
-
-To perform an audit of a CircleCI organization, run the following command in your terminal:
-
-```shell
-gh actions-importer audit circle-ci --output-dir tmp/audit
-```
-
-### Inspecting the audit results
-
-{% data reusables.actions.gai-inspect-audit %}
-
-## Forecast potential {% data variables.product.prodname_actions %} usage
-
-You can use the `forecast` command to forecast potential {% data variables.product.prodname_actions %} usage by computing metrics from completed pipeline runs in CircleCI.
-
-### Running the forecast command
-
-To perform a forecast of potential {% data variables.product.prodname_actions %} usage, run the following command in your terminal. By default, {% data variables.product.prodname_actions_importer %} includes the previous seven days in the forecast report.
-
-```shell
-gh actions-importer forecast circle-ci --output-dir tmp/forecast_reports
-```
-
-### Inspecting the forecast report
-
-The `forecast_report.md` file in the specified output directory contains the results of the forecast.
-
-Listed below are some key terms that can appear in the forecast report:
-
-* The **job count** is the total number of completed jobs.
-* The **pipeline count** is the number of unique pipelines used.
-* **Execution time** describes the amount of time a runner spent on a job. This metric can be used to help plan for the cost of {% data variables.product.prodname_dotcom %}-hosted runners.
-
- This metric is correlated to how much you should expect to spend in {% data variables.product.prodname_actions %}. This will vary depending on the hardware used for these minutes. You can use the [{% data variables.product.prodname_actions %} pricing calculator](https://github.com/pricing/calculator) to estimate the costs.
-* **Queue time** metrics describe the amount of time a job spent waiting for a runner to be available to execute it.
-* **Concurrent jobs** metrics describe the amount of jobs running at any given time. This metric can be used to define the number of runners you should configure.
-
-Additionally, these metrics are defined for each queue of runners in CircleCI. This is especially useful if there is a mix of hosted or self-hosted runners, or high or low spec machines, so you can see metrics specific to different types of runners.
-
-## Perform a dry-run migration of a CircleCI pipeline
-
-You can use the `dry-run` command to convert a CircleCI pipeline to an equivalent {% data variables.product.prodname_actions %} workflow. A dry-run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline.
-
-To perform a dry run of migrating your CircleCI project to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing `my-circle-ci-project` with the name of your CircleCI project.
-
-```shell
-gh actions-importer dry-run circle-ci --output-dir tmp/dry-run --circle-ci-project my-circle-ci-project
-```
-
-You can view the logs of the dry run and the converted workflow files in the specified output directory.
-
-{% data reusables.actions.gai-custom-transformers-rec %}
-
-## Perform a production migration of a CircleCI pipeline
-
-You can use the `migrate` command to convert a CircleCI pipeline and open a pull request with the equivalent {% data variables.product.prodname_actions %} workflow.
-
-### Running the migrate command
-
-To migrate a CircleCI pipeline to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing the `target-url` value with the URL for your {% data variables.product.prodname_dotcom %} repository, and `my-circle-ci-project` with the name of your CircleCI project.
-
-```shell
-gh actions-importer migrate circle-ci --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --circle-ci-project my-circle-ci-project
-```
-
-The command's output includes the URL to the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following:
-
-```shell
-$ gh actions-importer migrate circle-ci --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --circle-ci-project my-circle-ci-project
-[2022-08-20 22:08:20] Logs: 'tmp/migrate/log/actions-importer-20220916-014033.log'
-[2022-08-20 22:08:20] Pull request: 'https://github.com/octo-org/octo-repo/pull/1'
-```
-
-{% data reusables.actions.gai-inspect-pull-request %}
-
-## Reference
-
-This section contains reference information on environment variables, optional arguments, and supported syntax when using {% data variables.product.prodname_actions_importer %} to migrate from CircleCI.
-
-### Using environment variables
-
-{% data reusables.actions.gai-config-environment-variables %}
-
-{% data variables.product.prodname_actions_importer %} uses the following environment variables to connect to your CircleCI instance:
-
-* `GITHUB_ACCESS_TOKEN`: The {% data variables.product.pat_v1 %} used to create pull requests with a converted workflow (requires `repo` and `workflow` scopes).
-* `GITHUB_INSTANCE_URL`: The URL to the target {% data variables.product.prodname_dotcom %} instance (for example, `https://github.com`).
-* `CIRCLE_CI_ACCESS_TOKEN`: The CircleCI personal API token used to authenticate with your CircleCI instance.
-* `CIRCLE_CI_INSTANCE_URL`: The URL to the CircleCI instance (for example, `https://circleci.com`). If the variable is left unset, `https://circleci.com` is used as the default value.
-* `CIRCLE_CI_ORGANIZATION`: The organization name of your CircleCI instance.
-* `CIRCLE_CI_PROVIDER`: The location where your pipeline's source file is stored (such as `github`). Currently, only {% data variables.product.prodname_dotcom %} is supported.
-* `CIRCLE_CI_SOURCE_GITHUB_ACCESS_TOKEN` (Optional): The {% data variables.product.pat_v1 %} used to authenticate with your source {% data variables.product.prodname_dotcom %} instance (requires `repo` scope). If not provided, the value of `GITHUB_ACCESS_TOKEN` is used instead.
-* `CIRCLE_CI_SOURCE_GITHUB_INSTANCE_URL` (Optional): The URL to the source {% data variables.product.prodname_dotcom %} instance. If not provided, the value of `GITHUB_INSTANCE_URL` is used instead.
-
-These environment variables can be specified in a `.env.local` file that is loaded by {% data variables.product.prodname_actions_importer %} when it is run.
-
-### Optional arguments
-
-{% data reusables.actions.gai-optional-arguments-intro %}
-
-#### `--source-file-path`
-
-You can use the `--source-file-path` argument with the `forecast`, `dry-run`, or `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from source control. The `--source-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source file path instead.
-
-For example:
-
-```shell
-gh actions-importer dry-run circle-ci --output-dir ./output/ --source-file-path ./path/to/.circleci/config.yml
-```
-
-If you would like to supply multiple source files when running the `forecast` subcommand, you can use pattern matching in the file path value. For example, `gh forecast --source-file-path ./tmp/previous_forecast/jobs/*.json` supplies {% data variables.product.prodname_actions_importer %} with any source files that match the `./tmp/previous_forecast/jobs/*.json` file path.
-
-#### `--config-file-path`
-
-You can use the `--config-file-path` argument with the `audit`, `dry-run`, and `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from source control. The `--config-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source files instead.
-
-The `--config-file-path` argument can also be used to specify which repository a converted composite action should be migrated to.
-
-##### Audit example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file to perform an audit.
-
-```bash
-gh actions-importer audit circle-ci --output-dir ./output/ --config-file-path ./path/to/circle-ci/config.yml
-```
-
-To audit a CircleCI instance using a config file, the config file must be in the following format, and each `repository_slug` must be unique:
-
-```yaml
-source_files:
- - repository_slug: circle-org-name/circle-project-name
- path: path/to/.circleci/config.yml
- - repository_slug: circle-org-name/some-other-circle-project-name
- path: path/to/.circleci/config.yml
-```
-
-##### Dry run example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file as the source file to perform a dry run.
-
-The pipeline is selected by matching the `repository_slug` in the config file to the value of the `--circle-ci-organization` and `--circle-ci-project` options. The `path` is then used to pull the specified source file.
-
-```bash
-gh actions-importer dry-run circle-ci --circle-ci-project circle-org-name/circle-project-name --output-dir ./output/ --config-file-path ./path/to/circle-ci/config.yml
-```
-
-##### Specify the repository of converted composite actions
-
-{% data variables.product.prodname_actions_importer %} uses the YAML file provided to the `--config-file-path` argument to determine the repository that converted composite actions are migrated to.
-
-To begin, you should run an audit without the `--config-file-path` argument:
-
-```bash
-gh actions-importer audit circle-ci --output-dir ./output/
-```
-
-The output of this command will contain a file named `config.yml` that contains a list of all the composite actions that were converted by {% data variables.product.prodname_actions_importer %}. For example, the `config.yml` file may have the following contents:
-
-```yaml
-composite_actions:
- - name: my-composite-action.yml
- target_url: https://github.com/octo-org/octo-repo
- ref: main
-```
-
-You can use this file to specify which repository and ref a reusable workflow or composite action should be added to. You can then use the `--config-file-path` argument to provide the `config.yml` file to {% data variables.product.prodname_actions_importer %}. For example, you can use this file when running a `migrate` command to open a pull request for each unique repository defined in the config file:
-
-```bash
-gh actions-importer migrate circle-ci --circle-ci-project my-project-name --output-dir output/ --config-file-path config.yml --target-url https://github.com/my-org/my-repo
-```
-
-#### `--include-from`
-
-You can use the `--include-from` argument with the `audit` subcommand.
-
-The `--include-from` argument specifies a file that contains a line-delimited list of repositories to include in the audit of a CircleCI organization. Any repositories that are not included in the file are excluded from the audit.
-
-For example:
-
-```bash
-gh actions-importer audit circle-ci --output-dir ./output/ --include-from repositories.txt
-```
-
-The file supplied for this parameter must be a line-delimited list of repositories, for example:
-
-```text
-repository_one
-repository_two
-repository_three
-```
-
-### Supported syntax for CircleCI pipelines
-
-The following table shows the type of properties that {% data variables.product.prodname_actions_importer %} is currently able to convert.
-
-| CircleCI Pipelines | GitHub Actions | Status |
-| :------------------ | :--------------------------------- | :------------------ |
-| cron triggers | | Supported |
-| environment | - `env`
- `jobs..env`
- `jobs..steps.env`
| Supported |
-| executors | | Supported |
-| jobs | | Supported |
-| job | | Supported |
-| matrix | - `jobs..strategy`
- `jobs..strategy.matrix`
| Supported |
-| parameters | - `env`
- `workflow-dispatch.inputs`
| Supported |
-| steps | | Supported |
-| when, unless | | Supported |
-| triggers | | Supported |
-| executors | | Partially Supported |
-| orbs | | Partially Supported |
-| executors | | Unsupported |
-| setup | Not applicable | Unsupported |
-| version | Not applicable | Unsupported |
-
-For more information about supported CircleCI concept and orb mappings, see the [`github/gh-actions-importer` repository](https://github.com/github/gh-actions-importer/blob/main/docs/circle_ci/index.md).
-
-### Environment variable mapping
-
-{% data variables.product.prodname_actions_importer %} uses the mapping in the table below to convert default CircleCI environment variables to the closest equivalent in {% data variables.product.prodname_actions %}.
-
-| CircleCI | GitHub Actions |
-| :------------------------------------ | :--------------------------------------------- |
-| `CI` | {% raw %}`$CI`{% endraw %} |
-| `CIRCLE_BRANCH` | {% raw %}`${{ github.ref }}`{% endraw %} |
-| `CIRCLE_JOB` | {% raw %}`${{ github.job }}`{% endraw %} |
-| `CIRCLE_PR_NUMBER` | {% raw %}`${{ github.event.number }}`{% endraw %} |
-| `CIRCLE_PR_REPONAME` | {% raw %}`${{ github.repository }}`{% endraw %} |
-| `CIRCLE_PROJECT_REPONAME` | {% raw %}`${{ github.repository }}`{% endraw %} |
-| `CIRCLE_SHA1` | {% raw %}`${{ github.sha }}`{% endraw %} |
-| `CIRCLE_TAG` | {% raw %}`${{ github.ref }}`{% endraw %} |
-| `CIRCLE_USERNAME` | {% raw %}`${{ github.actor }}`{% endraw %} |
-| `CIRCLE_WORKFLOW_ID` | {% raw %}`${{ github.run_number }}`{% endraw %} |
-| `CIRCLE_WORKING_DIRECTORY` | {% raw %}`${{ github.workspace }}`{% endraw %} |
-| `<< pipeline.id >>` | {% raw %}`${{ github.workflow }}`{% endraw %} |
-| `<< pipeline.number >>` | {% raw %}`${{ github.run_number }}`{% endraw %} |
-| `<< pipeline.project.git_url >>` | `$GITHUB_SERVER_URL/$GITHUB_REPOSITORY` |
-| `<< pipeline.project.type >>` | `github` |
-| `<< pipeline.git.tag >>` | {% raw %}`${{ github.ref }}`{% endraw %} |
-| `<< pipeline.git.branch >>` | {% raw %}`${{ github.ref }}`{% endraw %} |
-| `<< pipeline.git.revision >>` | {% raw %}`${{ github.event.pull_request.head.sha }}`{% endraw %} |
-| `<< pipeline.git.base_revision >>` | {% raw %}`${{ github.event.pull_request.base.sha }}`{% endraw %} |
-
-## Legal notice
-
-{% data reusables.actions.actions-importer-legal-notice %}
diff --git a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-gitlab-with-github-actions-importer.md b/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-gitlab-with-github-actions-importer.md
deleted file mode 100644
index 190916ce7a4a..000000000000
--- a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-gitlab-with-github-actions-importer.md
+++ /dev/null
@@ -1,421 +0,0 @@
----
-title: Migrating from GitLab with GitHub Actions Importer
-intro: 'Learn how to use {% data variables.product.prodname_actions_importer %} to automate the migration of your GitLab pipelines to {% data variables.product.prodname_actions %}.'
-versions:
- fpt: '*'
- ghec: '*'
- ghes: '*'
-type: tutorial
-topics:
- - Migration
- - CI
- - CD
-shortTitle: GitLab migration
-redirect_from:
- - /actions/migrating-to-github-actions/automated-migrations/migrating-from-gitlab-with-github-actions-importer
----
-
-[Legal notice](#legal-notice)
-
-## About migrating from GitLab with GitHub Actions Importer
-
-The instructions below will guide you through configuring your environment to use {% data variables.product.prodname_actions_importer %} to migrate GitLab pipelines to {% data variables.product.prodname_actions %}.
-
-### Prerequisites
-
-* A GitLab account or organization with pipelines and jobs that you want to convert to {% data variables.product.prodname_actions %} workflows.
-* Access to create a GitLab {% data variables.product.pat_generic %} for your account or organization.
-{% data reusables.actions.actions-importer-prerequisites %}
-
-### Limitations
-
-There are some limitations on migrating processes automatically from GitLab pipelines to {% data variables.product.prodname_actions %} with {% data variables.product.prodname_actions_importer %}.
-
-* Automatic caching in between jobs of different workflows is not supported.
-* The `audit` command is only supported when using an organization account. However, the `dry-run` and `migrate` commands can be used with an organization or user account.
-
-#### Manual tasks
-
-Certain GitLab constructs must be migrated manually. These include:
-
-* Masked project or group variable values
-* Artifact reports
-
-For more information on manual migrations, see "[AUTOTITLE](/actions/migrating-to-github-actions/manually-migrating-to-github-actions/migrating-from-gitlab-cicd-to-github-actions)."
-
-## Installing the {% data variables.product.prodname_actions_importer %} CLI extension
-
-{% data reusables.actions.installing-actions-importer %}
-
-## Configuring credentials
-
-The `configure` CLI command is used to set required credentials and options for {% data variables.product.prodname_actions_importer %} when working with GitLab and {% data variables.product.prodname_dotcom %}.
-
-1. Create a {% data variables.product.prodname_dotcom %} {% data variables.product.pat_v1 %}. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic)."
-
- Your token must have the `workflow` scope.
-
- After creating the token, copy it and save it in a safe location for later use.
-1. Create a GitLab {% data variables.product.pat_generic %}. For more information, see [{% data variables.product.pat_generic_caps_plural %}](https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html#create-a-personal-access-token) in the GitLab documentation.
-
- Your token must have the `read_api` scope.
-
- After creating the token, copy it and save it in a safe location for later use.
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `configure` CLI command:
-
- ```shell
- gh actions-importer configure
- ```
-
- The `configure` command will prompt you for the following information:
-
- * For "Which CI providers are you configuring?", use the arrow keys to select `GitLab`, press Space to select it, then press Enter.
- * For "{% data variables.product.pat_generic_caps %} for GitHub", enter the value of the {% data variables.product.pat_v1 %} that you created earlier, and press Enter.
- * For "Base url of the GitHub instance", {% ifversion ghes %}enter the URL for your {% data variables.product.product_name %} instance, and press Enter.{% else %}press Enter to accept the default value (`https://github.com`).{% endif %}
- * For "Private token for GitLab", enter the value for the GitLab {% data variables.product.pat_generic %} that you created earlier, and press Enter.
- * For "Base url of the GitLab instance", enter the URL of your GitLab instance, and press Enter.
-
- An example of the output of the `configure` command is shown below.
-
- ```shell
- $ gh actions-importer configure
- ✔ Which CI providers are you configuring?: GitLab
- Enter the following values (leave empty to omit):
- ✔ {% data variables.product.pat_generic_caps %} for GitHub: ***************
- ✔ Base url of the GitHub instance: https://github.com
- ✔ Private token for GitLab: ***************
- ✔ Base url of the GitLab instance: http://localhost
- Environment variables successfully updated.
- ```
-
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `update` CLI command to connect to {% data variables.product.prodname_registry %} {% data variables.product.prodname_container_registry %} and ensure that the container image is updated to the latest version:
-
- ```shell
- gh actions-importer update
- ```
-
- The output of the command should be similar to below:
-
- ```shell
- Updating ghcr.io/actions-importer/cli:latest...
- ghcr.io/actions-importer/cli:latest up-to-date
- ```
-
-## Perform an audit of GitLab
-
-You can use the `audit` command to get a high-level view of all pipelines in a GitLab server.
-
-The `audit` command performs the following steps:
-
-1. Fetches all of the projects defined in a GitLab server.
-1. Converts each pipeline to its equivalent {% data variables.product.prodname_actions %} workflow.
-1. Generates a report that summarizes how complete and complex of a migration is possible with {% data variables.product.prodname_actions_importer %}.
-
-### Prerequisites for the audit command
-
-In order to use the `audit` command, you must have a {% data variables.product.pat_generic %} configured with a GitLab organization account.
-
-### Running the audit command
-
-To perform an audit of a GitLab server, run the following command in your terminal, replacing `my-gitlab-namespace` with the namespace or group you are auditing:
-
-```shell
-gh actions-importer audit gitlab --output-dir tmp/audit --namespace my-gitlab-namespace
-```
-
-### Inspecting the audit results
-
-{% data reusables.actions.gai-inspect-audit %}
-
-## Forecast potential build runner usage
-
-You can use the `forecast` command to forecast potential {% data variables.product.prodname_actions %} usage by computing metrics from completed pipeline runs in your GitLab server.
-
-### Running the forecast command
-
-To perform a forecast of potential {% data variables.product.prodname_actions %} usage, run the following command in your terminal, replacing `my-gitlab-namespace` with the namespace or group you are forecasting. By default, {% data variables.product.prodname_actions_importer %} includes the previous seven days in the forecast report.
-
-```shell
-gh actions-importer forecast gitlab --output-dir tmp/forecast --namespace my-gitlab-namespace
-```
-
-### Forecasting an entire namespace
-
-To forecast an entire namespace and all of its subgroups, you must specify each subgroup in the `--namespace` argument or `NAMESPACE` environment variable.
-
-For example:
-
-```shell
-gh actions-importer forecast gitlab --namespace my-gitlab-namespace my-gitlab-namespace/subgroup-one my-gitlab-namespace/subgroup-two ...
-```
-
-### Inspecting the forecast report
-
-The `forecast_report.md` file in the specified output directory contains the results of the forecast.
-
-Listed below are some key terms that can appear in the forecast report:
-
-* The **job count** is the total number of completed jobs.
-* The **pipeline count** is the number of unique pipelines used.
-* **Execution time** describes the amount of time a runner spent on a job. This metric can be used to help plan for the cost of {% data variables.product.prodname_dotcom %}-hosted runners.
- * This metric is correlated to how much you should expect to spend in {% data variables.product.prodname_actions %}. This will vary depending on the hardware used for these minutes. You can use the [{% data variables.product.prodname_actions %} pricing calculator](https://github.com/pricing/calculator) to estimate the costs.
-* **Queue time** metrics describe the amount of time a job spent waiting for a runner to be available to execute it.
-* **Concurrent jobs** metrics describe the amount of jobs running at any given time. This metric can be used to define the number of runners you should configure.
-
-Additionally, these metrics are defined for each queue of runners in GitLab. This is especially useful if there is a mix of hosted or self-hosted runners, or high or low spec machines, so you can see metrics specific to different types of runners.
-
-## Perform a dry-run migration of a GitLab pipeline
-
-You can use the `dry-run` command to convert a GitLab pipeline to its equivalent {% data variables.product.prodname_actions %} workflow.
-
-### Running the dry-run command
-
-You can use the `dry-run` command to convert a GitLab pipeline to an equivalent {% data variables.product.prodname_actions %} workflow. A dry-run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline.
-
-To perform a dry run of migrating your GitLab pipelines to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing `my-gitlab-project` with your GitLab project slug, and `my-gitlab-namespace` with the namespace or group (full group path for subgroups, e.g. `my-org/my-team`) you are performing a dry run for.
-
-```shell
-gh actions-importer dry-run gitlab --output-dir tmp/dry-run --namespace my-gitlab-namespace --project my-gitlab-project
-```
-
-### Inspecting the converted workflows
-
-You can view the logs of the dry run and the converted workflow files in the specified output directory.
-
-{% data reusables.actions.gai-custom-transformers-rec %}
-
-## Perform a production migration of a GitLab pipeline
-
-You can use the `migrate` command to convert a GitLab pipeline and open a pull request with the equivalent {% data variables.product.prodname_actions %} workflow.
-
-### Running the migrate command
-
-To migrate a GitLab pipeline to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing the following values:
-
-* `target-url` value with the URL for your {% data variables.product.product_name %} repository
-* `my-gitlab-project` with your GitLab project slug
-* `my-gitlab-namespace` with the namespace or group you are migrating (full path for subgroups, e.g. `my-org/my-team`)
-
-```shell
-gh actions-importer migrate gitlab --target-url https://github.com/:owner/:repo --output-dir tmp/migrate --namespace my-gitlab-namespace --project my-gitlab-project
-```
-
-The command's output includes the URL to the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following:
-
-```shell
-$ gh actions-importer migrate gitlab --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --namespace octo-org --project monas-project
-[2022-08-20 22:08:20] Logs: 'tmp/migrate/log/actions-importer-20220916-014033.log'
-[2022-08-20 22:08:20] Pull request: 'https://github.com/octo-org/octo-repo/pull/1'
-```
-
-{% data reusables.actions.gai-inspect-pull-request %}
-
-## Reference
-
-This section contains reference information on environment variables, optional arguments, and supported syntax when using {% data variables.product.prodname_actions_importer %} to migrate from GitLab.
-
-### Using environment variables
-
-{% data reusables.actions.gai-config-environment-variables %}
-
-{% data variables.product.prodname_actions_importer %} uses the following environment variables to connect to your GitLab instance:
-
-* `GITHUB_ACCESS_TOKEN`: The {% data variables.product.pat_v1 %} used to create pull requests with a converted workflow (requires the `workflow` scope).
-* `GITHUB_INSTANCE_URL`: The URL to the target {% data variables.product.prodname_dotcom %} instance (for example, `https://github.com`).
-* `GITLAB_ACCESS_TOKEN`: The GitLab {% data variables.product.pat_generic %} used to view GitLab resources.
-* `GITLAB_INSTANCE_URL`: The URL of the GitLab instance.
-* `NAMESPACE`: The namespaces or groups that contain the GitLab pipelines.
-
-These environment variables can be specified in a `.env.local` file that is loaded by {% data variables.product.prodname_actions_importer %} when it is run.
-
-### Using optional arguments
-
-{% data reusables.actions.gai-optional-arguments-intro %}
-
-#### `--source-file-path`
-
-You can use the `--source-file-path` argument with the `forecast`, `dry-run`, or `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from source control. The `--source-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source file path instead.
-
-For example:
-
-```shell
-gh actions-importer dry-run gitlab --output-dir output/ --namespace my-gitlab-namespace --project my-gitlab-project --source-file-path path/to/.gitlab-ci.yml
-```
-
-If you would like to supply multiple source files when running the `forecast` subcommand, you can use pattern matching in the file path value. The following example supplies {% data variables.product.prodname_actions_importer %} with any source files that match the `./tmp/previous_forecast/jobs/*.json` file path.
-
-```shell
-gh actions-importer forecast gitlab --output-dir output/ --namespace my-gitlab-namespace --project my-gitlab-project --source-file-path ./tmp/previous_forecast/jobs/*.json
-```
-
-#### `--config-file-path`
-
-You can use the `--config-file-path` argument with the `audit`, `dry-run`, and `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from source control. The `--config-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source files instead.
-
-The `--config-file-path` argument can also be used to specify which repository a converted reusable workflow should be migrated to.
-
-##### Audit example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file to perform an audit.
-
-```shell
-gh actions-importer audit gitlab --output-dir path/to/output/ --namespace my-gitlab-namespace --config-file-path path/to/gitlab/config.yml
-```
-
-To audit a GitLab instance using a configuration file, the file must be in the following format, and each `repository_slug` value must be unique:
-
-```yaml
-source_files:
- - repository_slug: namespace/project-name
- path: path/to/.gitlab-ci.yml
- - repository_slug: namespace/some-other-project-name
- path: path/to/.gitlab-ci.yml
-```
-
-##### Dry run example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file as the source file to perform a dry run.
-
-The pipeline is selected by matching the `repository_slug` in the configuration file to the value of the `--namespace` and `--project` options. The `path` is then used to pull the specified source file.
-
-```shell
-gh actions-importer dry-run gitlab --namespace my-gitlab-namespace --project my-gitlab-project-name --output-dir ./output/ --config-file-path ./path/to/gitlab/config.yml
-```
-
-##### Specify the repository of converted reusable workflows
-
-{% data variables.product.prodname_actions_importer %} uses the YAML file provided to the `--config-file-path` argument to determine the repository that converted reusable workflows are migrated to.
-
-To begin, you should run an audit without the `--config-file-path` argument:
-
-```shell
-gh actions-importer audit gitlab --output-dir ./output/
-```
-
-The output of this command will contain a file named `config.yml` that contains a list of all the composite actions that were converted by {% data variables.product.prodname_actions_importer %}. For example, the `config.yml` file may have the following contents:
-
-```yaml
-reusable_workflows:
- - name: my-reusable-workflow.yml
- target_url: https://github.com/octo-org/octo-repo
- ref: main
-```
-
-You can use this file to specify which repository and ref a reusable workflow or composite action should be added to. You can then use the `--config-file-path` argument to provide the `config.yml` file to {% data variables.product.prodname_actions_importer %}. For example, you can use this file when running a `migrate` command to open a pull request for each unique repository defined in the config file:
-
-```shell
-gh actions-importer migrate gitlab --project my-project-name --output-dir output/ --config-file-path config.yml --target-url https://github.com/my-org/my-repo
-```
-
-### Supported syntax for GitLab pipelines
-
-The following table shows the type of properties {% data variables.product.prodname_actions_importer %} is currently able to convert. For more details about how GitLab pipeline syntax aligns with {% data variables.product.prodname_actions %}, see "[AUTOTITLE](/actions/migrating-to-github-actions/manually-migrating-to-github-actions/migrating-from-gitlab-cicd-to-github-actions)".
-
-| GitLab Pipelines | GitHub Actions | Status |
-| :-------------------------------------- | :------------------------------ | :-------------------------- |
-| `after_script` | `jobs..steps` | Supported |
-| `auto_cancel_pending_pipelines` | `concurrency` | Supported |
-| `before_script` | `jobs..steps` | Supported |
-| `build_timeout` or `timeout` | `jobs..timeout-minutes` | Supported |
-| `default` | Not applicable | Supported |
-| `image` | `jobs..container` | Supported |
-| `job` | `jobs.` | Supported |
-| `needs` | `jobs..needs` | Supported |
-| `only_allow_merge_if_pipeline_succeeds` | `on.pull_request` | Supported |
-| `resource_group` | `jobs..concurrency` | Supported |
-| `schedule` | `on.schedule` | Supported |
-| `script` | `jobs..steps` | Supported |
-| `stages` | `jobs` | Supported |
-| `tags` | `jobs..runs-on` | Supported |
-| `variables` | `env`, `jobs..env` | Supported |
-| Run pipelines for new commits | `on.push` | Supported |
-| Run pipelines manually | `on.workflow_dispatch` | Supported |
-| `environment` | `jobs..environment` | Partially supported |
-| `include` | Files referenced in an `include` statement are merged into a single job graph before being transformed. | Partially supported |
-| `only` or `except` | `jobs..if` | Partially supported |
-| `parallel` | `jobs..strategy` | Partially supported |
-| `rules` | `jobs..if` | Partially supported |
-| `services` | `jobs..services` | Partially supported |
-| `workflow` | `if` | Partially supported |
-
-For information about supported GitLab constructs, see the [`github/gh-actions-importer` repository](https://github.com/github/gh-actions-importer/blob/main/docs/gitlab/index.md).
-
-### Environment variables syntax
-
-{% data variables.product.prodname_actions_importer %} uses the mapping in the table below to convert default GitLab environment variables to the closest equivalent in {% data variables.product.prodname_actions %}.
-
-| GitLab | GitHub Actions |
-| :-------------------------------------------- | :------------------------------------------------------------------------------------ |
-| `CI_API_V4_URL` | {% raw %}`${{ github.api_url }}`{% endraw %} |
-| `CI_BUILDS_DIR` | {% raw %}`${{ github.workspace }}`{% endraw %} |
-| `CI_COMMIT_BRANCH` | {% raw %}`${{ github.ref }}`{% endraw %} |
-| `CI_COMMIT_REF_NAME` | {% raw %}`${{ github.ref }}`{% endraw %} |
-| `CI_COMMIT_REF_SLUG` | {% raw %}`${{ github.ref }}`{% endraw %} |
-| `CI_COMMIT_SHA` | {% raw %}`${{ github.sha }}`{% endraw %} |
-| `CI_COMMIT_SHORT_SHA` | {% raw %}`${{ github.sha }}`{% endraw %} |
-| `CI_COMMIT_TAG` | {% raw %}`${{ github.ref }}`{% endraw %} |
-| `CI_JOB_ID` | {% raw %}`${{ github.job }}`{% endraw %} |
-| `CI_JOB_MANUAL` | {% raw %}`${{ github.event_name == 'workflow_dispatch' }}`{% endraw %} |
-| `CI_JOB_NAME` | {% raw %}`${{ github.job }}`{% endraw %} |
-| `CI_JOB_STATUS` | {% raw %}`${{ job.status }}`{% endraw %} |
-| `CI_JOB_URL` | {% raw %}`${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}`{% endraw %} |
-| `CI_JOB_TOKEN` | {% raw %}`${{ github.token }}`{% endraw %} |
-| `CI_NODE_INDEX` | {% raw %}`${{ strategy.job-index }}`{% endraw %} |
-| `CI_NODE_TOTAL` | {% raw %}`${{ strategy.job-total }}`{% endraw %} |
-| `CI_PIPELINE_ID` | {% raw %}`${{ github.repository}}/${{ github.workflow }}`{% endraw %} |
-| `CI_PIPELINE_IID` | {% raw %}`${{ github.workflow }}`{% endraw %} |
-| `CI_PIPELINE_SOURCE` | {% raw %}`${{ github.event_name }}`{% endraw %} |
-| `CI_PIPELINE_TRIGGERED` | {% raw %}`${{ github.actions }}`{% endraw %} |
-| `CI_PIPELINE_URL` | {% raw %}`${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}`{% endraw %} |
-| `CI_PROJECT_DIR` | {% raw %}`${{ github.workspace }}`{% endraw %} |
-| `CI_PROJECT_ID` | {% raw %}`${{ github.repository }}`{% endraw %} |
-| `CI_PROJECT_NAME` | {% raw %}`${{ github.event.repository.name }}`{% endraw %} |
-| `CI_PROJECT_NAMESPACE` | {% raw %}`${{ github.repository_owner }}`{% endraw %} |
-| `CI_PROJECT_PATH_SLUG` | {% raw %}`${{ github.repository }}`{% endraw %} |
-| `CI_PROJECT_PATH` | {% raw %}`${{ github.repository }}`{% endraw %} |
-| `CI_PROJECT_ROOT_NAMESPACE` | {% raw %}`${{ github.repository_owner }}`{% endraw %} |
-| `CI_PROJECT_TITLE` | {% raw %}`${{ github.event.repository.full_name }}`{% endraw %} |
-| `CI_PROJECT_URL` | {% raw %}`${{ github.server_url }}/${{ github.repository }}`{% endraw %} |
-| `CI_REPOSITORY_URL` | {% raw %}`${{ github.event.repository.clone_url }}`{% endraw %} |
-| `CI_RUNNER_EXECUTABLE_ARCH` | {% raw %}`${{ runner.os }}`{% endraw %} |
-| `CI_SERVER_HOST` | {% raw %}`${{ github.server_url }}`{% endraw %} |
-| `CI_SERVER_URL` | {% raw %}`${{ github.server_url }}`{% endraw %} |
-| `CI_SERVER` | {% raw %}`${{ github.actions }}`{% endraw %} |
-| `GITLAB_CI` | {% raw %}`${{ github.actions }}`{% endraw %} |
-| `GITLAB_USER_EMAIL` | {% raw %}`${{ github.actor }}`{% endraw %} |
-| `GITLAB_USER_ID` | {% raw %}`${{ github.actor }}`{% endraw %} |
-| `GITLAB_USER_LOGIN` | {% raw %}`${{ github.actor }}`{% endraw %} |
-| `GITLAB_USER_NAME` | {% raw %}`${{ github.actor }}`{% endraw %} |
-| `TRIGGER_PAYLOAD` | {% raw %}`${{ github.event_path }}`{% endraw %} |
-| `CI_MERGE_REQUEST_ASSIGNEES` | {% raw %}`${{ github.event.pull_request.assignees }}`{% endraw %} |
-| `CI_MERGE_REQUEST_ID` | {% raw %}`${{ github.event.pull_request.number }}`{% endraw %} |
-| `CI_MERGE_REQUEST_IID` | {% raw %}`${{ github.event.pull_request.number }}`{% endraw %} |
-| `CI_MERGE_REQUEST_LABELS` | {% raw %}`${{ github.event.pull_request.labels }}`{% endraw %} |
-| `CI_MERGE_REQUEST_MILESTONE` | {% raw %}`${{ github.event.pull_request.milestone }}`{% endraw %} |
-| `CI_MERGE_REQUEST_PROJECT_ID` | {% raw %}`${{ github.repository }}`{% endraw %} |
-| `CI_MERGE_REQUEST_PROJECT_PATH` | {% raw %}`${{ github.repository }}`{% endraw %} |
-| `CI_MERGE_REQUEST_PROJECT_URL` | {% raw %}`${{ github.server_url }}/${{ github.repository }}`{% endraw %} |
-| `CI_MERGE_REQUEST_REF_PATH` | {% raw %}`${{ github.ref }}`{% endraw %} |
-| `CI_MERGE_REQUEST_SOURCE_BRANCH_NAME` | {% raw %}`${{ github.event.pull_request.head.ref }}`{% endraw %} |
-| `CI_MERGE_REQUEST_SOURCE_BRANCH_SHA` | {% raw %}`${{ github.event.pull_request.head.sha}}`{% endraw %} |
-| `CI_MERGE_REQUEST_SOURCE_PROJECT_ID` | {% raw %}`${{ github.event.pull_request.head.repo.full_name }}`{% endraw %} |
-| `CI_MERGE_REQUEST_SOURCE_PROJECT_PATH` | {% raw %}`${{ github.event.pull_request.head.repo.full_name }}`{% endraw %} |
-| `CI_MERGE_REQUEST_SOURCE_PROJECT_URL` | {% raw %}`${{ github.event.pull_request.head.repo.url }}`{% endraw %} |
-| `CI_MERGE_REQUEST_TARGET_BRANCH_NAME` | {% raw %}`${{ github.event.pull_request.base.ref }}`{% endraw %} |
-| `CI_MERGE_REQUEST_TARGET_BRANCH_SHA` | {% raw %}`${{ github.event.pull_request.base.sha }}`{% endraw %} |
-| `CI_MERGE_REQUEST_TITLE` | {% raw %}`${{ github.event.pull_request.title }}`{% endraw %} |
-| `CI_EXTERNAL_PULL_REQUEST_IID` | {% raw %}`${{ github.event.pull_request.number }}`{% endraw %} |
-| `CI_EXTERNAL_PULL_REQUEST_SOURCE_REPOSITORY` | {% raw %}`${{ github.event.pull_request.head.repo.full_name }}`{% endraw %} |
-| `CI_EXTERNAL_PULL_REQUEST_TARGET_REPOSITORY` | {% raw %}`${{ github.event.pull_request.base.repo.full_name }}`{% endraw %} |
-| `CI_EXTERNAL_PULL_REQUEST_SOURCE_BRANCH_NAME` | {% raw %}`${{ github.event.pull_request.head.ref }}`{% endraw %} |
-| `CI_EXTERNAL_PULL_REQUEST_SOURCE_BRANCH_SHA` | {% raw %}`${{ github.event.pull_request.head.sha }}`{% endraw %} |
-| `CI_EXTERNAL_PULL_REQUEST_TARGET_BRANCH_NAME` | {% raw %}`${{ github.event.pull_request.base.ref }}`{% endraw %} |
-| `CI_EXTERNAL_PULL_REQUEST_TARGET_BRANCH_SHA` | {% raw %}`${{ github.event.pull_request.base.sha }}`{% endraw %} |
-
-## Legal notice
-
-{% data reusables.actions.actions-importer-legal-notice %}
diff --git a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-jenkins-with-github-actions-importer.md b/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-jenkins-with-github-actions-importer.md
deleted file mode 100644
index 5e6dc8f84a23..000000000000
--- a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-jenkins-with-github-actions-importer.md
+++ /dev/null
@@ -1,326 +0,0 @@
----
-title: Migrating from Jenkins with GitHub Actions Importer
-intro: 'Learn how to use {% data variables.product.prodname_actions_importer %} to automate the migration of your Jenkins pipelines to {% data variables.product.prodname_actions %}.'
-versions:
- fpt: '*'
- ghec: '*'
- ghes: '*'
-type: tutorial
-topics:
- - Migration
- - CI
- - CD
-shortTitle: Jenkins migration
-redirect_from:
- - /actions/migrating-to-github-actions/automated-migrations/migrating-from-jenkins-with-github-actions-importer
----
-
-[Legal notice](#legal-notice)
-
-## About migrating from Jenkins with GitHub Actions Importer
-
-The instructions below will guide you through configuring your environment to use {% data variables.product.prodname_actions_importer %} to migrate Jenkins pipelines to {% data variables.product.prodname_actions %}.
-
-### Prerequisites
-
-* A Jenkins account or organization with pipelines and jobs that you want to convert to {% data variables.product.prodname_actions %} workflows.
-* Access to create a Jenkins personal API token for your account or organization.
-{% data reusables.actions.actions-importer-prerequisites %}
-
-### Limitations
-
-There are some limitations when migrating from Jenkins to {% data variables.product.prodname_actions %} with {% data variables.product.prodname_actions_importer %}. For example, you must migrate the following constructs manually:
-
-* Mandatory build tools
-* Scripted pipelines
-* Secrets
-* Self-hosted runners
-* Unknown plugins
-
-For more information on manual migrations, see "[AUTOTITLE](/actions/migrating-to-github-actions/manually-migrating-to-github-actions/migrating-from-jenkins-to-github-actions)."
-
-## Installing the {% data variables.product.prodname_actions_importer %} CLI extension
-
-{% data reusables.actions.installing-actions-importer %}
-
-## Configuring credentials
-
-The `configure` CLI command is used to set required credentials and options for {% data variables.product.prodname_actions_importer %} when working with Jenkins and {% data variables.product.prodname_dotcom %}.
-
-1. Create a {% data variables.product.prodname_dotcom %} {% data variables.product.pat_v1 %}. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic)."
-
- Your token must have the `workflow` scope.
-
- After creating the token, copy it and save it in a safe location for later use.
-1. Create a Jenkins API token. For more information, see [Authenticating scripted clients](https://www.jenkins.io/doc/book/system-administration/authenticating-scripted-clients/) in the Jenkins documentation.
-
- After creating the token, copy it and save it in a safe location for later use.
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `configure` CLI command:
-
- ```shell
- gh actions-importer configure
- ```
-
- The `configure` command will prompt you for the following information:
-
- * For "Which CI providers are you configuring?", use the arrow keys to select `Jenkins`, press Space to select it, then press Enter.
- * For "{% data variables.product.pat_generic_caps %} for GitHub", enter the value of the {% data variables.product.pat_v1 %} that you created earlier, and press Enter.
- * For "Base url of the GitHub instance", {% ifversion ghes %}enter the URL for your {% data variables.product.product_name %} instance, and press Enter.{% else %}press Enter to accept the default value (`https://github.com`).{% endif %}
- * For "{% data variables.product.pat_generic_caps %} for Jenkins", enter the value for the Jenkins personal API token that you created earlier, and press Enter.
- * For "Username of Jenkins user", enter your Jenkins username and press Enter.
- * For "Base url of the Jenkins instance", enter the URL of your Jenkins instance, and press Enter.
-
- An example of the `configure` command is shown below:
-
- ```shell
- $ gh actions-importer configure
- ✔ Which CI providers are you configuring?: Jenkins
- Enter the following values (leave empty to omit):
- ✔ {% data variables.product.pat_generic_caps %} for GitHub: ***************
- ✔ Base url of the GitHub instance: https://github.com
- ✔ {% data variables.product.pat_generic_caps %} for Jenkins: ***************
- ✔ Username of Jenkins user: admin
- ✔ Base url of the Jenkins instance: https://localhost
- Environment variables successfully updated.
- ```
-
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `update` CLI command to connect to {% data variables.product.prodname_registry %} {% data variables.product.prodname_container_registry %} and ensure that the container image is updated to the latest version:
-
- ```shell
- gh actions-importer update
- ```
-
- The output of the command should be similar to below:
-
- ```shell
- Updating ghcr.io/actions-importer/cli:latest...
- ghcr.io/actions-importer/cli:latest up-to-date
- ```
-
-## Perform an audit of Jenkins
-
-You can use the `audit` command to get a high-level view of all pipelines in a Jenkins server.
-
-The `audit` command performs the following steps:
-
-1. Fetches all of the projects defined in a Jenkins server.
-1. Converts each pipeline to its equivalent {% data variables.product.prodname_actions %} workflow.
-1. Generates a report that summarizes how complete and complex of a migration is possible with {% data variables.product.prodname_actions_importer %}.
-
-### Running the audit command
-
-To perform an audit of a Jenkins server, run the following command in your terminal:
-
-```shell
-gh actions-importer audit jenkins --output-dir tmp/audit
-```
-
-### Inspecting the audit results
-
-{% data reusables.actions.gai-inspect-audit %}
-
-## Forecast potential build runner usage
-
-You can use the `forecast` command to forecast potential {% data variables.product.prodname_actions %} usage by computing metrics from completed pipeline runs in your Jenkins server.
-
-### Prerequisites for running the forecast command
-
-In order to run the `forecast` command against a Jenkins instance, you must install the [`paginated-builds` plugin](https://plugins.jenkins.io/paginated-builds) on your Jenkins server. This plugin allows {% data variables.product.prodname_actions_importer %} to efficiently retrieve historical build data for jobs that have a large number of builds. Because Jenkins does not provide a method to retrieve paginated build data, using this plugin prevents timeouts from the Jenkins server that can occur when fetching a large amount of historical data. The `paginated-builds` plugin is open source, and exposes a REST API endpoint to fetch build data in pages, rather than all at once.
-
-To install the `paginated-builds` plugin:
-
-1. On your Jenkins instance, navigate to `https:///pluginManager/available`.
-1. Search for the `paginated-builds` plugin.
-1. Check the box on the left and select **Install without restart**.
-
-### Running the forecast command
-
-To perform a forecast of potential {% data variables.product.prodname_actions %}, run the following command in your terminal. By default, {% data variables.product.prodname_actions_importer %} includes the previous seven days in the forecast report.
-
-```shell
-gh actions-importer forecast jenkins --output-dir tmp/forecast
-```
-
-### Inspecting the forecast report
-
-The `forecast_report.md` file in the specified output directory contains the results of the forecast.
-
-Listed below are some key terms that can appear in the forecast report:
-
-* The **job count** is the total number of completed jobs.
-* The **pipeline count** is the number of unique pipelines used.
-* **Execution time** describes the amount of time a runner spent on a job. This metric can be used to help plan for the cost of {% data variables.product.prodname_dotcom %}-hosted runners.
- * This metric is correlated to how much you should expect to spend in {% data variables.product.prodname_actions %}. This will vary depending on the hardware used for these minutes. You can use the [{% data variables.product.prodname_actions %} pricing calculator](https://github.com/pricing/calculator) to estimate the costs.
-* **Queue time** metrics describe the amount of time a job spent waiting for a runner to be available to execute it.
-* **Concurrent jobs** metrics describe the amount of jobs running at any given time. This metric can be used to define the number of runners you should configure.
-
-Additionally, these metrics are defined for each queue of runners in Jenkins. This is especially useful if there is a mix of hosted or self-hosted runners, or high or low spec machines, so you can see metrics specific to different types of runners.
-
-## Perform a dry-run migration of a Jenkins pipeline
-
-You can use the `dry-run` command to convert a Jenkins pipeline to its equivalent {% data variables.product.prodname_actions %} workflow.
-
-### Running the dry-run command
-
-You can use the `dry-run` command to convert a Jenkins pipeline to an equivalent {% data variables.product.prodname_actions %} workflow. A dry-run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline.
-
-To perform a dry run of migrating your Jenkins pipelines to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing `my-jenkins-project` with the URL of your Jenkins job.
-
-```shell
-gh actions-importer dry-run jenkins --source-url my-jenkins-project --output-dir tmp/dry-run
-```
-
-### Inspecting the converted workflows
-
-You can view the logs of the dry run and the converted workflow files in the specified output directory.
-
-{% data reusables.actions.gai-custom-transformers-rec %}
-
-## Perform a production migration of a Jenkins pipeline
-
-You can use the `migrate` command to convert a Jenkins pipeline and open a pull request with the equivalent {% data variables.product.prodname_actions %} workflow.
-
-### Running the migrate command
-
-To migrate a Jenkins pipeline to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing the `target-url` value with the URL for your {% data variables.product.product_name %} repository, and `my-jenkins-project` with the URL for your Jenkins job.
-
-```shell
-gh actions-importer migrate jenkins --target-url https://github.com/:owner/:repo --output-dir tmp/migrate --source-url my-jenkins-project
-```
-
-The command's output includes the URL to the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following:
-
-```shell
-$ gh actions-importer migrate jenkins --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --source-url http://localhost:8080/job/monas_dev_work/job/monas_freestyle
-[2022-08-20 22:08:20] Logs: 'tmp/migrate/log/actions-importer-20220916-014033.log'
-[2022-08-20 22:08:20] Pull request: 'https://github.com/octo-org/octo-repo/pull/1'
-```
-
-{% data reusables.actions.gai-inspect-pull-request %}
-
-## Reference
-
-This section contains reference information on environment variables, optional arguments, and supported syntax when using {% data variables.product.prodname_actions_importer %} to migrate from Jenkins.
-
-### Using environment variables
-
-{% data reusables.actions.gai-config-environment-variables %}
-
-{% data variables.product.prodname_actions_importer %} uses the following environment variables to connect to your Jenkins instance:
-
-* `GITHUB_ACCESS_TOKEN`: The {% data variables.product.pat_v1 %} used to create pull requests with a converted workflow (requires `repo` and `workflow` scopes).
-* `GITHUB_INSTANCE_URL`: The URL to the target {% data variables.product.prodname_dotcom %} instance (for example, `https://github.com`).
-* `JENKINS_ACCESS_TOKEN`: The Jenkins API token used to view Jenkins resources.
-
- {% note %}
-
- **Note**: This token requires access to all jobs that you want to migrate or audit. In cases where a folder or job does not inherit access control lists from their parent, you must grant explicit permissions or full admin privileges.
-
- {% endnote %}
-
-* `JENKINS_USERNAME`: The username of the user account that created the Jenkins API token.
-* `JENKINS_INSTANCE_URL`: The URL of the Jenkins instance.
-* `JENKINSFILE_ACCESS_TOKEN` (Optional) The API token used to retrieve the contents of a `Jenkinsfile` stored in the build repository. This requires the `repo` scope. If this is not provided, the `GITHUB_ACCESS_TOKEN` will be used instead.
-
-These environment variables can be specified in a `.env.local` file that is loaded by {% data variables.product.prodname_actions_importer %} when it is run.
-
-### Using optional arguments
-
-{% data reusables.actions.gai-optional-arguments-intro %}
-
-#### `--source-file-path`
-
-You can use the `--source-file-path` argument with the `forecast`, `dry-run`, or `migration` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from source control. The `--source-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source file path instead. You can use this option for Jenkinsfile and multibranch pipelines.
-
-If you would like to supply multiple source files when running the `forecast` subcommand, you can use pattern matching in the file path value. For example, `gh forecast --source-file-path ./tmp/previous_forecast/jobs/*.json` supplies {% data variables.product.prodname_actions_importer %} with any source files that match the `./tmp/previous_forecast/jobs/*.json` file path.
-
-##### Jenkinsfile pipeline example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified Jenkinsfile as the source file to perform a dry run.
-
-```shell
-gh actions-importer dry-run jenkins --output-dir path/to/output/ --source-file-path path/to/Jenkinsfile --source-url :url_to_jenkins_job
-```
-
-#### `--config-file-path`
-
-You can use the `--config-file-path` argument with the `audit`, `dry-run`, and `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from source control. The `--config-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source files instead.
-
-When you use the `--config-file-path` option with the `dry-run` or `migrate` subcommands, {% data variables.product.prodname_actions_importer %} matches the repository slug to the job represented by the `--source-url` option to select the pipeline. It uses the `config-file-path` to pull the specified source file.
-
-##### Audit example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file to perform an audit.
-
-```shell
-gh actions-importer audit jenkins --output-dir path/to/output/ --config-file-path path/to/jenkins/config.yml
-```
-
-To audit a Jenkins instance using a config file, the config file must be in the following format, and each `repository_slug` value must be unique:
-
-```yaml
-source_files:
- - repository_slug: pipeline-name
- path: path/to/Jenkinsfile
- - repository_slug: multi-branch-pipeline-name
- branches:
- - branch: main
- path: path/to/Jenkinsfile
- - branch: node
- path: path/to/Jenkinsfile
-```
-
-### Supported syntax for Jenkins pipelines
-
-The following tables show the type of properties {% data variables.product.prodname_actions_importer %} is currently able to convert. For more details about how Jenkins pipeline syntax aligns with {% data variables.product.prodname_actions %}, see "[AUTOTITLE](/actions/migrating-to-github-actions/manually-migrating-to-github-actions/migrating-from-jenkins-to-github-actions)".
-
-For information about supported Jenkins plugins, see the [`github/gh-actions-importer` repository](https://github.com/github/gh-actions-importer/blob/main/docs/jenkins/index.md).
-
-#### Supported syntax for Freestyle pipelines
-
-| Jenkins | GitHub Actions | Status |
-| :------------------------ | :--------------------------------- | :------------------ |
-| docker template | `jobs..container` | Supported |
-| build | `jobs` | Partially supported |
-| build environment | `env` | Partially supported |
-| build triggers | `on` | Partially supported |
-| general | `runners` | Partially supported |
-
-#### Supported syntax for Jenkinsfile pipelines
-
-| Jenkins | GitHub Actions | Status |
-| :---------- | :--------------------------------- | :------------------ |
-| docker | `jobs..container` | Supported |
-| stage | `jobs.` | Supported |
-| agent | `runners` | Partially supported |
-| environment | `env` | Partially supported |
-| stages | `jobs` | Partially supported |
-| steps | `jobs..steps` | Partially supported |
-| triggers | `on` | Partially supported |
-| when | `jobs..if` | Partially supported |
-| inputs | `inputs` | Unsupported |
-| matrix | `jobs..strategy.matrix` | Unsupported |
-| options | `jobs..strategy` | Unsupported |
-| parameters | `inputs` | Unsupported |
-
-### Environment variables syntax
-
-{% data variables.product.prodname_actions_importer %} uses the mapping in the table below to convert default Jenkins environment variables to the closest equivalent in {% data variables.product.prodname_actions %}.
-
-| Jenkins | GitHub Actions |
-| :---------------- | :------------------------------------------------------------------------------------ |
-| `${BUILD_ID}` | `{% raw %}${{ github.run_id }}{% endraw %}` |
-| `${BUILD_NUMBER}` | `{% raw %}${{ github.run_id }}{% endraw %}` |
-| `${BUILD_TAG}` | `{% raw %}${{ github.workflow }}-${{ github.run_id }}{% endraw %}` |
-| `${BUILD_URL}` | `{% raw %}${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}{% endraw %}` |
-| `${JENKINS_URL}` | `{% raw %}${{ github.server_url }}{% endraw %}` |
-| `${JOB_NAME}` | `{% raw %}${{ github.workflow }}{% endraw %}` |
-| `${WORKSPACE}` | `{% raw %}${{ github.workspace }}{% endraw %}` |
-
-## Legal notice
-
-{% data reusables.actions.actions-importer-legal-notice %}
diff --git a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-travis-ci-with-github-actions-importer.md b/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-travis-ci-with-github-actions-importer.md
deleted file mode 100644
index cf6837fee0a5..000000000000
--- a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/migrating-from-travis-ci-with-github-actions-importer.md
+++ /dev/null
@@ -1,364 +0,0 @@
----
-title: Migrating from Travis CI with GitHub Actions Importer
-intro: 'Learn how to use {% data variables.product.prodname_actions_importer %} to automate the migration of your Travis CI pipelines to {% data variables.product.prodname_actions %}.'
-versions:
- fpt: '*'
- ghec: '*'
- ghes: '*'
-type: tutorial
-topics:
- - Migration
- - CI
- - CD
-shortTitle: Travis CI migration
-redirect_from:
- - /actions/migrating-to-github-actions/automated-migrations/migrating-from-travis-ci-with-github-actions-importer
----
-
-[Legal notice](#legal-notice)
-
-## About migrating from Travis CI with GitHub Actions Importer
-
-The instructions below will guide you through configuring your environment to use {% data variables.product.prodname_actions_importer %} to migrate Travis CI pipelines to {% data variables.product.prodname_actions %}.
-
-### Prerequisites
-
-* A Travis CI account or organization with pipelines and jobs that you want to convert to {% data variables.product.prodname_actions %} workflows.
-* Access to create a Travis CI API access token for your account or organization.
-{% data reusables.actions.actions-importer-prerequisites %}
-
-### Limitations
-
-There are some limitations when migrating from Travis CI pipelines to {% data variables.product.prodname_actions %} with {% data variables.product.prodname_actions_importer %}.
-
-#### Manual tasks
-
-Certain Travis CI constructs must be migrated manually. These include:
-
-* Secrets
-* Unknown job properties
-
-For more information on manual migrations, see "[AUTOTITLE](/actions/migrating-to-github-actions/manually-migrating-to-github-actions/migrating-from-travis-ci-to-github-actions)."
-
-#### Travis CI project languages
-
-{% data variables.product.prodname_actions_importer %} transforms Travis CI project languages by adding a set of preconfigured build tools and a default build script to the transformed workflow. If no language is explicitly declared, {% data variables.product.prodname_actions_importer %} assumes a project language is Ruby.
-
-For a list of the project languages supported by {% data variables.product.prodname_actions_importer %}, see "[Supported project languages](#supported-project-languages)."
-
-## Installing the {% data variables.product.prodname_actions_importer %} CLI extension
-
-{% data reusables.actions.installing-actions-importer %}
-
-## Configuring credentials
-
-The `configure` CLI command is used to set required credentials and options for {% data variables.product.prodname_actions_importer %} when working with Travis CI and {% data variables.product.prodname_dotcom %}.
-
-1. Create a {% data variables.product.prodname_dotcom %} {% data variables.product.pat_v1 %}. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic)."
-
- Your token must have the `workflow` scope.
-
- After creating the token, copy it and save it in a safe location for later use.
-1. Create a Travis CI API access token. For more information, see [Get your Travis CI API token](https://docs.travis-ci.com/user/migrate/travis-migrate-to-apps-gem-guide/#4-get-your-travis-ci-api-token) in the Travis CI documentation.
-
- After creating the token, copy it and save it in a safe location for later use.
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `configure` CLI command:
-
- ```shell
- gh actions-importer configure
- ```
-
- The `configure` command will prompt you for the following information:
-
- * For "Which CI providers are you configuring?", use the arrow keys to select `Travis CI`, press Space to select it, then press Enter.
- * For "{% data variables.product.pat_generic_caps %} for GitHub", enter the value of the {% data variables.product.pat_v1 %} that you created earlier, and press Enter.
- * For "Base url of the GitHub instance", {% ifversion ghes %}enter the URL for your {% data variables.product.product_name %} instance, and press Enter.{% else %}press Enter to accept the default value (`https://github.com`).{% endif %}
- * For "{% data variables.product.pat_generic_caps %} for Travis CI", enter the value for the Travis CI API access token that you created earlier, and press Enter.
- * For "Base url of the Travis CI instance", enter the URL of your Travis CI instance, and press Enter.
- * For "Travis CI organization name", enter the name of your Travis CI organization, and press Enter.
-
- An example of the output of the `configure` command is shown below.
-
- ```shell
- $ gh actions-importer configure
- ✔ Which CI providers are you configuring?: Travis CI
- Enter the following values (leave empty to omit):
- ✔ {% data variables.product.pat_generic_caps %} for GitHub: ***************
- ✔ Base url of the GitHub instance: https://github.com
- ✔ {% data variables.product.pat_generic_caps %} for Travis CI: ***************
- ✔ Base url of the Travis CI instance: https://travis-ci.com
- ✔ Travis CI organization name: actions-importer-labs
- Environment variables successfully updated.
- ```
-
-1. In your terminal, run the {% data variables.product.prodname_actions_importer %} `update` CLI command to connect to {% data variables.product.prodname_registry %} {% data variables.product.prodname_container_registry %} and ensure that the container image is updated to the latest version:
-
- ```shell
- gh actions-importer update
- ```
-
- The output of the command should be similar to below:
-
- ```shell
- Updating ghcr.io/actions-importer/cli:latest...
- ghcr.io/actions-importer/cli:latest up-to-date
- ```
-
-## Perform an audit of Travis CI
-
-You can use the `audit` command to get a high-level view of all pipelines in a Travis CI server.
-
-The `audit` command performs the following steps:
-
-1. Fetches all of the projects defined in a Travis CI server.
-1. Converts each pipeline to its equivalent {% data variables.product.prodname_actions %} workflow.
-1. Generates a report that summarizes how complete and complex of a migration is possible with {% data variables.product.prodname_actions_importer %}.
-
-### Running the audit command
-
-To perform an audit of a Travis CI server, run the following command in your terminal:
-
-```shell
-gh actions-importer audit travis-ci --output-dir tmp/audit
-```
-
-### Inspecting the audit results
-
-{% data reusables.actions.gai-inspect-audit %}
-
-## Forecast potential build runner usage
-
-You can use the `forecast` command to forecast potential {% data variables.product.prodname_actions %} usage by computing metrics from completed pipeline runs in your Travis CI server.
-
-### Running the forecast command
-
-To perform a forecast of potential {% data variables.product.prodname_actions %} usage, run the following command in your terminal. By default, {% data variables.product.prodname_actions_importer %} includes the previous seven days in the forecast report.
-
-```shell
-gh actions-importer forecast travis-ci --output-dir tmp/forecast
-```
-
-### Inspecting the forecast report
-
-The `forecast_report.md` file in the specified output directory contains the results of the forecast.
-
-Listed below are some key terms that can appear in the forecast report:
-
-* The **job count** is the total number of completed jobs.
-* The **pipeline count** is the number of unique pipelines used.
-* **Execution time** describes the amount of time a runner spent on a job. This metric can be used to help plan for the cost of {% data variables.product.prodname_dotcom %}-hosted runners.
- * This metric is correlated to how much you should expect to spend in {% data variables.product.prodname_actions %}. This will vary depending on the hardware used for these minutes. You can use the [{% data variables.product.prodname_actions %} pricing calculator](https://github.com/pricing/calculator) to estimate the costs.
-* **Queue time** metrics describe the amount of time a job spent waiting for a runner to be available to execute it.
-* **Concurrent jobs** metrics describe the amount of jobs running at any given time. This metric can be used to define the number of runners you should configure.
-
-Additionally, these metrics are defined for each queue of runners in Travis CI. This is especially useful if there is a mix of hosted or self-hosted runners, or high or low spec machines, so you can see metrics specific to different types of runners.
-
-## Perform a dry-run migration of a Travis CI pipeline
-
-You can use the `dry-run` command to convert a Travis CI pipeline to an equivalent {% data variables.product.prodname_actions %} workflow. A dry-run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline.
-
-To perform a dry run of migrating your Travis CI pipelines to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing `my-travis-ci-repository` with the name of your Travis CI repository.
-
-```shell
-gh actions-importer dry-run travis-ci --travis-ci-repository my-travis-ci-repository --output-dir tmp/dry-run
-```
-
-You can view the logs of the dry run and the converted workflow files in the specified output directory.
-
-{% data reusables.actions.gai-custom-transformers-rec %}
-
-## Perform a production migration of a Travis CI pipeline
-
-You can use the `migrate` command to convert a Travis CI pipeline and open a pull request with the equivalent {% data variables.product.prodname_actions %} workflow.
-
-### Running the migrate command
-
-To migrate a Travis CI pipeline to {% data variables.product.prodname_actions %}, run the following command in your terminal, replacing the `target-url` value with the URL for your {% data variables.product.prodname_dotcom %} repository, and `my-travis-ci-repository` with the name of your Travis CI repository.
-
-```shell
-gh actions-importer migrate travis-ci --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --travis-ci-repository my-travis-ci-repository
-```
-
-The command's output includes the URL to the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following:
-
-```shell
-$ gh actions-importer migrate travis-ci --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --travis-ci-repository my-travis-ci-repository
-[2022-08-20 22:08:20] Logs: 'tmp/migrate/log/actions-importer-20220916-014033.log'
-[2022-08-20 22:08:20] Pull request: 'https://github.com/octo-org/octo-repo/pull/1'
-```
-
-{% data reusables.actions.gai-inspect-pull-request %}
-
-## Reference
-
-This section contains reference information on environment variables, optional arguments, and supported syntax when using {% data variables.product.prodname_actions_importer %} to migrate from Travis CI.
-
-### Using environment variables
-
-{% data reusables.actions.gai-config-environment-variables %}
-
-{% data variables.product.prodname_actions_importer %} uses the following environment variables to connect to your Travis CI instance:
-
-* `GITHUB_ACCESS_TOKEN`: The {% data variables.product.pat_v1 %} used to create pull requests with a converted workflow (requires the `workflow` scope).
-* `GITHUB_INSTANCE_URL`: The URL to the target {% data variables.product.prodname_dotcom %} instance (for example, `https://github.com`).
-* `TRAVIS_CI_ACCESS_TOKEN`: The Travis CI API access token used to view Travis CI resources.
-* `TRAVIS_CI_ORGANIZATION`: The organization name of your Travis CI instance.
-* `TRAVIS_CI_INSTANCE_URL`: The URL of the Travis CI instance.
-* `TRAVIS_CI_SOURCE_GITHUB_ACCESS_TOKEN`: (Optional) The {% data variables.product.pat_generic %} used to authenticate with your source GitHub instance. If not provided, `GITHUB_ACCESS_TOKEN` will be used instead.
-* `TRAVIS_CI_SOURCE_GITHUB_INSTANCE_URL`: (Optional) The URL to the source GitHub instance, such as https://github.com. If not provided, `GITHUB_INSTANCE_URL` will be used instead.
-
-These environment variables can be specified in a `.env.local` file that is loaded by {% data variables.product.prodname_actions_importer %} when it is run.
-
-### Using optional arguments
-
-{% data reusables.actions.gai-optional-arguments-intro %}
-
-#### `--source-file-path`
-
-You can use the `--source-file-path` argument with the `forecast`, `dry-run`, or `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from source control. The `--source-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source file path instead.
-
-For example:
-
-```shell
-gh actions-importer dry-run travis-ci --output-dir ./path/to/output/ --travis-ci-repository my-travis-ci-repository --source-file-path ./path/to/.travis.yml
-```
-
-#### `--allow-inactive-repositories`
-
-You can use this argument to specify whether {% data variables.product.prodname_actions_importer %} should include inactive repositories in an audit. If this option is not set, inactive repositories are not included in audits.
-
-```shell
-gh actions-importer dry-run travis-ci --output-dir ./path/to/output/ --travis-ci-repository my-travis-ci-repository --allow-inactive-repositories
-```
-
-#### `--config-file-path`
-
-You can use the `--config-file-path` argument with the `audit`, `dry-run`, and `migrate` subcommands.
-
-By default, {% data variables.product.prodname_actions_importer %} fetches pipeline contents from source control. The `--config-file-path` argument tells {% data variables.product.prodname_actions_importer %} to use the specified source files instead.
-
-##### Audit example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file to perform an audit.
-
-```shell
-gh actions-importer audit travis-ci --output-dir ./path/to/output/ --config-file-path ./path/to/travis-ci/config.yml
-```
-
-To audit a Travis CI instance using a configuration file, the file must be in the following format and each `repository_slug` value must be unique:
-
-```yaml
-source_files:
- - repository_slug: travis-org-name/travis-repo-name
- path: path/to/.travis.yml
- - repository_slug: travis-org-name/some-other-travis-repo-name
- path: path/to/.travis.yml
-```
-
-##### Dry run example
-
-In this example, {% data variables.product.prodname_actions_importer %} uses the specified YAML configuration file as the source file to perform a dry run.
-
-The pipeline is selected by matching the `repository_slug` in the configuration file to the value of the `--travis-ci-repository` option. The `path` is then used to pull the specified source file.
-
-```shell
-gh actions-importer dry-run travis-ci --travis-ci-repository travis-org-name/travis-repo-name --output-dir ./output/ --config-file-path ./path/to/travis-ci/config.yml
-```
-
-### Supported project languages
-
-{% data variables.product.prodname_actions_importer %} supports migrating Travis CI projects in the following languages.
-
-
-android
-bash
-c
-clojure
-c++
-crystal
-c#
-d
-dart
-elixir
-erlang
-generic
-go
-groovy
-haskell
-haxe
-java
-julia
-matlab
-minimal
-nix
-node_js
-objective-c
-perl
-perl6
-php
-python
-r
-ruby
-rust
-scala
-sh
-shell
-smalltalk
-swift
-
-
-### Supported syntax for Travis CI pipelines
-
-The following table shows the type of properties {% data variables.product.prodname_actions_importer %} is currently able to convert. For more details about how Travis CI pipeline syntax aligns with {% data variables.product.prodname_actions %}, see "[AUTOTITLE](/actions/migrating-to-github-actions/manually-migrating-to-github-actions/migrating-from-travis-ci-to-github-actions)".
-
-| Travis CI | GitHub Actions | Status |
-| :------------------ | :--------------------------------- | ------------------: |
-| branches | - `on..`
| Supported |
-| build_pull_requests | - `on.`
| Supported |
-| env | - `env`
- `jobs..env`
- `jobs..steps.env`
| Supported |
-| if | | Supported |
-| job | | Supported |
-| matrix | - `jobs..strategy`
- `jobs..strategy.fail-fast`
- `jobs..strategy.matrix`
| Supported |
-| os & dist | | Supported |
-| scripts | | Supported |
-| stages | | Supported |
-| env | - `on`
| Partially supported |
-| branches | - `on..`
- `on..paths`
| Unsupported |
-| build_pull_requests | - `on..`
- `on..`
- `on..paths`
| Unsupported |
-| cron triggers | - `on.schedule`
- `on.workflow_run`
| Unsupported |
-| env | - `jobs..timeout-minutes`
- `on..types`
| Unsupported |
-| job | - `jobs..container`
| Unsupported |
-| os & dist | | Unsupported |
-
-For information about supported Travis CI constructs, see the [`github/gh-actions-importer` repository](https://github.com/github/gh-actions-importer/blob/main/docs/travis_ci/index.md).
-
-### Environment variables syntax
-
-{% data variables.product.prodname_actions_importer %} uses the mapping in the table below to convert default Travis CI environment variables to the closest equivalent in {% data variables.product.prodname_actions %}.
-
-| Travis CI | GitHub Actions |
-| :---------------------------- | :------------------------------------------------------------------------------------ |
-| {% raw %}`$CONTINUOUS_INTEGRATION`{% endraw %} | {% raw %}`$CI`{% endraw %} |
-| {% raw %}`$USER`{% endraw %} | {% raw %}`${{ github.actor }}`{% endraw %} |
-| {% raw %}`$HOME`{% endraw %} | {% raw %}`${{ github.workspace }}` {% endraw %} |
-| {% raw %}`$TRAVIS_BRANCH`{% endraw %} | {% raw %}`${{ github.ref }}`{% endraw %} |
-| {% raw %}`$TRAVIS_BUILD_DIR`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} |
-| {% raw %}`$TRAVIS_BUILD_ID`{% endraw %} | {% raw %}`${{ github.run_number }}`{% endraw %} |
-| {% raw %}`$TRAVIS_BUILD_NUMBER`{% endraw %} | {% raw %}`${{ github.run_id }}`{% endraw %} |
-| {% raw %}`$TRAVIS_COMMIT`{% endraw %} | {% raw %}`${{ github.sha }}`{% endraw %} |
-| {% raw %}`$TRAVIS_EVENT_TYPE`{% endraw %} | {% raw %}`${{ github.event_name }}`{% endraw %} |
-| {% raw %}`$TRAVIS_PULL_REQUEST_BRANCH`{% endraw %} | {% raw %}`${{ github.base_ref }}`{% endraw %} |
-| {% raw %}`$TRAVIS_PULL_REQUEST`{% endraw %} | {% raw %}`${{ github.event.number }}`{% endraw %} |
-| {% raw %}`$TRAVIS_PULL_REQUEST_SHA`{% endraw %} | {% raw %}`${{ github.head.sha }}`{% endraw %} |
-| {% raw %}`$TRAVIS_PULL_REQUEST_SLUG`{% endraw %} | {% raw %}`${{ github.repository }}`{% endraw %} |
-| {% raw %}`$TRAVIS_TAG`{% endraw %} | {% raw %}`${{ github.ref }}`{% endraw %} |
-| {% raw %}`$TRAVIS_OS_NAME`{% endraw %} | {% raw %}`${{ runner.os }}`{% endraw %} |
-| {% raw %}`$TRAVIS_JOB_ID`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} |
-| {% raw %}`$TRAVIS_REPO_SLUG`{% endraw %} | {% raw %}`${{ github.repository_owner/github.repository }}`{% endraw %} |
-| {% raw %}`$TRAVIS_BUILD_WEB_URL`{% endraw %} | {% raw %}`${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}`{% endraw %} |
-
-## Legal notice
-
-{% data reusables.actions.actions-importer-legal-notice %}
diff --git a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/supplemental-arguments-and-settings.md b/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/supplemental-arguments-and-settings.md
deleted file mode 100644
index 582be574c7d5..000000000000
--- a/content/actions/migrating-to-github-actions/using-github-actions-importer-to-automate-migrations/supplemental-arguments-and-settings.md
+++ /dev/null
@@ -1,206 +0,0 @@
----
-title: Supplemental arguments and settings
-intro: '{% data variables.product.prodname_actions_importer %} has several supplemental arguments and settings to tailor the migration process to your needs.'
-versions:
- fpt: '*'
- ghec: '*'
- ghes: '*'
-type: reference
-topics:
- - Migration
- - CI
- - CD
-redirect_from:
- - /actions/migrating-to-github-actions/automated-migrations/supplemental-arguments-and-settings
----
-
-[Legal notice](#legal-notice)
-
-This article provides general information for configuring {% data variables.product.prodname_actions_importer %}'s supplemental arguments and settings, such as optional parameters, path arguments, and network settings.
-
-## Optional parameters
-
-{% data variables.product.prodname_actions_importer %} has several optional parameters that you can use to customize the migration process.
-
-### Limiting allowed actions
-
-The following options can be used to limit which actions are allowed in converted workflows. When used in combination, these options expand the list of allowed actions. If none of these options are supplied, then all actions are allowed.
-
-* `--allowed-actions` specifies a list of actions to allow in converted workflows. Wildcards are supported. Any other actions other than those provided will be disallowed.
-
- For example:
-
- ```shell
- --allowed-actions {% data reusables.actions.action-checkout %} actions/upload-artifact@* my-org/*
- ```
-
- You can provide an empty list to disallow all actions. For example, `--allowed-actions=`.
-
-* `--allow-verified-actions` specifies that all actions from verified creators are allowed.
-
-* `--allow-github-created-actions` specifies that actions published from the `github` or `actions` organizations are allowed.
-
- For example, such actions include `github/super-linter` and `actions/checkout`.
-
- This option is equivalent to `--allowed-actions actions/* github/*`.
-
-### Using a credentials file for authentication
-
-The `--credentials-file` parameter specifies the path to a file containing credentials for different servers that {% data variables.product.prodname_actions_importer %} can authenticate to. This is useful when build scripts (such as `.travis.yml` or `jenkinsfile`) are stored in multiple {% data variables.product.prodname_ghe_server %} instances.
-
-A credentials file must be a YAML file containing a list of server and access token combinations. {% data variables.product.prodname_actions_importer %} uses the credentials for the URL that most closely matches the network request being made.
-
-For example:
-
-```yaml
-- url: https://github.com
- access_token: ghp_mygeneraltoken
-- url: https://github.com/specific_org/
- access_token: ghp_myorgspecifictoken
-- url: https://jenkins.org
- access_token: abc123
- username: marty_mcfly
-```
-
-For the above credentials file, {% data variables.product.prodname_actions_importer %} uses the access token `ghp_mygeneraltoken` to authenticate all network requests to `https://github.com`, _unless_ the network request is for a repository in the `specific_org` organization. In that case, the `ghp_myorgspecifictoken` token is used to authenticate instead.
-
-#### Alternative source code providers
-
-{% data variables.product.prodname_actions_importer %} can automatically fetch source code from non-{% data variables.product.prodname_dotcom %} repositories. A credentials file can specify the `provider`, the provider URL, and the credentials needed to retrieve the source code.
-
-For example:
-
-```yaml
-- url: https://gitlab.com
- access_token: super_secret_token
- provider: gitlab
-```
-
-For the above example, {% data variables.product.prodname_actions_importer %} uses the token `super_secret_token` to retrieve any source code that is hosted on `https://gitlab.com`.
-
-Supported values for `provider` are:
-
-* `github` (default)
-* `gitlab`
-* `bitbucket_server`
-* `azure_devops`
-
-### Controlling optional features
-
-You can use the `--features` option to limit the features used in workflows that {% data variables.product.prodname_actions_importer %} creates. This is useful for excluding newer {% data variables.product.prodname_actions %} syntax from workflows when migrating to an older {% data variables.product.prodname_ghe_server %} instance. When using the `--features` option, you must specify the version of {% data variables.product.prodname_ghe_server %} that you are migrating to.
-
-For example:
-
-```shell
-gh actions-importer dry-run ... --features ghes-3.3
-```
-
-The supported values for `--features` are:
-
-* `all` (default value)
-* `ghes-latest`
-* `ghes-`, where `` is the version of {% data variables.product.prodname_ghe_server %}, `3.0` or later. For example, `ghes-3.3`.
-
-You can view the list of available feature flags by {% data variables.product.prodname_actions_importer %} by running the `list-features` command. For example:
-
-```shell copy
-gh actions-importer list-features
-```
-
-You should see an output similar to the following.
-
-
-
-```shell
-Available feature flags:
-
-actions/cache (disabled):
- Control usage of actions/cache inside of workflows. Outputs a comment if not enabled.
- GitHub Enterprise Server >= ghes-3.5 required.
-
-composite-actions (enabled):
- Minimizes resulting workflow complexity through the use of composite actions. See https://docs.github.com/en/actions/creating-actions/creating-a-composite-action for more information.
- GitHub Enterprise Server >= ghes-3.4 required.
-
-reusable-workflows (disabled):
- Avoid duplication by re-using existing workflows. See https://docs.github.com/en/actions/using-workflows/reusing-workflows for more information.
- GitHub Enterprise Server >= ghes-3.4 required.
-
-workflow-concurrency-option-allowed (enabled):
- Allows the use of the `concurrency` option in workflows. See https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#concurrency for more information.
- GitHub Enterprise Server >= ghes-3.2 required.
-
-Enable features by passing --enable-features feature-1 feature-2
-Disable features by passing --disable-features feature-1 feature-2
-```
-
-
-
-To toggle feature flags, you can use either of the following methods:
-* Use the `--enable-features` and `--disable-features` options when running a `gh actions-importer` command.
-* Use an environment variable for each feature flag.
-
-You can use the `--enable-features` and `--disable-features` options to select specific features to enable or disable for the duration of the command.
-For example, the following command disables use of `actions/cache` and `composite-actions`:
-
-```shell
-gh actions-importer dry-run ... --disable-features=composite-actions actions/cache
-```
-
-You can use the `configure --features` command to interactively configure feature flags and automatically write them to your environment:
-
-```shell
-$ gh actions-importer configure --features
-
-✔ Which features would you like to configure?: actions/cache, reusable-workflows
-✔ actions/cache (disabled): Enable
-? reusable-workflows (disabled):
-› Enable
- Disable
-```
-
-### Disabling network response caching
-
-By default, {% data variables.product.prodname_actions_importer %} caches responses from network requests to reduce network load and reduce run time. You can use the `--no-http-cache` option to disable the network cache. For example:
-
-```shell
-gh actions-importer forecast ... --no-http-cache
-```
-
-## Path arguments
-
-When running {% data variables.product.prodname_actions_importer %}, path arguments are relative to the container's disk, so absolute paths relative to the container's host machine are not supported. When {% data variables.product.prodname_actions_importer %} is run, the container's `/data` directory is mounted to the directory where {% data variables.product.prodname_actions_importer %} is run.
-
-For example, the following command, when used in the `/Users/mona` directory, outputs the {% data variables.product.prodname_actions_importer %} audit summary to the `/Users/mona/out` directory:
-
-```shell
-gh actions-importer audit --output-dir /data/out
-```
-
-## Using a proxy
-
-To access servers that are configured with a HTTP proxy, you must set the following environment variables with the proxy's URL:
-
-* `OCTOKIT_PROXY`: for any {% data variables.product.prodname_dotcom %} server.
-* `HTTP_PROXY` (or `HTTPS_PROXY`): for any other servers.
-
-For example:
-
-```shell
-export OCTOKIT_PROXY=https://proxy.example.com:8443
-export HTTPS_PROXY=$OCTOKIT_PROXY
-```
-
-If the proxy requires authentication, a username and password must be included in the proxy URL. For example, `https://username:password@proxy.url:port`.
-
-## Disabling SSL certificate verification
-
-By default, {% data variables.product.prodname_actions_importer %} verifies SSL certificates when making network requests. You can disable SSL certificate verification with the `--no-ssl-verify` option. For example:
-
-```shell
-gh actions-importer audit --output-dir ./output --no-ssl-verify
-```
-
-## Legal notice
-
-{% data reusables.actions.actions-importer-legal-notice %}