Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DO NOT MERGE]Feature/workspace parameters ci #270

Closed
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
2c8d9d3
Add support for dynamic application configurations (#5855)
tianleh Mar 2, 2024
b705ce2
refact: move workspace specific logic to savedObjectWrapper
SuZhou-Joe Oct 18, 2023
108f254
fix: some error
SuZhou-Joe Feb 23, 2024
d391dd6
feat: fix test error
SuZhou-Joe Feb 26, 2024
f4213f3
feat: remove useless config in test
SuZhou-Joe Feb 26, 2024
51aa1d0
feat: add CHANGELOG
SuZhou-Joe Feb 26, 2024
acfb09c
feat: add more unit test
SuZhou-Joe Sep 26, 2023
9f2f56d
fix: unit test
SuZhou-Joe Oct 18, 2023
c7585cc
feat: revert test in repository.test.js
SuZhou-Joe Feb 27, 2024
a476600
feat: revert test in import_saved_objects.test.ts
SuZhou-Joe Feb 27, 2024
3fc01b6
feat: revert test in repository.test.js
SuZhou-Joe Feb 27, 2024
08508a4
feat: add type
SuZhou-Joe Feb 27, 2024
5f932f5
fix: bootstrap type error
SuZhou-Joe Feb 27, 2024
803ac7d
feat: optimize code and add comment
SuZhou-Joe Feb 27, 2024
4d4cb22
fix: unit test error
SuZhou-Joe Feb 27, 2024
8551809
fix: integration test fail
SuZhou-Joe Feb 28, 2024
2e01f65
feat: add missing code
SuZhou-Joe Feb 28, 2024
6ef6e74
feat: optimize code
SuZhou-Joe Feb 29, 2024
3e239b8
Add permissions field to the mapping only if the permission control i…
gaobinlong Feb 29, 2024
816241a
Fix test failure
gaobinlong Feb 29, 2024
516307e
feat: modify unit test
SuZhou-Joe Mar 1, 2024
ad41030
fix: bulk create error
SuZhou-Joe Mar 1, 2024
419b8c4
fix: bulk create error
SuZhou-Joe Mar 1, 2024
40fdc96
feat: add new config in yml file
SuZhou-Joe Mar 1, 2024
09f5030
feat: add new config in yml file
SuZhou-Joe Mar 1, 2024
04c1d86
feat: update yml file
SuZhou-Joe Mar 2, 2024
833517b
feat: fix unit test
SuZhou-Joe Mar 2, 2024
02c8859
feat: do not skip migration when doing integration test
SuZhou-Joe Mar 2, 2024
7b368fd
feat: remove useless code
SuZhou-Joe Mar 3, 2024
aa49775
feat: remove useless code
SuZhou-Joe Mar 3, 2024
10adb9f
feat: change flag variable
SuZhou-Joe Mar 3, 2024
984d51a
feat: add test cases
SuZhou-Joe Mar 3, 2024
cdd660e
Self host runner (#140)
SuZhou-Joe Sep 14, 2023
d4a5d9b
disable github workflows running on windows for development (#161)
ruanyl Sep 15, 2023
7677b67
feat: update branch
SuZhou-Joe Mar 2, 2024
c6b4c34
[Workspace]Optional workspaces params in repository (#5949)
SuZhou-Joe Mar 4, 2024
f2691b1
feat: temp save
SuZhou-Joe Mar 4, 2024
74ed1db
fix: unit test
SuZhou-Joe Mar 4, 2024
1ba7c69
feat: add some comment
SuZhou-Joe Mar 4, 2024
fec6604
feat: merge
SuZhou-Joe Mar 4, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/add-untriaged.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ on:

jobs:
apply-label:
runs-on: ubuntu-latest
runs-on: arc-runner-set
steps:
- uses: actions/github-script@v6
with:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/backport.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ on:

jobs:
backport:
runs-on: ubuntu-latest
runs-on: arc-runner-set
permissions:
contents: write
pull-requests: write
Expand Down
40 changes: 11 additions & 29 deletions .github/workflows/build_and_test_workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ name: Build and test
# trigger on every commit push and PR for all branches except pushes for backport branches
on:
push:
branches: ['main', '[0-9].x', '[0-9].[0=9]+'] # Run the functional test on push for only release branches
branches: ['**', '!backport/**'] # Run the functional test on push for only release branches
paths-ignore:
- '**/*.md'
- 'docs/**'
Expand All @@ -33,13 +33,11 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, windows-latest]
os: [arc-runner-set]
group: [1, 2, 3, 4]
include:
- os: ubuntu-latest
- os: arc-runner-set
name: Linux
- os: windows-latest
name: Windows
runs-on: ${{ matrix.os }}
steps:
- name: Configure git's autocrlf (Windows only)
Expand Down Expand Up @@ -100,13 +98,13 @@ jobs:

- name: Run linter
# ciGroup 1 of unit-tests is shorter and Linux is faster
if: matrix.group == 1 && matrix.os == 'ubuntu-latest'
if: matrix.group == 1 && matrix.os == 'arc-runner-set'
id: linter
run: yarn lint

- name: Validate NOTICE file
# ciGroup 1 of unit-tests is shorter and Linux is faster
if: matrix.group == 1 && matrix.os == 'ubuntu-latest'
if: matrix.group == 1 && matrix.os == 'arc-runner-set'
id: notice-validate
run: yarn notice:validate

Expand Down Expand Up @@ -138,13 +136,11 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, windows-latest]
os: [arc-runner-set]
group: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13]
include:
- os: ubuntu-latest
- os: arc-runner-set
name: Linux
- os: windows-latest
name: Windows
runs-on: ${{ matrix.os }}
steps:
- run: echo Running functional tests for ciGroup${{ matrix.group }}
Expand Down Expand Up @@ -322,31 +318,16 @@ jobs:
strategy:
matrix:
include:
- os: ubuntu-latest
- os: arc-runner-set
name: Linux x64
ext: tar.gz
suffix: linux-x64
script: build-platform --linux --skip-os-packages
- os: ubuntu-latest
- os: arc-runner-set
name: Linux ARM64
ext: tar.gz
suffix: linux-arm64
script: build-platform --linux-arm --skip-os-packages
- os: macos-latest
name: macOS x64
ext: tar.gz
suffix: darwin-x64
script: build-platform --darwin --skip-os-packages
- os: macos-latest
name: macOS ARM64
ext: tar.gz
suffix: darwin-arm64
script: build-platform --darwin-arm --skip-os-packages
- os: windows-latest
name: Windows x64
ext: zip
suffix: windows-x64
script: build-platform --windows --skip-os-packages
runs-on: ${{ matrix.os }}
defaults:
run:
Expand Down Expand Up @@ -430,8 +411,9 @@ jobs:
retention-days: 1

bwc-tests:
if: false
needs: [build-min-artifact-tests]
runs-on: ubuntu-latest
runs-on: arc-runner-set
container:
image: docker://opensearchstaging/ci-runner:ci-runner-rockylinux8-opensearch-dashboards-integtest-v2
options: --user 1001
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/changelog_verifier.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ on:
jobs:
# Enforces the update of a changelog file on every pull request
verify-changelog:
runs-on: ubuntu-latest
runs-on: arc-runner-set
steps:
- uses: actions/checkout@v3
with:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/create_doc_issue.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ env:
jobs:
create-issue:
if: ${{ github.event.label.name == 'needs-documentation' }}
runs-on: ubuntu-latest
runs-on: arc-runner-set
name: Create Documentation Issue
steps:
- name: GitHub App token
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/cypress_workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ env:

jobs:
cypress-tests:
runs-on: ubuntu-latest
runs-on: arc-runner-set
strategy:
fail-fast: false
matrix:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/delete_backport_branch.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ on:

jobs:
delete-branch:
runs-on: ubuntu-latest
runs-on: arc-runner-set
if: startsWith(github.event.pull_request.head.ref,'backport/')
steps:
- name: Delete merged branch
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/github-workflow-badger.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ on:

jobs:
call-action:
runs-on: ubuntu-latest
runs-on: arc-runner-set
permissions:
pull-requests: write
steps:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/links_checker.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ on:
jobs:
linkchecker:

runs-on: ubuntu-latest
runs-on: arc-runner-set

steps:
- uses: actions/checkout@v2
Expand Down
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,8 @@ Inspired from [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
- [Multiple Datasource] Refactor client and legacy client to use authentication registry ([#5881](https://github.com/opensearch-project/OpenSearch-Dashboards/pull/5881))
- [Multiple Datasource] Improved error handling for the search API when a null value is passed for the dataSourceId ([#5882](https://github.com/opensearch-project/OpenSearch-Dashboards/pull/5882))
- [Multiple Datasource] Hide/Show authentication method in multi data source plugin based on configuration ([#5916](https://github.com/opensearch-project/OpenSearch-Dashboards/pull/5916))
- [[Dynamic Configurations] Add support for dynamic application configurations ([#5855](https://github.com/opensearch-project/OpenSearch-Dashboards/pull/5855))
- [Workspace] Optional workspaces params in repository ([#5949](https://github.com/opensearch-project/OpenSearch-Dashboards/pull/5949))

### 🐛 Bug Fixes

Expand Down
12 changes: 11 additions & 1 deletion config/opensearch_dashboards.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,13 @@
# dashboards. OpenSearch Dashboards creates a new index if the index doesn't already exist.
#opensearchDashboards.index: ".opensearch_dashboards"

# OpenSearch Dashboards uses an index in OpenSearch to store dynamic configurations.
# This shall be a different index from opensearchDashboards.index.
# opensearchDashboards.configIndex: ".opensearch_dashboards_config"

# Set the value of this setting to true to enable plugin application config. By default it is disabled.
# application_config.enabled: false

# The default application to load.
#opensearchDashboards.defaultAppId: "home"

Expand Down Expand Up @@ -285,4 +292,7 @@
# opensearchDashboards.survey.url: "https://survey.opensearch.org"

# Set the value of this setting to true to enable plugin augmentation on Dashboard
# vis_augmenter.pluginAugmentationEnabled: true
# vis_augmenter.pluginAugmentationEnabled: true

# Set the value to true to enable workspace feature
# workspace.enabled: false
1 change: 1 addition & 0 deletions src/core/public/saved_objects/saved_objects_client.ts
Original file line number Diff line number Diff line change
Expand Up @@ -345,6 +345,7 @@ export class SavedObjectsClient {
filter: 'filter',
namespaces: 'namespaces',
preference: 'preference',
workspaces: 'workspaces',
};

const renamedQuery = renameKeys<SavedObjectsFindOptions, any>(renameMap, options);
Expand Down
1 change: 1 addition & 0 deletions src/core/server/mocks.ts
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,7 @@ export function pluginInitializerContextConfigMock<T>(config: T) {
const globalConfig: SharedGlobalConfig = {
opensearchDashboards: {
index: '.opensearch_dashboards_tests',
configIndex: '.opensearch_dashboards_config_tests',
autocompleteTerminateAfter: duration(100000),
autocompleteTimeout: duration(1000),
},
Expand Down
1 change: 1 addition & 0 deletions src/core/server/opensearch_dashboards_config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@ export const config = {
schema: schema.object({
enabled: schema.boolean({ defaultValue: true }),
index: schema.string({ defaultValue: '.kibana' }),
configIndex: schema.string({ defaultValue: '.opensearch_dashboards_config' }),
autocompleteTerminateAfter: schema.duration({ defaultValue: 100000 }),
autocompleteTimeout: schema.duration({ defaultValue: 1000 }),
branding: schema.object({
Expand Down
1 change: 1 addition & 0 deletions src/core/server/plugins/plugin_context.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -98,6 +98,7 @@ describe('createPluginInitializerContext', () => {
expect(configObject).toStrictEqual({
opensearchDashboards: {
index: '.kibana',
configIndex: '.opensearch_dashboards_config',
autocompleteTerminateAfter: duration(100000),
autocompleteTimeout: duration(1000),
},
Expand Down
7 changes: 6 additions & 1 deletion src/core/server/plugins/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -287,7 +287,12 @@ export interface Plugin<

export const SharedGlobalConfigKeys = {
// We can add more if really needed
opensearchDashboards: ['index', 'autocompleteTerminateAfter', 'autocompleteTimeout'] as const,
opensearchDashboards: [
'index',
'configIndex',
'autocompleteTerminateAfter',
'autocompleteTimeout',
] as const,
opensearch: ['shardTimeout', 'requestTimeout', 'pingTimeout'] as const,
path: ['data'] as const,
savedObjects: ['maxImportPayloadBytes'] as const,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,8 @@ export interface SavedObjectsExportOptions {
excludeExportDetails?: boolean;
/** optional namespace to override the namespace used by the savedObjectsClient. */
namespace?: string;
/** optional workspaces to override the workspaces used by the savedObjectsClient. */
workspaces?: string[];
}

/**
Expand Down Expand Up @@ -87,13 +89,15 @@ async function fetchObjectsToExport({
exportSizeLimit,
savedObjectsClient,
namespace,
workspaces,
}: {
objects?: SavedObjectsExportOptions['objects'];
types?: string[];
search?: string;
exportSizeLimit: number;
savedObjectsClient: SavedObjectsClientContract;
namespace?: string;
workspaces?: string[];
}) {
if ((types?.length ?? 0) > 0 && (objects?.length ?? 0) > 0) {
throw Boom.badRequest(`Can't specify both "types" and "objects" properties when exporting`);
Expand Down Expand Up @@ -121,6 +125,7 @@ async function fetchObjectsToExport({
search,
perPage: exportSizeLimit,
namespaces: namespace ? [namespace] : undefined,
...(workspaces ? { workspaces } : {}),
});
if (findResponse.total > exportSizeLimit) {
throw Boom.badRequest(`Can't export more than ${exportSizeLimit} objects`);
Expand Down Expand Up @@ -153,6 +158,7 @@ export async function exportSavedObjectsToStream({
includeReferencesDeep = false,
excludeExportDetails = false,
namespace,
workspaces,
}: SavedObjectsExportOptions) {
const rootObjects = await fetchObjectsToExport({
types,
Expand All @@ -161,6 +167,7 @@ export async function exportSavedObjectsToStream({
savedObjectsClient,
exportSizeLimit,
namespace,
workspaces,
});
let exportedObjects: Array<SavedObject<unknown>> = [];
let missingReferences: SavedObjectsExportResultDetails['missingReferences'] = [];
Expand Down
3 changes: 3 additions & 0 deletions src/core/server/saved_objects/import/check_conflicts.ts
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@ interface CheckConflictsParams {
ignoreRegularConflicts?: boolean;
retries?: SavedObjectsImportRetry[];
createNewCopies?: boolean;
workspaces?: string[];
}

const isUnresolvableConflict = (error: SavedObjectError) =>
Expand All @@ -56,6 +57,7 @@ export async function checkConflicts({
ignoreRegularConflicts,
retries = [],
createNewCopies,
workspaces,
}: CheckConflictsParams) {
const filteredObjects: Array<SavedObject<{ title?: string }>> = [];
const errors: SavedObjectsImportError[] = [];
Expand All @@ -77,6 +79,7 @@ export async function checkConflicts({
});
const checkConflictsResult = await savedObjectsClient.checkConflicts(objectsToCheck, {
namespace,
workspaces,
});
const errorMap = checkConflictsResult.errors.reduce(
(acc, { type, id, error }) => acc.set(`${type}:${id}`, error),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@ interface CreateSavedObjectsParams<T> {
overwrite?: boolean;
dataSourceId?: string;
dataSourceTitle?: string;
workspaces?: string[];
}
interface CreateSavedObjectsResult<T> {
createdObjects: Array<CreatedObject<T>>;
Expand All @@ -60,6 +61,7 @@ export const createSavedObjects = async <T>({
overwrite,
dataSourceId,
dataSourceTitle,
workspaces,
}: CreateSavedObjectsParams<T>): Promise<CreateSavedObjectsResult<T>> => {
// filter out any objects that resulted in errors
const errorSet = accumulatedErrors.reduce(
Expand Down Expand Up @@ -169,6 +171,7 @@ export const createSavedObjects = async <T>({
const bulkCreateResponse = await savedObjectsClient.bulkCreate(objectsToCreate, {
namespace,
overwrite,
workspaces,
});
expectedResults = bulkCreateResponse.saved_objects;
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,7 @@ export async function importSavedObjectsFromStream({
namespace,
dataSourceId,
dataSourceTitle,
workspaces,
}: SavedObjectsImportOptions): Promise<SavedObjectsImportResponse> {
let errorAccumulator: SavedObjectsImportError[] = [];
const supportedTypes = typeRegistry.getImportableAndExportableTypes().map((type) => type.name);
Expand Down Expand Up @@ -92,6 +93,7 @@ export async function importSavedObjectsFromStream({
savedObjectsClient,
namespace,
ignoreRegularConflicts: overwrite,
workspaces,
};

const checkConflictsResult = await checkConflicts(checkConflictsParams);
Expand Down Expand Up @@ -142,6 +144,7 @@ export async function importSavedObjectsFromStream({
namespace,
dataSourceId,
dataSourceTitle,
...(workspaces ? { workspaces } : {}),
};
const createSavedObjectsResult = await createSavedObjects(createSavedObjectsParams);
errorAccumulator = [...errorAccumulator, ...createSavedObjectsResult.errors];
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,7 @@ export async function resolveSavedObjectsImportErrors({
createNewCopies,
dataSourceId,
dataSourceTitle,
workspaces,
}: SavedObjectsResolveImportErrorsOptions): Promise<SavedObjectsImportResponse> {
// throw a BadRequest error if we see invalid retries
validateRetries(retries);
Expand Down Expand Up @@ -163,6 +164,7 @@ export async function resolveSavedObjectsImportErrors({
overwrite,
dataSourceId,
dataSourceTitle,
workspaces,
};
const { createdObjects, errors: bulkCreateErrors } = await createSavedObjects(
createSavedObjectsParams
Expand Down
Loading
Loading