Skip to content

Commit

Permalink
[Reporting] Restore the "csv by savedobject" endpoint for 7.17 (#148030)
Browse files Browse the repository at this point in the history
## Summary

This restores an endpoint that was added in 7.3 in [this
PR](#34571), and was removed in
7.9 in [this PR](#71031). The
changes are re-done on top of 7.17, but still has a mostly-compatible
with the one that existed in 7.3-7.8. This serves 3rd parties that
relied on the earlier experimental code.

Supports:
* Saved searches with filters
* Saved searches with custom sorting
* Saved searches with or without selected columns
* Exports based on Index Patterns with or without a "time field"
* Requests can have an [optional POST
body](https://github.com/elastic/kibana/pull/148030/files#diff-0f565e26f3309c257fa919c5db227c3b7a78237015940c3d3677cbb1132a6701R27-R37)
with extra time range filters and/or specify a custom time zone.

LIMITATIONS:
* This endpoint is currently not supported in 8.x at this time.
* Saved Search objects created in older versions of Kibana may not work.
* Searching across hundreds of shards in the query could cause
Elasticsearch instability.
* Some minor bugs in the output of the CSV may exist, such as fields not
being formatted exactly as in the Discover table.
* This code may be forward-ported to `main` in a way that uses a
different API that is not compatible with this change.
* Does not allow "raw state" to be merged with the Search object, as in
the previous code. Otherwise, the API is compatible with the previous
code.
* This feature remains in "experimental" status, and is not ready to be
documented at this time.

## Testing
Since there is not a UI for this endpoint, there are a few options for
testing:
1. Run the functional test:
```sh
node scripts/functional_tests.js \
  --config x-pack/test/reporting_api_integration/reporting_and_security.config.ts \
  --grep 'CSV Generation from Saved Search ID'
```

2. Create a saved search in Kibana, and use a script to send a request
```sh
POST_URL="${HOST}/api/reporting/v1/generate/csv/saved-object/search:"$SAVED_SEARCH_ID

## Run transaction to generate a report, wait for execution completion, download the report, and send the
# report as an email attachment

# 1. Send a request to generate a report
DOWNLOAD_PATH=$(curl --silent -XPOST "$POST_URL" -H "kbn-xsrf: kibana-reporting" -H "${AUTH_HEADER}" | jq -e -r ".payload.path | values")
if [ -z "$DOWNLOAD_PATH" ]; then
  echo "Something went wrong! Could not send the request to generate a report!" 1>&2
  # TEST
  curl --silent -XPOST "$POST_URL" -H "kbn-xsrf: kibana-reporting" -H "${AUTH_HEADER}"
  exit 1
fi

# 2. Log the path used to download the report
DOWNLOAD_PATH=${HOST}$DOWNLOAD_PATH
echo Download path: $DOWNLOAD_PATH

# 3. Wait for report execution to finish
echo While the report is executing in the Kibana server, the reporting service will return a 503 status code response.
STATUS=''
while [[ -z $STATUS || $STATUS =~ .*503.* ]]
do
  echo Waiting 5 seconds...
  sleep 5
  STATUS=$(curl --silent --head "$DOWNLOAD_PATH" -H "${AUTH_HEADER}" | head -1)
  if [[ -z "$STATUS" || $STATUS =~ .*500.* ]]; then
    echo "Something went wrong! Could not request the report execution status!" 1>&2
    curl "$DOWNLOAD_PATH" -H "${AUTH_HEADER}" 1>&2
    exit 1
  fi
  echo $STATUS
done

# 4. Download final report and show the contents in the console
curl -v "$DOWNLOAD_PATH" -H "$AUTH_HEADER"
```

3. Test that the above script from (2) works in 7.8, and continues to
work after migrating to 7.17.
  • Loading branch information
tsullivan authored Jan 11, 2023
1 parent 336bdb6 commit 7fea0ad
Show file tree
Hide file tree
Showing 31 changed files with 3,448 additions and 4 deletions.
2 changes: 2 additions & 0 deletions x-pack/plugins/reporting/common/constants.ts
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,8 @@ export const DEFAULT_VIEWPORT = {
};

// Export Type Definitions
export const CSV_SAVED_OBJECT_JOB_TYPE = 'csv_saved_object';

export const CSV_REPORT_TYPE = 'CSV';
export const CSV_JOB_TYPE = 'csv_searchsource';

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/

import type { BaseParams, BasePayload } from '../base';

interface CsvFromSavedObjectBase {
objectType: 'saved search';
timerange?: {
timezone?: string;
min?: string | number;
max?: string | number;
};
savedObjectId: string;
}

export type JobParamsCsvFromSavedObject = CsvFromSavedObjectBase & BaseParams;
export type TaskPayloadCsvFromSavedObject = CsvFromSavedObjectBase & BasePayload;
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
export * from './csv';
export * from './csv_searchsource';
export * from './csv_searchsource_immediate';
export * from './csv_saved_object';
export * from './png';
export * from './png_v2';
export * from './printable_pdf';
Expand Down
2 changes: 1 addition & 1 deletion x-pack/plugins/reporting/server/core.ts
Original file line number Diff line number Diff line change
Expand Up @@ -259,7 +259,7 @@ export class ReportingCore {
return this.pluginSetupDeps;
}

private async getSavedObjectsClient(request: KibanaRequest) {
public async getSavedObjectsClient(request: KibanaRequest) {
const { savedObjects } = await this.getPluginStartDeps();
return savedObjects.getScopedClient(request) as SavedObjectsClientContract;
}
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/

import { CreateJobFn, CreateJobFnFactory } from '../../types';
import { JobParamsCsvFromSavedObject, TaskPayloadCsvFromSavedObject } from './types';

type CreateJobFnType = CreateJobFn<JobParamsCsvFromSavedObject, TaskPayloadCsvFromSavedObject>;

export const createJobFnFactory: CreateJobFnFactory<CreateJobFnType> =
function createJobFactoryFn() {
return async function createJob(jobParams) {
// params have been finalized in server/routes/generate_from_savedobject.ts
return jobParams;
};
};
Original file line number Diff line number Diff line change
@@ -0,0 +1,145 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/

jest.mock('../csv_searchsource/generate_csv', () => ({
CsvGenerator: class CsvGeneratorMock {
generateData() {
return {
size: 123,
content_type: 'text/csv',
};
}
},
}));

jest.mock('./lib/get_sharing_data', () => ({
getSharingData: jest.fn(() => ({ columns: [], searchSource: {} })),
}));

import { Writable } from 'stream';
import nodeCrypto from '@elastic/node-crypto';
import { ReportingCore } from '../../';
import { CancellationToken } from '../../../common';
import {
createMockConfigSchema,
createMockLevelLogger,
createMockReportingCore,
} from '../../test_helpers';
import { runTaskFnFactory } from './execute_job';

const logger = createMockLevelLogger();
const encryptionKey = 'tetkey';
const headers = { sid: 'cooltestheaders' };
let encryptedHeaders: string;
let reportingCore: ReportingCore;
let stream: jest.Mocked<Writable>;

beforeAll(async () => {
const crypto = nodeCrypto({ encryptionKey });
encryptedHeaders = await crypto.encrypt(headers);
});

beforeEach(async () => {
stream = {} as typeof stream;
reportingCore = await createMockReportingCore(createMockConfigSchema({ encryptionKey }));
});

test('recognized saved search', async () => {
reportingCore.getSavedObjectsClient = jest.fn().mockResolvedValue({
get: () => ({
attributes: {
kibanaSavedObjectMeta: {
searchSourceJSON: '{"indexRefName":"kibanaSavedObjectMeta.searchSourceJSON.index"}',
},
},
references: [
{
id: 'logstash-yes-*',
name: 'kibanaSavedObjectMeta.searchSourceJSON.index',
type: 'index-pattern',
},
],
}),
});

const runTask = runTaskFnFactory(reportingCore, logger);
const payload = await runTask(
'cool-job-id',
{
headers: encryptedHeaders,
browserTimezone: 'US/Alaska',
savedObjectId: '123-456-abc-defgh',
objectType: 'saved search',
title: 'Test Search',
version: '7.17.0',
},
new CancellationToken(),
stream
);

expect(payload).toMatchInlineSnapshot(`
Object {
"content_type": "text/csv",
"size": 123,
}
`);
});

test('saved search object is missing references', async () => {
reportingCore.getSavedObjectsClient = jest.fn().mockResolvedValue({
get: () => ({
attributes: {
kibanaSavedObjectMeta: {
searchSourceJSON: '{"indexRefName":"kibanaSavedObjectMeta.searchSourceJSON.index"}',
},
},
}),
});

const runTask = runTaskFnFactory(reportingCore, logger);
const runTest = async () => {
await runTask(
'cool-job-id',
{
headers: encryptedHeaders,
browserTimezone: 'US/Alaska',
savedObjectId: '123-456-abc-defgh',
objectType: 'saved search',
title: 'Test Search',
version: '7.17.0',
},
new CancellationToken(),
stream
);
};

await expect(runTest).rejects.toEqual(
new Error('Could not find reference for kibanaSavedObjectMeta.searchSourceJSON.index')
);
});

test('invalid saved search', async () => {
reportingCore.getSavedObjectsClient = jest.fn().mockResolvedValue({ get: jest.fn() });
const runTask = runTaskFnFactory(reportingCore, logger);
const runTest = async () => {
await runTask(
'cool-job-id',
{
headers: encryptedHeaders,
browserTimezone: 'US/Alaska',
savedObjectId: '123-456-abc-defgh',
objectType: 'saved search',
title: 'Test Search',
version: '7.17.0',
},
new CancellationToken(),
stream
);
};

await expect(runTest).rejects.toEqual(new Error('Saved search object is not valid'));
});
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/

import { SavedObject } from 'kibana/server';
import type { SearchSourceFields } from 'src/plugins/data/common';
import type { VisualizationSavedObjectAttributes } from 'src/plugins/visualizations/common';
import { DeepPartial } from 'utility-types';
import { JobParamsCSV } from '../..';
import { injectReferences, parseSearchSourceJSON } from '../../../../../../src/plugins/data/common';
import { CSV_JOB_TYPE } from '../../../common/constants';
import { getFieldFormats } from '../../services';
import type { RunTaskFn, RunTaskFnFactory } from '../../types';
import { decryptJobHeaders } from '../common';
import { CsvGenerator } from '../csv_searchsource/generate_csv';
import { getSharingData } from './lib';
import type { TaskPayloadCsvFromSavedObject } from './types';

type RunTaskFnType = RunTaskFn<TaskPayloadCsvFromSavedObject>;
type SavedSearchObjectType = SavedObject<
VisualizationSavedObjectAttributes & { columns?: string[]; sort: Array<[string, string]> }
>;
type ParsedSearchSourceJSON = SearchSourceFields & { indexRefName?: string };

function isSavedObject(
savedSearch: SavedSearchObjectType | unknown
): savedSearch is SavedSearchObjectType {
return (
(savedSearch as DeepPartial<SavedSearchObjectType> | undefined)?.attributes
?.kibanaSavedObjectMeta?.searchSourceJSON != null
);
}

export const runTaskFnFactory: RunTaskFnFactory<RunTaskFnType> = (reporting, _logger) => {
const config = reporting.getConfig();

return async function runTask(jobId, job, cancellationToken, stream) {
const logger = _logger.clone([CSV_JOB_TYPE, 'execute-job', jobId]);

const encryptionKey = config.get('encryptionKey');
const headers = await decryptJobHeaders(encryptionKey, job.headers, logger);
const fakeRequest = reporting.getFakeRequest({ headers }, job.spaceId, logger);
const uiSettings = await reporting.getUiSettingsClient(fakeRequest, logger);
const savedObjects = await reporting.getSavedObjectsClient(fakeRequest);
const dataPluginStart = await reporting.getDataService();
const fieldFormatsRegistry = await getFieldFormats().fieldFormatServiceFactory(uiSettings);

const [es, searchSourceStart] = await Promise.all([
(await reporting.getEsClient()).asScoped(fakeRequest),
await dataPluginStart.search.searchSource.asScoped(fakeRequest),
]);

const clients = {
uiSettings,
data: dataPluginStart.search.asScoped(fakeRequest),
es,
};
const dependencies = {
searchSourceStart,
fieldFormatsRegistry,
};

// Get the Saved Search Fields object from ID
const savedSearch = await savedObjects.get('search', job.savedObjectId);

if (!isSavedObject(savedSearch)) {
throw new Error(`Saved search object is not valid`);
}

// allowed to throw an Invalid JSON error if the JSON is not parseable.
const searchSourceFields: ParsedSearchSourceJSON = parseSearchSourceJSON(
savedSearch.attributes.kibanaSavedObjectMeta.searchSourceJSON
);

const indexRefName = searchSourceFields.indexRefName;
if (!indexRefName) {
throw new Error(`Saved Search data is missing a reference to an Index Pattern!`);
}

// Inject references into the Saved Search Fields
const searchSourceFieldsWithRefs = injectReferences(
{ ...searchSourceFields, indexRefName },
savedSearch.references ?? []
);

// Form the Saved Search attributes and SearchSource into a config that's compatible with CsvGenerator
const { columns, searchSource } = await getSharingData(
{ uiSettings },
await searchSourceStart.create(searchSourceFieldsWithRefs),
savedSearch,
job.timerange
);

const jobParamsCsv: JobParamsCSV = { ...job, columns, searchSource };
const csv = new CsvGenerator(
jobParamsCsv,
config,
clients,
dependencies,
cancellationToken,
logger,
stream
);
return await csv.generateData();
};
};
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/

import {
CSV_SAVED_OBJECT_JOB_TYPE as CSV_JOB_TYPE,
LICENSE_TYPE_BASIC,
LICENSE_TYPE_ENTERPRISE,
LICENSE_TYPE_GOLD,
LICENSE_TYPE_PLATINUM,
LICENSE_TYPE_CLOUD_STANDARD,
LICENSE_TYPE_TRIAL,
} from '../../../common/constants';
import { CreateJobFn, ExportTypeDefinition, RunTaskFn } from '../../types';
import { createJobFnFactory } from './create_job';
import { runTaskFnFactory } from './execute_job';
import { JobParamsCsvFromSavedObject, TaskPayloadCsvFromSavedObject } from './types';

export const getExportType = (): ExportTypeDefinition<
CreateJobFn<JobParamsCsvFromSavedObject>,
RunTaskFn<TaskPayloadCsvFromSavedObject>
> => ({
id: CSV_JOB_TYPE,
name: CSV_JOB_TYPE,
jobType: CSV_JOB_TYPE,
jobContentExtension: 'csv',
createJobFnFactory,
runTaskFnFactory,
validLicenses: [
LICENSE_TYPE_TRIAL,
LICENSE_TYPE_BASIC,
LICENSE_TYPE_CLOUD_STANDARD,
LICENSE_TYPE_GOLD,
LICENSE_TYPE_PLATINUM,
LICENSE_TYPE_ENTERPRISE,
],
});
Loading

0 comments on commit 7fea0ad

Please sign in to comment.