From 5def848d2cee99e337a46f57893c14be1d1c7052 Mon Sep 17 00:00:00 2001 From: Vadim Kibana <82822460+vadimkibana@users.noreply.github.com> Date: Mon, 30 Sep 2024 18:14:58 +0200 Subject: [PATCH 1/4] [ES|QL] AST package documentation (#194296) Updates documentation for the ES|QL AST package. --- packages/kbn-esql-ast/README.md | 97 ++------- packages/kbn-esql-ast/src/builder/README.md | 39 ++++ packages/kbn-esql-ast/src/parser/README.md | 144 ++++++++++++- .../kbn-esql-ast/src/pretty_print/README.md | 76 ++++++- packages/kbn-esql-ast/src/visitor/README.md | 202 +++++++++++++++++- packages/kbn-esql-ast/src/walker/README.md | 125 ++++++++--- 6 files changed, 575 insertions(+), 108 deletions(-) create mode 100644 packages/kbn-esql-ast/src/builder/README.md diff --git a/packages/kbn-esql-ast/README.md b/packages/kbn-esql-ast/README.md index 76232d371b9cb..f7be5248f2ca0 100644 --- a/packages/kbn-esql-ast/README.md +++ b/packages/kbn-esql-ast/README.md @@ -1,89 +1,38 @@ -# ES|QL utility library +# ES|QL AST library -## Folder structure +The general idea of this package is to provide low-level ES|QL parsing, +building, traversal, pretty-printing, and manipulation features on top of a +custom compact AST representation, which is designed to be resilient to many +grammar changes. -This library brings all the foundation data structure to enable all advanced features within an editor for ES|QL as validation, autocomplete, hover, etc... -The package is structure as follow: +Contents of this package: -``` -src - |- antlr // => contains the ES|QL grammar files and various compilation assets - | ast_factory.ts // => binding to the Antlr that generates the AST data structure - | ast_errors.ts // => error translation utility from raw Antlr to something understandable (somewhat) - | antlr_error_listener.ts // => The ES|QL syntax error listener - | antlr_facade.ts // => getParser and getLexer utilities - | ... // => miscellaneas utilities to work with AST -``` - -### Basic usage - -#### Get AST from a query string +- [`builder` — Contains the `Builder` class for AST node construction](./src/builder/README.md). +- [`parser` — Contains text to ES|QL AST parsing code](./src/parser/README.md). +- [`walker` — Contains the ES|QL AST `Walker` utility](./src/walker/README.md). +- [`visitor` — Contains the ES|QL AST `Visitor` utility](./src/visitor/README.md). +- [`pretty_print` — Contains code for formatting AST to text](./src/pretty_print/README.md). -This module contains the entire logic to translate from a query string into the AST data structure. -The `getAstAndSyntaxErrors` function returns the AST data structure, unless a syntax error happens in which case the `errors` array gets populated with a Syntax error. -##### Usage +## Demo -```js -import { getAstAndSyntaxErrors } from '@kbn/esql-ast'; +Much of the functionality of this package is demonstrated in the demo UI. You +can run it in Storybook, using the following command: -const queryString = "from index | stats 1 + avg(myColumn) "; -const { ast, errors} = await astProvider(queryString); - -if(errors){ - console.log({ syntaxErrors: errors }); -} -// do stuff with the ast +```bash +yarn storybook esql_ast_inspector ``` -## How does it work - -The general idea of this package is to provide all ES|QL features on top of a custom compact AST definition (all data structure types defined in `./types.ts`) which is designed to be resilient to many grammar changes. -The pipeline is the following: +Alternatively, you can start Kibana with *Example Plugins* enabled, using: +```bash +yarn start --run-examples ``` -Antlr grammar files -=> Compiled grammar files (.ts assets in the antlr folder) -=> AST Factory (Antlr Parser tree => custom AST) -``` - -Each feature function works with the combination of the AST and the definition files: the former describe the current statement in a easy to traverse way, while the definitions describe what's the expected behaviour of each node in the AST node (i.e. what arguments should it accept? How many arguments? etc...). -While AST requires the grammar to be compiled to be updated, definitions are static files which can be dynamically updated without running the ANTLR compile task. - -#### AST - -The AST is generated by 2 files: `ast_factory.ts` and its buddy `ast_walker.ts`: -* `ast_factory.ts` is a binding to Antlr and access the Parser tree -* Parser tree is passed over to `ast_walker` to append new AST nodes - -In general Antlr is resilient to grammar errors, in the sense that it can produe a Parser tree up to the point of the error, then stops. This is useful to perform partial tasks even with broken queries and this means that a partial AST can be produced even with an invalid query. - -### Keeping ES|QL up to date - -In general when operating on changes here use the `yarn kbn watch` in a terminal window to make sure changes are correctly compiled. - -### How to add new commands/options -When a new command/option is added to ES|QL it is done via a grammar update. -Therefore adding them requires a two step phase: -* Update the grammar with the new one - * add/fix all AST generator bindings in case of new/changed TOKENS in the `lexer` grammar file -* Update the definition files for commands/options +Then navigate to the *ES|QL AST Inspector* plugin in the Kibana UI. -To update the grammar: -1. Make sure the `lexer` and `parser` files are up to date with their ES counterparts - * an existing Kibana CI job is updating them already automatically -2. Run the script into the `package.json` to compile the ES|QL grammar. -3. open the `ast_factory.ts` file and add a new `exit` method -4. write some code in the `ast_walker/ts` to translate the Antlr Parser tree into the custom AST (there are already few utilites for that, but sometimes it is required to write some more code if the `parser` introduced a new flow) - * pro tip: use the `http://lab.antlr.org/` to visualize/debug the parser tree for a given statement (copy and paste the grammar files there) -5. if something goes wrong with new quoted/unquoted identifier token, open the `ast_helpers.ts` and check the ids of the new tokens in the `getQuotedText` and `getUnquotedText` functions - please make sure to leave a comment on the token name -#### Debug and fix grammar changes (tokens, etc...) +## Keeping ES|QL AST library up to date -On TOKEN renaming or with subtle `lexer` grammar changes it can happens that test breaks, this can be happen for two main issues: -* A TOKEN name changed so the `ast_walker.ts` doesn't find it any more. Go there and rename the TOKEN name. -* TOKEN order changed and tests started failing. This probably generated some TOKEN id reorder and there are two functions in `ast_helpers.ts` who rely on hardcoded ids: `getQuotedText` and `getUnquotedText`. - * Note that the `getQuotedText` and `getUnquotedText` are automatically updated on grammar changes detected by the Kibana CI sync job. - * to fix this just look at the commented tokens and update the ids. If a new token add it and leave a comment to point to the new token name. - * This choice was made to reduce the bundle size, as importing the `esql_parser` adds some hundreds of Kbs to the bundle otherwise. \ No newline at end of file +In general when operating on changes here use the `yarn kbn watch` in a terminal +window to make sure changes are correctly compiled. diff --git a/packages/kbn-esql-ast/src/builder/README.md b/packages/kbn-esql-ast/src/builder/README.md new file mode 100644 index 0000000000000..8b874579dab29 --- /dev/null +++ b/packages/kbn-esql-ast/src/builder/README.md @@ -0,0 +1,39 @@ +# Builder + +Contains the `Builder` class for AST node construction. It provides the most +low-level stateless AST node construction API. + +The `Builder` API can be used when constructing AST nodes from scratch manually, +and it is also used by the parser to construct the AST nodes during the parsing +process. + +When parsing the AST nodes will typically have more information, such as the +position in the source code, and other metadata. When constructing the AST nodes +manually, this information is not available, but the `Builder` API can still be +used as it permits to skip the metadata. + + +## Usage + +Construct a `literal` expression node: + +```typescript +import { Builder } from '@kbn/esql-ast'; + +const node = Builder.expression.literal.numeric({ value: 42, literalType: 'integer' }); +``` + +Returns: + +```js +{ + type: 'literal', + literalType: 'integer', + value: 42, + name: '42', + + location: { min: 0, max: 0 }, + text: '', + incomplete: false, +} +``` diff --git a/packages/kbn-esql-ast/src/parser/README.md b/packages/kbn-esql-ast/src/parser/README.md index 1500be94c40c8..e054c8999714c 100644 --- a/packages/kbn-esql-ast/src/parser/README.md +++ b/packages/kbn-esql-ast/src/parser/README.md @@ -1,6 +1,91 @@ +# ES|QL Parser + +The Kibana ES|QL parser uses the ANTLR library for lexing and parse tree (CST) +generation. The ANTLR grammar is imported from the Elasticsearch repository in +an automated CI job. + +We use the ANTLR outputs: (1) the token stream; and (2) the parse tree to +generate (1) the Abstract Syntax Tree (AST), (2) for syntax validation, (3) for +syntax highlighting, and (4) for formatting (comment and whitespace) extraction +and assignment to AST nodes. + +In general ANTLR is resilient to grammar errors, in the sense that it can +produce a Parser tree up to the point of the error, then stops. This is useful +to perform partial tasks even with broken queries and this means that a partial +AST can be produced even with an invalid query. + + +## Folder structure + +The parser is structured as follows: + +``` +src/ +|- parser/ Contains the logic to parse the ES|QL query and generate the AST. +| |- factories.ts Contains AST node factories. +| |- antlr_error_listener.ts Contains code which traverses ANTLR CST and collects syntax errors. +| |- esql_ast_builder_listener.ts Contains code which traverses ANTLR CST and builds the AST. +| +|- antlr/ Contains the autogenerated ES|QL ANTLR grammar files and various compilation assets. + |- esql_lexer.g4 Contains the ES|QL ANTLR lexer grammar. + |- esql_parser.g4 Contains the ES|QL ANTLR parser grammar. +``` + + +## Usage + +### Get AST from a query string + +The `parse` function returns the AST data structure, unless a syntax error +happens in which case the `errors` array gets populated with a Syntax errors. + +```js +import { parse } from '@kbn/esql-ast'; + +const src = "FROM index | STATS 1 + AVG(myColumn) "; +const { root, errors } = await parse(src); + +if(errors){ + console.log({ syntaxErrors: errors }); +} + +// do stuff with the ast +``` + +The `root` is the root node of the AST. The AST is a tree structure where each +node represents a part of the query. Each node has a `type` property which +indicates the type of the node. + + +### Parse a query and populate the AST with comments + +When calling the `parse` method with the `withFormatting` flag set to `true`, +the AST will be populated with comments. + +```js +import { parse } from '@kbn/esql-ast'; + +const src = "FROM /* COMMENT */ index"; +const { root } = await parse(src, { withFormatting: true }); +``` + + ## Comments -### Inter-node comment places +By default, when parsing the AST does not include any *formatting* information, +such as comments or whitespace. This is because the AST is designed to be +compact and to be used for syntax validation, syntax highlighting, and other +high-level operations. + +However, sometimes it is useful to have comments attached to the AST nodes. The +parser can collect all comments when the `withFormatting` flag is set to `true` +and attach them to the AST nodes. The comments are attached to the closest node, +while also considering the surrounding punctuation. + +### Inter-node comments + +Currently, when parsed inter-node comments are attached to the node from the +left side. Around colon in source identifier: @@ -25,3 +110,60 @@ Time interface expressions: ```eslq STATS 1 /* asdf */ DAY ``` + + +## Internal Details + + +### How does it work? + +The pipeline is the following: + +1. ANTLR grammar files are added to Kibana. +2. ANTLR grammar files are compiled to `.ts` assets in the `antlr` folder. +3. A query is parsed to a CST by ANTLR. +4. The `ESQLAstBuilderListener` traverses the CST and builds the AST. +5. Optionally: + 1. Comments and whitespace are extracted from the ANTLR lexer's token stream. + 2. The comments and whitespace are attached to the AST nodes. + + +### How to add new commands/options? + +When a new command/option is added to ES|QL it is done via a grammar update. +Therefore adding them requires a two step phase: + +To update the grammar: + +1. Make sure the `lexer` and `parser` files are up to date with their ES + counterparts. + * an existing Kibana CI job is updating them already automatically +2. Run the script into the `package.json` to compile the ES|QL grammar. +3. open the `ast_factory.ts` file and add a new `exit` method +4. write some code in the `ast_walker/ts` to translate the Antlr Parser tree + into the custom AST (there are already few utilites for that, but sometimes + it is required to write some more code if the `parser` introduced a new flow) + * pro tip: use the `http://lab.antlr.org/` to visualize/debug the parser tree + for a given statement (copy and paste the grammar files there) +5. if something goes wrong with new quoted/unquoted identifier token, open + the `ast_helpers.ts` and check the ids of the new tokens in the `getQuotedText` + and `getUnquotedText` functions, please make sure to leave a comment on the + token name + + +#### Debug and fix grammar changes (tokens, etc...) + +On token renaming or with subtle `lexer` grammar changes it can happens that +test breaks, this can be happen for two main issues: + +* A token name changed so the `esql_ast_builder_listener.ts` doesn't find it any + more. Go there and rename the TOKEN name. +* Token order changed and tests started failing. This probably generated some + token id reorder and there are two functions in `helpers.ts` who rely on + hardcoded ids: `getQuotedText` and `getUnquotedText`. + * Note that the `getQuotedText` and `getUnquotedText` are automatically + updated on grammar changes detected by the Kibana CI sync job. + * to fix this just look at the commented tokens and update the ids. If a new + token add it and leave a comment to point to the new token name. + * This choice was made to reduce the bundle size, as importing the + `esql_parser` adds some hundreds of Kbs to the bundle otherwise. diff --git a/packages/kbn-esql-ast/src/pretty_print/README.md b/packages/kbn-esql-ast/src/pretty_print/README.md index 48066697a5a7e..1d600fc19d3bc 100644 --- a/packages/kbn-esql-ast/src/pretty_print/README.md +++ b/packages/kbn-esql-ast/src/pretty_print/README.md @@ -4,20 +4,82 @@ human-readable string. This is useful for debugging or for displaying the AST to the user. -This module provides a number of pretty-printing options. +This module provides a number of pretty-printing facilities. There are two +main classes that provide pretty-printing: + +- `BasicPrettyPrinter` — provides the basic pretty-printing to a single + line. +- `WrappingPrettyPrinter` — provides more advanced pretty-printing, which + can wrap the query to multiple lines, and can also wrap the query to a + specific width. ## `BasicPrettyPrinter` -The `BasicPrettyPrinter` class provides the most basic pretty-printing—it -prints a query to a single line. Or it can print a query with each command on -a separate line, with the ability to customize the indentation before the pipe -character. +The `BasicPrettyPrinter` class provides the simpler pretty-printing +functionality—it prints a query to a single line. Or, it can print a query +with each command on a separate line, with the ability to customize the +indentation before the pipe character. + +Usage: + +```typescript +import { parse, BasicPrettyPrinter } from '@kbn/esql-ast'; + +const src = 'FROM index | LIMIT 10'; +const { root } = parse(src); +const text = BasicPrettyPrinter.print(root); + +console.log(text); // FROM index | LIMIT 10 +``` + +It can print each command on a separate line, with a custom indentation before +the pipe character: + +```typescript +const text = BasicPrettyPrinter.multiline(root, { pipeTab: ' ' }); +``` It can also print a single command to a single line; or an expression to a -single line. +single line. Below is the summary of the top-level functions: - `BasicPrettyPrinter.print()` — prints query to a single line. - `BasicPrettyPrinter.multiline()` — prints a query to multiple lines. - `BasicPrettyPrinter.command()` — prints a command to a single line. -- `BasicPrettyPrinter.expression()` — prints an expression to a single line. +- `BasicPrettyPrinter.expression()` — prints an expression to a single + line. + +See `BasicPrettyPrinterOptions` for formatting options. For example, a +`lowercase` options allows you to lowercase all ES|QL keywords: + +```typescript +const text = BasicPrettyPrinter.print(root, { lowercase: true }); +``` + +The `BasicPrettyPrinter` prints only *left* and *right* multi-line comments, +which do not have line breaks, as this formatter is designed to print a query +to a single line. If you need to print a query to multiple lines, use the +`WrappingPrettyPrinter`. + + +## `WrappingPrettyPrinter` + +The *wrapping pretty printer* can print a query to multiple lines, and can wrap +the text to a new line if the line width exceeds a certain threshold. It also +prints all comments attached to the AST (including ones that force the text +to be wrapped). + +Usage: + +```typescript +import { parse, WrappingPrettyPrinter } from '@kbn/esql-ast'; + +const src = ` + FROM index /* this is a comment */ + | LIMIT 10`; +const { root } = parse(src, { withFormatting: true }); +const text = WrappingPrettyPrinter.print(root); +``` + +See `WrappingPrettyPrinterOptions` interface for available formatting options. + diff --git a/packages/kbn-esql-ast/src/visitor/README.md b/packages/kbn-esql-ast/src/visitor/README.md index c952c8a34d8d9..20d55c0967e10 100644 --- a/packages/kbn-esql-ast/src/visitor/README.md +++ b/packages/kbn-esql-ast/src/visitor/README.md @@ -1,4 +1,28 @@ -## High-level AST structure +# `Visitor` Traversal API + +The `Visitor` traversal API provides a feature-rich way to traverse the ES|QL +AST. It is more powerful than the [`Walker` API](../walker/README.md), as it +allows to traverse the AST in a more flexible way. + +The `Visitor` API allows to traverse the AST starting from the root node or a +command statement, or an expression. Unlike in the `Walker` API, the `Visitor` +does not automatically traverse the entire AST. Instead, the developer has to +manually call the necessary *visit* methods to traverse the AST. This allows +to traverse the AST in a more flexible way: only traverse the parts of the AST +that are needed, or maybe traverse the AST in a different order, or multiple +times. + +The `Visitor` API is also more powerful than the `Walker` API, as for each +visitor callback it provides a *context* object, which contains the information +about the current node as well as the parent node, and the whole parent chain +up to the root node. + +In addition, each visitor callback can return a value (*output*), which is then +passed to the parent node, in the place where the visitor was called. Also, when +a child is visited, the parent node can pass in *input* to the child visitor. + + +## About ES|QL AST structure Broadly, there are two AST node types: (1) commands (say `FROM ...`, like *statements* in other languages), and (2) expressions (say `a + b`, or `fn()`). @@ -59,7 +83,8 @@ As of this writing, the following expressions are defined: - Column identifier expression, `{type: "column"}`, like `@timestamp` - Function call expression, `{type: "function"}`, like `fn(123)` - Literal expression, `{type: "literal"}`, like `123`, `"hello"` -- List literal expression, `{type: "list"}`, like `[1, 2, 3]`, `["a", "b", "c"]`, `[true, false]` +- List literal expression, `{type: "list"}`, like `[1, 2, 3]`, + `["a", "b", "c"]`, `[true, false]` - Time interval expression, `{type: "interval"}`, like `1h`, `1d`, `1w` - Inline cast expression, `{type: "cast"}`, like `abc::int`, `def::string` - Unknown node, `{type: "unknown"}` @@ -67,3 +92,176 @@ As of this writing, the following expressions are defined: Each expression has a `visitExpressionX` callback, where `X` is the type of the expression. If a expression-specific callback is not found, the generic `visitExpression` callback is called. + + +## `Visitor` API Usage + +The `Visitor` API is used to traverse the AST. The process is as follows: + +1. Create a new `Visitor` instance. +2. Register callbacks for the nodes you are interested in. +3. Call the `visitQuery`, `visitCommand`, or `visitExpression` method to start + the traversal. + +For example, the below code snippet prints the type of each expression node: + +```typescript +new Visitor() + .on('visitExpression', (ctx) => console.log(ctx.node.type)) + .on('visitCommand', (ctx) => [...ctx.visitArguments()]) + .on('visitQuery', (ctx) => [...ctx.visitCommands()]) + .visitQuery(root); +``` + +In the `visitQuery` callback it visits all commands, using the `visitCommands`. +In the `visitCommand` callback it visits all arguments, using the +`visitArguments`. And finally, in the `visitExpression` callback it prints the +type of the expression node. + +Above we started the traversal from the root node, using the `.visitQuery(root)` +method. However, one can start the traversal from any node, by calling the +following methods: + +- `.visitQuery()` — Start traversal from the root node. +- `.visitCommand()` — Start traversal from a command node. +- `.visitExpression()` — Start traversal from an expression node. + + +### Specifying Callbacks + +The simplest way to traverse the AST is to specify the below three callbacks: + +- `visitQuery` — Called for every query node. (Normally once.) +- `visitCommand` — Called for every command node. +- `visitExpression` — Called for every expression node. + + +However, you can be more specific and specify callbacks for commands and +expression types. This way the context `ctx` provided to the callback will have +helpful methods specific to the node type. + +When a more specific callback is not found, the generic `visitCommand` or +`visitExpression` callbacks are not called for that node. + +You can specify a specific callback for each command, instead of the generic +`visitCommand`: + +- `visitFromCommand` — Called for every `FROM` command node. +- `visitLimitCommand` — Called for every `LIMIT` command node. +- `visitExplainCommand` — Called for every `EXPLAIN` command node. +- `visitRowCommand` — Called for every `ROW` command node. +- `visitMetricsCommand` — Called for every `METRICS` command node. +- `visitShowCommand` — Called for every `SHOW` command node. +- `visitMetaCommand` — Called for every `META` command node. +- `visitEvalCommand` — Called for every `EVAL` command node. +- `visitStatsCommand` — Called for every `STATS` command node. +- `visitInlineStatsCommand` — Called for every `INLINESTATS` command node. +- `visitLookupCommand` — Called for every `LOOKUP` command node. +- `visitKeepCommand` — Called for every `KEEP` command node. +- `visitSortCommand` — Called for every `SORT` command node. +- `visitWhereCommand` — Called for every `WHERE` command node. +- `visitDropCommand` — Called for every `DROP` command node. +- `visitRenameCommand` — Called for every `RENAME` command node. +- `visitDissectCommand` — Called for every `DISSECT` command node. +- `visitGrokCommand` — Called for every `GROK` command node. +- `visitEnrichCommand` — Called for every `ENRICH` command node. +- `visitMvExpandCommand` — Called for every `MV_EXPAND` command node. + +Similarly, you can specify a specific callback for each expression type, instead +of the generic `visitExpression`: + +- `visitColumnExpression` — Called for every column expression node, say + `@timestamp`. +- `visitSourceExpression` — Called for every source expression node, say + `tsdb_index`. +- `visitFunctionCallExpression` — Called for every function call + expression node. Including binary expressions, such as `a + b`. +- `visitLiteralExpression` — Called for every literal expression node, say + `123`, `"hello"`. +- `visitListLiteralExpression` — Called for every list literal expression + node, say `[1, 2, 3]`, `["a", "b", "c"]`. +- `visitTimeIntervalLiteralExpression` — Called for every time interval + literal expression node, say `1h`, `1d`, `1w`. +- `visitInlineCastExpression` — Called for every inline cast expression + node, say `abc::int`, `def::string`. +- `visitRenameExpression` — Called for every rename expression node, say + `a AS b`. +- `visitOrderExpression` — Called for every order expression node, say + `@timestamp ASC`. + + +### Using the Node Context + +Each visitor callback receives a `ctx` object, which contains the reference to +the parent node's context: + +```typescript +new Visitor() + .on('visitExpression', (ctx) => { + ctx.parent + }); +``` + +Each visitor callback also contains various methods to visit the children nodes, +if needed. For example, to visit all arguments of a command node: + +```typescript +const expressions = []; + +new Visitor() + .on('visitExpression', (ctx) => expressions.push(ctx.node)); + .on('visitCommand', (ctx) => { + for (const output of ctx.visitArguments()) { + } + }); +``` + +The node context object may also have node specific methods. For example, the +`LIMIT` command context has the `.numeric()` method, which returns the numeric +value of the `LIMIT` command: + +```typescript +new Visitor() + .on('visitLimitCommand', (ctx) => { + console.log(ctx.numeric()); + }) + .on('visitCommand', () => null) + .on('visitQuery', (ctx) => [...ctx.visitCommands()]) + .visitQuery(root); +``` + + +### Using the Visitor Output + +Each visitor callback can return a *output*, which is then passed to the parent +callback. This allows to pass information from the child node to the parent +node. + +For example, the below code snippet collects all column names in the AST: + +```typescript +const columns = new Visitor() + .on('visitExpression', (ctx) => null) + .on('visitColumnExpression', (ctx) => ctx.node.name) + .on('visitCommand', (ctx) => [...ctx.visitArguments()]) + .on('visitQuery', (ctx) => [...ctx.visitCommands()]) + .visitQuery(root); +``` + + +### Using the Visitor Input + +Analogous to the output, each visitor callback can receive an *input* value. +This allows to pass information from the parent node to the child node. + +For example, the below code snippet prints all column names prefixed with the +text `"prefix"`: + +```typescript +new Visitor() + .on('visitExpression', (ctx) => null) + .on('visitColumnExpression', (ctx, INPUT) => console.log(INPUT + ctx.node.name)) + .on('visitCommand', (ctx) => [...ctx.visitArguments("prefix")]) + .on('visitQuery', (ctx) => [...ctx.visitCommands()]) + .visitQuery(root); +``` diff --git a/packages/kbn-esql-ast/src/walker/README.md b/packages/kbn-esql-ast/src/walker/README.md index 74e834e9095bc..4614350279b0c 100644 --- a/packages/kbn-esql-ast/src/walker/README.md +++ b/packages/kbn-esql-ast/src/walker/README.md @@ -1,41 +1,118 @@ -# ES|QL AST Walker +# `Walker` Traversal API -The ES|QL AST Walker is a utility that traverses the ES|QL AST and provides a -set of callbacks that can be used to perform introspection of the AST. +The ES|QL AST `Walker` is a utility that traverses the ES|QL AST. The developer +can provide a set of callbacks which are called when the walker visits a +specific type of node. + +The `Walker` utility allows to traverse the AST starting from any node, not just +the root node. + + +## Low-level API To start a new *walk* you create a `Walker` instance and call the `walk()` method with the AST node to start the walk from. ```ts - -import { Walker, getAstAndSyntaxErrors } from '@kbn/esql-ast'; +import { Walker } from '@kbn/esql-ast'; const walker = new Walker({ - // Called every time a function node is visited. - visitFunction: (fn) => { + /** + * Visit commands + */ + visitCommand: (node: ESQLCommand) => { + // Called for every command node. + }, + visitCommandOption: (node: ESQLCommandOption) => { + // Called for every command option node. + }, + + /** + * Visit expressions + */ + visitFunction: (fn: ESQLFunction) => { + // Called every time a function expression is visited. console.log('Function:', fn.name); }, - // Called every time a source identifier node is visited. - visitSource: (source) => { + visitSource: (source: ESQLSource) => { + // Called every time a source identifier expression is visited. console.log('Source:', source.name); }, + visitQuery: (node: ESQLAstQueryExpression) => { + // Called for every query node. + }, + visitColumn: (node: ESQLColumn) => { + // Called for every column node. + }, + visitLiteral: (node: ESQLLiteral) => { + // Called for every literal node. + }, + visitListLiteral: (node: ESQLList) => { + // Called for every list literal node. + }, + visitTimeIntervalLiteral: (node: ESQLTimeInterval) => { + // Called for every time interval literal node. + }, + visitInlineCast: (node: ESQLInlineCast) => { + // Called for every inline cast node. + }, }); -const { ast } = getAstAndSyntaxErrors('FROM source | STATS fn()'); walker.walk(ast); ``` -Conceptual structure of an ES|QL AST: - -- A single ES|QL query is composed of one or more source commands and zero or - more transformation commands. -- Each command is represented by a `command` node. -- Each command contains a list expressions named in ES|QL AST as *AST Item*. - - `function` — function call expression. - - `option` — a list of expressions with a specific role in the command. - - `source` — s source identifier expression. - - `column` — a field identifier expression. - - `timeInterval` — a time interval expression. - - `list` — a list literal expression. - - `literal` — a literal expression. - - `inlineCast` — an inline cast expression. +It is also possible to provide a single `visitAny` callback that is called for +any node type that does not have a specific visitor. + +```ts +import { Walker } from '@kbn/esql-ast'; + +const walker = new Walker({ + visitAny?: (node: ESQLProperNode) => { + // Called for any node type that does not have a specific visitor. + }, +}); + +walker.walk(ast); +``` + + +## High-level API + +There are few high-level utility functions that are implemented on top of the +low-level API, for your convenience: + +- `Walker.walk` — Walks the AST and calls the appropriate visitor functions. +- `Walker.commands` — Walks the AST and extracts all command statements. +- `Walker.params` — Walks the AST and extracts all parameter literals. +- `Walker.find` — Finds and returns the first node that matches the search criteria. +- `Walker.findAll` — Finds and returns all nodes that match the search criteria. +- `Walker.match` — Matches a single node against a template object. +- `Walker.matchAll` — Matches all nodes against a template object. +- `Walker.findFunction` — Finds the first function that matches the predicate. +- `Walker.hasFunction` — Searches for at least one occurrence of a function or expression in the AST. +- `Walker.visitComments` — Visits all comments in the AST. + +The `Walker.walk()` method is simply a sugar syntax around the low-level +`new Walker().walk()` method. + +The `Walker.commands()` method returns a list of all commands. This also +includes nested commands, once they become supported in ES|QL. + +The `Walker.params()` method collects all param literals, such as unnamed `?` or +named `?param`, or ordered `?1`. + +The `Walker.find()` and `Walker.findAll()` methods are used to search for nodes +in the AST that match a specific criteria. The criteria is specified using a +predicate function. + +The `Walker.match()` and `Walker.matchAll()` methods are also used to search for +nodes in the AST, but unlike `find` and `findAll`, they use a template object +to match the nodes. + +The `Walker.findFunction()` is a simple utility to find the first function that +matches a predicate. The `Walker.hasFunction()` returns `true` if at least one +function or expression in the AST matches the predicate. + +The `Walker.visitComments()` method is used to visit all comments in the AST. +You specify a callback that is called for each comment node. From 7aa64b6ed59488ab10a5136199b69de0c86668af Mon Sep 17 00:00:00 2001 From: Julia Rechkunova Date: Mon, 30 Sep 2024 18:20:21 +0200 Subject: [PATCH 2/4] [OneDiscover] Add EBT event to track field usage (#193996) - Closes https://github.com/elastic/kibana/issues/186156 - Closes https://github.com/elastic/kibana/issues/189454 ## Summary This PR adds new EBT event type `discover_field_usage` which we use for tracking adding and removing grid columns and adding filters via +/-/exists buttons. Properties of the added events consist of: `eventType`: `dataTableSelection`, `dataTableRemoval`, or `filterAddition` `fieldName`: name of the field if it's from ECS schema `filterOperation`: `+`, `-`, or `_exists_` Screenshot 2024-09-25 at 17 51 27 ## Testing Enable "Usage collection" global setting. Navigate to Discover and observe `kibana-browser` requests in Network tab. ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios --- .../discover/public/__mocks__/services.ts | 2 + .../application/context/context_app.test.tsx | 1 + .../application/context/context_app.tsx | 31 +- .../components/layout/discover_documents.tsx | 32 +- .../components/layout/discover_layout.tsx | 40 ++- src/plugins/discover/public/build_services.ts | 10 +- .../context_awareness/__mocks__/index.tsx | 8 +- .../profiles_manager.test.ts | 4 +- .../context_awareness/profiles_manager.ts | 10 +- src/plugins/discover/public/plugin.tsx | 86 +++-- .../discover_ebt_context_manager.test.ts | 95 ----- .../services/discover_ebt_context_manager.ts | 75 ---- .../services/discover_ebt_manager.test.ts | 242 +++++++++++++ .../public/services/discover_ebt_manager.ts | 219 ++++++++++++ .../context_awareness/_data_source_profile.ts | 101 +----- .../discover/context_awareness/_telemetry.ts | 326 ++++++++++++++++++ .../apps/discover/context_awareness/index.ts | 1 + 17 files changed, 946 insertions(+), 337 deletions(-) delete mode 100644 src/plugins/discover/public/services/discover_ebt_context_manager.test.ts delete mode 100644 src/plugins/discover/public/services/discover_ebt_context_manager.ts create mode 100644 src/plugins/discover/public/services/discover_ebt_manager.test.ts create mode 100644 src/plugins/discover/public/services/discover_ebt_manager.ts create mode 100644 test/functional/apps/discover/context_awareness/_telemetry.ts diff --git a/src/plugins/discover/public/__mocks__/services.ts b/src/plugins/discover/public/__mocks__/services.ts index 3d78239558f3e..f00d105444630 100644 --- a/src/plugins/discover/public/__mocks__/services.ts +++ b/src/plugins/discover/public/__mocks__/services.ts @@ -45,6 +45,7 @@ import { SearchResponse } from '@elastic/elasticsearch/lib/api/types'; import { urlTrackerMock } from './url_tracker.mock'; import { createElement } from 'react'; import { createContextAwarenessMocks } from '../context_awareness/__mocks__'; +import { DiscoverEBTManager } from '../services/discover_ebt_manager'; export function createDiscoverServicesMock(): DiscoverServices { const dataPlugin = dataPluginMock.createStartContract(); @@ -245,6 +246,7 @@ export function createDiscoverServicesMock(): DiscoverServices { singleDocLocator: { getRedirectUrl: jest.fn(() => '') }, urlTracker: urlTrackerMock, profilesManager: profilesManagerMock, + ebtManager: new DiscoverEBTManager(), setHeaderActionMenu: jest.fn(), } as unknown as DiscoverServices; } diff --git a/src/plugins/discover/public/application/context/context_app.test.tsx b/src/plugins/discover/public/application/context/context_app.test.tsx index 9c77d1e40bbb2..7a99194cad575 100644 --- a/src/plugins/discover/public/application/context/context_app.test.tsx +++ b/src/plugins/discover/public/application/context/context_app.test.tsx @@ -72,6 +72,7 @@ describe('ContextApp test', () => { contextLocator: { getRedirectUrl: jest.fn(() => '') }, singleDocLocator: { getRedirectUrl: jest.fn(() => '') }, profilesManager: discoverServices.profilesManager, + ebtManager: discoverServices.ebtManager, timefilter: discoverServices.timefilter, uiActions: discoverServices.uiActions, } as unknown as DiscoverServices; diff --git a/src/plugins/discover/public/application/context/context_app.tsx b/src/plugins/discover/public/application/context/context_app.tsx index e0dfa985b594e..b0fc1342a8f72 100644 --- a/src/plugins/discover/public/application/context/context_app.tsx +++ b/src/plugins/discover/public/application/context/context_app.tsx @@ -56,6 +56,8 @@ export const ContextApp = ({ dataView, anchorId, referrer }: ContextAppProps) => navigation, filterManager, core, + ebtManager, + fieldsMetadata, } = services; const isLegacy = useMemo(() => uiSettings.get(DOC_TABLE_LEGACY), [uiSettings]); @@ -199,15 +201,36 @@ export const ContextApp = ({ dataView, anchorId, referrer }: ContextAppProps) => ); const addFilter = useCallback( - async (field: DataViewField | string, values: unknown, operation: string) => { + async (field: DataViewField | string, values: unknown, operation: '+' | '-') => { const newFilters = generateFilters(filterManager, field, values, operation, dataView); filterManager.addFilters(newFilters); if (dataViews) { const fieldName = typeof field === 'string' ? field : field.name; await popularizeField(dataView, fieldName, dataViews, capabilities); + void ebtManager.trackFilterAddition({ + fieldName: fieldName === '_exists_' ? String(values) : fieldName, + filterOperation: fieldName === '_exists_' ? '_exists_' : operation, + fieldsMetadata, + }); } }, - [filterManager, dataViews, dataView, capabilities] + [filterManager, dataViews, dataView, capabilities, ebtManager, fieldsMetadata] + ); + + const onAddColumnWithTracking = useCallback( + (columnName: string) => { + onAddColumn(columnName); + void ebtManager.trackDataTableSelection({ fieldName: columnName, fieldsMetadata }); + }, + [onAddColumn, ebtManager, fieldsMetadata] + ); + + const onRemoveColumnWithTracking = useCallback( + (columnName: string) => { + onRemoveColumn(columnName); + void ebtManager.trackDataTableRemoval({ fieldName: columnName, fieldsMetadata }); + }, + [onRemoveColumn, ebtManager, fieldsMetadata] ); const TopNavMenu = navigation.ui.AggregateQueryTopNavMenu; @@ -271,8 +294,8 @@ export const ContextApp = ({ dataView, anchorId, referrer }: ContextAppProps) => isLegacy={isLegacy} columns={columns} grid={appState.grid} - onAddColumn={onAddColumn} - onRemoveColumn={onRemoveColumn} + onAddColumn={onAddColumnWithTracking} + onRemoveColumn={onRemoveColumnWithTracking} onSetColumns={onSetColumns} predecessorCount={appState.predecessorCount} successorCount={appState.successorCount} diff --git a/src/plugins/discover/public/application/main/components/layout/discover_documents.tsx b/src/plugins/discover/public/application/main/components/layout/discover_documents.tsx index 2fe2a4f5a8f93..77befc4dc334f 100644 --- a/src/plugins/discover/public/application/main/components/layout/discover_documents.tsx +++ b/src/plugins/discover/public/application/main/components/layout/discover_documents.tsx @@ -117,7 +117,7 @@ function DiscoverDocumentsComponent({ const services = useDiscoverServices(); const documents$ = stateContainer.dataState.data$.documents$; const savedSearch = useSavedSearchInitial(); - const { dataViews, capabilities, uiSettings, uiActions } = services; + const { dataViews, capabilities, uiSettings, uiActions, ebtManager, fieldsMetadata } = services; const [ dataSource, query, @@ -200,6 +200,22 @@ function DiscoverDocumentsComponent({ settings: grid, }); + const onAddColumnWithTracking = useCallback( + (columnName: string) => { + onAddColumn(columnName); + void ebtManager.trackDataTableSelection({ fieldName: columnName, fieldsMetadata }); + }, + [onAddColumn, ebtManager, fieldsMetadata] + ); + + const onRemoveColumnWithTracking = useCallback( + (columnName: string) => { + onRemoveColumn(columnName); + void ebtManager.trackDataTableRemoval({ fieldName: columnName, fieldsMetadata }); + }, + [onRemoveColumn, ebtManager, fieldsMetadata] + ); + const setExpandedDoc = useCallback( (doc: DataTableRecord | undefined) => { stateContainer.internalState.transitions.setExpandedDoc(doc); @@ -299,14 +315,22 @@ function DiscoverDocumentsComponent({ columnsMeta={customColumnsMeta} savedSearchId={savedSearch.id} onFilter={onAddFilter} - onRemoveColumn={onRemoveColumn} - onAddColumn={onAddColumn} + onRemoveColumn={onRemoveColumnWithTracking} + onAddColumn={onAddColumnWithTracking} onClose={() => setExpandedDoc(undefined)} setExpandedDoc={setExpandedDoc} query={query} /> ), - [dataView, onAddColumn, onAddFilter, onRemoveColumn, query, savedSearch.id, setExpandedDoc] + [ + dataView, + onAddColumnWithTracking, + onAddFilter, + onRemoveColumnWithTracking, + query, + savedSearch.id, + setExpandedDoc, + ] ); const configRowHeight = uiSettings.get(ROW_HEIGHT_OPTION); diff --git a/src/plugins/discover/public/application/main/components/layout/discover_layout.tsx b/src/plugins/discover/public/application/main/components/layout/discover_layout.tsx index 49e645e3f2206..bc9cad72a5eb6 100644 --- a/src/plugins/discover/public/application/main/components/layout/discover_layout.tsx +++ b/src/plugins/discover/public/application/main/components/layout/discover_layout.tsx @@ -78,6 +78,8 @@ export function DiscoverLayout({ stateContainer }: DiscoverLayoutProps) { spaces, observabilityAIAssistant, dataVisualizer: dataVisualizerService, + ebtManager, + fieldsMetadata, } = useDiscoverServices(); const pageBackgroundColor = useEuiBackgroundColor('plain'); const globalQueryState = data.query.getState(); @@ -154,6 +156,22 @@ export function DiscoverLayout({ stateContainer }: DiscoverLayoutProps) { settings: grid, }); + const onAddColumnWithTracking = useCallback( + (columnName: string) => { + onAddColumn(columnName); + void ebtManager.trackDataTableSelection({ fieldName: columnName, fieldsMetadata }); + }, + [onAddColumn, ebtManager, fieldsMetadata] + ); + + const onRemoveColumnWithTracking = useCallback( + (columnName: string) => { + onRemoveColumn(columnName); + void ebtManager.trackDataTableRemoval({ fieldName: columnName, fieldsMetadata }); + }, + [onRemoveColumn, ebtManager, fieldsMetadata] + ); + // The assistant is getting the state from the url correctly // expect from the index pattern where we have only the dataview id useEffect(() => { @@ -175,9 +193,14 @@ export function DiscoverLayout({ stateContainer }: DiscoverLayoutProps) { if (trackUiMetric) { trackUiMetric(METRIC_TYPE.CLICK, 'filter_added'); } + void ebtManager.trackFilterAddition({ + fieldName: fieldName === '_exists_' ? String(values) : fieldName, + filterOperation: fieldName === '_exists_' ? '_exists_' : operation, + fieldsMetadata, + }); return filterManager.addFilters(newFilters); }, - [filterManager, dataView, dataViews, trackUiMetric, capabilities] + [filterManager, dataView, dataViews, trackUiMetric, capabilities, ebtManager, fieldsMetadata] ); const getOperator = (fieldName: string, values: unknown, operation: '+' | '-') => { @@ -222,8 +245,13 @@ export function DiscoverLayout({ stateContainer }: DiscoverLayoutProps) { if (trackUiMetric) { trackUiMetric(METRIC_TYPE.CLICK, 'esql_filter_added'); } + void ebtManager.trackFilterAddition({ + fieldName: fieldName === '_exists_' ? String(values) : fieldName, + filterOperation: fieldName === '_exists_' ? '_exists_' : operation, + fieldsMetadata, + }); }, - [data.query.queryString, query, trackUiMetric] + [data.query.queryString, query, trackUiMetric, ebtManager, fieldsMetadata] ); const onFilter = isEsqlMode ? onPopulateWhereClause : onAddFilter; @@ -274,8 +302,8 @@ export function DiscoverLayout({ stateContainer }: DiscoverLayoutProps) { return undefined; } - return () => onAddColumn(draggingFieldName); - }, [onAddColumn, draggingFieldName, currentColumns]); + return () => onAddColumnWithTracking(draggingFieldName); + }, [onAddColumnWithTracking, draggingFieldName, currentColumns]); const [sidebarToggleState$] = useState>( () => new BehaviorSubject({ isCollapsed: false, toggle: () => {} }) @@ -396,10 +424,10 @@ export function DiscoverLayout({ stateContainer }: DiscoverLayoutProps) { sidebarPanel={ { const { usageCollection } = plugins; @@ -223,7 +223,7 @@ export const buildServices = memoize( noDataPage: plugins.noDataPage, observabilityAIAssistant: plugins.observabilityAIAssistant, profilesManager, - ebtContextManager, + ebtManager, fieldsMetadata: plugins.fieldsMetadata, logsDataAccess: plugins.logsDataAccess, }; diff --git a/src/plugins/discover/public/context_awareness/__mocks__/index.tsx b/src/plugins/discover/public/context_awareness/__mocks__/index.tsx index a15b7aa26a8a0..153d401cc980a 100644 --- a/src/plugins/discover/public/context_awareness/__mocks__/index.tsx +++ b/src/plugins/discover/public/context_awareness/__mocks__/index.tsx @@ -23,7 +23,7 @@ import { } from '../profiles'; import { ProfileProviderServices } from '../profile_providers/profile_provider_services'; import { ProfilesManager } from '../profiles_manager'; -import { DiscoverEBTContextManager } from '../../services/discover_ebt_context_manager'; +import { DiscoverEBTManager } from '../../services/discover_ebt_manager'; import { createLogsContextServiceMock } from '@kbn/discover-utils/src/__mocks__'; export const createContextAwarenessMocks = ({ @@ -152,12 +152,12 @@ export const createContextAwarenessMocks = ({ documentProfileServiceMock.registerProvider(documentProfileProviderMock); } - const ebtContextManagerMock = new DiscoverEBTContextManager(); + const ebtManagerMock = new DiscoverEBTManager(); const profilesManagerMock = new ProfilesManager( rootProfileServiceMock, dataSourceProfileServiceMock, documentProfileServiceMock, - ebtContextManagerMock + ebtManagerMock ); const profileProviderServices = createProfileProviderServicesMock(); @@ -173,7 +173,7 @@ export const createContextAwarenessMocks = ({ contextRecordMock2, profilesManagerMock, profileProviderServices, - ebtContextManagerMock, + ebtManagerMock, }; }; diff --git a/src/plugins/discover/public/context_awareness/profiles_manager.test.ts b/src/plugins/discover/public/context_awareness/profiles_manager.test.ts index 87965edbe7488..da5ad8b56dcf3 100644 --- a/src/plugins/discover/public/context_awareness/profiles_manager.test.ts +++ b/src/plugins/discover/public/context_awareness/profiles_manager.test.ts @@ -21,7 +21,7 @@ describe('ProfilesManager', () => { beforeEach(() => { jest.clearAllMocks(); mocks = createContextAwarenessMocks(); - jest.spyOn(mocks.ebtContextManagerMock, 'updateProfilesContextWith'); + jest.spyOn(mocks.ebtManagerMock, 'updateProfilesContextWith'); }); it('should return default profiles', () => { @@ -62,7 +62,7 @@ describe('ProfilesManager', () => { mocks.documentProfileProviderMock.profile, ]); - expect(mocks.ebtContextManagerMock.updateProfilesContextWith).toHaveBeenCalledWith([ + expect(mocks.ebtManagerMock.updateProfilesContextWith).toHaveBeenCalledWith([ 'root-profile', 'data-source-profile', ]); diff --git a/src/plugins/discover/public/context_awareness/profiles_manager.ts b/src/plugins/discover/public/context_awareness/profiles_manager.ts index 2c8b1c7d16cb0..6b7bef5e02294 100644 --- a/src/plugins/discover/public/context_awareness/profiles_manager.ts +++ b/src/plugins/discover/public/context_awareness/profiles_manager.ts @@ -25,7 +25,7 @@ import type { DocumentContext, } from './profiles'; import type { ContextWithProfileId } from './profile_service'; -import { DiscoverEBTContextManager } from '../services/discover_ebt_context_manager'; +import { DiscoverEBTManager } from '../services/discover_ebt_manager'; interface SerializedRootProfileParams { solutionNavId: RootProfileProviderParams['solutionNavId']; @@ -53,7 +53,7 @@ export interface GetProfilesOptions { export class ProfilesManager { private readonly rootContext$: BehaviorSubject>; private readonly dataSourceContext$: BehaviorSubject>; - private readonly ebtContextManager: DiscoverEBTContextManager; + private readonly ebtManager: DiscoverEBTManager; private prevRootProfileParams?: SerializedRootProfileParams; private prevDataSourceProfileParams?: SerializedDataSourceProfileParams; @@ -64,11 +64,11 @@ export class ProfilesManager { private readonly rootProfileService: RootProfileService, private readonly dataSourceProfileService: DataSourceProfileService, private readonly documentProfileService: DocumentProfileService, - ebtContextManager: DiscoverEBTContextManager + ebtManager: DiscoverEBTManager ) { this.rootContext$ = new BehaviorSubject(rootProfileService.defaultContext); this.dataSourceContext$ = new BehaviorSubject(dataSourceProfileService.defaultContext); - this.ebtContextManager = ebtContextManager; + this.ebtManager = ebtManager; } /** @@ -206,7 +206,7 @@ export class ProfilesManager { private trackActiveProfiles(rootContextProfileId: string, dataSourceContextProfileId: string) { const dscProfiles = [rootContextProfileId, dataSourceContextProfileId]; - this.ebtContextManager.updateProfilesContextWith(dscProfiles); + this.ebtManager.updateProfilesContextWith(dscProfiles); } } diff --git a/src/plugins/discover/public/plugin.tsx b/src/plugins/discover/public/plugin.tsx index e6430f82c62fe..dbbcc90a7d451 100644 --- a/src/plugins/discover/public/plugin.tsx +++ b/src/plugins/discover/public/plugin.tsx @@ -59,7 +59,7 @@ import { RootProfileService } from './context_awareness/profiles/root_profile'; import { DataSourceProfileService } from './context_awareness/profiles/data_source_profile'; import { DocumentProfileService } from './context_awareness/profiles/document_profile'; import { ProfilesManager } from './context_awareness/profiles_manager'; -import { DiscoverEBTContextManager } from './services/discover_ebt_context_manager'; +import { DiscoverEBTManager } from './services/discover_ebt_manager'; /** * Contains Discover, one of the oldest parts of Kibana @@ -149,8 +149,12 @@ export class DiscoverPlugin this.urlTracker = { setTrackedUrl, restorePreviousUrl, setTrackingEnabled }; this.stopUrlTracking = stopUrlTracker; - const ebtContextManager = new DiscoverEBTContextManager(); - ebtContextManager.initialize({ core }); + const ebtManager = new DiscoverEBTManager(); + ebtManager.initialize({ + core, + shouldInitializeCustomContext: true, + shouldInitializeCustomEvents: true, + }); core.application.register({ id: PLUGIN_ID, @@ -176,7 +180,7 @@ export class DiscoverPlugin window.dispatchEvent(new HashChangeEvent('hashchange')); }); - ebtContextManager.enable(); + ebtManager.enableContext(); const services = buildServices({ core: coreStart, @@ -188,12 +192,12 @@ export class DiscoverPlugin history: this.historyService.getHistory(), scopedHistory: this.scopedHistory, urlTracker: this.urlTracker!, - profilesManager: await this.createProfilesManager( - coreStart, - discoverStartPlugins, - ebtContextManager - ), - ebtContextManager, + profilesManager: await this.createProfilesManager({ + core: coreStart, + plugins: discoverStartPlugins, + ebtManager, + }), + ebtManager, setHeaderActionMenu: params.setHeaderActionMenu, }); @@ -226,7 +230,7 @@ export class DiscoverPlugin }); return () => { - ebtContextManager.disableAndReset(); + ebtManager.disableAndResetContext(); unlistenParentHistory(); unmount(); appUnMounted(); @@ -296,11 +300,12 @@ export class DiscoverPlugin } const getDiscoverServicesInternal = () => { + const ebtManager = new DiscoverEBTManager(); // It is not initialized outside of Discover return this.getDiscoverServices( core, plugins, - this.createEmptyProfilesManager(), - new DiscoverEBTContextManager() // it's not enabled outside of Discover + this.createEmptyProfilesManager({ ebtManager }), + ebtManager ); }; @@ -326,11 +331,15 @@ export class DiscoverPlugin return { rootProfileService, dataSourceProfileService, documentProfileService }; } - private createProfilesManager = async ( - core: CoreStart, - plugins: DiscoverStartPlugins, - ebtContextManager: DiscoverEBTContextManager - ) => { + private async createProfilesManager({ + core, + plugins, + ebtManager, + }: { + core: CoreStart; + plugins: DiscoverStartPlugins; + ebtManager: DiscoverEBTManager; + }) { const { registerProfileProviders } = await import('./context_awareness/profile_providers'); const { rootProfileService, dataSourceProfileService, documentProfileService } = this.createProfileServices(); @@ -341,7 +350,7 @@ export class DiscoverPlugin rootProfileService, dataSourceProfileService, documentProfileService, - ebtContextManager + ebtManager ); await registerProfileProviders({ @@ -349,21 +358,18 @@ export class DiscoverPlugin dataSourceProfileService, documentProfileService, enabledExperimentalProfileIds, - services: this.getDiscoverServices(core, plugins, profilesManager, ebtContextManager), + services: this.getDiscoverServices(core, plugins, profilesManager, ebtManager), }); return profilesManager; - }; - - private createEmptyProfilesManager() { - const { rootProfileService, dataSourceProfileService, documentProfileService } = - this.createProfileServices(); + } + private createEmptyProfilesManager({ ebtManager }: { ebtManager: DiscoverEBTManager }) { return new ProfilesManager( - rootProfileService, - dataSourceProfileService, - documentProfileService, - new DiscoverEBTContextManager() // it's not enabled outside of Discover + new RootProfileService(), + new DataSourceProfileService(), + new DocumentProfileService(), + ebtManager ); } @@ -371,7 +377,7 @@ export class DiscoverPlugin core: CoreStart, plugins: DiscoverStartPlugins, profilesManager: ProfilesManager, - ebtContextManager: DiscoverEBTContextManager + ebtManager: DiscoverEBTManager ) => { return buildServices({ core, @@ -383,11 +389,13 @@ export class DiscoverPlugin history: this.historyService.getHistory(), urlTracker: this.urlTracker!, profilesManager, - ebtContextManager, + ebtManager, }); }; private registerEmbeddable(core: CoreSetup, plugins: DiscoverSetupPlugins) { + const ebtManager = new DiscoverEBTManager(); // It is not initialized outside of Discover + const getStartServices = async () => { const [coreStart, deps] = await core.getStartServices(); return { @@ -396,16 +404,20 @@ export class DiscoverPlugin }; }; - const getDiscoverServicesInternal = async () => { + const getDiscoverServicesForEmbeddable = async () => { const [coreStart, deps] = await core.getStartServices(); - const ebtContextManager = new DiscoverEBTContextManager(); // it's not enabled outside of Discover - const profilesManager = await this.createProfilesManager(coreStart, deps, ebtContextManager); - return this.getDiscoverServices(coreStart, deps, profilesManager, ebtContextManager); + + const profilesManager = await this.createProfilesManager({ + core: coreStart, + plugins: deps, + ebtManager, + }); + return this.getDiscoverServices(coreStart, deps, profilesManager, ebtManager); }; plugins.embeddable.registerReactEmbeddableSavedObject({ onAdd: async (container, savedObject) => { - const services = await getDiscoverServicesInternal(); + const services = await getDiscoverServicesForEmbeddable(); const initialState = await deserializeState({ serializedState: { rawState: { savedObjectId: savedObject.id }, @@ -429,7 +441,7 @@ export class DiscoverPlugin plugins.embeddable.registerReactEmbeddableFactory(SEARCH_EMBEDDABLE_TYPE, async () => { const [startServices, discoverServices, { getSearchEmbeddableFactory }] = await Promise.all([ getStartServices(), - getDiscoverServicesInternal(), + getDiscoverServicesForEmbeddable(), import('./embeddable/get_search_embeddable_factory'), ]); diff --git a/src/plugins/discover/public/services/discover_ebt_context_manager.test.ts b/src/plugins/discover/public/services/discover_ebt_context_manager.test.ts deleted file mode 100644 index 3b2836325b671..0000000000000 --- a/src/plugins/discover/public/services/discover_ebt_context_manager.test.ts +++ /dev/null @@ -1,95 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the "Elastic License - * 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side - * Public License v 1"; you may not use this file except in compliance with, at - * your election, the "Elastic License 2.0", the "GNU Affero General Public - * License v3.0 only", or the "Server Side Public License, v 1". - */ - -import { BehaviorSubject } from 'rxjs'; -import { coreMock } from '@kbn/core/public/mocks'; -import { DiscoverEBTContextManager } from './discover_ebt_context_manager'; - -const coreSetupMock = coreMock.createSetup(); - -describe('DiscoverEBTContextManager', () => { - let discoverEBTContextManager: DiscoverEBTContextManager; - - beforeEach(() => { - discoverEBTContextManager = new DiscoverEBTContextManager(); - }); - - describe('register', () => { - it('should register the context provider', () => { - discoverEBTContextManager.initialize({ core: coreSetupMock }); - - expect(coreSetupMock.analytics.registerContextProvider).toHaveBeenCalledWith({ - name: 'discover_context', - context$: expect.any(BehaviorSubject), - schema: { - discoverProfiles: { - type: 'array', - items: { - type: 'keyword', - _meta: { - description: 'List of active Discover context awareness profiles', - }, - }, - }, - }, - }); - }); - }); - - describe('updateProfilesWith', () => { - it('should update the profiles with the provided props', () => { - const dscProfiles = ['profile1', 'profile2']; - const dscProfiles2 = ['profile21', 'profile22']; - discoverEBTContextManager.initialize({ core: coreSetupMock }); - discoverEBTContextManager.enable(); - - discoverEBTContextManager.updateProfilesContextWith(dscProfiles); - expect(discoverEBTContextManager.getProfilesContext()).toBe(dscProfiles); - - discoverEBTContextManager.updateProfilesContextWith(dscProfiles2); - expect(discoverEBTContextManager.getProfilesContext()).toBe(dscProfiles2); - }); - - it('should not update the profiles if profile list did not change', () => { - const dscProfiles = ['profile1', 'profile2']; - const dscProfiles2 = ['profile1', 'profile2']; - discoverEBTContextManager.initialize({ core: coreSetupMock }); - discoverEBTContextManager.enable(); - - discoverEBTContextManager.updateProfilesContextWith(dscProfiles); - expect(discoverEBTContextManager.getProfilesContext()).toBe(dscProfiles); - - discoverEBTContextManager.updateProfilesContextWith(dscProfiles2); - expect(discoverEBTContextManager.getProfilesContext()).toBe(dscProfiles); - }); - - it('should not update the profiles if not enabled yet', () => { - const dscProfiles = ['profile1', 'profile2']; - discoverEBTContextManager.initialize({ core: coreSetupMock }); - - discoverEBTContextManager.updateProfilesContextWith(dscProfiles); - expect(discoverEBTContextManager.getProfilesContext()).toEqual([]); - }); - - it('should not update the profiles after resetting unless enabled again', () => { - const dscProfiles = ['profile1', 'profile2']; - discoverEBTContextManager.initialize({ core: coreSetupMock }); - discoverEBTContextManager.enable(); - discoverEBTContextManager.updateProfilesContextWith(dscProfiles); - expect(discoverEBTContextManager.getProfilesContext()).toBe(dscProfiles); - discoverEBTContextManager.disableAndReset(); - expect(discoverEBTContextManager.getProfilesContext()).toEqual([]); - discoverEBTContextManager.updateProfilesContextWith(dscProfiles); - expect(discoverEBTContextManager.getProfilesContext()).toEqual([]); - discoverEBTContextManager.enable(); - discoverEBTContextManager.updateProfilesContextWith(dscProfiles); - expect(discoverEBTContextManager.getProfilesContext()).toBe(dscProfiles); - }); - }); -}); diff --git a/src/plugins/discover/public/services/discover_ebt_context_manager.ts b/src/plugins/discover/public/services/discover_ebt_context_manager.ts deleted file mode 100644 index 12ea918c495d9..0000000000000 --- a/src/plugins/discover/public/services/discover_ebt_context_manager.ts +++ /dev/null @@ -1,75 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the "Elastic License - * 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side - * Public License v 1"; you may not use this file except in compliance with, at - * your election, the "Elastic License 2.0", the "GNU Affero General Public - * License v3.0 only", or the "Server Side Public License, v 1". - */ - -import { BehaviorSubject } from 'rxjs'; -import { isEqual } from 'lodash'; -import type { CoreSetup } from '@kbn/core-lifecycle-browser'; - -export interface DiscoverEBTContextProps { - discoverProfiles: string[]; // Discover Context Awareness Profiles -} -export type DiscoverEBTContext = BehaviorSubject; - -export class DiscoverEBTContextManager { - private isEnabled: boolean = false; - private ebtContext$: DiscoverEBTContext | undefined; - - constructor() {} - - // https://docs.elastic.dev/telemetry/collection/event-based-telemetry - public initialize({ core }: { core: CoreSetup }) { - const context$ = new BehaviorSubject({ - discoverProfiles: [], - }); - - core.analytics.registerContextProvider({ - name: 'discover_context', - context$, - schema: { - discoverProfiles: { - type: 'array', - items: { - type: 'keyword', - _meta: { - description: 'List of active Discover context awareness profiles', - }, - }, - }, - // If we decide to extend EBT context with more properties, we can do it here - }, - }); - - this.ebtContext$ = context$; - } - - public enable() { - this.isEnabled = true; - } - - public updateProfilesContextWith(discoverProfiles: DiscoverEBTContextProps['discoverProfiles']) { - if ( - this.isEnabled && - this.ebtContext$ && - !isEqual(this.ebtContext$.getValue().discoverProfiles, discoverProfiles) - ) { - this.ebtContext$.next({ - discoverProfiles, - }); - } - } - - public getProfilesContext() { - return this.ebtContext$?.getValue()?.discoverProfiles; - } - - public disableAndReset() { - this.updateProfilesContextWith([]); - this.isEnabled = false; - } -} diff --git a/src/plugins/discover/public/services/discover_ebt_manager.test.ts b/src/plugins/discover/public/services/discover_ebt_manager.test.ts new file mode 100644 index 0000000000000..0ed20dacdb0ce --- /dev/null +++ b/src/plugins/discover/public/services/discover_ebt_manager.test.ts @@ -0,0 +1,242 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the "Elastic License + * 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side + * Public License v 1"; you may not use this file except in compliance with, at + * your election, the "Elastic License 2.0", the "GNU Affero General Public + * License v3.0 only", or the "Server Side Public License, v 1". + */ + +import { BehaviorSubject } from 'rxjs'; +import { coreMock } from '@kbn/core/public/mocks'; +import { DiscoverEBTManager } from './discover_ebt_manager'; +import { FieldsMetadataPublicStart } from '@kbn/fields-metadata-plugin/public'; + +describe('DiscoverEBTManager', () => { + let discoverEBTContextManager: DiscoverEBTManager; + + const coreSetupMock = coreMock.createSetup(); + + const fieldsMetadata = { + getClient: jest.fn().mockResolvedValue({ + find: jest.fn().mockResolvedValue({ + fields: { + test: { + short: 'test', + }, + }, + }), + }), + } as unknown as FieldsMetadataPublicStart; + + beforeEach(() => { + discoverEBTContextManager = new DiscoverEBTManager(); + }); + + describe('register', () => { + it('should register the context provider and custom events', () => { + discoverEBTContextManager.initialize({ + core: coreSetupMock, + shouldInitializeCustomContext: true, + shouldInitializeCustomEvents: true, + }); + + expect(coreSetupMock.analytics.registerContextProvider).toHaveBeenCalledWith({ + name: 'discover_context', + context$: expect.any(BehaviorSubject), + schema: { + discoverProfiles: { + type: 'array', + items: { + type: 'keyword', + _meta: { + description: 'List of active Discover context awareness profiles', + }, + }, + }, + }, + }); + + expect(coreSetupMock.analytics.registerEventType).toHaveBeenCalledWith({ + eventType: 'discover_field_usage', + schema: { + eventName: { + type: 'keyword', + _meta: { + description: + 'The name of the event that is tracked in the metrics i.e. dataTableSelection, dataTableRemoval', + }, + }, + fieldName: { + type: 'keyword', + _meta: { + description: "Field name if it's a part of ECS schema", + optional: true, + }, + }, + filterOperation: { + type: 'keyword', + _meta: { + description: "Operation type when a filter is added i.e. '+', '-', '_exists_'", + optional: true, + }, + }, + }, + }); + }); + }); + + describe('updateProfilesWith', () => { + it('should update the profiles with the provided props', () => { + const dscProfiles = ['profile1', 'profile2']; + const dscProfiles2 = ['profile21', 'profile22']; + discoverEBTContextManager.initialize({ + core: coreSetupMock, + shouldInitializeCustomContext: true, + shouldInitializeCustomEvents: false, + }); + discoverEBTContextManager.enableContext(); + + discoverEBTContextManager.updateProfilesContextWith(dscProfiles); + expect(discoverEBTContextManager.getProfilesContext()).toBe(dscProfiles); + + discoverEBTContextManager.updateProfilesContextWith(dscProfiles2); + expect(discoverEBTContextManager.getProfilesContext()).toBe(dscProfiles2); + }); + + it('should not update the profiles if profile list did not change', () => { + const dscProfiles = ['profile1', 'profile2']; + const dscProfiles2 = ['profile1', 'profile2']; + discoverEBTContextManager.initialize({ + core: coreSetupMock, + shouldInitializeCustomContext: true, + shouldInitializeCustomEvents: false, + }); + discoverEBTContextManager.enableContext(); + + discoverEBTContextManager.updateProfilesContextWith(dscProfiles); + expect(discoverEBTContextManager.getProfilesContext()).toBe(dscProfiles); + + discoverEBTContextManager.updateProfilesContextWith(dscProfiles2); + expect(discoverEBTContextManager.getProfilesContext()).toBe(dscProfiles); + }); + + it('should not update the profiles if not enabled yet', () => { + const dscProfiles = ['profile1', 'profile2']; + discoverEBTContextManager.initialize({ + core: coreSetupMock, + shouldInitializeCustomContext: true, + shouldInitializeCustomEvents: false, + }); + + discoverEBTContextManager.updateProfilesContextWith(dscProfiles); + expect(discoverEBTContextManager.getProfilesContext()).toEqual([]); + }); + + it('should not update the profiles after resetting unless enabled again', () => { + const dscProfiles = ['profile1', 'profile2']; + discoverEBTContextManager.initialize({ + core: coreSetupMock, + shouldInitializeCustomContext: true, + shouldInitializeCustomEvents: false, + }); + discoverEBTContextManager.enableContext(); + discoverEBTContextManager.updateProfilesContextWith(dscProfiles); + expect(discoverEBTContextManager.getProfilesContext()).toBe(dscProfiles); + discoverEBTContextManager.disableAndResetContext(); + expect(discoverEBTContextManager.getProfilesContext()).toEqual([]); + discoverEBTContextManager.updateProfilesContextWith(dscProfiles); + expect(discoverEBTContextManager.getProfilesContext()).toEqual([]); + discoverEBTContextManager.enableContext(); + discoverEBTContextManager.updateProfilesContextWith(dscProfiles); + expect(discoverEBTContextManager.getProfilesContext()).toBe(dscProfiles); + }); + }); + + describe('trackFieldUsageEvent', () => { + it('should track the field usage when a field is added to the table', async () => { + discoverEBTContextManager.initialize({ + core: coreSetupMock, + shouldInitializeCustomContext: false, + shouldInitializeCustomEvents: true, + }); + + await discoverEBTContextManager.trackDataTableSelection({ + fieldName: 'test', + fieldsMetadata, + }); + + expect(coreSetupMock.analytics.reportEvent).toHaveBeenCalledWith('discover_field_usage', { + eventName: 'dataTableSelection', + fieldName: 'test', + }); + + await discoverEBTContextManager.trackDataTableSelection({ + fieldName: 'test2', + fieldsMetadata, + }); + + expect(coreSetupMock.analytics.reportEvent).toHaveBeenLastCalledWith('discover_field_usage', { + eventName: 'dataTableSelection', // non-ECS fields would not be included in properties + }); + }); + + it('should track the field usage when a field is removed from the table', async () => { + discoverEBTContextManager.initialize({ + core: coreSetupMock, + shouldInitializeCustomContext: false, + shouldInitializeCustomEvents: true, + }); + + await discoverEBTContextManager.trackDataTableRemoval({ + fieldName: 'test', + fieldsMetadata, + }); + + expect(coreSetupMock.analytics.reportEvent).toHaveBeenCalledWith('discover_field_usage', { + eventName: 'dataTableRemoval', + fieldName: 'test', + }); + + await discoverEBTContextManager.trackDataTableRemoval({ + fieldName: 'test2', + fieldsMetadata, + }); + + expect(coreSetupMock.analytics.reportEvent).toHaveBeenLastCalledWith('discover_field_usage', { + eventName: 'dataTableRemoval', // non-ECS fields would not be included in properties + }); + }); + + it('should track the field usage when a filter is created', async () => { + discoverEBTContextManager.initialize({ + core: coreSetupMock, + shouldInitializeCustomContext: false, + shouldInitializeCustomEvents: true, + }); + + await discoverEBTContextManager.trackFilterAddition({ + fieldName: 'test', + fieldsMetadata, + filterOperation: '+', + }); + + expect(coreSetupMock.analytics.reportEvent).toHaveBeenCalledWith('discover_field_usage', { + eventName: 'filterAddition', + fieldName: 'test', + filterOperation: '+', + }); + + await discoverEBTContextManager.trackFilterAddition({ + fieldName: 'test2', + fieldsMetadata, + filterOperation: '_exists_', + }); + + expect(coreSetupMock.analytics.reportEvent).toHaveBeenLastCalledWith('discover_field_usage', { + eventName: 'filterAddition', // non-ECS fields would not be included in properties + filterOperation: '_exists_', + }); + }); + }); +}); diff --git a/src/plugins/discover/public/services/discover_ebt_manager.ts b/src/plugins/discover/public/services/discover_ebt_manager.ts new file mode 100644 index 0000000000000..420eb6c244444 --- /dev/null +++ b/src/plugins/discover/public/services/discover_ebt_manager.ts @@ -0,0 +1,219 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the "Elastic License + * 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side + * Public License v 1"; you may not use this file except in compliance with, at + * your election, the "Elastic License 2.0", the "GNU Affero General Public + * License v3.0 only", or the "Server Side Public License, v 1". + */ + +import { BehaviorSubject } from 'rxjs'; +import { isEqual } from 'lodash'; +import type { CoreSetup } from '@kbn/core-lifecycle-browser'; +import type { FieldsMetadataPublicStart } from '@kbn/fields-metadata-plugin/public'; + +const FIELD_USAGE_EVENT_TYPE = 'discover_field_usage'; +const FIELD_USAGE_EVENT_NAME = 'eventName'; +const FIELD_USAGE_FIELD_NAME = 'fieldName'; +const FIELD_USAGE_FILTER_OPERATION = 'filterOperation'; + +type FilterOperation = '+' | '-' | '_exists_'; + +export enum FieldUsageEventName { + dataTableSelection = 'dataTableSelection', + dataTableRemoval = 'dataTableRemoval', + filterAddition = 'filterAddition', +} +interface FieldUsageEventData { + [FIELD_USAGE_EVENT_NAME]: FieldUsageEventName; + [FIELD_USAGE_FIELD_NAME]?: string; + [FIELD_USAGE_FILTER_OPERATION]?: FilterOperation; +} + +export interface DiscoverEBTContextProps { + discoverProfiles: string[]; // Discover Context Awareness Profiles +} +export type DiscoverEBTContext = BehaviorSubject; + +export class DiscoverEBTManager { + private isCustomContextEnabled: boolean = false; + private customContext$: DiscoverEBTContext | undefined; + private reportEvent: CoreSetup['analytics']['reportEvent'] | undefined; + + constructor() {} + + // https://docs.elastic.dev/telemetry/collection/event-based-telemetry + public initialize({ + core, + shouldInitializeCustomContext, + shouldInitializeCustomEvents, + }: { + core: CoreSetup; + shouldInitializeCustomContext: boolean; + shouldInitializeCustomEvents: boolean; + }) { + if (shouldInitializeCustomContext) { + // Register Discover specific context to be used in EBT + const context$ = new BehaviorSubject({ + discoverProfiles: [], + }); + core.analytics.registerContextProvider({ + name: 'discover_context', + context$, + schema: { + discoverProfiles: { + type: 'array', + items: { + type: 'keyword', + _meta: { + description: 'List of active Discover context awareness profiles', + }, + }, + }, + // If we decide to extend EBT context with more properties, we can do it here + }, + }); + this.customContext$ = context$; + } + + if (shouldInitializeCustomEvents) { + // Register Discover events to be used with EBT + core.analytics.registerEventType({ + eventType: FIELD_USAGE_EVENT_TYPE, + schema: { + [FIELD_USAGE_EVENT_NAME]: { + type: 'keyword', + _meta: { + description: + 'The name of the event that is tracked in the metrics i.e. dataTableSelection, dataTableRemoval', + }, + }, + [FIELD_USAGE_FIELD_NAME]: { + type: 'keyword', + _meta: { + description: "Field name if it's a part of ECS schema", + optional: true, + }, + }, + [FIELD_USAGE_FILTER_OPERATION]: { + type: 'keyword', + _meta: { + description: "Operation type when a filter is added i.e. '+', '-', '_exists_'", + optional: true, + }, + }, + }, + }); + this.reportEvent = core.analytics.reportEvent; + } + } + + public enableContext() { + this.isCustomContextEnabled = true; + } + + public disableAndResetContext() { + this.updateProfilesContextWith([]); + this.isCustomContextEnabled = false; + } + + public updateProfilesContextWith(discoverProfiles: DiscoverEBTContextProps['discoverProfiles']) { + if ( + this.isCustomContextEnabled && + this.customContext$ && + !isEqual(this.customContext$.getValue().discoverProfiles, discoverProfiles) + ) { + this.customContext$.next({ + discoverProfiles, + }); + } + } + + public getProfilesContext() { + return this.customContext$?.getValue()?.discoverProfiles; + } + + private async trackFieldUsageEvent({ + eventName, + fieldName, + filterOperation, + fieldsMetadata, + }: { + eventName: FieldUsageEventName; + fieldName: string; + filterOperation?: FilterOperation; + fieldsMetadata: FieldsMetadataPublicStart | undefined; + }) { + if (!this.reportEvent) { + return; + } + + const eventData: FieldUsageEventData = { + [FIELD_USAGE_EVENT_NAME]: eventName, + }; + + if (fieldsMetadata) { + const client = await fieldsMetadata.getClient(); + const { fields } = await client.find({ + attributes: ['short'], + fieldNames: [fieldName], + }); + + // excludes non ECS fields + if (fields[fieldName]?.short) { + eventData[FIELD_USAGE_FIELD_NAME] = fieldName; + } + } + + if (filterOperation) { + eventData[FIELD_USAGE_FILTER_OPERATION] = filterOperation; + } + + this.reportEvent(FIELD_USAGE_EVENT_TYPE, eventData); + } + + public async trackDataTableSelection({ + fieldName, + fieldsMetadata, + }: { + fieldName: string; + fieldsMetadata: FieldsMetadataPublicStart | undefined; + }) { + await this.trackFieldUsageEvent({ + eventName: FieldUsageEventName.dataTableSelection, + fieldName, + fieldsMetadata, + }); + } + + public async trackDataTableRemoval({ + fieldName, + fieldsMetadata, + }: { + fieldName: string; + fieldsMetadata: FieldsMetadataPublicStart | undefined; + }) { + await this.trackFieldUsageEvent({ + eventName: FieldUsageEventName.dataTableRemoval, + fieldName, + fieldsMetadata, + }); + } + + public async trackFilterAddition({ + fieldName, + fieldsMetadata, + filterOperation, + }: { + fieldName: string; + fieldsMetadata: FieldsMetadataPublicStart | undefined; + filterOperation: FilterOperation; + }) { + await this.trackFieldUsageEvent({ + eventName: FieldUsageEventName.filterAddition, + fieldName, + fieldsMetadata, + filterOperation, + }); + } +} diff --git a/test/functional/apps/discover/context_awareness/_data_source_profile.ts b/test/functional/apps/discover/context_awareness/_data_source_profile.ts index ecf4b2fb29c4c..35e3552afa655 100644 --- a/test/functional/apps/discover/context_awareness/_data_source_profile.ts +++ b/test/functional/apps/discover/context_awareness/_data_source_profile.ts @@ -12,115 +12,16 @@ import expect from '@kbn/expect'; import type { FtrProviderContext } from '../ftr_provider_context'; export default function ({ getService, getPageObjects }: FtrProviderContext) { - const { common, discover, unifiedFieldList, dashboard, header, timePicker } = getPageObjects([ + const { common, discover, unifiedFieldList } = getPageObjects([ 'common', 'discover', 'unifiedFieldList', - 'dashboard', - 'header', - 'timePicker', ]); const testSubjects = getService('testSubjects'); const dataViews = getService('dataViews'); const dataGrid = getService('dataGrid'); - const monacoEditor = getService('monacoEditor'); - const ebtUIHelper = getService('kibana_ebt_ui'); - const retry = getService('retry'); - const esArchiver = getService('esArchiver'); - const kibanaServer = getService('kibanaServer'); - const dashboardAddPanel = getService('dashboardAddPanel'); describe('data source profile', () => { - describe('telemetry', () => { - before(async () => { - await esArchiver.loadIfNeeded('test/functional/fixtures/es_archiver/logstash_functional'); - await kibanaServer.importExport.load('test/functional/fixtures/kbn_archiver/discover'); - }); - - after(async () => { - await kibanaServer.importExport.unload('test/functional/fixtures/kbn_archiver/discover'); - }); - - it('should set EBT context for telemetry events with default profile', async () => { - await common.navigateToApp('discover'); - await discover.selectTextBaseLang(); - await discover.waitUntilSearchingHasFinished(); - await monacoEditor.setCodeEditorValue('from my-example-* | sort @timestamp desc'); - await ebtUIHelper.setOptIn(true); - await testSubjects.click('querySubmitButton'); - await discover.waitUntilSearchingHasFinished(); - - const events = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { - eventTypes: ['performance_metric'], - withTimeoutMs: 500, - }); - - expect(events[events.length - 1].context.discoverProfiles).to.eql([ - 'example-root-profile', - 'default-data-source-profile', - ]); - }); - - it('should set EBT context for telemetry events when example profile and reset', async () => { - await common.navigateToApp('discover'); - await discover.selectTextBaseLang(); - await discover.waitUntilSearchingHasFinished(); - await monacoEditor.setCodeEditorValue('from my-example-logs | sort @timestamp desc'); - await ebtUIHelper.setOptIn(true); - await testSubjects.click('querySubmitButton'); - await discover.waitUntilSearchingHasFinished(); - - const events = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { - eventTypes: ['performance_metric'], - withTimeoutMs: 500, - }); - - expect(events[events.length - 1].context.discoverProfiles).to.eql([ - 'example-root-profile', - 'example-data-source-profile', - ]); - - // should reset the profiles when navigating away from Discover - await testSubjects.click('logo'); - await retry.waitFor('home page to open', async () => { - return (await testSubjects.getVisibleText('euiBreadcrumb')) === 'Home'; - }); - await testSubjects.click('addSampleData'); - - await retry.try(async () => { - const eventsAfter = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { - eventTypes: ['click'], - withTimeoutMs: 500, - }); - - expect(eventsAfter[eventsAfter.length - 1].context.discoverProfiles).to.eql([]); - }); - }); - - it('should not set EBT context for embeddables', async () => { - await dashboard.navigateToApp(); - await dashboard.gotoDashboardLandingPage(); - await dashboard.clickNewDashboard(); - await timePicker.setDefaultAbsoluteRange(); - await ebtUIHelper.setOptIn(true); - await dashboardAddPanel.addSavedSearch('A Saved Search'); - await header.waitUntilLoadingHasFinished(); - await dashboard.waitForRenderComplete(); - const rows = await dataGrid.getDocTableRows(); - expect(rows.length).to.be.above(0); - await testSubjects.click('dashboardEditorMenuButton'); - - const events = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { - eventTypes: ['click'], - withTimeoutMs: 500, - }); - - expect( - events.every((event) => !(event.context.discoverProfiles as string[])?.length) - ).to.be(true); - }); - }); - describe('ES|QL mode', () => { describe('cell renderers', () => { it('should render custom @timestamp but not custom log.level', async () => { diff --git a/test/functional/apps/discover/context_awareness/_telemetry.ts b/test/functional/apps/discover/context_awareness/_telemetry.ts new file mode 100644 index 0000000000000..587de698f9336 --- /dev/null +++ b/test/functional/apps/discover/context_awareness/_telemetry.ts @@ -0,0 +1,326 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the "Elastic License + * 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side + * Public License v 1"; you may not use this file except in compliance with, at + * your election, the "Elastic License 2.0", the "GNU Affero General Public + * License v3.0 only", or the "Server Side Public License, v 1". + */ + +import expect from '@kbn/expect'; +import type { FtrProviderContext } from '../ftr_provider_context'; + +export default function ({ getService, getPageObjects }: FtrProviderContext) { + const { common, discover, unifiedFieldList, dashboard, header, timePicker } = getPageObjects([ + 'common', + 'discover', + 'unifiedFieldList', + 'dashboard', + 'header', + 'timePicker', + ]); + const testSubjects = getService('testSubjects'); + const dataGrid = getService('dataGrid'); + const dataViews = getService('dataViews'); + const monacoEditor = getService('monacoEditor'); + const ebtUIHelper = getService('kibana_ebt_ui'); + const retry = getService('retry'); + const esArchiver = getService('esArchiver'); + const kibanaServer = getService('kibanaServer'); + const dashboardAddPanel = getService('dashboardAddPanel'); + + describe('telemetry', () => { + describe('context', () => { + before(async () => { + await esArchiver.loadIfNeeded('test/functional/fixtures/es_archiver/logstash_functional'); + await kibanaServer.importExport.load('test/functional/fixtures/kbn_archiver/discover'); + }); + + after(async () => { + await kibanaServer.importExport.unload('test/functional/fixtures/kbn_archiver/discover'); + }); + + it('should set EBT context for telemetry events with default profile', async () => { + await common.navigateToApp('discover'); + await discover.selectTextBaseLang(); + await discover.waitUntilSearchingHasFinished(); + await monacoEditor.setCodeEditorValue('from my-example-* | sort @timestamp desc'); + await ebtUIHelper.setOptIn(true); + await testSubjects.click('querySubmitButton'); + await discover.waitUntilSearchingHasFinished(); + + const events = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { + eventTypes: ['performance_metric'], + withTimeoutMs: 500, + }); + + expect(events[events.length - 1].context.discoverProfiles).to.eql([ + 'example-root-profile', + 'default-data-source-profile', + ]); + }); + + it('should set EBT context for telemetry events when example profile and reset', async () => { + await common.navigateToApp('discover'); + await discover.selectTextBaseLang(); + await discover.waitUntilSearchingHasFinished(); + await monacoEditor.setCodeEditorValue('from my-example-logs | sort @timestamp desc'); + await ebtUIHelper.setOptIn(true); + await testSubjects.click('querySubmitButton'); + await discover.waitUntilSearchingHasFinished(); + + const events = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { + eventTypes: ['performance_metric'], + withTimeoutMs: 500, + }); + + expect(events[events.length - 1].context.discoverProfiles).to.eql([ + 'example-root-profile', + 'example-data-source-profile', + ]); + + // should reset the profiles when navigating away from Discover + await testSubjects.click('logo'); + await retry.waitFor('home page to open', async () => { + return (await testSubjects.getVisibleText('euiBreadcrumb')) === 'Home'; + }); + await testSubjects.click('addSampleData'); + + await retry.try(async () => { + const eventsAfter = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { + eventTypes: ['click'], + withTimeoutMs: 500, + }); + + expect(eventsAfter[eventsAfter.length - 1].context.discoverProfiles).to.eql([]); + }); + }); + + it('should not set EBT context for embeddables', async () => { + await dashboard.navigateToApp(); + await dashboard.gotoDashboardLandingPage(); + await dashboard.clickNewDashboard(); + await timePicker.setDefaultAbsoluteRange(); + await ebtUIHelper.setOptIn(true); + await dashboardAddPanel.addSavedSearch('A Saved Search'); + await header.waitUntilLoadingHasFinished(); + await dashboard.waitForRenderComplete(); + const rows = await dataGrid.getDocTableRows(); + expect(rows.length).to.be.above(0); + await testSubjects.click('dashboardEditorMenuButton'); + + const events = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { + eventTypes: ['click'], + withTimeoutMs: 500, + }); + + expect( + events.length > 0 && + events.every((event) => !(event.context.discoverProfiles as string[])?.length) + ).to.be(true); + }); + }); + + describe('events', () => { + beforeEach(async () => { + await common.navigateToApp('discover'); + await header.waitUntilLoadingHasFinished(); + await discover.waitUntilSearchingHasFinished(); + }); + + it('should track field usage when a field is added to the table', async () => { + await dataViews.switchToAndValidate('my-example-*'); + await discover.waitUntilSearchingHasFinished(); + await unifiedFieldList.waitUntilSidebarHasLoaded(); + await ebtUIHelper.setOptIn(true); + await unifiedFieldList.clickFieldListItemAdd('service.name'); + await header.waitUntilLoadingHasFinished(); + await discover.waitUntilSearchingHasFinished(); + await unifiedFieldList.waitUntilSidebarHasLoaded(); + + const [event] = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { + eventTypes: ['discover_field_usage'], + withTimeoutMs: 500, + }); + + expect(event.properties).to.eql({ + eventName: 'dataTableSelection', + fieldName: 'service.name', + }); + + await unifiedFieldList.clickFieldListItemAdd('_score'); + await header.waitUntilLoadingHasFinished(); + await discover.waitUntilSearchingHasFinished(); + await unifiedFieldList.waitUntilSidebarHasLoaded(); + + const [_, event2] = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { + eventTypes: ['discover_field_usage'], + withTimeoutMs: 500, + }); + + expect(event2.properties).to.eql({ + eventName: 'dataTableSelection', + }); + }); + + it('should track field usage when a field is removed from the table', async () => { + await dataViews.switchToAndValidate('my-example-logs'); + await discover.waitUntilSearchingHasFinished(); + await unifiedFieldList.waitUntilSidebarHasLoaded(); + await ebtUIHelper.setOptIn(true); + await unifiedFieldList.clickFieldListItemRemove('log.level'); + await header.waitUntilLoadingHasFinished(); + await discover.waitUntilSearchingHasFinished(); + await unifiedFieldList.waitUntilSidebarHasLoaded(); + + const [event] = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { + eventTypes: ['discover_field_usage'], + withTimeoutMs: 500, + }); + + expect(event.properties).to.eql({ + eventName: 'dataTableRemoval', + fieldName: 'log.level', + }); + }); + + it('should track field usage when a filter is added', async () => { + await dataViews.switchToAndValidate('my-example-logs'); + await discover.waitUntilSearchingHasFinished(); + await ebtUIHelper.setOptIn(true); + await dataGrid.clickCellFilterForButtonExcludingControlColumns(0, 0); + await header.waitUntilLoadingHasFinished(); + await discover.waitUntilSearchingHasFinished(); + await unifiedFieldList.waitUntilSidebarHasLoaded(); + + const [event] = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { + eventTypes: ['discover_field_usage'], + withTimeoutMs: 500, + }); + + expect(event.properties).to.eql({ + eventName: 'filterAddition', + fieldName: '@timestamp', + filterOperation: '+', + }); + + await unifiedFieldList.clickFieldListExistsFilter('log.level'); + + const [_, event2] = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { + eventTypes: ['discover_field_usage'], + withTimeoutMs: 500, + }); + + expect(event2.properties).to.eql({ + eventName: 'filterAddition', + fieldName: 'log.level', + filterOperation: '_exists_', + }); + }); + + it('should track field usage for doc viewer too', async () => { + await dataViews.switchToAndValidate('my-example-logs'); + await discover.waitUntilSearchingHasFinished(); + await unifiedFieldList.waitUntilSidebarHasLoaded(); + await ebtUIHelper.setOptIn(true); + + await dataGrid.clickRowToggle(); + await discover.isShowingDocViewer(); + + // event 1 + await dataGrid.clickFieldActionInFlyout('service.name', 'toggleColumnButton'); + await header.waitUntilLoadingHasFinished(); + await discover.waitUntilSearchingHasFinished(); + + // event 2 + await dataGrid.clickFieldActionInFlyout('log.level', 'toggleColumnButton'); + await header.waitUntilLoadingHasFinished(); + await discover.waitUntilSearchingHasFinished(); + + // event 3 + await dataGrid.clickFieldActionInFlyout('log.level', 'addFilterOutValueButton'); + await header.waitUntilLoadingHasFinished(); + await discover.waitUntilSearchingHasFinished(); + + const [event1, event2, event3] = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { + eventTypes: ['discover_field_usage'], + withTimeoutMs: 500, + }); + + expect(event1.properties).to.eql({ + eventName: 'dataTableSelection', + fieldName: 'service.name', + }); + + expect(event2.properties).to.eql({ + eventName: 'dataTableRemoval', + fieldName: 'log.level', + }); + + expect(event3.properties).to.eql({ + eventName: 'filterAddition', + fieldName: 'log.level', + filterOperation: '-', + }); + }); + + it('should track field usage on surrounding documents page', async () => { + await dataViews.switchToAndValidate('my-example-logs'); + await discover.waitUntilSearchingHasFinished(); + await unifiedFieldList.waitUntilSidebarHasLoaded(); + + await dataGrid.clickRowToggle({ rowIndex: 1 }); + await discover.isShowingDocViewer(); + + const [, surroundingActionEl] = await dataGrid.getRowActions(); + await surroundingActionEl.click(); + await header.waitUntilLoadingHasFinished(); + await ebtUIHelper.setOptIn(true); + + await dataGrid.clickRowToggle({ rowIndex: 0 }); + await discover.isShowingDocViewer(); + + // event 1 + await dataGrid.clickFieldActionInFlyout('service.name', 'toggleColumnButton'); + await header.waitUntilLoadingHasFinished(); + await discover.waitUntilSearchingHasFinished(); + + // event 2 + await dataGrid.clickFieldActionInFlyout('log.level', 'toggleColumnButton'); + await header.waitUntilLoadingHasFinished(); + await discover.waitUntilSearchingHasFinished(); + + // event 3 + await dataGrid.clickFieldActionInFlyout('log.level', 'addFilterOutValueButton'); + await header.waitUntilLoadingHasFinished(); + await discover.waitUntilSearchingHasFinished(); + + const [event1, event2, event3] = await ebtUIHelper.getEvents(Number.MAX_SAFE_INTEGER, { + eventTypes: ['discover_field_usage'], + withTimeoutMs: 500, + }); + + expect(event1.properties).to.eql({ + eventName: 'dataTableSelection', + fieldName: 'service.name', + }); + + expect(event2.properties).to.eql({ + eventName: 'dataTableRemoval', + fieldName: 'log.level', + }); + + expect(event3.properties).to.eql({ + eventName: 'filterAddition', + fieldName: 'log.level', + filterOperation: '-', + }); + + expect(event3.context.discoverProfiles).to.eql([ + 'example-root-profile', + 'example-data-source-profile', + ]); + }); + }); + }); +} diff --git a/test/functional/apps/discover/context_awareness/index.ts b/test/functional/apps/discover/context_awareness/index.ts index 655f4460883d1..f937f38c741f9 100644 --- a/test/functional/apps/discover/context_awareness/index.ts +++ b/test/functional/apps/discover/context_awareness/index.ts @@ -38,6 +38,7 @@ export default function ({ getService, getPageObjects, loadTestFile }: FtrProvid loadTestFile(require.resolve('./_root_profile')); loadTestFile(require.resolve('./_data_source_profile')); + loadTestFile(require.resolve('./_telemetry')); loadTestFile(require.resolve('./extensions/_get_row_indicator_provider')); loadTestFile(require.resolve('./extensions/_get_row_additional_leading_controls')); loadTestFile(require.resolve('./extensions/_get_doc_viewer')); From f207c2c176ec6d96768f4fefec546596cce57463 Mon Sep 17 00:00:00 2001 From: Kurt Date: Mon, 30 Sep 2024 12:34:04 -0400 Subject: [PATCH 3/4] ESLint Rule to discourage hashes being created with unsafe algorithms (#190973) Closes https://github.com/elastic/kibana/issues/185601 ## Summary Using non-compliant algorithms with Node Cryptos createHash function will cause failures when running Kibana in FIPS mode. We want to discourage usages of such algorithms. --------- Co-authored-by: Sid Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com> Co-authored-by: Elastic Machine --- .../src/bundle_routes/utils.ts | 2 +- .../src/bootstrap/bootstrap_renderer.ts | 2 +- .../src/get_migration_hash.ts | 2 +- packages/kbn-es/src/install/install_source.ts | 4 +- packages/kbn-eslint-config/.eslintrc.js | 1 + packages/kbn-eslint-plugin-eslint/index.js | 1 + .../rules/no_unsafe_hash.js | 166 ++++++++++++++++++ .../rules/no_unsafe_hash.test.js | 142 +++++++++++++++ .../report_failures_to_file.ts | 2 +- .../kbn-optimizer/src/common/dll_manifest.ts | 2 +- .../server/rest_api_routes/internal/fields.ts | 2 +- .../server/routes/fullstory.ts | 2 +- .../common/plugins/cases/server/routes.ts | 2 +- 13 files changed, 320 insertions(+), 10 deletions(-) create mode 100644 packages/kbn-eslint-plugin-eslint/rules/no_unsafe_hash.js create mode 100644 packages/kbn-eslint-plugin-eslint/rules/no_unsafe_hash.test.js diff --git a/packages/core/apps/core-apps-server-internal/src/bundle_routes/utils.ts b/packages/core/apps/core-apps-server-internal/src/bundle_routes/utils.ts index 05a31f85a51cc..ee115cda6e5b8 100644 --- a/packages/core/apps/core-apps-server-internal/src/bundle_routes/utils.ts +++ b/packages/core/apps/core-apps-server-internal/src/bundle_routes/utils.ts @@ -13,7 +13,7 @@ import * as Rx from 'rxjs'; import { map, takeUntil } from 'rxjs'; export const generateFileHash = (fd: number): Promise => { - const hash = createHash('sha1'); + const hash = createHash('sha1'); // eslint-disable-line @kbn/eslint/no_unsafe_hash const read = createReadStream(null as any, { fd, start: 0, diff --git a/packages/core/rendering/core-rendering-server-internal/src/bootstrap/bootstrap_renderer.ts b/packages/core/rendering/core-rendering-server-internal/src/bootstrap/bootstrap_renderer.ts index 757862d1d3c6c..8aa0d2a6c0387 100644 --- a/packages/core/rendering/core-rendering-server-internal/src/bootstrap/bootstrap_renderer.ts +++ b/packages/core/rendering/core-rendering-server-internal/src/bootstrap/bootstrap_renderer.ts @@ -114,7 +114,7 @@ export const bootstrapRendererFactory: BootstrapRendererFactory = ({ publicPathMap, }); - const hash = createHash('sha1'); + const hash = createHash('sha1'); // eslint-disable-line @kbn/eslint/no_unsafe_hash hash.update(body); const etag = hash.digest('hex'); diff --git a/packages/core/test-helpers/core-test-helpers-so-type-serializer/src/get_migration_hash.ts b/packages/core/test-helpers/core-test-helpers-so-type-serializer/src/get_migration_hash.ts index 461188703b3aa..c65f6330e176b 100644 --- a/packages/core/test-helpers/core-test-helpers-so-type-serializer/src/get_migration_hash.ts +++ b/packages/core/test-helpers/core-test-helpers-so-type-serializer/src/get_migration_hash.ts @@ -16,7 +16,7 @@ type SavedObjectTypeMigrationHash = string; export const getMigrationHash = (soType: SavedObjectsType): SavedObjectTypeMigrationHash => { const migInfo = extractMigrationInfo(soType); - const hash = createHash('sha1'); + const hash = createHash('sha1'); // eslint-disable-line @kbn/eslint/no_unsafe_hash const hashParts = [ migInfo.name, diff --git a/packages/kbn-es/src/install/install_source.ts b/packages/kbn-es/src/install/install_source.ts index 7dfbe8d7bd5b3..244b349002829 100644 --- a/packages/kbn-es/src/install/install_source.ts +++ b/packages/kbn-es/src/install/install_source.ts @@ -84,7 +84,7 @@ async function sourceInfo(cwd: string, license: string, log: ToolingLog = defaul log.info('on %s at %s', chalk.bold(branch), chalk.bold(sha)); log.info('%s locally modified file(s)', chalk.bold(status.modified.length)); - const etag = crypto.createHash('md5').update(branch); + const etag = crypto.createHash('md5').update(branch); // eslint-disable-line @kbn/eslint/no_unsafe_hash etag.update(sha); // for changed files, use last modified times in hash calculation @@ -92,7 +92,7 @@ async function sourceInfo(cwd: string, license: string, log: ToolingLog = defaul etag.update(fs.statSync(path.join(cwd, file.path)).mtime.toString()); }); - const cwdHash = crypto.createHash('md5').update(cwd).digest('hex').substr(0, 8); + const cwdHash = crypto.createHash('md5').update(cwd).digest('hex').substr(0, 8); // eslint-disable-line @kbn/eslint/no_unsafe_hash const basename = `${branch}-${task}-${cwdHash}`; const filename = `${basename}.${ext}`; diff --git a/packages/kbn-eslint-config/.eslintrc.js b/packages/kbn-eslint-config/.eslintrc.js index a68dc6ecd949e..205e5b182e215 100644 --- a/packages/kbn-eslint-config/.eslintrc.js +++ b/packages/kbn-eslint-config/.eslintrc.js @@ -314,6 +314,7 @@ module.exports = { '@kbn/eslint/no_constructor_args_in_property_initializers': 'error', '@kbn/eslint/no_this_in_property_initializers': 'error', '@kbn/eslint/no_unsafe_console': 'error', + '@kbn/eslint/no_unsafe_hash': 'error', '@kbn/imports/no_unresolvable_imports': 'error', '@kbn/imports/uniform_imports': 'error', '@kbn/imports/no_unused_imports': 'error', diff --git a/packages/kbn-eslint-plugin-eslint/index.js b/packages/kbn-eslint-plugin-eslint/index.js index 1b9c04a2b7918..5ff3d70ae8a32 100644 --- a/packages/kbn-eslint-plugin-eslint/index.js +++ b/packages/kbn-eslint-plugin-eslint/index.js @@ -19,5 +19,6 @@ module.exports = { no_constructor_args_in_property_initializers: require('./rules/no_constructor_args_in_property_initializers'), no_this_in_property_initializers: require('./rules/no_this_in_property_initializers'), no_unsafe_console: require('./rules/no_unsafe_console'), + no_unsafe_hash: require('./rules/no_unsafe_hash'), }, }; diff --git a/packages/kbn-eslint-plugin-eslint/rules/no_unsafe_hash.js b/packages/kbn-eslint-plugin-eslint/rules/no_unsafe_hash.js new file mode 100644 index 0000000000000..2088c196ddd60 --- /dev/null +++ b/packages/kbn-eslint-plugin-eslint/rules/no_unsafe_hash.js @@ -0,0 +1,166 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the "Elastic License + * 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side + * Public License v 1"; you may not use this file except in compliance with, at + * your election, the "Elastic License 2.0", the "GNU Affero General Public + * License v3.0 only", or the "Server Side Public License, v 1". + */ + +const allowedAlgorithms = ['sha256', 'sha3-256', 'sha512']; + +module.exports = { + allowedAlgorithms, + meta: { + type: 'problem', + docs: { + description: 'Allow usage of createHash only with allowed algorithms.', + category: 'FIPS', + recommended: false, + }, + messages: { + noDisallowedHash: + 'Usage of {{functionName}} with "{{algorithm}}" is not allowed. Only the following algorithms are allowed: [{{allowedAlgorithms}}]. If you need to use a different algorithm, please contact the Kibana security team.', + }, + schema: [], + }, + create(context) { + let isCreateHashImported = false; + let createHashName = 'createHash'; + let cryptoLocalName = 'crypto'; + let usedFunctionName = ''; + const sourceCode = context.getSourceCode(); + + const disallowedAlgorithmNodes = new Set(); + + function isAllowedAlgorithm(algorithm) { + return allowedAlgorithms.includes(algorithm); + } + + function isHashOrCreateHash(value) { + if (value === 'hash' || value === 'createHash') { + usedFunctionName = value; + return true; + } + return false; + } + + function getIdentifierValue(node) { + const scope = sourceCode.getScope(node); + if (!scope) { + return; + } + const variable = scope.variables.find((variable) => variable.name === node.name); + if (variable && variable.defs.length > 0) { + const def = variable.defs[0]; + if ( + def.node.init && + def.node.init.type === 'Literal' && + !isAllowedAlgorithm(def.node.init.value) + ) { + disallowedAlgorithmNodes.add(node.name); + return def.node.init.value; + } + } + } + + return { + ImportDeclaration(node) { + if (node.source.value === 'crypto' || node.source.value === 'node:crypto') { + node.specifiers.forEach((specifier) => { + if ( + specifier.type === 'ImportSpecifier' && + isHashOrCreateHash(specifier.imported.name) + ) { + isCreateHashImported = true; + createHashName = specifier.local.name; // Capture local name (renamed or not) + } else if (specifier.type === 'ImportDefaultSpecifier') { + cryptoLocalName = specifier.local.name; + } + }); + } + }, + VariableDeclarator(node) { + if (node.init && node.init.type === 'Literal' && !isAllowedAlgorithm(node.init.value)) { + disallowedAlgorithmNodes.add(node.id.name); + } + }, + AssignmentExpression(node) { + if ( + node.right.type === 'Literal' && + node.right.value === 'md5' && + node.left.type === 'Identifier' + ) { + disallowedAlgorithmNodes.add(node.left.name); + } + }, + CallExpression(node) { + const callee = node.callee; + + if ( + callee.type === 'MemberExpression' && + callee.object.name === cryptoLocalName && + isHashOrCreateHash(callee.property.name) + ) { + const arg = node.arguments[0]; + if (arg) { + if (arg.type === 'Literal' && !isAllowedAlgorithm(arg.value)) { + context.report({ + node, + messageId: 'noDisallowedHash', + data: { + algorithm: arg.value, + allowedAlgorithms: allowedAlgorithms.join(', '), + functionName: usedFunctionName, + }, + }); + } else if (arg.type === 'Identifier') { + const identifierValue = getIdentifierValue(arg); + if (disallowedAlgorithmNodes.has(arg.name) && identifierValue) { + context.report({ + node, + messageId: 'noDisallowedHash', + data: { + algorithm: identifierValue, + allowedAlgorithms: allowedAlgorithms.join(', '), + functionName: usedFunctionName, + }, + }); + } + } + } + } + + if (isCreateHashImported && callee.name === createHashName) { + const arg = node.arguments[0]; + if (arg) { + if (arg.type === 'Literal' && !isAllowedAlgorithm(arg.value)) { + context.report({ + node, + messageId: 'noDisallowedHash', + data: { + algorithm: arg.value, + allowedAlgorithms: allowedAlgorithms.join(', '), + functionName: usedFunctionName, + }, + }); + } else if (arg.type === 'Identifier') { + const identifierValue = getIdentifierValue(arg); + if (disallowedAlgorithmNodes.has(arg.name) && identifierValue) { + context.report({ + node, + messageId: 'noDisallowedHash', + data: { + algorithm: identifierValue, + allowedAlgorithms: allowedAlgorithms.join(', '), + functionName: usedFunctionName, + }, + }); + } + } + } + } + }, + }; + }, +}; diff --git a/packages/kbn-eslint-plugin-eslint/rules/no_unsafe_hash.test.js b/packages/kbn-eslint-plugin-eslint/rules/no_unsafe_hash.test.js new file mode 100644 index 0000000000000..d384ea40819eb --- /dev/null +++ b/packages/kbn-eslint-plugin-eslint/rules/no_unsafe_hash.test.js @@ -0,0 +1,142 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the "Elastic License + * 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side + * Public License v 1"; you may not use this file except in compliance with, at + * your election, the "Elastic License 2.0", the "GNU Affero General Public + * License v3.0 only", or the "Server Side Public License, v 1". + */ + +const { RuleTester } = require('eslint'); +const { allowedAlgorithms, ...rule } = require('./no_unsafe_hash'); + +const dedent = require('dedent'); + +const joinedAllowedAlgorithms = `[${allowedAlgorithms.join(', ')}]`; + +const ruleTester = new RuleTester({ + parser: require.resolve('@typescript-eslint/parser'), + parserOptions: { + sourceType: 'module', + ecmaVersion: 2018, + ecmaFeatures: { + jsx: true, + }, + }, +}); + +ruleTester.run('@kbn/eslint/no_unsafe_hash', rule, { + valid: [ + // valid import of crypto and call of createHash + { + code: dedent` + import crypto from 'crypto'; + crypto.createHash('sha256'); + `, + }, + // valid import and call of createHash + { + code: dedent` + import { createHash } from 'crypto'; + createHash('sha256'); + `, + }, + // valid import and call of createHash with a variable containing a compliant aglorithm + { + code: dedent` + import { createHash } from 'crypto'; + const myHash = 'sha256'; + createHash(myHash); + `, + }, + // valid import and call of hash with a variable containing a compliant aglorithm + { + code: dedent` + import { hash } from 'crypto'; + const myHash = 'sha256'; + hash(myHash); + `, + }, + ], + + invalid: [ + // invalid call of createHash when calling from crypto + { + code: dedent` + import crypto from 'crypto'; + crypto.createHash('md5'); + `, + errors: [ + { + line: 2, + message: `Usage of createHash with "md5" is not allowed. Only the following algorithms are allowed: ${joinedAllowedAlgorithms}. If you need to use a different algorithm, please contact the Kibana security team.`, + }, + ], + }, + // invalid call of createHash when importing directly + { + code: dedent` + import { createHash } from 'crypto'; + createHash('md5'); + `, + errors: [ + { + line: 2, + message: `Usage of createHash with "md5" is not allowed. Only the following algorithms are allowed: ${joinedAllowedAlgorithms}. If you need to use a different algorithm, please contact the Kibana security team.`, + }, + ], + }, + // invalid call of createHash when calling with a variable containing md5 + { + code: dedent` + import { createHash } from 'crypto'; + const myHash = 'md5'; + createHash(myHash); + `, + errors: [ + { + line: 3, + message: `Usage of createHash with "md5" is not allowed. Only the following algorithms are allowed: ${joinedAllowedAlgorithms}. If you need to use a different algorithm, please contact the Kibana security team.`, + }, + ], + }, + // invalid import and call of hash when importing directly + { + code: dedent` + import { hash } from 'crypto'; + hash('md5'); + `, + errors: [ + { + line: 2, + message: `Usage of hash with "md5" is not allowed. Only the following algorithms are allowed: ${joinedAllowedAlgorithms}. If you need to use a different algorithm, please contact the Kibana security team.`, + }, + ], + }, + { + code: dedent` + import _crypto from 'crypto'; + _crypto.hash('md5'); + `, + errors: [ + { + line: 2, + message: `Usage of hash with "md5" is not allowed. Only the following algorithms are allowed: ${joinedAllowedAlgorithms}. If you need to use a different algorithm, please contact the Kibana security team.`, + }, + ], + }, + + { + code: dedent` + import { hash as _hash } from 'crypto'; + _hash('md5'); + `, + errors: [ + { + line: 2, + message: `Usage of hash with "md5" is not allowed. Only the following algorithms are allowed: ${joinedAllowedAlgorithms}. If you need to use a different algorithm, please contact the Kibana security team.`, + }, + ], + }, + ], +}); diff --git a/packages/kbn-failed-test-reporter-cli/failed_tests_reporter/report_failures_to_file.ts b/packages/kbn-failed-test-reporter-cli/failed_tests_reporter/report_failures_to_file.ts index 7876efb8502a5..b1e3997ebf030 100644 --- a/packages/kbn-failed-test-reporter-cli/failed_tests_reporter/report_failures_to_file.ts +++ b/packages/kbn-failed-test-reporter-cli/failed_tests_reporter/report_failures_to_file.ts @@ -127,7 +127,7 @@ export async function reportFailuresToFile( // Jest could, in theory, fail 1000s of tests and write 1000s of failures // So let's just write files for the first 20 for (const failure of failures.slice(0, 20)) { - const hash = createHash('md5').update(failure.name).digest('hex'); + const hash = createHash('md5').update(failure.name).digest('hex'); // eslint-disable-line @kbn/eslint/no_unsafe_hash const filenameBase = `${ process.env.BUILDKITE_JOB_ID ? process.env.BUILDKITE_JOB_ID + '_' : '' }${hash}`; diff --git a/packages/kbn-optimizer/src/common/dll_manifest.ts b/packages/kbn-optimizer/src/common/dll_manifest.ts index 0a5bebefdeca5..fc8c597110156 100644 --- a/packages/kbn-optimizer/src/common/dll_manifest.ts +++ b/packages/kbn-optimizer/src/common/dll_manifest.ts @@ -20,7 +20,7 @@ export interface ParsedDllManifest { } const hash = (s: string) => { - return Crypto.createHash('sha1').update(s).digest('base64').replace(/=+$/, ''); + return Crypto.createHash('sha1').update(s).digest('base64').replace(/=+$/, ''); // eslint-disable-line @kbn/eslint/no_unsafe_hash }; export function parseDllManifest(manifest: DllManifest): ParsedDllManifest { diff --git a/src/plugins/data_views/server/rest_api_routes/internal/fields.ts b/src/plugins/data_views/server/rest_api_routes/internal/fields.ts index 7b13704f3c50a..0d8f8b4dd67b5 100644 --- a/src/plugins/data_views/server/rest_api_routes/internal/fields.ts +++ b/src/plugins/data_views/server/rest_api_routes/internal/fields.ts @@ -21,7 +21,7 @@ import { parseFields, IBody, IQuery, querySchema, validate } from './fields_for' import { DEFAULT_FIELD_CACHE_FRESHNESS } from '../../constants'; export function calculateHash(srcBuffer: Buffer) { - const hash = createHash('sha1'); + const hash = createHash('sha1'); // eslint-disable-line @kbn/eslint/no_unsafe_hash hash.update(srcBuffer); return hash.digest('hex'); } diff --git a/x-pack/plugins/cloud_integrations/cloud_full_story/server/routes/fullstory.ts b/x-pack/plugins/cloud_integrations/cloud_full_story/server/routes/fullstory.ts index 03e38baee4e91..d983191c726df 100644 --- a/x-pack/plugins/cloud_integrations/cloud_full_story/server/routes/fullstory.ts +++ b/x-pack/plugins/cloud_integrations/cloud_full_story/server/routes/fullstory.ts @@ -26,7 +26,7 @@ export const renderFullStoryLibraryFactory = (dist = true) => headers: HttpResponseOptions['headers']; }> => { const srcBuffer = await fs.readFile(FULLSTORY_LIBRARY_PATH); - const hash = createHash('sha1'); + const hash = createHash('sha1'); // eslint-disable-line @kbn/eslint/no_unsafe_hash hash.update(srcBuffer); const hashDigest = hash.digest('hex'); diff --git a/x-pack/test/cases_api_integration/common/plugins/cases/server/routes.ts b/x-pack/test/cases_api_integration/common/plugins/cases/server/routes.ts index 10139f636c809..3269f9f059446 100644 --- a/x-pack/test/cases_api_integration/common/plugins/cases/server/routes.ts +++ b/x-pack/test/cases_api_integration/common/plugins/cases/server/routes.ts @@ -19,7 +19,7 @@ import { CASES_TELEMETRY_TASK_NAME } from '@kbn/cases-plugin/common/constants'; import type { FixtureStartDeps } from './plugin'; const hashParts = (parts: string[]): string => { - const hash = createHash('sha1'); + const hash = createHash('sha1'); // eslint-disable-line @kbn/eslint/no_unsafe_hash const hashFeed = parts.join('-'); return hash.update(hashFeed).digest('hex'); }; From fefa59f41206c534297813af2cb6f732c2c59aeb Mon Sep 17 00:00:00 2001 From: Davis Plumlee <56367316+dplumlee@users.noreply.github.com> Date: Mon, 30 Sep 2024 12:37:29 -0400 Subject: [PATCH 4/4] [Security Solution] Test plan for rule `type` field diff algorithm (#193372) ## Summary Related ticket: https://github.com/elastic/kibana/issues/190482 Adds test plan for diff algorithm for `type` field diff algorithm implemented here: https://github.com/elastic/kibana/pull/193369 ### For maintainers - [ ] This was checked for breaking API changes and was [labeled appropriately](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) --- .../upgrade_review_algorithms.md | 117 ++++++++++++++++-- 1 file changed, 109 insertions(+), 8 deletions(-) diff --git a/x-pack/plugins/security_solution/docs/testing/test_plans/detection_response/prebuilt_rules/upgrade_review_algorithms.md b/x-pack/plugins/security_solution/docs/testing/test_plans/detection_response/prebuilt_rules/upgrade_review_algorithms.md index e65d366e0f44c..c4a39a994144f 100644 --- a/x-pack/plugins/security_solution/docs/testing/test_plans/detection_response/prebuilt_rules/upgrade_review_algorithms.md +++ b/x-pack/plugins/security_solution/docs/testing/test_plans/detection_response/prebuilt_rules/upgrade_review_algorithms.md @@ -17,12 +17,16 @@ Status: `in progress`. - [Rule field doesn't have an update and has no custom value - `AAA`](#rule-field-doesnt-have-an-update-and-has-no-custom-value---aaa) - [**Scenario: `AAA` - Rule field is any type**](#scenario-aaa---rule-field-is-any-type) - [Rule field doesn't have an update but has a custom value - `ABA`](#rule-field-doesnt-have-an-update-but-has-a-custom-value---aba) - - [**Scenario: `ABA` - Rule field is any type**](#scenario-aba---rule-field-is-any-type) + - [**Scenario: `ABA` - Rule field is any type except rule `type`**](#scenario-aba---rule-field-is-any-type-except-rule-type) + - [**Scenario: `ABA` - Rule field is rule `type`**](#scenario-aba---rule-field-is-rule-type) - [Rule field has an update and doesn't have a custom value - `AAB`](#rule-field-has-an-update-and-doesnt-have-a-custom-value---aab) - - [**Scenario: `AAB` - Rule field is any type**](#scenario-aab---rule-field-is-any-type) + - [**Scenario: `AAB` - Rule field is any type except rule `type`**](#scenario-aab---rule-field-is-any-type-except-rule-type) + - [**Scenario: `AAB` - Rule field is rule `type`**](#scenario-aab---rule-field-is-rule-type) - [Rule field has an update and a custom value that are the same - `ABB`](#rule-field-has-an-update-and-a-custom-value-that-are-the-same---abb) - - [**Scenario: `ABB` - Rule field is any type**](#scenario-abb---rule-field-is-any-type) + - [**Scenario: `ABB` - Rule field is any type except rule `type`**](#scenario-abb---rule-field-is-any-type-except-rule-type) + - [**Scenario: `ABB` - Rule field is rule `type`**](#scenario-abb---rule-field-is-rule-type) - [Rule field has an update and a custom value that are NOT the same - `ABC`](#rule-field-has-an-update-and-a-custom-value-that-are-not-the-same---abc) + - [**Scenario: `ABC` - Rule field is rule `type`**](#scenario-abc---rule-field-is-rule-type) - [**Scenario: `ABC` - Rule field is a number or single line string**](#scenario-abc---rule-field-is-a-number-or-single-line-string) - [**Scenario: `ABC` - Rule field is a mergeable multi line string**](#scenario-abc---rule-field-is-a-mergeable-multi-line-string) - [**Scenario: `ABC` - Rule field is a non-mergeable multi line string**](#scenario-abc---rule-field-is-a-non-mergeable-multi-line-string) @@ -37,6 +41,7 @@ Status: `in progress`. - [**Scenario: `-AB` - Rule field is an array of scalar values**](#scenario--ab---rule-field-is-an-array-of-scalar-values) - [**Scenario: `-AB` - Rule field is a solvable `data_source` object**](#scenario--ab---rule-field-is-a-solvable-data_source-object) - [**Scenario: `-AB` - Rule field is a non-solvable `data_source` object**](#scenario--ab---rule-field-is-a-non-solvable-data_source-object) + - [**Scenario: `-AB` - Rule field is rule `type`**](#scenario--ab---rule-field-is-rule-type) ## Useful information @@ -74,7 +79,7 @@ Status: `in progress`. #### **Scenario: `AAA` - Rule field is any type** -**Automation**: 10 integration tests with mock rules + a set of unit tests for each algorithm +**Automation**: 11 integration tests with mock rules + a set of unit tests for each algorithm ```Gherkin Given field is not customized by the user (current version == base version) @@ -85,6 +90,7 @@ And field should not be shown in the upgrade preview UI Examples: | algorithm | field_name | base_version | current_version | target_version | merged_version | +| rule type | type | "query" | "query" | "query" | "query" | | single line string | name | "A" | "A" | "A" | "A" | | multi line string | description | "My description.\nThis is a second line." | "My description.\nThis is a second line." | "My description.\nThis is a second line." | "My description.\nThis is a second line." | | number | risk_score | 1 | 1 | 1 | 1 | @@ -99,7 +105,7 @@ Examples: ### Rule field doesn't have an update but has a custom value - `ABA` -#### **Scenario: `ABA` - Rule field is any type** +#### **Scenario: `ABA` - Rule field is any type except rule `type`** **Automation**: 10 integration tests with mock rules + a set of unit tests for each algorithm @@ -124,9 +130,27 @@ Examples: | esql_query | esql_query | {query: "FROM query WHERE true", language: "esql"} | {query: "FROM query WHERE false", language: "esql"} | {query: "FROM query WHERE true", language: "esql"} | {query: "FROM query WHERE false", language: "esql"} | ``` +#### **Scenario: `ABA` - Rule field is rule `type`** + +**Automation**: 1 integration test with mock rules + a set of unit tests for each algorithm + +```Gherkin +Given field is customized by the user (current version != base version) +And field is not updated by Elastic in this upgrade (target version == base version) +Then for field the diff algorithm should output the target version as the merged one with a non-solvable conflict +And field should be returned from the `upgrade/_review` API endpoint +And field should be shown in the upgrade preview UI + +Examples: +| algorithm | field_name | base_version | current_version | target_version | merged_version | +| rule type | type | "query" | "saved_query" | "query" | "query" | +``` + +Notes: `type` field can only be changed between `query` and `saved_query` rule types in the UI and API via normal conventions, but the logic for others is still covered + ### Rule field has an update and doesn't have a custom value - `AAB` -#### **Scenario: `AAB` - Rule field is any type** +#### **Scenario: `AAB` - Rule field is any type except rule `type`** **Automation**: 10 integration tests with mock rules + a set of unit tests for each algorithm @@ -151,9 +175,27 @@ Examples: | esql_query | esql_query | {query: "FROM query WHERE true", language: "esql"} | {query: "FROM query WHERE true", language: "esql"} | {query: "FROM query WHERE false", language: "esql"} | {query: "FROM query WHERE false", language: "esql"} | ``` +#### **Scenario: `AAB` - Rule field is rule `type`** + +**Automation**: 1 integration test with mock rules + a set of unit tests for each algorithm + +```Gherkin +Given field is not customized by the user (current version == base version) +And field is updated by Elastic in this upgrade (target version != base version) +Then for field the diff algorithm should output the target version as the merged one with a non-solvable conflict +And field should be returned from the `upgrade/_review` API endpoint +And field should be shown in the upgrade preview UI + +Examples: +| algorithm | field_name | base_version | current_version | target_version | merged_version | +| rule type | type | "query" | "query" | "saved_query" | "saved_query" | +``` + +Notes: `type` field can only be changed between `query` and `saved_query` rule types in the UI and API via normal conventions, but the logic for others is still covered + ### Rule field has an update and a custom value that are the same - `ABB` -#### **Scenario: `ABB` - Rule field is any type** +#### **Scenario: `ABB` - Rule field is any type except rule `type`** **Automation**: 10 integration tests with mock rules + a set of unit tests for each algorithm @@ -179,8 +221,46 @@ Examples: | esql_query | esql_query | {query: "FROM query WHERE true", language: "esql"} | {query: "FROM query WHERE false", language: "esql"} | {query: "FROM query WHERE false", language: "esql"} | {query: "FROM query WHERE false", language: "esql"} | ``` +#### **Scenario: `ABB` - Rule field is rule `type`** + +**Automation**: 1 integration test with mock rules + a set of unit tests for each algorithm + +```Gherkin +Given field is customized by the user (current version != base version) +And field is updated by Elastic in this upgrade (target version != base version) +And customized field is the same as the Elastic update in this upgrade (current version == target version) +Then for field the diff algorithm should output the target version as the merged one with a non-solvable conflict +And field should be returned from the `upgrade/_review` API endpoint +And field should be shown in the upgrade preview UI + +Examples: +| algorithm | field_name | base_version | current_version | target_version | merged_version | +| rule type | type | "query" | "saved_query" | "saved_query" | "saved_query" | +``` + +Notes: `type` field can only be changed between `query` and `saved_query` rule types in the UI and API via normal conventions, but the logic for others is still covered + ### Rule field has an update and a custom value that are NOT the same - `ABC` +#### **Scenario: `ABC` - Rule field is rule `type`** + +**Automation**: 1 integration test with mock rules + a set of unit tests for the algorithms + +```Gherkin +Given field is customized by the user (current version != base version) +And field is updated by Elastic in this upgrade (target version != base version) +And customized field is different than the Elastic update in this upgrade (current version != target version) +Then for field the diff algorithm should output the target version as the merged one with a non-solvable conflict +And field should be returned from the `upgrade/_review` API endpoint +And field should be shown in the upgrade preview UI + +Examples: +| algorithm | field_name | base_version | current_version | target_version | merged_version | +| rule type | type | "query" | "saved_query" | "threshold" | "threshold" | +``` + +Notes: `type` field can only be changed between `query` and `saved_query` rule types in the UI and API via normal conventions, but the logic for others is still covered. This test case scenario cannot currently be reached. + #### **Scenario: `ABC` - Rule field is a number or single line string** **Automation**: 2 integration tests with mock rules + a set of unit tests for the algorithms @@ -328,7 +408,7 @@ Examples: #### **Scenario: `-AA` - Rule field is any type** -**Automation**: 9 integration tests with mock rules + a set of unit tests for each algorithm +**Automation**: 11 integration tests with mock rules + a set of unit tests for each algorithm ```Gherkin Given at least 1 installed prebuilt rule has a new version available @@ -340,6 +420,7 @@ And field should not be shown in the upgrade preview UI Examples: | algorithm | field_name | base_version | current_version | target_version | merged_version | +| rule type | type | N/A | "query" | "query" | "query" | | single line string | name | N/A | "A" | "A" | "A" | | multi line string | description | N/A | "My description.\nThis is a second line." | "My description.\nThis is a second line." | "My description.\nThis is a second line." | | number | risk_score | N/A | 1 | 1 | 1 | @@ -438,3 +519,23 @@ Examples: | algorithm | base_version | current_version | target_version | merged_version | | data_source | N/A | {type: "index_patterns", "index_patterns": ["one", "two", "three"]} | {type: "data_view", "data_view_id": "A"} | {type: "data_view", "data_view_id": "A"} | ``` + +#### **Scenario: `-AB` - Rule field is rule `type`** + +**Automation**: 1 integration test with mock rules + a set of unit tests for the algorithm + +```Gherkin +Given at least 1 installed prebuilt rule has a new version available +And the base version of the rule cannot be determined +And customized data_source field is different than the Elastic update in this upgrade (current version != target version) +And current version and target version are not both array fields in data_source +Then for data_source field the diff algorithm should output the target version as the merged version with a non-solvable conflict +And data_source field should be returned from the `upgrade/_review` API endpoint +And data_source field should be shown in the upgrade preview UI + +Examples: +| algorithm | base_version | current_version | target_version | merged_version | +| rule type | N/A | "query" | "saved_query" | "saved_query" | +``` + +Notes: `type` field can only be changed between `query` and `saved_query` rule types in the UI and API via normal conventions, but the logic for others is still covered