diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index 086c31140dad4..1aa57cc822b23 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -754,7 +754,7 @@ src/plugins/saved_objects_finder @elastic/kibana-data-discovery test/plugin_functional/plugins/saved_objects_hidden_from_http_apis_type @elastic/kibana-core test/plugin_functional/plugins/saved_objects_hidden_type @elastic/kibana-core src/plugins/saved_objects_management @elastic/kibana-core -src/plugins/saved_objects @elastic/kibana-core +src/plugins/saved_objects @elastic/appex-sharedux packages/kbn-saved-objects-settings @elastic/appex-sharedux src/plugins/saved_objects_tagging_oss @elastic/appex-sharedux x-pack/plugins/saved_objects_tagging @elastic/appex-sharedux diff --git a/docs/CHANGELOG.asciidoc b/docs/CHANGELOG.asciidoc index 074dc6ffd05f0..3c102cbbf9384 100644 --- a/docs/CHANGELOG.asciidoc +++ b/docs/CHANGELOG.asciidoc @@ -10,6 +10,7 @@ Review important information about the {kib} 8.x releases. +* <> * <> * <> * <> @@ -77,6 +78,360 @@ Review important information about the {kib} 8.x releases. include::upgrade-notes.asciidoc[] + +[[release-notes-8.16.0]] +== {kib} 8.16.0 + +For information about the {kib} 8.16.0 release, review the following information. + +[float] +[[deprecations-8.16.0]] +=== Deprecations + +The following functionality is deprecated in 8.16.0, and will be removed in 9.0.0. +Deprecated functionality does not have an immediate impact on your application, but we strongly recommend +you make the necessary updates after you upgrade to 8.16.0. + +[discrete] +* The Logs Stream is now hidden by default in favor of the Logs Explorer app. +[%collapsible] +==== +*Details* + +You can find the Logs Explorer app in the navigation menu under Logs > Explorer, or as a separate tab in Discover. For more information, refer to ({kibana-pull}194519[#194519]). + +*Impact* + +You can still show the Logs Stream app again by navigating to Stack Management > Advanced Settings and by enabling the `observability:enableLogsStream` setting. +==== + +[discrete] +* Deprecates the Observability AI Assistant specific advanced setting `observability:aiAssistantLogsIndexPattern`. +[%collapsible] +==== +*Details* + +The Observability AI Assistant specific advanced setting for Logs index patterns `observability:aiAssistantLogsIndexPattern` is deprecated and no longer used. The AI Assistant will now use the existing **Log sources** setting `observability:logSources` instead. For more information, refer to ({kibana-pull}192003[#192003]). + +//*Impact* + +//!!TODO!! +==== + + + +[float] +[[features-8.16.0]] +=== Features +{kib} 8.16.0 adds the following new and notable features. + +AGPL license:: +* Adds AGPL 3.0 license ({kibana-pull}192025[#192025]). +Alerting:: +* Adds TheHive connector ({kibana-pull}180138[#180138]). +* Adds flapping settings per rule ({kibana-pull}189341[#189341]). +Cases:: +* Support TheHive connector in cases ({kibana-pull}180931[#180931]). +Dashboards and visualizations:: +* Adds the ability to star your favorite dashboards and quickly find them ({kibana-pull}189285[#189285]). +* Adds a chart showing usage statistics to the dashboard details ({kibana-pull}187993[#187993]). +* Adds metric styling options in *Lens* ({kibana-pull}186929[#186929]). +* Adds support for coloring table cells by terms with color mappings assignments. This is supported for both Rows and Metric dimensions ({kibana-pull}189895[#189895]). +Data ingestion and Fleet:: +* Support content packages in UI ({kibana-pull}195831[#195831]). +* Advanced agent monitoring options UI for HTTP endpoint and diagnostics ({kibana-pull}193361[#193361]). +* Adds option to have Kafka dynamic topics in outputs ({kibana-pull}192720[#192720]). +* Adds support for GeoIP processor databases in Ingest Pipelines ({kibana-pull}190830[#190830]). +//// +!!TODO!! The above PR had a lengthy release note description: +The Ingest Pipelines app now supports adding and managing databases for the GeoIP processor. Additionally, the pipeline creation flow now includes support for the IP Location processor. +//// +* Adds agentless ux creation flow ({kibana-pull}189932[#189932]). +* Enable feature flag for reusable integration policies ({kibana-pull}187153[#187153]). +Discover:: +* When writing ES|QL queries, you now get recommendations to help you get started ({kibana-pull}194418[#194418]). +* Enhances the inline documentation experience in ES|QL mode ({kibana-pull}192156[#192156]). +* Adds the ability to break down the histogram by field for ES|QL queries in Discover ({kibana-pull}193820[#193820]). +* Adds a summary column to the Documents table when exploring log data in Discover ({kibana-pull}192567[#192567]). +* Adds row indicators to the Documents table when exploring log data in Discover ({kibana-pull}190676[#190676]). +* Moves the button to switch between ES|QL and classic modes to the toolbar ({kibana-pull}188898[#188898]). +* Adds density settings to allow further customization of the Documents table layout ({kibana-pull}188495[#188495]). +* Enables the time picker for indices without the @timestamp field when editing ES|QL queries ({kibana-pull}184361[#184361]). +Elastic Observability solution:: +* Show monitors from all permitted spaces !! ({kibana-pull}196109[#196109]). +* Adds experimental logs overview to the observability hosts and service overviews ({kibana-pull}195673[#195673]). +* Show alerts for entities ({kibana-pull}195250[#195250]). +* Create sub-feature role to manage APM settings write permissions ({kibana-pull}194419[#194419]). +* Adds related alerts tab to the alert details page ({kibana-pull}193263[#193263]). +* Adds labels field !! ({kibana-pull}193250[#193250]). +* Implement _ignored root cause identification flow ({kibana-pull}192370[#192370]). +* Enable page for synthetics ({kibana-pull}191846[#191846]). +* Settings add config to enable default rules ({kibana-pull}190800[#190800]). +* Added alerts page ({kibana-pull}190751[#190751]). +* Monitor list add bulk delete ({kibana-pull}190674[#190674]). +* Delete monitor API via id param !! ({kibana-pull}190210[#190210]). +* Enable metrics and traces in the Data Set Quality page ({kibana-pull}190043[#190043]). +* Adds alert grouping functionality to the observability alerts page ({kibana-pull}189958[#189958]). +* Adds a new SLO Burn Rate embeddable ({kibana-pull}189429[#189429]). +* The Slack Web API Alert Connector is now supported as a default connector for Synthetics and Uptime rules ({kibana-pull}188437[#188437]). +* Adds option to enable backfill transform ({kibana-pull}188379[#188379]). +* Save the ECS group by fields at the AAD root level ({kibana-pull}188241[#188241]). +* Adds last value aggregation ({kibana-pull}187082[#187082]). +* Improve synthetics alerting ({kibana-pull}186585[#186585]). +* Make overview grid embeddable ({kibana-pull}160597[#160597]). +Elastic Security solution:: +For the Elastic Security 8.16.0 release information, refer to {security-guide}/release-notes.html[_Elastic Security Solution Release Notes_]. +Kibana security:: +* Adds an API endpoint `POST security/roles` that can be used to bulk create or update roles ({kibana-pull}189173[#189173]). +* Automatic Import can now create integrations for logs in the CSV format ({kibana-pull}194386[#194386]). +* Adds an error handling framework to Automatic Import that provides error messages with more context to user ({kibana-pull}193577[#193577]). +* When running in FIPS mode, Kibana forbids usage of PKCS12 configuration options ({kibana-pull}192627[#192627]). +Machine Learning:: +* Adds new section for creating daylight saving time calendar events ({kibana-pull}193605[#193605]). +* Anomaly Detection: Adds a page to list supplied job configurations ({kibana-pull}191564[#191564]). +* Redesigns start/update model deployment dialog to support adaptive resources ({kibana-pull}190243[#190243]). +* File upload: Adds support for PDF files ({kibana-pull}186956[#186956]). +* Adds Pattern analysis embeddable for dashboards ({kibana-pull}186539[#186539]). +Management:: +* This release introduces a fresh, modern look for the console, now featuring the Monaco editor. We've added a file import and export functionality, and the console is fully responsive with stackable panels for a smoother experience. New buttons allow for quick clearing of editor values and output. Additionally, the history and config tabs were improved to enhance usability. ({kibana-pull}189748[#189748]). + +For more information about the features introduced in 8.16.0, refer to <>. + +[[enhancements-and-bug-fixes-v8.16.0]] +=== Enhancements and bug fixes + +For detailed information about the 8.16.0 release, review the enhancements and bug fixes. + + +[float] +[[enhancement-v8.16.0]] +=== Enhancements +Alerting:: +* Allow users to select template while adding a case action in the rule ({kibana-pull}190701[#190701]). +* New full-page rule form in the Stack Management app ({kibana-pull}194655[#194655]). +Dashboards and visualizations:: +* Adds compressed style for dashboard controls ({kibana-pull}190636[#190636]). +* Adds the ability to duplicate a managed dashboard from its `managed` badge ({kibana-pull}189404[#189404]). +* Adds the ability to expand the height of various sections in the Edit ES|QL visualization flyout ({kibana-pull}193453[#193453]). +* Improves the query authoring experience when editing an ES|QL visualization ({kibana-pull}186875[#186875]). +* Syncs the cursor for time series charts powered by ES|QL ({kibana-pull}192837[#192837]). +* Gauge and metric Lens visualizations are no longer experimental ({kibana-pull}192359[#192359]). +* Sets gauge default palette to "temperature" in *Lens* ({kibana-pull}191853[#191853]). +* Supports fuzzy search on field pickers and field lists in *Lens* ({kibana-pull}186894[#186894]). +Data ingestion and Fleet:: +* Update max supported package version ({kibana-pull}196551[#196551]). +* Adds additional columns to Agent Logs UI ({kibana-pull}192262[#192262]). +* Show `+build` versions for Elastic Agent upgrades ({kibana-pull}192171[#192171]). +* Added format parameter to `agent_policies` APIs ({kibana-pull}191811[#191811]). +* Adds toggles for `agent.monitoring.http.enabled` and `agent.monitoring.http.buffer.enabled` to agent policy advanced settings ({kibana-pull}190984[#190984]). +* Support integration policies without agent policy references (aka orphaned integration policies) ({kibana-pull}190649[#190649]). +* Changed the UX of the Edit Integration Policy page to update agent policies ({kibana-pull}190583[#190583]). +* Allow `traces` to be added to the `monitoring_enabled` array in Agent policies ({kibana-pull}189908[#189908]). +* Create task that periodically unenrolls inactive agents ({kibana-pull}189861[#189861]). +* Adds setup technology selector to add integration page ({kibana-pull}189612[#189612]). +* Support integration-level outputs ({kibana-pull}189125[#189125]). +Discover:: +* Renames the Documents tab to Results in ES|QL mode ({kibana-pull}197833[#197833]). +* Adds a cluster details tab for CCS data sources when inspecting requests in ES|QL mode ({kibana-pull}195373[#195373]). +* Adds the query time to the list of statistics when inspecting requests in ES|QL mode ({kibana-pull}194806[#194806]). +* Improves display of error messages in ES|QL mode ({kibana-pull}191320[#191320]). +* Adds a help menu to the ES|QL mode ({kibana-pull}190579[#190579]). +* Initializes the ES|QL editor with time named parameters when switching from the classic mode with a data view without @timestamp ({kibana-pull}189367[#189367]). +* Adds the ability to select multiple rows from the Documents table using "Shift + Select" ({kibana-pull}193619[#193619]). +* Adds the ability to filter on field names and values in the expanded document view ({kibana-pull}192299[#192299]). +* Adds filtering for selected fields ({kibana-pull}191930[#191930]). +* Adds a dedicated column to the document viewer flyout for pinning and unpinning rows ({kibana-pull}190344[#190344]). +* Improves absolute column width handling ({kibana-pull}190288[#190288]). +* Allows filtering by field type in the document viewer flyout ({kibana-pull}189981[#189981]). +* Improves the document viewer flyout to remember the last active tab ({kibana-pull}189806[#189806]). +* Adds ability to hide fields with null values from the document viewer ({kibana-pull}189601[#189601]). +* Adds the ability to copy selected rows as text ({kibana-pull}189512[#189512]). +* Adds a log level badge cell renderer to the Discover logs profile ({kibana-pull}188281[#188281]). +* Shows ECS field descriptions in Discover and adds markdown support for field descriptions ({kibana-pull}187160[#187160]). +* Adds support for the Log overview tab to the Discover log profile ({kibana-pull}186680[#186680]). +* Adds default app state extension and log integration data source profiles ({kibana-pull}186347[#186347]). +* Allows to select and deselect all rows in the grid at once ({kibana-pull}184241[#184241]). +* Limits the height of long field values by default ({kibana-pull}183736[#183736]). +ES|QL editor:: +* Changes the auto-focus to be on the ES|QL editor when loading the page ({kibana-pull}193800[#193800]). +* Updates the autocomplete behavior for `SORT` to be in line with other field-list-based experiences like `KEEP` in ES|QL queries ({kibana-pull}193595[#193595]). +* Adds `all (*)` to the list of suggestions for `COUNT` functions in ES|QL queries ({kibana-pull}192205[#192205]). +* Improves ES|QL autocomplete suggestions for `case()` expressions ({kibana-pull}192135[#192135]). +* Opens suggestions automatically for sources lists and `ENRICH` functions when writing ES|QL queries ({kibana-pull}191312[#191312]). +* Improves wrapping and readability for ES|QL queries ({kibana-pull}191269[#191269]). +* Improves suggestions based on previous function arguments and date suggestions for `bucket` functions in ES|QL queries ({kibana-pull}190828[#190828]). +* Show the `LIMIT` information in the ES|QL editor's footer ({kibana-pull}190498[#190498]). +* Opens suggestions automatically for field lists in ES|QL queries ({kibana-pull}190466[#190466]). +* Integrates a time picker for date fields into the ES|QL editor ({kibana-pull}187047[#187047]). +* Improves ES|QL support for Elasticsearch sub-types in AST for both validation and autocomplete ({kibana-pull}189689[#189689]). +* Adds ECS information to the ES|QL editor suggestions and prioritizes fields based on ECS information on the editor ({kibana-pull}187922[#187922]). +* Improves `BY` suggestions in ES|QL queries to include pipe and comma operators ({kibana-pull}189458[#189458]). +* Makes the suggestion menu open automatically in more places in ES|QL queries ({kibana-pull}189585[#189585]). +* Adds hints upon hover for function argument types and time system types ({kibana-pull}191881[#191881]). +Elastic Observability solution:: +* Enable Kubernetes Otel flow ({kibana-pull}196531[#196531]). +* Pass function responses when copying conversation ({kibana-pull}195635[#195635]). +* Turn 'fast filter' on by default and ensure tech preview badge shows when turned on ({kibana-pull}193710[#193710]). +* Custom Service Name Cell ({kibana-pull}192381[#192381]). +* Remove manage_transform and manage_ingest_pipeline privilege requirements ({kibana-pull}190572[#190572]). +* Create new formula for CPU Usage metric ({kibana-pull}189261[#189261]). +* Adds customizable header for quickstart flows ({kibana-pull}188340[#188340]). +* Change Kubernetes guide to link to observability onboarding ({kibana-pull}188322[#188322]). +* Adds KB user instructions ({kibana-pull}187607[#187607]). +* Refactor Synthetics Overview page for increased scalability ({kibana-pull}187092[#187092]). +* Improve synthetics alerting ({kibana-pull}186585[#186585]). +* Annotations Initial phase ({kibana-pull}184325[#184325]). +Elastic Search solution:: +* Adds Alibaba AI Search to Deletion, search and filtering of inference endpoints ({kibana-pull}190783[#190783]). +Elastic Security solution:: +For the Elastic Security 8.16.0 release information, refer to {security-guide}/release-notes.html[_Elastic Security Solution Release Notes_]. +Kibana security:: +* Enhances Open API spec generation to include Route Security Authorization if available ({kibana-pull}197001[#197001]). +* Automatic Import now analyzes larger number of samples to generate an integration ({kibana-pull}196233[#196233]). +* Extended `KibanaRouteOptions` to include security configuration at the route definition level ({kibana-pull}191973[#191973]). +* Adds several UX improvements to the management of Spaces in **Stack Management > Spaces**, including the ability to assign Roles to an existing Space. ({kibana-pull}191795[#191795]). +* Displays an "invalid file" error when selecting unsupported file types for the user profile image ({kibana-pull}190077[#190077]). +* Displays a warning to users whenever role mappings with empty `any` or `all` rules are created or updated ({kibana-pull}189340[#189340]). +* Adds support for CHIPS cookies ({kibana-pull}188519[#188519]). +* Adds support for Permissions Policy reporting ({kibana-pull}186892[#186892]). +Machine Learning:: +* File upload: enables check for model allocations ({kibana-pull}197395[#197395]). +* Data visualizer: Adds icons for semantic text, sparse vector, and dense vector ({kibana-pull}196069[#196069]). +* Updates vCPUs ranges for start model deployment ({kibana-pull}195617[#195617]). +* Adds ML tasks to the Kibana audit log ({kibana-pull}195120[#195120]). +* Anomaly Detection: adds ability to delete forecasts from job ({kibana-pull}194896[#194896]). +* Updates for Trained Models table layout and model states ({kibana-pull}194614[#194614]). +* Log rate analysis: ensures ability to sort on Log rate change ({kibana-pull}193501[#193501]). +* Single Metric Viewer: Enables cross-filtering for 'by', 'over', and 'partition' field values ({kibana-pull}193255[#193255]). +* Adds link to anomaly detection configurations from Integration > Assets tab ({kibana-pull}193105[#193105]). +* Anomaly Explorer: Displays markers for scheduled events in distribution-type anomaly charts ({kibana-pull}192377[#192377]). +* Serverless Security: Adds ES|QL visualizer menu item to the nav ({kibana-pull}192314[#192314]). +* Updates icons for Machine Learning embeddable dashboard panel types ({kibana-pull}191718[#191718]). +* AIOps: Uses no minimum time range by default for pattern analysis ({kibana-pull}191192[#191192]). +* Links to ML assets from Integration > Assets tab ({kibana-pull}189767[#189767]). +* Utilizes the `DataViewLazy` in ML plugin ({kibana-pull}189188[#189188]). +* AIOps: Chunks groups of field candidates into single queries for top items and histograms ({kibana-pull}189155[#189155]). +* AIOps: Updates fields filter popover to be able to filter fields from analysis (not just grouping) ({kibana-pull}188913[#188913]). +* Single Metric Viewer embeddable: adds forecasting ({kibana-pull}188791[#188791]). +* Adds new custom rule action to force time shift ({kibana-pull}188710[#188710]). +* AIOps: Chunks groups of field candidates into single queries ({kibana-pull}188137[#188137]). +* AIOps: Adds log rate analysis to alert details page contextual insight ({kibana-pull}187690[#187690]). +* Adds ability to toggle visibility for empty fields when choosing an aggregation or field in Anomaly detection, data frame analytics ({kibana-pull}186670[#186670]). +* Anomaly Detection: Adds popover links menu to anomaly explorer charts ({kibana-pull}186587[#186587]). +Management:: +* Adds an option to show or hide empty fields in dropdown lists in Transform ({kibana-pull}195485[#195485]). +* Adds a confirmation dialog when deleting a transform from a warning banner ({kibana-pull}192080[#192080]). +* Improves the autocomplete to suggest fields for the `dense_vector` type in Console ({kibana-pull}190769[#190769]). +* Adds the ability to view an ILM policy details in read-only mode ({kibana-pull}186955[#186955]). + +[float] +[[fixes-v8.16.0]] +=== Bug fixes +Alerting:: +* Show up to 1k maintenance windows in the UI ({kibana-pull}198504[#198504]) +* Skip scheduling actions for the alerts without scheduledActions ({kibana-pull}195948[#195948]). +* Fixes Stack Alerts feature API access control ({kibana-pull}193948[#193948]). +* Remove unintended internal find routes API with public access ({kibana-pull}193757[#193757]). +* Convert timestamp before passing to validation ({kibana-pull}192379[#192379]). +* Grouped over field is not populated correctly when editing a rule ({kibana-pull}192297[#192297]). +* Mark slack rate-limiting errors as user errors ({kibana-pull}192200[#192200]). +* Fixes maintenance window filtering with wildcards ({kibana-pull}194777[#194777]). +* Fixes search filters in rules, alerts, and maintenance windows ({kibana-pull}193623[#193623]). +Cases:: +* Use absolute time ranges when adding visualizations to a case ({kibana-pull}189168[#189168]). +* Fixes custom fields with long text that could not be edited in the UI ({kibana-pull}190490[#190490]). +Dashboards and visualizations:: +* Correctly show full screen mode when opening a dashboard or panel from a URL that contains the fullScreenMode parameter ({kibana-pull}196275[#196275]) and ({kibana-pull}190086[#190086]). +* Fixes an issue that could cause a the dashboard list to stay in loading state ({kibana-pull}195277[#195277]). +* Correctly use the same field icons as Discover ({kibana-pull}194095[#194095]). +* Fixes an issue where panels could disappear from a dashboard when canceling edit after saving the dashboard ({kibana-pull}193914[#193914]). +* Adds scroll margin to panels ({kibana-pull}193430[#193430]). +* Fixes an issue with the breadcrumb update icon not working when clicked ({kibana-pull}192240[#192240]). +* Fixes an issue where unsaved changes could remain after saving a dashboard ({kibana-pull}190165[#190165]). +* Fixes an issue causing the flyout to close when canceling the Save to library action ({kibana-pull}188995[#188995]). +* Fixes incomplete string escaping and encoding in *TSVB* ({kibana-pull}196248[#196248]). +* Fixes an issue where label truncation in heat map legends was not working properly in *Lens* ({kibana-pull}195928[#195928]). +* Fixes an issue where the color picker and axis side settings were incorrectly available in the breakdown dimension editor for XY charts in *Lens* ({kibana-pull}195845[#195845]). +* Fixes the tooltip position on faceted charts in *Vega* ({kibana-pull}194620[#194620]). +* Fixes the filter out legend action for ES|QL visualizations ({kibana-pull}194374[#194374]). +* Fixes element sizing issues in full screen mode in *Vega* ({kibana-pull}194330[#194330]). +* Fixes the default cell text alignment setting for non-numeric field types in *Lens* ({kibana-pull}193886[#193886]). +* Limits the height of the query bar input for long KQL queries ({kibana-pull}193737[#193737]). +* Makes the title correctly align left after removing an icon in **Lens** metric charts ({kibana-pull}191057[#191057]). +* Fixes a "No data" error caused by the "Collapse by" setting in **Lens** metric charts ({kibana-pull}190966[#190966]). +* Fixes an issue causing the color of a cell to disappear when clicking the "Expand cell" icon in *Lens* ({kibana-pull}190618[#190618]). +* Removes unnecessary index pattern references from Lens charts ({kibana-pull}190296[#190296]). +* Fixes several accessibility issues ({kibana-pull}188624[#188624]). +Data ingestion and Fleet:: +* Revert "Fix client-side validation for agent policy timeout fields" ({kibana-pull}194338[#194338]). +* Adds proxy arguments to install snippets ({kibana-pull}193922[#193922]). +* Rollover if dimension mappings changed in dynamic templates ({kibana-pull}192098[#192098]). +Discover:: +* Fixes an issue with search highlighting ({kibana-pull}197607[#197607]). +* Correctly pass embeddable filters to the Surrounding Documents page ({kibana-pull}197190[#197190]). +* Fixes trailing decimals dropped from client side validation messages ({kibana-pull}196570[#196570]). +* Fixes several validation issues and creates an expression type evaluator for ES|QL queries ({kibana-pull}195989[#195989]). +* Fixes duplicate autocomplete suggestions for `WHERE` clauses and suggestions with no space in between in ES|QL queries ({kibana-pull}195771[#195771]). +* Improves variable and field name handling in ES|QL queries ({kibana-pull}195149[#195149]). +* Fixes an issue where the Unified Field List popover could get cut off ({kibana-pull}195147[#195147]). +* Fixes the width for saved object type columns ({kibana-pull}194388[#194388]). +* Adds tooltips to Discover button icons ({kibana-pull}192963[#192963]). +* Excludes inactive integration data stream suggestions ({kibana-pull}192953[#192953]). +* Fixes new variables being suggested in incorrect places ({kibana-pull}192405[#192405]). +* Only log requests in the Inspector when they completed ({kibana-pull}191232[#191232]). +ES|QL editor:: +* Fixes an issue where the autocomplete suggestions could cause duplicate entries in ES|QL queries ({kibana-pull}190465[#190465]). +* Fixes several styling issues in the ES|QL editor ({kibana-pull}190170[#190170]). +Elastic Observability solution:: +* Change the slice outcome from bad to good whenever there is no data during the slice window ({kibana-pull}196942[#196942]). +* Make agent names generic with otel-native mode ({kibana-pull}195594[#195594]). +* Avoid showing unnecessary error toast ({kibana-pull}195331[#195331]). +* Use `fields` instead of `_source` on APM queries ({kibana-pull}195242[#195242]). +* Fixes ping heatmap payload ({kibana-pull}195107[#195107]). +* Fixes rule modal warnings in the developer console ({kibana-pull}194766[#194766]). +* Avoid AI assistant overlaying AI conversations ({kibana-pull}194722[#194722]). +* Improve loading state for metric items ({kibana-pull}192930[#192930]). +* Fixes issue where heatmap UI crashes on undefined histogram data ({kibana-pull}192508[#192508]). +* Calculate the latest metadata lookback based on the calculated history delay ({kibana-pull}191324[#191324]). +* Remove dedicated language setting ({kibana-pull}190983[#190983]). +* Change latest metric to use @timestamp ({kibana-pull}190417[#190417]). +* Prevent initial error when adding filters ({kibana-pull}190214[#190214]). +* Display error message when failing to enable machine learning anomaly detection in Inventory ({kibana-pull}189627[#189627]). +* Convert route validation to Zod ({kibana-pull}188691[#188691]). +* Fixes functions table height in asset details view profiling tab ({kibana-pull}188650[#188650]). +* Adds four decimal places float validation for transaction_sample_rate ({kibana-pull}188555[#188555]). +* Centralize data fetching and better control of when data can be refreshed ({kibana-pull}187736[#187736]). +* Fixes heatmap on monitor detail/history page for very large doc counts ({kibana-pull}184177[#184177]). +* Adds settings to serverless allowlist ({kibana-pull}190098[#190098]). +* Set missing group to false by default and show checkbox value in disable mode ({kibana-pull}188402[#188402]). +Elastic Search solution:: +* Fixes an issue with the {ref}/es-connectors-network-drive.html[Network Drive connector] where advanced configuration fields were not displayed for CSV file role mappings with `Drive Type: Linux` selected. +Elastic Security solution:: +For the Elastic Security 8.16.0 release information, refer to {security-guide}/release-notes.html[_Elastic Security Solution Release Notes_]. +Kibana platform:: +* Fixes an issue causing a wrong date to show in the header of a report when generated from relative date ({kibana-pull}197027[#197027]). +* Fixes an issue where the Created and Updated timestamps for Dashboards were ignoring the default timezone settings in Advanced settings. ({kibana-pull}196977[#196977]). +* Fixes an issue causing searches including a colon `:` character to show inaccurate results ({kibana-pull}190464[#190464]). +Kibana security:: +* Fixes an issue where an LLM was likely to generate invalid processors containing array access in Automatic Import ({kibana-pull}196207[#196207]). +Machine Learning:: +* File upload: fixes PDF character count limit ({kibana-pull}197333[#197333]). +* Data Drift: Updates brush positions on window resize fix ({kibana-pull}196830[#196830]). +* AIOps: Fixes issue where some queries cause filters to not be applied ({kibana-pull}196585[#196585]). +* Transforms: Limits the data grid result window ({kibana-pull}196510[#196510]). +* Fixes Anomaly Swim Lane Embeddable not updating properly on query change ({kibana-pull}195090[#195090]). +* Hides ES|QL based saved searches in ML & Transforms ({kibana-pull}195084[#195084]). +* Fixes query for pattern analysis and change point analysis ({kibana-pull}194742[#194742]). +* Anomaly explorer: Shows data gaps and connect anomalous points on Single Metric Charts ({kibana-pull}194119[#194119]). +* Fixes file upload with no ingest pipeline ({kibana-pull}193744[#193744]). +* Disables field statistics panel in Dashboard if ES|QL is disabled ({kibana-pull}193587[#193587]). +* Fixes display of assignees when attaching ML panels to a new case ({kibana-pull}192163[#192163]). +* Anomaly explorer: Fixes the order of the coordinates displayed on the map tooltip ({kibana-pull}192077[#192077]). +* Fixes links to the Single Metric Viewer from the Annotations and Forecasts tables ({kibana-pull}192000[#192000]). +* Trained models: fixes responsiveness of state column for smaller displays ({kibana-pull}191900[#191900]). +* File upload: increases timeout for upload request ({kibana-pull}191770[#191770]). +* Improves expired license check ({kibana-pull}191503[#191503]). +Management:: +* Fixes the pagination of the source documents data grid in Transforms ({kibana-pull}196119[#196119]). +* Fixes autocomplete suggestions after a comma in Console ({kibana-pull}189656[#189656]). + + [[release-notes-8.15.3]] == {kib} 8.15.3 @@ -3852,7 +4207,7 @@ In 8.1.0 and later, {kib} uses the field caps API, by default, to determine the `visualization:visualize:legacyPieChartsLibrary` has been removed from *Advanced Settings*. The setting allowed you to create aggregation-based pie chart visualizations using the legacy charts library. For more information, refer to {kibana-pull}146990[#146990]. *Impact* + -In 7.14.0 and later, the new aggregation-based pie chart visualization is available by default. For more information, check link:https://www.elastic.co/guide/en/kibana/current/add-aggregation-based-visualization-panels.html[Aggregation-based]. +In 7.14.0 and later, the new aggregation-based pie chart visualization is available by default. For more information, check <>. ==== [discrete] diff --git a/docs/upgrade-notes.asciidoc b/docs/upgrade-notes.asciidoc index 5bde93df15490..98f7feeac2d6a 100644 --- a/docs/upgrade-notes.asciidoc +++ b/docs/upgrade-notes.asciidoc @@ -1019,7 +1019,7 @@ In 8.1.0 and later, {kib} uses the field caps API, by default, to determine the `visualization:visualize:legacyPieChartsLibrary` has been removed from *Advanced Settings*. The setting allowed you to create aggregation-based pie chart visualizations using the legacy charts library. For more information, refer to {kibana-pull}146990[#146990]. *Impact* + -In 7.14.0 and later, the new aggregation-based pie chart visualization is available by default. For more information, check link:https://www.elastic.co/guide/en/kibana/current/add-aggregation-based-visualization-panels.html[Aggregation-based]. +In 7.14.0 and later, the new aggregation-based pie chart visualization is available by default. For more information, check <>. ==== [discrete] @@ -1694,6 +1694,30 @@ When you create *Lens* visualization, the default for the *Legend width* is now [float] ==== Elastic Observability solution +[discrete] +[[deprecation-192003]] +* Deprecated the Observability AI Assistant specific advanced setting `observability:aiAssistantLogsIndexPattern`. (8.16) +[%collapsible] +==== +*Details* + +The Observability AI Assistant specific advanced setting for Logs index patterns `observability:aiAssistantLogsIndexPattern` is deprecated and no longer used. The AI Assistant will now use the existing **Log sources** setting `observability:logSources` instead. For more information, refer to ({kibana-pull}192003[#192003]). + +//*Impact* + +//!!TODO!! +==== + +[discrete] +[[deprecation-194519]] +* The Logs Stream was hidden by default in favor of the Logs Explorer app. (8.16) +[%collapsible] +==== +*Details* + +You can find the Logs Explorer app in the navigation menu under Logs > Explorer, or as a separate tab in Discover. For more information, refer to ({kibana-pull}194519[#194519]). + +*Impact* + +You can still show the Logs Stream app again by navigating to Stack Management > Advanced Settings and by enabling the `observability:enableLogsStream` setting. +==== + [discrete] [[deprecation-120689]] diff --git a/docs/user/images/dashboard-star.png b/docs/user/images/dashboard-star.png new file mode 100644 index 0000000000000..25219d8866c0b Binary files /dev/null and b/docs/user/images/dashboard-star.png differ diff --git a/docs/user/images/dashboard-usage.png b/docs/user/images/dashboard-usage.png new file mode 100644 index 0000000000000..e18843511e21a Binary files /dev/null and b/docs/user/images/dashboard-usage.png differ diff --git a/docs/user/images/discover-log-level.png b/docs/user/images/discover-log-level.png new file mode 100644 index 0000000000000..a6de92c0ae020 Binary files /dev/null and b/docs/user/images/discover-log-level.png differ diff --git a/docs/user/images/esql-autocomplete-suggestions.png b/docs/user/images/esql-autocomplete-suggestions.png new file mode 100644 index 0000000000000..bd78201b0d121 Binary files /dev/null and b/docs/user/images/esql-autocomplete-suggestions.png differ diff --git a/docs/user/images/esql-suggestions.png b/docs/user/images/esql-suggestions.png new file mode 100644 index 0000000000000..234f0339003a1 Binary files /dev/null and b/docs/user/images/esql-suggestions.png differ diff --git a/docs/user/images/ip-location-processor.png b/docs/user/images/ip-location-processor.png new file mode 100644 index 0000000000000..b1de4a540f52d Binary files /dev/null and b/docs/user/images/ip-location-processor.png differ diff --git a/docs/user/images/metric-customization.png b/docs/user/images/metric-customization.png new file mode 100644 index 0000000000000..238df1aee82ac Binary files /dev/null and b/docs/user/images/metric-customization.png differ diff --git a/docs/user/images/monaco-console.png b/docs/user/images/monaco-console.png new file mode 100644 index 0000000000000..3bdd4be4eb498 Binary files /dev/null and b/docs/user/images/monaco-console.png differ diff --git a/docs/user/images/solution-view-obs.png b/docs/user/images/solution-view-obs.png new file mode 100644 index 0000000000000..4ae5942dbae37 Binary files /dev/null and b/docs/user/images/solution-view-obs.png differ diff --git a/docs/user/images/space-settings.png b/docs/user/images/space-settings.png new file mode 100644 index 0000000000000..a3a38c1ca88c7 Binary files /dev/null and b/docs/user/images/space-settings.png differ diff --git a/docs/user/images/table-coloring.png b/docs/user/images/table-coloring.png new file mode 100644 index 0000000000000..6c96daf381164 Binary files /dev/null and b/docs/user/images/table-coloring.png differ diff --git a/docs/user/whats-new.asciidoc b/docs/user/whats-new.asciidoc index 2a726ba3dc4f3..25568518ad2ec 100644 --- a/docs/user/whats-new.asciidoc +++ b/docs/user/whats-new.asciidoc @@ -1,175 +1,144 @@ [[whats-new]] -== What's new in 8.15 +== What's new in 8.16 -Here are the highlights of what's new and improved in 8.15. +Here are the highlights of what's new and improved in 8.16. For detailed information about this release, check the <>. -Previous versions: {kibana-ref-all}/8.14/whats-new.html[8.14] | {kibana-ref-all}/8.13/whats-new.html[8.13] | {kibana-ref-all}/8.12/whats-new.html[8.12] | {kibana-ref-all}/8.11/whats-new.html[8.11] | {kibana-ref-all}/8.10/whats-new.html[8.10] | {kibana-ref-all}/8.9/whats-new.html[8.9] | {kibana-ref-all}/8.8/whats-new.html[8.8] | {kibana-ref-all}/8.7/whats-new.html[8.7] | {kibana-ref-all}/8.6/whats-new.html[8.6] | {kibana-ref-all}/8.5/whats-new.html[8.5] | {kibana-ref-all}/8.4/whats-new.html[8.4] | {kibana-ref-all}/8.3/whats-new.html[8.3] | {kibana-ref-all}/8.2/whats-new.html[8.2] | {kibana-ref-all}/8.1/whats-new.html[8.1] | {kibana-ref-all}/8.0/whats-new.html[8.0] +Previous versions: {kibana-ref-all}/8.15/whats-new.html[8.15] | {kibana-ref-all}/8.14/whats-new.html[8.14] | {kibana-ref-all}/8.13/whats-new.html[8.13] | {kibana-ref-all}/8.12/whats-new.html[8.12] | {kibana-ref-all}/8.11/whats-new.html[8.11] | {kibana-ref-all}/8.10/whats-new.html[8.10] | {kibana-ref-all}/8.9/whats-new.html[8.9] | {kibana-ref-all}/8.8/whats-new.html[8.8] | {kibana-ref-all}/8.7/whats-new.html[8.7] | {kibana-ref-all}/8.6/whats-new.html[8.6] | {kibana-ref-all}/8.5/whats-new.html[8.5] | {kibana-ref-all}/8.4/whats-new.html[8.4] | {kibana-ref-all}/8.3/whats-new.html[8.3] | {kibana-ref-all}/8.2/whats-new.html[8.2] | {kibana-ref-all}/8.1/whats-new.html[8.1] | {kibana-ref-all}/8.0/whats-new.html[8.0] [discrete] -=== ES|QL +=== Solution-oriented navigation +On Elastic Cloud Hosted deployments running on version 8.16, you can now navigate Kibana using a lighter, solution-oriented left navigation menu, called **Solution view**. -[discrete] -==== Filter UX improvements in ES|QL +There are four selectable solution views: Search, Observability, Security, and Classic. Search, Observability, and Security are the new navigation menus. Each of those brings simplicity by focusing the left navigation menu on a relevant subset of features, scoped to its associated use cases, and offers a dedicated home page. Classic has the same navigation menu as 8.15 and before. -We're thrilled to unveil a complete overhaul of filtering in the ES|QL UX. Now, you can seamlessly filter data by browsing a time series chart, allowing for quick and intuitive time-based filtering. Interactive chart filtering lets you refine your data directly by clicking on any chart, while creating WHERE clause filters from the Discover table or sidebar has never been easier. These enhancements streamline data exploration and analysis, making your ES|QL experience more efficient and user-friendly than ever. +Each space has its own solution view setting which determines the navigation experience for all users of that space. -*Filter by clicking a chart:* +When creating a new deployment, you will now be asked to choose between one of the 3 new solution views for your default space. If you prefer to stick with the classic, multi-layered navigation, you can do so once the deployment is created by navigating to your space settings. -image::https://images.contentstack.io/v3/assets/bltefdd0b53724fa2ce/blt965a5190f246f7c8/669a7d41e5f7c84793b031cb/filter-by-clicking-chart.gif[Filter by clicking a chart] +Deployments upgrading from a previous version to 8.16 keep the classic navigation. Admins can enable one of the new solution views from the space settings. -*Filter by browsing a time series chart:* +image::images/solution-view-obs.png[Example of observability solution view] +_The Observability solution view and its Home page._ -image::https://images.contentstack.io/v3/assets/bltefdd0b53724fa2ce/blta20c9a93dded707c/669a7d40843f93a02fe51013/filter-by-brushing-time-series.gif[Filter by browsing a time series chart] +[discrete] +=== Discover and ES|QL -*Create WHERE clause filters from Discover table or sidebar:* +[discrete] +==== Contextual Data presentation -image::https://images.contentstack.io/v3/assets/bltefdd0b53724fa2ce/blt50ac35ab3af29ff8/669a7d4006a6fafe4c7cb39d/create-where-clause-filters-from-sidebar.gif[Create WHERE clause filters from Discover table or sidebar] +In this release, Discover introduces enhanced contextual data presentation. Previously, you needed to manually select relevant fields and set up your workspace before diving into data exploration. Now, Discover automatically tailors the user experience based on the data being explored, powered by a scalable contextual architecture. For example, when analyzing logs, you'll see a *log.level* field rendered directly in the table, a custom Logs overview in the document viewer, and log.level indicators on individual rows. +image::images/discover-log-level.png[Log level badge displaying in the Discover grid] [discrete] -==== Field statistics in ES|QL +==== Recommended ES|QL queries -Field statistics are now available in ES|QL. This feature is designed to provide comprehensive insights for each data field. With this enhancement, you can access detailed statistics such as distributions, averages, and other key metrics, helping you quickly understand your data. This makes data exploration and quality assessment more efficient, providing deeper insights and streamlining the analysis of field-level data in ES|QL. +Writing ES|QL queries just got easier. Many users face challenges when authoring queries, and even more so when unfamiliar with the syntax or data structure. This can lead to inefficiencies in data analysis and visualization. We want to reduce the time it takes to create queries and to lower the learning curve for both new and existing users by suggesting recommended queries within the ES|QL Help menu and from the auto-complete. -image::images/field-statistics-esql.png[Field statistics in ES|QL] +image::images/esql-suggestions.png[A list of suggestions to get started with an ES|QL query, width=30%] +_Recommended ES|QL queries from the ES|QL help menu_ -[discrete] -==== Integrations support in the ES|QL editor when using FROM command. +image::images/esql-autocomplete-suggestions.png[A list of suggestions in the autocomplete menu of an ES|QL query, width=50%] +_Recommended ES|QL queries from auto-complete suggestions_ -We're excited to announce enhanced support for integrations in the ES|QL editor with the *FROM* command. Previously, you could only access indices, but now you can also view a list of installed integrations directly within the editor. This improvement streamlines your workflow, making it easier to manage and utilize various integrations while working with your data. - -image::images/integrations-in-esql.png[Accessing an integration from ES|QL] [discrete] === Dashboards [discrete] -==== Field statistics in Dashboards - -It's now easier than ever to include your field statistics view from **Discover** into **Dashboards**. While running investigations, it is very common that you need to see some field information, such as unique values and their distribution, to make sense of the data. Select the fields that you want with your ES|QL query and get the document count, values, and distribution in your dashboard so you don't have to navigate back and forth to **Discover** to see this information. +==== Manage dashboards more easily and efficiently +As part of a series of improvements to help you find and manage your dashboards https://www.elastic.co/guide/en/kibana/8.15/whats-new.html#_view_dashboard_creator_and_last_editor[started in version 8.15], the new default way to sort your dashboards is by recently viewed, and we are adding an option to star your favorite dashboards, as well as some statistics to monitor the usage of your dashboards. -image::https://images.contentstack.io/v3/assets/bltefdd0b53724fa2ce/blt9bc52ff7851acc52/669a4f6a490fbc64fa22f279/field-statistics.gif[Showing field statistics panel in Dashboards] +You can find your favorite dashboards in the new **Starred** tab. -[discrete] -==== Statistics in legends +image::images/dashboard-star.png[Viewing starred dashboards] -Accelerate time to insights by summarizing the values of your charts using average, minimum, maximum, median, and variance, among many others. You can add these statistics for **Lens** and ES|QL visualizations. It is important to note that these statistics are computed using the data points from the chart considering the aggregation used and not the raw data. In the following example, the chart shows the median memory per host, so the Max = 15.3KB for the first series (artifacts.elastic.co) is the maximum value of the median memory per host. +By opening a dashboard's details using the “info” icon from the dashboard list view, you can now get a sense of the popularity of that dashboard with a histogram showing how many times the dashboard was viewed in the last 90 days. -image::images/statistics-in-legends.png[Statistics in legends] +image::images/dashboard-usage.png[Dashboard usage chart] -You can find the option to select statistics for your legends along with an explanation for each calculation when editing your visualization, as shown in the following image. +[discrete] +==== Log Pattern Analysis dashboard panels +Log Pattern Analysis panels are now available for you to add to your dashboards, making AIOps even more embedded in your workflows and where you need it. When filtering patterns, the dashboard’s data adjusts accordingly. You can also choose the filtering to transition you into Discover for further exploration. -image::images/statistics-in-legends2.png[Select statistics in legends] +image:https://images.contentstack.io/v3/assets/bltefdd0b53724fa2ce/blt8288e01386b5830c/67222fb0d2da223e27bc1e67/log_analysis_panel.gif[Log pattern analysis panel in dashboards] [discrete] -==== View dashboard creator and last editor +==== Color text values in tables +Previously, you could only decide to color numeric values in tables. We're adding the ability to also color your string values. You can decide whether you want to color the whole cell, or only the text. -You can now see who created and who last updated a dashboard. +image::images/table-coloring.png[Coloring table cells with string values] -You can find the creator information right from the dashboard list. -image::images/dashboard-creator.png[Dashboard creator column in dashboard list] +[discrete] +==== Formatting options for your metrics +We've received a lot of feedback asking for more flexibility to customize the appearance of your metrics. In this version, we are adding the ability to customize the title and value alignment, as well as the font size. Selecting the *Fit* option will adjust the font size and make the metric value occupy the entire panel. -Quickly find all dashboards created by the same user with a simple filter. +image::images/metric-customization.png[Customization options for a metric panel] -image::images/dashboard-creator-filter.png[Filtering dashboards by creator] -Note that the creator information will be visible only for dashboards created on or after version 8.14. -You can also see who last updated a dashboard by clicking the dashboard information icon from the dashboard list. The creator is also visible next to it. This information is immutable and cannot be changed. +//[discrete] +//=== Alerting, cases, and connectors -image::images/dashboard-last-editor.png[Dashboard details panel with the name of the last editor] [discrete] -=== Discover +=== Managing {kib} and data [discrete] -==== Push flyout for Discover document viewer +==== Edit space access from the space settings +As an admin, you can now assign roles to and edit role permissions on a given space directly from the settings of that space. -You can now seamlessly view document details and the main table simultaneously in **Discover** with the new _push_ flyout. You can adjust the width of the flyout to suit your needs and explore your data much more easily. - -image::https://images.contentstack.io/v3/assets/bltefdd0b53724fa2ce/bltb40a408acf4ab688/669a58ea9fecd85219d58ed2/discover-push-flyout.gif[Resizable push flyout in Discover] +Prior to 8.16, you could only do this from the role settings, which was counterintuitive. +image::space-settings.png[Editing space settings with new options] [discrete] -=== Alerting, cases, and connectors +==== New IP Location processor +Enhancing location information based on IP addresses just got easier with the new IP Location processor. In addition to the existing free GeoLite offerings from MaxMind, we have integrated with MaxMind’s premium GeoIP databases for users who have licensed MaxMind’s products. If you're an Enterprise Elastic customer, you now have an additional third-party product, IP Info, available for use as well. These additional data sources provide improved options for enriching data with location information associated with IP addresses to improve telemetry and insights. To utilize these features beyond the free MaxMind GeoIP database, you will need to have licensed premium MaxMind products and/or the IP Info database. -[discrete] -==== Case templates - -{kib} cases offer a new powerful capability to enhance the efficiency of your analyst teams with <>. -You can manage multiple templates, each of which can be used to auto-populate values in a case with pre-defined knowledge. -This streamlines the investigative process and significantly reduces time to resolution. +image::images/ip-location-processor.png[The IP Location processor] [discrete] -==== Case custom fields are GA +==== File uploader PDF support +The file uploader provides a quick way to upload data and start using Elastic. In 8.16, we are improving it to allow you to upload data from PDF files. -In 8.11, <> were added to cases and they are now moving from technical preview to general availability. -You can set custom field values in your templates to enhance consistency across cases. +image:https://images.contentstack.io/v3/assets/bltefdd0b53724fa2ce/blte8f0b295330b7e68/67222fb0ca492a5044b51bd8/file_uploader_pdf.gif[File uploader with PDF support] [discrete] -==== {sn} additional fields +=== Developer Tools Console redesign +We're excited to introduce a number of improvements to the overall user experience on one of our most popular features: **Console**. If you're new to Console, you will be welcomed by an onboarding tour that will help you get started quickly with your first requests. And if you're already a regular Console user, you will notice a variety of new features, including the ability to copy outputs to the clipboard, import and export request files, enjoy improved responsiveness, and other quality of life improvements. -You can now create enriched {sn} tickets based on detected alerts with a more comprehensive structure that matches the {sn} ticket scheme. -A new JSON field is now available as part of the {sn} action, which enables you to send any field from {kib} alerts to {sn} tickets. - -[discrete] -==== {webhook-cm} SSL auth support - -It's common for organizations to integrate with third parties using secured authentication. -Currently, most of the available case connectors use basic authentication (user and passwords or tokens), which might not be sufficient to meet organization security policies. -With this release, the <> now supports client certification, which enables you to leverage the connector for secured integration with third parties. - -The {webhook-cm} connector also moves from technical preview to general availability in this release. +image::images/monaco-console.png[Console's redesign featuring the Monaco editor] [discrete] === Machine Learning [discrete] -==== Improved UX for Log Pattern Analysis in Discover +==== The Inference API is now Generally Available -Analyze large volumes of logs efficiently, in very short times with Log Pattern Analysis in **Discover**. In 8.15, we redesigned the Log Pattern Analysis user flow in **Discover** to make it easier to use. Discover log patterns with one click for the message field (and other applicable text fields) and easily filter in and out logs to drastically reduce MTTR. - -image::https://images.contentstack.io/v3/assets/bltefdd0b53724fa2ce/blt7e63d7e764ab183e/669a807bd316c7015db35458/ml-log-pattern-analysis.gif[New log pattern analysis interface] +Starting in 8.16, the {ref}/inference-apis.html[Inference API] is now GA, offering production-level stability, robustness and performance. Elastic’s Inference API integrates the state-of-the-art in AI inference, including ELSER, your Elastic hosted models and {ref}/put-inference-api.html#put-inference-api-desc[an increasing array of external models and tasks] in a unified, lean syntax. Used with {ref}/semantic-text.html[semantic_text] or the vector fields supported by the Elastic vector database, you can perform AI search, reranking, and completion with simplicity. In 8.16, we're also adding streamed completions for improved flows and real time interactions and GenAI experiences. [discrete] -==== Log Rate Analysis contextual insights in serverless Observability +==== ELSER and trained models adaptive resources and chunking strategies -You can now see insights in natural language, for example for the root cause of a log rate change or threshold alert, in Log Rate Analysis. This feature is currently only available for Observability serverless projects. +From 8.16, ELSER and the other AI search and NLP models you use in Elastic automatically adapt resource consumption according to the inference load, providing the performance you need during peak times and reducing the cost during slow periods, all the way down to zero cost during idle times. -image::images/obs-log-rate-analysis-insigths.png[Log Rate Analysis contextual insights in serverless Observability] +We're also improving the UX through which you deploy your models. You can provision search-optimized and ingest-optimized model deployments with a one-click selection. An optimized configuration is created without the need to specify parameters such as threads and allocations. Combined with the flexibility of ML auto-scaling on Elastic Cloud and the incredible elasticity of Elastic Cloud Serverless, you are in full control of both performance and cost. -[discrete] -==== Inference API improvements +image::https://images.contentstack.io/v3/assets/bltefdd0b53724fa2ce/blt429790e1de1b4f93/67222fb048ec8c73255ef4eb/trained_models.gif[Trained models and ELSER] -The inference API provides a seamless, intuitive interface to perform inference and other tasks against proprietary, hosted, and integrated external services. In 8.15, we're extending it with the following capabilities: +In addition, from 8.16 you can choose between a word or sequence-based chunking strategy to use with your trained models, and you can also customize the maximum size and overlap parameters. A suitable chunking strategy can result in gains depending on the model you use, the length and nature of the texts and the length and complexity of the search queries. -* Support for Anthropic's chat completion API. -* Ability to host cross encoder models and perform the reranking task. - - -[discrete] -=== Managing {kib} users and objects [discrete] -==== Sharing improvements +==== Support for Daylight Saving Time changes in Anomaly Detection -You can now share a dashboard, search, or Lens object in one click. When sharing an object, the most common actions are directly presented to you, and a short link is automatically generated, making it simpler than ever to share your work. +In 8.16, we are introducing support for DST changes in Anomaly Detection. Set up a DST calendar by selecting the right timezone and apply it to your anomaly detection jobs individually or in groups. This feature eliminates any false positives that you may have experienced previously due to Daylight Saving Time changes, and works without the need for your intervention for many years ahead. -image::images/share-modal.png[New object share modal, width=50%] - -[discrete] -==== Quick API key creation - -Many API keys don’t require custom settings, so we made it simple to generate a standard key. From the **Endpoints & API keys** top menu in Search, you can create a key in seconds. - -image::images/create-simple-api-key.png[Shortcut to create an API key, width=60%] - -[discrete] -==== Filtering by User in Kibana Audit Logs +image::https://images.contentstack.io/v3/assets/bltefdd0b53724fa2ce/blt5fb82f18cde26710/67222fb086339971144a31e5/daylight_savings.gif[DST support in Anomaly Detection] -We are pleased to share that ignoring events by user in Kibana audit logs is now possible. This enhancement will give you more flexibility to reduce the overall number of events logged by the Kibana audit logs service and to control the volume of data being generated in audit logs. While we currently offer a number of ways to do this using the `xpack.security.audit.ignore_filters.[]` configuration setting, there wasn't an easy option to filter by user. With this addition, you can configure Kibana audit logs to ignore events based on values from the following fields: users, spaces, outcomes, categories, types and actions. \ No newline at end of file diff --git a/examples/response_stream/public/containers/app/pages/page_redux_stream/hooks.ts b/examples/response_stream/public/containers/app/pages/page_redux_stream/hooks.ts index f1c8c671611a8..735e70916593f 100644 --- a/examples/response_stream/public/containers/app/pages/page_redux_stream/hooks.ts +++ b/examples/response_stream/public/containers/app/pages/page_redux_stream/hooks.ts @@ -8,10 +8,9 @@ */ import type { TypedUseSelectorHook } from 'react-redux'; -import { useDispatch, useSelector, useStore } from 'react-redux'; -import type { AppDispatch, AppStore, RootState } from './store'; +import { useDispatch, useSelector } from 'react-redux'; +import type { AppDispatch, RootState } from './store'; // Use throughout your app instead of plain `useDispatch` and `useSelector` export const useAppDispatch: () => AppDispatch = useDispatch; export const useAppSelector: TypedUseSelectorHook = useSelector; -export const useAppStore: () => AppStore = useStore; diff --git a/package.json b/package.json index 17e873a89c7be..cda08f1a5b06f 100644 --- a/package.json +++ b/package.json @@ -1111,7 +1111,7 @@ "del": "^6.1.0", "diff": "^5.1.0", "dotenv": "^16.4.5", - "elastic-apm-node": "^4.8.0", + "elastic-apm-node": "^4.8.1", "email-addresses": "^5.0.0", "eventsource-parser": "^1.1.1", "execa": "^5.1.1", @@ -1497,7 +1497,7 @@ "@octokit/rest": "^17.11.2", "@parcel/watcher": "^2.1.0", "@playwright/test": "=1.46.0", - "@redocly/cli": "^1.25.7", + "@redocly/cli": "^1.25.8", "@statoscope/webpack-plugin": "^5.28.2", "@storybook/addon-a11y": "^6.5.16", "@storybook/addon-actions": "^6.5.16", @@ -1718,7 +1718,7 @@ "exit-hook": "^2.2.0", "expect": "^29.7.0", "expose-loader": "^0.7.5", - "express": "^4.21.0", + "express": "^4.21.1", "faker": "^5.1.0", "fetch-mock": "^7.3.9", "file-loader": "^4.2.0", diff --git a/packages/core/saved-objects/core-saved-objects-api-server-internal/src/lib/apis/helpers/common.ts b/packages/core/saved-objects/core-saved-objects-api-server-internal/src/lib/apis/helpers/common.ts index 27bd0918b0f9b..870f6833b4edc 100644 --- a/packages/core/saved-objects/core-saved-objects-api-server-internal/src/lib/apis/helpers/common.ts +++ b/packages/core/saved-objects/core-saved-objects-api-server-internal/src/lib/apis/helpers/common.ts @@ -105,10 +105,14 @@ export class CommonHelper { if (!id) { return SavedObjectsUtils.generateId(); } - // only allow a specified ID if we're overwriting an existing ESO with a Version - // this helps us ensure that the document really was previously created using ESO - // and not being used to get around the specified ID limitation - const canSpecifyID = (overwrite && version) || SavedObjectsUtils.isRandomId(id); + + const shouldEnforceRandomId = this.encryptionExtension?.shouldEnforceRandomId(type); + + // Allow specified ID if: + // 1. we're overwriting an existing ESO with a Version (this helps us ensure that the document really was previously created using ESO) + // 2. enforceRandomId is explicitly set to false + const canSpecifyID = + !shouldEnforceRandomId || (overwrite && version) || SavedObjectsUtils.isRandomId(id); if (!canSpecifyID) { throw SavedObjectsErrorHelpers.createBadRequestError( 'Predefined IDs are not allowed for saved objects with encrypted attributes unless the ID is a UUID.' diff --git a/packages/core/saved-objects/core-saved-objects-api-server-internal/src/lib/repository.encryption_extension.test.ts b/packages/core/saved-objects/core-saved-objects-api-server-internal/src/lib/repository.encryption_extension.test.ts index f5c8c8518a58a..cf66621565577 100644 --- a/packages/core/saved-objects/core-saved-objects-api-server-internal/src/lib/repository.encryption_extension.test.ts +++ b/packages/core/saved-objects/core-saved-objects-api-server-internal/src/lib/repository.encryption_extension.test.ts @@ -261,6 +261,7 @@ describe('SavedObjectsRepository Encryption Extension', () => { it(`fails if non-UUID ID is specified for encrypted type`, async () => { mockEncryptionExt.isEncryptableType.mockReturnValue(true); + mockEncryptionExt.shouldEnforceRandomId.mockReturnValue(true); mockEncryptionExt.decryptOrStripResponseAttributes.mockResolvedValue({ ...encryptedSO, ...decryptedStrippedAttributes, @@ -291,6 +292,25 @@ describe('SavedObjectsRepository Encryption Extension', () => { ).resolves.not.toThrowError(); }); + it('allows to opt-out of random ID enforcement', async () => { + mockEncryptionExt.isEncryptableType.mockReturnValue(true); + mockEncryptionExt.shouldEnforceRandomId.mockReturnValue(false); + mockEncryptionExt.decryptOrStripResponseAttributes.mockResolvedValue({ + ...encryptedSO, + ...decryptedStrippedAttributes, + }); + + const result = await repository.create(encryptedSO.type, encryptedSO.attributes, { + id: encryptedSO.id, + version: mockVersion, + }); + + expect(client.create).toHaveBeenCalled(); + expect(mockEncryptionExt.isEncryptableType).toHaveBeenCalledWith(encryptedSO.type); + expect(mockEncryptionExt.shouldEnforceRandomId).toHaveBeenCalledWith(encryptedSO.type); + expect(result.id).toBe(encryptedSO.id); + }); + describe('namespace', () => { const doTest = async (optNamespace: string, expectNamespaceInDescriptor: boolean) => { const options = { overwrite: true, namespace: optNamespace }; @@ -483,6 +503,7 @@ describe('SavedObjectsRepository Encryption Extension', () => { it(`fails if non-UUID ID is specified for encrypted type`, async () => { mockEncryptionExt.isEncryptableType.mockReturnValue(true); + mockEncryptionExt.shouldEnforceRandomId.mockReturnValue(true); const result = await bulkCreateSuccess(client, repository, [ encryptedSO, // Predefined IDs are not allowed for saved objects with encrypted attributes unless the ID is a UUID ]); @@ -529,6 +550,25 @@ describe('SavedObjectsRepository Encryption Extension', () => { expect(result.saved_objects.length).toBe(1); expect(result.saved_objects[0].error).toBeUndefined(); }); + + it('allows to opt-out of random ID enforcement', async () => { + mockEncryptionExt.isEncryptableType.mockReturnValue(true); + mockEncryptionExt.shouldEnforceRandomId.mockReturnValue(false); + mockEncryptionExt.decryptOrStripResponseAttributes.mockResolvedValue({ + ...encryptedSO, + ...decryptedStrippedAttributes, + }); + + const result = await bulkCreateSuccess(client, repository, [ + { ...encryptedSO, version: mockVersion }, + ]); + + expect(client.bulk).toHaveBeenCalled(); + expect(result.saved_objects).not.toBeUndefined(); + expect(result.saved_objects.length).toBe(1); + expect(result.saved_objects[0].error).toBeUndefined(); + expect(result.saved_objects[0].id).toBe(encryptedSO.id); + }); }); describe('#bulkUpdate', () => { diff --git a/packages/core/saved-objects/core-saved-objects-api-server-internal/src/mocks/saved_objects_extensions.mock.ts b/packages/core/saved-objects/core-saved-objects-api-server-internal/src/mocks/saved_objects_extensions.mock.ts index 2061bb63240b2..9dc7c0f0133c5 100644 --- a/packages/core/saved-objects/core-saved-objects-api-server-internal/src/mocks/saved_objects_extensions.mock.ts +++ b/packages/core/saved-objects/core-saved-objects-api-server-internal/src/mocks/saved_objects_extensions.mock.ts @@ -17,6 +17,7 @@ const createEncryptionExtension = (): jest.Mocked => ({ diff --git a/packages/core/saved-objects/core-saved-objects-api-server-mocks/src/saved_objects_extensions.mock.ts b/packages/core/saved-objects/core-saved-objects-api-server-mocks/src/saved_objects_extensions.mock.ts index 2a2d121b568be..776ecfe3a7385 100644 --- a/packages/core/saved-objects/core-saved-objects-api-server-mocks/src/saved_objects_extensions.mock.ts +++ b/packages/core/saved-objects/core-saved-objects-api-server-mocks/src/saved_objects_extensions.mock.ts @@ -18,6 +18,7 @@ const createEncryptionExtension = (): jest.Mocked => ({ diff --git a/packages/core/saved-objects/core-saved-objects-server/src/extensions/encryption.ts b/packages/core/saved-objects/core-saved-objects-server/src/extensions/encryption.ts index 3fdb29203fe13..4560ab5672666 100644 --- a/packages/core/saved-objects/core-saved-objects-server/src/extensions/encryption.ts +++ b/packages/core/saved-objects/core-saved-objects-server/src/extensions/encryption.ts @@ -39,6 +39,14 @@ export interface ISavedObjectsEncryptionExtension { */ isEncryptableType: (type: string) => boolean; + /** + * Returns false if ESO type explicitly opts out of highly random UID + * + * @param type the string name of the object type + * @returns boolean, true by default unless explicitly set to false + */ + shouldEnforceRandomId: (type: string) => boolean; + /** * Given a saved object, will return a decrypted saved object or will strip * attributes from the returned object if decryption fails. diff --git a/packages/kbn-search-api-panels/components/language_client_panel.tsx b/packages/kbn-search-api-panels/components/language_client_panel.tsx index 2f89c8da7578c..2c07d3118d943 100644 --- a/packages/kbn-search-api-panels/components/language_client_panel.tsx +++ b/packages/kbn-search-api-panels/components/language_client_panel.tsx @@ -62,8 +62,12 @@ export const LanguageClientPanel: React.FC = ({ width={euiTheme.size.xl} /> - -
{language.name}
+ + {language.name} diff --git a/src/core/server/integration_tests/saved_objects/migrations/group3/incompatible_cluster_routing_allocation.test.ts b/src/core/server/integration_tests/saved_objects/migrations/group3/incompatible_cluster_routing_allocation.test.ts index 56488f571ef16..8213c880c0fa4 100644 --- a/src/core/server/integration_tests/saved_objects/migrations/group3/incompatible_cluster_routing_allocation.test.ts +++ b/src/core/server/integration_tests/saved_objects/migrations/group3/incompatible_cluster_routing_allocation.test.ts @@ -121,7 +121,7 @@ describe('incompatible_cluster_routing_allocation', () => { await root.preboot(); await root.setup(); - root.start().catch(() => { + const startPromise = root.start().catch(() => { // Silent catch because the test might be done and call shutdown before starting is completed, causing unwanted thrown errors. }); @@ -165,6 +165,7 @@ describe('incompatible_cluster_routing_allocation', () => { { retryAttempts: 100, retryDelayMs: 500 } ); + await startPromise; // Wait for start phase to complete before shutting down await root.shutdown(); }); }); diff --git a/src/plugins/dashboard/public/dashboard_app/hooks/use_observability_ai_assistant_context.tsx b/src/plugins/dashboard/public/dashboard_app/hooks/use_observability_ai_assistant_context.tsx index c20e8fcd1dc76..39ae4594d5bc8 100644 --- a/src/plugins/dashboard/public/dashboard_app/hooks/use_observability_ai_assistant_context.tsx +++ b/src/plugins/dashboard/public/dashboard_app/hooks/use_observability_ai_assistant_context.tsx @@ -114,9 +114,11 @@ export function useObservabilityAIAssistantContext({ }, metric: { type: 'object', + properties: {}, }, gauge: { type: 'object', + properties: {}, }, pie: { type: 'object', @@ -158,6 +160,7 @@ export function useObservabilityAIAssistantContext({ }, table: { type: 'object', + properties: {}, }, tagcloud: { type: 'object', diff --git a/src/plugins/data/server/search/session/session_service.ts b/src/plugins/data/server/search/session/session_service.ts index 4ef741ec78387..48b563c5585ca 100644 --- a/src/plugins/data/server/search/session/session_service.ts +++ b/src/plugins/data/server/search/session/session_service.ts @@ -402,8 +402,8 @@ export class SearchSessionService implements ISearchSessionService { const session = await this.get(deps, user, sessionId); const requestHash = createRequestHash(searchRequest.params); if (!Object.hasOwn(session.attributes.idMapping, requestHash)) { - this.logger.error(`SearchSessionService: getId | ${sessionId} | ${requestHash} not found`); - this.logger.debug( + this.logger.debug(`SearchSessionService: getId | ${sessionId} | ${requestHash} not found`); + this.logger.error( `SearchSessionService: getId not found search with params: ${JSON.stringify( searchRequest.params )}` diff --git a/src/plugins/saved_objects/kibana.jsonc b/src/plugins/saved_objects/kibana.jsonc index 86aa1ab920725..d73cacdd8068c 100644 --- a/src/plugins/saved_objects/kibana.jsonc +++ b/src/plugins/saved_objects/kibana.jsonc @@ -2,7 +2,7 @@ "type": "plugin", "id": "@kbn/saved-objects-plugin", "owner": [ - "@elastic/kibana-core" + "@elastic/appex-sharedux" ], "group": "platform", "visibility": "shared", @@ -15,4 +15,4 @@ "dataViews" ] } -} \ No newline at end of file +} diff --git a/x-pack/.i18nrc.json b/x-pack/.i18nrc.json index 7afbc9dc704c4..e1e8478aa0517 100644 --- a/x-pack/.i18nrc.json +++ b/x-pack/.i18nrc.json @@ -3,6 +3,7 @@ "paths": { "xpack.actions": "plugins/actions", "xpack.aiops": [ + "packages/ml/aiops_common", "packages/ml/aiops_components", "packages/ml/aiops_log_pattern_analysis", "packages/ml/aiops_log_rate_analysis", diff --git a/x-pack/packages/ai-infra/inference-common/index.ts b/x-pack/packages/ai-infra/inference-common/index.ts index 6de7ce3bb8008..502d8e86a0beb 100644 --- a/x-pack/packages/ai-infra/inference-common/index.ts +++ b/x-pack/packages/ai-infra/inference-common/index.ts @@ -25,12 +25,15 @@ export { type ToolChoice, type ChatCompleteAPI, type ChatCompleteOptions, - type ChatCompletionResponse, + type ChatCompleteCompositeResponse, type ChatCompletionTokenCountEvent, type ChatCompletionEvent, type ChatCompletionChunkEvent, type ChatCompletionChunkToolCall, type ChatCompletionMessageEvent, + type ChatCompleteStreamResponse, + type ChatCompleteResponse, + type ChatCompletionTokenCount, withoutTokenCountEvents, withoutChunkEvents, isChatCompletionMessageEvent, @@ -48,7 +51,10 @@ export { export { OutputEventType, type OutputAPI, + type OutputOptions, type OutputResponse, + type OutputCompositeResponse, + type OutputStreamResponse, type OutputCompleteEvent, type OutputUpdateEvent, type Output, diff --git a/x-pack/packages/ai-infra/inference-common/src/chat_complete/api.ts b/x-pack/packages/ai-infra/inference-common/src/chat_complete/api.ts index c6ffa9d4c8d5d..cb91f4e53e8ae 100644 --- a/x-pack/packages/ai-infra/inference-common/src/chat_complete/api.ts +++ b/x-pack/packages/ai-infra/inference-common/src/chat_complete/api.ts @@ -6,16 +6,35 @@ */ import type { Observable } from 'rxjs'; -import type { ToolOptions } from './tools'; +import type { ToolCallsOf, ToolOptions } from './tools'; import type { Message } from './messages'; -import type { ChatCompletionEvent } from './events'; +import type { ChatCompletionEvent, ChatCompletionTokenCount } from './events'; /** * Request a completion from the LLM based on a prompt or conversation. * - * @example using the API to get an event observable. + * By default, The complete LLM response will be returned as a promise. + * + * @example using the API in default mode to get promise of the LLM response. + * ```ts + * const response = await chatComplete({ + * connectorId: 'my-connector', + * system: "You are a helpful assistant", + * messages: [ + * { role: MessageRole.User, content: "Some question?"}, + * ] + * }); + * + * const { content, tokens, toolCalls } = response; + * ``` + * + * Use `stream: true` to return an observable returning the full set + * of events in real time. + * + * @example using the API in stream mode to get an event observable. * ```ts * const events$ = chatComplete({ + * stream: true, * connectorId: 'my-connector', * system: "You are a helpful assistant", * messages: [ @@ -24,20 +43,44 @@ import type { ChatCompletionEvent } from './events'; * { role: MessageRole.User, content: "Another question?"}, * ] * }); + * + * // using the observable + * events$.pipe(withoutTokenCountEvents()).subscribe((event) => { + * if (isChatCompletionChunkEvent(event)) { + * // do something with the chunk event + * } else { + * // do something with the message event + * } + * }); + * ``` */ -export type ChatCompleteAPI = ( - options: ChatCompleteOptions -) => ChatCompletionResponse; +export type ChatCompleteAPI = < + TToolOptions extends ToolOptions = ToolOptions, + TStream extends boolean = false +>( + options: ChatCompleteOptions +) => ChatCompleteCompositeResponse; /** * Options used to call the {@link ChatCompleteAPI} */ -export type ChatCompleteOptions = { +export type ChatCompleteOptions< + TToolOptions extends ToolOptions = ToolOptions, + TStream extends boolean = false +> = { /** * The ID of the connector to use. - * Must be a genAI compatible connector, or an error will be thrown. + * Must be an inference connector, or an error will be thrown. */ connectorId: string; + /** + * Set to true to enable streaming, which will change the API response type from + * a single {@link ChatCompleteResponse} promise + * to a {@link ChatCompleteStreamResponse} event observable. + * + * Defaults to false. + */ + stream?: TStream; /** * Optional system message for the LLM. */ @@ -53,14 +96,44 @@ export type ChatCompleteOptions } & TToolOptions; /** - * Response from the {@link ChatCompleteAPI}. + * Composite response type from the {@link ChatCompleteAPI}, + * which can be either an observable or a promise depending on + * whether API was called with stream mode enabled or not. + */ +export type ChatCompleteCompositeResponse< + TToolOptions extends ToolOptions = ToolOptions, + TStream extends boolean = false +> = TStream extends true + ? ChatCompleteStreamResponse + : Promise>; + +/** + * Response from the {@link ChatCompleteAPI} when streaming is enabled. * * Observable of {@link ChatCompletionEvent} */ -export type ChatCompletionResponse = Observable< +export type ChatCompleteStreamResponse = Observable< ChatCompletionEvent >; +/** + * Response from the {@link ChatCompleteAPI} when streaming is not enabled. + */ +export interface ChatCompleteResponse { + /** + * The text content of the LLM response. + */ + content: string; + /** + * The eventual tool calls performed by the LLM. + */ + toolCalls: ToolCallsOf['toolCalls']; + /** + * Token counts + */ + tokens?: ChatCompletionTokenCount; +} + /** * Define the function calling mode when using inference APIs. * - native will use the LLM's native function calling (requires the LLM to have native support) diff --git a/x-pack/packages/ai-infra/inference-common/src/chat_complete/events.ts b/x-pack/packages/ai-infra/inference-common/src/chat_complete/events.ts index 92c49e6ee7fc0..73396b3e2b905 100644 --- a/x-pack/packages/ai-infra/inference-common/src/chat_complete/events.ts +++ b/x-pack/packages/ai-infra/inference-common/src/chat_complete/events.ts @@ -76,30 +76,38 @@ export type ChatCompletionChunkEvent = tool_calls: ChatCompletionChunkToolCall[]; }; +/** + * Token count structure for the chatComplete API. + */ +export interface ChatCompletionTokenCount { + /** + * Input token count + */ + prompt: number; + /** + * Output token count + */ + completion: number; + /** + * Total token count + */ + total: number; +} + /** * Token count event, send only once, usually (but not necessarily) * before the message event */ export type ChatCompletionTokenCountEvent = InferenceTaskEventBase & { - tokens: { - /** - * Input token count - */ - prompt: number; - /** - * Output token count - */ - completion: number; - /** - * Total token count - */ - total: number; - }; + /** + * The token count structure + */ + tokens: ChatCompletionTokenCount; }; /** - * Events emitted from the {@link ChatCompletionResponse} observable + * Events emitted from the {@link ChatCompleteResponse} observable * returned from the {@link ChatCompleteAPI}. * * The chatComplete API returns 3 type of events: diff --git a/x-pack/packages/ai-infra/inference-common/src/chat_complete/index.ts b/x-pack/packages/ai-infra/inference-common/src/chat_complete/index.ts index 8199af4cf068b..ca69f39b273e5 100644 --- a/x-pack/packages/ai-infra/inference-common/src/chat_complete/index.ts +++ b/x-pack/packages/ai-infra/inference-common/src/chat_complete/index.ts @@ -6,10 +6,12 @@ */ export type { - ChatCompletionResponse, + ChatCompleteCompositeResponse, ChatCompleteAPI, ChatCompleteOptions, FunctionCallingMode, + ChatCompleteStreamResponse, + ChatCompleteResponse, } from './api'; export { ChatCompletionEventType, @@ -18,6 +20,7 @@ export { type ChatCompletionEvent, type ChatCompletionChunkToolCall, type ChatCompletionTokenCountEvent, + type ChatCompletionTokenCount, } from './events'; export { MessageRole, diff --git a/x-pack/packages/ai-infra/inference-common/src/chat_complete/tool_schema.ts b/x-pack/packages/ai-infra/inference-common/src/chat_complete/tool_schema.ts index bb4742c6b74d9..fd935785f74f5 100644 --- a/x-pack/packages/ai-infra/inference-common/src/chat_complete/tool_schema.ts +++ b/x-pack/packages/ai-infra/inference-common/src/chat_complete/tool_schema.ts @@ -11,9 +11,9 @@ interface ToolSchemaFragmentBase { description?: string; } -interface ToolSchemaTypeObject extends ToolSchemaFragmentBase { +export interface ToolSchemaTypeObject extends ToolSchemaFragmentBase { type: 'object'; - properties?: Record; + properties: Record; required?: string[] | readonly string[]; } @@ -40,6 +40,9 @@ interface ToolSchemaTypeArray extends ToolSchemaFragmentBase { items: Exclude; } +/** + * A tool schema property's possible types. + */ export type ToolSchemaType = | ToolSchemaTypeObject | ToolSchemaTypeString diff --git a/x-pack/packages/ai-infra/inference-common/src/output/api.ts b/x-pack/packages/ai-infra/inference-common/src/output/api.ts index 677d2f7015c2a..3355042910a61 100644 --- a/x-pack/packages/ai-infra/inference-common/src/output/api.ts +++ b/x-pack/packages/ai-infra/inference-common/src/output/api.ts @@ -6,39 +6,140 @@ */ import type { Observable } from 'rxjs'; -import type { Message, FunctionCallingMode, FromToolSchema, ToolSchema } from '../chat_complete'; -import type { OutputEvent } from './events'; +import { Message, FunctionCallingMode, FromToolSchema, ToolSchema } from '../chat_complete'; +import { Output, OutputEvent } from './events'; /** * Generate a response with the LLM for a prompt, optionally based on a schema. * - * @param {string} id The id of the operation - * @param {string} options.connectorId The ID of the connector that is to be used. - * @param {string} options.input The prompt for the LLM. - * @param {string} options.messages Previous messages in a conversation. - * @param {ToolSchema} [options.schema] The schema the response from the LLM should adhere to. + * @example + * ```ts + * // schema must be defined as full const or using the `satisfies ToolSchema` modifier for TS type inference to work + * const mySchema = { + * type: 'object', + * properties: { + * animals: { + * description: 'the list of animals that are mentioned in the provided article', + * type: 'array', + * items: { + * type: 'string', + * }, + * }, + * }, + * } as const; + * + * const response = outputApi({ + * id: 'extract_from_article', + * connectorId: 'my-connector connector', + * schema: mySchema, + * input: ` + * Please find all the animals that are mentioned in the following document: + * ## Document¬ + * ${theDoc} + * `, + * }); + * + * // output is properly typed from the provided schema + * const { animals } = response.output; + * ``` */ export type OutputAPI = < TId extends string = string, - TOutputSchema extends ToolSchema | undefined = ToolSchema | undefined + TOutputSchema extends ToolSchema | undefined = ToolSchema | undefined, + TStream extends boolean = false >( - id: TId, - options: { - connectorId: string; - system?: string; - input: string; - schema?: TOutputSchema; - previousMessages?: Message[]; - functionCalling?: FunctionCallingMode; - } -) => OutputResponse; + options: OutputOptions +) => OutputCompositeResponse; + +/** + * Options for the {@link OutputAPI} + */ +export interface OutputOptions< + TId extends string = string, + TOutputSchema extends ToolSchema | undefined = ToolSchema | undefined, + TStream extends boolean = false +> { + /** + * The id of the operation. + */ + id: TId; + /** + * The ID of the connector to use. + * Must be an inference connector, or an error will be thrown. + */ + connectorId: string; + /** + * Optional system message for the LLM. + */ + system?: string; + /** + * The prompt for the LLM. + */ + input: string; + /** + * The schema the response from the LLM should adhere to. + */ + schema?: TOutputSchema; + /** + * Previous messages in the conversation. + * If provided, will be passed to the LLM in addition to `input`. + */ + previousMessages?: Message[]; + /** + * Function calling mode, defaults to "native". + */ + functionCalling?: FunctionCallingMode; + /** + * Set to true to enable streaming, which will change the API response type from + * a single promise to an event observable. + * + * Defaults to false. + */ + stream?: TStream; +} + +/** + * Composite response type from the {@link OutputAPI}, + * which can be either an observable or a promise depending on + * whether API was called with stream mode enabled or not. + */ +export type OutputCompositeResponse< + TId extends string = string, + TOutputSchema extends ToolSchema | undefined = ToolSchema | undefined, + TStream extends boolean = false +> = TStream extends true + ? OutputStreamResponse + : Promise< + OutputResponse< + TId, + TOutputSchema extends ToolSchema ? FromToolSchema : undefined + > + >; + +/** + * Response from the {@link OutputAPI} when streaming is not enabled. + */ +export interface OutputResponse { + /** + * The id of the operation, as specified when calling the API. + */ + id: TId; + /** + * The task output, following the schema specified as input. + */ + output: TOutput; + /** + * Potential text content provided by the LLM, if it was provided in addition to the tool call. + */ + content: string; +} /** - * Response from the {@link OutputAPI}. + * Response from the {@link OutputAPI} in streaming mode. * - * Observable of {@link OutputEvent} + * @returns Observable of {@link OutputEvent} */ -export type OutputResponse< +export type OutputStreamResponse< TId extends string = string, TOutputSchema extends ToolSchema | undefined = ToolSchema | undefined > = Observable< diff --git a/x-pack/packages/ai-infra/inference-common/src/output/index.ts b/x-pack/packages/ai-infra/inference-common/src/output/index.ts index ceac178f47faa..a3039005b2f7c 100644 --- a/x-pack/packages/ai-infra/inference-common/src/output/index.ts +++ b/x-pack/packages/ai-infra/inference-common/src/output/index.ts @@ -5,7 +5,13 @@ * 2.0. */ -export type { OutputAPI, OutputResponse } from './api'; +export type { + OutputAPI, + OutputOptions, + OutputCompositeResponse, + OutputResponse, + OutputStreamResponse, +} from './api'; export { OutputEventType, type OutputCompleteEvent, diff --git a/x-pack/packages/ml/aiops_common/constants.ts b/x-pack/packages/ml/aiops_common/constants.ts index 39a0fdc5842c8..1a75e929c147a 100644 --- a/x-pack/packages/ml/aiops_common/constants.ts +++ b/x-pack/packages/ml/aiops_common/constants.ts @@ -5,6 +5,8 @@ * 2.0. */ +import { i18n } from '@kbn/i18n'; + /** * AIOPS_PLUGIN_ID is used as a unique identifier for the aiops plugin */ @@ -28,3 +30,14 @@ export const AIOPS_EMBEDDABLE_ORIGIN = { DISCOVER: 'discover', ML_AIOPS_LABS: 'ml_aiops_labs', } as const; + +export const AIOPS_EMBEDDABLE_GROUPING = [ + { + id: 'logs-aiops', + getDisplayName: () => + i18n.translate('xpack.aiops.embedabble.groupingDisplayName', { + defaultMessage: 'Logs AIOps', + }), + getIconType: () => 'machineLearningApp', + }, +]; diff --git a/x-pack/packages/ml/aiops_common/tsconfig.json b/x-pack/packages/ml/aiops_common/tsconfig.json index 806b5b07e847e..ffd8c074a421d 100644 --- a/x-pack/packages/ml/aiops_common/tsconfig.json +++ b/x-pack/packages/ml/aiops_common/tsconfig.json @@ -15,6 +15,7 @@ ], "kbn_references": [ "@kbn/ml-is-populated-object", + "@kbn/i18n", ], "exclude": [ "target/**/*", diff --git a/x-pack/packages/ml/aiops_components/src/document_count_chart/document_count_chart.tsx b/x-pack/packages/ml/aiops_components/src/document_count_chart/document_count_chart.tsx index 39291762b8fca..d9f68fe7ef890 100644 --- a/x-pack/packages/ml/aiops_components/src/document_count_chart/document_count_chart.tsx +++ b/x-pack/packages/ml/aiops_components/src/document_count_chart/document_count_chart.tsx @@ -426,7 +426,12 @@ export const DocumentCountChart: FC = (props) => { <> {isBrushVisible && (
-
+ {/** + * We need position:relative on this parent container of the BrushBadges, + * because of the absolute positioning of the BrushBadges. Without it, the + * BrushBadges would not be positioned correctly when used in embedded panels. + */} +
= (props) => { const d3BrushContainer = useRef(null); const brushes = useRef([]); + // id to prefix html ids for the brushes since this component can be used + // multiple times within dashboard and embedded charts. + const htmlId = useMemo(() => htmlIdGenerator()(), []); + // We need to pass props to refs here because the d3-brush code doesn't consider // native React prop changes. The brush code does its own check whether these props changed then. // The initialized brushes might otherwise act on stale data. @@ -135,10 +141,10 @@ export const DualBrush: FC = (props) => { const xMax = x(maxRef.current) ?? 0; const minExtentPx = Math.round((xMax - xMin) / 100); - const baselineBrush = d3.select('#aiops-brush-baseline'); + const baselineBrush = d3.select(`#aiops-brush-baseline-${htmlId}`); const baselineSelection = d3.brushSelection(baselineBrush.node() as SVGGElement); - const deviationBrush = d3.select('#aiops-brush-deviation'); + const deviationBrush = d3.select(`#aiops-brush-deviation-${htmlId}`); const deviationSelection = d3.brushSelection(deviationBrush.node() as SVGGElement); if (!isBrushXSelection(deviationSelection) || !isBrushXSelection(baselineSelection)) { @@ -260,7 +266,7 @@ export const DualBrush: FC = (props) => { .insert('g', '.brush') .attr('class', 'brush') .attr('id', (b: DualBrush) => { - return 'aiops-brush-' + b.id; + return `aiops-brush-${b.id}-${htmlId}`; }) .attr('data-test-subj', (b: DualBrush) => { // Uppercase the first character of the `id` so we get aiopsBrushBaseline/aiopsBrushDeviation. @@ -339,6 +345,7 @@ export const DualBrush: FC = (props) => { drawBrushes(); } }, [ + htmlId, min, max, width, diff --git a/x-pack/packages/ml/aiops_components/src/progress_controls/progress_controls.tsx b/x-pack/packages/ml/aiops_components/src/progress_controls/progress_controls.tsx index 098f7038f82c8..173f33e08f0b4 100644 --- a/x-pack/packages/ml/aiops_components/src/progress_controls/progress_controls.tsx +++ b/x-pack/packages/ml/aiops_components/src/progress_controls/progress_controls.tsx @@ -70,7 +70,6 @@ export const ProgressControls: FC> = (pr const { euiTheme } = useEuiTheme(); const runningProgressBarStyles = useAnimatedProgressBarBackground(euiTheme.colors.success); - const analysisCompleteStyle = { display: 'none' }; return ( @@ -144,32 +143,30 @@ export const ProgressControls: FC> = (pr ) : null} - - - - + + + + + + + - - - - - - + + + ) : null} {children} diff --git a/x-pack/packages/ml/aiops_log_rate_analysis/constants.ts b/x-pack/packages/ml/aiops_log_rate_analysis/constants.ts index 054bb876a4f7a..a9812a7507441 100644 --- a/x-pack/packages/ml/aiops_log_rate_analysis/constants.ts +++ b/x-pack/packages/ml/aiops_log_rate_analysis/constants.ts @@ -33,3 +33,9 @@ export const RANDOM_SAMPLER_SEED = 3867412; /** Highlighting color for charts */ export const LOG_RATE_ANALYSIS_HIGHLIGHT_COLOR = 'orange'; + +/** */ +export const EMBEDDABLE_LOG_RATE_ANALYSIS_TYPE = 'aiopsLogRateAnalysisEmbeddable' as const; + +/** */ +export const LOG_RATE_ANALYSIS_DATA_VIEW_REF_NAME = 'aiopsLogRateAnalysisEmbeddableDataViewId'; diff --git a/x-pack/packages/ml/aiops_log_rate_analysis/state/hooks.ts b/x-pack/packages/ml/aiops_log_rate_analysis/state/hooks.ts index 4652d604c5d61..d02a3bea22bf3 100644 --- a/x-pack/packages/ml/aiops_log_rate_analysis/state/hooks.ts +++ b/x-pack/packages/ml/aiops_log_rate_analysis/state/hooks.ts @@ -6,10 +6,9 @@ */ import type { TypedUseSelectorHook } from 'react-redux'; -import { useDispatch, useSelector, useStore } from 'react-redux'; -import type { AppDispatch, AppStore, RootState } from './store'; +import { useDispatch, useSelector } from 'react-redux'; +import type { AppDispatch, RootState } from './store'; // Improves TypeScript support compared to plain `useDispatch` and `useSelector` export const useAppDispatch: () => AppDispatch = useDispatch; export const useAppSelector: TypedUseSelectorHook = useSelector; -export const useAppStore: () => AppStore = useStore; diff --git a/x-pack/packages/ml/aiops_log_rate_analysis/state/index.ts b/x-pack/packages/ml/aiops_log_rate_analysis/state/index.ts index 785bb02c24f31..7f7710ec23f3b 100644 --- a/x-pack/packages/ml/aiops_log_rate_analysis/state/index.ts +++ b/x-pack/packages/ml/aiops_log_rate_analysis/state/index.ts @@ -11,6 +11,7 @@ export { setAnalysisType, setAutoRunAnalysis, setDocumentCountChartData, + setGroupResults, setInitialAnalysisStart, setIsBrushCleared, setStickyHistogram, @@ -23,9 +24,10 @@ export { setPinnedSignificantItem, setSelectedGroup, setSelectedSignificantItem, -} from './log_rate_analysis_table_row_slice'; + setSkippedColumns, +} from './log_rate_analysis_table_slice'; export { LogRateAnalysisReduxProvider } from './store'; -export { useAppDispatch, useAppSelector, useAppStore } from './hooks'; +export { useAppDispatch, useAppSelector } from './hooks'; export { useCurrentSelectedGroup } from './use_current_selected_group'; export { useCurrentSelectedSignificantItem } from './use_current_selected_significant_item'; export type { GroupTableItem, GroupTableItemGroup, TableItemAction } from './types'; diff --git a/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_field_candidates_slice.test.ts b/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_field_candidates_slice.test.ts index 4f829b0e0bf5a..5b4946dc2eb2b 100644 --- a/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_field_candidates_slice.test.ts +++ b/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_field_candidates_slice.test.ts @@ -9,14 +9,16 @@ import { httpServiceMock } from '@kbn/core/public/mocks'; import type { FetchFieldCandidatesResponse } from '../queries/fetch_field_candidates'; -import { fetchFieldCandidates } from './log_rate_analysis_field_candidates_slice'; +import { fetchFieldCandidates, getDefaultState } from './log_rate_analysis_field_candidates_slice'; const mockHttp = httpServiceMock.createStartContract(); describe('fetchFieldCandidates', () => { it('dispatches field candidates', async () => { const mockDispatch = jest.fn(); - const mockGetState = jest.fn(); + const mockGetState = jest.fn().mockReturnValue({ + logRateAnalysisFieldCandidates: getDefaultState(), + }); const mockResponse: FetchFieldCandidatesResponse = { isECS: false, @@ -60,7 +62,12 @@ describe('fetchFieldCandidates', () => { payload: { fieldSelectionMessage: '2 out of 5 fields were preselected for the analysis. Use the "Fields" dropdown to adjust the selection.', - fieldFilterSkippedItems: [ + initialFieldFilterSkippedItems: [ + 'another-keyword-field', + 'another-text-field', + 'yet-another-text-field', + ], + currentFieldFilterSkippedItems: [ 'another-keyword-field', 'another-text-field', 'yet-another-text-field', diff --git a/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_field_candidates_slice.ts b/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_field_candidates_slice.ts index aa5cb969e5401..07b1cd6fee402 100644 --- a/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_field_candidates_slice.ts +++ b/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_field_candidates_slice.ts @@ -90,10 +90,14 @@ export const fetchFieldCandidates = createAsyncThunk( ...selectedKeywordFieldCandidates, ...selectedTextFieldCandidates, ]; - const fieldFilterSkippedItems = fieldFilterUniqueItems.filter( + const initialFieldFilterSkippedItems = fieldFilterUniqueItems.filter( (d) => !fieldFilterUniqueSelectedItems.includes(d) ); + const currentFieldFilterSkippedItems = ( + thunkApi.getState() as { logRateAnalysisFieldCandidates: FieldCandidatesState } + ).logRateAnalysisFieldCandidates.currentFieldFilterSkippedItems; + thunkApi.dispatch( setAllFieldCandidates({ fieldSelectionMessage: getFieldSelectionMessage( @@ -102,7 +106,13 @@ export const fetchFieldCandidates = createAsyncThunk( fieldFilterUniqueSelectedItems.length ), fieldFilterUniqueItems, - fieldFilterSkippedItems, + initialFieldFilterSkippedItems, + // If the currentFieldFilterSkippedItems is null, we're on the first load, + // only then we set the current skipped fields to the initial skipped fields. + currentFieldFilterSkippedItems: + currentFieldFilterSkippedItems === null + ? initialFieldFilterSkippedItems + : currentFieldFilterSkippedItems, keywordFieldCandidates, textFieldCandidates, selectedKeywordFieldCandidates, @@ -116,18 +126,20 @@ export interface FieldCandidatesState { isLoading: boolean; fieldSelectionMessage?: string; fieldFilterUniqueItems: string[]; - fieldFilterSkippedItems: string[]; + initialFieldFilterSkippedItems: string[]; + currentFieldFilterSkippedItems: string[] | null; keywordFieldCandidates: string[]; textFieldCandidates: string[]; selectedKeywordFieldCandidates: string[]; selectedTextFieldCandidates: string[]; } -function getDefaultState(): FieldCandidatesState { +export function getDefaultState(): FieldCandidatesState { return { isLoading: false, fieldFilterUniqueItems: [], - fieldFilterSkippedItems: [], + initialFieldFilterSkippedItems: [], + currentFieldFilterSkippedItems: null, keywordFieldCandidates: [], textFieldCandidates: [], selectedKeywordFieldCandidates: [], @@ -145,6 +157,12 @@ export const logRateAnalysisFieldCandidatesSlice = createSlice({ ) => { return { ...state, ...action.payload }; }, + setCurrentFieldFilterSkippedItems: ( + state: FieldCandidatesState, + action: PayloadAction + ) => { + return { ...state, currentFieldFilterSkippedItems: action.payload }; + }, }, extraReducers: (builder) => { builder.addCase(fetchFieldCandidates.pending, (state) => { @@ -157,4 +175,5 @@ export const logRateAnalysisFieldCandidatesSlice = createSlice({ }); // Action creators are generated for each case reducer function -export const { setAllFieldCandidates } = logRateAnalysisFieldCandidatesSlice.actions; +export const { setAllFieldCandidates, setCurrentFieldFilterSkippedItems } = + logRateAnalysisFieldCandidatesSlice.actions; diff --git a/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_slice.ts b/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_slice.ts index 251f0d3263800..8399e896900c6 100644 --- a/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_slice.ts +++ b/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_slice.ts @@ -34,6 +34,7 @@ export interface LogRateAnalysisState { autoRunAnalysis: boolean; initialAnalysisStart: InitialAnalysisStart; isBrushCleared: boolean; + groupResults: boolean; stickyHistogram: boolean; chartWindowParameters?: WindowParameters; earliest?: number; @@ -48,6 +49,7 @@ function getDefaultState(): LogRateAnalysisState { autoRunAnalysis: true, initialAnalysisStart: undefined, isBrushCleared: true, + groupResults: false, documentStats: { sampleProbability: 1, totalCount: 0, @@ -98,6 +100,9 @@ export const logRateAnalysisSlice = createSlice({ state.intervalMs = action.payload.intervalMs; state.documentStats = action.payload.documentStats; }, + setGroupResults: (state: LogRateAnalysisState, action: PayloadAction) => { + state.groupResults = action.payload; + }, setInitialAnalysisStart: ( state: LogRateAnalysisState, action: PayloadAction @@ -127,6 +132,7 @@ export const { setAnalysisType, setAutoRunAnalysis, setDocumentCountChartData, + setGroupResults, setInitialAnalysisStart, setIsBrushCleared, setStickyHistogram, diff --git a/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_table_row_slice.ts b/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_table_row_slice.ts deleted file mode 100644 index 3da98e4cc80ff..0000000000000 --- a/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_table_row_slice.ts +++ /dev/null @@ -1,72 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import type { PayloadAction } from '@reduxjs/toolkit'; -import { createSlice } from '@reduxjs/toolkit'; - -import type { SignificantItem } from '@kbn/ml-agg-utils'; - -import type { GroupTableItem } from './types'; - -type SignificantItemOrNull = SignificantItem | null; -type GroupOrNull = GroupTableItem | null; - -export interface LogRateAnalysisTableRowState { - pinnedGroup: GroupOrNull; - pinnedSignificantItem: SignificantItemOrNull; - selectedGroup: GroupOrNull; - selectedSignificantItem: SignificantItemOrNull; -} - -function getDefaultState(): LogRateAnalysisTableRowState { - return { - pinnedGroup: null, - pinnedSignificantItem: null, - selectedGroup: null, - selectedSignificantItem: null, - }; -} - -export const logRateAnalysisTableRowSlice = createSlice({ - name: 'logRateAnalysisTableRow', - initialState: getDefaultState(), - reducers: { - clearAllRowState: (state: LogRateAnalysisTableRowState) => { - state.pinnedGroup = null; - state.pinnedSignificantItem = null; - state.selectedGroup = null; - state.selectedSignificantItem = null; - }, - setPinnedGroup: (state: LogRateAnalysisTableRowState, action: PayloadAction) => { - state.pinnedGroup = action.payload; - }, - setPinnedSignificantItem: ( - state: LogRateAnalysisTableRowState, - action: PayloadAction - ) => { - state.pinnedSignificantItem = action.payload; - }, - setSelectedGroup: (state: LogRateAnalysisTableRowState, action: PayloadAction) => { - state.selectedGroup = action.payload; - }, - setSelectedSignificantItem: ( - state: LogRateAnalysisTableRowState, - action: PayloadAction - ) => { - state.selectedSignificantItem = action.payload; - }, - }, -}); - -// Action creators are generated for each case reducer function -export const { - clearAllRowState, - setPinnedGroup, - setPinnedSignificantItem, - setSelectedGroup, - setSelectedSignificantItem, -} = logRateAnalysisTableRowSlice.actions; diff --git a/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_table_slice.test.ts b/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_table_slice.test.ts new file mode 100644 index 0000000000000..498ada00654f0 --- /dev/null +++ b/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_table_slice.test.ts @@ -0,0 +1,130 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import { configureStore } from '@reduxjs/toolkit'; +import { + logRateAnalysisTableSlice, + localStorageListenerMiddleware, + setSkippedColumns, + getPreloadedState, + AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS, + type LogRateAnalysisResultsTableColumnName, +} from './log_rate_analysis_table_slice'; + +describe('getPreloadedState', () => { + beforeEach(() => { + localStorage.clear(); + }); + + it('should return default state when localStorage is empty', () => { + const state = getPreloadedState(); + expect(state).toEqual({ + skippedColumns: ['p-value', 'Baseline rate', 'Deviation rate'], + pinnedGroup: null, + pinnedSignificantItem: null, + selectedGroup: null, + selectedSignificantItem: null, + }); + }); + + it('should return state with skippedColumns from localStorage', () => { + const skippedColumns = ['Log rate', 'Doc count']; + localStorage.setItem(AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS, JSON.stringify(skippedColumns)); + + const state = getPreloadedState(); + expect(state.skippedColumns).toEqual(skippedColumns); + }); + + it('should return default state when localStorage contains invalid JSON', () => { + localStorage.setItem(AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS, 'invalid-json'); + + const state = getPreloadedState(); + expect(state).toEqual({ + skippedColumns: ['p-value', 'Baseline rate', 'Deviation rate'], + pinnedGroup: null, + pinnedSignificantItem: null, + selectedGroup: null, + selectedSignificantItem: null, + }); + }); + + it('should return default state when localStorage does not contain skippedColumns', () => { + localStorage.setItem('someOtherKey', JSON.stringify(['someValue'])); + + const state = getPreloadedState(); + expect(state).toEqual({ + skippedColumns: ['p-value', 'Baseline rate', 'Deviation rate'], + pinnedGroup: null, + pinnedSignificantItem: null, + selectedGroup: null, + selectedSignificantItem: null, + }); + }); +}); + +type Store = ReturnType; + +describe('localStorageListenerMiddleware', () => { + let store: Store; + + beforeEach(() => { + localStorage.clear(); + store = configureStore({ + reducer: { + logRateAnalysisTable: logRateAnalysisTableSlice.reducer, + }, + middleware: (getDefaultMiddleware) => + getDefaultMiddleware().prepend(localStorageListenerMiddleware.middleware), + }) as Store; + }); + + it('should save skippedColumns to localStorage when setSkippedColumns is dispatched', () => { + const skippedColumns: LogRateAnalysisResultsTableColumnName[] = ['Log rate', 'Doc count']; + store.dispatch(setSkippedColumns(skippedColumns)); + + const storedSkippedColumns = localStorage.getItem(AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS); + expect(storedSkippedColumns).toEqual(JSON.stringify(skippedColumns)); + }); + + it('should handle invalid JSON in localStorage gracefully', () => { + localStorage.setItem(AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS, 'invalid-json'); + const skippedColumns: LogRateAnalysisResultsTableColumnName[] = ['Log rate', 'Doc count']; + store.dispatch(setSkippedColumns(skippedColumns)); + + const storedSkippedColumns = localStorage.getItem(AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS); + expect(storedSkippedColumns).toEqual(JSON.stringify(skippedColumns)); + }); + + it('should not overwrite other localStorage keys', () => { + const otherKey = 'someOtherKey'; + const otherValue = ['someValue']; + localStorage.setItem(otherKey, JSON.stringify(otherValue)); + + const skippedColumns: LogRateAnalysisResultsTableColumnName[] = ['Log rate', 'Doc count']; + store.dispatch(setSkippedColumns(skippedColumns)); + + const storedOtherValue = localStorage.getItem(otherKey); + expect(storedOtherValue).toEqual(JSON.stringify(otherValue)); + }); + + it('should update localStorage when skippedColumns are updated multiple times', () => { + const initialSkippedColumns: LogRateAnalysisResultsTableColumnName[] = ['Log rate']; + store.dispatch(setSkippedColumns(initialSkippedColumns)); + + let storedSkippedColumns = localStorage.getItem(AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS); + expect(storedSkippedColumns).toEqual(JSON.stringify(initialSkippedColumns)); + + const updatedSkippedColumns: LogRateAnalysisResultsTableColumnName[] = [ + 'Log rate', + 'Doc count', + ]; + store.dispatch(setSkippedColumns(updatedSkippedColumns)); + + storedSkippedColumns = localStorage.getItem(AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS); + expect(storedSkippedColumns).toEqual(JSON.stringify(updatedSkippedColumns)); + }); +}); diff --git a/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_table_slice.ts b/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_table_slice.ts new file mode 100644 index 0000000000000..1d9c83dea98a6 --- /dev/null +++ b/x-pack/packages/ml/aiops_log_rate_analysis/state/log_rate_analysis_table_slice.ts @@ -0,0 +1,172 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import type { PayloadAction } from '@reduxjs/toolkit'; +import { createSlice, createListenerMiddleware } from '@reduxjs/toolkit'; + +import { i18n } from '@kbn/i18n'; +import type { SignificantItem } from '@kbn/ml-agg-utils'; + +import type { GroupTableItem } from './types'; + +export const AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS = 'aiops.logRateAnalysisResultColumns'; + +export const commonColumns = { + ['Log rate']: i18n.translate('xpack.aiops.logRateAnalysis.resultsTable.logRateColumnTitle', { + defaultMessage: 'Log rate', + }), + ['Doc count']: i18n.translate('xpack.aiops.logRateAnalysis.resultsTable.docCountColumnTitle', { + defaultMessage: 'Doc count', + }), + ['p-value']: i18n.translate('xpack.aiops.logRateAnalysis.resultsTable.pValueColumnTitle', { + defaultMessage: 'p-value', + }), + ['Impact']: i18n.translate('xpack.aiops.logRateAnalysis.resultsTable.impactColumnTitle', { + defaultMessage: 'Impact', + }), + ['Baseline rate']: i18n.translate( + 'xpack.aiops.logRateAnalysis.resultsTable.baselineRateColumnTitle', + { + defaultMessage: 'Baseline rate', + } + ), + ['Deviation rate']: i18n.translate( + 'xpack.aiops.logRateAnalysis.resultsTable.deviationRateColumnTitle', + { + defaultMessage: 'Deviation rate', + } + ), + ['Log rate change']: i18n.translate( + 'xpack.aiops.logRateAnalysis.resultsTable.logRateChangeColumnTitle', + { + defaultMessage: 'Log rate change', + } + ), + ['Actions']: i18n.translate('xpack.aiops.logRateAnalysis.resultsTable.actionsColumnTitle', { + defaultMessage: 'Actions', + }), +}; + +export const significantItemColumns = { + ['Field name']: i18n.translate('xpack.aiops.logRateAnalysis.resultsTable.fieldNameColumnTitle', { + defaultMessage: 'Field name', + }), + ['Field value']: i18n.translate( + 'xpack.aiops.logRateAnalysis.resultsTable.fieldValueColumnTitle', + { + defaultMessage: 'Field value', + } + ), + ...commonColumns, +} as const; + +export type LogRateAnalysisResultsTableColumnName = keyof typeof significantItemColumns | 'unique'; + +type SignificantItemOrNull = SignificantItem | null; +type GroupOrNull = GroupTableItem | null; + +export interface LogRateAnalysisTableState { + skippedColumns: LogRateAnalysisResultsTableColumnName[]; + pinnedGroup: GroupOrNull; + pinnedSignificantItem: SignificantItemOrNull; + selectedGroup: GroupOrNull; + selectedSignificantItem: SignificantItemOrNull; +} + +function getDefaultState(): LogRateAnalysisTableState { + return { + skippedColumns: ['p-value', 'Baseline rate', 'Deviation rate'], + pinnedGroup: null, + pinnedSignificantItem: null, + selectedGroup: null, + selectedSignificantItem: null, + }; +} + +export function getPreloadedState(): LogRateAnalysisTableState { + const defaultState = getDefaultState(); + + const localStorageSkippedColumns = localStorage.getItem(AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS); + + if (localStorageSkippedColumns === null) { + return defaultState; + } + + try { + defaultState.skippedColumns = JSON.parse(localStorageSkippedColumns); + } catch (err) { + // eslint-disable-next-line no-console + console.warn('Failed to parse skipped columns from local storage:', err); + } + + return defaultState; +} + +export const logRateAnalysisTableSlice = createSlice({ + name: 'logRateAnalysisTable', + initialState: getDefaultState(), + reducers: { + clearAllRowState: (state: LogRateAnalysisTableState) => { + state.pinnedGroup = null; + state.pinnedSignificantItem = null; + state.selectedGroup = null; + state.selectedSignificantItem = null; + }, + setPinnedGroup: (state: LogRateAnalysisTableState, action: PayloadAction) => { + state.pinnedGroup = action.payload; + }, + setPinnedSignificantItem: ( + state: LogRateAnalysisTableState, + action: PayloadAction + ) => { + state.pinnedSignificantItem = action.payload; + }, + setSelectedGroup: (state: LogRateAnalysisTableState, action: PayloadAction) => { + state.selectedGroup = action.payload; + }, + setSelectedSignificantItem: ( + state: LogRateAnalysisTableState, + action: PayloadAction + ) => { + state.selectedSignificantItem = action.payload; + }, + setSkippedColumns: ( + state: LogRateAnalysisTableState, + action: PayloadAction + ) => { + state.skippedColumns = action.payload; + }, + }, +}); + +// Action creators are generated for each case reducer function +export const { + clearAllRowState, + setPinnedGroup, + setPinnedSignificantItem, + setSelectedGroup, + setSelectedSignificantItem, + setSkippedColumns, +} = logRateAnalysisTableSlice.actions; + +// Create listener middleware +export const localStorageListenerMiddleware = createListenerMiddleware(); + +// Add a listener to save skippedColumns to localStorage whenever it changes +localStorageListenerMiddleware.startListening({ + actionCreator: setSkippedColumns, + effect: (action, listenerApi) => { + const state = listenerApi.getState() as { logRateAnalysisTable: LogRateAnalysisTableState }; + try { + const serializedState = JSON.stringify(state.logRateAnalysisTable.skippedColumns); + localStorage.setItem(AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS, serializedState); + } catch (err) { + // eslint-disable-next-line no-console + console.warn('Failed to save state to localStorage:', err); + } + }, +}); diff --git a/x-pack/packages/ml/aiops_log_rate_analysis/state/store.tsx b/x-pack/packages/ml/aiops_log_rate_analysis/state/store.tsx index 1589b27348d89..9fd8e8240dde3 100644 --- a/x-pack/packages/ml/aiops_log_rate_analysis/state/store.tsx +++ b/x-pack/packages/ml/aiops_log_rate_analysis/state/store.tsx @@ -15,12 +15,19 @@ import { streamSlice } from '@kbn/ml-response-stream/client'; import { logRateAnalysisResultsSlice } from '../api/stream_reducer'; import { logRateAnalysisSlice } from './log_rate_analysis_slice'; -import { logRateAnalysisTableRowSlice } from './log_rate_analysis_table_row_slice'; +import { + logRateAnalysisTableSlice, + getPreloadedState, + localStorageListenerMiddleware, +} from './log_rate_analysis_table_slice'; import { logRateAnalysisFieldCandidatesSlice } from './log_rate_analysis_field_candidates_slice'; import type { InitialAnalysisStart } from './log_rate_analysis_slice'; const getReduxStore = () => configureStore({ + preloadedState: { + logRateAnalysisTable: getPreloadedState(), + }, reducer: { // General page state logRateAnalysis: logRateAnalysisSlice.reducer, @@ -28,11 +35,13 @@ const getReduxStore = () => logRateAnalysisFieldCandidates: logRateAnalysisFieldCandidatesSlice.reducer, // Analysis results logRateAnalysisResults: logRateAnalysisResultsSlice.reducer, - // Handles running the analysis - logRateAnalysisStream: streamSlice.reducer, - // Handles hovering and pinning table rows - logRateAnalysisTableRow: logRateAnalysisTableRowSlice.reducer, + // Handles running the analysis, needs to be "stream" for the async thunk to work properly. + stream: streamSlice.reducer, + // Handles hovering and pinning table rows and column selection + logRateAnalysisTable: logRateAnalysisTableSlice.reducer, }, + middleware: (getDefaultMiddleware) => + getDefaultMiddleware().prepend(localStorageListenerMiddleware.middleware), }); interface LogRateAnalysisReduxProviderProps { @@ -54,6 +63,6 @@ export const LogRateAnalysisReduxProvider: FC< }; // Infer the `RootState` and `AppDispatch` types from the store itself -export type AppStore = ReturnType; +type AppStore = ReturnType; export type RootState = ReturnType; export type AppDispatch = AppStore['dispatch']; diff --git a/x-pack/packages/ml/aiops_log_rate_analysis/state/use_current_selected_group.ts b/x-pack/packages/ml/aiops_log_rate_analysis/state/use_current_selected_group.ts index 9653691d3efd4..a19bd3e18a735 100644 --- a/x-pack/packages/ml/aiops_log_rate_analysis/state/use_current_selected_group.ts +++ b/x-pack/packages/ml/aiops_log_rate_analysis/state/use_current_selected_group.ts @@ -10,8 +10,8 @@ import { createSelector } from '@reduxjs/toolkit'; import type { RootState } from './store'; import { useAppSelector } from './hooks'; -const selectSelectedGroup = (s: RootState) => s.logRateAnalysisTableRow.selectedGroup; -const selectPinnedGroup = (s: RootState) => s.logRateAnalysisTableRow.pinnedGroup; +const selectSelectedGroup = (s: RootState) => s.logRateAnalysisTable.selectedGroup; +const selectPinnedGroup = (s: RootState) => s.logRateAnalysisTable.pinnedGroup; const selectCurrentSelectedGroup = createSelector( selectSelectedGroup, selectPinnedGroup, diff --git a/x-pack/packages/ml/aiops_log_rate_analysis/state/use_current_selected_significant_item.ts b/x-pack/packages/ml/aiops_log_rate_analysis/state/use_current_selected_significant_item.ts index d189d16fc2fa0..f7327d3033df0 100644 --- a/x-pack/packages/ml/aiops_log_rate_analysis/state/use_current_selected_significant_item.ts +++ b/x-pack/packages/ml/aiops_log_rate_analysis/state/use_current_selected_significant_item.ts @@ -11,9 +11,8 @@ import type { RootState } from './store'; import { useAppSelector } from './hooks'; const selectSelectedSignificantItem = (s: RootState) => - s.logRateAnalysisTableRow.selectedSignificantItem; -const selectPinnedSignificantItem = (s: RootState) => - s.logRateAnalysisTableRow.pinnedSignificantItem; + s.logRateAnalysisTable.selectedSignificantItem; +const selectPinnedSignificantItem = (s: RootState) => s.logRateAnalysisTable.pinnedSignificantItem; const selectCurrentSelectedSignificantItem = createSelector( selectSelectedSignificantItem, selectPinnedSignificantItem, diff --git a/x-pack/packages/ml/field_stats_flyout/field_stats_flyout_provider.tsx b/x-pack/packages/ml/field_stats_flyout/field_stats_flyout_provider.tsx index 678dec7d36f42..4e7a501140b01 100644 --- a/x-pack/packages/ml/field_stats_flyout/field_stats_flyout_provider.tsx +++ b/x-pack/packages/ml/field_stats_flyout/field_stats_flyout_provider.tsx @@ -7,9 +7,8 @@ import type { PropsWithChildren, FC } from 'react'; import React, { useCallback, useState } from 'react'; -import type { CoreStart } from '@kbn/core/public'; +import type { ThemeServiceStart } from '@kbn/core-theme-browser'; import type { DataView } from '@kbn/data-plugin/common'; -import type { DataPublicPluginStart } from '@kbn/data-plugin/public'; import type { FieldStatsServices } from '@kbn/unified-field-list/src/components/field_stats'; import type { TimeRange as TimeRangeMs } from '@kbn/ml-date-picker'; import type { FieldStatsProps } from '@kbn/unified-field-list/src/components/field_stats'; @@ -18,32 +17,18 @@ import { getProcessedFields } from '@kbn/ml-data-grid'; import { stringHash } from '@kbn/ml-string-hash'; import { lastValueFrom } from 'rxjs'; import { useRef } from 'react'; -import { useKibana } from '@kbn/kibana-react-plugin/public'; import { getMergedSampleDocsForPopulatedFieldsQuery } from './populated_fields/get_merged_populated_fields_query'; import { FieldStatsFlyout } from './field_stats_flyout'; import { MLFieldStatsFlyoutContext } from './use_field_stats_flyout_context'; import { PopulatedFieldsCacheManager } from './populated_fields/populated_fields_cache_manager'; -type Services = CoreStart & { - data: DataPublicPluginStart; -}; - -function useDataSearch() { - const { data } = useKibana().services; - - if (!data) { - throw new Error('Kibana data service not available.'); - } - - return data.search; -} - /** * Props for the FieldStatsFlyoutProvider component. * * @typedef {Object} FieldStatsFlyoutProviderProps * @property dataView - The data view object. * @property fieldStatsServices - Services required for field statistics. + * @property theme - The EUI theme service. * @property [timeRangeMs] - Optional time range in milliseconds. * @property [dslQuery] - Optional DSL query for filtering field statistics. * @property [disablePopulatedFields] - Optional flag to disable populated fields. @@ -51,6 +36,7 @@ function useDataSearch() { export type FieldStatsFlyoutProviderProps = PropsWithChildren<{ dataView: DataView; fieldStatsServices: FieldStatsServices; + theme: ThemeServiceStart; timeRangeMs?: TimeRangeMs; dslQuery?: FieldStatsProps['dslQuery']; disablePopulatedFields?: boolean; @@ -79,12 +65,13 @@ export const FieldStatsFlyoutProvider: FC = (prop const { dataView, fieldStatsServices, + theme, timeRangeMs, dslQuery, disablePopulatedFields = false, children, } = props; - const search = useDataSearch(); + const { search } = fieldStatsServices.data; const [isFieldStatsFlyoutVisible, setFieldStatsIsFlyoutVisible] = useState(false); const [fieldName, setFieldName] = useState(); const [fieldValue, setFieldValue] = useState(); @@ -187,6 +174,7 @@ export const FieldStatsFlyoutProvider: FC = (prop fieldValue, timeRangeMs, populatedFields, + theme, }} > = (props) => { const { field, label, onButtonClick, disabled, isEmpty, hideTrigger } = props; - const themeVars = useThemeVars(); + const theme = useFieldStatsFlyoutThemeVars(); + const themeVars = useCurrentEuiThemeVars(theme); + const emptyFieldMessage = isEmpty ? ' ' + i18n.translate('xpack.ml.newJob.wizard.fieldContextPopover.emptyFieldInSampleDocsMsg', { diff --git a/x-pack/packages/ml/field_stats_flyout/tsconfig.json b/x-pack/packages/ml/field_stats_flyout/tsconfig.json index 0010d79432e34..df70aa27788b8 100644 --- a/x-pack/packages/ml/field_stats_flyout/tsconfig.json +++ b/x-pack/packages/ml/field_stats_flyout/tsconfig.json @@ -23,9 +23,7 @@ "@kbn/i18n", "@kbn/react-field", "@kbn/ml-anomaly-utils", - "@kbn/kibana-react-plugin", "@kbn/ml-kibana-theme", - "@kbn/core", "@kbn/ml-data-grid", "@kbn/ml-string-hash", "@kbn/ml-is-populated-object", @@ -33,5 +31,6 @@ "@kbn/ml-is-defined", "@kbn/field-types", "@kbn/ui-theme", + "@kbn/core-theme-browser", ] } diff --git a/x-pack/packages/ml/field_stats_flyout/use_field_stats_flyout_context.ts b/x-pack/packages/ml/field_stats_flyout/use_field_stats_flyout_context.ts index ec6c28873011c..121426352e6e4 100644 --- a/x-pack/packages/ml/field_stats_flyout/use_field_stats_flyout_context.ts +++ b/x-pack/packages/ml/field_stats_flyout/use_field_stats_flyout_context.ts @@ -7,6 +7,7 @@ import { createContext, useContext } from 'react'; import type { TimeRange as TimeRangeMs } from '@kbn/ml-date-picker'; +import type { ThemeServiceStart } from '@kbn/core-theme-browser'; /** * Represents the properties for the MLJobWizardFieldStatsFlyout component. @@ -21,6 +22,7 @@ interface MLJobWizardFieldStatsFlyoutProps { fieldValue?: string | number; timeRangeMs?: TimeRangeMs; populatedFields?: Set; + theme?: ThemeServiceStart; } /** @@ -34,6 +36,7 @@ export const MLFieldStatsFlyoutContext = createContext {}, timeRangeMs: undefined, populatedFields: undefined, + theme: undefined, }); /** @@ -41,5 +44,25 @@ export const MLFieldStatsFlyoutContext = createContext() { populatedFields, }; } + +export type UseFieldStatsTrigger = typeof useFieldStatsTrigger; diff --git a/x-pack/plugins/aiops/public/components/change_point_detection/fields_config.tsx b/x-pack/plugins/aiops/public/components/change_point_detection/fields_config.tsx index b1c0e3d89f35a..f967fffd45647 100644 --- a/x-pack/plugins/aiops/public/components/change_point_detection/fields_config.tsx +++ b/x-pack/plugins/aiops/public/components/change_point_detection/fields_config.tsx @@ -638,7 +638,7 @@ export const FieldsControls: FC> = ({ }) => { const { splitFieldsOptions, combinedQuery } = useChangePointDetectionContext(); const { dataView } = useDataSource(); - const { data, uiSettings, fieldFormats, charts, fieldStats } = useAiopsAppContext(); + const { data, uiSettings, fieldFormats, charts, fieldStats, theme } = useAiopsAppContext(); const timefilter = useTimefilter(); // required in order to trigger state updates useTimeRangeUpdates(); @@ -677,6 +677,7 @@ export const FieldsControls: FC> = ({ } : undefined } + theme={theme} > diff --git a/x-pack/plugins/aiops/public/components/document_count_content/document_count_content/document_count_content.tsx b/x-pack/plugins/aiops/public/components/document_count_content/document_count_content/document_count_content.tsx index 23caf21c39ee3..4dbf021e3b10b 100644 --- a/x-pack/plugins/aiops/public/components/document_count_content/document_count_content/document_count_content.tsx +++ b/x-pack/plugins/aiops/public/components/document_count_content/document_count_content/document_count_content.tsx @@ -15,6 +15,7 @@ import type { import { useAppSelector } from '@kbn/aiops-log-rate-analysis/state'; import { DocumentCountChartRedux } from '@kbn/aiops-components'; +import { AIOPS_EMBEDDABLE_ORIGIN } from '@kbn/aiops-common/constants'; import { useAiopsAppContext } from '../../../hooks/use_aiops_app_context'; @@ -37,17 +38,29 @@ export const DocumentCountContent: FC = ({ barHighlightColorOverride, ...docCountChartProps }) => { - const { data, uiSettings, fieldFormats, charts } = useAiopsAppContext(); + const { data, uiSettings, fieldFormats, charts, embeddingOrigin } = useAiopsAppContext(); const { documentStats } = useAppSelector((s) => s.logRateAnalysis); const { sampleProbability, totalCount, documentCountStats } = documentStats; if (documentCountStats === undefined) { - return totalCount !== undefined ? ( + return totalCount !== undefined && embeddingOrigin !== AIOPS_EMBEDDABLE_ORIGIN.DASHBOARD ? ( ) : null; } + if (embeddingOrigin === AIOPS_EMBEDDABLE_ORIGIN.DASHBOARD) { + return ( + + ); + } + return ( diff --git a/x-pack/plugins/aiops/public/components/log_rate_analysis/log_rate_analysis_content/log_rate_analysis_content.tsx b/x-pack/plugins/aiops/public/components/log_rate_analysis/log_rate_analysis_content/log_rate_analysis_content.tsx index 7bf43037f45c0..2821b59353b52 100644 --- a/x-pack/plugins/aiops/public/components/log_rate_analysis/log_rate_analysis_content/log_rate_analysis_content.tsx +++ b/x-pack/plugins/aiops/public/components/log_rate_analysis/log_rate_analysis_content/log_rate_analysis_content.tsx @@ -7,7 +7,7 @@ import { isEqual } from 'lodash'; import React, { useCallback, useEffect, useMemo, useRef, type FC } from 'react'; -import { EuiButton, EuiEmptyPrompt, EuiHorizontalRule, EuiPanel } from '@elastic/eui'; +import { EuiButton, EuiEmptyPrompt, EuiSpacer, EuiPanel } from '@elastic/eui'; import type * as estypes from '@elastic/elasticsearch/lib/api/typesWithBodyKey'; import type { BarStyleAccessor } from '@elastic/charts/dist/chart_types/xy_chart/utils/specs'; @@ -28,13 +28,16 @@ import { setInitialAnalysisStart, useAppDispatch, useAppSelector, + setGroupResults, } from '@kbn/aiops-log-rate-analysis/state'; +import { AIOPS_EMBEDDABLE_ORIGIN } from '@kbn/aiops-common/constants'; import { DocumentCountContent } from '../../document_count_content/document_count_content'; import { LogRateAnalysisResults, type LogRateAnalysisResultsData, } from '../log_rate_analysis_results'; +import { useAiopsAppContext } from '../../../hooks/use_aiops_app_context'; export const DEFAULT_SEARCH_QUERY: estypes.QueryDslQueryContainer = { match_all: {} }; const DEFAULT_SEARCH_BAR_QUERY: estypes.QueryDslQueryContainer = { @@ -69,9 +72,11 @@ export const LogRateAnalysisContent: FC = ({ onAnalysisCompleted, onWindowParametersChange, }) => { + const { embeddingOrigin } = useAiopsAppContext(); + const dispatch = useAppDispatch(); - const isRunning = useAppSelector((s) => s.logRateAnalysisStream.isRunning); + const isRunning = useAppSelector((s) => s.stream.isRunning); const significantItems = useAppSelector((s) => s.logRateAnalysisResults.significantItems); const significantItemsGroups = useAppSelector( (s) => s.logRateAnalysisResults.significantItemsGroups @@ -116,6 +121,7 @@ export const LogRateAnalysisContent: FC = ({ const { documentCountStats } = documentStats; function clearSelectionHandler() { + dispatch(setGroupResults(false)); dispatch(clearSelection()); dispatch(clearAllRowState()); } @@ -200,7 +206,11 @@ export const LogRateAnalysisContent: FC = ({ const changePointType = documentCountStats?.changePoint?.type; return ( - + {showDocumentCountContent && ( = ({ barStyleAccessor={barStyleAccessor} /> )} - + {showLogRateAnalysisResults && ( = ({ + timeRange, +}) => { + const { uiSettings } = useAiopsAppContext(); + const { dataView } = useDataSource(); + const { filters, query } = useFilterQueryUpdates(); + const appState = getDefaultLogRateAnalysisAppState({ + searchQuery: buildEsQuery( + dataView, + query ?? [], + filters ?? [], + uiSettings ? getEsQueryConfig(uiSettings) : undefined + ), + filters, + }); + const { searchQuery } = useSearch({ dataView, savedSearch: null }, appState, true); + + const timeRangeParsed = useMemo(() => { + if (timeRange) { + const min = datemath.parse(timeRange.from); + const max = datemath.parse(timeRange.to); + if (min && max) { + return { min, max }; + } + } + }, [timeRange]); + + return ( + <> + + + + ); +}; diff --git a/x-pack/plugins/aiops/public/components/log_rate_analysis/log_rate_analysis_options.tsx b/x-pack/plugins/aiops/public/components/log_rate_analysis/log_rate_analysis_options.tsx new file mode 100644 index 0000000000000..0aba8e2d763d4 --- /dev/null +++ b/x-pack/plugins/aiops/public/components/log_rate_analysis/log_rate_analysis_options.tsx @@ -0,0 +1,190 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import type { FC } from 'react'; +import React from 'react'; + +import { EuiButtonGroup, EuiFlexGroup, EuiFlexItem, EuiText } from '@elastic/eui'; + +import { i18n } from '@kbn/i18n'; +import { + clearAllRowState, + setGroupResults, + useAppDispatch, + useAppSelector, +} from '@kbn/aiops-log-rate-analysis/state'; +import { + commonColumns, + significantItemColumns, + setSkippedColumns, + type LogRateAnalysisResultsTableColumnName, +} from '@kbn/aiops-log-rate-analysis/state/log_rate_analysis_table_slice'; +import { setCurrentFieldFilterSkippedItems } from '@kbn/aiops-log-rate-analysis/state/log_rate_analysis_field_candidates_slice'; + +import { ItemFilterPopover as FieldFilterPopover } from './item_filter_popover'; +import { ItemFilterPopover as ColumnFilterPopover } from './item_filter_popover'; + +const groupResultsMessage = i18n.translate( + 'xpack.aiops.logRateAnalysis.resultsTable.groupedSwitchLabel.groupResults', + { + defaultMessage: 'Smart grouping', + } +); +const fieldFilterHelpText = i18n.translate('xpack.aiops.logRateAnalysis.page.fieldFilterHelpText', { + defaultMessage: + 'Deselect non-relevant fields to remove them from the analysis and click the Apply button to rerun the analysis. Use the search bar to filter the list, then select/deselect multiple fields with the actions below.', +}); +const columnsFilterHelpText = i18n.translate( + 'xpack.aiops.logRateAnalysis.page.columnsFilterHelpText', + { + defaultMessage: 'Configure visible columns.', + } +); +const disabledFieldFilterApplyButtonTooltipContent = i18n.translate( + 'xpack.aiops.analysis.fieldSelectorNotEnoughFieldsSelected', + { + defaultMessage: 'Grouping requires at least 2 fields to be selected.', + } +); +const disabledColumnFilterApplyButtonTooltipContent = i18n.translate( + 'xpack.aiops.analysis.columnSelectorNotEnoughColumnsSelected', + { + defaultMessage: 'At least one column must be selected.', + } +); +const columnSearchAriaLabel = i18n.translate('xpack.aiops.analysis.columnSelectorAriaLabel', { + defaultMessage: 'Filter columns', +}); +const columnsButton = i18n.translate('xpack.aiops.logRateAnalysis.page.columnsFilterButtonLabel', { + defaultMessage: 'Columns', +}); +const fieldsButton = i18n.translate('xpack.aiops.analysis.fieldsButtonLabel', { + defaultMessage: 'Fields', +}); +const groupResultsOffMessage = i18n.translate( + 'xpack.aiops.logRateAnalysis.resultsTable.groupedSwitchLabel.groupResultsOff', + { + defaultMessage: 'Off', + } +); +const groupResultsOnMessage = i18n.translate( + 'xpack.aiops.logRateAnalysis.resultsTable.groupedSwitchLabel.groupResultsOn', + { + defaultMessage: 'On', + } +); +const resultsGroupedOffId = 'aiopsLogRateAnalysisGroupingOff'; +const resultsGroupedOnId = 'aiopsLogRateAnalysisGroupingOn'; + +export interface LogRateAnalysisOptionsProps { + foundGroups: boolean; + growFirstItem?: boolean; +} + +export const LogRateAnalysisOptions: FC = ({ + foundGroups, + growFirstItem = false, +}) => { + const dispatch = useAppDispatch(); + + const { groupResults } = useAppSelector((s) => s.logRateAnalysis); + const { isRunning } = useAppSelector((s) => s.stream); + const fieldCandidates = useAppSelector((s) => s.logRateAnalysisFieldCandidates); + const { skippedColumns } = useAppSelector((s) => s.logRateAnalysisTable); + const { fieldFilterUniqueItems, initialFieldFilterSkippedItems } = fieldCandidates; + const fieldFilterButtonDisabled = + isRunning || fieldCandidates.isLoading || fieldFilterUniqueItems.length === 0; + const toggleIdSelected = groupResults ? resultsGroupedOnId : resultsGroupedOffId; + + const onGroupResultsToggle = (optionId: string) => { + dispatch(setGroupResults(optionId === resultsGroupedOnId)); + // When toggling the group switch, clear all row selections + dispatch(clearAllRowState()); + }; + + const onVisibleColumnsChange = (columns: LogRateAnalysisResultsTableColumnName[]) => { + dispatch(setSkippedColumns(columns)); + }; + + const onFieldsFilterChange = (skippedFieldsUpdate: string[]) => { + dispatch(setCurrentFieldFilterSkippedItems(skippedFieldsUpdate)); + }; + + // Disable the grouping switch toggle only if no groups were found, + // the toggle wasn't enabled already and no fields were selected to be skipped. + const disabledGroupResultsSwitch = !foundGroups && !groupResults; + + const toggleButtons = [ + { + id: resultsGroupedOffId, + label: groupResultsOffMessage, + 'data-test-subj': 'aiopsLogRateAnalysisGroupSwitchOff', + }, + { + id: resultsGroupedOnId, + label: groupResultsOnMessage, + 'data-test-subj': 'aiopsLogRateAnalysisGroupSwitchOn', + }, + ]; + + return ( + <> + + + + {groupResultsMessage} + + + + + + + + + + + void} + /> + + + ); +}; diff --git a/x-pack/plugins/aiops/public/components/log_rate_analysis/log_rate_analysis_results.tsx b/x-pack/plugins/aiops/public/components/log_rate_analysis/log_rate_analysis_results.tsx index ad8e7740b505d..1eb4f8fd0af0d 100644 --- a/x-pack/plugins/aiops/public/components/log_rate_analysis/log_rate_analysis_results.tsx +++ b/x-pack/plugins/aiops/public/components/log_rate_analysis/log_rate_analysis_results.tsx @@ -11,12 +11,13 @@ import { isEqual } from 'lodash'; import type * as estypes from '@elastic/elasticsearch/lib/api/typesWithBodyKey'; import { + EuiButtonIcon, EuiButton, - EuiButtonGroup, EuiCallOut, EuiEmptyPrompt, EuiFlexGroup, EuiFlexItem, + EuiToolTip, EuiSpacer, EuiText, } from '@elastic/eui'; @@ -26,6 +27,7 @@ import { ProgressControls } from '@kbn/aiops-components'; import { cancelStream, startStream } from '@kbn/ml-response-stream/client'; import { clearAllRowState, + setGroupResults, useAppDispatch, useAppSelector, } from '@kbn/aiops-log-rate-analysis/state'; @@ -37,8 +39,7 @@ import { import { i18n } from '@kbn/i18n'; import { FormattedMessage } from '@kbn/i18n-react'; import type { SignificantItem, SignificantItemGroup } from '@kbn/ml-agg-utils'; -import { useStorage } from '@kbn/ml-local-storage'; -import { AIOPS_ANALYSIS_RUN_ORIGIN } from '@kbn/aiops-common/constants'; +import { AIOPS_ANALYSIS_RUN_ORIGIN, AIOPS_EMBEDDABLE_ORIGIN } from '@kbn/aiops-common/constants'; import type { AiopsLogRateAnalysisSchema } from '@kbn/aiops-log-rate-analysis/api/schema'; import type { AiopsLogRateAnalysisSchemaSignificantItem } from '@kbn/aiops-log-rate-analysis/api/schema_v3'; import { @@ -50,15 +51,6 @@ import { fetchFieldCandidates } from '@kbn/aiops-log-rate-analysis/state/log_rat import { useAiopsAppContext } from '../../hooks/use_aiops_app_context'; import { useDataSource } from '../../hooks/use_data_source'; -import { - commonColumns, - significantItemColumns, -} from '../log_rate_analysis_results_table/use_columns'; -import { - AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS, - type AiOpsKey, - type AiOpsStorageMapped, -} from '../../types/storage'; import { getGroupTableItems, @@ -66,68 +58,15 @@ import { LogRateAnalysisResultsGroupsTable, } from '../log_rate_analysis_results_table'; -import { ItemFilterPopover as FieldFilterPopover } from './item_filter_popover'; -import { ItemFilterPopover as ColumnFilterPopover } from './item_filter_popover'; import { LogRateAnalysisInfoPopover } from './log_rate_analysis_info_popover'; -import type { ColumnNames } from '../log_rate_analysis_results_table'; +import { LogRateAnalysisOptions } from './log_rate_analysis_options'; -const groupResultsMessage = i18n.translate( - 'xpack.aiops.logRateAnalysis.resultsTable.groupedSwitchLabel.groupResults', - { - defaultMessage: 'Smart grouping', - } -); const groupResultsHelpMessage = i18n.translate( 'xpack.aiops.logRateAnalysis.resultsTable.groupedSwitchLabel.groupResultsHelpMessage', { defaultMessage: 'Items which are unique to a group are marked by an asterisk (*).', } ); -const groupResultsOffMessage = i18n.translate( - 'xpack.aiops.logRateAnalysis.resultsTable.groupedSwitchLabel.groupResultsOff', - { - defaultMessage: 'Off', - } -); -const groupResultsOnMessage = i18n.translate( - 'xpack.aiops.logRateAnalysis.resultsTable.groupedSwitchLabel.groupResultsOn', - { - defaultMessage: 'On', - } -); -const resultsGroupedOffId = 'aiopsLogRateAnalysisGroupingOff'; -const resultsGroupedOnId = 'aiopsLogRateAnalysisGroupingOn'; -const fieldFilterHelpText = i18n.translate('xpack.aiops.logRateAnalysis.page.fieldFilterHelpText', { - defaultMessage: - 'Deselect non-relevant fields to remove them from the analysis and click the Apply button to rerun the analysis. Use the search bar to filter the list, then select/deselect multiple fields with the actions below.', -}); -const columnsFilterHelpText = i18n.translate( - 'xpack.aiops.logRateAnalysis.page.columnsFilterHelpText', - { - defaultMessage: 'Configure visible columns.', - } -); -const disabledFieldFilterApplyButtonTooltipContent = i18n.translate( - 'xpack.aiops.analysis.fieldSelectorNotEnoughFieldsSelected', - { - defaultMessage: 'Grouping requires at least 2 fields to be selected.', - } -); -const disabledColumnFilterApplyButtonTooltipContent = i18n.translate( - 'xpack.aiops.analysis.columnSelectorNotEnoughColumnsSelected', - { - defaultMessage: 'At least one column must be selected.', - } -); -const columnSearchAriaLabel = i18n.translate('xpack.aiops.analysis.columnSelectorAriaLabel', { - defaultMessage: 'Filter columns', -}); -const columnsButton = i18n.translate('xpack.aiops.logRateAnalysis.page.columnsFilterButtonLabel', { - defaultMessage: 'Columns', -}); -const fieldsButton = i18n.translate('xpack.aiops.analysis.fieldsButtonLabel', { - defaultMessage: 'Fields', -}); /** * Interface for log rate analysis results data. @@ -173,10 +112,12 @@ export const LogRateAnalysisResults: FC = ({ documentStats: { sampleProbability }, stickyHistogram, isBrushCleared, + groupResults, } = useAppSelector((s) => s.logRateAnalysis); - const { isRunning, errors: streamErrors } = useAppSelector((s) => s.logRateAnalysisStream); + const { isRunning, errors: streamErrors } = useAppSelector((s) => s.stream); const data = useAppSelector((s) => s.logRateAnalysisResults); const fieldCandidates = useAppSelector((s) => s.logRateAnalysisFieldCandidates); + const { skippedColumns } = useAppSelector((s) => s.logRateAnalysisTable); const { currentAnalysisWindowParameters } = data; // Store the performance metric's start time using a ref @@ -185,61 +126,37 @@ export const LogRateAnalysisResults: FC = ({ const abortCtrl = useRef(new AbortController()); const previousSearchQuery = useRef(searchQuery); - const [groupResults, setGroupResults] = useState(false); const [overrides, setOverrides] = useState( undefined ); const [shouldStart, setShouldStart] = useState(false); - const [toggleIdSelected, setToggleIdSelected] = useState(resultsGroupedOffId); - const [skippedColumns, setSkippedColumns] = useStorage< - AiOpsKey, - AiOpsStorageMapped - >(AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS, ['p-value', 'Baseline rate', 'Deviation rate']); - // null is used as the uninitialized state to identify the first load. - const [skippedFields, setSkippedFields] = useState(null); - - const onGroupResultsToggle = (optionId: string) => { - setToggleIdSelected(optionId); - setGroupResults(optionId === resultsGroupedOnId); - - // When toggling the group switch, clear all row selections - dispatch(clearAllRowState()); + const [embeddableOptionsVisible, setEmbeddableOptionsVisible] = useState(false); + + const onEmbeddableOptionsClickHandler = () => { + setEmbeddableOptionsVisible((s) => !s); }; - const { - fieldFilterUniqueItems, - fieldFilterSkippedItems, - keywordFieldCandidates, - textFieldCandidates, - } = fieldCandidates; - const fieldFilterButtonDisabled = - isRunning || fieldCandidates.isLoading || fieldFilterUniqueItems.length === 0; - - // Set skipped fields only on first load, otherwise we'd overwrite the user's selection. + const { currentFieldFilterSkippedItems, keywordFieldCandidates, textFieldCandidates } = + fieldCandidates; + useEffect(() => { - if (skippedFields === null && fieldFilterSkippedItems.length > 0) - setSkippedFields(fieldFilterSkippedItems); - }, [fieldFilterSkippedItems, skippedFields]); + if (currentFieldFilterSkippedItems === null) return; - const onFieldsFilterChange = (skippedFieldsUpdate: string[]) => { dispatch(resetResults()); - setSkippedFields(skippedFieldsUpdate); setOverrides({ loaded: 0, remainingKeywordFieldCandidates: keywordFieldCandidates.filter( - (d) => !skippedFieldsUpdate.includes(d) + (d) => !currentFieldFilterSkippedItems.includes(d) ), remainingTextFieldCandidates: textFieldCandidates.filter( - (d) => !skippedFieldsUpdate.includes(d) + (d) => !currentFieldFilterSkippedItems.includes(d) ), regroupOnly: false, }); startHandler(true, false); - }; - - const onVisibleColumnsChange = (columns: ColumnNames[]) => { - setSkippedColumns(columns); - }; + // custom check to trigger on currentFieldFilterSkippedItems change + // eslint-disable-next-line react-hooks/exhaustive-deps + }, [currentFieldFilterSkippedItems]); function cancelHandler() { abortCtrl.current.abort(); @@ -299,18 +216,20 @@ export const LogRateAnalysisResults: FC = ({ dispatch(resetResults()); setOverrides({ remainingKeywordFieldCandidates: keywordFieldCandidates.filter( - (d) => skippedFields === null || !skippedFields.includes(d) + (d) => + currentFieldFilterSkippedItems === null || !currentFieldFilterSkippedItems.includes(d) ), remainingTextFieldCandidates: textFieldCandidates.filter( - (d) => skippedFields === null || !skippedFields.includes(d) + (d) => + currentFieldFilterSkippedItems === null || !currentFieldFilterSkippedItems.includes(d) ), }); } // Reset grouping to false and clear all row selections when restarting the analysis. if (resetGroupButton) { - setGroupResults(false); - setToggleIdSelected(resultsGroupedOffId); + dispatch(setGroupResults(false)); + // When toggling the group switch, clear all row selections dispatch(clearAllRowState()); } @@ -372,12 +291,13 @@ export const LogRateAnalysisResults: FC = ({ // eslint-disable-next-line react-hooks/exhaustive-deps }, [shouldStart]); + // On mount, fetch field candidates first. Once they are populated, + // the actual analysis will be triggered. useEffect(() => { if (startParams) { dispatch(fetchFieldCandidates(startParams)); dispatch(setCurrentAnalysisType(analysisType)); dispatch(setCurrentAnalysisWindowParameters(chartWindowParameters)); - dispatch(startStream(startParams)); } // eslint-disable-next-line react-hooks/exhaustive-deps }, []); @@ -413,23 +333,6 @@ export const LogRateAnalysisResults: FC = ({ }, 0); const foundGroups = groupTableItems.length > 0 && groupItemCount > 0; - // Disable the grouping switch toggle only if no groups were found, - // the toggle wasn't enabled already and no fields were selected to be skipped. - const disabledGroupResultsSwitch = !foundGroups && !groupResults; - - const toggleButtons = [ - { - id: resultsGroupedOffId, - label: groupResultsOffMessage, - 'data-test-subj': 'aiopsLogRateAnalysisGroupSwitchOff', - }, - { - id: resultsGroupedOnId, - label: groupResultsOnMessage, - 'data-test-subj': 'aiopsLogRateAnalysisGroupSwitchOn', - }, - ]; - return (
= ({ shouldRerunAnalysis={shouldRerunAnalysis || searchQueryUpdated} analysisInfo={} > - - + <> + {embeddingOrigin !== AIOPS_EMBEDDABLE_ORIGIN.DASHBOARD && ( + + )} + {embeddingOrigin === AIOPS_EMBEDDABLE_ORIGIN.DASHBOARD && ( - {groupResultsMessage} + + + - - - - - - - - - - void} - /> - + )} + + {embeddingOrigin === AIOPS_EMBEDDABLE_ORIGIN.DASHBOARD && embeddableOptionsVisible && ( + <> + + + + + + )} + {errors.length > 0 ? ( <> diff --git a/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/index.ts b/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/index.ts index 6813e71704918..c5112723e2784 100644 --- a/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/index.ts +++ b/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/index.ts @@ -8,4 +8,3 @@ export { getGroupTableItems } from './get_group_table_items'; export { LogRateAnalysisResultsTable } from './log_rate_analysis_results_table'; export { LogRateAnalysisResultsGroupsTable } from './log_rate_analysis_results_table_groups'; -export type { ColumnNames } from './use_columns'; diff --git a/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/log_rate_analysis_results_table.tsx b/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/log_rate_analysis_results_table.tsx index 83be306e93f50..e9072c2929f14 100644 --- a/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/log_rate_analysis_results_table.tsx +++ b/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/log_rate_analysis_results_table.tsx @@ -83,13 +83,11 @@ export const LogRateAnalysisResultsTable: FC = }, [allSignificantItems, groupFilter]); const zeroDocsFallback = useAppSelector((s) => s.logRateAnalysisResults.zeroDocsFallback); - const pinnedGroup = useAppSelector((s) => s.logRateAnalysisTableRow.pinnedGroup); - const selectedGroup = useAppSelector((s) => s.logRateAnalysisTableRow.selectedGroup); - const pinnedSignificantItem = useAppSelector( - (s) => s.logRateAnalysisTableRow.pinnedSignificantItem - ); + const pinnedGroup = useAppSelector((s) => s.logRateAnalysisTable.pinnedGroup); + const selectedGroup = useAppSelector((s) => s.logRateAnalysisTable.selectedGroup); + const pinnedSignificantItem = useAppSelector((s) => s.logRateAnalysisTable.pinnedSignificantItem); const selectedSignificantItem = useAppSelector( - (s) => s.logRateAnalysisTableRow.selectedSignificantItem + (s) => s.logRateAnalysisTable.selectedSignificantItem ); const dispatch = useAppDispatch(); diff --git a/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/log_rate_analysis_results_table_groups.tsx b/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/log_rate_analysis_results_table_groups.tsx index d69a0fec7200f..6bd0a5e4ce213 100644 --- a/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/log_rate_analysis_results_table_groups.tsx +++ b/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/log_rate_analysis_results_table_groups.tsx @@ -91,8 +91,8 @@ export const LogRateAnalysisResultsGroupsTable: FC s.logRateAnalysisTableRow.pinnedGroup); - const selectedGroup = useAppSelector((s) => s.logRateAnalysisTableRow.selectedGroup); + const pinnedGroup = useAppSelector((s) => s.logRateAnalysisTable.pinnedGroup); + const selectedGroup = useAppSelector((s) => s.logRateAnalysisTable.selectedGroup); const dispatch = useAppDispatch(); const isMounted = useMountedState(); diff --git a/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/use_columns.tsx b/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/use_columns.tsx index f3b8195767101..c5b7a83e33641 100644 --- a/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/use_columns.tsx +++ b/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/use_columns.tsx @@ -21,6 +21,11 @@ import { type SignificantItem, SIGNIFICANT_ITEM_TYPE } from '@kbn/ml-agg-utils'; import { getCategoryQuery } from '@kbn/aiops-log-pattern-analysis/get_category_query'; import type { FieldStatsServices } from '@kbn/unified-field-list/src/components/field_stats'; import { useAppSelector } from '@kbn/aiops-log-rate-analysis/state'; +import { + commonColumns, + significantItemColumns, + type LogRateAnalysisResultsTableColumnName, +} from '@kbn/aiops-log-rate-analysis/state/log_rate_analysis_table_slice'; import { getBaselineAndDeviationRates, getLogRateChange, @@ -40,55 +45,6 @@ const TRUNCATE_TEXT_LINES = 3; const UNIQUE_COLUMN_WIDTH = '40px'; const NOT_AVAILABLE = '--'; -export const commonColumns = { - ['Log rate']: i18n.translate('xpack.aiops.logRateAnalysis.resultsTable.logRateColumnTitle', { - defaultMessage: 'Log rate', - }), - ['Doc count']: i18n.translate('xpack.aiops.logRateAnalysis.resultsTable.docCountColumnTitle', { - defaultMessage: 'Doc count', - }), - ['p-value']: i18n.translate('xpack.aiops.logRateAnalysis.resultsTable.pValueColumnTitle', { - defaultMessage: 'p-value', - }), - ['Impact']: i18n.translate('xpack.aiops.logRateAnalysis.resultsTable.impactColumnTitle', { - defaultMessage: 'Impact', - }), - ['Baseline rate']: i18n.translate( - 'xpack.aiops.logRateAnalysis.resultsTable.baselineRateColumnTitle', - { - defaultMessage: 'Baseline rate', - } - ), - ['Deviation rate']: i18n.translate( - 'xpack.aiops.logRateAnalysis.resultsTable.deviationRateColumnTitle', - { - defaultMessage: 'Deviation rate', - } - ), - ['Log rate change']: i18n.translate( - 'xpack.aiops.logRateAnalysis.resultsTable.logRateChangeColumnTitle', - { - defaultMessage: 'Log rate change', - } - ), - ['Actions']: i18n.translate('xpack.aiops.logRateAnalysis.resultsTable.actionsColumnTitle', { - defaultMessage: 'Actions', - }), -}; - -export const significantItemColumns = { - ['Field name']: i18n.translate('xpack.aiops.logRateAnalysis.resultsTable.fieldNameColumnTitle', { - defaultMessage: 'Field name', - }), - ['Field value']: i18n.translate( - 'xpack.aiops.logRateAnalysis.resultsTable.fieldValueColumnTitle', - { - defaultMessage: 'Field value', - } - ), - ...commonColumns, -} as const; - export const LOG_RATE_ANALYSIS_RESULTS_TABLE_TYPE = { GROUPS: 'groups', SIGNIFICANT_ITEMS: 'significantItems', @@ -96,8 +52,6 @@ export const LOG_RATE_ANALYSIS_RESULTS_TABLE_TYPE = { export type LogRateAnalysisResultsTableType = (typeof LOG_RATE_ANALYSIS_RESULTS_TABLE_TYPE)[keyof typeof LOG_RATE_ANALYSIS_RESULTS_TABLE_TYPE]; -export type ColumnNames = keyof typeof significantItemColumns | 'unique'; - const logRateHelpMessage = i18n.translate( 'xpack.aiops.logRateAnalysis.resultsTable.logRateColumnTooltip', { @@ -213,7 +167,7 @@ export const useColumns = ( const { earliest, latest } = useAppSelector((s) => s.logRateAnalysis); const timeRangeMs = { from: earliest ?? 0, to: latest ?? 0 }; - const loading = useAppSelector((s) => s.logRateAnalysisStream.isRunning); + const loading = useAppSelector((s) => s.stream.isRunning); const zeroDocsFallback = useAppSelector((s) => s.logRateAnalysisResults.zeroDocsFallback); const { documentStats: { documentCountStats }, @@ -271,7 +225,10 @@ export const useColumns = ( [currentAnalysisType, buckets] ); - const columnsMap: Record> = useMemo( + const columnsMap: Record< + LogRateAnalysisResultsTableColumnName, + EuiBasicTableColumn + > = useMemo( () => ({ ['Field name']: { 'data-test-subj': 'aiopsLogRateAnalysisResultsTableColumnFieldName', @@ -615,20 +572,21 @@ export const useColumns = ( ); const columns = useMemo(() => { - const columnNamesToReturn: Partial> = isGroupsTable - ? commonColumns - : significantItemColumns; + const columnNamesToReturn: Partial> = + isGroupsTable ? commonColumns : significantItemColumns; const columnsToReturn = []; for (const columnName in columnNamesToReturn) { if ( Object.hasOwn(columnNamesToReturn, columnName) === false || - skippedColumns.includes(columnNamesToReturn[columnName as ColumnNames] as string) || + skippedColumns.includes( + columnNamesToReturn[columnName as LogRateAnalysisResultsTableColumnName] as string + ) || ((columnName === 'p-value' || columnName === 'Impact') && zeroDocsFallback) ) continue; - columnsToReturn.push(columnsMap[columnName as ColumnNames]); + columnsToReturn.push(columnsMap[columnName as LogRateAnalysisResultsTableColumnName]); } if (isExpandedRow === true) { diff --git a/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/use_view_in_discover_action.tsx b/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/use_view_in_discover_action.tsx index ec4284d6452e5..765f1435a93ad 100644 --- a/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/use_view_in_discover_action.tsx +++ b/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/use_view_in_discover_action.tsx @@ -28,8 +28,8 @@ export const useViewInDiscoverAction = (dataViewId?: string): TableItemAction => const { application, share, data } = useAiopsAppContext(); const discoverLocator = useMemo( - () => share.url.locators.get('DISCOVER_APP_LOCATOR'), - [share.url.locators] + () => share?.url.locators.get('DISCOVER_APP_LOCATOR'), + [share?.url.locators] ); const discoverUrlError = useMemo(() => { diff --git a/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/use_view_in_log_pattern_analysis_action.tsx b/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/use_view_in_log_pattern_analysis_action.tsx index dbac6fbe8c9f3..ec1f6774b6b46 100644 --- a/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/use_view_in_log_pattern_analysis_action.tsx +++ b/x-pack/plugins/aiops/public/components/log_rate_analysis_results_table/use_view_in_log_pattern_analysis_action.tsx @@ -32,7 +32,7 @@ const viewInLogPatternAnalysisMessage = i18n.translate( export const useViewInLogPatternAnalysisAction = (dataViewId?: string): TableItemAction => { const { application, share, data } = useAiopsAppContext(); - const mlLocator = useMemo(() => share.url.locators.get('ML_APP_LOCATOR'), [share.url.locators]); + const mlLocator = useMemo(() => share?.url.locators.get('ML_APP_LOCATOR'), [share?.url.locators]); const generateLogPatternAnalysisUrl = async ( groupTableItem: GroupTableItem | SignificantItem diff --git a/x-pack/plugins/aiops/public/components/time_field_warning.tsx b/x-pack/plugins/aiops/public/components/time_field_warning.tsx new file mode 100644 index 0000000000000..beb64917af538 --- /dev/null +++ b/x-pack/plugins/aiops/public/components/time_field_warning.tsx @@ -0,0 +1,32 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import { EuiSpacer, EuiCallOut } from '@elastic/eui'; +import { i18n } from '@kbn/i18n'; +import React from 'react'; + +export const TimeFieldWarning = () => { + return ( + <> + +

+ {i18n.translate('xpack.aiops.embeddableMenu.timeFieldWarning.title.description', { + defaultMessage: 'The analysis can only be run on data views with a time field.', + })} +

+
+ + + ); +}; diff --git a/x-pack/plugins/aiops/public/embeddables/change_point_chart/change_point_chart_initializer.tsx b/x-pack/plugins/aiops/public/embeddables/change_point_chart/change_point_chart_initializer.tsx index 08cf6ffd7dcb1..e69511fe45f92 100644 --- a/x-pack/plugins/aiops/public/embeddables/change_point_chart/change_point_chart_initializer.tsx +++ b/x-pack/plugins/aiops/public/embeddables/change_point_chart/change_point_chart_initializer.tsx @@ -18,16 +18,23 @@ import { EuiHorizontalRule, EuiTitle, } from '@elastic/eui'; +import { DatePickerContextProvider, type DatePickerDependencies } from '@kbn/ml-date-picker'; +import { UI_SETTINGS } from '@kbn/data-plugin/common'; +import type { FieldStatsServices } from '@kbn/unified-field-list/src/components/field_stats'; import { ES_FIELD_TYPES } from '@kbn/field-types'; import { i18n } from '@kbn/i18n'; import { FormattedMessage } from '@kbn/i18n-react'; import { isPopulatedObject } from '@kbn/ml-is-populated-object'; +import { FieldStatsFlyoutProvider } from '@kbn/ml-field-stats-flyout'; +import { useTimefilter } from '@kbn/ml-date-picker'; import { pick } from 'lodash'; import type { FC } from 'react'; import React, { useCallback, useEffect, useMemo, useState } from 'react'; import usePrevious from 'react-use/lib/usePrevious'; import { + ChangePointDetectionContextProvider, ChangePointDetectionControlsContextProvider, + useChangePointDetectionContext, useChangePointDetectionControlsContext, } from '../../components/change_point_detection/change_point_detection_context'; import { DEFAULT_AGG_FUNCTION } from '../../components/change_point_detection/constants'; @@ -38,7 +45,8 @@ import { PartitionsSelector } from '../../components/change_point_detection/part import { SplitFieldSelector } from '../../components/change_point_detection/split_field_selector'; import { ViewTypeSelector } from '../../components/change_point_detection/view_type_selector'; import { useAiopsAppContext } from '../../hooks/use_aiops_app_context'; -import { DataSourceContextProvider } from '../../hooks/use_data_source'; +import { useDataSource, DataSourceContextProvider } from '../../hooks/use_data_source'; +import { FilterQueryContextProvider } from '../../hooks/use_filters_query'; import { DEFAULT_SERIES } from './const'; import type { ChangePointEmbeddableRuntimeState } from './types'; @@ -53,11 +61,18 @@ export const ChangePointChartInitializer: FC = ({ onCreate, onCancel, }) => { + const appContextValue = useAiopsAppContext(); const { + data: { dataViews }, unifiedSearch: { ui: { IndexPatternSelect }, }, - } = useAiopsAppContext(); + } = appContextValue; + + const datePickerDeps: DatePickerDependencies = { + ...pick(appContextValue, ['data', 'http', 'notifications', 'theme', 'uiSettings', 'i18n']), + uiSettingsKeys: UI_SETTINGS, + }; const [dataViewId, setDataViewId] = useState(initialInput?.dataViewId ?? ''); const [viewType, setViewType] = useState(initialInput?.viewType ?? 'charts'); @@ -135,15 +150,21 @@ export const ChangePointChartInitializer: FC = ({ }} /> - - - - - + + + + + + + + + + + @@ -190,7 +211,13 @@ export const FormControls: FC<{ onChange: (update: FormControlsProps) => void; onValidationChange: (isValid: boolean) => void; }> = ({ formInput, onChange, onValidationChange }) => { + const { charts, data, fieldFormats, theme, uiSettings } = useAiopsAppContext(); + const { dataView } = useDataSource(); + const { combinedQuery } = useChangePointDetectionContext(); const { metricFieldOptions, splitFieldsOptions } = useChangePointDetectionControlsContext(); + const timefilter = useTimefilter(); + const timefilterActiveBounds = timefilter.getActiveBounds(); + const prevMetricFieldOptions = usePrevious(metricFieldOptions); const enableSearch = useMemo(() => { @@ -238,10 +265,33 @@ export const FormControls: FC<{ [formInput, onChange] ); + const fieldStatsServices: FieldStatsServices = useMemo(() => { + return { + uiSettings, + dataViews: data.dataViews, + data, + fieldFormats, + charts, + }; + }, [uiSettings, data, fieldFormats, charts]); + if (!isPopulatedObject(formInput)) return null; return ( - <> + updateCallback({ maxSeriesToPlot: v })} onValidationChange={(result) => onValidationChange(result === null)} /> - + ); }; diff --git a/x-pack/plugins/aiops/public/embeddables/change_point_chart/embeddable_change_point_chart_factory.tsx b/x-pack/plugins/aiops/public/embeddables/change_point_chart/embeddable_change_point_chart_factory.tsx index 7cf39eb1cf4ae..2ce1a46780db1 100644 --- a/x-pack/plugins/aiops/public/embeddables/change_point_chart/embeddable_change_point_chart_factory.tsx +++ b/x-pack/plugins/aiops/public/embeddables/change_point_chart/embeddable_change_point_chart_factory.tsx @@ -11,7 +11,6 @@ import { } from '@kbn/aiops-change-point-detection/constants'; import type { Reference } from '@kbn/content-management-utils'; import type { StartServicesAccessor } from '@kbn/core-lifecycle-browser'; -import { type DataPublicPluginStart } from '@kbn/data-plugin/public'; import type { DataView } from '@kbn/data-views-plugin/common'; import { DATA_VIEW_SAVED_OBJECT_TYPE } from '@kbn/data-views-plugin/common'; import type { ReactEmbeddableFactory } from '@kbn/embeddable-plugin/public'; @@ -38,32 +37,8 @@ import type { ChangePointEmbeddableState, } from './types'; -export interface EmbeddableChangePointChartStartServices { - data: DataPublicPluginStart; -} - export type EmbeddableChangePointChartType = typeof EMBEDDABLE_CHANGE_POINT_CHART_TYPE; -export const getDependencies = async ( - getStartServices: StartServicesAccessor -) => { - const [ - { http, uiSettings, notifications, ...startServices }, - { lens, data, usageCollection, fieldFormats }, - ] = await getStartServices(); - - return { - http, - uiSettings, - data, - notifications, - lens, - usageCollection, - fieldFormats, - ...startServices, - }; -}; - export const getChangePointChartEmbeddableFactory = ( getStartServices: StartServicesAccessor ) => { @@ -88,20 +63,6 @@ export const getChangePointChartEmbeddableFactory = ( buildEmbeddable: async (state, buildApi, uuid, parentApi) => { const [coreStart, pluginStart] = await getStartServices(); - const { http, uiSettings, notifications, ...startServices } = coreStart; - const { lens, data, usageCollection, fieldFormats } = pluginStart; - - const deps = { - http, - uiSettings, - data, - notifications, - lens, - usageCollection, - fieldFormats, - ...startServices, - }; - const { api: timeRangeApi, comparators: timeRangeComparators, @@ -120,7 +81,7 @@ export const getChangePointChartEmbeddableFactory = ( const blockingError = new BehaviorSubject(undefined); const dataViews$ = new BehaviorSubject([ - await deps.data.dataViews.get(state.dataViewId), + await pluginStart.data.dataViews.get(state.dataViewId), ]); const api = buildApi( diff --git a/x-pack/plugins/aiops/public/embeddables/change_point_chart/types.ts b/x-pack/plugins/aiops/public/embeddables/change_point_chart/types.ts index 4a39020a299c9..f86270bdaf19d 100644 --- a/x-pack/plugins/aiops/public/embeddables/change_point_chart/types.ts +++ b/x-pack/plugins/aiops/public/embeddables/change_point_chart/types.ts @@ -15,14 +15,6 @@ import type { SerializedTimeRange, SerializedTitles, } from '@kbn/presentation-publishing'; -import type { FC } from 'react'; -import type { SelectedChangePoint } from '../../components/change_point_detection/change_point_detection_context'; - -export type ViewComponent = FC<{ - changePoints: SelectedChangePoint[]; - interval: string; - onRenderComplete?: () => void; -}>; export interface ChangePointComponentApi { viewType: PublishingSubject; diff --git a/x-pack/plugins/aiops/public/embeddables/index.ts b/x-pack/plugins/aiops/public/embeddables/index.ts index b7d9ad25951fb..dae1f0eb3eeec 100644 --- a/x-pack/plugins/aiops/public/embeddables/index.ts +++ b/x-pack/plugins/aiops/public/embeddables/index.ts @@ -9,6 +9,7 @@ import type { CoreSetup } from '@kbn/core-lifecycle-browser'; import type { EmbeddableSetup } from '@kbn/embeddable-plugin/public'; import { EMBEDDABLE_CHANGE_POINT_CHART_TYPE } from '@kbn/aiops-change-point-detection/constants'; import { EMBEDDABLE_PATTERN_ANALYSIS_TYPE } from '@kbn/aiops-log-pattern-analysis/constants'; +import { EMBEDDABLE_LOG_RATE_ANALYSIS_TYPE } from '@kbn/aiops-log-rate-analysis/constants'; import type { AiopsPluginStart, AiopsPluginStartDeps } from '../types'; export const registerEmbeddables = ( @@ -23,4 +24,8 @@ export const registerEmbeddables = ( const { getPatternAnalysisEmbeddableFactory } = await import('./pattern_analysis'); return getPatternAnalysisEmbeddableFactory(core.getStartServices); }); + embeddable.registerReactEmbeddableFactory(EMBEDDABLE_LOG_RATE_ANALYSIS_TYPE, async () => { + const { getLogRateAnalysisEmbeddableFactory } = await import('./log_rate_analysis'); + return getLogRateAnalysisEmbeddableFactory(core.getStartServices); + }); }; diff --git a/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/embeddable_log_rate_analysis_factory.tsx b/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/embeddable_log_rate_analysis_factory.tsx new file mode 100644 index 0000000000000..592ec32cef120 --- /dev/null +++ b/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/embeddable_log_rate_analysis_factory.tsx @@ -0,0 +1,211 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import { + EMBEDDABLE_LOG_RATE_ANALYSIS_TYPE, + LOG_RATE_ANALYSIS_DATA_VIEW_REF_NAME, +} from '@kbn/aiops-log-rate-analysis/constants'; +import type { Reference } from '@kbn/content-management-utils'; +import type { StartServicesAccessor } from '@kbn/core-lifecycle-browser'; +import type { DataView } from '@kbn/data-views-plugin/common'; +import { DATA_VIEW_SAVED_OBJECT_TYPE } from '@kbn/data-views-plugin/common'; +import type { ReactEmbeddableFactory } from '@kbn/embeddable-plugin/public'; +import { i18n } from '@kbn/i18n'; +import { + apiHasExecutionContext, + fetch$, + initializeTimeRange, + initializeTitles, + useBatchedPublishingSubjects, +} from '@kbn/presentation-publishing'; + +import fastIsEqual from 'fast-deep-equal'; +import { cloneDeep } from 'lodash'; +import React, { useMemo } from 'react'; +import useObservable from 'react-use/lib/useObservable'; +import { BehaviorSubject, distinctUntilChanged, map, skipWhile } from 'rxjs'; +import { getLogRateAnalysisEmbeddableWrapperComponent } from '../../shared_components'; +import type { AiopsPluginStart, AiopsPluginStartDeps } from '../../types'; +import { initializeLogRateAnalysisControls } from './initialize_log_rate_analysis_analysis_controls'; +import type { + LogRateAnalysisEmbeddableApi, + LogRateAnalysisEmbeddableRuntimeState, + LogRateAnalysisEmbeddableState, +} from './types'; + +export const getLogRateAnalysisEmbeddableFactory = ( + getStartServices: StartServicesAccessor +) => { + const factory: ReactEmbeddableFactory< + LogRateAnalysisEmbeddableState, + LogRateAnalysisEmbeddableRuntimeState, + LogRateAnalysisEmbeddableApi + > = { + type: EMBEDDABLE_LOG_RATE_ANALYSIS_TYPE, + deserializeState: (state) => { + const serializedState = cloneDeep(state.rawState); + // inject the reference + const dataViewIdRef = state.references?.find( + (ref) => ref.name === LOG_RATE_ANALYSIS_DATA_VIEW_REF_NAME + ); + // if the serializedState already contains a dataViewId, we don't want to overwrite it. (Unsaved state can cause this) + if (dataViewIdRef && serializedState && !serializedState.dataViewId) { + serializedState.dataViewId = dataViewIdRef?.id; + } + return serializedState; + }, + buildEmbeddable: async (state, buildApi, uuid, parentApi) => { + const [coreStart, pluginStart] = await getStartServices(); + + const { + api: timeRangeApi, + comparators: timeRangeComparators, + serialize: serializeTimeRange, + } = initializeTimeRange(state); + + const { titlesApi, titleComparators, serializeTitles } = initializeTitles(state); + + const { + logRateAnalysisControlsApi, + serializeLogRateAnalysisChartState, + logRateAnalysisControlsComparators, + } = initializeLogRateAnalysisControls(state); + + const dataLoading = new BehaviorSubject(true); + const blockingError = new BehaviorSubject(undefined); + + const dataViews$ = new BehaviorSubject([ + await pluginStart.data.dataViews.get( + state.dataViewId ?? (await pluginStart.data.dataViews.getDefaultId()) + ), + ]); + + const api = buildApi( + { + ...timeRangeApi, + ...titlesApi, + ...logRateAnalysisControlsApi, + getTypeDisplayName: () => + i18n.translate('xpack.aiops.logRateAnalysis.typeDisplayName', { + defaultMessage: 'log rate analysis', + }), + isEditingEnabled: () => true, + onEdit: async () => { + try { + const { resolveEmbeddableLogRateAnalysisUserInput } = await import( + './resolve_log_rate_analysis_config_input' + ); + + const result = await resolveEmbeddableLogRateAnalysisUserInput( + coreStart, + pluginStart, + parentApi, + uuid, + false, + logRateAnalysisControlsApi, + undefined, + serializeLogRateAnalysisChartState() + ); + + logRateAnalysisControlsApi.updateUserInput(result); + } catch (e) { + return Promise.reject(); + } + }, + dataLoading, + blockingError, + dataViews: dataViews$, + serializeState: () => { + const dataViewId = logRateAnalysisControlsApi.dataViewId.getValue(); + const references: Reference[] = dataViewId + ? [ + { + type: DATA_VIEW_SAVED_OBJECT_TYPE, + name: LOG_RATE_ANALYSIS_DATA_VIEW_REF_NAME, + id: dataViewId, + }, + ] + : []; + return { + rawState: { + timeRange: undefined, + ...serializeTitles(), + ...serializeTimeRange(), + ...serializeLogRateAnalysisChartState(), + }, + references, + }; + }, + }, + { + ...timeRangeComparators, + ...titleComparators, + ...logRateAnalysisControlsComparators, + } + ); + + const LogRateAnalysisEmbeddableWrapper = getLogRateAnalysisEmbeddableWrapperComponent( + coreStart, + pluginStart + ); + + const onLoading = (v: boolean) => dataLoading.next(v); + const onRenderComplete = () => dataLoading.next(false); + const onError = (error: Error) => blockingError.next(error); + + return { + api, + Component: () => { + if (!apiHasExecutionContext(parentApi)) { + throw new Error('Parent API does not have execution context'); + } + + const [dataViewId] = useBatchedPublishingSubjects(api.dataViewId); + + const reload$ = useMemo( + () => + fetch$(api).pipe( + skipWhile((fetchContext) => !fetchContext.isReload), + map((fetchContext) => Date.now()) + ), + [] + ); + + const timeRange$ = useMemo( + () => + fetch$(api).pipe( + map((fetchContext) => fetchContext.timeRange), + distinctUntilChanged(fastIsEqual) + ), + [] + ); + + const lastReloadRequestTime = useObservable(reload$, Date.now()); + const timeRange = useObservable(timeRange$, undefined); + + const embeddingOrigin = apiHasExecutionContext(parentApi) + ? parentApi.executionContext.type + : undefined; + + return ( + + ); + }, + }; + }, + }; + + return factory; +}; diff --git a/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/index.ts b/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/index.ts new file mode 100644 index 0000000000000..2203d4c64bc8b --- /dev/null +++ b/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/index.ts @@ -0,0 +1,8 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +export { getLogRateAnalysisEmbeddableFactory } from './embeddable_log_rate_analysis_factory'; diff --git a/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/initialize_log_rate_analysis_analysis_controls.ts b/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/initialize_log_rate_analysis_analysis_controls.ts new file mode 100644 index 0000000000000..9d8a49b8f0e9c --- /dev/null +++ b/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/initialize_log_rate_analysis_analysis_controls.ts @@ -0,0 +1,43 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import type { StateComparators } from '@kbn/presentation-publishing'; +import { BehaviorSubject } from 'rxjs'; +import type { LogRateAnalysisComponentApi, LogRateAnalysisEmbeddableState } from './types'; + +type LogRateAnalysisEmbeddableCustomState = Omit< + LogRateAnalysisEmbeddableState, + 'timeRange' | 'title' | 'description' | 'hidePanelTitles' +>; + +export const initializeLogRateAnalysisControls = (rawState: LogRateAnalysisEmbeddableState) => { + const dataViewId = new BehaviorSubject(rawState.dataViewId); + + const updateUserInput = (update: LogRateAnalysisEmbeddableCustomState) => { + dataViewId.next(update.dataViewId); + }; + + const serializeLogRateAnalysisChartState = (): LogRateAnalysisEmbeddableCustomState => { + return { + dataViewId: dataViewId.getValue(), + }; + }; + + const logRateAnalysisControlsComparators: StateComparators = + { + dataViewId: [dataViewId, (arg) => dataViewId.next(arg)], + }; + + return { + logRateAnalysisControlsApi: { + dataViewId, + updateUserInput, + } as unknown as LogRateAnalysisComponentApi, + serializeLogRateAnalysisChartState, + logRateAnalysisControlsComparators, + }; +}; diff --git a/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/log_rate_analysis_embeddable_initializer.tsx b/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/log_rate_analysis_embeddable_initializer.tsx new file mode 100644 index 0000000000000..bf53f07677739 --- /dev/null +++ b/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/log_rate_analysis_embeddable_initializer.tsx @@ -0,0 +1,230 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import type { FC } from 'react'; +import React, { useEffect, useMemo, useState, useCallback } from 'react'; +import { pick } from 'lodash'; +import useMountedState from 'react-use/lib/useMountedState'; + +import { + EuiFlyoutHeader, + EuiTitle, + EuiFlyoutBody, + EuiForm, + EuiFormRow, + EuiButton, + EuiButtonEmpty, + EuiFlexGroup, + EuiFlexItem, + EuiFlyoutFooter, + EuiSpacer, +} from '@elastic/eui'; + +import type { IndexPatternSelectProps } from '@kbn/unified-search-plugin/public'; +import { euiThemeVars } from '@kbn/ui-theme'; +import { i18n } from '@kbn/i18n'; +import { FormattedMessage } from '@kbn/i18n-react'; +import { isPopulatedObject } from '@kbn/ml-is-populated-object'; +import type { DataViewsPublicPluginStart } from '@kbn/data-views-plugin/public'; + +import { TimeFieldWarning } from '../../components/time_field_warning'; + +import type { LogRateAnalysisEmbeddableRuntimeState } from './types'; + +export interface LogRateAnalysisEmbeddableInitializerProps { + dataViews: DataViewsPublicPluginStart; + IndexPatternSelect: React.ComponentType; + initialInput?: Partial; + onCreate: (props: LogRateAnalysisEmbeddableRuntimeState) => void; + onCancel: () => void; + onPreview: (update: LogRateAnalysisEmbeddableRuntimeState) => Promise; + isNewPanel: boolean; +} + +export const LogRateAnalysisEmbeddableInitializer: FC< + LogRateAnalysisEmbeddableInitializerProps +> = ({ + dataViews, + IndexPatternSelect, + initialInput, + onCreate, + onCancel, + onPreview, + isNewPanel, +}) => { + const isMounted = useMountedState(); + + const [formInput, setFormInput] = useState( + pick(initialInput ?? {}, ['dataViewId']) as LogRateAnalysisEmbeddableRuntimeState + ); + + // State to track if the selected data view is time based, undefined is used + // to track that the check is in progress. + const [isDataViewTimeBased, setIsDataViewTimeBased] = useState(); + + const isFormValid = useMemo( + () => + isPopulatedObject(formInput, ['dataViewId']) && + formInput.dataViewId !== '' && + isDataViewTimeBased === true, + [formInput, isDataViewTimeBased] + ); + + const updatedProps = useMemo(() => { + return { + ...formInput, + title: isPopulatedObject(formInput) + ? i18n.translate('xpack.aiops.embeddableLogRateAnalysis.attachmentTitle', { + defaultMessage: 'Log rate analysis', + }) + : '', + }; + }, [formInput]); + + useEffect( + function previewChanges() { + if (isFormValid) { + onPreview(updatedProps); + } + }, + [isFormValid, onPreview, updatedProps, isDataViewTimeBased] + ); + + const setDataViewId = useCallback( + (dataViewId: string | undefined) => { + setFormInput({ + ...formInput, + dataViewId: dataViewId ?? '', + }); + setIsDataViewTimeBased(undefined); + }, + [formInput] + ); + + useEffect( + function checkIsDataViewTimeBased() { + setIsDataViewTimeBased(undefined); + + const { dataViewId } = formInput; + + if (!dataViewId) { + return; + } + + dataViews + .get(dataViewId) + .then((dataView) => { + if (!isMounted()) { + return; + } + setIsDataViewTimeBased(dataView.isTimeBased()); + }) + .catch(() => { + setIsDataViewTimeBased(undefined); + }); + }, + [dataViews, formInput, isMounted] + ); + + return ( + <> + + +

+ {isNewPanel + ? i18n.translate('xpack.aiops.embeddableLogRateAnalysis.config.title.new', { + defaultMessage: 'Create log rate analysis', + }) + : i18n.translate('xpack.aiops.embeddableLogRateAnalysis.config.title.edit', { + defaultMessage: 'Edit log rate analysis', + })} +

+
+
+ + + + + <> + { + setDataViewId(newId ?? ''); + }} + data-test-subj="aiopsLogRateAnalysisEmbeddableDataViewSelector" + /> + {isDataViewTimeBased === false && ( + <> + + + + )} + + + + + + + + + + + + + + + + + + + + + ); +}; diff --git a/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/resolve_log_rate_analysis_config_input.tsx b/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/resolve_log_rate_analysis_config_input.tsx new file mode 100644 index 0000000000000..a066b5bc722f2 --- /dev/null +++ b/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/resolve_log_rate_analysis_config_input.tsx @@ -0,0 +1,94 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import type { CoreStart } from '@kbn/core/public'; +import { tracksOverlays } from '@kbn/presentation-containers'; +import { toMountPoint } from '@kbn/react-kibana-mount'; +import React from 'react'; +import type { AiopsPluginStartDeps } from '../../types'; +import { LogRateAnalysisEmbeddableInitializer } from './log_rate_analysis_embeddable_initializer'; +import type { LogRateAnalysisComponentApi, LogRateAnalysisEmbeddableState } from './types'; + +export async function resolveEmbeddableLogRateAnalysisUserInput( + coreStart: CoreStart, + pluginStart: AiopsPluginStartDeps, + parentApi: unknown, + focusedPanelId: string, + isNewPanel: boolean, + logRateAnalysisControlsApi: LogRateAnalysisComponentApi, + deletePanel?: () => void, + initialState?: LogRateAnalysisEmbeddableState +): Promise { + const { overlays } = coreStart; + + const overlayTracker = tracksOverlays(parentApi) ? parentApi : undefined; + + let hasChanged = false; + return new Promise(async (resolve, reject) => { + try { + const cancelChanges = () => { + if (isNewPanel && deletePanel) { + deletePanel(); + } else if (hasChanged && logRateAnalysisControlsApi && initialState) { + // Reset to initialState in case user has changed the preview state + logRateAnalysisControlsApi.updateUserInput(initialState); + } + + flyoutSession.close(); + overlayTracker?.clearOverlays(); + }; + + const update = async (nextUpdate: LogRateAnalysisEmbeddableState) => { + resolve(nextUpdate); + flyoutSession.close(); + overlayTracker?.clearOverlays(); + }; + + const preview = async (nextUpdate: LogRateAnalysisEmbeddableState) => { + if (logRateAnalysisControlsApi) { + logRateAnalysisControlsApi.updateUserInput(nextUpdate); + hasChanged = true; + } + }; + + const flyoutSession = overlays.openFlyout( + toMountPoint( + , + coreStart + ), + { + ownFocus: true, + size: 's', + type: 'push', + paddingSize: 'm', + hideCloseButton: true, + 'data-test-subj': 'aiopsLogRateAnalysisEmbeddableInitializer', + 'aria-labelledby': 'logRateAnalysisConfig', + onClose: () => { + reject(); + flyoutSession.close(); + overlayTracker?.clearOverlays(); + }, + } + ); + + if (tracksOverlays(parentApi)) { + parentApi.openOverlay(flyoutSession, { focusedPanelId }); + } + } catch (error) { + reject(error); + } + }); +} diff --git a/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/types.ts b/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/types.ts new file mode 100644 index 0000000000000..d2255e6dacb87 --- /dev/null +++ b/x-pack/plugins/aiops/public/embeddables/log_rate_analysis/types.ts @@ -0,0 +1,39 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import type { DefaultEmbeddableApi } from '@kbn/embeddable-plugin/public'; +import type { + HasEditCapabilities, + PublishesDataViews, + PublishesTimeRange, + PublishingSubject, + SerializedTimeRange, + SerializedTitles, +} from '@kbn/presentation-publishing'; + +export interface LogRateAnalysisComponentApi { + dataViewId: PublishingSubject; + updateUserInput: (update: LogRateAnalysisEmbeddableState) => void; +} + +export type LogRateAnalysisEmbeddableApi = DefaultEmbeddableApi & + HasEditCapabilities & + PublishesDataViews & + PublishesTimeRange & + LogRateAnalysisComponentApi; + +export interface LogRateAnalysisEmbeddableState extends SerializedTitles, SerializedTimeRange { + dataViewId: string; +} + +export interface LogRateAnalysisEmbeddableInitialState + extends SerializedTitles, + SerializedTimeRange { + dataViewId?: string; +} + +export type LogRateAnalysisEmbeddableRuntimeState = LogRateAnalysisEmbeddableState; diff --git a/x-pack/plugins/aiops/public/embeddables/pattern_analysis/embeddable_pattern_analysis_factory.tsx b/x-pack/plugins/aiops/public/embeddables/pattern_analysis/embeddable_pattern_analysis_factory.tsx index e7b1d6da3be61..d84043ea5f637 100644 --- a/x-pack/plugins/aiops/public/embeddables/pattern_analysis/embeddable_pattern_analysis_factory.tsx +++ b/x-pack/plugins/aiops/public/embeddables/pattern_analysis/embeddable_pattern_analysis_factory.tsx @@ -11,7 +11,6 @@ import { } from '@kbn/aiops-log-pattern-analysis/constants'; import type { Reference } from '@kbn/content-management-utils'; import type { StartServicesAccessor } from '@kbn/core-lifecycle-browser'; -import { type DataPublicPluginStart } from '@kbn/data-plugin/public'; import type { DataView } from '@kbn/data-views-plugin/common'; import { DATA_VIEW_SAVED_OBJECT_TYPE } from '@kbn/data-views-plugin/common'; import type { ReactEmbeddableFactory } from '@kbn/embeddable-plugin/public'; @@ -37,32 +36,6 @@ import type { PatternAnalysisEmbeddableState, } from './types'; -export interface EmbeddablePatternAnalysisStartServices { - data: DataPublicPluginStart; -} - -export type EmbeddablePatternAnalysisType = typeof EMBEDDABLE_PATTERN_ANALYSIS_TYPE; - -export const getDependencies = async ( - getStartServices: StartServicesAccessor -) => { - const [ - { http, uiSettings, notifications, ...startServices }, - { lens, data, usageCollection, fieldFormats }, - ] = await getStartServices(); - - return { - http, - uiSettings, - data, - notifications, - lens, - usageCollection, - fieldFormats, - ...startServices, - }; -}; - export const getPatternAnalysisEmbeddableFactory = ( getStartServices: StartServicesAccessor ) => { @@ -87,20 +60,6 @@ export const getPatternAnalysisEmbeddableFactory = ( buildEmbeddable: async (state, buildApi, uuid, parentApi) => { const [coreStart, pluginStart] = await getStartServices(); - const { http, uiSettings, notifications, ...startServices } = coreStart; - const { lens, data, usageCollection, fieldFormats } = pluginStart; - - const deps = { - http, - uiSettings, - data, - notifications, - lens, - usageCollection, - fieldFormats, - ...startServices, - }; - const { api: timeRangeApi, comparators: timeRangeComparators, @@ -119,8 +78,8 @@ export const getPatternAnalysisEmbeddableFactory = ( const blockingError = new BehaviorSubject(undefined); const dataViews$ = new BehaviorSubject([ - await deps.data.dataViews.get( - state.dataViewId ?? (await deps.data.dataViews.getDefaultId()) + await pluginStart.data.dataViews.get( + state.dataViewId ?? (await pluginStart.data.dataViews.getDefaultId()) ), ]); diff --git a/x-pack/plugins/aiops/public/embeddables/pattern_analysis/pattern_analysys_component_wrapper.tsx b/x-pack/plugins/aiops/public/embeddables/pattern_analysis/pattern_analysis_component_wrapper.tsx similarity index 100% rename from x-pack/plugins/aiops/public/embeddables/pattern_analysis/pattern_analysys_component_wrapper.tsx rename to x-pack/plugins/aiops/public/embeddables/pattern_analysis/pattern_analysis_component_wrapper.tsx diff --git a/x-pack/plugins/aiops/public/embeddables/pattern_analysis/pattern_analysis_initializer.tsx b/x-pack/plugins/aiops/public/embeddables/pattern_analysis/pattern_analysis_initializer.tsx index f44fff343fb50..ef185518638b8 100644 --- a/x-pack/plugins/aiops/public/embeddables/pattern_analysis/pattern_analysis_initializer.tsx +++ b/x-pack/plugins/aiops/public/embeddables/pattern_analysis/pattern_analysis_initializer.tsx @@ -33,6 +33,7 @@ import { useAiopsAppContext } from '../../hooks/use_aiops_app_context'; import { DataSourceContextProvider } from '../../hooks/use_data_source'; import type { PatternAnalysisEmbeddableRuntimeState } from './types'; import { PatternAnalysisSettings } from '../../components/log_categorization/log_categorization_for_embeddable/embeddable_menu'; +import { TimeFieldWarning } from '../../components/time_field_warning'; import { RandomSampler } from '../../components/log_categorization/sampling_menu'; import { DEFAULT_PROBABILITY, @@ -62,6 +63,7 @@ export const PatternAnalysisEmbeddableInitializer: FC { const { + data: { dataViews }, unifiedSearch: { ui: { IndexPatternSelect }, }, @@ -166,7 +168,7 @@ export const PatternAnalysisEmbeddableInitializer: FC - + { ); }; - -const TimeFieldWarning = () => { - return ( - <> - -

- {i18n.translate( - 'xpack.aiops.logCategorization.embeddableMenu.timeFieldWarning.title.description', - { - defaultMessage: 'Pattern analysis can only be run on data views with a time field.', - } - )} -

-
- - - ); -}; diff --git a/x-pack/plugins/aiops/public/embeddables/pattern_analysis/types.ts b/x-pack/plugins/aiops/public/embeddables/pattern_analysis/types.ts index f78934b9075f1..710e18823a2bb 100644 --- a/x-pack/plugins/aiops/public/embeddables/pattern_analysis/types.ts +++ b/x-pack/plugins/aiops/public/embeddables/pattern_analysis/types.ts @@ -14,18 +14,12 @@ import type { SerializedTimeRange, SerializedTitles, } from '@kbn/presentation-publishing'; -import type { FC } from 'react'; import type { MinimumTimeRangeOption } from '../../components/log_categorization/log_categorization_for_embeddable/minimum_time_range'; import type { RandomSamplerOption, RandomSamplerProbability, } from '../../components/log_categorization/sampling_menu/random_sampler'; -export type ViewComponent = FC<{ - interval: string; - onRenderComplete?: () => void; -}>; - export interface PatternAnalysisComponentApi { dataViewId: PublishingSubject; fieldName: PublishingSubject; diff --git a/x-pack/plugins/aiops/public/hooks/use_aiops_app_context.ts b/x-pack/plugins/aiops/public/hooks/use_aiops_app_context.ts index 03762a7ba70ba..c240ec90bc1df 100644 --- a/x-pack/plugins/aiops/public/hooks/use_aiops_app_context.ts +++ b/x-pack/plugins/aiops/public/hooks/use_aiops_app_context.ts @@ -5,7 +5,7 @@ * 2.0. */ -import { createContext, type FC, type PropsWithChildren, useContext } from 'react'; +import { createContext, type FC, useContext } from 'react'; import type { ObservabilityAIAssistantPublicStart } from '@kbn/observability-ai-assistant-plugin/public'; import type { IStorageWrapper } from '@kbn/kibana-utils-plugin/public'; @@ -24,17 +24,12 @@ import type { ThemeServiceStart, } from '@kbn/core/public'; import type { LensPublicStart } from '@kbn/lens-plugin/public'; -import { type EuiComboBoxProps } from '@elastic/eui/src/components/combo_box/combo_box'; -import { type DataView } from '@kbn/data-views-plugin/common'; -import type { - FieldStatsProps, - FieldStatsServices, -} from '@kbn/unified-field-list/src/components/field_stats'; -import type { TimeRange as TimeRangeMs } from '@kbn/ml-date-picker'; import type { EmbeddableStart } from '@kbn/embeddable-plugin/public'; import type { CasesPublicStart } from '@kbn/cases-plugin/public'; import type { UsageCollectionSetup } from '@kbn/usage-collection-plugin/public'; import type { UiActionsStart } from '@kbn/ui-actions-plugin/public'; +import type { FieldStatsFlyoutProviderProps } from '@kbn/ml-field-stats-flyout/field_stats_flyout_provider'; +import type { UseFieldStatsTrigger } from '@kbn/ml-field-stats-flyout/use_field_stats_trigger'; /** * AIOps app context value to be provided via React context. @@ -98,7 +93,7 @@ export interface AiopsAppContextValue { /** * Used to create deep links to other plugins. */ - share: SharePluginStart; + share?: SharePluginStart; /** * Used to create lens embeddables. */ @@ -115,18 +110,8 @@ export interface AiopsAppContextValue { * Deps for unified fields stats. */ fieldStats?: { - useFieldStatsTrigger: () => { - renderOption: EuiComboBoxProps['renderOption']; - closeFlyout: () => void; - }; - FieldStatsFlyoutProvider: FC< - PropsWithChildren<{ - dataView: DataView; - fieldStatsServices: FieldStatsServices; - timeRangeMs?: TimeRangeMs; - dslQuery?: FieldStatsProps['dslQuery']; - }> - >; + useFieldStatsTrigger: UseFieldStatsTrigger; + FieldStatsFlyoutProvider: FC; }; embeddable?: EmbeddableStart; cases?: CasesPublicStart; diff --git a/x-pack/plugins/aiops/public/hooks/use_data_source.tsx b/x-pack/plugins/aiops/public/hooks/use_data_source.tsx index 081e10a34de65..ef574a348b928 100644 --- a/x-pack/plugins/aiops/public/hooks/use_data_source.tsx +++ b/x-pack/plugins/aiops/public/hooks/use_data_source.tsx @@ -8,10 +8,10 @@ import type { FC, PropsWithChildren } from 'react'; import React, { createContext, useCallback, useContext, useEffect, useState } from 'react'; import type { DataView } from '@kbn/data-views-plugin/common'; +import type { DataViewsPublicPluginStart } from '@kbn/data-views-plugin/public'; import type { SavedSearch } from '@kbn/saved-search-plugin/public'; import { EuiEmptyPrompt } from '@elastic/eui'; import { FormattedMessage } from '@kbn/i18n-react'; -import { useAiopsAppContext } from './use_aiops_app_context'; export const DataSourceContext = createContext({ get dataView(): never { @@ -30,6 +30,7 @@ export interface DataViewAndSavedSearch { } export interface DataSourceContextProviderProps { + dataViews: DataViewsPublicPluginStart; dataViewId?: string; savedSearchId?: string; /** Output resolves data view objects */ @@ -43,20 +44,14 @@ export interface DataSourceContextProviderProps { * @constructor */ export const DataSourceContextProvider: FC> = ({ + dataViews, dataViewId, - savedSearchId, children, onChange, }) => { const [value, setValue] = useState(); const [error, setError] = useState(); - const { - data: { dataViews }, - // uiSettings, - // savedSearch: savedSearchService, - } = useAiopsAppContext(); - /** * Resolve data view or saved search if exists. */ diff --git a/x-pack/plugins/aiops/public/shared_components/change_point_detection.tsx b/x-pack/plugins/aiops/public/shared_components/change_point_detection.tsx index 9afbd9e1c4c8d..53997219fd639 100644 --- a/x-pack/plugins/aiops/public/shared_components/change_point_detection.tsx +++ b/x-pack/plugins/aiops/public/shared_components/change_point_detection.tsx @@ -11,7 +11,6 @@ import type { CoreStart } from '@kbn/core-lifecycle-browser'; import { UI_SETTINGS } from '@kbn/data-service'; import type { TimeRange } from '@kbn/es-query'; import { DatePickerContextProvider } from '@kbn/ml-date-picker'; -import { KibanaRenderContextProvider } from '@kbn/react-kibana-context-render'; import { pick } from 'lodash'; import React, { useEffect, useMemo, useState, type FC } from 'react'; import type { Observable } from 'rxjs'; @@ -141,33 +140,34 @@ const ChangePointDetectionWrapper: FC = ({ width: 100%; `} > - - - - - - - - - - - - - - - + + + + + + + + + + + + +
); }; diff --git a/x-pack/plugins/aiops/public/shared_components/index.tsx b/x-pack/plugins/aiops/public/shared_components/index.tsx index 1c5b85c4b79b6..b347d3ee24cac 100644 --- a/x-pack/plugins/aiops/public/shared_components/index.tsx +++ b/x-pack/plugins/aiops/public/shared_components/index.tsx @@ -11,6 +11,7 @@ import type { CoreStart } from '@kbn/core-lifecycle-browser'; import type { AiopsPluginStartDeps } from '../types'; import type { ChangePointDetectionSharedComponent } from './change_point_detection'; import type { PatternAnalysisSharedComponent } from './pattern_analysis'; +import type { LogRateAnalysisEmbeddableWrapper } from './log_rate_analysis_embeddable_wrapper'; const ChangePointDetectionLazy = dynamic(async () => import('./change_point_detection')); @@ -37,3 +38,24 @@ export const getPatternAnalysisComponent = ( }; export type { PatternAnalysisSharedComponent } from './pattern_analysis'; + +const LogRateAnalysisEmbeddableWrapperLazy = dynamic( + async () => import('./log_rate_analysis_embeddable_wrapper') +); + +export const getLogRateAnalysisEmbeddableWrapperComponent = ( + coreStart: CoreStart, + pluginStart: AiopsPluginStartDeps +): LogRateAnalysisEmbeddableWrapper => { + return React.memo((props) => { + return ( + + ); + }); +}; + +export type { LogRateAnalysisEmbeddableWrapper } from './log_rate_analysis_embeddable_wrapper'; diff --git a/x-pack/plugins/aiops/public/shared_components/log_rate_analysis_embeddable_wrapper.tsx b/x-pack/plugins/aiops/public/shared_components/log_rate_analysis_embeddable_wrapper.tsx new file mode 100644 index 0000000000000..9f2a88e73461c --- /dev/null +++ b/x-pack/plugins/aiops/public/shared_components/log_rate_analysis_embeddable_wrapper.tsx @@ -0,0 +1,179 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import { pick } from 'lodash'; +import React, { useEffect, useMemo, useState, type FC } from 'react'; +import usePrevious from 'react-use/lib/usePrevious'; +import type { Observable } from 'rxjs'; +import { BehaviorSubject, combineLatest, distinctUntilChanged, map } from 'rxjs'; +import { createBrowserHistory } from 'history'; + +import { UrlStateProvider } from '@kbn/ml-url-state'; +import { Router } from '@kbn/shared-ux-router'; +import { AIOPS_EMBEDDABLE_ORIGIN } from '@kbn/aiops-common/constants'; +import type { CoreStart } from '@kbn/core-lifecycle-browser'; +import { UI_SETTINGS } from '@kbn/data-service'; +import { LogRateAnalysisReduxProvider } from '@kbn/aiops-log-rate-analysis/state'; +import type { TimeRange } from '@kbn/es-query'; +import { DatePickerContextProvider } from '@kbn/ml-date-picker'; +import type { SignificantItem } from '@kbn/ml-agg-utils'; + +import { AiopsAppContext, type AiopsAppContextValue } from '../hooks/use_aiops_app_context'; +import { DataSourceContextProvider } from '../hooks/use_data_source'; +import { ReloadContextProvider } from '../hooks/use_reload'; +import { FilterQueryContextProvider } from '../hooks/use_filters_query'; +import type { AiopsPluginStartDeps } from '../types'; + +import { LogRateAnalysisForEmbeddable } from '../components/log_rate_analysis/log_rate_analysis_for_embeddable'; + +/** + * Only used to initialize internally + */ +export type LogRateAnalysisPropsWithDeps = LogRateAnalysisEmbeddableWrapperProps & { + coreStart: CoreStart; + pluginStart: AiopsPluginStartDeps; +}; + +export type LogRateAnalysisEmbeddableWrapper = FC; + +export interface LogRateAnalysisEmbeddableWrapperProps { + dataViewId: string; + timeRange: TimeRange; + /** + * Component to render if there are no significant items found + */ + emptyState?: React.ReactElement; + /** + * Outputs the most recent significant items + */ + onChange?: (significantItems: SignificantItem[]) => void; + /** + * Last reload request time, can be used for manual reload + */ + lastReloadRequestTime?: number; + /** Origin of the embeddable instance */ + embeddingOrigin?: string; + onLoading: (isLoading: boolean) => void; + onRenderComplete: () => void; + onError: (error: Error) => void; +} + +const LogRateAnalysisEmbeddableWrapperWithDeps: FC = ({ + // Component dependencies + coreStart, + pluginStart, + // Component props + dataViewId, + timeRange, + embeddingOrigin, + lastReloadRequestTime, +}) => { + const deps = useMemo(() => { + const { lens, data, usageCollection, fieldFormats, charts, share, storage, unifiedSearch } = + pluginStart; + + return { + data, + lens, + usageCollection, + fieldFormats, + charts, + share, + storage, + unifiedSearch, + ...coreStart, + }; + }, [coreStart, pluginStart]); + + const datePickerDeps = { + ...pick(deps, ['data', 'http', 'notifications', 'theme', 'uiSettings', 'i18n']), + uiSettingsKeys: UI_SETTINGS, + }; + + const aiopsAppContextValue = useMemo(() => { + return { + embeddingOrigin: embeddingOrigin ?? AIOPS_EMBEDDABLE_ORIGIN.DEFAULT, + ...deps, + }; + }, [deps, embeddingOrigin]); + + const [manualReload$] = useState>( + new BehaviorSubject(lastReloadRequestTime ?? Date.now()) + ); + + useEffect( + function updateManualReloadSubject() { + if (!lastReloadRequestTime) return; + manualReload$.next(lastReloadRequestTime); + }, + [lastReloadRequestTime, manualReload$] + ); + + const resultObservable$ = useMemo>(() => { + return combineLatest([manualReload$]).pipe( + map(([manualReload]) => Math.max(manualReload)), + distinctUntilChanged() + ); + }, [manualReload$]); + + const history = createBrowserHistory(); + + // We use the following pattern to track changes of dataViewId, and if there's + // a change, we unmount and remount the complete inner component. This makes + // sure the component is reinitialized correctly when the options of the + // dashboard panel are used to change the data view. This is a bit of a + // workaround since originally log rate analysis was developed as a standalone + // page with the expectation that the data view is set once and never changes. + const prevDataViewId = usePrevious(dataViewId); + const [_, setRerenderFlag] = useState(false); + useEffect(() => { + if (prevDataViewId && prevDataViewId !== dataViewId) { + setRerenderFlag((prev) => !prev); + } + }, [dataViewId, prevDataViewId]); + const showComponent = prevDataViewId === undefined || prevDataViewId === dataViewId; + + // TODO: Remove data-shared-item as part of https://github.com/elastic/kibana/issues/179376> + return ( +
+ {showComponent && ( + + + + + + + + + + + + + + + + + + )} +
+ ); +}; + +// eslint-disable-next-line import/no-default-export +export default LogRateAnalysisEmbeddableWrapperWithDeps; diff --git a/x-pack/plugins/aiops/public/shared_components/pattern_analysis.tsx b/x-pack/plugins/aiops/public/shared_components/pattern_analysis.tsx index 78261cd1f62f0..f601474a5707f 100644 --- a/x-pack/plugins/aiops/public/shared_components/pattern_analysis.tsx +++ b/x-pack/plugins/aiops/public/shared_components/pattern_analysis.tsx @@ -10,7 +10,6 @@ import type { CoreStart } from '@kbn/core-lifecycle-browser'; import { UI_SETTINGS } from '@kbn/data-service'; import type { TimeRange } from '@kbn/es-query'; import { DatePickerContextProvider } from '@kbn/ml-date-picker'; -import { KibanaRenderContextProvider } from '@kbn/react-kibana-context-render'; import { pick } from 'lodash'; import React, { useEffect, useMemo, useState, type FC } from 'react'; import type { Observable } from 'rxjs'; @@ -20,7 +19,7 @@ import type { RandomSamplerOption, RandomSamplerProbability, } from '../components/log_categorization/sampling_menu/random_sampler'; -import { PatternAnalysisEmbeddableWrapper } from '../embeddables/pattern_analysis/pattern_analysys_component_wrapper'; +import { PatternAnalysisEmbeddableWrapper } from '../embeddables/pattern_analysis/pattern_analysis_component_wrapper'; import { AiopsAppContext, type AiopsAppContextValue } from '../hooks/use_aiops_app_context'; import { DataSourceContextProvider } from '../hooks/use_data_source'; import { FilterQueryContextProvider } from '../hooks/use_filters_query'; @@ -139,31 +138,32 @@ const PatternAnalysisWrapper: FC = ({ padding: '10px', }} > - - - - - - - - - - - - - + + + + + + + + + + +
); }; diff --git a/x-pack/plugins/aiops/public/types/storage.ts b/x-pack/plugins/aiops/public/types/storage.ts index a4a29dda2f2c3..ea6fde6b06552 100644 --- a/x-pack/plugins/aiops/public/types/storage.ts +++ b/x-pack/plugins/aiops/public/types/storage.ts @@ -19,14 +19,12 @@ export const AIOPS_RANDOM_SAMPLING_PROBABILITY_PREFERENCE = 'aiops.randomSamplingProbabilityPreference'; export const AIOPS_PATTERN_ANALYSIS_MINIMUM_TIME_RANGE_PREFERENCE = 'aiops.patternAnalysisMinimumTimeRangePreference'; -export const AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS = 'aiops.logRateAnalysisResultColumns'; export type AiOps = Partial<{ [AIOPS_FROZEN_TIER_PREFERENCE]: FrozenTierPreference; [AIOPS_RANDOM_SAMPLING_MODE_PREFERENCE]: RandomSamplerOption; [AIOPS_RANDOM_SAMPLING_PROBABILITY_PREFERENCE]: number; [AIOPS_PATTERN_ANALYSIS_MINIMUM_TIME_RANGE_PREFERENCE]: MinimumTimeRangeOption; - [AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS]: string[]; }> | null; export type AiOpsKey = keyof Exclude; @@ -39,8 +37,6 @@ export type AiOpsStorageMapped = T extends typeof AIOPS_FROZ ? RandomSamplerProbability : T extends typeof AIOPS_PATTERN_ANALYSIS_MINIMUM_TIME_RANGE_PREFERENCE ? MinimumTimeRangeOption - : T extends typeof AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS - ? string[] : null; export const AIOPS_STORAGE_KEYS = [ @@ -48,5 +44,4 @@ export const AIOPS_STORAGE_KEYS = [ AIOPS_RANDOM_SAMPLING_MODE_PREFERENCE, AIOPS_RANDOM_SAMPLING_PROBABILITY_PREFERENCE, AIOPS_PATTERN_ANALYSIS_MINIMUM_TIME_RANGE_PREFERENCE, - AIOPS_LOG_RATE_ANALYSIS_RESULT_COLUMNS, ] as const; diff --git a/x-pack/plugins/aiops/public/ui_actions/create_change_point_chart.tsx b/x-pack/plugins/aiops/public/ui_actions/create_change_point_chart.tsx index f9078f575818a..4d7ed26e295f5 100644 --- a/x-pack/plugins/aiops/public/ui_actions/create_change_point_chart.tsx +++ b/x-pack/plugins/aiops/public/ui_actions/create_change_point_chart.tsx @@ -12,7 +12,9 @@ import type { UiActionsActionDefinition } from '@kbn/ui-actions-plugin/public'; import { IncompatibleActionError } from '@kbn/ui-actions-plugin/public'; import { EMBEDDABLE_CHANGE_POINT_CHART_TYPE } from '@kbn/aiops-change-point-detection/constants'; import type { CoreStart } from '@kbn/core-lifecycle-browser'; + import type { AiopsPluginStartDeps } from '../types'; + import type { ChangePointChartActionContext } from './change_point_action_context'; const parentApiIsCompatible = async ( diff --git a/x-pack/plugins/aiops/public/ui_actions/create_log_rate_analysis_actions.tsx b/x-pack/plugins/aiops/public/ui_actions/create_log_rate_analysis_actions.tsx new file mode 100644 index 0000000000000..588903e96aa16 --- /dev/null +++ b/x-pack/plugins/aiops/public/ui_actions/create_log_rate_analysis_actions.tsx @@ -0,0 +1,91 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import { i18n } from '@kbn/i18n'; +import type { PresentationContainer } from '@kbn/presentation-containers'; +import type { EmbeddableApiContext } from '@kbn/presentation-publishing'; +import type { UiActionsActionDefinition } from '@kbn/ui-actions-plugin/public'; +import { IncompatibleActionError } from '@kbn/ui-actions-plugin/public'; +import type { CoreStart } from '@kbn/core-lifecycle-browser'; +import { EMBEDDABLE_LOG_RATE_ANALYSIS_TYPE } from '@kbn/aiops-log-rate-analysis/constants'; +import { AIOPS_EMBEDDABLE_GROUPING } from '@kbn/aiops-common/constants'; + +import type { + LogRateAnalysisEmbeddableApi, + LogRateAnalysisEmbeddableInitialState, +} from '../embeddables/log_rate_analysis/types'; +import type { AiopsPluginStartDeps } from '../types'; + +import type { LogRateAnalysisActionContext } from './log_rate_analysis_action_context'; + +const parentApiIsCompatible = async ( + parentApi: unknown +): Promise => { + const { apiIsPresentationContainer } = await import('@kbn/presentation-containers'); + // we cannot have an async type check, so return the casted parentApi rather than a boolean + return apiIsPresentationContainer(parentApi) ? (parentApi as PresentationContainer) : undefined; +}; + +export function createAddLogRateAnalysisEmbeddableAction( + coreStart: CoreStart, + pluginStart: AiopsPluginStartDeps +): UiActionsActionDefinition { + return { + id: 'create-log-rate-analysis-embeddable', + grouping: AIOPS_EMBEDDABLE_GROUPING, + getIconType: () => 'logRateAnalysis', + getDisplayName: () => + i18n.translate('xpack.aiops.embeddableLogRateAnalysisDisplayName', { + defaultMessage: 'Log rate analysis', + }), + async isCompatible(context: EmbeddableApiContext) { + return Boolean(await parentApiIsCompatible(context.embeddable)); + }, + async execute(context) { + const presentationContainerParent = await parentApiIsCompatible(context.embeddable); + if (!presentationContainerParent) throw new IncompatibleActionError(); + + try { + const { resolveEmbeddableLogRateAnalysisUserInput } = await import( + '../embeddables/log_rate_analysis/resolve_log_rate_analysis_config_input' + ); + + const initialState: LogRateAnalysisEmbeddableInitialState = { + dataViewId: undefined, + }; + + const embeddable = await presentationContainerParent.addNewPanel< + object, + LogRateAnalysisEmbeddableApi + >({ + panelType: EMBEDDABLE_LOG_RATE_ANALYSIS_TYPE, + initialState, + }); + + if (!embeddable) { + return; + } + + const deletePanel = () => { + presentationContainerParent.removePanel(embeddable.uuid); + }; + + resolveEmbeddableLogRateAnalysisUserInput( + coreStart, + pluginStart, + context.embeddable, + embeddable.uuid, + true, + embeddable, + deletePanel + ); + } catch (e) { + return Promise.reject(); + } + }, + }; +} diff --git a/x-pack/plugins/aiops/public/ui_actions/create_pattern_analysis_action.tsx b/x-pack/plugins/aiops/public/ui_actions/create_pattern_analysis_action.tsx index 81127559e4e3d..f840e896abac4 100644 --- a/x-pack/plugins/aiops/public/ui_actions/create_pattern_analysis_action.tsx +++ b/x-pack/plugins/aiops/public/ui_actions/create_pattern_analysis_action.tsx @@ -12,13 +12,16 @@ import type { UiActionsActionDefinition } from '@kbn/ui-actions-plugin/public'; import { IncompatibleActionError } from '@kbn/ui-actions-plugin/public'; import type { CoreStart } from '@kbn/core-lifecycle-browser'; import { EMBEDDABLE_PATTERN_ANALYSIS_TYPE } from '@kbn/aiops-log-pattern-analysis/constants'; +import { AIOPS_EMBEDDABLE_GROUPING } from '@kbn/aiops-common/constants'; + import type { AiopsPluginStartDeps } from '../types'; -import type { PatternAnalysisActionContext } from './pattern_analysis_action_context'; import type { PatternAnalysisEmbeddableApi, PatternAnalysisEmbeddableInitialState, } from '../embeddables/pattern_analysis/types'; +import type { PatternAnalysisActionContext } from './pattern_analysis_action_context'; + const parentApiIsCompatible = async ( parentApi: unknown ): Promise => { @@ -33,16 +36,7 @@ export function createAddPatternAnalysisEmbeddableAction( ): UiActionsActionDefinition { return { id: 'create-pattern-analysis-embeddable', - grouping: [ - { - id: 'ml', - getDisplayName: () => - i18n.translate('xpack.aiops.navMenu.mlAppNameText', { - defaultMessage: 'Machine Learning and Analytics', - }), - getIconType: () => 'logPatternAnalysis', - }, - ], + grouping: AIOPS_EMBEDDABLE_GROUPING, getIconType: () => 'logPatternAnalysis', getDisplayName: () => i18n.translate('xpack.aiops.embeddablePatternAnalysisDisplayName', { diff --git a/x-pack/plugins/aiops/public/ui_actions/index.ts b/x-pack/plugins/aiops/public/ui_actions/index.ts index d14856fd28733..6081541c448e7 100644 --- a/x-pack/plugins/aiops/public/ui_actions/index.ts +++ b/x-pack/plugins/aiops/public/ui_actions/index.ts @@ -18,6 +18,7 @@ import { createOpenChangePointInMlAppAction } from './open_change_point_ml'; import type { AiopsPluginStartDeps } from '../types'; import { createCategorizeFieldAction } from '../components/log_categorization'; import { createAddPatternAnalysisEmbeddableAction } from './create_pattern_analysis_action'; +import { createAddLogRateAnalysisEmbeddableAction } from './create_log_rate_analysis_actions'; export function registerAiopsUiActions( uiActions: UiActionsSetup, @@ -27,6 +28,7 @@ export function registerAiopsUiActions( const openChangePointInMlAppAction = createOpenChangePointInMlAppAction(coreStart, pluginStart); const addChangePointChartAction = createAddChangePointChartAction(coreStart, pluginStart); const addPatternAnalysisAction = createAddPatternAnalysisEmbeddableAction(coreStart, pluginStart); + const addLogRateAnalysisAction = createAddLogRateAnalysisEmbeddableAction(coreStart, pluginStart); uiActions.addTriggerAction(ADD_PANEL_TRIGGER, addPatternAnalysisAction); uiActions.addTriggerAction(ADD_PANEL_TRIGGER, addChangePointChartAction); @@ -39,4 +41,6 @@ export function registerAiopsUiActions( ); uiActions.addTriggerAction(CONTEXT_MENU_TRIGGER, openChangePointInMlAppAction); + + uiActions.addTriggerAction(ADD_PANEL_TRIGGER, addLogRateAnalysisAction); } diff --git a/x-pack/plugins/aiops/public/ui_actions/log_rate_analysis_action_context.ts b/x-pack/plugins/aiops/public/ui_actions/log_rate_analysis_action_context.ts new file mode 100644 index 0000000000000..d0c1ef715b84c --- /dev/null +++ b/x-pack/plugins/aiops/public/ui_actions/log_rate_analysis_action_context.ts @@ -0,0 +1,24 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import { isPopulatedObject } from '@kbn/ml-is-populated-object'; +import { apiIsOfType, type EmbeddableApiContext } from '@kbn/presentation-publishing'; +import { EMBEDDABLE_LOG_RATE_ANALYSIS_TYPE } from '@kbn/aiops-log-rate-analysis/constants'; +import type { LogRateAnalysisEmbeddableApi } from '../embeddables/log_rate_analysis/types'; + +export interface LogRateAnalysisActionContext extends EmbeddableApiContext { + embeddable: LogRateAnalysisEmbeddableApi; +} + +export function isLogRateAnalysisEmbeddableContext( + arg: unknown +): arg is LogRateAnalysisActionContext { + return ( + isPopulatedObject(arg, ['embeddable']) && + apiIsOfType(arg.embeddable, EMBEDDABLE_LOG_RATE_ANALYSIS_TYPE) + ); +} diff --git a/x-pack/plugins/aiops/tsconfig.json b/x-pack/plugins/aiops/tsconfig.json index 188f8b275fbac..e8b4f4f3ed972 100644 --- a/x-pack/plugins/aiops/tsconfig.json +++ b/x-pack/plugins/aiops/tsconfig.json @@ -63,7 +63,6 @@ "@kbn/presentation-containers", "@kbn/presentation-publishing", "@kbn/presentation-util-plugin", - "@kbn/react-kibana-context-render", "@kbn/react-kibana-context-theme", "@kbn/react-kibana-mount", "@kbn/rison", @@ -79,6 +78,8 @@ "@kbn/observability-ai-assistant-plugin", "@kbn/ui-theme", "@kbn/apm-utils", + "@kbn/ml-field-stats-flyout", + "@kbn/shared-ux-router", ], "exclude": [ "target/**/*", diff --git a/x-pack/plugins/encrypted_saved_objects/server/crypto/encrypted_saved_object_type_definition.ts b/x-pack/plugins/encrypted_saved_objects/server/crypto/encrypted_saved_object_type_definition.ts index d8ce2daa6efbe..bb07842e2bab5 100644 --- a/x-pack/plugins/encrypted_saved_objects/server/crypto/encrypted_saved_object_type_definition.ts +++ b/x-pack/plugins/encrypted_saved_objects/server/crypto/encrypted_saved_object_type_definition.ts @@ -16,6 +16,7 @@ export class EncryptedSavedObjectAttributesDefinition { public readonly attributesToEncrypt: ReadonlySet; private readonly attributesToIncludeInAAD: ReadonlySet | undefined; private readonly attributesToStrip: ReadonlySet; + public readonly enforceRandomId: boolean; constructor(typeRegistration: EncryptedSavedObjectTypeRegistration) { if (typeRegistration.attributesToIncludeInAAD) { @@ -49,6 +50,8 @@ export class EncryptedSavedObjectAttributesDefinition { } } + this.enforceRandomId = typeRegistration.enforceRandomId !== false; + this.attributesToEncrypt = attributesToEncrypt; this.attributesToStrip = attributesToStrip; this.attributesToIncludeInAAD = typeRegistration.attributesToIncludeInAAD; diff --git a/x-pack/plugins/encrypted_saved_objects/server/crypto/encrypted_saved_objects_service.test.ts b/x-pack/plugins/encrypted_saved_objects/server/crypto/encrypted_saved_objects_service.test.ts index 1691d3f4c0610..67c972ec5f859 100644 --- a/x-pack/plugins/encrypted_saved_objects/server/crypto/encrypted_saved_objects_service.test.ts +++ b/x-pack/plugins/encrypted_saved_objects/server/crypto/encrypted_saved_objects_service.test.ts @@ -2405,3 +2405,24 @@ describe('#decryptAttributesSync', () => { }); }); }); + +describe('#shouldEnforceRandomId', () => { + it('defaults to true if enforceRandomId is undefined', () => { + service.registerType({ type: 'known-type-1', attributesToEncrypt: new Set(['attr']) }); + expect(service.shouldEnforceRandomId('known-type-1')).toBe(true); + }); + it('should return the value of enforceRandomId if it is defined', () => { + service.registerType({ + type: 'known-type-1', + attributesToEncrypt: new Set(['attr']), + enforceRandomId: false, + }); + service.registerType({ + type: 'known-type-2', + attributesToEncrypt: new Set(['attr']), + enforceRandomId: true, + }); + expect(service.shouldEnforceRandomId('known-type-1')).toBe(false); + expect(service.shouldEnforceRandomId('known-type-2')).toBe(true); + }); +}); diff --git a/x-pack/plugins/encrypted_saved_objects/server/crypto/encrypted_saved_objects_service.ts b/x-pack/plugins/encrypted_saved_objects/server/crypto/encrypted_saved_objects_service.ts index 44072a0828d48..d2c7d9975a9ca 100644 --- a/x-pack/plugins/encrypted_saved_objects/server/crypto/encrypted_saved_objects_service.ts +++ b/x-pack/plugins/encrypted_saved_objects/server/crypto/encrypted_saved_objects_service.ts @@ -33,6 +33,7 @@ export interface EncryptedSavedObjectTypeRegistration { readonly type: string; readonly attributesToEncrypt: ReadonlySet; readonly attributesToIncludeInAAD?: ReadonlySet; + readonly enforceRandomId?: boolean; } /** @@ -152,6 +153,16 @@ export class EncryptedSavedObjectsService { return this.typeDefinitions.has(type); } + /** + * Checks whether the ESO type has explicitly opted out of enforcing random IDs. + * @param type Saved object type. + * @returns boolean - true unless explicitly opted out by setting enforceRandomId to false + */ + public shouldEnforceRandomId(type: string) { + const typeDefinition = this.typeDefinitions.get(type); + return typeDefinition?.enforceRandomId !== false; + } + /** * Takes saved object attributes for the specified type and, depending on the type definition, * either decrypts or strips encrypted attributes (e.g. in case AAD or encryption key has changed diff --git a/x-pack/plugins/encrypted_saved_objects/server/saved_objects/saved_objects_encryption_extension.ts b/x-pack/plugins/encrypted_saved_objects/server/saved_objects/saved_objects_encryption_extension.ts index 01c35c7403fdf..45e0f6a46c892 100644 --- a/x-pack/plugins/encrypted_saved_objects/server/saved_objects/saved_objects_encryption_extension.ts +++ b/x-pack/plugins/encrypted_saved_objects/server/saved_objects/saved_objects_encryption_extension.ts @@ -40,6 +40,10 @@ export class SavedObjectsEncryptionExtension implements ISavedObjectsEncryptionE return this._service.isRegistered(type); } + shouldEnforceRandomId(type: string) { + return this._service.shouldEnforceRandomId(type); + } + async decryptOrStripResponseAttributes>( response: R, originalAttributes?: T diff --git a/x-pack/plugins/enterprise_search/public/applications/shared/api_key/create_api_key_flyout.tsx b/x-pack/plugins/enterprise_search/public/applications/shared/api_key/create_api_key_flyout.tsx index 38217df269fd1..c72f56c656e49 100644 --- a/x-pack/plugins/enterprise_search/public/applications/shared/api_key/create_api_key_flyout.tsx +++ b/x-pack/plugins/enterprise_search/public/applications/shared/api_key/create_api_key_flyout.tsx @@ -210,6 +210,7 @@ export const CreateApiKeyFlyout: React.FC = ({ onClose defaultMessage: 'Store this API key', })} titleSize="xs" + role="alert" > {i18n.translate('xpack.enterpriseSearch.apiKey.apiKeyStepDescription', { diff --git a/x-pack/plugins/fleet/common/types/models/epm.ts b/x-pack/plugins/fleet/common/types/models/epm.ts index 3aa65dc3adcd4..827130d802f22 100644 --- a/x-pack/plugins/fleet/common/types/models/epm.ts +++ b/x-pack/plugins/fleet/common/types/models/epm.ts @@ -124,10 +124,25 @@ export type InstallablePackage = RegistryPackage | ArchivePackage; export type AssetsMap = Map; +export interface ArchiveEntry { + path: string; + buffer?: Buffer; +} + +export interface ArchiveIterator { + traverseEntries: (onEntry: (entry: ArchiveEntry) => Promise) => Promise; + getPaths: () => Promise; +} + export interface PackageInstallContext { packageInfo: InstallablePackage; + /** + * @deprecated Use `archiveIterator` to access the package archive entries + * without loading them all into memory at once. + */ assetsMap: AssetsMap; paths: string[]; + archiveIterator: ArchiveIterator; } export type ArchivePackage = PackageSpecManifest & diff --git a/x-pack/plugins/fleet/server/routes/epm/file_handler.test.ts b/x-pack/plugins/fleet/server/routes/epm/file_handler.test.ts index 1eb8387f69751..5690c32c2d7fd 100644 --- a/x-pack/plugins/fleet/server/routes/epm/file_handler.test.ts +++ b/x-pack/plugins/fleet/server/routes/epm/file_handler.test.ts @@ -15,7 +15,7 @@ import { getBundledPackageByPkgKey } from '../../services/epm/packages/bundled_p import { getFile, getInstallation } from '../../services/epm/packages/get'; import type { FleetRequestHandlerContext } from '../..'; import { appContextService } from '../../services'; -import { unpackBufferEntries } from '../../services/epm/archive'; +import { unpackArchiveEntriesIntoMemory } from '../../services/epm/archive'; import { getAsset } from '../../services/epm/archive/storage'; import { getFileHandler } from './file_handler'; @@ -29,7 +29,7 @@ jest.mock('../../services/epm/packages/get'); const mockedGetBundledPackageByPkgKey = jest.mocked(getBundledPackageByPkgKey); const mockedGetInstallation = jest.mocked(getInstallation); const mockedGetFile = jest.mocked(getFile); -const mockedUnpackBufferEntries = jest.mocked(unpackBufferEntries); +const mockedUnpackBufferEntries = jest.mocked(unpackArchiveEntriesIntoMemory); const mockedGetAsset = jest.mocked(getAsset); function mockContext() { diff --git a/x-pack/plugins/fleet/server/routes/epm/file_handler.ts b/x-pack/plugins/fleet/server/routes/epm/file_handler.ts index 0f22a31c1aa72..994f52a71c224 100644 --- a/x-pack/plugins/fleet/server/routes/epm/file_handler.ts +++ b/x-pack/plugins/fleet/server/routes/epm/file_handler.ts @@ -17,7 +17,7 @@ import { defaultFleetErrorHandler } from '../../errors'; import { getAsset } from '../../services/epm/archive/storage'; import { getBundledPackageByPkgKey } from '../../services/epm/packages/bundled_packages'; import { pkgToPkgKey } from '../../services/epm/registry'; -import { unpackBufferEntries } from '../../services/epm/archive'; +import { unpackArchiveEntriesIntoMemory } from '../../services/epm/archive'; const CACHE_CONTROL_10_MINUTES_HEADER: HttpResponseOptions['headers'] = { 'cache-control': 'max-age=600', @@ -69,7 +69,7 @@ export const getFileHandler: FleetRequestHandler< pkgToPkgKey({ name: pkgName, version: pkgVersion }) ); if (bundledPackage) { - const bufferEntries = await unpackBufferEntries( + const bufferEntries = await unpackArchiveEntriesIntoMemory( await bundledPackage.getBuffer(), 'application/zip' ); diff --git a/x-pack/plugins/fleet/server/routes/epm/kibana_assets_handler.ts b/x-pack/plugins/fleet/server/routes/epm/kibana_assets_handler.ts index 8fe83f98669d1..ad0bec6397ee8 100644 --- a/x-pack/plugins/fleet/server/routes/epm/kibana_assets_handler.ts +++ b/x-pack/plugins/fleet/server/routes/epm/kibana_assets_handler.ts @@ -22,6 +22,7 @@ import type { FleetRequestHandler, InstallKibanaAssetsRequestSchema, } from '../../types'; +import { createArchiveIteratorFromMap } from '../../services/epm/archive/archive_iterator'; export const installPackageKibanaAssetsHandler: FleetRequestHandler< TypeOf, @@ -69,6 +70,7 @@ export const installPackageKibanaAssetsHandler: FleetRequestHandler< packageInfo, paths: installedPkgWithAssets.paths, assetsMap: installedPkgWithAssets.assetsMap, + archiveIterator: createArchiveIteratorFromMap(installedPkgWithAssets.assetsMap), }, }); diff --git a/x-pack/plugins/fleet/server/services/epm/archive/archive_iterator.ts b/x-pack/plugins/fleet/server/services/epm/archive/archive_iterator.ts new file mode 100644 index 0000000000000..369b32412bd82 --- /dev/null +++ b/x-pack/plugins/fleet/server/services/epm/archive/archive_iterator.ts @@ -0,0 +1,83 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import type { AssetsMap, ArchiveIterator, ArchiveEntry } from '../../../../common/types'; + +import { traverseArchiveEntries } from '.'; + +/** + * Creates an iterator for traversing and extracting paths from an archive + * buffer. This iterator is intended to be used for memory efficient traversal + * of archive contents without extracting the entire archive into memory. + * + * @param archiveBuffer - The buffer containing the archive data. + * @param contentType - The content type of the archive (e.g., + * 'application/zip'). + * @returns ArchiveIterator instance. + * + */ +export const createArchiveIterator = ( + archiveBuffer: Buffer, + contentType: string +): ArchiveIterator => { + const paths: string[] = []; + + const traverseEntries = async ( + onEntry: (entry: ArchiveEntry) => Promise + ): Promise => { + await traverseArchiveEntries(archiveBuffer, contentType, async (entry) => { + await onEntry(entry); + }); + }; + + const getPaths = async (): Promise => { + if (paths.length) { + return paths; + } + + await traverseEntries(async (entry) => { + paths.push(entry.path); + }); + + return paths; + }; + + return { + traverseEntries, + getPaths, + }; +}; + +/** + * Creates an archive iterator from the assetsMap. This is a stop-gap solution + * to provide a uniform interface for traversing assets while assetsMap is still + * in use. It works with a map of assets loaded into memory and is not intended + * for use with large archives. + * + * @param assetsMap - A map where the keys are asset paths and the values are + * asset buffers. + * @returns ArchiveIterator instance. + * + */ +export const createArchiveIteratorFromMap = (assetsMap: AssetsMap): ArchiveIterator => { + const traverseEntries = async ( + onEntry: (entry: ArchiveEntry) => Promise + ): Promise => { + for (const [path, buffer] of assetsMap) { + await onEntry({ path, buffer }); + } + }; + + const getPaths = async (): Promise => { + return [...assetsMap.keys()]; + }; + + return { + traverseEntries, + getPaths, + }; +}; diff --git a/x-pack/plugins/fleet/server/services/epm/archive/extract.ts b/x-pack/plugins/fleet/server/services/epm/archive/extract.ts index 84aa161385cb3..9f5f90959d144 100644 --- a/x-pack/plugins/fleet/server/services/epm/archive/extract.ts +++ b/x-pack/plugins/fleet/server/services/epm/archive/extract.ts @@ -11,13 +11,12 @@ import * as tar from 'tar'; import yauzl from 'yauzl'; import { bufferToStream, streamToBuffer } from '../streams'; - -import type { ArchiveEntry } from '.'; +import type { ArchiveEntry } from '../../../../common/types'; export async function untarBuffer( buffer: Buffer, filter = (entry: ArchiveEntry): boolean => true, - onEntry = (entry: ArchiveEntry): void => {} + onEntry = async (entry: ArchiveEntry): Promise => {} ) { const deflatedStream = bufferToStream(buffer); // use tar.list vs .extract to avoid writing to disk @@ -37,7 +36,7 @@ export async function untarBuffer( export async function unzipBuffer( buffer: Buffer, filter = (entry: ArchiveEntry): boolean => true, - onEntry = (entry: ArchiveEntry): void => {} + onEntry = async (entry: ArchiveEntry): Promise => {} ): Promise { const zipfile = await yauzlFromBuffer(buffer, { lazyEntries: true }); zipfile.readEntry(); @@ -45,9 +44,12 @@ export async function unzipBuffer( const path = entry.fileName; if (!filter({ path })) return zipfile.readEntry(); - const entryBuffer = await getZipReadStream(zipfile, entry).then(streamToBuffer); - onEntry({ buffer: entryBuffer, path }); - zipfile.readEntry(); + try { + const entryBuffer = await getZipReadStream(zipfile, entry).then(streamToBuffer); + await onEntry({ buffer: entryBuffer, path }); + } finally { + zipfile.readEntry(); + } }); return new Promise((resolve, reject) => zipfile.on('end', resolve).on('error', reject)); } diff --git a/x-pack/plugins/fleet/server/services/epm/archive/index.ts b/x-pack/plugins/fleet/server/services/epm/archive/index.ts index 5943f8f838fcb..ed9ff2a5e4b72 100644 --- a/x-pack/plugins/fleet/server/services/epm/archive/index.ts +++ b/x-pack/plugins/fleet/server/services/epm/archive/index.ts @@ -5,13 +5,20 @@ * 2.0. */ -import type { AssetParts, AssetsMap } from '../../../../common/types'; +import type { + ArchiveEntry, + ArchiveIterator, + AssetParts, + AssetsMap, +} from '../../../../common/types'; import { PackageInvalidArchiveError, PackageUnsupportedMediaTypeError, PackageNotFoundError, } from '../../../errors'; +import { createArchiveIterator } from './archive_iterator'; + import { deletePackageInfo } from './cache'; import type { SharedKey } from './cache'; import { getBufferExtractor } from './extract'; @@ -20,66 +27,85 @@ export * from './cache'; export { getBufferExtractor, untarBuffer, unzipBuffer } from './extract'; export { generatePackageInfoFromArchiveBuffer } from './parse'; -export interface ArchiveEntry { - path: string; - buffer?: Buffer; -} - export async function unpackBufferToAssetsMap({ - name, - version, contentType, archiveBuffer, + useStreaming, }: { - name: string; - version: string; contentType: string; archiveBuffer: Buffer; -}): Promise<{ paths: string[]; assetsMap: AssetsMap }> { - const assetsMap = new Map(); - const paths: string[] = []; - const entries = await unpackBufferEntries(archiveBuffer, contentType); - - entries.forEach((entry) => { - const { path, buffer } = entry; - if (buffer) { - assetsMap.set(path, buffer); - paths.push(path); - } - }); - - return { assetsMap, paths }; + useStreaming: boolean | undefined; +}): Promise<{ paths: string[]; assetsMap: AssetsMap; archiveIterator: ArchiveIterator }> { + const archiveIterator = createArchiveIterator(archiveBuffer, contentType); + let paths: string[] = []; + let assetsMap: AssetsMap = new Map(); + if (useStreaming) { + paths = await archiveIterator.getPaths(); + // We keep the assetsMap empty as we don't want to load all the assets in memory + assetsMap = new Map(); + } else { + const entries = await unpackArchiveEntriesIntoMemory(archiveBuffer, contentType); + + entries.forEach((entry) => { + const { path, buffer } = entry; + if (buffer) { + assetsMap.set(path, buffer); + paths.push(path); + } + }); + } + + return { paths, assetsMap, archiveIterator }; } -export async function unpackBufferEntries( +/** + * This function extracts all archive entries into memory. + * + * NOTE: This is potentially dangerous for large archives and can cause OOM + * errors. Use 'traverseArchiveEntries' instead to iterate over the entries + * without storing them all in memory at once. + * + * @param archiveBuffer + * @param contentType + * @returns All the entries in the archive buffer + */ +export async function unpackArchiveEntriesIntoMemory( archiveBuffer: Buffer, contentType: string ): Promise { + const entries: ArchiveEntry[] = []; + const addToEntries = async (entry: ArchiveEntry) => void entries.push(entry); + await traverseArchiveEntries(archiveBuffer, contentType, addToEntries); + + // While unpacking a tar.gz file with unzipBuffer() will result in a thrown + // error, unpacking a zip file with untarBuffer() just results in nothing. + if (entries.length === 0) { + throw new PackageInvalidArchiveError( + `Archive seems empty. Assumed content type was ${contentType}, check if this matches the archive type.` + ); + } + return entries; +} + +export async function traverseArchiveEntries( + archiveBuffer: Buffer, + contentType: string, + onEntry: (entry: ArchiveEntry) => Promise +) { const bufferExtractor = getBufferExtractor({ contentType }); if (!bufferExtractor) { throw new PackageUnsupportedMediaTypeError( `Unsupported media type ${contentType}. Please use 'application/gzip' or 'application/zip'` ); } - const entries: ArchiveEntry[] = []; try { const onlyFiles = ({ path }: ArchiveEntry): boolean => !path.endsWith('/'); - const addToEntries = (entry: ArchiveEntry) => entries.push(entry); - await bufferExtractor(archiveBuffer, onlyFiles, addToEntries); + await bufferExtractor(archiveBuffer, onlyFiles, onEntry); } catch (error) { throw new PackageInvalidArchiveError( `Error during extraction of package: ${error}. Assumed content type was ${contentType}, check if this matches the archive type.` ); } - - // While unpacking a tar.gz file with unzipBuffer() will result in a thrown error in the try-catch above, - // unpacking a zip file with untarBuffer() just results in nothing. - if (entries.length === 0) { - throw new PackageInvalidArchiveError( - `Archive seems empty. Assumed content type was ${contentType}, check if this matches the archive type.` - ); - } - return entries; } export const deletePackageCache = ({ name, version }: SharedKey) => { diff --git a/x-pack/plugins/fleet/server/services/epm/archive/parse.ts b/x-pack/plugins/fleet/server/services/epm/archive/parse.ts index 530ca804f24eb..8cccfe9982457 100644 --- a/x-pack/plugins/fleet/server/services/epm/archive/parse.ts +++ b/x-pack/plugins/fleet/server/services/epm/archive/parse.ts @@ -40,7 +40,7 @@ import { import { PackageInvalidArchiveError } from '../../../errors'; import { pkgToPkgKey } from '../registry'; -import { unpackBufferEntries } from '.'; +import { traverseArchiveEntries } from '.'; const readFileAsync = promisify(readFile); export const MANIFEST_NAME = 'manifest.yml'; @@ -160,9 +160,8 @@ export async function generatePackageInfoFromArchiveBuffer( contentType: string ): Promise<{ paths: string[]; packageInfo: ArchivePackage }> { const assetsMap: AssetsBufferMap = {}; - const entries = await unpackBufferEntries(archiveBuffer, contentType); const paths: string[] = []; - entries.forEach(({ path: bufferPath, buffer }) => { + await traverseArchiveEntries(archiveBuffer, contentType, async ({ path: bufferPath, buffer }) => { paths.push(bufferPath); if (buffer && filterAssetPathForParseAndVerifyArchive(bufferPath)) { assetsMap[bufferPath] = buffer; diff --git a/x-pack/plugins/fleet/server/services/epm/archive/storage.ts b/x-pack/plugins/fleet/server/services/epm/archive/storage.ts index dd6321445df75..8f6f151383d5a 100644 --- a/x-pack/plugins/fleet/server/services/epm/archive/storage.ts +++ b/x-pack/plugins/fleet/server/services/epm/archive/storage.ts @@ -15,6 +15,7 @@ import { SavedObjectsErrorHelpers } from '@kbn/core/server'; import { ASSETS_SAVED_OBJECT_TYPE } from '../../../../common'; import type { + ArchiveEntry, InstallablePackage, InstallSource, PackageAssetReference, @@ -24,7 +25,6 @@ import { PackageInvalidArchiveError, PackageNotFoundError } from '../../../error import { appContextService } from '../../app_context'; import { setPackageInfo } from '.'; -import type { ArchiveEntry } from '.'; import { filterAssetPathForParseAndVerifyArchive, parseAndVerifyArchive } from './parse'; const ONE_BYTE = 1024 * 1024; diff --git a/x-pack/plugins/fleet/server/services/epm/elasticsearch/ingest_pipeline/install.ts b/x-pack/plugins/fleet/server/services/epm/elasticsearch/ingest_pipeline/install.ts index a456734747324..5a4672f67fe53 100644 --- a/x-pack/plugins/fleet/server/services/epm/elasticsearch/ingest_pipeline/install.ts +++ b/x-pack/plugins/fleet/server/services/epm/elasticsearch/ingest_pipeline/install.ts @@ -16,14 +16,13 @@ import type { PackageInfo, } from '../../../../types'; import { getAssetFromAssetsMap, getPathParts } from '../../archive'; -import type { ArchiveEntry } from '../../archive'; import { FLEET_FINAL_PIPELINE_CONTENT, FLEET_FINAL_PIPELINE_ID, FLEET_FINAL_PIPELINE_VERSION, } from '../../../../constants'; import { getPipelineNameForDatastream } from '../../../../../common/services'; -import type { PackageInstallContext } from '../../../../../common/types'; +import type { ArchiveEntry, PackageInstallContext } from '../../../../../common/types'; import { appendMetadataToIngestPipeline } from '../meta'; import { retryTransientEsErrors } from '../retry'; diff --git a/x-pack/plugins/fleet/server/services/epm/elasticsearch/transform/mappings.test.ts b/x-pack/plugins/fleet/server/services/epm/elasticsearch/transform/mappings.test.ts index f34015bf77697..de962850fba8c 100644 --- a/x-pack/plugins/fleet/server/services/epm/elasticsearch/transform/mappings.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/elasticsearch/transform/mappings.test.ts @@ -5,6 +5,8 @@ * 2.0. */ +import { createArchiveIteratorFromMap } from '../../archive/archive_iterator'; + import { loadMappingForTransform } from './mappings'; describe('loadMappingForTransform', () => { @@ -13,6 +15,7 @@ describe('loadMappingForTransform', () => { { packageInfo: {} as any, assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], }, 'test' @@ -49,6 +52,7 @@ describe('loadMappingForTransform', () => { ), ], ]), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [ '/package/ti_opencti/2.1.0/elasticsearch/transform/latest_ioc/fields/ecs.yml', '/package/ti_opencti/2.1.0/elasticsearch/transform/latest_ioc/fields/ecs-extra.yml', diff --git a/x-pack/plugins/fleet/server/services/epm/kibana/assets/install.ts b/x-pack/plugins/fleet/server/services/epm/kibana/assets/install.ts index 276478099daf8..bf5684f29c205 100644 --- a/x-pack/plugins/fleet/server/services/epm/kibana/assets/install.ts +++ b/x-pack/plugins/fleet/server/services/epm/kibana/assets/install.ts @@ -325,16 +325,16 @@ export async function deleteKibanaAssetsAndReferencesForSpace({ await saveKibanaAssetsRefs(savedObjectsClient, pkgName, [], true); } +const kibanaAssetTypes = Object.values(KibanaAssetType); +export const isKibanaAssetType = (path: string) => { + const parts = getPathParts(path); + + return parts.service === 'kibana' && (kibanaAssetTypes as string[]).includes(parts.type); +}; + export function getKibanaAssets( packageInstallContext: PackageInstallContext ): Record { - const kibanaAssetTypes = Object.values(KibanaAssetType); - const isKibanaAssetType = (path: string) => { - const parts = getPathParts(path); - - return parts.service === 'kibana' && (kibanaAssetTypes as string[]).includes(parts.type); - }; - const result = Object.fromEntries( kibanaAssetTypes.map((type) => [type, []]) ) as Record; diff --git a/x-pack/plugins/fleet/server/services/epm/kibana/assets/install_with_streaming.ts b/x-pack/plugins/fleet/server/services/epm/kibana/assets/install_with_streaming.ts new file mode 100644 index 0000000000000..fca6cf27a0cd7 --- /dev/null +++ b/x-pack/plugins/fleet/server/services/epm/kibana/assets/install_with_streaming.ts @@ -0,0 +1,115 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import type { SavedObject, SavedObjectsClientContract } from '@kbn/core/server'; + +import type { Installation, PackageInstallContext } from '../../../../../common/types'; +import type { KibanaAssetReference, KibanaAssetType } from '../../../../types'; +import { getPathParts } from '../../archive'; + +import { saveKibanaAssetsRefs } from '../../packages/install'; + +import type { ArchiveAsset } from './install'; +import { + KibanaSavedObjectTypeMapping, + createSavedObjectKibanaAsset, + isKibanaAssetType, + toAssetReference, +} from './install'; +import { getSpaceAwareSaveobjectsClients } from './saved_objects'; + +interface InstallKibanaAssetsWithStreamingArgs { + pkgName: string; + packageInstallContext: PackageInstallContext; + spaceId: string; + savedObjectsClient: SavedObjectsClientContract; + installedPkg?: SavedObject | undefined; +} + +const MAX_ASSETS_TO_INSTALL_IN_PARALLEL = 100; + +export async function installKibanaAssetsWithStreaming({ + spaceId, + packageInstallContext, + savedObjectsClient, + pkgName, + installedPkg, +}: InstallKibanaAssetsWithStreamingArgs): Promise { + const { archiveIterator } = packageInstallContext; + + const { savedObjectClientWithSpace } = getSpaceAwareSaveobjectsClients(spaceId); + + const assetRefs: KibanaAssetReference[] = []; + let batch: ArchiveAsset[] = []; + + await archiveIterator.traverseEntries(async ({ path, buffer }) => { + if (!buffer || !isKibanaAssetType(path)) { + return; + } + const savedObject = JSON.parse(buffer.toString('utf8')) as ArchiveAsset; + const assetType = getPathParts(path).type as KibanaAssetType; + const soType = KibanaSavedObjectTypeMapping[assetType]; + if (savedObject.type !== soType) { + return; + } + + batch.push(savedObject); + assetRefs.push(toAssetReference(savedObject)); + + if (batch.length >= MAX_ASSETS_TO_INSTALL_IN_PARALLEL) { + await bulkCreateSavedObjects({ + savedObjectsClient: savedObjectClientWithSpace, + kibanaAssets: batch, + refresh: false, + }); + batch = []; + } + }); + + // install any remaining assets + if (batch.length) { + await bulkCreateSavedObjects({ + savedObjectsClient: savedObjectClientWithSpace, + kibanaAssets: batch, + // Use wait_for with the last batch to ensure all assets are readable once the install is complete + refresh: 'wait_for', + }); + } + + // Update the installation saved object with installed kibana assets + await saveKibanaAssetsRefs(savedObjectsClient, pkgName, assetRefs); + + return assetRefs; +} + +async function bulkCreateSavedObjects({ + savedObjectsClient, + kibanaAssets, + refresh, +}: { + kibanaAssets: ArchiveAsset[]; + savedObjectsClient: SavedObjectsClientContract; + refresh?: boolean | 'wait_for'; +}) { + if (!kibanaAssets.length) { + return []; + } + + const toBeSavedObjects = kibanaAssets.map((asset) => createSavedObjectKibanaAsset(asset)); + + const { saved_objects: createdSavedObjects } = await savedObjectsClient.bulkCreate( + toBeSavedObjects, + { + // We only want to install new saved objects without overwriting existing ones + overwrite: false, + managed: true, + refresh, + } + ); + + return createdSavedObjects; +} diff --git a/x-pack/plugins/fleet/server/services/epm/package_service.ts b/x-pack/plugins/fleet/server/services/epm/package_service.ts index 661475dfadc09..a097db584b460 100644 --- a/x-pack/plugins/fleet/server/services/epm/package_service.ts +++ b/x-pack/plugins/fleet/server/services/epm/package_service.ts @@ -39,7 +39,10 @@ import type { InstallResult } from '../../../common'; import { appContextService } from '..'; -import type { CustomPackageDatasetConfiguration, EnsurePackageResult } from './packages/install'; +import { + type CustomPackageDatasetConfiguration, + type EnsurePackageResult, +} from './packages/install'; import type { FetchFindLatestPackageOptions } from './registry'; import { getPackageFieldsMetadata } from './registry'; @@ -56,6 +59,7 @@ import { } from './packages'; import { generatePackageInfoFromArchiveBuffer } from './archive'; import { getEsPackage } from './archive/storage'; +import { createArchiveIteratorFromMap } from './archive/archive_iterator'; export type InstalledAssetType = EsAssetReference; @@ -381,12 +385,14 @@ class PackageClientImpl implements PackageClient { } const { assetsMap } = esPackage; + const archiveIterator = createArchiveIteratorFromMap(assetsMap); const { installedTransforms } = await installTransforms({ packageInstallContext: { assetsMap, packageInfo, paths, + archiveIterator, }, esClient: this.internalEsClient, savedObjectsClient: this.internalSoClient, diff --git a/x-pack/plugins/fleet/server/services/epm/packages/assets.ts b/x-pack/plugins/fleet/server/services/epm/packages/assets.ts index a82b5c0d103b2..3bb84c0d23163 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/assets.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/assets.ts @@ -5,9 +5,9 @@ * 2.0. */ +import type { ArchiveEntry } from '../../../../common/types'; import type { AssetsMap, PackageInfo } from '../../../types'; import { getAssetFromAssetsMap } from '../archive'; -import type { ArchiveEntry } from '../archive'; const maybeFilterByDataset = (packageInfo: Pick, datasetName: string) => diff --git a/x-pack/plugins/fleet/server/services/epm/packages/get.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/get.test.ts index 2dc295762e33a..5711c8fcccaf4 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/get.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/get.test.ts @@ -27,6 +27,8 @@ import { auditLoggingService } from '../../audit_logging'; import * as Registry from '../registry'; +import { createArchiveIteratorFromMap } from '../archive/archive_iterator'; + import { getInstalledPackages, getPackageInfo, getPackages, getPackageUsageStats } from './get'; jest.mock('../registry'); @@ -915,6 +917,7 @@ owner: elastic`, MockRegistry.getPackage.mockResolvedValue({ paths: [], assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), packageInfo: { name: 'my-package', version: '1.0.0', diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install.test.ts index 709e0d84d70fc..6b3a31eda649e 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install.test.ts @@ -442,6 +442,24 @@ describe('install', () => { expect(response.status).toEqual('installed'); }); + + it('should use streaming installation for the detection rules package', async () => { + jest.spyOn(licenseService, 'hasAtLeast').mockReturnValue(true); + + const response = await installPackage({ + spaceId: DEFAULT_SPACE_ID, + installSource: 'registry', + pkgkey: 'security_detection_engine', + savedObjectsClient: savedObjectsClientMock.create(), + esClient: {} as ElasticsearchClient, + }); + + expect(response.error).toBeUndefined(); + + expect(installStateMachine._stateMachineInstallPackage).toHaveBeenCalledWith( + expect.objectContaining({ useStreaming: true }) + ); + }); }); describe('upload', () => { diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install.ts b/x-pack/plugins/fleet/server/services/epm/packages/install.ts index 1ea6f29cad839..ebe5acc35178d 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install.ts @@ -76,6 +76,7 @@ import { deleteVerificationResult, unpackBufferToAssetsMap, } from '../archive'; +import { createArchiveIteratorFromMap } from '../archive/archive_iterator'; import { toAssetReference } from '../kibana/assets/install'; import type { ArchiveAsset } from '../kibana/assets/install'; import type { PackageUpdateEvent } from '../../upgrade_sender'; @@ -107,6 +108,12 @@ import { removeInstallation } from './remove'; export const UPLOAD_RETRY_AFTER_MS = 10000; // 10s const MAX_ENSURE_INSTALL_TIME = 60 * 1000; +const PACKAGES_TO_INSTALL_WITH_STREAMING = [ + // The security_detection_engine package contains a large number of assets and + // is not suitable for regular installation as it might cause OOM errors. + 'security_detection_engine', +]; + export async function isPackageInstalled(options: { savedObjectsClient: SavedObjectsClientContract; pkgName: string; @@ -449,6 +456,7 @@ async function installPackageFromRegistry({ // TODO: change epm API to /packageName/version so we don't need to do this const { pkgName, pkgVersion: version } = Registry.splitPkgKey(pkgkey); let pkgVersion = version ?? ''; + const useStreaming = PACKAGES_TO_INSTALL_WITH_STREAMING.includes(pkgName); // if an error happens during getInstallType, report that we don't know let installType: InstallType = 'unknown'; @@ -478,11 +486,12 @@ async function installPackageFromRegistry({ } // get latest package version and requested version in parallel for performance - const [latestPackage, { paths, packageInfo, assetsMap, verificationResult }] = + const [latestPackage, { paths, packageInfo, assetsMap, archiveIterator, verificationResult }] = await Promise.all([ latestPkg ? Promise.resolve(latestPkg) : queryLatest(), Registry.getPackage(pkgName, pkgVersion, { ignoreUnverified: force && !neverIgnoreVerificationError, + useStreaming, }), ]); @@ -490,6 +499,7 @@ async function installPackageFromRegistry({ packageInfo, assetsMap, paths, + archiveIterator, }; // let the user install if using the force flag or needing to reinstall or install a previous version due to failed update @@ -542,6 +552,7 @@ async function installPackageFromRegistry({ ignoreMappingUpdateErrors, skipDataStreamRollover, retryFromLastState, + useStreaming, }); } catch (e) { sendEvent({ @@ -580,6 +591,7 @@ async function installPackageWithStateMachine(options: { ignoreMappingUpdateErrors?: boolean; skipDataStreamRollover?: boolean; retryFromLastState?: boolean; + useStreaming?: boolean; }): Promise { const packageInfo = options.packageInstallContext.packageInfo; @@ -599,6 +611,7 @@ async function installPackageWithStateMachine(options: { skipDataStreamRollover, packageInstallContext, retryFromLastState, + useStreaming, } = options; let { telemetryEvent } = options; const logger = appContextService.getLogger(); @@ -696,6 +709,7 @@ async function installPackageWithStateMachine(options: { ignoreMappingUpdateErrors, skipDataStreamRollover, retryFromLastState, + useStreaming, }) .then(async (assets) => { logger.debug(`Removing old assets from previous versions of ${pkgName}`); @@ -785,6 +799,7 @@ async function installPackageByUpload({ } const { packageInfo } = await generatePackageInfoFromArchiveBuffer(archiveBuffer, contentType); const pkgName = packageInfo.name; + const useStreaming = PACKAGES_TO_INSTALL_WITH_STREAMING.includes(pkgName); // Allow for overriding the version in the manifest for cases where we install // stack-aligned bundled packages to support special cases around the @@ -807,17 +822,17 @@ async function installPackageByUpload({ packageInfo, }); - const { assetsMap, paths } = await unpackBufferToAssetsMap({ - name: packageInfo.name, - version: pkgVersion, + const { paths, assetsMap, archiveIterator } = await unpackBufferToAssetsMap({ archiveBuffer, contentType, + useStreaming, }); const packageInstallContext: PackageInstallContext = { packageInfo: { ...packageInfo, version: pkgVersion }, assetsMap, paths, + archiveIterator, }; // update the timestamp of latest installation setLastUploadInstallCache(); @@ -837,6 +852,7 @@ async function installPackageByUpload({ authorizationHeader, ignoreMappingUpdateErrors, skipDataStreamRollover, + useStreaming, }); } catch (e) { return { @@ -1004,12 +1020,14 @@ export async function installCustomPackage( acc.set(asset.path, asset.content); return acc; }, new Map()); - const paths = [...assetsMap.keys()]; + const paths = assets.map((asset) => asset.path); + const archiveIterator = createArchiveIteratorFromMap(assetsMap); const packageInstallContext: PackageInstallContext = { assetsMap, paths, packageInfo, + archiveIterator, }; return await installPackageWithStateMachine({ packageInstallContext, @@ -1341,16 +1359,20 @@ export async function installAssetsForInputPackagePolicy(opts: { ignoreUnverified: force, }); + const archiveIterator = createArchiveIteratorFromMap(pkg.assetsMap); packageInstallContext = { assetsMap: pkg.assetsMap, packageInfo: pkg.packageInfo, paths: pkg.paths, + archiveIterator, }; } else { + const archiveIterator = createArchiveIteratorFromMap(installedPkgWithAssets.assetsMap); packageInstallContext = { assetsMap: installedPkgWithAssets.assetsMap, packageInfo: installedPkgWithAssets.packageInfo, paths: installedPkgWithAssets.paths, + archiveIterator, }; } diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/_state_machine_package_install.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/_state_machine_package_install.test.ts index 174076a9e9b1b..73b78a6cc4aa0 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/_state_machine_package_install.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/_state_machine_package_install.test.ts @@ -38,6 +38,8 @@ import { updateCurrentWriteIndices } from '../../elasticsearch/template/template import { installIndexTemplatesAndPipelines } from '../install_index_template_pipeline'; +import { createArchiveIteratorFromMap } from '../../archive/archive_iterator'; + import { handleState } from './state_machine'; import { _stateMachineInstallPackage } from './_state_machine_package_install'; import { cleanupLatestExecutedState } from './steps'; @@ -110,6 +112,7 @@ describe('_stateMachineInstallPackage', () => { logger: loggerMock.create(), packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -172,6 +175,7 @@ describe('_stateMachineInstallPackage', () => { logger: loggerMock.create(), packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -208,6 +212,7 @@ describe('_stateMachineInstallPackage', () => { logger: loggerMock.create(), packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -257,6 +262,7 @@ describe('_stateMachineInstallPackage', () => { logger: loggerMock.create(), packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -336,6 +342,7 @@ describe('_stateMachineInstallPackage', () => { owner: { github: 'elastic/fleet' }, } as any, assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], }, installType: 'install', diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/_state_machine_package_install.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/_state_machine_package_install.ts index 1f10d40feba38..c941b6d60d63b 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/_state_machine_package_install.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/_state_machine_package_install.ts @@ -48,11 +48,13 @@ import { updateLatestExecutedState, cleanupLatestExecutedState, cleanUpKibanaAssetsStep, + cleanUpUnusedKibanaAssetsStep, cleanupILMPoliciesStep, cleanUpMlModelStep, cleanupIndexTemplatePipelinesStep, cleanupTransformsStep, cleanupArchiveEntriesStep, + stepInstallKibanaAssetsWithStreaming, } from './steps'; import type { StateMachineDefinition, StateMachineStates } from './state_machine'; import { handleState } from './state_machine'; @@ -73,6 +75,7 @@ export interface InstallContext extends StateContext { skipDataStreamRollover?: boolean; retryFromLastState?: boolean; initialState?: INSTALL_STATES; + useStreaming?: boolean; indexTemplates?: IndexTemplateEntry[]; packageAssetRefs?: PackageAssetReference[]; @@ -83,7 +86,7 @@ export interface InstallContext extends StateContext { /** * This data structure defines the sequence of the states and the transitions */ -const statesDefinition: StateMachineStates = { +const regularStatesDefinition: StateMachineStates = { create_restart_installation: { nextState: INSTALL_STATES.INSTALL_KIBANA_ASSETS, onTransition: stepCreateRestartInstallation, @@ -152,6 +155,31 @@ const statesDefinition: StateMachineStates = { }, }; +const streamingStatesDefinition: StateMachineStates = { + create_restart_installation: { + nextState: INSTALL_STATES.INSTALL_KIBANA_ASSETS, + onTransition: stepCreateRestartInstallation, + onPostTransition: updateLatestExecutedState, + }, + install_kibana_assets: { + onTransition: stepInstallKibanaAssetsWithStreaming, + nextState: INSTALL_STATES.SAVE_ARCHIVE_ENTRIES, + onPostTransition: updateLatestExecutedState, + }, + save_archive_entries_from_assets_map: { + onPreTransition: cleanupArchiveEntriesStep, + onTransition: stepSaveArchiveEntries, + nextState: INSTALL_STATES.UPDATE_SO, + onPostTransition: updateLatestExecutedState, + }, + update_so: { + onPreTransition: cleanUpUnusedKibanaAssetsStep, + onTransition: stepSaveSystemObject, + nextState: 'end', + onPostTransition: updateLatestExecutedState, + }, +}; + /* * _stateMachineInstallPackage installs packages using the generic state machine in ./state_machine * installStates is the data structure providing the state machine definition @@ -166,6 +194,10 @@ export async function _stateMachineInstallPackage( const logger = appContextService.getLogger(); let initialState = INSTALL_STATES.CREATE_RESTART_INSTALLATION; + const statesDefinition = context.useStreaming + ? streamingStatesDefinition + : regularStatesDefinition; + // if retryFromLastState, restart install from last install state // if force is passed, the install should be executed from the beginning if (retryFromLastState && !force && installedPkg?.attributes?.latest_executed_state?.name) { diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_create_restart_installation.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_create_restart_installation.test.ts index 2b653728d6574..e5a7fed55fe87 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_create_restart_installation.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_create_restart_installation.test.ts @@ -31,6 +31,8 @@ import { auditLoggingService } from '../../../../audit_logging'; import { restartInstallation, createInstallation } from '../../install'; import type { Installation } from '../../../../../../common'; +import { createArchiveIteratorFromMap } from '../../../archive/archive_iterator'; + import { stepCreateRestartInstallation } from './step_create_restart_installation'; jest.mock('../../../../audit_logging'); @@ -84,6 +86,7 @@ describe('stepCreateRestartInstallation', () => { logger, packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -120,6 +123,7 @@ describe('stepCreateRestartInstallation', () => { logger, packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -164,6 +168,7 @@ describe('stepCreateRestartInstallation', () => { logger, packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -208,6 +213,7 @@ describe('stepCreateRestartInstallation', () => { logger, packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_delete_previous_pipelines.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_delete_previous_pipelines.test.ts index 7d8a251433bb5..06201770ee2e2 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_delete_previous_pipelines.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_delete_previous_pipelines.test.ts @@ -24,6 +24,8 @@ import { deletePreviousPipelines, } from '../../../elasticsearch/ingest_pipeline'; +import { createArchiveIteratorFromMap } from '../../../archive/archive_iterator'; + import { stepDeletePreviousPipelines } from './step_delete_previous_pipelines'; jest.mock('../../../elasticsearch/ingest_pipeline'); @@ -84,6 +86,7 @@ describe('stepDeletePreviousPipelines', () => { owner: { github: 'elastic/fleet' }, } as any, assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], }; appContextService.start( @@ -276,6 +279,7 @@ describe('stepDeletePreviousPipelines', () => { owner: { github: 'elastic/fleet' }, } as any, assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], }; appContextService.start( diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_ilm_policies.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_ilm_policies.test.ts index 2cf9b23bb9adb..4c106a0c68f15 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_ilm_policies.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_ilm_policies.test.ts @@ -24,6 +24,8 @@ import { installIlmForDataStream } from '../../../elasticsearch/datastream_ilm/i import { ElasticsearchAssetType } from '../../../../../types'; import { deleteILMPolicies, deletePrerequisiteAssets } from '../../remove'; +import { createArchiveIteratorFromMap } from '../../../archive/archive_iterator'; + import { stepInstallILMPolicies, cleanupILMPoliciesStep } from './step_install_ilm_policies'; jest.mock('../../../archive/storage'); @@ -56,6 +58,7 @@ const packageInstallContext = { owner: { github: 'elastic/fleet' }, } as any, assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], }; let soClient: jest.Mocked; @@ -239,6 +242,7 @@ describe('stepInstallILMPolicies', () => { owner: { github: 'elastic/fleet' }, } as any, assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], }, installType: 'install', diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_index_template_pipelines.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_index_template_pipelines.test.ts index d258747edc6ef..1c368cfd998d3 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_index_template_pipelines.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_index_template_pipelines.test.ts @@ -37,6 +37,8 @@ const mockDeletePrerequisiteAssets = deletePrerequisiteAssets as jest.MockedFunc typeof deletePrerequisiteAssets >; +import { createArchiveIteratorFromMap } from '../../../archive/archive_iterator'; + import { stepInstallIndexTemplatePipelines, cleanupIndexTemplatePipelinesStep, @@ -122,6 +124,7 @@ describe('stepInstallIndexTemplatePipelines', () => { owner: { github: 'elastic/fleet' }, } as any, assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], }; appContextService.start( @@ -281,6 +284,7 @@ describe('stepInstallIndexTemplatePipelines', () => { ], } as any, assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], }; appContextService.start( @@ -431,6 +435,7 @@ describe('stepInstallIndexTemplatePipelines', () => { ], } as any, assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], }; appContextService.start( @@ -521,6 +526,7 @@ describe('stepInstallIndexTemplatePipelines', () => { ], } as any, assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], }; appContextService.start( @@ -574,6 +580,7 @@ describe('stepInstallIndexTemplatePipelines', () => { owner: { github: 'elastic/fleet' }, } as any, assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], }; appContextService.start( @@ -647,6 +654,7 @@ describe('cleanupIndexTemplatePipelinesStep', () => { ], } as any, assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], }; const mockInstalledPackageSo: SavedObject = { diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_kibana_assets.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_kibana_assets.test.ts index 52c93c61c16e1..cf9d953868b6a 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_kibana_assets.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_kibana_assets.test.ts @@ -23,8 +23,25 @@ import { deleteKibanaAssets } from '../../remove'; import { KibanaSavedObjectType, type Installation } from '../../../../../types'; -import { stepInstallKibanaAssets, cleanUpKibanaAssetsStep } from './step_install_kibana_assets'; +import { createArchiveIteratorFromMap } from '../../../archive/archive_iterator'; +import { + stepInstallKibanaAssets, + cleanUpKibanaAssetsStep, + stepInstallKibanaAssetsWithStreaming, + cleanUpUnusedKibanaAssetsStep, +} from './step_install_kibana_assets'; + +jest.mock('../../../kibana/assets/saved_objects', () => { + return { + getSpaceAwareSaveobjectsClients: jest.fn().mockReturnValue({ + savedObjectClientWithSpace: jest.fn(), + savedObjectsImporter: jest.fn(), + savedObjectTagAssignmentService: jest.fn(), + savedObjectTagClient: jest.fn(), + }), + }; +}); jest.mock('../../../kibana/assets/install'); jest.mock('../../remove', () => { return { @@ -58,6 +75,7 @@ const packageInstallContext = { } as any, paths: ['some/path/1', 'some/path/2'], assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), }; describe('stepInstallKibanaAssets', () => { @@ -82,6 +100,7 @@ describe('stepInstallKibanaAssets', () => { logger: loggerMock.create(), packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -102,7 +121,7 @@ describe('stepInstallKibanaAssets', () => { }); await expect(installationPromise).resolves.not.toThrowError(); - expect(mockedInstallKibanaAssetsAndReferencesMultispace).toBeCalledTimes(1); + expect(installKibanaAssetsAndReferencesMultispace).toBeCalledTimes(1); }); esClient = elasticsearchServiceMock.createClusterClient().asInternalUser; appContextService.start(createAppContextStartContractMock()); @@ -121,6 +140,7 @@ describe('stepInstallKibanaAssets', () => { logger: loggerMock.create(), packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -144,6 +164,60 @@ describe('stepInstallKibanaAssets', () => { }); }); +describe('stepInstallKibanaAssetsWithStreaming', () => { + beforeEach(async () => { + soClient = savedObjectsClientMock.create(); + esClient = elasticsearchServiceMock.createClusterClient().asInternalUser; + appContextService.start(createAppContextStartContractMock()); + }); + + it('should rely on archiveIterator instead of in-memory assetsMap', async () => { + const assetsMap = new Map(); + assetsMap.get = jest.fn(); + assetsMap.set = jest.fn(); + + const archiveIterator = { + traverseEntries: jest.fn(), + getPaths: jest.fn(), + }; + + const result = await stepInstallKibanaAssetsWithStreaming({ + savedObjectsClient: soClient, + esClient, + logger: loggerMock.create(), + packageInstallContext: { + assetsMap, + archiveIterator, + paths: [], + packageInfo: { + title: 'title', + name: 'xyz', + version: '4.5.6', + description: 'test', + type: 'integration', + categories: ['cloud', 'custom'], + format_version: 'string', + release: 'experimental', + conditions: { kibana: { version: 'x.y.z' } }, + owner: { github: 'elastic/fleet' }, + }, + }, + installType: 'install', + installSource: 'registry', + spaceId: DEFAULT_SPACE_ID, + }); + + expect(result).toEqual({ installedKibanaAssetsRefs: [] }); + + // Verify that assetsMap was not used + expect(assetsMap.get).not.toBeCalled(); + expect(assetsMap.set).not.toBeCalled(); + + // Verify that archiveIterator was used + expect(archiveIterator.traverseEntries).toBeCalled(); + }); +}); + describe('cleanUpKibanaAssetsStep', () => { const mockInstalledPackageSo: SavedObject = { id: 'mocked-package', @@ -302,3 +376,84 @@ describe('cleanUpKibanaAssetsStep', () => { expect(mockedDeleteKibanaAssets).not.toBeCalled(); }); }); + +describe('cleanUpUnusedKibanaAssetsStep', () => { + const mockInstalledPackageSo: SavedObject = { + id: 'mocked-package', + attributes: { + name: 'test-package', + version: '1.0.0', + install_status: 'installing', + install_version: '1.0.0', + install_started_at: new Date().toISOString(), + install_source: 'registry', + verification_status: 'verified', + installed_kibana: [] as any, + installed_es: [] as any, + es_index_patterns: {}, + }, + type: PACKAGES_SAVED_OBJECT_TYPE, + references: [], + }; + + const installationContext = { + savedObjectsClient: soClient, + savedObjectsImporter: jest.fn(), + esClient, + logger: loggerMock.create(), + packageInstallContext, + installType: 'install' as const, + installSource: 'registry' as const, + spaceId: DEFAULT_SPACE_ID, + retryFromLastState: true, + initialState: 'install_kibana_assets' as any, + }; + + beforeEach(async () => { + soClient = savedObjectsClientMock.create(); + esClient = elasticsearchServiceMock.createClusterClient().asInternalUser; + appContextService.start(createAppContextStartContractMock()); + }); + + it('should not clean up assets if they all present in the new package', async () => { + const installedAssets = [{ type: KibanaSavedObjectType.dashboard, id: 'dashboard-1' }]; + await cleanUpUnusedKibanaAssetsStep({ + ...installationContext, + installedPkg: { + ...mockInstalledPackageSo, + attributes: { + ...mockInstalledPackageSo.attributes, + installed_kibana: installedAssets, + }, + }, + installedKibanaAssetsRefs: installedAssets, + }); + + expect(mockedDeleteKibanaAssets).not.toBeCalled(); + }); + + it('should clean up assets that are not present in the new package', async () => { + const installedAssets = [ + { type: KibanaSavedObjectType.dashboard, id: 'dashboard-1' }, + { type: KibanaSavedObjectType.dashboard, id: 'dashboard-2' }, + ]; + const newAssets = [{ type: KibanaSavedObjectType.dashboard, id: 'dashboard-1' }]; + await cleanUpUnusedKibanaAssetsStep({ + ...installationContext, + installedPkg: { + ...mockInstalledPackageSo, + attributes: { + ...mockInstalledPackageSo.attributes, + installed_kibana: installedAssets, + }, + }, + installedKibanaAssetsRefs: newAssets, + }); + + expect(mockedDeleteKibanaAssets).toBeCalledWith({ + installedObjects: [installedAssets[1]], + spaceId: 'default', + packageInfo: packageInstallContext.packageInfo, + }); + }); +}); diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_kibana_assets.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_kibana_assets.ts index b5a1fff91d3b8..aabd23f2eb9cc 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_kibana_assets.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_kibana_assets.ts @@ -11,7 +11,9 @@ import { withPackageSpan } from '../../utils'; import type { InstallContext } from '../_state_machine_package_install'; import { deleteKibanaAssets } from '../../remove'; +import type { KibanaAssetReference } from '../../../../../../common/types'; import { INSTALL_STATES } from '../../../../../../common/types'; +import { installKibanaAssetsWithStreaming } from '../../../kibana/assets/install_with_streaming'; export async function stepInstallKibanaAssets(context: InstallContext) { const { savedObjectsClient, logger, installedPkg, packageInstallContext, spaceId } = context; @@ -37,6 +39,26 @@ export async function stepInstallKibanaAssets(context: InstallContext) { return { kibanaAssetPromise }; } +export async function stepInstallKibanaAssetsWithStreaming(context: InstallContext) { + const { savedObjectsClient, installedPkg, packageInstallContext, spaceId } = context; + const { packageInfo } = packageInstallContext; + const { name: pkgName } = packageInfo; + + const installedKibanaAssetsRefs = await withPackageSpan( + 'Install Kibana assets with streaming', + () => + installKibanaAssetsWithStreaming({ + savedObjectsClient, + pkgName, + packageInstallContext, + installedPkg, + spaceId, + }) + ); + + return { installedKibanaAssetsRefs }; +} + export async function cleanUpKibanaAssetsStep(context: InstallContext) { const { logger, @@ -65,3 +87,44 @@ export async function cleanUpKibanaAssetsStep(context: InstallContext) { }); } } + +/** + * Cleans up Kibana assets that are no longer in the package. As opposite to + * `cleanUpKibanaAssetsStep`, this one is used after the package assets are + * installed. + * + * This function compares the currently installed Kibana assets with the assets + * in the previous package and removes any assets that are no longer present in the + * new installation. + * + */ +export async function cleanUpUnusedKibanaAssetsStep(context: InstallContext) { + const { logger, installedPkg, packageInstallContext, spaceId, installedKibanaAssetsRefs } = + context; + const { packageInfo } = packageInstallContext; + + if (!installedKibanaAssetsRefs) { + return; + } + + logger.debug('Clean up Kibana assets that are no longer in the package'); + + // Get the assets installed by the previous package + const previousAssetRefs = installedPkg?.attributes.installed_kibana ?? []; + + // Remove any assets that are not in the new package + const nextAssetRefKeys = new Set( + installedKibanaAssetsRefs.map((asset: KibanaAssetReference) => `${asset.id}-${asset.type}`) + ); + const assetsToRemove = previousAssetRefs.filter( + (existingAsset) => !nextAssetRefKeys.has(`${existingAsset.id}-${existingAsset.type}`) + ); + + if (assetsToRemove.length === 0) { + return; + } + + await withPackageSpan('Clean up Kibana assets that are no longer in the package', async () => { + await deleteKibanaAssets({ installedObjects: assetsToRemove, spaceId, packageInfo }); + }); +} diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_mlmodel.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_mlmodel.test.ts index 1afb436eb4361..df939f3a458b6 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_mlmodel.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_mlmodel.test.ts @@ -22,6 +22,8 @@ import { createAppContextStartContractMock } from '../../../../../mocks'; import { installMlModel } from '../../../elasticsearch/ml_model'; import { deleteMLModels, deletePrerequisiteAssets } from '../../remove'; +import { createArchiveIteratorFromMap } from '../../../archive/archive_iterator'; + import { stepInstallMlModel, cleanUpMlModelStep } from './step_install_mlmodel'; jest.mock('../../../elasticsearch/ml_model'); @@ -53,6 +55,7 @@ const packageInstallContext = { } as any, paths: ['some/path/1', 'some/path/2'], assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), }; let soClient: jest.Mocked; let esClient: jest.Mocked; diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_transforms.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_transforms.test.ts index 1ac2383950b05..3bf07d52c6cbf 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_transforms.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_install_transforms.test.ts @@ -22,6 +22,8 @@ import { createAppContextStartContractMock } from '../../../../../mocks'; import { installTransforms } from '../../../elasticsearch/transform/install'; import { cleanupTransforms } from '../../remove'; +import { createArchiveIteratorFromMap } from '../../../archive/archive_iterator'; + import { stepInstallTransforms, cleanupTransformsStep } from './step_install_transforms'; jest.mock('../../../elasticsearch/transform/install'); @@ -52,6 +54,7 @@ const packageInstallContext = { } as any, paths: ['some/path/1', 'some/path/2'], assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), }; describe('stepInstallTransforms', () => { diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_remove_legacy_templates.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_remove_legacy_templates.test.ts index 39e7159596ba8..7fa00a1c57f57 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_remove_legacy_templates.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_remove_legacy_templates.test.ts @@ -24,6 +24,8 @@ import { appContextService } from '../../../../app_context'; import { createAppContextStartContractMock } from '../../../../../mocks'; import { removeLegacyTemplates } from '../../../elasticsearch/template/remove_legacy'; +import { createArchiveIteratorFromMap } from '../../../archive/archive_iterator'; + import { stepRemoveLegacyTemplates } from './step_remove_legacy_templates'; jest.mock('../../../elasticsearch/template/remove_legacy'); @@ -82,6 +84,7 @@ describe('stepRemoveLegacyTemplates', () => { } as any, paths: ['some/path/1', 'some/path/2'], assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), }; appContextService.start( createAppContextStartContractMock({ diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_save_archive_entries.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_save_archive_entries.test.ts index b03c146640488..255572d57cf49 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_save_archive_entries.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_save_archive_entries.test.ts @@ -21,6 +21,8 @@ import { appContextService } from '../../../../app_context'; import { createAppContextStartContractMock } from '../../../../../mocks'; import { saveArchiveEntriesFromAssetsMap, removeArchiveEntries } from '../../../archive/storage'; +import { createArchiveIteratorFromMap } from '../../../archive/archive_iterator'; + import { stepSaveArchiveEntries, cleanupArchiveEntriesStep } from './step_save_archive_entries'; jest.mock('../../../archive/storage', () => { @@ -60,6 +62,7 @@ const packageInstallContext = { Buffer.from('{"content": "data"}'), ], ]), + archiveIterator: createArchiveIteratorFromMap(new Map()), }; const getMockInstalledPackageSo = ( installedEs: EsAssetReference[] = [] diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_save_archive_entries.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_save_archive_entries.ts index b0d5bb67627a6..7db44bb243f85 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_save_archive_entries.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_save_archive_entries.ts @@ -14,17 +14,32 @@ import { withPackageSpan } from '../../utils'; import type { InstallContext } from '../_state_machine_package_install'; import { INSTALL_STATES } from '../../../../../../common/types'; +import { MANIFEST_NAME } from '../../../archive/parse'; export async function stepSaveArchiveEntries(context: InstallContext) { - const { packageInstallContext, savedObjectsClient, installSource } = context; + const { packageInstallContext, savedObjectsClient, installSource, useStreaming } = context; - const { packageInfo } = packageInstallContext; + const { packageInfo, archiveIterator } = packageInstallContext; + + let assetsMap = packageInstallContext?.assetsMap; + let paths = packageInstallContext?.paths; + // For stream based installations, we don't want to save any assets but + // manifest.yaml due to the large number of assets in the package. + if (useStreaming) { + assetsMap = new Map(); + await archiveIterator.traverseEntries(async (entry) => { + if (entry.path.endsWith(MANIFEST_NAME)) { + assetsMap.set(entry.path, entry.buffer); + } + }); + paths = Array.from(assetsMap.keys()); + } const packageAssetResults = await withPackageSpan('Update archive entries', () => saveArchiveEntriesFromAssetsMap({ savedObjectsClient, - assetsMap: packageInstallContext?.assetsMap, - paths: packageInstallContext?.paths, + assetsMap, + paths, packageInfo, installSource, }) diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_save_system_object.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_save_system_object.test.ts index aecdd0b2552c4..8d80c236aefb0 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_save_system_object.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_save_system_object.test.ts @@ -21,6 +21,8 @@ import { createAppContextStartContractMock } from '../../../../../mocks'; import { auditLoggingService } from '../../../../audit_logging'; import { packagePolicyService } from '../../../../package_policy'; +import { createArchiveIteratorFromMap } from '../../../archive/archive_iterator'; + import { stepSaveSystemObject } from './step_save_system_object'; jest.mock('../../../../audit_logging'); @@ -67,6 +69,7 @@ describe('updateLatestExecutedState', () => { logger, packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -133,6 +136,7 @@ describe('updateLatestExecutedState', () => { logger, packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_update_current_write_indices.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_update_current_write_indices.test.ts index c7f3c040b7966..017805d34efef 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_update_current_write_indices.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/step_update_current_write_indices.test.ts @@ -22,6 +22,8 @@ import { appContextService } from '../../../../app_context'; import { createAppContextStartContractMock } from '../../../../../mocks'; import { updateCurrentWriteIndices } from '../../../elasticsearch/template/template'; +import { createArchiveIteratorFromMap } from '../../../archive/archive_iterator'; + import { stepUpdateCurrentWriteIndices } from './step_update_current_write_indices'; jest.mock('../../../elasticsearch/template/template'); @@ -86,6 +88,7 @@ describe('stepUpdateCurrentWriteIndices', () => { } as any, paths: ['some/path/1', 'some/path/2'], assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), }; appContextService.start( createAppContextStartContractMock({ diff --git a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/update_latest_executed_state.test.ts b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/update_latest_executed_state.test.ts index d963e5fea44c9..aea879aba5479 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/update_latest_executed_state.test.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/install_state_machine/steps/update_latest_executed_state.test.ts @@ -32,6 +32,8 @@ import { auditLoggingService } from '../../../../audit_logging'; import type { PackagePolicySOAttributes } from '../../../../../types'; +import { createArchiveIteratorFromMap } from '../../../archive/archive_iterator'; + import { updateLatestExecutedState } from './update_latest_executed_state'; jest.mock('../../../../audit_logging'); @@ -61,6 +63,7 @@ describe('updateLatestExecutedState', () => { logger, packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -116,6 +119,7 @@ describe('updateLatestExecutedState', () => { logger, packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -153,6 +157,7 @@ describe('updateLatestExecutedState', () => { logger, packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', @@ -198,6 +203,7 @@ describe('updateLatestExecutedState', () => { logger, packageInstallContext: { assetsMap: new Map(), + archiveIterator: createArchiveIteratorFromMap(new Map()), paths: [], packageInfo: { title: 'title', diff --git a/x-pack/plugins/fleet/server/services/epm/packages/remove.ts b/x-pack/plugins/fleet/server/services/epm/packages/remove.ts index ac3f5def5d09c..3892eaa951e5f 100644 --- a/x-pack/plugins/fleet/server/services/epm/packages/remove.ts +++ b/x-pack/plugins/fleet/server/services/epm/packages/remove.ts @@ -148,6 +148,7 @@ export async function deleteKibanaAssets({ const namespace = SavedObjectsUtils.namespaceStringToId(spaceId); + // TODO this should be the installed package info, not the package that is being installed const minKibana = packageInfo.conditions?.kibana?.version ? minVersion(packageInfo.conditions.kibana.version) : null; diff --git a/x-pack/plugins/fleet/server/services/epm/registry/index.ts b/x-pack/plugins/fleet/server/services/epm/registry/index.ts index bb4d612aa7de3..75b9869d0a7c6 100644 --- a/x-pack/plugins/fleet/server/services/epm/registry/index.ts +++ b/x-pack/plugins/fleet/server/services/epm/registry/index.ts @@ -54,6 +54,8 @@ import { resolveDataStreamFields, resolveDataStreamsMap, withPackageSpan } from import { verifyPackageArchiveSignature } from '../packages/package_verification'; +import type { ArchiveIterator } from '../../../../common/types'; + import { fetchUrl, getResponse, getResponseStream } from './requests'; import { getRegistryUrl } from './registry_url'; @@ -309,11 +311,12 @@ async function getPackageInfoFromArchiveOrCache( export async function getPackage( name: string, version: string, - options?: { ignoreUnverified?: boolean } + options?: { ignoreUnverified?: boolean; useStreaming?: boolean } ): Promise<{ paths: string[]; packageInfo: ArchivePackage; assetsMap: AssetsMap; + archiveIterator: ArchiveIterator; verificationResult?: PackageVerificationResult; }> { const verifyPackage = appContextService.getExperimentalFeatures().packageVerification; @@ -340,18 +343,18 @@ export async function getPackage( setVerificationResult({ name, version }, latestVerificationResult); } - const { assetsMap, paths } = await unpackBufferToAssetsMap({ - name, - version, + const contentType = ensureContentType(archivePath); + const { paths, assetsMap, archiveIterator } = await unpackBufferToAssetsMap({ archiveBuffer, - contentType: ensureContentType(archivePath), + contentType, + useStreaming: options?.useStreaming, }); if (!packageInfo) { packageInfo = await getPackageInfoFromArchiveOrCache(name, version, archiveBuffer, archivePath); } - return { paths, packageInfo, assetsMap, verificationResult }; + return { paths, packageInfo, assetsMap, archiveIterator, verificationResult }; } export async function getPackageFieldsMetadata( @@ -397,7 +400,7 @@ export async function getPackageFieldsMetadata( } } -function ensureContentType(archivePath: string) { +export function ensureContentType(archivePath: string) { const contentType = mime.lookup(archivePath); if (!contentType) { diff --git a/x-pack/plugins/fleet/server/services/package_policies/experimental_datastream_features.ts b/x-pack/plugins/fleet/server/services/package_policies/experimental_datastream_features.ts index edf31991634b9..cd1c26942aa0c 100644 --- a/x-pack/plugins/fleet/server/services/package_policies/experimental_datastream_features.ts +++ b/x-pack/plugins/fleet/server/services/package_policies/experimental_datastream_features.ts @@ -30,6 +30,7 @@ import { applyDocOnlyValueToMapping, forEachMappings, } from '../experimental_datastream_features_helper'; +import { createArchiveIteratorFromMap } from '../epm/archive/archive_iterator'; export async function handleExperimentalDatastreamFeatureOptIn({ soClient, @@ -75,6 +76,7 @@ export async function handleExperimentalDatastreamFeatureOptIn({ return prepareTemplate({ packageInstallContext: { assetsMap, + archiveIterator: createArchiveIteratorFromMap(assetsMap), packageInfo, paths, }, diff --git a/x-pack/plugins/inference/README.md b/x-pack/plugins/inference/README.md index 1807da7f29faa..935ae31bd6bc6 100644 --- a/x-pack/plugins/inference/README.md +++ b/x-pack/plugins/inference/README.md @@ -4,13 +4,12 @@ The inference plugin is a central place to handle all interactions with the Elas external LLM APIs. Its goals are: - Provide a single place for all interactions with large language models and other generative AI adjacent tasks. -- Abstract away differences between different LLM providers like OpenAI, Bedrock and Gemini -- Host commonly used LLM-based tasks like generating ES|QL from natural language and knowledge base recall. +- Abstract away differences between different LLM providers like OpenAI, Bedrock and Gemini. - Allow us to move gradually to the \_inference endpoint without disrupting engineers. ## Architecture and examples -![CleanShot 2024-07-14 at 14 45 27@2x](https://github.com/user-attachments/assets/e65a3e47-bce1-4dcf-bbed-4f8ac12a104f) +![architecture-schema](https://github.com/user-attachments/assets/e65a3e47-bce1-4dcf-bbed-4f8ac12a104f) ## Terminology @@ -21,8 +20,22 @@ The following concepts are commonly used throughout the plugin: - **tools**: a set of tools that the LLM can choose to use when generating the next message. In essence, it allows the consumer of the API to define a schema for structured output instead of plain text, and having the LLM select the most appropriate one. - **tool call**: when the LLM has chosen a tool (schema) to use for its output, and returns a document that matches the schema, this is referred to as a tool call. +## Inference connectors + +Performing inference, or more globally communicating with the LLM, is done using stack connectors. + +The subset of connectors that can be used for inference are called `genAI`, or `inference` connectors. +Calling any inference APIs with the ID of a connector that is not inference-compatible will result in the API throwing an error. + +The list of inference connector types: +- `.gen-ai`: OpenAI connector +- `.bedrock`: Bedrock Claude connector +- `.gemini`: Vertex Gemini connector + ## Usage examples +The inference APIs are available via the inference client, which can be created using the inference plugin's start contract: + ```ts class MyPlugin { setup(coreSetup, pluginsSetup) { @@ -40,9 +53,9 @@ class MyPlugin { async (context, request, response) => { const [coreStart, pluginsStart] = await coreSetup.getStartServices(); - const inferenceClient = pluginsSetup.inference.getClient({ request }); + const inferenceClient = pluginsStart.inference.getClient({ request }); - const chatComplete$ = inferenceClient.chatComplete({ + const chatResponse = inferenceClient.chatComplete({ connectorId: request.body.connectorId, system: `Here is my system message`, messages: [ @@ -53,13 +66,9 @@ class MyPlugin { ], }); - const message = await lastValueFrom( - chatComplete$.pipe(withoutTokenCountEvents(), withoutChunkEvents()) - ); - return response.ok({ body: { - message, + chatResponse, }, }); } @@ -68,33 +77,190 @@ class MyPlugin { } ``` -## Services +## APIs -### `chatComplete`: +### `chatComplete` API: `chatComplete` generates a response to a prompt or a conversation using the LLM. Here's what is supported: -- Normalizing request and response formats from different connector types (e.g. OpenAI, Bedrock, Claude, Elastic Inference Service) +- Normalizing request and response formats from all supported connector types - Tool calling and validation of tool calls -- Emits token count events -- Emits message events, which is the concatenated message based on the response chunks +- Token usage stats / events +- Streaming mode to work with chunks in real time instead of waiting for the full response + +#### Standard usage + +In standard mode, the API returns a promise resolving with the full LLM response once the generation is complete. +The response will also contain the token count info, if available. + +```ts +const chatResponse = inferenceClient.chatComplete({ + connectorId: 'some-gen-ai-connector', + system: `Here is my system message`, + messages: [ + { + role: MessageRole.User, + content: 'Do something', + }, + ], +}); + +const { content, tokens } = chatResponse; +// do something with the output +``` + +#### Streaming mode + +Passing `stream: true` when calling the API enables streaming mode. +In that mode, the API returns an observable instead of a promise, emitting chunks in real time. + +That observable emits three types of events: + +- `chunk` the completion chunks, emitted in real time +- `tokenCount` token count event, containing info about token usages, eventually emitted after the chunks +- `message` full message event, emitted once the source is done sending chunks + +The `@kbn/inference-common` package exposes various utilities to work with this multi-events observable: + +- `isChatCompletionChunkEvent`, `isChatCompletionMessageEvent` and `isChatCompletionTokenCountEvent` which are type guard for the corresponding event types +- `withoutChunkEvents` and `withoutTokenCountEvents` + +```ts +import { + isChatCompletionChunkEvent, + isChatCompletionMessageEvent, + withoutTokenCountEvents, + withoutChunkEvents, +} from '@kbn/inference-common'; -### `output` +const chatComplete$ = inferenceClient.chatComplete({ + connectorId: 'some-gen-ai-connector', + stream: true, + system: `Here is my system message`, + messages: [ + { + role: MessageRole.User, + content: 'Do something', + }, + ], +}); -`output` is a wrapper around `chatComplete` that is catered towards a single use case: having the LLM output a structured response, based on a schema. It also drops the token count events to simplify usage. +// using and filtering the events +chatComplete$.pipe(withoutTokenCountEvents()).subscribe((event) => { + if (isChatCompletionChunkEvent(event)) { + // do something with the chunk event + } else { + // do something with the message event + } +}); + +// or retrieving the final message +const message = await lastValueFrom( + chatComplete$.pipe(withoutTokenCountEvents(), withoutChunkEvents()) +); +``` + +#### Defining and using tools + +Tools are defined as a record, with a `description` and optionally a `schema`. The reason why it's a record is because of type-safety. +This allows us to have fully typed tool calls (e.g. when the name of the tool being called is `x`, its arguments are typed as the schema of `x`). + +The description and schema of a tool will be converted and sent to the LLM, so it's important +to be explicit about what each tool does. + +```ts +const chatResponse = inferenceClient.chatComplete({ + connectorId: 'some-gen-ai-connector', + system: `Here is my system message`, + messages: [ + { + role: MessageRole.User, + content: 'How much is 4 plus 9?', + }, + ], + toolChoice: ToolChoiceType.required, // MUST call a tool + tools: { + date: { + description: 'Call this tool if you need to know the current date' + }, + add: { + description: 'This tool can be used to add two numbers', + schema: { + type: 'object', + properties: { + a: { type: 'number', description: 'the first number' }, + b: { type: 'number', description: 'the second number'} + }, + required: ['a', 'b'] + } + } + } as const // as const is required to have type inference on the schema +}); -### Observable event streams +const { content, toolCalls } = chatResponse; +const toolCall = toolCalls[0]; +// process the tool call and eventually continue the conversation with the LLM +``` + +### `output` API + +`output` is a wrapper around the `chatComplete` API that is catered towards a specific use case: having the LLM output a structured response, based on a schema. +It's basically just making sure that the LLM will call the single tool that is exposed via the provided `schema`. +It also drops the token count info to simplify usage. + +Similar to `chatComplete`, `output` supports two modes: normal full response mode by default, and optional streaming mode by passing the `stream: true` parameter. + +```ts +import { ToolSchema } from '@kbn/inference-common'; -These APIs, both on the client and the server, return Observables that emit events. When converting the Observable into a stream, the following things happen: +// schema must be defined as full const or using the `satisfies ToolSchema` modifier for TS type inference to work +const mySchema = { + type: 'object', + properties: { + animals: { + description: 'the list of animals that are mentioned in the provided article', + type: 'array', + items: { + type: 'string', + }, + }, + vegetables: { + description: 'the list of vegetables that are mentioned in the provided article', + type: 'array', + items: { + type: 'string', + }, + }, + }, +} as const; -- Errors are caught and serialized as events sent over the stream (after an error, the stream ends). -- The response stream outputs data as [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events) -- The client that reads the stream, parses the event source as an Observable, and if it encounters a serialized error, it deserializes it and throws an error in the Observable. +const response = inferenceClient.outputApi({ + id: 'extract_from_article', + connectorId: 'some-gen-ai-connector', + schema: mySchema, + system: + 'You are a helpful assistant and your current task is to extract informations from the provided document', + input: ` + Please find all the animals and vegetables that are mentioned in the following document: + + ## Document + + ${theDoc} + `, +}); + +// output is properly typed from the provided schema +const { animals, vegetables } = response.output; +``` ### Errors -All known errors are instances, and not extensions, from the `InferenceTaskError` base class, which has a `code`, a `message`, and `meta` information about the error. This allows us to serialize and deserialize errors over the wire without a complicated factory pattern. +All known errors are instances, and not extensions, of the `InferenceTaskError` base class, which has a `code`, a `message`, and `meta` information about the error. +This allows us to serialize and deserialize errors over the wire without a complicated factory pattern. -### Tools +Type guards for each type of error are exposed from the `@kbn/inference-common` package, such as: -Tools are defined as a record, with a `description` and optionally a `schema`. The reason why it's a record is because of type-safety. This allows us to have fully typed tool calls (e.g. when the name of the tool being called is `x`, its arguments are typed as the schema of `x`). +- `isInferenceError` +- `isInferenceInternalError` +- `isInferenceRequestError` +- ...`isXXXError` diff --git a/x-pack/plugins/inference/common/create_output_api.test.ts b/x-pack/plugins/inference/common/create_output_api.test.ts new file mode 100644 index 0000000000000..b5d380fa9aac6 --- /dev/null +++ b/x-pack/plugins/inference/common/create_output_api.test.ts @@ -0,0 +1,122 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import { firstValueFrom, isObservable, of, toArray } from 'rxjs'; +import { + ChatCompleteResponse, + ChatCompletionEvent, + ChatCompletionEventType, +} from '@kbn/inference-common'; +import { createOutputApi } from './create_output_api'; + +describe('createOutputApi', () => { + let chatComplete: jest.Mock; + + beforeEach(() => { + chatComplete = jest.fn(); + }); + + it('calls `chatComplete` with the right parameters', async () => { + chatComplete.mockResolvedValue(Promise.resolve({ content: 'content', toolCalls: [] })); + + const output = createOutputApi(chatComplete); + + await output({ + id: 'id', + stream: false, + functionCalling: 'native', + connectorId: '.my-connector', + system: 'system', + input: 'input message', + }); + + expect(chatComplete).toHaveBeenCalledTimes(1); + expect(chatComplete).toHaveBeenCalledWith({ + connectorId: '.my-connector', + functionCalling: 'native', + stream: false, + system: 'system', + messages: [ + { + content: 'input message', + role: 'user', + }, + ], + }); + }); + + it('returns the expected value when stream=false', async () => { + const chatCompleteResponse: ChatCompleteResponse = { + content: 'content', + toolCalls: [{ toolCallId: 'a', function: { name: 'foo', arguments: { arg: 1 } } }], + }; + + chatComplete.mockResolvedValue(Promise.resolve(chatCompleteResponse)); + + const output = createOutputApi(chatComplete); + + const response = await output({ + id: 'my-id', + stream: false, + connectorId: '.my-connector', + input: 'input message', + }); + + expect(response).toEqual({ + id: 'my-id', + content: chatCompleteResponse.content, + output: chatCompleteResponse.toolCalls[0].function.arguments, + }); + }); + + it('returns the expected value when stream=true', async () => { + const sourceEvents: ChatCompletionEvent[] = [ + { type: ChatCompletionEventType.ChatCompletionChunk, content: 'chunk-1', tool_calls: [] }, + { type: ChatCompletionEventType.ChatCompletionChunk, content: 'chunk-2', tool_calls: [] }, + { + type: ChatCompletionEventType.ChatCompletionMessage, + content: 'message', + toolCalls: [{ toolCallId: 'a', function: { name: 'foo', arguments: { arg: 1 } } }], + }, + ]; + + chatComplete.mockReturnValue(of(...sourceEvents)); + + const output = createOutputApi(chatComplete); + + const response$ = await output({ + id: 'my-id', + stream: true, + connectorId: '.my-connector', + input: 'input message', + }); + + expect(isObservable(response$)).toEqual(true); + const events = await firstValueFrom(response$.pipe(toArray())); + + expect(events).toEqual([ + { + content: 'chunk-1', + id: 'my-id', + type: 'output', + }, + { + content: 'chunk-2', + id: 'my-id', + type: 'output', + }, + { + content: 'message', + id: 'my-id', + output: { + arg: 1, + }, + type: 'complete', + }, + ]); + }); +}); diff --git a/x-pack/plugins/inference/common/create_output_api.ts b/x-pack/plugins/inference/common/create_output_api.ts index 450114c892cba..e5dd2eeda2cbd 100644 --- a/x-pack/plugins/inference/common/create_output_api.ts +++ b/x-pack/plugins/inference/common/create_output_api.ts @@ -5,24 +5,36 @@ * 2.0. */ -import { map } from 'rxjs'; import { - OutputAPI, - OutputEvent, - OutputEventType, ChatCompleteAPI, ChatCompletionEventType, MessageRole, + OutputAPI, + OutputEventType, + OutputOptions, + ToolSchema, withoutTokenCountEvents, } from '@kbn/inference-common'; +import { isObservable, map } from 'rxjs'; import { ensureMultiTurn } from './utils/ensure_multi_turn'; -export function createOutputApi(chatCompleteApi: ChatCompleteAPI): OutputAPI { - return (id, { connectorId, input, schema, system, previousMessages, functionCalling }) => { - return chatCompleteApi({ +export function createOutputApi(chatCompleteApi: ChatCompleteAPI): OutputAPI; +export function createOutputApi(chatCompleteApi: ChatCompleteAPI) { + return ({ + id, + connectorId, + input, + schema, + system, + previousMessages, + functionCalling, + stream, + }: OutputOptions) => { + const response = chatCompleteApi({ connectorId, - system, + stream, functionCalling, + system, messages: ensureMultiTurn([ ...(previousMessages || []), { @@ -41,27 +53,42 @@ export function createOutputApi(chatCompleteApi: ChatCompleteAPI): OutputAPI { toolChoice: { function: 'structuredOutput' as const }, } : {}), - }).pipe( - withoutTokenCountEvents(), - map((event): OutputEvent => { - if (event.type === ChatCompletionEventType.ChatCompletionChunk) { + }); + + if (isObservable(response)) { + return response.pipe( + withoutTokenCountEvents(), + map((event) => { + if (event.type === ChatCompletionEventType.ChatCompletionChunk) { + return { + type: OutputEventType.OutputUpdate, + id, + content: event.content, + }; + } + return { - type: OutputEventType.OutputUpdate, id, + output: + event.toolCalls.length && 'arguments' in event.toolCalls[0].function + ? event.toolCalls[0].function.arguments + : undefined, content: event.content, + type: OutputEventType.OutputComplete, }; - } - + }) + ); + } else { + return response.then((chatResponse) => { return { id, + content: chatResponse.content, output: - event.toolCalls.length && 'arguments' in event.toolCalls[0].function - ? event.toolCalls[0].function.arguments + chatResponse.toolCalls.length && 'arguments' in chatResponse.toolCalls[0].function + ? chatResponse.toolCalls[0].function.arguments : undefined, - content: event.content, - type: OutputEventType.OutputComplete, }; - }) - ); + }); + } }; } diff --git a/x-pack/plugins/inference/public/chat_complete.test.ts b/x-pack/plugins/inference/public/chat_complete.test.ts new file mode 100644 index 0000000000000..b297db1f2fb2c --- /dev/null +++ b/x-pack/plugins/inference/public/chat_complete.test.ts @@ -0,0 +1,63 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import { omit } from 'lodash'; +import { httpServiceMock } from '@kbn/core/public/mocks'; +import { ChatCompleteAPI, MessageRole, ChatCompleteOptions } from '@kbn/inference-common'; +import { createChatCompleteApi } from './chat_complete'; + +describe('createChatCompleteApi', () => { + let http: ReturnType; + let chatComplete: ChatCompleteAPI; + + beforeEach(() => { + http = httpServiceMock.createStartContract(); + chatComplete = createChatCompleteApi({ http }); + }); + + it('calls http.post with the right parameters when stream is not true', async () => { + const params = { + connectorId: 'my-connector', + functionCalling: 'native', + system: 'system', + messages: [{ role: MessageRole.User, content: 'question' }], + }; + await chatComplete(params as ChatCompleteOptions); + + expect(http.post).toHaveBeenCalledTimes(1); + expect(http.post).toHaveBeenCalledWith('/internal/inference/chat_complete', { + body: expect.any(String), + }); + const callBody = http.post.mock.lastCall!; + + expect(JSON.parse((callBody as any[])[1].body as string)).toEqual(params); + }); + + it('calls http.post with the right parameters when stream is true', async () => { + http.post.mockResolvedValue({}); + + const params = { + connectorId: 'my-connector', + functionCalling: 'native', + stream: true, + system: 'system', + messages: [{ role: MessageRole.User, content: 'question' }], + }; + + await chatComplete(params as ChatCompleteOptions); + + expect(http.post).toHaveBeenCalledTimes(1); + expect(http.post).toHaveBeenCalledWith('/internal/inference/chat_complete/stream', { + asResponse: true, + rawResponse: true, + body: expect.any(String), + }); + const callBody = http.post.mock.lastCall!; + + expect(JSON.parse((callBody as any[])[1].body as string)).toEqual(omit(params, 'stream')); + }); +}); diff --git a/x-pack/plugins/inference/public/chat_complete.ts b/x-pack/plugins/inference/public/chat_complete.ts index 5319f7c31c381..64cb5533f20be 100644 --- a/x-pack/plugins/inference/public/chat_complete.ts +++ b/x-pack/plugins/inference/public/chat_complete.ts @@ -5,14 +5,31 @@ * 2.0. */ -import { from } from 'rxjs'; import type { HttpStart } from '@kbn/core/public'; -import type { ChatCompleteAPI } from '@kbn/inference-common'; +import { + ChatCompleteAPI, + ChatCompleteCompositeResponse, + ChatCompleteOptions, + ToolOptions, +} from '@kbn/inference-common'; +import { from } from 'rxjs'; import type { ChatCompleteRequestBody } from '../common/http_apis'; import { httpResponseIntoObservable } from './util/http_response_into_observable'; -export function createChatCompleteApi({ http }: { http: HttpStart }): ChatCompleteAPI { - return ({ connectorId, messages, system, toolChoice, tools, functionCalling }) => { +export function createChatCompleteApi({ http }: { http: HttpStart }): ChatCompleteAPI; +export function createChatCompleteApi({ http }: { http: HttpStart }) { + return ({ + connectorId, + messages, + system, + toolChoice, + tools, + functionCalling, + stream, + }: ChatCompleteOptions): ChatCompleteCompositeResponse< + ToolOptions, + boolean + > => { const body: ChatCompleteRequestBody = { connectorId, system, @@ -22,12 +39,18 @@ export function createChatCompleteApi({ http }: { http: HttpStart }): ChatComple functionCalling, }; - return from( - http.post('/internal/inference/chat_complete', { - asResponse: true, - rawResponse: true, + if (stream) { + return from( + http.post('/internal/inference/chat_complete/stream', { + asResponse: true, + rawResponse: true, + body: JSON.stringify(body), + }) + ).pipe(httpResponseIntoObservable()); + } else { + return http.post('/internal/inference/chat_complete', { body: JSON.stringify(body), - }) - ).pipe(httpResponseIntoObservable()); + }); + } }; } diff --git a/x-pack/plugins/inference/scripts/evaluation/evaluation.ts b/x-pack/plugins/inference/scripts/evaluation/evaluation.ts index 0c70bf81eddad..425b7e334074a 100644 --- a/x-pack/plugins/inference/scripts/evaluation/evaluation.ts +++ b/x-pack/plugins/inference/scripts/evaluation/evaluation.ts @@ -89,8 +89,8 @@ function runEvaluations() { const evaluationClient = createInferenceEvaluationClient({ connectorId: connector.connectorId, evaluationConnectorId, - outputApi: (id, parameters) => - chatClient.output(id, { + outputApi: (parameters) => + chatClient.output({ ...parameters, connectorId: evaluationConnectorId, }) as any, diff --git a/x-pack/plugins/inference/scripts/evaluation/evaluation_client.ts b/x-pack/plugins/inference/scripts/evaluation/evaluation_client.ts index d35c214542255..99eaf02494af7 100644 --- a/x-pack/plugins/inference/scripts/evaluation/evaluation_client.ts +++ b/x-pack/plugins/inference/scripts/evaluation/evaluation_client.ts @@ -6,8 +6,7 @@ */ import { remove } from 'lodash'; -import { lastValueFrom } from 'rxjs'; -import { type OutputAPI, withoutOutputUpdateEvents } from '@kbn/inference-common'; +import type { OutputAPI } from '@kbn/inference-common'; import type { EvaluationResult } from './types'; export interface InferenceEvaluationClient { @@ -67,11 +66,12 @@ export function createInferenceEvaluationClient({ output: outputApi, getEvaluationConnectorId: () => evaluationConnectorId, evaluate: async ({ input, criteria = [], system }) => { - const evaluation = await lastValueFrom( - outputApi('evaluate', { - connectorId, - system: withAdditionalSystemContext( - `You are a helpful, respected assistant for evaluating task + const evaluation = await outputApi({ + id: 'evaluate', + stream: false, + connectorId, + system: withAdditionalSystemContext( + `You are a helpful, respected assistant for evaluating task inputs and outputs in the Elastic Platform. Your goal is to verify whether the output of a task @@ -84,10 +84,10 @@ export function createInferenceEvaluationClient({ quoting what the assistant did wrong, where it could improve, and what the root cause was in case of a failure. `, - system - ), + system + ), - input: ` + input: ` ## Criteria ${criteria @@ -99,37 +99,36 @@ export function createInferenceEvaluationClient({ ## Input ${input}`, - schema: { - type: 'object', - properties: { - criteria: { - type: 'array', - items: { - type: 'object', - properties: { - index: { - type: 'number', - description: 'The number of the criterion', - }, - score: { - type: 'number', - description: - 'The score you calculated for the criterion, between 0 (criterion fully failed) and 1 (criterion fully succeeded).', - }, - reasoning: { - type: 'string', - description: - 'Your reasoning for the score. Explain your score by mentioning what you expected to happen and what did happen.', - }, + schema: { + type: 'object', + properties: { + criteria: { + type: 'array', + items: { + type: 'object', + properties: { + index: { + type: 'number', + description: 'The number of the criterion', + }, + score: { + type: 'number', + description: + 'The score you calculated for the criterion, between 0 (criterion fully failed) and 1 (criterion fully succeeded).', + }, + reasoning: { + type: 'string', + description: + 'Your reasoning for the score. Explain your score by mentioning what you expected to happen and what did happen.', }, - required: ['index', 'score', 'reasoning'], }, + required: ['index', 'score', 'reasoning'], }, }, - required: ['criteria'], - } as const, - }).pipe(withoutOutputUpdateEvents()) - ); + }, + required: ['criteria'], + } as const, + }); const scoredCriteria = evaluation.output.criteria; diff --git a/x-pack/plugins/inference/scripts/evaluation/scenarios/esql/index.spec.ts b/x-pack/plugins/inference/scripts/evaluation/scenarios/esql/index.spec.ts index d9071b3f0ae3f..49a82db8124e9 100644 --- a/x-pack/plugins/inference/scripts/evaluation/scenarios/esql/index.spec.ts +++ b/x-pack/plugins/inference/scripts/evaluation/scenarios/esql/index.spec.ts @@ -9,8 +9,7 @@ import expect from '@kbn/expect'; import type { Logger } from '@kbn/logging'; -import { firstValueFrom, lastValueFrom, filter } from 'rxjs'; -import { isOutputCompleteEvent } from '@kbn/inference-common'; +import { lastValueFrom } from 'rxjs'; import { naturalLanguageToEsql } from '../../../../server/tasks/nl_to_esql'; import { chatClient, evaluationClient, logger } from '../../services'; import { EsqlDocumentBase } from '../../../../server/tasks/nl_to_esql/doc_base'; @@ -66,11 +65,10 @@ const retrieveUsedCommands = async ({ answer: string; esqlDescription: string; }) => { - const commandsListOutput = await firstValueFrom( - evaluationClient - .output('retrieve_commands', { - connectorId: evaluationClient.getEvaluationConnectorId(), - system: ` + const commandsListOutput = await evaluationClient.output({ + id: 'retrieve_commands', + connectorId: evaluationClient.getEvaluationConnectorId(), + system: ` You are a helpful, respected Elastic ES|QL assistant. Your role is to enumerate the list of ES|QL commands and functions that were used @@ -82,34 +80,32 @@ const retrieveUsedCommands = async ({ ${esqlDescription} `, - input: ` + input: ` # Question ${question} # Answer ${answer} `, - schema: { - type: 'object', - properties: { - commands: { - description: - 'The list of commands that were used in the provided ES|QL question and answer', - type: 'array', - items: { type: 'string' }, - }, - functions: { - description: - 'The list of functions that were used in the provided ES|QL question and answer', - type: 'array', - items: { type: 'string' }, - }, - }, - required: ['commands', 'functions'], - } as const, - }) - .pipe(filter(isOutputCompleteEvent)) - ); + schema: { + type: 'object', + properties: { + commands: { + description: + 'The list of commands that were used in the provided ES|QL question and answer', + type: 'array', + items: { type: 'string' }, + }, + functions: { + description: + 'The list of functions that were used in the provided ES|QL question and answer', + type: 'array', + items: { type: 'string' }, + }, + }, + required: ['commands', 'functions'], + } as const, + }); const output = commandsListOutput.output; diff --git a/x-pack/plugins/inference/scripts/load_esql_docs/utils/output_executor.ts b/x-pack/plugins/inference/scripts/load_esql_docs/utils/output_executor.ts index 62cfd8f877e3f..f4014db0e6e8d 100644 --- a/x-pack/plugins/inference/scripts/load_esql_docs/utils/output_executor.ts +++ b/x-pack/plugins/inference/scripts/load_esql_docs/utils/output_executor.ts @@ -5,7 +5,6 @@ * 2.0. */ -import { lastValueFrom } from 'rxjs'; import type { OutputAPI } from '@kbn/inference-common'; export interface Prompt { @@ -27,13 +26,13 @@ export type PromptCallerFactory = ({ export const bindOutput: PromptCallerFactory = ({ connectorId, output }) => { return async ({ input, system }) => { - const response = await lastValueFrom( - output('', { - connectorId, - input, - system, - }) - ); + const response = await output({ + id: 'output', + connectorId, + input, + system, + }); + return response.content ?? ''; }; }; diff --git a/x-pack/plugins/inference/scripts/util/kibana_client.ts b/x-pack/plugins/inference/scripts/util/kibana_client.ts index b599ab81a4af4..ad6c21cf4b248 100644 --- a/x-pack/plugins/inference/scripts/util/kibana_client.ts +++ b/x-pack/plugins/inference/scripts/util/kibana_client.ts @@ -15,6 +15,7 @@ import { inspect } from 'util'; import { isReadable } from 'stream'; import { ChatCompleteAPI, + ChatCompleteCompositeResponse, OutputAPI, ChatCompletionEvent, InferenceTaskError, @@ -22,6 +23,8 @@ import { InferenceTaskEventType, createInferenceInternalError, withoutOutputUpdateEvents, + type ToolOptions, + ChatCompleteOptions, } from '@kbn/inference-common'; import type { ChatCompleteRequestBody } from '../../common/http_apis'; import type { InferenceConnector } from '../../common/connectors'; @@ -154,7 +157,7 @@ export class KibanaClient { } createInferenceClient({ connectorId }: { connectorId: string }): ScriptInferenceClient { - function stream(responsePromise: Promise) { + function streamResponse(responsePromise: Promise) { return from(responsePromise).pipe( switchMap((response) => { if (isReadable(response.data)) { @@ -174,14 +177,18 @@ export class KibanaClient { ); } - const chatCompleteApi: ChatCompleteAPI = ({ + const chatCompleteApi: ChatCompleteAPI = < + TToolOptions extends ToolOptions = ToolOptions, + TStream extends boolean = false + >({ connectorId: chatCompleteConnectorId, messages, system, toolChoice, tools, functionCalling, - }) => { + stream, + }: ChatCompleteOptions) => { const body: ChatCompleteRequestBody = { connectorId: chatCompleteConnectorId, system, @@ -191,15 +198,29 @@ export class KibanaClient { functionCalling, }; - return stream( - this.axios.post( - this.getUrl({ - pathname: `/internal/inference/chat_complete`, - }), - body, - { responseType: 'stream', timeout: NaN } - ) - ); + if (stream) { + return streamResponse( + this.axios.post( + this.getUrl({ + pathname: `/internal/inference/chat_complete/stream`, + }), + body, + { responseType: 'stream', timeout: NaN } + ) + ) as ChatCompleteCompositeResponse; + } else { + return this.axios + .post( + this.getUrl({ + pathname: `/internal/inference/chat_complete/stream`, + }), + body, + { responseType: 'stream', timeout: NaN } + ) + .then((response) => { + return response.data; + }) as ChatCompleteCompositeResponse; + } }; const outputApi: OutputAPI = createOutputApi(chatCompleteApi); @@ -211,8 +232,13 @@ export class KibanaClient { ...options, }); }, - output: (id, options) => { - return outputApi(id, { ...options }).pipe(withoutOutputUpdateEvents()); + output: (options) => { + const response = outputApi({ ...options }); + if (options.stream) { + return (response as any).pipe(withoutOutputUpdateEvents()); + } else { + return response; + } }, }; } diff --git a/x-pack/plugins/inference/server/chat_complete/api.ts b/x-pack/plugins/inference/server/chat_complete/api.ts index 62a1ea8b26146..cf325e72ddf3a 100644 --- a/x-pack/plugins/inference/server/chat_complete/api.ts +++ b/x-pack/plugins/inference/server/chat_complete/api.ts @@ -11,32 +11,37 @@ import type { Logger } from '@kbn/logging'; import type { KibanaRequest } from '@kbn/core-http-server'; import { type ChatCompleteAPI, - type ChatCompletionResponse, + type ChatCompleteCompositeResponse, createInferenceRequestError, + type ToolOptions, + ChatCompleteOptions, } from '@kbn/inference-common'; import type { InferenceStartDependencies } from '../types'; import { getConnectorById } from '../util/get_connector_by_id'; import { getInferenceAdapter } from './adapters'; -import { createInferenceExecutor, chunksIntoMessage } from './utils'; +import { createInferenceExecutor, chunksIntoMessage, streamToResponse } from './utils'; -export function createChatCompleteApi({ - request, - actions, - logger, -}: { +interface CreateChatCompleteApiOptions { request: KibanaRequest; actions: InferenceStartDependencies['actions']; logger: Logger; -}) { - const chatCompleteAPI: ChatCompleteAPI = ({ +} + +export function createChatCompleteApi(options: CreateChatCompleteApiOptions): ChatCompleteAPI; +export function createChatCompleteApi({ request, actions, logger }: CreateChatCompleteApiOptions) { + return ({ connectorId, messages, toolChoice, tools, system, functionCalling, - }): ChatCompletionResponse => { - return defer(async () => { + stream, + }: ChatCompleteOptions): ChatCompleteCompositeResponse< + ToolOptions, + boolean + > => { + const obs$ = defer(async () => { const actionsClient = await actions.getActionsClientWithRequest(request); const connector = await getConnectorById({ connectorId, actionsClient }); const executor = createInferenceExecutor({ actionsClient, connector }); @@ -73,7 +78,11 @@ export function createChatCompleteApi({ logger, }) ); - }; - return chatCompleteAPI; + if (stream) { + return obs$; + } else { + return streamToResponse(obs$); + } + }; } diff --git a/x-pack/plugins/inference/server/chat_complete/utils/index.ts b/x-pack/plugins/inference/server/chat_complete/utils/index.ts index dea2ac65f4755..d3dc2010cba3a 100644 --- a/x-pack/plugins/inference/server/chat_complete/utils/index.ts +++ b/x-pack/plugins/inference/server/chat_complete/utils/index.ts @@ -12,3 +12,4 @@ export { type InferenceExecutor, } from './inference_executor'; export { chunksIntoMessage } from './chunks_into_message'; +export { streamToResponse } from './stream_to_response'; diff --git a/x-pack/plugins/inference/server/chat_complete/utils/stream_to_response.test.ts b/x-pack/plugins/inference/server/chat_complete/utils/stream_to_response.test.ts new file mode 100644 index 0000000000000..939997a5fef15 --- /dev/null +++ b/x-pack/plugins/inference/server/chat_complete/utils/stream_to_response.test.ts @@ -0,0 +1,73 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import { of } from 'rxjs'; +import { ChatCompletionEvent } from '@kbn/inference-common'; +import { chunkEvent, tokensEvent, messageEvent } from '../../test_utils/chat_complete_events'; +import { streamToResponse } from './stream_to_response'; + +describe('streamToResponse', () => { + function fromEvents(...events: ChatCompletionEvent[]) { + return of(...events); + } + + it('returns a response with token count if both message and token events got emitted', async () => { + const response = await streamToResponse( + fromEvents( + chunkEvent('chunk_1'), + chunkEvent('chunk_2'), + tokensEvent({ prompt: 1, completion: 2, total: 3 }), + messageEvent('message') + ) + ); + + expect(response).toEqual({ + content: 'message', + tokens: { + completion: 2, + prompt: 1, + total: 3, + }, + toolCalls: [], + }); + }); + + it('returns a response with tool calls if present', async () => { + const someToolCall = { + toolCallId: '42', + function: { + name: 'my_tool', + arguments: {}, + }, + }; + const response = await streamToResponse( + fromEvents(chunkEvent('chunk_1'), messageEvent('message', [someToolCall])) + ); + + expect(response).toEqual({ + content: 'message', + toolCalls: [someToolCall], + }); + }); + + it('returns a response without token count if only message got emitted', async () => { + const response = await streamToResponse( + fromEvents(chunkEvent('chunk_1'), messageEvent('message')) + ); + + expect(response).toEqual({ + content: 'message', + toolCalls: [], + }); + }); + + it('rejects an error if message event is not emitted', async () => { + await expect( + streamToResponse(fromEvents(chunkEvent('chunk_1'), tokensEvent())) + ).rejects.toThrowErrorMatchingInlineSnapshot(`"No message event found"`); + }); +}); diff --git a/x-pack/plugins/inference/server/chat_complete/utils/stream_to_response.ts b/x-pack/plugins/inference/server/chat_complete/utils/stream_to_response.ts new file mode 100644 index 0000000000000..4bae4fda767cb --- /dev/null +++ b/x-pack/plugins/inference/server/chat_complete/utils/stream_to_response.ts @@ -0,0 +1,42 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import { toArray, map, firstValueFrom } from 'rxjs'; +import { + ChatCompleteResponse, + ChatCompleteStreamResponse, + createInferenceInternalError, + isChatCompletionMessageEvent, + isChatCompletionTokenCountEvent, + ToolOptions, + withoutChunkEvents, +} from '@kbn/inference-common'; + +export const streamToResponse = ( + streamResponse$: ChatCompleteStreamResponse +): Promise> => { + return firstValueFrom( + streamResponse$.pipe( + withoutChunkEvents(), + toArray(), + map((events) => { + const messageEvent = events.find(isChatCompletionMessageEvent); + const tokenEvent = events.find(isChatCompletionTokenCountEvent); + + if (!messageEvent) { + throw createInferenceInternalError('No message event found'); + } + + return { + content: messageEvent.content, + toolCalls: messageEvent.toolCalls, + tokens: tokenEvent?.tokens, + }; + }) + ) + ); +}; diff --git a/x-pack/plugins/inference/server/routes/chat_complete.ts b/x-pack/plugins/inference/server/routes/chat_complete.ts index d4d0d012a78cd..582d4ceb97d45 100644 --- a/x-pack/plugins/inference/server/routes/chat_complete.ts +++ b/x-pack/plugins/inference/server/routes/chat_complete.ts @@ -5,8 +5,14 @@ * 2.0. */ -import { schema, Type } from '@kbn/config-schema'; -import type { CoreSetup, IRouter, Logger, RequestHandlerContext } from '@kbn/core/server'; +import { schema, Type, TypeOf } from '@kbn/config-schema'; +import type { + CoreSetup, + IRouter, + Logger, + RequestHandlerContext, + KibanaRequest, +} from '@kbn/core/server'; import { MessageRole, ToolCall, ToolChoiceType } from '@kbn/inference-common'; import type { ChatCompleteRequestBody } from '../../common/http_apis'; import { createInferenceClient } from '../inference_client'; @@ -84,6 +90,32 @@ export function registerChatCompleteRoute({ router: IRouter; logger: Logger; }) { + async function callChatComplete({ + request, + stream, + }: { + request: KibanaRequest>; + stream: T; + }) { + const actions = await coreSetup + .getStartServices() + .then(([coreStart, pluginsStart]) => pluginsStart.actions); + + const client = createInferenceClient({ request, actions, logger }); + + const { connectorId, messages, system, toolChoice, tools, functionCalling } = request.body; + + return client.chatComplete({ + connectorId, + messages, + system, + toolChoice, + tools, + functionCalling, + stream, + }); + } + router.post( { path: '/internal/inference/chat_complete', @@ -92,23 +124,22 @@ export function registerChatCompleteRoute({ }, }, async (context, request, response) => { - const actions = await coreSetup - .getStartServices() - .then(([coreStart, pluginsStart]) => pluginsStart.actions); - - const client = createInferenceClient({ request, actions, logger }); - - const { connectorId, messages, system, toolChoice, tools, functionCalling } = request.body; - - const chatCompleteResponse = client.chatComplete({ - connectorId, - messages, - system, - toolChoice, - tools, - functionCalling, + const chatCompleteResponse = await callChatComplete({ request, stream: false }); + return response.ok({ + body: chatCompleteResponse, }); + } + ); + router.post( + { + path: '/internal/inference/chat_complete/stream', + validate: { + body: chatCompleteBodySchema, + }, + }, + async (context, request, response) => { + const chatCompleteResponse = await callChatComplete({ request, stream: true }); return response.ok({ body: observableIntoEventSourceStream(chatCompleteResponse, logger), }); diff --git a/x-pack/plugins/inference/server/tasks/nl_to_esql/actions/generate_esql.ts b/x-pack/plugins/inference/server/tasks/nl_to_esql/actions/generate_esql.ts index 26a8fb63ce013..3d8701eba72db 100644 --- a/x-pack/plugins/inference/server/tasks/nl_to_esql/actions/generate_esql.ts +++ b/x-pack/plugins/inference/server/tasks/nl_to_esql/actions/generate_esql.ts @@ -71,6 +71,7 @@ export const generateEsqlTask = ({ chatCompleteApi({ connectorId, functionCalling, + stream: true, system: `${systemMessage} # Current task diff --git a/x-pack/plugins/inference/server/tasks/nl_to_esql/actions/request_documentation.ts b/x-pack/plugins/inference/server/tasks/nl_to_esql/actions/request_documentation.ts index aea428208be1d..06e75db09bdc9 100644 --- a/x-pack/plugins/inference/server/tasks/nl_to_esql/actions/request_documentation.ts +++ b/x-pack/plugins/inference/server/tasks/nl_to_esql/actions/request_documentation.ts @@ -33,8 +33,10 @@ export const requestDocumentation = ({ }) => { const hasTools = !isEmpty(tools) && toolChoice !== ToolChoiceType.none; - return outputApi('request_documentation', { + return outputApi({ + id: 'request_documentation', connectorId, + stream: true, functionCalling, system, previousMessages: messages, diff --git a/x-pack/plugins/inference/server/test_utils/chat_complete_events.ts b/x-pack/plugins/inference/server/test_utils/chat_complete_events.ts new file mode 100644 index 0000000000000..4b09ca9c4dc5a --- /dev/null +++ b/x-pack/plugins/inference/server/test_utils/chat_complete_events.ts @@ -0,0 +1,39 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import { + ChatCompletionChunkEvent, + ChatCompletionEventType, + ChatCompletionTokenCountEvent, + ChatCompletionMessageEvent, + ChatCompletionTokenCount, + ToolCall, +} from '@kbn/inference-common'; + +export const chunkEvent = (content: string = 'chunk'): ChatCompletionChunkEvent => ({ + type: ChatCompletionEventType.ChatCompletionChunk, + content, + tool_calls: [], +}); + +export const messageEvent = ( + content: string = 'message', + toolCalls: Array> = [] +): ChatCompletionMessageEvent => ({ + type: ChatCompletionEventType.ChatCompletionMessage, + content, + toolCalls, +}); + +export const tokensEvent = (tokens?: ChatCompletionTokenCount): ChatCompletionTokenCountEvent => ({ + type: ChatCompletionEventType.ChatCompletionTokenCount, + tokens: { + prompt: tokens?.prompt ?? 10, + completion: tokens?.completion ?? 20, + total: tokens?.total ?? 30, + }, +}); diff --git a/x-pack/plugins/ml/public/application/data_frame_analytics/pages/analytics_creation/components/configuration_step/configuration_step_form.tsx b/x-pack/plugins/ml/public/application/data_frame_analytics/pages/analytics_creation/components/configuration_step/configuration_step_form.tsx index 3212eba8b2ddd..7fd678e98f6fd 100644 --- a/x-pack/plugins/ml/public/application/data_frame_analytics/pages/analytics_creation/components/configuration_step/configuration_step_form.tsx +++ b/x-pack/plugins/ml/public/application/data_frame_analytics/pages/analytics_creation/components/configuration_step/configuration_step_form.tsx @@ -576,6 +576,7 @@ export const ConfigurationStepForm: FC = ({ fieldStatsServices={fieldStatsServices} timeRangeMs={indexData.timeRangeMs} dslQuery={jobConfigQuery} + theme={services.theme} > diff --git a/x-pack/plugins/ml/public/application/jobs/new_job/pages/new_job/wizard_steps.tsx b/x-pack/plugins/ml/public/application/jobs/new_job/pages/new_job/wizard_steps.tsx index 6c4600be5d25e..b44c523bc57cf 100644 --- a/x-pack/plugins/ml/public/application/jobs/new_job/pages/new_job/wizard_steps.tsx +++ b/x-pack/plugins/ml/public/application/jobs/new_job/pages/new_job/wizard_steps.tsx @@ -123,6 +123,7 @@ export const WizardSteps: FC = ({ currentStep, setCurrentStep }) => { fieldStatsServices={fieldStatsServices} timeRangeMs={timeRangeMs} dslQuery={jobCreator.query} + theme={services.theme} > <> diff --git a/x-pack/plugins/observability_solution/infra/public/pages/metrics/metric_detail/lib/get_filtered_metrics.ts b/x-pack/plugins/observability_solution/infra/public/pages/metrics/metric_detail/lib/get_filtered_metrics.ts index 36433a2bca016..a6cfd30eaa26d 100644 --- a/x-pack/plugins/observability_solution/infra/public/pages/metrics/metric_detail/lib/get_filtered_metrics.ts +++ b/x-pack/plugins/observability_solution/infra/public/pages/metrics/metric_detail/lib/get_filtered_metrics.ts @@ -18,11 +18,13 @@ export const getFilteredMetrics = ( .filter((data) => data && data.source === 'metrics') .map((data) => data && data.name); return requiredMetrics.filter((metric) => { - const metricModelCreator = metrics.tsvb[metric]; + const metricModelCreator = metrics?.tsvb[metric] ?? null; // We just need to get a dummy version of the model so we can filter // using the `requires` attribute. - const metricModel = metricModelCreator(TIMESTAMP_FIELD, 'test', '>=1m'); - return metricMetadata.some((m) => m && metricModel.requires.includes(m)); + const metricModel = metricModelCreator + ? metricModelCreator(TIMESTAMP_FIELD, 'test', '>=1m') + : { requires: [''] }; // when tsvb is not defined (host & container) + return metricMetadata.some((m) => m && metricModel?.requires?.includes(m)); }); }; diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/index.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/index.ts index faa848192fd46..44fe7d0cb1a8c 100644 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/index.ts +++ b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/index.ts @@ -10,16 +10,6 @@ import { cpu } from './snapshot/cpu'; import { memory } from './snapshot/memory'; import { rx } from './snapshot/rx'; import { tx } from './snapshot/tx'; -import { containerCpuKernel } from './tsvb/container_cpu_kernel'; -import { containerCpuUsage } from './tsvb/container_cpu_usage'; -import { containerDiskIOOps } from './tsvb/container_diskio_ops'; -import { containerDiskIOBytes } from './tsvb/container_disk_io_bytes'; -import { containerK8sCpuUsage } from './tsvb/container_k8s_cpu_usage'; -import { containerK8sMemoryUsage } from './tsvb/container_k8s_memory_usage'; -import { containerK8sOverview } from './tsvb/container_k8s_overview'; -import { containerMemory } from './tsvb/container_memory'; -import { containerNetworkTraffic } from './tsvb/container_network_traffic'; -import { containerOverview } from './tsvb/container_overview'; import type { ContainerFormulas } from './formulas'; import { ContainerCharts } from './charts'; @@ -30,18 +20,6 @@ export const containerSnapshotMetricTypes = Object.keys(containerSnapshotMetrics >; export const metrics: InventoryMetricsWithCharts = { - tsvb: { - containerOverview, - containerCpuUsage, - containerCpuKernel, - containerDiskIOOps, - containerDiskIOBytes, - containerNetworkTraffic, - containerMemory, - containerK8sCpuUsage, - containerK8sOverview, - containerK8sMemoryUsage, - }, snapshot: containerSnapshotMetrics, getFormulas: async () => await import('./formulas').then(({ formulas }) => formulas), getCharts: async () => await import('./charts').then(({ charts }) => charts), diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_cpu_kernel.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_cpu_kernel.ts deleted file mode 100644 index f469a9e86ad49..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_cpu_kernel.ts +++ /dev/null @@ -1,34 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const containerCpuKernel: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'containerCpuKernel', - requires: ['docker.cpu'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'kernel', - split_mode: 'everything', - metrics: [ - { - field: 'docker.cpu.kernel.pct', - id: 'avg-cpu-kernel', - type: 'avg', - }, - ], - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_cpu_usage.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_cpu_usage.ts deleted file mode 100644 index f4efc7de43663..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_cpu_usage.ts +++ /dev/null @@ -1,34 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const containerCpuUsage: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'containerCpuUsage', - requires: ['docker.cpu'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'cpu', - split_mode: 'everything', - metrics: [ - { - field: 'docker.cpu.total.pct', - id: 'avg-cpu-total', - type: 'avg', - }, - ], - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_disk_io_bytes.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_disk_io_bytes.ts deleted file mode 100644 index 7320349f92e8d..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_disk_io_bytes.ts +++ /dev/null @@ -1,81 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const containerDiskIOBytes: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'containerDiskIOBytes', - requires: ['docker.diskio'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'read', - split_mode: 'everything', - metrics: [ - { - field: 'docker.diskio.read.bytes', - id: 'max-diskio-read-bytes', - type: 'max', - }, - { - field: 'max-diskio-read-bytes', - id: 'deriv-max-diskio-read-bytes', - type: 'derivative', - unit: '1s', - }, - { - id: 'posonly-deriv-max-diskio-read-bytes', - type: 'calculation', - variables: [{ id: 'var-rate', name: 'rate', field: 'deriv-max-diskio-read-bytes' }], - script: 'params.rate > 0.0 ? params.rate : 0.0', - }, - ], - }, - { - id: 'write', - split_mode: 'everything', - metrics: [ - { - field: 'docker.diskio.write.bytes', - id: 'max-diskio-write-bytes', - type: 'max', - }, - { - field: 'max-diskio-write-bytes', - id: 'deriv-max-diskio-write-bytes', - type: 'derivative', - unit: '1s', - }, - { - id: 'posonly-deriv-max-diskio-write-bytes', - type: 'calculation', - variables: [{ id: 'var-rate', name: 'rate', field: 'deriv-max-diskio-write-bytes' }], - script: 'params.rate > 0.0 ? params.rate : 0.0', - }, - { - id: 'calc-invert-rate', - script: 'params.rate * -1', - type: 'calculation', - variables: [ - { - field: 'posonly-deriv-max-diskio-write-bytes', - id: 'var-rate', - name: 'rate', - }, - ], - }, - ], - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_diskio_ops.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_diskio_ops.ts deleted file mode 100644 index a6887fb4e7638..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_diskio_ops.ts +++ /dev/null @@ -1,81 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const containerDiskIOOps: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'containerDiskIOOps', - requires: ['docker.diskio'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'read', - split_mode: 'everything', - metrics: [ - { - field: 'docker.diskio.read.ops', - id: 'max-diskio-read-ops', - type: 'max', - }, - { - field: 'max-diskio-read-ops', - id: 'deriv-max-diskio-read-ops', - type: 'derivative', - unit: '1s', - }, - { - id: 'posonly-deriv-max-diskio-read-ops', - type: 'calculation', - variables: [{ id: 'var-rate', name: 'rate', field: 'deriv-max-diskio-read-ops' }], - script: 'params.rate > 0.0 ? params.rate : 0.0', - }, - ], - }, - { - id: 'write', - split_mode: 'everything', - metrics: [ - { - field: 'docker.diskio.write.ops', - id: 'max-diskio-write-ops', - type: 'max', - }, - { - field: 'max-diskio-write-ops', - id: 'deriv-max-diskio-write-ops', - type: 'derivative', - unit: '1s', - }, - { - id: 'posonly-deriv-max-diskio-write-ops', - type: 'calculation', - variables: [{ id: 'var-rate', name: 'rate', field: 'deriv-max-diskio-write-ops' }], - script: 'params.rate > 0.0 ? params.rate : 0.0', - }, - { - id: 'calc-invert-rate', - script: 'params.rate * -1', - type: 'calculation', - variables: [ - { - field: 'posonly-deriv-max-diskio-write-ops', - id: 'var-rate', - name: 'rate', - }, - ], - }, - ], - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_k8s_cpu_usage.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_k8s_cpu_usage.ts deleted file mode 100644 index 67372f50d750b..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_k8s_cpu_usage.ts +++ /dev/null @@ -1,34 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const containerK8sCpuUsage: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'containerK8sCpuUsage', - requires: ['kubernetes.container'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'cpu', - split_mode: 'everything', - metrics: [ - { - field: 'kubernetes.container.cpu.usage.limit.pct', - id: 'avg-cpu', - type: 'avg', - }, - ], - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_k8s_memory_usage.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_k8s_memory_usage.ts deleted file mode 100644 index 066d993817b75..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_k8s_memory_usage.ts +++ /dev/null @@ -1,34 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const containerK8sMemoryUsage: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'containerK8sMemoryUsage', - requires: ['kubernetes.container'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'memory', - split_mode: 'everything', - metrics: [ - { - field: 'kubernetes.container.memory.usage.limit.pct', - id: 'avg-memory', - type: 'avg', - }, - ], - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_k8s_overview.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_k8s_overview.ts deleted file mode 100644 index 470704140efb4..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_k8s_overview.ts +++ /dev/null @@ -1,45 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const containerK8sOverview: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'containerK8sOverview', - requires: ['kubernetes.container'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'cpu', - split_mode: 'everything', - metrics: [ - { - field: 'kubernetes.container.cpu.usage.limit.pct', - id: 'avg-cpu-total', - type: 'avg', - }, - ], - }, - { - id: 'memory', - split_mode: 'everything', - metrics: [ - { - field: 'kubernetes.container.memory.usage.limit.pct', - id: 'avg-memory-total', - type: 'avg', - }, - ], - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_memory.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_memory.ts deleted file mode 100644 index e0572692d60fc..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_memory.ts +++ /dev/null @@ -1,34 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const containerMemory: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'containerMemory', - requires: ['docker.memory'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'memory', - split_mode: 'everything', - metrics: [ - { - field: 'docker.memory.usage.pct', - id: 'avg-memory', - type: 'avg', - }, - ], - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_network_traffic.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_network_traffic.ts deleted file mode 100644 index d957d51a41648..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_network_traffic.ts +++ /dev/null @@ -1,93 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const containerNetworkTraffic: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'containerNetworkTraffic', - requires: ['docker.network'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'tx', - metrics: [ - { - field: 'docker.network.outbound.bytes', - id: 'max-net-out', - type: 'max', - }, - { - field: 'max-net-out', - id: 'deriv-max-net-out', - type: 'derivative', - unit: '1s', - }, - { - id: 'posonly-deriv-max-net-out', - type: 'calculation', - variables: [{ id: 'var-rate', name: 'rate', field: 'deriv-max-net-out' }], - script: 'params.rate > 0.0 ? params.rate : 0.0', - }, - { - function: 'sum', - id: 'seriesagg-sum', - type: 'series_agg', - }, - ], - split_mode: 'terms', - terms_field: 'docker.network.interface', - }, - { - id: 'rx', - metrics: [ - { - field: 'docker.network.inbound.bytes', - id: 'max-net-in', - type: 'max', - }, - { - field: 'max-net-in', - id: 'deriv-max-net-in', - type: 'derivative', - unit: '1s', - }, - { - id: 'posonly-deriv-max-net-in', - type: 'calculation', - variables: [{ id: 'var-rate', name: 'rate', field: 'deriv-max-net-in' }], - script: 'params.rate > 0.0 ? params.rate : 0.0', - }, - { - id: 'calc-invert-rate', - script: 'params.rate * -1', - type: 'calculation', - variables: [ - { - field: 'posonly-deriv-max-net-in', - id: 'var-rate', - name: 'rate', - }, - ], - }, - { - function: 'sum', - id: 'seriesagg-sum', - type: 'series_agg', - }, - ], - split_mode: 'terms', - terms_field: 'docker.network.interface', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_overview.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_overview.ts deleted file mode 100644 index c2d234be2eed7..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/container/metrics/tsvb/container_overview.ts +++ /dev/null @@ -1,67 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const containerOverview: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'containerOverview', - requires: ['docker'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'cpu', - split_mode: 'everything', - metrics: [ - { - field: 'docker.cpu.total.pct', - id: 'avg-cpu-total', - type: 'avg', - }, - ], - }, - { - id: 'memory', - split_mode: 'everything', - metrics: [ - { - field: 'docker.memory.usage.pct', - id: 'avg-memory', - type: 'avg', - }, - ], - }, - { - id: 'tx', - split_mode: 'everything', - metrics: [ - { - field: 'docker.network.out.bytes', - id: 'avg-network-out', - type: 'avg', - }, - ], - }, - { - id: 'rx', - split_mode: 'everything', - metrics: [ - { - field: 'docker.network.in.bytes', - id: 'avg-network-in', - type: 'avg', - }, - ], - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/index.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/index.ts index ce56e7bd59b0b..82408084e9472 100644 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/index.ts +++ b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/index.ts @@ -5,7 +5,6 @@ * 2.0. */ import { snapshot } from './snapshot'; -import { tsvb } from './tsvb'; import { InventoryMetricsWithCharts } from '../../types'; import type { HostFormulas } from './formulas'; import type { HostCharts } from './charts'; @@ -18,7 +17,6 @@ export const hostSnapshotMetricTypes = Object.keys(exposedHostSnapshotMetrics) a >; export const metrics: InventoryMetricsWithCharts = { - tsvb, snapshot, getFormulas: async () => await import('./formulas').then(({ formulas }) => formulas), getCharts: async () => await import('./charts').then(({ charts }) => charts), diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_cpu_usage.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_cpu_usage.ts deleted file mode 100644 index bcafeb4ebc4cf..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_cpu_usage.ts +++ /dev/null @@ -1,254 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostCpuUsage: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostCpuUsage', - requires: ['system.cpu'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'user', - metrics: [ - { - field: 'system.cpu.user.pct', - id: 'avg-cpu-user', - type: 'avg', - }, - { - field: 'system.cpu.cores', - id: 'max-cpu-cores', - type: 'max', - }, - { - id: 'calc-avg-cores', - script: 'params.avg / params.cores', - type: 'calculation', - variables: [ - { - field: 'max-cpu-cores', - id: 'var-cores', - name: 'cores', - }, - { - field: 'avg-cpu-user', - id: 'var-avg', - name: 'avg', - }, - ], - }, - ], - split_mode: 'everything', - }, - { - id: 'system', - metrics: [ - { - field: 'system.cpu.system.pct', - id: 'avg-cpu-system', - type: 'avg', - }, - { - field: 'system.cpu.cores', - id: 'max-cpu-cores', - type: 'max', - }, - { - id: 'calc-avg-cores', - script: 'params.avg / params.cores', - type: 'calculation', - variables: [ - { - field: 'max-cpu-cores', - id: 'var-cores', - name: 'cores', - }, - { - field: 'avg-cpu-system', - id: 'var-avg', - name: 'avg', - }, - ], - }, - ], - split_mode: 'everything', - }, - { - id: 'steal', - metrics: [ - { - field: 'system.cpu.steal.pct', - id: 'avg-cpu-steal', - type: 'avg', - }, - { - field: 'system.cpu.cores', - id: 'max-cpu-cores', - type: 'max', - }, - { - id: 'calc-avg-cores', - script: 'params.avg / params.cores', - type: 'calculation', - variables: [ - { - field: 'avg-cpu-steal', - id: 'var-avg', - name: 'avg', - }, - { - field: 'max-cpu-cores', - id: 'var-cores', - name: 'cores', - }, - ], - }, - ], - split_mode: 'everything', - }, - { - id: 'irq', - metrics: [ - { - field: 'system.cpu.irq.pct', - id: 'avg-cpu-irq', - type: 'avg', - }, - { - field: 'system.cpu.cores', - id: 'max-cpu-cores', - type: 'max', - }, - { - id: 'calc-avg-cores', - script: 'params.avg / params.cores', - type: 'calculation', - variables: [ - { - field: 'max-cpu-cores', - id: 'var-cores', - name: 'cores', - }, - { - field: 'avg-cpu-irq', - id: 'var-avg', - name: 'avg', - }, - ], - }, - ], - split_mode: 'everything', - }, - { - id: 'softirq', - metrics: [ - { - field: 'system.cpu.softirq.pct', - id: 'avg-cpu-softirq', - type: 'avg', - }, - { - field: 'system.cpu.cores', - id: 'max-cpu-cores', - type: 'max', - }, - { - id: 'calc-avg-cores', - script: 'params.avg / params.cores', - type: 'calculation', - variables: [ - { - field: 'max-cpu-cores', - id: 'var-cores', - name: 'cores', - }, - { - field: 'avg-cpu-softirq', - id: 'var-avg', - name: 'avg', - }, - ], - }, - ], - split_mode: 'everything', - }, - { - id: 'iowait', - metrics: [ - { - field: 'system.cpu.iowait.pct', - id: 'avg-cpu-iowait', - type: 'avg', - }, - { - field: 'system.cpu.cores', - id: 'max-cpu-cores', - type: 'max', - }, - { - id: 'calc-avg-cores', - script: 'params.avg / params.cores', - type: 'calculation', - variables: [ - { - field: 'max-cpu-cores', - id: 'var-cores', - name: 'cores', - }, - { - field: 'avg-cpu-iowait', - id: 'var-avg', - name: 'avg', - }, - ], - }, - ], - split_mode: 'everything', - }, - { - id: 'nice', - metrics: [ - { - field: 'system.cpu.nice.pct', - id: 'avg-cpu-nice', - type: 'avg', - }, - { - field: 'system.cpu.cores', - id: 'max-cpu-cores', - type: 'max', - }, - { - id: 'calc-avg-cores', - script: 'params.avg / params.cores', - type: 'calculation', - variables: [ - { - field: 'max-cpu-cores', - id: 'var-cores', - name: 'cores', - }, - { - field: 'avg-cpu-nice', - id: 'var-avg', - name: 'avg', - }, - ], - }, - ], - split_mode: 'everything', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_docker_info.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_docker_info.ts deleted file mode 100644 index 4cc2b574362d7..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_docker_info.ts +++ /dev/null @@ -1,56 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostDockerInfo: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostDockerInfo', - requires: ['docker.info'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'running', - metrics: [ - { - field: 'docker.info.containers.running', - id: 'max-running', - type: 'max', - }, - ], - split_mode: 'everything', - }, - { - id: 'paused', - metrics: [ - { - field: 'docker.info.containers.paused', - id: 'max-paused', - type: 'max', - }, - ], - split_mode: 'everything', - }, - { - id: 'stopped', - metrics: [ - { - field: 'docker.info.containers.stopped', - id: 'max-stopped', - type: 'max', - }, - ], - split_mode: 'everything', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_docker_overview.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_docker_overview.ts deleted file mode 100644 index df56a21dbf5b7..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_docker_overview.ts +++ /dev/null @@ -1,67 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostDockerOverview: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostDockerOverview', - requires: ['docker.info'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'top_n', - series: [ - { - id: 'total', - metrics: [ - { - field: 'docker.info.containers.total', - id: 'max-total', - type: 'max', - }, - ], - split_mode: 'everything', - }, - { - id: 'running', - metrics: [ - { - field: 'docker.info.containers.running', - id: 'max-running', - type: 'max', - }, - ], - split_mode: 'everything', - }, - { - id: 'paused', - metrics: [ - { - field: 'docker.info.containers.paused', - id: 'max-paused', - type: 'max', - }, - ], - split_mode: 'everything', - }, - { - id: 'stopped', - metrics: [ - { - field: 'docker.info.containers.stopped', - id: 'max-stopped', - type: 'max', - }, - ], - split_mode: 'everything', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_docker_top_5_by_cpu.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_docker_top_5_by_cpu.ts deleted file mode 100644 index fd9eb97419736..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_docker_top_5_by_cpu.ts +++ /dev/null @@ -1,37 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostDockerTop5ByCpu: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostDockerTop5ByCpu', - requires: ['docker.cpu'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'avg-cpu', - metrics: [ - { - field: 'docker.cpu.total.pct', - id: 'avg-cpu-metric', - type: 'avg', - }, - ], - split_mode: 'terms', - terms_field: 'container.name', - terms_order_by: 'avg-cpu', - terms_size: 5, - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_docker_top_5_by_memory.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_docker_top_5_by_memory.ts deleted file mode 100644 index cad828671d232..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_docker_top_5_by_memory.ts +++ /dev/null @@ -1,37 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostDockerTop5ByMemory: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostDockerTop5ByMemory', - requires: ['docker.memory'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'avg-memory', - metrics: [ - { - field: 'docker.memory.usage.pct', - id: 'avg-memory-metric', - type: 'avg', - }, - ], - split_mode: 'terms', - terms_field: 'container.name', - terms_order_by: 'avg-memory', - terms_size: 5, - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_filesystem.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_filesystem.ts deleted file mode 100644 index ce284345410fc..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_filesystem.ts +++ /dev/null @@ -1,38 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostFilesystem: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostFilesystem', - requires: ['system.filesystem'], - filter: 'system.filesystem.device_name:\\/*', - index_pattern: indexPattern, - time_field: timeField, - interval, - type: 'timeseries', - series: [ - { - id: 'used', - metrics: [ - { - field: 'system.filesystem.used.pct', - id: 'avg-filesystem-used', - type: 'avg', - }, - ], - split_mode: 'terms', - terms_field: 'system.filesystem.device_name', - terms_order_by: 'used', - terms_size: 5, - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_cpu_cap.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_cpu_cap.ts deleted file mode 100644 index d33e4cdeb34cd..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_cpu_cap.ts +++ /dev/null @@ -1,58 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostK8sCpuCap: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostK8sCpuCap', - map_field_to: 'kubernetes.node.name', - requires: ['kubernetes.node'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'capacity', - metrics: [ - { - field: 'kubernetes.node.cpu.allocatable.cores', - id: 'max-cpu-cap', - type: 'max', - }, - { - id: 'calc-nanocores', - type: 'calculation', - variables: [ - { - id: 'var-cores', - field: 'max-cpu-cap', - name: 'cores', - }, - ], - script: 'params.cores * 1000000000', - }, - ], - split_mode: 'everything', - }, - { - id: 'used', - metrics: [ - { - field: 'kubernetes.node.cpu.usage.nanocores', - id: 'avg-cpu-usage', - type: 'avg', - }, - ], - split_mode: 'everything', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_disk_cap.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_disk_cap.ts deleted file mode 100644 index e9e512a136631..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_disk_cap.ts +++ /dev/null @@ -1,45 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; -export const hostK8sDiskCap: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostK8sDiskCap', - map_field_to: 'kubernetes.node.name', - requires: ['kubernetes.node'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'capacity', - metrics: [ - { - field: 'kubernetes.node.fs.capacity.bytes', - id: 'max-fs-cap', - type: 'max', - }, - ], - split_mode: 'everything', - }, - { - id: 'used', - metrics: [ - { - field: 'kubernetes.node.fs.used.bytes', - id: 'avg-fs-used', - type: 'avg', - }, - ], - split_mode: 'everything', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_memory_cap.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_memory_cap.ts deleted file mode 100644 index ccc227ef1854e..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_memory_cap.ts +++ /dev/null @@ -1,46 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostK8sMemoryCap: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostK8sMemoryCap', - map_field_to: 'kubernetes.node.name', - requires: ['kubernetes.node'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'capacity', - metrics: [ - { - field: 'kubernetes.node.memory.allocatable.bytes', - id: 'max-memory-cap', - type: 'max', - }, - ], - split_mode: 'everything', - }, - { - id: 'used', - metrics: [ - { - field: 'kubernetes.node.memory.usage.bytes', - id: 'avg-memory-usage', - type: 'avg', - }, - ], - split_mode: 'everything', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_overview.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_overview.ts deleted file mode 100644 index 2da74d50dff1b..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_overview.ts +++ /dev/null @@ -1,155 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostK8sOverview: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostK8sOverview', - requires: ['kubernetes'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'top_n', - series: [ - { - id: 'cpucap', - split_mode: 'everything', - metrics: [ - { - field: 'kubernetes.node.cpu.allocatable.cores', - id: 'max-cpu-cap', - type: 'max', - }, - { - field: 'kubernetes.node.cpu.usage.nanocores', - id: 'avg-cpu-usage', - type: 'avg', - }, - { - id: 'calc-used-cap', - script: 'params.used / (params.cap * 1000000000)', - type: 'calculation', - variables: [ - { - field: 'max-cpu-cap', - id: 'var-cap', - name: 'cap', - }, - { - field: 'avg-cpu-usage', - id: 'var-used', - name: 'used', - }, - ], - }, - ], - }, - { - id: 'diskcap', - metrics: [ - { - field: 'kubernetes.node.fs.capacity.bytes', - id: 'max-fs-cap', - type: 'max', - }, - { - field: 'kubernetes.node.fs.used.bytes', - id: 'avg-fs-used', - type: 'avg', - }, - { - id: 'calc-used-cap', - script: 'params.used / params.cap', - type: 'calculation', - variables: [ - { - field: 'max-fs-cap', - id: 'var-cap', - name: 'cap', - }, - { - field: 'avg-fs-used', - id: 'var-used', - name: 'used', - }, - ], - }, - ], - split_mode: 'everything', - }, - { - id: 'memorycap', - metrics: [ - { - field: 'kubernetes.node.memory.allocatable.bytes', - id: 'max-memory-cap', - type: 'max', - }, - { - field: 'kubernetes.node.memory.usage.bytes', - id: 'avg-memory-usage', - type: 'avg', - }, - { - id: 'calc-used-cap', - script: 'params.used / params.cap', - type: 'calculation', - variables: [ - { - field: 'max-memory-cap', - id: 'var-cap', - name: 'cap', - }, - { - field: 'avg-memory-usage', - id: 'var-used', - name: 'used', - }, - ], - }, - ], - split_mode: 'everything', - }, - { - id: 'podcap', - metrics: [ - { - field: 'kubernetes.node.pod.capacity.total', - id: 'max-pod-cap', - type: 'max', - }, - { - field: 'kubernetes.pod.uid', - id: 'card-pod-name', - type: 'cardinality', - }, - { - id: 'calc-used-cap', - script: 'params.used / params.cap', - type: 'calculation', - variables: [ - { - field: 'max-pod-cap', - id: 'var-cap', - name: 'cap', - }, - { - field: 'card-pod-name', - id: 'var-used', - name: 'used', - }, - ], - }, - ], - split_mode: 'everything', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_pod_cap.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_pod_cap.ts deleted file mode 100644 index 85cb798eaf9b9..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_k8s_pod_cap.ts +++ /dev/null @@ -1,47 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostK8sPodCap: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostK8sPodCap', - requires: ['kubernetes.node'], - map_field_to: 'kubernetes.node.name', - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - - series: [ - { - id: 'capacity', - metrics: [ - { - field: 'kubernetes.node.pod.allocatable.total', - id: 'max-pod-cap', - type: 'max', - }, - ], - split_mode: 'everything', - }, - { - id: 'used', - metrics: [ - { - field: 'kubernetes.pod.uid', - id: 'avg-pod', - type: 'cardinality', - }, - ], - split_mode: 'everything', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_load.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_load.ts deleted file mode 100644 index bef170c743e6c..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_load.ts +++ /dev/null @@ -1,56 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostLoad: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostLoad', - requires: ['system.cpu'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'load_1m', - metrics: [ - { - field: 'system.load.1', - id: 'avg-load-1m', - type: 'avg', - }, - ], - split_mode: 'everything', - }, - { - id: 'load_5m', - metrics: [ - { - field: 'system.load.5', - id: 'avg-load-5m', - type: 'avg', - }, - ], - split_mode: 'everything', - }, - { - id: 'load_15m', - metrics: [ - { - field: 'system.load.15', - id: 'avg-load-15m', - type: 'avg', - }, - ], - split_mode: 'everything', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_memory_usage.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_memory_usage.ts deleted file mode 100644 index bda81dea4bd28..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_memory_usage.ts +++ /dev/null @@ -1,78 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostMemoryUsage: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostMemoryUsage', - requires: ['system.memory'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'free', - metrics: [ - { - field: 'system.memory.free', - id: 'avg-memory-free', - type: 'avg', - }, - ], - split_mode: 'everything', - }, - { - id: 'used', - metrics: [ - { - field: 'system.memory.actual.used.bytes', - id: 'avg-memory-used', - type: 'avg', - }, - ], - split_mode: 'everything', - }, - { - id: 'cache', - metrics: [ - { - field: 'system.memory.actual.used.bytes', - id: 'avg-memory-actual-used', - type: 'avg', - }, - { - field: 'system.memory.used.bytes', - id: 'avg-memory-used', - type: 'avg', - }, - { - id: 'calc-used-actual', - script: 'params.used - params.actual', - type: 'calculation', - variables: [ - { - field: 'avg-memory-actual-used', - id: 'var-actual', - name: 'actual', - }, - { - field: 'avg-memory-used', - id: 'var-used', - name: 'used', - }, - ], - }, - ], - split_mode: 'everything', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_network_traffic.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_network_traffic.ts deleted file mode 100644 index b3dfc5e91ba0a..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_network_traffic.ts +++ /dev/null @@ -1,109 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostNetworkTraffic: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostNetworkTraffic', - requires: ['system.network'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'timeseries', - series: [ - { - id: 'tx', - metrics: [ - { - field: 'host.network.egress.bytes', - id: 'avg-net-out', - type: 'avg', - }, - { - id: 'max-period', - type: 'max', - field: 'metricset.period', - }, - { - id: '3216b170-f192-11ec-a8e3-dd984b7213e2', - type: 'calculation', - variables: [ - { - id: '34e64c30-f192-11ec-a8e3-dd984b7213e2', - name: 'value', - field: 'avg-net-out', - }, - { - id: '3886cb80-f192-11ec-a8e3-dd984b7213e2', - name: 'period', - field: 'max-period', - }, - ], - script: 'params.value / (params.period / 1000)', - }, - ], - filter: { - language: 'kuery', - query: 'host.network.egress.bytes : * ', - }, - split_mode: 'everything', - }, - { - id: 'rx', - metrics: [ - { - field: 'host.network.ingress.bytes', - id: 'avg-net-in', - type: 'avg', - }, - { - id: 'calc-invert-rate', - script: 'params.rate * -1', - type: 'calculation', - variables: [ - { - field: 'avg-net-in', - id: 'var-rate', - name: 'rate', - }, - ], - }, - { - id: 'max-period', - type: 'max', - field: 'metricset.period', - }, - { - id: '3216b170-f192-11ec-a8e3-dd984b7213e2', - type: 'calculation', - variables: [ - { - id: '34e64c30-f192-11ec-a8e3-dd984b7213e2', - name: 'value', - field: 'calc-invert-rate', - }, - { - id: '3886cb80-f192-11ec-a8e3-dd984b7213e2', - name: 'period', - field: 'max-period', - }, - ], - script: 'params.value / (params.period / 1000)', - }, - ], - filter: { - language: 'kuery', - query: 'host.network.ingress.bytes : * ', - }, - split_mode: 'everything', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_system_overview.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_system_overview.ts deleted file mode 100644 index 69ebd0aa35947..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/host_system_overview.ts +++ /dev/null @@ -1,130 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { TSVBMetricModelCreator, TSVBMetricModel } from '../../../types'; - -export const hostSystemOverview: TSVBMetricModelCreator = ( - timeField, - indexPattern, - interval -): TSVBMetricModel => ({ - id: 'hostSystemOverview', - requires: ['system.cpu', 'system.memory', 'system.load', 'system.network'], - index_pattern: indexPattern, - interval, - time_field: timeField, - type: 'top_n', - series: [ - { - id: 'cpu', - split_mode: 'everything', - metrics: [ - { - field: 'system.cpu.total.norm.pct', - id: 'avg-cpu-total', - type: 'avg', - }, - ], - }, - { - id: 'load', - split_mode: 'everything', - metrics: [ - { - field: 'system.load.5', - id: 'avg-load-5m', - type: 'avg', - }, - ], - }, - { - id: 'memory', - split_mode: 'everything', - metrics: [ - { - field: 'system.memory.actual.used.pct', - id: 'avg-memory-actual-used', - type: 'avg', - }, - ], - }, - { - id: 'rx', - metrics: [ - { - field: 'host.network.ingress.bytes', - id: 'avg-net-in', - type: 'avg', - }, - { - id: 'max-period', - type: 'max', - field: 'metricset.period', - }, - { - id: '3216b170-f192-11ec-a8e3-dd984b7213e2', - type: 'calculation', - variables: [ - { - id: '34e64c30-f192-11ec-a8e3-dd984b7213e2', - name: 'value', - field: 'avg-net-in', - }, - { - id: '3886cb80-f192-11ec-a8e3-dd984b7213e2', - name: 'period', - field: 'max-period', - }, - ], - script: 'params.value / (params.period / 1000)', - }, - ], - filter: { - language: 'kuery', - query: 'host.network.ingress.bytes : * ', - }, - split_mode: 'everything', - }, - { - id: 'tx', - metrics: [ - { - field: 'host.network.egress.bytes', - id: 'avg-net-out', - type: 'avg', - }, - { - id: 'max-period', - type: 'max', - field: 'metricset.period', - }, - { - id: '3216b170-f192-11ec-a8e3-dd984b7213e2', - type: 'calculation', - variables: [ - { - id: '34e64c30-f192-11ec-a8e3-dd984b7213e2', - name: 'value', - field: 'avg-net-out', - }, - { - id: '3886cb80-f192-11ec-a8e3-dd984b7213e2', - name: 'period', - field: 'max-period', - }, - ], - script: 'params.value / (params.period / 1000)', - }, - ], - filter: { - language: 'kuery', - query: 'host.network.egress.bytes : * ', - }, - split_mode: 'everything', - }, - ], -}); diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/index.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/index.ts deleted file mode 100644 index fb91ee9dd8b27..0000000000000 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/host/metrics/tsvb/index.ts +++ /dev/null @@ -1,42 +0,0 @@ -/* - * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one - * or more contributor license agreements. Licensed under the Elastic License - * 2.0; you may not use this file except in compliance with the Elastic License - * 2.0. - */ - -import { hostSystemOverview } from './host_system_overview'; -import { hostCpuUsage } from './host_cpu_usage'; -import { hostLoad } from './host_load'; -import { hostMemoryUsage } from './host_memory_usage'; -import { hostNetworkTraffic } from './host_network_traffic'; -import { hostFilesystem } from './host_filesystem'; - -import { hostK8sOverview } from './host_k8s_overview'; -import { hostK8sCpuCap } from './host_k8s_cpu_cap'; -import { hostK8sPodCap } from './host_k8s_pod_cap'; -import { hostK8sDiskCap } from './host_k8s_disk_cap'; -import { hostK8sMemoryCap } from './host_k8s_memory_cap'; - -import { hostDockerTop5ByMemory } from './host_docker_top_5_by_memory'; -import { hostDockerTop5ByCpu } from './host_docker_top_5_by_cpu'; -import { hostDockerOverview } from './host_docker_overview'; -import { hostDockerInfo } from './host_docker_info'; - -export const tsvb = { - hostSystemOverview, - hostCpuUsage, - hostLoad, - hostMemoryUsage, - hostNetworkTraffic, - hostFilesystem, - hostK8sOverview, - hostK8sCpuCap, - hostK8sPodCap, - hostK8sDiskCap, - hostK8sMemoryCap, - hostDockerOverview, - hostDockerInfo, - hostDockerTop5ByMemory, - hostDockerTop5ByCpu, -}; diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/metrics.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/metrics.ts index 00937a064a7b8..510e655c0b8d7 100644 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/metrics.ts +++ b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/metrics.ts @@ -5,10 +5,8 @@ * 2.0. */ -import { metrics as hostMetrics } from './host/metrics'; import { metrics as sharedMetrics } from './shared/metrics'; import { metrics as podMetrics } from './kubernetes/pod/metrics'; -import { metrics as containerMetrics } from './container/metrics'; import { metrics as awsEC2Metrics } from './aws_ec2/metrics'; import { metrics as awsS3Metrics } from './aws_s3/metrics'; import { metrics as awsRDSMetrics } from './aws_rds/metrics'; @@ -16,10 +14,8 @@ import { metrics as awsSQSMetrics } from './aws_sqs/metrics'; export const metrics = { tsvb: { - ...hostMetrics.tsvb, ...sharedMetrics.tsvb, ...podMetrics.tsvb, - ...containerMetrics.tsvb, ...awsEC2Metrics.tsvb, ...awsS3Metrics.tsvb, ...awsRDSMetrics.tsvb, diff --git a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/types.ts b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/types.ts index f4054f289ae7a..8ad0ff886ebe6 100644 --- a/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/types.ts +++ b/x-pack/plugins/observability_solution/metrics_data_access/common/inventory_models/types.ts @@ -46,7 +46,6 @@ export const InventoryMetricRT = rt.keyof({ hostSystemOverview: null, hostCpuUsageTotal: null, hostCpuUsage: null, - hostFilesystem: null, hostK8sOverview: null, hostK8sCpuCap: null, hostK8sDiskCap: null, @@ -55,17 +54,12 @@ export const InventoryMetricRT = rt.keyof({ hostLoad: null, hostMemoryUsage: null, hostNetworkTraffic: null, - hostDockerOverview: null, - hostDockerInfo: null, - hostDockerTop5ByCpu: null, - hostDockerTop5ByMemory: null, podOverview: null, podCpuUsage: null, podMemoryUsage: null, podLogUsage: null, podNetworkTraffic: null, containerOverview: null, - containerCpuKernel: null, containerCpuUsage: null, containerDiskIOOps: null, containerDiskIOBytes: null, @@ -276,7 +270,7 @@ export const SnapshotMetricTypeRT = rt.keyof(SnapshotMetricTypeKeys); export type SnapshotMetricType = rt.TypeOf; export interface InventoryMetrics { - tsvb: { [name: string]: TSVBMetricModelCreator }; + tsvb?: { [name: string]: TSVBMetricModelCreator }; snapshot: { [name: string]: MetricsUIAggregation | undefined }; defaultSnapshot: SnapshotMetricType; /** This is used by the inventory view to calculate the appropriate amount of time for the metrics detail page. Some metrics like awsS3 require multiple days where others like host only need an hour.*/ diff --git a/x-pack/plugins/observability_solution/observability/public/components/alert_overview/helpers/map_rules_params_with_flyout.ts b/x-pack/plugins/observability_solution/observability/public/components/alert_overview/helpers/map_rules_params_with_flyout.ts index b5819631406d7..e9e8714bf85b9 100644 --- a/x-pack/plugins/observability_solution/observability/public/components/alert_overview/helpers/map_rules_params_with_flyout.ts +++ b/x-pack/plugins/observability_solution/observability/public/components/alert_overview/helpers/map_rules_params_with_flyout.ts @@ -286,7 +286,7 @@ export const mapRuleParamsWithFlyout = (alert: TopAlert): FlyoutThresholdData[] const { thresholdComparator, threshold } = ruleParams as EsQueryRuleParams; const ESQueryFlyoutMap = { observedValue: [alert.fields[ALERT_EVALUATION_VALUE]], - threshold: threshold.join(' AND '), + threshold: [threshold].flat().join(' AND '), comparator: thresholdComparator, pctAboveThreshold: getPctAboveThreshold( threshold, diff --git a/x-pack/plugins/session_view/public/methods/index.tsx b/x-pack/plugins/session_view/public/methods/index.tsx index ad8d77660b3b1..43295737c21f1 100644 --- a/x-pack/plugins/session_view/public/methods/index.tsx +++ b/x-pack/plugins/session_view/public/methods/index.tsx @@ -28,10 +28,7 @@ const SUPPORTED_PACKAGES = [ CLOUD_DEFEND_DATA_SOURCE, AUDITBEAT_DATA_SOURCE, ]; -const INDEX_REGEX = new RegExp( - `([a-z0-9_-]+\:)?[a-z0-9\-.]*(${SUPPORTED_PACKAGES.join('|')})`, - 'i' -); +const INDEX_REGEX = new RegExp(`([a-z0-9_-]+\:)?[a-z0-9-.]*(${SUPPORTED_PACKAGES.join('|')})`, 'i'); export const DEFAULT_INDEX = 'logs-*'; export const CLOUD_DEFEND_INDEX = 'logs-cloud_defend.*'; diff --git a/x-pack/plugins/transform/public/app/sections/create_transform/components/wizard/wizard.tsx b/x-pack/plugins/transform/public/app/sections/create_transform/components/wizard/wizard.tsx index ab2865b85eb8a..2c078f93627cc 100644 --- a/x-pack/plugins/transform/public/app/sections/create_transform/components/wizard/wizard.tsx +++ b/x-pack/plugins/transform/public/app/sections/create_transform/components/wizard/wizard.tsx @@ -111,7 +111,7 @@ export const CreateTransformWizardContext = createContext<{ export const Wizard: FC = React.memo(({ cloneConfig, searchItems }) => { const { showNodeInfo } = useEnabledFeatures(); const appDependencies = useAppDependencies(); - const { uiSettings, data, fieldFormats, charts } = appDependencies; + const { uiSettings, data, fieldFormats, charts, theme } = appDependencies; const { dataView } = searchItems; // The current WIZARD_STEP @@ -247,6 +247,7 @@ export const Wizard: FC = React.memo(({ cloneConfig, searchItems }) fieldStatsServices={fieldStatsServices} timeRangeMs={stepDefineState.timeRangeMs} dslQuery={transformConfig.source.query} + theme={theme} > = }); } + core.savedObjects.registerType({ + name: TYPE_WITH_PREDICTABLE_ID, + hidden: false, + namespaceType: 'single', + mappings: deepFreeze({ + properties: { + publicProperty: { type: 'keyword' }, + publicPropertyExcludedFromAAD: { type: 'keyword' }, + publicPropertyStoredEncrypted: { type: 'binary' }, + privateProperty: { type: 'binary' }, + }, + }), + }); + + deps.encryptedSavedObjects.registerType({ + type: TYPE_WITH_PREDICTABLE_ID, + attributesToEncrypt: new Set([ + 'privateProperty', + { key: 'publicPropertyStoredEncrypted', dangerouslyExposeValue: true }, + ]), + attributesToIncludeInAAD: new Set(['publicProperty']), + enforceRandomId: false, + }); + core.savedObjects.registerType({ name: SAVED_OBJECT_WITHOUT_SECRET_TYPE, hidden: false, diff --git a/x-pack/test/encrypted_saved_objects_api_integration/tests/encrypted_saved_objects_api.ts b/x-pack/test/encrypted_saved_objects_api_integration/tests/encrypted_saved_objects_api.ts index 4687a01858260..23aa9017e52ea 100644 --- a/x-pack/test/encrypted_saved_objects_api_integration/tests/encrypted_saved_objects_api.ts +++ b/x-pack/test/encrypted_saved_objects_api_integration/tests/encrypted_saved_objects_api.ts @@ -26,6 +26,8 @@ export default function ({ getService }: FtrProviderContext) { 'saved-object-with-secret-and-multiple-spaces'; const SAVED_OBJECT_WITHOUT_SECRET_TYPE = 'saved-object-without-secret'; + const TYPE_WITH_PREDICTABLE_ID = 'type-with-predictable-ids'; + function runTests( encryptedSavedObjectType: string, getURLAPIBaseURL: () => string, @@ -900,5 +902,129 @@ export default function ({ getService }: FtrProviderContext) { } }); }); + + describe('enforceRandomId', () => { + describe('false', () => { + it('#create allows setting non-random ID', async () => { + const id = 'my_predictable_id'; + + const savedObjectOriginalAttributes = { + publicProperty: randomness.string(), + publicPropertyStoredEncrypted: randomness.string(), + privateProperty: randomness.string(), + publicPropertyExcludedFromAAD: randomness.string(), + }; + + const { body: response } = await supertest + .post(`/api/saved_objects/${TYPE_WITH_PREDICTABLE_ID}/${id}`) + .set('kbn-xsrf', 'xxx') + .send({ attributes: savedObjectOriginalAttributes }) + .expect(200); + + expect(response.id).to.be(id); + }); + + it('#bulkCreate not enforcing random ID allows to specify ID', async () => { + const bulkCreateParams = [ + { + type: TYPE_WITH_PREDICTABLE_ID, + id: 'my_predictable_id', + attributes: { + publicProperty: randomness.string(), + publicPropertyExcludedFromAAD: randomness.string(), + publicPropertyStoredEncrypted: randomness.string(), + privateProperty: randomness.string(), + }, + }, + { + type: TYPE_WITH_PREDICTABLE_ID, + id: 'my_predictable_id_2', + attributes: { + publicProperty: randomness.string(), + publicPropertyExcludedFromAAD: randomness.string(), + publicPropertyStoredEncrypted: randomness.string(), + privateProperty: randomness.string(), + }, + }, + ]; + + const { + body: { saved_objects: savedObjects }, + } = await supertest + .post('/api/saved_objects/_bulk_create') + .set('kbn-xsrf', 'xxx') + .send(bulkCreateParams) + .expect(200); + + expect(savedObjects).to.have.length(bulkCreateParams.length); + expect(savedObjects[0].id).to.be('my_predictable_id'); + expect(savedObjects[1].id).to.be('my_predictable_id_2'); + }); + }); + + describe('true or undefined', () => { + it('#create setting a predictable id on ESO types that have not opted out throws an error', async () => { + const id = 'my_predictable_id'; + + const savedObjectOriginalAttributes = { + publicProperty: randomness.string(), + publicPropertyStoredEncrypted: randomness.string(), + privateProperty: randomness.string(), + publicPropertyExcludedFromAAD: randomness.string(), + }; + + const { body: response } = await supertest + .post(`/api/saved_objects/saved-object-with-secret/${id}`) + .set('kbn-xsrf', 'xxx') + .send({ attributes: savedObjectOriginalAttributes }) + .expect(400); + + expect(response.message).to.contain( + 'Predefined IDs are not allowed for saved objects with encrypted attributes unless the ID is a UUID.' + ); + }); + + it('#bulkCreate setting random ID on ESO types that have not opted out throws an error', async () => { + const bulkCreateParams = [ + { + type: SAVED_OBJECT_WITH_SECRET_TYPE, + id: 'my_predictable_id', + attributes: { + publicProperty: randomness.string(), + publicPropertyExcludedFromAAD: randomness.string(), + publicPropertyStoredEncrypted: randomness.string(), + privateProperty: randomness.string(), + }, + }, + { + type: SAVED_OBJECT_WITH_SECRET_TYPE, + id: 'my_predictable_id_2', + attributes: { + publicProperty: randomness.string(), + publicPropertyExcludedFromAAD: randomness.string(), + publicPropertyStoredEncrypted: randomness.string(), + privateProperty: randomness.string(), + }, + }, + ]; + + const { + body: { saved_objects: savedObjects }, + } = await supertest + .post('/api/saved_objects/_bulk_create') + .set('kbn-xsrf', 'xxx') + .send(bulkCreateParams) + .expect(200); + + expect(savedObjects).to.have.length(bulkCreateParams.length); + + savedObjects.forEach((savedObject: any) => { + expect(savedObject.error.message).to.contain( + 'Predefined IDs are not allowed for saved objects with encrypted attributes unless the ID is a UUID.' + ); + }); + }); + }); + }); }); } diff --git a/x-pack/test/fleet_api_integration/apis/epm/index.js b/x-pack/test/fleet_api_integration/apis/epm/index.js index 3caed7da79f65..cae50abdae762 100644 --- a/x-pack/test/fleet_api_integration/apis/epm/index.js +++ b/x-pack/test/fleet_api_integration/apis/epm/index.js @@ -31,6 +31,7 @@ export default function loadTests({ loadTestFile, getService }) { loadTestFile(require.resolve('./install_update')); loadTestFile(require.resolve('./install_tsds_disable')); loadTestFile(require.resolve('./install_tag_assets')); + loadTestFile(require.resolve('./install_with_streaming')); loadTestFile(require.resolve('./bulk_upgrade')); loadTestFile(require.resolve('./bulk_install')); loadTestFile(require.resolve('./update_assets')); diff --git a/x-pack/test/fleet_api_integration/apis/epm/install_by_upload.ts b/x-pack/test/fleet_api_integration/apis/epm/install_by_upload.ts index e6fa2930cf84d..e32328b4e22cc 100644 --- a/x-pack/test/fleet_api_integration/apis/epm/install_by_upload.ts +++ b/x-pack/test/fleet_api_integration/apis/epm/install_by_upload.ts @@ -195,7 +195,7 @@ export default function (providerContext: FtrProviderContext) { .send(buf) .expect(400); expect((res.error as HTTPError).text).to.equal( - '{"statusCode":400,"error":"Bad Request","message":"Archive seems empty. Assumed content type was application/gzip, check if this matches the archive type."}' + '{"statusCode":400,"error":"Bad Request","message":"Manifest file manifest.yml not found in paths."}' ); }); diff --git a/x-pack/test/fleet_api_integration/apis/epm/install_with_streaming.ts b/x-pack/test/fleet_api_integration/apis/epm/install_with_streaming.ts new file mode 100644 index 0000000000000..152e3dfd4c69d --- /dev/null +++ b/x-pack/test/fleet_api_integration/apis/epm/install_with_streaming.ts @@ -0,0 +1,65 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ +import type { Client } from '@elastic/elasticsearch'; +import { INGEST_SAVED_OBJECT_INDEX } from '@kbn/core-saved-objects-server'; +import { Installation } from '@kbn/fleet-plugin/server/types'; +import expect from 'expect'; +import { FtrProviderContext } from '../../../api_integration/ftr_provider_context'; +import { skipIfNoDockerRegistry, isDockerRegistryEnabledOrSkipped } from '../../helpers'; + +export default function (providerContext: FtrProviderContext) { + const { getService } = providerContext; + const es: Client = getService('es'); + const supertest = getService('supertest'); + const fleetAndAgents = getService('fleetAndAgents'); + + const uninstallPackage = async (pkg: string, version: string) => { + await supertest.delete(`/api/fleet/epm/packages/${pkg}/${version}`).set('kbn-xsrf', 'xxxx'); + }; + const installPackage = (pkg: string, version: string, opts?: { force?: boolean }) => { + return supertest + .post(`/api/fleet/epm/packages/${pkg}/${version}`) + .set('kbn-xsrf', 'xxxx') + .send({ force: !!opts?.force }); + }; + + const getInstallationSavedObject = async (pkg: string): Promise => { + const res: { _source?: { 'epm-packages': Installation } } = await es.transport.request({ + method: 'GET', + path: `/${INGEST_SAVED_OBJECT_INDEX}/_doc/epm-packages:${pkg}`, + }); + + return res?._source?.['epm-packages'] as Installation; + }; + + describe('Installs a package using stream-based approach', () => { + skipIfNoDockerRegistry(providerContext); + + before(async () => { + await fleetAndAgents.setup(); + }); + + describe('security_detection_engine package', () => { + after(async () => { + if (!isDockerRegistryEnabledOrSkipped(providerContext)) return; + await uninstallPackage('security_detection_engine', '8.16.0'); + }); + it('should install security-rule assets from the package', async () => { + await installPackage('security_detection_engine', '8.16.0').expect(200); + const installationSO = await getInstallationSavedObject('security_detection_engine'); + expect(installationSO?.installed_kibana).toEqual( + expect.arrayContaining([ + expect.objectContaining({ + id: expect.any(String), + type: 'security-rule', + }), + ]) + ); + }); + }); + }); +} diff --git a/x-pack/test/fleet_api_integration/apis/fixtures/test_packages/security_detection_engine/security_detection_engine-8.16.0.zip b/x-pack/test/fleet_api_integration/apis/fixtures/test_packages/security_detection_engine/security_detection_engine-8.16.0.zip new file mode 100644 index 0000000000000..b57713203546d Binary files /dev/null and b/x-pack/test/fleet_api_integration/apis/fixtures/test_packages/security_detection_engine/security_detection_engine-8.16.0.zip differ diff --git a/x-pack/test/functional/apps/aiops/index.ts b/x-pack/test/functional/apps/aiops/index.ts index 8706d3d242c6b..75ad41f2e21fb 100644 --- a/x-pack/test/functional/apps/aiops/index.ts +++ b/x-pack/test/functional/apps/aiops/index.ts @@ -31,6 +31,7 @@ export default function ({ getService, loadTestFile }: FtrProviderContext) { loadTestFile(require.resolve('./log_rate_analysis')); loadTestFile(require.resolve('./log_rate_analysis_anomaly_table')); + loadTestFile(require.resolve('./log_rate_analysis_dashboard_embeddable')); loadTestFile(require.resolve('./change_point_detection')); loadTestFile(require.resolve('./log_pattern_analysis')); loadTestFile(require.resolve('./log_pattern_analysis_in_discover')); diff --git a/x-pack/test/functional/apps/aiops/log_rate_analysis_dashboard_embeddable.ts b/x-pack/test/functional/apps/aiops/log_rate_analysis_dashboard_embeddable.ts new file mode 100644 index 0000000000000..331a89821335d --- /dev/null +++ b/x-pack/test/functional/apps/aiops/log_rate_analysis_dashboard_embeddable.ts @@ -0,0 +1,104 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import { EMBEDDABLE_LOG_RATE_ANALYSIS_TYPE } from '@kbn/aiops-log-rate-analysis/constants'; + +import { FtrProviderContext } from '../../ftr_provider_context'; +import { logRateAnalysisTestData } from './log_rate_analysis_test_data'; + +const testDataSetup = logRateAnalysisTestData[0]; +const testDataPanel = { + type: 'testData', + suiteSuffix: 'with multi metric job', + panelTitle: `AIOps log rate analysis for ${testDataSetup.sourceIndexOrSavedSearch}`, + dashboardTitle: `AIOps log rate analysis for ${ + testDataSetup.sourceIndexOrSavedSearch + } ${Date.now()}`, +}; +export default function ({ getService, getPageObjects }: FtrProviderContext) { + const PageObjects = getPageObjects([ + 'common', + 'console', + 'dashboard', + 'header', + 'home', + 'security', + 'timePicker', + ]); + const aiops = getService('aiops'); + + // AIOps / Log Rate Analysis lives in the ML UI so we need some related services. + const ml = getService('ml'); + + const from = 'Apr 16, 2023 @ 00:39:02.912'; + const to = 'Jun 15, 2023 @ 21:45:26.749'; + + describe('log rate analysis in dashboard', function () { + before(async () => { + await aiops.logRateAnalysisDataGenerator.generateData(testDataSetup.dataGenerator); + + await ml.testResources.setKibanaTimeZoneToUTC(); + + await ml.securityUI.loginAsMlPowerUser(); + await ml.testResources.createDataViewIfNeeded( + testDataSetup.sourceIndexOrSavedSearch, + '@timestamp' + ); + + await PageObjects.common.setTime({ from, to }); + }); + + after(async () => { + await ml.testResources.deleteDataViewByTitle(testDataSetup.sourceIndexOrSavedSearch); + await aiops.logRateAnalysisDataGenerator.removeGeneratedData(testDataSetup.dataGenerator); + await PageObjects.common.unsetTime(); + }); + + describe(testDataPanel.suiteSuffix, function () { + before(async () => { + await PageObjects.dashboard.navigateToApp(); + }); + + after(async () => { + await ml.testResources.deleteDashboardByTitle(testDataPanel.dashboardTitle); + }); + + it('should open initializer flyout', async () => { + await PageObjects.dashboard.clickNewDashboard(); + await aiops.dashboardEmbeddables.assertDashboardIsEmpty(); + await aiops.dashboardEmbeddables.openEmbeddableInitializer( + EMBEDDABLE_LOG_RATE_ANALYSIS_TYPE + ); + }); + + it('should select data view', async () => { + await aiops.dashboardEmbeddables.assertLogRateAnalysisEmbeddableDataViewSelectorExists(); + await aiops.dashboardEmbeddables.selectLogRateAnalysisEmbeddableDataView( + testDataSetup.sourceIndexOrSavedSearch + ); + }); + + it('should create new log rate analysis panel', async () => { + await aiops.dashboardEmbeddables.clickLogRateAnalysisInitializerConfirmButtonEnabled(); + await PageObjects.timePicker.pauseAutoRefresh(); + await aiops.dashboardEmbeddables.assertDashboardPanelExists(testDataPanel.panelTitle); + await aiops.logRateAnalysisPage.assertAutoRunButtonExists(); + await PageObjects.dashboard.saveDashboard(testDataPanel.dashboardTitle); + }); + + it('should run log rate analysis', async () => { + await aiops.dashboardEmbeddables.assertDashboardPanelExists(testDataPanel.panelTitle); + await aiops.logRateAnalysisPage.clickAutoRunButton(); + // Wait for the analysis to finish + await aiops.logRateAnalysisPage.assertAnalysisComplete( + testDataSetup.analysisType, + testDataSetup.dataGenerator + ); + }); + }); + }); +} diff --git a/x-pack/test/functional/apps/spaces/solution_view_flag_enabled/solution_tour.ts b/x-pack/test/functional/apps/spaces/solution_view_flag_enabled/solution_tour.ts index 7b058aa36ba92..ac8281f45a56e 100644 --- a/x-pack/test/functional/apps/spaces/solution_view_flag_enabled/solution_tour.ts +++ b/x-pack/test/functional/apps/spaces/solution_view_flag_enabled/solution_tour.ts @@ -17,10 +17,20 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) { const browser = getService('browser'); const es = getService('es'); const log = getService('log'); + const retry = getService('retry'); describe('space solution tour', () => { let version: string | undefined; + const getGlobalSettings = async () => { + const doc = await es.get( + { id: `config-global:${version}`, index: '.kibana' }, + { headers: { 'kbn-xsrf': 'spaces' }, ignore: [404] } + ); + const value = (doc?._source as any)?.['config-global'] || null; + return value; + }; + const removeGlobalSettings = async () => { version = version ?? (await kibanaServer.version.get()); version = version.replace(/-SNAPSHOT$/, ''); @@ -37,7 +47,10 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) { throw error; }); - await PageObjects.common.sleep(500); // just to be on the safe side + await retry.tryForTime(3000, async () => { + const value = await getGlobalSettings(); + return value === null; + }); }; before(async () => { @@ -66,6 +79,7 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) { before(async () => { _defaultSpace = await spacesService.get('default'); + await removeGlobalSettings(); // Make sure we start from a clean state await PageObjects.common.navigateToUrl('management', 'kibana/spaces', { shouldUseHashForSubUrl: false, diff --git a/x-pack/test/functional/page_objects/search_sessions_management_page.ts b/x-pack/test/functional/page_objects/search_sessions_management_page.ts index 6d704387e21f2..8694dc51dfbd6 100644 --- a/x-pack/test/functional/page_objects/search_sessions_management_page.ts +++ b/x-pack/test/functional/page_objects/search_sessions_management_page.ts @@ -37,7 +37,7 @@ export function SearchSessionsPageProvider({ getService, getPageObjects }: FtrPr id: ((await row.getAttribute('data-test-search-session-id')) ?? '').split('id-')[1], name: $.findTestSubject('sessionManagementNameCol').text().trim(), status: $.findTestSubject('sessionManagementStatusLabel').attr('data-test-status'), - mainUrl: $.findTestSubject('sessionManagementNameCol').text(), + mainUrl: $.findTestSubject('sessionManagementNameCol').attr('href'), created: $.findTestSubject('sessionManagementCreatedCol').text(), expires: $.findTestSubject('sessionManagementExpiresCol').text(), searchesCount: Number($.findTestSubject('sessionManagementNumSearchesCol').text()), diff --git a/x-pack/test/functional/services/aiops/dashboard_embeddables.ts b/x-pack/test/functional/services/aiops/dashboard_embeddables.ts new file mode 100644 index 0000000000000..a24cec24734da --- /dev/null +++ b/x-pack/test/functional/services/aiops/dashboard_embeddables.ts @@ -0,0 +1,110 @@ +/* + * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + * or more contributor license agreements. Licensed under the Elastic License + * 2.0; you may not use this file except in compliance with the Elastic License + * 2.0. + */ + +import expect from '@kbn/expect'; + +import { FtrProviderContext } from '../../ftr_provider_context'; + +export function AiopsDashboardEmbeddablesProvider({ getService }: FtrProviderContext) { + const comboBox = getService('comboBox'); + const retry = getService('retry'); + const testSubjects = getService('testSubjects'); + const find = getService('find'); + const dashboardAddPanel = getService('dashboardAddPanel'); + + return { + async assertLogRateAnalysisEmbeddableInitializerExists() { + await retry.tryForTime(10 * 1000, async () => { + await testSubjects.existOrFail('aiopsLogRateAnalysisEmbeddableInitializer', { + timeout: 1000, + }); + }); + }, + + async assertLogRateAnalysisEmbeddableInitializerNotExists() { + await retry.tryForTime(10 * 1000, async () => { + await testSubjects.missingOrFail('aiopsLogRateAnalysisEmbeddableInitializer', { + timeout: 1000, + }); + }); + }, + + async assertInitializerConfirmButtonEnabled(subj: string) { + await retry.tryForTime(60 * 1000, async () => { + await testSubjects.existOrFail(subj); + await testSubjects.isEnabled(subj); + }); + }, + + async assertDashboardIsEmpty() { + await retry.tryForTime(60 * 1000, async () => { + await testSubjects.existOrFail('emptyDashboardWidget'); + }); + }, + + async assertDashboardPanelExists(title: string) { + await retry.tryForTime(5000, async () => { + await find.existsByLinkText(title); + }); + }, + + async assertLogsAiopsSectionExists(expectExist = true) { + await retry.tryForTime(60 * 1000, async () => { + await dashboardAddPanel.clickEditorMenuButton(); + await dashboardAddPanel.verifyEmbeddableFactoryGroupExists('logs-aiops', expectExist); + }); + }, + + async clickLogRateAnalysisInitializerConfirmButtonEnabled() { + const subj = 'aiopsLogRateAnalysisConfirmButton'; + await retry.tryForTime(60 * 1000, async () => { + await this.assertInitializerConfirmButtonEnabled(subj); + await testSubjects.clickWhenNotDisabledWithoutRetry(subj); + await this.assertLogRateAnalysisEmbeddableInitializerNotExists(); + }); + }, + + async openEmbeddableInitializer(mlEmbeddableType: 'aiopsLogRateAnalysisEmbeddable') { + const name = { + aiopsLogRateAnalysisEmbeddable: 'Log rate analysis', + }; + await retry.tryForTime(60 * 1000, async () => { + await dashboardAddPanel.clickEditorMenuButton(); + await testSubjects.existOrFail('dashboardPanelSelectionFlyout', { timeout: 2000 }); + + await dashboardAddPanel.verifyEmbeddableFactoryGroupExists('logs-aiops'); + + await dashboardAddPanel.clickAddNewPanelFromUIActionLink(name[mlEmbeddableType]); + await testSubjects.existOrFail('aiopsLogRateAnalysisControls', { timeout: 2000 }); + }); + }, + + async assertLogRateAnalysisEmbeddableDataViewSelectorExists() { + await testSubjects.existOrFail( + 'aiopsLogRateAnalysisEmbeddableDataViewSelector > comboBoxInput' + ); + }, + + async assertLogRateAnalysisEmbeddableDataViewSelection(dataViewValue: string) { + const comboBoxSelectedOptions = await comboBox.getComboBoxSelectedOptions( + 'aiopsLogRateAnalysisEmbeddableDataViewSelector > comboBoxInput' + ); + expect(comboBoxSelectedOptions).to.eql( + [dataViewValue], + `Expected data view selection to be '${dataViewValue}' (got '${comboBoxSelectedOptions}')` + ); + }, + + async selectLogRateAnalysisEmbeddableDataView(dataViewValue: string) { + await comboBox.set( + 'aiopsLogRateAnalysisEmbeddableDataViewSelector > comboBoxInput', + dataViewValue + ); + await this.assertLogRateAnalysisEmbeddableDataViewSelection(dataViewValue); + }, + }; +} diff --git a/x-pack/test/functional/services/aiops/index.ts b/x-pack/test/functional/services/aiops/index.ts index 71de8c397c073..39abf21375f86 100644 --- a/x-pack/test/functional/services/aiops/index.ts +++ b/x-pack/test/functional/services/aiops/index.ts @@ -7,6 +7,7 @@ import type { FtrProviderContext } from '../../ftr_provider_context'; +import { AiopsDashboardEmbeddablesProvider } from './dashboard_embeddables'; import { LogRateAnalysisPageProvider } from './log_rate_analysis_page'; import { LogRateAnalysisResultsTableProvider } from './log_rate_analysis_results_table'; import { LogRateAnalysisResultsGroupsTableProvider } from './log_rate_analysis_results_groups_table'; @@ -16,6 +17,7 @@ import { ChangePointDetectionPageProvider } from './change_point_detection_page' import { MlTableServiceProvider } from '../ml/common_table_service'; export function AiopsProvider(context: FtrProviderContext) { + const dashboardEmbeddables = AiopsDashboardEmbeddablesProvider(context); const logRateAnalysisPage = LogRateAnalysisPageProvider(context); const logRateAnalysisResultsTable = LogRateAnalysisResultsTableProvider(context); const logRateAnalysisResultsGroupsTable = LogRateAnalysisResultsGroupsTableProvider(context); @@ -27,6 +29,7 @@ export function AiopsProvider(context: FtrProviderContext) { const changePointDetectionPage = ChangePointDetectionPageProvider(context, tableService); return { + dashboardEmbeddables, changePointDetectionPage, logRateAnalysisPage, logRateAnalysisResultsTable, diff --git a/x-pack/test/functional/services/aiops/log_rate_analysis_page.ts b/x-pack/test/functional/services/aiops/log_rate_analysis_page.ts index 66c4e64f05efe..0f7b14e3e8be7 100644 --- a/x-pack/test/functional/services/aiops/log_rate_analysis_page.ts +++ b/x-pack/test/functional/services/aiops/log_rate_analysis_page.ts @@ -266,6 +266,22 @@ export function LogRateAnalysisPageProvider({ getService, getPageObject }: FtrPr ); }, + async clickAutoRunButton() { + await testSubjects.clickWhenNotDisabledWithoutRetry( + 'aiopsLogRateAnalysisContentRunAnalysisButton' + ); + + await retry.tryForTime(30 * 1000, async () => { + await testSubjects.missingOrFail('aiopsLogRateAnalysisContentRunAnalysisButton'); + }); + }, + + async assertAutoRunButtonExists() { + await retry.tryForTime(5000, async () => { + await testSubjects.existOrFail('aiopsLogRateAnalysisContentRunAnalysisButton'); + }); + }, + async assertNoAutoRunButtonExists() { await testSubjects.existOrFail('aiopsLogRateAnalysisNoAutoRunContentRunAnalysisButton'); }, diff --git a/x-pack/test/search_sessions_integration/tests/apps/discover/async_search.ts b/x-pack/test/search_sessions_integration/tests/apps/discover/async_search.ts index 3e9429ee5ed97..1f768780a9c95 100644 --- a/x-pack/test/search_sessions_integration/tests/apps/discover/async_search.ts +++ b/x-pack/test/search_sessions_integration/tests/apps/discover/async_search.ts @@ -29,8 +29,7 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) { const kibanaServer = getService('kibanaServer'); const toasts = getService('toasts'); - // FLAKY: https://github.com/elastic/kibana/issues/195955 - describe.skip('discover async search', () => { + describe('discover async search', () => { before(async () => { await esArchiver.loadIfNeeded('x-pack/test/functional/es_archives/logstash_functional'); await kibanaServer.importExport.load( @@ -115,6 +114,8 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) { it('relative timerange works', async () => { await common.navigateToApp('discover'); await header.waitUntilLoadingHasFinished(); + const url = await browser.getCurrentUrl(); + await searchSessions.save(); await searchSessions.expectState('backgroundCompleted'); const searchSessionId = await getSearchSessionId(); @@ -125,8 +126,14 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) { await searchSessionsManagement.goTo(); const searchSessionListBeforeRestore = await searchSessionsManagement.getList(); const searchesCountBeforeRestore = searchSessionListBeforeRestore[0].searchesCount; + // navigate to Discover - await searchSessionListBeforeRestore[0].view(); + // Instead of clicking the link to navigate to Discover, we load Discover from scratch (just like we did when we + // ran the search session before saving). This ensures that the same number of requests are made. + // await searchSessionListBeforeRestore[0].view(); + const restoreUrl = new URL(searchSessionListBeforeRestore[0].mainUrl, url).href; + await browser.navigateTo(restoreUrl); + await header.waitUntilLoadingHasFinished(); await searchSessions.expectState('restored'); expect(await discover.hasNoResults()).to.be(true); diff --git a/x-pack/test/security_solution_cypress/cypress/e2e/investigations/sourcerer/sourcerer_timeline.cy.ts b/x-pack/test/security_solution_cypress/cypress/e2e/investigations/sourcerer/sourcerer_timeline.cy.ts index 9d2e1ac2e11a5..3560ff4bfd4c4 100644 --- a/x-pack/test/security_solution_cypress/cypress/e2e/investigations/sourcerer/sourcerer_timeline.cy.ts +++ b/x-pack/test/security_solution_cypress/cypress/e2e/investigations/sourcerer/sourcerer_timeline.cy.ts @@ -40,7 +40,8 @@ import { closeTimeline, openTimelineById } from '../../../tasks/timeline'; const siemDataViewTitle = 'Security Default Data View'; const dataViews = ['logs-*', 'metrics-*', '.kibana-event-log-*']; -describe('Timeline scope', { tags: ['@ess', '@serverless', '@skipInServerless'] }, () => { +// Failing: See https://github.com/elastic/kibana/issues/198943 +describe.skip('Timeline scope', { tags: ['@ess', '@serverless', '@skipInServerless'] }, () => { before(() => { waitForRulesBootstrap(); }); diff --git a/x-pack/test_serverless/api_integration/test_suites/common/platform_security/authorization.ts b/x-pack/test_serverless/api_integration/test_suites/common/platform_security/authorization.ts index 187efa4e860a8..a43da7314bed4 100644 --- a/x-pack/test_serverless/api_integration/test_suites/common/platform_security/authorization.ts +++ b/x-pack/test_serverless/api_integration/test_suites/common/platform_security/authorization.ts @@ -39,6 +39,7 @@ export default function ({ getService }: FtrProviderContext) { const log = getService('log'); const svlCommonApi = getService('svlCommonApi'); const roleScopedSupertest = getService('roleScopedSupertest'); + const platformSecurityUtils = getService('platformSecurityUtils'); const es = getService('es'); let supertestAdminWithCookieCredentials: SupertestWithRoleScopeType; let supertestAdminWithApiKey: SupertestWithRoleScopeType; @@ -56,6 +57,7 @@ export default function ({ getService }: FtrProviderContext) { }); }); after(async function () { + await platformSecurityUtils.clearAllRoles(); await supertestAdminWithApiKey.destroy(); }); diff --git a/yarn.lock b/yarn.lock index cbd5dd93a43fd..e2a1487cf91af 100644 --- a/yarn.lock +++ b/yarn.lock @@ -8481,7 +8481,7 @@ require-from-string "^2.0.2" uri-js-replace "^1.0.1" -"@redocly/cli@^1.25.7": +"@redocly/cli@^1.25.8": version "1.25.8" resolved "https://registry.yarnpkg.com/@redocly/cli/-/cli-1.25.8.tgz#fecd62d9ee1d564e6f0e1522f2c5648f514ce02b" integrity sha512-oVFN3rpGFqupx57ZS0mF2B8grnk3i0xjTQrrMm1oftF3GEf7yTg5JzwnWi8KKRWuxin4qI7j+Id5AKgNQNmTKA== @@ -14880,10 +14880,10 @@ cookie-signature@1.0.6: resolved "https://registry.yarnpkg.com/cookie-signature/-/cookie-signature-1.0.6.tgz#e303a882b342cc3ee8ca513a79999734dab3ae2c" integrity sha1-4wOogrNCzD7oylE6eZmXNNqzriw= -cookie@0.6.0: - version "0.6.0" - resolved "https://registry.yarnpkg.com/cookie/-/cookie-0.6.0.tgz#2798b04b071b0ecbff0dbb62a505a8efa4e19051" - integrity sha512-U71cyTamuh1CRNCfpGY6to28lxvNwPG4Guz/EVjgf3Jmzv0vlDp1atT9eS5dDjMYHucpHbWns6Lwf3BKz6svdw== +cookie@0.7.1: + version "0.7.1" + resolved "https://registry.yarnpkg.com/cookie/-/cookie-0.7.1.tgz#2f73c42142d5d5cf71310a74fc4ae61670e5dbc9" + integrity sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w== cookie@^0.5.0: version "0.5.0" @@ -16693,10 +16693,10 @@ elastic-apm-node@3.46.0: traverse "^0.6.6" unicode-byte-truncate "^1.0.0" -elastic-apm-node@^4.8.0: - version "4.8.0" - resolved "https://registry.yarnpkg.com/elastic-apm-node/-/elastic-apm-node-4.8.0.tgz#10d17c3bbd127b8bab9cb264750936a81b4339ad" - integrity sha512-XEfkWWQlIyv72QTCgScFzXWYM2znm/mA+6I8e2DMmr3lBdwemOTxBZw9jExu4OQ2uMc+Ld8wc5bbikkAYp4nng== +elastic-apm-node@^4.8.1: + version "4.8.1" + resolved "https://registry.yarnpkg.com/elastic-apm-node/-/elastic-apm-node-4.8.1.tgz#b0ff8e04d7109c64b98608de73936c8740849f60" + integrity sha512-gryl5pJWd9Cp7qK+McHIFm9VlsqKipoXWIsKOGyPGzrddIVKKA8ONnLBwymQ1A2x4uXuk1HnIVwBvyk56vukaw== dependencies: "@elastic/ecs-pino-format" "^1.5.0" "@opentelemetry/api" "^1.4.1" @@ -17861,17 +17861,17 @@ expr-eval@^2.0.2: resolved "https://registry.yarnpkg.com/expr-eval/-/expr-eval-2.0.2.tgz#fa6f044a7b0c93fde830954eb9c5b0f7fbc7e201" integrity sha512-4EMSHGOPSwAfBiibw3ndnP0AvjDWLsMvGOvWEZ2F96IGk0bIVdjQisOHxReSkE13mHcfbuCiXw+G4y0zv6N8Eg== -express@^4.17.1, express@^4.17.3, express@^4.18.2, express@^4.21.0: - version "4.21.0" - resolved "https://registry.yarnpkg.com/express/-/express-4.21.0.tgz#d57cb706d49623d4ac27833f1cbc466b668eb915" - integrity sha512-VqcNGcj/Id5ZT1LZ/cfihi3ttTn+NJmkli2eZADigjq29qTlWi/hAQ43t/VLPq8+UX06FCEx3ByOYet6ZFblng== +express@^4.17.1, express@^4.17.3, express@^4.18.2, express@^4.21.1: + version "4.21.1" + resolved "https://registry.yarnpkg.com/express/-/express-4.21.1.tgz#9dae5dda832f16b4eec941a4e44aa89ec481b281" + integrity sha512-YSFlK1Ee0/GC8QaO91tHcDxJiE/X4FbpAyQWkxAvG6AXCuR65YzK8ua6D9hvi/TzUfZMpc+BwuM1IPw8fmQBiQ== dependencies: accepts "~1.3.8" array-flatten "1.1.1" body-parser "1.20.3" content-disposition "0.5.4" content-type "~1.0.4" - cookie "0.6.0" + cookie "0.7.1" cookie-signature "1.0.6" debug "2.6.9" depd "2.0.0"