Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[META] OpenSearch Dashboards Forecasting Dev Ops Metrics #70

Open
5 tasks
xeniatup opened this issue Jul 29, 2023 · 1 comment
Open
5 tasks

[META] OpenSearch Dashboards Forecasting Dev Ops Metrics #70

xeniatup opened this issue Jul 29, 2023 · 1 comment
Labels
enhancement New feature or request

Comments

@xeniatup
Copy link

xeniatup commented Jul 29, 2023

This is a meta issue for UX work related to predicting behavior of time series data in OpenSearch Dashboards.
The solution covers creating and managing forecasting jobs from a data source and consuming forecasting results visualized on a graph based on this feature proposal.

Forecasting will show the user where their metrics are headed so that they can prepare accordingly (e.g. disk space, server instance upgrades, inventory planning).

Is your feature request related to a problem? Please describe.

DevOps and SREs can leverage AI/ML capabilities of OpenSearch to identify anomalies in their log data, but the gained insights would be retroactive by definition. The user wants to make sure that the applications, service tiers, or services are up and fully functional, they would like to be proactively notified if their area of responsibility is going to have problems.

With Forecasting user can configure a job that will take in account the granularity and the amount of the data and will train a machine learning model to generate predictive datapoints for a desired period of time in the future. The forecasting results will be visualized on a graph.

Describe the solution you'd like

The following UX proposal is describing a workflow of user for setting up and managing a forecast and the user experience to view forecasting results. The recommendation includes the use of OUI components to provide consistency and ease in user and developer experience.

Forecast lifecycle

Forecast (forecasting job) can exist in one of the following states:

  • Inactive - a forecast that hasn’t been started yet
  • Inactive: stopped - a forecast stopped by user after running
  • Awaiting data to initialize forecast - a forecast is attempting to start but there is not enough data for the model to start initializing
  • Awaiting data to restart forecast - a forecast is attempting to restart after running but there is not enough data
  • Initializing test - a forecast is building model to run test
  • Initializing forecast - a forecast is building model to start running continuously
  • Test complete - a forecast generated a test result and stopped
  • Running - a forecast running continuously
  • Test initialization failed
  • Forecast initialization failed
  • Forecast failed

Some of the state transitions happen automatically (an example being going from “Initializing test” to “Test complete”) or be triggered by user (in order for a forecast to go into “Initializing forecast” user starts the forecast).

Screenshot 2023-07-18 at 10 27 58 AM

User flows

@dblock
Copy link
Member

dblock commented Jun 6, 2024

[Triage -- attendees 1, 2, 3, 4, 5, 6, 7]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants