diff --git a/docs/user-guide/explore/best-practices.md b/docs/user-guide/explore/best-practices.md new file mode 100644 index 00000000..e4c61452 --- /dev/null +++ b/docs/user-guide/explore/best-practices.md @@ -0,0 +1,118 @@ +--- +sidebar_position: 6 +title: Explore Best Practices +description: Best practices in Log management and Explore +image: https://dytvr9ot2sszz.cloudfront.net/logz-docs/social-assets/docs-social.jpg +keywords: [logz.io, explore, dashboard, log analysis, observability] +--- + +Once you've sent your data to Logz.io, you can search and query your logs to identify, debug, and monitor issues as quickly and effectively as possible. + +Explore supports a few query methods, including: + + +## Simple Search + +Logz.io offers an intuitive and easy way to build your query. You can build queries easily by selecting fields, conditions, and values. + +Click the search bar or type to see available fields, add operators, and choose values. To use custom values, type the name and click the + sign. Press Enter to apply the query or Tab to add another condition. + +Free-text searches automatically convert into Lucene queries. + +## Lucene + +Logz.io supports Lucene for more advanced queries. + +Search for free text by typing the text string you want to find; for example, `error` will return all words containing this string, and using quotation marks, `"error"`, will return only the specific word you're searching for. + +![See error](https://dytvr9ot2sszz.cloudfront.net/logz-docs/explore-dashboard/basic-search-search-word.png) + +Search for a value in a specific field: + +`log_level:ERROR` + +Use the boolean operators AND, OR, and NOT to create more complex searches. For example, to search for a specific status that doesn't contain a certain word: + +`log_level:ERROR AND Kubernetes` + +To perform **range-related searches**, fields must be mapped as numbers (long, float, double, etc.). Then, you can use the following syntax. For example, you can use it to find all status codes between 400-499: + +`LogSize:[2000 TO 3000]` + +To make your search more complex, you can find status codes 400-499 with the extension php: + +`LogSize:[2000 TO 3000] AND eventType:MODIFIED` + +Or, find status codes 400-499 with the extension php or html: + +`LogSize:[2000 TO 3000] AND logzio-signature:[700000000 TO 710000000]` + +To exclude a term from your search, you can use the following syntax: + +`LogSize:[2000 TO 3000] AND type NOT (name:"agent-k8s")` + + +## Filters + +Use the filters to refine your search, whether you're using Simple or Lucene. Open string fields to view its related values, and open numeric fields to choose a range. For example, `LogSize` lets you select the size of the logs you're interested in: + +![numeric filters](https://dytvr9ot2sszz.cloudfront.net/logz-docs/explore-dashboard/logsize-explore-aug27.png) + + + +## Regex in Lucene + +:::caution +Using regex can overload your system and cause performance issues in your account. If regex is necessary, it is best to apply filters and use shorter timeframes. +::: + +Logz.io uses Apache Lucene's regular expression engine to parse regex queries, supporting regexp and query_string. + +While Lucene's regex supports all Unicode characters, several characters are reserved as operators and cannot be searched on their own: + +`. ? + * | { } [ ] ( ) " \` + +Depending on the optional operators enabled, some additional characters may also be reserved. These characters are: + +`# @ & < > ~` + +However, you can still use reserved characters by applying a backslash or double-quotes. For example: + +`\*` will render as a * sign. + +`\#` will render as a # sign. + +`\()` will render as brackets. + + +To use Regex in a search query in OpenSearch, you'll need to use the following template: + +`fieldName:/.*value.*/`. + +For example, you have a field called `sentence` that holds the following line: "The quick brown fox jumps over the lazy dog". + +To find one of the values in the field, such as `fox`, you'll need to use the following query: + +`sentence:/.*fox.*/`. + + +## Edit log table view + +You can add additional columns to your logs table view. + +Find the field you'd like to add, hover over it and click the **Toggle column in table** button. + +![Add field](https://dytvr9ot2sszz.cloudfront.net/logz-docs/explore-dashboard/toggle-in-table-sep9.png) + +Once added, you can drag it to reposition it, or click the **X** to remove it. + +Save your query to quickly access it whenever needed. The query is saved while the results change according to your chosen relevant time frame. + +![Save field](https://dytvr9ot2sszz.cloudfront.net/logz-docs/explore-dashboard/saved-search-sep9.png) + + +## Select logs' time frame + +The default period to display results is 15 minutes. You can edit this time frame by clicking on the time picker. Choose an option from the quick menu, or switch to the absolute view to select a specific time frame. In this option, you can type the time frame you want to view. + +![Time frame options](https://dytvr9ot2sszz.cloudfront.net/logz-docs/explore-dashboard/time-picker-sep9.png) \ No newline at end of file diff --git a/docs/user-guide/quick-start.md b/docs/user-guide/quick-start.md index 4a4afa89..c5abe904 100644 --- a/docs/user-guide/quick-start.md +++ b/docs/user-guide/quick-start.md @@ -8,53 +8,48 @@ keywords: [logs, metrics, traces, logz.io, getting started] --- +Logz.io is a scalable, end-to-end cloud monitoring service that combines the best open-source tools with a fully managed SaaS platform. It provides unified log, metric, and trace collection with AI/ML-enhanced features for improved troubleshooting, faster response times, and cost management. - -Logz.io is an end-to-end cloud monitoring service built for scale. It’s the best-of-breed open source monitoring tools on a fully managed cloud service. - -One unified SaaS platform to collect and analyze logs, metrics, and traces, combined with human-powered AI/ML features to improve troubleshooting, reduce response time and help you manage costs. - - -Whether you are a new user or looking for a refresher on Logz.io, you are invited to join one of our engineers for a **[training session on the Logz.io platform](https://logz.io/training/)**! - +Whether you’re new to Logz.io or need a refresher, join one us for a **[training session on the Logz.io platform](https://logz.io/training/)**! ## Send your data to Logz.io -Once you’ve set up your account, you can start sending your data. - -Logz.io provides various tools, integrations, and methods to send data and monitor your Logs, Metrics, Traces, and SIEM. +After setting up your account, you can start sending your data to Logz.io using various tools, integrations, and methods for monitoring Logs, Metrics, Traces, and SIEM. -The fastest and most seamless way to send your data is through our **Telemetry Collector**. It lets you easily configure your data-sending process by executing a single line of code, providing a complete observability platform to monitor and improve your logs, metrics, and traces. +The quickest way is through our **Telemetry Collector**, which simplifies data configuration with a single line of code, enabling full observability across your systems. [**Get started with Telemetry Collector**](https://app.logz.io/#/dashboard/integrations/collectors?tags=Quick%20Setup). -If you prefer to send your data manually, Logz.io offers numerous methods to do so, and here are some of the more popular ones based on what you’d like to monitor: +If you prefer a manual approach, Logz.io offers multiple methods tailored to different monitoring needs. Here are some popular options: |**Logs**|**Metrics**|**Traces**|**Cloud SIEM**| | --- | --- | --- | --- | |[Filebeat](https://app.logz.io/#/dashboard/integrations/Filebeat-data)|[.NET](https://app.logz.io/#/dashboard/integrations/dotnet)|[Jaeger installation](https://app.logz.io/#/dashboard/integrations/Jaeger-data)|[Cloudflare](https://app.logz.io/#/dashboard/integrations/Cloudflare-network) |[S3 Bucket](https://app.logz.io/#/dashboard/integrations/AWS-S3-Bucket)|[Prometheus](https://app.logz.io/#/dashboard/integrations/Prometheus-remote-write)|[OpenTelemetry installation](https://app.logz.io/#/dashboard/integrations/OpenTelemetry-data)|[NGINX](https://app.logz.io/#/dashboard/integrations/Nginx-load) -|[cURL](https://app.logz.io/#/dashboard/integrations/cURL-data)|[Azure Kubernetes Service](https://app.logz.io/#/dashboard/integrations/Kubernetes)|[Docker](https://app.logz.io/#/dashboard/integrations/Docker)|[Active directory](https://app.logz.io/#/dashboard/integrations/Active-Directory) -|[JSON uploads](https://app.logz.io/#/dashboard/integrations/JSON)|[Google Kubernetes Engine over OpenTelemetry](https://app.logz.io/#/dashboard/integrations/Kubernetes)|[Kubernetes](https://app.logz.io/#/dashboard/integrations/Kubernetes)|[CloudTrail](https://app.logz.io/#/dashboard/integrations/AWS-CloudTrail) -|[Docker container](https://app.logz.io/#/dashboard/integrations/Docker)|[Amazon EC2](https://app.logz.io/#/dashboard/integrations/AWS-EC2)|[Go instrumentation](https://app.logz.io/#/dashboard/integrations/GO)|[Auditbeat](https://app.logz.io/#/dashboard/integrations/auditbeat) | +|[cURL](https://app.logz.io/#/dashboard/integrations/cURL-data)|[Java](https://app.logz.io/#/dashboard/integrations/Java)|[Docker](https://app.logz.io/#/dashboard/integrations/Docker)|[Active directory](https://app.logz.io/#/dashboard/integrations/Active-Directory) +|[HTTP uploads](https://app.logz.io/#/dashboard/integrations/HTTP)|[Node.js](https://app.logz.io/#/dashboard/integrations/Node-js)|[Kubernetes](https://app.logz.io/#/dashboard/integrations/Kubernetes)|[CloudTrail](https://app.logz.io/#/dashboard/integrations/AWS-CloudTrail) +|[Python](https://app.logz.io/#/dashboard/integrations/Python)|[Amazon EC2](https://app.logz.io/#/dashboard/integrations/AWS-EC2)|[Go instrumentation](https://app.logz.io/#/dashboard/integrations/GO)|[Auditbeat](https://app.logz.io/#/dashboard/integrations/auditbeat) | -Browse the complete list of available shipping methods [here](https://docs.logz.io/docs/category/send-your-data/). +Browse the complete list of available shipping methods [here](https://app.logz.io/#/dashboard/integrations/collectors). -To learn more about shipping your data, check out **Shipping Log Data to Logz.io**: + + ### Parsing your data -Logz.io offers automatic parsing [for over 50 log types](https://docs.logz.io/docs/user-guide/data-hub/log-parsing/default-parsing/). +Logz.io automatically parses [over 50 log types](https://docs.logz.io/docs/user-guide/data-hub/log-parsing/default-parsing/). + +If your log type isn't listed, or you want to send custom logs, we offer parsing-as-a-service as part of your subscription. Just reach out to our **Support team** via chat or email us at [help@logz.io](mailto:help@logz.io?subject=Parse%20my%20data) with your request. -If you can't find your log type, or if you're interested in sending custom logs, Logz.io will parse the logs for you. Parsing-as-a-service is included in your Logz.io subscription; just open a chat with our **Support team** with your request, you can also email us at [help@logz.io](mailto:help@logz.io). -###### Additional resources +