- ?? ??
- ????
- Glossary
- ?? ??
- Guides
- Agent
- ??
- ????????
- ???
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- ? ?
- ??? ??
- ???????
- ?????? ??
- APM
- Continuous Profiler
- ?? ???
- ??? ??? ????
- ??? ?? ????
- ??? ??
- ????? ??
- ??
- AI Observability
- ?? ??
- ??
During an incident investigation, you might need to run complex queries, such as combining attributes from multiple log sources or transforming log data, to analyze your logs. Use Log Workspaces to run queries to:
You can create a workspace from the Workspaces page or from the Log Explorer.
On the Log Workspaces page:
In the Log Explorer:
In addition to the default columns, you can add your own columns to your workspace:
You can take existing Log Explorer queries with Calculated Fields and directly open them in Workspaces. To transfer these queries from the Log Explorer, click Open in New Workspace. The Calculated Fields will automatically be converted into a Transformation cell.
You can also create Calculated Fields directly within a Workspace to define a computed field from existing data sources. These fields can be reused in subsequent analysis:
You can add the following cells to:
Cells that depend on other cells are automatically updated when one of the cells it depends on is changed.
At the bottom of your workspace, click any of the cell tiles to add it to your workspace. After adding a cell, you can click the dataset on the left side of your workspace page to go directly to that cell.
You can add a logs query or a reference table as a data source.
select only timestamp, customer id, transaction id from the transaction logs
.Add the Visualization cell to display your data as a:
status:error
. If you are using an analysis cell as your data source, you can also filter the data in SQL first.Click the Transformation tile to add a cell for filtering, aggregating, and extracting data.
The following is an example dataset:
timestamp | host | message |
---|---|---|
May 29 11:09:28.000 | shopist.internal | Submitted order for customer 21392 |
May 29 10:59:29.000 | shopist.internal | Submitted order for customer 38554 |
May 29 10:58:54.000 | shopist.internal | Submitted order for customer 32200 |
Use the following grok syntax to extract the customer ID from the message and add it to a new column called customer_id
:
Submitted order for customer %{notSpace:customer_id}`
This is the resulting dataset in the transformation cell after the extraction:
timestamp | host | message | customer_id |
---|---|---|---|
May 29 11:09:28.000 | shopist.internal | Submitted order for customer 21392 | 21392 |
May 29 10:59:29.000 | shopist.internal | Submitted order for customer 38554 | 38554 |
May 29 10:58:54.000 | shopist.internal | Submitted order for customer 32200 | 32200 |
Click the Text cell to add a markdown cell so you can add information and notes.
This example workspace has:
Three data sources:
trade_start_logs
trade_execution_logs
trading_platform_users
Three derived datasets, which are the results of data that has been transformed from filtering, grouping, or querying using SQL:
parsed_execution_logs
transaction_record
transaction_record_with_names
One treemap visualization.
This diagram shows the different transformation and analysis cells the data sources go through.
The example starts off with two logs data sources:
trade_start_logs
trade_execution_logs
The next cell in the workspace is the transform cell parsed_execution_logs
. It uses the following grok parsing syntax to extract the transaction ID from the message
column of the trade_execution_logs
dataset and adds the transaction ID to a new column called transaction_id
.
transaction %{notSpace:transaction_id}
An example of the resulting parsed_execution_logs
dataset:
timestamp | host | message | transaction_id |
---|---|---|---|
May 29 11:09:28.000 | shopist.internal | Executing trade for transaction 56519 | 56519 |
May 29 10:59:29.000 | shopist.internal | Executing trade for transaction 23269 | 23269 |
May 29 10:58:54.000 | shopist.internal | Executing trade for transaction 96870 | 96870 |
May 31 12:20:01.152 | shopist.internal | Executing trade for transaction 80207 | 80207 |
The analysis cell transaction_record
uses the following SQL command to select specific columns from the trade_start_logs
dataset and the trade_execution_logs
, renames the status INFO
to OK
, and then joins the two datasets.
SELECT
start_logs.timestamp,
start_logs.customer_id,
start_logs.transaction_id,
start_logs.dollar_value,
CASE
WHEN executed_logs.status = 'INFO' THEN 'OK'
ELSE executed_logs.status
END AS status
FROM
trade_start_logs AS start_logs
JOIN
trade_execution_logs AS executed_logs
ON
start_logs.transaction_id = executed_logs.transaction_id;
An example of the resulting transaction_record
dataset:
timestamp | customer_id | transaction_id | dollar_value | status |
---|---|---|---|---|
May 29 11:09:28.000 | 92446 | 085cc56c-a54f | 838.32 | OK |
May 29 10:59:29.000 | 78037 | b1fad476-fd4f | 479.96 | OK |
May 29 10:58:54.000 | 47694 | cb23d1a7-c0cb | 703.71 | OK |
May 31 12:20:01.152 | 80207 | 2c75b835-4194 | 386.21 | ERROR |
Then the reference table trading_platform_users
is added as a data source:
customer_name | customer_id | account_status |
---|---|---|
Meghan Key | 92446 | verified |
Anthony Gill | 78037 | verified |
Tanya Mejia | 47694 | verified |
Michael Kaiser | 80207 | fraudulent |
The analysis cell transaction_record_with_names
runs the following SQL command to take the customer name and account status from trading_platform_users
, appending it as columns, and then joins it with the transaction_records
dataset:
SELECT tr.timestamp, tr.customer_id, tpu.customer_name, tpu.account_status, tr.transaction_id, tr.dollar_value, tr.status
FROM transaction_record AS tr
LEFT JOIN trading_platform_users AS tpu ON tr.customer_id = tpu.customer_id;
An example of the resulting transaction_record_with_names
dataset:
timestamp | customer_id | customer_name | account_status | transaction_id | dollar_value | status |
---|---|---|---|---|---|---|
May 29 11:09:28.000 | 92446 | Meghan Key | verified | 085cc56c-a54f | 838.32 | OK |
May 29 10:59:29.000 | 78037 | Anthony Gill | verified | b1fad476-fd4f | 479.96 | OK |
May 29 10:58:54.000 | 47694 | Tanya Mejia | verified | cb23d1a7-c0cb | 703.71 | OK |
May 31 12:20:01.152 | 80207 | Michael Kaiser | fraudulent | 2c75b835-4194 | 386.21 | ERROR |
Finally, a treemap visualization cell is created with the transaction_record_with_names
dataset filtered for status:error
logs and grouped by dollar_value
, account_status
, and customer_name
.
?? ??? ??, ?? ? ??:
喝什么养胃最好 | 子宫内膜炎是什么原因造成的 | 泌尿科属于什么科 | 桃花像什么 | 银杯子喝水有什么好处与坏处 |
7月出生的是什么星座 | 5公里25分钟什么水平 | 吃什么容易结石 | 笃定什么意思 | 狗和什么属相相冲 |
仕字五行属什么 | 毛躁是什么意思 | 儿童风寒咳嗽吃什么药 | 梦见妖魔鬼怪是什么意思 | 为什么会得丹毒 |
为什么要喝酒 | 吃什么东西补血 | 一点小事就暴躁的人是什么病 | 中元节出什么生肖 | aldo是什么牌子 |
碱性磷酸酶高是什么病xjhesheng.com | 戳什么意思hcv7jop6ns8r.cn | 副主任医师是什么级别hcv8jop0ns9r.cn | 乳腺结节吃什么好hcv9jop3ns6r.cn | 空洞是什么意思hcv8jop4ns9r.cn |
离是什么生肖hcv9jop0ns6r.cn | 福报是什么意思hcv8jop4ns9r.cn | 四月十六日是什么星座hcv8jop8ns5r.cn | 经期喝什么补气血hcv9jop7ns3r.cn | 什么饮料解酒wuhaiwuya.com |
台风什么时候到福建hcv7jop9ns1r.cn | 什么是跳蛋hcv8jop0ns7r.cn | 贫血吃什么水果hcv8jop0ns7r.cn | nt检查什么jasonfriends.com | 天天喝可乐有什么危害hcv8jop0ns6r.cn |
小腹疼挂什么科hcv8jop3ns9r.cn | 龟头瘙痒用什么药膏hcv8jop0ns5r.cn | 亲临是什么意思hcv8jop1ns2r.cn | 1度房室传导阻滞是什么意思xinmaowt.com | 什么是小男人hcv9jop2ns1r.cn |