You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: README.md
+67-2
Original file line number
Diff line number
Diff line change
@@ -20,7 +20,7 @@ Based on [PySpark DataSource API](https://spark.apache.org/docs/preview/api/pyth
20
20
21
21
## Splunk data source
22
22
23
-
Right now only implements writing to Splunk - both batch & streaming. Registered data source name is `splunk`.
23
+
Right now only implements writing to [Splunk](https://www.splunk.com/) - both batch & streaming. Registered data source name is `splunk`.
24
24
25
25
By default, this data source will put all columns into the `event` object and send it to Splunk together with metadata (`index`, `source`, ...). This behavior could be changed by providing `single_event_column` option to specify which string column should be used as the single value of `event`.
-`url` (string, required) - URL of the Splunk HTTP Event Collector (HEC) endpoint to send data to. For example, `http://localhost:8088/services/collector/event`.
66
+
-`url` (string, required) - URL of the Splunk HTTP Event Collector (HEC) endpoint to send data to. For example, `http://localhost:8088/collector/services/event`.
67
67
-`token` (string, required) - HEC token to [authenticate to HEC endpoint](https://docs.splunk.com/Documentation/Splunk/9.3.1/Data/FormateventsforHTTPEventCollector#HTTP_authentication).
68
68
-`index` (string, optional) - name of the Splunk index to send data to. If omitted, the default index configured for HEC endpoint is used.
69
69
-`source` (string, optional) - the source value to assign to the event data.
@@ -75,6 +75,71 @@ Supported options:
75
75
-`remove_indexed_fields` (boolean, optional, default: `false`) - if indexed fields should be removed from the `event` object.
76
76
-`batch_size` (int. optional, default: 50) - the size of the buffer to collect payload before sending to Splunk.
77
77
78
+
## Microsoft Sentinel / Azure Monitor
79
+
80
+
Right now only implements writing to [Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/overview/) - both batch & streaming. Registered data source name is `ms-sentinel`. The integration uses [Logs Ingestion API of Azure Monitor](https://learn.microsoft.com/en-us/azure/sentinel/create-custom-connector#connect-with-the-log-ingestion-api), so it's also exposed as `azure-monitor`.
81
+
82
+
To push data you need to create Data Collection Endpoint (DCE), Data Collection Rule (DCR), and create a custom table in Log Analytics workspace. See [documentation](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview) for description of this process. The structure of the data in DataFrame should match the structure of the defined custom table.
83
+
84
+
This connector uses Azure Service Principal Client ID/Secret for authentication - you need to grant correct permissions (`Monitoring Metrics Publisher`) to the service principal on the DCE and DCR.
0 commit comments