Custom Security Event Workflows
In this section you will learn how to build your first Custom Security Event Workflow.
Note: The
Findings Intake
API facilitates the ingestion of security findings into the XDR data warehouse using the Open Cybersecurity Schema Framework (OCSF) version 1.4. This enables seamless integration of external threat data, enhancing Cisco XDR's detection and response capabilities. At this time (September, 2025) the Findings Intake service is in Beta, which means not all features are available yet. The main consumer of this API currently is the new Custom Security Event Workflow. More functionality will be released to the Findings Intake service before the Beta flag is removed.
Use Cases
- Custom Data Integration: allows organizations to bring their own threat intelligence or security findings into Cisco XDR.
- Automated Ingestion: facilitates the automated flow of external findings into the XDR platform.
- Enhanced Correlation: integrates external data with native telemetry sources for comprehensive threat analysis.
How to create your first Custom Security Event Workflow
Creating the Workflow template
This workflow can be used to ingest security events from a custom source, set up the required mapping to our data model of the Cisco XDR Data Warehouse, and associate an Automation Rule to trigger the workflow to execute.
In XDR Automate, click on Create Workflow and select Custom Security Event Workflow. When creating a workflow, you can start with a blank workflow or choose the type of workflow to gain some configuration assistance in which the system pre-configures certain settings according to the "intent" of the workflow to expedite the build process.
Enter a descriptive name for the workflow into the Workflow display name field; a workflow must have a name before it can be saved. For the purpose of this example, we will do a Scheduled based workflow for Network events from Splunk.
From the Security event type drop-down list, choose the type of security event you intend to ingest, such as Email or Network.
From the Ingestion method drop-down list, choose how the security event will be ingested, such as by Webhook (push) for immediate delivery or Schedule (pull) for periodic polling. The push method is preferred but not supported by all sources.
If you choose Schedule (pull):
- From the Select an HTTP target to use drop-down list, choose a new or existing HTTP endpoint target. This binds the target to the workflow and creates the module instance in the background so that the events are ingested with the correct source. Choose your Splunk target here if you want to follow the example of this guide.
- From the Select or create a schedule automation rule drop-down list, choose a new or existing Schedule rule, which can be used to automatically trigger this workflow to execute when configured conditions are met. For more information, see the Schedule Rule topic.
If you choose Webhook (push):
- Choose a new or existing Webhook rule either here now or in the workflow's properties later. Once the workflow is validated, the associated Webhook rule can automatically trigger this workflow to execute when conditions are met. For more information, see the Webhook Rule topic.
- Click Continue to open this workflow in the Workflow Editor. On the canvas is a pre-defined group of actions, which you must either Dismiss to remove or Accept (preferred) to keep, showing what's required to handle the data in security event ingestion from a custom source. The first block is getting the data in from your custom source, followed by parsing the data, and finally mapping the results to your custom schema. An example will be covered in later sections how to use these.
Getting the data in: drag and drop activity here that will pull the data from the custom source. This only appears when using the Schedule method of ingestion, since the data already gets pushed when using the Webhook method. An example will be covered in below section.
Parsing the data: drag and drop activity here that will parse the data and retrieve the relevant fields. An example will be covered in below section.
Mapping the data: map variables to the parsed data. Contains a Cisco-managed atomic action such as XDR - Analytics - Ingest Email Security Event or XDR - Analytics - Ingest Network Security Event, depending on the type of security event.
Click the atomic action to view its properties and complete the configuration. Scroll through the list and enter as much fields as possible. The required information are indicated by a red asterisk.
Important note on mapping data: The required fields Module Instance ID and Workflow Run ID are output variable references automatically set by the system when the target was chosen. The Module Instance ID is mapped to an Integration Module in your XDR tenant. This is resolved by the
Intake API
and used as the name of the Custom Event Source. This can be of Module Instance of an existing Integration Module Type (e.g. Splunk), or you can create a placeholder Module Instance using the new Custom OCSF Event Source Module Type (change URL to.eu.
or.apjc.
as needed).
- In the Workflow Properties, under Intent Options, the Workflow Intent should be set to Custom Security Event. The Security Event Type, Ingestion Type, Automation Rule Type, and Target should also be populated based on your earlier selections. The rule type cannot be removed from the workflow with this intent, and the workflow will not be executed automatically unless a validated automation rule is associated with it.
Getting the data in
Example use case of running a Splunk search query and parsing that data to be ingested in the XDR data warehouse.
- Add the Splunk - Run Search and Splunk - Get Search Results atomic actions to your workflow in the first placeholder group ("getting the data in").
- Make sure the Workflow target (Splunk) is properly inherited by these Atomics (this should be default) and provide the query. For example:
index=security sourcetype=firewall event_id=12345
This query retrieves a specific firewall event with event_id=12345
. Make sure to enhance this query so you look for a very specific Network event type that adheres to the consideration from the introduction of this guide.
Parsing the data
Use the JSONPath atomic action to extract relevant fields from the Splunk query result. For example:
- Add the JSONPath action in the "parsing the data" group.
- Input: the JSON response from the Splunk - Get Search Results atomic action.
- Example JSONPath expressions:
$.result[0].src_ip
→ Source IP$.result[0].dst_ip
→ Destination IP$.result[0].action
→ Action taken (e.g., allow/deny)
Mapping the data
The earlier perfomed JSONPath queries are now available as Output Variables in the XDR - Analytics - Ingest Network Security Event atomic action. Map the parsed JSONPath variables to the Custom Security Event input variables. For example:
source_address
=src_ip
destination_address
=dst_ip
event_action
=action
Note: You may need to perform minor transformations to format the data correctly before mapping. This can be done using a Python activity or using a combination of conditions and setting local variables.
Testing your workflow and bringing it to production
- Validate the workflow in the Workflow Editor.
- Trigger the workflow manually with test events to confirm parsing and mapping.
- Check the ingested events in the Cisco XDR Data Warehouse.
- Associate the workflow with an Automation Rule (Schedule) so it runs automatically.
- Move from testing to production by enabling the rule and monitoring for successful ingestion.