Configure Outputs
This section guides you through the process of setting up output destinations for your NetFlow data. You can specify multiple destinations, each with its own format and data type. Customize the output destinations to meet your specific needs, whether it's sending data to a syslog server, an AWS S3 bucket, a Splunk instance, or another supported platform.
You may add up to sixteen output destinations, specifying the format and the kind of data to be sent to each destination.
Click on the symbol to add data outputs and select desired Output Type from the following drp-down.
Output Types
NFO supports the following types of outputs:
Type | Description |
---|---|
Syslog (UDP) | Indicates the destination where data is sent in syslog format |
Syslog (JSON) | Indicates the destination where data is sent in JSON format |
AWS S3 | Indicates the destination is AWS S3 buckets |
Disk | Indicates the destination is a disk |
Splunk HEC | Indicates that the destination is Splunk HEC. NFO sends data to Splunk HEC in key=value format |
Splunk Observability Metrics | Indicates that the destination is Splunk Observability Cloud (SingalFX) |
Azure Blob Storage Syslog | Indicates the destination is Azure Blob Storage in Syslog format |
Azure Blob Storage JSON | Indicates the destination is Azure Blob Storage in JSON format |
Azure Log Analytics Workspace | Indicates the destination is Microsoft Azure Log Analytics Workspace (Azure Monitor, Sentinel) |
Kafka Syslog | Indicates the destination is Kafka in Syslog format |
Kafka JSON | Indicates the destination is Kafka in JSON format |
OpenSearch | Indicates the destination is OpenSearch (e.g. Amazon OpenSearch Service) |
ClickHouse | Indicates the destination of your ClickHouse database |
Repeater (UDP) | Indicates that flow data received by NFO should be retransmitted to that destination, e.g your legacy NetFlow collector or another NFO instance |
Output Filters
If you have only one output, the Output filter is not applicable, so it will revert to All.
You can set filters for each output:
Output Filter | Description |
---|---|
All | Indicates the destination for all data generated by NFO, both by Modules and by Original NetFlow/IPFIX/sFlow one-to-one conversion |
Modules Output Only | Indicates the destination will receive data only generated by enabled NFO Modules |
Original NetFlow/IPFIX only | Indicates the destination for all flow data, translated into syslog or JSON, one-to-one. NetFlow/IPFIX Options from Original Flow Data translated into syslog or JSON, one-to-one, also sent to this output. Use this option to archive all underlying flow records NFO processes for forensics. This destination is typically Hadoop or another inexpensive storage, as the volume for this destination can be quite high |
Original sFlow only | Indicates the destination for sFlow data, translated into syslog or JSON, one-to-one. Use this option to archive all underlying sFlow records NFO processes for forensics. This destination is typically configured to send output to inexpensive syslog storage, such as the volume for this destination can be quite high |
Module Filter
Here you can enter a comma-separated list of nfc_id
for the Modules to be included in this output destination.
Preview
To preview your output click the icon.
A pop-up window will appear, allowing you to specify preview parameters.
Preview parameter | Description |
---|---|
Capture filter | Use a regular expression to filter the specific output records you want to examine |
Buffer size (messages) | pecify the maximum number of records to capture |
Capture time | Set a time window to limit the preview to a specific timeframe |
Capture filter example: to preview messages with src_ip=172.31.24.19
, use the following:
src_ip=172\.31\.24\.19
Once you've adjusted these settings, click Start
.
Important! Wait for 30 seconds, then click Refresh
.
You can view the captured data in the window.