The Common Event Format (CEF) is a popular log format used primarily in the context of security event logging. With CEF, disparate logs — including those generated by network activity, authentication attempts, and system-level actions — are standardized on a consistent schema with defined components, enabling holistic analysis within downstream security platforms.
With Edge Delta Security Data Pipelines, teams can enhance threat detection capabilities and operational visibility at scale by standardizing and enriching CEF logs before forwarding them downstream. Edge Delta’s CEF Pipeline Pack automates the entire process — integrating seamlessly into any Edge Delta Pipeline to streamline log parsing and structuring, without the need for manual configurations.
Once processed, the transformed CEF logs can be forwarded to any security analytics backend (e.g., Splunk, Sentinel, IBM QRadar), while a full copy of your raw logs can be routed to cost-effective storage for long-term retention.
Below, we’ll break down each of the five processing steps in the CEF Pack, and demonstrate what you can expect once it’s operational.
Processing Pathway: Route Syslog Entries
After initial ingestion into Edge Delta, logs move through the syslog_router
Route node. This node applies regex-based conditional logic to differentiate logs containing syslog headers from those without, redirecting each to separate processing routes. Any logs that fail to match either condition fall into the unmatched
route and are forwarded to the Other Logs
outbound destination node.
- name: syslog_router
type: route
paths:
- path: without_syslog
condition: regex_match(item["body"], "^CEF:")
exit_if_matched: true
- path: with_syslog
condition: regex_match(item["body"], "CEF:")
exit_if_matched: true
Processing Pathway: Parse Syslog Headers
Logs with a syslog header move to the aptly-named with_syslog_parsing
node, which uses OpenTelemetry Transform Language (OTTL) statements to perform four different processing steps:
1. The ExtractGrokPatterns
function is used to parse the log message body into structured attributes. It applies a Grok pattern with named captures to extract specific components stored in the log body — such as the month, day, time, host, and remaining message content — which are then saved in the cache["grok"]
dictionary for future access.
2. The log item’s timestamp
field is then updated by combining the cached month, day, and time fields with the current year. Next, the Concat
function merges these components into a space-delimited string, which is parsed by the Time
function using a predefined format. The resulting datetime object is converted to Unix time in milliseconds, and the final value is assigned to the timestamp attribute.
3. The resource["host.name"]
attribute is set to the host value stored in cache["grok"]["host"]
.
4. The final expression captures the remainder of the log message following the host and assigns it to attributes["cef_body"]
.
- name: with_syslog_parsing
type: ottl_transform
statements: |-
set(cache["grok"], ExtractGrokPatterns(Decode(body, "utf-8"), "%{MONTH:month} %{MONTHDAY:day} %{TIME:time} %{NOTSPACE:host} %{GREEDYDATA:rest}", true))
set(timestamp, UnixMilli(Time(Concat([cache["grok"]["month"], cache["grok"]["day"], String(Year(Now())), cache["grok"]["time"]], " "), "%h %d %Y %H:%M:%S", "UTC")))
set(resource["host.name"], cache["grok"]["host"])
set(attributes["cef_body"], cache["grok"]["rest"])
Processing Pathway: Extract CEF Directly
Logs without syslog headers are routed to the without_syslog_parsing
node, where an OTTL statement is used to place the log body into the cef_body
attribute in preparation for entering the common_cef_parsing
node.
- name: without_syslog_parsing
type: ottl_transform
statements: |-
set(attributes["cef_body"], Decode(body, "utf-8"))
Processing Pathway: Common CEF Parsing
Both sets of logs flowing through the with_syslog_parsing
and without_syslog_parsing
nodes then converge into the common_cef_parsing
node. This OTTL Transform node parses the CEF-formatted cef_body
string by splitting it into structured components for more granular and effective log analysis:
- The
cef_body
string is split using the|
delimiter, with each segment stored in thecache["cef"] array
, effectively dissecting the CEF message into its constituent fields.
- The CEF version is extracted from
cache["cef"][0]
using the Substring function, converted to an integer, and then stored inattributes["cef_version"]
. - A range of descriptive elements — such as product information, event class, and event severity — are then stored directly as fields in the log’s
attributes
.
- If the final segment
(cache["cef"][7])
contains any additional key-value pairs, they’re then parsed using Edge Delta’s customEDXParseKeyValue
OTTL function. It uses=
as the key-value delimiter and spaces as the key-value pair delimiter, converting the data into a structured map stored inattributes["cef_attributes"]
.
- name: common_cef_parsing
type: ottl_transform
statements: |-
set(cache["cef"], Split(attributes["cef_body"], "|"))
set(attributes["cef_version"], Int(Substring(cache["cef"][0], 4, 1)))
set(attributes["cef_device_vendor"], cache["cef"][1])
set(attributes["cef_device_product"], cache["cef"][2])
set(attributes["cef_device_version"], cache["cef"][3])
set(attributes["cef_device_event_class_id"], cache["cef"][4])
set(attributes["cef_name"], cache["cef"][5])
set(attributes["cef_severity"], cache["cef"][6])
set(attributes["cef_attributes"], EDXParseKeyValue(cache["cef"][7], "=", " ", true)) where cache["cef"][7] != ""
Processing Pathway: Cleanup Attributes
All remaining logs pass through the remove_keys
node, which deletes the temporary processing field item["attributes"]["cef_body"]
, for efficiency.
- name: remove_keys
type: generic_transform
transformations:
- field_path: item["attributes"]["cef_body"]
operation: delete
Finally, processed logs are sent to CEF Logs
, a compound_output
path, where logs exit the pack ready for routing to SIEMs, XDRs, and other downstream destinations.
See the CEF Pack in Action
To start taking advantage of the CEF Pack’s capabilities, you’ll need an existing pipeline in Edge Delta.
If you haven’t set up a pipeline yet, go to the Pipelines section, click on “New Fleet,” and choose either Edge Fleet or Cloud Fleet based on your hosting environment. Then, follow the setup instructions to complete the pipeline configuration.
Once your pipeline is active, open the Pipelines menu, select “Knowledge,” and navigate to “Packs.” Scroll down to find the CEF Pack and click “Add Pack.” This will move the pack to your library, which you can access anytime from the “Pipelines” menu under “Packs.”
To install the pack into an existing pipeline, return to your Pipelines dashboard, select the pipeline where you want to add the CEF Pack, and enter Edit Mode. In Edit Mode, click “Add Processor,” navigate to “Packs,” and choose the CEF Pack.
You can then rename the pack from “CEF Pack” to something else if preferred. Once ready, click “Save Changes” to apply the pack to your pipeline. Afterward, go back to the Pipeline Builder and drag and drop the initial connection from your CEF logs source into the pack.
To complete the setup, you’ll need to add destinations. Edge Delta Security Data Pipelines allow you to route your processed CEF logs to any SIEM — such as Microsoft Sentinel, CrowdStrike Falcon, and Splunk to name a few — or any other downstream security destination. You can also route a copy of your raw CEF logs to any storage destination for compliance purposes or long-term analysis.

Get Started with the CEF Pack
Ready to try out the CEF Pack? Check out our documentation to learn more. And if you’re new to Edge Delta, explore our free playground to see how we provide a stronger, safer foundation for your security data.