TECHNOLOGY
Interflow
Normalized, Enriched Data
A Data Fusion engine that makes your telemetry more valuable,
automatically. Interflow is a normalized, enriched data model
that allows IT and security tools to talk the same language, so
that you can detect and respond to every threat.
Why Interflow?
Raw logs from IT and security tools don’t interoperate
with each other.
PCAP is too heavyweight for security analysis Netflow
is not enough. Interflow solves these problems with a
normalized, enriched data model purpose designed
for security.
With Interflow, your security team is able to:
- Stop doing manual data munging – Interflow is produced automatically
- Reduce data volume – PCAP to Interflow data reduction can be up to two orders of magnitude
- Correlate across seemingly unrelated events – Standard key values make correlation easy
- Highly interpretable – Reduce analyst training time with easy to understand data

"Users can enhance their favorite EDR tools with full integration into an XDR platform, obtaining greater visibility."
Jon Oltsik
Senior Principal Analyst and ESG Fellow

"Stellar Cyber reduced our analysis expenses and enabled us to kill threats far more quickly."
Central IT Department
University of Zurich

Stellar Cyber delivers built-in Network Detection & Response (NDR), Next Gen SIEM and Automated Response
Rik Turner
Principal Analyst, Infrastructure Solutions
Sportscar Performance XDR For A Family Sedan Budget!
Gartner PeerInsights
How it Works
Data is collected from everywhere, from
Integrations and Stellar Cyber Sensors.
Data is reduced and filtered depending on the
Integration and Sensor, to maintain only relevant
security information.
Data is enriched with Threat Intelligence, and
other event context such as details about users
and assets involved.
Normalization forces source data into a standard
data model, regardless of where it came from.
The resulting Interflow record is stored in Stellar
Cyber’s Data Lake for analysis.
360° Visibility
