Normalized, Enriched Data

A Data Fusion engine that makes your telemetry more valuable,
automatically. Interflow is a normalized, enriched data model
that allows IT and security tools to talk the same language, so
that you can detect and respond to every threat.
Network traffic analysis

Why Interflow?

Raw logs from IT and security tools don’t interoperate
with each other.
PCAP is too heavyweight for security analysis Netflow
is not enough. Interflow solves these problems with a
normalized, enriched data model purpose designed
for security.

With Interflow, your security team is able to:

  • Stop doing manual data munging – Interflow is produced automatically
  • Reduce data volume – PCAP to Interflow data reduction can be up to two orders of magnitude
  • Correlate across seemingly unrelated events – Standard key values make correlation easy
  • Highly interpretable – Reduce analyst training time with easy to understand data

How it Works

EDR Software
Data is collected from everywhere, from
Integrations and Stellar Cyber Sensors.
Next Gen SIEM
Data is reduced and filtered depending on the
Integration and Sensor, to maintain only relevant
security information.
Open XDR
Data is enriched with Threat Intelligence, and
other event context such as details about users
and assets involved.
Open XDR security services
Normalization forces source data into a standard
data model, regardless of where it came from.
SIEM application
The resulting Interflow record is stored in Stellar
Cyber’s Data Lake for analysis.

360° Visibility


Why Interflow is Critical for AI

Data is the fuel for AI. If the data quality is poor, the resulting AI will have poor performance. If the
data complexity is high, the resulting AI will struggle to scale. That’s why Interflow is critical for AI –
it ensures quality data with reduced complexity.