TECHNOLOGY

Interflow

Normalized, enriched data

A Data Fusion engine that makes your telemetry more valuable, automatically. Interflow is a normalized, enriched data model that allows IT and security tools to talk the same language, so that you can detect and respond to every threat.

Network traffic analysis

Why Interflow?

Raw logs from IT and security tools don't interoperate with each other.
PCAP is too heavyweight for security analysis. Netflow is not enough. Interflow solves these problems with a normalized, enriched data model purpose designed for security.

With Interflow, your security team is able to:
1. Stop doing manual data munging – Interflow is produced automatically
2. Reduce data volume – PCAP to Interflow data reduction can be up to two orders of magnitude
3. Correlate across seemingly unrelated events – Standard key values make correlation easy
4. Highly interpretable – Reduce analyst training time with easy to understand data

How It Works

cloud security

Data is collected from everywhere, from Integrations and Stellar Cyber Sensors.

cloud security
next gen siem application

Data is reduced and filtered depending on the Integration and Sensor, to maintain only relevant security information.

open xdr

Normalization forces source data into a standard data model, regardless of where it came from.

open xdr
open xdr integrated application

Data is enriched with Threat Intelligence, and other event context such as details about users and assets involved.

next gen siem application

The resulting Interflow record is stored in Stellar Cyber's Data Lake for analysis.

next gen siem application

360° Visibility

XDR

Why Interflow Is Critical For AI

Data is the fuel for AI. If the data quality is poor, the resulting AI will have poor performance. If the data complexity is high, the resulting AI will struggle to scale. That's why Interflow is critical for AI –
it ensures quality data with reduced complexity.