Overview

NetFlow is a network protocol that records metadata about IP traffic flows between devices — source IP, destination IP, ports, protocols, and byte counts. By importing NetFlow capture files into Clarity, you can augment your CMDB relationships with real, observed communication patterns.

This is particularly valuable during the Discovery phase when you need to understand which devices talk to each other, identify undocumented dependencies, and validate that move groups don't break critical traffic paths.

When to Use NetFlow

Dependency Discovery
Identify real communication paths between servers that aren't documented in the CMDB — critical for safe move group planning.
Move Group Validation
Verify that planned move groups don't separate devices that actively communicate at high volume.
Application Mapping Enrichment
Feed traffic data into Application Mapping graphs to visualise actual network relationships alongside CMDB-defined ones.
Firewall Rule Planning
Understand which ports and protocols are in active use to plan firewall rules for the target environment.

Prerequisites

Before you begin

You need NetFlow capture files exported from your network infrastructure. Clarity supports both NetFlow v5 and v9 formats (.nfcap files).

  • NetFlow capture files (.nfcap). Exported from routers, switches, or dedicated NetFlow collectors (e.g. nfdump, SolarWinds, PRTG).
  • Devices already in CMDB. NetFlow records are matched to CMDB devices by IP address. Import your device inventory first for best results.
  • Administrator role. NetFlow import requires admin-level access.

Importing NetFlow Data

NetFlow imports are managed as named job definitions. Navigate to Integrations → NetFlow to see the job table, then follow the steps below.

1
Create a new job

Click the New button in the toolbar. The editor modal opens with two fields: a Name for the job (e.g. DC1 Week 15) and a NetFlow Binary File upload control.

2
Upload your NetFlow file

Click the file picker and select a NetFlow binary capture file (.nfcap, .nfcapd, .bin, or .dat). The file uploads immediately — when you see ✓ Ready next to the filename, the file is staged and ready.

3
Save the job

Click Create. The job definition is saved and appears as a row in the DataTable. You can edit the job later to replace the staged file at any time.

4
Run the import

Click the ▶ Run button on the job row. The button is enabled only when a file has been staged. A progress modal opens and streams live updates via SignalR — do not close the browser tab during processing.

5
Review results

When processing completes, the progress modal shows a final summary (records created, updated, warnings, errors, time taken). The job row updates with the Last Run timestamp and a Last Result badge — green for Success, red for Failure. Check Import History for the full audit trail.

Supported Formats

Format File Extensions Notes
NetFlow v5 .nfcap  .nfcapd  .bin  .dat The most common format. Fixed-length records with source/destination IP, ports, protocol, and byte counts.
NetFlow v9 .nfcap  .nfcapd  .bin  .dat Template-based format with extensible fields. Supports IPv6 and additional metadata.

Import History

Every NetFlow run — whether it succeeds or fails — is written to a shared integration history log. To view it, go to Integrations (the main integrations page) and click the history icon on the NetFlow card. This opens the Import History page filtered to NetFlow runs.

Each history entry shows:

  • Job Name — the name of the job definition that was run.
  • Date — when the run was triggered.
  • ResultSuccess or Failure badge.
  • Success / Warning / Error counts — flow record processing totals.
  • Time Taken — processing duration in seconds.
  • Run By — the user who triggered the run.
At-a-glance status on the job row

The Last Run and Last Result columns on the job DataTable show the outcome of the most recent run inline — so you can see at a glance whether a job needs re-running without opening the full history.

Example Workflow

Real-world example
Discovering hidden database dependencies before a wave migration

A migration architect is planning Wave 3, which includes 40 application servers. The CMDB shows known database connections, but she suspects there are undocumented dependencies — batch jobs, reporting queries, and legacy integrations that aren't captured in the CMDB.

She exports a week's worth of NetFlow data from the data centre switches, uploads the .nfcap files to Clarity, and navigates to Application Mapping. The force-directed graph now shows traffic-based edges alongside CMDB relationships — revealing three servers that communicate heavily with a database server not currently in Wave 3.

She adds the database server to Wave 3 and avoids a potential cut-over failure.

Tips

Import NetFlow data before building move groups

Traffic data reveals dependencies that static CMDB records miss. Importing NetFlow before you define move groups ensures your groupings respect actual communication patterns.

  • Capture at least one week of traffic. Shorter capture windows may miss batch jobs, weekly reports, or infrequent integration patterns.
  • Ensure CMDB devices have IP addresses populated. NetFlow matching works by IP address. Devices without IPs in the CMDB won't be matched to flow records.
  • Re-import periodically. Network traffic patterns change. Re-importing fresh NetFlow data before each wave ensures you're working with current dependencies.

Common Mistakes & Troubleshooting

  • High warning count after import. Warnings typically indicate flow records with IP addresses that don't match any CMDB device. This is expected for external traffic (internet, CDN, cloud services). Focus on internal IP mismatches — these may indicate devices missing from your CMDB.
  • No relationships appearing in Application Mapping. Verify that the source and destination IPs in the NetFlow data match IP addresses on CMDB devices. If devices were imported without IPs, the flow records cannot be correlated.
  • Unsupported file format. Ensure the file is a valid .nfcap binary capture, not a CSV or text-based export. The parser expects the native NetFlow binary format.