Skip to Content
MetamodelData FlowsData Flow Fields

Data Flow Fields


1. Description

  • What it’s for: A narrative overview of this data flow — what data moves, between which components, and why.

  • What to include:

    • The source and destination of the data.
    • What the data represents and why it needs to flow.
    • Any key processing or transformation at a high level (detail goes in the Data Mapping tab).
    • The business process or event that triggers or relies on this flow.
  • Example: "Supplier invoice data received from the EDI gateway is validated, enriched with purchase order references, and loaded into the ERP Accounts Payable module. This flow supports the automated invoice matching process and replaces the current manual email-and-spreadsheet process."


2. Sizing

  • What it’s for: The volume characteristics of this data flow — how much data is involved.

  • What to include:

    • Number of records per run or per day.
    • File sizes or payload sizes if applicable.
    • Peak vs average volumes if they differ significantly.
    • Any growth expectations over the life of the solution.
  • Example: "Average 200 invoices per day; peak 1,500 at month-end. Each invoice record is approximately 2KB. Expected 15% annual growth in invoice volume."


3. Frequency

  • What it’s for: How often this data flow occurs.

  • What to include:

    • The trigger: scheduled (include interval), event-driven, real-time, or on-demand.
    • For scheduled flows: the schedule and any blackout windows.
    • Any latency requirements (e.g. “must complete within 30 minutes of trigger”).
  • Example: "Triggered by EDI gateway on receipt of each invoice batch; typically 4–6 batches per day during business hours. Month-end batches may arrive outside business hours and must be processed within 2 hours of receipt."


Relationships

Business Traceability

RelationshipWhat to link
Supports Business RequirementBusiness Requirements that this data flow helps to satisfy
Supports Non-Functional RequirementNon-Functional Requirements — throughput, latency, data retention — that this flow addresses
Supports Transition RequirementTransition Requirements that this flow helps to deliver (e.g. a migration flow)
Supports Business ProcessBusiness Processes that this data flow supports or enables
Supports Business ScenarioBusiness Scenarios in which this data flow participates
Supports Business OutcomeBusiness Outcomes that this data flow contributes to
Implements Business RuleBusiness Rules that this data flow enforces (e.g. data residency, transformation rules)
Implements Business ReferenceStandards, policies, or regulatory documents that govern this data flow

Data and Information

RelationshipWhat to link
Uses Business InformationBusiness Information items that this flow carries or transforms
Uses Data SetData Sets — the structured data schemas — that are involved in this flow
Uses Data StoreData Stores that this flow reads from or writes to

Technical Components

RelationshipWhat to link
Uses APIAPIs through which this data flow is implemented
Uses StreamMessage queues or event streams used by this flow
Uses ServiceBackend services that participate in this flow
Uses TechnologyTechnologies used to implement this flow (e.g. ETL tool, messaging platform)

Quality Attributes

RelationshipWhat to link
Uses AvailabilityAvailability mechanisms applied to this flow (e.g. retry logic, failover)
Uses RecoverabilityRecovery mechanisms in place if this flow fails (e.g. dead-letter queues, reprocessing)
Uses PerformancePerformance mechanisms and targets for this flow
Uses SecuritySecurity controls applied to data in transit (encryption, access control, audit logging)
Uses DeployabilityDeployment patterns relevant to this flow
Uses ObservabilityObservability mechanisms — logging, monitoring, alerting — applied to this flow

Acceptance and Testing

RelationshipWhat to link
Has Acceptance CriteriaBusiness-level criteria for verifying this data flow works as intended
Has Implementation Acceptance CriteriaTechnical criteria for verifying the implementation of this flow
Has Test DataTest data sets used to test this flow

RAID and Work

RelationshipWhat to link
Has AssumptionAssumptions made about this data flow (e.g. assumed availability of source data, assumed API contract)
Has RiskRisks associated with this flow (e.g. data quality risk, volume risk, latency risk)
Has IssueKnown issues affecting the definition or implementation of this flow
Has TaskTasks assigned to this flow

Design

RelationshipWhat to link
Supports Design DecisionDesign Decisions that are informed by or affect this data flow
Last updated on