Case StudiesFinancial Services

EIGHT SOURCES.
ONE TRUTH//

A financial services firm was producing contradictory board reports from the same underlying data. Eight source systems, no single source of truth, three days to close the books. We fixed it in 14 weeks.

3 days → 4hrs
Board reporting time
0
Data discrepancies post-launch
8 → 1
Source systems unified

The problem

The client operated eight data sources feeding three separate reporting environments. Each had its own extract logic, its own definition of key business metrics, and its own maintenance owner. When the board requested a consolidated performance report, the finance team needed three days to reconcile the numbers — and still regularly produced reports where the revenue figure on page 2 did not match the revenue figure on page 7.

The problem had been building for four years. Each time a new data requirement emerged, the path of least resistance was to add another extract or another reporting table rather than to fix the underlying architecture. By the time we engaged, the team had lost track of which figures were authoritative, and two members of the analytics team spent the majority of their time reconciling numbers rather than delivering analysis.

The business impact went beyond reporting delay. The lack of a trusted source of truth was eroding executive confidence in data-driven decisions. When the numbers are unreliable, the default is to trust instinct — and the analytics investment stops paying for itself.

What was broken

Our assessment phase identified four root causes:

  • No canonical business metric definitions. Revenue, margin, and headcount were each defined differently in three systems. The definitions were documented nowhere.

  • Extract-on-extract architecture. Reports were pulling from transformed extracts, not source data. When source schema changed upstream, the breakage was invisible until reports went wrong.

  • No medallion separation. Raw, cleansed, and aggregated data lived in the same layer with no governance between them. Analysts were joining directly on partially-transformed tables.

  • Ownership gaps. No one owned the data platform architecture. The DBA owned the SQL Server instances. The BI team owned the Tableau layer. No one owned the middle.

What we built

We designed and delivered a three-layer Azure data platform using a medallion architecture: a bronze layer for raw ingestion from all eight sources, a silver layer for validated, cleaned, and conformed data, and a gold layer for business-ready aggregates consumed by the Tableau reporting layer.

Azure Data Factory handled ingestion from all eight sources with full lineage metadata. dbt managed all transformation logic in version-controlled SQL with test coverage on every critical metric definition. Azure Synapse Analytics served as the semantic layer for Tableau. Every metric in the gold layer had a single canonical definition, documented in the code and verified by automated tests that ran with every pipeline execution.

The Tableau reporting layer was rebuilt with certified data sources pointing to the gold layer. Dashboards that had previously taken 45–90 seconds to load reduced to under 6 seconds. Board reporting moved from a 3-day manual reconciliation process to a 4-hour automated refresh-and-review workflow.

The outcome

The platform went live at week 14 with zero data loss and no service disruption to existing reporting. In the 12 months following launch, there were zero data discrepancy incidents — compared to an average of 2–3 per board cycle previously.

The two analytics team members who had spent most of their time reconciling numbers shifted to product analytics and customer analysis work. The executive team began using the Tableau dashboards in board meetings rather than producing separate static reports.

The platform was sized for 5× the current data volume with no architectural changes required.

Related

Healthcare Tableau Migration Case StudyAzure Cost Reduction Case StudyData Architecture ConsultingAzure Cloud EngineeringData Architect New YorkWhat is a Data Architecture Audit?