← All Open Roles
2 Openings·Full-time·Remote — US or Australia

Data Engineer

We are hiring two data engineers to join our delivery team. You will build and maintain production-grade data pipelines for enterprise clients — taking architecture specifications and turning them into reliable, maintainable systems that actually run in prod. No fluff. Real pipelines. Real stakes.

Apply for This Role →

What You'll Do

  • Design, build, and maintain production data pipelines for enterprise clients across ingestion, transformation, and delivery layers
  • Implement ELT/ETL workflows using Azure Data Factory, dbt, and Databricks on Azure — handling scale, performance, and reliability requirements
  • Collaborate with the data architect to translate architectural designs into working, maintainable pipeline code
  • Write clean, well-structured Python and SQL — code that engineers can maintain and extend after you
  • Monitor pipeline health, implement alerting, and own incident response for data reliability issues
  • Work directly with client data and engineering teams to understand source systems, data contracts, and downstream requirements
  • Contribute to internal engineering standards, code review processes, and shared tooling

What You'll Need

  • 4+ years of experience building production data pipelines in a professional engineering role
  • Strong Azure data engineering experience: Azure Data Factory, Azure Data Lake Storage, Databricks, and Synapse Analytics
  • Proficient in Python (PySpark and pandas) and SQL at a production level
  • Hands-on experience with dbt for data transformation in a cloud lakehouse or warehouse environment
  • Understanding of data modeling patterns — star schema, medallion architecture, and data vault
  • Experience working in client-facing or professional services environments is a plus
  • Comfort navigating ambiguity in early-stage client projects where requirements evolve

Nice to Have

  • Experience with Apache Kafka or Azure Event Hubs for real-time data pipelines
  • Familiarity with infrastructure-as-code tools (Terraform, Bicep) for data platform provisioning
  • Tableau or Power BI development experience
  • AWS or GCP data engineering background
  • Data quality tooling (Great Expectations, dbt tests, Soda)

What We Offer

Fully remote — work from anywhere in the US or Australia

Competitive mid-to-senior compensation (discussed at offer stage)

Work on real enterprise environments — not toy data sets

Tight-knit team with strong architectural leadership above and below

Exposure to a wide range of industries and data environments

Ready to apply?

Send your CV and work samples to hello@dataarchitect.co with subject line: Data Engineer Application

Apply Now →

Also Hiring

Data Architect
1 opening
Data Analyst
2 openings