Overview
We're working with a client looking for a strong Azure Data Architect / Engineer to help shape and build out their data platform.
It's a mix of hands-on engineering and higher-level design, with a big focus on modern tooling across Azure and Microsoft Fabric. They're moving towards a more unified, lakehouse-style setup, so it's a good opportunity to work on something fairly greenfield rather than just maintaining legacy pipelines.
What you'll be doing
Designing and building data pipelines across Azure and Fabric
Working on lakehouse architecture (Fabric / Data Lake)
Getting involved in data modelling for reporting and analytics
Helping move existing solutions into a more modern Azure/Fabric setup
Working closely with analysts and business stakeholders to understand requirements
Keeping an eye on performance, cost, and general optimisation
Inputting into architecture decisions (especially if you're more senior)
What they're looking for
Solid experience across Azure data services (ADF, Synapse, Data Lake, Azure SQL)
Some exposure to or interest in Microsoft Fabric (OneLake, pipelines, warehouse, etc.)
Strong SQL and data modelling fundamentals
Experience building ETL/ELT pipelines end-to-end
Python and/or Spark (Databricks or Fabric) is a big plus
Comfortable working in a fairly collaborative, non-siloed environment
If interested, please reach out.