Data Engineer

Salary
£500 - Per Day
Location
Leeds, United Kingdom
Type
Contract
Workplace
Hybrid
Published
Jan 29, 2026
Ref
168075
Share this
Location: Hybrid Leeds, United Kingdom

Duration: 6 months

Rate: Up to £500 per day (DOE)

IR35 Status: Outside IR35

We are looking for a highly skilled Senior Data Engineer to join our client's team for an initial 3-month contract role to deliver a MS Fabric POC. This position is Outside IR35. You will be involved in building and optimising data pipelines and architecture, supporting data transformation, integration, and delivery. This role is ideal for someone with a strong analytical mindset, a deep understanding of data engineering practices, and an ability to adapt to complex client needs.

Key Responsibilities:
  • Design, develop and maintain scalable and high-performance data pipelines for structured and unstructured data.
  • Implement data integration, extraction, transformation, and loading processes using Apache Spark and Python.
  • Develop and maintain dataset documentation and data modelling standards.
  • Work with stakeholders to understand business requirements and translate them into technical data solutions.
  • Ensure system performance through query optimisation, partitioning, and indexing strategies.
  • Contribute to the development and deployment of Power BI dashboards and reports, ensuring appropriate data access and Row-Level Security.
  • Follow DevOps and CI/CD practices, maintaining source control using Git and implementing pull-request workflows.
Required Skills and Experience:
  • Strong proficiency in SQL, with deep knowledge of indexing, data partitioning, and performance tuning for large datasets.
  • Proven recent experience working with MS Fabric will be essential
  • Proven expertise in Python with a focus on data libraries such as Pandas, PySpark, and PyArrow.
  • Comprehensive experience working with Apache Spark, including structured streaming, batch processing, and Delta Lake architecture.
  • Advanced understanding of Power BI visualisation tools, including DAX, data modelling best practices, and implementation of Row-Level Security.
  • Hands-on experience with cloud platforms, preferably Azure, including Azure Data Factory, Lake Storage Gen2, Synapse Analytics, and Databricks. Knowledge of AWS or GCP is also acceptable.
  • Experience using version control systems such as Git and applying CI/CD pipelines in data engineering projects.
This is an exciting opportunity to work on cutting-edge data engineering projects with significant impact. If you're a problem solver with a passion for data engineering and want to work in a dynamic and collaborative environment, we’d love to hear from you.

Apply

Gravitas Recruitment Group
Follow us
© Gravitas Group 2026Site by