Data Engineer

Salary
£350 - Per Day
Location
London, United Kingdom
Type
Contract
Workplace
Remote
Published
Mar 18, 2026
Ref
169542
Share this

Data Engineer

We are looking for a Data Engineer to join a consulting engagement focused on designing, building, and optimising robust data platforms and pipelines. The company fosters a collaborative environment where technical expertise and problem-solving are valued to deliver reliable, well-governed datasets that support data-driven decision-making. The role involves working closely with analysts, engineers, and stakeholders to develop scalable solutions, ensuring high-performance and cost efficiency within client environments.

Role Overview:

This position involves designing and maintaining scalable ELT/ETL pipelines on Google Cloud Platform, modelling and transforming data using dbt, and optimising data solutions in BigQuery. The Data Engineer will orchestrate workflows with Apache Airflow, develop batch and stream processing solutions, containerise services with Docker, and support deployments with Kubernetes. The role is essential for enabling efficient data workflows and ensuring data quality, security, and governance across projects.

Key Skills & Experience:

• Proven experience delivering production-grade data pipelines and data models

• Strong hands-on experience with GCP, particularly BigQuery

• Solid knowledge of dbt for data modelling, testing, macros, and documentation

• Commercial experience with Apache Airflow for orchestration

• Experience with Google Dataflow (Apache Beam) for data processing

• Proficiency with Docker and Kubernetes for containerisation and deployment

• Good understanding of data governance, security, and access controls

• Strong SQL skills and communication abilities suited for a consulting environment

Key Responsibilities:

• Design, develop, and maintain scalable ELT/ETL pipelines on GCP

• Model, transform, and test data using dbt with best practices

• Build and optimise BigQuery data solutions, including partitioning and performance tuning

• Orchestrate workflows using Apache Airflow for scheduling, monitoring, and alerting

• Develop batch and stream processing workflows with Google Dataflow

• Containerise services using Docker and support Kubernetes deployments

• Implement data quality checks, lineage, documentation, and CI/CD pipelines

• Collaborate with stakeholders to define requirements and translate them into technical solutions

Requirements:

• Right to work in the UK

• At least 3 years of professional data engineering experience, including 1+ year working with mobile data beneficial

• Extensive experience with GCP, BigQuery, and SQL

• Proven expertise with dbt, Airflow, Docker, Kubernetes, and Dataflow

• Familiarity with data governance, security, and access controls

• Strong communication skills and ability to work in a consulting environment

Nice to Have:

• Experience with integrating multiple data sources (APIs, event data, relational and semi-structured formats)

• Additional experience working on mobile data projects

Candidates are encouraged to apply promptly for this opportunity to contribute to a vital data engineering project with a reputable organisation.

Apply

Gravitas Recruitment Group
Follow us
© Gravitas Group 2026Site by