Databricks Engineer

Databricks EngineerLondon- hybrid- 3 days per week on-site6 Months +UMBRELLA only- Inside IR35 Key ResponsibilitiesDesign, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling.Build and manage data transformation workflows in DBT running on Databricks.Optimize data models in Delta Lake for performance, scalability, and cost efficiency.Collaborate with analytics, BI, and data science teams to deliver clean, reliable datasets.Implement data quality checks (dbt tests, monitoring) and ensure governance standards.Manage and monitor Databricks clusters and SQL Warehouses to support workloads.Contribute to CI/CD practices for data pipelines (version control, testing, deployments).Troubleshoot pipeline failures, performance bottlenecks, and scaling challenges.Document workflows, transformations, and data models for knowledge sharing.Required Skills and Qualifications3-6 years of experience as a Data Engineer (or similar).Hands-on expertise with:DBT (dbt-core, dbt-databricks adapter, testing and documentation).Apache Airflow (DAG design, operators, scheduling, dependencies).Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses).Strong SQL skills and understanding of data modeling (Kimball, Data Vault, or similar).Proficiency in Python for scripting and pipeline development.Experience with CI/CD tools (e.g., GitHub Actions, GitLab CI, Azure DevOps).Familiarity with cloud platforms (AWS, Azure, or GCP).Strong problem-solving skills and ..... full job details .....
Other jobs of interest...
Perform a fresh search...
-
Create your ideal job search criteria by
completing our quick and simple form and
receive daily job alerts tailored to you!