We are looking for a Senior DevOps Engineer - Data Platform to join the Data and AI Platform team.
As a Senior DevOps Engineer focus on our Data Platform, your mission will be to improve and maintain an efficient, reliable and scalable platform that enables Data Product developers and owners to develop, deploy and maintain their data products autonomously, at scale, with clear and maintained interfaces and full observability, ensuring seamless data flow across the organization and enabling data-driven decision-making.
Your responsibilities include but are not limited to:
Maintain Data Product Controller to give stakeholders full responsibility for managing their data products in an automated way (via CI/CD and infrastructure as code) securely, reliably, governed and with full ownership
Maintain Data and AI Platform orchestrator (Dagster) to give the possibility to Data Product developers to orchestrate their Data Product in a decentralized way (Data Mesh), by owning their release process and job pipelines.
Monitor data platform for performance and reliability, identify and troubleshoot issues, and implement proactive solutions to ensure availability
Offer observability components to enable developer teams and data product consumers to have the right level of knowledge of costs, data quality and data lineage
Monitor platform costs, identify optimizations and saving opportunities while collaborating with data engineers, data scientists, and other stakeholders
About our tech environment
GCP (BigQuery, Google Cloud Storage, Pub/Sub, Cloud Run, IAM, Monitoring)
Container & Orchestration: Google Kubernetes Engine, ArgoCD for GitOps
Iac: Terraform/Crossplane
Monitoring: Datadog
Versioning/Continuous Integration: Git
Data Orchestrator: Dagster
We leverage AI ethically across our products to empower teams. Discover our AI vision here and learn about our first AI hackathon here!
Who you are
Before you read on — if you don't have the exact profile described below, but you feel this job description matches your skill set, we still encourage you to apply.
You have more than 5 years of experience as a DataOps Engineer or in a similar role, with a proven track record of building and maintaining complex data infrastructures
You have strong proficiency in data engineering and infrastructure tools and technologies (Kubernetes, ArgoCD, Crossplane)
You have expertise in programming languages like Python
You are familiar with cloud infrastructure and services, preferably GCP, and have experience with infrastructure-as-code tools such as Terraform
You have excellent problem-solving skills with a focus on identifying and resolving data infrastructure bottlenecks and performance issues
Now it would be fantastic if you:
Have knowledge of data governance principles and best practices for data security
Have experience with continuous integration and continuous delivery (CI/CD) pipelines for data