Senior Data Engineer - Analytics (x/f/m)

Paris, Paris, FranceCompetitive0 applicants

About this role

We are looking for a Senior Data Engineer - Analytics to join the Analytics Engineering team at Doctolib.

As a Senior Data Engineer - Analytics, your mission will be to work on building data products that provide insights and support decision-making across Doctolib, helping to transform access to healthcare. You will be working in a team developing data pipelines and solutions that empower our organization with data-driven insights while supporting our AI strategy.

Working in the tech team at Doctolib involves building innovative products and features to improve the daily lives of care teams and patients. We work in feature teams in an agile environment, while collaborating with product, design, and business teams.

Your responsibilities include but are not limited to:

Building and maintaining data pipelines using Python (Dagster) to support Doctolib's AI strategy

Developing and maintaining data marts in BigQuery using SQL/Jinja, DBT

Creating dashboards for high-level reporting using Tableau

Collaborating with stakeholders to understand their data needs and define specifications

Ensuring data quality, security (GDPR compliance), and availability through monitoring and optimization

About our tech environment

Our solutions are built on a single fully cloud-native platform that supports web and mobile app interfaces, multiple languages, and is adapted to the country and healthcare specialty requirements. To address these challenges, we are modularizing our platform run in a distributed architecture through reusable components.

Our stack is composed of Rails, TypeScript, Java, Python, Kotlin, Swift, and React Native.

We leverage AI ethically across our products to empower patients and health professionals. Discover our AI vision here and learn about our first AI hackathon here!

Our data stack includes: Kafka/Debezium for data ingestion, Dagster/DBT for orchestration, GCS/BigQuery for data warehousing, and Metabase/Tableau for BI and reporting.

Who you are

Before you read on — if you don't have the exact profile described below, but you feel this job description matches your skill set, we still encourage you to apply.

You have at least 7 years+ of experience as an Analytics Engineer, Data Engineer, or a similar role

You are proficient in Python, SQL, and DBT for building data pipelines

You have experience with BI tools such as Tableau, Metabase, or similar platforms

You have a first experience with Google Cloud Platform (GCP) stack

You have a good understanding of functional aspects of data (Sales, Finance, Web Analytics, Product)

You have strong communication skills and can collaborate effectively with business teams

Now it would be fantastic if you:

Have experience with AI products

Have experience with Vertex AI

Have experience with Cursor / Claude

Are familiar with GDPR regulations

Responsibilities

  • Building and maintaining data pipelines using Python (Dagster) to support Doctolib's AI strategy
  • Developing and maintaining data marts in BigQuery using SQL/Jinja, DBT
  • Creating dashboards for high-level reporting using Tableau
  • Collaborating with stakeholders to understand their data needs and define specifications
  • Ensuring data quality, security (GDPR compliance), and availability through monitoring and optimization
  • Our solutions are built on a single fully cloud-native platform that supports web and mobile app interfaces, multiple languages, and is adapted to the country and healthcare specialty requirements. To address these challenges, we are modularizing our platform run in a distributed architecture through reusable components.
  • Our stack is composed of Rails, TypeScript, Java, Python, Kotlin, Swift, and React Native.
  • We leverage AI ethically across our products to empower patients and health professionals. Discover our AI vision here and learn about our first AI hackathon here!
  • Our data stack includes: Kafka/Debezium for data ingestion, Dagster/DBT for orchestration, GCS/BigQuery for data warehousing, and Metabase/Tableau for BI and reporting.
  • You have at least 7 years+ of experience as an Analytics Engineer, Data Engineer, or a similar role
  • You are proficient in Python, SQL, and DBT for building data pipelines
  • You have experience with BI tools such as Tableau, Metabase, or similar platforms
  • You have a first experience with Google Cloud Platform (GCP) stack
  • You have a good understanding of functional aspects of data (Sales, Finance, Web Analytics, Product)
  • You have strong communication skills and can collaborate effectively with business teams
  • Have experience with AI products
  • Have experience with Vertex AI
  • Have experience with Cursor / Claude
  • Are familiar with GDPR regulations

Requirements

  • Free comprehensive health insurance for you and your children
  • Parent Care Program: additional leave on top of the legal parental leave
  • Free mental health and coaching services through our partner Moka.care
  • For caregivers and workers with disabilities, a package including an adaptation of the remote policy, extra days off for medical reasons, and psychological support
  • Work from EU countries and the UK for up to 10 days per year, thanks to our flexibility days policy
  • Work Council subsidy to refund part of a sport club membership or a creative class
  • Up to 14 days of RTT
  • Lunch voucher with Swile card
  • Bicycle subsidy
  • HR Screen
  • Live tech interview
  • Behavioral interview + Meeting with the team at the office
  • At least one reference check
  • Offer!
  • Permanent position
  • Full-time
  • Paris
  • Start date: as soon as possible

EU Requirements

Job Details

Posted1 April 2026
Closes1 May 2026

Contact

Similar Jobs

Finding similar jobs...