Closes in -5 days

Data Infrastructure Engineer

Dublin, IrelandCompetitiveHybrid0 applicants

About this role

Intercom is the AI Customer Service company on a mission to help businesses provide incredible customer experiences.

Our AI agent Fin, the most advanced customer service AI agent on the market, lets businesses deliver always-on, impeccable customer service and ultimately transform their customer experiences for the better. Fin can also be combined with our Helpdesk to become a complete solution called the Intercom Customer Service Suite, which provides AI enhanced support for the more complex or high touch queries that require a human agent.

Founded in 2011 and trusted by nearly 30,000 global businesses, Intercom is setting the new standard for customer service. Driven by our core values, we push boundaries, build with speed and intensity, and consistently deliver incredible value to our customers.

What's the opportunity?

The Data Infrastructure team builds distributed systems and tools supporting Intercom and empowering people with information. As the company grows, so does the volume and velocity of our data and the appetite for more-and-more sophisticated and specialized, often AI-assisted, data solutions.

Our team builds, maintains, evolves, and extends the data platform, enabling our partners to self-serve by creating their own end-to-end data workflows, from ingestion through transforming data and evaluating experiments to analyzing usage and running predictive models. We provide a solid data foundation to support various highly impactful business and product-focused projects.

We’re looking for a Data Infrastructure engineer to join us and collaborate on large-scale data-related infrastructure initiatives, who is passionate about providing solid foundations for providing high quality data to our consumers.

What will I be doing?

Evolve the Data Platform by designing and building the next generation of the stack.

Develop, run and support our data pipelines using tools like Airflow, PlanetScale, Kinesis, Snowflake, Tableau, all in AWS.

Collaborate with product managers, data engineers, analysts and data scientists to develop tooling and infrastructure to support their needs.

Develop automation and tooling to support the creation and discovery of high quality analytics data in an environment where dozens of changes can be shipped daily.

Implement systems to monitor our infrastructure, detect and surface data quality issues and ensure Operational Excellence

Recent projects the team has delivered:

Migrating the MySQL ingestion pipeline from Aurora to PlanetScale

LLM utilisation DAG framework

Tableau Dashboard Performance monitoring

Unified Local Analytics Development Environment for Airflow and DBT

Containerised Snowflake Apps

Responsibilities

  • The Data Infrastructure team builds distributed systems and tools supporting Intercom and empowering people with information. As the company grows, so does the volume and velocity of our data and the appetite for more-and-more sophisticated and specialized, often AI-assisted, data solutions.
  • Our team builds, maintains, evolves, and extends the data platform, enabling our partners to self-serve by creating their own end-to-end data workflows, from ingestion through transforming data and evaluating experiments to analyzing usage and running predictive models. We provide a solid data foundation to support various highly impactful business and product-focused projects.
  • We’re looking for a Data Infrastructure engineer to join us and collaborate on large-scale data-related infrastructure initiatives, who is passionate about providing solid foundations for providing high quality data to our consumers.
  • Evolve the Data Platform by designing and building the next generation of the stack.
  • Develop, run and support our data pipelines using tools like Airflow, PlanetScale, Kinesis, Snowflake, Tableau, all in AWS.
  • Collaborate with product managers, data engineers, analysts and data scientists to develop tooling and infrastructure to support their needs.
  • Develop automation and tooling to support the creation and discovery of high quality analytics data in an environment where dozens of changes can be shipped daily.
  • Implement systems to monitor our infrastructure, detect and surface data quality issues and ensure Operational Excellence
  • Migrating the MySQL ingestion pipeline from Aurora to PlanetScale
  • LLM utilisation DAG framework
  • Tableau Dashboard Performance monitoring
  • Unified Local Analytics Development Environment for Airflow and DBT
  • Containerised Snowflake Apps
  • You have 3+ years of full-time, professional work experience in the data space using Python and SQL.
  • You have solid experience building and running data pipelines for large and complex datasets including handling dependencies.
  • You have hands-on cloud provider experience (preferably AWS) including service integrations and automation via CLI and APIs.

Requirements

  • You have a solid understanding of data security practices and are passionate about privacy.
  • You have some DevOps experience
  • You care about your craft
  • Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would be helpful.
  • Experience or understanding of tools and technologies included in the modern data stack ( Snowflake, DBT )
  • Industry awareness of up-and-coming technologies and vendors.
  • Competitive salary and equity in a fast-growing start-up
  • We serve lunch every weekday, plus a variety of snack foods and a fully stocked kitchen
  • Regular compensation reviews - we reward great work!
  • Pension scheme & match up to 4%
  • Peace of mind with life assurance, as well as comprehensive health and dental insurance for you and your dependents
  • Open vacation policy and flexible holidays so you can take time off when you need it
  • Paid maternity leave, as well as 6 weeks paternity leave for fathers, to let you spend valuable time with your loved ones
  • If you’re cycling, we’ve got you covered on the Cycle-to-Work Scheme. With secure bike storage too
  • MacBooks are our standard, but we also offer Windows for certain roles when needed.

About Intercom

You have 3+ years of full-time, professional work experience in the data space using Python and SQL. You have solid experience building and running data pipelines for large and complex datasets including handling dependencies. You have hands-on cloud provider experience (preferably AWS) including service integrations and automation via CLI and APIs. You have a solid understanding of data security practices and are passionate about privacy. You have some DevOps experience You care about your craft In addition it would be a bonus if you have Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would be helpful. Experience or understanding of tools and technologies included in the modern data stack ( Snowflake, DBT ) Industry awareness of up-and-coming technologies and vendors.

EU Requirements

Job Details

Posted3 March 2026
Closes2 April 2026
Work ModeHybrid

Contact

Similar Jobs

Finding similar jobs...