Senior Software Engineer

SlovakiaCompetitiveRemote0 applicants

About this role

Bloomreach is building the world’s premier agentic platform for personalization.We’re revolutionizing how businesses connect with their customers, building and deploying AI agents to personalize the entire customer journey.

We're taking autonomous search mainstream, making product discovery more intuitive and conversational for customers, and more profitable for businesses.

We’re making conversational shopping a reality, connecting every shopper with tailored guidance and product expertise — available on demand, at every touchpoint in their journey.

We're designing the future of autonomous marketing, taking the work out of workflows, and reclaiming the creative, strategic, and customer-first work marketers were always meant to do.

And we're building all of that on the intelligence of a single AI engine — Loomi AI — so that personalization isn't only autonomous…it's also consistent.From retail to financial services, hospitality to gaming, businesses use Bloomreach to drive higher growth and lasting loyalty. We power personalization for more than 1,400 global brands, including American Eagle, Sonepar, and Pandora.

We want you to join us full-time as a Senior Software Engineer in our Data Pipeline team. We work remotely first (from Central & Eastern Europe), but we are more than happy to meet you in our nice offices in Bratislava, Brno or Prague. And if you are interested in who will be your engineering manager, check out Vaclav's Linkedin. Salary range starting from 3700 EUR gross per month, going up depending on your experience and skills.

Intrigued? Read on 🙂…

Your responsibilities

You will develop a data pipeline processing a large amount of data reliably and at a high rate using Python, Go, MongoDB, GCP and much more.

For example, our clients feed their visitors’ behavior through real-time tracking to our platform. The data then can be analyzed and used for marketing automation. We process tens of thousands of requests per second 🚀. You may help improve the efficiency of our workers, improve monitoring or help with autoscaling of our infrastructure.

Imports are critical for our clients to utilize our platform to the fullest. You may help us to improve the throughput and reliability of our imports as we need to import millions of rows of data or help us implement integration with another data storage.

We also are responsible for exporting data from our platform to Google’s BigQuery using google’s DataFlows, PySpark and Apache Beam allowing data access by our clients

You will help us run and support our services in production handling high-volume traffic using Google Cloud Platform and Kubernetes.

You will review the code of your peers and they'll review yours. We have high code quality standards and the four-eyes principle is a must!

You will start on easier, self-contained projects and once you feel at home, you can move to real beasts. Challenging, complex projects that will leave a mark - we have plenty of those 💪.

Later you might help us by participating in on-call rotation keeping our services up and running 🚀.

Your qualifications

You have experience with Python and a solid grasp of engineering practices 🔧.

If you have an experience with Go, that's a big advantage 💪.

You have experience with ETL pipelines, ingest on a scale, starting from various connectors to SQL and non-SQL sources, followed by data cleanup, sanitization and transformations resulting in loading cleaned-up data into optimise storages for a given use-case

You know how to behave in a remote-first environment

You are able to learn and adapt. It'll be handy while exploring new tech, navigating our not-so-small code base, or when iterating on our team processes.

Our tech stack

Python, GO

Apache Kafka, Kubernetes, GitLab

Google Cloud Platform, DataFlow, PySpark, Apache Beam, BigQuery, BigLake Table

DataProc, PySpark, Flink,

Mongo, Redis

… and much more 🙂

Your success story

During the first 30 days, you will get to know the team, the company, and the most important processes. You’ll work on your first tasks. We will help you to get familiar with our codebase and our product.

During the first 90 days, you will participate in your first, more complex projects. You will help the team to find solutions to various problems, break the solution down into smaller tasks and participate in implementation. You will learn how we identify problems, how we prioritize our efforts, and how we deliver value to our customers.

During the first 180 days, you’ll become an integral part of the team. You will achieve the first goals we will set together to help you grow and explore new and interesting things. You will help us to deliver multi-milestone projects bringing great value to our customers. You will help us mitigate your first incidents and eventually even join the on-call rotation. You will get a sense of where the team is heading and you’ll help us to shape our future.

Finally, you’ll find out that our values are truly lived by us 💛. We are dreamers and builders. Join us!

Responsibilities

  • We're taking autonomous search mainstream, making product discovery more intuitive and conversational for customers, and more profitable for businesses.
  • We’re making conversational shopping a reality, connecting every shopper with tailored guidance and product expertise — available on demand, at every touchpoint in their journey.
  • We're designing the future of autonomous marketing, taking the work out of workflows, and reclaiming the creative, strategic, and customer-first work marketers were always meant to do.
  • You will develop a data pipeline processing a large amount of data reliably and at a high rate using Python, Go, MongoDB, GCP and much more.
  • For example, our clients feed their visitors’ behavior through real-time tracking to our platform. The data then can be analyzed and used for marketing automation. We process tens of thousands of requests per second 🚀. You may help improve the efficiency of our workers, improve monitoring or help with autoscaling of our infrastructure.
  • Imports are critical for our clients to utilize our platform to the fullest. You may help us to improve the throughput and reliability of our imports as we need to import millions of rows of data or help us implement integration with another data storage.
  • We also are responsible for exporting data from our platform to Google’s BigQuery using google’s DataFlows, PySpark and Apache Beam allowing data access by our clients
  • You will help us run and support our services in production handling high-volume traffic using Google Cloud Platform and Kubernetes.
  • You will review the code of your peers and they'll review yours. We have high code quality standards and the four-eyes principle is a must!
  • You will start on easier, self-contained projects and once you feel at home, you can move to real beasts. Challenging, complex projects that will leave a mark - we have plenty of those 💪.
  • Later you might help us by participating in on-call rotation keeping our services up and running 🚀.
  • You have experience with Python and a solid grasp of engineering practices 🔧.
  • If you have an experience with Go, that's a big advantage 💪.
  • You have experience with ETL pipelines, ingest on a scale, starting from various connectors to SQL and non-SQL sources, followed by data cleanup, sanitization and transformations resulting in loading cleaned-up data into optimise storages for a given use-case
  • You know how to behave in a remote-first environment
  • You are able to learn and adapt. It'll be handy while exploring new tech, navigating our not-so-small code base, or when iterating on our team processes.
  • Python, GO
  • Apache Kafka, Kubernetes, GitLab
  • Google Cloud Platform, DataFlow, PySpark, Apache Beam, BigQuery, BigLake Table
  • DataProc, PySpark, Flink,
  • Mongo, Redis
  • … and much more 🙂
  • During the first 30 days, you will get to know the team, the company, and the most important processes. You’ll work on your first tasks. We will help you to get familiar with our codebase and our product.

Requirements

  • During the first 90 days, you will participate in your first, more complex projects. You will help the team to find solutions to various problems, break the solution down into smaller tasks and participate in implementation. You will learn how we identify problems, how we prioritize our efforts, and how we deliver value to our customers.
  • During the first 180 days, you’ll become an integral part of the team. You will achieve the first goals we will set together to help you grow and explore new and interesting things. You will help us to deliver multi-milestone projects bringing great value to our customers. You will help us mitigate your first incidents and eventually even join the on-call rotation. You will get a sense of where the team is heading and you’ll help us to shape our future.
  • Finally, you’ll find out that our values are truly lived by us 💛. We are dreamers and builders. Join us!

EU Requirements

Job Details

Posted1 April 2026
Closes1 May 2026
Work ModeRemote

Contact

Similar Jobs

Finding similar jobs...