Big Data Solutions Architect (Professional Services)

Remote - FranceCompetitive0 applicants

About this role

CSQ227R46

As a Big Data Solutions Architect (Resident Solutions Architect) in our Professional Services team you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data. RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.

The impact you will have:

You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to's and productionalizing customer use cases

Work with engagement managers to scope variety of professional services work with input from the customer

Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications

Consult on architecture and design; bootstrap or implement customer projects which leads to a customers' successful understanding, evaluation and adoption of Databricks.

Provide an escalated level of support for customer operational issues.

You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer's needs.

Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.

What we look for:

6+ years experience in data engineering, data platforms & analytics

Strong expertise in data warehousing concepts, architecture, and migration strategies.

Comfortable writing code in either Python, Pyspark or Scala

Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one

Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals

Familiarity with CI/CD for production deployments

Working knowledge of MLOps

Design and deployment of performant end-to-end data architectures

Experience with technical project delivery - managing scope and timelines.

Documentation and white-boarding skills.

Experience working with clients and managing conflicts.

Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.

Data Science expertise is a nice-to-have

Travel to customers 10-20% of the time

Databricks Certification

Responsibilities

  • You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to's and productionalizing customer use cases
  • Work with engagement managers to scope variety of professional services work with input from the customer
  • Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications
  • Consult on architecture and design; bootstrap or implement customer projects which leads to a customers' successful understanding, evaluation and adoption of Databricks.
  • Provide an escalated level of support for customer operational issues.
  • You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer's needs.
  • Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.
  • 6+ years experience in data engineering, data platforms & analytics
  • Strong expertise in data warehousing concepts, architecture, and migration strategies.
  • Comfortable writing code in either Python, Pyspark or Scala
  • Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one

Requirements

  • Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals
  • Familiarity with CI/CD for production deployments
  • Working knowledge of MLOps
  • Design and deployment of performant end-to-end data architectures
  • Experience with technical project delivery - managing scope and timelines.
  • Documentation and white-boarding skills.
  • Experience working with clients and managing conflicts.
  • Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.
  • Data Science expertise is a nice-to-have
  • Travel to customers 10-20% of the time
  • Databricks Certification

About Databricks

Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.

EU Requirements

Job Details

Posted30 March 2026
Closes29 April 2026

Contact

Similar Jobs

Finding similar jobs...