One of the most successful boutique FinTech firms has thrown huge funding behind the build out of a state-of-the-art central data platform. They specialise in bespoke Cloud-based Data engineering solutions, using the very latest in cloud, data and DevOps technologies to deliver market-leading tech. They are global partners of Amazon, Google and Microsoft, meaning they get early access to the latest cloud and DevOps tech ahead of the rest of the market.

They’re looking for an Azure/GCP Data Engineer to deliver bleeding edge data platforms for ultra-large-scale data sets, which are used directly by key business stakeholders. You will have the freedom to utilise the latest in big data/streaming tech such as Kafka and Flink, as well as the most exciting DevOps tools such as K8s/Istio, TeamCity and Gitlab. Their culture is also hugely important to them, with a strict “no egos” hiring policy they have managed to create a truly friendly and collaborative engineering environment.

Must have:

  • Data engineering background with experience in big data tech such as Hadoop, Kafka, Spark, Cassandra
  • Experience working in cloud-based infrastructure (Azure and/or GCP clouds)

Nice to have:

  • Experience coding in Python, Java or other language
  • DevOps experience with tech such as Kubernetes, Docker, TeamCity, Terraform, Jenkins, or similar
  • Certification in Microsoft Azure or Google Cloud

Interview process: fully remote

Company:

  • Variable remote working (~2 days WFH per week)
  • Healthcare/dental cover
  • Cycle to work scheme
  • NOTE: this company cannot provide visa sponsorship

Please contact  Seb.Coughlin@stanfordblack.com for more information.