Big Data Developer/Architect

Overview

Are you  a big Data Developer or Architect who wants to help create a product that is unique in the market and allows financial institutions to monitor 354 trillion transactions with a daily value in excess of $1 trillion.

We are looking for someone with 3+ years experienc with Hadoop to work alongside talented engineers in our client's Big Data tea and develop and integrate technologies, support and maintain the Big Data platform.

You will be used to us
ing best practice CI/CD methodologies to produce a high-performance system that is secure, easy to deploy and maintainable.

You relish helping the QA and Support teams to troubleshoot issues and provide solutions.

You will actively help the Machine Learning team to integrate new algorithms into the platform.

What you need to get this job:

  • Passion for Computer Science, Big Data and learning new technologies
  • BSc. or MSc. in Computer Science or related discipline 
  • Minmum of 2 years' experience with Hadoop, and also experience with technologies such as Spark, Kafka, Druid and Presto
  • Can you to configure, manage and optimize multi-node clusters?
  • Strong with Java, Scala and/or Python
  • Relevant Hadoop Certification (e.g. HDPCD, HDPCA, CCA, CCAH)
  • Experience in dealing with large, complex datasets
  • Working knowledge of Linux/Unix environment

Tom Hanley, Technical Resourcer Recruiter - Your e-Frontiers consultant for this job

Tom joined e-Frontiers having previously worked in recruitment in the Telecoms industry. He specialises in IT roles which he sees will dominate our daily lives for years to come. He has a Bachelor’s Degree in French and Spanish. Tom has been living in Spain for the last 4 years – so can give first-hand experience on relocating to a new country. He prides himself on matching the right candidate to the right client and achieving satisfaction for both parties.

Specialises in: Java, Business Analysts, Project Managers, Product Owner