Lead Hadoop Engineer needed at Absa Group Limited

Save 12 hours ago

Job title : Lead Hadoop Engineer

Job Location : Gauteng,

Deadline : December 04, 2024

Quick Recommended Links

Job Summary

  • This role provides an exciting opportunity to roll out a new strategic initiative within the firm– Enterprise Infrastructure Big Data Service. The Big Data Developer serves as a development and support expert with responsibility for the design, development, automation, testing, support and administration of the Enterprise Infrastructure Big Data Service. The roles require experience with both Hadoop and Kafka.  This will involve building and supporting a real time streaming platform utilized by Absa data engineering community. You will be responsible for developing features, ongoing support and administration, and documentation for the service.  The platform provides a messaging queue and a blueprint for integrating with existing upstream and downstream technology solutions.

Job Description

Candidate Description 

  • You will have the opportunity of working directly across the firm with developers, operations staff, data scientists, architects and business constituents to develop and enhance the big data service.  
  • Development and deployment of data applications 
  • Design & Implementation of infrastructure tooling and work on horizontal frameworks and libraries 
  • Creation of data ingestion pipelines between legacy data warehouses and the big data stack 
  • Automation of application back-end workflows 
  • Building and maintaining backend services created by multiple services framework 
  • Maintain and enhance applications backed by Big Data computation applications 
  • Be eager to learn new approaches and technologies 
  • Strong problem solving skills 
  • Strong programming skills 
  • Background in computer science, engineering, physics, mathematics or equivalent 
  • Worked on Big Data platforms (Vanilla Hadoop, Cloudera or Hortonworks) 
  • Preferred: Experience with Scala or other functional languages (Haskell, Clojure, Kotlin, Clean) 
  • Preferred: Experience with some of the following: Apache Hadoop, Spark, Hive, Pig, Oozie, ZooKeeper, MongoDB, CouchbaseDB, Impala, Kudu, Linux, Bash, version control tools, continuous integration tools 

Education

  • Bachelor’s Degree: Information Technology

How to Apply for this Offer

Interested and Qualified candidates should Click here to Apply Now

  • ICT jobs

Share this job