Specialist Platform Engineer needed at Absa Group Limited

Save

Job title : Specialist Platform Engineer

Job Location : Gauteng,

Deadline : January 09, 2026

Quick Recommended Links

Job Summary

  • We are looking for an engineer to join our team to help build and maintain Kafka-based streaming applications and support the Kafka platform across on-prem and Confluent Cloud environments. The role is a hybrid of development, platform responsibilities, and observability, providing a unique opportunity to work on distributed systems at scale.

Job Description

Core Responsibilities:

  • Develop, maintain, and optimize Kafka-based applications and event streaming pipelines using Java(Spring / Spring Boot), Python, or .NET.
  • Work with distributed systems concepts: partitions, replication, fault-tolerance, scaling, and event-driven architectures.
  • Contribute to provisioning, managing, and securing Kafka clusters both on-prem and in Confluent Cloud.
  • Implement and maintain security and authorization mechanisms, including ACLs, Kerberos, SSL, and OAuth for Confluent Cloud.
  • Automate infrastructure deployment and configuration using Terraform, Ansible, CloudFormation, Docker, or Kubernetes.
  • Configure, monitor, and maintain observability for Kafka clusters, including metrics, alerts, and dashboards (e.g., Prometheus, Grafana, Confluent Control Center, ElasticSearch).
  • Assist in troubleshooting production issues and perform root cause analysis.
  • Collaborate closely with developers, DevOps/SRE teams, and other stakeholders to ensure reliable and performant streaming systems.
  • Contribute to best practices for connector configuration, high availability, disaster recovery, and performance tuning, including streaming applications and pipelines built with Kafka Streams, ksqlDB, Apache Flink, and TableFlow.

Required Skills:

  • Strong programming experience in Java(Spring / Spring Boot), Python, or .NET. Ability to write clean, maintainable, and performant code.
  • Solid understanding of distributed systems principles and event-driven architectures.
  • Hands-on experience with Kafka in production or strong ability to learn quickly.
  • Knowledge of Kafka ecosystem components (Connect, Schema Registry, KSQL, MirrorMaker, Control Center, Kafka Streams, Apache Flink, TableFlow) is a plus.
  • Familiarity with security best practices for Kafka, including ACLs, Kerberos, SSL, and OAuth.
  • Experience with infrastructure as code and containerized environments.
  • Experience with monitoring and observability tools for distributed systems.

Desirable Skills / Bonus Points:

  • Experience with Confluent Cloud or other managed Kafka platforms.
  • Experience with AWS.
  • Experience building streaming pipelines across multiple systems and environments.
  • Familiarity with CI/CD pipelines and automated deployments.

Behavioral / Soft Skills:

  • Strong problem-solving and analytical skills.
  • Excellent communication and interpersonal skills.
  • Ability to work independently and prioritize across multiple BAU and project tasks.
  • Product-minded approach, focusing on delivering value and scalable solutions.

Education

  • Bachelor’s Degree: Information Technology

End Date: December 12, 2025 

How to Apply for this Offer

Interested and Qualified candidates should Click here to Apply Now

  • ICT jobs

Disclaimer: MRjobs.co.za is not an employer and does not directly offer jobs. We share available opportunities from verified sources to help job seekers. Please do your due diligence before applying. We are not responsible for any transactions, interviews, or outcomes from third-party employers.