(736) Data Engineer – BSTD needed at South African Reserve Bank

Save

Job title : (736) Data Engineer – BSTD

Job Location : Gauteng, Pretoria

Deadline : December 26, 2024

Quick Recommended Links

Detailed description

The successful candidate will be responsible for the following key performance areas:

  • Implement data service standards and frameworks across the SARB to ensure optimised solutions and adherence to best practice, that is, data operations, development and operations as well as machine learning and operations.
  • Take responsibility for BI data pipelines and flows for domain specific analytic implementations across the SARB.
  • Ensure understanding of client’s data requirements in order to drive continuous development of data services and address evolving business needs.
  • Design and build data pipelines that are robust, modular, scalable, deployable, reproducible and versioned for analytics and reporting purposes.
  • Continually monitor and optimise domain specific data pipelines to ensure data availability and optimal long-term performance of data pipelines.
  • Implement new data engineering features.
  • Implement data sharing technology services for the SARB, in alignment with the BI and Business Solutions and Technology Department (BSTD) Strategy.
  • Diagnoses, manage and enhance the performance of BI data marts and warehouses across the SARB by applying data engineering techniques such as distributed computing and data optimisation.
  • Resolve data issues across BI data marts, data warehouses and data lakes.
  • Implement initiatives to ensure compliance and adherence to security and application standards with respect to all BI data services.
  • Identify and manage the mitigation of risks relating to domain-specific BI data services.
  • Proactively engage and problem-solve with cross functional stakeholders ‒ from technical data teams to managers ‒ to address their data needs in order to build impactful analytics solutions.
  • Provide reporting and recommendations on data service performance, improvements and data availability for domain-specific solutions to management.
  • Keep abreast of industry best practices and technologies and lead implementation thereof to optimise effective and efficient data pipelines and services.
  • Impart knowledge of the technical environment to other data engineers, systems development, database administrator, infrastructure and enterprise architecture and enterprise information management teams.

Qualifications

Job requirements

To be considered for this position, candidates must be in possession of: 

  • a Bachelor’s degree (NQF 7) in Computer Science, Engineering, Mathematics, Information Management, Technology or an equivalent qualification; and
  • five to eight years’ experience in building, maintaining and optimising data pipelines and services.

Additional requirements include:

  • ability to write clean, maintainable, scalable, and robust code in object-oriented language, e.g., Python, Scala, Java, in a professional setting;
  • proven experience building data pipelines in production for advanced analytics use cases;
  • experience working across structured, semi-structured and unstructured data;
  • experience with database technologies such Microsoft SQL Server, Oracle Database, MySQL, PostgreSQL, IBM Db2 and NoSQL;
  • familiarity with distributed computing frameworks (e.g. Spark, Dask) cloud platforms (e.g. AWS, Azure, GCP, containerization, and analytics libraries (e.g. pandas, NumPy, matplotlib);
  • familiarity with time-series and graph database types and related technologies (Druid, InfluxDB, Neo4J, etc.) would be considered a plus;
  • practical knowledge of software engineering concepts and best practices, including DevOps, Development Security and Operations and Data Operations, would be considered a plus;
  • ability to scope projects, define workstreams and effective lead, and mentor more junior colleagues;
  • continuous improvement knowledge and skill;
  • industry, organisational and business awareness, knowledge and skill;
  • quality assurance knowledge and skill;
  • business continuity planning knowledge and skill;
  • information technology (IT) enablement planning knowledge and skill;
  • IT transformation and innovation knowledge skills;
  • release management knowledge and skill;
  • infrastructure design and development knowledge and skill;
  • workload estimation knowledge and skill;
  • technical analysis knowledge and skill;
  • systems integration knowledge and skill;
  • IT governance knowledge and skill; and
  • continued learning and/or professional development knowledge and skill.

How to Apply for this Offer

Interested and Qualified candidates should Click here to Apply Now

  • Research / Data Analysis  jobs