GCP Data Engineer needed at PBT Group

Save 4 days ago

Job title : GCP Data Engineer

Job Location : Western Cape,

Deadline : December 08, 2024

Quick Recommended Links

  • Our team is seeking an experienced Google Cloud Platform (GCP) Data Engineer with a passion for building and optimising data pipelines, architectures, and data sets on GCP. This role involves collaborating closely with data analysts, data scientists, and other stakeholders to ensure high-quality data availability for advanced analytics and machine learning initiatives.

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines and ETL processes on GCP to support a variety of data applications, including analytics, reporting, and machine learning.
  • Implement and manage data warehouses and data lakes using GCP services such as BigQuery, Cloud Storage, Cloud Dataflow, and Cloud Composer.
  • Develop, maintain, and enhance data models in line with business requirements, optimising for performance and scalability.
  • Collaborate with cross-functional teams to understand business needs, translate them into technical solutions, and provide data access solutions tailored to specific use cases.
  • Conduct regular performance tuning, capacity management, and data quality checks to ensure the efficiency, security, and reliability of data infrastructure.
  • Implement security best practices, including data encryption, access control, and compliance with organisational and regulatory standards.
  • Develop and maintain comprehensive documentation on data architecture, data models, and processes.

Qualifications:

  • Bachelor’s Degree in Computer Science, Information Systems, Engineering, or a related field.
  • 3+ years of experience as a Data Engineer with hands-on expertise in designing, building, and optimising data architectures on GCP.
  • Proficiency in SQL and Python for data transformation, automation, and integration.
  • Strong knowledge of GCP tools: BigQuery, Cloud Dataflow, Cloud Storage, Cloud Composer (Apache Airflow), Pub/Sub, and Cloud Functions.
  • Experience with data modelling, ETL processes, and data warehousing concepts.
  • Experience with Apache Kafka or other streaming platforms is beneficial.
  • Familiarity with CI/CD tools and DevOps practices for data pipelines, such as Docker, Terraform, or Kubernetes, is advantageous.

Skills:

  • Analytical and problem-solving skills: Ability to diagnose complex data issues, recommend and implement solutions efficiently.
  • Attention to detail and commitment to data quality: Ensures that data is accurate, reliable, and adheres to company standards.
  • Communication and collaboration skills: Strong written and verbal communication abilities, capable of working well with both technical and non-technical stakeholders.
  • Self-motivated and adaptable: Able to work independently in a fast-paced environment, with a proactive approach to learning and adopting new technologies.

Preferred Competencies:

  • Certifications in Google Cloud (e.g., Professional Data Engineer) are a plus.
  • Experience with machine learning workflows and integration with data pipelines.
  • Knowledge of data governance principles and experience implementing data quality frameworks.

How to Apply for this Offer

Interested and Qualified candidates should Click here to Apply Now

  • ICT jobs