Senior Data Consultant (Cape Town) needed at Cyberlogic
Job title : Senior Data Consultant (Cape Town)
Job Location : Western Cape, Cape Town
Deadline : January 18, 2025
Quick Recommended Links
Purpose Of Position
- As a Senior Data Consultant, you will play a critical role in designing, developing, and maintaining data pipelines to enable data-driven decision-making across our clients’ businesses. You will work closely with a small but highly skilled team of data engineers, data scientists, and business analysts to deliver high-impact solutions. You’ll be expected to bring both technical expertise and a collaborative mindset, helping to shape the architecture and development of data solutions on cloud platforms.
Key Responsibilities
- Design, build, and maintain scalable data pipelines for the collection, transformation, and storage of data.
- Work with large, complex datasets to ensure efficient processing and integration.
- Develop data engineering solutions using Azure technologies and PySpark for distributed data processing.
- Implement ETL processes and automate data workflows to support analytics, reporting, and business intelligence initiatives.
- Collaborate with data scientists, business analysts, and stakeholders to understand data requirements and deliver actionable insights.
- Ensure data quality, consistency, and reliability across all pipelines and datasets.
- Optimize data models, storage, and processing workflows for performance and scalability.
- Contribute to the architecture and design decisions for cloud-based data platforms.
- Mentor junior team members and foster a collaborative and innovative team culture.
- Troubleshoot and resolve issues related to data pipelines, ensuring high availability and performance.
Key Requirements
Essential:
- 8-10 years of experience in data engineering or a related role with a strong background in building and optimizing data pipelines.
- Expertise in PySpark for large-scale data processing and distributed computing.
- Strong experience with Azure data technologies (Azure Data Factory, Azure Databricks, Azure SQL Database, etc.).
- Proficiency in SQL and experience working with relational and NoSQL databases.
- Experience with cloud-based data platforms and services, especially in Azure.
- Solid understanding of ETL processes and data modeling techniques.
- Strong problem-solving and troubleshooting skills.
- Experience working with version control systems (e.g., Git) and continuous integration/continuous deployment (CI/CD) practices.
- Strong communication skills and the ability to work effectively within a collaborative, small-team environment.
- A proactive, self-starter attitude with a passion for data engineering and technology.
Preferred
- Experience with data warehousing solutions (e.g., Azure Synapse Analytics, Snowflake).
- Familiarity with other big data technologies (e.g., Hadoop, Kafka, Spark Streaming).
- Experience with infrastructure as code (IaC) tools such as Terraform or Azure Resource Manager (ARM).
- Knowledge of containerization technologies (e.g., Docker, Kubernetes).
- Background in Agile development methodologies.
How to Apply for this Offer
Interested and Qualified candidates should Click here to Apply Now
- Research / Data Analysis jobs