Cloud DataOps Engineer

Cape Town

Posting Country:  South Africa
Date Posted:  18-Jan-2021
Full Time / Part Time:  Full Time
Contract Type:  Permanent
Joining Vodacom is more than a job, what we do matters. We don’t just carry minutes, texts and data – we carry people’s lives. And that’s a huge responsibility. If you think for a minute about the people you rely on…the likelihood is they rely on us.

Customers are at the heart of everything we do and we want to make a difference to the lives of our customers, and the communities in which we live and work. We support our people to give something back to the causes that mean the most to them through helping them give time and money to the charities they love.

And what’s it like to work here? We have created an environment where you can look forward to coming to work and are empowered to be at your best. We offer flexibility in how you work that helps you do your job in a way that suits you, opportunities to help you grow and progress throughout your career and a choice of benefits to suit your lifestyle.
Role Purpose
The Cloud DataOps Engineer role based within the Technology Business Unit.

 

The role of the Cloud DataOps Engineer is to be a cross-functional person responsible for data ingestion pipelines, monitoring and quality.
This includes automating data monitoring, alerting, and checking across various stages of data transformation and projection.
Ensure that data quality controls are inherent in our data ingestion patterns and data governance to serve quality and trusted data to our customers.
In addition, you will support and maintain production data pipelines.
Your responsibilities will include:
Engaging with various stakeholders to understand data pipeline demands in order to support various use cases.
Work as a DataOps Engineer across multiple data platforms to integrate data, be responsible for data quality control, research data issues, and formulate data integrity solutions.
Be hands-on and champion the implementation of proactive monitoring, alerting, trend analysis, and robust applications.
Implementation and compliance to the data governance policy and related controls across multiple data platforms.
Develop and advance data reporting and data QC applications/systems.
Triage data research requests and issues to prioritize and systemize for effective resolutions.
Serve as technical contributor in enhancing and improving ETL processes and data ingestions across multiple platforms.
Competently communicate and collaborate with multiple product lines, customers, and development teams for anything data related.
Work effectively on an Agile team and collaborate well with your other team members.
Providing support to other teams if any optimization or the troubleshooting on the performance of the application to avoid errors.

The ideal candidate for this role will have:
Matric / Grade 12 essential

3 Year degree in Computer Science, Data Analytics or Data Science
Post graduate qualification within the Data field would be advantageous.
5+ years of overall IT experience with Big Data, Advance Analytics, Data Warehousing and Business Intelligence.
Relevant cloud certification at professional or associate level would be advantageous.
Solid experience in building batch and stream data pipelines.
Agile exposure, Kanban or Scrum

 

Core competencies and knowledge:
In-depth knowledge of data as a product & Information best practices.
Expert level experience in designing, building and managing data pipelines for batch and streaming applications
Experience with performance tuning for batch based applications like Hadoop, including working knowledge using Nifi, Yarn, Hive, Airflow and Spark.
Experience with performance tuning streaming based applications for real-time data processing using Kafka, Confluent Kafka, AWS Kenesis, GCP pub/sub or similar
Working experience within the ML and Analytics lifecycle capabilities such as Data Pipelines, Data Processing, Data Storing, Model Lifecycle, Data Operations, Data Management & Data Governance.
Experience in using a wide range for data tools such as Hadoop, Spark, Hive, Cassandra, Airflow, Kafka, Flink, AWS services, GCP Services, etc.
Working experience with Cloud platforms such as OpenShift, AWS and GCP.
Solid working experience with CI/CD.
Java and Python programming ability would be an advantage

 

Closing date for Applications: 26 January 2021

The base location for this role is Cape Town.

The Company’s approved Employment Equity Plan and Targets will be considered as part of the recruitment process. As an Equal Opportunities employer, we actively encourage and welcome people with various disabilities to apply.
Vodacom is committed to an organisational culture that recognises, appreciates and values diversity & inclusion.

Commitment from Vodacom

Vodacom is committed to attracting, developing and retaining the very best people by offering a flexible, motivating and inclusive workplace in which talent is truly recognized, developed and rewarded. We believe that diversity plays an important role in the success of our business and we are committed to creating an inclusive work environment which respects, values, celebrates and makes the most of people’s individual differences – we are not only multinational but multicultural too. At Vodacom you will have access to our excellent flexible benefits programme that you would expect from any global company.