Company Description
Standard Bank Group is a leading Africa-focused financial services group, and an innovative player on the global stage, that offers a variety of career-enhancing opportunities – plus the chance to work alongside some of the sector’s most talented, motivated professionals. Our clients range from individuals, to businesses of all sizes, high net worth families and large multinational corporates and institutions. We’re passionate about creating growth in Africa. Bringing true, meaningful value to our clients and the communities we serve and creating a real sense of purpose for you.
Job Description
Responsible for designing, building, and optimising real-time streaming data architectures that enables hyper-personalisation strategies. The role involves creating and maintaining event-driven triggers and automated solutions to support dynamic data workflows and real-time processing needs with a key focus on building innovative solutions leveraging real-time streaming tools to enable immediate customer data insights and actions. The role required strong data engineering skills with expertise in building and integrating APIs, customer data platforms, and core business applications. Understanding the development and maintenance of scalable streaming pipelines using streaming tools (e.g., Kafka, Flink, Spark Streaming, CDC) and utilising intelligent decisioning software’s such as IDS (SAS Viya) will be advantageous.
Qualifications
Qualification:
- Degree in STEM (Informatics, Statistics, Mathematics, Computer Science, Engineering or related qualification)
Experience Required
- 5-7 years’ experience in building databases, warehouses, reporting and data integration solutions. Experience building and optimising big data data-pipelines, architectures and data sets.
- 5-7 years experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement and working with API’s
- 8-10 years Deep understanding of data pipelining and performance optimisation, data principles, how data fits in an organisation, including customers, products and transactional information. Knowledge of integration patterns, styles, protocols and systems theory
- 8-10 years experience in database programming languages including SAS,SQL, PL/SQL, SPARK and or appropriate data tooling. Experience with data pipeline and workflow management tools
Additional Information
Behavioural Competencies:
- Adopting Practical Approaches
- Articulating Information
- Checking Things
- Developing Expertise
- Documenting Facts
- Embracing Change
- Examining Information
- Interpreting Data
- Managing Tasks
- Producing Output
- Taking Action
- Team Working
Technical Competencies:
- Big Data Frameworks and Tools
- Data Engineering
- Data Integrity
- Data Quality
- IT Knowledge
- Stakeholder Management (IT)