Data Analyst

Listing reference: payco_000171
Listing status: Online
Apply by: 4 June 2026
Position summary
Industry: Financial Services
Job category: Data Processing
Location: Sandton
Contract: Permanent
Remuneration: Market Related
EE position: No
Introduction
We are looking for a Data Analyst with strong skills in analytics, data engineering, reporting, and data modelling. The role focuses on transforming operational and financial data into structured, reliable data assets that support reporting, dashboards, analysis, and business decision-making. The successful candidate will design, build, and maintain ETL and ELT data pipelines, develop scalable data models, and deliver actionable insights using tools such as SQL, Power BI, SSRS, Python, and C#. This role requires strong analytical thinking, solid database knowledge, attention to data quality, and the ability to translate business requirements into practical data solutions. The candidate must be comfortable working with both technical teams and business stakeholders in a fast-paced, delivery focused environment.
Job description

  • Data Pipeline Development and Maintenance: Design, build, and maintain reliable ETL and ELT pipelines to ingest, transform, validate and structure data from multiple source systems
  • Data Modelling: Develop and maintain efficient data models across Microsoft SQL Server, PostgreSQL and MySQL environments to support reporting, analytics and business decision making
  • Reporting and Dashboard Development: Design, build and maintain automated reports and dashboards using Power BI, SSRS and similar reporting tools
  • Business Requirements Translation: Engage with business stakeholders to understand reporting and analytics needs and translate these into practical data solutions
  • Data Quality and Integrity: Implement validation, monitoring and reconciliation processes to ensure data accuracy, consistency and reliability across systems
  • Data Analysis and Insight Generation: Analyse complex datasets to identify trends, anomalies, risks and opportunities and provide clear insights to support decision making
  • Automation and Process Improvement: Identify opportunities to automate manual data processes and reporting workflows to improve efficiency and reduce operational effort
  • Performance Tuning and Optimisation: Optimise SQL queries, data structures, reports and dashboards to improve performance, scalability and responsiveness
  • Cross-Functional Collaboration: Work closely with data engineers, BI developers, software teams and business stakeholders to deliver integrated data and analytics solutions
  • Documentation and Standards: Produce clear documentation for data models, pipelines, reporting logic, and reconciliation rules, while contributing to improved data and BI standard

Minimum requirements

  • Matric Certificate
  • Completed Bachelor’s degree in one of the following fields: Computer Science, Information Systems, Data Science, Engineering, Information Technology or a related field
  • Minimum 3 to 5 years’ experience in a data-focused role, such as Data Analyst, BI Developer or Data Engineer
  • Strong SQL experience, including query writing, stored procedures, views and performance tuning
  • Hands-on experience working with relational databases such as Microsoft SQL Server, PostgreSQL or MySQL
  • Experience building and maintaining ETL or ELT processes and data pipelines
  • Experience developing reports and dashboards using Power BI, SSRS or similar reporting tools
  • Exposure to Python or C# for data processing, scripting, automation or integration work
  • Experience working with financial services, payments, or transactional data environments is advantageous

Technical Competencies:

  • SQL and Database Proficiency: Strong experience writing SQL queries, stored procedures, views and performing database performance tuning
  • Relational Database Skills: Hands-on experience working with relational databases such as Microsoft SQL Server, PostgreSQL and MySQL
  • ETL and ELT Development: Ability to design, build and maintain data pipelines that ingest, transform, validate and structure data from multiple source systems
  • Data Modelling: Solid understanding of data modelling concepts, including dimensional modelling, data warehousing principles and scalable reporting structures
  • Reporting and Dashboard Development: Experience building automated reports and dashboards using Power BI, SSRS or similar reporting tools
  • Data Analysis and Insight Generation: Ability to analyse large and complex datasets, identify trends, anomalies and opportunities and present clear insights to support business decisions
  • Data Quality and Reconciliation: Strong understanding of data validation, monitoring, reconciliation, accuracy, consistency and reliability across systems
  • Programming and Scripting: Exposure to Python or C# for data processing, scripting, automation, integration or pipeline development
  • Requirements Analysis: Experience engaging with business stakeholders, understanding reporting and analytics requirements and translating them into technical data solutions
  • Automation and Optimisation: Ability to identify manual reporting or data processes that can be automated, improved or optimised
  • AI and Data Preparation: Familiarity with AI or LLM concepts and data preparation for AI-driven solutions is advantageous

Our website uses cookies so that we can provide you with the best user experience. By continuing to use our website, you agree to our use of cookies.