Data Engineer (1 Position)
Reporting to the Head Of Data & Insights, the role holder will be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines and infrastructure across global data platforms. The role ensures that data from various systems is efficiently ingested, transformed, stored, and made available for advanced analytics, reporting, and machine learning use cases in compliance with global data governance and privacy standards.
KEY ACCOUNTABILITIES:
- Design, build, and maintain data pipelines to ingest data from structured and unstructured sources (internal and external).
- Develop and optimize ETL/ELT processes to ensure reliability, scalability, and performance across large datasets.
- Implement data warehousing and data lake architectures using cloud and on-prem technologies (e.g., Snowflake, Azure Synapse, BigQuery, AWS Redshift, Databricks, SSMS).
- Create reusable data assets and frameworks for repeatable and standardized data integration.
- Implement data validation, cleansing, and quality monitoring frameworks.
- Integrate and support Master Data Management (MDM) and Metadata Management practices.
- Partner with the Data Governance and Data Protection Officers to ensure compliance with Data protection and Privacy laws in Uganda and other global data protection laws.
- Manage data lineage, cataloging, and access control using enterprise tools such as Azure Purview, Collibra, or Alation.
- Build scalable data pipelines using tools such as Azure Data Factory, Apache Airflow, NiFi, or AWS Glue.
- Develop real-time and batch data streaming solutions using Kafka, Event Hubs, or Kinesis.
- Support API-based integrations and data sharing across systems and geographies.
- Work closely with Data Scientists and Analysts to provision and prepare data for predictive and prescriptive modelling.
- Collaborate with BI and reporting teams to ensure data consistency across dashboards and analytical layers.
- Partner with cross-functional teams to define and implement data standards and reusable assets.
- Research and implement best-in-class tools and frameworks for data engineering.
- Lead or contribute to cloud modernization and data platform migration initiatives.
- Ensure cost optimization and performance tuning of data workloads.
- Stay updated on emerging technologies (AI-driven data management, Data Mesh, Data Fabric, GenAI-enhanced data tools.
KNOWLEDGE, SKILLS, AND EXPERIENCE REQUIRED:
- Bachelor’s Degree in Computer Science, Software Engineering, Statistics, Mathematics, Data Science, Information Systems, or other Quantitative fields.
- Preferred: Master’s degree or equivalent experience in Data Engineering, Cloud Computing, or Analytics.
- Certifications in one or more of the following:
- Azure Data Engineer Associate / AWS Certified Data Analytics / Google Professional Data Engineer;
- Databricks Certified Data Engineer;
- Snowflake SnowPro Core / Advanced Architect
- Minimum 3–5 years’ experience in Data Engineering or Data Platform Development.
- Proficiency in SQL, Python, PySpark, or Scala for data transformation.
- Experience with cloud data platforms (Azure, AWS).
- Strong understanding of data modelling, data warehousing, and ETL orchestration.
- Hands-on experience with data versioning, CI/CD for data pipelines, and Infrastructure as Code (IaC) using Terraform or ARM templates.
- Familiarity with data governance frameworks and data privacy principles.
- Experience with modern architecture patterns such as Data Mesh or Data Fabric is a plus.
- Excellent communication, collaboration, and problem-solving skills in cross-functional, multicultural environments.
INVITATION
If you believe you meet the requirements as noted above, please use the link below to apply;
careers.dfcugroup.com
Once there, click on “Career Opportunities” to get started. (We recommend using Google Chrome for the best experience.
Deadline: Wednesday 29th October 2025
Only short-listed candidates will be contacted.
Please note that all recruitment terms and conditions as stated in the HR Policies and Procedures Manual shall apply.