Monthly Salary
$5,000 – $6,000
Posted
17 April 2026
Expires 17 May 2026
Categories
Description
We are seeking an experienced Senior Data Engineer to design, build, and optimize scalable data platforms that support analytics, reporting, and data-driven decision-making. The successful candidate will work closely with data scientists, analysts, and application teams to ensure reliable, secure, and high-performing data solutions across the organisation.
Key Responsibilities
- Design, develop, and maintain robust data pipelines and ETL/ELT workflows for structured and unstructured data
- Architect and manage data warehouses, data lakes, and data marts to support analytics and reporting needs
- Optimize data models for performance, scalability, and reliability
- Ensure data quality, integrity, governance, and security standards are met
- Collaborate with business stakeholders, data scientists, and analysts to translate requirements into technical solutions
- Lead troubleshooting and performance tuning of data pipelines and platforms
- Implement monitoring, logging, and alerting for data workflows
- Mentor junior data engineers and provide technical guidance on best practices
- Contribute to architecture discussions, technical standards, and documentation
Requirements
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field
- Minimum 5–8 years of hands-on experience in data engineering or related roles
- Strong proficiency in SQL and data modelling
- Solid experience with ETL/ELT frameworks and data orchestration tools
- Proven experience building and maintaining large-scale data platforms
- Strong analytical, problem-solving, and communication skills
- Ability to work independently and collaborate within cross-functional teams
Technical Skills (Preferred)
- Programming experience in Python, Java, or Scala
- Hands-on experience with cloud platforms (AWS, Azure, or GCP)
- Experience with data warehousing technologies (e.g. Snowflake, BigQuery, Redshift)
- Familiarity with big data technologies (e.g. Spark, Hadoop, Kafka)
- Experience with CI/CD, version control (Git), and Infrastructure as Code (IaC)
- Knowledge of data governance, security, and compliance standards
- Experience working in Agile / DevOps environments
Nice to Have
- Exposure to data visualization or BI tools (e.g. Power BI, Tableau)
- Experience supporting machine learning or advanced analytics workloads
- Prior experience in regulated or large enterprise environments