sg happening
← 求人一覧に戻る
ZENITH INFOTECH (S) PTE LTD.

Data Engineer - Contract

Professional Contract 3 年以上の経験

月給

$5,000 – $6,000

掲載日

2026年4月14日

2026年4月28日 まで

スキル

DesignAirflowApache SparkData ModelingPipeline DevelopmentAzure Data FactoryTransformationSQL ViewsDatabase ManagementDB2Process Analytical TechnologyDatabases

職務内容

Role Overview

Our client is looking for a skilled and motivated Data Engineer to join the Data & Analytics team. In this role, you will be responsible for designing, developing, and maintaining robust end-to-end data pipelines that power our operational and analytical capabilities.

You will work closely with data analysts, BI developers, data scientists, and business stakeholders to deliver high-quality, reliable data solutions at scale.

Key Responsibilities

Data Pipeline Engineering

• Design, develop, and maintain end-to-end data pipelines for ingesting, transforming, and delivering data from multiple source systems (databases, files, APIs, streaming platforms).

• Build and optimize ETL/ELT workflows using SQL, Python, and enterprise data integration tools.

• Ensure data pipelines are scalable, resilient, and performant to meet operational and analytical requirements.

Database & Data Platform Management

• Work hands-on with RDBMS platforms such as Oracle, DB2, SQL Server, or PostgreSQL for data extraction, transformation, and performance tuning.

• Develop and optimize SQL queries, views, and stored procedures to support reporting and analytics use cases.

• Support data modelling activities (logical and physical) for analytics and reporting layers.

Data Quality, Governance & Operations

• Implement data validation, reconciliation, and monitoring to ensure data accuracy, completeness, and consistency.

• Support operational data activities, including incident investigation, root cause analysis, and remediation.

• Maintain clear documentation for data pipelines, schemas, and operational processes to support audits and knowledge transfer.

Collaboration & Stakeholder Engagement

• Collaborate with business users, product owners, and downstream teams to gather requirements and translate them into technical solutions.

• Work closely with Data Analysts, BI developers, and Data Scientists to enable dashboards, reports, and advanced analytics.

Required Skills & Experience

• Bachelor's degree in Computer Science, Engineering, Information Systems, or equivalent practical experience.

• Strong hands-on experience with SQL and relational databases (Oracle, DB2, SQL Server, PostgreSQL).

• Experience building and supporting ETL/data pipelines in enterprise environments.

• Solid understanding of data modelling, data quality, and data lifecycle management.

• Ability to troubleshoot data issues and work in production/operational environments.

• Proven experience with Python for data processing or automation.

Preferred / Nice-to-Have

• Experience with data streaming technologies such as Kafka, Apache Spark, or Apache NiFi.

• Familiarity with BI and visualisation tools such as Tableau, Qlik, or Power BI.

• Knowledge of cloud or hybrid data platforms and workflow orchestration tools (e.g. Airflow, Azure Data Factory).

. Knowledge of Agile/DevOps practices and CI/CD for data pipelines