Mô tả công việc
- Develop ETL/ELT jobs to ingest, transform, and load data from multiple source systems into Data Warehouse / Data Lake platforms based on approved analysis and design documents.
- Analyze and design data storage, data integration, and data processing mechanisms to ensure scalability, performance, and data quality.
- Build and maintain detailed technical specifications, deployment guides, and operation manuals.
- Prepare deployment packages and perform deployments across environments (DEV, UAT, PRODUCTION), including deployment validation and checklist verification.
- Optimize system performance through tuning, refactoring, and upgrading ETL/ELT jobs and data pipelines.
- Investigate, troubleshoot, and resolve data pipeline issues, incidents, and operational problems.
- Collaborate with related teams (Data Architecture, BI, Application, Infrastructure) to ensure smooth data integration and operation.
- Perform other tasks as assigned by management.
Yêu cầu công việc
- Bachelor's degree in Computer Science, Data Science, Information Systems, or equivalent practical experience.
- Knowledge of Big Data concepts and architectures.
- Hands-on experience with Databricks (Cloud-based) or Oracle (DWH) for data processing and analytics (mandatory requirement).
- Experience in relational database development and optimization, including Oracle, SQL Server, MySQL, and DB2 (DB2 is highly preferred).
- Strong understanding of Data Warehouse, Data Modeling, Data Mart, Data Lake, and database design principles.
Thông tin chung
Cách thức ứng tuyển
Ứng viên nộp hồ sơ trực tuyến bằng cách bấm nút Ứng tuyển bên dưới:
Hạn nộp: 14/05/2026