Vị trí công việc này hiện tại đã hết hạn nộp hồ sơ, bạn có thể tham khảo thêm một số công việc tương tự tại đây:
Mô tả công việc
Mô tả Công việc
- Develop ETL/ELT jobs to ingest, transform, and load data from multiple source systems into Data Warehouse / Data Lake platforms based on approved analysis and design documents.
- Analyze and design data storage, data integration, and data processing mechanisms to ensure scalability, performance, and data quality.
- Build and maintain detailed technical specifications, deployment guides, and operation manuals.
- Prepare deployment packages and perform deployments across environments (DEV, UAT, PRODUCTION), including deployment validation and checklist verification.
- Optimize system performance through tuning, refactoring, and upgrading ETL/ELT jobs and data pipelines.
- Investigate, troubleshoot, and resolve data pipeline issues, incidents, and operational problems.
- Collaborate with related teams (Data Architecture, BI, Application, Infrastructure) to ensure smooth data integration and operation.
- Perform other tasks as assigned by management.
Yêu cầu
Yêu Cầu Công Việc
- Bachelor's degree in Computer Science, Data Science, Information Systems, or equivalent practical experience.
- Solid background in Data engineering best practices.
- Strong knowledge of Big Data concepts and architectures.
- Hands-on experience with Databricks (Cloud-based) or Oracle (DWH) for data processing and analytics (mandatory requirement).
- Experience in relational database development and optimization, including Oracle, SQL Server, MySQL, and DB2 (DB2 is highly preferred).
- Strong understanding of Data Warehouse, Data Modeling, Data Mart, Data Lake, and database design principles.
- Hands-on experience in building and maintaining ETL/ELT pipelines, especially using Oracle Data Integrator (ODI) or Airflow.
- Experience with Cloud platforms (AWS / Azure / GCP) and cloud-based data architectures.
- Experience with Agile Software Development, with a solid understanding of Agile principles, Scrum methodology, and collaborative delivery models.
- Strong analytical thinking, attention to detail, and problem-solving mindset.
- Team player with a proactive attitude and willingness to continuously learn and self-develop.
Nice to Have (Strong Plus):
- Experience or knowledge of IBM Banking Data Model.
- Understanding and hands-on exposure to DataOps practices, including CI/CD for data pipelines, monitoring, logging, and data quality automation.
Quyền lợi
Laptop
Chế độ bảo hiểm
Du Lịch
Phụ cấp
Đồng phục
Chế độ thưởng
Chăm sóc sức khỏe
Đào tạo
Tăng lương
Công tác phí
Phụ cấp thâm niên
Nghỉ phép năm
CLB thể thao
Thông tin chung
- Ngày hết hạn: 31/01/2026
- Thu nhập: Cạnh tranh