1. Responsibilities:
- Architect and maintain the Data Lakehouse platform utilizing an open-source tech stack (MinIO/SeaweedFS, Apache Iceberg, Airflow, etc.) hosted and orchestrated on Kubernetes.
- Deploy and monitor data applications within our Kubernetes environment. Continuously enhance infrastructure for greater scalability, automate manual processes, and optimize data delivery across the ecosystem.
- Design, build, and maintain highly scalable ELT/ETL data pipelines using Apache Airflow for orchestration, and leverage dbt alongside Trino, Spark to execute complex, large-scale data transformations.
- Work with stakeholders including the Executive, Marketing, Accounting, Finance, etc teams to assist with data-related technical issues and support their data infrastructure needs.
- Establish and follow strict data governance, data quality monitoring, and metadata management processes to ensure a highly reliable single source of truth.
2. Qualifications:
- 4+ years of experience in a Data Engineer role.
- Having a degree in Computer Science, Information Systems, Software Engineering or another related field.
- Wide knowledge about software products using in F&B field is a plus.
Should also have experience with the following software/tools/platforms:
- Experience with programming languages such as Python, Scala...
- Advanced working SQL knowledge and experience working with relational SQL and NoSQL databases as well as working familiarity with a variety of data sets: SQL Server, MySQL, PostgreSQL, MongoDB, ... etc
- Experience with data pipeline and workflow management tools: Airflow, DBT
- Experience with data processing tools: Spark, Trino, etc.
- Experience with both OLAP and OLTP database: design data model, optimization database, optimization query.
- Hands-on experience building, migrating, or managing data infrastructure on major cloud providers (AWS, GCP, or Azure) is a strong plus.
- Proven experience working with data stacks hosted on Kubernetes. Familiarity with configuring, deploying, and troubleshooting pods/containers using Helm, kubectl, and Docker is highly preferred.
3. Benefits:
- Work Location: Hanoi Office - 315 Truong Chinh, Thanh Xuan District, Hanoi
- Compensation: Competitive and negotiable, commensurate with qualifications and experience
- Working Hours: Monday to Friday, with flexible check-in arrangements
- Equipment: The Company provides all necessary working equipment, including a desktop or laptop
- Benefits: Full compliance with statutory benefits in accordance with the Labor Law, including mandatory social insurance contributions and 24/7 accident insurance; additionally, private health insurance coverage is provided (subject to job level)
- Incentives: KPI-based performance bonus and 13th-month salary
- Employee Perks: Enjoy a 10%-30% discount on total bills within the Golden Gate system