Work closely with
data analysts and engineer team to design, build and operate data platform including but not limited to data sources, data ingestion, ETL/ELT, data warehouse, data marts, and data services.
Contribute to the solution and execution of various data migration projects.
Implement data security and access controls to protect sensitive data; develops and maintains data governance policies and procedures.
Contribute to measuring and improving the accuracy, scalability, performance, reliability, and maintainability of the data platform.
Bachelor's degree in Computer Science/Information Systems/Engineering/related fields.
Advanced knowledge of modern data architecture and AWS ecosystem such as Redshift, Kafka, Airflow, S3. Experience with Google Firebase Analytics and BigQuery is a big plus.
Proficient in SQL; experience with one of the programming languages such as Python, Java, and scripting.
Good understanding of Information Security and Data Governance principles.
Experience in designing and building data pipelines, ETL/ELT processes, and data warehouse.