Key Responsibilities:
1. Data Architecture and Design
• Design and implement scalable, secure, and efficient data pipelines and architectures on
AWS
• Define and manage the architecture for real-time and batch data processing.
2. Data Pipeline Development
• Build ETL/ELT processes to ingest, transform, and store structured and unstructured
data from diverse sources.
• Develop solutions for data integration, data quality, and data governance.
3. Optimization and Monitoring
• Optimize data processing workflows for performance and cost-effectiveness.
• Monitor data pipelines and systems, troubleshooting and resolving issues promptly.
4. Collaboration
• Partner with
data scientists, analysts, and business teams to understand data needs and
deliver high-quality solutions.
• Collaborate with DevOps teams to implement CI/CD for data workflows.
5. Security and Compliance
• Ensure data solutions comply with organizational policies and industry security, privacy,
and governance standards.
• Implement data anonymization and encryption techniques as needed.
6. Innovation
• Stay updated with emerging AWS technologies and data engineering best practices.
• Drive continuous improvement in processes, tools, and methodologies.