Role overview
You will be a core member of a forward-deployed engagement team that onboards enterprise customers (often highly regulated, high-maturity organizations such as banks) to an AI
Data Analyst SaaS platform. Your primary responsibility is to design and implement robust data models and connections so the AI Data Analyst can reliably compute business metrics and answer analytical questions. This is a high-touch role combining deep technical data modeling and engineering with stakeholder management, requirements gathering, testing/validation, risk mitigation, and project delivery.
What you'll do - key responsibilities
• Customer liaison & discovery
• Lead discovery sessions with technical and non-technical stakeholders to understand source systems, data lineage, business definitions, and reporting needs.
• Map business KPIs/metrics to available data and identify gaps or remediation required.
• Data modeling & metric engineering
• Design logical and physical data models (facts, dimensions, hierarchies, slowly changing dimensions) that reflect customer business semantics and support the AI Data Analyst's metric definitions.
• Define canonical metric specifications (metric definition, calculation SQL/DSL, cohort logic, edge cases).
• Platform integration
• Implement data connections, ingestion pipelines, and schema mappings into the SaaS platform (or customer's cloud data layer) ensuring freshness, reliability, and observability.
• Configure dimensions, attributes, and metric metadata inside the platform so the AI models can consume and reason about the data.
• Validation & QA
• Develop and execute test plans to validate AI Data Analyst outputs against agreed-upon metric specs and ground-truth reports; quantify accuracy and identify root causes for discrepancies.
• Create automated and manual validation suites (unit tests, reconciliation queries, data quality checks).
• Project & stakeholder management
• Create project plans, manage timelines, set realistic expectations, and communicate status/risks to customers and internal stakeholders.
• Facilitate sign-offs on metric definitions, data readiness, and production cutovers.
• Risk, security & governance
• Identify data and model risks (PII exposures, inference errors, stale data) and put mitigation controls in place.
• Ensure implementations comply with customer security, data governance, and regulatory requirements.
• Knowledge transfer & documentation
• Produce clear runbooks, metric spec docs, and onboarding artifacts. Train customer users and internal support teams for ongoing operations.
• Continuous improvement
• Feed product/engineering with requirements and lessons learned to improve platform data modeling capabilities and onboarding playbooks.