What are the responsibilities and job description for the Data Product Owner/ Domain lead position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Intake IT Solutions, is seeking the following. Apply via Dice today!
Job Title: Data Product Owner/ Domain lead
Duration: FTE
Location: Newport Beach, CA
Key Skills: Data Product Ownership, SQL & Data Validation, Business-to-Technical Translation, Agile/Scrum Delivery, Data Governance (Collibra/Alation), Snowflake & dbt Exposure, Insurance Domain Expertise
12 years
Data Product Strategy | Business-to-Technical Bridge | Governed Data Delivery | Insurance Domain.
8 years in data analytics, data management, or data product roles with at least 3 years in a product ownership or domain lead capacity on a data modernization program.
Strong SQL fluency ability to validate data product outputs, write acceptance queries, and review dbt model logic with the engineering team.
Experience working in Agile delivery environments sprint planning, backlog refinement, story writing, and sprint demos
Demonstrated ability to translate complex business requirements from domain SMEs into clear, actionable data specifications for engineering teams
Hands-on experience with data governance tooling (Collibra, Alation, or equivalent) business glossary, data cataloging, certification workflows
Excellent stakeholder communication skills able to engage credibly with both business executives and senior data engineers
Role Summary
We are looking for an experienced Data Product Owner to own the definition, prioritization, and business validation of data products across a Fortune 500 insurance and financial services enterprise's multi-wave data modernization program. You will serve as the business authority for data quality and semantic accuracy sitting at the intersection of the customer's domain SMEs and WinWire's engineering teams, translating business intent into certified, analytics-ready data products on Snowflake. This is not a passive backlog management role you will make consequential decisions about what gets built, in what order, and what "done" means.
Key Responsibilities
Data Product Definition & Backlog Ownership
Define and own data product specifications for all Wave 1 domains: Policy & Contract Management, Premium & Billing, Claims & Benefits, Actuarial & Reserves, Financial Accounting, Agent & Distribution, Customer & Party, Product Master, and Compliance & Regulatory
Prioritize the delivery backlog across transformation waves in collaboration with the Delivery Manager, Technical Architect, and the customer Product Owners
Translate business requirements from the customer domain SMEs into validated data specifications, acceptance criteria, and dbt model definitions that the engineering team can execute against
Define data product SLAs availability, freshness, and quality expectations in partnership with the DRE Engineering team
Strong communication skills ability to explain technical concepts clearly.
Proactive, ownership-driven mindset with high accountability.
Ability to collaborate across engineering, operations, and support teams.
Adaptability to fast-paced iterative environments.
The candidate is expected to deliver the following measurable outcomes:
All 9 Wave 1 data products have clearly documented specifications, accepted acceptance criteria, and a defined certification checklist before engineering begins each domain
Wave 1 UAT completes with fewer than 5% of defects attributed to ambiguous or missing business rules a direct measure of upstream specification quality
Collibra business glossary has active, reviewed, and approved definitions for all core entities across Wave 1 domains used and trusted by the customer data stewards
the customer business stakeholders describe the data products as "accurate and trustworthy" in post-launch surveys
Wave 2 backlog is defined, prioritized, and ready for engineering handoff before Wave 1 go-live
Insurance domain knowledge life insurance, annuities, actuarial, claims, or distribution operations.
Familiarity with ACORD Life & Annuity data standards.
Experience with Profisee MDM or enterprise master data management concepts.
Exposure to Snowflake as a data platform (not required to be a Snowflake engineer).
Job Title: Data Product Owner/ Domain lead
Duration: FTE
Location: Newport Beach, CA
Key Skills: Data Product Ownership, SQL & Data Validation, Business-to-Technical Translation, Agile/Scrum Delivery, Data Governance (Collibra/Alation), Snowflake & dbt Exposure, Insurance Domain Expertise
12 years
Data Product Strategy | Business-to-Technical Bridge | Governed Data Delivery | Insurance Domain.
8 years in data analytics, data management, or data product roles with at least 3 years in a product ownership or domain lead capacity on a data modernization program.
Strong SQL fluency ability to validate data product outputs, write acceptance queries, and review dbt model logic with the engineering team.
Experience working in Agile delivery environments sprint planning, backlog refinement, story writing, and sprint demos
Demonstrated ability to translate complex business requirements from domain SMEs into clear, actionable data specifications for engineering teams
Hands-on experience with data governance tooling (Collibra, Alation, or equivalent) business glossary, data cataloging, certification workflows
Excellent stakeholder communication skills able to engage credibly with both business executives and senior data engineers
Role Summary
We are looking for an experienced Data Product Owner to own the definition, prioritization, and business validation of data products across a Fortune 500 insurance and financial services enterprise's multi-wave data modernization program. You will serve as the business authority for data quality and semantic accuracy sitting at the intersection of the customer's domain SMEs and WinWire's engineering teams, translating business intent into certified, analytics-ready data products on Snowflake. This is not a passive backlog management role you will make consequential decisions about what gets built, in what order, and what "done" means.
Key Responsibilities
Data Product Definition & Backlog Ownership
Define and own data product specifications for all Wave 1 domains: Policy & Contract Management, Premium & Billing, Claims & Benefits, Actuarial & Reserves, Financial Accounting, Agent & Distribution, Customer & Party, Product Master, and Compliance & Regulatory
Prioritize the delivery backlog across transformation waves in collaboration with the Delivery Manager, Technical Architect, and the customer Product Owners
Translate business requirements from the customer domain SMEs into validated data specifications, acceptance criteria, and dbt model definitions that the engineering team can execute against
Define data product SLAs availability, freshness, and quality expectations in partnership with the DRE Engineering team
Strong communication skills ability to explain technical concepts clearly.
Proactive, ownership-driven mindset with high accountability.
Ability to collaborate across engineering, operations, and support teams.
Adaptability to fast-paced iterative environments.
The candidate is expected to deliver the following measurable outcomes:
All 9 Wave 1 data products have clearly documented specifications, accepted acceptance criteria, and a defined certification checklist before engineering begins each domain
Wave 1 UAT completes with fewer than 5% of defects attributed to ambiguous or missing business rules a direct measure of upstream specification quality
Collibra business glossary has active, reviewed, and approved definitions for all core entities across Wave 1 domains used and trusted by the customer data stewards
the customer business stakeholders describe the data products as "accurate and trustworthy" in post-launch surveys
Wave 2 backlog is defined, prioritized, and ready for engineering handoff before Wave 1 go-live
Insurance domain knowledge life insurance, annuities, actuarial, claims, or distribution operations.
Familiarity with ACORD Life & Annuity data standards.
Experience with Profisee MDM or enterprise master data management concepts.
Exposure to Snowflake as a data platform (not required to be a Snowflake engineer).