What are the responsibilities and job description for the Lead Data Architect (Snowflake) position at SAN R&D Business Solutions?
Job Title: Lead Data Architect (Snowflake)
Location: Newport Beach, CA
Work Type: Onsite
Employment Type: FTE/Contract
Visa Requirement: No OPT / CPT
Open to only LOCALS of CA
Key skills: Expert in Snowflake-based data platform architecture, AWS S3 lakehouse (medallion), dbt/ Matillion pipelines, and enterprise data modeling, with strong governance (Collibra) and CI/CD practices. large-scale data modernization, AI-enabled data engineering, (insurance Domain )
Role Summary:
We are looking for a seasoned Technical Architect to own end-to-end solution architecture for a Fortune 500 insurance and financial services enterprise's large-scale, multi-wave data modernization program. You will design and govern a Snowflake AWS S3 Matillion dbt platform built on medallion architecture principles, define the ACORD-based enterprise data model, and set the technical standards that all delivery workstreams will follow. This is a hands-on architecture role — you will be deeply embedded in the delivery team, not a distant reviewer.
Key Responsibilities:
- Snowflake, AWS S3, Matillion, dbt (data build tool), Collibra, Profisee MDM, Python, SQL, Snowflake Cortex AI, WinAIDM, SnowConvert AI, Tableau, Power BI, CI/CD (Azure DevOps), Git, ACORD Data Model.
- Architect and deliver the enterprise data platform on Snowflake AWS S3 using a medallion (Bronze–Silver–Gold) architecture, supporting 80 source systems and 7-year historical migration
- Design the ACORD Life & Annuity-based enterprise data model customized for insurance domains — Policy, Claims, Finance, Actuarial, Agent/Distribution, Customer/Party
- Define the data mesh architecture with federated governance, domain ownership boundaries, and self-serve platform patterns for multi-wave delivery
- Establish reusable ingestion templates (Matillion), dbt transformation frameworks, and Snowflake-native quality patterns (Data Metric Functions) as cross-program standards
- Govern architecture decisions across all 5 parallel workstreams: Ingestion, Transformation, Data Quality/DRE, Consumption, and Governance/MDM
Required Skills & Qualification:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical discipline.
- 12 years in data engineering and analytics with 3 years in a solution/technical architect role on enterprise-scale programs
- Deep, hands-on expertise in Snowflake — query optimization, clustering, Data Metric Functions, Snowpipe, Streams, and native AI capabilities
- Proven experience designing medallion/lakehouse architectures with AWS S3 as the raw data lake layer
- Strong command of dbt — project structure, macros, testing frameworks, and CI/CD integration
- Experience architecting data governance solutions using Collibra — catalog, lineage, business glossary, and certification workflows
- Demonstrated ability to lead multi-wave, multi-workstream data modernization programs in a regulated (insurance, healthcare, or financial services) environment
- Hands-on experience migrating legacy ETL (Informatica or SSIS) to modern dbt/Matillion pipelines
Soft Skills:
- Strong communication skills—ability to explain technical concepts clearly.
- Proactive, ownership-driven mindset with high accountability.
- Ability to collaborate across engineering, operations, and support teams.
- Adaptability to fast-paced iterative environments.