What are the responsibilities and job description for the Technical Architect position at SAN R&D Business Solutions?
Title - Technical Architect — Data and Analytics
Role - (Data Solution Architect with Insurance Domain)
Experience: 12–16 years
Location: United States (Onshore)
Tagline: Platform Architecture | Snowflake Data Mesh | AI-Augmented Delivery | Enterprise-Scale Modernization
Role Summary
We are looking for a seasoned Technical Architect to own end-to-end solution architecture for a Fortune 500 insurance and financial services enterprise's large-scale, multi-wave data modernization program. You will design and govern a Snowflake AWS S3 dbt platform built on medallion architecture principles, define the ACORD-based enterprise data model,and set the technical standards that all delivery work streams will follow. This is a hands-on architecture role — you will be deeply embedded in the delivery team, not a distant reviewer.
Why This Role Matters
This engagement is a generational transformation — migrating 372 production database instances and 80 source systems into a unified, governed, cloud-native data platform over 18–24 months. The architecture you design will directly enable three of the client's most critical strategic programs: real-time operations transformation, new business underwriting modernization, and Finance Oracle Cloud implementation. Getting the foundation right in Wave 1 determines whether Waves 2 and 3 can scale without repeat forming.
What You'll Do
Architecture & Solution Design
Architect and deliver the enterprise data platform on Snowflake AWS S3 using a medallion (Bronze–Silver–Gold) architecture, supporting 80 source systems and 7-year historical migration
Design the ACORD Life & Annuity-based enterprise data model customized for insurance domains — Policy, Claims, Finance, Actuarial, Agent/Distribution, Customer/Party
Define the data mesh architecture with federated governance, domain ownership boundaries, and self-serve platform patterns for multi-wave delivery
Establish reusable ingestion templates (Matillion), dbt transformation frameworks, and Snowflake-native quality patterns (Data Metric Functions) as cross-program standards
Govern architecture decisions across all 5 parallel work streams: Ingestion, Transformation, Data Quality/DRE, Consumption, and Governance/MDM
Data Platform & Engineering Standards
Design the data ingestion strategy for structured (SQL Server CDC, Oracle, SFTP, APIs) and unstructured (EHR, APS notes, underwriting documents) source systems
Define dbt project structure, modular macro patterns, and Git-integrated version control standards for all transformation logic
Architect the Collibra integration strategy — automated catalog, end-to-end lineage (Matillion → dbt → Snowflake → Tableau/Power BI), business glossary, and certification workflows
Specify the Profisee MDM integration architecture — bi-directional Snowflake Silver layer synchronization, golden record publishing, and insurance-specific match/merge patterns
AI-Driven Delivery & Acceleration
Embed WinWire's WinAIDM AI accelerator framework into the delivery model — AI-powered ingestion, quality validation, dbt transformation, and test case generation
Guide adoption of Snowflake Cortex AI capabilities (AI_EXTRACT, AI_CLASSIFY, AI_COMPLETE) for unstructured data processing within the platform's security perimeter
Define the CI/CD quality gate architecture — automated dbt tests, reconciliation validation (99.9% match), performance benchmarks, and lineage completeness checks
Champion AI-augmented engineering practices (GitHub Copilot, LLM-based accelerators) to drive 40–50% reduction in development effort across the team
Technical Leadership & Governance
Lead technical design reviews, architecture decision records (ADRs), and code standards across all engineering work streams
Mentor senior engineers, technical leads, and data modelers — building a high-performing delivery team capable of sustaining the platform post-engagement
Drive the Data Reliability Engineering (DRE) framework: SLO/SLI definition, error budgets, automated monitoring, and incident response patterns for all certified data products
Participate in the three-tier governance model — representing architecture at Program Management and Steering Committee levels
Client Engagement
Translate complex technical architecture into clear, decision-ready recommendations for client technology and business stakeholders
Proactively surface trade-offs (performance vs. cost, speed vs. governance) and recommend options — acting as a trusted guide, not just an executor
Collaborate with the customer's data engineering, infrastructure, and domain SME teams to align platform decisions to business outcomes
Tech Stack Snapshot
Snowflake, AWS S3, Matillion, dbt (data build tool), Collibra, Profisee MDM, Python, SQL, Snowflake Cortex AI, WinAIDM, SnowConvert AI, Tableau, Power BI, CI/CD (Azure DevOps), Git, ACORD Data Model
Must-Have Skills
12 years in data engineering and analytics with 3 years in a solution/technical architect role on enterprise-scale programs
Deep, hands-on expertise in Snowflake — query optimization, clustering, Data Metric Functions, Snowpipe, Streams, and native AI capabilities
Proven experience designing medallion/lake house architectures with AWS S3 as the raw data lake layer
Strong command of dbt — project structure, macros, testing frameworks, and CI/CD integration
Experience architecture data governance solutions using Collibra — catalog, lineage, business glossary, and certification workflows
Demonstrated ability to lead multi-wave, multi-workstream data modernization programs in a regulated (insurance, healthcare, or financial services) environment
Hands-on experience migrating legacy ETL (Informatica or SSIS) to modern dbt/Matillion pipelines
Good to Have
Familiarity with ACORD Life & Annuity data standards and insurance domain concepts (Policy, Claims, Actuarial, Reinsurance)
Experience with Profisee MDM or equivalent enterprise MDM platforms
Exposure to Snowflake Cortex AI, AI-assisted development tools (GitHub Copilot, Azure OpenAI), or LLM-based data engineering accelerators
SnowPro Advanced certification (Data Engineer or Architect)
What Success Looks Like (6–12 Months)
Wave 1 platform foundation (Snowflake environments, AWS S3 data lake, Matillion/dbt/Collibra/Profisee, CI/CD pipelines) delivered by Month 3 with no rework required
All 9 Wave 1 certified data products (CMD Life, Finance, Actuarial) achieve SLO targets: 99.9% completeness, 99.5% accuracy, end-to-end lineage in Collibra
Architecture standards and dbt/Matillion templates adopted consistently across all 5 delivery work streams — no divergence in patterns
WinAIDM accelerator framework implemented and deployed and contributing to 40–50% reduction in data engineering effort vs. baseline
Client technical stakeholders describe WinWire as a "trusted architecture guide" — proactive, decision-ready, and commercially aware