What are the responsibilities and job description for the Sr. Fabric Data Engineer position at Inizio Partners?
Role & Responsibilities Overview:
- Architect and implement end‑to‑end data solutions using Microsoft Fabric components including Data Factory, Lakehouse, Warehouse, Real‑Time Analytics, and OneLake.
- Design scalable data models, integration patterns, and storage strategies to support insurance datasets such as policy, claims, billing, actuarial, and customer information.
- Create architectural blueprints, reference architectures, and reusable frameworks that align with enterprise engineering and cloud standards.
- Develop and manage ingestion pipelines using Fabric Data Factory, enabling batch, real‑time streaming, and API‑based data integrations.
- Build Lakehouse‑centered architectures to support advanced analytics, reporting, and machine learning workloads.
- Optimize data pipelines for performance, reliability, and cost efficiency across Azure and Fabric environments.
- Collaborate with Data Governance teams to ensure adherence to enterprise data standards, lineage tracking, metadata management, and data quality rules.
- Implement governance and compliance capabilities within Fabric, including cataloging, sensitivity labeling, and granular access control.
- Support regulatory requirements for insurance data, ensuring compliance with NAIC guidelines, state-level regulations, and data privacy laws.
- Partner with cloud engineering and security teams to optimize Fabric deployments integrated with Azure AD, Microsoft Purview, and Power BI.
- Ensure architectural alignment with enterprise cloud strategy, security frameworks, and operational guardrails.
- Evaluate new Microsoft Fabric features, capabilities, and roadmap items, recommending adoption and modernization strategies.
- Collaborate with business and analytics stakeholders to translate domain requirements into robust, scalable data solutions.
- Provide technical direction and architectural guidance to data engineers, analysts, and BI developers across projects.
- Lead design reviews and enforce architectural consistency across data initiatives and engineering teams.
- Define and implement Fabric best practices for performance tuning, storage optimization, and cost governance.
- Establish strong data security models, including RBAC, encryption, data masking, and secure access patterns.
- Monitor Fabric platform health and proactively manage risks, ensuring high availability and operational resilience.
Candidate Profile:
- 7 years of experience in data architecture, data engineering, or analytics platform design.
- Strong hands on experience with Microsoft Fabric and related Azure services.
- Deep understanding of insurance data domains (policy, claims, underwriting, actuarial, billing).
- Expertise in data modeling (dimensional, canonical, semantic models).
- Strong knowledge of data governance, metadata management, and data quality practices.
- Proficiency with SQL, Python, and modern data engineering frameworks.
- Excellent communication and stakeholder management skills.
- Experience with Microsoft Purview, Power BI, Databricks, or Azure Synapse.
- Familiarity with modern data architecture concepts (data mesh, data fabric, Lakehouse).
- Certifications such as Azure Data Engineer
- Prior experience in P&C or Life insurance data modernization programs.
- Able to work from client location at Madison, Wisconsin
#madisonjobs #insurancejobs #microsoftfabric #azure #azurejobs #fulltimejobs #dejobs #dataengineerjobs
Salary : $140,000 - $160,000