What are the responsibilities and job description for the Data Architect Microsoft Fabric Transformation position at STAFFXPERT LLC?
Job Title: Data Architect Microsoft Fabric Transformation
Location: St. Paul / Minneapolis, MN (Hybrid)
Job SummarySTAFFXPERT LLC is seeking a Data Architect Microsoft Fabric Transformation on behalf of our client in St. Paul / Minneapolis, MN. This role will lead the modernization of an enterprise data platform, transitioning from legacy systems to a scalable, cloud-based architecture using Microsoft Fabric.
The ideal candidate will play a critical role in designing a robust Lakehouse architecture, enabling advanced analytics, and establishing best practices across data engineering, modeling, governance, and performance optimization.
Key Responsibilities- Lead the migration from legacy SQL Server-based systems to Microsoft Fabric
- Design and implement scalable Lakehouse architecture (Bronze, Silver, Gold layers)
- Architect end-to-end data pipelines for ingestion, transformation, and delivery
- Develop and optimize data workflows using PySpark and Spark SQL
- Define and enforce data modeling standards across Lakehouse and Warehouse layers
- Deliver high-quality, business-ready datasets for analytics and reporting
- Optimize performance for distributed data processing workloads
- Establish and maintain data governance, quality, and security frameworks
- Collaborate with cross-functional teams including engineering, analytics, and business stakeholders
- Mentor data engineers and promote modern ELT best practices
- Modernize legacy application logic into Fabric-native solutions
- 5 years of experience in Data Architecture or Data Engineering
- Hands-on experience with Microsoft Fabric (Lakehouse, Warehouse, OneLake) or similar platforms such as Databricks, Snowflake, or Azure Synapse
- Strong expertise in Microsoft SQL Server and advanced T-SQL
- Experience with PySpark, Spark SQL, and notebook-based development
- Proven track record of building modern data platforms (Lakehouse or Data Warehouse)
- Strong understanding of ETL/ELT processes and distributed data systems
- Experience with data modeling techniques (star schema, dimensional modeling)
- Experience working in Agile/Scrum environments
- Experience migrating legacy systems to cloud-based architectures
- Background in modernizing C#-based data solutions
- Exposure to real-time or streaming data processing
- Knowledge of data governance, lineage, and metadata management tools
- Experience working in enterprise-scale or regulated environments