What are the responsibilities and job description for the Fabric Data Engineer position at New York Technology Partners?
Position Title: Fabric Data Engineer
Location: Phoenix, AZ (Onsite 5 days)
Years of Experience: 8 years
Responsibilities -
• Builds and maintains ELT/ETL pipelines using Microsoft Fabric tools, enabling efficient data ingestion from multiple resources.
• Applies transformations, cleanses, and enriches data to ensure it is ready for analyzing and reporting
• Handles large datasets, optimizing storage and retrieval for performance.
• Implements automation for data processing and integration workflows, reducing manual intervention
• Works with Platform Architects to ensure infrastructure supports data requirements.
• Partners with Report developers to ensure that data is in a usable format and ready for analysis.
• Ensuring code reusability and parameterization
• Focuses on creating interactive, intuitive reports and dashboards using Microsoft Fabric's reporting tools.
Qualifications:
• Data Factory (in Fabric): Designing and orchestrating data ingestion and transformation pipelines (ETL/ELT).
• Data Engineering Experience (Spark): Using Notebooks (PySpark, Spark SQL, Scala) and Spark Job Definitions for complex data processing, cleansing, enrichment, and large-scale transformations directly on OneLake data.
• Lakehouse Items: Creating and managing Lakehouse structures (Delta tables, files) as the primary landing and processing zone within OneLake.
• OneLake / ADLS Gen2: Understanding storage structures, Delta Lake format, partitioning strategies, and potentially managing Shortcuts.
• Monitoring Hubs: Tracking pipeline runs and Spark job performance.
• Core Responsibilities (Fabric Context): Building ingestion pipelines from diverse sources; implementing data cleansing and quality rules; transforming raw data into curated Delta tables within Lakehouses or Warehouses; optimizing Spark jobs and data layouts for performance and cost; managing pipeline schedules and dependencies; ensuring data security and governance principles are applied to pipelines and data structures.
• Excellent Communicator and collaboration skills
• Bachelor’s degree in Computer Science, Engineering, or a relevant field
• Azure/AWS Cloud Certifications will be a plus
• Experience with Manufacturing domain will be a plus.
• Should be self-driven and be able to drive the projects to delivery