What are the responsibilities and job description for the Data Engineer position at Dale WorkForce Solutions?
Data Engineer
Direct Hire Opportunity
Somerset, NJ - onsite 5x per week
Position Summary
The Data Engineer is responsible for designing, building, and maintaining robust data pipelines and solutions to support business analytics and reporting. This role focuses on leveraging Microsoft Azure, Microsoft Fabric, PySpark, SQL, APIs, and ETL processes to ensure efficient, secure, and high-quality data delivery for enterprise analytics platforms such as Power BI. The Data Engineer will work fully on-site, collaborating closely with cross-functional teams to drive data efficiency and innovation.
Essential Responsibilities
Data Engineering & Pipeline Development
Direct Hire Opportunity
Somerset, NJ - onsite 5x per week
Position Summary
The Data Engineer is responsible for designing, building, and maintaining robust data pipelines and solutions to support business analytics and reporting. This role focuses on leveraging Microsoft Azure, Microsoft Fabric, PySpark, SQL, APIs, and ETL processes to ensure efficient, secure, and high-quality data delivery for enterprise analytics platforms such as Power BI. The Data Engineer will work fully on-site, collaborating closely with cross-functional teams to drive data efficiency and innovation.
Essential Responsibilities
Data Engineering & Pipeline Development
- Design, develop, and maintain data pipelines using Azure Data Factory, Azure Synapse, and Microsoft Fabric Dataflows.
- Implement and maintain Medallion architecture in Fabric and Azure environments.
- Build ingestion and transformation flows that efficiently convert raw Parquet files into Delta tables to support curated, incremental, and governed datasets optimized for Power BI.
- Implement efficient data lakehouse patterns within Microsoft Fabric.
- Design and maintain star schemas, dimensional models, and semantic layers for Power BI.
- Collaborate with Power BI developers to ensure models are performant and aligned to needs.
- Tune queries, optimize partitioning, and manage performance across Fabric and Synapse environments.
- Ensure solutions follow Azure best practices for scalability, cost efficiency, and security.
- Implement data governance, lineage, and cataloging via Microsoft Purview and Fabric capabilities.
- Partner with IT security and compliance teams to enforce data access controls.
- Handles all Help Desk tickets related to reporting.
- Performs general software and hardware installations as required.
- Handles related email and telephone correspondence in a timely and professional manner.
- Responds to inquiries for the purpose of resolving problems, providing information, and/or referring to appropriate personnel.
- Will be required to perform other job functions, as assigned.
- Minimum 5 years of relevant experience.
- Strong expertise in data languages (Python, SQL, DAX, M, R, etc.) and handling large data sets using PySpark.
- Strong expertise in Azure Data Factory, Azure Synapse Analytics, Databricks (nice-to-have), and Microsoft Fabric.
- Hands-on experience implementing Medallion architecture in a lakehouse environment.
- Expertise in transforming Parquet datasets into Delta tables for reliable, incremental processing.
- Experience building ETL/ELT pipelines and working with both structured and unstructured data.
- Familiarity with CI/CD for data pipelines is a plus.
- Experience with Power BI integration and data modeling.
- Excellent analytical and problem-solving skills.
- Excellent record-keeping and auditing skills.
- Excellent communication skills, both oral and written.
- Strong attention to detail and ability to self-check work.
- Excellent time management skills.
- Enthusiastic attitude, cooperative team player, adaptable to new or changing circumstances, professional demeanor, sensitive to client needs, self-motivated, creative, and innovative.