What are the responsibilities and job description for the Data Engineer Azure Data Platforms & Data Modeling position at Nisum?
What you know
$140K - $145K / PA
- Design, develop, and maintain scalable, high-performance data pipelines across Azure cloud platforms.
- Build and orchestrate ETL/ELT workflows using Azure Data Factory (ADF).
- Develop enterprise-grade data models (dimensional, relational, lakehouse) to support analytics and reporting needs.
- Implement and optimize data processing solutions using Azure Databricks and distributed computing frameworks.
- Support AI/ML initiatives by enabling curated datasets and workflow integration using Azure Machine Learning pipelines.
- Collaborate with data scientists, analysts, architects, and business stakeholders to deliver trusted data products.
- Ensure best practices in data governance, security, quality, and compliance across platforms.
- Monitor and optimize pipeline reliability, performance, and cost efficiency in production environments.
- Contribute to CI/CD automation and operational excellence for enterprise data workflows.
- Strong data modeling mindset (enterprise, analytics, and lakehouse architectures)
- Ownership of end-to-end data architecture and delivery
- Ability to work cross-functionally with technical and business teams
- Clear communication and stakeholder alignment
- Strong decision-making in platform and architecture design
- Enterprise focus on reliability, security, and scalability
- 8–10 years of experience in Data Engineering, Data Platform Engineering, or related roles.
- Strong experience building cloud-scale data solutions in the Microsoft Azure ecosystem.
- Hands-on expertise with Azure Data Factory for orchestration and automation.
- Experience with Azure Databricks for large-scale data transformation and processing.
- Strong proficiency in SQL and Python for data engineering workflows.
- Experience working with Azure Data Lake Storage (ADLS) and modern lakehouse architectures.
- Familiarity with Azure Machine Learning pipelines and supporting feature/data workflows for ML teams.
- Strong understanding of data governance, lineage, and quality frameworks.
- Experience with Delta Lake and modern lakehouse design.
- Familiarity with DevOps practices, CI/CD pipelines, and Infrastructure-as-Code.
- Experience working in large-scale enterprise environments with complex data integration needs.
- Knowledge of Azure security best practices (RBAC, Key Vault, encryption).
- Bachelor’s degree in Computer Science, Engineering, Data Science, or a related field (required).
- Master’s degree is a plus.
$140K - $145K / PA
Salary : $140,000 - $145,000