What are the responsibilities and job description for the Lead Snowflake Developer (Azure) position at SES?
Job Details
Systems Engineering Services is seeking a Azure/Snowflake Lead based out of the Chicago, IL area.
Top 5 Technical Skills:
- 8 years overall experience in Data Engineering / Data Warehousing / Analytics.
- 5 years hands-on experience with Snowflake in production environments.
- Proven experience leading at least one large end-to-end migration from on-prem / legacy DWH to Snowflake on Azure (Netezza, Yellowbrick, Oracle, SQL Server, etc.).
- Strong experience with Azure cloud services: Azure Data Factory,
- Data Lake Storage (ADLS), Azure Databricks and/or Synapse, Key Vault, Azure DevOps or GitHub.
Job Description:
- Lead the end-to-end architecture and design of Snowflake-based data platforms on Azure, including integration with Azure services (ADF, Synapse pipelines, Azure Functions, Key Vault, ADLS, etc.).
- Define and implement data modeling standards (star/snowflake schema, data vault, dimensional modeling) tailored for analytics, BI, and downstream data products.
- Design secure, scalable, and cost-efficient Snowflake environments, including warehouses, databases, schemas, roles, resource monitors, and virtual warehouses.
- Lead migration strategy and roadmap for moving data from legacy/on-prem systems to Snowflake on Azure.
- Work with stakeholders to assess current state (source systems, ETL, reporting, data quality) and design target-state architecture on Snowflake.
- Define migration waves/phases, including data profiling, schema conversion, historical load, incremental load, and cutover strategy.
- Oversee and implement data ingestion pipelines from various sources (databases, flat files, APIs, streaming) into ADLS / Landing zones and then into Snowflake using tools like Azure Data Factory, Synapse pipelines, or Databricks, plus CDC where applicable.
- Manage data reconciliation and validation to ensure completeness, accuracy, and performance parity (or improvement) compared to legacy platforms.
- Lead a team of data engineers / ETL developers delivering Snowflake-based solutions and migration workstreams.
- Define and enforce coding standards, code review practices, and CI/CD pipelines for Snowflake objects (SQL, stored procedures, views, tasks, streams).
- Design & build ELT/ETL patterns (staging ? raw ? curated ? semantic layers), using tools such as dbt, ADF, Synapse, Databricks, or other orchestration tools.
- Implement automated testing frameworks (unit tests, regression tests, data quality checks) and monitoring (SLAs).
- Monitor query performance and optimize Snowflake workloads using query profiling, clustering, partitioning, and warehouse sizing strategies.
- Implement resource monitors, auto-scaling, and auto-suspend policies to optimize compute usage and manage Snowflake consumption costs.
Jake Lutman
Technical Recruiter
Systems Engineering Services
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.