What are the responsibilities and job description for the AI Snowflake Data Engineer(W2 only) position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Apetan Consulting, is seeking the following. Apply via Dice today!
Position: Senior Snowflake Data Engineer (W2 only )
Location: Detroit , MI (Remote but 2 days onsite/Month onsite is needed)
Job Description-:
About The Role
We are looking for a Senior Snowflake Data Engineer with deep expertise in modern data
platforms and large‑scale cloud data architectures. This role is part of a high‑visibility initiative
to build a unified enterprise data foundation powering advanced analytics, AI/ML workloads,
and mission‑critical decision systems.
You will design complex Snowflake architectures, lead data engineering best practices, mentor
engineers, and drive end‑to‑end data platform modernization at scale.
This is a role for senior, hands‑on engineers who excel in solving hard problems, optimizing
systems, and driving technical excellence in fast‑paced environments.
Key Responsibilities
Architecture & System Design
Advanced Engineering & Optimization
AI/ML Data Enablement
Leadership & Collaboration
Required Qualifications
What Success Looks Like
Position: Senior Snowflake Data Engineer (W2 only )
Location: Detroit , MI (Remote but 2 days onsite/Month onsite is needed)
Job Description-:
About The Role
We are looking for a Senior Snowflake Data Engineer with deep expertise in modern data
platforms and large‑scale cloud data architectures. This role is part of a high‑visibility initiative
to build a unified enterprise data foundation powering advanced analytics, AI/ML workloads,
and mission‑critical decision systems.
You will design complex Snowflake architectures, lead data engineering best practices, mentor
engineers, and drive end‑to‑end data platform modernization at scale.
This is a role for senior, hands‑on engineers who excel in solving hard problems, optimizing
systems, and driving technical excellence in fast‑paced environments.
Key Responsibilities
Architecture & System Design
- Own the end‑to‑end architecture, design, and optimization of Snowflake environments.
- Build scalable data ingestion, transformation, and orchestration frameworks capable
- Architect complex ELT pipelines, using Snowflake Streams, Tasks, Snowpipe,
- Create performant dimensional and data vault models with strong understanding of
Advanced Engineering & Optimization
- Lead performance tuning, including clustering, micro‑partition optimization, and query
- Drive cost governance, warehouse sizing strategies, auto‑suspend/auto‑resume setups,
- Build reusable frameworks for schema evolution, metadata management, and
- Develop CI/CD workflows for data transformations, infrastructure-as-code, and
AI/ML Data Enablement
- Partner closely with AI/ML teams to deliver feature‑ready datasets, high‑throughput
- Architect data flows to support model training, validation, batch/real-time inference,
- Enable feature stores, embedding pipelines, and vectorized data workflows where
Leadership & Collaboration
- Provide technical leadership to data engineering teams, drive best practices, and guide
- Work with cross‑functional stakeholders—platform engineering, product, analytics, and
- Lead code reviews, mentor junior engineers, and raise the overall engineering bar.
- Implement strong role-based access control, data masking, and enterprise‑grade
- Establish data quality SLAs: validation rules, anomaly detection, automated
- Build monitoring dashboards for pipeline observability, reliability metrics, and incident
Required Qualifications
- 6–12 years of experience in data engineering, with deep hands‑on Snowflake
- Expert-level proficiency in SQL, advanced query optimization, and distributed data
- Strong experience with Python and building production-grade data pipelines.
- Hands‑on experience with Airflow, dbt, Dagster, or similar orchestration/ELT tools.
- Strong understanding of cloud ecosystems (AWS/Google Cloud Platform/Azure) including IAM,
- Proven track record designing enterprise-scale data architectures for complex analytics
- Experience leading engineering efforts, mentoring, and driving technical direction.
- Experience supporting AI/ML engineering workflows or building ML‑ready data layers.
- Deep knowledge of Snowflake features such as:
- Zero-copy cloning
- Resource monitors
- Streams, Tasks, Pipes
- Time Travel & Fail-safe
- Exposure to event-driven data pipelines, Kafka, Kinesis, Pub/Sub, or similar platforms.
- Background in consulting, platform modernization, or large enterprise transformation
What Success Looks Like
- You design high‑performance, scalable Snowflake data systems that handle complex
- You proactively identify architectural gaps and deliver robust, forward-looking solutions.
- You mentor engineers and become a technical backbone for the data platform.
- You consistently deliver reliable, high-quality data to downstream AI, analytics, and