What are the responsibilities and job description for the Need Snowflake Data Engineer - Chicago, IL / Columbus, OH position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Radiantze, is seeking the following. Apply via Dice today!
Job Title: Snowflake Data Engineer
Location: Columbus, OH / Chicago, IL
Duration: Long Term
Job Summary
We are looking for a skilled Snowflake Data Engineer to design, build, and maintain scalable data pipelines and data warehouse solutions using Snowflake. The ideal candidate should have strong experience in cloud data platforms, ETL/ELT processes, and performance optimization.
Key Responsibilities
Radiantze Inc
Job Title: Snowflake Data Engineer
Location: Columbus, OH / Chicago, IL
Duration: Long Term
Job Summary
We are looking for a skilled Snowflake Data Engineer to design, build, and maintain scalable data pipelines and data warehouse solutions using Snowflake. The ideal candidate should have strong experience in cloud data platforms, ETL/ELT processes, and performance optimization.
Key Responsibilities
- Design and implement data solutions using Snowflake Cloud Data Platform
- Build and maintain scalable ETL/ELT pipelines
- Develop data models (star/snowflake schemas)
- Optimize Snowflake performance (clustering, partitioning, query tuning)
- Integrate data from multiple sources (APIs, databases, flat files)
- Work with cloud platforms like AWS / Azure / Google Cloud Platform
- Ensure data quality, integrity, and governance
- Collaborate with data analysts, scientists, and business teams
- Automate workflows using tools like Airflow, dbt, or similar
- Monitor and troubleshoot data pipelines
- Strong experience with Snowflake
- Proficiency in SQL and query optimization
- Experience with Python / Scala / Java
- Hands-on with ETL tools (Informatica, Talend, dbt, etc.)
- Knowledge of cloud platforms (AWS S3, Azure Data Lake, Google Cloud Platform BigQuery)
- Experience with data warehousing concepts
- Familiarity with version control (Git)
- Experience with Snowpipe, Streams, and Tasks
- Knowledge of Data Lake architecture
- Experience with CI/CD pipelines
- Understanding of data governance and security
- Certification in Snowflake or cloud platforms
- Bachelor s degree in Computer Science, Engineering, or related field
- 3 years of experience in data engineering or similar role
- Experience in real-time streaming (Kafka, Spark Streaming)
- Exposure to BI tools (Power BI, Tableau)
- Agile/Scrum experience
- Snowflake
- SQL
- Python
- dbt / Airflow
- AWS / Azure / Google Cloud Platform
- Git
Radiantze Inc