What are the responsibilities and job description for the Snowflake Developer position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, B2B Technology, is seeking the following. Apply via Dice today!
Our direct client is looking for Data Engineer/Snowflake Developer, Hybrid in Minneapolis, MN area. Duration β 12 months with extension. Please send me your rate, contact details, current location.
NO- H1B, Opt/CPT
Primary Job Duties/ Tasks
Our direct client is looking for Data Engineer/Snowflake Developer, Hybrid in Minneapolis, MN area. Duration β 12 months with extension. Please send me your rate, contact details, current location.
NO- H1B, Opt/CPT
Primary Job Duties/ Tasks
- Analyze the current data environment, including data sources, pipelines, and legacy structures, to determine required transformations and optimal migration strategies into Snowflake.
- Collaborate with stakeholders and data architects to design and implement scalable, secure, and cost-effective data architecture using Snowflake.
- Re-engineer legacy reporting logic by translating them into Snowflake SQL and optimizing performance.
- Develop and automate ELT/ETL data pipelines using Snowflake''s native features and tools such as Snowpipe, Streams, Tasks, Informatica, and integration with external orchestration tools (e.g., dbt, Airflow).
- Partner with analysts and business users to build efficient, reusable data models and secure views within Snowflake that support downstream reporting (e.g., Power BI, Tableau, or Looker).
- Optimize query performance and data governance by implementing best practices in Snowflake for security, access control, caching, clustering, and cost monitoring.
- Support training, documentation, and knowledge transfer to internal teams, ensuring smooth adoption and use of Snowflake-based solutions.
- Expert level SQL programming is REQUIRED for this position.
- Proven experience with Snowflake platform architecture and data warehousing concepts.
- Expertise in building efficient, secure, and scalable data models in Snowflake using views, materialized views, and secure shares.
- Strong knowledge of ELT/ETL patterns and tools (e.g., dbt, Airflow, Talend, Informatica, MS SSIS, Fivetran).
- Solid understanding of data governance, security roles, masking policies, and RBAC within Snowflake.
- Experience working with cloud storage integrations (e.g., AWS S3, Azure Blob) and external tables in Snowflake.
- Familiarity with dimensional modeling (Star/Snowflake Schema), OLAP concepts, and reporting layers for BI tools.
- Strong communication and analytical skills for working with cross-functional teams and converting data requirements into technical solutions.
- Strong understanding of current data governance concepts and best practices.
- Knowledge of data migration best practices from external data sources and legacy systems (e.g., mainframe, DB2, MS SQL Server, Oracle) into Snowflake.
- Experience with data visualization tools (Power BI, Tableau, Looker) and building BI semantic models using Snowflake as a backend.
- Exposure to mainframe systems, legacy flat files, and their integration with cloud-based platforms.
- Familiarity with Agile/SCRUM frameworks and experience working in iterative development cycles.
- Experience with Oracle Data Warehouse.
- Understanding of DevOps and CI/CD practices in data engineering (e.g., Git, dbt Cloud, or GitHub Actions).