What are the responsibilities and job description for the Snowflake Developer position at B2B Technology?
Our direct client is looking for Data Engineer/Snowflake Developer, Hybrid in Minneapolis, MN area. Duration – 12 months with extension. Please send me your rate, contact details, current location.
NO- H1B, Opt/CPT
Primary Job Duties/ Tasks
- Analyze the current data environment, including data sources, pipelines, and legacy structures, to determine required transformations and optimal migration strategies into Snowflake.
- Collaborate with stakeholders and data architects to design and implement scalable, secure, and cost-effective data architecture using Snowflake.
- Re-engineer legacy reporting logic by translating them into Snowflake SQL and optimizing performance.
- Develop and automate ELT/ETL data pipelines using Snowflake''s native features and tools such as Snowpipe, Streams, Tasks, Informatica, and integration with external orchestration tools (e.g., dbt, Airflow).
- Partner with analysts and business users to build efficient, reusable data models and secure views within Snowflake that support downstream reporting (e.g., Power BI, Tableau, or Looker).
- Optimize query performance and data governance by implementing best practices in Snowflake for security, access control, caching, clustering, and cost monitoring.
- Support training, documentation, and knowledge transfer to internal teams, ensuring smooth adoption and use of Snowflake-based solutions.
Job Specific Knowledge, Skills, and Abilities (KSAs)
- Expert level SQL programming is REQUIRED for this position.
- Proven experience with Snowflake platform architecture and data warehousing concepts.
- Expertise in building efficient, secure, and scalable data models in Snowflake using views, materialized views, and secure shares.
- Strong knowledge of ELT/ETL patterns and tools (e.g., dbt, Airflow, Talend, Informatica, MS SSIS, Fivetran).
- Solid understanding of data governance, security roles, masking policies, and RBAC within Snowflake.
- Experience working with cloud storage integrations (e.g., AWS S3, Azure Blob) and external tables in Snowflake.
- Familiarity with dimensional modeling (Star/Snowflake Schema), OLAP concepts, and reporting layers for BI tools.
- Strong communication and analytical skills for working with cross-functional teams and converting data requirements into technical solutions.
- Strong understanding of current data governance concepts and best practices.
- Knowledge of data migration best practices from external data sources and legacy systems (e.g., mainframe, DB2, MS SQL Server, Oracle) into Snowflake.
Preferred KSAs:
- Experience with data visualization tools (Power BI, Tableau, Looker) and building BI semantic models using Snowflake as a backend.
- Exposure to mainframe systems, legacy flat files, and their integration with cloud-based platforms.
- Familiarity with Agile/SCRUM frameworks and experience working in iterative development cycles.
- Experience with Oracle Data Warehouse.
- Understanding of DevOps and CI/CD practices in data engineering (e.g., Git, dbt Cloud, or GitHub Actions).