What are the responsibilities and job description for the Actively looking for a Snowflake Developer position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Arbor Tek Systems, is seeking the following. Apply via Dice today!
Job Title: Snowflake Developer
Location: Dallas, Texas (need only locals)
Mail:
Summary:
We are seeking a skilled Snowflake Developer to design, develop, and optimize data solutions on the Snowflake platform. The ideal candidate will have strong experience in cloud data warehousing, ETL/ELT processes, and performance tuning, along with expertise in SQL and modern data tools.
Key Responsibilities:
Job Title: Snowflake Developer
Location: Dallas, Texas (need only locals)
Mail:
Summary:
We are seeking a skilled Snowflake Developer to design, develop, and optimize data solutions on the Snowflake platform. The ideal candidate will have strong experience in cloud data warehousing, ETL/ELT processes, and performance tuning, along with expertise in SQL and modern data tools.
Key Responsibilities:
- Design and develop scalable data models and pipelines using Snowflake
- Build and optimize ETL/ELT workflows using tools like Azure Data Factory / DBT / Informatica / Airflow
- Write complex and efficient SQL queries for data transformation and reporting
- Implement data integration from multiple sources (APIs, databases, flat files, etc.)
- Monitor and improve query performance and warehouse utilization in Snowflake
- Work with cloud platforms like AWS / Azure / Google Cloud Platform for data solutions
- Ensure data quality, governance, and security best practices
- Collaborate with data analysts, data engineers, and business stakeholders
- Troubleshoot data issues and provide production support
- Strong hands-on experience with Snowflake Data Warehouse
- Expertise in SQL and data modeling concepts (Star Schema, Snowflake Schema)
- Experience with ETL/ELT tools (ADF, DBT, Talend, Informatica, etc.)
- Knowledge of cloud platforms (AWS / Azure / Google Cloud Platform)
- Familiarity with Python or scripting languages
- Experience with data warehousing concepts and large-scale data processing
- Understanding of performance tuning and optimization in Snowflake
- Experience with DBT (Data Build Tool)
- Knowledge of Data Lakes (S3, ADLS, GCS)
- Familiarity with CI/CD pipelines and DevOps practices
- Exposure to BI tools (Power BI, Tableau, Sigma, etc.)
- Snowflake certifications (preferred but not mandatory)