What are the responsibilities and job description for the Snowflake Developer only USC position at Ampstek?
Role :: Snowflake Developer
Location :: Johnston, RI 02919
Duration :: Long Term Contract
Only USC
We are seeking an experienced Snowflake Developer to design, develop, and optimize data pipelines and data warehouse solutions using Snowflake. The ideal candidate will have strong expertise in cloud data engineering, SQL, ELT/ETL processes, and performance tuning. This role involves working closely with data architects, analysts, and business stakeholders to deliver scalable, secure, and high-performance data solutions.
Key Responsibilities
• Design, develop, and maintain scalable Snowflake Data Warehouse solutions.
• Build and optimize ELT/ETL pipelines using Snowflake features such as Streams, Tasks, Snowpipe, and Stored Procedures.
• Develop complex SQL scripts, transformations, and data models to support analytics and reporting.
• Implement data ingestion from various data sources (cloud storage, APIs, databases, third-party tools).
• Configure Snowflake components including Warehouses, Databases, Schemas, Roles, Stages, and Security integrations.
• Ensure performance tuning, query optimization, and efficient resource utilization.
• Implement robust data governance, security, and access control aligned with organizational policies.
• Troubleshoot data issues, perform root-cause analysis, and ensure high data quality.
• Collaborate with data architects, BI developers, and business stakeholders for solution design.
• Work with cloud platforms such as AWS / Azure / GCP for integration and deployment.
• Develop automation scripts using Python, DBT, or CI/CD pipelines for Snowflake deployments.
• Support migration of legacy data warehouses to Snowflake.
Required Skills & Qualifications
• Bachelor’s degree in Computer Science, Engineering, or related field.
• 3–8 years of experience in Data Engineering or BI development.
• Strong hands-on experience with Snowflake development and administration.
• Proficiency in SQL, data modeling (Star/Snowflake schema), and query optimization.
• Experience with ETL/ELT tools such as Informatica, Talend, Matillion, DBT, ADF, Glue, or Airflow.
• Knowledge of cloud platforms (AWS, Azure, or Google Cloud).
• Experience with Python for automation and data processing.
• Familiarity with CI/CD pipelines (Git, Jenkins, GitHub Actions, Azure DevOps, etc.).
• Understanding of data governance, security, and compliance standards.
Preferred Skills
• Experience with Snowflake advanced features like Zero-Copy Cloning, Time Travel, Secure Data Sharing, and Materialized Views.
• Knowledge of modern data architectures (Data Lake, Data Mesh, Lakehouse).
• Exposure to BI tools such as Tableau, Power BI, Looker, or ThoughtSpot.
• Experience with real-time data ingestion using Kafka, Kinesis, or streaming tools.