What are the responsibilities and job description for the Sr. Data Engineer (Snowpark, Python, SQL, Azure)- No H1B or C2C position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, TechLink Resources, is seeking the following. Apply via Dice today!
PLEASE THE JOB DESCRIPTION CAREFULLY. THIS IS ON-SITE LOS ANGELES POSITION. MUST BE ABLE TO INTERVIEW IN PERSON. NO H-1B AND NO C2C.
Senior Data Engineer (Snowflake / Python / AI-Assisted Development)
About The Role
We re looking for a Senior Data Engineer to help build and modernize enterprise data pipelines in a cloud-based environment. This role is highly hands-on and focused on execution and delivery, working closely with architects and team leads to implement scalable data solutions.
You ll play a key role in migrating existing data pipelines from Azure Data Factory into Snowflake (Snowpark) while leveraging modern AI-assisted development tools (like Cursor and Copilot) to improve speed, quality, and efficiency.
What You Ll Be Doing
PLEASE THE JOB DESCRIPTION CAREFULLY. THIS IS ON-SITE LOS ANGELES POSITION. MUST BE ABLE TO INTERVIEW IN PERSON. NO H-1B AND NO C2C.
Senior Data Engineer (Snowflake / Python / AI-Assisted Development)
About The Role
We re looking for a Senior Data Engineer to help build and modernize enterprise data pipelines in a cloud-based environment. This role is highly hands-on and focused on execution and delivery, working closely with architects and team leads to implement scalable data solutions.
You ll play a key role in migrating existing data pipelines from Azure Data Factory into Snowflake (Snowpark) while leveraging modern AI-assisted development tools (like Cursor and Copilot) to improve speed, quality, and efficiency.
What You Ll Be Doing
- Build, refactor, and maintain scalable data pipelines for enterprise data systems
- Migrate existing ADF pipelines into Snowflake Snowpark solutions
- Design and implement data transformations for reporting, analytics, and KPIs
- Work heavily with Snowflake (SQL Snowpark) for data processing and storage
- Develop and optimize Python-based data workflows
- Integrate data from multiple sources and ensure high data quality
- Collaborate with engineering leadership to execute on defined requirements
- Leverage AI tools (Cursor, Copilot, etc.) for coding, debugging, and optimization
- Ensure solutions meet performance, security, and data governance standards
- Strong hands-on coder (not just design/architecture)
- Deep experience with Python SQL for data engineering
- Proven experience working with Snowflake in production
- Experience with Azure Data Factory and translating pipelines into code
- Someone who has done cloud data migrations (Azure Snowflake preferred)
- Comfortable working in Agile/Scrum environments
- Familiar with using AI coding tools to move faster and write better code
- 3 5 years in Data Engineering / Data Integration
- Advanced experience with Python and SQL (complex transformations tuning)
- Strong experience with Snowflake
- Experience with Azure Data Factory (ADF)
- Experience working in Agile teams
- Experience using AI-assisted development tools (Cursor, Copilot, etc.)
- Experience with Snowflake Snowpark, Tasks, and Streams
- Exposure to AWS or Azure cloud environments
- Experience with CI/CD pipelines for data engineering
- Background in large-scale data migration projects
- High-impact work modernizing enterprise data systems
- Exposure to cutting-edge AI-assisted development practices
- Strong engineering culture focused on delivery and quality
- Opportunity to work on large-scale cloud data transformations