What are the responsibilities and job description for the Senior Data Engineer - Snowflake Migration & Python - 1597 position at aKube Inc?
City: Burbank, CA
Onsite/ Hybrid/ Remote: Hybrid (4 days onsite per week, no flexibility)
Duration: 12 Months
Rate Range: Upto $96/hr on W2
Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B
Must Have:
- Snowflake
- Snowpark
- Python
- SQL
- Azure Data Factory
- ETL / data pipeline migration
- REST API integration
- Agile / Scrum
- AI-assisted development tools such as Cursor or Microsoft Copilot
Responsibilities:
- Build, refactor, and support enterprise data pipelines for data collection, transformation, and delivery
- Develop and maintain Snowflake-based data solutions using Snowpark, Python, and SQL
- Migrate existing Azure Data Factory pipelines into Snowflake Snowpark solutions
- Join and transform data from multiple source systems for reporting, dashboards, KPIs, and analytics use cases
- Implement infrastructure that supports secure data storage, processing, and retrieval in Snowflake
- Execute work from defined requirements, technical designs, and priorities set by team leads and architects
- Identify delivery risks, technical issues, or blockers and escalate as needed
- Manage assigned tasks and deliverables against project timelines and sprint commitments
- Apply performance tuning and optimization across Python and SQL workflows
- Use AI-assisted development tools to support coding, refactoring, debugging, and documentation while following engineering standards
- Validate AI-generated output to ensure security, quality, performance, and governance requirements are met
- Share AI tool usage patterns and best practices with the broader engineering team
Qualifications:
- 3 to 5 years of experience in Data Engineering or Data Integration roles
- Strong hands-on experience with Snowflake in a production environment
- Strong hands-on experience with Snowpark pipeline development
- Senior-level Python skills for data engineering and integration workloads
- Advanced SQL skills, including complex transformations and query tuning
- Experience working with Azure Data Factory and translating pipeline logic into Python-based implementations
- Experience migrating ETL or data pipelines across cloud platforms, especially from Azure to Snowflake
- Experience working with REST APIs using Python
- Experience in Agile/Scrum teams with sprint-based delivery
- Understanding of data security, governance, and enterprise engineering standards
- Bachelor’s degree or equivalent practical experience
Nice to Have:
- Snowflake Tasks and Streams
- Snowflake warehouse configuration and optimization
- AWS experience
- Azure experience
- CI/CD for data engineering workflows
- Experience with large-scale cloud data migration projects
Salary : $96