What are the responsibilities and job description for the Snowflake Data Engineering with AWS, Python and PySpark position at InfoVision Inc.?
Job Title: Snowflake Data Engineering with AWS, Python and PySpark
Location: Frisco TX (3 days office)
Duration: 12 months
Required Skills & Experience:
- 10 years of experience in data engineering and data integration roles.
- Experts working with snowflake ecosystem integrated with AWS services & PySpark.
- 8 years of Core Data engineering skills – Handson on experience with Snowflake ecosystem AWS experience, Core SQL, Snowflake, Python Programming.
- 5 years Handson experience in building new data pipeline frameworks with AWS, Snowflake, Python and able to explore new ingestion frame works.
- Handson with Snowflake architecture, Virtual Warehouses, Storage, and Caching, Snow pipe, Streams, Tasks, and Stages.
- Experience with cloud platforms (AWS, Azure, or GCP) and integration with Snowflake.
- Snowflake SQL and Stored Procedures (JavaScript or Python-based).
- Proficient in Python for data ingestion, transformation, and automation.
- Solid understanding of data warehousing concepts (ETL, ELT, data modeling, star/snowflake schema).
- Hands-on with orchestration tools (Airflow, dbt, Azure Data Factory, or similar).
- Proficiency in SQL and performance tuning.
- Familiar with Git-based version control, CI/CD pipelines, and DevOps best practices.
- Strong communication skills and ability to collaborate in agile teams.
Data Engineer III - PySpark/AWS
JPMorganChase -
Plano, TX
Data Engineer / Apache Airflow Python / PySpark
Motion Recruitment -
Irving, TX
Data Engineer / Apache Airflow Python / PySpark
Motion Recruitment Partners, LLC -
Irving, TX