You haven't searched anything yet.
Job Title: : Data Ops Engineer
Location: Phoenix, AZ
Duration: Long Term Contract
Qualifications:
1. 5 years of overall software engineering experience that includes hands-on software development, data engineering
2. 3 years for hands-on coding experience in SQL, Python, PySpark
3. 3 years of experience with advanced orchestration tools like Apache Airflow
4. 2 years of experience in at least one cloud (Azure, AWS, Google Cloud Platform) platforms, preferably Google Cloud Platform
5. Experience in building CI/CD processes and pipelines
Roles and Responsibilities:
1. Worked in an agile environment
2. Proactively identify and assist in solving recurring data quality or dataavailability issues
3. Monitor, support, triage data pipelines that ingest, move, transform, and integrate as it moves from acquisition to consumption layers
4. Exceptional problem solving and troubleshooting skills, analyze data to figure out issues/patterns
5. Effective communication skills with technical and business teams
6. Aspire to be efficient, thorough, and proactive
7. Able to develop queries, metrics for data platform related ad-hoc reporting and/or ETL batch triage
8. Maintain knowledge base and FAQ documentation providing instructions for solving a problem that jobs commonly run into
Regards,
Surya Prakash
Full Time
IT Outsourcing & Consulting
$112k-138k (estimate)
04/20/2024
05/01/2024
intraedge.com
CHANDLER, AZ
500 - 1,000
2002
KAILASH SOMANI
$50M - $200M
IT Outsourcing & Consulting