What are the responsibilities and job description for the Data Engineer (GCP / Telecom / Genesys Migration) position at Holistic Partners, Inc?
Job Title: Data Engineer (GCP / Telecom / Genesys Migration)
Location: Phoenix, AZ (Preferred) or Walnut Creek, CA (Onsite)
Duration: 12 Months C2H
Must-Have Skills
- BigQuery
- Google Cloud Storage (GCS)
- Airflow / Cloud Composer
- Python
- SQL
- Telecom / Contact Center IVR
- Agile Environment & Communication Skills
Job Overview
We are seeking a Data Engineer (Level 2–3) to support a large-scale enterprise data migration initiative from Avaya to Genesys contact center platform.
This role focuses on data engineering, not telecom engineering. The environment processes high-volume telecom interaction data such as call recordings, transcripts, and voice metadata.
The ideal candidate will have strong experience building and maintaining data pipelines in Google Cloud Platform (GCP) and handling both structured and semi-structured data.
Key Responsibilities
- Design, build, and maintain scalable data ingestion and transformation pipelines
- Work with telecom datasets:
- Call Detail Records (CDR)
- Transcripts
- Event-based data
- Process multiple data formats:
- JSON transcripts
- XML datasets
- Relational databases
- Build and optimize datasets in BigQuery
- Develop workflows using Airflow / Cloud Composer
- Ingest and manage data using GCS
- Perform data modeling and transformation for analytics/reporting
- Collaborate during Avaya → Genesys migration
- Ensure data quality, reliability, and performance
- Work in an Agile environment
Required Experience
- 5 years as a Data Engineer (Mid–Senior level)
- Strong hands-on experience with Google Cloud Platform (GCP)
- Expertise in:
- BigQuery
- Cloud Storage (GCS)
- Airflow / Cloud Composer
- Python
- SQL
- Experience with ETL/ELT pipeline design
- Ability to independently build and support production-grade pipelines
Preferred Experience
- Telecom / Contact Center domain knowledge
- Experience with:
- Call Detail Records (CDR)
- IVR systems
- Voice analytics
- Familiarity with data lake architecture and cloud-native solutions
Additional Responsibilities
- Use GCS, BigQuery, Airflow Composer, Cloud Data Fusion, Cloud Functions, Python, SQL, and GitHub to build and maintain the Enterprise Data Warehouse (EDW)
- Collaborate with:
- Data Engineering Scrum Team
- Data & Integrations Engineering Team
- Production Data Support Team
- Business Stakeholders
- Design new features and troubleshoot production issues
Key Competencies
- Strong collaboration and communication skills
- Ability to learn new technologies quickly
- Effective time management and prioritization
- Strong problem-solving and analytical thinking
- Experience with Agile Scrum methodology
- Commitment to teamwork, transparency, and delivery excellence
Education & Certifications
- Bachelor’s degree in Computer Science or related field
- 4–5 years of relevant experience
- Experience working in Agile environments
- GCP certifications (preferred)