What are the responsibilities and job description for the Data Engineer position at Sibitalent Corp?
Hi,
Hope you are doing well,
IMMEDIATE INTERVIEW(Interview Process: One done) = GCP Data Engineers in Hartford, Connecticut -HYBRID (NEED LOCAL CANDIDATE)- OPEN FOR W2 CANDIDATE
Please find the Job details below and kindly revert if you're interested in learning more about this job.
Job Title: GCP Data Engineers
Location: Hartford, Connecticut -HYBRID (NEED LOCAL CANDIDATE)
Duration: 6-12 mo. contract
NEED ONE MANAGER REFERNCE AND THEIR LINKEDIN PROFILE AND OFFICIAL EMAIL ID.
Other Acceptable Titles:
GCP Engineer, Data Engineer, Cloud Engineer
Top 3 must-have technologies:
- GCP
- Python
- SQL
Nice-to-Haves:
- Teradata (platform they are moving off of)
- Healthcare experience
Will be looking for:
- Candidates who are well-versed hands-on with required technologies
- Candidates who are self-sufficient and can be independent
Position Detail:
We are seeking skilled Data Engineer(s) to support a high-impact enterprise data migration initiative. The goal is to migrate data warehouse assets and ETL pipelines from
Teradata to Google Cloud Platform (GCP).
The role involves hands-on development, testing, and optimization of data pipelines and warehouse structures in GCP, ensuring minimal disruption and maximum performance.
Key Responsibilities:
- Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
- Analyze and map existing Teradata workloads to appropriate GCP equivalents.
- Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).
- Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.
- Develop automated workflows for data movement and transformation using GCP-native tools and/or custom scripts (Python).
- Optimize data storage, query performance, and costs in the cloud environment.
- Implement monitoring, logging, and alerting for all migration pipelines and production workloads.
Required Skills:
- 4 to 6 years of experience in Data Engineering, with at least 2 years in GCP.
- Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
- Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
- Experience with ETL/ELT pipelines using custom scripting tools (Python/Java).
- Proven ability to refactor and translate legacy logic from Teradata to GCP.
- Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data environments.
- Strong analytical, troubleshooting, and communication skills.
Preferred Qualifications:
- GCP certification (Preferred: Professional Data Engineer).
- Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP.
- Experience working in the healthcare domain.
- Knowledge of data governance, security, and compliance in cloud ecosystems.
Behavioral Skills:
- Problem solving mindset
- Attention to detail
- Accountability and ownership
- Curious and staying current with evolving GCP services
Preference:
Ability to work in Hartford, CT office at least thrice a week.