What are the responsibilities and job description for the Senior GCP Data Engineer position at Empiric?
Are you ready to lead large-scale cloud data transformations on Google Cloud Platform? We’re looking for a Senior GCP Data Engineer to support a major Azure → GCP migration and build modern, scalable data architectures that power analytics and AI-driven platforms.
What You’ll Do:
• Build and optimize GCP data pipelines
• Develop ETL / ELT workflows
• Work with BigQuery, Dataflow, and Dataplex
• Implement streaming / real-time solutions (Pub/Sub, Apache Beam, Kafka)
• Collaborate with stakeholders and deliver client-facing solutions
What We’re Looking For:
• 5 years data engineering experience
• Hands-on GCP experience (BigQuery, Dataflow, Dataplex)
• Proficient in Python or Java
• Experience with ETL / ELT and streaming data
• Comfortable in a consulting/client-facing environment
• Bonus: Snowflake, Databricks, or cloud migrations
• U.S. Citizen or Green Card Holder (no sponsorship)
Role Details:
• Full-time, remote-friendly (East Coast preferred)
• Travel to NYC every other week for 1–3 business days (paid)
Compensation:
• NYC: $140k–$160k base
• Other locations: $120k–$140k base
• 7% performance bonus
Apply or reach out directly at: Jillian.Lykins@empiric.com
Salary : $120,000 - $150,000