What are the responsibilities and job description for the GCP Data Engineer position at Capgemini?
Role: GCP Data Engineer
Location: Dallas, TX or Charlotte, NC
Fulltime
Note: No Sponsorship available for this role, Only Non Visa.
Job Description:
- 3 years of experience, Design, develop, and maintain reusable, Terraform modules for Google Cloud Platform services and platform observability components.
- Build and publish Terraform modules to the Terraform Enterprise Private Registry, ensuring modules are parameterized, versioned, and reusable across teams and environment.
- Develop Terraform modules for GCP observability and monitoring, including alerting policies, monitoring dashboards, log-based metrics, uptime checks, metrics scopes, quota monitoring, and notification channels.
- Implement Infrastructure-as-Code modules supporting GCP services such as IAM, KMS, GCS, BigQuery, BigLake, Bigtable, Cloud Composer, GKE, Cloud SQL, Cloud Functions, Data Catalog, Dataplex, Dataproc, and Dataflow.
- Develop SDLC-promotable Terraform code that supports environment-based deployments (dev, test, prod) and enables multi-tenant rollouts across business units and applications.
- Integrate Terraform modules with CI/CD pipelines to support automated validation and deployment, including Terraform plan/apply workflows, linting and static analysis, unit testing (TF Test), and policy validation.
- Conduct unit, integration, and regression testing of Terraform modules to ensure reliable, repeatable, and idempotent infrastructure deployments.
- Troubleshoot provisioning issues related to IAM permissions, API enablement, quotas, state management, dependencies, or resource conflicts.
- Optimize logging and monitoring configurations to balance observability, cost efficiency, and operational performance.
- Maintain comprehensive module documentation, usage examples, runbooks, and deployment guides to support operational excellence and team adoption.
- Analyze PRISMA violation reports and governance findings to identify compliance issues, security risks, and cost optimization opportunities across GCP environments