Zenith Services is Hiring a GCP Data Architect Near Dearborn, MI
Detailed JD along with Primary Skills:
Good to have GCP Certification (Either GCP Data Engineer or GCP Cloud Architect)
8 to 10 years of experience in Architecting Data projects and knowledge of multiple TD/Hadoop/Hive/Spark/Client implementation
5 experience in Data modeling and Data warehouse and Data lake implementation
Working experience in implementing Hadoop to GCS and HIVE to Bigquery migration project
Working experience in migrating Teradata to Bigquery migration project
Ability to identify and gather requirements to define a solution to be built and operated on GCP, perform high-level and low-level design for the GCP platform
Capabilities to implement and provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project.
GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization.
Databases including Big table, Cloud SQL, Cloud Spanner, Memory store, Data Analytics Data Flow, DataProc, Cloud Pub/Sub, Kubernetes, Docker, managing containers, container auto scaling and container security
GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization.
Databases including Big table, Cloud SQL, Cloud Spanner, Memory store, Data Analytics Data Flow, DataProc, Cloud Pub/Sub, Kubernetes, Docker, managing containers, container auto scaling and container security
Experience in Design, Deployment, configuration and Integration of application infrastructure resources including GKE clusters, Anthos, APIGEE and DevOps Platform
Application development concepts and technologies (e.g. CI/CD, Java, Python)
Capabilities to implement and provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project.