Demo

Data Architect (Google Cloud )– IAM Data Modernization

Rivago infotech inc
Dallas, TX Contractor
POSTED ON 5/8/2026
AVAILABLE BEFORE 6/7/2026

Role : Google Cloud Data Architect – IAM Data Modernization with (OpenShift Container Platform (OCP) )

Location : Dallas, TX / Charlotte, NC (Hybrid – 4 days office)

Contract

End Client: Banking

Implementation partner - ********

Experience: 12 Years

 

Highly Preferred OCP exp

 

Project/Program

Identity & Access Management (IAM) Data Modernization – migration of an on‑premises SQL data warehouse to a target‑state Data Lake on Google Cloud (Google Cloud Platform), enabling metrics & reporting, advanced analytics, and GenAI use cases (natural language querying, accelerated summarization, cross‑domain trend analysis) leveraging PySpark‑based processing, cloud‑native DevOps CI/CD pipelines, and containerized deployments on OpenShift (OCP) to deliver scalable, secure, and high‑performance data solutions.

 

About Program/Project

 

The IAM Data Modernization project involves migrating an on-premises SQL data warehouse to a target state Data Lake in  Google Cloud Platform cloud environment. Key highlights include:

  • Integration Scope: 30 source system data ingestions and multiple downstream integrations
  • Capabilities: Metrics, reporting, and Gen AI use cases with natural language querying, advanced pattern/trend analysis, faster summarizations, and cross-domain metric monitoring
  • Benefits:
    • Scalability and access to advanced cloud functionality
    • Highly available and performant semantic layer with historical data support
    • Unified data strategy for executive reporting, analytics, and Gen AI across cyber domains

This modernization establishes a single source of truth for enterprise-wide data-driven decision-making.

 

Required Skills

 

DevOps / CI‑CD

  • Experience implementing CI/CD pipelines for data and analytics workloads
  • Familiarity with Git‑based source control, build automation, and deployment strategies

Containers & Platform

  • Experience with OpenShift Container Platform (OCP) for deploying data workloads and services
  • Understanding of containerized architecture, scaling, and environment management
  • Proven ability to build CI/CD pipelines for data and infrastructure workloads
  • Experience managing secrets securely using Google Cloud Platform Secret Manager
  • Ownership of observability, SLOs, dashboards, alerts, and runbooks
  • Proficiency in logging, monitoring, and alerting for data pipelines and platform reliability

Big Data & Processing

  • Hands‑on experience with PySpark for ETL/ELT, data transformation, and performance optimization
  • Solid understanding of distributed data processing concepts

Data & Cloud Architecture

  • Strong experience designing data platforms on Google Cloud Platform (Google Cloud Platform)
  • Experience with Data Lakes, data warehousing, and large‑scale migration programs

 

Data Lake Architecture & Storage

  • Proven experience designing and implementing data lake architectures (e.g., Bronze/Silver/Gold or layered models).
  • Strong knowledge of Cloud Storage (GCS) design, including bucket layout, naming conventions, lifecycle policies, and access controls

·       Experience with Hadoop/HDFS architecture, distributed file systems, and data locality principles

  • Hands-on experience with columnar data formats (Parquet, Avro, ORC) and compression techniques
  • Expertise in partitioning strategies, backfills, and large-scale data organization
  • Ability to design data models optimized for analytics and BI consumption

 

Data Ingestion & Orchestration

·       Experience building batch and streaming ingestion pipelines using Google Cloud Platform-native services

·       Knowledge of Pub/Sub-based streaming architectures, event schema design, and versioning

·       Strong understanding of incremental ingestion and CDC patterns, including idempotency and deduplication

·       Hands-on experience with workflow orchestration tools (Cloud Composer / Airflow)

·       Ability to design robust error handling, replay, and backfill mechanisms

 

Data Processing & Transformation

·       Experience developing scalable batch and streaming pipelines using Dataflow (Apache Beam) and/or Spark (Dataproc)

·       Strong proficiency in BigQuery SQL, including query optimization, partitioning, clustering, and cost control.

·       Hands-on experience with Hadoop MapReduce and ecosystem tools (Hive, Pig, Sqoop)

·       Advanced Python programming skills for data engineering, including testing and maintainable code design

·       Experience managing schema evolution while minimizing downstream impact

 

Analytics & Data Serving

·       Expertise in BigQuery performance optimization and data serving patterns

·       Experience building semantic layers and governed metrics for consistent analytics

·       Familiarity with BI integration, access controls, and dashboard standards

·       Understanding of data exposure patterns via views, APIs, or curated datasets

 

Data Governance, Quality & Metadata

·       Experience implementing data catalogs, metadata management, and ownership models

·       Understanding of data lineage for auditability and troubleshooting

·       Strong focus on data quality frameworks, including validation, freshness checks, and alerting

·       Experience defining and enforcing data contracts, schemas, and SLAs

 

 Good to have

Security, Privacy & Compliance

·       Hands-on experience implementing fine-grained access controls for BigQuery and GCS

·       Experience with Sprint planning and helping team technically.

·       Strong stakeholder communication and solution‑architecture skills

 

Qualifications

  • Experience: [10–14] years in DevOps and Data Architecture, 5 years designing on Pyspark/Google Cloud Platform/OCP at scale; prior on‑prem → cloud migration a must.
  • Education: Bachelor’s/Master’s in Computer Science, Information Systems, or equivalent experience.
  • Certifications: Google Cloud Professional Cloud Architect/DevOps/OCP (required or within 3 months). Plus: Professional Data Engineer, Security Engineer.

Salary : $65 - $70

If your compensation planning software is too rigid to deploy winning incentive strategies, it’s time to find an adaptable solution. Compensation Planning
Enhance your organization's compensation strategy with salary data sets that HR and team managers can use to pay your staff right. Surveys & Data Sets

What is the career path for a Data Architect (Google Cloud )– IAM Data Modernization?

Sign up to receive alerts about other jobs on the Data Architect (Google Cloud )– IAM Data Modernization career path by checking the boxes next to the positions that interest you.
Income Estimation: 
$103,114 - $138,258
Income Estimation: 
$118,163 - $145,996
Income Estimation: 
$120,777 - $151,022
Income Estimation: 
$129,363 - $167,316
Income Estimation: 
$86,891 - $130,303
Income Estimation: 
$81,769 - $104,543
Income Estimation: 
$89,551 - $118,439
Income Estimation: 
$103,228 - $139,671
Employees: Get a Salary Increase
View Core, Job Family, and Industry Job Skills and Competency Data for more than 15,000 Job Titles Skills Library

Job openings at Rivago infotech inc

  • Rivago infotech inc San Jose, CA
  • Job Description: The Deployment Pod Lead – Data Center is responsible for leading a cross‑functional deployment pod and delivering end‑to‑end data center d... more
  • 4 Days Ago

  • Rivago infotech inc Arizona, AZ
  • Role : Principal AI/ML Architect (MLOps) Location : Scottsdale, AZ 85250 (onsite) Hire Type : Contract Client: Service base Company Implementation partner:... more
  • 5 Days Ago

  • Rivago infotech inc Warren, NJ
  • Role : Technical Delivery Leader — Cloud Location : Warren , NJ (Hybrid) Implementation partner - ****** End client: Over the call only. Experience - 15 lo... more
  • 5 Days Ago

  • Rivago infotech inc York, NY
  • We are seeking a Senior Cybersecurity Architect to serve as the strategic and technical leader advancing our global cyber architecture and engineering matu... more
  • 5 Days Ago


Not the job you're looking for? Here are some other Data Architect (Google Cloud )– IAM Data Modernization jobs in the Dallas, TX area that may be a better fit.

  • NTT DATA North America Dallas, TX
  • Req ID: 349161 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive... more
  • 2 Months Ago

  • NTT DATA Plano, TX
  • Company Overview: Req ID: 364087 NTT DATA strives to hire exceptional, innovative, and passionate individuals who want to grow with us. If you want to be p... more
  • 1 Month Ago

AI Assistant is available now!

Feel free to start your new journey!