Demo

Data Platform Engineer

ExecutivePlacements.com
San Jose, CA Contractor
POSTED ON 4/9/2026
AVAILABLE BEFORE 5/9/2026
Join Mizuho as a Data Platform Engineer!

We are seeking a highly skilled Kafka Platform Engineer to design, build, and operate our enterprise event-streaming platform using Red Hat AMQ Streams (Kafka on OpenShift). In this role, you will be responsible for ensuring a reliable, scalable, secure, and developer-friendly streaming ecosystem. You will work closely with application teams to define and implement event-driven integration patterns, and you will leverage GitLab and Argo CD to automate platform delivery and configuration.

This position requires a strong blend of platform engineering, DevOps practices, Kafka cluster expertise, and architectural understanding of integration/streaming patterns.

Key Responsibilities

Kafka & AMQ Streams Engineering

Design, deploy, and operate AMQ Streams (Kafka) clusters on Red Hat OpenShift.

Configure and manage Kafka components including brokers, Kraft, MirrorMaker 2,

Explore Kafka Connect, and Schema Registry concepts and implementations.

Ensure performance, reliability, scalability, and high availability of the Kafka platform.

Implement cluster monitoring, logging, and alerting using enterprise observability tools.

Manage capacity planning, partition strategies, retention policies, and performance tuning.

Integration Patterns & Architecture

Define And Document Standardized Event-driven Integration Patterns, Including

  • Event sourcing
  • CQRS
  • Pub/sub messaging
  • Change data capture
  • Stream processing & enrichment
  • Request-reply over Kafka

Guide application teams on using appropriate patterns that align with enterprise architecture.

Establish best practices for schema design, topic governance, data contracts, and message lifecycle management.

Security & Governance

Implement enterprise-grade security for Kafka, including RBAC, TLS, ACLs, and authentication/authorization integration. (SSO and OAuth)

Maintain governance for topic creation, schema evolution, retention policies, and naming standards.

Ensure adherence to compliance, auditing, and data protection requirements (Encryption at Rest and flight).

Collaboration & Support

Provide platform guidance and troubleshooting expertise to development and integration teams.

Partner with architects, SREs, and developers to drive adoption of event-driven architectures.

Create documentation, runbooks, and internal knowledge-sharing materials.

CI/CD & GitOps Automation

Build and maintain GitOps workflows using Argo CD for declarative deployment of Kafka resources and platform configurations.

Develop CI/CD pipelines in GitLab, enabling automated builds, infrastructure updates, and configuration promotion across environments.

Maintain Infrastructure-as-Code (IaC) repositories and templates for Kafka resources.

Qualifications

  • Bachelors degree in computer science, Engineering, or a related field.
  • Proven experience with Kafka administration and management.
  • Strong knowledge of OpenShift and container orchestration.
  • Proficiency in scripting languages such as Python or Bash.
  • Experience with monitoring and logging tools (e.g., Splunk, Prometheus, Grafana).
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills.

Preferred Qualifications

  • Experience with Red Hat OpenShift administration.
  • Knowledge of service mesh patterns (Istio, OpenShift Service Mesh).
  • Familiarity with stream processing frameworks (Kafka Streams, ksqlDB, Flink).
  • Experience using observability stacks (Prometheus, Grafana).
  • Background working in regulated or enterprise-scale environments.
  • Knowledge of DevOps practices and tools (e.g., ArgoCD, Ansible, Terraform).
  • Knowledge of SRE Monitoring and logging tools (e.g., Splunk, Prometheus, Grafana).

The expected base salary ranges from $111k -$150k. Salary offers are based on a wide range of factors including relevant skills, training, experience, education, and, where applicable, certifications and licenses obtained. Market and organizational factors are also considered. In addition to salary and a generous employee benefits package, successful candidates are eligible to receive a discretionary bonus.

Other Requirements

Mizuho has in place a hybrid working program, with varying opportunities for remote work depending on the nature of the role, needs of your department, as well as local laws and regulatory obligations. Roles in some of our departments have greater in-office requirements that will be communicated to you as part of the recruitment process.

Salary : $111,000 - $150,000

If your compensation planning software is too rigid to deploy winning incentive strategies, it’s time to find an adaptable solution. Compensation Planning
Enhance your organization's compensation strategy with salary data sets that HR and team managers can use to pay your staff right. Surveys & Data Sets
Employees: Get a Salary Increase
View Core, Job Family, and Industry Job Skills and Competency Data for more than 15,000 Job Titles Skills Library

Job openings at ExecutivePlacements.com

  • ExecutivePlacements.com Boston, MA
  • Under direct supervision, creates basic instrumentation, controls and related systems designs using 3D and 4D drafting programs. Reviews simple component l... more
  • Just Posted

  • ExecutivePlacements.com Cupertino, CA
  • Data Engineer Expected skills: Python, Golang/Rust (nice to have) Data Engineering tools: pyiceberg, daft to name a few. The candidate should be familiar w... more
  • 2 Days Ago

  • ExecutivePlacements.com Valley, UT
  • Earn up to $15/hour performance bonuses. Work remotely and flexibly. Outlier, a platform owned and operated by Scale AI, is looking for English speakers to... more
  • 3 Days Ago

  • ExecutivePlacements.com Denver, CO
  • JOB SUMMARY Enable data-driven decision making and in-time health monitoring of our Client's products by building and deploying product and feature specifi... more
  • 3 Days Ago


Not the job you're looking for? Here are some other Data Platform Engineer jobs in the San Jose, CA area that may be a better fit.

  • xAI Palo Alto, CA
  • About XAI xAI's mission is to create AI systems that can accurately understand the universe and aid humanity in its pursuit of knowledge. Our team is small... more
  • 19 Days Ago

  • 10x Genomics Pleasanton, CA
  • About The Role Join our Data Platform team at 10x Genomics to architect and implement our strategic Unified Data Platform (UDP). This pivotal role is focus... more
  • 1 Day Ago

AI Assistant is available now!

Feel free to start your new journey!