Mizuho is looking to build data-intensive services and Spark pipelines on Kubernetes and Databricks to support financial services operations, specifically in Market Risk, Counterparty Credit Risk, or Liquidity Risk. The role requires end-to-end delivery with a strong SDLC and mentoring of junior developers.
Requirements
- Expert in Java, Python, SQL.
- Hands-on Spark/Databricks (Delta, Spark SQL, performance tuning).
- Production Kubernetes experience (deployments, networking, autoscaling).
- Proven SDLC rigor and CI/CD (GitHub Actions/Azure DevOps/Jenkins).
- Finance/risk systems depth in at least one: Market, CCR, or Liquidity.
- Unity Catalog, DLT, cluster policies; Terraform; Kafka; Airflow/Jobs.
- Secure APIs (REST/gRPC), OAuth2/OIDC, RBAC/ABAC.
Responsibilities
- Develop services (Java, Python) and ETL/ELT (SQL, Spark/PySpark) on Databricks.
- Containerize and deploy to Kubernetes; ensure logging/metrics/tracing.
- Apply strong SDLC: design reviews, testing, CI/CD, releases, monitoring.
- Partner with risk stakeholders; deliver audit-ready data contracts & controls.
- Review code, coach juniors, lead incident RCAs and remediation.
Other
- 6+ years in software/data engineering
- Financial services experience required, ideally in Market Risk, Counterparty Credit Risk, or Liquidity Risk.
- Mizuho has in place a hybrid working program, with varying opportunities for remote work depending on the nature of the role, needs of your department, as well as local laws and regulatory obligations.
- We are looking for candidates who want to contribute to our entrepreneurial culture where people at all levels are inspired to share ideas.
- We maintain a drug-free workplace and reserve the right to require pre- and post-hire drug testing as permitted by applicable law.