MUFG is looking to define, enable, and drive its data strategy by engaging with partners across technology, data, business, and operations to develop and deliver business outcomes. The role will help define Cloud Data-Ops, AI-ML solutions and Capabilities in line with the overall data strategy of technology and business, and optimize resource utilization, manage budgets, and implement innovative cost-saving strategies with AI/ML technologies.
Requirements
- Experience with design and engineering of AWS cloud resources for enterprise data platforms, data pipelines (ETL) and data controls, and reporting and analytic solutions.
- Strong understanding of data management strategies, best practices, and innovative technologies. Experience with relational databases such as Oracle, RDS, Snowflake etc. and with data virtualization tools such as Starburst and Denodo.
- Experience with data movement tools such as Informatica, Data Sync, FsX, AWS Glue, Airflow and messaging streams (Kinesis).
- Experience with data visualization tools such as Tableau, Power Bi, AWS Quicksight etc.
- Exposure to AI Solutions: AWS Bedrock, Sagemaker, Amazon Q, AWS Data Automation, Lang Graph, MCPs, Agentic AI
- Expertise in AWS Cloud internals: API Gateway, EC2, S3, IAM, Lambda, RDS, EKS, CloudFront, Route 53, Control Tower, MWAA, EMR, Glue ETL, Athena, AWS Config, KMS, Kinesis, SQS, OpenSearch etc.
- Experience in CI/CD tools - DevOps: Jenkins, AWS Code Deploy, Code Pipeline, GitHub, Terraform, Cloudbees
Responsibilities
- Provide leadership in helping the EDDC group with AWS operational aspects - IAM roles, Policies and Permissions, security guardrails.
- Guide and pursuit the aspects related to reliability, scalability, and performance of MUFG’s Cloud infrastructure, platforms, and applications
- Extend help in cloud architecture & migration as well as setting up AWS AI/ML Solutions (with a focus on AWS Bedrock, SageMaker, Amazon Q)
- Work with cutting-edge ML models and AI-driven applications, focusing on improving (RAG) systems and evolving MCP practices to develop scalable solutions.
- Utilize Pinecone or Singlestore for scalable vector search solutions, enhancing the performance of search/retrieval systems with fast, scalable vector search techniques.
- Manage large datasets, applying best practices in machine learning to ensure robust model training, validation, and deployment pipelines.
- Design and implement robust CI/CD pipelines for AI/ML model development, testing, and deployment, ensuring minimal downtime and maximum efficiency.
Other
- The selected colleague will work at an MUFG office or client sites four days per week and work remotely one day.
- 10+ years of data experience in the large financial Services sector from a technology, data, and/or business operations perspective.
- Have either an AWS associate or professional certification (or equivalent)
- Proven experience as an Integration Engineer or similar role.
- We do not anticipate providing visa sponsorship/support for this position.