Visa is looking to design, optimize, and scale key components of its data platforms across Hadoop and cloud environments to deliver high-impact data solutions and drive advanced engineering best practices.
Requirements
- Strong expertise in distributed data processing using Hadoop, Apache Spark, and modern data lake/lakehouse architectures.
- Advanced programming proficiency in PySpark, Scala, and Python, with experience building production-grade data applications.
- Deep knowledge of SQL, distributed query engines (Presto, Trino, Hive, SparkSQL), and database performance optimization.
- Hands-on experience designing and managing relational databases, NoSQL systems, and scalable storage solutions.
- Proficiency in data modeling, ETL/ELT patterns, and data warehousing design.
- Strong experience with AWS, GCP, and Azure for cloud-based data architectures, including data lakes, orchestration tools, and distributed compute.
- Advanced proficiency with Databricks, including Notebook and job optimization, Delta Lake design and tuning, Cluster and workspace management, CI/CD integration for data workloads, and Performance tuning for distributed processing
Responsibilities
- Architect and implement large-scale data pipelines, ingestion frameworks, and processing systems that support analytics, real-time insights, and enterprise workloads.
- Lead the design and development of scalable data models, lakehouse structures, and distributed compute solutions across Hadoop and cloud platforms.
- Provide deep technical guidance during design reviews, code reviews, and architectural discussions, influencing engineering decisions within assigned domains.
- Build automation frameworks, reusable components, and self-service tooling that improve platform efficiency and reduce operational overhead.
- Drive platform improvements in data quality, observability, governance, reliability, and performance.
- Lead multi-team technical projects, collaborating closely with engineering, analytics, and product partners to deliver high-impact data solutions.
- Evaluate emerging tools and technologies, recommending enhancements that strengthen scalability and data engineering productivity.
Other
- 8 or more years of relevant work experience with a Bachelor Degree or at least 5 years of experience with an Advanced Degree (e.g. Masters, MBA, JD, MD) or 2 years of work experience with a PhD
- Strong communication and collaboration skills to work effectively with product, analytics, and engineering partners.
- Travel 5-10% of the time
- Ability to work in an office setting and operate standard office equipment
- Mental/Physical Requirements: This position will be performed in an office setting. The position will require the incumbent to sit and stand at a desk, communicate in person and by telephone, frequently operate standard office equipment, such as telephones and computers.