Global supply chains rely on slow, manual processes like email and spreadsheets, creating inefficiencies in the $13T of goods shipped annually. Salesforce aims to reimagine supply chains with an AI-powered platform for designing, automating, and running end-to-end business processes, with seamless collaboration through familiar channels like email.
Requirements
- 8+ years of professional software development experience, with a significant focus on large-scale data systems, ETL, or integration platforms
- Demonstrated expertise in designing, building, and maintaining production-grade data pipelines (ETL/ELT)
- Expert-level proficiency in at least one modern programming language (e.g., Java, Python, Go) suitable for platform development.
- Proven experience designing and implementing robust, external-facing REST APIs and Webhooks
- Deep practical knowledge of building and operating distributed systems, concurrency, and high-availability architectures
- Experience in a lead technical role (LMTS/PMTS level or equivalent), driving and owning complex projects as an Individual Contributor
- Hands-on experience with Salesforce integration technologies such as Mulesoft or Salesforce Data Cloud
Responsibilities
- Architect, design, and deliver high-quality, scalable code for complex data integration and platform features.
- Drive technical decision-making and project execution as the primary Individual Contributor for integration initiatives.
- Conduct thorough code reviews and mentor other engineers on best practices for performance, security, and scalability.
- Collaborate cross-functionally with Product Management and other engineering teams to define requirements and deliver solutions that meet business needs.
- Set the technical direction and standards for all data integration and ETL processes within the platform.
- Identify and mitigate architectural risks associated with scaling our data infrastructure to support exponential customer growth.
- Drive continuous improvement in system performance, observability, and operational efficiency.
Other
- A related technical degree required
- Excellent written and verbal communication skills
- Experience working with large-scale event streaming platforms (e.g., Kafka, Kinesis)
- Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes)
- Experience designing or working with GraphQL API implementations