Twilio is looking to support the GTM Data Engineering team by scaling out ML + AI infrastructure and better supporting stakeholders across the organization, including Sales Systems and Marketing, by providing a robust data foundation for building sophisticated automations.
Requirements
- 5+ years of experience in developing ML platforms and/or data engineering.
- Proven track record of delivering large-scale data projects and working with business partners.
- Strong understanding of infrastructure components of large-scale ML and AI applications.
- Experience with big data processing frameworks such as Spark, Flink, or Ray.
- Experience with data orchestration tools like Airflow or Dagster.
- Experience with infrastructure-as-code tools (e.g., Terraform) and modern CI/CD pipelines.
- Experience on building large scale distributed systems in AWS or a similar cloud provider.
Responsibilities
- Collaborate with other engineers, business partners, and data scientists to build best-in-class data infrastructure that meets evolving needs.
- Design and optimize infrastructure for managing ML workflows at scale.
- Design high-performance systems to ensure fast and efficient AI agent serving.
- Design and manage reverse ETL pipelines to power sales operations and marketing automation.
- Improve internal tooling and developer experience for our data scientists.
- Develop and maintain our data warehouse (Snowflake) to enable efficient and accurate usage of data.
- Document data pipelines, data models, and data transformation processes.
Other
- Travel may be required to participate in project or team in-person meetings.
- Location: This role will be remote, but is not eligible to be hired in CA, CT, NJ, NY, PA, WA.
- Bachelor's, Master's, or Ph.D. degree in a relevant field (not explicitly mentioned but implied)
- Strong communication and collaboration skills (implied by the need to work with business partners and data scientists)
- Ability to work in a remote-first environment