Design and implement a unified data interface to power model development, experimentation, evaluation, and real-time inference for Meter's neural network-driven system for reasoning in raw computer networks.
Requirements
- Experience designing large-scale data infrastructure, ideally across batch and streaming modes
- Think deeply about schema design, versioning, and data quality
- Proficiency in Python, Go
- Experience with Kafka, Postgres, Clickhouse
- Experience with AWS, Azure
Responsibilities
- Design and implement the Models API: a unified interface for accessing training, evaluation, and deployment data across raw, transformed, and feature-engineered layers
- Ensure backwards compatibility and feature versioning across constantly evolving schemas
- Build scalable pipelines for ingesting, transforming, and serving petabytes of data across Kafka, Postgres, and Clickhouse
- Create CI/CD workflows that evolve the API in lockstep with changes to the underlying data schema
- Enable fine-grained querying of historical and real-time data for any network, at any point in time
Other
- Flexible time off
- Commuter benefits
- Parental leave
- Onsite meals (San Francisco office)
- Medical, dental & vision coverage for you and your dependents