Alchemy is looking to solve the problem of building and scaling onchain apps and rollups by providing a complete developer platform with powerful APIs, SDKs, and tools.
Requirements
- Knowledge of Multi-state orchestration frameworks such as Airflow
- Knowledge of Jenkins
- Knowledge of Pachyderm
Responsibilities
- Maintain Alchemy’s batch pipelines that power our production serving systems.
- Set up frameworks and tools to help team members create and debug pipelines by themselves.
- Track data quality and latency, and set up monitors and alerts to ensure smooth operation.
- Build production DAG workflows for batch data processing and storage.
- Aggregate logs from multiple regions and multiple clouds.
- Design and implement our next-generation data warehouse that aggregates internal and third-party data sources.
Other
- Requires a Bachelor’s degree or foreign degree equivalent in Computer Science, Computer Engineering, Information Systems, or a closely related field.
- 4 years of relevant industry experience in data engineering or data infrastructure.
- 401k and unlimited flexible time off
- Comprehensive medical, dental, and vision coverage