Alchemy needs to architect and build new backend systems and improve existing ones for a platform that supports millions of users globally, focusing on sophisticated and high-throughput distributed systems in the blockchain world.
Requirements
- Multi-state orchestration frameworks such as Airflow
- Jenkins
- Pachyderm
Responsibilities
- Maintain Alchemy’s batch pipelines that power our production serving systems.
- Set up frameworks and tools to help team members create and debug pipelines by themselves.
- Track data quality and latency, and set up monitors and alerts to ensure smooth operation.
- Build production DAG workflows for batch data processing and storage.
- Aggregate logs from multiple regions and multiple clouds.
- Design and implement our next generation data warehouse that aggregates internal and third-party data sources.
Other
- Requires a Bachelor’s degree or foreign degree equivalent in Computer Science, Computer Engineering, Information Systems, or a closely related field.
- 4+ years of relevant industry experience in data engineering or data infrastructure.
- comprehensive medical, dental, and vision coverage, as well as other benefits such as 401k and unlimited flexible time off.