Rain is rebuilding the global financial pipes money flows through, and as they scale to millions of end users, they need a dedicated Data Engineer to architect the ingestion, pipelines, and infrastructure that power their data ecosystem, ensuring timely, accurate, and trustworthy data for all teams.
Requirements
- Strong Python and SQL fundamentals, with real experience building production-grade ETL/ELT.
- Hands-on experience with Airflow, Dagster, Prefect, or similar systems.
- Comfortable designing schemas, optimizing performance, and operating modern cloud warehouses (Snowflake, BigQuery, Redshift).
- Experience ingesting and processing payments data, transaction logs, or ledger systems.
- Exposure to smart contracts, blockchain data structures, or on-chain event ingestion.
- Familiarity with dbt and/or semantic modeling to support analytics layers.
- Prior experience standing up data platforms from 0→1 at early-stage companies.
Responsibilities
- Design, build, and maintain Rain’s core data pipelines, including ingestion from payments processors, card issuers, blockchain nodes, internal services, and third-party APIs.
- Own orchestration and workflow management, implementing Airflow, Dagster, or similar tools to ensure reliable, observable, and scalable data processing.
- Architect and manage Rain’s data warehouse (Snowflake, BigQuery, or Redshift), driving performance, cost optimization, partitioning, and access patterns.
- Develop high-quality ELT/ETL transformations to structure raw logs, transactions, ledgers, and on-chain events into clean, production-grade datasets.
- Implement data quality frameworks and observability (tests, data contracts, freshness checks, lineage) to ensure every dataset is trustworthy.
- Partner closely with backend engineers to instrument new events, define data contracts, and improve telemetry across Rain’s infrastructure.
- Own data reliability at scale, leading root-cause investigations, reducing pipeline failures, and building monitoring and alerting systems.
Other
- Data infrastructure builder – You thrive in early-stage environments, owning pipelines and platforms end-to-end and choosing simplicity without sacrificing reliability.
- Quality-obsessed – You care deeply about data integrity, testing, lineage, and observability.
- Systems thinker – You see data as a platform; you design for reliability, scale, and future users.
- Collaborator – You work well with backend engineers, analytics engineers, and cross-functional stakeholders to define requirements and deliver outcomes.
- Experienced – 5–7+ years in data engineering roles, ideally within fintech, payments, B2B SaaS, or infrastructure-heavy startups.