Realtor.com® is looking to design, build, and operate backend services and data integrations that power personalized marketing experiences across CRM and Digital Marketing channels, ensuring reliability, observability, and scalability of data pipelines.
Requirements
- Proficiency in one or more of: TypeScript/Node.js, Python, Java/Kotlin; strong foundation in writing maintainable, testable code.
- Solid SQL skills and experience with Snowflake or comparable cloud data warehouses; familiarity with data modeling and working with structured/semi-structured data.
- Experience with event-driven systems and streaming technologies (e.g., Kafka/MSK) and building resilient consumers/producers
- Hands-on with AWS services (e.g., Lambda, ECS/EKS, S3, IAM, MSK, API Gateway) and Infrastructure-as-Code (Terraform or CloudFormation).
- CI/CD expertise (preferably on CircleCI + GitHub + Argo) and a testing mindset across unit, integration, contract, and data validations.
- Strong observability practices (logs, metrics, traces) using tools like New Relic and Splunk; effective incident response and post-incident improvement.
- Experience with the Snowflake data platform and its streaming/batch components; DBT and Airflow familiarity.
Responsibilities
- Design, build, and operate backend services and data integrations that are secure, observable, and cost-efficient, using modern cloud patterns on AWS.
- Collaborate with data platform teams on the Snowflake and Kafka data platform (streaming and batch) to enable reliable data collection, processing, and access for near real-time marketing and analytics use cases.
- Contribute to streaming pipelines and consumers (e.g., Kafka/MSK) and batch workflows backed by Snowflake/DBT/Airflow/Hightouch, with a focus on correctness, performance, and operational excellence.
- Implement and extend CI/CD pipelines on the company's deployment platform (CircleCI GitHub + Argo), driving paved-path adoption, test automation, and deployment safety.
- Help instrument, validate, and monitor clickstream and service events in partnership with data reliability teams, improving event health and taxonomy alignment across systems.
- Strengthen observability for services and pipelines (metrics, logging, tracing) to reduce mean time to detection/recovery and enable effective on-call rotations.
- Partner with cross-functional stakeholders (marketing, engineering, product, analytics, privacy) to define requirements, de-risk designs, and ship incremental value.
Other
- 5+ years of experience building backend systems and/or data-intensive services in production, with ownership across design, implementation, and operations.
- Bachelor's degree or equivalent experience.
- Excellent collaboration and communication skills; ability to partner with product, analytics, and other engineering teams to deliver outcomes.
- Experience with CRM and digital marketing platforms (Braze, Cordial, Google Ads, Meta Ads, etc)
- Background in clickstream data reliability: event taxonomy, validation tooling, event health monitoring, and developer workflows that "shift-left" data quality.