Madhive is seeking a Senior Data Engineer to scale the infrastructure powering their universal pixel and transform high-volume event data into reliable, actionable signals, working closely with cross-functional teams to build robust, scalable pipelines and models, while helping ensure high data quality across their platform.
Requirements
- 5+ years of experience in software or data engineering, with a focus on building scalable data infrastructure
- Strong experience with data pipelining and modeling, including tools like Apache Airflow, Databricks, Snowflake and dbt
- In-depth knowledge of streaming technologies such as Apache Kafka
- Skilled in designing and maintaining ELT/ETL workflows using modern tooling
- Proficient in SQL and comfortable working with both relational and NoSQL databases (e.g., Postgres, Bigtable, Spanner)
- Experience working with cloud platforms, ideally GCP
- Familiarity with JavaScript and front-end tracking concepts, especially in non-browser environments like CTV.
Responsibilities
- Design and implement scalable data pipelines for ingesting, processing, and transforming large volumes of universal pixel and event data
- Build and maintain real-time and batch workflows using tools like Kafka, Airflow, and BigQuery
- Collaborate with engineers and product teams to ensure event data is captured accurately through our JavaScript-based universal pixel
- Own and optimize ELT processes to support reporting, analytics, and machine learning use cases
- Develop and maintain data models to support internal stakeholders and platform features
- Monitor pipeline health, implement anomaly detection, and maintain high data quality standards
- Contribute to the evolution of our cloud data infrastructure (built on GCP)
Other
- Excellent collaboration and communication skills
- Bonus: experience in adtech, martech, or CTV attribution
- LI-Remote