LiveRamp is looking to solve the business and technical problem of building and maintaining robust data pipelines and infrastructure to support data-driven decision-making, enable efficient data flow, and tackle large-scale data challenges.
Requirements
- Deep expertise in data modeling, ETL/ELT architecture, and SQL.
- Proficiency in Python for data transformation, automation, and tooling.
- Hands-on experience with modern data stack tools such as DBT, FiveTran, and Airflow (or similar ETL orchestration systems).
- Experience with at least one major cloud data platform (e.g., GCP BigQuery, AWS Redshift, Azure Synapse, or Snowflake).
- Understanding of data reliability, governance, and distributed computing.
- Experience supporting data platforms (access, cost optimization, monitoring).
- Hands-on experience with Google Cloud Platform (BigQuery).
Responsibilities
- Contribute to the design and maintenance of data pipelines that support business and product decision-making.
- Support the reliability and performance of data workflows across FiveTran, DBT, and Google Cloud (BigQuery).
- Contribute to team-wide adoption of best practices for ETL and data automation with a focus on high data quality and system resilience.
- Administer and optimize data platforms, monitoring data systems for efficiency, scalability, and cost-effectiveness.
- Leverage frameworks and tools that support data pipeline to improve the reliability and observability of our data ecosystem.
- Partner with cross-functional teams (Data Science, Analytics, Engineering) to ensure that data solutions meet organizational needs.
- Mentor and onboard other engineers, contributing to shared standards and technical excellence.
Other
- 5+ years of experience in Data Engineering or a related technical role.
- Excellent communication skills and ability to work cross-functionally with technical and non-technical partners.
- A self-starter who thrives in fast-paced, evolving environments.
- Experience with PySpark or distributed data processing frameworks.
- Experience building custom data frameworks or automation tooling.