AbbVie's Data Platform team builds and maintains the foundation that powers analytics, ML, and data-driven products. The mission is to enable other teams to move faster, safer, and with greater reliability by delivering scalable, observable, and compliant data systems.
Requirements
- Proficiency in Python and TypeScript for data pipeline and backend service development.
- Expertise in PostgreSQL, including schema design, query optimization, and replication.
- Deep understanding of data modeling, ETL/ELT frameworks, and event-driven architecture.
- Hands-on experience with Kafka, Terraform, and AWS data services.
- Proven success operating and scaling production data systems with high reliability and observability.
Responsibilities
- Define and execute the data platform roadmap, covering ingestion, transformation, orchestration, and data access.
- Design, implement, and maintain ETL/ELT pipelines and data models that support analytics, ML, and operational workloads.
- Architect and evolve event-driven data systems using Kafka to enable scalable, decoupled pipelines.
- Maintain and optimize core data tooling such as Snowflake, dbt, Airflow, and Fivetran.
- Ensure system reliability, observability, and performance across development, staging, and production environments.
- Manage AWS-based infrastructure (MWAA, S3, RDS/Postgres, MSK, Glue, Lambda, ECS/EKS) using Terraform.
- Collaborate with Platform and Security Engineering on CI/CD, monitoring, and compliance practices.
Other
- Lead and grow a team of data and software platform engineers, fostering technical excellence, ownership, and collaboration.
- Partner with domain and product engineering teams to define and enforce data contracts, governance, and quality standards.
- Champion automation, scalability, and cost efficiency within the platform.
- 8-10 years total experience in data, software, and platform engineering roles.
- 2-4 years of experience leading or managing technical teams.