ServiceNow is looking to build a modern, secure, and highly scalable multi-cloud data platform infrastructure to support next-generation analytics and FinOps governance capabilities.
Requirements
- 10+ years of experience in data engineering with deep expertise in distributed data systems and cloud-native architecture
- Proven experience building and scaling data warehousing solutions on AWS, GCP, and Azure with hybrid cloud integration
- Expert-level proficiency with modern data stack technologies: dbt, Trino/Presto, Apache Airflow, and Apache Iceberg
- Advanced SQL skills and experience with query optimization for large-scale analytics workloads
- Hands-on experience with cloud cost management, FinOps data analysis, and showback/chargeback system implementation
- Strong programming skills in Python, Scala, and SQL with experience in data pipeline development
- Experience with AWS (S3, Glue, EMR, Redshift), GCP (BigQuery, Dataflow, Composer), Azure (Synapse, Data Factory, Event Hubs), and private data center integration
Responsibilities
- Design and implement scalable data warehousing solutions supporting petabyte-scale cost and usage analytics across AWS, GCP, Azure, and private data centers.
- Architect robust ETL/ELT pipelines using Apache Airflow, dbt, and modern orchestration frameworks for complex multi-cloud data ingestion and transformation.
- Establish data modeling best practices for cost allocation, showback/chargeback, and unit economics analysis while ensuring data quality, lineage, and governance across the platform.
- Build and optimize distributed query engines using Trino/Presto for complex analytics workloads across multi-cloud environments.
- Implement advanced data transformation logic using dbt for cost attribution, resource optimization, and financial analytics.
- Develop high-performance data ingestion connectors for hyperscaler billing APIs, ServiceNow internal systems, and enterprise financial systems.
- Optimize query performance and resource utilization for large-scale analytics workloads while maintaining strict SLAs.
Other
- Bachelor's degree in Computer Science, Engineering, or related technical field required
- Full professional proficiency in English
- Conference speaking experience and thought leadership in data engineering or FinOps communities
- Strong technical writing and documentation skills with ability to translate complex technical concepts for diverse audiences
- Proven ability to work autonomously while coordinating across multiple engineering teams