OXIO is looking to design, develop, and scale its modern data platform, migrating towards a North Star data architecture to support scalable, secure, and intelligent data operations. This involves enabling advanced analytics use cases across telecom networking, product intelligence, financial reporting, and customer insights, including building a Customer 360 platform for fraud detection and personalized engagement.
Requirements
- 10+ years of hands-on experience coding in Python, Spark, SQL for building and maintaining data pipelines.
- 5+ years of experience working with AWS cloud, dbt and Snowflake, Databricks, BigQuery, Redshift or other data warehouses
- Proficient with Dimensional Modeling (Star Schema, Kimball, Inmon) and Data Architecture concepts.
- Advanced SQL skills (ease with window functions, defining UDFs)
- Experience in implementing real-time and batch data pipelines with strict SLOs, and optimizing data storage and access patterns.
- Proven track record of enhancing data reliability, discoverability, and observability.
- Has good understanding of working with storage layers like Hudi, Delta Lake or Iceberg
Responsibilities
- Help build, maintain, and scale our data pipelines that bring together data from various internal and external systems into our data warehouse.
- Partner with upstream engineering teams to enhance data logging patterns and best practices.
- Participate in architectural decisions and help us plan for the company’s data needs as we scale.
- Adopt and evangelize data engineering best practices for data processing, modeling, and lake/warehouse development.
- Advise engineers and other cross-functional partners on how to most efficiently use our data tools.
- Develop data solutions through hands-on coding.
- Help build a comprehensive Customer 360 platform powered by ML models and behavioral data
Other
- 15+ years experience as a data engineer and/or analytics engineer building large scale data platforms and scalable data warehouses.
- Hands-on experience with data visualization and BI tools (Tableau, PowerBI, Cibe.dev)
- Strong grasp of data governance, compliance frameworks, and secure data sharing
- Demonstrated success in leading large-scale projects across teams and mentoring others in Data Engineering best practices.
- Experience building operational tools and defining best practices to improve operational efficiency and developer experience—including evaluating and adopting AI-based tools for engineering productivity.