Conviva is looking to optimize digital customer experiences by harnessing full-census, comprehensive client-side telemetry and linking them to the performance of underlying services in real time.
Requirements
- Proficiency in major distributed stream processing frameworks, such as Akka Streams, Apache Spark, Apache Flink, and others.
- Hands-on experience with big data platforms with query engines or distributed database systems, such as ClickHouse, Apache Druid, Presto, BigQuery, and others.
- Experience in performance tuning, analysis over distributed systems.
- Experience building massively scalable data infrastructure using commercial and open-source tools.
- Strong programming skills in C++, Rust, Scala, Java, or similar language.
- Experience in a SaaS space (preferred)
- Track record of contributing to open-source projects (preferred)
Responsibilities
- Develop and maintain the OLAP layer, optimizing data processing and query performance.
- Design, build and optimize data pipelines for efficient batch and streaming data processing using various open-source frameworks.
- Integrate OLAP solutions into the broader data ecosystem, ensuring high performance and reliability.
- Design, build, maintain and improve a range of algorithms and their underlying systems.
- Lead critical technical decisions collaboratively, guiding and training team members to tackle engineering challenges.
- Stay current with industry trends and emerging technologies in distributed database systems and stream processing frameworks.
Other
- Foster a positive team culture, promoting code quality, driving initiatives, and ensuring impeccable execution.
- Demonstrate effective collaboration within teams while being open to receiving feedback, embracing a learning mindset, and actively supporting others.
- Adaptable to ambiguity and comfortable in a fast-paced work environment.
- Solid foundation in computer science or related fields.
- 3+ years of industry experience in software development using modern software development processes, tool chains, and infrastructure or a PhD with significant contributions in big data or distributed systems.