Ridgeline is looking to build scalable, reliable systems for a modern cloud platform and deliver clean, timely, and trusted data to internal and external stakeholders.
Requirements
- Proficient in Python, Kotlin, or Java
- Advanced knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL, Aurora)
- Deep experience in designing and maintaining data pipelines, data models, and ETL/ELT workflows
- Familiarity with big data tools such as Kafka, Spark, or Flink
- Experience working with cloud platforms (preferably AWS)
- Hands-on with containerization and orchestration tools like Docker and Kubernetes
Responsibilities
- Design and build scalable, reliable, high-performance data pipelines that ingest, transform, and deliver data across Ridgeline’s platform
- Develop and maintain data APIs, SDKs, and services to support internal development teams and external clients
- Ensure data quality and integrity by implementing robust testing, validation, and monitoring processes
- Optimize storage, processing, and retrieval mechanisms for performance, scalability, and cost efficiency
- Implement best practices for CI/CD, observability, and security in all data engineering workflows
- Mentor junior engineers and provide technical leadership to foster a culture of continuous learning and innovation
Other
- 8+ years of experience in software engineering with a strong focus on data engineering and distributed systems
- Excellent communication and collaboration skills
- Strong analytical problem-solving abilities with a bias for action
- Ability to take ownership and lead initiatives independently while mentoring others
- Unlimited vacation, educational and wellness reimbursements, and $0 cost employee insurance plans