Akuna Capital is looking to solve the problem of managing and utilizing large amounts of data to drive business success.
Requirements
- Java/Scala experience required
- Python experience a significant plus
- Prior hands-on experience with data platforms and technologies such as Delta Lake, Spark, Kubernetes, Kafka, ClickHouse, and/or Presto/Trino
- Experience building large-scale batch and streaming pipelines with strict SLA and data quality requirements
- Recent hands-on experience with AWS Cloud development, deployment and monitoring necessary
- Demonstrated experience working on an Agile team employing software engineering best practices, such as GitOps and CI/CD
- Experience with Databricks/Spark and EKS
Responsibilities
- Drive the ongoing design and expansion of our data platform across a wide variety of data sources
- Build and deploy batch and streaming pipelines to collect and transform our rapidly growing Big Data set within our hybrid cloud architecture
- Produce clean, well-tested, and documented code with a clear design to support mission critical applications
- Build automated data validation test suites that ensure that data is processed and published in accordance with well-defined Service Level Agreements (SLA’s)
- Mentor junior engineers in software and data engineering best practices
- Work closely with Trading, Quant, Technology & Business Operations teams throughout the firm to identify how data is produced and consumed
- Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack
Other
- BS/MS/PhD in Computer Science, Engineering, Physics, Math, or equivalent technical field
- 5+ years of professional experience developing software applications
- Excellent communication, analytical, and problem-solving skills
- Must possess the ability to react quickly and accurately to rapidly changing market conditions
- Ability to quickly and accurately respond and/or solve math and coding problems are essential functions of the role