DRW C/FICCO Data Engineering team is essential to developing valuable datasets and scalable data infrastructure critical to our trading operations. We work closely with traders, researchers, and other engineering teams to ensure seamless data flow and accessibility, supporting data-driven decision-making.
Requirements
- A minimum of 2+ years' experience using at least one of Python, Java, and C++, and can work comfortably in multiple programming languages
- Experience with Linux-based, concurrent, high-throughput, low-latency software systems
- Experience with pipeline orchestration frameworks (e.g. Airflow, Dagster)
- Experience with streaming platforms (e.g. Kafka), data lake platforms (e.g. Delta Lake, Apache Iceberg), and relational databases
- Ability to dive deep into complex problems, develop intuitive understandings, spot risks early, and minimize complexity
Responsibilities
- Design, build, and maintain systems for both batch processing and real-time streaming of time series datasets, ensuring high data quality and reliability.
- Develop APIs and data access methods for fast, intuitive retrieval of historical and live data, working with both new and existing systems.
- Collaborate with cross-functional teams to deliver data solutions that support diverse trading activities.
- Take full ownership of data products, guiding them from initial concept through to stable production.
- Provide on-call support as needed.
Other
- A track record of working directly with end customers, scoping and delivering production systems in fast-moving and ambiguous environments
- Exceptional interpersonal skills - you communicate clearly with stakeholders as well as other engineers, fostering a collaborative, supportive working environment
- Experience in the financial markets, especially in delta one, store of value, and/or FICC options trading
- Have a Bachelors or advanced degree in Computer Science, Mathematics, Statistics, Physics, Engineering, or equivalent work experience