The company is looking to support its growing portfolio of work, specifically focusing on GenAI tools and a firmwide virtual assistant. This involves translating stakeholder requirements into application features, designing and building AI tools, and acting as a champion for data science and AI adoption within the organization.
Requirements
- fluency in Python with ability to design and write clean, modular, well-documented code and a solid understanding of coding best practices
- Ability to logically evolve an architecture from prototype to product, considering technical debt and delivery risk
- Experience with data engineering, APIs, and cloud platforms (ideally AWS) and containerization technologies (Docker)
- Experience with enterprise software development lifecycle and tooling including continuous integration and delivery concepts/technologies
- Experience with machine learning workflows, cloud scale machine learning infrastructure (including LLMs)
- Data orchestrators (Airflow, Dagster) and cloud-based ETL/ELT pipelines
Responsibilities
- Implement: take the requirements from our broad range of commercial stakeholders and translate these into application features.
- Design: Ensure we design and build the models and tools to meet the functional/non-functional requirements, as well as being supportable.
- Develop, test, maintain software tools and data pipelines for machine learning
- Provide software engineering and design expertise and best practices (Python) with a focus on maintainability, performance, and reliability
- As needed, take ownership of key technical infrastructure
- Engage with projects at any point in their lifecycle, understand and debug bespoke applications; driving performance and reliability
- Actively participating in and leading code reviews, experiment design and tooling decisions to help drive the team’s velocity and quality
Other
- Act as the primary point of contact in Houston for our GenAI toolset
- Translate: Act as a local champion for data science and AI, helping users adopt tools and articulate their changes and requirements to the wider team.
- The individual will work both with our data scientists and machine learning engineers but will also need to directly engage with the commercial teams (across trading, operations, support functions, etc.).
- Manage relationships and priorities across projects, focused on maximising value
- Master's degree in Computer Science or a related field
- Ability and desire to learn and apply new technologies
- Collaborative approach to problem solving - ability to effectively pair program
- Effective technical communicator - both written and verbal; able to translate loose designs into documentation / process / operating model
- Experience in the energy or commodities trading industry, with knowledge of financial markets and trading concepts
- A self-motivated individual who thrives on seeing the results of their work make an impact in the business
- Strong communication skills, both verbally and in writing
- Proven ability to be flexible, work hard, and a sense for the art of the possible
- Methodical, organized and with an attention to detail - in general, in experimental design, and in code!
- Willingness to share their knowledge and learn from others
- An interest in learning about the commodities space
- Resourceful, able to think creatively and adapt in a dynamic environment
- Team player, with an open non-political style and a high level of integrity
- Desire to be a thought-partner in a fast-growing team, and make an impact at a business that sits at the heart of the world’s energy flows