DRW is seeking to improve its forecasting and analytics efforts by building and maintaining robust data pipelines, and is looking for a Python Data Engineer to join its Weather team to support this goal.
Requirements
- Experience with weather and climate datasets and tooling (e.g., Copernicus, Xarray, Zarr, NetCDF).
- Proficiency in Python programming and experience with libraries such as Pandas, NumPy, and FastAPI.
- Experience with ETL tools and frameworks (e.g., Apache Airflow, Apache NiFi, Talend).
- Strong understanding of relational databases and SQL.
- Experience with cloud platforms (e.g., AWS, GCP, Azure) and their data services.
- Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake).
- Experience with version control systems (e.g., Git).
Responsibilities
- Design, develop, and maintain efficient and scalable data pipelines using Python.
- Extract, transform, and load (ETL) data from various sources into our data platform.
- Collaborate with data scientists and analysts to understand data requirements and deliver high-quality data solutions.
- Monitor and optimize the performance of data pipelines to ensure data quality and reliability.
- Implement data validation and error-handling mechanisms to ensure data accuracy.
- Work with cloud-based data storage and processing solutions (e.g., AWS, GCP, Azure).
- Stay up-to-date with industry trends and emerging technologies to continuously improve our data engineering capabilities.
Other
- Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field.
- Minimum of 3 years of experience in data engineering or a related role.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
- Travel requirements not specified, but may be required for collaboration with global teams.