The company is looking to solve the problem of developing and maintaining efficient data pipelines and solutions to support its climate technologies business.
Requirements
- Experience with relational (SQL) databases.
- Experience with data warehouses like Oracle, SQL & Snowflake.
- Experience with data modeling, data mining and segmentation techniques.
- Experience with ETL/ELT tools like Pentaho Data Integration, Fivetran and dbt.
- Experience with batch and real time data ingestion and processing frameworks.
- Experience with languages like Python and Java.
- Experience with at least one of the cloud platforms (Azure, AWS, GCP)
Responsibilities
- Develops data pipelines and solutions that transfers/transforms data across various systems.
- Maintains deep technical knowledge of various tools in the data warehouse, data hub, and analytical tools.
- Ensures data is transformed and stored in efficient methods for retrieval and use.
- Maintains data systems to ensure optimal performance.
- Develops a deep understanding of underlying business systems involved with analytical systems.
- Follows standard software development lifecycle, code control, code standards and process standards.
- Maintains and develops technical knowledge by self-training of current toolsets and computing environments, participates in educational opportunities, maintains professional networks, and participates in professional organizations related to their tech skills.
Other
- Bachelor’s Degree in Computer Science/Information Technology or equivalent and 3+ years of experience.
- Ability to effectively communicate with others at all levels of the Company both verbally and in writing.
- Willingness to travel both domestically and internationally to support global implementations.
- Ability to work in a large, global corporate structure.
- Work Authorization: Copeland will only employ those who are legally authorized to work in the United States.