Equilibrium Energy is revolutionizing the clean energy transition by developing innovative grid-scale energy storage solutions. Our technology and market platform helps utilities, independent power producers, and commercial customers optimize their renewable energy assets, improve grid reliability, and accelerate decarbonization. As a fast-growing climate tech company, we're building infrastructure that will shape the future of energy markets and enable a sustainable energy economy.
Requirements
- Five years of experience developing a globally distributed data services platform.
- Five years of coding experience leveraging Python, Go, Java, or other applicable languages (Rust or Julia).
- Five years of experience in data analysis and ETL/ELT pipelines in order to transform and collect data to make it accessible to customers.
- Five years of experience in developing and operating large-scale data pipelines demonstrating knowledge with technologies such as Airflow, Luigi, Spark, Kafka, ElasticSearch, Cassandra, Relational Database Management Systems, and SQL.
- Five years of experience leveraging relational databases and transactional databases (Postgres, MySQL, or MongoDB).
- Five years of experience query authoring (SQL) in order to provide analytics and access of data to customers or applications.
- Five years of experience building and optimizing big data pipelines, architectures, and data sets.
Responsibilities
- Build the data flow channels and processing systems in python programming language to extract, transform, load and integrate data from various data sources using the APIs.
- Store data in feature store for the machine learning models.
- Develop complex code, scripts to process structured and unstructured data near real-time and develop data pipelines using Airflow.
- Design and develop database models, create new data schemes in postgres, and maintain the database code using typeorm migrations.
- Write Graphql queries using Hasura to interact with the database to fetch and update the information.
- Build DevOps pipelines and automate the application build and deployment processing using Gitlab for quick product go-live cycles to quickly deliver business value.
- Participate in software system level design discussions and present designs to product teams to develop new modules.
Other
- Five years of progressive, post-baccalaureate experience as a Data Engineer, Data Analyst, Quality Analyst, Software Engineer, or a related occupation.
- Five years of experience performing root cause analysis on internal and external data and processes to answer specific questions and identify opportunities for improvement.
- Five years of experience leveraging unstructured datasets to effectively extract complex data from ambiguous requirements.
- Telecommuting Permitted.
- Collaborative, talented, passionate and resourceful folks to join our team