The company is seeking to advance a safer, more reliable, and more resilient electrical grid by transforming the grid for resilient and reliable power worldwide.
Requirements
- Proficiency in programming languages such as Python, Java, or SQL
- Strong understanding of data warehousing concepts and ETL processes
- Experience with cloud platforms (e.g., AWS, Azure, Google Cloud)
- Experience with CI/CD pipelines and best practices
- Microsoft certifications
- Databricks
- Experience with big data technologies (e.g., Hadoop, Spark)
Responsibilities
- Lead the design, development, and maintenance of data pipelines and ETL processes
- Implement standards required for best practice design, development, and governance of overall data stack
- Integrate data from various sources into our data warehouse using varied technologies and coding techniques
- Ensure data quality and integrity through data validation and cleansing
- Optimize database performance and scalability
- Document data processes and workflows
- Monitor and troubleshoot data pipeline issues
Other
- Bachelor's degree in Computer Science, Engineering or a related field or equivalent work experience
- 5-7 years of proven experience as a Senior Data Engineer or in a similar role
- Excellent communication and teamwork skills
- Attention to detail and a commitment to data accuracy
- 8:00 am – 5:00 pm (Mon-Fri) Hybrid