John Hancock Life Insurance Company (U.S.A.) is looking for a Data Engineer to design, construct, test, and maintain highly scalable data management systems, ensuring data is accessible, reliable, and efficiently coordinated across the organization.
Requirements
- 3 to 5 years of experience in developing and managing Azure Data Pipelines.
- Demonstrable experience with Databricks, including building and maintaining data pipelines.
- Proficient in ETL (Extract, Transform, Load) processes and standard methodologies.
- Experience with Informatica for data integration and transformation tasks.
- Proficiency in using reporting tools like Power BI and Cognos for data visualization and reporting.
- Solid understanding of data architecture and database management systems.
- Strong programming skills in languages such as Python, SQL, or Scala.
Responsibilities
- Building and maintaining robust and scalable data pipelines that automate the collection, transformation, and storage of data from various sources.
- Ensuring seamless integration of data from different sources into a unified data warehouse or data lake, facilitating easy access and analysis.
- Designing, implementing, and managing databases and data storage solutions that can handle large volumes of data efficiently.
- Putting in place systems and resources to secure the correctness, uniformity, and safeguarding of data company-wide.
- Monitoring system performance and making improvements as needed to ensure data systems are running efficiently and effectively.
- Working with data scientists, analysts, and other collaborators to understand data requirements and deliver solutions that meet business needs.
- Staying up to date with emerging technologies and tools in data engineering and recommending their adoption where appropriate to improve data processes and infrastructure.
Other
- 5 to 7 years of experience in data engineering or a related field.
- Excellent problem-solving abilities and attention to detail.
- Ability to work collaboratively in a team-oriented setting and communicate effectively with technical and non-technical collaborators.
- Experience with data quality and data governance practices to ensure data integrity and security across the organization.
- Familiarity with performance monitoring and optimization techniques for large-scale data systems.