New York Life is seeking to design and execute a modern data management strategy by building the next generation of Insurance solutions in the cloud leveraging nascent and modern technologies and platforms.
Requirements
Proficiency with languages such as Python, PySpark
Strong knowledge on AWS Cloud technologies.
Strong, demonstratable hands-on experience with data engineering and data pipelines
Proficient in cloud architecture and best practices, particularly with AWS Cloud tools such as Amazon RDS, Redshift, AWS DMS, AWS Glue, Amazon S3, AWS Lambda, and more.
Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
Identify, recommend, and implement ELT processes and architecture improvements.
Assist and verify design of solution and production of all design phase deliverables.
Responsibilities
Build and maintain Data engineering solutions on cloud platforms using AWS or Azure services.
Design, develop, and implement scalable data transformations and ETL/ELT processes using Python, PySpark, and or Data Integration tools IDMC, DBT
Collaborate with Data Architects and Data analysts to understand data requirements and translate them into scalable, high performant data integration and pipeline solutions.
Develop and maintain data models and schemas to support data integration and analysis
Monitor and troubleshoot data pipeline performance, identifying and resolving bottlenecks and issues
Optimize and tune data integration pipelines for performance, reliability, and scalability
Implement data quality and validation checks to ensure accuracy and integrity of data through testing
Other
Hybrid - 3 days per week
Eight or more years of experience in enterprise-level delivery of Data Engineering solutions.
Deep understanding of modern data architecture, including experience with data lakes, data warehouses, data marts, relational and dimensional modeling, data quality, and master data management.
Strong background in Operational Data Stores, Dimensional Modeling, and supporting application data architecture. Experience with Redshift, Snowflake, Databricks SQL, Oracle, Postgres, MySQL, and understanding of best practice architectural concepts for relational data models.
8 + Years of design and implementation of ETL/ELT framework for complex warehouses/data marts.