Nationwide is looking to hire a Data Engineer to join their tech-driven environment, specifically facing off the Actuarial department within Nationwide Financial under Digital Back Office. The role involves leveraging curiosity and a drive for learning to tackle complex challenges and champion ownership of projects from inception to completion, ultimately creating data solutions that are secure, reliable, and efficient to support their mission of providing extraordinary care.
Requirements
- Command Linux and Windows environments with ease.
- Develop and maintain systems using Informatica, Talend, Perl, and harness SQL Server, Oracle, and Informatica or other ETL tools to manage and transform data effectively.
- Utilize your knowledge of Snowflake, Databricks, SQL Server, Oracle to maintain robust database systems.
- Enhance our systems with Python and UAC scripting capabilities.
- Contribute to our cloud initiatives with your AWS expertise.
- Streamline processes by building and maintaining efficient pipelines.
- Moderate to advanced skills with modern programming and scripting languages (e.g., SQL, R, Python, Spark, UNIX Shell scripting, Perl, or Ruby).
Responsibilities
- Build work, Incident Management -Run support/ETL and also third-party software management
- Employ your problem-solving skills to navigate and resolve multifaceted technical issues.
- Take full ownership of tasks, ensuring meticulous execution and delivery of high-quality work.
- Provides basic to moderate technical consultation on data product projects by analyzing end to end data product requirements and existing business processes to lead in the design, development and implementation of data products.
- Produces data building blocks, data models, and data flows for varying client demands such as dimensional data, standard and ad hoc reporting, data feeds, dashboard reporting, and data science research & exploration
- Creates simple to moderate business user access methods to structured and unstructured data by such techniques such as mapping data to a common data model, NLP, transforming data as necessary to satisfy business rules, AI, statistical computations and validation of data content.
- Develops and maintains scalable data pipelines for both streaming and batch requirements and builds out new API integrations to support continuing increases in data volume and complexity.
Other
- Faces off against the Actuarial Business within IDS at NF
- Foster a culture of continuous learning and improvement, staying ahead of the curve in a rapidly evolving tech landscape including AI.
- Collaborate with cross-functional teams to drive innovation and operational excellence.
- Undergraduate studies in computer science, management information systems, business, statistics, math, a related field or comparable experience and education strongly preferred.
- Three to five years of relevant experience with data quality rules, data management organization/standards, practices and software development.