LexisNexis Risk Solutions is looking to solve data processing issues within the Insurance Industry by ensuring data integrity, availability, and timely completion of data processes through monitoring, troubleshooting, and automation.
Requirements
- Proficient in SQL, Windows, Unix, HPCC, with working knowledge of Python, Java, or Scala for data processing.
- Solid grasp of ETL processes, data warehousing principles, and big data technologies like Spark and Hadoop.
- Experience with cloud platforms such as AWS, Azure, or GCP, and familiarity with version control systems like Git.
- Hands-on exposure through internships or academic projects in data engineering or analytics.
Responsibilities
- Assist in designing, building, and maintaining reliable data pipelines to support analytics and business intelligence.
- Monitor data flows and troubleshoot issues to ensure data integrity and availability.
- Receive data processes from development teams, analyze for production-readiness, and adds them automation list.
- Check internal GUI, emails, and teams for data processing issues.
- Ensure that all data processes finished in a timely fashion.
- Communicate with development teams if there are issues.
Other
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Document processes, data flows, and system configurations.
- Strong analytical thinking, problem-solving, and collaboration skills.
- Bachelor’s degree in computer science, Engineering, Information Systems, or a related field.