Optum is looking to build an investigative data platform to ingest data from events and provide a secure environment for data analysts to understand the data, as part of the security team.
Requirements
- 5+ years of writing and deploying Python and/ or Java code
- 5+ years of experience in PySpark and Databricks.
- 5+ years of experience normalizing unstructured data
- 4+ years of experience with DevOps and CI/CD tools such as GitHub actions, Kubernetes, Docker, and Terraform.
- 2+ years of experience leveraging and deploying Generative AI use cases to production environment
- Expertise in SQL and database fundamentals, with strong experience working with data lakes and warehouses (e.g., Snowflake, Databricks)
- Experience designing and scaling ELT/ETL frameworks with orchestration tools like Airflow, or similar platform
Responsibilities
- Participate in incident investigations following a data event
- Partner with the team to design and develop a scalable, high-performance data and reporting platform that serves our customers and stakeholders.
- Partner with cross-functional stakeholders to understand evolving data needs and define long-term technical solutions
- Drive strategic initiatives around the use of AI solutions, data quality, observability, lineage, and governance. Ability to leverage AI and AI tools in security.
- Introduce and evolve best practices in data modeling, orchestration, testing, and monitoring
- Identify and champion investments in platform scalability, reusability, and operational efficiency
- Build, maintain, and leverage parsing and analytic libraries.
Other
- You'll enjoy the flexibility to work remotely from anywhere within the U.S.
- Work comfortably under time-sensitive conditions while ensuring thoroughness.
- Maintain high ethical standards and the ability to remain objective and confidential.
- All employees working remotely will be required to adhere to UnitedHealth Group's Telecommuter Policy.
- Candidates are required to pass a drug test before beginning employment.