Job Board
LogoLogo

Get Jobs Tailored to Your Resume

Filtr uses AI to scan 1000+ jobs and finds postings that perfectly matches your resume

Optum Logo

Data Analyst - 2332094

Optum

$112,739 - $138,495
Dec 5, 2025
Eden Prairie, MN, US
Apply Now

Optum Services Inc. is looking to transform raw data into meaningful information to improve healthcare outcomes and efficiency.

Requirements

  • Data warehousing concepts, ETL concepts, and working with ETL/Data quality tools on any ETL platform
  • Building, deploying, and troubleshooting data extraction and loading pipelines (ETL) using Azure Data Factory (ADF)
  • Writing complex SQL queries, Stored Procedures, and Performance tuning
  • Analyzing and executing data profiles
  • Developing a data solution on Snowflake Data Warehouse using snowflake continuous data pipelines with Snowpipe, Streams and Tasks.

Responsibilities

  • Develop and implement a set of techniques or analytics applications to transform raw data into meaningful information using data-oriented programming languages and visualization software.
  • Assist in the overall architecture of the ETL Design, and proactively provide inputs in designing, implementing, and automating the ETL flows.
  • Develop ETL pipelines and data flows in and out of the data warehouse using a combination of Azure Data Factory and Snowflake toolsets.
  • Develop idempotent ETL process design so that interrupted, incomplete, or failed processes can be rerun without errors using ADF dataflows and Pipelines.
  • Work in Snowflake Virtual Warehouses as needed in Snowflake and automate data pipelines using Snow pipe for tedious ETL problems.
  • Capture changes in data dimensions and maintain versions of them using Stream sets in snowflake and schedule them using Tasks.
  • Responsible for testing of ETL system code, data design, and pipelines and data flows.

Other

  • Develop and manage effective working relationships with other departments, groups and personnel
  • Efficiently communicate with Data architect while understanding the requirements and business process knowledge in order to transform the data in a way that's geared towards the needs of end users
  • Investigate and mine data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions
  • Root cause analysis on all processes and resolve production issues as a part of the process and routine tests on databases and data flow and pipeline testing
  • Responsible for documenting the implementations, and test cases as well as responsible for building deployment documents needed for CI/CD.