At Cotiviti, the business problem is to ensure operational functions are occurring as expected, including managing data implementations, data production, data quality, and data security for their clients.
Requirements
- 2+ years of working knowledge of RDBMS (Oracle, MS SQL, Vertica, etc.) and experience using SQL, PL/SQL or other data integration/ETL tools.
- 2+ years of experience in data engineering, data analysis, or a related field with a strong track record of building and managing data pipelines.
- 2-4 years’ experience with data aggregation, standardization, linking, quality check mechanisms, and reporting
- 2-4 years’ experience with big data technologies like Hadoop and Spark
- Proficient and experienced in analyzing, designing, and developing solutions and strategies involving relational databases (e.g., Oracle, Vertica, SQL Server), ETL tools (e.g., SSIS, ODI, Informatica), and data warehousing concepts.
- Solid understanding of Linux environments; strong knowledge of shell scripting and file systems.
- Experience in at least one programming language such as Python, Java, Scala or Powershell.
Responsibilities
- Oversee ETL processes to ensure their success, with the primary responsibility of providing accurate and timely data across various products to support the company's revenue generation.
- Maintain data pipelines running on proprietary big data processing platform and providing support to make sure data delivery SLAs are met.
- Maintain data engineering processes using a variety of tools including T-SQL, Spark and Scala, and shell scripting.
- Develop, support, and improve scalable and efficient data ingestion processes and techniques aimed at enhancing process efficiencies and optimizing query performance for our proprietary data applications and systems.
- Implement and perform data validation and quality checks to maintain high data integrity.
- Perform troubleshooting, data analysis, data mining, investigations and identifying root cause of issues using several cutting-edge data analysis tools in a fast-paced environment.
- Develop data transformation specifications to convert source data to be loaded into target data warehouse tables using SQL and other Data Integration/ETL tools.
Other
- Bachelor’s degree in Computer Science, Information Technology or equivalent work experience.
- Ability to analyze data with a high level of detailed accuracy, identify root causes of issues, and demonstrate problem-solving skills to troubleshoot data-related issues.
- Excellent verbal, listening and written communication skills.
- Ability to multitask and prioritize projects to meet scheduled deadlines and tight turnaround times.
- Flexible work schedule.