Oddball is seeking to design, build, and maintain secure, scalable data pipelines to enable effective use of enterprise data in the federal space.
Requirements
- 5+ years of experience in data engineering or data platform development.
- Strong proficiency with ETL/ELT tools and frameworks (e.g., Apache Spark, Apache Airflow, Talend, Informatica).
- Solid experience with SQL and working with relational databases (PostgreSQL, Oracle, SQL Server, etc.).
- Familiarity with data warehousing concepts and modern architectures (Snowflake, Redshift, BigQuery, or equivalent).
- Proficiency with Python or another scripting language for building and automating data pipelines.
- Experience designing and supporting cloud-based data environments (AWS preferred; Azure or GCP a plus).
- Knowledge of compliance and security requirements in healthcare or federal environments (HIPAA, NIST, RMF).
Responsibilities
- Design, develop, and maintain ETL/ELT pipelines to move and transform data from multiple sources into enterprise data platforms.
- Build and optimize data models, schemas, and storage solutions to support analytics and reporting.
- Ensure compliance with federal security and data governance frameworks (HIPAA, NIST, CMMC, RMF).
- Collaborate with data stewards, analysts, and business teams to define requirements and deliver trusted data products.
- Implement monitoring and logging for data pipelines to ensure reliability, scalability, and performance.
- Support data integration and migration projects across enterprise systems.
- Document data flows, transformations, and system integrations for audit and compliance needs.
Other
- Applicants must be authorized to work in the United States.
- In alignment with federal contract requirements, certain roles may also require U.S. citizenship and the ability to obtain and maintain a federal background investigation and/or a security clearance.
- Bachelor’s Degree
- Perform other related duties as assigned.
- Must be able to work fully remote.