Helping clients find answers in their data to impact the most critical global cyber missions by organizing data from disparate sources using advanced technology solutions
Requirements
- 8+ years of experience with ETL and data pipeline development using tools such as Databricks and AWS Glue
- 5+ years of experience with programming languages such as Python, Java, or Scala
- 3+ years of experience with data modeling and data quality, including schema development and medallion architecture
- 3+ years of experience with database management and data warehousing, including DataOps
- Experience working with FDA data, especially related to medical devices
- Experience working with Databricks PVC
- Experience with version control software such as Git
Responsibilities
- Deploy and develop pipelines and platforms that organize and make disparate data meaningful
- Manage the assessment, design, building, and maintenance of scalable platforms for clients
- Use experience in analytical exploration and data examination
- Lead technical teams and drive technical decision-making
- Develop and implement data engineering activities on mission-driven projects
- Build advanced technology solutions
- Guide a multi-disciplinary team of analysts, data engineers, developers, and data consumers
Other
- Ability to collaborate within a cross-functional team
- Ability to obtain and maintain a Public Trust or Suitability/Fitness determination based on client requirements
- Master’s degree
- Possession of excellent communication skills
- Willingness to work in a fast-paced environment