Helping clients find answers in their data to impact important missions, such as fraud detection, cancer research, and national intelligence, by organizing and making disparate data meaningful
Requirements
- Experience programming in JavaScript, Java, Python, SQL, Scala, or Bash/Shell scripting
- Experience building scalable ETL/ELT workflows for reporting and analytics
- Experience with CI/CD practices to automate builds, testing, and deployments
- Experience writing PL or SQL scripts and stored procedures for data manipulation, parsing, and processing within RDBMS such as Oracle or Postgres
- Knowledge of database design principles, query optimization, and index management
- Experience administering, deploying, and managing RDBMS such as Oracle and Postgres on cloud platforms like AWS
- AWS Certifications such as AWS Big Data Certification
Responsibilities
- Deploy and develop pipelines and platforms that organize and make disparate data meaningful
- Manage the assessment, design, building, and maintenance of scalable platforms for clients
- Use experience in analytical exploration and data examination to guide a multi-disciplinary team
- Build scalable ETL/ELT workflows for reporting and analytics
- Write PL or SQL scripts and stored procedures for data manipulation, parsing, and processing within RDBMS
- Determine the best solution required for business and customer needs in a complex enterprise environment
- Independently research, design, document, and implement new data services
Other
- Top Secret clearance
- Bachelor's degree and 10+ years of experience as a database engineer of a large volume enterprise system, or 14+ years of experience as a database engineer of a large volume enterprise system in lieu of a degree
- TS/SCI clearance with a polygraph
- Ability to work in a fast-paced, agile environment
- Ability to work with and guide a multi-disciplinary team of analysts, data engineers, developers, and data consumers