As a data engineer, you know that organizing big data can yield pivotal insights when it’s gathered from disparate sources. As a Data Engineer at Booz Allen, you’ll implement data engineering activities on some of the most mission-driven projects in the industry. You’ll deploy and develop pipelines and platforms that organize and make disparate data available and meaningful.
Requirements
- 3+ years of experience with data pipelines, including data acquisition, data prep, and database architecture, to make data easily searchable and retrievable
- 3+ years of experience in ETL development using Java, Python, SQL, or Scala
- 2+ years of experience with PostgreSQL or Kafka
- Experience with AWS services, including RDS and RDS Aurora
- Experience with Kafka Connect, Kafka Streams, and other Kafka ecosystem tools
- Experience with NiFi
- Experience with Terraform, Ansible, or other infrastructure-as-code tools
Responsibilities
- implement data engineering activities
- deploy and develop pipelines and platforms that organize and make disparate data available and meaningful
- data acquisition
- data prep
- database architecture
- make data easily searchable and retrievable
- ETL development
Other
- Ability to work independently and as part of a team
- Active TS/SCI clearance; willingness to take a polygraph exam
- Bachelor’s degree in a Computer Science field
- Ability to obtain a Security+ Certification within 6 months of start date
- TS/SCI clearance with a polygraph