Emerging capabilities in artificial intelligence, machine learning, and autonomous systems demand robust data engineering—especially when integrating complex, diverse data into mission-driven applications for Booz Allen's clients across defense and intelligence missions.
Requirements
- 2+ years of experience with programming languages including Python and C++
- 2+ years of experience developing and maintaining scalable data pipelines and storage solutions
- Experience with modern databases, including PostgreSQL, MongoDB, Neo4j, or DuckDB
- Experience designing ETL/ELT workflows for structured and unstructured data
- Experience building containerized microservices using Docker or Podman and deploying on Kubernetes
- Experience working with vector-based search and AI-native infrastructure such as Llama.cpp or Ollama
- Experience with modern application frameworks like FastAPI, Gin, or Streamlit
Responsibilities
- design and build the pipelines, services, and backend infrastructure that power advanced analytics and AI
- build robust infrastructure to support both legacy and bleeding-edge data sources, enabling AI/ML model deployment, geospatial visualization, and multi-modal applications
- connect operational systems with vector databases
- embed LLMs into field-deployable tools
- refine skills in API development, data platform integration, and deployment across secure environments like AWS GovCloud and containerized infrastructure
- support data ingestion, transformation, and retrieval via APIs and backend services
- collaborate with developers, scientists, and analysts in a fast-paced, agile environment
Other
- Bachelor’s degree
- Secret clearance
- Ability to support work in a fast-paced, agile environment
- Ability to collaborate with developers, scientists, and analysts
- TS/SCI clearance (nice to have)