Helping clients find answers in their data to impact important missions from fraud detection to cancer research, to national intelligence at Booz Allen
Requirements
- 2+ years of experience writing software in programming languages, including Python
- 2+ years of experience with ETL operations, including on-premises or Cloud infrastructure
- 1+ years of experience with source control and collaboration software, including Git or Atlassian tools
- Knowledge of relational and non-relational database technologies, including SQL or GraphQL
- Knowledge of automation and scripting on Linux or Windows operating systems
- Experience deploying analytics workloads on a platform as a service (PaaS) and software as a service (SaaS), including AWS EMR, Redshift, SageMaker, Databricks, SQL Data Warehouse, or machine learning service
- Experience with distributed or parallel programming frameworks, including Apache Spark or NVIDIA CUDA
Responsibilities
- Develop and deploy the pipelines and platforms that organize and make disparate data meaningful
- Support the assessment, design, development, and maintenance of scalable platforms for clients
- Work with a multi-disciplinary team of analysts, data engineers, developers, and data consumers in a fast-paced, agile environment
- Sharpen skills in analytical exploration and data examination
- Build advanced technology solutions and implement data engineering activities on mission-driven projects
- Develop and deploy large-scale batch and stream analytics pipelines
- Work with integrated groups comprised of customer success managers, infrastructure engineers, data scientists, and software engineers
Other
- Ability to obtain a Secret clearance
- Bachelor's degree
- Master’s degree in CS, Computer Engineering, Mathematics, Data Science, Software Engineering, Electrical Engineering, Physics, or a related field
- Cloud Development Certification such as AWS Solutions Architect or Azure Certification
- Information Security Certification such as Security+ or CISSP Certification